National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 1 of 134
United States Environmental Protection Agency
Office of Water
Office of Environmental Information
Washington, DC
EPA No. 841-R-14-005
National Coastal
Condition Assessment
2015
Quality Assurance
Project Plan
Version 2.1 May 2016
* the
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 2 of 134
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 3 of 134
QUALITY ASSURANCE PROJECT PLAN
REVIEW & DISTRIBUTION ACKNOWLEDGMENT AND
COMMITMENT TO IMPLEMENT
for
National Coastal Condition Assessment 2015
I/We have read the QAPP and the methods manuals for the National Coastal Condition
Assessment listed below. Our agency/organization agrees to abide by its requirements for work
performed under the National Coastal Condition Assessment. Please check the appropriate
documents.
Quality Assurance Project Plan n
Field Operations Manual n
Site Evaluation Guidelines n
Laboratory Methods Manual n
Field Crew leads: I also certify that I attended an NCCA 2015 training and that all members of
my crew have received training in NCCA protocols n
Print Name
Title
(Cooperator's Principal Investigator)
Organization
Signature Date
Field Crews: Please return the signed original to the Logistics Contractor. The Logistics Contractor will
ensure all parties have signed the QA forms, compile them and submit to the EPA Project QA
Coordinator. Send your forms to: Chris Turner, Great Lakes Environmental Center, Inc.; 739 Hastings
Street; Traverse City, Ml 49686. cturner@glec.com
Labs and others: Please return the signed original to Kendra Forde who will ensure all parties have
signed the QA forms, compile them, and submit them to the EPA QA Coordinator. Send your forms to:
Kendra Forde, US EPA; 1200 Pennsylvania Ave, NW (4503T); Washington, DC 20460.
forde.kendra@epa.gov
Retain a copy for your files.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 4 of 134
Notices
The National Coastal Condition Assessment (NCCA) 2015 Quality Assurance Project Plan (QAPP) and
related documents are based on the previous Environmental Monitoring and Assessment Program's
(EMAP) National Coastal Assessment (NCA) conducted in 2001 - 2004 as well as the National Coastal
Condtion Assessment 2010.
The complete documentation of overall NCCA project management, design, methods, and standards is
contained in four companion documents, including:
National Coastal Condition Assessment: Quality Assurance Project Plan (EPA 841-R-14-005)
National Coastal Condition Assessment: Field Operations Manual (EPA 841-R-14-007)
National Coastal Condition Assessment: Laboratory Methods Manual (EPA 841-R-14-008)
National Coastal Condition Assessment: Site Evaluation Guidelines (EPA 841-R-14-006)
This document (QAPP) contains elements of the overall project management, data quality objectives,
measurement and data acquisition, and information management for the NCCA 2015. Methods
described in this document are to be used specifically in work relating to the NCCA 2015 and related
projects. All Project Cooperators should follow these guidelines. Mention of trade names or commercial
products in this document does not constitute endorsement or recommendation for use. More details
on specific methods for site evaluation, field sampling, and laboratory processing can be found in the
appropriate companion document(s).
The citation for this document is:
U.S. EPA. National Coastal Condition Assessment Quality Assurance Project Plan. United States
Environmental Protection Agency, Office of Water, Office of Wetlands, Oceans and Watersheds.
Washington, D.C. EPA 841-R-14-005. 2014.
-------
National Coastal Condition Assessment 201 5 n,,^ii*, A
Version 2.1 May 2016 eas"neni ^ » Quality Assurance Project Plan
-- - - - - ...... - - . - _ ________ _ Page 5 of 134
Approval Page ' ~ - -
^^
Previous phased approvals cover the following aspects of the NCCA project as of the approved date:
in the relevant P°rti°" °f the QAPP and LOM
tox.aty analysis for freshwater as described in the relevant portion of the QAPP and WM
SedimentT" T ^^ 3S ^'^ '" »* relevant portion of the QAPpTnd LOM
" " ^ 3S deSC"bed " the relvnt portion of the QAPP «d LOM
a ct m« « -H , portion of the QAPP 3"
Field activities as described m the relevant portions of the QAPP and the FOM
Water chem.stry analys.s for brackish/marine water as described in the re.evant portion of the QAPP and
Sediment toxicity for marine waters as described in the relevant portion of the QAPP and LOM
Acting NCCA Project Manager
"t
Date
Sarah Lehmann
Project Quality Assurance Coordinator
~
Sarah Lehmann / /
Natj^hal Aquatic Resource Surveys (NARS) Team Leader
' r/W/c
-Susan Holdsworth'- / /
Chief, Monitoring Branch
Fox-Norse, Virginia 5^C'E££"*"M""
Virginia Fox-Norse ~ ~~ ~ ~~ ~
OCPD Quality Assurance Coo/din'ator
^_____$/1 ?; 14,
Margarete Heber ~~ ~~~~ ~" " i > < \-
OWOW Quality Assurance Officer
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 6 of 134
Table of Contents
Notices 4
Approval Page 5
Table of Contents 6
Acronyms 15
NCCA Executive Summary 17
Distribution List 20
Project Planning and Management 22
1.1 Introduction 22
1.2 Scope of the Quality Assurance Project Plan 23
1.3 Project Organization 23
1.4 Project Design 28
1.5 Project Schedule 30
1.6 Overview of Field Operations 30
1.7 Overview of Laboratory Operations 34
1.8 Data Analysis 36
1.9 Peer Review 36
2.0 Data Quality Objectives 37
2.1 Data Quality Objectives for the National Coastal Condition Assessment 38
2.2 Measurement Quality Objectives 38
2.2.1 Method Detection Limits (Laboratory Reporting Level (Sensitivity)) 38
2.2.2 Sampling Precision and Bias 39
2.2.3 Sampling Accuracy 41
2.2.4 Taxonomic Precision and Accuracy 42
2.2.5 Completeness 43
2.2.6 Comparability 43
2.2.7 Representativeness 44
3. Site Selection Design 44
3.1. Probability Based Sampling Design and Site Selection 44
3.2. Survey Design for the Marine Waters 45
3.3. Survey Design for the Great Lakes 45
3.4 Revisit Sites 46
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 7 of 134
4.0 Information Management 46
4.1 Roles and Responsibilities 46
4.2 State-Based Data Management 49
4.3 Overview of System Structure 50
4.3.1 Data Flow Conceptual Model 51
4.3.2 Simplified Data Flow Description 51
4.4 Core Information Management Standards 53
4.5 Data Formats 53
Public Accessibility 54
4.6 Data Transfer Protocols 55
4.7 Data Quality and Results Validation 56
4.7.1 Data Entry, Scanned, or Transferred Data 56
4.7.2 Analytical Results Validation 57
4.7.3 Database Changes 58
4.8 Metadata 58
4.9 Information Management Operations 58
4.9.1 Computing Infrastructure 58
4.9.2 Data Security and Accessibility 58
4.9.3 Life Cycle 59
4.9.4 Data Recovery and Emergency Backup Procedures 59
4.9.5 Long-Term Data Accessibility and Archive 59
4.10 Records Management 60
5 INDICATORS 61
5.1 In Situ Measurements 63
5.1.1 Introduction 63
5.1.2 Sample Design and Methods 63
5.1.3 Pertinent Laboratory QA/QC Procedures 63
5.1.4 Pertinent Field QA/QC Procedures 63
5.1.5 Data Review 68
5.2 Water Chemistry Measurements (Including chlorophyll-a-) 68
5.2.1 Introduction 68
5.2.2 Sample Design and Methods 68
5.2.3 Pertinent Laboratory QA/QC Procedures 68
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 8 of 134
5.2.4 Pertinent Field QA/QC Procedures 74
5.2.5 Data Review 76
5.3 M icrocystins 77
5.3.1 I ntroduction 77
5.3.2 Sample Design and Methods 77
5.3.3 Pertinent Laboratory QA/QC Procedures 77
5.3.4 Pertinent Field QA/QC Procedures 79
5.3.5 Data Review 80
5.4 Benthic Invertebrates 81
5.4.1 Introduction 81
5.4.2 Sample Design and Methods 81
5.4.3 Pertinent Laboratory QA/QC Procedures 81
5.4.4 Pertinent Field QA/QC Procedures 84
5.4.5 Data Review 85
5.5 Sediment Contaminants, Total Organic Carbon (TOC) and Grain Size 86
5.5.1 I ntroduction 86
5.5.2 Sample Design and Methods 86
5.5.3 Pertinent Laboratory QA/QC Procedures 86
5.5.4 Pertinent Field QA/QC Procedures 93
5.5.5 Data Review 94
5.6 Sediment Toxicity 94
5.6.1 Introduction 94
5.6.2 Sample Design and Methods 95
5.6.3 Pertinent Laboratory QA/QC Procedures 95
5.6.4 Pertinent Field QA/QC Procedures 98
5.6.5 Data Review 99
5.7 Fecal Indicator: Enterococci 99
5.7.1 Introduction 99
5.7.2 Sampling Design and Methods 99
5.7.3 Pertinent Laboratory QA/QC Procedures 99
5.7.4 Pertinent Field QA/QC Procedures 100
5.8 Whole Fish Tissue Samples for Ecological Analysis 101
5.8.1 Introduction 101
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 9 of 134
5.8.2 Sample Design and Methods 101
5.8.3 Pertinent Laboratory QA/QC Procedures 101
5.8.4 Pertinent Field QA/QC Procedures 107
5.8.5 Data Review 108
5.9 Fish Tissue Filets (Great Lakes) 108
5.9.1 Introduction 108
5.9.2 Sampling Design and Methods 109
5.9.3 Sampling and Analytical Methodologies 110
5.9.4 Pertinent Laboratory QA/QC Procedures 110
5.9.5 Pertinent Field QA/QC Procedures 110
5.9.6 Data Management, Review and Validation 111
5.10 Fish Tissue Plugs 112
5.10.1 Introduction 112
5.10.2 Sample Design and Methods 112
5.10.3 Pertinent Laboratory QA/QC Procedures 112
5.10.4 Pertinent Field QA/QC Procedures 113
5.10.5 Data Review 114
5.11 Algal Toxins, Research Indicator 115
5.11.1 I ntroduction 115
5.11.2 Sample Design and Methods 115
5.11.3 Pertinent Laboratory QA/QC Procedures 115
5.1.2 Pertinent Field QA/QC Procedures 118
5.1.3 Data Review 118
6 Field and Biological Quality Evaluation & Assistance 119
6.1 National Coastal Condition Assessment Field Quality Evaluation and Assistance
Visit Plan 119
6.1.1 Preparation Activities 120
6.1.2 Field Day Activities 120
6.1.3 Post Field Day Activities 121
6.1.4 Summary 121
6.2 National Coastal Condition Assessment Laboratory Quality Evaluation and
Assistance Visit Plan 122
6.2.1 Remote Evaluation/Technical Assessment 123
6.2.2 Water Chemistry Laboratories 124
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 10 of 134
6.2.3 I nter-laboratory Comparison 124
6.2.4 Assistance Visits 124
NCCA 2015 Document Request Form Chemistry Laboratories 125
NCCA 2015 Document Request Form Biology Labs 126
7 Data Analysis Plan 127
7.1 Data Interpretation Background 127
7.1.1 Scale of Assessment 127
7.1.2 Selecting Indicators 127
7.2 Datasetstobe usedforthe Report 128
7.3 Indicators for the Coastal Assessment 128
7.4 NCCR Index Development Approach 129
7.5 Calculation of Population Estimates 129
7.6 Relative Extent, Relative Risk and Attributable Risk Analysis 129
7.7 Other Change Analyses 129
7.8 Index Precision and Interpretation 129
8 References 130
Attachment A 135
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 11 of 134
List of Tables
Table 1. Summary of IM Responsibilities 47
Table 2. Description of NCCA 2015 Indicators and Location Where Indicators are Collected....61
Table 3. Measurement Data Quality Objectives: Water Indicators 64
Table 4. Field Quality Control: Multiparameter Meter Indicator 66
Table 5. Data Reporting Criteria: Field Measurements 67
Table 6. Data Validation Quality Control for In-Situ Indicator 68
Table 7. Measurement Data Quality Objectives: Water Chemistry Indicator and Chlorophyll-a.70
Table 8. Laboratory Quality Control Samples: Water Chemistry Indicator 71
Table 9. Data Reporting Criteria: Water Chemistry Indicator 74
Table 10. Sample Field Processing Quality Control Activities: Water Chemistry Indicator
(CHEM) 75
Table 11. Sample Field Processing Quality Control: Chlorophyll-a (CHLA) and Dissolved
Nutrient (NUTS) Indicators 75
Table 12. Data Validation Quality Control for Water Chemistry Indicator 76
Table 13. Measurement Quality Objectives for Microcystins 77
Table 14. Sample Analysis Quality Control Activities and Objectives for Microcystins 78
Table 15. Sample Receipt and Processing Quality Control: Microcystins Indicator 79
Table 16. Data Reporting Criteria: Microcystins Indicator 79
Table 17. Sample Field Processing Quality Control: Microcystins Indicator 80
Table 18. Data Validation Quality Control for Microcystins Indicator 80
Table 19. Benthic Macroinvertebrates: Measurement Data Quality Objectives 82
Table 20. Benthic Macroinvertebrates: Laboratory Quality Control 83
Table 21. Sample Receipt and Processing Quality Control: Benthic Invertebrate Indicator 83
Table 22. Sample Collection and Field Processing Quality Control: Benthic Invertebrate
I nd icator 85
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 12 of 134
Table 23. Data Validation Quality Control for Benthic Macroinvetebrates 85
Table 24. Sediment Contaminants, Grain size and TOC: Precision and Accuracy Objectives...87
Table 25. Sediment Contaminants, Grain Size, and TOC: Analytical Methods 87
Table 26. Sediment Contaminants, Grain Size, and TOC: Required Parameters 88
Table 27. Sediment Chemistry, Grain Size, and TOC: Quality Control Activities for Samples ...90
Table 28. Data Reporting Criteria: Sediment Contaminants, TOC and Grain Size Indicators ....92
Table 29. Sample Collection and Field Processing Quality Control: Sediment Contaminant
I nd icator 93
Table 30. Sample Collection and Field Processing Quality Control: Sediment TOC and Grain
Size I ndicator 94
Table 31. Data Validation Quality Control for Sediment Contaminants, TOC and Grain Size
I ndicators 94
Table 32. Quality Control Activities for Sediment Toxicity Samples 96
Table 33. Data Reporting Review Critera: Sediment Toxicity 97
Table 34. Sample Collection and Field Processing Quality Control: Sediment Toxicity Indicator
98
Table 35. Data Validation Quality Control: Sediment Toxicity 99
Table 36. Data Validation Quality Control: Fecal Indicator 100
Table 37. Sample Collection and Field Processing Quality Control: Fecal Indicator 100
Table 38. Whole Fish Tissue: Precision and Accuracy Objectives 102
Table 39. Whole Body Fish: Required Contaminants 102
Table 40. Whole Body Fish: Quality Control Activities 104
Table 41. Data Reporting Criteria: Eco-Fish Tissue Chemistry 106
Table 42. Method Quality Objectives for Field Measurement for Eco-Fish Indicator 107
Table 43. Field Quality Control: Whole Fish Tissue Samples for Ecological Analysis 107
Table 44. Data Validation Quality Control: Eco-Fish 108
Table 45. Data Validation Quality Control: Eco-Fish Tissue Indicator 108
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 13 of 134
Table 46. Recommended Target Species: Whole Fish Tissue Collection 109
Table 47. Field Data Types: Whole Fish Tissue Samples for Fillet Analysis 110
Table 48. Field Quality Control: Whole Fish Tissue Samples for Fillet Analysis 111
Table 49. Data Validation Quality Control: Whole Fish Tissue Samples for Fillet Analysis 111
Table 50. Measurement Data Quality Objectives for Mercury in Fish Tissue Plugs 112
Table 51. Quality Control for Mercury in Fish Tissue Plugs 112
Table 52. Data Reporting Criteria: Fish Tissue Plugs 113
Table 53. Method Quality Objectives for Field Measurement for the Fish Tissue Plug Indicator
114
Table 54. Field Quality Control: Fish Tissue Plug 114
Table 55. Data Validation Quality Control: Fish Tissue Plugs 115
Table 56. Measurement Quality Objectives for Algal Toxin Research Indicator 116
Table 57. Sample Analysis Quality Control Activities and Objectives for Algal Toxins 116
Table 58. Sample Receipt and Processing Quality Control: Algal Toxin Research Indicator ...117
Table 59. Data Reporting Criteria: Algal Toxin Research Indicator 117
Table 60. Sample Field Processing Quality Control: Algal Toxin Research Indicator 118
Table 61. Data Validation Quality Control for Algal Toxin Research Indicator 119
Table 62. Equipment and Supplies - Field Evaluation and Assistance Visits 120
Table 63. Summary of Field Evaluation and Assistance Visit Information 121
List of Figures
Figure 1. NCCA Project Organization and Flow 27
Figure 2. NCCA Marine Base Sites 28
Figure 3. NCCA Great Lakes Coastal Base Sites 29
Figure 4. Schedule for the NCCA 2015 30
Figure 5. Site Evaluation Diagram 33
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 14 of 134
Figure 6. Organization of the National Aquatic Resource Surveys Information Management
System (NARSIMS) for the NCCA 51
Figure 7. Conceptual Model of Data Flow into and out of the Master SQL Database for the
NCCA 52
Figure 8. Field Measurement Process for Water Chemistry Samples 65
Figure 9. Analysis Activities for Water Chemistry Samples 69
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 15 of 134
Acronyms
APHA American Public Health Association
ASCII American Standard Code for Information Interchange
CAS Chemical Abstracts Service
CRM Certified Reference Material
CSDGM Content Standards for Digital Geospatial Metadata
CV Coefficient of Variation
DDT dichlorodiphenyltrichloroethane
DO Dissolved Oxygen
DQOs Data Quality Objectives
EMAP Environmental Monitoring and Assessment Program
FGDC Federal Geographic Data Committee
FOIA Freedom of Information Act
GC Gas Chromatograph
GED Gulf Ecology Division
GLEC Great Lakes Environmental Center, Inc.
GPS Global Positioning System
GRTS Generalized Random Tessellation Stratified
ICP Inductively Coupled Plasma
IDL Instrument Detection Limit
IM Information Management
ITIS Integrated Taxonomic Information System
LDR Linear Dynamic Range
LRL Laboratory Reporting Level
LT-MDL Long-term Method Detection Limit
MDLs Method Detection Limits
MQOs Measurement Quality Objectives
NARSIMS National Aquatic Resource Surveys Information Management System
NARS National Aquatic Resource Surveys
NCA National Coastal Assessment (past surveys)
NCCA National Coastal Condition Assessment (current survey)
NCCRs National Coastal Condition Reports
NELAC National Environmental Laboratory Accreditation Conference
NEP National Estuary Programs
NERL U.S. EPA New England Regional Laboratory
NHD National Hydrography Dataset
NHEERL National Health and Environmental Effects Research Laboratory
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 16 of 134
NIST National Institute of Standards and Technology
NOAA National Oceanic and Atmospheric Administration
NRCC National Research Council of Canada
NWQL National Water Quality Laboratory
OARM Office of Administrative Resource Management
OCPD Oceans and Coastal Protection Division
ORD Office of Research and Development
OST Office of Science and Technology
OW Office of Water
OWOW Office of Wetlands, Oceans and Watersheds
PAHs Polycyclic Aromatic Hydrocarbons
PAR Photosynthetically Active Radiation
PBDE Polybrominated Diphenyl Ethers
PCBs Polychlorinated biphenyl
PE Performance Evaluation
PFC Perfluorinated compound
PPT parts per thousand
PSU Practical Salinity Unit
PTD Percent Taxonomic Disagreement
PTL Phosphorus, total
QAPP Quality Assurance Project Plan
QA/QC Quality Assurance/Quality Control
qPCR quantitative Polymerase Chain Reaction
R-EMAP Regional Environmental Monitoring and Assessment Program
RSD Relative Standard Deviation
SAS Statistical Analysis System
SDTS Spatial Data Transfer Standard
SQL Structure Query Language
SRM Standard Reference Material
STORET Storage and Retrieval Data Warehouse
SWIMS Surface Water Information Management System
TKN Total Kjeldahl Nitrogen
TOC Total Organic Carbon
TSA Technical Systems Audits
US EPA United States Environmental Protection Agency
USGS United Stated Geological Survey
WED Western Ecology Division
WQX Water Quality Exchange
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 17 of 134
NCCA Executive Summary
Background
Several recent reports have identified the need for improved water quality monitoring and analysis at
multiple scales. In response, the U.S. EPA Office of Water, in partnership with EPA's Office of Research
and Development (ORD), EPA regional offices, states, tribes and other partners, has begun a program to
assess the condition of the nation's waters using a statistically valid design approach. Often referred to
as probability-based surveys, these assessments, known as the National Aquatic Resource Surveys,
report on core indicators of water condition using standardized field and lab methods and utilize
integrated information management plans, such as described in this Quality Assurance Project Plan, to
ensure confidence in the results at national and ecoregional scales.
The NCCA, which builds upon previous National Coastal Assessments led by ORD and the National
Coastal Condition Assessment 2010, aims to address three key questions about the quality of the
Nation's coastal waters:
What percent of the Nation's coastal waters are in good, fair, and poor condition for key
indicators of water quality, ecological health, and recreation?
What is the relative extent of key stressors such as nutrients and pathogens?
How are conditions in coastal waters changing over time?
The NARS are also designed to help expand and enhance state monitoring programs. Through these
surveys, states and tribes have the opportunity to collect data which can be used to supplement their
existing monitoring programs or to begin development of new programs.
NCCA Project Organization
Overall project coordination is conducted by EPA's Office of Water (OW) in Washington, DC, with
technical support from EPA's ORD. Each of the coastal EPA Regional Offices has identified regional
coordinators to assist in implementing the survey and coordinate with the state crews who collect the
water and sediment samples following NCCA protocols. As in 2010, the Office of Science and Technology
(OST) within OW is conducting the human health fish tissue study in the Great Lakes in partnership with
the Great Lakes National Program Office. The Great Lakes National Program Office and ORD in Duluth
are again conducting an intensification survey within embayments of the Great Lakes.
Quality Assurance Project Plan
The purpose of this QAPP is to document the project data quality objectives and quality
assurance/quality control measures that will be implemented in order to ensure that the data collected
meets those needs. The plan contains elements of the overall project management, data quality
objectives, measurement and data acquisition, and information management for the NCCA.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 18 of 134
Information Management Plan
Environmental monitoring efforts that amass large quantities of information from various sources
present unique and challenging data management opportunities. To meet these challenges, the NCCA
employs a variety of well-tested information management (IM) strategies to aid in the functional
organization and ensured integrity of stored electronic data. IM is integral to all aspects of the NCCA
from initial selection of sampling sites through the dissemination and reporting of final, validated data.
A technical workgroup convened by the EPA Project Leader is responsible for development of a data
analysis plan that includes a verification and validation strategy. General processes are summarized in
the indicator-specific sections of this QAPP. Validated data are transferred to the central data base
managed by EMAP information management support staff located at the Western Ecology Division
facilities in Corvallis. This database is known as the National Aquatic Resource Surveys Information
Management (NARS IM) system. All validated measurement and indicator data from the NCCA are
eventually transferred to EPA's Water Quality Exchange (WQX) for storage in EPA's STORET warehouse
for public accessibility. NCCA IM staff provides support and guidance to all program operations in
addition to maintaining NARS IM.
Overview of NCCA Design
The NCCA is designed to be completed during the index period of June through the end of September
2015. EPA used an unequal probability design to select 684marine sites along the coasts of the
continental United States and 225 freshwater sites from the shores of the Great Lakes. Fifty sites were
drawn for Hawaii. For the NCCA, crews will revisit 66 of the marine sites during the 2015 sampling index
period and 25 of the freshwater sites. To improve our ability to assess embayments as well as shorelines
in the Great Lakes, EPA added 150 randomly selected sites in bays and embayments across all five Great
Lakes Additionally, related sampling will occur on reef flat (coastal areas) of American Samoa, Guam and
the Northern Mariana Islands during the 2015 field season. Additionally, EPA is conducting a pilot study
sampling 50 sites plus 5 revisits within the Huron-Erie Connecting Channel Corridor. EPA will sample
these sites first in 2014 and then again in 2015. EPA included oversample sites for each of these
components that must be used when a "base" site cannot be sampled for any reason. More information
can be found in the site evaluation guidelines.
Overview of Field Operations
Field data acquisition activities are implemented in a consistent manner across the entire country. Each
site is given a unique ID which identifies it throughout the pre-field, field, lab, analysis, and data
management phases of the project. Specific procedures for evaluating each sampling location and for
replacing non-sampleable sites are documented in NCCA 2015: Site Evaluation Guidelines.
NCCA indicators include nutrients, light attenuation, sediment chemistry, sediment toxicity, benthic
communities, fish tissue, microcystins and pathogens. Field measurements and samples are collected by
trained teams. The field team leaders must be trained at an EPA-sponsored training session. Field
sampling audits or evaluation visits will be completed for each field team.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 19 of 134
Overview of Laboratory Operations
NCCA laboratory analyses are conducted either by state-selected labs or "National Laboratories" set up
by EPA to conduct analyses for any state which so elects. All laboratories must comply with the QA/QC
requirements described in this document. Any laboratory selected to conduct analyses with NCCA
samples must demonstrate that they can meet the quality standards presented in this QAPP and the
NCCA 2015: Laboratory Methods Manual and NCCA 2015: Field Operations Manual.
Peer Review
Surveys undergo a thorough peer review process, where the scientific community and the public are
given the opportunity to provide comments. Cooperators have been actively involved in the
development of the overall project management, design, indicator selection and method
selection/refinements.
The EPA utilizes a three tiered approach for peer review of the Survey: (1) internal and external review
by EPA, states, other cooperators and partners, (2) external scientific peer review, and (3) public review.
Outside scientific experts from universities, research centers, and other federal agencies have been
instrumental in indicator development and will continue to play an important role in data analysis.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 20 of 134
Distribution List
This Quality Assurance Protection Plan (QAPP) and associated manuals or guidelines will be distributed
to the following EPA and contractor staff participating in the NCCA and to State Water Quality Agencies
or cooperators who will perform the field sampling operations. The NCCA Project Quality Assurance
(QA) Coordinator will distribute the QA Project Plan and associated documents to participating project
staff at their respective facilities and to the project contacts at participating states, EPA offices,
laboratories and any others, as they are determined.
Hugh Sullivan
Acting NCCA Project Leader
Sarah Lehmann
NCCA Project QA Coordinator
Margarete Heber OWOW Quality
Assurance Officer
Virginia Fox-Norse
OCPD Quality Assurance
Coordinator
Steven G. Paulsen
EPA ORD Technical Advisor
Sarah Lehmann
NARS Team Leader
Colleen Mason
NCCA Logistics Coordinator
Marlys Cappaert, SRA International
Inc.
NARS Information Management
Coordinator
Chris Turner
Contract Logistics Coordinator
LeanneStahl
OST Fish Tissue Coordinator
Bill Kramer
OST Fish Tissue QA Coordinator
David Bolgrien
Great Lakes Embayment
Enhancement Coordinator
Tom Faber, Region 1
sullivan.huHh@epa.gov
202-564-1763
lehmann.sarah@epa.gov
202-566-1379
heber.margarete@epa.gov
202-566-1189
fox-norse.virginia@epa.gov
202- 566-1266
paulsen.steve@epa.gov
541-754-4428
lehmann.sarah@epa.gov
202-566-1379
mason.colleen@epa.gov
202-343-9641
cappaert.marlys@epa.gov
541-754-4467
541-754-4799 (fax)
cturner@glec.com
715-829-3737
stahl.leanne@epa.gov
202-566-0404
kramer.bill@epa.gov
202-566-0385
bolgrien.david@epa.gov
218-529-5216
faber.tom@epa.gov
617-918-8672
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
U.S EPA, ORD
Western Ecology Division
Corvallis, OR
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
U.S. EPA Office of Water
Office of Wetlands, Oceans, and Watersheds
Washington, DC
Computer Science Corporation
Corvallis, OR 9733
Great Lakes Environmental Center
Traverse City, Ml
U.S. EPA Office of Water
Office of Science and Technology
Washington, DC
U.S. EPA Office of Water
Office of Science and Technology
Washington, DC
U.S. EPA, ORD
Mid-Continent Ecology Division
Duluth, MN
U.S. EPA -Region 1
North Chelmsford, MA
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 21 of 134
Darvene Adams, Region 2
Bill Richardson, Region 3
David Melgaard, Region 4
Mari Nord, Region 5
MikeSchaub, Region 6
Terry Fleming, Region 9
Gretchen Hayslip, Region 10
adams.darvene@epa.gov
732-321-6700
richardson.william@epa.gov
215-814-5675
melgaard.david@epa.gov
404-562-9265
nord.mari@epa.gov
312-353-3017
schaub.mike@epa.gov
214-665-7314
fleming.terrence@epa.gov
415-972-3452
havslip.gretchen@epa.gov
206-553-1685
USEPA- Region II
Edison, NJ
U.S. EPA -Region III
Philadelphia, PA
U.S.EPA- Region IV
Atlanta, GA
U.S. EPA -Region V
Chicago, IL
U.S. EPA -Region VI
Dallas, TX
U.S.EPA- Region IX
San Francisco, CA
U.S. EPA -Region X,
Seattle, WA
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 22 of 134
PROJECT PLANNING AND MANAGEMENT
1.1 Introduction
Several recent reports have identified the need for improved water quality monitoring and analysis at
multiple scales. In 2000, the General Accounting Office (USGAO 2000) reported that EPA, states, and
tribes collectively cannot make statistically valid inferences about water quality (via 305[b] reporting)
and lack data to support key management decisions. In 2001, the National Research Council (NRC 2000)
recommended EPA, states, and tribes promote a uniform, consistent approach to ambient monitoring
and data collection to support core water quality programs. In 2002, the H. John Heinz III Center for
Science, Economics, and the Environment (Heinz Center 2002) found there is inadequate data for
national reporting on fresh water, coastal and ocean water quality indicators. The National Association
of Public Administrators (NAPA 2002) stated that improved water quality monitoring is necessary to help
states and tribes make more effective use of limited resources. EPA's Report on the Environment 2003
(USEPA 2003) said that there is not sufficient information to provide a national answer, with confidence
and scientific credibility, to the question, 'What is the condition of U.S. waters and watersheds?'
In response to this need, the Office of Water (OW), in partnership with states and tribes, initiated a
program to assess the condition of the nation's waters via a statistically valid approach. The current
assessment, the National Coastal Condition Assessment 2015 (referred to as NCCA 2015 throughout this
document), builds upon the National Coastal Condition Assessment 2010 and the original National
Coastal Assessments implemented by EPA's Office of Research and Development, state and other
partners. It also builds on other National Aquatic Resource Surveys (NARS) surveys such as the National
Lakes Assessment (NLA), the National Rivers and Streams Assessment (NRSA) and the National Wetland
Condition Assessment (NWCA). The NCCA 2015 effort will provide important information to states and
the public about the condition of the nation's coastal waters and key stressors on a national and
regional scale. It will also provide a trends assessment between 4 time periods: 2000-2001; 2005-2006;
2010 and 2015.
EPA developed this QAPP to support project participants and to ensure that the final assessment is
based on high quality data and known quality for its intended use, and information. The QAPP contains
elements of the overall project management, data quality objectives, measurement and data
acquisition, and information management for NCCA 2015. EPA recognizes that states and tribes may add
elements to the survey, such as supplemental indicators, that are not covered in the scope of this
integrated QAPP. EPA requires that any supplemental elements are addressed by the states, tribes, or
their designees, in a separate approved QAPP. This document covers all core NCCA QA activities. The
NCCA 2015 participants have agreed to follow this QAPP and the protocols and design laid out in this
document, and its associated documents - the NCCA 2015 Field Operations Manual (FOM), Lab
Operations Manual (LOM), and Site Evaluation Guidelines (SEG).
This cooperative effort between states, tribes, and federal agencies makes it possible to produce a
broad-scale assessment of the condition of the Nation's coastal waters with both a known confidence
and scientific credibility. Through this survey, states and tribes have the opportunity to collect data that
can be used to supplement their existing monitoring programs or to begin development of new
programs.
The NCCA 2015 has three main objectives:
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 23 of 134
Estimate the current status, trends, and changes in selected trophic, ecological, and recreational
indicators of the condition of the nation's coastal waters with known statistical confidence;
Identify the relative importance of key stressors; and
Assess changes and trends from the earlier National Coastal Assessments and the NCCA 2010
Indicators for the 2015 survey will remain basically the same as those used in the past surveys, with a
few modifications. This is critical so that EPA and partners can track not only condition but changes over
time in the quality of coastal water resources. Modifications include expanding the area in which crews
can collect fish and sediment to reduce the amount of missing data. Additionally, for NCCA 2015 EPA
and our parterns added indicators related to human health and recreational concerns including an ELISA
microcystin analysis, analysing mercury in fish tissue filets, and adding a broader suite of algal toxins as a
research indicator.
Other EPA programs are conducting special studies under the NCCA in the Great Lakes. The Office of
Science and Technology (OST) within OW is conducting an human health fish tissue study in the Great
Lakes in partnership with the Great Lakes National Program Office. A brief description of the study is
provided in Section 5.5.1. ORD's National Health and Ecological Effects Research Laboratory in Duluth,
MN is conducting an enhanced assessment of Great Lakes embayments. This study adds additional sites
to the overall selection of sites within the Great Lakes, but is otherwise following procedures as outlined
in the QAPP and other NCCA documents. See section 1.3 on study design for more information.
Additionally, ORD's National Health and Ecological Effects Research Laboratory in Duluth, MN and the
Great Lakes National Program Office are implementing a special study in the Lake Huron-Erie Connecting
Channel Corridor using the same protocols that are used for the NCCA although these sites are outside
of the NCCA target population.
1.2 Scope of the Quality Assurance Project Plan
This QAPP addresses the data acquisition efforts of NCCA, which focuses on the 2015 sampling of coasts
across the United States. Data from approximately 909 coastal sites (selected with a probability design)
located along the contiguous coastal marine and Great Lakes states and 45 sites along the Hawaiian
shoreline will provide a comprehensive assessment of the Nation's coastal waters. Additionally, EPA is
conducting special studies as described above. Companion documents to this QAPP that are relevant to
the overall project include:
National Coastal Condition Assessment: Field Operations Manual (EPA 841-R-14-007)
National Coastal Condition Assessment: Laboratory Methods Manual (EPA 841-R-14-008)
National Coastal Condition Assessment: Site Evaluation Guidelines (EPA 841-R-14-006)
1.3 Project Organization
The responsibilities and accountability of the various principals and cooperators are described here and
illustrated in . Overall, the project is coordinated by the Office of Water (OW) in Washington, DC, with
support from EPA Western Ecology Division (WED), the EPA Gulf Ecological Division (GED) and the EPA
Atlantic Ecological Division (AED). Each EPA Regional Office has identified a Regional EPA Coordinator
who is part of the EPA team providing a critical link with state and tribal partners. Cooperators will work
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 24 of 134
with their Regional EPA Coordinator to address any technical issues. A comprehensive quality assurance
(QA) program has been established to ensure data integrity and provide support for the reliable
interpretation of the findings from this project.
Contractor support is provided for all aspects of this project. Contractors will provide support ranging
from implementing the survey, sampling and laboratory processing, data management, data analysis,
and report writing. Cooperators will interact with their Regional EPA Coordinator and the EPA Project
Leader regarding contractual services.
The primary responsibilities of the principals and cooperators are as follows:
Acting Project Leader: Hugh Sullivan, EPA Office of Water
* Provides overall coordination of the project and makes decisions regarding the proper functioning of
all aspects of the project.
Makes assignments and delegates authority, as needed to other parts of the project organization.
Leads the NCCA Steering Committee and establishes needed technical workgroups.
Interacts with EPA Project Team on technical, logistical, and organizational issues on a regular basis.
EPA Field Logistics Coordinator: Colleen Mason, EPA Office of Water
* EPA employee who functions to support implementation of the project based on technical guidance
established by the EPA Project Leader and serves as point-of-contact. for questions from field crews
and cooperators for all activities.
Tracks progress of field sampling activities.
EPA Project QA Coordinator: Sarah Lehmann, EPA Office of Water
* Provides leadership, development, and oversight of project-level quality assurance for NARS.
Assembles and provides leadership for a NCCA 2015 Quality Team.
Maintains official, approved QAPP.
Maintains all training materials and documentation.
Maintains all laboratory accreditation files.
EPA Technical Advisor: Steven Paulsen, EPA Office of Research and Development
* Advises the Project Leader on the relevant experiences and technology developed within the Office
of Research and Development (ORD) that may be used in this project.
Facilitates consultations between NCCA personnel and ORD scientists.
Laboratory Review Coordinator: Kendra Forde, EPA Office of Water
* Ensures participating laboratories complete sample analysis following LOM.
Ensures participating laboratories follow QA activities.
Ensures data submitted within the specified timelines.
Coordinates activities of individual lab Task Order Project Officers to ensure methods are
followed and QA activities take place.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 25 of 134
QA Assistance Visit Coordinator - Colleen Mason, EPA Office of Water
The EPA employee who will supervise the implementation of the QA audit program; and
Directs the field and laboratory audits and ensures the field and lab auditors are adequately trained
to correct errors immediately to avoid erroneous data and the eventual discarding of information
from the assessment.
Human Health Fish Tissue Indicator Lead - Leanne Stahl, EPA Office of Water
The EPA Employee who will coordinate implementation of the human health fish tissue effort on the
Great Lakes;
Interacts with the EPA Project Leads, EPA regional coordinators, contractors and cooperators to
provide information and respond to questions related to the human health fish tissue indicator; and
Responsible for lab analysis phase of the project.
Great Lakes Embayment Enhancement Coordinator-Dave Bolgrien, EPA Office of Research
and Development
The EPA Employee who will coordinate the embayment enhancement component of the Great
Lakes NCCA; and
Interacts with the EPA Project Leads, EPA regional coordinators, contractors and cooperators to
provide information and respond to questions related to embayment enhancement effort.
Information Management Coordinator Marlys Cappaert, SRA International, Inc.
* A contractor who functions to support implementation of the project based on technical guidance
established by the EPA Project Leader and Alternate EPA Project Leader.
Under scope of the contract, oversees the NARS Information Management team.
Oversees all sample shipments and receives data forms from the Cooperators.
Oversees all aspects of data entry and data management for the project.
EPA QA Officer: Margarete Heber, EPA Office of Water
* Functions as an independent officer overseeing all quality assurance (QA) and quality control (QC)
activities.
Responsible for ensuring that the QA program is implemented thoroughly and adequately to
document the performance of all activities.
OCPD QA Coordinator: Virginia Fox-Norse, EPA Office of Water
* Functions as an independent coordinator reviewing all quality assurance (QA) and quality control
(QC) activities.
Regional EPA Coordinators
* Assists EPA Project Leader with regional coordination activities.
Serves on the Technical Experts Workgroup and interacts with Project Facilitator on technical,
logistical, and organizational issues on a regular basis.
Serves as primary point-of-contact for the Cooperators.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 26 of 134
Steering Committee (Technical Experts Workgroup): States, EPA, academics, other federal
agencies
* Provides expert consultation on key technical issues as identified by the EPA Coordination crew and
works with Project Facilitator to resolve approaches and strategies to enable data analysis and
interpretation to be scientifically valid.
Cooperator(s): States, Tribes, USGS, others
* Under the scope of their assistance agreements, plans and executes their individual studies as part
of the cross jurisdictional NCCA 2013/14 and adheres to all QA requirements and standard operating
procedures (SOPs).
Interacts with the Grant Coordinator, Project Facilitator and EPA Project Leader regarding technical,
logistical, organizational issues.
Field Sampling Crew Leaders
* Functions as the senior member of each Cooperator's field sampling crew and the point of contact
for the Field Logistics Coordinator.
Responsible for overseeing all activities of the field sampling crew and ensuring that the Project field
method protocols are followed during all sampling activities.
National Laboratory Task Order Managers: EPA Office of Water
* EPA staff responsible for managing activities of the national contract laboratories.
Provide direction to national and State labs on methods, timelines and QA activities to ensure all
actions are followed.
Provide updates to EPA Laboratory Review Coordinator, the EPA QA Project Lead, and the Project
Leader on the sample processing status of labs and any questions or concerns raised by participating
labs in regards to timelines and deliverables.
Field Logistics Coordinator: Chris Turner, GLEC
* A contractor who functions to support implementation of the project based on technical guidance
established by the EPA Field Logistics Coordinator and the Project Leader.
Serves as point-of-contact for questions from field crews and cooperators for all activities.
Tracks progress of field sampling activities.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 27 of 134
Project Management
Acting Project Lead: Hugh Sullivan, EPA-OW
Project QA Coordinator: Mari Nord, EPA-OW
Technical Advisor: Steve Paulseu, EPA-ORD
OWOW QA
Oversight/Review
Margarete Heber
Stud}' Design
Tony Olseu EPA-ORD
Field Logistics
Implementation Coordinator
Training
EPA OW, ORD and Regions,
Contractors
Field Implementation
State/Tribal Water Quality Agencies,
EPAORD/Regious, Contractors
Select Indicator Leads
HH Fish Tissue - Leanne Stalil. EPA-OW
Pathogens - Rich Haugland- EPA-ORD
GL Enhancements - David Bolgrien. EPA-ORD
Field Protocols/Indicator
Selection
EPA OW and ORD
State/Tribal
Steering Committee
Sample Flow
Phytoplakton
Lab (GL only)
Human Health
Lab (Subset of
GLsites onlv)
NCCA Project and Quality
Team
Information Management
WED/SRA- Marlys Cappaeit
Final Data
STORET/WQX - OW
Assessment
EPA-OW Lead
EPA ORD and Regions,
States, Tribes,
Cooperates arid other partners
Figure 1. NCCA Project Organization and Flow
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 28 of 134
1.4 Project Design
The NCCA 2015 is designed to be completed during the index period of June through the end of
September 2015. Field crews will collect a variety of measurements and samples from predetermined
sampling locations (located with an assigned set of coordinates).
With input from the states and other partners, EPA used an unequal probability design to select 684
marine sites along the coasts of the continental United States and 225 freshwater sites from the shores
of the Great Lakes. Fifty sites were drawn for Hawaii. Field crews will collect a variety of measurements
and samples from predetermined sampling areas associated with an assigned set of coordinates. See
maps of coastal sites in Figure 2 and Figure 3.
To improve our ability to assess embayments as well as shorelines in the Great Lakes, EPA added 150
randomly selected sites in bays and embayments across all 5 Great Lakes (sites not included in the maps
below). This intensification constitutes the Great Lakes Embayment Enhancement. Additionally, EPA will
conduct a pilot study in the Huron-Erie Conneting Channel Corridor using the NCCA QAPP and related
documents (although these sites are not part of the NCCA 2015 target population). See attachment A for
a map of the sites and study area.
Additional sites were also identified for Puerto Rico and Alaska to provide an equivalent design for these
coastal areas if these states and territories choose to sample them. Additionally, related sampling will
occur on reef flat (coastal areas) of American Samoa, Guam and the Northern Mariana Islands during the
2015 field season (not included on map below).
i*
t J
\
J*
T $
*
i
A
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 29 of 134
Legend
2010 Great Lakes Sites
US State* Boundary
Canadian Boundary
Figure 3. NCCA Great Lakes Coastal Base Sites
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 30 of 134
1.5 Project Schedule
Training and field sampling will be conducted in spring/early summer of 2015. Sample processing and
data analysis will be completed by 2016 to support a published report in 2017. Figure 4 gives an
overview of the major tasks leading up to the final report.
2013 2014 2015 2015-2016 2017
survey planning
pilot studies
select indicators
design frame
select sites
implementation
manuals
field training
sampling season
sample processing
data analysis
draft report
peer review
final report
research
-
^H
^1
^1
design
-
-
-
-
-
Field
-
-
-
-
-
lab /data
-
-
report
-
-
^H
-
Figure 4 Schedule for the NCCA 2015
1.6 Overview of Field Operations
Field data acquisition activities are implemented for the NCCA, based on guidance developed by EMAP.
Funding for states and tribes to conduct field data collection activities are provided by EPA under
Section 106 of the Clean Water Act. Survey preparation is initiated with selection of the sampling
locations by the Design Team (ORD in Corvallis). The Design Team gives each site a unique ID which
identifies it throughout the pre-field, field, lab, analysis, and data management phases of the project.
The Project Lead distributes the list of sampling locations to the EPA Regional Coordinators, states, and
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 31 of 134
tribes. With the sampling location list, state and tribal field crews can begin site reconnaissance on the
primary sites and alternate replacement sites and begin work on obtaining access permission to each
site. EPA provides specific procedures for evaluating each sampling location and for replacing non-
sampleable sites in NCCA: Site Evaluation Guidelines. Each crew is responsible for procuring, as needed,
scientific collecting permits from State/Tribal and Federal agencies. The field teams will use standard
field equipment and supplies as identified in the Equipment and Supplies List (Appendix A of the Field
Operations Manual). Field crews will work with Field Logistics Coordinators to coordinate equipment
and supply requests. This helps to ensure comparability of protocols across all crews. EPA has
documented detailed lists of equipment required for each field protocol, as well as guidance on
equipment inspection and maintenance, in the Field Operations Manual.
Field measurements and samples are collected by trained teams/crews. The field crews leaders must be
trained at an EPA-sponsored training session. Ideally, all members of each field crews should attend one
EPA-sponsored training session before the field season. The training program stresses hands-on practice
of methods, consistency among crews, collection of high quality data and samples, and safety. Training
documentation will be maintained by the Project QA Coordinator. Field Crew leaders will maintain
records indicating that members of their team that did not attend and EPA training were properly
trained to follow the NCCA protocols. Field crew leaders will provide EPA with this documentation if
requested by the NCCA Project Leader or QA Coordinator. EPA or other designated personnel (e.g.
contractors) will conduct field sampling assistance visits for each field crew early in the sampling season.
For each site, crews prepare a dossier that contains the following applicable information: road maps,
copies of written access permissions to boat launches, scientific collection permits, coordinates of the
coastal site, information brochures on the program for interested parties, and local area emergency
numbers. Whenever possible, field crews leaders attempt to contact owners of private marinas or boat
launches (as appropriate) approximately two days before the planned sampling date. As the design
requires repeat visits to select sampling locations, it is important for the field crews to do everything
possible to maintain good relationships with launch owners. This includes prior contacts, respect of
special requests, closing gates, minimal site disturbance, and removal of all materials, including trash,
associated with the sampling visit.
The site verification process is shown in Figure 5. Upon arrival at a site, crews verify the location by a
Global Positioning System (GPS) receiver, landmark references, and/or local residents. Crews collect
samples and measurements for various parameters in a specified order (See the Field Operations
Manual). This order has been set up to minimize the impact of sampling for one parameter upon
subsequent parameters. All methods are fully documented in step-by-step procedures in the NCCA Field
Operations Manual. The manual also contains detailed instructions for completing documentation,
labeling samples, any field processing requirements, and sample storage and shipping. Field
communications will be through Field Logistics Coordinator and may involve regularly scheduled
conference calls or contacts.
Standardized field data forms (see Appendix B, NCCA Field Operations Manual) are the primary means
of data recording. Field forms are available to crews in both hard copy and electronic versions. On
completion, the data forms are reviewed by a person other than the person who initially entered the
information. Prior to departure from the field site, the field team leader reviews all forms and labels for
completeness and legibility and ensures that all samples are properly labeled and packed.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 32 of 134
Upon return from field sampling to the office, crews send completed data forms to the Information
Management Coordinator in Corvallis, Oregon for entry into a computerized data base. Crews will send
in hardcopy forms within 2 weeks of sample collection. Crews will send in electronic field forms as soon
as possible after reviewing the forms, but no longer than one week after sample collection. The
Information Management Coordinator will ensure that data uploaded from field forms are reviewed
independently to verify that values are consistent with those recorded on the field data form or original
field data file.
Crews store and package samples for shipment in accordance with instructions contained in the Field
Operations Manual. EPA developed the NCCA shipping instructions so that sample holding times are not
exceeded. Samples which must be shipped are delivered to a commercial carrier; copies of bills of lading
or other documentation are maintained by the team. Crews notify the Information Management
Coordinator, as outlined in the FOM, that shipment has occured; thus, tracing procedures can be
initiated quickly in the event samples are not received. Crews complete chain-of-custody forms for all
transfers of samples, with copies maintained by the field team.
The field operations phase is completed with collection of all samples or expiration of the sampling
window. Following the field seasons, EPA and the contractor field logisitcs coordinator will hold
debriefings with crews and other project staff which cover all aspects of the field program and solicit
suggestions for improvements.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 33 of 134
Select alternate Site
Locate X-site on map
Conduct preliminary
evaluation
(desktop / office)
Regional
NCCA rep and
Coastal Team
Leader confirm
sites should be
dropped''
Complete site
siteverificatio; upload to
NCCAsharepoint*
Is X-site in a
coastal
nearshore
area?
Permission to access
granted (as needed)
X-site Verification
(on-site)
Is X-site
sampleable?
Identify reason(s) why
site not sampleable
Can X-site be
moved within a
.02 nm area to
achieve
sampleability?
Figure 5 Site Evaluation Diagram
* If you need access to the SharePoint site, please send an email to Kendra Forde at forde.kendra@epa.gov and cc: Hugh
Sullivan at sullivan.hugh(5)epa.gov. If you are having trouble with the SharePoint site, you may email interim and final
spreadsheets to the Contract Logistics Coordinator and your Regional Coordinator (see page 19 for contact information).
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 34 of 134
1.7 Overview of Laboratory Operations
Holding times for surface water samples vary with the sample types and analyte. Field crews begin some
analytical measurements during sampling (e.g., in situ measurements) while other analytical
measurements are not initiated until sampling has been completed (e.g., water chemistry, microcystins,
fecal indicators (Enterococci)). Analytical methods are summarized in the NCCA 2015 Laboratory
Operations Manual (LOM). When available, standard methods are used and are referenced in the LOM.
Where experimental methods are used or standard methods are modified by the laboratory, these
methods are documented in the laboratory methods manual by EPA or in internal documentation by the
appropriate laboratbory. The laboratory coordinator will work with appropriate experts to describe
them in Standard Operating Procedures (SOPs) developed by the analytical laboratories.
Contractor and/or cooperator laboratories will perform chemical, physical, and biological analyses.
National contract labs will process most samples. Where those labs are currently in place, EPA
has identified them here. Dynamac, a lab managed by the ORD Western Ecology Division, will analyze
water chemistry and chlorophyll-a samples. PG Environmental, a national contract lab will analyze
benthic invertebrates. Enviroscience, a national contract lab, will analyze sediment chemistry. PG
Envrionmental, a national contract lab, will analyze sediment toxicity. Enviroscience, a national contract
lab, will analyze whole fish tissue samples. A national contract lab, PG Environmental, will analyze fish
tissue plugs. A national contract lab, EnviroScience, will analyze microcystins samples. EPA's Office of
Research and Development lab in Cincinnati, OH will analyze samples for enterococci. A national
contract lab, Microbac, will analyze fish tissue filet samples. USGS will analyze algal toxins as a research
indicator. Additionally, EPA anticipates that a few pre-approved state labs may opt to analyze samples
for various indicators.
Laboratories providing analytical support must have the appropriate facilities to properly store and
prepare samples and appropriate instrumentation and staff to provide data of the required quality
within the time period dictated by the project. Laboratories are expected to conduct operations using
good laboratory practices. The following are general guidelines for analytical support laboratories:
A program of scheduled maintenance of analytical balances, water purification systems,
microscopes, laboratory equipment, and instrumentation.
Verification of the calibration of analytical balances using class "S" weights which are certified by the
National Institute of Standards and Technology (NIST) (http://www.nist.gov/).
Verification of the calibration of top-loading balances using NIST-certified class "P" weights.
Checking and recording the composition of fresh calibration standards against the previous lot of
calibration standards. Participating laboratories will keep a percentage of the previous lot of
calibration standard to check against the next batch of samples processed. This will ensure that a
comparison between lots can occur. Acceptable comparisons are less than or equal to two percent
of the theoretical value. (This acceptance is tighter than the method calibration criteria.)
Recording all analytical data in bound logbooks in ink, or on standardized recording forms.
Verification of the calibration of uniquely identified daily use thermometers using NIST-certified
thermometers.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 35 of 134
Monitoring and recording (in a logbook or on a recording form) temperatures and performance of
cold storage areas and freezer units (where samples, reagents, and standards may be stored).
During periods of sample collection operations, monitoring must be done on a daily basis.
An overall program of laboratory health and safety including periodic inspection and verification of
presence and adequacy of first aid and spill kits; verification of presence and performance of safety
showers, eyewash stations, and fume hoods; sufficiently exhausted reagent storage units, where
applicable; available chemical and hazardous materials inventory; and accessible material safety
data sheets for all required materials.
An overall program of hazardous waste management and minimization, and evidence of proper
waste handling and disposal procedures (90-day storage, manifested waste streams, etc.).
If needed, having a source of reagent water meeting American Society of Testing and Materials
(ASTM) Type I specifications for conductivity (< 1 u.S/cm at 25 °C; ASTM 2011) available in sufficient
quantity to support analytical operations.
Appropriate microscopes or other magnification for biological sample sorting and organism
identification.
Approved biological identification and taxonomic keys/guides for use in biological identification
(benthic macroinvertebrates) as appropriate.
Labeling all containers used in the laboratory with date prepared contents, and initials of the
individual who prepared the contents.
Dating and storing all chemicals safely upon receipt. Chemicals are disposed of properly when the
expiration date has expired.
Using a laboratory information management system to track the location and status of any sample
received for analysis.
Reporting results electronically using standard formats and units compatible with NARS IM (see LOM
for data templates). These files will be labeled properly by referencing the indicator and/or analyte
and date.
All laboratories providing analytical support to NCCA 2015 must adhere to the provisions of this
integrated QAPP and LOM. Laboratories will provide information documenting their ability to conduct
the analyses with the required level of data quality prior to data analysis. Different requirements will be
provided based on the type of analysis being completed by the laboratory (i.e. chemistry vs. biological
analyses).
Laboratories will send the documentation to the Project Quality Assurance Coordinator and the
Laboratory Review Coordinator at EPA Headquarters (or other such designated parties). The Project QA
Coordinator will maintain these files in NCCA QA files. Such information may include the following:
Signed Quality Assurance Project Plan by the laboratory performing analysis;
Signed Laboratory Form;
Valid Accreditation or Certification;
Laboratory's Quality Manual and/or Data Management Plan;
Method Detection Limits (MDL);
Demonstration of Capability;
Results from inter-laboratory comparison studies;
Analysis of performance evaluation samples; and
Control charts and results of internal QC sample or internal reference sample analyses to Document
achieved precision, bias, accuracy.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 36 of 134
Other requirements may include:
Participation in calls regarding laboratory procedures and processes with participating laboratories;
Participation in a laboratory technical assessment or audit;
Participation in performance evaluation studies; and
Participation in inter-laboratory sample exchange.
Chemistry Lab Quality Evaluation
Participating laboratories will send requested documentation to the NCCA 2015 QA Team for evaluation
of qualifications. The NCCA 2015 QA Team will maintain these records in the project QA file.
Biological Laboratory Quality Evaluation
The NCCA 2015 Quality Team will review the past performance of biological laboratories. The biological
laboratories shall adhere to the quality assurance objectives and requirements as specified for the
pertinent indicators in the LOM.
See Section 6 of this QAPP and Appendix A of the LOM for additional information related to laboratory
certification. All qualified laboratories shall work with the NARS IM Center to track samples as specified
by the NARS Information Managment Lead.
1.8 Data Analysis
A technical workgroup convened by the EPA Project Leader is responsible for development of a data
analysis plan that includes a verification and validation strategy. General processes are summarized in
the indicator-specific sections of this QAPP. The NCCA Quality team transfer validated data to the
central data base managed by EMAP information management support staff located at WED in Corvallis.
Information management activities are discussed further in Section 4. Data in the WED data base are
available to Cooperators for use in development of indicator metrics. EPA will transfer all validated
measurement and indicator data from the NCCA to EPA's Water Quality Exchange (WQX) for storage in
EPA's STORET warehouse for public accessibility. Additionally, the NCCA will maintain data files on the
internal project sharefile site for partners and on the NCCA website for public accessibility. The Data
Analysis plan is described in Section 7 of this QAPP.
1.9 Peer Review
The Survey will undergo a thorough peer review process, where the scientific community and the
public will be given the opportunity to provide comments. Cooperators have been actively involved in
the development of the overall project management, design, methods, and standards including the
drafting of four key project documents:
National Coastal Condition Assessment: Quality Assurance Project Plan (EPA 841-R-14-005)
National Coastal Condition Assessment: Field Operations Manual (EPA 841-R-14-007)
National Coastal Condition Assessment: Laboratory Methods Manual (EP, 841-R-14-008)
National Coastal Condition Assessment: Site Evaluation Guidelines (EPA 841-R-14-006)
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 37 of 134
Outside scientific experts from universities, research centers, and other federal agencies have been
instrumental in indicator development and will continue to play an important role in data analysis.
The EPA will utilize a three tiered approach for peer review of the Survey: (1) internal and external
review by EPA, states, other cooperators and partners, (2) external scientific peer review, and (3)
public review.
Once data analysis has been completed, cooperators will examine the results at regional meetings or
webinars. The NCCA project team will incorporate comments and feedback from the cooperators into
the draft report. The NCCA team will send the draft report out for scientific peer review and
incorporate comments into the draft report. Finally, EPA will release the report for public comment.
This public comment period is important to the process and will allow EPA to garner a broader
perspective in examining the results before the final report is completed. The public peer review is
consistent with the Agency and OMB's revised requirements for peer review.
Below are the proposed measures EPA will implement for engaging in the peer review process:
1. Develop and maintain a public website with links to standard operating procedures, quality
assurance documents, fact sheets, cooperator feedback, and final report;
2. Conduct technical workgroup meetings or webinars composed of scientific experts,
cooperators, and EPA to evaluate and recommend data analysis options and indicators;
3. Hold national meetings or webinars where cooperators will provide input and guidance on
data presentation and an approach for data analysis;
4. Complete data validation on all chemical, physical and biological data;
5. Conduct final data analysis with workgroup to generate assessment results;
6. Engage peer review contractor to identify external peer review pane;l
7. Develop draft report presenting assessment results;
8. Conduct regional meetings with cooperators to examine and comment on results;
9. Develop final draft report incorporating input from cooperators and results from data analysis
group to be distributed for peer and public review;
10. Develop final draft report incorporating input from cooperators and results from data analysis
group to be distributed for peer and public review (when applicable);
11. Issue Federal Register (FR) Notice announcing document availability and hold scientific/peer
review and 30-45 day public comment periods (when applicable);
12. Consider scientific and public comments (when applicable); and produce a final report.
2.0 Data Quality Objectives
It is a policy of the U.S. EPA that Data Quality Objectives (DQOs) be developed for all environmental data
collection activities following the prescribed DQO Process. DQOs are qualitative and quantitative
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 38 of 134
statements that clarify study objectives, define the appropriate types of data, and specify the tolerable
levels of potential decision errors that will be used as the basis for establishing the quality and quantity
of data needed to support decisions (EPA 2006B). Data quality objectives thus provide the criteria to
design a sampling program within cost and resource constraints or technology limitations imposed upon
a project or study. DQOs are typically expressed in terms of acceptable uncertainty (e.g., width of an
uncertainty band or interval) associated with a point estimate at a desired level of statistical confidence
(EPA 2006B). The DQO Process is used to establish performance or acceptance criteria, which serve as
the basis for designing a plan for collecting data of sufficient quality and quantity to support the goals of
a study (EPA 2006B). As a general rule, performance criteria represent the full set of specifications that
are needed to design a data or information collection effort such that, when implemented, generate
newly-collected data that are of sufficient quality and quantity to address the project's goals (EPA
2006B). Acceptance criteria are specifications intended to evaluate the adequacy of one or more
existing sources of information or data as being acceptable to support the project's intended use (EPA
2006B).
2.1 Data Quality Objectives for the National Coastal Condition Assessment
NCCA has established target DQOs for assessing the current status of selected indicators of condition
for the conterminous U.S.coastal resources as follows:
For each indicator of condition, estimate the proportion of the nation's estuaries and
combined area of the Great Lakes in degraded condition within a ± 5% margin of error and
with 95% confidence.
For each indicator of condition, estimate the proportion of regional estuarine (Northeast,
Southeast, Gulf of Mexico, and West Coast) or Great Lake resources in degraded condition
within a ± 15% margin of error and with 95% confidence.
For estimates of change, the DQOs are: Estimate the proportion of the nation's estuaries and
combined area of the Great Lakes (± 7%) that have changed condition classes for selected
measures with 95% confidence.
2.2 Measurement Quality Objectives
For each parameter, performance objectives (associated primarily with measurement error) are
established for several different data quality indicators (following USEPA Guidance for Quality
Assurance Plans EPA240/R-02/009). Specific measurement quality objectives (MQOs) for each
parameter are shown in chapter 5 of this QAPP and in the LOM. The following sections define the data
quality indicators and present approaches for evaluating them against acceptance criteria established
for the program.
2.2.1 Method Detection Limits (Laboratory Reporting Level (Sensitivity))
For chemical measurements, requirements for the MDL are typically established (see indicators in
Section 5). The MDL is defined as the lowest level of analyte that can be distinguished from zero with
99 percent confidence based on a single measurement (Glaser et al., 1981). United State Geologic
Survey (USGS) NWQL has developed a variant of the MDL called the long-term MDL (LT-MDL) to
capture greater method variability (Oblinger Childress et al. 1999). Unlike MDL, it is designed to
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 39 of 134
incorporate more of the measurement variability that is typical for routine analyses in a production
laboratory, such as multiple instruments, operators, calibrations, and sample preparation events
(Oblinger Childress et al. 1999). The LT-MDL determination ideally employs at least 24 spiked samples
prepared and analyzed by multiple analysts on multiple instruments over a 6- to 12-month period at
a frequency of about two samples per month (EPA 2004B). The LT-MDL uses "F-pseudosigma" (Fa) in
place of s, the sample standard deviation, used in the EPA MDL calculation. F-pseudosigma is a non-
parametric measure of variability that is based on the interquartile range of the data (EPA 2004B).
The LT-MDL may be calculated using either the mean or median of a set of long-term blanks, or from
long-term spiked sample results (depending o the analyte and specific analytical method). The LT-
MDL for an individual analyte is calculated as:
Equation la LT-MDL = M +
Where M is the mean or median of blank results; n is the number of spiked sample results; and
FO is F-pseudosigma, a nonparametric estimate of variability calculated as:
Q* ~ Q^
Equation Ib F -
1.349
Where: Q3 and Ql are the 75th percentile and 25th percentile of spiked sample results, respectively.
LT-MDL is designed to be used in conjunction with a laboratory reporting level (LRL; Oblinger Childress
et al. 1999). The LRL is designed to achieve a risk of <1% for both false negatives and false positives
(Oblinger Childress etal. 1999). The LRL is set as a multiple of the LT-MDL, and is calculated as follows:
LRL = 2 x LT-MDL
Therefore, multiple measurements of a sample having a true concentration at the LRL should result in
the concentration being detected and reported 99 percent of the time (Oblinger Childress et al. 1999).
All laboratories will develop calibration curves for each batch of samples that include a calibration
standard with an analyte concentration equal to the LRL. Estimates of LRLs (and how they are
determined) are required to be submitted with analytical results. Analytical results associated with LRLs
that exceed the objectives are flagged as being associated with unacceptable LRLs. Analytical data that
are below the estimated LRLs are reported, but are flagged as being below the LRLs.
2.2.2 Sampling Precision and Bias
Precision and bias are estimates of random and systematic error in a measurement process (Kirchmer,
1983; Hunt and Wilson, 1986, USEPA 2002). Collectively, precision and bias provide an estimate of the
total error or uncertainty associated with an individual measurement or set of measurements.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 40 of 134
Systematic errors are minimized by using validated methods and standardized procedures across all
laboratories. Precision is estimated from repeated measurements of samples. Net bias is determined
from repeated measurements of solutions of known composition, or from the analysis of samples that
have been fortified by the addition of a known quantity of analyte. For analytes with large ranges of
expected concentrations, MQOsfor precision and bias are established in both absolute and relative
terms, following the approach outlined in Hunt and Wilson (1986). At lower concentrations, MQOs are
specified in absolute terms. At higher concentrations, MQOs are stated in relative terms. The point of
transition between an absolute and relative MQO is calculated as the quotient of the absolute objective
divided by the relative objective (expressed as a proportion, e.g., 0.10 rather than as a percentage, e.g.,
10%). Precision and bias within each laboratory are monitored for every sample batch by the analysis of
internal QC samples. Samples associated with unacceptable QC sample results are reviewed and re-
analyzed if necessary. For selected analyses, precision and bias across all laboratories will be evaluated
by EPA (or an EPA contractor) sending performance evaluation samples to each lab. For more
information, see section 5 of this QAPP and the Laboratory Operations Manual. Equations used to
calculate precision, bias and accuracy follow.
Equation 1 Standard Deviation. Precision in absolute terms is estimated as the sample standard
deviation when the number of measurements is greater than two:
s =
n-l
where xi is the value of the replicate, * is the mean of repeated sample measurements, and n is the
number of replicates.
Equation 2 Relative Standard Deviation or Coefficient of Variation. Relative precision for such
measurements is estimated as the relative standard deviation (RSD, or coefficient of variation, [CV]):
X
value for the set of measurements.Here s is the sample standard deviation of the set of measurements,
and x equals the mean.
Equation 3 Relative Percent Difference.Precision based on duplicate measurements is estimated based
on the range of measured values (which equals the difference for two measurements). The relative
percent difference (RPD) is calculated as:
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 41 of 134
I A B ~\
xlOO
B)I2)
where A is the first measured value, B is the second measured value.
Equation 4 Net Bias. For repeated measurements of samples of known composition, net bias (B) is
estimated in absolute terms as:
B = x-T
where x equals the mean value for the set of measurements, and T equals the theoretical or
target value of a performance evaluation sample.
Equation 5 Relative Bias. Bias in relative terms (B[%]) is calculated as:
where * equals the mean value for the set of measurements, and T equals the theoretical or target
value of a performance evaluation sample.
2.2.3 Sampling Accuracy
Accuracy is generally a qualitative description rather than a quantitative description. Therefore, accuracy
is estimated for some analytes by calculating the percent recovery of a known quantity of an analytes
from fortified or spiked samples. For example, for water chemistry and chlorophyll a, accuracy is
estimated as the difference between the measured (across batches) and target values of performance
evaluation and/or internal reference samples at the lower concentration range, and as the percent
difference at the higher concentration range. See specific indicator sections in Chapter 5 for which
analytes include accuracy calculations.
Equation 6 Percent Recovery. Percent recovery is calculated as:
%re cov ery = Cls~C" x 100
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 42 of 134
where Cis is the measured concentration of the spiked sample, CM is the concentration of the unspiked
sample, and Cs is the concentration of the spike.
2.2.4 Taxonomic Precision and Accuracy
For the NCCA, taxonomic precision will be quantified by comparing whole-sample identifications
completed by independent taxonomists or laboratories. Accuracy of taxonomy will be qualitatively
evaluated through specification of target hierarchical levels (e.g., family, genus, or species); and the
specification of appropriate technical taxonomic literature or other references (e.g., identification keys,
voucher specimens). To calculate taxonomic precision, 10 percent of the samples will be randomly-
selected for re-identification by an independent, outside taxonomist or laboratory.
Equation 7 Percent Taxonomic Disagreement. Comparison of the results of whole sample re-
identifications will provide a Percent Taxonomic Disagreement (PTD) calculated as:
PTD =
1-
comp
pos
N
x 100
where compp0s is the number of agreements, and N is the total number of individuals in the larger of the
two counts. The lower the PTD, the more similar are taxonomic results and the overall taxonomic
precision is better. A MQO of 15% is recommended for taxonomic difference (overall mean <15% is
acceptable). Individual samples exceeding 15% are examined for taxonomic areas of substantial
disagreement, and the reasons for disagreement investigated.
Sample enumeration is another component of taxonomic precision. Final specimen counts for samples
are dependent on the taxonomist, not the rough counts obtained during the sorting activity.
Equation 8 Percent Difference in Enumeration. Comparison of counts is quantified by calculation of
percent difference in enumeration (PDE), calculated as:
(\Lab\-Lab2\\
PDE = L x 100
^ Labi + Labi )
An MQO of 5% is recommended (overall mean of <5% is acceptable) for PDE values. Individual samples
exceeding 5% are examined to determine reasons for the exceedance.
Corrective actions for samples exceeding these MQOs can include defining the taxa for which re-
identification may be necessary (potentially even by third party), for which samples (even outside of the
10% lot of QC samples) it is necessary, and where there may be issues of nomenclatural or enumeration
problems.
Taxonomic accuracy is evaluated by having individual specimens representative of selected taxa
identified by recognized experts. Samples will be identified using the most appropriate technical
literature that is accepted by the taxonomic discipline and reflects the accepted nomenclature. Where
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 _ Page 43 of 1 34
necessary, the Integrated Taxonomic Information System (ITIS, http://www.itis.usda.gov/) will be used
to verify nomenclatural validity and spelling. A reference collection will be compiled as the samples are
identified. Specialists in several taxonomic groups will verify selected individuals of different taxa, as
determined by the NCCA workgroup.
2.2.5 Completeness
Completeness is defined as "a measure of the amount of data collected from a measurement process
compared to the amount that was expected to be obtained under the conditions of measurement"
(Stanley and Vener, 1985).
Completeness requirements are established and evaluated from two perspectives. First, valid data for
individual parameters must be acquired from a minimum number of sampling locations in order to make
subpopulation estimates with a specified level of confidence or sampling precision. The objective of this
study is to complete sampling at 95% or more of the 1000 initial sampling sites. Percent completeness is
calculated as:
Equation 8 Percent Completeness.
where V is the number of measurements/samples judged valid, and T is the total number of planned
measurements/samples.
Within each indicator, completeness objectives are also established for individual samples or individual
measurement variables or analytes. These objectives are estimated as the percentage of valid data
obtained versus the amount of data expected based on the number of samples collected or number of
measurements conducted. Where necessary, supplementary objectives for completeness are presented
in the indicator-specific sections of this QAPP.
The completeness objectives are established for each measurement per site type (e.g., probability sites,
revisit sites, etc.). Failure to achieve the minimum requirements for a particular site type results in
regional population estimates having wider confidence intervals and may impact the ability to make
some subnational assessments. Failure to achieve requirements for repeat sampling (10% of samples
collected) and revisit samples (10% of sites visited) reduces the precision of estimates of index period
and annual variance components, and may impact the representativeness of these estimates because of
possible bias in the set of measurements obtained.
2.2.6 Comparability
Comparability is defined as "the confidence with which one data set can be compared to another"
(Stanley and Vener, 1985). A performance-based methods approach is being utilized for water chemistry
and chlorophyll-a analyses that defines a set of laboratory method performance requirements for data
quality. Following this approach, participating laboratories may choose which analytical methods they
will use for each target analyte as long as they are able to achieve the performance requirements as
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 44 of 134
listed in the Quality Control section of each Indicator section. For all parameters, comparability is
addressed by the use of standardized sampling procedures and analytical methods by all sampling crews
and laboratories. Comparability of data within and among parameters is also facilitated by the
implementation of standardized quality assurance and quality control techniques and standardized
performance and acceptance criteria. For all measurements, reporting units and format are specified,
incorporated into standardized data recording forms, and documented in the information management
system. Comparability is also addressed by providing results of QA sample data, such as estimates of
precision and bias, and conducting performance evaluation studies such as providing performance
evaluation samples to all appropriate labs and implementing an independent verification of taxonomic
identifications for 10% of samples processed at laboratories.
2.2.7 Representativeness
Representativeness is defined as "the degree to which the data accurately and precisely represent a
characteristic of a population parameter, variation of a property, a process characteristic, or an
operational condition" (USEPA 2002). At one level, representativeness is affected by problems in any or
all of the other data quality indicators.
At another level, representativeness is affected by the selection of the target surface water bodies, the
location of sampling sites within that body, the time period when samples are collected, and the time
period when samples are analyzed. The probability-based sampling design should provide estimates of
condition of surface water resource populations that are representative of the region. The individual
sampling programs defined for each indicator attempt to address representativeness within the
constraints of the response design, (which includes when, where, and how to collect a sample at each
site). Holding time requirements for analyses ensure analytical results are representative of conditions
at the time of sampling. Use of duplicate (repeat) samples which are similar in composition to samples
being measured provides estimates of precision and bias that are applicable to sample measurements.
3. Site Selection Design
The overall sampling program for the NCCA project requires a randomized, probability-based approach
for selecting coastal sites where sampling activities are to be conducted. Details regarding the specific
application of the probability design to surface waters resources are described in Paulsen et al. (1991)
and Stevens (1994). The specific details for the collection of samples associated with different indicators
are described in the indicator-specific sections of this QAPP.
3.1. Probability Based Sampling Design and Site Selection
The target population for this project includes:
All coastal waters of the United States from the head-of-salt to confluence with ocean including
inland waterways and major embayments such as Florida Bay and Cape Cod Bay. For the
purposes of this study the head of salt is generally defined as < 0.5 psu (ppt) and represents the
landward/upstream boundary . The seaward boundary extends out to where an imaginary
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 45 of 134
straight-line intersecting two land features would fully enclose a body of coastal water. All
waters within the enclosed area are defined as estuarine, regardless of depth or salinity.
Near shore waters of the Great Lakes of the United States and Canada. Near shore zone is
defined as region from shoreline to 30m depth constrained to a maximum of 5 km from
shoreline. Great Lakes include Lake Superior, Lake Michigan, Lake Huron, Lake Erie, and Lake
Ontario. The NARS Great Lakes survey will be restricted to the United States portion.
3.2. Survey Design for the Marine Waters
The sample frame was derived from the prior National Coastal Assessment sample frame developed
by ORD Gulf Breeze Ecology Division. The prior GED sample frame was enhanced as part of the
National Coastal Monitoring Network design (National Water Quality Monitoring Network) by
including information from NOAA's Coastal Assessment Framework, boundaries of National Estuary
Programs (NEP) and identification of major coastal systems. Information on salinity zones was
obtained from NOAAforthe NCCA. For Delaware Bay, Chesapeake Bay, Puget Sound and state of
South Carolina, the prior NCCA sample frames were replaced by GIS layers provided by South Carolina
Department of Health & Environmental Control, Washington Department of Ecology, Chesapeake Bay
Program and Delaware River Basin Commission, ensuring that no prior areas in NCCA were excluded
and any differences were clearly identified in the new NCCA sample frame.
A Generalized Random Tessellation Stratified (GRTS) survey design for an area resource was used for
the NCCA. The survey design is a stratified design with unequal probability of selection based on area
within each stratum. The details are given below:
Unequal probability categories were created based on area of polygons within each major estuary. The
number of categories ranged from 3 to 7. The categories were used to ensure that sites were selected in
the smaller polygons. The Design includes three panels: "Revisit" identifies sites that are to be visited
twice, "Base" identifies remaining sites to be visited, and "Over" identifies sites available to be used as
replacement sites. Over sample sites were selected independent of the other two panels. The expected
sample size is 682 sites for conterminous coastal states and 45 sites each for Hawaii and Puerto Rico.
The maximum number of sites for a major estuary was 46 (Chesapeake Bay). Total number of site visits
is 750 allocated to 682 unique sites and 68 sites to be revisited. Additionally, over sample sites were
selected to not only provide replacement sites that either are not part of the target population or could
not be sampled but also to accommodate those states on National Estuary Programs who may want to
increase the number of sites sampled within their state for a state-level design or NEP design.
3.3. Survey Design for the Great Lakes
The sample frame was obtained from Jack Kelly, US EPA ORD. A Generalized Random Tessellation
Stratified (GRTS) survey design for an area resource was used. The survey design is stratified by Lake and
country with unequal probability of selection based on state shoreline length within each stratum.
Unequal probability categories are states or province within each Great Lake based on proportion of
state shoreline length within each stratum. The design uses a single panel, "Base", with an over sample
that was selected independent of the Base panel. The expected sample size is for 45 sites in Shallow
NearShore zone for each Great Lake and country combination for a total of 405 sites. Sample sizes were
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 46 of 134
allocated proportional to shoreline length by state within each Great Lake. An over sample size of 405
(100%) was selected to provide replacement sites that either are not part of the target population or
could not be sampled. The over sample sites were selected independently of the base design.
3.4 Revisit Sites
Of the sites visited in the field and found to be target sites, a total of 10% will be revisited. The primary
purpose of this revisit set of sites is to allow variance estimates that would provide information on the
extent to which the population estimates might vary if they were sampled at a different time.
4.0 Information Management
Environmental monitoring efforts that amass large quantities of information from various sources
present unique and challenging data management opportunities. To meet these challenges, the NCCA
employs a variety of well-tested information management (IM) strategies to aid in the functional
organization and ensured integrity of stored electronic data. IM is integral to all aspects of the NCCA
from initial selection of sampling sites through the dissemination and reporting of final, validated data.
And, by extension, all participants in the NCCA have certain responsibilities and obligations which also
make them a part of the IM system. This "inclusive" approach to managing information helps to:
Strengthen relationships among NCCA participants.
Increase the quality and relevancy of accumulated data.
Ensure the flexibility and sustainability of the NCCA IM structure.
This IM strategy provides a congruent and scientifically meaningful approach for maintaining
environmental monitoring data that will satisfy both scientific and technological requirements of the
NCCA.
4.1 Roles and Responsibilities
At each point where data and information are generated, compiled, or stored, the NCCA team must
manage the information. Thus, the IM system includes all of the data-generating activities, all of the
means of recording and storing information, and all of the processes which use data. The IM system also
includes both hardcopy and electronic means of generating, storing, organizing and archiving data and
the efforts to achieve a functional IM process is all encompassing. To that end, all participants in the
NCCA play an integral part within the IM system. Table 1 provides a summary of the IM responsibilities
identified by NCCA group. See also roles/responsibilities in Section 1.3 of the QAPP. Specific information
on the field team responsibilities for tracking and sending information is found in the Field Operations
Manual.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 47 of 134
Table 1. Summary of IM Responsibilities
NCCA Group
Contact
Primary Role
Responsibility
Field Crews
State
partners
and
contractors
Acquire in-situ
measurements
and prescribed
list of
biotic/abiotic
samples at each
site targeted for
the survey
Complete and review field data forms and
sample tracking forms for accuracy,
completeness, and legibility.
Ship/fax field and sample tracking forms to
NCCA IM Center so information can be
integrated into the central database
Work with the NCCA IM Center staff to develop
acceptable file structures and electronic data
transfer protocols should there be a need to
transfer and integrate data into the central
database
Provide all data as specified in Field Operations
Manual or as negotiated with the NCCA Project
Leader.
Maintain open communications with NCCA IM
Center regarding any data issues
Analytical
Laboratories
State
partners
and
contractors
Analyze samples
received from
field teams in
the manner
appropriate to
acquire
biotic/abiotic
indicators/meas
urements
requested.
Review all electronic data transmittal files for
completeness and accuracy (as identified in the
Quality Assurance Project Plan).
Use provided data templates and work with the
NCCA IM Center staff to develop file structures
and electronic data transfer protocols for
electronic-based data as needed.
Submit completed sample tracking forms to NCCA
IM Center so information can be updated in the
central database
Provide all data and metadata as specified in LOM
and QAPP or as negotiated with the NCCA Project
Leader.
Maintain open communications with NCCA IM
Center regarding any data issues.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 48 of 134
NCCA Group
NCCAIM
Center staff
(led by
Information
Management
Coordinator)
NCCA Quality
Assurance
Coordinator
NCCA
Laboratory
Review
Coordinator
Contact
USEPAORD
NHEERL
Western
Ecology
Division-
Corvallis/Co
ntractor
USEPA
Office Of
Water
USEPA
Office of
Water
Primary Role
Provides
support and
guidance for all
IM operations
related to
maintaining a
central data
management
system for
NCCA.
Review and
evaluate the
relevancy and
quality of
information/dat
a collected and
generated
through the
NCCA surveys.
Coordinates
oversight of
participating
labs and
Responsibility
Develop/update field data forms.
Plan and implement electronic data flow and
management processes.
Manage the centralized database and implement
related administration duties.
Receive, scan, and conduct error checking of
field data forms.
Monitor and track samples from field collection,
through shipment to appropriate laboratory.
Receive data submission packages (analytical
results and metadata) from the NCCA Quality
team.
Run automated error checking, e.g., formatting
differences, field edits, range checks, logic
checks, etc.
Receive verified, validated, and final indicator
data files (including record changes and reason
for change) from QA reviewers. Maintain history
of all changes to data records from inception
through delivery to WQX.
Organize data in preparation for data verification
and validation analysis and public dissemination.
Implement backup and recovery support for
central database.
Implement data version control as appropriate
including maintaining data tracking
documentation for field and lab data received by
NARSIM.
Monitor instrument and analytical quality
control information.
Evaluate results stemming from field and
laboratory audits.
Investigate and take corrective action, as
necessary, to mitigate any data quality issues.
Issue guidance to NCCA Project Leader and IM
Center staff for qualifying data when quality
standards are not met or when protocols
deviate from plan.
Ensures participating laboratories complete
sample analysis following LOM.
Ensures participating laboratories follow QA
activities.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 49 of 134
NCCA Group
Contact
Primary Role
Responsibility
submission of
lab data.
Ensures data submitted within the specified
timelines.
Coordinates activities of individual lab Task
Order Project Officers to ensure methods are
followed and QA activities take place.
Submits laboratory data files to NARS IM
Coordinator for upload to the NARS IM
database. Maintains data tracking
documentation for laboratory submissions to
NARSIM.
NCCA Data
Analysis and
Reporting
Team
USEPA
Office of
Water
Provide the data
analysis and
technical
support for
NCCA reporting
requirements
Provide data integration, aggregation and
transformation support as needed for data
analysis.
Provide supporting information necessary to
create metadata.
Investigate and follow-up on data anomalies
identified data analysis activities.
Produce estimates of extent and ecological
condition of the target population of the
resource.
Provide written background information and
data analysis interpretation for report(s).
Document in-depth data analysis procedures
used.
Provide mapping/graphical support.
Document formatting and version control.
Data
Finalization
Team
USEPA
Office of
Water
Provides data
librarian
support
Prepare NCCA data for transfer to USEPA public
web-server(s).
Generate data inventory catalog record (Science
Inventory Record)
Ensure all metadata is consistent, complete, and
compliant with USEPA standards.
4.2 State-Based Data Management
Some state partners will be conducting field and laboratory analyses. While NCCA encourages states to
use these in-house capabilities, it is imperative that NCCA partners understand their particular role and
responsibilities for executing these functions within the context of the national program:
If a state chooses to do IM in-house, the state will perform all of the functions associated with
the following roles:
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 50 of 134
o Field Crewincluding shipping/faxing of field data forms to the IM Coordinator (NCCA
field forms must be used and the original field forms must be sent to the IM Center as
outlined in the Field Operations Manual)
o Quality Control Team for internal laboratory data as required for the selected indicator
in the NCCA 2015 QAPP and LOM
All data will flow from the state to the Laboratory Review Coordinator. Typically, the state will
provide a single point of contact for all things related to NCCA data. However, it may be
advantageous for the NCCA team to have direct communication with the state-participating
laboratories to facilitate the transfer of dataa point that may negotiated between the primary
state contact, the regional coordinator and the NCCA Project Leader).
States must submit all initial laboratory results (i.e., those that have been verified by the
laboratory and have passed all internal laboratory QA/QC criteria) in the appropriate format to
the laboratory review coordinator and the Project QA coordinator by May 2016 or as otherwise
negotiated with EPA.
The NCCA Quality Team will complete additional QC and then submit to the NCCA IM Center.
Data transfers must be complete. For example, laboratory analysis results submitted by the
state must be accompanied by related quality control and quality assurance data, qualifiers code
definitions, contaminant/parameter code cross-references/descriptions, test methods,
instrumentation information and any other relevant laboratory-based assessments or
documentation related to specific analytical batch runs.
The state will ensure that data meet minimum quality standards and that data transfer files
meet negotiated content and file structure standards.
The NCCA Laboratory review coordinator will provide all participating labs with required data templates
for use in submitting data and metadata results.
4.3 Overview of System Structure
In its entirety, the IM system includes site selection and logistics information, sample labels and field
data forms, tracking records, map and analytical data, data validation and analysis processes,
reports, and archives. NCCA IM staff provides support and guidance to all program operations in
addition to maintaining a central data base management system for the NCCA data. The central
repository for data and associated information collected for use by the NCCA is a secure, access-
controlled server located at WED-Corvallis. This database is known as the National Aquatic Resource
Surveys Information Management System (NARSIMS). The general organization of the information
management system is presented in Figure 6. Data are stored and managed on this system using the
Structured Query Language (SQL). Data review (e.g., verification and validation) and data analysis
(e.g., estimates of status and extent) are accomplished primarily using programs developed in either
(SAS) or R language software packages.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 51 of 134
SAMPLE SITE INFORMATION
INDICATOR RESEARCH AND DEVELOPMENT INFORMATION
TIER II LIST
FRAME
Site ID
Weighting
Factor
Location
coordinates
LOGISTICS
DATA
Site ID
information
Location
coordinates
Access
Information
SITE
VERIFICATION
DATA
Site ID
Measured
location
coordinates
Sampling status
FIELD
DATA
LABORATORY
DATA
QA/QC
DATA
Tpfl^iiur- HISTORICAL
TRACKING DflTfl
DATA DATA
STRESSOR
DATA
Landuse
data
Sampling
status
ASSESSMENT AND REPORTING INFORMATION
(by indicator)
ANNUAL
POPULATION
STATUS
DATA
POPULATION
TREND
DATA
SPATIAL
DATA
(CIS)
META-DATA
DATA BASE
DOCUMENTATION
QUALITY ASSURANCE
DOCUMENTATION
IM SYSTEM
USER GUIDES
METHODS
DOCUMENTATION
Figure 6. Organization of the National Aquatic Resource Surveys Information Management
System (NARSIMS) for the NCCA.
4.3.1 Data Flow Conceptual Model
The NCCA will accumulate large quantities of observational and laboratory analysis data. To
appropriately manage this information, it is essential to have a well-defined data flow model and
documented approach for acquiring, storing, and summarizing the data. This conceptual model (Figure
7) helps focus efforts on maintaining organizational and custodial integrity, ensuring that data available
for analyses are of the highest possible quality.
4.3.2 Simplified Data Flow Description
There are several components associated with the flow of information, these are:
Communicationbetween the NCCA Quality Team, IM Center and the various data
contributors (e.g., field crews, laboratories and the data analysis and reporting team)is vital
for maintaining an organized, timely, and successful flow of information and data.
Data are captured or acquired from four basic sources field data transcription, laboratory
analysis reporting, automated data capture, and submission of external data files (e.g., GIS
data)encompassing an array of data types: site characterization; biotic assessment; sediment
and tissue contaminants; and water quality analysis. Data capture generally relies on the
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 52 of 134
transference of electronic data, e.g., optical character readers and email, to a central data
repository. However, some data must be transcribed by hand in order to complete a record.
ECOLOGICAL INDICATOR FIELD AND LABORATORY DATA FLOW
FIELD DATA COLLECTION
SAMPLE
COLLECTION
ACM
OTHER
DATA FILES
(e.g., Survey
design, GIS
attribute data)
LABORATORY
LABORATORY
INFORMATION
MANAGEMENT
SYSTEM
\ = \V RAW DATA
SUBMISSION PACI
INFORMATION \
MANAGEMENTCENTER\
(WED-Corvallis) \
NARS IM
SQL SERVER
Relational-
1 record per datum
Data
Table
1
Data
Table
4
Data
Table
2
Data
Table
3
Data
Table
n
ASSESSMENT DATA FILES
(Extent and status estimates)
FINAL DATA
RECORDS
(Flat files)
Posted to
Webpage or
FTP site
FINAL DATA
RECORDS
(EPA WATER
QUALITY
EXCHANGE
[WQX])
Permanent
Archival
V /
Figure 7. Conceptual Model of Data Flow into and out of the Master SQL Database for the
NCCA
Data repository or storageprovides the computing platform where raw data are archived,
partially processed data are staged, and the "final" data, assimilated into a final, user-ready data
file structure, are stored. The raw data archive is maintained in a manner consistent for
providing an audit trail of all incoming records. The staging area provides the IM Center staff a
platform for running the data through all of its QA/QC paces as well as providing data analysts a
first look at the incoming data. This area of the data system evolves as new data are gathered
and user-requirements are updated. The final data format becomes the primary source for all
statistical analysis and data distribution.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 53 of 134
Metadataa descriptive document that contains information compliant with the Content
Standards for Digital Geospatial Metadata (CSDGM) developed by the Federal Geographic Data
Committee (FGDC).
4.4 Core Information Management Standards
The development and organization of the IM system is compliant with guidelines and standards
established by the EMAP Information Management Technical Coordination Group, the EPA Office of
Technology, Operations, and Planning (OTOP), and the ORD Office of Science Information
Management. Areas addressed by these policies and guidelines include, but are not limited to, the
following:
Taxonomic nomenclature and coding;
Locational data;
Sampling unit identification and reference;
Hardware and software; and
Data catalog documentation.
The NCCA is committed to compliance with all applicable regulations and guidance concerning hardware
and software procurement, maintenance, configuration control, and QA/QC. To that end, the NCCA
team has adopted several IM standards that help maximize the ability to exchange data within the study
and with other aquatic resource surveys or similar large-scale monitoring and assessment studies (e.g.
EMAP, R-EMAP, state probability surveys). These standards include those of the Federal Geographic
Data Committee (FGDC 1999), the National Spatial Data Infrastructure (NSDI 1999), and the National
Biological Information Infrastructure (NBII 1999). More specific information follows.
4.5 Data Formats
Attribute Data
Sql Tables
Sas Data Sets.
R Data Sets.
Ascii Files: Comma-Separated values, or space-delimited, or fixed column
CIS Data
ARC/INFO native and export files; compressed .tar file of ARC/INFO workspace
Spatial Data Transfer Standard (SDTS; FGDC 1999) format available on request
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 54 of 134
Standard Coding Systems
Sampling Site (EPA Locational Data Policy; EPA 1991)
Latitude and Longitude in decimal degrees ( +/- 7.4)
Negative longitude values (west of the prime meridian).
Datum used must be specified (e.g., NAD83, NAD27)
Chemical Compounds: Chemical Abstracts Service (CAS 1999)
Species Codes: Integrated Taxonomic Information System (ITIS 1999).
Land cover/land use codes: Multi-Resolution Land Characteristics (MRLC 1999); National Hydrography
Dataset Plus Version 1.0 (NHDPIus 2005)
Public Accessibility
While any data created using public funds are subject to the Freedom of Information Act (FOIA), some
basic rules apply for general public accessibility and use.
Program must comply with Data Quality Act before making any data available to the public and
must fill out and have a signed Information Quality Guidelines package before any posting to the
web or distribution of any kind.
Data and metadata files are made available to the contributor or participating group for review
or other project-related use from NARSIMS or in flat files before moving to an EPA approved
public website.
Data to be placed on a public website will undergo QA/QC review according to the approved
Quality Assurance Project Plan.
Only "final" data (those used to prepare the final project report) are readily available through an
EPA approved public website. Other data can be requested through the NCCA Project Leader or
NARS Coordinator.
As new guidance and requirements are issued, the NCCA information management staff will assess the
impact upon the IM system and develop plans for ensuring timely compliance.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 55 of 134
4.6 Data Transfer Protocols
Field crews are expected to send in hard copies of field forms containing in-situ measurement and event
information to the IM Center as defined in the Field Operations Manual. Field crews using electronic
field forms email files to the Information Management Coordinator via the App email function.
Laboratories submit electronic data files (). The NARS IM Team receives and maintains tracking records
for sampling and sample receipt including all records of crews logging in sampling events, shipment of
samples to the batch lab, shipment of samples to processing labs, and receipt of samples by both the
batch and processing labs. This information is maintained in the NARS database.
Labs must submit all sample tracking and analytical results data to the Laboratory Review Coordinator
and the Project QA Coordinator (or for EPA contract labs the applicable Contract Officer Representative)
in electronic form using a standard software package to export and format data. The Laboratory Review
Coordinator provides labs with EPA's standardized tempate for reporting results and metadata. The
Laboratory Review Coordinatortransfers the lab data to NARSIM and maintains records of the transfer.
The Laboratory Review Coordinator oversees and works with the Project QA Coordinator and EPA Task
Order or Work Assignment Contracts Officer Representatives to ensure that all interim and final data
submissions from the labs are maintained on the NCCA g:drive. Examples of software and the associated
formats are:
Software Export Options (file extensions)
Microsoft Excel* xls, xlsx, csv, formatted txt
Microsoft Access" mdb, csv, formatted txt
SAS* sas7bdat, csv, formatted txt
R csv, formatted txt
All electronic files submitted by the laboratories must be accompanied by appropriate documentation,
e.g., metadata, laboratory reports, QA/QC data and review results). The laboratory submitted
information shall contain sufficient information to identify field contents, field formats, qualifier codes,
etc. It is very important to keep EPA informed of the completeness of the analyses. Labs may send files
periodically, before all samples are analyzed, but the labs must inform EPA (either the Laboratory
Review Coordinator or applicable Task Order or Work Assignment Contracts Officer Representative)
must be informed that more data are pending if a partial file is submitted. Laboratory data files must be
accompanied by text documentation describing the status of the analyses, any QA/QC problems
encountered during processing, and any other information pertaining to the quality of the data.
Following is a list of general transmittal requirements each laboratory or state-based IM group should
consider when packaging data for electronic transfer to EPA:
Provide data in row/column data file/table structure. Further considerations:
o Include sample id provided on the sample container label in a field for each record
(row) to ensure that each data file/table record can be related to a site visit.
o Use a consistent set of column labels.
o Use file structures consistently.
o Use a consistent set of data qualifiers.
o Use a consistent set of units.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 56 of 134
o Include method detection limit (MDL) as part of each result record.
o Include reporting limit (RL) as part of each result record.
o Provide a description of each result/QC/QA qualifier.
o Provide results/measurements/MDL/RL in numeric form.
o Maintain result qualifiers, e.g., <, ND, in a separate column.
o Use a separate column to identify record-type. For example, if QA or QC data are
included in a data file, there should be a column that allows the NCCA IM staff to
readily identify the different result types.
o Include laboratory sample identifier.
o Include batch numbers/information so results can be paired with appropriate
QA/QC information.
o Include "True Value" concentrations, if appropriate, in QA/QC records.
o Include a short description of preparation and analytical methods used) either as
part of the record or as a separate description for the test(s) performed on the
sample. For example, EPAxxxx.x, ASTMxxx.x, etc. Provide a broader description, e.g.,
citation, if a non-standard method is used.
o Include a short description of instrumentation used to acquire the test result
(where appropriate). This may be reported either as part of the record or as a
separate description for each test performed on the sample. For example, GC/MS-
ECD, ICP-MS, etc.
Ensure that data ready for transfer to NCCA IM are verified and validated, and results are
qualified to the extent possible (final verification and validation are conducted by EPA).
Data results must complement expectations (analysis results) as specified by contract or
agreement.
Identify and qualify missing data (why is the data missing).
Submit any other associated quality assurance assessments and relevant data related to
laboratory results (i.e., chemistry, nutrients). Examples include summaries of QC sample
analyses (blanks, duplicates, check standards, matrix spikes) standard or certified reference
materials, etc.), results for external performance evaluation or proficiency testing samples,
and any internal consistency checks conducted by the laboratory.
Labs may send electronic files by e-mail attachments or they may upload files through EPA's SharePoint
site.
4.7 Data Quality and Results Validation
Data quality is integrated throughout the life-cycle of the data. Data received at the IM center are
examined for completeness, format compatibility, and internal consistency. Field collected data quality
is evaluated using a variety of automated and other techniques. Analytical results are reviewed by
subject matter experts. Any changes (deletions, additions, corrections) are submitted to the NCCA data
center for inclusion into the validated data repository. Explanations for data changes are included in the
record history.
4.7.1 Data Entry, Scanned, or Transferred Data
Field crews record sampling event observational data in a standard and consistent manner using field
data collection forms. The NARS IM Team logs in receipt of field forms to the NARS IM database and is
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 57 of 134
responsible for maintaining hard copies of the field forms submitted by crews for 5 years, scanned pdf
versions of the field forms in NARS IM until the Office of Water determines that NARS data are no longer
needed and the data itself in the NARS IM database until the Office of Water determines that NARS data
are no longer needed. The Information Management Coordinator also transfers a copy of the scanned
pdf version of the field forms to the NARS g:drive as a back-up.
The IM Center either optically scans or transcribes information from field collection forms into an
electronic format (sometimes using a combination of both processes). The IM Center process includes
the following:
During the scanning process, incoming data are subjected to a number of automated error
checking routines.
Obvious errors are corrected immediately.
Suspected errors that cannot be confirmed at the time of scanning are qualified for later review
by someone with the appropriate background and experience (e.g., a chemist or aquatic
ecologist).
The process continues until the transcribed data are 100 % verified or no corrections are
required.
The IM Center staff conduct additional validation by executing a series of programs (e.g., computer
code) to check for correct file structure and variable naming and formats, outliers, missing data,
typographical errors and illogical or inconsistent data based on expected relationships to other
variables. Data that fail any check routine are identified in an "exception report" that is reviewed by an
appropriate scientist for resolution. The code is maintained in Corvallis, OR by the NARS IM Center (the
Information Management Coordinator).
The IM Center brings any remaining questionable data to the attention of the Project QA manager and
individuals responsible for collecting the data for resolution.
4.7.2 Analytical Results Validation
All data are evaluated to determine completeness and validity. Additionally, the data are run through a
rigorous inspection using SQL queries or other computer programs such as SAS or R to check for
anomalous data values that are especially large or small, or are noteworthy in other ways. Focus is on
rare, extreme values since outliers may affect statistical quantities such as averages and standard
deviations.
All laboratory quality assurance (QA) information is examined to determine if the laboratory met the
predefined data quality objectives - available through the Quality Assurance Project Plan (QAPP). All
questionable data will be corrected or qualified through the NCCA IM staff with support of the project
QA coordinator and QAteam.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 58 of 134
4.7.3 Database Changes
Data corrections are completed at the lowest level by the IM Center staff to ensure that any subsequent
updates will contain only the most correct data. Laboratory results found to be in error are sent back to
the originator (lab) for correction by the Laboratory Review Coordinator or the Project QA Coordinator.
After the originator makes any corrections, they submit the file back the Laboratory Review Coordinator.
It is the responsibility of the Laboratory Review Coordinator to resumbit the entire batch or file to the
IM Center. The IM Center uses these resubmissions to replace any previous versions of the same data.
The IM Center uses a version control methodology when receiving files. Incoming data are not always
immediately transportable into a format compatible with the desired file structures. When these
situations occur, the IM staff creates a copy of the original data file which then becomes the working file
in which any formatting changes will take place. The original raw data will remain unchanged. This
practice further ensures the integrity of the data and provides an additional data recovery avenue,
should the need arise.
All significant changes are documented by the IM Center staff. The IM Center includes this information
in the final summary documentation for the database (metadata).
After corrections have been applied to the data, the IM Center will rerun the validation programs to re-
inspect the data.
The IM Center may implement database auditing features to track changes.
4.8 Metadata
The LOM lists the required metadata elements for each type of laboratory analysis. Metadata for
geospatial information (e.g., latitude and longitude) follow the Federal Geographic Data Committee,
Content standard for digital geospatial metadata, version 2.0. FGDC-STD-001-1998 (FGDC 1998). The
NARS IM Center uploads and maintains in the NARS IM database all appropriate metadata as provided
via the scanned field forms, electronic field forms, and the laboratory files submitted by the Laboratory
Review Coordinator.
4.9 Information Management Operations
4.9.1 Computing Infrastructure
The NCCA uses a centralized information management system to maintain electronic data in the primary
data repository at the NARS IM center that is housed at the Western Ecology Division. This server at the
NARS IM center stores all NCCA data in SQL native tables, SAS* native data sets or R datasets within a
central Windows Server 2003 R2 (current configuration) or higher computing platform
4.9.2 Data Security and Accessibility
EPA ensures that all data files in the IM system are protected from corruption by computer viruses,
unauthorized access, and hardware and software failures (the server is protected through the general
EPA IT program, not specifically by the NARS program or the NARS IM team). The NARS IM team
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 59 of 134
manages who can access the SQL server which is limited to NARS IM staff and a few OW staff with
permission of the NARS Team Leader. NARS IM maintains the code for updating data, data validation
and quality assurance on EPA servers (e.g., share drives) and it is backed up by EPA's standard policies.
Prior to the release of the report, the NCCA team makes raw and preliminary data files accessible only to
the NCCA analysts, collaborators or others authorized by the NCCA Project Leader. The NCCA
Information Management Coordinator protects these preliminary data files from unauthorized use or
accidental changes by publishing them on a secure SharePoint site. Data analysis team members can
download the files and upload files with new names, but can not edit or delete the originally posted
version. When the team is ready to release data to additional collaborators (e.g., states), the Laboratory
Review Coordinator, working with the NCCA Project Leader and the Information Management
Coordinator posts files in a NCCA SharePoint folder that can be accessed as read-only by authorized
partners.
EPA routinely stores data generated, processed, and incorporated into the IM system according to EPA's
overarching IT policies and procedures for back-ups. This ensures that if one system is corrupted or
destroyed, IM staff can work with the EPA IT staff and contractors to reconstruct the databases.
Data security and accessibility standards implemented for NCCA IM meet EPA's standard security
authentication (i.e., username, password) process in accordance to the EPA's Information Management
Security Manual (1999; EPA Directive 2195 Al) and EPA Order 2195.1 A4 (2001D). The NCCA team
provides any data sharing through an authenticated site.
4.9.3 Life Cycle
NCCA team, partners and others can retrieve data may be retrieved electronically throughout the
records retention and disposition lifecycle or as practicable.
4.9.4 Data Recovery and Emergency Backup Procedures
EPA security maintains a back-up copy of the server on which the NARS IM system resides which
includes NARS data files and programs for processing the data. All laboratories generating data and
developing data files are expected to established procedures for backing up and archiving computerized
data. The Laboratory Review Coordinator maintains copies of laboratory submitted files on NCCA
g:drive.
4.9.5 Long-Term Data Accessibility and Archive
When the NCCA report is released to the public, the Data Finalization Team works with the Information
Management Coordinator and the NCCA Project Leader to prepare data files for posting on EPA's
website of all data used in the report. The Data Finalization Team works with the NARS microsite owner
to post .csv files of the data and .txt file of related metadata. Additionally, following release of the final
report, the Data Finalization team works with the OW Water Quality Exchange (WQX) team to transfers
the NCCA data to EPA's WQX system for long-term access by the public. WQX is a repository for water
quality, biological, and physical data and is used by state environmental agencies, EPA and other federal
agencies, universities, private citizens, and many others. Revised from STORET, WQX provides a
centralized system for storage of physical, chemical, and biological data and associated analytical tools
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 60 of 134
for data analysis. Once uploaded, states and tribes and the public will be able to download data from
EPA's website along with other water quality data.
4.10 Records Management
The NARS IM team maintains and tracks removable storage media (i.e., CDs, diskettes, tapes) and paper
records in a centrally located area at the NCCA IM center. As noted previously, the NARS IM Team logs in
receipt of field forms to the NARS IM database and is responsible for maintaining hard copies of the field
forms submitted by crews for 5 years, scanned pdf versions of the field forms in NARS IM until the Office
of Water determines that NARS data are no longer needed and the data itself in the NARS IM database
until the Office of Water determines that NARS data are no longer needed. The Information
Management Coordinator also transfers a copy of the scanned pdf version of the field forms to the
NARS g:drive as a back-up.Records retention and disposition comply with U.S. EPA directive 2160
Records Management Manual (July, 1984) in accordance with the Federal Records Act of 1950.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 61 of 134
5 INDICATORS
This section of the QAPP provides summary information on laboratory and field performance and quality
control measures for the NCCA 2015 indicators. Additional details are described in the NCCA 2015 Field
Operations Manual and Laboratory Operations Manual. A description of the NCCA indicators is found in
Table 2. Description of NCCA 2015 Indicators and Location Where Indicators are Collected.
Table 2. Description of NCCA 2015 Indicators and Location Where Indicators are Collected
Indicator
In Situ measurements
[Salinity (marine),
temperature, DO
Depth,
Conductivity (freshwater),
pH]
Description
Measurements taken to detect
extremes in condition that
might indicate impairment and
depth at location.
Location of sample collection
One set of measurements taken at the
index site; readings are taken on a
profile through the water column at
the index site
Secchi/light measurements
PAR
Measurements to look at
clarity
Measured at the index site.
Water chemistry Filtered
sample for dissolved
inorganic NO2 NO3, NH4,
PO4; Unfiltered sample for
Total N and P
Water chemistry
measurements will be used to
determine nutrient
enrichment/eutrophication
Collected from a depth of 0.5 m at the
index site
Chlorophyll-a
Chlorophyll-a is used to
determine algal biomass in the
water.
Collected as part of water chemistry
sample
Microcystins
Measurement used to
determine the harmful algal
bloom biomass in the water.
Collected from a depth of 0.5 m at the
index site
Benthic invertebrate
assemblage
Benthic invertebrate
community information is used
to assess the biological health
of estuarine and Great lake
waters. The NCCA will measure
attributes of the overall
Collected from a sediment grab at the
index site
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 62 of 134
structure and function of the
benthic community, diversity,
abundances, etc to evaluate
biological integrity.
Sediment Chemistry Measurement to determine Collected from a sediment grab at the
contaminant levels in sediment index site
Sediment toxicity
Whole fish tissue
Measurement to determine
level of toxicity of sediment
Measurement to determine
contaminant levels in whole
body fish for ecological
assessment
Collected from a sediment grab at the
index site
Target species collected within 500
meter radius of the X-site (may expand
to 1000 meters if needed)
Fecal indicator (Enterococci)
Fish Tissue Plug
Fish Tissue Fillet
Enterococci are bacteria that
are endemic to the guts of
warm blooded creatures.
These bacteria, by themselves,
are not considered harmful to
humans but often occur in the
presence of potential human
pathogens (the definition of an
indicator organism).
Fish Tissue plugs will provide
information on the national
distribution of Hg, a
bioaccumulative and toxic
chemical in fish species.
Fish Tissue Fillet samples for
Hg and PFCs will focus on
analysis of fillet tissue because
of associated human
consumption and health risk
implications.
Collected from a depth of 0.5 m at the
index site.
Target species collected within 500
meter radius of the X-site (may expand
to 1000 meters if needed)
Target species collected at a subset of
Great Lakes sites within 500 meter
radius of the X-site (may expand to
1500 meters if needed)
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 63 of 134
Algal toxins Research indicator. Collected from a depth of 0.5 m at the
Measurement used to look at index site
concentrations of harmful algal
toxins in coastal waters.
5.1 In Situ Measurements
The first activities that should be conducted by crews upon arriving onsite are those that involve water
column measurements; these data need to be collected before disturbing bottom sediments.
5.1.1 Introduction
Crews make in situ measurements using field meters, and data are recorded on standardized data
forms. Field crews will measure dissolved oxygen (DO), pH, conductivity (fresh water) or salinity
(marine), and temperature using a multi-parameter water quality meter. Crews use a meter to read
photosynthetically active radiation (PAR) throughout the photic zone. Crews measure secchi disk depth
as well. At Great Lakes sites, crews will take underwater video at each site.
5.1.2 Sample Design and Methods
Detailed sample collection and handling procedures are described in NCCA 2015 Field Operation
Manual.
5.1.3 Pertinent Laboratory QA/QC Procedures
Not applicable for in situ measurements.
5.1.4 Pertinent Field QA/QC Procedures
Several pieces of equipment that may be utilized by crews to collect or analyze environmental data for
NCCA should have periodic maintenance and calibration verification performed by manufacturer's
representatives or service consultants. These procedures should be documented by date and the
signature of person performing the inspection. Examples include:
CTDs or multiparameter probes - annual (or as needed)maintenance and calibration check by
manufacturer or certified service center;
Light (PAR) Meters - biannual verification of calibration coefficient by manufacturer;
Video cameras- as needed maintenance as described in the manufacturer information.
Crews will maintain all other sampling gear and laboratory instrumentation in good repair as per
manufacturer's recommendations to ensure proper function.
5.1.4.1 Field Performance Requirements
Measurement data quality objectives (measurement DQOs or MQOs) are given in Table 3. General
requirements for comparability and representativeness are addressed in Section 2.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 64 of 134
Table 3. Measurement Data Quality Objectives: Water Indicators
Variable or Measurement
Maximum allowable
Accuracy Goal (Bias)
Maximum
Allowable
Precision Goal
(%RSD)
Completeness
Oxygen, dissolved
Temperature
Conductivity
Salinity
Depth
PH
PAR
Secchi Depth
±0.5 mg/L
±1±C
±1 u.S/cm
ilppt
±0.5 m
±0.3 SU
O.Olumols^rrr2*
±0.5 m
10%
10%
10%
10%
10%
10%
5%
10%
95%
95%
95%
95%
95%
95%
95%
95%
*Determined by biannual manufacturer calibration.
5.1.4.2 Field Quality Control Requirements
For in situ measurements, each field instrument (e.g., multi-probe) used by the crews must be
calibrated, inspected prior to use, and operated according to manufacturer specifications. Figure 8
illustrates the general scheme for field chemistry measurement procedures.
5.1.4.3 Instrumentation
Seabird CTDs and Multiparameter Probes: SeaBird CTDs and multiparameter probes are routinely
used in estuarine, Great Lakes, deep water or oceanographic surveys to measure and electronically
log various water column parameters. When properly maintained and serviced, they have an
established history of dependable utilization. The units can be configured with different arrays of
probes; for the purposes of the NCCA, when used, crews will equip them to measure DO,
temperature, salinity/conductivity, pH, and depth. Crews will follow the NCCA Field Operations
Manual and manufacturer's instructions for use of these instruments.
For instruments that are factory calibrated and checked (e.g. Sea-Bird Electronics meters, etc.),
crews must ensure that factory-certified diagnostics have been completed according to
manufacturer specifications (preferably conducted immediately prior to the sampling season) and
provide documentation copies during assistance visits. Meters such as these do not require the daily
calibration steps or the weekly diagnostic/QCS checks. Table 4 includes field quality control
measures for multiparameter probes.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 65 of 134
FIELD MEASUREMENT PROCESS: WATER CHEMISTRY INDICATOR
Pre-Departure Check
Replace Probe
and/or Instrument
-Probe Inspection
-Electronic Checks
-Test Calibration
FIELD CALIBRATION
QC Sample Measurement
Performance Evaluation
Measurement
MEASUREMENTS
AND RECORD DATA
QC Sample Measurement
Duplicate Measurement
REVIEW
DATA FORM
Qualify Data
Correct Errors
ACCEPT FOR DATA ENTRY
Figure 8. Field Measurement Process for Water Chemistry Samples.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 66 of 134
Table 4. Field Quality Control: Multiparameter Meter Indicator
Check Description
Verify performance of
temperature probe using
wet ice.
Verify depth against
markings on cable
pH - Internal electronic
check if equipped; if not
check against Quality
Check Solution
Conductivity (Great Lakes
only) - internal electronic
check if equipped; if not
check against Quality
Check Solution
Salinity (marine only) -
internal electronic check if
equipped; if not check
against Quality Check
Solution
Check DO calibration in
field against atmospheric
standard (ambient air
saturated with water)
Frequency
Prior to initial
sampling, daily
thereafter
Daily
At the beginning and
end of each day
At the beginning and
end of each day
At the beginning and
end of each day
At the beginning and
end of each day
Acceptance Criteria
Functionality = ±0.5°C
±0.2m
Alignment with
instrument
manufacturer's
specifications; or QCS
measurement in range
Alignment with
intrument
manufacturer's
specifications or within
±2 u.S/cm or ±10% of QCS
value
Alignment with
instrument
manufacturer's
specifications or within ±
0.2pptof QCS value
±1.0 mg/L
Corrective
Actions
See manufacturer's
directions.
Re-calibrate
AM: Re-calibrate
PM: Flag day's data. pH
probe may need
maintenance.
AM: Re-calibrate
PM: Flag day's data.
Instrument may need repair.
AM: Re-calibrate
PM: Flag day's data.
Instrument may need
reapair.
AM: Re-calibrate
PM: Flag day's data. Change
membrane and re-check.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 67 of 134
LICOR PAR meter: No daily field calibration procedures are required for the LICOR light meter; however,
the manufacturer recommends that the instrument be returned to the factory for bi-annual calibration
check and resetting of the calibration coefficient. Calibration kits are available from LICOR and this
procedure can be performed at the laboratory (see LICOR operation manual). There are several field QC
measures that crews will take to help ensure taking accurate measurements of light penetration.
The "deck" sensor must be situated in full sunlight (i.e., out of any shadows).
Likewise, the submerged sensor must be deployed from the sunny side of the vessel and care
should be taken to avoid positioning the sensor in the shadow of the vessel.
For the comparative light readings of deck and submerged sensors, (ratio of ambient vs.
submerged), the time interval between readings should be minimized (approximately 1 sec).
Secchi Disk: No field calibration procedures are required for the Secchi disk. QC procedures that crews
will implement when using the Secchi disk to make water clarity measurements include designating a
specific crew member as the Secchi depth reader; taking all measurements from the shady side of the
boat (unlike LICOR measurements which are taken from the sunny side); and not wearing sunglasses or
hats when taking Secchi readings.
Underwater Video (Great Lakes only): No field calibration of camera is required but crews should check
the equipment prior to each field day to assure that it is operational. Crews will charge the battery
regularly.
5.1.4.4 Data Reporting
Data reporting units and significant figures are summarized in Table 5.
Table 5. Data Reporting Criteria: Field Measurements
Measurement
Significant
Figures
Maximum No.
Decimal Places
Dissolved Oxygen
Temperature
PH
Conductivity
Salinity
PAR
Depth
Secchi Depth
mg/L
°C
pH units
u,S/cm at 25 °C
ppt
mE/m2/s
meters
meters
2
2
3
3
2
2
3
3
1
1
1
1
1
1
1
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 68 of 134
5.1.5 Data Review
See Table 6 for data validation quality control.
Table 6. Data Validation Quality Control for In-Situ Indicator.
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and whisker
plots)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or
invalid.
Review data from calibration and field notes
Determine impact and possible limitations on
overall usability of data
5.2 Water Chemistry Measurements (Including chlorophyll-o-)
5.2.1 Introduction
Water chemistry indicators based on field and laboratory methods evaluate estuarine and Great Lake
condition with respect to nutrient over-enrichment and eutrophication. Data are collected for a variety
of physical and chemical constituents to provide information on the water clarity, primary productivity,
and nutrient status. Data are collected for chlorophyll-o to provide information on the algal loading and
gross biomass of blue-greens and other algae.
5.2.2 Sample Design and Methods
Detailed sample collection and handling procedures are described in NCCA 2015 Field Operation
Manual. Detailed laboratory methods are in the NCCA 2015 Laboratory Operations Manual.
5.2.3 Pertinent Laboratory QA/QC Procedures
A single central laboratory and some State laboratories will analyze the water chemistry samples. The
specific quality control procedures used by each laboratory are implemented to ensure that:
Objectives established for various data quality indicators being met.
Results are consistent and comparable among all participating laboratories.
The central laboratory demonstrated in previous studies that it can meet the required Laboratory
Reporting Levels( RLs) (USEPA 2004). All laboratories will follow the QA/QC procedures outlined in the
NCCA 2015 QAPP and the LOM. A summary and diagram of the QA processes related to water chemistry
samples for the NCCA 2015 is found in Figure 9.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 69 of 134
PREPARE QC SAMPLES
Laboratory Blank
Fortified Sample
Laboratory Split Sample
PREPARE QC SAMPLES
SAMPLEPROCESS NG
QC Check Samples (QCCS)
Internal Reference Sample
Contamination
or Biased
Calibration
Recheck
LT-MDL QCCS
into sample batch
Accept Batch
for Entry
and Verification
Re-Calibrate
Re-analyze
Previous Samples
Qualify batch
for possible
re-analysis
Figure 9. Analysis Activities for Water Chemistry Samples
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 70 of 134
5.2.3.1 Laboratory Performance Requirements
Table 7 summarizes the pertinent laboratory measurement data quality objectives for the water
chemistry indicators.
Table 7. Measurement Data Quality Objectives: Water Chemistry Indicator and Chlorophyll-a
Parameter
Ammonia (NH3)
Chloride (Cl) (Great
Lakes only)
Conductivity
Nitrate-Nitrite (N03-
N02)
pH (Laboratory)
Total Nitrogen (TN)
Total Phosphorous
(TP) and
ortho-Phosphate
Nitrate (N03)
Sulfate (S04)
Chlorophyll-o
mgN/L
mg CI/L
DS/cm at
25°C
mgN/L
Std Units
mgN/L
mgP/L
mgN/L
mg/L
ug/L in
extract
Potential
Method _ . . _ .
. Transition Precision
of Samples1 Limit Objective2
Otol7
0 to 5,000
1-66,000
0 to 360
(as nitrate)
3.5-10
0.1 to 90
Oto22
(asTP)
0. to 360
0 to 5000
0.7 to 11,000
0.01 marine
(0.7 |aeq/L)
0.02 freshwater
0.20 (6 |aeq/L)
1.0
0.01 marine
0.02 freshwater
N/A
0.01
0.002
0.01 marine
(10.1 |aeq/L)
0.03 freshwater
0.5 freshwater
(10.4 ueq/L)
1.5
0.10
1
20
0.10
5.75, 8.25
0.10
0.02
0.1
2.5
15
JJ'Jl^J'^jl^M
±0.01 or
±10%
±0.10 or
±10%
±2 or ±10%
±0.01 or
±10%
<5.75 or
>8.25 =
±0.07;
5.75-8.25 =
±0.15
±0.01 or
±10%
± 0.002 or
±10%
±0.01 or
±5%
±0.25 or
±10%
±1.5 or
±10%
±0.01 or
±10%
±0.10 or
±10%
±2 or ±5%
±0.01 or
±10%
<5.75 or
>8.25
=±0.15;
5.75-8.25 =
±0.05
±0.01 or
±10%
± 0.002 or
±10%
±0.01 or
±5%
±0.25 or
±10%
±1.5 or
±10%
1 Estimated from samples analyzed at the EPA Western Ecological Division-Corvallis laboratory between 1999 and 2005
2 The method detection limit is determined as a one-sided 99% confidence interval from repeated measurements of a low-level
standard across several calibration curves.
3 Value for which absolute (lower concentrations) vs. relative (higher concentrations) objectives for precision and accuracy are
used.
4 For duplicate samples, precision is estimated as the pooled standard deviation (calculated as the root-mean square) of all
samples at the lower concentration range, and as the pooled percent relative standard deviation of all samples at the higher
concentration range. For standard samples, precision is estimated as the standard deviation of repeated measurements across
batches at the lower concentration range, and as percent relative standard deviation of repeated measurements across batches
at the higher concentration range.
5 Accuracy is estimated as the difference between the measured (across batches) and target values of performance evaluation
and/or internal reference samples at the lower concentration range, and as the percent difference at the higher concentration
range.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 71 of 134
5.2.3.2 Laboratory Quality Control Requirements
Table 8 summarizes the pertinent laboratory quality control samples for the water chemistry
indicators.
Table 8. Laboratory Quality Control Samples: Water Chemistry Indicator
QC Sample
Type and
Description
Demonstrate
competency for
analyzing water
samples to
meet the
performance
measures
Check condition
of sample when
it arrives.
Store sample
appropriately.
All
All
All
^H^TOICT l^u
Demonstration
of past
experience with
water samples in
achieving the
method
detection limits
Sample issues
such as cracked
container;
missing label;
temperature;
adherence to
holding time
requirements;
sufficient
volume for test.
Check the
temperature of
the refrigerator
per laboratory's
standard
operating
procedures.
Once
Once
Record temperature
of sample upon
arrival at the
laboratory. Check
temperature of the
refrigerator/freezer
where samples are
stored at least daily if
using a continuous
temperature logger
and twice daily (once
at beginning of the
day and once at the
end) not using a
continuous logger.
-WiTJffip^M^^B
KJI₯iiHy₯n^B
See Appendix A
oftheLOM
No sample
issues or
determination
that sample can
still be analyzed
While stored at
the laboratory,
the sample must
be kept at a
maximum
temperature of
4° C. (for
aliquots except
chlorophyll o)
and -20° C for
the chlorophyll o
sample.
Corrective Action
EPA will not approve
any laboratory for
NCCA sample
processing if the
laboratory cannot
demonstrate
competency. In other
words, EPA will select
another laboratory
that can demonstrate
competency for its
NCCA samples.
Lab determines if the
sample can be
analyzed or has been
too severely
compromised (e.g.,
contamination).
Assign appropriate
condition code
identified in Table 7.1
of LOM
If at anytime
samples are warmer
than required, note
temperature and
duration (either
from the continuous
temperature log or
from the last
manual reading) in
comment field. Lab
will still perform
test. EPA expects
that the laboratory
will exercise every
effort to maintain
samples at the
correct
temperature.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 72 of 134
L49xJjJJ!lLfl
Type and
Description
Analyze sample
within holding
time
Analyze
Laboratory/
Reagent Blank
Analyze
Filtration Blank
Determine LT-
MDL Limit for
Quality Control
Check Sample
(QCCS)
Analyze
Calibration
QCCS
Analyze
Laboratory
Indicators Description Frequency
All
All
All dissolved
analytes
All
All
All
ASTM Type II
reagent water
processed
through
filtration unit
Prepared so
concentration is
four to six times
the LT-MDL
objective
Once per day prior to
sample analysis
Prepare once per
week and archive
Prepare filter blank
for each box of 100
filters, and examine
the results before
any other filters are
used from that box.
Once per day
Before and after
sample analyses
One per batch
Eceptance Corrective Action
iteria
The test must be
completed
within the
holding time
specified in the
analytical
method.
Control limits
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 73 of 134
L49xJjJJ!lLfl
Type and
Description
Duplicate
Sample
Analyze
Standard
Reference
Material
(SRM)
Analyze Matrix
Spike Samples
Indicators Description Frequency
When
available fora
particular
indicator
Only prepared
when samples
with potential
for matrix
interferences
are
encountered
One analysis in a
minimum of five
separate batches
One per batch
Eceptance Corrective Action
iteria
Manufacturers
certified range
Control limits for
recovery cannot
exceed 100±20%
Prepare and analyze
split from different
sample (volume
permitting). Review
precision of QCCS
measurements for
batch. Check
preparation of split
sample. Qualify all
samples in batch for
possible reanalysis.
Analyze standard in
next batch to confirm
suspected
inaccuracyEvaluate
calibration and QCCS
solutions and
standards for
contamination and
preparation error.
Correct before any
further analyses of
routine samples are
conducted.
Reestablish control by
three successive
reference standard
measurements that
are acceptable.
Qualify all sample
batches analyzed
since the last
acceptable reference
standard
measurement for
possible reanalysis.
Select two additional
samples and prepare
fortified subsamples.
Reanalyze all
suspected samples in
batch by the method
of standard additions.
Prepare three
subsamples
(unfortified, fortified
with solution
approximately equal
to the endogenous
concentration, and
fortified with solution
approximately twice
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Type and
Description
dicators Description Frequency
Quality Assurance Project Plan
Page 74 of 134
:ceptance Corrective Act
riteria
the endogenous
concentration).
Use consistent
units for QC
samples and
field samples
All
Verify that all
units are
provided
consistently
within each
indicator.
Data reporting
For each
indicator, all
field and QC
samples are
reported with
the same
measurement
units
If it is not possible to
provide the results in
consistent units, then
assign a QC code and
describe the reason
for different units in
the comments field of
the database.
Maintain
completeness
All
Determine
completeness
Data reporting
Completeness
objective is 95%
for all indicators
(useable with or
without flags).
Contact EPA HQ NCCA
Laboratory Review
Coordinator*
immediately if issues
affect laboratory's
ability to meet
completeness
objective.
*Chapter 2 of the LOM provides contact information for the EPA HQ NCCA Laboratory Review Coordinator. Laboratories under
contract to EPA must contact the Task Order's Contracting Officer's Representative (TOCOR) instead of the Laboratory Review
Coordinator.
5.2.3.3 Data Reporting
Data reporting units and significant figures are summarized in Table 9.
Table 9. Data Reporting Criteria: Water Chemistry Indicator
No. Significant Maximum No.
Measurement Units Figures Decimal Places
Total phosphorus
Total nitrogen
Nitrate-Nitrite
Ammonia
Chlorophyll-a
pH (laboratory)
Conductivity (Laboratory)
mgP/L
MgN/L
mg/L as N
mg/L as N
ug/L
pH units
|aS/cm at 25 °C
3
3
3
3
2
3
3
3
2
2
2
1
2
1
5.2.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA FOM. That quality is enhanced
by the training and experience of project staff and documentation of sampling activities. Field crews will
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 75 of 134
verify that all sample containers are uncontaminated and intact, and that all sample labels are legible
and intact.
Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the water chemistry sample on the Sample Collection
Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Store the CHEM and NUTS indicators on wet ice in a cooler. Maintain CHLA filters frozen until
shipping on wet ice.
Recheck all forms and labels for completeness and legibility.
5.2.4.1 Field Performance Requirements
Not Applicable
5.2.4.2 Field Quality Control Requirements
See Table 10 and Table 11 for quality control activities and corrective actions.
Table 10. Sample Field Processing Quality Control Activities: Water Chemistry Indicator (CHEM)
Quality Control
Activity
Description and Requirements
Water Chemistry Rinse collection bottles 3xwith ambient water
Container and before collecting water samples.
Preparation
Corrective Action
Discard sample. Rinse bottle
and refill.
Sample Storage
Store samples in darkness at 4°C.
Ship on wet ice within 24 hours of collection.
Qualify sample as suspect for
all analyses.
Table 11. Sample Field Processing Quality Control: Chlorophyll-a (CHLA) and Dissolved Nutrient (NUTS)
Indicators
Quality Control
Activity
Description and Requirements
Corrective Action
Chlorophyll-a Rinse collection bottles 3x with ambient water Discard sample. Rinse bottle
Containers and before collecting water samples. and refill.
Preparation
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 76 of 134
Holding Time
Complete filtration of chlorophyll-a after all water Qualify samples
samples are collected.
Use Whatman 0.7 u,m GF/F filter. Filtration
pressure should not exceed 3.4 psig to avoid
rupture of fragile algal cells.
Rinse sample bottle for dissolved nutrient (NUTS)
3x with 10-20 ml of filtrate before collecting 250
ml of filtrate for analysis.
Filtration (done
in field)
Discard and refilter
Sample Storage CHLA: Filters are placed in centrifuge tube
wrapped in foil square and stored on dry ice in
field.
NUTS: Filtrate is stored on wet ice in field.
CHLA and NUTS are shipped on wet ice along with
water chemistry (CHEM).
Qualify sample as suspect
5.2.5 Data Review
Checks made of the data in the process of review and verification are summarized in Table 12. The NCCA
Project QA Coordinator is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
Table 12. Data Validation Quality Control for Water Chemistry Indicator
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and whisker
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or
invalid.
plots)
Review holding times
Review data from QA samples (laboratory PE
samples, and interlaboratory comparison samples)
Qualify value for additional review
Determine impact and possible limitations on
overall usability of data
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 77 of 134
5.3 Microcystins
5.3.1 Introduction
Crews will collect a water sample at the index site to measure concentrations of total microcystins, an
algal toxin.
5.3.2 Sample Design and Methods
Detailed sample collection and handling procedures are found in the NCCA 2015 Field Operations
Manual. Detailed laboratory methods are in the NCCA 2015 Laboratory Operations Manual.
5.3.3 Pertinent Laboratory QA/QC Procedures
A single central laboratory and some State laboratories will analyze the microcystins samples. The
specific quality control procedures used by each laboratory are implemented to ensure that:
Objectives established for various data quality indicators are being met.
Results are consistent and comparable among all participating laboratories.
All laboratories will follow the procedures outlined in the NCCA 2015 QAPP and the LOM.
5.3.3.1 Laboratory Performance Requirements
Performance requirements for the microcystins indicator are listed in Table 13.
Table 13. Measurement Quality Objectives for Microcvstins
Parameter
Microcystins, undiluted samples
with salinities <3.5 part per
thousand (ppt)
Mg/L
Method Detection
Limit Objective
0.1
Reporting Limit Objective
0.15
Microcystins, undiluted samples
samples with salinity greater than or
equal to 3.5 ppt
Mg/L
0.175
0.263
Microcystins, diluted samples with
salinities <3.5 ppt
Mg/L
0.1 times the dilution factor
Will vary
Microcystins, diluted samples with
salinity greater than or equal to 3.5
ppt
Mg/L
1.75 times the dilution factor Will vary
5.3.3.2 Laboratory Quality Control Requirements
Quality control requirements for the microcystins indicator are listed in Table 14. Sample receipt and
other processing requirements are listed in Table 15.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 78 of 134
Table 14. Sample Analysis Quality Control Activities and Objectives for Microcystins
Quality
Activity
Kit-SheIf Life
Is within its expiration date listed on kit box.
rrective Action
If kit has expired, then discard or clearly
label as expired and set aside for training
activities.
Kit - Contents
All required contents must be present and in acceptable
condition. This is important because Abraxis has
calibrated the standards and reagents separately for
each kit.
If any bottles are missing or damaged,
discard the kit.
Calibration
All of the following must be met:
Standard curve must have a correlation coefficient of
>0.99;
Average absorbance value, Ao, for SO must be >0.80; and
Standards SO-S5 must have decreasing average
absorbance values. That is, if Aj is the average of the
absorbance values for Sj, then the absorbance average
values must be: Ao > AI > A2 > As > A4 >As
If any requirement fails:
Results from the analytical run are not
reported.
All samples in the analytical run are
reanalyzed until calibration provides
acceptable results.
Kit Control
The average concentration value of the duplicates (or
triplicate) must be within the range of 0.75 +/- 0.185
[ig/L. That is, the average must be between 0.565 |jg/L
and 0.935 Mg/L.
Negative Control
The values for the negative control replicates must meet
the following requirements:
All concentration values must be < 0.15 [ig/L (i.e., the
reporting limit; and
one or more concentration results must be
nondetectable (i.e., <0.10 [
Sample Evaluations
All samples are run in duplicate. Each duplicate pair must
have %CV<15% between its absorbance values.
If either requirement fails:
Results from the analytical run are not
reported
The lab evaluates its processes, and if
appropriate, modifies its processes to
correct possible contamination or other
problems.
The lab reanalyzes all samples in the
analytical run until the controls meet the
requirements. At its discretion, the lab
may consult with EPA for guidance on
persistent difficulties with calibration.
If %CV of the absorbances for the
sample>15%, then:
Record the results for both duplicates
using different start dates and/or start
times to distinguish between the runs.
Report the data for both duplicate results
using the Quality Control Failure flag
"QCF"; and
re-analyze the sample in a new analytical
run. No samples are to be run more than
twice.
If the second run passes, then the data
analyst will exclude the data from the first
run (which will have been flagged with
"QCF"). If both runs fail, the data analyst
will determine if either value should be
used in the analysis (e.g., it might be
acceptable to use data if the CV is just
slightly over 15%).
Results Within All samples are run in duplicate. If both of the values are
Calibration Range | less than the upper calibration range (i.e., < 5.0 [ig/i for
undiluted samples with salinity <3.5 ppt; < 8.75 [ig/L for
undiluted samples with salinity >3.5 ppt), then the
requirement is met.
If a result registers as 'HIGH', then record
the result with a data flag of "HI." If one or
both duplicates register as 'HIGH,' then
the sample must be diluted and re-run
until both results are within the
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 79 of 134
calibration range. No samples are to be
run more than twice. The lab reports both
the original and diluted sample results.
External Quality External QC Coordinator, supported by QC contractor,
Control Sample provides 1-2 sets of identical samples to all laboratories
and compares results.
Based upon the evaluation, the External
QC Coordinator may request additional
information from one or more
laboratories about any deviations from
the Method or unique laboratory practices
that might account for differences
between the laboratory and others. With
this additional information, the External
QC Coordinator will determine an
appropriate course of action, including no
action, flagging the data, or excluding
some or all of the laboratory's data.
Table 15. Sample Receipt and Processing Quality Control: Microcystins Indicator
Quality Control
Activity
Sample Log-in
Description and Requirements
Corrective Action
Upon receipt of a sample shipment, record receipt Discrepancies, damaged, or
of samples in the NARS IM system (within 24 clock missing samples are reported to
hours) and the laboratory's Information
Management System (LIMS).
the EPA HQs Laboratory QA
Coordinator
Sample condition
upon receipt
Sample issues such as cracked container; missing Qualify samples
label; temperature (frozen); adherence to holding
time requirements; sufficient volume for test.
Sample Storage
Store sample frozen
Qualify samples
Holding time
Frozen samples can be stored for several months.
Qualify samples
5.3.3.1 Data Reporting
Data reporting units and significant figures are summarized in Table 16.
Table 16. Data Reporting Criteria: Microcystins Indicator
Measurement
Microcystins
Significant
Figures
Maximum No.
Decimal Places
5.3.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 FOM. That quality is
enhanced by the training and experience of project staff and documentation of sampling activities.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 80 of 134
Crews will collect a single water sample for microcystins analyses. Field crews will verify that all sample
containers are uncontaminated and intact, and that all sample labels are legible and intact. While in the
field, the crew will store samples in a cooler on ice and will then freeze the sample upon returning to the
base site (hotel, lab, office). Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the microcystins sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Store the sample on ice in field.
Recheck all forms and labels for completeness and legibility.
5.3.4.1 Field Performance Requirements
Not Applicable.
5.3.4.2 Field Quality Control Requirements
See Table 17 for quality control activities and corrective actions.
Table 17. Sample Field Processing Quality Control: Microcystins Indicator
Holding time Hold sample on wet ice and freeze immediately Qualify samples
upon return to the base site (hotel, lab, office) and
keep frozen until shipping
Sample Storage Store samples in darkness and frozen (-20 °C)
Monitor temperature daily
Qualify sample as suspect
5.3.5 Data Review
Checks made of the data in the process of review and verification is summarized in Table 18. The NCCA
Project QA Coordinator is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
Table 18. Data Validation Quality Control for Microcystins Indicator
Activity or Procedure
Range checks, summary statistics, and/
exploratory data analysis (e.g., box and
whisker plots)
'or
Requirements and Corrective Action
Correct reporting errors or qualify as
suspect or invalid.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 81 of 134
Review holding times Qualify value for additional review
Review data from QA samples (laboratory Determine impact and possible
PE samples, and interlaboratory limitations on overall usability of data
comparison samples)
5.4 Benthic Invertebrates
5.4.1 Introduction
The Benthic invertebrates inhabit the sediment (infauna) or live on the bottom substrates or aquatic
vegetation (epifauna) of coastal areas. The response of benthic communities to various stressors can
often be used to determine types of stressors and to monitor trends (Klemm et al., 1990). The overall
objectives of the benthic invertebrate indicators are to detect stresses on community structure in
National coastal waters and to assess and monitor the relative severity of those stresses. The benthic
invertebrate indicator procedures are based on various recent bioassessment litrature (Barbour et al.
1999, Hawkins et al. 2000, Klemm et al. 2003), previous coastal surveys (US EPA 2001C, US EPA 2004A,
US EPA 2008,), and the procedures used in NCCA 2010.
5.4.2 Sample Design and Methods
Detailed sample collection and handling procedures are described in the NCCA 2015 Field Operations
Manuals. Detailed information on the benthic processing procedure are described in the NCCA 2015
Laboratory Operations Manual.
5.4.3 Pertinent Laboratory QA/QC Procedures
A single central laboratory and some State laboratories will analyze the benthic invertebrate samples.
The specific quality control procedures used by each laboratory are implemented to ensure that:
Objectives established for various data quality indicators being met.
Results are consistent and comparable among all participating laboratories.
All laboratories will follow the procedures outlined in the NCCA QAPP and the LOM.
For the NCCA 2015, laboratories and EPA will implement quality control in three primary ways. First,
laboratories will conduct internal QC for sorters as described in the LOM (10% of all samples (minimum
of 1) completed per sorter). Second, laboratories will conduct internal QCfortaxonomists identifying
benthic invertebrates as described in the LOM (1 in 10 samples per taxonomist). Finally, EPA will
randomly select 10% of samples for identification by an independent, external taxonomist as described
in the LOM (10% of all samples completed by each laboratory).
5.4.3.1 Laboratory Performance Requirements
Measurement quality objectives (MQOs) are given in Table 19. General requirements for comparability
and representativeness are addressed in Section 2. Precision is calculated as percent efficiency,
estimated from examination of randomly selected sample residuals by a second analyst and
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 82 of 134
independent identifications of organisms in randomly selected samples. The MQO for sorting and
picking accuracy is estimated from examinations (repicks) of randomly selected residues by an
experienced QC Sorter.
Equation 4.1 Percent sorting efficiency (PSE)
Number of organisms found by the sorter (A) compared to the combined (total) number of organisms
found by the sorter (A) and the number recovered by the QC Officer (B) from Sorter A's pickate for a
sample. PSE should be >90%.
PSE =
A
A + B
xlOO
Equation 4.2 Percent disagreement in enumeration (PDE)
Measure of taxonomic precision comparing the number of organisms, ni, counted in a sample by the
primary taxonomist with the number of organisms, n2, counted by the internal or external QC
taxonomist. PDE should be <5%.
PDE =
n1-n2
:100
n
Equation 4.3 Percent taxonomic disagreement (PTD)
Measure of taxonomic precision comparing the number of agreements (positive comparisons, comppos]
of the primary taxonomist and internal or external QC taxonomists. In the following equation, N is the
total number of organisms in the larger of the two counts. PTD should be <15%.
PTD =
xlOO
Table 19. Benthic Macroinvertebrates: Measurement Data Quality Objectives
Variable or Measurement Precision Accuracy
Sort and Pick
Identification
90% a
85% b
90% a
95% c
NA = not applicable;a As measured by PSE; b As measured by (100%-PTD);c As measured by (100%-PDE)
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 83 of 134
5.4.3.2 Laboratory Quality Control Requirements
Quality Control Requirements for the benthic invertebrate indicator are provided in Table 20 and Table
21.
Table 20. Benthic Macroinvertebrates: Laboratory Quality Control
Check or Sample Frequency Acceptance Criteria Corrective Action
Description
SAMPLE PROCESSING AND SORTING
Sample pickate
examined by another
sorter
10% of all samples
(minimum of 1)
completed per sorter
PSE > 90%
If < 90%, examine all residuals of
samples by that sorter and retrain
sorter
IDENTIFICATION
Duplicate
identification by
Internal Taxonomy QC
Officer
Independent
identification by
outside, expert,
taxonomist
External QC
Use of
widely/commonly
accepted taxonomic
references by all NCCA
labs
Prepare reference
collection1
1 in 10 samples per
taxonomist,
All uncertain taxa
10% of all samples
completed per
laboratory
For all identifications
Each new taxon per
laboratory
PTD <15%
Uncertain identifications to be
confirmed by expert in
particular taxa
PDE < 5%
PTD < 15%
All keys and references used by
each lab must be on
bibliography prepared by one
or more additional NCCA labs.
This requirement demonstrates
the general acceptance of the
references by the scientific
community.
Complete reference collection
to be maintained by each
individual laboratory
If PTD >15%, reidentify all samples
completed by that taxonomist since
last meeting the acceptance criteria,
focusing on taxa of concern
Record both tentative and
independent IDs
If PDE > 5%, implement recommended
corrective actions.
If PTD > 15%, implement
recommended corrective actions.
If a lab proposes to use other
references, the lab must obtain prior
permission from External QC Officer
before submitting the data with the
identifications based upon the
references.
Internal Taxonomy QC Officer
periodically reviews data and
reference collection to ensure
reference collection is complete and
identifications are accurate
DATA VALIDATION
Taxonomic
"reasonableness"
checks
All data sheets
Taxa known to occur for coastal
waters or Great Lakes.
Second or third identification by
expert in that taxon
Table 21. Sample Receipt and Processing Quality Control: Benthic Invertebrate Indicator
Quality Contro
Activity
Description ai
Corrective Action
If requested, EPA can return reference collection materials and/or other sample materials.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 84 of 134
Sample Log-in
Sample
condition upon
receipt
Upon receipt of a sample shipment, record
receipt of samples in the NARS IM system (within
24 clock hours) and the laboratory's Information
Management System (LIMS).
Sample issues such as cracked container; missing
label; preservation.
Sample Storage Store benthic samples in a cool, dark place.
Preservation Transfer storage to 70% ethanol for long term
storage
Holding time Preserved samples can be stored indefinitely;
periodically check jars and change the ethanol if
sample material appears to be degrading.
Discrepancies, damaged, or
missing samples are reported to
the EPA HQs Laboratory QA
Coordinator
Qualify samples
Qualify sample as suspect for all
analyses
Qualify samples
Qualify samples
5.4.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 Field Operations Manuals.
That quality is enhanced by the training and experience of project staff and documentation of sampling
activities. Field Crews enter a flag code and provide comments on the Sample Collection Form if there
are any problems in collecting the sample or if conditions occur that may affect sample integrity.
Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the benthic invertebrate sample on the Sample Collection
Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Preserve the sample with formalin.
Recheck all forms and labels for completeness and legibility.
5.4.4.1 Field Performance Requirements
Not Applicable
5.4.4.2 Field Quality Control Requirements
Specific quality control measures are listed in Table 22 for field quality control requirements.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 85 of 134
Table 22. Sample Collection and Field Processing Quality Control: Benthic Invertebrate Indicator
Quality Control Activity Description and Requirements
Check integrity of sample
containers and labels
Sample Processing (field)
Sample Storage (field)
Holding time
Preservation
Clean, intact containers and labels
Use 0.5 mm mesh sieve. Preserve with ten percent
buffered formalin. Fill jars no more than 1/2 full of
material to reduce the chance of organisms being
damaged.
Store benthic samples in a cool, dark place until
shipment to analytical lab
Preserved samples can be stored indefinitely;
periodically check jars and change the ethanol
(change from formalin to ethanol for long term
storage) if sample material appears to be
degrading.2
Transfer storage to 70% ethanol for long term
storage
Corrective Action
Obtain replacement
supplies
Discard and recollect
sample
Discard and recollect
sample
Change ethanol
Qualify samples
5.4.5 Data Review
Checks made of the data in the process of review and verification is summarized in Table 23. The NCCA
Project QA Coordinator is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
Table 23. Data Validation Quality Control for Benthic Macroinvetebrates
Activity or Procedure
Review data and reports from Laboratories
Review data and reports from External QC
Coordinator
Requirements and Corrective Action
Determine impact and possible limitations on
overall usability of data
Determine impact and possible limitations on
overall usability of data
Review taxonomic names and spellings
Correct and qualify
2 In most cases, crews will ship samples to the batch lab within 2 weeks, so long-term storage will not be an issue for
field crews.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 86 of 134
5.5 Sediment Contaminants, Total Organic Carbon (TOC) and Grain Size
5.5.1 Introduction
Crews will collect sediment grabs for chemical analyses (organics/metals and TOC), and grain size
determination.
5.5.2 Sample Design and Methods
Detailed sample collection and handling procedures are described in the NCCA 2015 Field Operations
Manual. Detailed laboratory methods are in the NCCA 2015 Laboratory Operations Manual.
5.5.3 Pertinent Laboratory QA/QC Procedures
A single central laboratory and some State laboratories will analyze the sediment contaminants, TOC
and grain size samples. The specific quality control procedures used by each laboratory are implemented
to ensure that:
Objectives established for various data quality indicators being met.
Results are consistent and comparable among all participating laboratories.
All laboratories will follow the QA/QC procedures outlined in the NCCA QAPP and the LOM.
5.5.3.1 Laboratory Performance Requirements
The laboratory shall perform analysis of the sediment samples to determine the moisture content, grain
size, and concentrations of TOC, metals, pesticides, PAHs, and PCBs.
To demonstrate its competency in analysis of sediment samples, the laboratory shall provide analyte
and matrix specific information to EPA. EPA will accept one or more of the following as a demonstration
of competency:
Memorandum that identifies the relevant services that the laboratory provided for the National
Aquatic Resource Surveys in the past five years.
Documentation detailing the competency of the organization, including professional
certifications for water-related analyses, membership in professional societies, and experience
with analyses that are the same or similar to the requirements of this method.
Demonstration of competency with sediment samples in achieving the method detection limits,
accuracy, and precision targets.
To demonstrate its competency in quality assurance and quality control procedures, the organization
shall provide EPA with copies of the quality-related documents relevant to the procedure. Examples
include Quality Management Plans (QMP), QAPPs, and applicable Standard Operating Procedures
(SOPs). To demonstrate its ongoing commitment to quality assurance, the person in charge of quality
issues for the organization shall sign the NCCA QAPP Certification Page.
Precision and accuracy objectives are identified in Table 24. Table 25 identifies the storage
requirements. Laboratories may choose to use any analysis method, including those in Table 25, which
measures the parameters to the levels of the method detection limits identified in Table 26.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 87 of 134
Table 24. Sediment Contaminants, Grain size and TOC: Precision and Accuracy Objectives
All Contaminants
Objective
osured by.
Objective
(measured by/
30% (RPD between MS
and MSD)
20% (average %Rs between
MS and MSD)
TOC
10% (RPD between
duplicates)
10% (CRM)
Grain Size
10% (LCS)
Not Applicable
* RPD=Relative Percent Difference; %Rs=%Recovery; MS=Matrix Spike; MSD=Matrix Spike Duplicate;
CRM=Certified Reference Material; LCS=Lab Control Sample.
Table 25. Sediment Contaminants, Grain Size, and TOC: Analytical Methods
Storage
Freeze samples to a
temperature < -20° C
Metals (except Mercury)
Mercury
PCBs, Pesticides, PAHs
TOC
Methods that Meet the QA/QC
Requirements (any method that
meets the QA/QC requirements is
acceptable)
Extraction: EPA Method 3051A
Analysis: EPA Method 6020A3
EPA Method 245.7"
Extraction: EPA Method 3540C
Analysis: EPA Method 8270D5
Lloyd Kahn Method6
Refrigerate at 4° C
(do not freeze)
Grain Size
Any method that reports the
determination as %silt and meets
QA/QC requirements
3 For example, see:
Method 3051A "Microwave Assisted Acid Digestion of Sediments, Sludges, Soils, And Oils" retrieved
June 27, 2014 from http://www.epa.gov/osw/hazard/testmethods/sw846/pdfs/305 la.pdf: and
Method 6020A "Inductively Coupled Plasma-Mass Spectrometry" retrieved June 27, 2014 from
http://www.epa.gov/osw/hazard/testmethods/sw846/pdfs/6020a.pdf.
4 For example, see Method 245.7 "Mercury in Water by Cold Vapor Atomic Fluorescence Spectrometry, Revision
2.0" (EPA-821-R-05-001, February 2005), retrieved June 27, 2014 from
http://water.epa.gov/scitech/methods/cwa/bioindicators/upload/2007 07 10 methods method 245 7.pdf.
5 For example, see:
Method 3540C "Soxhlet Extraction" retrieved June 27, 2014 from
http://www.epa.gov/osw/hazard/testmethods/sw846/pdfs/3540c.pdf: and
Method 8270D "Semivolatile Organic Compounds by Gas Chromatography/Mass Spectrometry (GC/MS)
retrieved June 27, 2014 from http://www.epa.gov/osw/hazard/testmethods/sw846/pdfs/8270d.pdf.
6 For example, the "Lloyd Kahn Method" developed by Lloyd Kahn at EPA Region II and retrieved from
www. ni. gov/dep/srp/guidance/rs/lloYdkahn.pdf.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 88 of 134
Table 26. Sediment Contaminants, Grain Size, and TOC: Required Parameters
Type
METAL
PCB
PEST
UNITS Parameter CAS Number PCB MDL
Number Targe
(where t
applicable)
% sand and
% silt/clay
mg/kg
dry weight Mg/g
(ppm)
dry weight
ng/g
(ppb)
dry weight
ng/g
(ppb)
Grain Size
Total Organic Carbon (TOC)
Aluminum
Antimony
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Manganese
Mercury
Nickel
Selenium
Silver
Tin
Vanadium
Zinc
2,2',3,3',4,4',5,5',6,6'-Decachlorobiphenyl
2,4'-Dichlorobiphenyl
2,2',3,3',4,4',5-Heptachlorobiphenyl
2,2',3,4',5,5',6-Heptachlorobiphenyl
2,2',3,4',5,5',6-Heptachlorobiphenyl
2,2',3,3',4,4'-Hexachlorobiphenyl
2,2',3,4,4',5'-Hexachlorobiphenyl
2,2',4,4',5,5'-Hexachlorobiphenyl
2,2',3,3',4,4',5,5',6-Nonachlorobiphenyl
2,2',3,3',4,4',5,6-Octachlorobiphenyl
2,3,3',4,4'-Pentachlorobiphenyl
2,2',4,5,5'-Pentachlorobiphenyl
2,3',4,4',5-Pentachlorobiphenyl
2,3,3',4,6'-Pentachlorobiphenyl
3,3',4,4',5-Pentachlorobiphenyl
2,2',3,5'-Tetrachlorobiphenyl
3,3',4,4'-Tetrachlorobiphenyl
2,2',5,5'-Tetrachlorobiphenyl
2,3',4,4'-Tetrachlorobiphenyl
2,2',5-Trichlorobiphenyl
2,4,4'-Trichlorobiphenyl
2,4'-DDD
2,4'-DDE
2,4'-DDT
4,4'-DDD
4,4'-DDE
not applicable
not applicable
7429-90-5
7440-36-0
7440-38-2
7440-43-9
7440-47-3
7440-50-8
7439-89-6
7439-92-1
7439-96-5
7439-97-6
7440-02-0
7782-49-2
7440-22-4
7440-31-5
7440-62-2
7440-66-6
2051-24-3
34883-43-7
35065-30-6
52663-68-0
35065-29-3
38380-07-3
35065-28-2
35065-27-1
40186-72-9
52663-78-2
32598-14-4
37680-73-2
31508-00-6
38380-03-9
57465-28-8
41464-39-5
32598-13-3
35693-99-3
32598-10-0
37680-65-2
7012-37-5
53-19-0
3424-82-6
789-02-6
72-54-8
72-55-9
209
8
170
187
180
128
138
153
206
195
105
101
118
110
126
44
77
52
66
18
28
0.05%
0.01%
1500
0.2
1.5
0.05
5.0
5.0
500
1.0
1.0
0.01
1.0
0.1
0.3
0.1
1.0
2.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
4,4'-DDT
Aldrin
Alpha-BHC
Beta-BHC
Delta-BHC
Alpha-Chlordane
Gamma-Chlordane
Dieldrin
Endosulfan I
Endosulfan II
Endosulfan Sulfate
Endrin
Endrin Aldehyde
Endrin Ketone
Heptachlor
Heptachlor Epoxide
Hexachlorobenzene
Lindane
Mi rex
Cis-Nonachlor
Oxychlordane
Trans-Nonachlor
PAHs dry weight Acenaphthene
ng/g Acenaphthylene
(PPb> Anthracene
Benz(a)anthracene
Benzo(b)fluoranthene
Benzo(k)fluoranthene
Benzo(g,h,i)perylene
Benzo(a)pyrene
Benzo(e)pyrene
Biphenyl
Chrysene
Dibenz(a,h)anthracene
Dibenzothiophene
2,6-Dimethylnaphthalene
Fluoranthene
Fluorene
lndeno(l,2,3-c,d)pyrene
1-Methyl naphthalene
2-Methyl naphthalene
1-Methylphenanthrene
Naphthalene
Perylene
Phenanthrene
Pyrene
2,3,5-Trimethylnaphthalene
Quality
50-29-3
309-00-2
319-84-6
319-85-7
319-86-8
5103-71-9
5566-34-7
60-57-1
959-98-8
33213-65-9
1031-07-8
72-20-8
7421-93-4
53494-70-5
76-44-8
1024-57-3
118-74-1
58-89-9
2385-85-5
5103-73-1
26880-48-8
39765-80-5
83-32-9
208-96-8
120-12-7
200-280-6
205-99-2
207-08-9
191-24-27-2
50-32-8
192-9
92-54-4
218-01-9
53-70-3
132-65-0
581-42-0
205-99-2
86-73-7
193-39-5
90-12-0
91-57-6
832-69-9
91-20-3
198-55-0
85-01-8
129-00-0
2245-38-7
Assurance Project Plan
Page 89 of 134
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
1.0
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
10
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 90 of 134
5.5.3.2 Laboratory Quality Control Requirements
The laboratory must conduct QC analyses for each batch of samples. Each batch shall consist of no more
than 20 samples. Unique laboratory quality control lot numbers must be assigned to each batch of
samples. The lot number must associate each batch of field samples to the appropriate measures such
as laboratory control sample, matrix spike, laboratory duplicate, and method blank samples. Also, each
laboratory QC samples (i.e., preparation and instrument blanks, laboratory control sample (LCS),
spike/duplicate, etc.) must be given a unique sample identification. Table 27 provides a summary of the
quality control requirements including sample receipt and processing.
Table 27. Sediment Chemistry, Grain Size, and TOC: Quality Control Activities for Samples
Activity
Demonstrate competency for analyzing
sediment samples to meet the
performance measures
Evaluation
Demonstration of competency with
sediment samples in achieving the
method detection limits, accuracy, and
precision targets
Corrective Action
EPA will not approve any laboratory for
NCCA sample processing if the
laboratory cannot demonstrate
competency. In other words, EPA will
select another laboratory that can
demonstrate competency for its NCCA
samples.
Check condition of sample when it
arrives.
Sample issues such as cracked
container; missing label; sufficient
volume for test.
Assign appropriate condition code
identified in Table 6.4. of the LOM
Store sample appropriately. While
stored at the laboratory, the sample
must be kept at a temperature <-20° C
except jars for grain analyses are
refrigerated at 4°C.
Check the temperature of the freezer
and refrigerator per laboratory's
standard operating procedures.
Record temperature of sample upon
arrival at the laboratory. If at any other
time, samples are warmer than
required, note temperature and
duration in comment field.
Data analyst will consider temperature
deviations in evaluating the data.
He/she will flag the deviations and
determine whether the data appear to
be affected and/or the data should be
excluded from the analyses.
Analyze sample within holding time
The test must be completed within the
holding time of 1 year. If the original
test fails, then the retest also must be
conducted within the holding time.
Perform test, but note reason for
performing test outside holding time.
EPA expects that the laboratory will
exercise every effort to perform tests
before the holding time expires.
Perform once at the start of each batch
to evaluate the labeled compound
recovery (LCR) in a Laboratory Control
Sample (LCS). This tests the
performance of the equipment.
Control limits for recovery cannot
exceed 100±20%.
First, prepare and analyze one
additional LCS. If the second blank
meets the requirement, then no further
action is required. If the second LCS
fails, then determine and correct the
problem before proceeding with any
sample analyses.
Perform once at the start of each batch
to evaluate the entire extraction and
analysis process using a Method Blank
Control limits cannot exceed the
laboratory reporting level (LRL).
First, prepare and analyze one
additional blank. If the second blank
meets the requirement, then no further
action is required. If the second blank
fails, then determine and correct the
problem (e.g., contamination,
instrument calibration) before
proceeding with any sample analyses.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 91 of 134
Check calibration immediately before
and immediately after the sample batch
(abbreviated as QCCS for quality control
check sample)
Results must be ±10% of each other or
as specified in method criteria
Compare results of one laboratory
duplicate sample (for TOC) or matrix
spike duplicate (for contaminant)
sample for each batch (not required for
grain size)
Results must be within the target
precision goal
Compare results of one matrix spike
sample per batch to evaluate
performance in matrix (not required for
TOC and grain size)
Evaluate performance after the first 3
batches; and then every subsequent
batch. Ideally, control limits for recovery
will not exceed the target accuracy goal,
but this may not be realistic for all
parameters with this matrix.
Reestablish statistical control by
analyzing three blank samples. Report
values of all blanks analyzed.
If calibration fails before analysis,
recalibrate and reanalyze QCCS until it
passes. If check fails after all samples
the batch have been analyzed, verify the
QCCS reading. If the QCCS reading fails a
second time, then reanalyze all samples
in the batch and report only the set of
results associated with the acceptable
QCCS reading. Also report all QCCS
readings for the batch.
If both results are below LRL, then
conclude that the test has passed.
Otherwise, prepare and analyze a split
from different sample in the batch. If
the second result is within the target
precision goal of the original sample,
then report the data and findings for
both QC samples. However, if the two
results differ by more than the target
precision goal, review precision of QCCS
measurements for batch; check
preparation of split sample; etc. and
report evaluation and findings in the
case narrative. Consult with the EPA HQ
NCCA Laboratory Review Coordinator to
determine if reanalysis of the entire
batch (at the laboratory's expense) is
necessary. If no reanalysis is necessary,
report and quantify all samples in batch.
If reanalysis is necessary, then report all
QC sample and the 2nd analysis of the
batch. If the second set also is
unacceptable, then assign a data code
to each sample in the batch.
If both the original and duplicate results
are below LRL, then conclude that the
test has passed for the batch.
Otherwise, if any results are not within
the target accuracy goal for the first 3
batches, within 2 working days, contact
the EPA HQ NCCA Laboratory Review
Coordinator to discuss method
performance and potential
improvements. After achieving
acceptable results or EPA's permission
to continue, perform the test for every
subsequent batch. For each batch,
report the results from the original
analysis and its duplicate and their RPD
for TOC; the matrix spike, matrix spike
duplicate, RPD and %recovery for
contaminants.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 92 of 134
Compare results of TOC Certified
Reference Material once per each batch
Maintain the required MDL
Participate in External Quality Control
Maintain completeness
Value must be within 10% of the
certified value.
Evaluate for each sample
Evaluate QC samples provided by the
External QC Coordinator
Completeness objective is 95% for all
parameters.
If value is outside the acceptable range,
analyze a second CRM. If the second
CRM also is measured outside the
acceptable range, then determine and
correct the problem (e.g.,
contamination, instrument calibration)
before reanalyzing all samples in the
batch.
If MDL could not be achieved, then
provide dilution factor or QC code and
explanation in the comment field.
Based upon the evaluation, the External
QC Coordinator may request additional
information from one or more
laboratories about any deviations from
the Method or unique laboratory
practices that might account for
differences between the laboratory and
others. With this additional information,
the External QC Coordinator will
determine an appropriate course of
action, including no action, flagging the
data, or excluding some or all of the
laboratory's data.
Contact EPA HQ NCCA Laboratory
Review Coordinator immediately if
issues affect laboratory's ability to meet
completeness objective.
*Chapter 2 of the LOM provides contact information for the EPA HQ NCCA Laboratory Review Coordinator. Laboratories under
contract to EPA must contact the Task Order's Contracting Officer's Representative (TOCOR) instead of the Laboratory Review
Coordinator.
5.5.3.3 Data Reporting
Data reporting units and significant figures are summarized in Table 28.
Table 28. Data Reporting Criteria: Sediment Contaminants, TOC and Grain Size Indicators
Measurement Units Expressed to the
Nearest
Sediment
Pesticides and PCBs
Metals
Hg
PAHs
TOC
Grain Size
ng/g; Ppb (sediment: dry wt)
ug/g; PPm (sediment: dry wt)
ug/g; PPm (sediment: dry wt)
ng/g;ppb(drywt)
%
%
0.01
0.01
0.001
0.01
0.01
0.01
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 93 of 134
5.5.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 FOM. That quality is
enhanced by the training and experience of project staff and documentation of sampling activities.
Crews will collect a sediment sample for sediment contamination, TOC and grain size analyses. Field
crews will verify that all sample containers are uncontaminated and intact, and that all sample labels are
legible and intact.
Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the sediment sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Store the sediment contaminants and TOC samples on dry ice. Store grain size samples on wet ice.
Recheck all forms and labels for completeness and legibility.
5.5.4.1 Field Performance Requirements
Not Applicable
5.5.4.2 Field Quality Performance Requirements
Any contamination of the samples can produce significant errors in the resulting interpretation. Crews
must take care not to contaminate the sediment with the tools used to collect the sample (i.e., the
sampler, spoons, mixing bowl or bucket) and not to mix the surface layer with the deeper sediments.
Prior to sampling at each site, crews must clean the sampler and collection tools that will come into
contact with the sediment with Alconox and rinse them with ambient water at the site. Field processing
quality control requirements can be found in Table 29 and Table 30.
Table 29. Sample Collection and Field Processing Quality Control: Sediment Contaminant Indicator
Quality Control Activity
Check integrity of sample
containers and labels
Description and Requirements
Clean, intact containers and labels.
Corrective Action
Obtain replacement
supplies
Sample Storage (field)
Store sediment samples on dry ice and in
a dark place (cooler).
Discard and recollect
sample
Shipping time
Frozen samples must be shipped on dry
ice within 2 weeks of collection.
Logistics coordinator
contacts crew and
requests samples be
shipped every week
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 94 of 134
Table 30. Sample Collection and Field Processing Quality Control: Sediment TOC and Grain Size Indicator
Check for homogeneity Sample must be homogeneous.
Mix sample for a longer
period of time
Sample Storage (field)
Store sediment (TOC) samples on dry ice and
grain size indicators on wet ice. Store all
samples in a dark place (cooler).
Discard and recollect
sample
Holding time
TOC samples must be shipped on dry ice within Qualify samples
2 weeks of collection.
Grain size indicators must be shipped on wet
ice every week.
Check integrity of
sample containers and
labels
Clean, intact containers and labels.
Obtain replacement
supplies
5.5.5 Data Review
Checks made of the data in the process of review and verification is summarized in Table 31. The NCCA
Project QA Coordinator is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
Table 31. Data Validation Quality Control for Sediment Contaminants, TOC and Grain Size Indicators
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and whisker
plots)
Correct reporting errors or qualify as suspect
or invalid.
Review holding times
Qualify value for additional review
Review data from QA samples (laboratory PE
samples, and interlaboratory comparison
samples)
Determine impact and possible limitations on
overall usability of data
5.6 Sediment Toxicity
5.6.1 Introduction
Toxicity tests will be completed on sediments from both marine/estuarine and freshwater
environments. Both tests determine toxicity, in terms of survival rate of amphipod crustaceans, in whole
sediment samples.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 95 of 134
5.6.2 Sample Design and Methods
Detailed sample collection and handling procedures are described in the NCCA 2015 Field Operations
Manual. Laboratory methods are in the NCCA 2015 Laboratory Operations Manual.
5.6.3 Pertinent Laboratory QA/QC Procedures
A single central laboratory and some State laboratories will analyze the sediment toxicity. The specific
quality control procedures used by each laboratory are implemented to ensure that:
Objectives established for various data quality indicators being met.
Results are consistent and comparable among all participating laboratories.
All laboratories will follow the QA/QC procedures outlined in the NCCA QAPP and the LOM.
5.6.3.1 Laboratory Performance Requirements
Laboratories may choose to use any analysis method using the required organisms of Hyalella azteca
(freshwater) or Leptocheirus plumulosus (marine). The laboratory's method must meet the quality
requirements in Section 8.7 of the LOM, including mean survival of the control's treatments must
remain greater than or equal to 80% and 90%, respectively. It is essential that the contractor require
that all of its laboratory technicians use the same procedures and meet the required quality elements.
At a minimum, the laboratory must:
1. Perform the procedures using the 10-day tests. Possible methods include those described in the
following documents:
a. Marine: Test Method 100.4 in EPA 600/R-94/0257 or ASTM E1367-038
b. Freshwater: Test Method 100.1 in EPA 600/R-99/0649 or ASTM E170610
2. Test the following number of replicates for each sample and control:
a. Marine: 5 replicates with 20 organisms per replicate
b. Freshwater: 4 replicates with 10 organisms per replicate
3. Test no more than 10 samples and one control within each batch.
4. Use the following organisms for the tests:
a. Marine: Leptocheirus plumulosus
b. Freshwater: Hyalella azteca
5. Select organisms for each batch of tests that are:
a. From the same culture;
b. Cultured at the same temperature as will be used for the tests;
7 Chapter 11 in Methods for Assessing the Toxicity of Sediment-associated Contaminants with Estuarine and Marine
Amphipods, June 1994, retrieved from http://water.epa.gov/polwaste/sediments/cs/upload/marinemethod.pdf.
8 American Society for Testing and Materials (ASTM). 2008. E1367-03 "Standard Guide for Conducting 10-Day
Static Sediment Toxicity Tests With Marine and Estuarine Amphipods." Annual Book of Standards, Water and
Environmental Technology, Vol. 11.05, West Conshohocken, PA.
9 Section 11 in Methods for Measuring the Toxicity and Bio accumulation of Sediment-associated Contaminants with
Freshwater Invertebrates, Second Edition, March 2000, retrieved from
http://water.epa.gov/polwaste/sediments/cs/upload/freshmanual.pdf.
10 ASTM 2009 E1706. "Standard Test Method for Measuring the Toxicity of Sediment-Associated Contaminants
with Freshwater Invertebrates."
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 96 of 134
c. (optional) EPA would prefer but does not require that the organisms are cultured in the
same water as that used for testing.
6. Use a water source (for the overlying water) demonstrated to support survival, growth, and
reproduction of the test organisms.
a. For marine sediments, 175 ml of sediment and 800 ml of overlying seawater
b. For freshwater sediments, lOOmL of sediment and 175mL of overlying freshwater
7. Use clean sediment for control tests.
8. Implement the following for exposure/feeding
a. For marine sediments, exposure is static (i.e., water is not renewed), and the animals
are not fed over the 10 d exposure period
b. For freshwater, exposure is renewed (i.e., 2 volumes a day) and the animals are fed over
the 10 day exposure period.
9. Follow the following procedure for homogenization/sieving: Water above the sediment is not
discarded, but is mixed back into the sediment during homogenization. Sediments should be
sieved for marine samples (following the 10 day method) and the sieve size should be noted. For
freshwater samples, they should not be sieved to remove indigenous organisms unless there is a
good reason to believe indigenous organisms may influence the response of the test organism.
For freshwater samples, large indigenous organisms and large debris can be removed using
forceps and if sediments must be sieved, the samples should be analyzed before and after
sieving (e.g., pore-water metals, DOC, and AVS) to document the influence of sieving on
sediment chemistry (note sieve size).
5.6.3.2 Laboratory Quality Control Requirements
The laboratory must conduct QC analyses for each batch of samples. Each batch shall consist of no more
than 10 samples. Unique laboratory quality control lot numbers must be assigned to each batch of
samples. The lot number must associate each batch of field samples to the appropriate measures such
as laboratory control samples. Table 32 provides a summary of the quality control requirements
including sample receipt and processing.
Table 32. Quality Control Activities for Sediment Toxicity Samples
Laboratory demonstrates competency
for conducting sediment toxicity
analyses
EPA will review SOPs, lab certifications,
past performance results, etc. as part
of the lab verification process.
Corrective Action
EPA will not approve any laboratory for
NCCA sample processing if the
laboratory cannot demonstrate
competency. In other words, EPA will
select another laboratory that can
demonstrate competency for its NCCA
samples.
Check condition of sample when it
arrives.
Sample issues, such as cracked or
leaking container; missing label;
temperature; adherence to holding
time requirements; insufficient volume
for test.
Assign appropriate condition code
identified in Table 8.1 of the LOM
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 97 of 134
Sample storage
All samples: 4 °C at arrival at the
laboratory (temperature recorded at
arrival) and while stored at the
laboratory.
Record temperature upon arrival at the
laboratory. Check temperature of the
refrigerator where samples are stored
at least daily if using a continuous
temperature logger and twice daily
(beginning and end of day) if the lab
does not have a continuous logger. If
refrigerator is warmer than required,
note temperature and duration (either
from the continuous temperature log
or from the last manual reading) in
comment field. Lab will still perform
test. EPA expects that the laboratory
will exercise every effort to maintain
samples at the correct temperature.
Holding Time
The test must be completed within 8
weeks after sample collection. If the
original test fails, then the retest also
must be conducted within the 8 weeks
after sample collection.
Perform test, but note reason for
performing test outside holding time.
EPA expects that the laboratory will
exercise every effort to perform tests
before the holding time expires.
Check that the organisms are healthy
before starting the test
Unhealthy organisms may appear to be
discolored, or otherwise stressed (for
example, greater than 20 percent
mortality for the 48 hours before the
start of a test).
Don't start test using unhealthy
organisms.
Maintain conditions as required in
Section 8.3 of the LOM
Check conditions (e.g., temperature,
DO) each test day. Record conditions in
bench sheet or in laboratory database.
Control survival rates
For a test of a batch of samples to be
considered valid, the control's mean
survival in hyalella and leptocheirus
treatments must remain >80% and
>90%, respectively.
Note any deviations in comments field.
In extreme cases, conduct a new
toxicity test for all samples affected by
the adverse conditions.
Data template includes a field to record
if a test passed or failed the control
requirements. If a test fails, retest all
samples in the batch. Report both the
original and retest results. If both tests
fail, submit data to EPA for further
consideration. Include comments in the
data template noting any particular
factors that may have caused the test
to fail twice.
*Chapter 2 of the LOM provides contact information for the EPA HQ NCCA Laboratory Review Coordinator. Laboratories under
contract to EPA must contact the Task Order's Contracting Officer's Representative (TOCOR) instead of the Laboratory Review
Coordinator.
5.6.3.3 Data Reporting
Data reporting units and significant figures are given in Table 33.
Table 33. Data Reporting Review Critera: Sediment Toxicity
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 98 of 134
5.6.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 FOM. That quality is
enhanced by the training and experience of project staff and documentation of sampling activities.
Crews will collect a sediment sample for sediment toxicity. Field crews will verify that all sample
containers are uncontaminated and intact, and that all sample labels are legible and intact.
Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the sediment sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Store the sample on wet ice.
Recheck all forms and labels for completeness and legibility.
5.6.4.1 Field Performance Requirements
Not Applicable
5.6.4.2 Field Quality Control Requirements
Any contamination of the samples can produce significant errors in the resulting interpretation. Crews
must take care not to contaminate the sediment with the tools used to collect the sample (i.e., the
sampler, spoons, mixing bucket) and not to mix the surface layer with the deeper sediments. Prior to
sampling at each site, crews must clean the sampler and collection tools that will come into contact with
the sediment with Alconox and rinse them with ambient water at the site. Field processing quality
control requirements are summarized in Table 34.
Table 34. Sample Collection and Field Processing Quality Control: Sediment Toxicity Indicator
Check integrity of
sample containers and
labels
Clean, intact containers and labels.
Obtain replacement
supplies
Sample Volume
Preferred maximum volume 2000 ml;
minimum volume 900 ml (marine); For Great
Lakes sites, preferred volume is 900 ml,
minimum is 400 ml.
Qualify samples if less
than 900 ml available to
submit to lab (less than
400 ml for GL sites.
Sample Storage (field)
Holding time
Store sediment samples on wet ice and in a
dark place (cooler).
Refrigerated samples must be shipped on wet
ice within 1 week of collection.
Discard and recollect
sample
Qualify samples
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 99 of 134
5.6.5 Data Review
Checks made of the data in the process of review, verification, and validation are summarized in Table
35. The NCCA Project QA Coordinator is ultimately responsible for ensuring the validity of the data,
although performance of the specific checks may be delegated to other staff members.
Table 35. Data Validation Quality Control: Sediment Toxicity
Activity or Procedure Requirements and Corrective Action
Summary statistics, and/or exploratory data Correct reporting errors or qualify as suspect
analysis (e.g., box and whisker plots) or invalid.
Review data from reference toxicity samples
Determine impact and possible limitations on
overall usability of data
5.7 Fecal Indicator: Enterococci
5.7.1 Introduction
The primary function of collecting water samples for Pathogen Indicator Testing is to provide a relative
comparison of fecal pollution indicators for coastal waters. The concentration of Enterococci (the
current bacterial indicator for fresh and marine waters) in a water body correlates with the level of more
infectious gastrointestinal pathogens present in the water body. While some Enterococci are
opportunistic pathogens among immuno-compromised human individuals, the presence of Enterococci
is more importantly an indicator of the presence of more pathogenic microbes (bacteria, viruses and
protozoa) associated with human or animal fecal waste.
5.7.2 Sampling Design and Methods
Detailed sample collection and handling procedures are described in the NCCA 2015 Field Operations
Manual.
5.7.3 Pertinent Laboratory QA/QC Procedures
Pertinent laboratory QA/QC procedures are in the EPA ORD manuals/QAPP.
5.7.3.1 Data Reporting, Review and Management
Checks made of the data in the process of review, verification, and validations are summarized in Table
36. All raw data (including all standardized forms and logbooks) are retained in an organized fashion for
seven years or until written authorization for disposition has been received from the NCCA Project Lead.
Once data have passed all acceptance requirements, data is submitted to the NARS Project Lead and
then to the NARS IM processing center.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 100 of 134
Table 36. Data Validation Quality Control: Fecal Indicator
Check Description Frequency
ance Criteria
Corrective Action
Duplicate
sampling
Duplicate composite Measurements should be Review data for reasonableness;
samples collected at within 10 percent determine if acceptance criteria
10% of sites
need to be modified
Field filter blanks Field blanks filtered Measurements should be Review data for reasonableness;
at 10% of sites
within 10 percent
determine if acceptance criteria
need to be modified
DATA PROCESSING & REVIEW
100% verification
and review of
qPCRdata
All qPCR
amplification
traces, raw and
processed data
sheets
All final data will be checked
against raw data, exported data,
and calculated data printouts
before entry into LIMS and
upload to NARSIM.
Second tier review by
contractor and third tier
review by EPA.
5.7.4 Pertinent Field QA/QC Procedures
5.7.4.1 Field Performance Requirements
Not Applicable
5.7.4.2 Field Quality Control Requirements
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA Field Operations Manual. That
quality is enhanced by the training and experience of project staff and documentation of sampling
activities. Specific quality control measures are listed in Table 37 for field measurements and
observations.
Table 37. Sample Collection and Field Processing Quality Control: Fecal Indicator
escription and Requirements
Check integrity of sample Clean, intact containers and labels
containers and labels
Corrective Action
Obtain replacement
supplies
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 101 of 134
Sterility of sample
containers
Sample collection bottle and filtering apparatus are
sterile and must be unopened prior to sampling.
Nitrile gloves must be worn during sampling and
filtering
Discard sample and
recollect in the field.
Sample Collection
Collect sample at the last transect to minimize
holding time before filtering and freezing
Discard sample and
recollect in the field.
Sample holding
Sample is held in a cooler on wet ice until filtering.
Discard sample and
recollect in the field.
Field Processing
Sample is filtered within 6 hours of collection and
filters are frozen on dry ice.
Discard sample and
recollect in the field
Field Blanks
Field blanks must be filtered at 10% of sites.
Review blank data and
flag sample data.
5.8 Whole Fish Tissue Samples for Ecological Analysis
5.8.1 Introduction
Fish collected as indicators of ecological contamination (Eco-fish) will be collected at all sites to be
analyzed for whole body concentrations of organic and inorganic contaminants. This will also include the
analysis and reporting of lipid content, sample weight and percent moisture. Results from these analyses
will be used to help determine the ecological integrity of U.S. coastal resources..
5.8.2 Sample Design and Methods
Detailed sample collection and handling procedures are described in the NCCA 2015 Field Operations
Manual. Laboratory methods are in the NCCA 2015 Laboratory Operations Manual..
5.8.3 Pertinent Laboratory QA/QC Procedures
5.8.3.1 Laboratory Performance Requirements
A single central laboratory shall perform analysis of the homogenized composites to determine the lipid
content, concentrations of metals, mercury, pesticides, and PCBs. EPA also may require the national
contract laboratory to analyze the samples for PAHs; however, EPA will not require the State
laboratories to analyze for them. With the exception of sea urchins, NCCA does not provide support for
analyses of any other invertebrates such as crustaceans (e.g., lobster, crabs).
Laboratories may choose to use any analysis method that measures contaminants to the levels of the
method detection limits identified in Table 39. In addition, the method must meet the target precision
of 30% and the target accuracy identified in Table 38.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 102 of 134
Table 38. Whole Fish Tissue: Precision and Accuracy Objectives
Metals
Precision
30%
20%
Organics (PCBs, pesticides, and PAHs)
30%
35%
Table 39. Whole Body Fish: Required Contaminants
Type UNITS
Parameter
CAS Number PCB Number
(where
applicable)
Target
LIPID
METAL
PCB
% Wet Weight
Mg/wet g
(mg/L)
ng/wet g
(Mg/L)
% LIPID
Aluminum
Arsenic
Cadmium
Chromium
Copper
Iron
Lead
Mercury
Nickel
Selenium
Silver
Tin
Vanadium
Zinc
2,2',3,3',4,4',5,5',6,6'-
Decachlorobiphenyl
2,4'-Dichlorobiphenyl
2,2',3,4',5,5',6-Heptachlorobiphenyl
2,2',3,3'4,4',5,6-Octachlorobiphenyl
2,2',3,4,5,5',6-Heptachlorobiphenyl
2,2',3,3',4,4'-Hexachlorobiphenyl
2,2',3,3'4,4',5-Heptachlorobiphenyl
2,2',3,4,4',5'-Hexachlorobiphenyl
2,2',4,4',5,5'-Hexachlorobiphenyl
2,2',3,3',4,4',5,5',6-
Nonachlorobiphenyl
2,3,3',4,4'-Pentachlorobiphenyl
2,2',4,5,5'-Pentachlorobiphenyl
2,3',4,4',5-Pentachlorobiphenyl
2,3,3',4,6'-Pentachlorobiphenyl
3,3',4,4',5-Pentachlorobiphenyl
2,2',3,5'-Tetrachlorobiphenyl
3,3',4,4'-Tetrachlorobiphenyl
2,2',5,5'-Tetrachlorobiphenyl
2,3',4,4'-Tetrachlorobiphenyl
2,2',5-Trichlorobiphenyl
7429-90-5
7440-38-2
7440-43-9
7440-47-3
7440-50-8
7439-89-6
7439-92-1
7439-97-6
7440-02-0
7782-49-2
7440-22-4
7440-31-5
7440-62-2
7440-66-6
2051-24-3
34883-43-7
35065-29-3
52663-78-2
52663-68-0
38380-07-3
35065-30-6
35065-28-2
35065-27-1
40186-72-9
32598-14-4
37680-73-2
31508-00-6
38380-03-9
57465-28-8
41464-39-5
32598-13-3
35693-99-3
32598-10-0
37680-65-2
209
8
180
195
187
128
170
138
153
206
105
101
118
110
126
44
77
52
66
18
10.0
2.0
0.2
0.1
5.0
50.0
0.1
0.01
0.5
1.0
0.3
0.05
1.0
50.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 103 of 134
PEST
PAHs*
ng/wet g
(Mg/L)
2,4,4'-Trichlorobiphenyl
2,4'-DDD
2,4'-DDE
2,4'-DDT
4,4'-DDD
4,4'-DDE
4,4'-DDT
Aldrin
Alpha-BHC
Beta-BHC
Delta-BHC
Alpha-Chlordane
Gamma-Chlordane
Dieldrin
Endosulfan I
Endosulfan II
Endosulfan Sulfate
Endrin
Endrin Aldehyde
Endrin Ketone
Heptachlor
Heptachlor Epoxide
Hexachlorobenzene
Lindane
Mi rex
Cis-Nonachlor
Oxychlordane
Trans-Nonachlor
Acenaphthene
Acenaphthylene
Anthracene
Benz(a)anthracene
Benzo(b)fluoranthene
Benzo(k)fluoranthene
Benzo(g,h,i)perylene
Benzo(a)pyrene
Benzo(e)pyrene
Biphenyl
Chrysene
Dibenz(a,h)anthracene
Dibenzothiophene
2,6-Dimethylnaphthalene
Fluoranthene
Fluorene
lndeno(l,2,3-c,d)pyrene
1-Methyl naphthalene
2-Methyl naphthalene
1-Methylphenanthrene
Naphthalene
Perylene
Phenanthrene
7012-37-5
53-19-0
3424-82-6
789-02-6
72-54-8
72-55-9
50-29-3
309-00-2
319-84-6
319-85-7
319-86-8
5103-71-9
5566-34-7
60-57-1
959-98-8
33213-65-9
1031-07-8
72-20-8
7421-93-4
53494-70-5
76-44-8
1024-57-3
118-74-1
58-89-9
2385-85-5
5103-73-1
26880-48-8
39765-80-5
83-32-9
208-96-8
120-12-7
200-280-6
205-99-2
207-08-9
191-24-27-2
50-32-8
192-97-2
92-54-4
218-01-9
53-70-3
132-65-0
581-42-0
205-99-2
86-73-7
193-39-5
90-12-0
91-57-6
832-69-9
91-20-3
198-55-0
85-01-8
28
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
2.0
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 104 of 134
Pyrene I 129-00-0
2,3,5-Trimethylnaphthalene | 2245-38-7
2.0
2.0
* EPA also may require the national contract laboratory to analyze the samples for PAHs; however, EPA will not require the
State laboratories to analyze for them.
5.8.3.2 Laboratory Quality Control Requirements
The laboratory must conduct QC analyses for each batch of samples. Each batch shall consist of no more
than 20 samples. Unique laboratory quality control lot numbers must be assigned to each batch of
samples. The lot number must associate each batch of field samples to the appropriate measures such
as laboratory control sample, matrix spike, laboratory duplicate, and method blank samples. Also, each
laboratory QC samples (i.e., preparation and instrument blanks, laboratory control sample (LCS),
spike/duplicate, etc.) must be give a unique sample identification. Table 40 provides a summary of the
quality control requirements, including sample receipt and processing.
Table 40. Whole Body Fish: Quality Control Activities
Quality Control Description and Requirements
Activity
Demonstrate
competency for
analyzing fish
samples with the
required methods
Demonstration of competency with fish samples in
achieving the method detection limits, accuracy, and
precision targets.
Corrective Action
EPA will not approve any laboratory for
NCCA sample processing if the laboratory
cannot demonstrate competency. In other
words, EPA will select another laboratory
that can demonstrate competency for its
NCCA samples.
Check condition of
sample when it
arrives.
Store sample
appropriately. While
stored at the
laboratory, the
sample must be kept
at a maximum
temperature of
-20° C.
Sample issues, such as punctures or rips in wrapping;
missing label; temperature; adherence to holding time
requirements; sufficient volume for test. All samples
should arrive at the laboratory in a frozen state.
Assign appropriate condition code
identified in Table 5.1. of the LOM
Check the temperature of the freezer per laboratory's
standard operating procedures.
Record temperature of sample upon
arrival at the laboratory. If at any other
time, samples are warmer than required,
note temperature and duration in
comment field.
Determine if all fish
meet the criteria
Evaluate if the sample contains fish of the same species
and are similar in size (within 75%), and provides enough
material to run the analysis.
Contact the EPA HQ NCCA Laboratory
Review Coordinator* for a decision on fish
selection and/or chemical analysis.
Analyze sample
within holding time
The test must be completed within the holding time (i.e.,
28 days for mercury; 6 months for other metals; and 1
year for all others). If the original test fails, then the
retest also must be conducted within the holding time.
Perform test, but note reason for
performing test outside holding time. EPA
expects that the laboratory will exercise
every effort to perform tests before the
holding time expires.
Perform once at the
start of each batch to
evaluate the labeled
compound recovery
(LCR) in a Laboratory
Control Sample (LCS).
This tests the
performance of the
equipment.
Control limits for recovery cannot exceed 100±20%.
First, prepare and analyze one additional
LCS. If the second blank meets the
requirement, then no further action is
required. If the second LCS fails, then
determine and correct the problem before
proceeding with any sample analyses.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 105 of 134
Perform once at the
start of each batch to
evaluate the entire
extraction and
analysis process using
a Method Blank
Check calibration
immediately before
and immediately
after the sample
batch is run
(abbreviated as QCCS
for quality control
check sample)
Control limits cannot exceed the laboratory reporting
level (LRL).
Results must be ±10% of each other or as specified in
method criteria
First, prepare and analyze one additional
blank. If the second blank meets the
requirement, then no further action is
required. If the second blank fails, then
determine and correct the problem (e.g.,
homogenization, reagent contamination,
instrument calibration, or contamination
introduced during filtration) before
proceeding with any sample analyses.
Reestablish statistical control by analyzing
three blank samples. Report values of all
blanks analyzed.
If calibration fails before analysis,
recalibrate and reanalyze QCCS until it
passes. If check fails after all samples in
the batch have been analyzed, verify the
QCCS reading. If the QCCS reading fails a
second time, then reanalyze all samples in
the batch and report both sets of
results. For the first run, include a data
qualifier that indicates that the QCCS
reading taken immediately following the
first run failed. For the second run, include
a data qualifier that indicates that it is the
second set and whether the QCCS reading
immediately following that second run
passed. No sample is to be analyzed more
than twice.
Evaluate rinsates for
first sample in each
batch. This evaluation
is a surrogate for
assessing cross-
contamination.
Compare lipids in
triplicate for the first
sample in each batch.
This evaluation is a
surrogate for
assessing
homogenization.
Compare results of
one laboratory
duplicate sample or
matrix spike
duplicate sample for
each batch
Results must be below laboratory's LRL.
Substitute the LRL for any value below the LRL before
calculating the RSD. If the RSD of the triplicate results is
<20%, then the homogenization effort is judged to be
sufficient for all samples in the batch.
Results must be within the target precision goal in Table
38 (30% for all analytes).
If original rinsate was above LRL, analyze
rinsate from a second sample. If second
rinsate sample also has results above the
LRL, then assign a data qualifier to all
samples in the batch for the parameters
with results above the LRL in the rinsates.
Also, improve procedures for cleaning all
surfaces, knives, and homogenization
equipment between samples.
If the RSD could not be achieved, then
regrind all samples in the batch one or
more times as described in Section 5.5 of
the LOM
If both results are below LRL, then
conclude that the test has passed.
Otherwise, prepare and analyze a split
from different sample in the batch. If the
second result is within the target precision
goal (see Table 38) of the original sample,
then report the data and findings for both
QC samples. However, if the two results
differ by more than the target precision
goal, review precision of QCCS
measurements for batch; check
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 106 of 134
preparation of split sample; etc. and
report evaluation and findings in the case
narrative. Consult with the EPA HQ NCCA
Laboratory Review Coordinator* to
determine if reanalysis of the entire batch
(at the laboratory's expense) is necessary.
If no reanalysis is necessary, report and
quantify all samples in batch. If reanalysis
is necessary, then report all QC sample
and the 2nd analysis of the batch. If the
second set also is unacceptable, then
assign a data code to each sample in the
batch.
Compare results of
one matrix spike
sample per batch to
evaluate
performance in
matrix
Evaluate performance after the first 3 batches. Ideally,
control limits for recovery will not exceed the target
accuracy goal (Table 38), but this may not be realistic for
all parameters with this matrix.
If both results are below LRL, then
conclude that the test has passed for the
batch. Otherwise, if any results are not
within the target accuracy goal for the 3
batches, within 2 working days, contact
the EPA HQ NCCA Laboratory Review
Coordinator* to discuss method
performance and potential improvements.
Continue to perform the test for every
batch. Report the results from the original
analysis, the matrix spike, matrix spike
duplicate, and %recovery.
Maintain the
required MDL
identified inError!
Reference source not
found.
Evaluate for each sample
If MDL could not be achieved, then
provide dilution factor or QC code and
explanation in the comment field.
Use consistent units Verify that all units are provided in wet weight units and If dry units are reported for any sample
for QC samples and
field samples
consistently within each indicator type as follows:
Metals in Mg/g or ppm.
PCBs, pesticides, and PAHs in ng/g or Mg/L.
(QC or field), reanalyze the sample and
report only the reanalysis results. If it is
not possible to provide the results in wet
units, then assign a QC code and describe
the reason for dry units in the comments
field of the database.
Maintain
completeness
Completeness objective is 95% for all parameters.
Contact EPA HQ NCCA Laboratory Review
Coordinator* immediately if issues affect
laboratory's ability to meet completeness
objective.
*Chapter 2 of the LOM provides contact information for the EPA HQ NCCA Laboratory Review Coordinator. Laboratories under
contract to EPA must contact the Task Order's Contracting Officer's Representative (TOCOR) instead of the Laboratory Review
Coordinator.
5.8.3.3 Data Reporting
Data reporting units and significant figures are given in Table 41.
Table 41. Data Reporting Criteria: Eco-Fish Tissue Chemistry
Measurement
Pesticides and PCBs
Expressed to the
Nearest
dry wt and fish tissue wet weight)
0.01
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 107 of 134
Metals
Hg
PAHs
dry wt and fish tissue wet weight)
dry wt and fish tissue wet weight)
ng/g;ppb(drywt)
0.01
0.001
0.01
5.8.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 FOM. That quality is
enhanced by the training and experience of project staff and documentation of sampling activities.
Crews will collect whole fish samples for analysis of organic and inorganic contaminants. Field crews will
verify that all sample containers are uncontaminated and intact, and that all sample labels are legible
and intact.
Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the fish sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Store the sample frozen.
Recheck all forms and labels for completeness and legibility.
5.8.4.1 Field Performance Requirements
Specific field performance requirements/checks are listed in Table 42.
Table 42. Method Quality Objectives for Field Measurement for Eco-Fish Indicator
Quality Control
Activity
75% rule
Description and Requirements
Length of smallest fish in the composite must
be at least 75% of the length of the longest
fish.
Corrective Action
Indicator lead will review
composite data and advise the
| lab before processing begins
5.8.4.2 Field Quality Control Requirements
Specific quality control measures are listed in Table 43 for field measurements and observations.
Table 43. Field Quality Control: Whole Fish Tissue Samples for Ecological Analysis
Quality Control Activity
Check integrity of sample
containers and labels
Description and Requirements
Clean, intact containers and labels.
Corrective Action
Obtain replacement
supplies
Set up fishing equipment
An experienced fisheries biologist sets up the
equipment. If results are poor, a different
method may be necessary.
Note on field data sheet
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 108 of 134
Field Processing
Holding time
Sample Storage (field)
The fisheries biologist will identify specimens in
the field using a standardized list of common
and scientific names. A re-check will be
performed during processing.
Frozen samples must be shipped on dry ice
within 2 weeks of collection
Keep frozen and check integrity of sample
packaging.
Attempt to catch more
fish of the species of
interest.
Qualify samples
Qualify sample as suspect
for all analyses
5.8.5 Data Review
Checks made of the data in the process of review, verification, and validation are summarized in Table
44 and Table 45. The NCCA Project QA Coordinator is ultimately responsible for ensuring the validity of
the data, although performance of the specific checks may be delegated to other staff members.
Table 44. Data Validation Quality Control: Eco-Fish
Requirements and Corrective Action
Summary statistics, and/or exploratory data
analysis (e.g., box and whisker plots)
Correct reporting errors or qualify as suspect
or invalid.
Review data from reference toxicity samples
Determine impact and possible limitations on
overall usability of data
Table 45. Data Validation Quality Control: Eco-Fish Tissue Indicator
Acceptance Criteria
Taxonomic
"reasonableness"
checks
All data sheets
Generally known to occur
in coastal waters or
geographic area
Corrective Action
Second or third identification by
expert in that taxon
Composite validity
check
All composites
Each composite sample
must have 5 fish of the
same species
Indicator lead will review
composite data and advise the
lab before processing begins
75% rule
All composites
Length of smallest fish in
the composite must be at
least 75% of the length of
the longest fish.
Indicator lead will review
composite data and advise the
lab before processing begins
5.9 Fish Tissue Filets (Great Lakes)
5.9.1 Introduction
Fish are time-integrating indicators of persistent pollutants, and contaminant bioaccumulation in
fish tissue has important human and ecological health implications. The NCCA fish tissue fillet
collection will provide information on the distribution of selected chemical residues (mercury,
polychlorinated biphenyls (PCBs), fatty acids, perfluorinated compounds (PFCs), and additional
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 109 of 134
contaminants of emerging concern (e.g., polybrominated diphenyl ethers or PBDEs)) in predator
fish species from the Great Lakes.
The fish tissue indicator procedures are based on EPA's National Study of Chemical Residues in
Lake Fish Tissue (USEPA 2000a) and EPA's Guidance for Assessing Chemical Contaminant Data for
Use in Fish Advisories, Volume 1 (Third Edition) (USEPA 2000b).
5.9.2 Sampling Design and Methods
Field crews collect human health fish tissue composites at a subset of 150 of the Great Lakes sites
(30 sites per lake). Fish tissue samples must consist of a composite of fish (i.e., five individuals of
one predator species that will collectively provide greater than 500 grams of fillet tissue) from
each site.
As with the ecological fish tissue samples, crews collect human health fish tissue samples using
any reasonable method that represents the most efficient or best use of the available time on
station (e.g., gill net, otter trawl, or hook and line) to obtain the target recommended predator
species (Table 46). Five fish will be collected per composite at each site, all of which must be large
enough to provide sufficient tissue for analysis (i.e., 500 grams of fillets, collectively). Fish in each
composite must all be of the same species, satisfy legal requirements of harvestable size (or be of
consumable size if there are no harvest limits), and be of similar size so that the smallest individual
in the composite is no less that 75% of the total length of the largest individual. If the
recommended target species are unavailable, the on-site fisheries biologist will select an
alternative species (i.e., a predator species that is commonly consumed in the study area, with
specimens of harvestable or consumable size, and in sufficient numbers to yield a composite).
able 46. Recommended Target Species: Whole Fish Tissue Collection
Centrarchidae
Cyprinidae
Esocidae
Ictaluridae
Gadidae
Moronidae
Percidae
Salmonidae
Ambloplites rupestris
Micropterus dolomieu
Micropterus salmoides
Pomoxis annularis
Pomoxis nigromaculatus
Cyprinus carpio
Esox lucius
Esox masquinongy
Esox niger
Ictalurus punctatus
Lota lota
Morone americana
Mo rone chrysops
Perca flavescens
Sander canadensis
Sander vitreus
Coregonus clupeaformis
Oncorhynchus gorbuscha
Oncorhynchus kisutch
Oncorhynchus tshawytscha
Oncorhynchus mykiss
Salmo salar
Rock bass
Smallmouth bass
Largemouth bass
White crappie
Black crappie
Common carp
Northern pike
Muskellunge
Chain pickerel
Channel catfish
Burbot
White perch
White bass
Yellow perch
Sauger
Walleye
Lake whitefish
Pink salmon
Coho salmon
Chinook salmon
Rainbow trout
Atlantic salmon
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 110 of 134
Sciaenidae
^BfTsSRiTTi/
Centrarchidae
Ictaluridae
Sal mo trutta
Salvelinus namaycush
Aplodinotus grunniens
\RY HUMAN HEALTH FISH TISSUE 1
Carpiodes cyprinus
Catostomus catostomus
Catostomus commersonii
Hypentelium nigmcans
Ictiobus cyprinellus
Ictiobus niger
Lepomis cyanellus
Lepomis gibbosus
Lepomis gulosus
Lepomis macrochirus
Lepomis megalotis
Ameiurus melas
Ameiurus natalis
Ameiurus nebulosus
Coregonus artedi
Coregonus hoyi
Prosopium cylindraceum
Salvelin us font in alis
Brown trout
Lake trout
Freshwater drum
PARGET SPECIES
Quillback
Longnose sucker
White sucker
Northern hogsucker
Bigmouth buffalo
Black buffalo
Green Sunfish
Pumpkinseed
Warmouth
Bluegill
Longear Sunfish
Black bullhead
Yellow bullhead
Brown bullhead
Cisco/ lake herring
Bloater
Round whitefish
Brook trout
5.9.3 Sampling and Analytical Methodologies
Detailed methods and handling for samples are found in the NCCA 2015 FOM.
5.9.4 Pertinent Laboratory QA/QC Procedures
Detailed methods and handling for samples are in the EPA OST Manuals/QAPP.
5.9.5 Pertinent Field QA/QC Procedures
5.9.5.1 Quality Assurance Objectives
The relevant quality objectives for fish tissue fillet sample collection activities are primarily related to
sample handling issues. Types of field sampling data needed for the fish tissue indicator are listed in
Table 47. Methods and procedures described in this QAPP and the FOMs are intended to reduce the
magnitude of the sources of uncertainty (and their frequency of occurrence) by applying:
standardized sample collection and handling procedures, and
use of trained scientists to perform the sample collection and handling activities.
Table 47. Field Data Types: Whole Fish Tissue Samples for Fillet Analysis
Variable or Measurement Measurement Endpoint or Unit
Fish specimen
Fish length
Composite classification
Specimen count classification
Species-level taxonomic identification
Millimeters (mm), total length
Sample identification number
Specimen number
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 111 of 134
5.9.5.2 Quality Control Procedures: Field Operations
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the FOM. That quality is enhanced by the
training and experience of project staff and documentation of sampling activities. Specific quality control
measures are listed in Table 48Error! Reference source not found, for field measurements and
observations.
Table 48. Field Quality Control: Whole Fish Tissue Samples for Fillet Analysis
Quality Control Activity Description and Requirements
Check integrity of sample Clean, intact containers and labels
containers and labels
Corrective Action
Obtain replacement
supplies
Field Processing
The crew will identify specimens in the field
Labs verify. If not same
species, sample not
composited
Sample Collection
The crew will retain 5 specimens of the same species to Labs verify. If not same
form the composite sample.
Sample Collection
The length of the smallest fish must be at least 75% of
the length of the longest fish.
species, sample not
composited
If fish out of length range
requirement, EPA
contacted for instructions
5.9.6 Data Management, Review and Validation
Checks made of the data in the process of review, verification, and validation is summarized in Table
49. For the whole fish tissue fillet data, the Indicator Lead is ultimately responsible for ensuring the
validity of the data, although performance of the specific checks may be delegated to other staff
members. All raw data (including all standardized forms and logbooks) are retained in an organized
fashion for seven years or until written authorization for disposition has been received from the NCCA
Lead.
Table 49. Data Validation Quality Control: Whole Fish Tissue Samples for Fillet Analysis
Check Description Frequency Acceptance Criteria
Composite validity
check
All composites
Each routine composite
sample must have 5 fish of
the same species
Corrective Action
For non-routine composite samples, EPA
indicator lead contacted for instructions
before processing begins
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Check Description Frequency Acceptance Criteria
75% rule
All composites
Length of smallest fish in
the composite must be at
least 75% of the length of
the longest fish.
Quality Assurance Project Plan
Page 112 of 134
Corrective Action
For non-routine composite samples,
indicator lead contacted for instructions
before processing begins
EPA
5.10 Fish Tissue Plugs
5.10.1 Introduction
Fish are time-integrating indicators of persistent pollutants, and contaminant bioaccumulation in fish
tissue has important human and ecological health implications. The NCCA 2015 tissue plug will provide
information on the national distribution of mercury in fish species from all coastal waters.
5.10.2 Sample Design and Methods
Detailed methods and handling for samples are found in the NCCA 2015 Field Operations manual. The
laboratory method for fish tissue is performance based. Example standard operating procedures are
provided in Appendix B of the LOM.
5.10.3 Pertinent Laboratory QA/QC Procedures
5.10.3.1 Laboratory Performance Requirements
Specific laboratory performance requirements are listed in Table 50.
Table 50. Measurement Data Quality Objectives for Mercury in Fish Tissue Plugs
Variable or Measurement
[uantitation Limit
Mercury
0.47 ng/g
I 5.0ng/g
5.10.3.2 Laboratory Quality Control Requirements
Specific laboratory quality control requirements are listed in Table 51.
Table 51. Quality Control for Mercury in Fish Tissue Plugs
Activity
Evaluation/Acceptance Criteria Corrective Action
Demonstrate competency for I Demonstration of past experience with
analyzing fish samples to meet fish tissue samples in applying the
the performance measures | laboratory SOP in achieving the method
detection limit
EPA will not approve any laboratory for
NCCA sample processing if the
laboratory cannot demonstrate
competency. In other words, EPA will
select another laboratory that can
demonstrate competency for its NCCA
samples.
Check condition of sample
when it arrives.
Sample issues, such as punctures or rips
in wrapping; missing label; temperature;
adherence to holding time
requirements; sufficient volume for test.
All samples should arrive at the
laboratory frozen.
Assign an appropriate condition code.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 113 of 134
Store sample appropriately.
While stored at the laboratory,
the sample must be kept at a
maximum temperature of -20°
C.
Analyze sample within holding
time
Maintain quality control
specifications from selected
method/SOP (that meets the
measurement data quality
objectives)
Maintain the required MDL
Use consistent units for QC
samples and field samples
Maintain completeness
Check the temperature of the freezer
per laboratory's standard operating
procedures.
The test must be completed within the
holding time (i.e., 1 year). If the original
test fails, then the retest also must be
conducted within the holding time.
Data meet all QC specifications in the
selected method/SOP.
Evaluate for each sample
Verify that all units are provided in wet
weight units and consistently
Completeness objective is 95% for all
parameters.
Record temperature of sample upon
arrival at the laboratory. If at any other
time, samples are warmer than
required, note temperature and
duration in comment field.
Perform test, but note reason for
performing test outside holding time.
EPA expects that the laboratory will
exercise every effort to perform tests
before the holding time expires.
If data do not meet all QC
requirements, rerun sample or qualify
data. If the lab believes the data are to
be qualified without rerunning sample,
the lab must consult with the EPA
Survey QA Lead before proceeding.
If MDL could not be achieved, then
provide dilution factor or QC code and
explanation in the comment field.
If it is not possible to provide the results
in the same units as most other
analyses, then assign a QC code and
describe the reason for different units in
the comments field of the database.
Contact the EPA Survey QA Lead
immediately if issues affect laboratory's
ability to meet completeness objective.
5.10.3.3 Data Reporting
Table 52. Data Reporting Criteria: Fish Tissue Plugs
Metals
fish tissue wet weight
0.01
5.10.4 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 FOM. That quality is
enhanced by the training and experience of project staff and documentation of sampling activities.
Crews will collect fish plugs for mercury. Field crews will verify that all sample containers are
uncontaminated and intact, and that all sample labels are legible and intact.
Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the fish sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 114 of 134
Store the sample frozen.
Recheck all forms and labels for completeness and legibility.
5.10.4.1 Field Performance Requirements
Specific field performance requirements are listed in Table 53.
Table 53. Method Quality Objectives for Field Measurement for the Fish Tissue Plug Indicator
Quality Control
Activity
75% rule
tive Action
Length of smallest fish in the composite must Indicator lead will review
be at least 75% of the length of the longest composite data and advise the
fish. | lab before processing begins
5.10.4.2 Field Quality Control Requirements
Specific quality control measures are listed in Table 54 for field measurements and observations.
Table 54. Field Quality Control: Fish Tissue Plug
Quality Control Activity
Check integrity of sample
containers and labels
Set up fishing equipment
Field Processing
Holding time
Sample Storage (field)
Description and Requirements
Clean, intact containers and labels.
An experienced fisheries biologist sets up the
equipment. If results are poor, a different
method may be necessary.
The fisheries biologist will identify specimens in
the field using a standardized list of common
and scientific names. A re-check will be
performed during processing.
Frozen samples must be shipped on dry ice
within 2 weeks of collection.
Keep frozen and check integrity of sample
packaging.
Corrective Action
Obtain replacement
supplies
Note on field data sheet
Attempt to catch more
fish of the species of
interest.
Qualify samples
Qualify sample as suspect
for all analyses
5.10.5 Data Review
Checks made of the data in the process of review, verification, and validation are summarized in Table
55. The Project QA Coordinator is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 115 of 134
Table 55. Data Validation Quality Control: Fish Tissue Plugs
ctivity or Procedure
Requirements and Corrective Action
Range checks, summary statistics, and/or Correct reporting errors or qualify as suspect
exploratory data analysis (e.g., box and whisker or invalid.
plots)
Review holding times
Qualify value for additional review
Review data from QA samples (laboratory PE
samples, and interlaboratory comparison
samples)
Determine impact and possible limitations on
overall usability of data
5.11 Algal Toxins, Research Indicator
5.11.1 Introduction
Crews will collect a water sample at the index site to measure concentrations of algal toxins including
Anatoxin-a, Cylindrospermopsi, Domoic acid, Microcystin-HtYR, Microcystins-LF, Microcystin-LR,
Microcystin-LW, Microcystin-LY, Microcystin-RR, Microcystin-WR, Microcystin-YR, Nodularin-R, and
Okadaic acid.
5.11.2 Sample Design and Methods
Detailed sample collection and handling procedures are found in the NCCA 2015 Field Operations
Manual. For this research indicator, the USGS laboratory method is found in the NCCA 2015 Laboratory
Operations Manual.
5.11.3 Pertinent Laboratory QA/QC Procedures
A single laboratory will analyze the algal toxin samples. The specific quality control procedures used are
implemented to ensure that:
Objectives established for various data quality indicators are being met.
The laboratory will follow the procedures outlined in the NCCA 2015 QAPP and the LOM.
5.11.3.1 Laboratory Performance Requirements
Performance requirements for the algal toxin indicator are listed in Table 56.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 116 of 134
rable 56. Measurement Quality Objectives for Algal Toxin Research Indicator
... Method Detection _ .... _..
Parameter Units . . . _.. . Reporting Limit Objective
Limit Objective K 8 '
Algal Toxins
Anatoxin-a
Cylindrospermopsin
Domoic acid
Microcystin-HtYR
Microcystin-LR
Microcystin-LY
Microcystin-RR
Microcystin-WR
Microcystin-YR
Nodularin-R
Okadaic acid
Microcystin-LF
Microcystin -LW
Hg/L
Hg/L
Matrix dependent
Matrix dependent
0.10 ng/L
0.30 |ag/L
5.1.1.1 Laboratory Quality Control Requirements
Quality control requirements for the algal toxin research indicator are listed in Table 57. Sample receipt
and other processing requirements are listed in Table 58.
Table 57. Sample Analysis Quality Control Activities and Objectives for Algal Toxins
Quality Control
Activity
Reagents and Standards -
Shelf Life
LCTX Working
Standard
Stock Internal
Standards
Intermediate
Internal
standards
Working Internal
Standards
Check standards
ription and Requirements
| Store in the dark at -20 °C. shelf life is based on
LC/MS/MS calibration curve response. If curve
has drifted outside of +/- 20% of expected value,
then new intermediate working standard mixes
will be made.
Corrective Action
If standard has expired or storage
temperature is exceeded or sample is
otherwise compromised, then discard or
clearly label as expired and set aside for
training activities and mix fresh standard
according to SOP in LOM.
Calibration
Either internal standard calibration curve or
single point standard addition described in the
algal toxin SOP at a level equivalent to 1.0 [ig/L.
Standard addition can be used exclusively or
when matrix effects are greater than +/- 20%
(28.3% RSD) of spiked concentration.
If any requirement fails:
Results from the analytical run are
not reported.
Clean instrument source per
manfacturer's directions.
Recalibrate or re-equilibrate
LC/MS/MSAII samples in the
analytical run are reanalyzed until
calibration provides acceptable
results.
LC/MS/MS Equilibration
Retention time values within 60 seconds, peak
shape within 30% of historical abundance.
Troubleshoot according to section
7.13.5 in SOP in LOM. Do not analyze
samples until successful equilibration
is achieved.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 117 of 134
Control Standards
(evaluate between injections)
Within 20% of expected concentration or
abundance
Clean instrument source per
manufcaturer's dirctions
Re-equilibrate LC/MS/MS
Check internal standards vial. If
empty, refill. If not empty, suspect
matrix effects and use standard
addition to bring to within 20% of
expected value.
Re-run all samples since last
successful equilibration
Internal Standards, Blanks,
Controls, Standard Additions
Retention times within 60 seconds of historical
values; peak shape within 30% of historical
abundance
Clean instrument source per
manufacturer's directions.
Re-equilibrate LC/MS/MSCheck
internal standards vial. If empty,
refill. If not empty, suspect matrix
effects and use standard addition to
bring to within 20% of expected
value.
Re-run all samples since last
successful equilibration
Table 58. Sample Receipt and Processing Quality Control: Algal Toxin Research Indicator
Quality Control
Activity
Description and Requirements
Corrective Action
Sample Log-in
Upon receipt of a sample shipment, record receipt Discrepancies, damaged, or
of samples in the NARS IM system (within 24 clock missing samples are reported to
hours) and the laboratory's Information
Management System (LIMS).
the EPA HQs Laboratory QA
Coordinator
Sample condition Sample issues such as cracked container; missing Qualify samples
upon receipt
label; temperature (frozen); adherence to holding
time requirements; sufficient volume for test.
Sample Storage
Store sample frozen at -20 °C
Qualify samples
Holding time
Frozen samples can be stored for at least 4 years.
Qualify samples
5.11.3.2 Data Reporting
Data reporting units and significant figures are summarized in Table 59.
Table 59. Data Reporting Criteria: Algal Toxin Research Indicator
Measurement
Algal Toxins
No. Maximum No.
Significant Decimal Places
Figures
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 118 of 134
5.1.2 Pertinent Field QA/QC Procedures
Field data quality is addressed, in part, by application and consistent performance of valid procedures
documented in the standard operating procedures detailed in the NCCA 2015 FOM. That quality is
enhanced by the training and experience of project staff and documentation of sampling activities.
Crews will collect a single water sample for algal toxin analyses. Field crews will verify that all sample
containers are uncontaminated and intact, and that all sample labels are legible and intact. While in the
field, the crew will store samples in a cooler on ice and will then freeze the sample upon returning to the
base site (hotel, lab, office). Before leaving the field, the crews will:
Check the label to ensure that all written information is complete and legible.
Place a strip of clear packing tape over the label, covering the label completely.
Record the sample ID number assigned to the microcystins sample on the Sample Collection Form.
Enter a flag code and provide comments on the Sample Collection Form if there are any problems in
collecting the sample or if conditions occur that may affect sample integrity.
Store the sample on ice in field.
Recheck all forms and labels for completeness and legibility.
5.1.2.1 Field Performance Requirements
Not Applicable.
5.1.2.2 Field Quality Control Requirements
See Table 60 for quality control activities and corrective actions.
Table 60. Sample Field Processing Quality Control: Algal Toxin Research Indicator
Quality Control
Activity
Holding time
Description and Requirem
Corrective Action
Hold sample on wet ice and freeze immediately | Qualify samples
upon return to the base site (hotel, lab, office) and
keep frozen until shipping
Sample Storage Store samples in darkness and frozen (-20 °C)
Monitor temperature daily
Qualify sample as suspect
5.1.3 Data Review
Checks made of the data in the process of review and verification is summarized in Table 61. The NCCA
Project QA Coordinator is ultimately responsible for ensuring the validity of the data, although
performance of the specific checks may be delegated to other staff members.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 119 of 134
Table 61. Data Validation Quality Control for Algal Toxin Research Indicator
Range checks, summary statistics, and/or Correct reporting errors or qualify as
exploratory data analysis (e.g., box and suspect or invalid.
whisker plots)
Review holding times
Qualify value for additional review
Review data from QA samples (laboratory Determine impact and possible
PE samples, and interlaboratory
comparison samples)
limitations on overall usability of data
6 Field and Biological Quality Evaluation & Assistance
6.1 National Coastal Condition Assessment Field Quality Evaluation and Assistance Visit Plan
EPA, contractor and other qualified staff will conduct evaluation and assistance visits with each field
crew early in the sampling and data collection process, if possible, and corrective actions will be
conducted in real time. These visits provide both a quality check for the uniform evaluation of the data
collection methods and an opportunity to conduct procedural reviews, as required, minimizing data loss
due to improper technique or interpretation of field procedures and guidance. Through uniform training
of field crews and review cycles conducted early in the data collection process, sampling variability
associated with specific implementation or interpretation of the protocols will be significantly reduced.
The visit also provides the field crews with an opportunity to clarify procedures and offer suggestions for
future improvements based on their sampling experience preceding the visit. The field evaluations,
while performed by a number of different supporting collaborator agencies and participants, will be
based on the uniform training, plans, and checklists. The field evaluations will be based on the
evaluation plan and field evaluation checklist. EPA has scheduled this review and assistance task for
each unique field crew collecting and contributing data under this program. If unforeseen events
prevent the EPA from evaluating every crew, the NCCA Quality Assurance Coordinator (QAC) will rely on
the data review and validation process to identify unacceptable data that will not be included in the final
database. If inconsistencies cannot be resolved, the QAC may contact the Field Crew Leader for
clarification..
One or more designated EPA, contractor or other staff who are qualified (i.e. have completed training)
in the procedures of the NCCA 2015 field sampling operations will visit trained state, contractor, federal
agency and EPA field sampling crews during sampling operations on site. If membership of a field crew
changes, and at least two of the members have not been evaluated previously, the field crew must be
evaluated again during sampling operations as soon as possible to ensure that all members of the field
crew understand and can perform the procedures. If a deviation is needed from the process described
here, the staff member conducting the assistance visit (AV) must contact the Assistance Visit
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 120 of 134
Coordinator who will contact the NCCA Project Lead and the NCCA Project QA Coordinator to determine
an acceptable course of action.
The purpose of this on-site visit will be to identify and correct deficiencies during field sampling
operations. The process will involve preparation activities, field day activities and post field day activities
as described in the following sections. Additionally, conference calls with crews may be held
approximately every two weeks to discuss issues as they come up throughout the sampling season.
6.1.1 Preparation Activities
Each Field Crew Evaluator will schedule an assistance visit with their designated crews in
consultation with the Contractor Field Logistics Coordinator, Regional NCCA Coordinator, and
respective Field Sampling Crew Leader. Ideally, each Field Crew will be evaluated within the first two
weeks of beginning sampling operations, so that procedures can be corrected or additional training
provided, if needed.
Each Evaluator is responsible for providing their own field gear sufficient to accompany the Field
Sampling Crews during a complete sampling cycle. Schedule of the Field visits will be made by the
Evaluator in consultation with the respective Field Crew Leader. Evaluators should be prepared to
spend additional time in the field if needed (see below).
Each Field Crew Evaluator will ensure that field crews are aware of their visit plans and all capacity
and safety equipment will be provided for the Field Crew Evaluator.
Each Field Crew Evaluator will need to bring the items listed in Table 62.
Table 62. Equipment and Supplies - Field Evaluation and Assistance Visits
Assistance Visit
Checklist
Appendix D (see FOM)
Quantity
Documentation NCCA 2015 Field Operations Manuals
NCCA 2015 Quality Assurance Project Plan
Clipboard
Pencils (#2, for data forms)/Pen (or computer for electronic versions)
Field notebook (optional)
Gear
Field gear (e.g., protective clothing, sunscreen, insect repellent, hat, water, As
food, backpack, cell phone) needed
6.1.2 Field Day Activities
The Field Crew Evaluator will review the Field Evaluation & Assistance Visit Checklist with each crew
during the field sampling day and establish and plan and schedule for their evaluation activities for
the day.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 121 of 134
The Field Crew Evaluator will view the performance of a field crew through one complete set of
sampling activities as detailed on the checklist.
Scheduling might necessitate starting the evaluation midway on the list of tasks at a site, instead of
at the beginning. In that case, the Field Crew Evaluator will follow the crew to the next site to
complete the evaluation of the first activities on the list.
If the field crew misses or incorrectly performs a procedure, the Field Crew Evaluator will note this
on the checklist and immediately point this out so the mistake can be corrected on the spot. The role
of the Field Crew Evaluator is to provide additional training and guidance so that the procedures are
being performed consistent with the FOM, all data are recorded correctly, and paperwork is
properly completed at the site.
When the sampling operation has been completed, the Field Crew Evaluator will review the results
of the evaluation with the field crew before leaving the site (if practicable), noting positive practices
and problems (i.e., weaknesses [might affect data quality]; deficiencies [would adversely affect data
quality]). The Field Crew Evaluator will ensure that the field crew understands the findings and will
be able to perform the procedures properly in the future.
The Field Crew Evaluator will review the list and record responses or concerns from the field crew, if
any; on the checklist (this may happen throughout the field day).
The Field Crew Leader will sign the checklist after this review.
6.1.3 Post Field Day Activities
The Field Crew Evaluator will review the checklist that evening and provide a summary of findings,
including lessons learned and concerns.
If the Field Crew Evaluator finds major deficiencies in the field crew operations (e.g., less than two
members, equipment, or performance problems) the Field Crew Evaluator must contact the EPA
NCCA Project QA Coordinator. The EPA NCCA Project QA Coordinator will work with the EPA NCCA
Program Manager to determine the appropriate course of action. Data records from sampling sites
previously visited by this Field Crew will be checked to determine whether any sampling sites must
be redone.
The Field Crew Evaluator will retain a copy of the checklist and submit to the EPA Logistics
Coordinator either via Fed-Ex or electronically.
The EPA Logistics Coordinator and the NCCA Project QA Coordinator or authorized designee
(member of the NCCA 2015 quality team) will review the returned Field Evaluation and Assistance
Visit Checklist, note any issues, and check off the completion of the evaluation for each field crew.
6.1.4 Summary
Table 63 summarizes the plan, checklist, and corrective action procedures.
Table 63. Summary of Field Evaluation and Assistance Visit Information
Field The Field Crew Evaluator:
Evaluation Arranges the field evaluation visit in consultation with the Project QA Coordinator, Regional
p|an NCCA Coordinator, and respective Field Sampling Crew Leader, ideally within the first two
weeks of sampling
Observes the performance of a crew through one complete set of sampling activities
Takes note of errors the field crew makes on the checklist and immediately point these out to
correct the mistake
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 122 of 134
Reviews the results of the evaluation with the field crew before leaving the site, noting positive
practices, lessons learned, and concern
Field The Field Crew Evaluator:
Evaluation Observes all pre-sampling activities and verifies that equipment is properly calibrated and in
Checklist good working order, and protocols are followed
Checks the sample containers to verify that they are the correct type and size, and checks the
labels to be sure they are correctly and completely filled out
Confirms that the field crew has followed NCCA protocols for locating the X -site
Observes the index site sampling, confirming that all protocols are followed
Observes the littoral sampling and habitat characterization, confirming that all protocols are
followed
Records responses or concerns, if any, on the Field Evaluation and Assistance Checklist
Corrective If the Field Crew Evaluator's findings indicate that the Field Crew is not performing the
Action procedures correctly, safely, or thoroughly, the Evaluator must continue working with this Field
Procedures Crew until certain of the crew's ability to conduct the sampling properly so that data quality is
not adversely affected.
If the Field Crew Evaluator finds major deficiencies in the Field Crew operations the Evaluator
must contact the EPA NCCA Project QA Coordinator.
6.2 National Coastal Condition Assessment Laboratory Quality Evaluation and Assistance Visit
Plan
As part of the NCCA 2015, field samples will be collected at each assessment site. These samples will be
sent to laboratories cooperating in the assessment. To ensure quality, each Project Cooperator
laboratory analyzing samples from the NCCA 2015 will receive an evaluation from an NCCA Lab
Evaluator. All Project Cooperator laboratories will follow these guidelines.
No national program of accreditation for laboratory processing for many of our indicators currently
exists. For this reason, a rigorous program of laboratory evaluation has been developed to support the
NCCA 2015.
Given the large number of laboratories participating in the NCCA 2015, it is not feasible to perform an
assistance visit11 (AV) on each of these laboratories. An AV would include an on-site visit to the
laboratory lasting at least a day. As a result, the EPA Headquarters Project Management Team will
conduct remote review of laboratory certifications and accreditations of all laboratories. Additionally,
EPA will include an inter-laboratory comparison between some laboratories (mainly for biological
indicators). If issues arise from the remote review or inter-laboratory comparison that cannot be
resolved remotely, the EPA Quality Team and/or contractors will perform an on-site visit to the
laboratory. This process is in keeping with EPA's Policy to Assure Competency of Laboratories, Field
Sampling, and Other Organizations Generating Environmental Measurement Data under Agency-Funded
Acquisitions.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 123 of 134
6.2.1 Remote Evaluation/Technical Assessment
A remote evaluation procedure has been developed for performing assessment of all laboratories
participating in the NCCA 2015.
The Laboratory Review Coordinator, the NCCA Project QA Coordinator and other members of the NCCA
QA Team will conduct laboratory evaluation prior to data analysis to ensure that the laboratories are
qualified and that techniques are implemented consistently across the multiple laboratories generating
data for the program. The EPA National Aquatic Resource Surveys team has developed laboratory
evaluation plans to ensure uniform interpretation and guidance in the procedural reviews.
The NCCA Quality Team is using a procedure that requests the laboratory to provide documentation of
its policies and procedures. For the NCCA 2015 project, the Quality Team is requesting that each
participating laboratory provide the following documentation:
The laboratory's Quality Manual, Quality Management Plan or similar document.
Standard Operating Procedures (SOPs) for each analysis to be performed.
Long term Method Detection Limits (MDLs) for each instrument used and Demonstration of
Capability for each analysis to be performed.
A list of the laboratory's accreditations and certifications, if any.
Results from Proficiency Tests for each analyte to be analyzed under the NCCA 2015 project.
If a laboratory has clearly documented procedures for sample receiving, storage, preservation,
preparation, analysis, and data reporting; has successfully analyzed Proficiency Test samples (if required
by EPA, EPA will provide the PT samples); has a Quality Manual that thoroughly addresses laboratory
quality including standard and sample preparation, record keeping and QA non-conformance;
participates in a nationally recognized or state certification program; and has demonstrated ability to
perform the testing for which program/project the audit is intended, then the length of an on-site visit
will be minimum, if not waived entirely. The QA Team will make a final decision on the need for an
actual on-site visit after the review and evaluation of the documentation requested.
If a laboratory meets or exceeds all of the major requirements and is deficient in an area that can be
corrected remotely by the lab, suggestions will be offered and the laboratory will be given an
opportunity to correct the issue. The QA Team will then verify the correction of the deficiency remotely.
The on-site visit by EPA and/or a contractor should only be necessary if the laboratory fails to meet the
major requirements and is in need of help or fails to produce the requested documentation.
In addition, all labs must sign a Lab Signature Form (see NCCA 2015 LOM) indicating that they will abide
by the following:
Utilize procedures identified in the NCCA 2015 Lab Operations Manual (or equivalent). If using
equivalent procedures, please provide procedures manual to demonstrate ability to meet the
required MQOs.
Read and abide by the NCCA 2015 Quality Assurance Project Plan (QAPP) and related Standard
Operating Procedures (SOPs).
Have an organized IT system in place for recording sample tracking and analysis data.
Provide data using the template provided in the Lab Operations Manual.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 124 of 134
Provide data results in a timely manner. This will vary with the type of analysis and the number of
samples to be processed. Sample data must be received no later than May 1, 2016 or as otherwise
negotiated with EPA.
Participate in a lab technical assessment or audit if requested by EPA NCCA Quality Team staff (this
may be a conference call or on-site audit).
If a lab is participating in biology analyses, they must, in addition, abide by the following:
Use taxonomic standards outlined in the NCCA 2015 Lab Manual.
Participate in taxonomic reconciliation exercises during the field and data analysis season, which
include conference calls and other lab reviews (see more below on Inter-laboratory comparison).
6.2.2 Water Chemistry Laboratories
The water chemistry laboratory approval process which is outlined on in the previous paragraphs of this
section is deemed appropriate because many laboratories participate in one or more national laboratory
accreditation programs such as the National Environmental Laboratory Accreditation Program (NELAP),
International Organization for Standardization (ISO-17025) as well as various state certification
programs which include strict requirements around documentation and procedures as well as site visits
by the accrediting authority. It is built off of the process s used by the NLA 2012 and NRSA 2013/14. The
laboratories participating in NCCA 2015 meet these qualifications and as such have demonstrated their
ability to function independently. This process is one that has been utilized in Region 3 for many years
and is designed around the national accrediting programs listed above.
6.2.3 Inter-laboratory Comparison
The NCCA QA plan includes an inter-laboratory investigation for the laboratories performing analysis on
benthic invertebrates for the NCCA 2015. This process is defined as an inter-laboratory comparison since
the same protocols and method will be used by both laboratories as described in this manual. The QA
plan also includes an independent taxonomist (EPA Contractor) to re-identify 10% of the samples from
each laboratory. No site visit is envisioned for these laboratories unless the data submitted and
reviewed by EPA does not meet the requirements of the inter-laboratory comparison described.
6.2.4 Assistance Visits
Assistance Visits will be used to:
Confirm the NCCA 2015 Laboratory Operations Manual (LOM) methods are being properly
implemented by cooperator laboratories.
Assist with questions from laboratory personnel.
Suggest corrections if any errors are made in implementing the lab methods.
Evaluation of the laboratories will take the form of administration of checklists which have been
developed from the LOM to ensure that laboratories are following the methods and protocols outlined
therein. The checklist will be administered on-site by a qualified EPA scientist or contractor.
Below are examples of the Document Request form used for both the Biological laboratories and the
Chemical laboratories.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 125 of 134
NCCA2015 Document Request Form Chemistry Laboratories
EPA and its state and tribal partners will conduct a survey of the nation's coastal waters. This National
Coastal Condition Assessment (NCCA), is designed to provide statistically valid regional and national
estimates of the condition of coastal waters. Consistent sampling and analytical procedures ensure that
the results can be compared across the country. As part of the NCCA 2015, the Quality Assurance Team
will conduct a technical assessment to verify quality control practices in your laboratory and its ability to
perform chemistry analyses under this project. Our review will assess your laboratory's ability to receive,
store, prepare, analyze, and report sample data generated under EPA's NCCA 2015.
The first step of this assessment process will involve the review of your laboratory's certification and/or
documentation. Subsequent actions may include (if needed): reconciliation exercises and/or a site visit.
All laboratories will need to complete the following forms:
If your lab has been previously approved within the last 5 years for the specific parameters:
A signature on the attached Laboratory Signature Form indicates that your laboratory will follow the
quality assurance protocols required for chemistry laboratories conducting analyses for the NCCA
2015. A signature on the QAPP and the LOM Signature Form indicates that you will follow both the
QAPP and the LOM.
If you have not been approved within the last 5 years for the specific parameters in order for us to
determine your ability to participate as a laboratory in the NCCA, we are requesting that you submit
the following documents (if available) for review:
Documentation of a successful quality assurance audit from a prior National Aquatic Resource
Survey (NARS) that occurred within the last 5 years (if you need assistance with this please contact
the individual listed below).
Documentation showing participation in a previous NARS for Water Chemistry for the same
parameters/methods.
Additionally, we request that all laboratories provide the following information in support of your
capabilities, (these materials are required if neither of the two items above are provided):
A copy of your Laboratory's accreditations and certifications if applicable (i.e. NELAC, ISO, state
certifications, North American Benthological Society (NABS), etc.).
An updated copy of your Laboratory's QAPP.
Standard Operating Procedures (SOPs) for your laboratory for each analysis to be performed (if not
covered in NCCA 2015 LOM).
Documentation attesting to experience running all analytes for the NCCA 2015, including chlorophyll
a.
This documentation may be submitted electronically via e-mail to forde.kendra@epa.gov. Questions
concerning this request can be submitted forde.kendra@epa.gov (202-566-0417) or
sullivan.hugh@epa.gov (202-564-1763).
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 126 of 134
NCCA2015 Document Request Form Biology Labs
EPA and its state and tribal partners will conduct a survey of the nation's coastal waters . This National
Coastal Condition Assessment (NCCA), is designed to provide statistically valid regional and national
estimates of the condition of coastal waters. Consistent sampling and analytical procedures ensure that
the results can be compared across the country. As part of the NCCA 2015, the Quality Assurance Team
will conduct a technical assessment to verify quality control practices in your laboratory and its ability to
perform biology analyses under this project. Our review will assess your laboratory's ability to receive,
store, prepare, analyze, and report sample data generated under EPA's NCCA 2015.
The first step of this assessment process will involve the review of your laboratory's certification and/or
documentation. Subsequent actions may include (if needed): reconciliation exercises and/or a site visit.
All laboratories will need to complete the following forms:
If your laboratory has been previously approved within the last 5 years for the specific parameters:
A signature on the attached Laboratory Signature Form indicates that your laboratory will follow the
quality assurance protocols required for biology laboratories conducting analyses for the NCCA
2015. A signature on the QAPP and the LOM Signature Form indicates you will follow both the QAPP
andtheLOM.
If you have not been approved within the last 5 years for the specific parameters, in order for us to
determine your ability to participate as a laboratory in the NCCA, we are requesting that you submit
the following documents (if available) for review:
Documentation of a successful quality assurance audit from a prior National Aquatic Resource
Survey (NARS) that occurred within the last 5 years (if you need assistance with this please contact
the individual listed below).
Documentation showing participation in previous NARS for this particular indicator.
Additionally, we request that all laboratories provide the following information in support of your
capabilities, (these materials are required if neither of the two items above are provided):
A copy of your Laboratory's accreditations and certifications if applicable (i.e. NELAC, ISO, state
certifications, NABS, etc.).
Documentation of NABS (or other) certification for the taxonomists performing analyses (if
applicable).
An updated copy of your Laboratory's QAPP.
Standard Operating Procedures (SOPs) for your lab for each analysis to be performed (if not covered
in NCCA 2015 LOM).
This documentation may be submitted electronically via e-mail to forde.kendra@epa.gov. Questions
concerning this request can be submitted forde.kendra@epa.gov (202-566-0417) or
sullivan.hugh@epa.gov (202-564-1763).
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 127 of 134
7 Data Analysis Plan
The goal of the NCCA is to address three key questions about the quality of the Nation's coastal
waters:
What percent of the Nation's coastal waters are in good, fair, and poor condition for key
indicators of chemical water quality, ecological condition, and suitability for recreation?
How are conditions changing over time?
What is the relative importance of key stressors (e.g., nutrients and pathogens) in
impacting the biota?
The Data Analysis Plan describes the approach used to process the data generated during the field
survey to answer these three questions. Results from the analysis will be included in the final report and
used in future analysis.
7.1 Data Interpretation Background
The intent of data analyses is to describe the occurrence and distribution of selected indicators
throughout the estuaries and coastal waters of the United States within the context of regionally
relevant expectations. The analyses will culminate by categorizing and reporting the condition of coastal
waters as being good, fair, or poor condition. Statistical analysis techniques appropriate for using data
collected using probabilistic survey designs such as those described at EPA's Aquatic Resource
Monitoring website, http://www.epa.gov/nheerl/arm/index.htm, will serve as the primary method for
interpreting survey results. However, other data analyses will be used for further assessment
investigations as described below.
Because of the large-scale and multijurisdictional nature of this effort, the key issues for data
interpretation are: the scale of assessment, selecting the effective indicators across the range of systems
included in the survey, and determining thresholds for judging condition. An NCCA Data Analysis work
group will be created to address these points and to help strengthen NCCA assessments.
7.1.1 Scale of Assessment
EPA selected the sampling locations for the NCCA survey using a probability based design, and
developed rules for selection to meet certain distribution criteria, while ensuring that the design yielded
a set of coastal areas that would provide for statistically valid conclusions about the condition of the
population of coastal areas across the nation.
7.1.2 Selecting Indicators
Indicators for the 2015 survey will basically remain the same as those used in the previous National
Coastal Condition Assessment12, with a few modifications. The indicators for NCCA 2015 include
nutrients in water, light attenuation, sediment chemistry, sediment toxicity, benthic communities,
whole body fish tissue, fish tissue plugs for mercury analysis, microcystins, and enterococci.
12 For more information visit the NCCA website at: https://www.epa.gov/national-aquatic-resource-surveys/ncca
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 128 of 134
Supplemental and research indicators also include algal toxins, fish tissue filets (Great Lakes only),
phytoplankton (Great Lakes only), and under water video (Great Lakes only). Of these, fish tissue plugs,
microcystins and algal toxins are new indicators.
7.2 Datasets to be used for the Report
The Dataset used for the 2015 assessment consists of data collected during NCCA 2015, the NCCA
2010, and data from historic National Coastal Condition Reports (NCCRs) for tracking changes in water
quality data. Other data may be added as appropriate.
7.3 Indicators for the Coastal Assessment
Water Chemistry and Chlorophyll
A wide array of water chemistry parameters will be measured. Water chemistry analysis is critical for
interpreting the biological indicators. Chlorophyll-a, Secchi depth, light attenuation and nutrient
measurements will be used to create a water quality index and identify stressors.
Benthic Invertebrates
To distinguish degraded benthic habitats from undegraded benthic habitats, EMAP and NCA have
developed regional (Southeast, Northeast, and Gulf coasts) benthic indices of environmental condition
(Engle et al., 1994; Weisberg et al., 1997; Engle and Summers, 1999; Van Dolah et al., 1999; Hale and
Heltshe, 2008). A new Multi-metric approach (M-AMBI) is also being developed and peer reviewed for
potential use in the NCCA 2015 report.
Sediment Chemistry/Characteristics
The NCCA is collecting sediment samples, measuring the concentrations of chemical constituents and
percent TOC in the sediments, and evaluating sediment toxicity as described in the QAPP, field
operations manual and laboratory operations manual. The results of these evaluations will be used to
identify the percent of coastal waters with sediment contamination. The sediment quality index is based
on measurements of three component indicators of sediment condition: sediment toxicity, sediment
contaminants, and sediment TOC. This information will also be used in identifying stressors to
ecological/biological condition.
Enterococci Data Analysis
The presence of certain levels of enterococci is associated with pathogenic bacterial contamination of
the resource. A single enterococci water sample will be collected at each site, then filtered, processed,
and analyzed using qPCR. Bacterial occurrence and distribution will be reported. Data interpretation will
be enhanced by comparison to USEPA thresholds13. In 2012, EPA released new recreational water
quality criteria recommendations for protecting human health in all coastal and non-coastal waters
13 For more information visit EPA's website at https://www.epa.gov/wqc/2012-recreational-water-quality-criteria-
documents
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 129 of 134
designated for primary contact recreation use. NCCA will use the enterococci statistical threshold values
for marine and freshwaters to assess the percent of coastal waters above and below human health
levels of concern.
Fish Chemistry
For the NCCA, both juvenile and adult target fish species will be collected from all monitoring stations
where fish were available, and whole-body contaminant burdens will be determined. The target species
typically included demersal (bottom dwelling) and pelagic (water column-dwelling) species that are
representative of each of the geographic regions. The EPA recommended values for fish advisories will
serve as the threshold against which to evaluate risk.
Algal toxins
The presence of algal toxins can be an indicator of human and/or ecological risk. Microcystin and other
algal toxins will be collected at each site. Occurrence and distribution will be reported. Where
thresholds are available (such as World Health Organization or other applicable thresholds)
concentrations will be reported against those values.
7.4 NCCR Index Development Approach
EPA intends to calculate the indices used in previous NCCR reports. Information on this approach, the
indices and related thresholds can be found in the National Coastal Condition Report III (EPA 2008.)
7.5 Calculation of Population Estimates
Once the individual indicator values are calculated for each sampling location, population estimates will
be generated using the procedures outlined by EMAP and found on the Aquatic Resource Monitoring
website (https://archive.epa.gov/nheerl/arm/web/html/index.html). The population estimates will
include estimates of uncertainty for each indicator. The output of these analyses are the specific results
that will appear in the coastal assessment report.
7.6 Relative Extent, Relative Risk and Attributable Risk Analysis
EPA intends to estimate the relative extent of poor conditions for each stressor, the relative risk posed
to biota by that stressor and the population attributable risk analysis as outline by Van Sickle and
Paulsen (2008).
7.7 Other Change Analyses
Biological and stressor/chemical data from the NCCA and previous reports will be analyzed to see what
changes have occurred overtime.
7.8 Index Precision and Interpretation
NCCA indicators will be repeated at 10% of the sites during the summer index sampling period. These
repeat samples allow an assessment of the within-season repeatability of these indicators and metrics.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 130 of 134
The NCCA will calculate the precision of select site condition indicators using a basic measure of
repeatability - the RMSrep or the Root Mean Square of repeat visits.
The RMSrep is a measure of the absolute (unsealed) precision of the whole measurement and analytical
process as well as short-term temporal variability within the summer sampling period. The RMSrep for a
metric is an estimate of its average standard deviation if it were measured repeatedly at all sites, and
then standard deviations for each site were averaged. For Log transformed data, the antilog of the
RMSrep represents a proportional standard deviation. For example, if theRMSrep of the unsealed total
phosphorus data is 0.179, the antilog is 1.51. Therefore, the RMSrep of 0.179 for LoglO(PTL+l) means
that the error bound on a measurement at a site is +/-1.51. Because the data are LoglO transformed,
the measured value times 1.51 gives the upper ("+") error bound and divided by 1.51 gives the lower ("-
") error bound. So, the +/-1 StdDev error bounds on a PTL measurement of 10 ug/L during the index
period is (10 H- 1.51) to (10 xl.51) or 6.6 to 15.1.
Another way of scaling the precision of metrics is to examine their components of variance. The NCCA
calculates signal to noise ratios for each indicator to determine whether the amount of variance is
acceptable for it to be used in the data analysis described above. The ratio of variance among sites to
measurement (or temporal) variation within individual sites has been termed a "Signal-to-noise" ratio.
The S/N ratio assesses the ability of the metric to discern differences among sites in this survey context.
If the among-site variance in condition in the region, large estuary, Great Lake or nation is high, then the
S/N is high and the metric is ble to adequately discern differences in site condition. The NCCA uses a
variance-partitioning explained in Kaufmann et al. (1999) and Faustini and Kaufmann (2007), in which
the authors referred to RMSrep as RMSE and evaluated S/N in stream physical habitat variables. In those
publications, the authors generally interpreted precision to be high relative to regional variation if S/N
>10, low if S/N <2.0, and moderate if in-between. When S/N is over about 10, the effect of
measurement error on most interpretations is nearly insignificant within the national context; when S/N
is between 6 and 10, measurement effects are minor. When S/N ratios are between 2 and 5, the effects
of imprecision should be acknowledged, examined and evaluated. Ratios between 2 and 4 are usually
adequate to make good-fair-poor classifications in the NCCA, but there is some distortion of cumulative
distribution functions and a significant limitation to ability of a multiple linear regression to explain the
amount of among-site variance using single visit data.
8 References
American Public Health Association. 2006. Standard Methods for the Examination of Water and
Wastewater. 21st Edition. American Public Health Association, Washington, D.C.
American Society for Testing and Materials. 1991. Guide for conducting 10-day static sediment toxicity
tests with marine and estuarine amphipods. ASTM Standard Methods Volume 1104, Method Number E-
1367-90. American Society for Testing and Materials, Philadelphia, PA.
Arar, E.J., and J.B. Collins, 1992. EPA Method 445.0: "In Vitro Determination of Chlorophyll a and
Pheophytin a in Marine and Freshwater Phytoplankton by Fluorescence" EPA/600/R-2/121.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 131 of 134
Barbour, M.T., J. Gerritsen, B.D. Snyder, and J.B. Stribling. 1999. Rapid Bioassessment Protocols for Use
in Streams and Wadeable Rivers: Periphyton, Benthic Macroinvertebrates and Fish, Second Edition. EPA
841-B-99-002. U.S. Environmental Protection Agency; Office of Water; Washington, D.C.
CAS - Chemical Abstracts Service (CAS 1999)
Engle, V.D., J.K. Summers, and G.R. Gaston. 1994. A benthic index of environmental condition of the Gulf
of Mexico Estuaries. Estuaries 17:372-384.
Engle, V.D., and J.K. Summers. 1999. Refinement, validation, and application of a benthic index for
northern Gulf of Mexico estuaries. Estuaries 22(3A):624-635.
Faustini, John M. and Philip R. Kaufman. 2007. Adequacy of visually classified particle count statistics
from regional stream habitat surveys. Journal of the American Water Resources Association 43(5): 1293-
1315. WED-06-126.
Federal Register, Part VIII, EPA. "Guidelines Establishing Test Procedures for the Analysis ofPollutants
Under the Clean Water Act: Final Rule and Proposed Rule." 40 CFR Part 136, Oct. 28, 1984.
FGDC, 1998. Content Standard for Digital Geospatial Metadata. FGDC-STD-001-1998, Federal Geographic
DataCommittee, Reston, VA-USA.
FGDC, 1999. Geospatial Metadata, Part 1: Biological Data Profile. FGDC-STD-001.1-1999, Federal
Geographic Data Committee, Reston, VA-USA.
Glaser, P.M.; Wheeler, G.A.; Gorham, E.; Wright, H.E., Jr. 1981. The patterned mires of the Red Lake
Peatland, northern Minnesota: vegetation, water chemistry, and landforms. Ecology. 69: 575-599.
Hale, S.S., and J.F. Heltshe. 2008. Signals from the benthos: Development and evaluationof a benthic
index for the nearshore Gulf of Maine. Ecological Indicators 8: 338-350.
Hawkins, C. P., R. H. Norris, J. N. Hogue, and J. W. Feminella. 2000. Development and evaluation of
predictive models for measuring the biological integrity of streams. Ecological Applications 10:1456-
1477.
Heinz Center. 2002. The State of the Nation's Ecosystems. The Cambridge University Press.
Hunt, D.T.E., and A.L. Wilson. 1986. The Chemical Analysis of Water: General Principles and Techniques.
2nd ed. Royal Society of Chemistry, London, England. 683 pp.
Hydrolab Corporation. 1990. DataSonde 3 Operation Manual (and Performance Manual). Hydrolab
Corporation, Austin, TX.
Integrated Taxonomic Information System, 1999 (ITIS, http://www.itis.usda.gov/)
Kaufmann, P. R., P. Levine, E. G. Robison, C. Seeliger, and D. V. Peck. 1999. QuantifyingPhysical Habitat in
Wadeable Streams. EPA620/R-99/003. US Environmental ProtectionAgency, Washington, D.C.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 132 of 134
Kirchner, C.J. 1983. Quality control in water analysis. Environ. Sci. and Technol. 17 (4):174A-181A.
Klemm, D. J., K. A. Blocksom, F. A. Fulk, A. T. Herlihy, R. M. Hughes, P. R. Kaufmann, D. V.Peck, J. L
Stoddard, W. T. Thoeny, M. B. Griffith, and W. S. Davis. 2003. Development andevaluation of a
macroinvertebrate biotic integrity index (MBII) for regionally assessing Mid-Atlantic Highlands streams.
Environmental Management 31(5): 656-669.
MRLC- Multi-Resolution Land Characteristics (MRLC 1999) http://www.epa.gov/mrlc/
NAPA. 2002. Environment.gov. National Academy of Public Administration. ISBN: 1-57744-083-8. 219
pages.
NBII - National Biological Information Infrastructure (NBII 1999)
http://www.nbii.gov/datainfo/metadata/
NHD - National Hydrography Dataset Plus Version 1.0 (NHDPIus 2005) http://www.horizon-
systems.com/nhdplus/index.php
NRC. 2000. Ecological Indicators for the Nation. National Research Council.
NSDI - National Spatial Data Infrastructure (NSDI 1999) http://www.fgdc.gov/nsdi/nsdi.html
National Water Quality Monitoring Network for U.S. Coastal Waters and Their Tributaries,
http://acwi.gov/monitoring/network/index.html
Oblinger Childress, C.J., Foreman, W.T., Connor, B.F. and T.J. Maloney. 1999. New reporting procedures
based on long-term method detection levels and some considerations for interpretations of water-
quality data provided by the U.S. Geological Survey National Water Quality Laboratory. U.S.G.S Open-
File Report 99-193, Reston, Virginia.
Paulsen, S.G., D.P. Larsen, P.R. Kaufmann, T.R. Whittier, J.R. Baker, D. Peck, J.McGue, R.M. Hughes, D.
McMullen, D. Stevens, J.L Stoddard, J. Lazorchak, W. Kinney, A.R. Selle, and R. Hjort. 1991. EMAP -
surface waters monitoring and research strategy, fiscal year 1991. EPA-600-3-91-002. U.S.
Environmental Protection Agency, Office of Research and Development, Washington, D.C. and
Environmental Research Laboratory, Corvallis, Oregon.
SDTS - Spatial Data Transfer Standard (SDTS) http://mcmcweb.er.usgs.gov/sdts/
Stanley, T.W., and S.S. Verner. 1985. The U.S. Environmental Protection Agency's quality assurance
program. pp!2-19 In: J.K. Taylor and T.W Stanley (eds.). Quality Assurance for Environmental
Measurements, ASTM SPT867. American Society for Testing and Materials, Philadelphia, PA.
Stevens, D. L., Jr., 1994. Implementation of a National Monitoring Program. Journal
Environ. Management 42:1-29.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 133 of 134
Strobel, C.J. 2000. Coastal 2000 - Northeast Component: Field Operations Manual. U. S.
Environmental Protection Agency, National Health and Environmental Effects Research
Laboratory, Atlantic Ecology Division, Narragansett, Rl. EPA/620/R-00/002.
U.S. EPA, 1984. EPA Order 2160 (July 1984), Records Management Manual, U.S. Environmental
Protection Agency, Washington, DC.
U.S. EPA 1993. EPA Requirements for Quality Assurance Project Plans for Environmental Data
Operations (EPA QA/R-5). U.S. Environmental Protection Agency, Quality Assurance Management Staff,
Washington, DC.
U.S. EPA. 1995. Environmental Monitoring and Assessment Program (EMAP): Laboratory Methods
Manual-Estuaries, Volume 1: Biological and Physical Analyses. U.S. Environmental Protection Agency,
Office of Research and Development, Narragansett, Rl. EPA/620/R-95/008.
U.S. EPA, 1999. EPA's Information Management Security Manual. EPA Directive 2195 Al.
U.S. EPA, 2000a. EPA's National Study of Chemical Residues in Lake Fish Tissue.
http://www.epa.gov/fishadvisories/study/sampling.htm.
U.S. EPA. 2000b. Guidance for assessing chemical contaminant data for use in fish advisories, volume 1:
Fish sampling and analysis. Third edition. EPA/823/B-00/007.
http://www.epa.gov/waterscience/fish/ (available under "National Guidance").
U.S. EPA 2001A. Environmental Monitoring and Assessment Program (EMAP) National Coastal
Assessment Quality Assurance Project Plan 2001-2004, Office of Research and Development, National
Health and Environmental Effects Research Laboratory, Gulf Ecology Division, Gulf Breeze, FL.
EPA/620/R-01/002
U.S. EPA 2001B. National Coastal Assessment: Field Operations Manual 2001, Office of Research and
Development, National Health and Environmental Effects Research Laboratory, Gulf Ecology Division,
Gulf Breeze, FL. EPA/620/R-01/003.
U.S. EPA 2001C. National Coastal Condition Report. Office of Research and Development/ Office of
Water. Washington, DC 20460.
U.S. EPA, 2001D. Agency Network Security Policy. EPA Order 2195.1 A4.
U.S. EPA 2002. Guidance for Quality Assurance Plans EPA240/R-02/009 U.S. Environmental Protection
Agency, Office of Environmental Information, Washington, D.C.
U.S. EPA 2004A. National Coastal Condition Report II, Office of Research and Development/Office of
Water. Washington, DC 20460. EPA-620/R-03/002.
U.S. EPA. 2004B. Revised Assessment of Detection and Quantitation Approaches. EPA-821-B-04-005. U.S.
Environmental Protection Agency, Office of Science and Technology, Washington, D.C.
-------
National Coastal Condition Assessment 2015 Quality Assurance Project Plan
Version 2.1 May 2016 Page 134 of 134
U.S. EPA, 2006A. Method 1606: Enterococci in water by Taqman Quantitative Polymerase Chain
Reaction (qPCR) assay (draft). U.S. EPA Office of Water, Washington DC December 2006.
U.S. EPA. 2006B. Guidance on Systematic Planning Using the Data Quality Objectives Process.
EPA/240/B-06/001. U.S. Environmental Protection Agency, Office of Environmental Information,
Washington, D.C.
U.S. EPA 2008. National Coastal Condition Report III, Office of Research and Development/Office of
Water. Washington, DC 20460. EPA/842-R-08-002.
U.S. EPA, 2009. National Coastal Condition Assessment Field Operations Manual. United States
Environmental Protection Agency, Office of Water, Office of Wetlands, Oceans and Watersheds.
Washington, D.C. EPA/841-R-09-003.
U.S. EPA, 2009. National Coastal Condition Assessment Laboratory Methods Manual. United States
Environmental Protection Agency, Office of Water, Office of Wetlands, Oceans and Watersheds.
Washington, D.C. EPA/841-R-09-002.
U.S.GAO. 2000. Water duality. GAO/RCED-00-54.
Van Dolah, R.F., J.L. Hyland, A.F. Holland, J.S. Rosen, and T.T. Snoots. 1999. A benthic index of biological
integrity for assessing habitat quality in estuaries of the southeastern USA. Mar. Environ. Res. 48(4-
5):269-283.
Van Sickle, J. and S.G. Paulsen. 2008. Assessing the attributable risks, relative risks, and regional extents
of aquatic stressors. Journal of the North American Benthological Society 27:920-931.
Wade - Enterococcus DNA in a sample, epidemiological studies (Wade et al. 2005)
Weisberg, S.B., J.A. Ranasinghe, D.D. Dauer, LC. Schnaffer, R.J. Diaz, and J.B. Frithsen. 1997. An estuarine
benthic index of biotic integrity (B-IBI) for Chesapeake Bay. Estuaries 20(1):149-158.
-------
National Coastal Condition Assessment 2015
Version 2.1 May 2016
Quality Assurance Project Plan
Page 135 of 134
Attachment A
Map of Huron Erie Corridor (HEC) Sampling Sites
Huron Erie Corridor ^r
Legend
HEC_2014_design_map_final_9_11_14
Sample2014
Base
Base & duplicate
Base & U of I grab 4 , ' ;' '" ' . .,.:-
Base & U of I core ~Hf:'^-- - "'- " ' '
Lake alt & U of I grab '-t ^ "? ' '? '- "-">-
~ i "),*" " fr.~ ' '
* Lake alternate -;'*-; /-t- '-ti|jfc
River alt & U of I grab ,.-..: .'£, ; .''";.... "'
J-'AS';;,'LHLEC-09
» River alternate , :j,,... >,,';"*'
-'^.j^SB LHLEC-1
' l&:.-
*..jj ,
! -j^LJLKjpM
'"" 'fte' ,...:... 'I
i .iLHLEC=002
'. v.'1:^ - mi-
. i .-_. L _ . /'-:,-, ;:;,.,;;;:, ./LHLEC-021*
-BHLEC-083 i^' . ,, R
EC-090- ' ":-^:: '\',,r -" - S:;:'v -;'fc ..;;;^'. .,,,
LHLEC-099 . . - * n
-.-*':'-- LHLEC^25: "f.
03 »f3E """"^
;_ -,- LHLEC-079;,;. ^ ; LHLEC-OOT
^LHLECilOS-;. 'LHLEC-076
, vl" "-''- LHLEC-098; * '" ''''' LHLEC=»W > '''-
.iDovvSMOMile, LHLEC-089 ..'.»' " ^t'* V
» ^ VLHLEC-102* LHLEC-085/
'r-4S^H « " LHLEC-101]. LHLEC/0,80
| >'/ ~ " *LHLEC(-097 ' LH?EC4>96
-.-^;s."V, ' ' LHLEC
.A-:/:, '"._ .' LHLEC-C
.. . LHLEC-014 LHLEC-026
Jl. T"" ,'-:.: I'-'ft,/'
:": ', . -, '>?"*'':,%:.'. I"'- -.,'r ' "'.. ; Up
^ \ t'f-^, UHLEC-030 . . l..."-' ' J'VV-" te -,i; ,j .*.'-__
^i.Hf- LHLEC-007
: . - ...
:LHLEC-019,LHLEC:Q23
LHLEC-003 LHLEC-027. 048 16
LHLEC-015
f ' v: -- '-'
'£<:- LHLEC-011
i>* ' *- *LHLEC-028
iiHLEC^1|LHLEC016
LHLEC-020 LHLEC-024
:'".:. LHLEC-004
'. . -LHLEC-092. -.'I- ~O
-078" ". w I
LHLEC-081 , LHLEC-077 V-
M '}-.-
^LHLEC093- ' f j-^C.,88
- . . - ' ' -LHftEC-104 ,. '/ |"i:=-;^
Detroit ', Thames . .. nVy v
N
24 32 j\
A
------- |