United States Environmental Protection Agency
Office of Water
Washington, DC
EPA 841-B-16-003
National Lakes Assessment 2017
Quality Assurance Project
Plan
Version 1.1, May 2017

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page ii of xii

-------
National Lakes Assessment 2017
Version l.i, May 2017
Quality Assurance Project Plan
Page iii of xh
Approval Page
\ 7

*\
Amina Pollard
National Lakes Assessment 2017 Project Leader
U.S. EPA Office of Wafer
's/fi
Date
/,
. u
Sarah Lehmann
National Lakes Assessment Project Quality Assurance Coordinator
U.S. EPA Office of Water

¦ • .7)
! (,	
(p
Date
b / if/n
Susan Holdsworth
Cfnet Monitoring Branch
U.S. EPA Office ofWaten,
3te
C I
Margarete Heber
Office of Wetlands, Oceans, and Watersheds Quality Assurance Officer
U.S. EPA Office of Water
¦ Date

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page iv of xii
VERSION HISTORY
QAPP Version
Date Approved
Changes Made
1.0
3/10/2017
Not Applicable


Minor editorial and
grammatical changes
throughout QAPP;


List of acronyms updated;


Page 5, Section 2.1:
additional microcystin
sample added;
1.1

Page 15, Section 3.2.1 and
3.2.3: clarifications added on
RLs, precision, bias;


Changes made to NLA 2017
FOM and LOM; see Appendix
B for a summary of those
changes

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page v of xii
Quality Assurance Project Plan Review & Distribution Acknowledgement
& Commitment to Implement the National Lakes Assessment 2017
l/We have read the Quality Assurance Project Plan and the methods manuals for the 2017 National
Lakes Assessment listed below. Our agency/organization, agrees to abide by its requirements for work
performed under the National Lakes Assessment 2017. Check appropriate boxes for the appropriate
documents.
Quality Assurance Project Plan	~
Site Evaluation Guidelines	~
Field Operations Manual	~
Laboratory Operations Manual	~
_aj
Field Crew leaders: I also certify that I attended an EPA-sponsored NLA 2017 training and that all	p-
members of my crew have received training in NLA protocols (check box)	~	~
-M
-M
c

Field Crews: Please send a signed, scanned copy of this page to the Logistics Contractor. The Logistics	J
Contractor ensures all parties have signed the QA forms, compiles them and submits to the EPA Project c
QA Coordinator. Send your forms to: Chris Turner, cturner@glec.com.
Labs and others: Please return the signed, scanned copy to Kendra Forde who ensures all parties have
signed the QA forms, compiles them, ar
Kendra Forde. forde.kendraffiepa.gov.
ili rsi
cl ^
u a;
¦% E
O en
s- on
signed the QA forms, compiles them, and submits them to the EPA QA Coordinator. Send your forms to: ^ Si
§	<
tc	^
i-	(D
D
en	CD
to 	l
Retain a copy for your files.	<
f—
£	o
CD
a z
v

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page vi of xii
NOTICE
The intention of the National Lakes Assessment 2017 (NLA 2017) is to provide a comprehensive "State of
the Lakes" assessment for lakes, ponds, and reservoirs across the United States. The complete
documentation of overall project management, design, methods, and standards is contained in this
Quality Assurance Project Plan and companion documents, including:
National Lakes Assessment 2017: Site Evaluation Guidelines (EPA 841-B-16-001)
National Lakes Assessment 2017: Field Operations Manual (EPA 841-B-16-002)
National Lakes Assessment 2017: Laboratory Operations Manual (EPA 841-B-16-004)
This document, the NLA 2017 Quality Assurance Project Plan (QAPP), contains elements of the overall
project management, data quality objectives, measurement and data acquisition, and information
management for NLA 2017. The complete QAPP includes this document and its associated Field
Operations Manual (FOM), Laboratory Operations Manual (LOM), and Site Evaluation Guidelines (SEG),
which together comprise the integrated set of QAPP documents. Methods described in this document
are to be used specifically in work relating to the NLA 2017. All project cooperators should follow these
guidelines. Mention of trade names or commercial products in this document does not constitute
endorsement or recommendation for use.
The suggested citation for this document is:
USEPA. 2017. National Lakes Assessment 2017. Quality Assurance Project Plan. V.l.l. EPA 841-B-16-003.
U.S. Environmental Protection Agency, Washington, DC

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page vii of xii
TABLE OF CONTENTS
TITLE	FRONT COVER
APPROVAL PAGE	Ill
QUALITY ASSURANCE PROJECT PLAN REVIEW & DISTRIBUTION ACKNOWLEDGEMENT & COMMITMENT TO
IMPLEMENT THE NATIONAL LAKES ASSESSMENT 2017	V
NOTICE	VI
TABLE OF CONTENTS	VII
LIST OF TABLES	IX
LIST OF ACRONYMS	X
DISTRIBUTION LIST	XI
1	EXECUTIVE SUMMARY	1
1.1	Background	1
1.2	Project Organization	1
1.3	Quality Assurance Project Plan	1
1.4	Information Management Plan	1
1.5	NLA 2017 Design	2
1.6	Field Operations	2
1.7	Laboratory Ope rations	2
1.8	Peer Review	2
2	PROJECT PLANNING AND MANAGEMENT	4
2.1	Introduction	4
2.1.1	Project Organization	5
2.1.2	Project Schedule	7
2.2	Scope of QAPP	7
2.2.1	Field Operations	7
2.2.2	Overview of Laboratory Operations	11
2.2.3	Data Analysis and Reporting	13
2.2.4	Peer Review	13
3	DATA QUALITY OBJECTIVES	15
3.1	Data Quality Objectives	15
3.2	Measurement Quality Objectives	15
3.2.1	Laboratory Reporting Level (Sensitivity)	15
3.2.2	Field Measurements	17
3.2.3	Chemical Precision, Bias, and Accuracy	19
3.2.4	Taxonomic Precision and Accuracy of Benthic Macroinvertebrates and Zooplankton	20
3.2.5	Precision of Physical Habitat Indicators	22
3.2.6	Completeness	24
3.2.7	Comparability	24 H
3.2.8	Representativeness	25 llj
4	SAMPLING DESIGN AND SITE SELECTION	26 o
u
4.1	Probability Based Sampling Design and Site Selection	26 q
4.2	Reference (or Least-Disturbed) Site Selection	27 llj
CO
5	INFORMATION MANAGEMENT	28 <
vii

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page viii of xii
5.1	Roles and Responsibilities	28
5.1.1 State/Tribe-Based Data Management	30
5.2	Overview of System Structure	31
5.2.1	Data Flow	31
5.2.2	Simplified Description of Data Flow	32
5.2.3	Core Information Management Standards	33
5.2.4	Data Formats	34
5.2.5	Public Accessibility	34
5.3	Data Transfer Protocols	35
5.4	Data Quality and Results Validation	36
5.4.1	Design and Site Status Data Files	37
5.4.2	Sample Collection and Field Data	37
5.4.3	Laboratory Analyses and Data Recording	38
5.4.4	Data Review, Verification, and Validation Activities	40
5.5	Data Transfer	42
5.5.1 Database Changes	42
5.6	Metadata	42
5.7	Information Management Operations	42
5.7.1	Computing Infrastructure	42
5.7.2	Data Security and Accessibility	42
5.7.3	Life Cycle	43
5.7.4	Data Recovery and Emergency Backup Procedures	43
5.7.5	Long-Term Data Accessibility and Archive	43
5.8	Records Management	43
6	INDICATORS	44
6.1 Summary	44
6.1.1	Sampling Design	44
6.1.2	Sampling and Analytical Methods	44
6.1.3	Quality Assurance Objectives	44
6.1.4	Quality Control Procedures: Field Operations	44
6.1.5	Quality Control Procedures: Laboratory Operations	44
6.1.6	Data Management, Review, and Validation	44
7	ASSISTANCE VISITS	47
7.1	Field Evaluation and Assistance Visit Plan	47
7.2	Laboratory Evaluation and Assistance Visit Plan	47
8	DATA ANALYSIS PLAN	48
8.1	Data Interpretation Background	48
8.1.1	Scale of assessment	48
8.1.2	Selecting the best indicators	48
8.1.3	Defining least impacted (reference) condition	48
8.1.4	Determining thresholds for judging condition	48
8.2	Geospatial Data	49
£	8.3 Datasets Used for the Report	49
^	8.3.1 Trophic status and water quality	49
2	8.3.2 Ecological integrity	49
O	8.3.3 Human use	49
li-	8.4 Indicator Data Analysis	50
2	8.4.1 Algal Toxins	50
m	8.4.2 Bacteria (E. coli)	50
H	8.4.3 Benthic Macroinvertebrate and Zooplankton Assemblages	50
viii

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page ix of xii
8.4.4	Dissolved Gases	50
8.4.5	FisheDNA	50
8.4.6	Physical Habitat	51
8.4.7	Phytoplankton Assemblages	54
8.4.8	Sediment Contaminants	54
8.4.9	Atrazine Pesticide Screen	54
8.4.10	Trophic Status	54
8.4.11	Water Chemistry, Chlorophyll a and Secchi Depth	54
9 LITERATURE CITED	55
APPENDIX A: LABORATORY LIST	58
APPENDIX B: REVISION HISTORY	60
LIST OF TABLES
Table 2.1 Field training sessions for NLA 2017	10
Table 2.2 Proposed peer review schedule for NLA 2017 report	14
Table 3.1 Important variance components for aquatic resource assessments	18
Table 5.1 Summary of IM responsibilities	28
Table 5.2 NLA 2017 Data submission software and associated file formats	35
Table 5.3 Summary sample and field data quality control activities	38
Table 5.4 Summary laboratory data quality control activities	39
Table 5.5 Data review, verification, and validation quality control activities	41
Table 6.1 Summary of indicator QA procedures and coordinators	45
Table 8.1 Physical habitat measurement data quality objectives	51
Table 8.2 Physical habitat field quality control	51
LIST
Figure 2.1 National Lakes Assessment 2017 project organization chart	9
Figure 4.1 Design sites for the 2017 National Lakes Assessment	27
Figure 5.1 Conceptual model of data flow into and out of the master SQL database for the NLA 2017	33
LIST
Equation 3-1. LT-MDL calculation for an individual analyte	16
Equation 3-2. Precision in absolute terms	19
Equation 3-3. Relative precision	19
Equation 3-4. Relative percent difference	20
Equation 3-5. Net bias	20
Equation 3-6. Bias in relative terms	20
Equation 3-7. Percent recovery	20
Equation 3-8. Percent taxonomic disagreement	21
Equation 3-9. Percent similarity	21
Equation 3-10. Percent difference in enumeration	22
Equation 3-11. Repeat visit variance	22
Equation 3-12. Signal:noise ratio	22
Equation 3-13. Source of variation in habitat variable	22
Equation 3-14. Percent completeness	24

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page x of xii
LIST
l/l
>
z
O
QC
u
<
Ll_
o
I—
l/l
ANC
ASTM
CH4
C02
CSDGM
DBH
DO
DOC
DQO
eDNA
EMAP
FGDC
FOIA
FOM
GIS
GRTS
HDPE
H2S
IM
LIMS
LOM
LRL
LT-MDL
MDL
MQ/cm
MMI
MQO
NARS
ND
NHD
NIST
NLA
N20
OMB
ORD
OW
PETG
QA
QAPP
QA/QC
QC
QCS
RL
SEG
SOPs
SQL
TN
TOC
TP
USEPA
USGS
acid neutralizing capacity
American Society of Testing and Materials
methane
carbon dioxide
Content Standards for Digital Geospatial Metadata
diameter at breast height
dissolved oxygen
dissolved organic carbon
Data Quality Objectives
Environmental deoxyribonucleic acid
Environmental Monitoring and Assessment Program
Federal Geographic Data Committee
Freedom of Information Act
Field Operations Manual
geographic information system
Generalized Random Tessellation Stratified (survey design)
high density polyethylene
hydrogen sulfide
information management
Laboratory Information Management System
Lab Operations Manual
Laboratory Reporting Limit
target long-term Method Detection Limit
Method Detection Limit
megaohms/centimeter
multimetric indices
Measurement Quality Objectives
National Aquatic Resource Surveys
non-detect
National Hydrography Dataset
National Institute of Standards
National Lakes Assessment
nitrous oxide
Office of Management and Budget
USEPA Office of Research and Development
USEPA Office of Water
polyethylene terephthalate
quality assurance
Quality Assurance Project Plan
quality assurance/quality control
quality control
quality control sample
Reporting Limit
Site Evaluation Guidelines
Standard Operating Procedures
Structured Query Language
total nitrogen
total organic carbon
total phosphorus
United States Environmental Protection Agency
United States Geological Survey
X

-------
National Lakes Assessment 2017
Version 1.1, May 2017
WED USEPA Office of Research and Development's Western Ecology Division
WQX USEPA Water Quality Exchange
DISTRIBUTION LIST
This QAPP, which includes the associated manuals or guidelines, is distributed to the following: USEPA,
States, Tribes, universities, labs, and contractors participating in the National Lakes Assessment 2017
(NLA). USEPA Regional Survey Coordinators are responsible for distributing the NLA QAPP to State and
Tribal Water Quality Agency staff or other cooperators who will perform the field sampling and
laboratory operations. The Logistics Coordinator distributes the QAPP and associated documents to
participating project staff at their respective facilities and to the project contacts at participating
laboratories, as they are determined. If the QAPP is updated, the project lead distributes the relevant
materials via email to necessary participants.
Title
Name
Contact Information
USEPA HQ Project Lead
Amina Pollard, OW
pollard.amina@epa.gov
202-566-2360
USEPA HQ Project QA Coordinator
Sarah Lehmann, OW
lehmann.sarah@epa.gov
202-566-1379
USEPA HQ QA Officer
Margarete Heber, OW
heber.margarete@epa.gov
202 566-1189
USEPA HQ Logistics Lead
Brian Hasty, OW
hastv.brian@eoa.gov
202-564-2236
USEPA HQ Laboratory Review
Coordinator
Kendra Forde, OW
forde.kendra@eoa.gov
202-566-0417
Contract Logistics Coordinator
Chris Turner, GLEC
cturner@glec.com
715-829-3737
NARS Information Management
(IM) Coordinator
Marlys Cappaert, CSRA
caooaert.marlvs@eoa.gov
541-754-4467
541-754-4799 (fax)
USEPA Regional NLA Coordinators
Hilary Snook, Region 1
snook.hilary@epa.gov
617-918-8670

Jim Kurtenbach, Region 2
kurtenbach.james@epa.gov
732-321-6695

Frank Borsuk, Region 3
William Richardson, Region 3
borsuk.frank@eoa.gov
304-234-0241
richardson.william@eoa.gov
215-814-5675

Chris McArthur, Region 4
mcarthur.christooher@eoa.gov
404-562-9391

Mari Nord, Region 5
nord.mari@epa.gov
312-886-3017

Rob Cook, Region 6
cook.robert@epa.gov
214-665-7141

Gary Welker, Region 7
welker.gary@epa.gov
913-551-7177

Kris Jensen, Region 8
jensen.kris@epa.gov
Quality Assurance Project Plan
Page xi of xii

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page xii of xii
Jeff McPherson, Region 8
303-312-6237
mcpherson.jeffrey@epa.gov
303-312-7752
Janet Hashimoto, Region 9
hashimoto.ianet@eoa.gov

415-972-3106
Matthew Bolt, Region 9
Bolt.matthew@eoa.gov

415-972-3578
Lil Herger, Region 10
herger.lillian@epa.gov
206-553-1074

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 1 of 64
1 EXECUTIVE SUMMARY
1.1	Background
To address the need for improved water quality monitoring and analysis at multiple scales, the USEPA
Office of Water (OW), in partnership with USEPA's Office of Research and Development (ORD), USEPA
regional offices, states and tribes and other partners, assesses the condition of the nation's waters via a
statistically valid approach. Often referred to as probability-based surveys, these assessments, known as
the National Aquatic Resource Surveys (NARS), report on core indicators of water condition using
standardized field and lab methods and utilize integrated information management (IM) plans to ensure
confidence in the results at national and ecoregional scales.
The NLA 2017, which builds upon the previous NLA 2012 and NLA 2007, aims to address three key
questions about the quality of the nation's lakes and reservoirs:
¦	What percent of the nation's lakes are least, moderately, and most disturbed for key
indicators of trophic state, ecological health, and human use (recreation)?
¦	What is the relative importance of key stressors such as nutrients and pathogens?
¦	What changes are occurring in the condition of the nation's lakes?
The surveys are also designed to help expand and enhance state and tribal monitoring programs.
Through these surveys, states and tribes have the opportunity to collect data that can be used to
supplement their existing monitoring programs or to begin development of new programs.
1.2	Project Organization
Overall project coordination is conducted by USEPA's Office of Water in Washington, DC, with technical
support from the ORD's Western Ecology Division (WED) in Corvallis, Oregon. Each of the USEPA
Regional Offices has identified regional coordinators to assist in implementing the survey and coordinate
with the state/tribal crews who collect the water and sediment samples following NLA 2017 protocols.
USEPA began planning the NLA 2017 with state, tribal, and other federal partners in 2015 and is
continuing this partnership effort. USEPA expects to report the results in December 2019 in compliance
with the Data Quality Act.
1.3	Quality Assurance Project Plan
The purpose of this QAPP is to document the NLA 2017 project data quality objectives and quality
assurance/quality control measures needed to ensure that the data collected meets those objectives.
The plan contains elements of the overall project management, data quality objectives, measurement
and data acquisition, and information management for the NLA 2017 and identifies where these
elements are described in detail. This QAPP and its associated documents, the Field Operations Manual,
Laboratory Operations Manual and Site Evaluation Guidelines, are interdependent, integrated and
together make up the full QAPP for the National Lakes Assessment 2017.
1.4	Information Management Plan
Environmental monitoring efforts that amass large quantities of information from various sources
present unique and challenging data management opportunities. To meet these challenges, the NLA
2017 employs a variety of well-tested information management (IM) strategies to aid in the functional
organization and ensured integrity of stored electronic data. IM is integral to all aspects of the NLA 2017
from initial selection of sampling sites through the dissemination and reporting of final, validated data.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 2 of 64
A technical workgroup convened by the USEPA Project Leader is responsible for development of a data
analysis plan that includes a verification and validation strategy. These processes are summarized in the
data analysis plan section of this QAPP. Validated data are transferred to the central database managed
by information management support staff located at the Western Ecology Division facilities in Corvallis.
This database is known as the National Aquatic Resource Surveys Information Management System
(NARS IM). All validated measurement and indicator data from the NLA 2017 are eventually transferred
to USEPA's Water Quality Exchange (WQX) for archival in USEPA's STORET warehouse for public
accessibility. NLA 2017 IM staff provides support and guidance to all program operations in addition to
maintaining NARS IM.
1.5	NLA 2017 Design
USEPA used an unequal probability design to select approximately 1,000 lakes and reservoirs greater
than 1 hectare (ha) in size (note: in NLA 2007, the lower size limit was 4 ha) in the continental United
States. The design also includes revisits to approximately 10% of lakes during the 2017 sampling season
for quality assurance purposes including evaluation of the ability of an indicator to distinguish among
sites from differences within individual sites. Of these 1,000 lakes, 218 are lakes that were previously
sampled as part of the 2012 NLA and 226 lakes were previously sampled as part of the 2007 NLA. These
are collectively referred to as resample lakes. Related designs were also completed for sampling of lakes
for 12 state intensification studies including the state of Alaska.
1.6	Field Operations
Sample collection for NLA 2017 is designed to be completed during the index period of June through the
end of September 2017. Field data acquisition activities are implemented in a consistent manner across
the entire country. Each site is given a unique ID which identifies it throughout the pre-field, field, lab,
analysis, and data management phases of the project. Specific procedures for evaluating each sampling
location and for replacing non-sampleable sites are documented in 2017 NLA Site Evaluation Guidelines
(SEG, EPA-841-B-16-001).
NLA 2017 indicators include: algal toxins (microcystins and cylindrospermopsin), benthic
macroinvertebrates, physical habitat, phytoplankton, atrazine pesticide screen, water chemistry and
chlorophyll-o, and zooplankton. Additional research indicators include: bacteria (E. coli), sediment
contaminants, sediment total organic carbon (TOC), sediment grain size, fish environmental DNA
(eDNA), and dissolved gases. Field measurements and sampling methods are outlined in the NLA 2017
Field Operations Manual (FOM, EPA 841-B-16-002). Field crews are trained on these methods at a
required USEPA-sponsored training session. Field sampling assistance visits are completed for each field
crew for quality assurance.
1.7	Laboratory Operations
NLA 2017 laboratory analyses are conducted either by state/tribal-selected labs or "National
Laboratories" set up by USEPA to conduct analyses for any state/tribe which so elects. The designated
National Laboratories and state/tribal labs must comply with the QA/QC requirements described in this
document and in the National Lakes Assessment 2017: Laboratory Operations Manual (LOM, EPA 841-B-
16-004). Any laboratory selected to conduct analyses with NLA 2017 samples must demonstrate that it
can meet the quality standards presented in this NLA 2017 QAPP and in the NLA 2017 LOM.
1.8	Peer Review
The NARS program, including the NLA utilizes a three-tiered approach for peer review of the Survey.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 3 of 64
¦	internal and external review by USEPA, states, other cooperators and partners;
¦	external scientific peer review (when applicable); and
¦	public review (when applicable).
Cooperators have been actively involved in the development of the overall project management, design,
indicator selection, and methods. Outside scientific experts from universities, research centers, and
other federal agencies have been instrumental in indicator development and will continue to play an
important role in data analysis.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 4 of 64
2 PROJECT PLANNING AND MANAGEMENT
2.1 Introduction
In the early 2000s, several reports identified the need for improved water quality monitoring and
analysis at multiple scales. In 2000, the General Accounting Office (USGAO 2000) reported that USEPA,
states, and tribes collectively cannot make statistically valid inferences about water quality (via 305[b]
reporting) and lack data to support key management decisions. In 2001, the National Research Council
(NRC 2000) recommended USEPA, states, and tribes promote a uniform, consistent approach to ambient
monitoring and data collection to support core water quality programs. In 2002, the H. John Heinz III
Center for Science, Economics, and the Environment (Heinz Center 2002) found that there is inadequate
data for national reporting on fresh water, coastal and ocean water quality indicators. The National
Association of Public Administrators (NAPA 2002) stated that improved water quality monitoring is
necessary to help states and tribes make more effective use of limited resources. USEPA's Report on the
Environment 2003 (USEPA 2003) stated that there is insufficient information to provide a national
answer, with confidence and scientific credibility, to the question, 'What is the condition of U.S. waters
and watersheds?'
In response to this need, OW, in partnership with states and tribes, began a program to assess the
condition of the nation's waters via a statistically valid approach. The current assessment, the National
Lakes Assessment 2017 (referred to as NLA 2017 throughout this document), builds upon the 2012 and
2007 National Lakes Assessment as well as other NARS surveys such as the National Rivers and Streams
Assessment, National Coastal Condition Assessment, and the National Wetland Condition Assessment.
The NLA 2017 effort will provide important information to states and the public about the condition of
the nation's lake resources and key stressors on a national and regional scale.
USEPA developed this QAPP to support project participants and to ensure that the final assessment is
based on high quality data that is documented and appropriate for its intended use. The QAPP contains
elements of the overall project management, data quality objectives, measurement and data
acquisition, and information management for NLA 2017. USEPA recognizes that states and tribes may
add elements to the survey, such as supplemental indicators, that are not covered in the scope of this
integrated QAPP. USEPA expects that any supplemental elements are addressed by the states, tribes, or
their designees, in a separate approved QAPP or an addendum to this QAPP. The NLA 2017 participants
have agreed to follow this QAPP and the protocols and design laid out in this document, and its
associated documents - the NLA 2017 FOM, LOM, and SEG.
I—
m	This cooperative effort between states, tribes, and federal agencies makes it possible to produce a
^	broad-scale assessment of the condition of the nation's lakes with both confidence and scientific
^	credibility. Through this survey, states and tribes have the opportunity to collect data that can be used
^	to supplement their existing monitoring programs or to begin development of new programs.
<
The National Lakes Assessment 2017 has three main objectives:

Estimate the current status, trends, and changes in selected trophic, ecological, and human use
indicators of the condition of the nation's lakes with known statistical confidence.
<	• Seek associations between selected indicators of natural and anthropogenic stresses and
indicators of ecological condition.
b
• Assess changes in population status between 2007 and 2017.
O
oc
CL
4

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 5 of 64
A NLA 2017 workgroup, comprised of USEPA, state, and other partners, decided on a few improvements
and changes to the NLA 2012 suite of indicators. The additions include bacteria, sediment contaminants,
sediment TOC, sediment grain size, dissolved gases, fish eDNA, and an algal toxin, cylindrospermopsin.
The following indicators from NLA 2012 are not being sampled or analyzed in the NLA 2017:
macrophytes assemblage, sediment mercury, sediment dating, sediment diatoms, and dissolved carbon.
Modifications from the NLA 2012 protocols include discontinuing collection of chlorophyll-o,
phytoplankton (for cyanobacteria), and algal toxins at the littoral site and sampling for them only at the
index site. While taxonomic information is included as part of the laboratory work for the phytoplankton
index site sample, the focus for NLA (including QC and assessment) is on cyanobacteria. Mercury is
included in the new sediment contaminants indicator, but changes from the previous method may result
in data that is not comparable. Crews will collect the primary microcystin sample in a polyethylene
terephthalate (PETG) sample container rather than the previously used high density polyethylene
(HDPE) sample containers. Research has shown microcystin adsorption to HDPE bottles. To assess
differences in these two approaches and to allow for comparison back to NLA 2007 and 2012, crews will
collect a second microcystin sample in the previously used HDPE containers.
2.1.1 Project Organization
The responsibilities and accountability of the various principals and cooperators are described here and
illustrated in Figure 2.1. Overall, the project is coordinated by the Office of Water (OW) in Washington,
DC, with support from USEPA Western Ecology Division (WED) in Corvallis, Oregon. Each USEPA Regional
Office has identified a Regional USEPA Coordinator who is part of the USEPA team providing a critical
link with state and tribal partners. Cooperators work with their Regional USEPA Coordinator to address
any technical issues. The NLA implements a comprehensive quality assurance (QA) program to ensure
data integrity and provide support for the reliable interpretation of the findings from this project. The
Project Lead convenes Technical Experts Workgroups to provide the team with support for determining
the best and most appropriate approaches for key technical issues, such as: (1) the selection and
establishment of reference conditions based on least-disturbed sites and expert consensus for
characterizing benchmarks for assessment of ecological condition; (2) selection and calibration of
ecological endpoints and attributes of the biota and relationship to stressor indicators; (3) a data
analysis plan for interpreting the data and addressing the objectives in a nationwide assessment; and (4)
a framework for the reporting of the condition assessment and conveying the information on the
ecological status of the nation's lakes.
Contractor support is provided for all aspects of this project. Contractors provide support ranging from
implementing the survey, sampling and laboratory processing, data management, data analysis, and
report writing. Cooperators interact with their Regional USEPA Coordinator and the USEPA Project
Leader regarding contractual services.	^
The primary responsibilities of the principals and cooperators are as follows:	<
<
Project Leader: Amina Pollard	;>
Q
¦	Provides overall coordination of the project and makes decisions regarding the proper	z
<£.
functioning of all aspects of the project.	^
¦	Makes assignments and delegates authority, as needed to other parts of the project	^
organization.
¦	Leads the Lakes Steering Committee and establishes needed technical workgroups.
¦	Interacts with USEPA Project Team on technical, logistical, and organizational issues on a regular [3
basis.	g
QC
CL
<
	I
o_
5

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 6 of 64
USEPA Field Logistics Coordinator: Brian Hasty
¦	USEPA employee who functions to support implementation of the project based on technical
guidance established by the USEPA Project Leader and serves as point-of-contact for questions
from field crews and cooperators for all activities.
¦	Tracks progress of field sampling activities.
USEPA Project QA Coordinator: Sarah Lehmann
¦	Provides leadership, development, and oversight of project-level quality assurance for NLA.
¦	Assembles and provides leadership for a NLA 2017 Quality Team.
¦	Maintains official, approved QAPP.
¦	Maintains all training materials and documentation.
¦	Maintains all laboratory accreditation files.
EPA Laboratory Review Coordinator - Kendra Forde, OW
¦	Ensures participating laboratories have the appropriate technical competencies to process
samples.
¦	Ensures participating laboratories complete sample analysis following Laboratory Operations
Manual.
¦	Ensures participating laboratories follow QA activities.
Information Management Coordinator: Marlys Cappaert
¦	A contractor who functions to support implementation of the project based on technical
guidance established by the USEPA Project Leader and Alternate USEPA Project Leader.
¦	Oversees all sample shipments and receives data forms from the Cooperators.
¦	Oversees all aspects of data entry and data management for the project.
USEPA QA Officer, Office of Wetlands, Oceans and Watersheds: Margarete Heber
¦	Functions as an independent officer overseeing all Quality Assurance (QA) and quality control
(QC) activities.
¦	Responsible for ensuring that the QA program is implemented thoroughly and adequately to
document the performance of all activities.
Regional USEPA Coordinators
¦	Assists USEPA Project Leader with regional coordination activities.
^	¦ Serves on the Technical Experts Workgroup and interacts with Project Facilitator on technical,
^	logistical, and organizational issues on a regular basis,
g	¦ Serves as primary point-of-contact for the Cooperators.
<
2	Steering Committee (Technical Experts Workgroup): States, USEPA, and other federal agencies
<
Provides expert consultation on key technical issues as identified by the USEPA Coordination
crew and works with Project Facilitator to resolve approaches and strategies to enable data
analysis and interpretation to be scientifically valid.
Cooperator(s): States, Tribes, USGSothers
<
	i
CL
O
OC
CL
Under the scope of their assistance agreements, plans and executes their individual studies as
part of the cross jurisdictional NLA 2017 and adheres to all QA requirements and standard
operating procedures (SOPs).
6

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 7 of 64
¦	Interacts with the Grant Coordinator, Project Facilitator and USEPA Project Leader regarding
technical, logistical, organizational issues.
Field Sampling Crew Leader
¦	Functions as the senior member of each Cooperator's field sampling crew and the point of
contact for the Field Logistics Coordinator.
¦	Provides training and oversight to their field crew as needed.
¦	Accompanies and oversees other members of the sampling crew in the field.
¦	Responsible for overseeing all activities of the field sampling crew and ensuring that the Project
field method protocols are followed during all sampling activities.
Contractor Field Logistics Coordinator: Chris Turner
¦	A contractor who functions to support implementation of the project based on technical
guidance established by the USEPA Field Logistics Coordinator and the Project Leader
¦	Serves as point-of-contact for questions from field crews and cooperators for all activities.
¦	Tracks progress of field sampling activities.
EPA Technical Advisor: Steven Paulsen
¦	Advises the Project Leader on the relevant experiences and technology developed within ORD
that may be used in this project.
¦	Facilitates consultations between NLA personnel and ORD scientists.
EPA Study Design Manager: Tony Olsen, ORD
¦	Provides leadership and oversight of Design Team
¦	Coordinates w/ Project Manager and Field Logistics Coordinator to develop and manage the
Sampling Frame, select sampling locations, and track field evaluation and site reconnaissance.
2.1.2 Project Schedule
Training and field sampling is conducted in 2017. The team needs to complete sample processing and
data analysis by 2018 in order to publish a report in FY 2020.
2.2 Scope of QAPP
This QAPP addresses the data acquisition efforts of the NLA 2017, which focuses on the sampling of
lakes across the United States in 2017. Data from approximately 1000 site visits (selected with a
probability design) located within the contiguous 48 states provide a comprehensive assessment of the
nation's lakes. Quality information, requirements, and procedures are contained in the QAPP and its
accompanying documents: the SEG, FOM, and LOM. Much of the detailed quality assurance information	5
is in the companion documents to avoid redundancy. In these cases, the QAPP directs readers to the
primary sources of this information.
<
2.2.1 Field Operations	^
O
All field operations information is available in the FOM.	z
Field operations are implemented for the NLA 2017 based on guidance developed by EMAP (Baker and
Merritt 1990), experience from NLA 2007 and NLA 2012 advice from the NARS Team, and through
consultation with a steering committee comprised of various state, tribal, federal, and regional agencies.
Funding for states and tribes to conduct field data collection activities is provided by USEPA under
Section 106 of the Clean Water Act. The project lead initiates field operations preparation by working
<
	I
o_
O
7

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 8 of 64
with the Design Team (led by ORD in Corvallis) to revise, as needed, the target population and sample
frame and to identify state/tribal or other organization-requested intensifications/modifications. The
Design Team selects sampling locations. The Project Lead distributes the list of sampling locations to the
USEPA Regional NLA Coordinators, states, and tribes and to other partners. See the Site Evaluation
Guidelines for the detailed design documentation.
With the sampling location list, state and tribal field crews can begin site reconnaissance on the primary
sites and alternate replacement sites and begin work on obtaining permission to access each site.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 9 of 64
Field Data
Project Management
Project Lead - Amina Pollard, USEPA OW
Project QA- Sarah Lehmann, USEPA OW
Technical Advisor - Steve Paulsen, USEPA ORD
Study Design
Tony Olsen, USEPA ORD
Quality Assurance
Margarete Heber
USEPA OW
Field Protocols
NLA 2017 Steering
Committee
Field Logistics Coordinator
Brian Hasty, USEPA OW
Training
USEPA HQ USEPA ORD, USEPA Regions,
Contractors
Field Implementation
USEPA HQ USEPA Regions, States, Tribes,
Contractors
Algal Toxins
Chlorophyll A
Bacteria
Sample Flow
Benthic Macroinvertebrates
Dissolved Gasses
Fish eDNA
Field Data
Sediment Contaminants, TOC, Grain Size
Atrazine Pesticide
T
M
Information Management
USEPA WED - Marlys Cappaert
Final Data
Web, STORET/WQX-OW

*

Assessment
OW - Lead
USEPA ORD, USEPA Regions, States, Tribes,
Federal Partners, Cooperators
3>
Figure 2.1 National Lakes Assessment 2017 project organization chart.
Chemistry
Phytoplankton
Zooplankton
13
<
<
<
	I
Q_
I—
U
LU
o
cd
Q_
9

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 10 of 64
Specific procedures for evaluating each sampling location and for replacing non-sampleable sites are
documented in the NLA 2017 SEG. Field crews procure scientific collecting permits from State, Tribal,
and Federal agencies, as needed. The field crews use standard field equipment and supplies. Field Crew
Leaders from states and tribes work with USEPA Regional Coordinators and the NARS Information
Management (IM) Center to coordinate equipment and supply requirements. This helps to ensure
comparability of protocols across states. Detailed lists of equipment required for each field protocol, as
well as guidance on equipment inspection and maintenance, are contained in the FOM.
Trained crews collect field measurements and samples. Each Field Crew Leader must be trained at an
USEPA-sponsored training session prior to the start of the field season (see Table 2.1), along with as
many crew members as possible. USEPA provides the three-day training sessions in a number of
locations around the country for cooperators and contractors. It is strongly encouraged that field crews
attend all three days of training. The training program stresses hands-on practice of methods,
comparability among crews, collection of high quality data and samples, and safety. All field crews
providing field operational support to NLA 2017 must adhere to the provisions of this integrated QAPP,
FOM, and SEG. Trainers maintain a list of all personnel trained and provide the information to the NLA
Project Lead and the QA Project Lead.
The Project QA Coordinator or his/her designated member of the Quality Team maintains training
documentation in NLA 2017 QA files. Field crews may not operate without a trained field crew leader
present.
Table 2.1 Field training sessions for NLA 2017.
Date	Training Location
Primary Trainees*
<
<
Q
<
<
	i
CL
b
LU
O
QC
CL
March 6-9, 2017
Folsom Lake, CA
Train the trainer
April 4-6, 2017
Moss Landing, CA
AZ, CA, HI, NV
April 11-13, 2017
Broken Bow, OK
LA, AR, OK, NM, TX
April 18-20, 2017
Denver, CO
CO, MT, ND, SD, UT, WY
May 2-4, 2017
Flintstone, MD
PA, WV, VA, DE, MD, DC, NJ
May 9-11, 2017
St. Petersburg, FL
KY, TN, MS, AL, GA, FL, SC, NC
May 16-18, 2017
Lake Geneva, Wl
IL, IN, Ml, MN, OH, Wl
May 25-27, 2017
Kansas City, MO
IA, MO, KS, NE
June 6-8, 2017
North Chelmsford, MA
CT, ME, MA, NH, Rl, VT, NY
June 13-15, 2017
Lacey, WA
AK, ID, OR, WA
* Actual trainees will change based on training dates and who is conducting the sampling
Trained evaluators conduct evaluation and assistance visits with each Field Crew early in the sampling
and data collection process. Evaluators provide corrective actions in real time. These visits provide
USEPA with a basis for the uniform evaluation of the data collection techniques, and an opportunity to
conduct procedural reviews to minimize data loss due to improper technique or interpretation of
program guidance. The field visit evaluations are based on the uniform training, plans, and checklists.
For more information on field assistance visits see Section 8 of the FOM.
Crews may use a variety of methods to access a lake. Some sampling locations require crews to hike in,
transporting all equipment in backpacks. For this reason, EPA and the steering committee considered
10

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 11 of 64
ruggedness and weight as important considerations in the selection of equipment and instrumentation.
Crews may need to camp out at the sampling location and may need to provide themselves with the
necessary camping equipment.
The site verification process is outlined in the NLA 2017 SEG and FOM. EPA fully documented all
methods used in the field in step-by-step procedures in the NLA 2017 FOM. The manual also contains
detailed instructions for completing documentation, labeling samples, any field processing
requirements, and sample storage and shipping. Field communications is through Field Crew Leaders,
and involves regularly scheduled conference calls or contacts with the NLA 2017 logistics staff
Standardized field data forms are the primary means of data recording. For NLA 2017, crews are using
electronic field forms (NLA eforms application). Back-up paper forms are available if needed. On
completion, a field crew member other than the person who initially entered the information reviews
the data forms. Prior to departure from the field site, the field crew leader reviews all forms and labels
for completeness and legibility and ensures that all samples are properly labeled and packed. This
review process is done for either form of data collection (electronic or paper). Field crews also back-up
electronic field data to an iStick in case data are lost from the tablet.
Upon return from field sampling to the office, field crews using electronic forms send completed forms
via email as soon as they have access to email. Field crews using paper forms send completed data
forms to the information management staff at WED in Corvallis, Oregon for entry into a computerized
database. At WED, the IM team review electronic data files independently to verify that values are
consistent with those recorded on the field data form or original field data file (see Section 5.4.4).
Field crews store or package samples for shipment in accordance with instructions contained in the NLA
2017 Field Operations Manual, including taking precautions so holding times are not exceeded. Field
crews deliver samples which must be shipped to a commercial carrier; crews maintain copies of bills of
lading or other documentation. Using the tracking form, crews notify the NARS IM Center about sample
shipment; thus, NARS IM and Logistics staff can initiate tracking procedures quickly in the event samples
are not received. Crews complete chain-of-custody forms for all transfers of samples, with copies
maintained by the field crew. The Logistics staff follows up with field crews about any missing samples
and/or incomplete files.
The field operations phase is completed with collection of all samples or expiration of the sampling
window.
2.2.2 Overview of Laboratory Operations
Holding times for surface water samples vary with the sample types and analytes. Some analytical
measurements begin during sampling (e.g., in situ profiles) while others are not initiated until sampling
has been completed (e.g., phytoplankton and zooplankton). Analytical methods are summarized in the
NLA 2017 LOM.
Chemical, physical, or biological analyses may be performed by cooperator or contractor laboratories.
Laboratories providing analytical support must have the appropriate facilities to properly store and
prepare samples and appropriate instrumentation and staff to provide data of the required quality
within the time period dictated by the project. Laboratories are expected to conduct operations using
good laboratory practices. The following are general guidelines for analytical support laboratories:
• A program of scheduled maintenance of analytical balances, water purification systems,
microscopes, laboratory equipment, and instrumentation.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 12 of 64
•	Verification of the calibration of analytical balances using class "S" weights which are certified by
the National Institute of Standards and Technology (NIST).
•	Verification of the calibration of top-loading balances using NIST-certified class "P" weights.
•	Checking and recording the composition of fresh calibration standards against the previous lot.
Acceptable comparisons are less than or equal to two percent of the theoretical value (This
acceptance is tighter than the method calibration criteria.).
•	Recording all analytical data in bound logbooks in ink, or on standardized recording forms.
•	Verification of the calibration of uniquely identified daily use thermometers using NIST-certified
thermometers.
•	Monitoring and recording (in a logbook or on a recording form) temperatures and performance
of cold storage areas and freezer units (where samples, reagents, and standards may be stored).
During periods of sample collection operations, monitoring must be done on a daily basis.
•	An overall program of laboratory health and safety including periodic inspection and verification
of presence and adequacy of first aid and spill kits; verification of presence and performance of
safety showers, eyewash stations, and fume hoods; sufficiently exhausted reagent storage units,
where applicable; available chemical and hazardous materials inventory; and accessible safety
data sheets for all required materials.
•	An overall program of hazardous waste management and minimization, and evidence of proper
waste handling and disposal procedures (e.g., 90-day storage, manifested waste streams, etc.).
•	If needed, having a source of reagent water meeting American Society of Testing and Materials
(ASTM) Type I specifications for resistivity (>18 megaohms/cm (MQ/cm; at 25 °C; ASTM D1193-
6) available in sufficient quantity to support analytical operations.
•	Appropriate microscopes or other magnification for biological sample sorting and organism
identification.
•	Approved biological identification and taxonomic keys/guides for use in biological identification
(zooplankton and benthic macroinvertebrates) as appropriate.
•	Labeling all containers used in the laboratory with date prepared contents, and initials of the
individual who prepared the contents.
•	Dating and storing all chemicals safely upon receipt. Chemicals are disposed of properly when
the expiration date has expired.
•	Using a laboratory information management system to track the location and status of any
sample received for analysis.
•	Reporting results electronically using standard formats and units compatible with NARS IM (see
NLA 2017 LOM for data templates). These files are labeled properly by referencing the indicator
and/or analyte and date.
All laboratories providing analytical support to NLA 2017 must adhere to the provisions of this
integrated QAPP and LOM. Laboratories provide information documenting their ability to conduct the
analyses with the required level of data quality prior to data analysis. EPA provides different
requirements based on the type of analysis being done by the lab (i.e., chemistry vs. biological analyses).

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 13 of 64
Labs send the documentation to the Laboratory Review Coordinator at USEPA Headquarters (or other
such designated parties) to maintain in the NLA 2017 QA files. Such information may include the
following, depending on the evaluation by the Quality Assurance Project Coordinator and the Laboratory
Review Coordinator:
Signed Quality Assurance Project Plan by the laboratory performing the analysis.
Signed Laboratory Form.
Valid Accreditation or Certification.
Laboratory's Quality Manual and/or Data Management Plan.
Method Detection Limits (MDL).
Demonstration of Capability.
Results from inter-laboratory comparison studies.
Analysis of performance evaluation samples.
Control charts and results of internal QC sample or internal reference sample analyses to
document achieved precision, bias, accuracy.
Other laboratory requirements may include:
•	Participation in calls regarding laboratory procedures and processes with participating
laboratories.
•	Participation in a laboratory technical assessment or audit.
•	Participation in performance evaluation studies.
•	Participation in inter-laboratory sample exchange.
All qualified laboratories shall work with the NARS IM Center to track samples as specified in Section 1 of
the LOM.
2.2.2.1 Biological Laboratory Quality Evaluation
The NLA 2017 Quality Team requested and, whenever possible, reviewed the past performance of
biological laboratories. The biological laboratories shall adhere to the quality assurance objectives and
requirements as specified for the pertinent indicators in the LOM.
2.2.3	Data Analysis and Reporting
A technical data analysis and reporting workgroup convened by the USEPA Project Leader is responsible
for development of a data analysis plan that includes a verification and validation strategy. These
processes are summarized in the data analysis sections of this QAPP. Validated data are transferred to	H
the central database managed by NARS IM support staff located at WED in Corvallis. Information	jf]
management activities are discussed further in Section 4. Data in the WED database are available to
Cooperators for use in development of indicator metrics. All validated measurement and indicator data	^
from NLA 2017 are eventually transferred to USEPA's Water Quality Exchange (WQX) and then the	^
National STORET warehouse.	^
o
-Z.
2.2.4	Peer Review	<
u
If deemed necessary, the NLA 2017 report will undergo a thorough peer review process. Cooperators	—
have been actively involved in the development of the overall project management, design, methods,	z
and standards including the drafting of four key project documents:	^
•	Quality Assurance Project Plan.	y
•	Site Evaluation Guidelines.	§
Q-
13

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 14 of 64
•	Field Operations Manual.
•	Laboratory Operations Manual.
The USEPA NARS program, including the NLA 2017, utilizes a three-tiered approach for peer review of
the Survey: (1) internal and external review by USEPA, states, other cooperators and partners, (2)
external scientific peer review, when applicable, and (3) public review, when applicable.
Once data analysis has been completed, cooperators examine the results. The NLA team reviews
comments and feedback from the cooperators and incorporate such feedback into the draft report,
when appropriate. The NLA Project Team follows Agency and OMB requirements for public and peer
review. External scientific peer review and public review is initiated for new analyses or approaches as
appropriate. Additionally, following applicable guidance other aspects of the NLA may undergo public
and scientific peer review.
Below are the proposed measures USEPA plans for engaging in the peer review process:
¦	Follow the USEPA's Information Quality Guidelines and complete the checklist
¦	Develop and maintain a public website with links to standard operating procedures, quality
assurance documents, fact sheets, scientific peer review feedback, and final report.
¦	Conduct technical workgroup meetings composed of scientific experts, cooperators, and USEPA
to evaluate and recommend data analysis options and indicators.
¦	Complete data validation on all chemical, physical and biological data.
¦	Conduct final data analysis with workgroup to generate assessment results.
¦	Engage peer review contractor to identify external peer review panel (if applicable).
¦	Develop draft report presenting assessment results.
¦	Develop final draft report incorporating input from cooperators and results from data analysis
group to be distributed for peer a review.
¦	Issue Federal Register (FR) Notice announcing document availability and hold public comment
(30-45 days) (if applicable).
¦	Consider public comments (if applicable) and produce a final report.
The proposed peer review schedule is provided below in Table 2.2 and is contingent upon timeliness of
data validation and schedule availability for regional meetings and experts for data analysis workshop.
Table 2.2 Proposed peer review schedule for NLA 2017 report.
Proposed Schedule Activity
May - December 2018
Data validation
May-August 2019
Internal data analysis and review meetings (e.g., web conferences)
August 2019
Draft released for external peer review (if applicable)
October 2020
Draft released for public review (if applicable)

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 15 of 64
3 DATA QUALITY OBJECTIVES
It is a policy of the USEPA that Data Quality Objectives (DQOs) be developed for all environmental data
collection activities following the prescribed DQO Process. DQOs are qualitative and quantitative
statements that clarify study objectives, define the appropriate types of data, and specify the tolerable
levels of potential decision errors that will be used as the basis for establishing the quality and quantity
of data needed to support decisions (USEPA 2006). Data quality objectives thus provide the criteria to
design a sampling program within cost and resource constraints or technology limitations imposed upon
a project or study. DQOs are typically expressed in terms of acceptable uncertainty (e.g., width of an
uncertainty band or interval) associated with a point estimate at a desired level of statistical confidence
(USEPA 2006). The DQO Process is used to establish performance or acceptance criteria, which serve as
the basis for designing a plan for collecting data of sufficient quality and quantity to support the goals of
a study (USEPA 2006). As a general rule, performance criteria represent the full set of specifications that
are needed to design a data or information collection effort such that, when implemented, it will
generate newly-collected data that are of sufficient quality and quantity to address the project's goals
(USEPA 2006). Acceptance criteria are specifications intended to evaluate the adequacy of one or more
existing sources of information or data as being acceptable to support the project's intended use (USEPA
2006).
3.1 Data Quality Objectives
Target DQOs established for the NLA 2017 relate to the goal of describing the current status of selected
indicators of the condition of lakes in the conterminous U.S. and ecoregions of interest. The formal
statement of the DQO for national estimates is as follows:
•	Estimate the proportion of lakes (± 5%) in the conterminous U.S. that fall below the designated
threshold for good conditions for selected measures with 95% confidence.
For the ecoregions of interest, the DQO is:
•	Estimate the proportion of lakes (± 15%) in a specific ecoregion that fall below the designated
threshold for good conditions for selected measures with 95% confidence.
For estimates of change, the DQOs are:
•	Estimate the proportion of lakes (± 7%) in the conterminous U.S. that have changed condition
classes for selected measures with 95% confidence.
3.2 Measurement Quality Obj ectives
For each parameter, performance objectives (associated primarily with measurement error) are
established for several different data quality indicators (following USEPA Guidance for Quality Assurance
Plans, USEPA 2002). Specific Measurement Quality Objectives (MQOs) for each parameter are presented 
quality indicators and present approaches for evaluating them against acceptance criteria established	^
3.2.1 Laboratory Reporting Level (Sensitivity)
For water chemistry measurements, requirements for the method detection limit (MDL) are typically
for the program.	g
>
<
3
a
established (see indicator specific information in the LOM for specifics on what is used for each	<
indicator). The MDL is defined as the lowest level of analyte that can be distinguished from zero with 99	<
15

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 16 of 64
percent confidence based on seven measurements (40CFR136 App. B). USGS NWQL has developed a
variant of the MDL called the long-term MDL (LT-MDL) to capture greater method variability (Oblinger
Childress et al. 1999). Unlike MDL, it is designed to incorporate more of the measurement variability
that is typical for routine analyses in a production laboratory, such as multiple instruments, operators,
calibrations, and sample preparation events (Oblinger Childress et al. 1999). Because the LT-MDL
addresses more potential sources of variability than the MDL, the NLA uses the LT-MDL.
For the NLA, target long-term MDL (LT-MDL, following Oblinger-Childress et al., 1999) values were
established for each chemical analyte based on the anticipated range of concentrations expected, values
required as thresholds for assigning lake condition based on chemical stressors (e.g., nutrients,
acidification, salinity, etc.) or trophic state (oligotrophic vs. mesotrophic vs. eutrophic), and the
capability of analytical laboratories to measure an analyte at low concentrations over time given
available methods.
The LT-MDL determination ideally employs at least 24 blanks and spiked samples prepared and analyzed
by multiple analysts on multiple instruments over a 6- to 12-month period at a frequency of about two
samples per month (USEPA 2004). The LT-MDL uses "F-pseudosigma" (F0) in place of s, the sample
standard deviation, used in the EPA MDL calculation. F-pseudosigma is a non-parametric measure of
variability that is based on the interquartile range of the data (USEPA 2004). The LT-MDL is calculated
using either the mean or median of a set of long-term blanks, and from long-term spiked sample results
(depending on the analyte and specific analytical method). The LT-MDL for an individual analyte is
calculated as:
Equation 3-1. LT-MDL calculation for an individual analyte.
LT -MDL =M+ (t099 x Fa)
a.
where:
M = the mean or median of blank results
n = the number of spiked sample results
F0 = F-pseudosigma, a nonparametric estimate of variability calculated as:
F _a-a
1.349
where:
Q3 = the 75th percentile of spiked sample results
i/i
LU
>
b	Q.i = the 25th percentile of spiked sample results
O
>-
t	LT-MDL is designed to be used in conjunction with a laboratory reporting level (LRL; Oblinger Childress
<	etal. 1999).
a
<	The lab monitors performance using the determined/calculated LT-MDL values, but uses the MDLs as
<	determined based on 40CFR136 App. B to establish MDLs and Reporting Levels for reporting purpose,
16

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 17 of 64
estimates and flagging (RLs are also known as minimal reporting levels). The RL values are designed to
achieve a risk of <1% for both false negatives and false positives (Oblinger- Childress et al., 1999). The
Laboratory Reporting Limit (LRL) is set as two times higher than the target LT-MDL value. Therefore,
multiple measurements of a sample having a true concentration at the RL should result in the
concentration being detected and reported 99 percent of the time (Oblinger- Childress et al., 1999).
Target MDL and RL values are based on the presumption that a laboratory receives samples from across
the United States. Laboratories analyzing NLA samples from a more restricted region may have modified
target RL values based on the range of expected concentrations and required thresholds values. A
modified RL for a "regional" laboratory cannot be greater than a required threshold value used in the
NLA assessment. The objective for NLA is to minimize the number of values reported as "estimated" by
an individual laboratory (i.e., between an estimated MDL and the laboratory RL).
For chemical analyses, all participating laboratories will monitor their target RL values by one (or both)
of the following approaches:
1)	For every calibration curve, include a calibration standard with an analyte concentration equal to
the RL.
2)	Monitor the RL by including a Quality Control Sample (QCS) with a concentration equal to the RL
with each analytical batch. Results of each QCS analysis must meet the acceptance criteria
established for precision and bias (Section 3.2.3).
Laboratories are encouraged to conduct evaluations of analytical performance using samples at the
target RLs established based on a "national" laboratory (receiving samples from across the US). These
studies provide an indication of the confidence that can be placed on "estimated" results reported by
the laboratory.
Laboratories must submit estimates of RLs (and how they are determined) with analytical results.
Laboratories must flag analytical results associated with RLs that exceed the objectives as being
associated with unacceptable RLs. Laboratories must report analytical data that are below the estimated
RLs, but above the laboratory's MDL, but laboratories also flag these as "estimated" values (detected
but not quantified). Laboratories should report (if possible), values below the MDL, but the laboratory
must flag the value as being below the MDL. If a laboratory has to report values below the MDL as being
equal to the MDL, this must be clearly stated in the metadata submitted with any analytical results to
avoid the misuse of these results in assessment analyses.
3.2.2 Field Measurements
Since analytical (or field) precision, bias, and accuracy of field measurements is not monitored
separately during the NLA 2017, a revisit site approach is implemented to help evaluate the quality of
data (revisiting sites within the NLA 2017 index period). The survey design also incorporates a plan for
resampling a subset of sites from previous NLAs (including a subset of 226 lakes that were originally
sampled in NLA 2007 and 218 lakes that were originally sampled in NLA 2012). Data from these repeat
visits provide estimates of important components of variance to evaluate the performance of ecological
indicators, evaluate the capability of the survey design to estimate status vs. detect trend, and to
potentially reduce bias in the population estimates by "de-convoluting" the variance. These variance
components are presented in Table 3.1. If estimates of these components are available from other
studies, they are used in conjunction with the project requirements to evaluate alternative design
scenarios (Larsen et al., 1995, 2001, 2004). Status estimates are influenced most by the interaction (if
multiple years are required to complete sampling) and residual variance components. Residual variance
is composed of temporal variance within a sampling period confounded with measurement error of

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 18 of 64
various types. If the magnitude of residual variance is sufficiently large to impact status estimates (see
above), then relative magnitudes of the interaction variance and various components of residual
variance are examined to determine if any reduction can be achieved in the future. Interaction variance
can only be reduced by increasing the sample size. Index variance can be reduced by either increasing
the number of sites, increasing the number of times a site is visited within a year, reducing the length of
the index period, or by reducing measurement error. Trend detection is evaluated using the equation to
determine the variance in the slope of the trend (Table 3.1). In the equation, residual variance also
includes the interaction component. For multi-site networks such as the national aquatic resource
assessments, trend detection is most sensitive to coherent year variance, which can only be reduced by
extending the time period for monitoring (Larsen et al., 1995, 2001, 2004). If residual variance is large
relative to the coherent year variance, then trend detection within a fixed time period can be improved
by increasing the number of sites sampled each year, increasing the number of times each site is
sampled within a year, or by reducing measurement error.
Table 3.1 Important variance components for aquatic resource assessments.
Model for status estimation	Model for trend detection
2 = 2 I 2 + 2 2
& total ^sites ^year &sitexyear & residual
2 f 2 \
^ sites _ 2 _ ^residual \
N ; 1 yeaf N
1 * jrwj I A * ziiec }
vat(s!ope) =				
i= 1
and
2 2 2
^ residual &mtkirf year &error

And
2
r2 2 ^residual
^ residual ~~ ts sits xyear , r
viiif
Components in parentheses represent "extraneous" variance
Variance
Component
Description
2
^ sites
Observed variance among all sites or streams sampled over multiple-year sampling cycle.
If sites are revisited across years, this effect can be eliminated.
2
^ year
Coherent variance across years that affects all sites equally, due to regional-scale factors such
as climate or hydrology.
Principal effect on trend detection, reduced only by increasing number of years
^sitexy:ear
"Interaction" variance occurring at each site across years that affects each site independently.
Principal effect on status, reduce by increasing number of sites.
^residual
"Residual" variance: Includes temporal variance at each site within a single index period
(o2within-year) confounded with measurement error (o2error) due to acquiring the data from the
site (e.g., sample collection and analysis)
Principal effect on status,
If o2mdex» o2error reduce by increasing number of sites or altering index period.
If o2error is large relative to o2index, then modify sampling and analysis procedures.
<
=)
a
<
i—
<
o
For NLA 2017, 10 percent of all sample sites receive repeat visits to determine temporal variability plus
analytical variability within the index period. Revisit sites must be sampled at least 2 and as long as
possible within the index period to ensure that temporal variability is assessed. The NLA team
18

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 19 of 64
implements control measures to minimize measurement error among crews and sites. These control
measures include the use of standardized field protocols provided in the FOM, consistent training of all
crews, field assistance visits to all field crews, and availability of experienced technical personnel during
the field season to respond to site-specific questions from field crews as they arise.
3.2.3 Chemical Precision, Bias, and Accuracy
The information in this section is particularly relevant to analysis of water chemistry precision, bias and
accuracy; sediment chemistry precision and accuracy and bacteria precision. See more specifics for how
these are applied in the relevant sections of the LOM. See additional information on QC procedures for
other indicators in the relevant sections of the LOM.
Precision and bias are estimates of random and systematic error in a measurement process (Kirchmer,
1983; Hunt and Wilson, 1986; USEPA, 2002). Collectively, precision and bias provide an estimate of the
total error or uncertainty associated with an individual measurement or set of measurements.
Systematic errors are minimized by using validated methods and standardized procedures across all
laboratories. Precision is estimated from repeated measurements of samples. Net bias is determined
from repeated measurements of solutions of known composition, or from the analysis of samples that
have been fortified by the addition of a known quantity of analyte. For analytes with large ranges of
expected concentrations, MQOs for precision and bias are established in both absolute and relative
terms, following the approach outlined in Hunt and Wilson (1986). At lower concentrations, MQOs are
specified in absolute terms. At higher concentrations, MQOs are stated in relative terms. The point of
transition between an absolute and relative MQO is calculated as the quotient of the absolute objective
divided by the relative objective (expressed as a proportion, e.g., 0.10 rather than as a percentage, e.g.,
10%).
Precision in absolute terms is estimated as the sample standard deviation(s) when the number of
measurements is greater than two:
Equation 3-2. Precision in absolute terms.
where x, is the value of the replicate, X is the mean of repeated sample measurements, and n is the
number of replicates. Relative precision for such measurements is estimated as the relative standard
deviation (RSD, or coefficient of variation, [CV]):
Equation 3-3. Relative precision.
where s is the sample standard deviation of the set of measurements, and X equals the mean value for
the set of measurements. Both RSD and CV can be expressed as percentages by multiplying by 100.
Precision based on duplicate measurements is estimated based on the range of measured values (which
equals the difference for two measurements).
The relative percent difference (RPD) is calculated as:
n
n-1
RSD

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 20 of 64
Equation 3-4. Relative percent difference.
f
RPD =
\A-B\_
(A+B)/2

x 100
where A is the first measured value, and B is the second measured value.
For repeated measurements of samples of known composition, net bias (6) is estimated in absolute
terms as:
Equation 3-5. Net bias.
B = x-T
where x equals the mean value for the set of measurements, and Tequals the theoretical or
target value of a performance evaluation sample.
Bias in relative terms [B[%]) is calculated as:
Equation 3-6. Bias in relative terms.
B(%) = x 100
where x equals the mean value for the set of measurements, and Tequals the theoretical or target
value of a performance evaluation sample.
Accuracy is estimated for some analytes from fortified or spiked samples as the percent recovery.
Percent recovery (%recovery) is calculated as:
Equation 3-7. Percent recovery.
% recovery =	x 100
a
where C/s is the measured concentration of the spiked sample, C„ is the concentration of the unspiked
sample, and Cs is the concentration of the spike.
For NLA 2017 each laboratory must monitor precision and bias for every sample batch by the analysis of
internal QC samples. Laboratories also report on percent recovery to determine accuracy. Laboratories
must review and re-analyze samples associated with unacceptable QC sample results within one week
or analyte holding time, whichever is longer. Laboratories should consult with the Project QA manager
about any unacceptable results within one week and to verify that the appropriate corrective actions are
taken.
3.2.4 Taxonomic Precision and Accuracy of Benthic Macroinvertebrates and Zooplankton
NLA 2017 includes two layers of quality assurance for biological data: internal and external.
3.2.4.1 Internal quality assurance and quality control for biological data
Each laboratory conducts internal, or within laboratory, quality assurance and quality control activities.
Each laboratory must evaluate the sorting efficiency of the NLA 2017 laboratory analysts. All laboratory

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 21 of 64
analysts responsible for taxonomic identification must participate in an internal taxonomic verification
check. The details of the sorting and taxonomic verifications can be found in the indicator-specific
sections of the NLA 2017 LOM.
3.2,4.2 External quality assurance for biological data
Each laboratory participates in external, or among laboratory, quality assurance. In general, external
quality assurance takes two forms: (1) an independent taxonomist re-analyzes 10% of samples or (2) all
of the laboratories participate in a round robin, where they swap 10% of samples among laboratories
and re-analyze them. The details of the external quality assurance requirements (e.g., taxonomic
resolution, calculations) are found in the indicator-specific sections of the NLA 2017 LOM.
For benthic macroinvertebrates and zooplankton, accuracy of taxonomy is qualitatively evaluated
through specification of target hierarchical levels (e.g., family, genus, or species); and the specification
of appropriate technical taxonomic literature or other references (e.g., identification keys). To calculate
taxonomic precision, USEPA randomly selects 10% of the samples for re-identification by an
independent, outside taxonomist or laboratory. Comparison of the results of whole sample re-
identifications provides a Percent Taxonomic Disagreement (PTD) calculated as:
Equation 3-8. Percent taxonomic disagreement.
PTD =
1-
fcomPPo.^
N
x 100
where comppos is the number of agreements, and N is the total number of individuals in the larger of the
two counts. The lower the PTD, the more similar the taxonomic results and the greater the overall
taxonomic precision. An MQO of 15% is recommended for taxonomic difference (overall mean <15% is
acceptable). Individual samples exceeding 15% are examined for taxonomic areas of substantial
disagreement, and the reasons for disagreement investigated.
In addition, percent similarity (PSC) is calculated between the taxonomic laboratories. Percent similarity
is a measure of similarity between two communities or two samples (Washington, 1984). Values range
from 0% for samples with no species in common, to 100% for samples which are identical. It is
calculated as follows:
Equation 3-9. Percent similarity.
PSC = 1 - 0.5^ |a - b\
Z=1
where: a and b are, for a given species, the relative proportions of the total samples A and B,
respectively, which that species represents. An MQO of >85% is recommended for percent similarity of
taxonomic identification. If the MQO is not met, the reasons for the discrepancies between analysts is
discussed. If a major discrepancy is found in how the two analysts have been identifying organisms, the
last batch of samples that have been counted by the analyst under review may have to be re-counted.
to
LU
>
I—
u
LU
Sample enumeration is another component of taxonomic precision. Final specimen counts for samples	m
are dependent on the taxonomist, not the rough counts obtained during the sorting activity.	>.
H
Comparison of counts is quantified by calculation of Percent Difference in Enumeration (PDE), calculated <
=)
as:	a
<
i—
<
o
21

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 22 of 64
Equation 3-10. Percent difference in enumeration.
(\Labl- LablT\
PDE = J	l- x 100
Labi + Labi
An MQO of 5% is recommended (overall mean of <5% is acceptable). Individual samples exceeding 5%
are examined to determine reasons for the exceedance.
Corrective actions for samples exceeding these MQOs can include defining the taxa for which re-
identification may be necessary (potentially even by third party), for which samples (even outside of the
10% lot of QC samples) it is necessary, and where there may be issues of nomenclatural or enumeration
problems. Specific corrective actions are identified in the indicator sections of the LOM.
Taxonomic accuracy is evaluated by having individual specimens (representative of selected taxa)
identified by recognized experts. Samples are identified using the most appropriate technical literature
that is accepted by the taxonomic discipline and that reflects the accepted nomenclature including the
NLA taxonomic lists from past surveys. Specific references are identified in the indicator sections in the
LOM. Any laboratory or taxonomist who believes these are not sufficient must contact the USEPA NLA
Project Leader and Project QA Coordinator to discuss options. The internal NLA taxonomic lists are used
to verify nomenclatural validity and spelling. A reference collection is compiled as the samples are
identified. If necessary, specialists in several taxonomic groups verify selected individuals of different
taxa, as determined by the NLA workgroup.
3.2.5 Precision of Physical Habitat Indicators
In a regional or national assessment of status, differences among lakes are the signal of interest, but real
differences can be obscured by noise variance (Paulsen et al. 1991, Kaufmann et al. 1999). The habitat
variables (metrics) of interest are lake summary variables based on measurements at 10 randomized,
equidistant nearshore stations employing measurements and observations at littoral, riparian, and
drawdown zone plots at each of those stations.
Measures of variance between repeat visits within the sampling season of the same year provide
accurate estimates of the variances in individual lake habitat metrics that would be encountered in a
spatially extensive survey carried out over a typical summer field season. Repeat visit variance includes
the combined effects of within-season habitat variation, measurement variation, changes in the
locations of sampling plots between visits to individual lakes, and variation in estimates obtained by
different field crews. Analysts employed variance components analysis to estimate repeat visit variance
and the signaknoise (S/N) ratio which is one expression of the relative precision of habitat metrics
(Kaufmann et al. 1999).
Equation 3-11. Repeat visit variance.
Analysts used the general random-effects model of Kincaid, et al. (2004) to model the sources of
variation in any habitat variable, V, as
Equation 3-13. Source of variation in habitat variable.
rep
Equation 3-12. Signaknoise ratio.
S/N = O'lake/O2 rep
Yijk — (J. ~f~ l~i -f- Tj -f- LTjj + Ejjk f

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 23 of 64
Here V^is the measured metric value for the kth visit to lake i within the fh year. The grand mean value is
and L and T are random lake and year effects, respectively. For the NLA, data came from a single year,
so the year (7) and lake:year interaction (LT) terms in Equation 3-13 are zero, and the model simplifies
to the form Y,k = n + L, + E,k. The residual error (Eik) of the simplified model represents within-year
variation at any single lake, which we estimated from a subset of the lakes that were resampled the
same summer. Analysts assume that and Eik are normally-distributed random effects, with variances of
a2take and <7%,, respectively. The combined data set containing samples from different lakes as well as
revisits to same lakes, enabled estimation of both among-lake variance (cr2;^) and repeat visit variance
{cfrep) using restricted maximum likelihood (Littell et al. 2006).
In the synoptic survey context, variance among lakes (c^iake) is the signal of interest, and the variance in
revisits within the index period from all sources (<7%,) is noise variance; we define their ratio as S/N. The
methods used to quantify precision, the precision of NLA lake physical habitat metrics and key habitat
condition indices, the implications of varying precision levels for monitoring and assessment, application
of habitat condition indicators in a national assessment, and the biological relevance of the NLA
indicators are comprehensively evaluated by Kaufmann et al. (2014a,b,c). Below is a summary of
precision for key physical habitat indicators based on the NLA 2012 survey data, which employed the
same field methods as NLA 2017.
The key NLA physical habitat indices had moderate to high S/N (2.2 - 11.0) over the entire NLA-2012
survey (Table 7 and Appendix B, USEPA 2016). Compared with the other composite indices, the human
disturbance index RDis_IX and horizontal drawdown index had the highest S/N (9.1-11), whereas the
littoral cover O/E index had the lowest S/N (2.2). The advantage of S/N as a precision measure is its
relevance to many types of statistical analysis and detecting differences in subpopulation means (Zar
1999). High noise in habitat descriptions relative to the signal (i.e., low signal: noise ratio, S/N)
diminishes statistical power to detect differences among lakes or groups of lakes. Imprecise data limit
the ability to detect temporal trends (Larsen et al. 2001, 2004). Noise variance also limits the maximum
amount of variance that can be explained by models such as multiple linear regression (Van Sickle et al.
2005, Kaufmann and Hughes 2006). By reducing the ability to quantify associations between variables
(Allen et al. 1999, Kaufmann et al. 1999), imprecision compromises the usefulness of habitat data for
discerning likely controls on biota and diagnosing probable causes of impairment. The adverse effects of
noise variance on these types of analysis are negligible when S/N >10; becoming minor as S/N decreases
to 6, increasing to moderate as S/N decreases to 2, and finally becoming severely limiting as S/N
approaches 0 (Paulsen et al. 1991, Kaufmann et al. 1999). At S/N=0, all the metric variance observed
among lakes in the survey can be attributed to measurement "noise". Based on these guidelines, the
effects of imprecision are minor for all the indicators except for the Littoral Cover index, for which the
effects are minor-to-moderate.
Kaufmann et al. (2014a) explain that the S/N ratio may not always be a good measure of the potential of
a given metric to discern ecologically important differences among sites. For example, a metric may
easily discriminate between sparse and abundant littoral cover for fish, but S/N for the metric would be
low in a region where littoral cover does not vary greatly among lakes. In cases where the signal
variance (a2;^) observed in a regional survey reflects a large range of habitat alteration or a large range
in natural habitat conditions, S/N would be a good measure of the precision of a metric relative to what
we want it to measure. However, in random surveys or in relatively homogeneous regions, o2^ and
consequently S/N, may be less than would be calculated for a set of sites specifically chosen to span the
full range of habitat conditions occurring in a region. To evaluate the potential usefulness of metrics,
Kaufmann et al. (2014a) suggested that an alternate measure of relative precision, orep divided by its
potential or observed range (Rgpot or Rg0bs) offers additional insight. The minimum detectable

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 24 of 64
difference in means between 2 lakes (or between two times in one lake) is given by Dmin = 1.96CTrep(2n)1/2
= 2.11orep , using a 2-sided Z-test with a = 0.05 (Zar 1999). Thus, to detect any specified difference
between 2 lakes in a metric relative to its potential or observed range (Rgpot or Rg0bs, the standardized
within-lake standard deviation, orep/Rg, cannot exceed {Dmi„/Rg)/2.77. By the criteria in Kaufmann et al.
(2014a, Table 2), the key NLA physical habitat indices were precise or moderately precise, with orep/Rgobs
between 0.052 - 0.107 (Table 7, USEPA 2016). Depending on the index, they have the potential to
discern differences between single lakes (or one lake at two different times) that are between 1/3rd and
l/8th the magnitude of the observed ranges of these indices.
3.2.6 Completeness
Completeness requirements are established and evaluated from two perspectives. First, valid data for
individual parameters must be acquired from a minimum number of sampling locations in order to make
subpopulation estimates with a specified level of confidence or sampling precision. The objective of this
study is to complete sampling at 95% or more of the 1000 initial sampling sites. Percent completeness
(%Cj is calculated as:
Equation 3-14. Percent completeness.
where V is the number of measurements/samples judged valid, and T is the total number of planned
measurements/samples. Within each indicator, completeness objectives are also established for
individual samples or individual measurement variables or analytes. These objectives are estimated as
the percentage of valid data obtained versus the amount of data expected based on the number of
samples collected or number of measurements conducted. Where necessary, supplementary objectives
for completeness are presented in the indicator-specific sections of the LOM.
In addition to evaluating completeness for each laboratory, the completeness objectives are established
for each measurement per site type (e.g., probability sites, revisit sites, etc.). Failure to achieve the
minimum requirements for a particular site type results in regional population estimates having wider
confidence intervals. Failure to achieve requirements for repeat sampling (10% of samples collected)
and revisit samples (10% of sites visited) reduces the precision of estimates of index period and annual
variance components, and may impact the representativeness of these estimates because of possible
bias in the set of measurements obtained.
3.2.7 Comparability
Comparability is defined as the confidence with which one data set can be compared to another
(USEPA,2002). A performance-based methods approach is being utilized for water chemistry and
chlorophyll a analyses that defines a set of laboratory method performance requirements for data
quality. Following this approach, participating laboratories may choose which analytical methods they
use for each target analyte as long as they are able to achieve the performance requirements as listed in
Table 10.4 of the LOM. Requirements for reporting limits may be modified for regional laboratories
based on the expected range of concentrations for samples they may receive and required threshold
values for assessing condition. For all parameters, comparability is addressed by the use of standardized
sampling procedures and analytical methods by all sampling crews and laboratories. Comparability of
data within and among parameters is also facilitated by the implementation of standardized quality
assurance and quality control techniques and standardized performance and acceptance criteria. For all
measurements, reporting units and format are specified, incorporated into standardized data recording
forms, and documented in the information management system. Comparability is also addressed by

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 25 of 64
providing results of QA sample data, such as estimates of precision and bias, conducting methods
comparison studies when requested by the grantees and conducting inter-laboratory performance
evaluation studies among state, university, and NLA 2017 contract laboratories. See indicator specific
sections of the LOM for more information when appropriate.
3.2.8 Representativeness
Representativeness is defined as "the degree to which the data accurately and precisely represent a
characteristic of a population parameter, variation of a property, a process characteristic, or an
operational condition" (USEPA, 2002). At one level, representativeness is affected by problems in any or
all of the other data quality indicators.
At another level, representativeness is affected by the selection of the target surface water bodies, the
location of sampling sites within that body, the time period when samples are collected, and the time
period when samples are analyzed. The probability-based sampling design should provide estimates of
condition of surface water resource populations that are representative of the region. The individual
sampling programs defined for each indicator attempt to address representativeness within the
constraints of the response design, (which includes when, where, and how to collect a sample at each
site). Holding time requirements for analyses ensure analytical results are representative of conditions
at the time of sampling. See indicator specific sections of the LOM for more information and Appendix B
of the FOM for more information.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 26 of 64
4 SAMPLING DESIGN AND SITE SELECTION
The overall sampling program for the NLA 2017 project requires a randomized, probability-based
approach for selecting lakes where sampling activities are to be conducted. Details regarding the specific
application of the probability design to surface waters resources are described in Paulsen et al., (1991),
Peck et al., (2013), and Stevens (1994).
4.1 Probability Based Sampling Design and Site Selection
The target population for this project includes all lakes, reservoirs, and ponds within the 48 contiguous
United States greater than 1 hectare (2.5 acres) in surface area that are permanent water bodies. Lakes
that are saline due to tidal influence are excluded as are those used for aquaculture, disposal-tailings,
sewage treatment, evaporation, or other unspecified disposal use. The National Hydrography Dataset
(NHD, 1:100,000 scale) was employed by USEPA to derive a list of lakes for potential inclusion in the
survey. The overall sample size was set to include 1000 lake sampling events. In NLA 2017, 904 lakes will
be sampled; and 96 of the lakes will be sampled twice for a total of 1000 lake visits. The 904 lakes
consist of three sets of lakes. The first set is 226 lakes that were originally sampled in NLA 2007,
resampled in NLA 2012 and will be resampled again in NLA 2017. Of these, 43 lakes will be sampled
twice in NLA 2017. The second set is 218 lakes originally sampled in NLA 2012 and will be resampled
again in NLA 2017. Of these, 53 lakes will be sampled twice in NLA 2017. The third set is 460 new lakes
that will be sampled for the first time in NLA 2017. This design provides a robust number of sites that we
will use to evaluate change between the 2007 and the 2017 lakes assessments. Figure 4.1 displays the
distribution of the 904 base sites from the original NLA 2017 design.
A Generalized Random Tessellation Stratified (GRTS) survey design for a finite resource was used for site
selection. Lake selection for the survey provided for six size class categories (1-4 hectares (ha), 4-10 ha,
10-20 ha, 20-50 ha, 50-100 ha, >100 ha), as well as spatial distribution across the lower 48 states and
nine aggregated Omernik Level 3 ecoregions (for more information on Omernik ecoregions see
https://www.epa.gov/eco-research/ecoregions). USEPA developed another subset of lakes for states
that may want to do state level assessments (to increase the overall sample size to 50 per state).
Additional lakes were selected as potential replacement lakes (oversample sites). The oversample is
used to replace a candidate lake that is determined to be non-target or to replace a target lake that is
not accessible due to landowner denials, physical barriers, or safety concerns. Crews must take
replacement sites from the Oversample List in the order that they appear in the site list (numerically by
SITEJD). Skipping over sites on the list compromises the integrity of the survey design and complicates
the assessment analyses. It is important that crews assign a final status to all sites on the list regardless
of whether they end up being sampled.

-------
Legend
• 2017 NLA Base Sites (904)
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 27 of 64
Design Sites for the
2017 National Lakes Assessment
Figure 4.1 Design sites for the 2017 National Lakes Assessment.
Complete documentation is included in Appendix C in the Site Evaluation Guidelines document.
4.2 Reference (or Least-Disturbed) Site Selection
A set of reference lakes (least disturbed lakes), i.e., those that USEPA will use to inform benchmarks in
the assessment, will be determined after the complete set of data is returned. At that point, USEPA will
run a set of screening criteria similar to that used in NLA 2012 (USEPA, 2009). Analysts will consider
whether information from these sites, combined with information from past surveys, indicates a need to
revise thresholds used in the NLA 2012.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 28 of 64
5 INFORMATION MANAGEMENT
Environmental monitoring efforts that amass large quantities of information from various sources
present unique and challenging data management opportunities. To meet these challenges, the NLA
2017 employs a variety of well-tested information management (IM) strategies to aid in the functional
organization and ensured integrity of stored electronic data. IM is integral to all aspects of the NLA 2017
from initial selection of sampling sites through the dissemination and reporting of final, validated data.
And, by extension, all participants in the NLA 2017 have certain responsibilities and obligations which
also make them a part of the IM system. This "inclusive" approach to managing information helps to:
¦	Strengthen relationships among NLA 2017 cooperators.
¦	Increase the quality and relevance of accumulated data.
¦	Ensure the flexibility and sustainability of the NLA 2017 IM structure.
This IM strategy provides a congruent and scientifically meaningful approach for maintaining
environmental monitoring data that satisfies both the scientific and technological requirements of the
NLA 2017.
5.1 Roles and Responsibilities
At each point where data and information are generated, compiled, or stored, the NLA 2017 team must
manage the information. Thus, the IM system includes all of the data-generating activities, all of the
means of recording and storing information, and all of the processes that use data. The IM system also
includes both hardcopy and electronic means of generating, storing, organizing and archiving data, and
the effort to achieve a functional IM process is all encompassing. To that end, all participants in the NLA
2017play an integral part within the IM system. Table 5.1 provides a summary of the IM responsibilities
identified by the NLA 2017 IM group. Specific information on the field crew responsibilities for tracking
and sending information is found in the FOM.
Table 5.1 Summary of IM responsibilities.
NLA 2017
Contact
Primary Role
Responsibility
Group



<
o
i—
<
QC
o
28
Field Crews
State/tribal
partners and
contractor or
other field
crews
(regional
USEPA, etc.)
Acquire in-situ
measurements
and prescribed
list of
biotic/abiotic
samples at each
site targeted for
the survey
Complete and review field data forms and sample tracking
forms for accuracy, completeness, and legibility.
Ship/email field and sample tracking forms to NARS IM
Center so information can be integrated into the central
database.
Work with the NARS IM Center staff to develop acceptable
file structures and electronic data transfer protocols should
there be a need to transfer and integrate data into the
central database.
Provide all data as specified in FOM, SEG or as negotiated
with the NLA Project Leader.
Maintain open communications with NARS IM Center
regarding any data issues.
Analytical
Laboratories
State/tribal
partners and
contractors
Analyze samples
received from
field crews in the
Review all electronic data transmittal files for
completeness and accuracy (as identified in the QAPP).

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 29 of 64
IM Center
staff
USEPAORD
NHEERL
Western
Ecology
Division-
Corvallis,
Contractors
Project
Quality
Assurance
Coordinator
USEPA Office
of Water
manner
appropriate to
acquire
biotic/abiotic
indicators/measur
ements
requested.
Provides support
and guidance for
all IM operations
related to
maintaining a
central data
management
system for NLA
2017
Review and
evaluate the
relevancy and
quality of
information/data
collected and
generated
through the NLA
2017 survey.
Work with the NARS IM Center staff to develop file
structures and electronic data transfer protocols for
electronically-based data.
Submit completed sample tracking forms to NARS IM
Center so information can be updated in the central
database.
Provide all data and metadata as specified in the laboratory
transmittal guidance section of the QAPP or as negotiated
with the NLA Project Leader.
Maintain open communications with NARS IM Center
regarding any data issues.
Develop/update field data forms.
Plan and implement electronic data flow and management
processes.
Manage the centralized database and implement related
administration duties.
Receive, scan, and conduct error checking of field data
forms.
Monitor and track samples from field collection, through
shipment to appropriate laboratory.
Receive data submission packages (analytical results and
metadata) as compiled by the NLA 2017 Quality Team from
each laboratory or directly (e.g., national water chemistry
laboratory).
Run automated error checking, e.g., formatting differences,
field edits, range checks, logic checks, etc.
Receive verified, validated, and final indicator data files
(including record changes and reason for change) from QA
reviewers. Maintain history of all changes to data records
from inception through delivery to WQX.
Organize data in preparation for data verification and
validation analysis and public dissemination.
Implement backup and recovery support for central
database.
Implement data version control as appropriate.
Oversee NLA 2017 Quality Team including initial review of
laboratory electronic data deliverables, quality checks and
submission of compiled datasets to the NARS IM Center
Monitor quality control information.
Evaluate results stemming from field and laboratory audits.
Investigate and take corrective action, as necessary, to
mitigate any data quality issues.
29

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 30 of 64
Issue guidance to NLA 2017 Project Leader and IM Center
staff for qualifying data when quality standards are not met
or when protocols deviate from plan.
Steering
Committee
NLA Project
Lead and
other team
members,
USEPA
Regional and
ORD staff,
States, tribes,
other federal
agencies
Provide technical
recommendations
related to data
analysis, reporting
and overall
implementation
Provide feedback and recommendations related to QA,
data management, analysis, reporting and data distribution
issues.
Review and comment on QA and information management
documentation (QAPP, data templates, etc).
Data Analysis
and Reporting
Team
USEPA Office
of Water,
ORD WED,
Partners
Provide the data
analysis and
technical support
for NLA 2017
reporting
requirements
Provide data integration, aggregation and transformation
support as needed for data analysis.
Provide supporting information necessary to create
metadata.
Investigate and follow-up on data anomalies using
identified data analysis activities.
Produce estimates of extent and ecological condition of the
target population of the resource.
Provide written background information and data analysis
interpretation for report(s).
Document in-depth data analysis procedures used.
Provide mapping/graphical support.
Document formatting and version control.
Develops QA report for management.
Data
Finalization
Team
TBD
Provides data
librarian support
Prepare NLA 2017 data for transfer to USEPA public web-
servers).
Generate data inventory catalog record (Science Inventory
Record).
Ensure all metadata is consistent, complete, and compliant
with USEPA standards.
i—	5.1.1 State/Tribe-Based Data Management
LU
^	Some state or tribal partners manage activities for both field sampling and laboratory analyses. While
S	the NARS program encourages states to use these in-house capabilities, it is imperative that NLA 2017
z	partners understand their particular role and responsibilities for executing these functions within the
^	context of the national program. If a state or Tribe chooses to do IM in-house, the state or tribe
z	performs all of the functions associated with the following roles:
<	• Field Crew—including shipping/emailing of field data forms to the IM Coordinator (NLA 2017
^	paper or electronic field forms must be used and the original field forms must be sent to the
2	NARS IM Center as outlined in the NLA 2017 FOM).
30

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 31 of 64
•	Laboratory quality assurance including responding to the NLA 2017 Quality Team questions after
submitting data
•	Submission of data from the state or tribe to the Laboratory Review Coordinator or other
designated member of the Quality Team (who submit to the NARS IM Center). Typically, the
state or tribe must provide a single point of contact for all activities related to NLA 2017 data.
However, it may be advantageous for the Laboratory Review Coordinator to have direct
communication with the state or tribe-participating laboratories to facilitate the transfer of
data. This is a point that may be negotiated between the primary state or tribal contact, the
regional coordinator and the Laboratory Review Coordinator.
•	Data transfers to the NARS IM Center must be timely. States must submit all initial laboratory
results (i.e., those that have been verified by the laboratory and have passed all internal
laboratory QA/QC criteria) in the appropriate format to the Laboratory Review Coordinator by
May 2018, in order to meet NLA 2017 product deadlines (unless otherwise indicated for a
contract/grant requirement).
•	Data transfers must be complete. For example, laboratory analysis results submitted by the
state or tribe must be accompanied by related quality control and quality assurance data,
qualifiers code definitions, contaminant/parameter code cross-references/descriptions, test
methods, instrumentation information and any other relevant laboratory-based assessments or
documentation related to specific analytical batch runs.
•	The state or tribe must ensure that data meet minimum quality standards and that data transfer
files meet negotiated content and file structure standards.
The Laboratory Review Coordinator communicates the necessary guidance for data management and
submission requirements (i.e., data templates). Each group that performs in-house IM functions
incorporates these guidelines as is practicable or as previously negotiated.
5.2 Overview of System Structure
In its entirety, the NARS IM system includes site selection and logistics information, sample labels and
field data forms, tracking records, map and analytical data, data validation and analysis processes,
reports, and archives. NARS IM staff provides support and guidance to all program operations in
addition to maintaining a central database management system for the NLA data.
The central repository for data and associated information collected for use by NLA 2017 is a secure,
access-controlled server located at WED-Corvallis.
This database is known as the NARS IM. Data are stored and managed on this system using the
Structured Query Language (SQL). Data review (e.g., verification and validation) and data analysis (e.g.,
estimates of status and extent) are accomplished primarily using programs developed in either SAS or R
language software packages.
5.2.1 Data Flow
The NLA 2017 will accumulate large quantities of observational and laboratory analysis data. To
appropriately manage this information, it is essential to have a well-defined data flow model and
documented approach for acquiring, storing, and summarizing the data. This conceptual model (Figure
5.1) helps focus efforts on maintaining organizational and custodial integrity, ensuring that data
available for analyses are of the highest possible quality.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 32 of 64
5.2.2 Simplified Description of Data Flow
There are several components associated with the flow of information. These are described below and
also shown in Figure 5.1:
¦	Communication—between the NARS IM Center and the various data contributors (e.g., field
crews, the NLA Quality Team, laboratories and the data analysis and reporting team)—is vital for
maintaining an organized, timely, and successful flow of information and data.
¦	Data are captured or acquired from four basic sources — field data transcription, laboratory
analysis reporting, automated data capture, and submission of external data files (e.g., GIS
data)—encompassing an array of data types: site characterization; biotic assessment; sediment
and tissue contaminants; and water quality analysis. Data capture generally relies on the
transference of electronic data, e.g., optical character readers and email, to a central data
repository. However, some data must be transcribed by hand in order to complete a record.
¦	Data repository or storage—provides the computing platform where raw data are archived,
partially processed data are staged, and the "final" data, assimilated into a final, user-ready data
file structure, are stored. The raw data archive is maintained in a manner consistent with
providing an audit trail of all incoming records. The staging area provides the IM Center staff
with a platform for running the data through all of its QA/QC paces as well as providing data
analysts a first look at the incoming data. This area of the data system evolves as new data are
gathered and user-requirements are updated. The final data format becomes the primary
source for all statistical analysis and data distribution.
¦	Metadata—a descriptive document that contains information compliant with the Content
Standards for Digital Geospatial Metadata (CSDGM) developed by the Federal Geographic Data
Committee (FGDC).

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 33 of 64
ECOLOGICAL INDICATOR FIELD AND LABORATORY DATA FLOW
LABORATORY
SAMPLE
COLLECTION
I
FIELD DATA COLLECTION
SAM PL E
AHA! VSIS
I mm-.
RECE PT
Recorders
LABORATORY
INFORMATION
MANAGEMENT
SYSTEM
1 Notebook PC
p 1 f #
Sample
Tracking
Formfs)
QA/QC
REVIEW
OTHER
DATA FILES
(e.g.. Survey
design, GIS
attribute data}
OFFICE
REVIEW
RAW DATA
SUBMISSION PACKA
RAW DATA
SUBMISSION PACKAGE
RAW DATA FILES
(NARS IM Spec)
DATA ENTRY

Si
QA review
Create flat
files for use
with SAS or R
g
-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 34 of 64
NLA 2017 is committed to compliance with all applicable regulations and guidance concerning hardware
and software procurement, maintenance, configuration control, and QA/QC. To that end, the NLA 2017
team has adopted several IM standards that help maximize the ability to exchange data within the study
and with other aquatic resource surveys or similar large-scale monitoring and assessment studies (e.g.,
NARS, past EMAP and R-EMAP studies). Specific information follows.
5.2.4	Data Formats
5.2.4.1	Attribute Data
•	SQL Tables
•	SAS Data Sets
•	Ra Workspaces
•	American Standard Code for Information Interchange (Ascii) Files: Comma-Separated values, or
space-delimited, or fixed column
5.2.4.2	CIS Data
¦	ARC/INFO native and export files; compressed .tar file of ARC/INFO workspace
5.2.4.3	Standard Coding Systems
Sampling Site: (USEPA National Locational Data Policy; USEPA, 2004)
Coordinates: Latitude and Longitude in decimal degrees (±0.002)
Datum: NAD83
Chemical Compounds: Chemical Abstracts Service (CAS, 1999)
Species Codes: Integrated Taxonomic Information System when possible
Land cover/land use codes: Multi-Resolution Land Characteristics; National Hydrography
Dataset Plus Version 1.0 (NHDPIus, 2005)
5.2.5	Public Accessibility
While any data created using public funds are subject to the Freedom of Information Act (FOIA), some
basic rules apply for general public accessibility and use.
¦	Program must comply with Data Quality Act requirements before making any data available to
g the public and the person generating data must fill out and have a signed the Information
^	Quality Guidelines package available before any posting to the Web or distribution of any kind.
LU
(J	¦ Data and metadata files are made available to the contributor or participating group for review
<£.
2	or other project-related use from NARS IM or in flat files before moving to an USEPA-approved

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 35 of 64
¦ Only "final" data (those used to prepare the final project report) are readily available through an
USEPA-approved public websiteb.
As new guidance and requirements are issued, the NARS IM staff assess the impact upon the IM system
and develop plans for ensuring timely compliance.
5.3 Data Transfer Protocols
Field crews are expected to use the provided electronic field forms containing in situ measurement and
event information to the NARS IM Center defined in the FOM for submission. If crews need to use paper
forms, they must send in hard copies of field forms within two weeks of sampling. Laboratories must
submit electronic data files. Field crews and laboratories must submit all sample tracking and analytical
results data to the NARS IM Center in electronic form using a standard software package to export and
format data. Data submission templates for laboratories are included in the LOM. Examples of software
and the associated formats are presented in Table 5.2:
Table 5.2 NLA 2017 Data submission software and associated file formats.
Software
Export Options (file extensions)
Microsoft Excel*
xls, xlsx, csv, formatted txt delimited
SAS*
csv, formatted txt delimited
R
csv, formatted txt delimited, R

workspaces (.Rdata)
All electronic files must be accompanied by appropriate documentation (e.g., metadata, laboratory
reports, QA/QC data and review results). This documentation must contain sufficient information to
identify field contents, field formats, qualifier codes, etc. It is very important to keep USEPA informed of
the completeness of the analyses. Laboratories may send files periodically, before all samples are
analyzed, but USEPA must be informed that more data are pending if a partial file is submitted0. All data
files sent by the laboratories must be accompanied by text documentation describing the status of the
analyses, any QA/QC problems encountered during processing, and any other information pertaining to
the quality of the data. Following is a list of general transmittal requirements each laboratory or state-
based IM group should consider when packaging data for electronic transfer to the NLA team and that is
captured in the applicable data submission templates using row/column data file/table structure (see
Appendix C in the LOM for templates):
¦	Include NLA site and sample ID provided on the sample container label in a field for each
record (row) to ensure that each data file/table record can be related to a site visit.
¦	Use a consistent set of column labels.
¦	Use file structures consistently.
¦	Use a consistent set of data qualifiers.
¦	Use a consistent set of units.
<
o
i—
<
QC
O
b If data collected as part of the NLA that are distributed with less rigorous QC applied because the data were not
used in the NLA assessment, this shall be clearly indicated in metadata.
c Laboratories must adhere to contract or grant requirements for submission of data.
35

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 36 of 64
Include method detection limit (MDL) as part of each result recordd.
Include reporting limit (RL) as part of each result record.
Provide a description of each result/QC/QA qualifier.
Provide results/measurements/MDL/RL in numeric form.
Maintain result qualifiers (e.g., <, ND) in a separate column.
Use a separate column to identify record-type. For example, if QA or QC data are included in a
data file, there should be a column that allows the IM staff to readily identify the different result
types.
¦	Include laboratory sample identifier.
¦	Include batch numbers/information so results can be paired with appropriate QA/QC
information.
¦	Include "true value" concentrations, if appropriate, in QA/QC records.
¦	Include a short description of preparation and analytical methods used (where appropriate)
either as part of the record or as a separate description for the test(s) performed on the sample.
For example, EPAxxxx.x, ASTMxxx.x, etc. Provide a broader description (e.g., citation) if a non-
standard method is used.
¦	Include a short description of instrumentation used to acquire the test result (where
appropriate). This may be reported either as part of the record or as a separate description for
each test performed on the sample. For example, GC/MS-ECD, ICP-MS, etc.
¦	Ensure that data ready for transfer to NARS IM are verified and validated, and results are
qualified to the extent possible (final verification and validation are conducted by USEPA).
¦	Data results must complement expectations (analysis results) as specified by contract or
agreement.
¦	Identify and qualify missing data (why are the data missing?).
¦	Submit any other associated quality assurance assessments and relevant data related to
laboratory results (i.e., chemistry, nutrients). Examples include summaries of QC sample
analyses (blanks, duplicates, check standards, matrix spikes, standard or certified reference
materials, etc.), results for external performance evaluation or proficiency testing samples, and
any internal consistency checks conducted by the laboratory. For requirements, please see
specific indicator sections of this QAPP and lab SOP.
The Laboratory Review Coordinator works with the NARS IM Coordinator to establish a data load
process into NARS IM.
!_ 5.4 Data Quality and Results Validation
-Z.
^	Data quality is integrated throughout the life cycle of the data. This includes development of appropriate
g	forms, labels etc. for capturing data as well as verifying data entry, results, and other assessments.
<	Indicator workgroup experts and the data analysis and reporting teams submit any recommended
<	changes to the Project QA Coordinator who recommends and submits any changes (deletions, additions,
z	corrections) to the NARS IM data center for inclusion in the validated data repository. The NARS IM
Q	Center includes all explanation for data changes in the record history.
<
QC
O
d National lab to provide MDL with each result, and may provide an "estimate" comment for each result below the
RL but above the MDL, and a flag when a result is below the MDL.
36

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 37 of 64
5.4.1	Design and Site Status Data Files
The site selection process described in Section 4 produces a list of candidate sampling locations,
inclusion probabilities, and associated site classification data (e.g., target status, ecoregion, etc.). The
Design Team provides this file to the NLA 2017 Project Leader, who in turn distributes to the IM staff,
and field coordinators. Field coordinators determine ownership and contacts for acquiring permission to
access each site, and conduct site evaluation and reconnaissance activities. Field Crews document
information from site evaluation and reconnaissance activities following the SEG and the FOM. The site
evaluation spreadsheets and verification forms are submitted to the Project Lead by the field crews via
SharePoint. The Contractor Field Logistics Coordinator and the NARS IM Center compiles all information
such as ownership, site evaluation, and reconnaissance information for each site into a "site status" data
file. Any missing information from the site status data file is identified and a request is made by
Contractor Field Logistics Coordinator to the field crew (or site evaluator) to complete the record.
Revised information is then submitted to the NARS IM Center.
5.4.2	Sample Collection and Field Data
Field crews record sampling event observational data in a standard and consistent manner using field
data collection forms). Prior to initiation of field activities, the NARS IM staff works with the indicator
leads and analytical support laboratories to develop standardized field data forms and sample labels.
Adhesive labels, completed by the field crews, have a standard recording format and are affixed to each
sample container. Field protocols include precautions to ensure that label information remains legible
and the label remains attached to the sample.
NLA 2017 provides two options for completing field forms: electronic data entry using pre-developed
forms on a tablet or smart phone or "traditional" paper. Paper forms are printed for field crews on
water resistant paper. Copies of the field data forms and instructions for completing each form are
documented in the NLA 2017 FOM. Recorded data - whether through e-forms or paper - are reviewed
upon completion of data collection and recording activities by the Field Crew Leader. Field crews check
completed data forms and sample labels before leaving a sampling site to ensure information and data
were recorded legibly and completely. Errors are corrected by field crews if possible, and data
considered as suspect are qualified using a flag variable. The field sampling crew enters explanations for
all flagged data in a comments section. Field crews transmit e-forms to the NARS IM Staff by selecting
the "submit" button as described in the FOM. Field crews ship completed paper field data forms to the
NARS IM staff for entry into the central database management system.
All samples are tracked from the point of collection. Field crews ensure that copies of the shipping and
custody record accompany all sample transfers; other copies are transmitted to the NARS IM Center.
The NARS IM Center tracks samples to ensure that they are delivered to the appropriate laboratory, that
lost shipments can be quickly identified and traced, and that any problems with samples observed when
received at the laboratory are reported promptly so that corrective action can be taken if necessary.
Detailed procedures on shipping and sample tracking can be found in the Field Operations Manual.
Procedures for completion of sample labels and electronic field data forms are covered extensively in
training sessions. General QC checks and procedures associated with sample collection and transfer,
field measurements, and field data form completion for most indicators are listed in Table 5.3.
Additional QA/QC checks or procedures specific to individual indicators are described in the NLA 2017
Lab Operations Manual.

-------
National Lakes Assessment 2017	Quality Assurance Project Plan
Version 1.1, May 2017	Page 38 of 64
Table 5.3 Summary sample and field data quality control activities.
Quality Control
Activity
Description and/or Requirements
Contamination
Prevention
All containers for individual site sealed in plastic bags until use; specific
contamination avoidance measures covered in training
Sample Identification
Pre-printed labels with unique ID number on each sample
Data Recording
Data recorded on pre-printed forms of water-resistant paper; field sampling
crew reviews data forms for accuracy, completeness, and legibility
Data Qualifiers
Defined qualifier codes used on data form; qualifiers explained in comments
section on data form
Sample Custody
Unique sample ID and tracking form information entered in LIMS; sample
shipment and receipt confirmed
Sample Tracking
Sample condition inspected upon receipt and noted on tracking form with
copies sent to ORD Technical Lead and/or IM
Data Entry
Data entered using customized entry screens that resemble the data forms;
entries reviewed manually or by automated comparison of double entry
Data Submission
Standard format defined for each measurement including units, significant
figures, and decimal places, accepted code values, and required field width
Data Archival
All data records, including raw data, archived in an organized manner. For
example, following verification/validation of the last submission into the
NARS database, it is copied to a terabit external hard drive and sent to the
Project Leader for inclusion in his project file, scheduled as 501, permanent
records.
Processed samples and reference collections of taxonomic specimens
submitted for cataloging and curing at an appropriate museum facility
5.4.3 Laboratory Analyses and Data Recording
Upon receipt of a sample shipment, analytical laboratory receiving personnel check the condition and
identification of each sample against the sample tracking record. Each sample is identified by
information written on the sample label. Any discrepancies, damaged samples, or missing samples are
reported to the NARS IM staff and NLA 2017 Project Lead electronically.
Most of the laboratory analyses for the NLA 2017 indicators, particularly chemical and physical analyses,
S	follow or are based on standard methods. Standard methods generally include requirements for QC
checks and procedures. General laboratory QA/QC procedures applicable to most NLA 2017 indicators
are described in Table 5.4. Additional QA/QC procedures specific to individual indicator and parameter
analyses are described in the LOM. Biological sample analyses are generally based on current acceptable
practices within the particular biological discipline. QC checks and procedures applicable to most NLA
2017 biological samples are described in the LOM.
QC
O
38

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 39 of 64
Table 5.4 Summary laboratory data quality control activities.
Quality Control Activity
Description and/or Requirements
Instrument Maintenance
Follow manufacturer's recommendations and specific guidelines in methods;
maintain logbook of maintenance/repair activities
Calibration
Calibrate according to manufacturer's recommendations; recalibrate or replace
before analyzing any samples if producing erratic results
QC Data
Maintain control charts, determine LT-MDLs and achieved data attributes; include
QC data summary (narrative and compatible electronic format) in submission
package
Data Recording
Use software compatible with NARS IM system, check all data entered against the
original bench sheet to identify and correct entry errors.
Review other QA data (e.g., condition upon receipt, etc.) for possible problems
with sample or specimen.
Data Qualifiers
Use defined qualifier codes; explain all qualifiers
Data Entry
Automated comparison of double entry or 100% manual check against original
data form
Submission Package
Includes:
•	Letter by laboratory manager
•	Data
•	Data qualifiers and explanations
•	Electronic format compatible with NARS IM
•	Documentation of file and database structures
•	Metadata: variable descriptions and formats
•	Summary report of any problems and corrective actions implemented
A laboratory's IM system may consist of only hardcopy records such as bench sheets and logbooks, an
electronic laboratory information management system (LIMS), or some combination of hardcopy and
electronic records. Laboratory data records are reviewed at the end of each analysis day by the
designated laboratory onsite QA coordinator or by supervisory personnel. Errors are corrected by
laboratory personnel if possible, and data considered as suspect by laboratory analysts are qualified by
the laboratory personnel with a flag variable. The laboratory explains all flagged data in a comments
section. Private contract laboratories generally have a laboratory Quality Assurance Plan and established
procedures for recording, reviewing, and validating analysis data.
Once analytical data have passed all of the laboratory's internal review procedures, the laboratory	H
prepares and transfers a submission package using the prescribed templates in the LOM. The contents	[S
of the submission package are largely dictated by the type of analysis (physical, chemical, or biological).	^
Remaining sample material may be transferred to USEPA's designated laboratory or facilities as directed	^
by the NLA 2017 Project Lead. All samples and raw data files (including logbooks, bench sheets, and	^
instrument tracings) are to be retained by the laboratory for 3 years or until authorized for disposal, in	z
writing, by the USEPA Project Leader. Deliverables from contractors and cooperators, including raw	p
data, are permanent as per USEPA Record Schedule 258. USEPA's project records are scheduled 501 and	^
are also permanent.	g
39

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 40 of 64
5.4.4 Data Review, Verification, and Validation Activities
Raw data files are created from entry of field and analytical data, including data for QA/QC samples and
any data qualifiers noted on the field forms or analytical data package.
5.4.4.1	Paper Forms
The NARS IM Center either optically scans or transcribes information from field collection forms into an
electronic format (sometimes using a combination of both processes). During the scanning process,
incoming data are subjected to a number of automated error checking routines. The NARS IM Center
corrects obvious errors immediately at the time of scanning. Suspected errors that cannot be confirmed
at the time of scanning are qualified for later review by someone with the appropriate background and
experience (e.g., a chemist or aquatic ecologist). The process continues until the transcribed data are
100% verified or no corrections are required.
5.4.4.2	Electronic Forms
The NARS IM Center directly uploads information from the electronic field collection forms into their
database. During the upload process, incoming data are subjected to a number of automated error
checking routines. Omissions and errors are automatically noted in an email message to the field crew
lead.
5.4.4.3	Additional Review
Additional validation is accomplished by the NARS IM Center staff using a specific set of guidelines and
executing a series of programs (computer code) to check for: correct file structure and variable naming
and formats, outliers, missing data, typographical errors and illogical or inconsistent data based on
expected relationships to other variables. Data that fail any check routine are identified in an "exception
report" that is reviewed by an appropriate scientist for resolution.
The NARS IM Center brings any remaining questionable data to the attention of the QA manager and
individuals responsible for collecting the data for resolution.
The NLA Quality Team evaluates all data to determine completeness and validity. Additionally, the data
are run through a rigorous inspection using SQL queries or other computer programs such as SAS or R to
check for anomalous data values that are especially large or small, or are noteworthy in other ways.
Focus is on rare, extreme values since outliers may affect statistical quantities such as averages and
standard deviations.
The NLA Quality Team examines all laboratory quality assurance (QA) information to determine if the
laboratory met the predefined data quality objectives - available through the QAPP.
Some of the typical checks made in the processes of verification and validation are described in Table
5.5. QA staff use automated review procedures. The primary purpose of the initial checks is to confirm
that each data value present in an electronic data file is accurate with respect to the value that was
initially recorded on a data form or obtained from an analytical instrument. In general, these activities
focus on individual variables in the raw data file and may include range checks for numeric variables,
frequency tabulations of coded or alphanumeric variables to identify erroneous codes or misspelled
entries, and summations of variables reported in terms of percent or percentiles. In addition, associated
QA information (e.g., sample holding time) and QC sample data are reviewed to determine if they meet
acceptance criteria. Suspect values are assigned a data qualifier. They are either corrected, replaced
with a new acceptable value from sample reanalysis, or confirmed suspect after sample reanalysis. For
biological samples, species identifications are corrected for entry errors associated with incorrect or

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 41 of 64
misspelled codes. Files corrected for entry errors are considered to be raw data files. Copies of all raw
data files are maintained in the centralized NARS IM System.
Any suspect data are flagged for data qualification.
The NARS IM staff, with the support of the NLA 2017 Quality Team, correct and qualify all questionable
data. Copies of the raw data files are maintained in NARS IM, generally in active files until completion of
reporting and then in archive files. Redundant copies of all data files are maintained and all files are
periodically backed up to the EPA headquarters shared G: drive system.
Table 5.5 Data review, verification, and validation quality control activities.
Quality Control Activity
Description and/or Requirements
Review any qualifiers associated with variable
Determine if value is suspect or invalid; assign
validation qualifiers as appropriate
Determine if MQOs and project DQOs have been achieved
Determine potential impact on achieving research
and/or program objectives
Exploratory data analyses (univariate, bivariate,
multivariate) utilizing all data
Identify outlier values and determine if analytical
error or site-specific phenomenon is responsible
Confirm assumptions regarding specific types of statistical
techniques being utilized in development of metrics and
indicators
Determine potential impact on achieving research
and/or program objectives
In the final stage of data verification and validation, exploratory data analysis techniques may be used to
identify extreme data points or statistical outliers in the data set. Examples of univariate analysis
techniques include the generation and examination of box-and-whisker plots and subsequent statistical
tests of any outlying data points. Bivariate techniques include calculation of Spearman correlation
coefficients for all pairs of variables in the data set with subsequent examination of bivariate plots of
variables having high correlation coefficients. Multivariate techniques have also been used in detecting
extreme or outlying values in environmental data sets (Meglen, 1985; Garner et al., 1991; Stapanian et
al., 1993).
The Quality Team reviews suspect data to determine the source of error, if possible. If the error is
correctable, the data set is edited to incorporate the correct data. If the source of the error cannot be
determined, the Quality Team qualifies the data as questionable or invalid. Data qualified as
questionable may be acceptable for certain types of data analyses and interpretation activities. The
decision to use questionable data must be made by the individual data users. Data qualified as invalid
are considered to be unacceptable for use in any analysis or interpretation activities and are generally
removed from the data file and replaced with a missing value code and explanatory comment or flag
code. After completion of verification and validation activities, a final data file is created, with copies
transmitted for archival and for uploading to the NARS IM system.
Once verified and validated, data files are made available for use in various types of interpretation
activities; each activity may require additional restructuring of the data files. These restructuring
activities are collectively referred to as "data enhancement." In order to develop indicator metrics from
one or more variables, data files may be restructured so as to provide a single record per lake.
<
o
i—
<
QC
o
41

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 42 of 64
5.5	Data Transfer
Field crews may transmit data electronically; hardcopies of completed data and sample tracking forms
may be sent via express courier service. Copies of raw, verified, and validated data files are transferred
from the Project QA Coordinator (or designee) to the NARS IM staff for inclusion in the NARS IM system.
All transfers of data are conducted using a means of transfer, file structure, and file format that has
been approved by the NARS IM staff. Data files that do not meet the required specifications are not
incorporated into the centralized data access and management system.
5.5.1 Database Changes
The NARS IM Center staff complete data corrections at the lowest level to ensure that any subsequent
updates contain only the most correct data. The NARS IM Center alerts the Laboratory Review
Coordinator if a laboratory result is found to be in error. The Laboratory Review Coordinator, or other
identified member of the NLA team, sends the laboratory results found to be in error to the originator
(lab) for correction. After the originator makes any corrections, the Laboratory Review Coordinator
resubmits the entire batch or file to the NARS IM Center (unless otherwise discussed with the NARS IM
staff). The NARS IM Center uses these resubmissions to replace any previous versions of the same data.
The NARS IM Center uses a version control methodology when receiving files. Incoming data are not
always immediately transportable into a format compatible with the desired file structures. When this
situation occurs, the IM staff creates a copy of the original data file, which then becomes the working
file in which any formatting changes take place. The NARS IM staff works with the Quality team to
address significant problems with formatting. The original raw data remains unchanged. This practice
further ensures the integrity of the data and provides an additional data recovery avenue, should the
need arise.
All significant changes are documented by the NARS IM Center staff. The NARS IM Center includes this
information in the final summary documentation for the database (metadata).
After corrections have been applied to the data, the NARS IM Center reruns the validation programs to
re-inspect the data.
The NARS IM Center may implement database auditing features to track changes.
5.6	Metadata
All metadata will be documented following the procedures outlined by the Federal Geographic Data
Committee, Content standard for digital geospatial metadata, version 2.0. FGDC-STD-001-1998 (FGDC,
1998).
5.7	Information Management Operations
i—
¦Z.
^	5.7.1 Computing Infrastructure
LU
w	The NARS IM Center collects and maintains electronic data within a central server housed at WED using
z	a Windows Server (current configuration) or higher computing platform in SQL native tables for the
^	primary data repository and SAS® native data sets or R datasets for data analysis. The NARS IM Center
g	conducts official IM functions in a centralized environment.
I—
^	5.7.2 Data Security and Accessibility
QC
O	The NARS IM Center ensures that all data files in NARS IM are protected from corruption by computer
^	viruses, unauthorized access, and hardware and software failures. The NARS IM Center follows guidance
42

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 43 of 64
and policy documents of USEPA and management policies established by the IM Technical Coordination
Group for data access and data confidentiality. Raw and verified data files are accessible only to the NLA
2017 collaborators. Validated data files are accessible only to users specifically authorized by the NLA
2017 Project Leader. Data files in the central repository used for access and dissemination are marked as
read-only to prevent corruption by inadvertent editing, additions, or deletions.
The NARS IM Center routinely stores and archives on redundant systems the data generated, processed,
and incorporated into the IM system. This ensures that if one system is destroyed or incapacitated, IM
staff can reconstruct the databases. Procedures developed to archive the data, monitor the process, and
recover the data are described in IM documentation.
Data security and accessibility standards implemented for NLA 2017 IM meet USEPA's standard security
authentication (i.e., username, password) process in accordance with USEPA's Information Security
Policy (USEPA Order 2150). Any data sharing requiring file transfer protocol (FTP) or internet protocol is
provided through an authenticated site.
5.7.3	Life Cycle
Data may be retrieved electronically by the NLA 2017 team, partners and others throughout the records
retention and disposition lifecycle or as practicable (See Section 5.7.5). Data in the NARS IM database
are subject to EPA Record Schedule 0089 as described in the NARSPROC-003 standard operating
procedure.
5.7.4	Data Recovery and Emergency Backup Procedures
The NARS IM Center maintains several backup copies of all data files and of the programs used for
processing the data. The NARS IM Center maintains backups of the entire system off-site. The IM
process used by the NARS IM Center for NLA 2017 also uses system backup procedures. The NARS IM
Center backs up and archives the central database according to procedures already established for WED
and NARS IM. All laboratories generating data and developing data files are expected to establish
procedures for backing up and archiving computerized data.
5.7.5	Long-Term Data Accessibility and Archive
All data are transferred by OW's Water Quality Exchange (WQX) team working with the NARS IM Team
to USEPA's agency-wide WQX data management system for archival purposes. WQX is a repository for
water quality, biological, and physical data and is used by state environmental agencies, USEPA and
other federal agencies, universities, and private citizens. Data from the NLA 2017 project are run
through an Interface Module in an Excel format and uploaded to WQX by the WQX team. Once
uploaded, states and tribes and the public can download data. Data are also provided in flat files on the
NARS website.
5.8 Records Management
The NARS IM Center maintains removable storage media (i.e., CDs, thumb drives) and paper records in a
centrally located area at the NARS IM Center. Paper records are returned to OW once the assessment is
complete or destroyed per records retention schedules. The NARS IM staff identifies and maintains files
using standard divisional procedures. Records retention and disposition comply with USEPA directive
2160 Records Management Manual (July, 1984) in accordance with the Federal Records Act of 1950.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 44 of 64
6 INDICATORS
6.1 Summary
The NLA Project Team provides detailed, indicator-specific design, collection method, sample handling,
and quality control procedures for field operations in the National Lakes Assessment 2017 Field
Operations Manual. Similarly, the team provides detailed, indicator-specific sample handling, laboratory
procedure, and quality control procedures for laboratory operations in the National Lakes Assessment
2017 Laboratory Operations Manual. Quality assurance objectives for physical habitat, which does not
collect samples or have laboratory analysis associated with its measurements, are in the data analysis
plan of this document. A summary of the QA procedures and the Indicator QA Coordinators is shown in
Table 6.1.
6.1.1	Sampling Design
Field crews collect samples from an index site and/or littoral sites on each lake as described in the
National Lakes Assessment 2017 Field Operations Manual.
6.1.2	Sampling and Analytical Methods
6.1.2.1	Sample Collection
Detailed sample collection and handling procedures are described in the National Lakes Assessment
2017 Field Operations Manual.
6.1.2.2	Analysis
Detailed analysis procedures are described in the National Lakes Assessment 2017 Laboratory
Operations Manual.
6.1.3	Quality Assurance Objectives
Quality assurance objectives are described in detail in the National Lakes Assessment 2017 Laboratory
Operations Manual.
6.1.4	Quality Control Procedures: Field Operations
Detailed design, collection, sample handling and quality control procedures for field operations are
described in the National Lakes Assessment 2017 Field Operations Manual.
6.1.5	Quality Control Procedures: Laboratory Operations
Specific information about sample receipt, processing, and analysis are in the National Lakes Assessment
2017 Laboratory Operations Manual.
6.1.6	Data Management, Review, and Validation
Detailed information about data management, review, and validation are in the National Lakes
Assessment 2017 Laboratory Operations Manual.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 45 of 64
Table 6.1 Summary of indicator QA procedures and coordinators.
Indicator
Lab
Method Lab Analyses QA
Taxa
Indicator QA
QA

Verification
Verificati
Require
Coordinator(s)
Analyst


on
ments


Algal Toxins
Documentation
Methods
Interlab comparison
N/A Kendra Forde
Kendra
(microcystins
review (e.g.,
Call
Lab blanks,

Forde
and
SOPs, lab

duplicates and spiked


cylindrosperm
certifications,

samples


opsin)
prior experience)




Audit





documentation





(if applicable)




Bacteria (£.
Documentation
Methods
Interlab comparison
N/A Kendra Forde
Kenda
coli)
review (e.g.,
Call
Lab reagent blanks

Forde

SOPs, lab

and duplicates



certifications,





prior experience)





Audit





documentation





(if applicable)




Benthic
Taxa QC samples
Methods
Outside Lab QA
Genus or Brian Hasty
Brian
Macro-
Documentation
Call
Taxonomist to review
Family
Hasty
invertebrates
review (e.g.,

10% of samples -
(see


SOPs, lab

photos
LOM)


certifications,

Reconciliation calls



prior experience)





Audit





documentation





(if applicable)




Dissolved
Documentation

3rd party analytical
N/A Jake Beaulieu
Jake
Gases
review (e.g.,

standards

Beaulieu

SOPs, lab

Continuing



certifications,

calibration checks



prior experience)





Audit





documentation





(if applicable)




Fish eDNA
-
-
-
N/A Erik Pilgrim
Erik





Pilgrim
Physical
-
-
-
N/A Phil Kaufmann
Phil
Habitat




Kaufma





nn

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 46 of 64
Phytoplankton
Taxa QC samples
Methods
Outside Lab QA
Species
Brian Hasty
Brian

Documentation
Call
Taxonomist round


Hasty

review (e.g.,

robin




SOPs, lab

Reconciliation calls




certifications,






prior experience)






Audit






documentation






(if applicable)





Sediment
Documentation
Methods
Lab blanks,
N/A
Mari Nord
Kendra
Contaminants,
review (e.g.,
Call
duplicates and spiked

Kendra Forde
Forde
TOC and grain
SOPs, lab

samples (as



size
certifications,
prior experience)
Audit
documentation
(if applicable)

appropriate)



Atrazine
Documentation
Methods
Duplicates
N/A
Kendra Forde
Kendra
Pesticide
review (e.g.,
Call
Standard Solution


Forde
Screen
SOPs, lab
certifications,
prior experience)
Audit
documentation
(if applicable)





Water
Documentation
Methods
Lab blanks,
N/A
Dave Peck Alan
Dave
Chemistry and
review (e.g.,
Call
duplicates and spiked

Herlihy
Peck
Chlorophyll-o
SOPs, lab
certifications,
prior experience)
Audit
documentation
(if applicable)

samples (as
appropriate)



Zooplankton
Taxa QC samples
Methods
Outside Lab QA
Species
Brian Hasty
Brian

Lab blanks,
Call
Taxonomist to review


Hasty

duplicates and

10% of samples -




spiked samples

photos




(as appropriate)

Reconciliation calls




-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 47 of 64
7 ASSISTANCE VISITS
Assistance visits are a component of the QA program for the NLA 2017. Both these sections have been
explained clearly in the National Lakes Assessment 2017 FOM and LOM and therefore are not included
here.
7.1	Field Evaluation and Assistance Visit Plan
Please see the NLA 2017 Field Operations Manual for details.
7.2	Laboratory Evaluation and Assistance Visit Plan
Please see the NLA 2017 Lab Operations Manual for details.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 48 of 64
8 DATA ANALYSIS PLAN
The Data Analysis Plan describes the general process used to evaluate the data for the survey. It outlines
the steps taken to assess the condition of the nation's lakes and identify the relative impact of stressors
on this condition. Results from the analysis are included in the final report and used in future analysis.
The data analysis plan may be refined and clarified as the data are analyzed by USEPA and states.
8.1 Data Interpretation Background
The basic intent of data interpretation is to evaluate the occurrence and distribution of parameters
throughout the population of lakes in the United States within the context of regionally relevant
expectations for least disturbed reference conditions. This is presented using a cumulative distribution
function or similar graphic. For most indicators the analysis categorizes the condition of water as least,
moderately, or most disturbed. Because of the large-scale and multijurisdictional nature of this effort,
the key issues for data interpretation are unique and include: the scale of assessment, selecting the best
indicators, defining the least impacted reference conditions, and determining thresholds for judging
condition.
8.1.1	Scale of assessment
This is the third national report on the ecological condition of the nation's lakes using comparable
methods. USEPA selected the sampling locations for the assessment using a probability based design,
and developed rules for selection to meet certain distribution criteria, while ensuring that the design
yielded a set of lakes that would provide for statistically valid conclusions about the condition of the
population of lakes across the nation. A challenge that this mosaic of waterbodies poses is developing a
data analysis plan that allows USEPA and other partners to interpret data and present results at a large,
aggregate scale.
8.1.2	Selecting the best indicators
Indicators should be applicable across all reporting units, and must be able to differentiate a range of
conditions. USEPA formed a steering committee for these discussions. The Committee, comprised of
state representatives from each of the USEPA regions, provides advice and recommendations to USEPA
on matters related to the NLA 2017. This committee was able to develop and refine indicators and
sampling methodologies.
USEPA developed screening and evaluation criteria which included indicator applicability on a national
scale, the ability of an indicator to reflect various aspects of ecological condition, and cost-effectiveness.
8.1.3	Defining least impacted (reference) condition
Reference condition data are necessary to describe expectations for biological conditions under least
disturbed settings. Analysts expect to use an approach similar to that used in NLA 2012, which is
described in detail in the NLA 2012 Technical Report (EPA 841-R-16-114) (USEPA 2016). Analysts will
consider whether data from additional 2017 reference sites indicate that NLA 2012 thresholds need to
be updated or not.
8.1.4	Determining thresholds for judging condition
This reference site approach is then used to set expectations and benchmarks for interpreting the data
on lake condition. The range of conditions found in the reference sites for an ecoregion describes a
distribution of those biological or stressor values expected for least disturbed condition. The

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 49 of 64
benchmarks used to define distinct condition classes (e.g., least disturbed, moderately, most disturbed)
are drawn from this reference distribution. USEPA's approach is to examine the range of values for a
biological or stressor indicator in all of the reference sites in a region, and to use the 5th percentile of the
reference distribution for that indicator to separate the most disturbed of all sites from moderately
disturbed sites. Using the 5th percentile means that lakes in the most disturbed category are worse than
95% of the best sites used to define reference condition. Similarly, the 25th percentile of the reference
distribution can be used to distinguish between moderately disturbed sites and those in least disturbed
condition. This means that lakes reported as least disturbed are as good as 75% of the sites used to
define reference condition. Thresholds may also be adjusted following the process in Herlihy et al.,
(2008). For some indicators, analysts use literature or other established values.
8.2	Geospatial Data
Geospatial data is an integral part of data analysis for the NLA 2017, as it has been for all other surveys.
The following activities are anticipated: review of coordinate data and corrections, pourpoint (the outlet
of the lake) identification, watershed delineations, and computing landscape metrics. Through the site
evaluation process, lakes that have changed or are inaccurately represented in the National
Hydrography Dataset will be noted and provided to those that update the NHD.
8.3	Datasets Used for the Report
The datasets available for use in the report were developed based on analytical methods selected during
the NLA data analysis workshop. Many of the analytical methods used in the survey stem from
discussions, input, and feedback provided by the National Lakes Assessment Steering Committee. Many
of the methods are an outgrowth of the testing and refinement of the existing and developed methods
and the logistical foundation constructed during the implementation of the Environmental Monitoring
and Assessment Program (EMAP) studies from 1991 through 1994 (Whittier et al., 2002), from a New
England pilot study conducted in 2005, from focused pilot studies for methods development, and from
various state water quality agency methods currently in use.
The survey uses indicators to assess trophic status and water quality, ecological integrity, and the human
use of lakes.
8.3.1	Trophic status and water quality
Lakes are typically classified according to their trophic state. Three variables, chlorophyll a, Secchi disk
depth, and total phosphorus, are used by USEPA to estimate biomass and define the trophic state of a
particular lake. Other variables are measured in conjunction with the trophic state variables to
supplement and enhance understanding of lake processes that affect primary productivity.
8.3.2	Ecological integrity
Ecological integrity describes the ecological condition of a lake based on different assemblages of the
aquatic community and their physical habitat. The indicators include zooplankton, benthic
macroinvertebrates, and the physical habitat of the shoreline and littoral zone. Analysts will also
examine a research indicator-fish eDNA.
8.3.3	Human use
Human use indicators address the ability of the population to support recreational uses such as
swimming, fishing and boating. The protection of these uses is one of the requirements in the Clean

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 50 of 64
Water Act under 305(b). The extent of algal toxins (microcystins and cylindrospermopsin), bacteria (£.
coli), sediment contaminants, and atrazine pesticide will serve as the primary indicators of human use.
8.4 Indicator Data Analysis
8.4.1	Algal Toxins
Cyanobacterial (blue-green algal) blooms are common midsummer to late fall events that occur in many
lakes and reservoirs throughout the United States. Algal toxin production has been identified as a
significant potential human health problem that has been associated with many of these bloom events.
However, little is known about the general occurrence of algal toxins in the pelagic zones of these water
bodies, where extensive blooms are less likely to occur than in near-shore areas.
Laboratories analyze the total (whole water) concentrations of microcystins and cylindrospermopsin in
lakes and reservoirs throughout the United States using a standardized immunoassay test. The data
analysis team compares these concentrations to national or other literature values. In addition, the data
analysis team analyzes and interpret the data for microcystin occurrence and concentration in the
context of other environmental data that is collected as part of the lake assessment (e.g. nutrients,
phytoplankton, chlorophyll a, turbidity, specific conductance, pH).
8.4.2	Bacteria (£". coli)
The presence of bacteria (E. coli) in water samples will be analyzed to indicate possible contamination by
human and other animal wastes. Laboratories analyze the concentration of E.coli in water samples from
lakes and reservoirs throughout the United States using a standard method. The data analysis team
plans to compare E. coli concentrations to USEPA's national guideline for recreation.
8.4.3	Benthic Macroinvertebrate and Zooplankton Assemblages
The data analysis team calculates benthic macroinvertebrate and zooplankton assemblage will be
analyzed using multimetric indices (MMI). The MMI approach summarizes various assemblage
attributes, such as composition, tolerance to disturbance, trophic and habitat preferences, as individual
metrics or measures of the biological community. Candidate metrics are evaluated for aspects of
performance and a subset of the best performing metrics are combined into an index known as a
Macroinvertebrate Index of Biotic Condition. This index is then used to rank the condition of the
resource.
8.4.4	Dissolved Gases
Researchers analyze samples for dissolved carbon dioxide (C02), methane (CH4), nitrous oxide (N20)
concentration, and the carbon isotopic composition of C02 and CH4. The results will be used to estimate
the magnitude of C02, CH4, and N20 emissions from lakes and reservoirs across the nation. This is a
supplemental research indicator and may not result in an assessment endpoint.
8.4.5	Fish eDNA
Water samples will be analyzed for fish environmental DNA. This is a supplemental research indicator
and may not result in an assessment endpoint. NLA will use this sample to evaluate whether general fish
occurrence information can be determined from this sample.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 51 of 64
8.4.6 Physical Habitat
8.4.6.1 Quality assurance objectives and procedures
MQOs are presented in Table 8.1. General requirements for comparability and representativeness are
addressed in Section 3.2. The MQOs given in Table 8.1 represent the maximum allowable criteria for
statistical control purposes. Precision is determined from results of revisits (field measurements) taken
on a different day and by duplicate measurements taken on the same day.
Table 8.1 Physical habitat measurement data quality objectives.
Variable or
Precision
Accuracy
Completeness
Measurement



Field Measurements and Observations
±10%
NA
90%
Specific quality control measures are listed in Table 8.2 for field measurements and observations.
Table 8.2 Physical habitat field quality control.
Check Description
Frequency
Acceptance Criteria
Corrective Actions
QUALITY CONTROL
Check totals for cover
class categories
(vegetation type,
substrate, cover)
Each
station
Sum must be reasonable
Repeat observations
Check completeness of
station depth
measurements
Each
station
Depth measurements for
all stations
Obtain best estimate of depth where
actual measurement not possible
DATA VALIDATION
Estimate precision of
measurements based on
repeat visits
2 visits
Measurements should be
within 10 percent
Review data for reasonableness;
Determine if acceptance criteria need
to be modified
8.4.6.2 Shoreline human disturbances
Crews record the presence or absence of 12 predefined types of human land use or disturbance for each
of the 10 stations. As part of the NLA 2017, crews separately identify additional human disturbances
outside of, but adjacent to, the plots. For each of the 12 disturbance categories, the data analysis team
calculates the proportion of lakeshore stations where the disturbance is observed on each lake.
Proportions are weighted according to the proximity of the disturbance before computing the whole-
lake metrics. Weightings are 1.0 for disturbance observations within the riparian sample plots and 0.33
for those behind or adjacent to the plots. Two types of summary metrics are calculated by synthesizing
all the human disturbance observations. The first, a measure of the extent of shoreline disturbance, is
calculated as the proportion of stations at which one or more human disturbances were observed. The
second, a measure of disturbance intensity, is calculated as the mean number of human disturbance
types observed at each of the 10 shoreline stations.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 52 of 64
8.4.63 Riparian vegetation
Crews visually estimate riparian vegetation type and areal cover in three layers: the canopy (>5 m high),
mid-layer (0.5-5 m high) and ground cover (<0.5 m high). Coniferous and deciduous vegetation is
distinguished in the canopy and mid-layer; woody and herbaceous vegetation is distinguished in the
mid-layer and ground cover. As was done in NLA 2007 and NLA 2012, crews estimate cover in four
classes: absent (0), sparse (0-10%), moderate (10-40%), heavy (40-75%) and very heavy (>75%). The data
analysis team calculates simple whole-lake metrics by assigning the cover class mid-point value to each
station's observations and then averaging those cover values across all 10 stations. The data analysis
team calculates summary metrics for each lake by summing the areal cover or tallying the presence of
defined combinations of riparian vegetation layers or vegetation types.
8.4.6.4	Aquatic macrophytes
Using the same cover classes as for riparian vegetation, crews estimate areal covers of nearshore
emergent, floating, and submerged aquatic macrophytes visually. The data analysis team calculates
simple and summary aquatic macrophyte metrics for each lake in the same fashion as for riparian
vegetation.
8.4.6.5	Fish concealment features
Crews record the presence or absence of eight specified types of fish concealment features within each
10-m x 15-m littoral plot. Crews assign the areal cover of each type to one of the same cover classes
listed above. Simple metrics for each type of fish concealment feature are calculated as the proportion
of littoral stations with the particular concealment feature present. The data analysis team calculates
summary metrics as the mean number of concealment types per station. The team then uses the areal
cover class designation to unweight very sparse cover in the calculation of both simple and summary fish
cover metrics.
8.4.6.6	Shoreline and littoral bottom substrate
Crews make visual estimates of areal cover of 9 defined substrate types (bedrock, boulders, cobble,
gravel, sand, silt/clay/muck, woody debris, organic matter, and vegetation) separately for the 1-m
shoreline band and the bottom within the 10-m x 15-m littoral plot. Cover classes are the same as for
riparian vegetation, with the same modification to include an additional higher cover class. In cases
where the bottom substrate cannot be observed directly, crew observers use a clear plastic viewing
bucket, a 3-m plastic (PVC) sounding tube, or an anchor to examine or obtain samples of bottom
sediments.
The data analysis team obtains simple metrics describing the lake-wide mean cover of littoral and
shoreline substrate in each cover class size category by averaging the cover estimates at each station,
using the cover class midpoint approach described for riparian vegetation. The team then calculates
three substrate summary metrics for both shoreline and littoral bottom substrates. First is the mean
cover of the dominant substrate type. Second and third are measures of the central tendency and
variety of substrate size. Because the size categories are approximately logarithmic, the data analysis
team calculates a cover-weighted mean substrate size class and its standard deviation; ranks the
substrate classes by size from 1 to 6, weighting them by their lakewide mean cover, and then averages
weighted cover or computes its variance across size classes.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 53 of 64
8.4.6.7	Littoral depth, bank characteristics and other observations
Crews measure lake depth 10 m offshore using SONAR, sounding line, or sounding rod. Field crews
estimate the bank angle based on high and low water marks and the vertical and lateral range in lake
water level fluctuation. They also note the presence of water surface scums, algal mats, oil slicks, and
sediment color and odor. The data analysis team calculates whole-lake metrics for littoral depth and
water level fluctuations as arithmetic averages and standard deviations. For bank angle classes and
qualitative observations of water surface condition, sediment color, and odor, the team calculates the
proportion of stations where the described features are present.
8.4.6.8	Human Disturbances in Riparian/Littoral
12 Simple metrics describe presence (proportion of shore) with: buildings, commercial land use, lawns,
developed parkland, roads/railroads, docks/boats, trash/landfill, seawalls/revetments, row crop
agriculture, pasture, orchards, and other human activities.
2 Summary metrics describe mean number of disturbance types observed per station and proportion of
shoreline with human disturbance of any type.
8.4.6.9	Riparian Vegetation Structure
8 Simple metrics describe areal cover of trees >0.3 m diameter at breast height (DBH) and <0.3 m DBH in
canopy layer; woody and herbaceous vegetation in mid-layer; barren ground and woody, herbaceous,
and inundated vegetation in ground cover layer.
6 Summary metrics describe aggregate covers in canopy + mid-layer, woody vegetation in canopy + mid-
layer, and canopy + mid-layer + ground cover layers; presence of vegetation in canopy layer; presence in
both canopy and mid-layer.
8.4.6.10	Littoral Aquatic Macrophytes
Simple metrics describe cover of emergent, floating, and submergent macrophytes; and presence of
macrophytes lakeward from the shoreline observation plot.
2 Summary metrics describe mean combined cover and proportion of shoreline with macrophytes
present.
8.4.6.11	Shoreline and Littoral Substrate Type and Size
14 Simple metrics separately describing shoreline and littoral substrate: areal cover estimates of bedrock
(>4000 mm), boulder (250-4000 mm), cobble (64-250 mm), gravel (2-64 mm), sand (0.06-2.0 mm), soil
or silt/clay/muck (<0.06 mm), and vegetation or woody debris (if concealing substrate).
6 Summary metrics (3 for shore and 3 for littoral bottom) estimating cover-weighted mean size class,
size class variance, and the areal cover of the dominant substrate type.
8.4.6.12	Littoral Fish Cover
8 Simple metrics estimating proportion of shore zone with various fish cover types: boulder, rock ledge,
brush, inundated live trees, overhanging vegetation, snags >0.3 m diameter, aquatic macrophytes, and
human structures (e.g., docks, enhancement structures).
Summary metrics describing the mean number of fish cover types.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 54 of 64
8.4.6.13	Littoral Depth, Banks, and Level Fluctuations
7Simple metrics describing mean depth and depth variation among sampling station, bank angle, and
apparent height and extent of vertical and horizontal lake water level fluctuations.
1 Summary metric describing spatial variation of station depths on lake.
8.4.6.14	Miscellaneous Habitat Variables
7 Simple metrics describing proportion of sampling sites with sediment odor (petrol, H2S,) sediment
colors (black, brown, other), and water surface films (oil, algal mat, other).
1 Summary metric describing proportion of sampling sites with surface film of any type.
8.4.7	Phytoplankton Assemblages
Phytoplankton will be collected as an integrated sample from the euphotic zone in open water. Both
abundance and biovolume on a species-specific basis will be determined. The data will be used in to
calculate cyanobacteria cell density, which will be compared to algal toxin benchmarks established by
the World Health Organization.
8.4.8	Sediment Contaminants
Concentrations of chemical constituents and percent TOC are measured in the sediments in order to
determine sediment condition in lakes and reservoirs. Sediment contaminant measures will be
compared to existing published quotients. Data analysts can use the total organic carbon and grain size
information to help interpret other sediment contaminant information as part of the analysis process.
8.4.9	Atrazine Pesticide Screen
Analysts plan to determine atrazine occurrence and concentration from lake water samples.
Comparisons will be made among lakes, relative to land use in the watershed and other water quality
characteristics (e.g., nutrient concentrations). Atrazine concentrations will be compared to USEPA's level
of concern for plant communities.
8.4.10	Trophic Status
The trophic state of lakes is analyzed using chlorophyll a concentrations, which is considered the most
accurate estimator of tropic state. Trophic state is assessed using chlorophyll a concentration
thresholds, as follows: oligotrophic, <2 |ag/L; mesotrophic, 2 to 7 |ag/L; eutrophic, 7 to <30 |ag/L; and
hypereutrophic, >30 i-ig/L. These categories will be used to rank the condition of lakes relative to their
trophic state.
8.4.11	Water Chemistry, Chlorophyll a and Secchi Depth
Laboratories measure a wide array of water chemistry parameters, including DO, pH, total nitrogen (TN),
total phosphorus (TP), clarity, DOC, color, ANC, primary productivity, and other analytes. The data
analysis team plans to assess some of these parameters using the reference based approach and some
using nationally-consistent values. Additionally, the team reports on values for these parameters and
their distribution. Water chemistry analysis is critical for interpreting the biological indicators.
Temperature profiles is used to determine degree of lake stratification.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 55 of 64
9 LITERATURE CITED
Allen AP, WhittierTR, Kaufmann PR, Larsen DP, O'Connor RJ, Hughes RM, Stemberger RS, Dixit SS,
Brinkhurst RO, Herlihy AT, Paulsen SG. 1999. Concordance of taxonomic composition patterns across
multiple lake assemblages: effects of scale, body size, and land use. Canadian Journal of Fisheries
and Aquatic Sciences 56: 2029-2040.
Baker, J.R. and G.D. Merritt, 1990. Environmental Monitoring and Assessment Program: Guidelines for
Preparing Logistics Plans. EPA 600/4-91-001. U.S. Environmental Protection Agency. Las Vegas,
Nevada.
Carlson, R.E. 1977. A trophic state index for lakes. Limnology and Oceanography 22(2):361-369.
CAS - Chemical Abstracts Service (CAS 1999).
Code of Federal Regulations, Title 40 - Protection of Environment. 40CFR Part 136, App. B Definition and
Procedure for the Determination of the Method Detection Limit.
FGDC. 1998. Federal Grographic Data Committee. Content standard for digital geospatial metadata,
version 2.0. FGDC-STD-001-1998. https://www.fgdc.gov/metadata/csdgm.
Garner, F.C., M.A. Stapanian, and K.E. Fitzgerald. 1991. Finding causes of outliers in multivariate
environmental data. Journal of Chemometrics. 5: 241-248.
Heinz Center. 2002. The State of the Nation's Ecosystems. The Cambridge University Press.
Herlihy, A. T., S. G. Paulsen, J. V. Sickle, J. L. Stoddard, C. P. Hawkins, and L. L. Yuan. 2008. Striving for
consistency in a national assessment: the challenges of applying a reference-condition approach at a
continental scale. Journal of the North American Benthological Society 27:860-877.
Hunt, D.T.E and A.L. Wilson. 1986. The chemical analysis of water: general principles and techniques. 2nd
edition. Royal Society of Chemistry, London, England.
Kaufmann PR, Hughes RM. 2006. Geomorphic and anthropogenic influences on fish and amphibians in
Pacific Northwest coastal streams. Pages 429-455 in Hughes RM, Wang L, Seelbach PW (editors).
Landscape influences on stream habitat and biological assemblages. American Fisheries Society
Symposium 48, Bethesda, Maryland.
Kaufmann PR, Hughes RM, Van Sickle J, Whittier TR, Seeliger CW, Paulsen SG. 2014a. Lake shore and
littoral habitat structure: Afield survey method and its precision. Lake and Reservoir
Management. 30:157-176.
Kaufmann PR, Hughes RM, WhittierTR, Bryce SA, Paulsen SG. 2014b. Relevance of lake physical habitat
assessment indices to fish and riparian birds. Lake and Reservoir Management. 30:177-191.
Kaufmann PR, Levine P, Robison EG, Seeliger C, Peck DV. 1999. Quantifying physical habitat in wadeable
streams. EPA/620/R-99/003. U.S. Environmental Protection Agency, Office of Research and
Development,Washington, DC. Available at
http://www.epa.gov/emap/html/pubs/docs/groupdocs/surfwatr/field/phyhab.html. Accessed
April 2011.
Kaufmann PR, Peck DV, Paulsen SG, Seeliger CW, Hughes RM, WhittierTR, Kamman NC. 2014c.
Lakeshore and littoral physical habitat structure in a national lakes assessment. Lake and Reservoir
Management. 30:192-215.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 56 of 64
Kincaid TM, Larsen DP, Urquhart NS. 2004. The structure of variation and its influence on the estimation
of status: indicators of condition of lakes in the Northeast USA. Environmental Monitoring and
Assessment 98:1-21.
Kirchmer, C.J. 1983. Quality control in water analysis. Environmental Science & Technology. 17: 174A-
181A.
Klemm, D.J., P.A. Lewis, F. Fulk, and J.M. Lazorchak. 1990. Macroinvertebrate Field and Laboratory
Methods for Evaluating the Biological Integrity of Surface Waters. EPA 600/4-90/030. U.S.
Environmental Protection Agency, Cincinnati, Ohio.
Larsen DP, Kinkaid TM, Jacobs SE, Urquhart NS. 2001. Designs for evaluating local and regional scale
trends. Bioscience 51(12):1069-1078.
Larsen DP, Kaufmann PR, Kincaid TM, Urquhart NS. 2004. Detecting persistent change in the habitat of
salmon-bearing streams in the Pacific Northwest. Canadian Journal of Fisheries and Aquatic
Sciences 61:283-291.
Larsen, D. P., N. S. Urquhart, and D. L. Kugler. 1995. Regional-scale trend monitoring of indicators of
trophic condition of lakes. Water Resources Bulletin 31:117-139.
Lemmon, P.E. 1957. A new instrument for measuring forest overstory density. J. For. 55(9): 667-669.
Littel RC, Milliken GA, Stroup WW, Wolfinger RD, Schabenberger O. 2006. SAS for mixed models, Second
Edition. Cary, N.C. SAS Institute, Inc. 814p.
Meglen, R.R. 1985. A quality control protocol for the analytical laboratory. Pg. 250-270. IN: J.J. Breen and
P.E. Robinson (eds). Environmental Applications of Cehmometrics. ACS Symposium Series 292.
American Chemical Society, Washington, D.C.
NHDPIus 2005. NHD - National Hydrography Dataset Plus Version 1.0
http://www.horizonsystems.com/nhdplus/index.php.
NAPA. 2002. Environment.gov. National Academy of Public Administration. ISBN: 1-57744-083-8. 219
pages.
NRC. 2000. Ecological Indicators for the Nation. National Research Council.
Oblinger Childress, C.J., Foreman, W.T., Connor, B.F. and T.J. Maloney. 1999. New reporting procedures
based on long-term method detection levels and some considerations for interpretations of water-
quality data provided by the U.S. Geological Survey National Water Quality Laboratory. U.S.G.S
Open-File Report 99-193, Reston, Virginia.
Paulsen, S.G., D.P. Larsen, P.R. Kaufmann, T.R. Whittier, J.R. Baker, D. Peck, J., McGue, R.M. Hughes, D.
McMullen, D. Stevens, J.L. Stoddard, J. Lazorchak, W.Kinney, A.R. Selle, and R. Hjort. 1991. EMAP -
surface waters monitoring and research strategy, fiscal year 1991. EPA-600-3-91-002. U.S.
Environmental Protection Agency, Office of Research and Development, Washington, D.C. and
Environmental Research Laboratory, Corvallis, Oregon.
Peck, D. V., A. R. Olsen, M. H. Weber, S. G. Paulsen, C. Peterson, and S. M. Holdsworth. 2013. Survey
design and extent estimates for the National Lakes Assessment. Freshwater Science 32:1231-1245.
Peck, D.V., J.M. Lazorchak, and D.J. Klemm (editors). 2003. Unpublished draft. Environmental Monitoring
and Assessment Program - Surface Waters: Western Pilot Study Field Operations Manual for
Wadeable Streams. U.S. Environmental Protection Agency, Washington, D.C.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 57 of 64
Peck, D. V., and R. C. Metcalf. 1991. Dilute, neutral pH standard of known conductivity and acid
neutralizing capacity. Analyst 116:221-231
Plafkin, J.L., M.T. Barbour, K.D. Porter, S.K. Gross, and R.M. Hughes. 1989. Rapid Bioassessment
Protocols for Use in Streams and Rivers: Benthic Macroinvertebrates and Fish. EPA 440/4-89/001.
U.S. Environmental Protection Agency, Office of Water, Washington, D.C.
Platts, W.S., W.F. Megahan, and G.W. Minshall. 1983. Methods for Evaluating Stream, Riparian, and
Biotic Conditions. USDA Forest Service, Gen. Tech. Rep. INT-183. 71pp.
Stapanian, M.A., F.C. Garner, K.E. Fitzgerald, G.T. Flatman, and J.M. Nocerino. 1993. Finding suspected
causes of measurement error in multivariate environmental data. Journal of Chemometrics. 7: 165-
176.
Stevens, D. L., Jr., 1994. Implementation of a National Monitoring Program. Journal of Environ.
Management 42:1-29.
USEPA. 1984. EPA Order 2160 (July 1984), Records Management Manual, U.S. Environmental Protection
Agency, Washington, DC.U.S. EPA, 1999. EPA's Information Management Security Manual. EPA
Directive 2195 Al.
USEPA. 2002. Guidance for Quality Assurance Project Plans (EPA QA/G-5). EPA/240/R-02/009. U.S.
Environmental Protection Agency, Office of Environmental Information, Washington, D.C.
http://www.epa.gov/quality/qs-docs/g5-final.pdf
USEPA. 2003. Draft Report on the Environment. ORD and OEI. EPA-260-R-02-006.
USEPA. 2004. National Geospatial Data Policy, https://www.epa.gov/sites/production/files/2014-
08/documents/national_geospatial_data_policy_0.pdf
USEPA. 2006. Guidance on Systematic Planning Using the Data Quality Objectives Process (EPA QA/G-4).
EPA/240/B-06/001. U.S. Environmental Protection Agency, Office of Environmental Information,
Washington, D.C. http://www.epa.gov/quality/qs-docs/g4-final.pdf
USEPA. 2009. National Lakes Assessment: Technical Appendix. EPA 841-B-09-001a. U.S. Environmental
Protection Agency, Washington, DC.
USEPA. 2013. EPA's Information Security Policy. EPA Order 2150.
https://www.epa.gov/sites/production/files/2013-ll/documents/ansp_interim_policy.pdf
USEPA. 2016. National Lakes Assessment 2012: Technical Report. EPA 841-R-16-114. U.S. Environmental
Protection Agency, Washington, D.C.
USGAO. 2000. Water Quality. GAO/RCED-OO-54.
Van Sickle J, Hawkins CP, Larsen DP, Herlihy AT. 2005. A null model for the expected
macroinvertebrate assemblage in streams. Journal of the North American Benthological Society.
24(1):178-191.
Washington, H.G. 1984. Diversity, biotic, and similarity indices. Water Research 18(6): 653-694.
Whittier, T. R., S. G. Paulsen, D. P. Larsen, S. A. Peterson, A. T. Herlihy, and P. R. Kaufmann. 2002.
Indicators of ecological stress and their extent in the population of northeastern lakes: a regional-
scale assessment. Bioscience 52:235-247.
Zar JH. 1999. Biostatistical Analysis, 4th ed. Prentice-Hall, Inc. New Jersey, USA.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
APPENDIX A: LABORATORY LIST
Quality Assurance Project Plan
Page 58 of 64

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 59 of 64
National Lakes Assessment 2017 Contract Laboratory List
Analysis
Contact
Contractor
Contractor No.
& Task No.
Project Officer
Algal Toxins
(microcystins and
cylindrospermopsin)
Kendra Forde
EnviroScience
EP-C-12-002; TO
30
Sarah Lehmann
Algal Toxins
(microcystins -
HDPE sample
container)
Kendra Forde
GLEC
EP-C-16-008
Sarah Lehmann
Benthic
Macroinvertebrates
Brian Hasty
PG Environmental
EP-C-12-004; TO
28
Sarah Lehmann
Bacteria (E. coli)
Kendra Forde
EnviroScience
EP-C-12-002; TO
33
Sarah Lehmann
Sediment
Contaminants
Kendra Forde
EnviroScience
EP-C-12-002; TO
29
Sarah Lehmann
Atrazine Pesticide
Screen
Kendra Forde
EnviroScience
EP-C-12-002; TO
31
Sarah Lehmann
Water Chemistry
Dave Peck
CSS
EP-D-16-021, TO
02
Dave Peck
Zooplankton and
Phytoplankton
Brian Hasty
BSA Envirnmental
Services
GS-10F-0302S
Sarah Lehmann

-------
National Lakes Assessment 2017
Version 1.1, May 2017
APPENDIX B: REVISION HISTORY
Quality Assurance Project Plan
Page 60 of 64

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 61 of 64
NLA 2017 Changes made to the Field Operations Manual Version 1.0
(incorporated into Version 1.1).
Background on change: There has been a recent change to the index water sampling protocol. There
will now be two Algal Toxins samples collected at the index site; the primary sample (MICX) collected in
a 500 mL square clear PETG bottle and a second sample (for a bottle comparison) in a 500 mL white
HDPE bottle called MICZ. MICX samples will be processed for both microcystins and cylindrospermopsin,
while the MICZ will be processed for only microcystins as was done in 2007 and 2012.
There is new guidance that suggests HDPE (which was used in both 2007 and 2012) may adsorb
cyanobacteria cells to it which could result in a lower reported level in the sample. To help determine
whether the 2007 and 2012 results were under-reported, it is essential that EPA conducts a side-by-side
test with both the old and new bottle style.
In order to do a true comparison, we need to provide the same bottle and volume as was collected in
2007 and 2012, which means crews will need to collect some additional water at the index site. Crews
will now collect 5 integrated samples (5 full, 10 halves, etc.). The first will go straight to the CHLA
sample bottle, #2 and #3 will be composited in the cubitainer and divided, and then #4 and #5 will
become the CHEM sample. The new steps have been incorporated into both Version 1.1 of the FOM
and Index Presentation.
A summary of the FOM changes are below. Forms and labels will also include the new sample.
•	Figure 3.1 has been updated to include the change in numbers of integrated sampler
pulls and the bottles into which water will be dispensed.
•	Figure 5.3 (below) has been updated to include the changes in sample collection.
Integrated
Sam piers #2 and #3
Integrated
Samplers #4 and #5
Integrated
Sampler #1
Nutrients
Water Chemistry
Phytoplankton
mL
Clear, square PETG
(MICX)
White, round HDPE
(MICZ)
In Appendix A, the site kit equipment list has been updated to include both bottle types:
PETG bottle (500 mL, clear, narrow-mouth, square)
1
Algal Toxins (MICX)
HDPE bottle (500 mL, white, wide-mouth, round)

Algal Toxins (MICZ)
• In Appendix B, the shipping table and flowchart have been updated to reflect the new
sample, which will be shipped with the other T2 chilled immediate samples to GLEC.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 62 of 64
NLA 2017 Changes made to the Laboratory Operations Manual
LOM Version
Date
Approved
Changes Made
1.1

Minor editorial and grammatical changes throughout LOM;
some section header language revised. "Chain of Custody"
changed to "sample tracking form" throughout to reflect
NARS procedures. Section and table numbering revised to
reflect additions and deletions.
Section 1.3 and 1.4 revised to clarify that data templates
are available from EPA not as an Appendix to the LOM
Section 2.0 revised to clarify that labs all participate in a
laboratory review process not receive an evaluation
Section 2.1 assistance visit section deleted.
Section 3.0 information on shipping of samples moved from
old Section 3.5 for consistency with other chapters.
Section 3.1 header removed for consistency with other
chapters
Section 3.2 and Section 3.4 removed because issues
covered in Section 1.
New Section 3.2 added - Precautions.
Section 3.5.3 Step 5 added text related to adding conjugate
solution .
Deleted Table 3.2 and references to the table because the
required data submission elements are included in the data
templates.
Section 3.6.2 (previous 3.7.2) added information on
precision and accuracy.
Table 3.2 (previous 3.3) added >A6 in calibration and
changed .605 to .60 in kit control.
Section 3.8 Sample and Record Retention deleted because
information is included in Section 1.4.
Section 4.0 added reference to EPA standard method;
information on shipping of samples moved from old Section
4.5 for consistency with other chapters.
Health and safety information moved to Section 4.2
Precautions for consistency with other chapters.
Section 4.4 removed because issues covered in Section 1.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 63 of 64


Deleted Table 4.2 and references to the table because the
required data submission elements are included in the data
templates.


Section 4.5.2 (previous 4.7.2) information on precision and
accuracy added.


Section 4.8 Sample and Record Retention deleted because
information is included in Section 1.4.


Section 5.4 Sample Receipt added.


Section 5.5.4 (previous 5.5.3) Deleted previous Table 5.2
and related text. Inserted information on EPA data
templates.


Table 5.3 inserted information on assuming samples
collected at 12:00 noon.


Section 6.4 Sample Receipt added.


Section 7.4 Sample Receipt added.


Section 7.7.2 (previous Section 7.6.2) inserted information
to clarify that EPA may choose to conduct external QC.


Section 8.0 revised to indicate that TOC is frozen not
refrigerated by the laboratory.


Section 8.2 removed because issues covered in Section 1.


Section 8.1 Personnel deleted reference to immunoassays.


Added new Section 8.2 Precautions.


Section 8.4 Step 2 (new) deleted reference to shipping as
this is covered in Section 8.0; Step 5 (new) clarified that
TOC is frozen; and added Step 7 regarding maintaining
sample tracking forms.


Table 8.2 updated including footnotes.


Table 8.3 added "%" as an additional unit for TOC


Deleted Table 8.4 because required data elements are
identified in EPA data templates.


Deleted Section 8.7.1.


Section 8.6.2 (previous 8.7.3) added matrix spike duplicate.


Section 8.8 Sample and Record Retention deleted because
information is included in Section 1.4.


Section 9.4 Sample Receipt added.


Section 9.9.5 Data Entry added.


Table 9.3 added.

-------
National Lakes Assessment 2017
Version 1.1, May 2017
Quality Assurance Project Plan
Page 64 of 64


Section 10.2 sample receipt added.
Table 10.3 updated to change NH to NH3-N.
Section 10.5.4 Data Entry added.
Section 11.4 sample receipt added.
Section 11.6 information on EPA's data template added.
Section 11.7.2 information added on proportional QC
analyses and to clarify that EPA may choose to conduct
external QC.

-------