%
           United States Environmental Protection Agency
           Office of Water
           Office of Environmental Information
           Washington, DC
           EPA 843-R-10-003
National Wetland Condition Assessment

     Quality Assurance
          Project Plan
Version 2, March 2012

-------
National Wetland Condition Assessment                                     March 2012
QA Project Plan Version 2	Page 2 of 120

                      QUALITY ASSURANCE PROJECT PLAN
                REVIEW & DISTRIBUTION ACKNOWLEDGMENT AND
                          COMMITMENT TO IMPLEMENT

                                       for

                  National Wetland Condition Assessment (NWCA)

We have read the QAPP and the methods manuals for NWCA listed below. Our
agency/organization, agrees to abide by its requirements for work performed under NWCA
(under CWA 106).

Quality Assurance Project Plan      n
Field Operations Manual           n
Site Evaluation Guidelines         n
Laboratory Methods Manual        n


Print Name
Title	
(Cooperator's Principal Investigator)

Organization	
Signature                                         Date

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 3 of 120

                                      NOTICE

The complete documentation of overall Wetlands Survey project management, design,
methods, and standards is contained in four companion documents, including:

   •   NWCA: Site Evaluation Guidelines (EPA-843-R-10-004)
   •   NWCA: Field Operations Manual (EPA-843-R-10-001)
   •   NWCA: Laboratory Methods Manual (EPA-843-R-10-002)
   •   Ecological Indicators for the 2011 National Wetland Condition Assessment (in
       preparation)

This document (Quality Assurance Project Plan) contains elements of the overall project
management, data quality objectives, measurement and data acquisition, and information
management for NWCA. Methods described in this document are to be used specifically in work
relating to NWCA. All Project Cooperators should follow these guidelines. Mention of trade
names or commercial products in this document does not constitute endorsement or
recommendation for use. More details on specific methods for site evaluation, field sampling,
and laboratory processing can be found in the appropriate companion document(s).
The suggested citation for this document is:

       USEPA. 2011 (draft). National Wetland Condition Assessment: Integrated
       Quality Assurance Project Plan. EPA 843-R-10-003. U.S. Environmental
       Protection Agency, Office of Water and Office of Research and Development,
       Washington, DC.

-------
National Wetland Condition Assessment                                           March 2012
QA Project Plan Version 2                                                       Page 4 of 120
                           NWCA QAPP VERSION 2 SIGNATURE PAGE
                                                                        <
           Michael Scozzafava                                      Date
           NWCA Project Leader
           U.S. EPA Office of Water
           Gregg Serenbetz       :                                  Date
           NWCA Project Co-Leader
           U.S. EPA Office of Water
           Tanya Code, Acting Chief                                  Date
           Wetlands Strategies and State Programs Branch
           U.S. EPA Office of Water

                        ?<   $
                        / S A
           Sarah Lehmann                                         Date
           National Aquatic Resource Surveys Team Leader
           U.S.. EPA Office of Water
           Virginia^ox-Norse                                       Date
           OWOW QA Officer
           U.S. EPA Office of Water

-------
National Wetland Condition Assessment
QA Project Plan Version 2
  March 2012
Page 5 of 120
                                VERSION HISTORY
QAPP Version
1
2
Date Approved
4/26/201 1
3/5/2012
Changes Made
N/A
Section 6.2

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                            March 2012
                           Page 6 of 120
                                DISTRIBUTION LIST

This QA Project Plan and associated manuals or guidelines will be distributed to the following:
EPA, States, Tribes, universities, and contractors participating in NWCA. EPA Regional Survey
Coordinators are responsible for distributing NWCA QA Project Plan to State and Tribal Water
Quality Agency staff or other cooperators who will perform the field sampling and laboratory
operations. The Great Lakes Environmental Center QA Officers will distribute the QA Project
Plan and associated documents to participating project staff at their respective facilities and to
the project contacts at participating laboratories, as they are determined.
Michael E. Scozzafava
Office of Wetlands, Oceans and
Watersheds
U.S. Environmental Protection Agency
1200 Pennsylvania Avenue, NW (4503 T)
Washington, DC 20460
202-566-1376
Scozzafava.MichaelE@epa.gov

Chris Faulkner
Office of Wetlands, Oceans and
Watersheds
U.S. Environmental Protection Agency
1200 Pennsylvania Avenue, NW (4503 T)
Washington, DC 20460
202-566-1185
Faulkner.Chris@epa.gov

Gregg Serenbetz
Office of Wetlands, Oceans and
Watersheds
U.S. Environmental Protection Agency
1200 Pennsylvania Avenue, NW (4503 T)
Washington, DC 20460
202-566-1253
Serenbetz.Gregg@epa.gov

Mary Kentula
Aquatic Monitoring and Assessment Branch
Western Ecology Division, NHEERL, ORD,
U.S. EPA 200 S.W. 35th St.
Corvallis, OR  97330
541-754-4478
Kentula.Mary@epa.gov
Teresa Magee
Aquatic Monitoring and Assessment Branch
Western Ecology Division, NHEERL, ORD,
U.S. EPA 200 S.W. 35th St.
Corvallis, OR 97330
541-754-4385
Magee.Teresa@epa.gov

Sarah Lehmann
U.S. EPA Office of Wetlands, Oceans and
Watersheds
1200 Pennsylvania Avenue, NW (4503T)
Washington DC 20460
202-566-1379
Lehmann.Sarah@epa.gov

Jeanne Voorhees
Regional EPA Coordinator
U.S. EPA-Region 1
1 Congress Street, Suite 1100
Boston MA 02114-2023
617-918-1686
Voorhees.Jeane@epa.gov

Tom Faber
Regional EPA Coordinator
U.S. EPA Region 1 - New England
Regional Laboratory
11 Technology  Drive
North Chelmsford,  MA 01863-2431
617-918-8672
Faber.Tom@epa.gov

Kathy Drake
Regional EPA Coordinator
U.S. EPA-Region 2
2890 Woodbridge Ave.
Edison, NJ 08837
212-637-3817
Drake. Kathleen@epa.gov

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                            March 2012
                          Page 7 of 120
Darvene Adams
Regional EPA Coordinator
Division of Envir. Science and Assessment
U.S. EPA-Region 2
2890 Woodbridge Ave.
Edison, NJ 08837
732-321-6700
Adams.Darvene@epa.gov

Regina Poeske
EPA NWCA QA Assistance Visit
Coordinator
Regional EPA Coordinator
U.S. EPA-Region 3
1650 Arch Street
Philadelphia, PA 19103-2029
215-814-2725
Poeske.Regina@epa.gov

Dave Melgaard
Regional EPA Coordinator
U.S. EPA-Region 4
AFCBIdg., 15th Floor
61 ForsythSt., S.W.
Atlanta, GA 30303-8960
404-562-9265
Melgaard.David@epa.gov

Mari Nord
Regional EPA Coordinator
U.S. EPA-Region 5
77 W. Jackson Blvd.
Chicago, IL 60604
312-886-3017
Nord.Mari@epa.gov

Peter Jackson
Regional EPA Coordinator
U.S. EPA-Region 5
77 W. Jackson Blvd.
Chicago, IL 60604
312-886-3894
Jackson.Peter@epa.gov

Sue Elston
Regional EPA Coordinator
U.S. EPA-Region 5
77 W Jackson Blvd.
Chicago, IL 60604
312-886-6115
Elston.Sue@epa.gov
Mark Stead
Regional EPA Coordinator
U.S. EPA - Region 6 (6WQ-EWM)
1445 Ross Avenue - Suite 1200
Dallas, TX 75202-2733
214-665-2271
Stead.Mark@epa.gov

Laura Hunt
Regional EPA Coordinator
U.S. EPA - Region 6 (6WQ-EWM)
1445 Ross Avenue - Suite 1200
Dallas, TX 75202-2733
214-665-9729
Hunt.Laura@epa.gov

Eliodora Chamberlain
Regional EPA Coordinator
U.S. EPA-Region 7
901 North Fifth Street
Kansas City,  KS 66101
913-551-7945
Chamberlain.Eliodora@epa.gov

Gary Welker
Regional EPA Coordinator
U.S. EPA-Region 7
901 North Fifth Street
Kansas City,  KS 66101
913-551-7177
Welker.Gary@epa.gov

Karl Hermann
Regional EPA Coordinator
U.S. EPA-Region 8
1595WynkoopSt.
Denver, CO 80202-2405
303-312-6084
Hermann.karl@epag.gov

Julia McCarthy
Regional EPA Coordinator
U.S. EPA-Region 8
1595WynkoopSt.
Denver, CO 80202-2405
303-312-6153
McCarthy.Julia@epa.gov

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                            March 2012
                           Page 8 of 120
Paul Jones
Regional EPA Coordinator
U.S. EPA-Region 9
75 Hawthorne Street
San Francisco, CA 94105
415-972-3470
Jones.Paul@epa.gov

Janet Hashimoto
Regional EPA Coordinator
U.S. EPA-Region 9
75 Hawthorne Street
San Francisco, CA 94105
415-972-3452
Hashimoto.Janet@epa.gov

Mary-Anne Thiesing
Regional EPA Coordinator
U.S. EPA-Region 10
1200 Sixth Avenue
Seattle, WA 98101
206-553-6114
Thiesing.Mary@epa.gov
Gretchen Hayslip
Regional EPA Coordinator
U.S. EPA-Region 10
1200 Sixth Avenue
Seattle, WA 98101
206-553-1685
Hayslip.Gretchen@epa.gov

Dennis J. McCauley
Logistics Coordinator
Great Lakes Environmental Center
739 Hastings St.
Traverse City, Ml 49686
231-941-2230
dmccauley@glec.com

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                   Page 9 of 120
                               TABLE OF CONTENTS

1   PROJECT PLANNING AND MANAGEMENT	13
  1.1     Introduction	13
  1.2     NWCA Project Organization	15
    1.2.1   Project Schedule	20
  1.3     Scope of QA Project Plan	21
    1.3.1   Overview of Field Operations	21
    1.3.2   Overview of Laboratory Operations	28
    1.3.3   Data Analysis and Reporting	30
    1.3.4   Peer Review	31
2   DATA QUALITY OBJECTIVES	32
  2.1     Data Quality Objectives for the National Wetland Condition Survey	32
  2.2     Measurement Quality Objectives	33
    2.2.1   Laboratory Reporting Level (Sensitivity)	33
    2.2.2   Sampling Precision, Bias, and Accuracy	34
    2.2.3   Taxonomic Precision and Accuracy	37
    2.2.4   Completeness	39
    2.2.5   Comparability	39
    2.2.6   Representativeness	40
3   SAMPLING DESIGN AND SITE SELECTION	40
  3.1     Probability-Bases Sampling Design and Site Selection	40
4   INFORMATION MANAGEMENT	43
  4.1     Overview of System Structure	43
    4.1.1   Design and Site  Status Data Files	44
    4.1.2   Sample Collection and Field Data Recording	45
    4.1.3   Laboratory Analyses and Data Recording	46
    4.1.4   Data Review, Verification, and Validation Activities	48
  4.2     Data Transfer	50
  4.3     Hardware and Software Control	50
  4.4     Data Security	50
  4.5     Data Archive	51
5   INDICATORS	51
  5.1     Vegetation	53
    5.1.1   I ntroduction	53
    5.1.2   Training and  Field Audits	53
    5.1.3   Sampling Design	54
    5.1.4   Field Measurements and Sampling	54
    5.1.5   Laboratory Methods	57
    5.1.6   Quality Assurance Objectives	59
    5.1.7   Quality Control Procedures:  Field Operations	59
    5.1.8   Quality Control Procedures:  Laboratory Operations	60
    5.1.9   Data Management, Review,  and Validation	61
  5.2     Soils	61
    5.2.1   Introduction	61
    5.2.2   Sampling Design	62
    5.2.3   Sampling and Analytical Methods	67
    5.2.4   Quality Assurance Objectives	68
    5.2.5   Quality Control Procedures:  Field Operations	69
    5.2.6   Quality Control Procedures:  Laboratory Operations	70

-------
National Wetland Condition Assessment                                       March 2012
QA Project Plan Version 2	Page 10 of 120

    5.2.7   Data Management, Review, and Validation	71
  5.3    Hydrology	72
    5.3.1   Introduction	72
    5.3.2   Sampling Design	72
    5.3.3   Sampling and Analytical Methods	73
    5.3.4   Quality Assurance Objectives	73
    5.3.5   Quality Control Procedures	74
    5.3.6   Data Management, Review, and Validation	74
  5.4    Water Chemistry Indicator	75
    5.4.1   I ntroduction	75
    5.4.2   Field Collection	75
    5.4.3   Sampling and Analytical Methods	75
    5.4.4   Quality Assurance Objectives	76
    5.4.5   Quality Control Procedures: Field Operations	78
    5.4.6   Quality Control Procedures: Laboratory Operations	78
    5.4.7   Data Reporting, Review, and Management	83
  5.5    Algae I ndicator	83
    5.5.1   I ntroduction	83
    5.5.2   Sampling Design	84
    5.5.3   Sampling and Analytical Methods	84
    5.5.4   Quality Assurance Objectives	89
    5.5.5   Quality Control Procedures: Field Operations	89
    5.5.6   Quality Control Procedures: Laboratory Operations	92
    5.5.7   Data Management, Review, and Validation	94
  5.6    Stressors I ndicator	96
  5.7    Rapid Assessment Method	96
    5.7.1   Introduction	96
    5.7.2   Sampling Design	97
    5.7.3   Quality Assurance Objectives	97
    5.7.4   Quality Control Procedures: Field Operations	98
    5.7.5   Data Management, Review, and Validation	98
6   FIELD AND LABORATORY QUALITY EVALUATION AND ASSISTANCE VISITS	99
  6.1     Field Quality Evaluation and Assistance Visit Plan for the National Wetland Condition
  Assessment (NWCA)	99
  6.2    Laboratory Quality Evaluation and Assistance Visit Plan for the National Wetland
  Condition Assessment (NWCA)	102
7   DATA ANALYSIS PLAN	107
  7.1     Data Interpretation Background	107
  7.2    Datasets Utilized for the Report	108
  7.3    Vegetation, Soft Algae, and Diatom Data Analysis	109
  7.4    Soils, Hydrology and Water Quality Data Analysis	109
  7.5    Rapid Assessment Data Analysis and Methodology Evaluation	110
8   REFERENCES	111
 9   APPENDIX A: NATIONAL WETLAND CONDITION ASSESSMENT FIELD EVALUATION
AND ASSISTANCE SITE VISIT SUMMARY OF FORMS	116
10  APPENDIX B: WETLAND SURVEY LABORATORY LIST	117
11  APPENDIX C: Interlaboratory Total Microcystin Comparison by ELISA	118
  11.1    Introduction:	118
  11.2    Interlaboratory Comparison Study Design:	118
  11.3    Criteria for Acceptable Comparison:	119
  11.4    Corrective Action:	119

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                   Page 11 of 120
                                  TABLE OF TABLES

Table 1.3-1. Critical logistics elements (from Baker and Merritt, 1990)	23
Table 1.3-2. Guidelines for analytical support laboratories	30
Table 2.2-1. Important variance components for aquatic resource assessments	36
Table 4.1-1. Sample and field data quality control activities	46
Table 4.1-2. Laboratory data quality control activities	47
Table 4.1 -3. Biological sample quality control activities	47
Table 4.1-4. Data review, verification, and validation quality control activities	49
Table 5.1-1. Field measurement methods: vegetation	55
Table 5.1-2. Guidelines for resolution when estimating percent cover	56
Table 5.1-3. Measurement data quality objectives: vegetation indicator	59
Table 5.2-1. Field measurement methods: soil profile metrics	67
Table 5.2-2. Soil Sample Collection	68
Table 5.2-3. Measurement quality objectives: soil indicator	68
Table 5.2-4. Field quality control: Soil indicator	69
Table 5.2-5. Lab analysis quality control: soils indicator	70
Table 5.2-6. Data validation quality control:  soils indicator	72
Table 5.3-1. Field measurement methods: hydrology metrics	73
Table 5.3-2. Measurement quality objectives: soil indicator	74
Table 5.3-3: Data quality control: hydrology	75
Table 5.4-1: Performance  requirements for water chemistry analytical methods	77
Table 5.4-2. Sample processing  quality control activities: water chemistry indicator	79
Table 5.4-3. Laboratory Quality Control Samples: Water Chemistry Indicator	80
Table 5.4-4: Data validation quality control: water chemistry indicator	83
Table 5.4-5. Data Reporting Criteria: Water Chemistry Indicator	83
Table 5.5-1. Field and laboratory methods:  Diatoms	86
Table 5.5-2. Field and laboratory methods:  Soft Algae	86
Table 5.5-3. Performance Requirements for chlorophyll a Analytical Methods	88
Table 5.5-4. Field Sample Processing Quality Control:  chlorophyll a Samples	90
Table 5.5-5. Sample Processing Quality Control: Composite and chlorophyll a Samples	93
Table 5.5-6: Lab sample processing quality controls: chlorophyll a	93
Table 5.5-7. Laboratory Quality Control: Composite Sample  (Diatoms and Soft Algae)	95
Table 5.5-8. Data validation quality control:  chlorophyll a indicator	95
Table 5.5-9. Data reporting criteria: chlorophyll a indicator	96
Table 5.7-1. Measurement data quality objectives: vegetation indicator	98

                                  TABLE OF FIGURES

Figure 1-1:  Relationship between the goals and  objectives of the National Wetland Condition
Assessment and the long-term goals of EPA's current strategic plan (EPA 2006b)	15
Figure 1-2:  NWCA  Project Organization	18
Figure 1-3:  Timeline of NWCA Activities	20
Figure 1-4:  Site verification activities for wetland field surveys	25
Figure 1-5:  Summary of field activities and site sampling	26
Figure 3-1:  NWCA  2011 Survey  Design Summary Map	43
Figure 4-1:  Organization of information management system modeled after EMAP Surface
Water Information Management  (SWIM) system for the NWCA	44
Figure 5-1:  Potential options for plant vouchers collected as  part  of the 2011 NWCA	58
Figure 5-2:  Example PPQ 525-A Regulated Soils Permit	64

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 12 of 120

Figure 5-3:  General Batch Water Sample Processing Scheme	79
Figure 5-4:  Analysis Activities for Water Chemistry Samples	82
Figure 5-5:  Sample Collection Form	91

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                 Page 13 of 120
  1   PROJECT PLANNING AND MANAGEMENT

 1.1  Introduction

Several recent reports have identified the need for improved water quality monitoring and
analysis at multiple scales. In 2000, the General Accounting Office (USGAO, 2000) reported
that the U.S. Environmental Protection Agency (EPA), states, and tribes collectively cannot
make statistically valid inferences about water quality (via 305[b] reporting) and lack data to
support key management decisions. In 2001, the National Research Council (NRC, 2000)
recommended EPA, states, and tribes promote a uniform, consistent approach to ambient
monitoring and data collection to support core water quality programs.  In 2002, the H. John
Heinz III Center for Science, Economics, and the Environment (Heinz Center, 2002) found there
are inadequate data for national reporting on fresh water, coastal and ocean water quality
indicators. The National Association of Public Administrators (NAPA, 2002) stated that improved
water quality monitoring is necessary to help states and tribes make more effective use of
limited resources. EPA's Report on the  Environment 2003 (USEPA,  2003) says that there is
insufficient information to  provide  a national answer, with confidence and scientific credibility, to
the question, "What is the condition of U.S. waters and watersheds?"

The most commonly cited and scientifically valid sources of national-scale wetland information
are the U.S. Fish and Wildlife Service (FWS) Wetlands Status and Trends Reports (S&T
Report), which have documented  trends in wetland acreage since the 1950's. The most recent
report, published in 2005, documented an annual  net increase of 32,000 wetland acres from
1998-2004. At the same time, the report documents significant increases in freshwater ponds
(12 percent) and alarming decreases in highly productive emergent mashes and coastal
wetlands (Dahl 2005). In fact, a follow-up study recently published by the National  Oceanic and
Atmospheric Administration (NOAA) and U.S.  FWS showed that wetlands in coastal watersheds
of the eastern U.S. decreased at a rate of approximately 60,000 acres  per year during the same
study period  (Stedman and  Dahl,  2009). It is vitally important for wetland managers to
understand the causes and  sources of this loss to inform implementation of appropriate
management measures. While the S&T Report is  an invaluable source of information on trends
in wetland acreage and class,  it does not provide data on wetland condition.

In response to these needs, EPA  Office of Water (OW), in concert with EPA's Office of
Research and Development (ORD), the 10 EPA Regions, states and tribes has begun  a
program to assess the condition of the nation's waters via a statistically valid approach. The
current assessment is the National Wetland Condition Assessment (referred to as  NWCA).
NWCA is a national assessment of the condition of the Nation's wetlands in the conterminous
U.S.  This assessment is the first assessment of wetlands to be based on data consistently
collected using the same field and laboratory protocols, and based on a statistical survey design
that allows inferences about all wetlands based on a sample of wetlands across the country.
NWCA effort will provide important information about the condition of the nation's wetland
resources and key stressors on a national and regional scale.

EPA developed this Quality Assurance Project Plan (QAPP) to support project participants and
to ensure that the final assessment is based on high quality data and information. The QAPP
contains elements of the overall project management,  data quality objectives, measurement and
data acquisition, and information management for NWCA. EPA recognizes that states and tribes
may  add elements to the Survey,  such as supplemental indicators, that are not covered in the

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 14 of 120

scope of this integrated QAPP. EPA expects that any supplemental elements are addressed by
the states, tribes, or their designees, in a separate approved QAPP or an addendum to this
QAPP. The NWCA participants have agreed to follow this QAPP and the protocols and design
laid out in this document.

This cooperative effort between states, tribes, and federal agencies makes it possible to
produce a broad-scale assessment of the condition of the Nation's wetlands with both
confidence and scientific credibility. Development of the NWCA will build on the
accomplishments of the USFWS and their production of national reports on status and trends in
wetland acreage. When taken together, the NWCA and the S&T Report results will be used to
measure progress toward attainment of the national goal to increase the quantity and quality of
the Nation's wetlands. These complementary studies will influence how wetlands are managed
at local, state, and national scales (Scozzafava, et. al. 2007).

USEPA will collaborate with state, tribal, federal, and other partners to implement the NWCA to
meet three goals:

   1.   Produce a report that describes the ecological condition of the Nation's wetlands and
       ranks the predominant stressors associated with poor wetland condition.

   2.   Assist states and tribes in the implementation of wetland monitoring and assessment
       programs that will guide policy development and aid project decision-making.

   3.   Advance the science of wetlands monitoring  and assessment to support management
       needs.

Through the framework of its goals and objectives, the NWCA addresses the long-term goals
outlined in the Agency's current strategic plan (EPA 2006b) to improve the Nation's water
quality and to protect, sustain, and restore the health of critical natural habitats and ecosystems,
including wetlands (Figure 1-1).

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 15 of 120
   Figure 1-1: Relationship between the goals and objectives of the National Wetland Condition
        Assessment and the long-term goals of EPA's current strategic plan (EPA 2006b)
2011
Goals
National Report

State and Tribal
Capacity

Advance the
Science

Natio
>
nal Wetland Condition
Assessment
Objectives
• Report for the entire nation and within ecoregions
• Report by wetland class for the entire nation
• Report by state or tribal area (provided the state or tribe
invests additional resources to conduct an intensification
study) |
• Develop scalable methods and indicators
• Provide field training and equipment
• Encourage/support state or tribal intensification studies
• Share survey data, including reference data
• Develop plans for continued monitoring, post survey, to
identify wetland condition trends through time
• Integrate reporting of wetland acreage with condition
assessment
• Consider how wetland condition impacts the delivery of
ecosystem services.
• Consider how climate change will affect wetland condition 1
»
c ~\
EPA Long-term Goal 2.3 : IMPROVE WATER QUALITY
EPA Long-term Goal 4.3 : PROTECT, SUSTAIN, AND RESTORE THE
HEALTH OF CRITICAL NATURAL HABITATS AND ECOSYSTEMS .
v )

 1.2  NWCA Project Organization

A comprehensive quality assurance (QA) program, including assigning roles and
responsibilities, was established to ensure data integrity and provide support for the reliable
interpretation of the findings from this project. The responsibilities and accountability of the
various principals and cooperators are described here and illustrated in Figure 1-2. The overall
coordination of the project will be provided by EPA's Office of Water (OW) in Washington, DC,
with support from EPA's Office of Research and Development (ORD). Each EPA Regional
Office has identified a Regional EPA Coordinator to provide the critical link with state and tribal
partners. State and Tribal Cooperators will work with their Regional EPA Coordinator to address
any technical issues. In addition, the National Wetlands Monitoring and Assessment Work
Group  (NWMAWG) is a group of experts from the federal family, states, tribes, academia, and
other cooperators.  Subsets of the NWMAWG membership will be tasked to decide on the most
appropriate approaches for key technical issues, such as: (1) the selection and establishment of
reference conditions based on least-disturbed sites and expert consensus for characterizing
benchmarks for assessment of ecological condition; (2) selection and calibration of ecological
endpoints and attributes of the biota and relationship to stressor indicators; (3) a data analysis
plan for interpreting the data and addressing the objectives in a nationwide assessment; and (4)

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 16 of 120

a framework for the reporting of the condition assessment and conveying the information on the
ecological status of the Nation's wetlands.

Contractor support is provided for all aspects of this project. Contractors will provide support
ranging from implementing the survey, sampling and laboratory processing, data management,
data analysis, and report writing. Cooperators will interact with their Regional EPA Coordinator
and the EPA Project Leader regarding contractual services.

The primary responsibilities of the principals and cooperators are as follows:

Project Leader- Michael Scozzafava, OW

   •   Provides overall coordination of the project and makes decisions regarding the proper
       functioning of all aspects of the project.
   •   Makes assignments and delegates authority, as needed to other parts of the project
       organization.

Alternate Project Leaders - Chris Faulkner and Gregg Serenbetz, OW

   •   Assists EPA Project Leader with coordination and assumes responsibility for certain
       aspects of the project, as agreed upon with the EPA Project Leader.
   •   Serves as primary point-of-contact for project coordination in the absence or
       unavailability of Project Leader.
   •   Serves on the Technical Experts Workgroup and interacts with Project Leader on
       technical, logistical, and organizational issues on a regular basis.

QA Assistance Visit Coordinator- Regina Poeske, Region 3
   •   Assists in the implementation of the QA program.
   •   Coordinates all field and laboratory quality assurance visits.
EPA Project QA Lead - Sarah Lehman, OW
   •   Provides leadership, development and oversight of project-level quality assurance for
       NWCA in Office of Water

Technical Advisor- Mary Kentula, ORD (Teresa Magee, alternate)
   •   Advises the Project Leader on the relevant experiences and technology developed
       within ORD that are to be used in this project.
   •   Facilitates consultations between NWCA personnel and ORD scientists.

EPA (OWOW) QA Officer - Margarete Heber, OW
   •   Functions as the primary officer overseeing all QA and quality control (QC) activities.
   •   Responsible for ensuring that the QA program is implemented thoroughly and
       adequately to document the performance of all activities.

Regional EPA Coordinators
   •   Assist Project Leader with regional coordination activities.
   •   Serve on the NWMAWG and interact with Project Leader on technical,  logistical, and
       organizational issues on a regular basis.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 17 of 120

   •   Serve as primary points-of-contact for the Cooperators.

Contractor Technical Representative

   •   Provides contractor support to the project and works with Project Leader to ensure all
       needs for contractor support are covered.

Study Design Manager - Tony Olsen, ORD

   •   Coordinates w/ Project Lead, REPACs, NWMAWG, and Field Implementation
       Coordinator to develop and manage the Sampling Frame, select sampling locations,  and
       track field evaluation and site reconnaissance.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                       March 2012
                    Page 18 of 120
                                      Project Management
                                    Project Lead - Mike Scozzafava, OW
                                   Alt. Project Lead -Chris Faulkner and
                                         Gregg Serenbetz, OW
                                     Project QA - Sarah Lehman, OW
                                  QA Assistance Visit Coordinator- Regina
                                          Poeske, Region 3
                                   Technical Advisors - Mary Kentula and
                                         Teresa Magee, ORD
                                       Regional EPA Coordinators
                                   Contractor Technical Representative - ?
          OWOW QA
        Oversight and Review
        Margarete Heber, OW
                   Study Design
                    Tony Olsen, ORD
Field Protocols
NWMAWG, ORD, OW
                                         Field Logistics
                                     Field Implementation Coordinator

                                             Training
                                     ORD, EPA Regions, Contractors

                                      Field Implementation
                                    Field Crew Leaders, State, Tribal, &
                                           contractor crews

                                             QA Team
                                  OW, Regional, State, Tribal & Contractor
                                             QA Officers
^
T
Sample Flow
Veget.
[Stat
EcoAn

ation
2S,
alyst]

Algae
[States, EcoAnaly
Soils
[NRCS]
Algal Toxins
[USGS KS]

Stressors
st] [WED-Dynamac]
Rap
Water Quality
[WED-Dynamac]

>id Assessment
[N/A]

                                    Information Management
                                    Information Management Coordinator

                                         Data Archiving
                                              STORET
                                         Data Analysis &
                                           Assessment
                                          Data Analysis Team
Figure 1-2: NWCA Project Organization

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 19 of 120

NWMAWG -States, EPA, academics, other federal agencies

   •   Provides expert consultation on key technical issues as identified by the EPA Project
       Management team and works with  Project Lead to resolve approaches and strategies to
       enable data analysis and interpretation to be scientifically valid.

Logistics Coordinator- Dennis McCauley, Great  Lakes Environmental Center (GLEC)

   •   A contractor who functions to support implementation of the project based on technical
       guidance established by the EPA Project Leader and Alternate EPA Project Leader
       serves as point-of-contact for questions from field crews and cooperators for all
       activities.
   •   Tracks progress of field sampling activities.
   •   Tracks progress of lab activities.

Field Crew Leader

   •   Functions as the senior member of each Cooperator's field sampling crew and the point
       of contact for the Field Implementation Coordinator.
   •   Responsible for overseeing all activities of the field sampling crew and ensuring that the
       Project field method protocols are followed during all sampling activities.

Cooperator(s) - States, Tribes,  academics, other federal agencies, contractors

   •   Under the scope of their assistance agreements, plan and execute their individual
       studies as part of the cross jurisdictional NWCA, and adhere to all QA requirements and
       standard operating procedures (SOPs).
   •   Interact with the Regional EPA Coordinators, Field Implementation Coordinator and  EPA
       Project Leader regarding technical, logistical, organizational issues.

QA Team

   •   Oversees the transfer of samples and related records for each indicator.
   •   Ensures the validity of data for each indicator.

Regional QA Project Officers)

   •   Oversee(s)  individual studies of cooperators (assistance recipients).
   •   Interacts with EPA Project Leader and Field Implementation Coordinator on issues
       related to sampling design, project  plan, and schedules for conduct of activities.
   •   Collects copies of all official field forms, field evaluation checklists and reports.
   •   Oversees and maintains records on field evaluation visits,  but is not a part of sampling
       teams.

State QA Officer(s)

   •   Work(s) with EPA Project Lead, Regional EPA Coordinators, and Field Implementation
       Coordinator to ensure that the NWCA field protocols and QA protocols are carried out.

Contractor QA Officer(s)

   •   The contractor QA Officer who will supervise the implementation of the QA program.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 20 of 120
   •   Directs the field and laboratory audits and ensures the field and lab auditors are
       adequately trained to correct errors immediately to avoid erroneous data and the
       eventual discarding of information from the assessment.

Information Management Coordinator- Marlys Cappaert, WED-SRA

   •   A contractor who functions to support implementation of the project based on technical
       guidance established by the EPA Project Leader and Alternate EPA Project Leader
       oversees all sample shipments and receives data forms from the Cooperators.
   •   Oversees all aspects of data entry  and data management for the project.
Data Analysis Team - EPA OW, ORD, Regions, FWS, NRCS, States, Michigan State Univ,
                      Kenyon College, Utah State Univ, Oregon State Univ, and contractors

   •   Develops and implements the NWCA Data Analysis plan which details our approach for
       analyzing acquired data, generating metadata, developing one or more I Bis, discussing
       findings with stakeholders in workshops and other venues,  and contributing to the final
       report.

Program level QA will be the responsibility of the OWOW QA Officer and the Project QA Officer.
A records system will be used to maintain a permanent hardcopy file of all NWCA
documentation from site selection to data analysis. This will be housed in OW Headquarters
Office.
 1.2.1 Project Schedule
The U.S. EPA has responded to a State and OW goal to report on the quality of the Nation's
wetlands by no later than December, 2013. Tasks leading up to the final report are described
throughout the QAPP.
                        2011                 2012                   2013
 Site Eval uation/Recon


 Field Team Training

 Field Sampling/
 Shipping

 Field Evaluation
 (Audits)


 Sample Processing

 Lab Evaluation {Audits}


 Data Management
 {QA/QCJ Integration

 Data Analysis


 Report Preparation

 Re port Review


 PeerReview

 Final Report Production
                                 i
                                                         S
                                                      fc
                                                            *
                                                                              f 3
Figure 1-3: Timeline of NWCA Activities

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 21 of 120

 1.3  Scope of QA Project Plan

This QA Project Plan addresses all aspects of the data acquisition efforts of the NWCA, which
focuses on the 2011 sampling of wetland sites in the conterminous United States. Analysis of
data from approximately 1000 sites (selected with a probability design) will provide a
comprehensive assessment of the Nation's wetlands. Relevant companion documents to this
QAPP include the following, which can be found at the NWCA web site
(http://water.epa.gov/type/wetlands/assessment/survey/index.cfm):

   •   NWCA:  Site Evaluation Guidelines (EPA 843-R-10-004)
   •   NWCA:  Field Operations Manual (EPA 843-R-10-001)
   •   NWCA:  Laboratory Methods Manual (EPA 843-R-10-002)
   •   Ecological Indicators for the 2011 National Wetland Condition Assessment (in
       preparation)

 1.3.1 Overview of Field Operations

Field measurements and samples are collected by trained teams. Typically, each Field Crew is
comprised of 4 members, divided into the Vegetation (Veg) Team and the Assessment Area
and Buffer (AB) Team. The number and size of crews depends on the duration of the sampling
window, geographic distribution of sampling locations, number and complexity of samples and
field measurements, and other factors. The two teams will work closely with each other, and
coordinate sampling activities.

1.3.1.1  Field Crew Duties and Qualifications

The NWCA Veg Team is composed of a Botanist/Ecologist and a Botanist Assistant.
Primary responsibilities for the Veg Team include:

   1.   Laying out the Assessment Area (AA) and vegetative plots;
   2.   Collecting high quality plant ecological data (including species identities, presence and
       cover of individual species, presence and cover of vertical vegetation strata, and counts
       of larger trees);
   3.   Collecting other information related to vegetation condition;  and
   4.   Collecting and processing plant specimens.

The Veg Team carefully follows protocols to make onsite decisions regarding layout and set-up
of the vegetation plots within the assessment area and to collect  ecological data. Accurate plant
species identification is critical to data quality. Careful descriptions of diagnostic characteristics,
habitat, and plant associations will be documented. Plant specimens must be collected for all
unknown taxa and quality assurance taxa, which will be later identified by expert taxonomists.
Careful attention to providing tracking information for all specimens is essential.

In addition,  NWCA will provide Veg Team members with training  on study goals, vegetation
sampling methods, field protocols, and plant collection requirements. Training will prepare the
Team to accurately complete data and specimen collection tasks.

In addition to the skills developed in the training, the Botanist/Ecologist will have the following
minimum qualifications:

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                    March 2012
                                 Page 22 of 120
       Understanding of basic wetland plant ecology.

       Familiarity with regional flora and proficiency in identifying common wetland plant
       species:
       o  capable of sight recognition of often dominant species to the level of genus and
          species, provided plants are at the proper phonological stage; or
       o  capable of sight recognition of dominant species to the family, and proficiency in
          keying in the field.

       Proficiency in keying many unknown plants (e.g., forbs, shrubs, trees) to species using
       regionally appropriate floras and diagnostic keys.

       Ability to distinguish difficult graminoid taxa as Poaceae (grasses), Juncaceae (rushes),
       and Cyperaceae (sedges, bulrushes, spikerushes), and to  distinguish unknown species
       within these families or genera from one another.

       College course-work in plant taxonomy or systematics that included field identification of
       plant species; and/or excellent references regarding proficiency in botanical
       identification.

       Previous experience conducting botanical or ecological field work, including the
       collection and preservation of plant specimens.
All Botanist/Ecologist applicants will send their Curriculum vitae and references to the Regional
Lead. The Project Lead and RMCs will review and verify the qualifications of all applicants prior
to the applicants joining a Field Crew. If a State is unable to identify a Botanist/Ecologist, EPA
will work with the State to identify a
Botanist/Ecologist.
The NWCA AB Team is composed of
two crew members, whose primary
responsibilities include:

    1.  Collecting high-quality
       biological (e.g.,% vegetative
       cover, water quality),
       hydrology, soils and stressor
       data following the FOM and
       USA RAM protocols,
    2.  Collecting and processing soil,
       algae and water chemistry
       specimens.
           Field Crew Training

Each Field Crew Leader and Botanist/Ecologist
must be trained at an EPA-sponsored training
session prior to the start of the field season,
along with as many crew members as possible.
The training program stresses hands-on practice
of methods, comparability among crews,
collection of high quality data and samples, and
safety. Training will be provided in ten central
locations for cooperators and contractors. Project
organizations responsible for training oversight
are identified in Figure 1-2. Training
documentation will be maintained by the Project
QA Officers.
The AB team carefully follows
protocols in both FOM and USA RAM to make onsite decisions regarding the collection of
ecological data and placement of soil pits. All samples (algae, soil, water chemistry) must be
carefully collected, preserved, packed and catalogued for tracking.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 23 of 120

AB Team members should have the following skills/abilities:

   •   Some previous experience conducting ecological field work;

   •   Ability to recognize evidence of human (or natural) landscape disturbance from the
       present or recent past.

   •   Ability to use common field equipment (compass, GPS, laser rangefinder, stadia rods,
       etc.).
   •   Experience measuring basic physical characteristics of soil,

   •   Knowledge of regional hydric soil indicators

   •   Knowledge of hydrogeomorphic classification


In addition,  NWCA will provide AB Team members with additional training on study goals,
biological and physical sampling methods, field protocols, and soil  collection requirements.
Training will prepare the Team to accurately complete data and tracking tasks.

 1.3.1.2Field Operations  Timeline

Field data acquisition activities are implemented for the NWCA (Table 1), based on guidance
developed for earlier EMAP studies (Baker and Merritt 1990).

Table 1.3-1. Critical logistics elements (from Baker and Merritt, 1990)
 Logistics Plan Component          Required Elements
 Project Management                 Overview of Logistic Activities Staffing and Personnel
                                    Requirements Communications
 Access and Scheduling               Sampling Schedule and Site Access Reconnaissance
 Safety                             Safety Plan
                                    Waste Disposal Plan
 Procurement and Inventory Control    Equipment, Supplies, and Services Requirements
                                    Procurement Methods and Scheduling
 Training and Data Collection          Training Program
                                    Field Operations Scenario
                                    Laboratory Operations Scenarios
                                    Quality Assurance
                                    Information Management
 Assessment of Operations            Field Crew Debriefings
                                    Logistics Review and Recommendations

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                               March 2012
                                             Page 24 of 120
Pre-Field Visit Activities

Survey preparation is initiated with selection of the sampling locations by EPA's Office of
Research and Development (WED in Corvallis). The list of sampling locations is distributed to
the EPA Regional Wetland Monitoring Coordinators and cooperators. With the sampling location
list, State and Tribal
cooperators can decide to
what level they wish to
participate (vs.  requesting in-
kind assistance).
Participating State and Tribal
Field Crews can then begin
site reconnaissance on  both
the primary sites and
alternate/replacement sites
(known as base and
oversample locations,
respectively) and begin  work
on obtaining access
permission to each site1.
         Equipment Use During NWCA Field Activities

The timely receipt, proper use (including inspection and calibration),
and maintenance of appropriate equipment are important contributors
to acquiring quality data.

The Field Crews will use standard field equipment and supplies which
are provided by EPA and contractors. The Field Implementation
Coordinator will work with Regional Wetland Monitoring Coordinators,
Cooperators, States, Tribes and Contractors to make certain the Field
Crews have the required equipment and supplies in a timely fashion.
Detailed lists of equipment required for each field protocol are
contained in the NWCA: Field Operations Manual (USEPA 2010[b]).

Also, some sampling locations require teams to hike to them,
transporting all equipment in backpacks. For this reason, ruggedness
and weight are important considerations in the selection of equipment
and instrumentation. In addition, Field Crews may need to camp out
at the sampling location, and if this is the case then they must be
equipped with the necessary camping equipment.

The Field Crews will be responsible for the inspection, maintenance,
and calibration of the equipment they  use. Detailed information
(including guidance) on equipment inspection, maintenance, and
calibration, are contained in the NWCA: Field Operations Manual
(USEPA 2010[b]).
Field Crews need to acquire
permission to access sites
on private property, as well
as permits to access and
sample federally protected
land. The Field Crew Leader
should begin contacting
private property owners (and
the appropriate federal
agency in the case of
federally protected land) as
early as 2010. As the design requires repeat visits to selected sites (i.e. for sampling), it is
important for the Field Crews to do everything possible to maintain good relationships with
landowners. This includes prior contacts, respect of special requests, closing gates, minimal site
disturbance, and removal of all materials including flagging and trash. More details on the timing
and acquisition of property access permissions and permits are found in the NWCA: Site
Evaluation Guidelines (USEPA 2010[a]).

In addition to the initial list of base and oversample sampling locations, Cooperators conducting
field operations (i.e., States and Tribes that decide to conduct field operations themselves, and
contractors performing in-kind support) will also receive and develop Site Packets for the base
locations. Each Site Packet will contain the following applicable information:

    •   topographic and aerial maps with the POINT location (lat/long) marked,

    •   copies of written access permissions,
1 Specific procedures for evaluating each sampling location and for replacing target sites are documented
in the NWCA: Site Evaluation Guidelines (USEPA, 2010[a]).

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 25 of 120

   •   scientific collection permits,

   •   information brochures on the program for interested land owners,

   •   road maps, and local area emergency numbers.
Field Visit Activities
The site verification process is shown in Figure 1-4. Upon arrival at a site, the POINT location is
verified by a Global Positioning System (GPS) receiver. Samples and measurements for various
indicators are collected in a specified order (Figure 1-5). This order has been set up to minimize
the impact of sampling for one indicator upon subsequent indicators; for example, vegetation
sampling is to be completed before soil pits are dug and sampled. All methods are fully
documented in step-by-step procedures in the NWCA: Field Operations Manual (USEPA
2011[b]). The manual also contains detailed instructions for completing documentation, labeling
samples, any field processing requirements, and sample storage and shipping. Any revision of
methods must be approved in advance by the EPA Project Leader. Field communications will
be available through Field Coordinators, regularly scheduled conference calls, a
Communications Center, or an electronic  distribution.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 26 of 120
                              Locate POINT on map
                               Conduct preliminary
                                  evaluation
                                (desktop / office)
  Permission to access  IVes/Mayl
       granted       H
   POINT Verification
       (on-site)
                                 Is POINT in or
                                  near a target
                                   wetland?
                                                            Permission to access
                                                                 denied
                                                            Logistical and safety
                                                            constraints prevent
                                                                samnlinn
                                                                                Select alternate POINT
Figure 1-4: Site verification activities for wetland field surveys

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                                                      March 2012
                                                                                   Page 27 of 120
    T
    I
    M
    E

    O
    F

    D
    A
    Y
                "Vegetation(VEG) Team"
                      (2 persons)
  SITE VERIFICATION AND
        DESCRIPTION
•Verify Wetland
•Determine HGM Class
•Locate POINT
•Define Assessment Area (AA)
•Annotate site aerial photo for
description
•Document with reference photos
        VEGETATION
    CHARACTERIZATION
          (Intensive)
   •Establish Sampling Plots
   •Determine Species
   Composition & Abundance
   • Alien & Invasive Species
   Cover
   •Community Types &
   Distribution
   •V e gelation Structure &
   Productivity
   •Total Absolute
   Distribution & Cover of
   Growth Forms
   •Physiology and Plant
   Stress
   •Collect, Label & Press
   Unknown and Voucher
   Plants
                                                         "Assessment Area/Buffer (AB)
                                                               Team" (2 persons)
                                                            USA RAM Assessment
                                                        •Indicators of Condition
                                                        •Stressors
  HYDROLOGY
•ID Water Sources
•Determine
hydrology
stressors/rank
•Ditch depth
•Drift lines
•Surface water extent
                            ALGAE
                      •Collect composite
                      taxonomic ID
                      samples from benthic
                      and vegetation
                      habitats
                      •Collect benthic,
                      epiphytic and
                      phytoplankton toxin
                      samples
                      •Chlorophyll-a
   BUFFER
•Establish
buffer and
buffer plots
•Determine
natural cover,
Agriculture &
rural stressors
•Hydrology
stressors
•Targeted
Alien Species
                                                           SOILS
                                              •Dig pits for analysis
                                              •Collect composite samples for Bulk
                                              Density & Soil chemistry
                                              •Determine soil disturbance
                                              •Describe soil profile (organic layers,
                                              texture)
                                              •Collect soil isotope sample
                                                                                      ***STRESSORS***
                                                                                   •Collected as part of the
                                                                                   other indicators
                    FINAL ON-SITE ACTIVITIES
        •Conduct visual assessment
        •Review field data forms
        •Prepare or collect any algae (ID, chlorophyll, bio mass), soil
        or other samples if not yet sampled
        •Inspect and package samples
        •Cleanup wetland site
        •Clean equipment & crew of plant propagules
        •Crew Chief final check off
                                                              NEXT DAY ACTIVITIES
                                                          •Fill out packing forms, Ship samples
                                                          and original data forms (after copying
                                                                     data forms)
                                                           •Call, FAX or e-mail status report
                                                              •Travel to next wetland site
Figure 1-5: Summary of field activities and site sampling

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2                                                   Page 28 of 120
Standardized field data forms are provided to the Field Crews as the primary means of data
recording. On completion, the data forms are reviewed by a Field Crew member other than the
person who initially entered the information. Prior to departure from the field site, the Field Crew
Leader reviews all forms and labels for completeness and legibility and ensures that all samples
are properly labeled and packed.  Each site has a unique identifier provided by the design. All
samples from a site must be labeled with this unique identifier.

Post-Field Visit Activities
Upon return from a field sampling site (either to the Field Crew's  home office or to a motel),
completed data forms are sent to the Information Management Staff at WED for entry  into a
computerized data base. At WED, electronic data files are reviewed independently to verify that
values are consistent with those recorded on the field data form or original field data file (refer to
section 4.1.4 of this document for more information).

Samples are stored or packaged for shipment in accordance with instructions contained in the
Field Operations Manual (USEPA 2010[b]). Samples which must be shipped are delivered to a
commercial carrier. The recipient  is notified to expect delivery; thus, tracing procedures can be
initiated quickly in  the event samples are not received. Bills  of lading and chain-of-custody forms
are completed for  all transfers of samples maintained by the labs, with copies also  maintained
by the field team. The Logistics Coordinator maintains a centralized tracking system of all
shipments.

The field operations phase is completed with collection of all samples or expiration of the
sampling window.  Following completion  of all sampling, a debriefing session will be scheduled
(see Table 1). These debriefings cover all aspects of the field program and solicit suggestions
for improvements.

 1.3.2 Overview of Laboratory Operations

Holding times for samples vary with the sample types and analytes. Thus, some analyses begin
during sampling (e.g., in situ profiles) while others are not even initiated until sampling has been
completed (e.g., algae). Analytical methods conducted in the field are described in the Field
Operations Manual, and methods conducted in the laboratory are described in the  Laboratory
Methods Manual (USEPA, 2010[c]).

Chemical, physical, or biological analyses may be performed in-house or by contractor or
cooperator laboratories.  Laboratories providing analytical support must have the appropriate
facilities to properly store and prepare samples and appropriate instrumentation and staff to
provide data of the required quality within the time period dictated by the project. Laboratories
are expected to conduct operations using good laboratory practices. General guidelines for
analytical support  laboratories:

    •  A program of scheduled maintenance of analytical balances, water purification  systems,
       microscopes, laboratory equipment, and instrumentation.

    •  Verification of the calibration of analytical balances using  class "S" weights which are
       certified by the National  Institute  of Standards and Technology (NIST).

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 29 of 120

   •   Verification of the calibration of top-loading balances using NIST-certified class "P"
       weights.

   •   Checking and recording the composition of fresh calibration standards against the
       previous lot. Acceptable comparisons are 2 percent of the theoretical value. (This
       acceptance is tighter than the method calibration criteria.)

   •   Recording all analytical data in  bound logbooks in ink, or on standardized recording
       forms.

   •   Verification of the calibration of uniquely identified daily use thermometers using NIST-
       certified thermometers.

   •   Monitoring and recording (in a logbook or on a recording form) temperatures and
       performance of cold storage areas and freezer units (where samples, reagents, and
       standards may be stored). During periods of sample collection operations, monitoring
       must be done on a daily basis.

   •   An overall program of laboratory health and safety including periodic inspection and
       verification of presence and adequacy of first aid and spill kits; verification of presence
       and performance of safety showers, eyewash stations, and fume hoods; sufficiently
       exhausted reagent storage units, where  applicable; available  chemical and hazardous
       materials inventory; and accessible material safety data sheets for all required materials.

   •   An overall program of hazardous waste management and minimization, and evidence of
       proper waste handling and disposal procedures (90-day storage, manifested waste
       streams, etc.).

   •   If needed, having a source of reagent water meeting American Society of Testing and
       Materials (ASTM) Type I specifications for conductivity (< 1 uS/cm at 25 °C; ASTM
       1984) available in sufficient quantity to support analytical operations.

   •   Appropriate microscopes or other magnification for biological  sample sorting and
       organism identification.

   •   Labeling all containers used in the laboratory with date prepared, contents, and initials of
       the individual who prepared the contents.

   •   Dating and storing all chemicals safely upon receipt. Chemicals are  disposed of properly
       upon expiration.

   •   Using a laboratory information management system to track the location and status of
       any sample received for analysis.

   •   Reporting results using standard formats and units compatible with the information
       management system.

All laboratories providing analytical support to the NWCA must adhere to the provisions of this
integrated QAPP. Laboratories will provide information documenting  their ability to conduct the
analyses with the required level of data quality before analyses begin. The documentation will

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 30 of 120

be sent to the EPA Project QA Lead (Sarah Lehmann) at EPA Headquarters. Such information
might include results from inter-laboratory comparison studies, analysis of performance
evaluation samples, control charts and results of internal QC sample or internal reference
sample analyses to document achieved precision, bias, accuracy, and method detection limits.
Contracted laboratories will be required to provide copies of their Data Management Plan.
Laboratory operations may be evaluated by technical systems audits, performance evaluation
studies, and by participation in inter-laboratory sample exchange.

Table 1.3-2. Guidelines for analytical support laboratories	
    A program of scheduled maintenance of analytical balances, water purification systems,
   microscopes, laboratory equipment, and instrumentation.
   Checking and recording the composition of fresh calibration standards against the previous
   lot. Acceptable comparisons are ±2 percent of the theoretical value.
   Recording all analytical data in bound logbooks in ink, or on standardized recording forms.
   Monitoring and recording (in a logbook or on a recording form) temperatures and
   performance of cold storage areas and freezer units. During  periods of sample collection
   operations, monitoring must be done on  a daily basis.
   Verifying the efficiency of fume hoods.
   If needed, having a source of reagent water meeting American Society of Testing and
   Materials (ASTM) Type I specifications for conductivity (< 1 uS/cm at 25 °C; ASTM 1984)
   available in sufficient quantity to  support analytical operations.
   Appropriate microscopes or other magnification for biological sample sorting and organism
   identification.
   Labeling all containers used in the laboratory with date prepared, contents, and initials of the
   individual who prepared the contents.
   Dating and storing all chemicals  safely upon receipt. Chemicals are disposed of properly
   when the expiration date has expired.
   Using a laboratory information management system to track  the location and status of any
   sample received for analysis.
   Reporting results using standard formats and units compatible with the information
   management system.
 1.3.3 Data Analysis and Reporting

A technical workgroup convened by the EPA Project Leader is responsible for developing a data
analysis plan that includes verification and validation. These processes are described in the
internal indicator research strategies and summarized in the indicator-specific sections of this
QAPP. Validated data are transferred to the central  National Aquatic Resource Surveys (NARS)
surface waters information management system at WED-Corvallis and managed by Information
Management Staff. Data analysis to support this report will be conducted by the NWCA Data
Analysis Team. Information management  activities in support of this effort are discussed further
in Section 4. Data in the database are available to Cooperators for their own use upon
completion of the final verification and validation. All validated measurement and indicator data
from the NWCA will eventually be transferred to EPA's Water Quality Exchange (WQX) that will
replace the STORET data management system.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 31 of 120

 1.3.4 Peer Review

The Survey will undergo a thorough peer review process, where the scientific community and
the public will be given the opportunity to provide comments. Cooperators have been actively
involved in the development of the overall project management, design, methods, and
standards including the drafting of five key project documents:

   •   NWCA:  Quality Assurance Project Plan (EPA 843-R-10-003)

   •   NWCA: Site Evaluation Guidelines (EPA 843-R-10-004)

   •   NWCA: Field Operations Manual (EPA 843-R-10-001)

   •   NWCA: Laboratory Methods Manual (EPA 843-R-10-002)

   •   Ecological  Indicators for the 2011 National Wetland Condition Assessment (in
       preparation)
Outside scientific experts from universities, research centers, and other federal agencies have
been instrumental in indicator development and will continue to play an important role in data
analysis.

The EPA will utilize a three-tiered  approach for peer review of the Survey: (1) internal and
external review by EPA, states, other cooperators and partners, (2) external scientific peer
review, and (3) public review.

Once data analysis is complete, cooperators will examine the results at regional meetings.
Comments and feedback from the cooperators will be incorporated into the draft report.  Public
and scientific peer review will happen simultaneously. This public comment period is important
to the process and will allow us to garner a broader perspective in examining the results before
the final report. The public peer review is consistent with the Agency and  OMB's revised
requirements for peer review.

Below are the proposed measures EPA will implement for engaging in the peer review process:

   1.  Develop and maintain a public website with links to standard operating procedures,
       quality assurance documents, fact sheets, cooperator feedback, and final report

   2.  Conduct technical workgroup meetings composed of scientific experts, cooperators, and
       EPA to evaluate and recommend data analysis options and indicators

   3.  Hold national meeting where cooperators will provide input and guidance on data
       presentation and an approach for data analysis
   4.  Complete data validation on all chemical, physical and biological data

   5.  Conduct final data analysis with workgroup to generate assessment results

   6.  Engage peer review contractor to identify external peer review panel

   7.  Develop draft report presenting assessment results
   8.  Conduct regional meetings with cooperators to examine and comment on results

   9.  Develop final draft report incorporating input from cooperators and results from data
       analysis group to be distributed for peer and public review

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 32 of 120

   10. Issue Federal Register (FR) Notice announcing document availability and hold
       scientific/peer review and public comment (30-45 days)
   11. Consider scientific and public comments and produce a final report


The proposed peer review schedule is provided below and is contingent upon timeliness of data
validation, schedule availability for regional meetings and experts for data analysis workshop.

May 2011 - December 2011         Data validation
March 15, 2012                   Data analysis workshop
May - August 2013                Internal peer review meetings with states, cooperators,
                                 participants
October 19, 2013                  Release for external peer and public review of draft

  2   DATA QUALITY OBJECTIVES

It is U.S. EPA policy that Data Quality Objectives (DQOs) be developed for all environmental
data  collection activities following the prescribed DQO Process. DQOs are qualitative and
quantitative statements that:

   •   Clarify study objectives;

   •   Define the appropriate types of data; and

   •   Specify the tolerable levels of potential decision errors.

These statements are the basis for establishing the quality and quantity of data needed to
support decisions (EPA 2006). Data Quality Objectives thus  provide the criteria to design a
sampling program within cost and resource constraints or technology limitations imposed upon
a project or study.

DQOs are typically expressed in terms of acceptable uncertainty (e.g.,  width of an uncertainty
band or interval) associated with a point estimate at a desired level of statistical confidence
(EPA 2006). The DQO Process is used to establish performance or acceptance criteria, which
serve as the basis for designing a plan for collecting data of sufficient quality and quantity to
support the goals of a study (EPA 2006). As a general rule, performance  criteria represent the
full set of specifications that are needed to design a data or information collection effort such
that, when implemented, generate newly-collected data that  are of sufficient quality and quantity
to address the project's goals (EPA 2006). Acceptance criteria are specifications for evaluating
the adequacy of existing sources of information or data as being acceptable to support the
project's intended use (EPA 2006).

 2.1  Data  Quality Objectives for the National Wetland Condition
      Survey

Target DQOs established for the  NWCA relate to the goal of describing the current status in the
condition of selected indicators of the condition of wetlands in the conterminous U.S. and
ecoregions of interest.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 33 of 120

The formal statement of the DQO for national estimates is as follows:

   Estimate the proportion of wetlands (± 5%) in the conterminous U.S. that fall below
   the designated threshold for good conditions for selected measures with 95%
   confidence.

For the ecoregions of interest the DQO is:

   Estimate the proportion of wetlands (± 15%) in a specific ecoregion that fall below
   the designated threshold for good conditions for selected measures with 95%
   confidence.

 2.2  Measurement Quality  Objectives

For each parameter, performance objectives (associated primarily with measurement error) are
established for several different data  quality indicators (following USEPA [2002a]). Specific
measurement quality objectives (MQOs) for each parameter are presented in the indicator
section of this QAPP. The following sections define the data quality indicators and present
approaches for evaluating them against acceptance criteria established for the program.

 2.2.1 Laboratory Reporting Level (Sensitivity)

Generally, NWCA laboratory analyses can be divided between taxonomic (i.e. vegetation and
algae, addressed in section 2.2.3) and non-taxonomic metrics. Non-taxonomic metrics are
further broken out into physical metrics and chemical metrics.

For physical and chemical measurements, requirements for the method detection  limit (MDL)
are typically established. The MDL is defined as the lowest level of analyte that can be
distinguished from zero with 99 percent confidence based on a single measurement (Glaser et
al., 1981). USGS NWQL has developed a variant of the MDL called the long-term  MDL (LT-
MDL) to capture greater method variability (Oblinger Childress et al. 1999). Unlike MDL, it is
designed to incorporate more of the measurement variability that is typical for routine analyses
in a production laboratory, such as multiple instruments, operators, calibrations, and sample
preparation events (Oblinger Childress et al. 1999).  Because the LT-MDL addresses more
potential sources of variability than the MDL, the NWCA uses the LT-MDL.

The LT-MDL determination ideally employs at least 24 blanks and spiked samples prepared and
analyzed by multiple analysts on multiple instruments over a 6- to 12-month period at a
frequency of about two samples per month (EPA 2004). The LT-MDL uses "F-pseudosigma"
(Fa) in place of s, the sample standard deviation, used in the EPA MDL calculation. F-
pseudosigma is a non-parametric measure of variability that is based on the interquartile range
of the data (EPA 2004). The LT-MDL is calculated using either the mean or median of a set of
long-term blanks, and from long-term spiked sample results (depending on the analyte and
specific analytical method). The LT-MDL for an individual analyte is calculated as:

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 34 of 120

Equation 1a                 LT_MDL =M + ^^ x

       where:
             M = the mean or median of blank results
             n = the number of spiked sample results
             Fa = F-pseudosigma, a nonparametric estimate of variability calculated as:

Equation 1b                               Q  ~Q
                                     F^ =	
                                           1 ^ AQ
       where:                               i.Jty
             Q3 = the 75th percentile of spiked sample results
             Q-i = the 25th percentile of spiked sample results

LT-MDL is designed to be used in conjunction with a laboratory reporting level (LRL;  Oblinger
Childress et al. 1999). The LRL is designed to achieve a risk of <1% for both false negatives
and false positives (Oblinger Childress et al. 1999). The LRL is set as a multiple of the LT-MDL,
and is calculated as follows:

                      LRL = (2 x LT-MDL)/fractional spike recovery

Where fractional spike recovery is the mean or median recovered spike concentration divided
by the expected spike concentration. For example, at 50% recovery, LRL is 4 times the LT-
MDL.
Therefore, multiple measurements of a sample having a true concentration at the LRL should
result in the concentration being detected and reported 99 percent of the time (Oblinger
Childress et al. 1999).

All laboratories will develop calibration curves for each batch of samples that include  a
calibration standard with an analyte concentration equal to the LRL. Estimates of LRLs (and
how they are determined)  are required to be submitted with analytical results. Analytical results
associated with LRLs that exceed the objectives are flagged as being associated with
unacceptable LRLs. Analytical data that are below the estimated LRLs are reported, but are
flagged as being below the LRLs.

 2.2.2 Sampling Precision, Bias, and Accuracy

Accuracy is a qualitative term referring to the proximity of a measurement to its "true" value.
Accuracy will be qualitatively evaluated for taxonomic data collected as part of the NWCA, as
described in Section 2.2.3 below; however, it will not be evaluated for all data.  Precision and
bias, on the other hand, are quantitative terms referring to the agreement between multiple
measurements and the distance between those measurements and the true value
(respectively). Precision and bias are estimates of random and systematic error in a
measurement process (Kirchmer, 1983; Hunt and Wilson, 1986,  USEPA2002a). Collectively,
precision and bias provide an estimate of the total error or uncertainty associated with an
individual measurement or set of measurements. Precision and bias MQOs are developed for
lab measurements. Precision, bias, and accuracy of field measurements will not be monitored
during the NWCA2.
2 Bias, for example, cannot be determined directly, since the "true" values at any particular site are not
known.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 35 of 120

Laboratory Measurements
Systematic errors in water and soil chemistry metrics are minimized by using validated methods
and standardized procedures across all laboratories. Precision is estimated from repeated
measurements of samples. Net bias is determined from repeated measurements of solutions of
known composition, or from the analysis of samples that have been fortified by the addition of a
known quantity of analyte. For analytes with large ranges of expected concentrations, MQOs for
precision and bias are established in both absolute and relative terms, following the approach
outlined in Hunt and Wilson (1986). At lower concentrations, MQOs are specified in absolute
terms. At higher concentrations, MQOs are stated in relative terms. The point of transition
between an absolute and relative  MQO is calculated as the quotient of the absolute objective
divided by the relative objective (expressed as a proportion, e.g., 0.10 rather than as a
percentage, e.g., 10%).

Precision based on duplicate measurements (e.g., from revisited POINTs) is estimated based
on the range of measured values  (which equals the  difference for two measurements). The
relative percent difference (RPD) is calculated as:
                                                A
                                         A-B
Equation 1                    RPD =
       Where:
                    A = the first measured value
                    B = the second measured value.

   Bias in relative terms (B/%7) is calculated as:

Equation 2
xlOO
       Where:
             x = the mean value for the set of measurements
             T = the theoretical or target value of a performance evaluation sample.

Precision and bias within each laboratory are monitored for every sample batch by the analysis
of internal QC samples. Samples associated with unacceptable QC sample results are reviewed
and re-analyzed if necessary. Precision and bias across all laboratories will be evaluated after
analyses are completed by using the results of performance evaluation (PE) samples sent to all
laboratories (3 sets of 3 PE samples, with each set consisting of a low, moderate, and high
concentration sample of all analytes).

Field Measurements
Since precision, bias, and accuracy of field measurements will not be monitored during the
NWCA, a revisit site approach will be taken to ensure the quality of data. The survey design
incorporates a plan for repeated sampling of a subset of sites. Data from these repeat visits
provide estimates of important components of variance to evaluate the performance of
ecological indicators. These variance components are presented in Table 2.2-1. If estimates of
these components are available from other studies, they are used in conjunction with  the project
requirements to evaluate alternative design scenarios (Larsen et al. 1995, 2001, 2004). Status
estimates are influenced most by the interaction (if multiple years are required to complete
sampling) and residual variance components.  Residual variance is composed of temporal
variance within a sampling period confounded with measurement error of various types. If the

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                                             March 2012
                                                                           Page 36 of 1 20
magnitude of residual variance is sufficiently large to impact status estimates (see above), then
relative magnitudes of the interaction variance and various components of residual variance are
examined to determine if any reduction can be achieved in the future. Interaction variance can
only be reduced by increasing the sample size. Index variance can be reduced by either
increasing the number of sites, increasing the number of times a site is visited within a year,
reducing the length of the index period, or by reducing measurement error. Trend detection is
evaluated using the equation to determine the variance in the slope of the trend (Table 2.2-1). In
this model, residual variance also includes the interaction component. For multi-site networks
such as the national aquatic resource assessments, trend detection is most sensitive to
coherent year variance, which can only be reduced by extending the time period for monitoring
(Larsen et al. 1995, 2001, 2004).  If residual variance is large relative to the coherent year
variance, then trend detection within a fixed time period can be improved by increasing the
number of sites sampled each year, increasing the number of times each site is sampled within
a year, or by reducing measurement error.

Table 2.2-1. Important variance components for aquatic resource assessments _
Model for status estimation
                                            Model for trend detection
              • :> 
-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 37 of 120

minimize measurement error among crews and sites will be employed. These control measures
include the use of standardized field protocols provided in the Field Operations Manual (FOM),
consistent training of all crews, field assistance visits to all crews, and availability of experienced
technical personnel during the field season to respond to site-specific questions from field crews
as they arise.

Each Field Crew Leader and Botanist/Ecologist must be trained at an EPA-sponsored  training
session  prior to the start of the field season, along with as many crew members as possible.
The training  program stresses hands-on practice of methods, comparability among crews,
collection of  high quality data and samples, and safety. Training will be provided in ten  central
locations for cooperators and contractors over the course of 3.5 days. Project organizations
responsible for training oversight are identified in Figure  1-2. Training documentation will be
maintained by the Project QA Officers.

It is anticipated that evaluation and assistance visits will be conducted with each Field Team
early in the sampling and data collection process, and that corrective actions will be conducted
in real time. These visits provide a basis for the uniform evaluation of the data collection
techniques, and an opportunity to conduct procedural reviews to minimize data loss due to
improper technique or interpretation of program guidance. The field visits evaluations will be
based on the uniform training, plans, and checklists. For more information on field assistance
visits see Chapter 6 of this document.

 2.2.3  Taxonomic Precision and Accuracy

For the NWCA, taxonomic precision will be quantified by comparing whole-sample
identifications completed by independent taxonomists or laboratories. Accuracy of taxonomy will
be qualitatively evaluated through specification of target  hierarchical  levels (e.g., family, genus,
or species); and the specification of appropriate technical taxonomic  literature or other
references (e.g., identification keys, voucher specimens). To calculate taxonomic precision for
vascular plants and algae, 10 percent of the samples will be randomly-selected for re-
identification by an independent, outside taxonomist or laboratory. Comparison of the results of
whole sample re-identifications will provide a Percent Taxonomic Disagreement (PTD)
calculated as:
Equation 7                   PTD =
1-
                                          comPpos
                                             N
                                     _   ^

       Where:
xlOO
                    s = the number of agreements
             N = the total number of individuals in the larger of the two counts.

The lower the PTD, the more similar taxonomic results are and the overall taxonomic precision
is better. A MQO of 15% is recommended for taxonomic differences (overall mean <15% is
acceptable). Individual samples exceeding 15% are examined for taxonomic areas of
substantial disagreement, and the reasons for disagreement investigated.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 38 of 120

Where re-identification by an independent, outside taxonomist or laboratory is not practical (i.e.,
phytoplankton, algae),  percent similarity will be calculated. Percent similarity is a measure of
similarity between two communities or two samples (Washington 1984). Values range from 0%
for samples with no species in common, to 100% for samples which are identical. It is calculated
as follows:
                                             K
Equations                      psc = 1-0.5^ \a-b
                                             i=l
       where:
             a and b = for a given species, the relative proportions of the total samples A and
             B, respectively, which that species represents.

A MQO of >85% is recommended for percent similarity of taxonomic identification. If the MQO is
not met, the reasons for the discrepancies between analysts should be discussed. If a major
discrepancy is found in how the two analysts have been identifying organisms, the last batch of
samples that have been counted  by the analyst under review may have to be recounted.

Additionally, percent similarity should be calculated for re-processed subsamples. This provides
a quantifiable measure of the precision of subsampling procedures employed for various
parameters (i.e., phytoplankton, algae). A MQO of >70% is recommended for percent similarity
of subsamples. If a sample does not meet this threshold, additional subsamples should be
processed from that sample until the MQO is achieved.

Sample enumeration is another component of taxonomic precision. Final specimen counts for
samples are dependent on the taxonomist, not the rough counts obtained during the sorting
activity. Comparison of counts is quantified by calculation of percent difference in enumeration
(PDE), calculated as:
Equations                  PDE = N	  xlOO
(\Lab\-Lab2\

  Labi + Labi j
An MQO of 5% is recommended (overall mean of <5% is acceptable). Individual samples
exceeding 5% are examined to determine reasons for the exceedance.

Corrective actions for samples exceeding these MQOs can include defining the taxa for which
re-identification may be necessary (potentially even by a third party), for which samples (even
outside of the 10% lot of QC samples) it is necessary, and where there may be issues of
nomenclatural or enumeration problems.

Taxonomic accuracy is evaluated by having individual specimens representative of selected
taxa identified by recognized experts. Samples will be identified using the most appropriate
technical literature that is accepted by the taxonomic discipline and reflects the accepted
nomenclature. Where necessary, the Integrated Taxonomic Information System (ITIS,
http://www.itis.usda.gov/) will be used to verify nomenclatural validity and spelling. A reference
collection will be compiled as the samples are identified.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                  Page 39 of 120
 2.2.4 Completeness

Completeness requirements are established and evaluated from two perspectives. First, valid
data for individual parameters must be acquired from a minimum number of sampling locations
in order to make subpopulation estimates with a specified level of confidence or sampling
precision. The objective of this study is to acquire valid data at 95% or more of the sampled
sites. Percent completeness is calculated as:

Equation 10                        %C = ^/xlOO

       Where:
             V = the number of measurements/samples judged valid
             T = the total number of planned measurements/samples.

Within each indicator, completeness objectives are also established for individual samples or
individual measurement variables or analytes. These objectives are estimated as the
percentage of valid data obtained versus the amount of data expected based on the number of
samples collected or number of measurements conducted.  Where necessary, supplementary
objectives for completeness are presented in the indicator-specific sections of this QAPP.

The completeness objectives are established for each measurement per site type (e.g.,
probability sites, revisit sites, etc.). Failure to achieve the minimum requirements for a particular
site type results in regional population estimates having wider confidence intervals. Failure to
achieve requirements for revisit samples (10% of sites visited) reduces the precision of
estimates of index period and annual variance components, and may impact the
representativeness of these estimates because of possible  bias in  the set of measurements
obtained.

 2.2.5 Comparability

Comparability is defined as the confidence with which one data set can be compared to another
(USEPA 2002). A performance-based methods approach is being  utilized for water chemistry
analyses that define a set of laboratory method performance requirements for data quality.
Following this approach, participating laboratories  may choose which analytical methods they
will use for each target analyte as long as they are able to achieve the performance
requirements as listed in Table 5.4-1. For all parameters, comparability is addressed by the use
of standardized sampling procedures and analytical methods by all sampling crews and
laboratories. Comparability of data within and among parameters is also facilitated by the
implementation of standardized quality assurance  and quality control techniques and
standardized performance and acceptance  criteria. For all measurements, reporting units and
format are specified, incorporated into standardized data recording forms, and documented in
the information management system. Comparability is also  addressed by providing results of
QA sample data, such as estimates of precision and bias, conducting methods comparison
studies when requested by the grantees and conducting interlaboratory performance evaluation
studies among state, university, and NWCA contract laboratories.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 40 of 120

 2.2.6 Representativeness

Representativeness is defined as "the degree to which the data accurately and precisely
represent a characteristic of a population parameter, variation of a property, a process
characteristic, or an operational condition" (USEPA 2002). At one level, representativeness is
affected by problems in any or all of the other data quality indicators.

At another level, representativeness is affected by the selection of the target wetlands, the
location of sampling sites within that wetland, the time period when samples are collected, and
the time period when samples are analyzed. The probability-based sampling design should
estimate the condition of wetland resource populations that are representative of the region. The
individual sampling programs defined for each indicator attempt to address representativeness
within the constraints of the response design, (which includes when, where, and how to collect a
sample at each site). Holding-time requirements for analyses ensure analytical results are
representative of conditions at the time of sampling.

  3   SAMPLING  DESIGN AND SITE SELECTION

The overall sampling program for the National Wetland Condition Assessment project requires a
randomized, probability-based approach for selecting wetlands where sampling activities are to
be conducted. Details regarding the specific application of the probability design to surface
waters resources are described in Paulsen et al. (1991) and Stevens (1994). The specific
details for the collection of samples associated with different indicators are described in the
indicator-specific sections of this QAPP.

 3.1  Probability-Bases Sampling Design and Site Selection

The objectives, or design requirements, for the National Wetland Condition Assessment are to
produce:

   1.   Estimates of the 2011 status of wetlands nationally and regionally (9 aggregated
      Omernik ecoregions,  major river basins, EPA Regions, etc.),

   2.   Estimates of the 2011 status of seven S&T wetland classes nationally.
   3.   Estimates of the 2011 status of wetlands in coastal watersheds nationally,


Generally, almost all wetlands in the conterminous U.S. are considered the target population for
the assessment (see the "Target population" sidebar for a more complete definition).

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 41 of 120

As stated  in Chapter 1 section 1.1 (Introduction), the USFWS National Wetlands Inventory
(NWI) S&T Reports (Dahl, 2005) are the most scientifically defensible sources of national-scale
information on wetland location and extent. The NWCA therefore used site-specific information
found in the S&T Reports (augmented as detailed below) to identify sampling sites. The sample
frame is the FWS National Wetland Status and Trend 2005 survey and was obtained from Tom
Dahl at the U.S. FWS. The sample frame
consists of all polygons mapped based
on 2005 remote sensing information for
over 5000 2 mi by 2 mi plots across the
48 states.
Working with EPA, FWS created
additional plots for the Pacific Coast to
help balance the spatial coverage of sites
             Target population
The target population for the NWCA is tidal and
nontidal wetlands of the conterminous U.S.,
including certain farmed wetlands not currently in
crop production. The wetlands have rooted
vegetation and, when present, open water less
than 1 meter deep. A wetland's jurisdictional
status3 understate or federal regulatory programs
nationally and to ensure the NWCA can     wi" not factor into this definition of tar9et
produce a representative national
assessment of estuarine wetlands.
Alaska, Hawaii and the trust territories
will not be included in the primary design for the NWCA. Additional attributes added to the
sample frame are state, EPA Region, Omernik ecoregion level III, Wadeable Stream
Assessment 3 and 9 aggregated ecoregions. The wetland types included are E2EM, E2SS,
PEM, PSS, PFO, Pf and PUBPAB which includes PAB, PUB, PUBf, PUBi, PUBn, and PUBu.
The following land cover types were excluded: E1UB, E2AB, E2US, I_AC, M1, M2, OUT, PUS,
RIV, UA, UB, UFP, UO, and URD.

The NWCA design included sites from the following S&T Classes because these classes are
very likely to be consistent with the  NWCA target population:

    •   Estuarine Intertidal  Emergent

    •   Estuarine Shrub/Forested

    •   Palustrine Emergent

    •   Palustrine Scrub/Shrub

    •   Palustrine Forested

    •   Palustrine Unconsolidated Bottom and Aquatic Bed

    •   Palustrine Farmed

Some wetlands in the S&T Classes listed above will not be consistent with the NWCA target
population. These wetlands will most likely be found in the Palustrine Unconsolidated
Bottom/Aquatic Bed and Palustrine  Farmed classes and will have few, if any, characteristics of
naturally-occurring wetlands. If any  of these inconsistent sites are selected for sampling, they
will be dropped  as soon as they are identified (e.g., during desk-top or onsite evaluation).
3 Impacts to wetlands and other aquatic resources are regulated under the Clean Water Act when an aquatic
resource is determined to be a "Water of the United States." Jurisdictional Determinations are made on a case-by-
case basis according to the definition found in 40 CFR 230.3(s). For more information please visit the following
website: http://www.epa.gov/owow/wetlands/guidance/CWAwaters.html.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 42 of 120

The survey has a two-stage design with the first stage from the FWS National Wetland Status
and Trend survey design. It is an area frame design stratified by state and physiographic region
where the area frame consists of 2 mi by 2 mi plots that cover the 48 contiguous states. The first
stage results in the identification of land cover types focused on wetland types within each 2 mi
by 2 mi plot selected (sample size is approximately 5000 plots). The second stage is a
Generalized Random Tessellation Stratified (GRTS) survey design for an area resource applied
to the stage one sample plots. It is a stratified design with unequal probability of selection based
on area within  each stratum.

Stratification is by state and unequal probability of selection is by seven (7) wetland type
categories. Allocation of sites by state and wetland type categories was completed by solving a
quadratic programming problem that minimized the sum of the squared deviations of the
expected sample size minus proportional allocation of sites by wetland type based on state area
within each wetland type subject to constraints that (1) the sum of the expected  sample sizes for
a state within a wetland type was the following E2EM=128, E2SS=127, PEM=129, PSS=129,
PFO=129, Pf=129,  and PUBPAB=129, (2) the minimum number of sites for a state was 8, (3)
the maximum number of sites within a state for E2EM or E2SS was 13, (4) the maximum
number of sites within a state for PEM, PSS, PFO, Pf, or PUBPAB was 10 and (5) the minimum
number of sites was greater than or equal to zero for each wetland type and state combination.
This approach ensured that the sample size for the seven wetland types was sufficient for
national reporting, each state received a minimum number of sites (which also improved the
national spatial balance of the sites) and otherwise proportionally allocated the sites by area
within a wetland type.

The design includes three panels.

   1.  Revisit: identifies sites that are to be visited twice.

   2.  Base: identifies remaining sites to  be visited.

   3.  Over: identifies sites available to be used as replacement sites.


The expected sample size is 900 sites for conterminous 48 states. The maximum number of
sites for a state was 69 (Louisiana) and the minimum number  of sites for a state was 8
(Vermont). Total number of site visits is 996 allocated to 900 unique sites with 96 sites to be
revisited. A 100% over sample size was selected to provide replacement sites that either are not
part of the target population or could not be sampled. Sites should be used in SitelD order within
each state.  If a revisit site cannot be sampled, the next site in the base panel within the state
should  be used as a revisit site. The map  below (Figure 2-1) identifies revisit sites in green,
base sites in red and over sample sites in black.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 43 of 120
Figure 3-1: NWCA 2011 Survey Design Summary Map
  4  INFORMATION MANAGEMENT

Like QA, information management (IM) is integral to all aspects of the NWCA, from initial
selection of sampling sites through dissemination and reporting of final, validated data. QA and
QC measures implemented for the IM system are aimed at preventing corruption of data at the
time of their initial incorporation into the system and maintaining the integrity of data and
information after incorporation into the system. The general organization of, and QA/QC
measures  associated with, the IM systems are described in this section.

 4.1  Overview  of System Structure

At each point where  data and information are generated,  compiled, or stored, the information
must be managed. Thus, the IM system includes all of the data-generating activities, all of the
means of recording and storing information, and all of the processes which use data. The IM
system includes both hardcopy and electronic means of generating, storing,  and archiving data.
All participants in the NWCA have certain responsibilities and obligations which make them a
part of the IM system. In its entirety, the IM system  includes site selection and logistics
information, sample labels and field data forms, tracking records, map and analytical data, data
validation and analysis processes, reports, and archives.  IM staff, supporting the NWCA at
WED, provide support and guidance to all program operations, in addition to maintaining a
central data base management system for the NWCA data.

The central repository for data and associated information collected for use by the NWCA is  a
secure, access-controlled server located at WED-Corvallis. The general organization of the

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 44 of 120
information management system is presented in Figure 4-1. Data are stored and managed on
this system using the Statistical Analysis System (SAS) software package. This centrally
managed IM system is the primary data management center for the NWCA research conducted
at WED and elsewhere. The IM staff receives, enters, and maintains data and information
generated by the site selection process (see Section 3), field sample and data collection, map-
based measurements, laboratory analyses, and verification and validation activities completed
by the Project Lead. In addition to this inflow, the IM system provides outflow in  provision of data
files to NWCA staff and other users. The IM staff at WED is responsible for maintaining the
security integrity of both the data and the system.

The following sections describe the major inputs to the central data base and the associated
QA/QC processes used to record, enter, and validate measurement and analytical data
collected. Activities to maintain the integrity and assure the quality of the contents of the IM
system are also described.
SAMPLE SITE INFORMATION

TIER II LIST
FRAME

• Site ID
•Weighting
factor
• Location
coordinates










LOGISTICS
DATA

• Site ID
information
• Location
coordinates
• Access
information









SITE
VERIFICATION
DATA

• Site ID
• Measured
location
coordinates
• Sampling status
INDICATOR RESEARCH AND DEVELOPMENT INFORMATION


FIELD
DATA


LABORATORY
DATA

SAMPLE
TRACKING
DATA

ASSESSMENT AND REPORTING INFORMATION
(by indicator)

ANNUAL
POPULATION
STATUS
DATA

POPULATION
TREND
DATA

SPATIAL
DATA
(CIS)

                                                           M ETA-DATA
                                                QUALITY ASSURANCE
                                                  DOCUMENTATION
Figure 4-1: Organization of information management system modeled after EMAP Surface Water
Information Management (SWIM) system for the NWCA

 4.1.1 Design and Site Status Data Files

The site selection process described in Section 3 produces a list of candidate sampling
locations, inclusion probabilities, and associated site classification data (e.g., target status,
ecoregion, etc.). This "design" data file is provided to the IM staff, Field Implementation

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 45 of 120

Coordinators, and Field Crew Leaders. Field Crew Leaders determine ownership and contacts
for acquiring permission to access each site, and conduct site evaluation and reconnaissance
activities. Ownership, site evaluation, and reconnaissance information for each site are
compiled into a "site status" data file. Generally, standardized forms are used during
reconnaissance activities (see the Site Evaluation Guidelines (USEPA 2011 [a]) for more
detailed information). Information from these forms may be entered into a SAS-compatible data
management system. Whether in electronic or hardcopy format, a copy of the logistics data
base is provided to the IM Staff for archiving.

 4.1.2 Sample Collection and Field Data Recording

Prior to initiation of field activities, the IM staff works with the Project  Lead and analytical support
laboratories to develop standardized data forms and sample labels. Preprinted adhesive labels
having a standard recording format are completed and affixed to each sample container.
Precautions are taken to ensure that label information remains legible and the label remains
attached to the sample. Examples of sample labels are presented in  the Field Operations
Manual.

Data forms are designed in conjunction with IM staff to ensure the format facilitates both field
recording and subsequent data entry tasks. All data forms which may be used in the field are
printed on water-resistant paper4. Copies of the data forms and instructions for completing each
form are documented in the Field Operations Manual. Recorded data are reviewed  upon
completion of data collection and recording activities by a person other than the one who
completed the form. The Field Crew Leader checks completed data forms and sample labels
before leaving a sampling site to ensure information and data were recorded legibly and
completely. Errors are corrected  if possible, and data considered as suspect are qualified using
a flag variable. The Field Crew enters explanations for all flagged data  in a comments section.
Completed data forms are transmitted to the IM staff at WED for entry into the central data base
management system; the ORD Technical Lead also receives copies  of all field-recorded data.

If portable PCs (or handheld data recorders) are to be used in the field, user screens are
developed that duplicate the standardized form to facilitate data entry. Specific output formats
are available to print data for review and for production  of shipping forms. Data may be
transferred via modem on a daily basis. Each week CDs containing all down-loaded data for the
week are mailed to the Information Management Coordinator (IMC).

All samples are tracked from the point of collection. If field PCs are used, tracking information is
entered on custom-designed electronic tracking forms. Hardcopy tracking and custody forms
are completed if PCs are not available for use. One copy of the shipping and custody record
accompanies all sample transfers; a second copy is transmitted to the IMC and ORD Technical
Lead. Samples are tracked to ensure that they are delivered to the appropriate laboratory, that
lost shipments can be quickly identified and traced, and that any problems with samples
observed when received at the laboratory are reported promptly so that corrective action can  be
taken, if necessary. Detailed procedures on shipping and sample tracking can be found in the
Field Operations Manual.
4 Water-resistant paper is not to be copied with photocopiers, as photocopying this type of paper can
damage photocopying equipment.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 46 of 120
Procedures for completion of sample labels and field data forms, and use of PCs are covered
extensively in training sessions. General QC checks and procedures associated with sample
collection and transfer, field measurements, and field data form completion for most indicators
are listed in Table 4.1-1. Additional QA/QC checks or procedures specific to individual indicators
are described in the indicator sections in Section 5 of this QAPP.

Table 4.1-1. Sample and field data quality control activities
Quality Control
Activity
Contamination
Prevention
Sample
Identification
Data Recording
Data Qualifiers
Sample Custody
Sample Tracking
Data Entry
Data Submission
Data Archival
Description and/or Requirements
All containers for individual site sealed in plastic bags until use; specific
contamination avoidance measures covered in training
Pre-printed labels with unique ID number on each sample
Data recorded on pre-printed forms of water-resistant paper; field sampling crew
reviews data forms for accuracy, completeness, and legibility
Defined qualifier codes used on data form; qualifiers explained in comments
section on data form
Unique sample ID and tracking form information entered in LIMS; sample shipment
and receipt confirmed
Sample condition inspected upon receipt and noted on tracking form with copies
sent to ORD Technical Lead and/or IM
Data entered using customized entry screens that resemble the data forms; entries
reviewed manually or by automated comparison of double entry
Standard format defined for each measurement including units, significant figures,
and decimal places, accepted code values, and required field width
All data records, including raw data, archived in an organized manner. For
example, following verification/validation of the last submission into the NWCA
database, it is copied to a terabit external hard drive and sent to the Project Leader
for inclusion in his project file, scheduled as 501 , permanent records.
Processed samples and reference collections of taxonomic specimens submitted
for cataloging and curing at an appropriate museum facility
 4.1.3 Laboratory Analyses and Data Recording

Upon receipt of a sample shipment, analytical laboratory receiving personnel check the
condition and identification of each sample against the sample tracking record. Each sample is
identified by information written on the sample label and by a barcode label. Any discrepancies,
damaged samples, or missing samples are reported to the IM staff and Project Lead by
telephone.

Most of the laboratory analyses for the NWCA indicators, particularly chemical and physical
analyses, follow or are based on standard methods. Standard methods generally include
requirements for QC checks and procedures. General laboratory QA/QC procedures applicable
to most NWCA indicators are described  in Table 4.1-2. Additional QA/QC procedures specific to
individual indicator analyses are described in the indicator-specific sections of this QAPP.
Biological sample analyses are generally based on current acceptable practices within the

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                         March 2012
                                                      Page 47 of 120
particular biological discipline. Some QC checks and procedures applicable to most NWCA
biological samples are described in Table 4.1-3. Additional QA/QC procedures specific to
individual parameters are described in the indicator-specific sections of this QAPP.

Table 4.1-2. Laboratory data quality control activities
     Quality Control
         Activity
                  Description and/or Requirements
 Instrument Maintenance

 Calibration

 QC Data



 Data Recording
 Data Qualifiers
 Data Entry

 Submission Package
Follow manufacturer's recommendations and specific guidelines in methods;
maintain logbook of maintenance/repair activities
Calibrate according to manufacturer's recommendations; recalibrate or
replace before analyzing any samples
Maintain control charts, determine LT-MDLs and achieved data attributes;
include QC data summary (narrative and compatible electronic format) in
submission package
Use software compatible with NARS-SWIM system; check all data entered
against the original bench sheet to identify and correct entry errors.
Review other QA data (e.g., condition upon receipt, etc.) for possible
problems with sample or specimens.
Use defined qualifier codes; explain all qualifiers
Automated comparison of double entry or 100% manual check against
original data form
Includes: Letter by the laboratory manager; data, data qualifiers and
explanations; electronic format compatible with NARS-SWIM system,
documentation  of file and data base structures, variable descriptions and
formats; summary report of any problems and corrective actions implemented
Table 4.1-3. Biological sample quality control activities
      Quality Control
         Activity
                   Description and/or Requirements
 Taxonomic Nomenclature
 Taxonomic Identifications

 Independent
 Identifications
 Duplicate Identifications

 Taxonomic
 Reasonableness Checks
 Reference Collections
 Use accepted common and scientific nomenclature and unique entry codes
 Use standard taxonomic references and keys; maintain bibliography of all
 references used
 Uncertain identifications to be confirmed by expert in particular taxa

 At least 5% of all samples completed pertaxonomist re-identified by different
 analyst; less than 10% assigned different ID
 Species or genera known to occur in given conditions or geographic area
 Permanent mounts or voucher specimens of all taxa encountered
A laboratory's IM system may consist of only hardcopy records such as bench sheets and
logbooks, an electronic laboratory information management system (LIMS), or some
combination of hardcopy and electronic records. Laboratory data records are reviewed at the
end of each analysis day by the designated laboratory onsite QA coordinator or by supervisory
personnel. Errors are corrected if possible, and data considered as suspect by laboratory

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 48 of 120

analysts are qualified with a flag variable. All flagged data are explained in a comments section.
Private contract laboratories generally have a laboratory Quality Assurance Plan and
established procedures for recording, reviewing, and validating analysis data.

Once analytical data have passed all of the laboratory's internal review procedures, a
submission package is prepared and transferred to the IM staff. The contents of the submission
package are largely dictated by the type of analysis (physical, chemical, or biological),  but
generally includes at least the elements listed in the Field  and  Laboratory Operations Manuals.

Remaining sample material and voucher specimens may be transferred to EPA's designated
laboratory or facilities as directed by the EPA Project Leader. All samples and raw data files
(including  logbooks, bench sheets, and instrument tracings) are to be retained permanently or
until authorized for disposal, in writing, by the EPA Project Leader. (Deliverables from
contractors and Cooperators, including raw data, are permanent as per EPA Record Schedule
258.  EPA's project records are scheduled 501 and are also permanent.)

 4.1.4 Data Review,  Verification, and Validation Activities

Raw data files are created from entry of field and analytical data, including data for QA/QC
samples and any data qualifiers noted on the data forms or analytical data package. After initial
entry, data are reviewed for entry errors by either a manual comparison of a printout of the
entered data against the original data form or by automated comparison of data  entered twice
into separate files.  Entry errors are corrected and reentered. For biological samples, species
identifications are corrected for entry errors associated with incorrect or misspelled codes.
Errors associated with misidentification of specimens are corrected after voucher specimens
have been confirmed and the results are available. Files corrected for entry errors are
considered to be raw data files. Copies of all raw data files are maintained in the centralized IM
system.

The Logistics Coordinator will work with ORD Technical Lead and the IM staff (primary data
recipients) to ensure that sufficient QC activities are engaged in the various data management
processes. Copies of the raw data files are maintained in the central IM system,  generally in
active files until completion of reporting and then are transferred to archive files as static data
files. Redundant copies are maintained of all data files and all files are periodically backed up.

Some of the typical checks made in the  processes of verification and validation are described in
Table 4.1-4.  Automated review procedures may be used. The primary purpose of the initial
checks is to  confirm that a data value present in an electronic data file is accurate with respect
to the value that was initially recorded on a data form or obtained from an analytical instrument.
In general, these activities focus on individual variables in the raw data file and may include
range checks for numeric variables, frequency tabulations of coded or alphanumeric variables to
identify erroneous codes or misspelled entries, and summations of variables reported in terms
of percent or percentiles. In addition,  associated QA information (e.g., sample holding time) and
QC sample data are reviewed to determine if they meet acceptance criteria. Suspect values are
assigned a data qualifier. They will either be corrected, replaced with a new acceptable value
from sample reanalysis, or confirmed as suspect after sample reanalysis. Any suspect data will
be flagged for data qualification.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                  March 2012
                               Page 49 of 120
Table 4.1-4. Data review, verification, and validation quality control activities
             Quality Control Activity
       Description and/or Requirements
 Review any qualifiers associated with variable
 Summarize and review replicate sample data
 Determine if MQOs and project DQOs have been
 achieved
 Exploratory data analyses (univariate, bivariate,
 multivariate) utilizing all data
 Confirm assumptions regarding specific types of
 statistical techniques being utilized in development
 of metrics and indicators
Determine if value is suspect or invalid; assign
validation qualifiers as appropriate
Identify replicate samples with  large variance;
determine if analytical error or visit-specific
phenomenon is responsible
Determine potential impact on achieving research
and/or program objectives
Identify outlier values and determine if analytical
error or site-specific phenomenon is responsible
Determine potential impact on achieving research
and/or program objectives
In the final stage of data verification and validation, exploratory data analysis techniques may be
used to identify extreme data points or statistical outliers in the data set. Examples of univariate
analysis techniques include the generation and examination of box-and-whisker plots and
subsequent statistical tests of any outlying data points. Bivariate techniques include calculation
of Spearman correlation coefficients for all pairs of variables in the data set with subsequent
examination of bivariate plots of variables having high correlation coefficients. Multivariate
techniques have also been used in detecting extreme or outlying values in environmental data
sets (Meglen, 1985; Garner et al.,  1991; Stapanian et al., 1993). A software
package, SCOUT, developed  by EPA and based on the approach of Garner et al. (1991) may
be used to validate multivariate data sets.

Suspect data are reviewed to determine the source of error, if possible. If the error is
correctable, the  data set is edited to incorporate the correct data. If the source of the error
cannot be determined, data are qualified as questionable or invalid.  Data qualified as
questionable may be acceptable for certain types of data analyses and interpretation activities.
The decision to use questionable data must be made by the individual data users. Data qualified
as invalid are considered to be unacceptable for use in any analysis or interpretation activities
and will generally be removed from the data file and replaced with a missing value code and
explanatory comment or flag code. After completion of verification and validation activities, a
final data file is created, with copies transmitted for archival and for uploading to the centralized
IM system.

Once verified and validated, data files are made available for use in various types of
interpretation activities, each of which may require additional restructuring of the  data files.
These restructuring activities are collectively referred to as "data enhancement."  In order to
develop indicator metrics from one or more variables, data files may be restructured so as  to
provide a single  record per wetland.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                  Page 50 of 120
 4.2  Data Transfer

Field crews may transmit data electronically via modem or electronic media disc; hardcopies of
completed data and sample tracking forms may be transmitted to the IM staff via portable
facsimile (FAX) machine or via express courier service. Copies of raw, verified, and validated
data files are transferred from the ORD Technical Lead to the IM staff for inclusion in the central
IM system. All transfers of data are conducted using a means of transfer, file structure, and file
format that have been approved by the IM staff. Data files that do not meet the required
specifications will not be incorporated into the centralized data access and management
system.

 4.3  Hardware and Software Control

All automated data processing (ADP) equipment and software purchased for or used in NWCA
research is subject to the requirements of the federal government, the particular Agency, and
the individual facility making the purchase or maintaining  the equipment and software. All
hardware purchased by EPA is identified with an EPA barcode tag label; an inventory is
maintained by the responsible ADP personnel at the facility. Inventories are also maintained  of
all software licenses; periodic checks are made of all software assigned to a particular PC.

The development and organization of the IM  system is compliant with guidelines and standards
established by the EMAP Information Management Technical Coordination Group, the EPA
Office of Technology, Operations, and Planning (OTOP),  and the EPA Office of Administrative
Resources Management (OARM). Areas addressed by these policies and guidelines include,
but are not limited to, the  following:

    •  Taxonomic Nomenclature and Coding

    •  Locational  data

    •  Sampling unit identification and reference

    •  Hardware and software

    •  Data catalog documentation


The NWCA is committed to compliance with  all applicable regulations and guidance concerning
hardware and software procurement,  maintenance, configuration control, and QA/QC. As new
guidance and requirements are issued, NWCA information management staff will assess the
impact upon the IM system and develop plans for ensuring timely compliance.

 4.4  Data Security

All data files in the IM system are protected from corruption by computer viruses, unauthorized
access,  and hardware and software failures.  Guidance and policy documents of EPA and
management policies established by the IM Technical Coordination Group for data access and
data confidentiality are followed. Raw and verified data files are accessible only to NWCA
Cooperators. Validated data files are accessible only to users specifically authorized by the EPA
Project Leader. Data files in the central repository used for access and dissemination are
marked as read-only to prevent corruption by inadvertent editing, additions,  or deletions.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                   Page 51 of 120
Data generated, processed, and incorporated into the IM system are routinely stored as well as
archived on redundant systems. This ensures that if one system is destroyed or incapacitated,
IM staff will be able to reconstruct the data bases. Procedures developed to archive the data,
monitor the process, and recover the data are described in IM documentation.

Several backup copies of all data files and of the programs used for processing the data are
maintained. Backups of the entire system are maintained off-site. System backup procedures
are used. The central data base is backed up and archived according to pre-established
procedures. All data records, including raw data, are archived in an organized manner in
compliance with EPA and Federal Government records management policies. For example,
following verification/validation of the last submission into the NWCA database, all data is
copied to a terabit external hard drive and sent to the Project Leader for inclusion in the project
file as permanent records. All laboratories generating data and developing data files must have
established procedures for backing up and archiving computerized data.

 4.5   Data Archive

Ultimately, all data will be transferred to U.S. EPA's agency-wide WQX (Water Quality
Exchange) data management system for archival purposes. WQX is  a repository for water
quality, biological,  and physical data and is used by state environmental agencies, EPA and
other federal agencies, universities,  private citizens, and many others. Revised from STORET,
WQX provides a centralized system for storage of physical, chemical, and  biological data and
associated analytical tools for data analysis. Data from the NWCA project in an Excel format will
be run  through an  Interface Module and uploaded to WQX. Once uploaded, states and tribes
will be able to download data (using Oracle software) from their region. In the  period after data
collection and before transfer to STORET, data will be archived in SWIMS.

  5  INDICATORS

As first described in Chapter 2 (Data Quality Objectives, or DQOs), the NWCA has two DQOs:
one for condition estimates at the national scale, and the other for the condition estimates within
individual ecoregions. The DQO for national-scale estimates is as follows:

    Estimate the proportion of wetlands (± 5%) in the conterminous U.S. that fall below
   the designated threshold for good conditions for selected measures with 95%
   confidence.

The DQO for the ecoregions of interest is:

    Estimate the proportion of wetlands (± 15%) in a specific ecoregion that fall below
   the designated threshold for good conditions for selected measures with 95%
   confidence.

These two DQOs then govern the structure, performance, archiving,  and documentation of all
phases of the NWCA.

The influence of these DQOs on the data-acquisition phase (i.e. field sampling) can generally be
divided into two types of sub-objectives. The first type of sub-objective influences Field Crew

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 52 of 120

performance. The second uses repeat visits to evaluate the overall effectiveness of the data
collection. The first sub-objective can be stated as follows:

   Sub-Objective 1:    The Field Crews implement the sampling protocols as designed to
                       collect high quality data.

Sub-objective 1 influences Field Crew operations across all indicators, and applies to both the
Crew doing the initial sampling of a POINT and the Crew doing the repeat samples.

QA on the implementation of the protocols by Field Crews would include:

   •  Appropriate expertise and qualifications (particularly the  Botanist);

   •  Training;

   •  Site Evaluation and Assistance Visits (i.e. QA audits);

   •  Review of data forms; and

   •  Sample tracking.


QA on ability to collect high quality data will focus on completeness (including consideration of
proportion of samples that cannot be processed because of problems with  how the sample was
taken) and accuracy as determined by checks by experts.  Two such checks include the check
of plant identification on the voucher specimens and the review of soil descriptions by the
regional soil scientist.

The second type of sub-objective addresses the overall effectiveness of data collection. This
type  is summarized in the following two sub-objectives:

   Sub-objective 2a:   The protocols, when implemented by a Field Crew meeting the QA
                       objective for sampling, generate a reproducible evaluation of
                       ecological condition as demonstrated with data from repeat samples
                       of the same POINT.

   Sub-Objective 2b:   The data collected by Field Crews meeting the QA objective for
                       sampling can distinguish between sites of different condition, i.e.,  are
                       robust, in the face of naturally-occurring and sampling variability.

Sub-Objective 2a gets at whether data collected by Field Crews at different times generate the
same answers about the condition of the site. This sub-objective influences post-sampling
analysis. Examples of QA measures are: (1) The same dominant species are observed in
repeat samples ± the MQO and (2) Vegetation structure and composition is not significantly
different in repeat samples as determined through the use of multivariate analysis.

Analysis for the Sub-Objective 2b is also done post sampling. It involves developing variance
estimates for data metrics which aid in characterizing the utility of metrics through signal to
noise ratios, etc. (Stoddard, etal, 2008).

How each of the sub-objectives manifests itself in each of the indicators is detailed in the
following sections.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 53 of 120

 5.1  Vegetation

 5.1.1 Introduction

Vegetation is a key attribute of most wetland ecosystems, is sensitive to human-caused
disturbance, and accurately reflects wetland condition and biological integrity. It has been used
effectively in assessing overall ecological condition and to distinguish particular stressors (Tiner
1999, Gamier et al. 2004, Quetier et al. 2007).

Wetland plant species 1) represent diverse adaptations, ecological tolerances, and life history
strategies, and 2) effectively integrate environmental conditions, species interactions, and
human-caused disturbance. Data describing species composition and abundance and
vegetation structure are powerful, robust, and relatively easy to gather.  In addition, they can be
used to derive a myriad of metrics or indicators that are useful descriptors of ecological integrity
or stress (e.g.,  USEPA 2002, Bourdaghs et al. 2006, Magee et al. 2008, Mack and Kentula  in
review). Examples of the types of data to be collected are:

   •   Species composition and abundance

   •   Native species

   •   Alien species

   •   Floristic quality

   •   Guild composition

   •   Community composition

   •   Vegetation structure


For more detailed information please see "Ecological Indicators for the 2011 National Wetland
Condition Assessment" (in preparation).

 5.1.2 Training and Field Audits

Protocols for collecting data describing species composition and abundance and vegetation
structure are provided in the Field Operations Manual (FOM) Vegetation Chapter. Standardized
training in implementation of these protocols will be provided to the Botanists and Field Crew
members who will assist with botanical data collection to  ensure collection of comparable data
across the natural study area (see  Section 1.3.1.1 for qualifications and duties). In addition,
quality assurance audits will be conducted at least once during the field season for each Field
Crew to ensure that the protocols are being implemented consistent with training. Ten percent
of all sample sites will  receive repeat visits to determine if differences exist in field data
collection on different days. Revisit sites must be sampled at least 2-4 weeks apart to ensure
that we are assessing  temporal variability.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                  Page 54 of 120
 5.1.3 Sampling Design

There are two components to collecting vegetation information: the primary component involves
field or in situ measurement of various species composition indicators; the secondary
component involves collecting samples of plant specimens for all  unknown species and for 5
vouchers of known species from each site.

The vegetation sampling, observations and associated protocols developed for the NWCA are
based on the flexible-plot method of Peet et al. (1998), adapted to meet the objectives and data
collection needs of the NWCA. Vegetation sampling will take place in five 100-m2 plots arranged
systematically across the Assessment Area. The FOM Vegetation Chapter includes detailed
instructions for establishing the vegetation plots in standard or alternate configurations.
Vegetation composition, abundance and structure are assessed at the 100-m2 scale. Each plot
will contain  a series of nested quadrats established in two opposing corners to obtain estimates
of species diversity, based on species presence at multiple spatial scales (1.0 m2 and 10m2).

To optimize vegetation characterization, field sampling for the NWCA will take place during the
peak growing season when most vegetation is in flower or fruit. Sampling during this period
minimizes seasonal phenological variability and enhances plant species identification accuracy,
particularly of difficult species such as grasses and sedges. Although some early ephemeral
flowering forbs may be missed by not sampling early in the season, most plant species will be in
mature reproductive stages and more readily detected.

On site, it is important to avoid trampling fragile wetland vegetation during sampling activities.
Also, to prevent spread of potentially harmful organisms between research sites,  all crew
members will employ ZERO TAXA TRANSPORT protocols (See FOM Daily Operations
Chapter) before leaving the AA.  Before entering the vehicle for return to base, field crews are
required to decontaminate shoes, clothing and person of all propagules or organisms.
Equipment must also be cleaned before replacing it in the vehicle.

 5.1.4 Field Measurements and Sampling

 5.1.4.1 Pre-Sampling Activities

Compiling data forms and organizing the equipment needed for the day's vegetation data
collection activities prior to beginning field work enhances efficiency of sampling throughout the
rest of the day. Some of this organizational work is completed at the base location or in route to
the road location nearest to the POINT.

The vegetation equipment checklist (FOM Vegetation Chapter) ensures all equipment is
present. Items should be  routinely located in the same places in the vehicle. Keeping the
equipment organized by storing and transporting items in the same  locations allows items to be
easily found, facilitates packing and unpacking the vehicle, minimizes mess and confusion, and
helps prevent loss. It is also important to confirm all needed gear  and supplies are present
before hiking in to the POINT, especially when the location of the POINT is a substantial
distance from the nearest road.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 55 of 120
Several data forms are used in collecting vegetation data for the NWCA. Each data sheet
should be filled out according to the steps outlined in the Vegetation Chapter.  Plant specimen
labels and plant sample ID tags are also provided. Data forms include:

   a.  V-1 Vegetation Plot Establishment and Characterization Form
   b.  V-2 Vascular Species Presence and Cover Form,
   c.  V-3 Ground Surface Attributes Form,
   d.  V-4 Snag and Tree Counts and Tree Cover Form,

 5.1.4.2Sampling Activities

All field measurement and sampling operations will be conducted by a Vegetation (Veg) Team
consisting of a Botanist/Ecologist and Botanist Assistant.

All measurements and observations are recorded on standardized forms which are later entered
in to the central  National Aquatic Resource Surveys (NARS) surface waters information
management system at WED-Corvallis. Table 5.1-1 provides a brief summary of the
observations recorded by the Veg Team.

Table 5.1-1. Field measurement methods: vegetation
Variable or
Measurement
Vascular strata
coverage
Non-vascular
coverage
Individual vascular
coverage
Ground surface
attributes: coverage
Ground surface
attributes: depth
Tree coverage
Tree count
Standing dead trees
and snags
Species presence
data
Units
%
%
%
%
cm
%
None
None
None










Summary of Method
Estimate total cover of emergent and non-aquatic vegetation by
height class (< 0.5, 0.5-2, 2-5, 5-15, 15-30, >30 m, or liana, vines,
and epiphytes), submerged aquatic vegetation and floating aquatic
vegetation
Estimate percent cover for non-vascular taxonomic groups
(bryophytes, ground lichens, arboreal epiphytic bryophytes and
lichens, filamentous or mat-forming algae, and macro algae)
Estimate percent cover for each species and record the predominant
height class in which it occurs
Estimate cover of water, bareground, vegetative litter, and dead
woody debris
Measurements for water (minimum, predominant, and maximum
depth) and vegetative litter
Estimate percent cover for each species by height class (< 0.5, 0.5-2,
2-5,5-15, 15-30, >30m)
Count stems for individuals >5 cm diameter breast height (dbh) by
diameter class (5-10, 11-25, 26-50, 51-75, 76-100, 101-200 and >
200 cm), by species
Count total number of stems >5 cm dbh by diameter class (5-10, 11-
25, 26-50, 51-75, 76-100, 101-200 and > 200 cm), by species
For each species present, record the smallest quadrat in which it
occurs











-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 56 of 120
General Cover Estimation Protocols:
The entire range of values from 0 to 100% may be used when estimating cover for a species or
other entity within each 100-m2 plot. However it is not necessary or appropriate to deliberate
extensively over small differences in values for cover estimates (e.g., 0.1% or 0.5%, 5 or 7%, 25
or 30%, 75% or 85%). This degree of precision is likely to exceed the accuracy of the
Botanist/Ecologist's ability to detect cover differences over the area of the module. See Table
5.1-2 for guidelines on increments of resolution for different cover ranges.

Table 5.1-2. Guidelines for resolution when estimating percent cover
For Cover Of:
Trace (<1%) = 0.1%
1 to 5%
5 to 25%
25 to 50%
55 to 100%
Use % Increments Of:
NA
1%
5%
5 to 10%
10 to 15%
Nomenclature:
To effectively identify plant species in the field and to key unknown taxa, it will be necessary to
use local floras appropriate to each region or state. This means numerous taxonomies will be
applied across the 48 conterminous states comprising the study area. To reduce potential
discrepancies in nomenclature, it is suggested that each Botanist/Ecologist reconciles species
names to the USDA PLANTS nomenclature.

Collecting Plant Material for Specimens:
Throughout the sampling day,  the Botanist/Ecologist and Botanist Assistant collect all unknown
plant species and five known plant species (randomly selected from species identified in the
100-m2 vegetation plots) from the site. Specimens are carefully labeled with tracking information
and placed in a plant press to dry. The Veg Team ensures that all tracking information  always
remains with the specimens (pertinent information written on the Plant Specimen Tag and
affixed on the newspaper sleeve containing the specimen and on the  Plant Specimen Label
Form). Detailed instructions for specimen collection, pressing, labeling, shipping, and tracking
are found in the FOM Vegetation Chapter. Voucher specimens should not be collected for
plants that are rare within the vegetation plot or Assessment Area.

Pressing Plant Specimens:
Plant specimens represent critical vegetation data; thus, it is important to press plant material as
soon after collection as  practicable to preserve the morphological features of the plant for later
identification by a botanist. In those situations where important morphological features  may be
damaged or lost by pressing and drying (i.e., flower color, fruit color, and fruit shape) it is
important to document these features on the plant specimen label form. Plant specimen labels
are considered field data, and  a plant specimen is incomplete in the absence of accurate label
data. A completed Plant Specimen Label Form should be enclosed in the newsprint folder of
each specimen. Photographic documentation is also valuable. The field day is not considered
finished  until all plant specimens collected in the field are properly pressed and labeled. Key
elements of label data and steps for pressing plant specimens can be found in the appropriate
sections of the FOM Vegetation Chapter.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 57 of 120

Drying. Storing, and Shipping Plant Specimens:
Normally, pressed plant specimens should be thoroughly dried before removing them from the
presses. Ideally, full plant presses will be returned to the base location after a few field days and
placed on plant dryers to dry. Once the specimens are dry they can be removed from the press
and shipped to an expert for identification or stored for later identification by the Veg Team
during non-field days of the field season.

For crews that work for more than four or five days in the field without returning to a location
where plants can be dried, wet plant specimens may need to be removed from the presses,
packaged and shipped to a location where the specimens can be dried and processed.

The steps for  handling specimens once they are in the press can be found in the appropriate
section of the  FOM Vegetation Chapter.

 5.1.4.3Quality Assurance Objectives

As mentioned above in section 2.2.2 (Precision, Bias, and Accuracy), precision of field
measurements will not be monitored during the NWCA. Previous plant identification experience
or class work will be valuable for Veg Team members, but mandatory NWCA training will
prepare the crew to accurately complete vegetation data collection tasks according to the
standardized field protocols.

MQOs are given in Table 5.1-4. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.1-4 represent the maximum allowable
criteria for statistical control purposes. Precision is determined from results of revisits (field
measurements) taken on a different day (at least 2-4 weeks apart).

 5.1.5 Laboratory Methods

For the purposes of this manual a Herbarium represents the person identifying and processing
unknown specimens collected in the field. This could be a field botanist, state identified
herbarium, EPA identified regional herbarium, or National EPA Contractor. The Herbarium  is
responsible for ensuring all plant identification and processing tasks outlined in this manual are
completed. In some cases this may require the Herbarium to identify partners to assist with the
work. The Herbarium identifies all  unknown plant species.

Known plant species collected for quality assurance are sent to the QA Herbarium for re-
identification. A QA Herbarium is an independent qualified botanist, state or EPA identified
regional herbarium that agrees to  use the NWCA prescribed methods to ensure that all QA
vouchers receive the same level of taxonomic precision. Ten percent of unknown specimens
identified by the Herbarium are also sent to the QA Herbarium for re-identification and quality
assurance. Details on how the Herbarium and QA Herbarium should handle and identify plant
specimens can be found in appropriate section of the LOM Vegetation Chapter. Voucher
specimens will arrive at the QA Herbarium without a species name. The QA Herbarium will then
blindly re-identify all species to ensure that the identifications are independent.

Voucher and unknown specimens may arrive at the Herbarium or QA Herbarium either dried
and pressed, or pressed and possibly still wet in the plant press. If specimens  are pressed  and
dried they must be treated for contamination (detritivores, molds, and pests) before
identification.  If specimens are still wet in the plant press they must be dried and treated for

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                              March 2012
                                           Page 58 of 120
contamination before identification (LOM Vegetation Chapter). It will be important to maintain a
record of specimen custody through shipping to identification to data entry to ensure the correct
species name is recorded for the appropriate NWCA site and Vegetation Plot in the database.

Tracking Specimens:
In the field, each voucher specimen collected is assigned a set of tracking information, which is
recorded on the Plant Specimen Tracking  Form. If a specimen does not have any of the
necessary information, contact the Logistics Coordinator as soon as possible. It is important that
every specimen sent to and received by the lab is tracked following the protocols described in
the appropriate section of the LOM Vegetation Chapter. Specimens may follow one of the paths
described in Figure 5-1.
     la)- Field Crew
     Botanist/Ecologist, or Other
     State Botanist or Herbarium
     selected by State to identify
     unknowns
                                                  2a) - Quality Assurance Species
                                                  (5 randomly selected species of known
                                                  identity for QA verification, 1
                                                  species/Veg Plot)
Ibj - EcoAnalysts Plant Lab
              3) - Independent Botanist

              *For specimens from 2a - not the person
              who collected the QA specimens in the field

              *For specimens from 2b: - not the person
              who identified these species
       2b) -10% of species identified in steps la or
       Ib, randomly selected for QA verification
Figure 5-1: Potential options for plant vouchers collected as part of the 2011 NWCA
Nomenclature:
This means numerous taxonomies will be applied during the 2011 NWCA. The Herbarium and
QA Herbarium will reconcile all species received to the standard found in USDA Plants. The
LOM Vegetation Chapter contains more information on reconciling taxonomy to USDA Plants.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 59 of 120
 5.1.6 Quality Assurance Objectives

MQOs are given in Table 5.1-3. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.1-4 represent the maximum allowable
criteria for statistical control purposes. Precision is determined from results of revisits (field
measurements) taken on a different day (at least 2-4 weeks apart).

Table 5.1-3. Measurement data quality objectives: vegetation indicator
Variable or Measurement
Field Measurements and Observations
Precision
±10%
Taxonomic
Disagreement
<15%
Completeness
90%
NA = not applicable in most cases. This would apply if the field auditor did a separate assessment and
compared the results to the crews.
 5.1.7 Quality Control Procedures: Field Operations

Precision, bias and accuracy of field measurements will not be monitored during the NWCA5.
Control measures to minimize measurement error among crews and sites include the use of
standardized field protocols, consistent training of all crews, field assistance visits to all crews,
and availability of experienced technical personnel during the field season to respond to site-
specific questions from field crews as they arise.

Upon completion of sampling, the Botanist/Ecologist reviews all vegetation forms for
completeness, legibility, and for any errors in species names.

The Botanist/Ecologist checks the voucher collection record on the Vascular Plant Species
Presence and Cover Form (FOM Vegetation Chapter) for all taxa with pseudonyms to ensure
that specimens have been collected for all unknown species. Additionally, the Botanist/Ecologist
and Botanist Assistant collect 5 known plant species (randomly selected from species identified
from the 100-m2 vegetation plots) as voucher specimens. These voucher specimens will be sent
to a QC taxonomist for re-identification.

   1.  The QC taxonomist will perform re-identifications completing a copy of the Vegetation
       Taxonomic Bench Sheet for each specimen. Each bench sheet must be labeled with the
       term "QC ID." As each bench sheet is completed, it must be faxed or emailed to the
       project facilitator.
 Bias, for example, cannot be determined directly, since the "true" values at any particular site are not
known.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                   Page 60 of 120
   2.  The project facilitator will compare the taxonomic results generated by the primary and
       QC taxonomists for each specimen and calculate percent taxonomic disagreement
       (PTD) as measures of taxonomic precision (Stribling et al. 2003) as follows:
                           PTD =
Equation 1

       where
1-
                                         comp
                                               pos
        N
xlOO
                     = the number of agreements (positive comparisons)
             N       = the total number of specimens in the larger of the two counts.

   3.  Unless otherwise specified by project goals and objectives, the measurement quality
       objective for enumerations will be a mean PTD less than or equal to 15, calculated from
       all the specimens sent to the QC taxonomist. Results greater than these values will be
       investigated and logged for indication of error patterns or trends, but all values will
       generally be considered acceptable for further analysis, unless the investigation reveals
       significant problems.

   4.  Corrective action will include determining problem areas (taxa) and consistent
       disagreements, addressing problems through taxonomist interactions. Disagreements
       resulting from identification to a specific taxonomic level, creating the possibility to
       double-count "unique" or "distinct" taxa will also be rectified through corrective actions.

   5.  The project facilitator will prepare a report or technical memorandum. This document will
       quantify both aspects of taxonomic precision, assess data acceptability, highlight
       taxonomic problem areas, and provide  recommendations for improving precision. This
       report will be submitted to the project manager, with copies sent to the primary and QC
       taxonomists and another copy maintained in the project file.

Ten percent of all sites will receive repeat sampling visits to be sampled by a Field Crew  to
determine the extent to which the population estimates might vary if they were sampled at a
different time (revisit sites must be sampled at least 2-4 weeks apart).

 5.1.8 Quality Control Procedures: Laboratory Operations

A subset of plant samples collected as unknowns and later identified by the lab will need  to be
verified by a QA taxonomist for additional Quality Assurance. The lab will randomly select 10%
of the identified unknown samples to be sent to the QA taxonomist, another experienced
taxonomist who did not participate in the original identifications. A chain-of-custody form will be
completed and sent with the specimens.

   6.  The QC taxonomist  will perform re-identifications completing another copy of the
       Vegetation Taxonomic Bench Sheet for each specimen.  Each bench sheet must be
       labeled with the term "QC Re-ID." As each bench sheet is completed, it must be faxed to
       the project facilitator.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 61 of 120

   7.  The project facilitator will compare the taxonomic results generated by the primary and
       QC taxonomists for each specimen and calculate percent taxonomic disagreement
       (PTD) as measures of taxonomic precision (Stribling et al. 2003) as follows:
                           PTD =
Equation 1

      where
                                                     xlOO
                    s = the number of agreements (positive comparisons)
             N       = the total number of specimens in the larger of the two counts.

   8.  Unless otherwise specified by project goals and objectives, the measurement quality
       objective for enumerations will be a mean PTD less than or equal to 15, calculated from
       all the specimens in the 10% set sent to the QC taxonomist. Results greater than these
       values will be investigated and logged for indication of error patterns or trends, but all
       values will generally be considered acceptable for further analysis, unless the
       investigation reveals significant problems.

   9.  Corrective action will include determining  problem areas (taxa) and consistent
       disagreements, addressing problems through taxonomist interactions. Disagreements
       resulting from  identification to a specific taxonomic level, creating the possibility to
       double-count "unique" or "distinct" taxa will also be rectified through corrective actions.

   10. The project facilitator will prepare a report or technical memorandum. This document will
       quantify both aspects of taxonomic precision, assess data acceptability, highlight
       taxonomic problem areas, and provide  recommendations for improving precision. This
       report will be submitted to the project manager, with copies sent to the primary and QC
       taxonomists and another copy maintained in the project file.

 5.1.9 Data Management, Review, and Validation

The Botanist and Field Crew Leader are responsible for the validity of all field-generated data
(i.e. measurement and observation data) up to the point it is sent to EPA (ORD/Corvallis). The
Botanist and Field Crew Leader are likewise responsible for the proper labeling, storage, and
delivery for shipping of all voucher samples,  and for informing ORD/Corvallis when  samples
have been shipped. Laboratory SOPs (see Chapter 2 for details) will be followed to ensure that
data generated and delivered to EPA are valid. Once  data have been delivered to EPA, data
quality (DQ) procedures (as detailed in Chapter 2) will be followed to ensure the validity of data
in storage, analysis, reporting and archiving. All raw data (including all standardized forms and
logbooks) are retained permanently in an organized fashion in accordance with EPA records
management policies.

 5.2  Soils

 5.2.1 Introduction

The presence of hydric soil is a defining characteristic of wetland ecosystems.  Soils influence
surface and ground water movement in wetlands. Soils also provide  a matrix for biogeochemical

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 62 of 120

processes (e.g., nutrient cycling, pollutant storage) which affect wetland vegetation and other
wetland ecosystem components that reflect ecological condition (Tiner 1991, Mitsch and
Gosselink 2007). Examples of the types of data to be collected are:

   •   Hydric soil field indicators

   •   Description of site, soil morphology, and other characteristics

   •   Soil chemistry

   •   Soil isotope and sediment enzymes

   •   Bulk density


For more detailed information please see "Ecological Indicators for the 2011 National Wetland
Condition Assessment" (in preparation).

 5.2.2 Sampling Design

The soil sampling, observations and associated protocols used in the NWCA were chosen in
consultation with USDA soil scientists, as well as wetlands scientists and field sampling experts
in the EPA, states, tribes, academia and the private sector. For the soil indicator class,
individual metrics were chosen to:

   •   Describe current physical and chemical properties of the soil;

   •   Identify the presence of hydric soils (which inform soil development and history); and

   •   Ascertain the presence and extent of disturbance which affects soil function.


There are two components to collecting soil information: The first component involves field
measurement and description of soil macromorphological properties, e.g., texture, color, and
structural attributes;  the second component involves collecting soil samples for laboratory
analysis of various physical characteristics and chemical constituents (NWCA FOM, 2009).

As described in Section 1.3.1 (Overview of Field Operations) above, NWCA Field Crews will be
divided into two teams, the Veg Team (2 members) and the AB Team (2 members) (NWCA
FOM, 2009). The AB Team will be responsible for collecting Soil  Indicator samples and site
descriptions.

After the Veg Team  has delineated the Vegetation plots (See Section 5.1 above), soil-related
sampling will be conducted at four soil pits, located at the southeast corner of the vegetation
plot. The Soils Chapter of the FOM details  how the four pits will be located, as well as detailed
procedures for completing the protocols; equipment and supplies required are also listed. Two
activities will take place 1) description of the soil and 2) collection of soil samples for laboratory
analysis.

Soil profile information will be the first data collected  at the pit, from a 25cm x 10cm x 60cm
slab. Soil samples will be collected at one pit,  chosen as representative of the soil in the AA,
only after soil  profile information is recorded. As detailed in the FOM Soils Chapter 6, there are
3 distinct soil sample collection protocols:

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 63 of 120

   1.   Samples collected for physical, chemical and nutrient analysis

   2.   Samples collected for soil isotope and sediment enzyme analysis

   3.   Samples collected for determination of bulk density (Db)


A modified 60 ml_ plastic syringe will be used to collect three isotope samples and six enzyme
core samples from three locations around the unexpanded representative pit. The isotope
sample will be placed in a clean quart-size zip lock plastic bag and the enzyme sample will be
placed in a gallon-sized zip lock bag. Each of the bags is pre-labeled (using an indelible pen)
with the Site ID  #, date, and sample number.

Bulk density and chemistry samples are  collected from each soil horizon that is greater than 8
cm thick. A special bulk density sampler is used to collect three cores of known volume of
sample for bulk density, that are placed,  together, in a pre-labeled soil bag. The label on the
outside of the bag will contain the depth  of the horizon. Another label is stapled on the outside
of the bag that contains the diameter and length of cores and volume of sample.

Chemistry samples are collected simultaneously with bulk density samples.  When the corer is
placed, loose soil from the same horizon is broken off and placed in the pre-labeled soil bag.
The label will contain the depth of the horizon.  If large rocks are removed from the sample, the
estimated percent volume that they made up should be recorded on the second label that is
stapled to the outside of the bag.

Tools will be wiped clean between sampling pits to prevent contamination of soils collected at
different horizons, soil  pits, and sites.

On site, it is important to avoid trampling fragile wetland  vegetation during soil sampling
activities. Also, to prevent spread of potentially harmful organisms between research sites, all
crew members will employ ZERO TAXA  TRANSPORT protocols (See FOM Daily Operations
Chapter) before leaving the AA, and before entering the vehicle for return to base.
Decontaminate  shoes, clothing and person of all propagules or organisms. Clean equipment
before replacing it in vehicle.

Shipping protocols differ for the different soil samples. Soil enzyme and isotope samples are
shipped on ice within 24 hours of collection.  Bulk density and chemistry samples should be kept
cool and can be held for up to two weeks and batched for shipping..

Shipping and receiving regulated soils.  Soils that may contain pests (i.e., bacteria, plant
viruses, fungi, nematodes, and life  stages of destructive mollusks, acari, and insects) are
regulated by U.S Department of Agriculture's Animal and Plant Health Inspection Service
(APHIS). Areas within states that are under Federal quarantine must follow the conditions and
safeguards prescribed by APHIS before  shipping to another part of the country. To ensure that
the national survey is in compliance with APHIS recommendations all soils collected for the
survey will be shipped  as regulated soils. Participating labs are responsible for obtaining and
maintaining a valid permit for receiving regulated soils (USDA APHIS PPQ 525-A, Figure 5-2
below).

Upon arrival  at the lab, soil samples will be separated into regulated and non-regulated based
on their county and state of origin (as recorded on the water proof label affixed to the outside of

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 64 of 120

the sample bag). The lab is responsible for following all APHIS protocols when handling or
disposing of regulated soils found in 7 CFR 330.

-------
National Wetland Condition Assessment
QA Project Plan Version  2
                                                                                                   March 2012
                                                                                                Page  65 of 120
            APHIS  Animal and
                    Plant Health
                    Inspection Service
                                          Lnited States Department of Agriculture
                                        Animal and Plant Health Inspection Service
                                                      4700  River Road
                                                   Riverdale. MD 20737

                                                Permit to Receive Soil
                                                 Regulated by 1 CFR 330
                                 This permit was generated electronically via the ePermits system.
           PERMITTEE NAME:    Dr. Thomas Reinsch               PERMIT NUMBER:        P330-08-00009
           COMPANY:            USDA-NRCS-NSSC               APPLICATION NUMBER: P525-071002-007
           RECEI\TNG ADDRESS: Federal Building. Room 152. MS 41  DATE ISSUED:            01/14/2008
                                   100 Centennial Mall North
                                   Lincoln. XE 68508-3866
                                   Federal Building. Room 152, MS 41
                                   100 Centennial Mall North
                                   Lincoln. NE 68508-3866
                                   (402)437-4179
                                   (402)437-5760                   EXPIRES:               01/14/2011
MAILING ADDRESS:
PHONE:
FAX:
           PORTS OF ARRIVAL/PLANT INSPECTION STATIONS: Various Ports of Entry Staffed by CBP-Agnculrure
           Inspection
           HAND CARRY:         No
                                  Under the conditions specified, this permit authorizes the following:
                                          Quantity of Soil pel Shipment and Treatment
                                                          Over 3 Ibs
                                                   PERMIT CONDITIONS
             1.  This permit authorizes the importation of soil, under the conditions specified below. Upon arrival in the United
             States, the articles, shipping containers), and paperwork are subject to inspection by officials of Customs and
             Border Protection. Agriculture Inspection (CBP-AI) and the USDA. Plant Protection and Quarantine (PPQ).

             2.  Under the Plant Protection Act. individuals or corporations who fail to comply with the following conditions and
             authorizations, or who forge, counterfeit, or deface permits or shipping labels will receive civil or criminal penalties.
             and will have all current permits cancelled and future permit applications denied.

             3.  Any person who unloads, lauds, or otherwise brings or moves into or through the United States any regulated
             plants, plant products, plant pests, soil or other products or articles in violation of the regulations will be subject to
             prosecution under the applicable provisions of the law.
                                                                                                   Pe-mit iMjTite-
THIS PERMIT HAS BEEN APPROVED ELECTRONICALLY BY THE FOLLOWING
PPQ HEADQUARTER OFFICIAL VIA EPERMITS.
^~- ^T^r
Maria Corpuz
DATE
01/14/2008
            WARMING: Any alteration, forgery or unauthorized use of this Federal Form is subject to civil penalties of up to $25Q:0(X) (7 U.S.C.s 7734(b)) or punishable by a 5ne of net more -J
            S10.000, or imprisonment of net more ±au 5 years, or both (18 U.S.C.s 1001)

-------
National Wetland Condition Assessment
QA Project Plan Version 2
        March  2012
    Page 66 of 120
              APHIS  Animal and
                      Plant Health
                      Inspection Service
     Plant
     Protection &
13  Quarantine
              4. All foreign cargo of agricultural interest is inspected at the first port of arrival or the first port of unlading- If a
              shipment arrives at a port without the required official personnel available to do the proper inspection. and or
              treatment, any subsequent movement, or any transfer anct'or transloading. must be approved by CBP-AI

              5. A copy of this permit must accompany all shipments authorized under this permit.

              6. The soil is to be shipped in sturdy, leak-proof containers.
              7. CBP-AI and PPQ have the option to order and approve treatment, re-exportation or destruction of a shipment, a
              portion of a shipment, or any other material associated with the shipment (i.e. pallets, packaging, means of
              conveyance). This will be done if the official personnel  find that the shipment requires treatment as a condition of
              entry, is contaminated with a quarantine plant pest or pests, is commingled with prohibited plant material, or if
              required documentation is incomplete or missing.

              8. The shipment must be free from foreign matter or debris, plants, noxious weed seeds, and living organisms such
              as parasitic plants, pathogens, insects, snails, and mites. Material found to be commingled with unauthorized material
              will be subject to the same action (i.e. re-export, destruction) as the unauthorized material.

              9. All solid wood packing material (SWPM) present with this shipment must be in compliance with ISPM 15
              treatment and IPPC stamp requirements and enforcement. Noucoinphaut shipments, will be treated, re-exported or
              destroyed at the consignee's expense.
              10. All costs and arrangements for the safeguarding of the cargo and the transportation of the cargo are the
              responsibility of the importer, broker, or other parties associated with the shipment.
              11. The shipment can be released without treatment at the port of entry to the permittee's address listed on the permit
              or label, or an authorized user only if the final destination is an approved facility listed at
              https:, web01.apliis.usda.gov PPQ AuthSoilLabs.nsf web?opeufomi.

              12. Permit is to be utilized by the permittee or authorized user only (authorized users must present a written, dated.
              and signed statement on letterhead from the permittee, along with a valid ID and a copy of this permit).
              13. There is no further distribution of soil without prior approval from the State and Federal Regulatory Officials.
              Soil is to be used strictly for analysis in a laboratory environment at USDA-NRCS-NSSC located in Lincoln. NE.
              14. Upon receipt, all samples will remain within the approved soil laboratory identified on this permit. Laboratory
              access is restricted to individuals authorized by the permit holder.

              15. Tins permit does not authorize the use of soil for growing purposes and or the isolation or culture of organisms
              sourced from imported soil.

              16. All uuconsurned soil,  containers, and effluent is to be autoclaved. incinerated, or properly sterilized by the
              permittee at the conclusion of the project as approved and prescribed by PPQ in the compliance agreement.

              17. Valid for shipments of soil not heat treated at the port of entry, only if a Compliance Agreement (PPQ Form
              519) has been completed and signed. Compliance Agreements and Soil Permits are non-transferable. Notify local
              USDA office promptly if the permittee leaves the company.

              IS. Tins permit authorizes shipments from all foreign sources, including Guam. Hawaii.
              Puerto Rico, and the U.S. Virgin Islands through any U.S. port of entry.
                                                  END OF PERMIT CONDITIONS
                                                                                                        Permit Nilrnte-P33D-C8-C3CO?
THIS PERMIT HAS BEEN APPROVED ELECTRONICALLY BY THE FOLLOWING
PPQ HEADQUARTER OFFICIAL VTA EPERMITS
^— c~T~fi~
Maria Corpuz
DATE
01/14/2003
             '.VARMING: Any alteradon. forger.' or unauthorized use of this federal Fonni-: subset to CRT! penalties of up to $250.000 (7 U.S.C.s /734(b)) or punishable by a fine of list more ±an
             S10.000. or imprisonment of net more dian 5 years, or both (IS U.S.C.s 1001)
Figure 5-2: Example PPQ 525-A Regulated Soils Permit

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 67 of 120
 5.2.3 Sampling and Analytical Methods

Field Observations and Sampling:

Field measurement and sampling operations will be conducted by the AB Team. Field
measurements collected are described in Table 5.2-1 below. All observations are recorded on
standardized forms which are later entered in to the central NARS surface waters information
management system at WED-Corvallis.

Table 5.2-1. Field measurement methods: soil profile metrics.
Variable or
Measurement
Units

Summary of Method
For each pit
Date
Location
depth
Hydric soil indicators
Water level
Saturation level
NA
d, m, s
cm
NA
cm
cm






Date of observations
Latitude & longitude, from GPS
Depth of profile observations
Compare the soil texture as determined for the soil profile
horizons (see below ) to the descriptive keys in both generic
and regionally specific versions of the Field Indicators of
Hydric Soils in the United States (USDA and NRCS 2006).
Depth to water from soil surface

For each of the top 7 (O, A, E, B, C, L, R) soil horizons
Horizon depth
Texture
Matrix color
Redoximorphic features
Concentrations or
deletions color
cm
NA
NA
NA
NA





Measure the depth of each soil horizon
Simple hand test described in FOM Figure 9-4 (NRCS, 2009.
Modified from S.J. Thien. 1979)
Compare moist color to color-chips from the Munsell Color
Book

Same as matrix color method














As mentioned in Section 5.2-2, soil samples will be collected for soil istopes, enzymes chemical
and nutrient analysis, and determination of Db. For bulk density and chemistry, separate
samples will be collected for each soil horizon measuring greater than 8 cm from the
representative pit, down to 1.25m, if possible. Confirmation of sample collection status will be
recorded on the standardized form.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 68 of 120
Table 5.2-2. Soil Sample Collection
Sample Type
Soil isotope
Soil enzyme
Bulk density
Chemistry
Summary of Method
One core each from each of 3 locations, from the uppermost
horizon, around unexpanded soil pit
Two cores from each of three locations, from the uppermost
horizon, around unexpanded soil pit
Three cores of known volume from each soil horizon that
measures more than 8 cm, to 125 cm
Approximately 1 to 1 .5 L of sediment from each soil horizon
that measures more than 8 cm, to 125 cm





All receipts and records of shipping will be kept as part of the permanent record of the NWCA
and copies of the pertinent NWCA Soil Sample Form(s) will be included with shipped samples.

 5.2.4 Quality Assurance Objectives

As mentioned above in section 2.2.2 (Precision, Bias, and Accuracy), precision of field
measurements will not be monitored during the NWCA. Previous soils experience or class work
will be valuable for AB team members, but mandatory NWCA training will provide an
understanding of basic soil processes, soil description methods, and sampling techniques. This
training will prepare the crew to accurately complete soil data collection tasks according to the
standardized field protocols.

MQOs are given in Table 5.2-2. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.2-3 represent the maximum allowable
criteria for statistical control purposes. Precision is determined by the comparison of field
measurements from two visits to the same site; the  revisit is at least 2-4 weeks after the first
visit. Due to the high level of disturbance caused by the soil sampling methods, it is not
appropriate for the soil protocols to be completed in the same location twice. During the second
sampling event the AB team will locate the soil pits as close to the original soil pit locations as
possible without entering into the zone of disturbance created by the first sampling event. This
will ensure that the soil data collected are as similar to the original data as possible.

Table 5.2-3. Measurement quality objectives: soil indicator
Variable or Measurement
Field Measurements and Observations
Precision
±10%
Accuracy
NA
Completeness
90%
NA = not applicable in most cases. This would apply if the field auditor did a separate assessment and
compared the results to the crews.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 69 of 120
 5.2.5 Quality Control Procedures: Field Operations

Control measures to minimize measurement error among crews and sites include the use of
standardized field protocols, consistent training of all crews, and availability of experienced
technical personnel during the field season to respond to site-specific questions from field
crews. Additionally, field crews will apply a consistent labeling convention across all samples
(see FOM Soils Chapter for details on info to include on labels).

Other controls include audits and revists.  Quality assurance audits are conducted of each Field
Crew at least once during the field season, to ensure the protocols followed are consistent with
training. Ten percent of all sites will receive repeat sampling visits to be sampled by a Field
Crew to determine the extent to which the population estimates might vary if they were sampled
at a different time.

In addition, field Crew Leaders are responsible for reviewing all forms for completeness and
legibility, and  ensuring that all samples are properly collected and shipped. Field forms are then
sent to participating NRCS State Soil Scientists to ensure that horizon designations are correct.
Specific quality control measures are listed in Table 5.2-3 for field measurements and
observations.
Table 5.2-4. Field quality control: Soil indicator
Quality Control Activity
Frequency
Acceptance criteria
Corrective Action
Quality Control
Check completeness of
soil descriptive data
Check for completeness
of soil sample collection
for chemical analyses
and bulk density
Sample Storage
Each soil horizon
Each station
Each station
Values for each soil
horizon
Data sheets complete
where appropriate
Nontidal soils: samples
kept in a cool dry place
until shipped
Tidal soils: samples kept
on ice until placed in a
refrigerator or shipped
with cold packs
Repeat observations
Repeat observations
Qualify sample as
suspect for all analyses
Data Validation
Estimate precision of
measurement based on
repeat visits
2 visits
Measurements should be
within 10 percent
Review data for
reasonableness;
Determine if acceptance
criteria need to be
modified

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 70 of 120
 5.2.6 Quality Control Procedures: Laboratory Operations

Standardized lab protocols, consistent training of all lab technicians, lab assistance visits to all
labs, and availability of experienced technical personnel to respond to site-specific questions as
they arise are important to ensuring the quality of lab data. Additionally, control measures to
minimize measurement error among lab technicians and laboratories include the use of a
Control Sample, a Blank Sample, Data Review, and Data Validation.

A Control Sample represents a sample of known concentration for a particular attribute. A
Control Sample is collected in bulk for an attribute and repetitively analyzed to determine
statistical control limits (i.e., range of expected values) for the particular method. A Control
Sample is analyzed in conjunction with every batch of samples to ensure the method was run
correctly. If the value of the Control Sample falls outside the expected range of values then the
process has failed and the batch is triggered for reanalysis.

A Blank Sample is used to ensure equipment is thoroughly cleaned before each use. A Blank
Sample is especially important when measuring soil chemistry (i.e., trace metals) because
concentrations may be quite small. A Blank Sample is analyzed in conjunction with every batch
of samples to ensure that proper equipment cleaning protocols are followed. If the value of the
Blank Sample does not equal zero or fall below the MDL, then the equipment is not clean and
the batch is triggered for reanalysis.

The process of Data Validation is described here. Laboratory data undergo four Data Reviews,
first by the Bench Analysts, second by the Lead Analyst, third by the Project Coordinator Soil
Scientist, and fourth by a Soil Scientist Liaison with expertise in soils from the region where the
samples are from. The Bench Analysts verifies that blank and control samples return results that
fall within established control limits. The  Lead Analyst examines the data for inconsistencies and
apparent anomalies; inconsistencies usually take the form of unexpected high or low values for
a particular analyte or values that do not fit with the expected trend of a soil profile. The Project
Coordinator will use professional judgment to determine whether the project data are self-
consistent and congruent with the site data  collected in the field; incongruities within the data
that can be explained either by site data or the results of other analytes are recorded. A final
review is given by a Soil Scientist Liaison to the area of sample origin, before the data are
released.

Table 5.2-5. Lab analysis quality control: soils indicator
Activity or Procedure
Range check of Control Sample
Value check of Blank Sample
Data Review
Data Validation
Requirements and Corrective Action
If value is outside expected range, batch is
triggered for reanalysis
If value is >0 or the MDL, batch sample is
triggered for reanalysis
Corrective reporting for explicable incongruities
within the data
Corrective reporting for explicable incongruities
within the data

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 71 of 120

 5.2.7 Data Management, Review, and Validation

Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.2-5. The Field Crew Leader is responsible for the validity of all field-generated data (i.e.
measurement and observation data) up to the point it is sent to EPA (ORD/Corvallis). The Field
Crew Leader is responsible for the proper labeling, storage, and delivery for shipping of all
samples. The Field Crew Leader is responsible for notifying both the laboratory and
ORD/Corvallis when samples have been shipped. Laboratory SOPs (see Chapter 2 for details)
will be followed to ensure that data generated and delivered to EPA are valid. Once
ORD/Corvallis receives the data, DQ procedures (as detailed in Chapter 2) will be followed to
ensure the validity of data in storage, analysis, reporting and archiving. Raw data (including
standardized forms and logbooks) are retained permanently in an organized fashion in
accordance with EPA records management policies.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 72 of 120
Table 5.2-6. Data validation quality control: soils indicator
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and whisker
plots)
Review data from QA samples (e.g., laboratory
control samples, blank samples, or other
standards or replicates)
Requirements and Corrective Action
Corrective reporting errors or qualify as suspect
or invalid
Determine impact and possible limitations on
overall usability of data
 5.3 Hydrology

 5.3.1  Introduction

Wetland hydrology is a key driver of wetland ecosystem formation and persistence. Hydrology
influences wetland soil condition as well as biotic community composition and structure. In turn,
hydrology is controlled by watershed characteristics, and geomorphic conditions found at each
site (Tiner 1999, Mitsch and Gosselink 2007).  Examples of the types of data to be collected are:

   •  Degree of saturation

   •  Degree of inundation

   •  Types of hydrologic alteration
For more detailed  information please see "Ecological Indicators for the 2011 National Wetland
Condition Assessment" (in preparation).

 5.3.2  Sampling Design

The collection of hydrologic  data for the NWCA will be entirely in the field - No hydrology
samples will be collected for laboratory analysis. Hydrologic data collection is comprised of a
number of tasks, including assessment of:

   1.  hydrologic  sources;
   2.  surface water connectivity to a floodplain;
   3.  indirect evidence of hydroperiod;
   4.  hydrologic  fluctuations based on evidence of seasonal water levels; and
   5.  extent of alterations or stressors.
After the Point has been identified, the AA perimeter defined, and the Veg Team has delineated
the Vegetation plots (See Section 5.1 above), the AB Team will collect hydrological information
from the entire AA. Hydrologic assessment of the Vegetation Plot should be done from the Plot
periphery. Groundwater depth information will be collected at the four pits dug to collect soil
indicator samples & information. Section 9 of the FOM details how the four pits will be located.
Chapter 10 of the FOM details hydrology data collection protocols, as well as the required
equipment and standardized data forms.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 73 of 120
 5.3.3 Sampling and Analytical Methods

All field measurement and observation operations will be conducted by the AB Team.

All observations are recorded on standardized forms which are later entered in to the central
NARS surface waters information management system at WED-Corvallis.  The form used to
collect most Hydrology information is Form H-1. Generally, the AB Team will collect hydrologic
information using the following steps:

      1.   Walk the perimeter of the AA and identify water sources for the AA.
     2.   Locate the deepest ditch that may provide connectivity between the AA and a
          floodplain and measure its depth.
     3.   Search for drift lines and record findings.
     4.   Use the data form to identify and record any hydrologic alterations found present in
          the AA including (but not limited to) damming features, ditches and their lengths and
          depths, and any fresh sediment influx across the wetland.
     5.   Maximum water depth and the percent of the AA covered by water are recorded on
          FormWQ-1.
     6.   At the end of the day just prior to filling in the 4 soil pits, measure the distance from
          the soil surface down to the surface of the groundwater (recorded on Form S-1).
Annual hydroperiod is covered under other indicator protocols.

Table 5.3-1. Field measurement methods: hydrology metrics.
Variable or
Measurement
Units

Summary of Method
For each pit
Water Sources
Hydrologic alterations
Drift lines
Connectivity
Water Depth
Depth to Groundwater




cm
cm






Count of seasonal and perennial sources, including
inlets, streams, springs, the ocean, ditches, and pipes
Count of damming features (e.g., dikes/berms, roads),
length and depth of ditches/drains, evidence of tilling and
fresh sediment influx
Evidence of leaf packs and other plant detritus,
anthropogenic trash, and the percent of the AA with
standing water.
Determine the width and depth of the deepest ditch in the
AA.
Determine the maximum depth of surface water and the
percent of the AA covered. (Form WQ-1)
Recorded on S-1 Form








 5.3.4 Quality Assurance Objectives

As mentioned above in section 2.2.2 (Precision, Bias, and Accuracy), precision of field
measurements will not be monitored during the NWCA. Previous hydrology experience or class

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 74 of 120

work will be valuable for AB team members, but mandatory NWCA training will provide an
understanding of basic hydrology. This training will prepare the crew to accurately complete
hydrology data collection tasks according to the standardized field protocols.

MQOs are given in Table 5.3-2. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.3-3 represent the maximum allowable
criteria for statistical control purposes. Precision is determined from results of the revisits (field
measurements) taken on a different day (at least 2-4 weeks apart).

Table 5.3-2. Measurement quality objectives: soil indicator
Variable or Measurement
Field Measurements and Observations
Precision
±10%
Accuracy
NA
Completeness
90%
NA = not applicable in most cases. This would apply if the field auditor did a separate assessment and
compared the results to the crews.
 5.3.5 Quality Control Procedures

To avoid impairing data collection for the vegetation indicator, the AB Team members must
avoid stepping into the Vegetation Plot modules and potentially trampling vegetation.
Assessments of the Vegetation Plot modules should be done from the Plot periphery.

Upon completion of data collection, the Field Crew Leader reviews all forms for completeness
and legibility.

In addition, quality assurance audits are conducted, at least once during the field season, for a
random subset of field crews to ensure that the protocols are being implemented consistent with
training. In addition, ten percent of all sites will receive a repeat visit to determine if differences
exist in field data collection on different days.

 5.3.6 Data Management,  Review, and Validation

Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.3-3. The Field Crew Leader is responsible for the validity of all field-generated data (i.e.
measurement and observation data) up to the point they are sent to EPA (WED-Corvallis).
EPA/ORD QA SOPs (see Chapter 2 for details) will be followed to ensure that data generated
and delivered to EPA are valid. Once data have been delivered to EPA, DQ procedures (as
detailed in Chapter 2) will be followed to ensure the validity of data in storage, analysis,
reporting and archiving. All raw data (including all standardized forms and logbooks) are
retained permanently in an organized fashion in accordance with EPA records management
policies.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 75 of 120

Table 5.3-3: Data quality control: hydrology
Quality Control Activity
      Frequency
  Acceptance criteria
   Corrective Action
                                    Quality Control
Check completeness of
hydrology data
Across AA and Buffer
Values where
appropriate
Repeat observations
 5.4 Water Chemistry Indicator

 5.4.1  Introduction

Along with vegetation and soil, water is one of the key determinants of wetland systems. Some
studies show that water chemistry analyses are useful for evaluating wetland ecological integrity
and for evaluating stressor-response relationships (Lane and Brown, 2007; Reiss and Brown,
2005). Examples of the types of data to be collected are:

   •  PH;

   •  Nutrient Enrichment;

   •  Dissolved oxygen; and

   •  Temperature


For more detailed information please see "Ecological Indicators for the 2011 National Wetland
Condition Assessment" (in preparation).

Water chemistry information will be obtained by collecting samples of surface water for
laboratory analysis. At each site, crews fill one 1  L container with surface water. All samples are
stored in a cooler packed with resealable plastic bags filled with ice and shipped to the
analytical laboratory within 24 hours of collection.

 5.4.2  Field Collection

While the AA boundary and subdivisions are determined by the Veg Team, the AB Team will
determine if surface water meeting the collecting criterion (2x depth of collecting dipper -15
cm) is present within the AA. If there is surface water meeting this criterion, the AB Team will
sample the surface water.

 5.4.3  Sampling and Analytical Methods

Surface Water Sample and Data Collection:
At the identified sample collection location6, rinse the collection cup and  1L cubitainer three
times, and then collect enough surface water to just fill the 1L cubitainer. Detailed procedures
6 The preferred sample location will be towards the center of the water body, away from inlets and outlets
and deep enough to avoid fouling the water as the dipper is used to collect water

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 76 of 120
for sample collection and handling are described in the Field Operations Manual, Water Quality
Chapter (Sampling Procedure).
                             Example of long handled dipper in use.
                          (Photo credit- Maine Dept. of Environmental Protection:
                             Protocols for Collecting Water Grab Samples in
                              Rivers, Streams, and Freshwater Wetlands)
                 http://www.maine.aov/dep/blwa/docmonitorina/biomonitoring/materials/sop watergrab.pdf
Analysis:
A performance-based methods approach is being utilized for water chemistry analysis that
defines a set of laboratory method performance requirements for data quality. Following this
approach, participating laboratories may choose which analytical method they will use for each
target analyte, as long as they are able to achieve the performance requirements as listed in
Table 5.5-1.

 5.4.4 Quality Assurance Objectives

Measurement quality objectives (MQOs) are given in Table 5.4-1. General requirements for
comparability and representativeness are addressed in Section 2. The MQOs given in Table
5.4-1 represent the maximum allowable criteria for statistical control purposes.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                                  March 2012
                                                               Page 77 of 120
 Table 5.4-1:  Performance requirements for water chemistry analytical methods
        Analyte
    Units
 Potential Range    Long-Term MDL      Laboratory     T    ...   ..  .  4      Precision           Bias
  of Samples1        Objective2      Reporting Limit3   " ransmon vaiue       objective5         Objective6
 Conductivity
US/cm at 25'C
   1 to 15,000
                                       NA
                  2.0
                  20
                   ±2 or ±10%
                    ± 2 or 5%
 PH
  pH units
    3.7 to 10
 NA
 NA
5.75 and>8.25     ±0.08 or ±0.15
                 ±0.05 or ±0.10
 Ammonia (NH3)
   mgN/L
     Oto17
0.01
0.023
    0.10
± 0.01 or ±10%     ± 0.01 or ±10%
 Nitrate-Nitrite (NO3-NO2)
   mgN/L
0 to 360 (as nitrite)
0.01
0.023
    0.10
± 0.01 or ±10%     ± 0.01 or ±10%
 Total Nitrogen (TN)
   mgN/L
    0.1 to 90
0.01
0.023
    0.10
±0.01 or ±10%
±0.01 or ±10%
 Total Phosphorus (TP)
   |jgP/L
   0 to 22,000
                                   20
                                ±2 or ±10%
                                    ±2 or ±10%
 1   Estimated from samples analyzed at the WED-Corvallis laboratory between 1999 and 2005 for TIME, EMAP-West, and WSA streams from across the U.S.
 2   The long-term method detection limit is determined (eq. 1a) as a one-sided 99% confidence interval from repeated measurements of a low-level standard
    across several calibration curves, and includes medium or mean method blank results, (USGS Open File Report 99-193, EPA 2004). These represent values
    that should be achievable by multiple labs analyzing  samples over extended periods with comparable (but not necessarily identical) methods.
 3   The minimum reporting limit is the lowest value that needs to be quantified (as opposed to just detected), and represents the value of the lowest nonzero
    calibration standard used. It is set to 2x the long-term detection limit/ fractional spike recovery, following USGS Open File Report 99-193 and EPA 2004.
 4   Value at which performance objectives for precision and bias switch from absolute (< transition value) to relative (> transition value). Two-tiered approach
    based on Hunt, D.T.E. and A.L Wilson. 1986. The Chemical Analysis of Water: General Principles and Techniques. 2nd ed..  Royal Society of Chemistry,
    London, England.
 5   For duplicate samples, precision is estimated as the  pooled standard deviation (calculated as the root-mean square) of all samples at the lower concentration
    range, and as the pooled percent relative standard deviation of all samples at the higher concentration range. For standard samples, precision is estimated as
    the standard deviation of repeated measurements across batches at the lower concentration range, and as percent relative standard deviation of repeated
    measurements across batches at the higher concentration range.
 6   Bias (systematic error) is estimated as the difference between the mean measured value and the target value of a performance evaluation and/or internal
    reference samples at the lower concentration range measured across sample batches, and as the percent difference at the higher concentration  range.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                  Page 78 of 120
 5.4.5 Quality Control Procedures: Field Operations

Control measures to minimize measurement error among crews and sites include the use of
standardized field protocols, consistent training of all crews, and availability of experienced
technical personnel during the field season to respond to site-specific questions from field
crews. Additionally, field crews will apply a consistent labeling convention across all samples
(see FOM Water Quality Chapter for details on info to include on labels).

Water chemistry sample duplicates will be collected for performing QA checks. Crews are
required to collect a duplicate sample for each 10 surface water samples collected overall. Each
crew should collect the QA sample for the first site visited containing sampleable surface water
in the AA and then every 10th surface water collection thereafter. This will ensure that a
duplicate sample is collected by each crew.

Other controls include audits and revisits.  Quality assurance audits are conducted of each Field
Crew at least once during the field season, to ensure the protocols followed are consistent with
training. Ten percent of all sites will receive repeat sampling visits to be sampled by a Field
Crew to determine the extent to which the population estimates might vary if they were sampled
at a different time.

Whenever possible, surface water samples should be collected prior to 11:00 a.m. to
standardize the collection time frame for the NWCA.  This will limit the impact of diurnal changes
in the metabolic activity of the organisms in the water. Throughout the water chemistry sample
collection process it is important to take precautions  to avoid contaminating the sample.
Samples can  be contaminated quite easily by perspiration from hands, sneezing, smoking,
suntan lotion, insect repellent, fumes from gasoline engines or chemicals used during sample
collection. Also, the sampler should not enter the water to avoid fouling the water and potentially
contaminating or otherwise compromising the quality of the sample. Bottom sediments should
not be disturbed as the dipper cup is rinsed three times. The rinse water is poured onto the
wetland away from the collection site so that the water does not drain back into the sample area
and potentially affect the collected sample. Further, surface water samples should  be obtained
from areas which are completely free of surface debris.

 5.4.6 Quality Control Procedures: Laboratory Operations

 5.4.6.1 Sample Receipt  and Processing

QC activities associated with sample receipt and processing are presented in Table 5.4-3. The
communications center and information management staff is notified of sample receipt and any
associated problems as soon  as possible after samples are received. The general schemes for
processing wetland water chemistry samples for analysis is presented in Figure 5-4. Several
aliquots are prepared from bulk water samples and preserved accordingly. Ideally, all analyses
are completed within a few days after processing to allow for review of the results and possible
reanalysis of suspect samples within seven days. Critical holding times for the various analyses
are the maximum allowable holding times, based on  current EPA and American Public Health
Association (APHA) requirements (American Public Health Association,  1989).

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                                March 2012
                                                             Page 79 of 120
 Table 5.4-2. Sample processing quality control activities: water chemistry indicator
   Quality Control
       Activity
           Description and Requirements
                        Corrective Action
 Sample Storage
Store samples in darkness at 4°C
Monitor temperature daily
                     Qualify sample as
                     suspect for all analyses
 Holding time
Complete processing bulk samples within 48 hours of
collection if possible, or ASAP after receipt
                     Qualify samples
Aliquot Containers
and Preparation
HOPE bottles.
Rinse bottles and soak for 48 h with ASTM Type II
reagent water; test water for conductivity
Prepare bottles to receive acid as preservative by filling
with a 10% HCI solution and allow to stand overnight.
Rinse six times by filling with deionized water.
Determine the conductivity of the final  rinse of every
tenth bottle. Conductivity must be < 2 |j,S/cm.
                     Repeat the deionized
                     water rinsing procedure
                     on all bottles cleaned
                     since the last
                     acceptable check.
                     Check conductivity of
                     final rinse on every fifth
                     bottle.
 Filtration
0.4 |j,m polycarbonate filters required for all dissolved
analytes. Rinse filters and filter chamber twice with 50-
ml portions of deionized water, followed by a 20-mL
portion of sample. Repeat for each filter used on a
single sample. Rinse aliquot bottles with two 25 to 50
ml portions of filtered sample before use.
                                      Sample Receipt
                                       4-L Bulk Samples
                              •Inspect samples anocomplete tracking form
                                     •Store at 4°C in darkness
                                  Process within 48 Hours
Filtration

(0
.4
um)


          •HOPE bottle
          •Not acid
          washed
          •No preservative
     •HOPE bottle
     •Acid washed
     •Preserve with
     H2SO4
Not

Filtered


•HOPE bottle
•Acid washed
•Preserve with
H2S04
                                                       Analyses:

                                                    •Total Phosphorous
                                                    •Total Nitrogen

                                                    (28 day holding
                                                    time)
•HOPE bottle
•Not acid washed
•No preservative
Figure 5-3:  General Batch Water Sample Processing Scheme

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                        March 2012
                                                     Page 80 of 120
 5.4.6.2Analysis of Samples

QC protocols are an integral part of all analytical procedures to ensure that the results are
reliable and the analytical stage of the measurement system is maintained in a state of
statistical control. Information regarding QC sample requirements and corrective actions are
summarized in Table 5.4-4. Figure 5-5 illustrates the general scheme for analysis of a batch of
water chemistry samples, including associated QC samples.
Table 5.4-3. Laboratory Quality Control Samples: Water Chemistry Indicator
    QC Sample Type
     (Analytes), and
       Description
     Frequency
    Acceptance
      Criteria
         Corrective Action
Laboratory/ Reagent Blank
Once per day prior to
sample analysis
Control limits < LRL
Prepare and analyze new blank.
Determine and correct problem (e.g.,
reagent contamination, instrument
calibration, or contamination
introduced during filtration) before
proceeding with any sample analyses.
Reestablish statistical control by
analyzing three blank samples.
Filtration Blank:  (All
dissolved analytes,

ASTM Type II reagent
water processed through
filtration unit.
Prepare once per
week and archive
Prepare filter blank for
each box of 100 filters,
and examine the
results before any
other filters are used
from that box.
Measured
concentrations 
-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                        March 2012
                                                      Page 81 of 120
Table 5.4-4. (Continued).
     QC Sample Type
      (Analytes), and
       Description
     Frequency
    Acceptance
      Criteria
         Corrective Action
Standard Reference
Material: (When available
for a particular analyte)
One analysis in a
minimum of five
separate batches
Manufacturers
certified range
Analyze standard in next batch to
confirm suspected imprecision or bias.
Evaluate calibration and QCCS
solutions and standards for
contamination and preparation error.
Correct before any further analyses of
routine samples are conducted.
Reestablish control by three successive
reference standard measurements
which are acceptable. Qualify all
sample batches analyzed since  the last
acceptable reference standard
measurement for possible reanalysis.
Matrix spike samples: (Only
prepared when samples
with potential for matrix
interferences are
encountered)
One per batch
Control limits for
recovery cannot
exceed 100±20%
Select two additional samples and
prepare fortified subsamples.
Reanalyze all suspected samples in
batch by the method of standard
additions. Prepare three subsamples
(unfortified, fortified with solution
approximately equal to the endogenous
concentration, and fortified with solution
approximately twice the endogenous
concentration.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                March 2012
                                            Page 82 of 120
           PREPARE QC SAMPLES

         Laboratory Blank
         Fortified Sample
         Laboratory Split Sample

SAMPLEPROCESSING
/ •


r
                          PREPARE QC SAMPLES

                          •  QC Check Samples (QCCS)
                          •  Internal Reference Sample
                                                                      Contamination
                                                                        or Biased
                                                                       Calibration
Laboratory
  Blank
                                                                       Recheck
                                                                    LT-MDL QCCS
                              Insert randomly
                             nto sample batch
                                                Calibration
                                                  QCCS
                 Accept Batch
                   for Entry
                 and Verification
                                                                         Re-Calibrate
                                                                         Re-analyze
                                                                       Previous Samples
Calibration
  QCCS

                  Qualify batch
                  for possible
                  re-analysis
Figure 5-4:  Analysis Activities for Water Chemistry Samples

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 83 of 120
 5.4.7 Data Reporting, Review, and Management

Checks made of the data in the process of review and verification are summarized in Table 5.4-
5. Data reporting units and significant figures are given in Table 5.4-6. The Project Lead is
ultimately responsible for ensuring the validity of the data, although performance of the specific
checks may be delegated to other staff members.
Table 5.4-4: Data validation quality control: water chemistry indicator
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and whisker
plots)
Review holding times
Review data from QA samples (laboratory PE
samples, and interlaboratory comparison
samples)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or
invalid.
Qualify value for additional review
Determine impact and possible limitations on
overall usability of data
 Table 5.4-5. Data Reporting Criteria: Water Chemistry Indicator
Measurement
Dissolved Oxygen
Temperature
PH
Conductivity
Ammonia
Nitrate-Nitrite
Total nitrogen
Total phosphorus
Units
mg/L
°C
pH units
|aS/cm at 25 °C
mgN/L
mgN/L
mg N/L
^gP/L
No. Significant
Figures
2
2
3
3
3
3
3
3
Maximum No.
Decimal Places
1
1
2
1
2
2
2
0
 5.5  Algae Indicator

 5.5.1  Introduction

Algae, which include planktonic (open water), benthic (periphyton), and metaphyton forms, are
an extremely important ecosystem component, providing essential primary productivity as well
as being a food resource for higher level organisms including macroinvertebrates and fish
(Mitsch and Gosselink 2007). Like other biotic taxa, many indicator attributes or metrics which
describe ecological condition can be derived from data describing taxonomic composition and
abundance of algae. Due to high dispersal and growth rates, algae respond quickly to
environmental disturbances, both natural and anthropomorphic, and are one of the first

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 84 of 120

indicators of ecological change in the wetland (McCormick and Cairns 1994). Examples of the
types of data to be collected are:

   •   Species composition and abundance, including guilds

   •   Productivity

   •   Toxicity


For more detailed information please see "Ecological Indicators for the 2011 National Wetland
Condition Assessment" (in preparation).

 5.5.2 Sampling Design

Algae collection procedures for the NWCA have generally been based upon the multi-habitat
procedures of the National Water-Quality Assessment Program (NAWQA) (Moulton et al 2002).
The design is based on a representative multi-habitat or composite sampling method, thus
samples are collected from multiple habitats rather than once at the POINT. This method
provides a qualitative sample, though a fixed area of sampling effort is used. Multi-habitat
samples will be collected from surface sediments (benthic sample), and from the surface of
vegetation stems or leaves if vegetation is present. Multi-habitat samples collected at the site
are composited in a bottle and homogenized to characterize taxonomic composition and relative
abundance of the algal assemblage in the AA. If the wetland AA has standing water present, a
phytoplankton sample will also be collected for biomass estimates.

A 250 ml sub-sample of the composite sample is collected and shipped to the designated lab
for identification. Surface water and epiphytic algae composite samples are sub-sampled to test
for microcystin toxicity. The phytoplankton sample for biomass (chlorophyll a) analysis is
collected on a glass fiber filter and shipped to the designated lab.

It is anticipated that the AB Team will collect and field process all  algae samples. At the end of
the day's sampling activities, at least one type of laboratory sample (i.e. to evaluate composite
taxonomic composition) will have been taken at all sites. In  addition, a biomass sample will be
collected at all sites with sampleable surface water.

 5.5.3 Sampling and Analytical Methods

Three distinct types of samples will be collected:  composite taxonomic samples, and, if the AA
includes standing water, a toxin  sample and a biomass (chlorophyll a) sample. The toxin sample
will consist of five epiphytic algae samples and surface water and the composite taxonomic
sample will  include the five epiphytic algae samples and surface water, as well as five substrate
samples. All the algae sampling will be conducted by the AB Team. While the AA is determined
by the Veg Team, the AB Team  will determine algal sample collection locations in the AA for
taxonomic assemblage, as well as determine if surface water meeting the collecting criterion (2x
depth of collecting dipper ~ 6 inches) is present within the AA.

If standing water is  located in the vegetation plot only, the AB Team will work with the Veg Team
to minimize vegetation impacts while collecting the algal sample.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                  Page 85 of 120
 5.5.3.1Microcystin Toxin Sample

Sample Collection:
If aquatic or emergent plants are present, epiphytic algae samples are collected from one inch
square surface area plant scrapings for a total of 5 samples per AA. These are rinsed into a 125
ml_ bottle and surface water is used to fill it to the shoulder. Fifty ml_ of this sample are
measured out and used for the taxonomic ID sample, as described below, and surface water, is
again added to the shoulder of toxin sample. If no epiphytes are present, the bottle is simply
filled to the shoulder with  surface water.  The sample is then properly labeled and stored and
shipped on ice.

Lab Analysis:
Toxin samples will be processed by performing Microtiter Plate ELISA of Microcystins using the
Abraxis Polyclonal ADDA kit at the USGS Organic Geochemistry Research Laboratory (OGRL)
in Lawrence, KS. Results for water samples and concentrations are reported between 0.10 ug/L
and 5.0 ug/L without dilution.
 5.5.3.2Composite Taxonmic Sample

Sample Collection if no surface water:
If there is no evidence of previous inundation or desiccated algae, a sample is not collected.  If
suitable substrate is located with  probable algal growth, ten substrate samples are collected
using a sampling device described by Moulton et al 2002. Sediment core samples are collected
using a 1-inch square modified syringe. The target length for a core sample is the top few
millimeters (1/2-inch). If the target length cannot be obtained after two consecutive attempts, the
maximum obtainable core should be used. All samples will be deposited into a 250 mL bottle
which will be filled to the shoulder with deionized water, then homogenized. Lugols is added to
preserve the sample.

Sample Collection if AA has surface water:
Two types of surfaces are selected within the AA that are suitable for sampling including
depositional habitats (soft bottom, stones, sticks/wood) and vegetation for epiphytic habitat. Fifty
mLs of the algal toxin sample is poured into the 250 mL taxonomic ID bottle. This provides the
the portion needed of epiphytic algae. Five more samples are collected and added to the bottle.
These samples are representative of the predominant surfaces in the water being sampled.
One inch square surface area scrapings or 1/4-in core samples are collected from each
microhabitat, then the bottle is filled to the shoulder with surface water. This will make a total of
10 samples per AA, which are homogenized and preserved with Lugols. Detailed procedures for
sample collection and handling are described in chapter 9 of the Field Operations Manual.

Lab Analysis:
Sediment samples are cleaned of organic matter with strong  oxidizing agents and slides are
made. The analysis is made by identifying and  counting 600  individual cells. Detailed
procedures for sample processing and enumeration are described in the laboratory methods
manual. Table 5.5-1 summarizes field and analytical methods for the diatoms and Table 5.5-2
summarizes the field an analytical methods for soft algae.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 86 of 120
Table 5.5-1. Field and laboratory methods: Diatoms
Variable or
Measurement
Sample
Collection
Sample
Digestion and
Concentration
Slide preparation
Enumeration
Identification
QA
Class
C
N
N
C
C
Expected
Range
and/or Units
NA
NA
NA
0 to 300
organisms
genus
Summary of Method
Core sampler used to collect a 1 cm
core of sediments, epiphytic algae
or algae from other substrates.
Surface water is also added
Add acid and heat at 200°C for 2
hrs. Allow to settle, siphon off
supernatant, repeat until final
volume is between 25-50 ml
Prepare coverslips and mount on
slide using Naphrax
Random systematic selection of
rows and fields with target of 600
organisms from sample
Specified keys and references
References
Glewetal. 2001;
Wetlands Survey Field
Operations Manual
2009
Charles et al. 2003;
Wetlands Survey
Laboratory Methods
Manual 2010
Charles et al. 2003;
Wetlands Survey
Laboratory Methods
Manual 2010
Charles et al. 2003;
Wetlands Survey
Laboratory Methods
Manual 2010

C = critical, N = non-critical quality assurance classification.
Table 5.5-2. Field and laboratory methods: Soft Algae
Variable or
Measurement
Sample
Collection
Plamer-Maloney
Cell Preparation
Sedgewick-
Rafter Cell
Preparation
Enumeration
Identification
QA
Class
C
N
N
C
C
Expected
Range
and/or Units
NA
NA
NA
0 to 300
organisms
genus
Summary of Method
Core sampler used to collect a 1 cm
core of sediments, epiphytic algae
or algae from other substrates.
Surface water is also added
0.05mL of soft algae subsample
viewed in two, half Plamber-
Maloney cells
Large soft algae viewed in
Sedgewick-Rafter cell
Random systematic selection of
rows and fields with target of 600
organisms from sample
Specified keys and references
References
Glewetal. 2001;
Wetlands Survey Field
Operations Manual
2009
USGS 1997; NAWQA
Laboratory Methods
Manual
USGS 1997; NAWQA
Laboratory Methods
Manual
USGS 1997; NAWQA
Laboratory Methods
Manual

C = critical, N = non-critical quality assurance classification.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 87 of 120

 5.5.3.3Biomass (chlorophyll a) Sample

Sample Collection:
Water samples are collected using the long-handled dipper from the water chemistry protocol.
Take care to minimize disturbance of submerged, floating or emergent vegetation and
associated epiphytes so that none of this material is collected. Also, do not sample duckweed,
Wolfia, etc. The sample is filtered in subdued light to minimize degradation. The filter is then
stored in a centrifuge tube on ice before being shipped to the laboratory for chlorophyll a
analysis. Detailed procedures for sample collection and processing are described in the FOM
Algae Chapter.

Lab Analysis:
A performance-based methods approach is being utilized for chlorophyll a analysis that defines
a set of laboratory method performance requirements for data quality. Following this approach,
participating  laboratories may choose which  analytical method they will use to determine
chlorophyll a concentration as long as they are able to achieve the performance requirements
as listed in Table 5.5-3.

-------
National Wetland Condition Assessment                                                                                March 2012
QA Project Plan Version 2 _ Page 88 of 1 20

 Table 5.5-3. Performance Requirements for chlorophyll a Analytical Methods.
Analyte
chlorophyll a
Potential Range
Units of Samples
H,g/L 0.7 to 1 1 ,000
Long-Term MDL
Objective2
1.5
Laboratory
Reporting Limit3
3
Transition Value4
15
Precision
Objective5
± 1 .5 or ±1 0%
Bias
Objective6
± 1 .5 or ±1 0%
                  (in extract)
 1  Estimated from samples analyzed at the WED-Corvallis laboratory between 1999 and 2005 for TIME, EMAP-West, and WSA streams from
   across the U.S.
 2  The long-term method detection limit is determined (eq. 1a) as a one-sided 99% confidence interval from repeated measurements of a low-
   level standard across several calibration curves, and includes medium or mean method blank results, (USGS Open File Report 99-193, EPA
   2004). These represent values that should be achievable by multiple labs analyzing samples over extended periods with comparable (but not
   necessarily identical) methods.
 3  The minimum reporting limit is the lowest value that needs to be quantified (as opposed to just detected), and represents the value of the
   lowest nonzero calibration standard used. It is set to 2xthe long-term detection limit/ fractional spike recovery, following USGS Open File
   Report 99-193 and EPA 2004.
 4  Value at which performance objectives for precision and bias switch from absolute (< transition value) to relative 9> transition value). Two-
   tiered approach based on Hunt, D.T.E. and A.L. Wilson.  1 986. The Chemical Analysis of Water: General Principles and Techniques. 2nd ed.
   Royal Society of Chemistry, London, England.
 5  For duplicate samples, precision is estimated as the pooled standard deviation (calculated as the root-mean square) of all samples at the
   lower concentration range, and as the  pooled percent relative standard deviation of all samples at the higher concentration range. For
   standard samples, precision is estimated as the standard deviation of repeated measurements across batches at the lower concentration
   range, and as percent relative standard deviation of repeated measurements across batches at the  higher concentration range.
 6  Bias (systematic error) is estimated as the difference between the mean  measured value and the target value of a performance evaluation
   and/or internal reference samples at the lower concentration range measured across sample batches, and as the percent difference at the
   higher concentration range.

-------
National Wetland Condition Assessment                                       March 2012
QA Project Plan Version 2	Page 89 of 120

 5.5.4 Quality Assurance Objectives

 5.5.4.1 Composite Taxonomic Sample

A taxonomic harmonization table for diatoms will be developed through co-operation of the
different taxonomic laboratories to ensure consistent identification among laboratories. The
harmonization table will begin with the National Water-Quality Assessment (NAWQA) program
diatom list, and taxonomic experts from each laboratory will work together to clean up the data
set to ensure that there will be no ambiguous or synonymous taxa in the final data set.

5.5.4.2  Microcystin Toxin Sample and Biomass (chlorophyll a)
           Sample:

MQOs are given in Table 5.5-3. General requirements for comparability and  representativeness
are addressed in Section 2. The MQOs given in Table 5.5-3 represent the maximum allowable
criteria for statistical control purposes. LT-MDLs are monitored over time by repeated
measurements of low level standards and calculated using Equation 1a.

For precision, the objectives presented in Table 5.5-3 represent the 99 percent confidence
intervals about a single measurement and are thus based on the standard deviation of a set of
repeated measurements (n > 1). Precision objectives at lower concentrations are equivalent to
the corresponding LRL. At higher concentrations, the precision objective is expressed in relative
terms, with the 99 percent confidence interval based on the relative standard deviation (Section
2). Objectives for accuracy are equal to the corresponding precision objective, and are based on
the mean value of repeated measurements. Accuracy is generally estimated as net bias or
relative net bias (Section 2). Precision and bias are monitored at the point of measurement (field
or analytical laboratory) by several types of QC samples, including those in Table 5.5-4 (field)
and Table 5.5-5 (lab).

 5.5.5 Quality Control Procedures: Field Operations

5.5.5.1  Composite Taxonomic Sample and  Mycrocystin  Toxin Sample:

Any contamination of the samples can produce significant errors in the resulting interpretation.
Great care must be taken by the samplers not to contaminate the bottom sample with higher
levels of the core or with surface water or with the tools used to collect the sample (i.e., the
corer, core tube, and spatulas).  Prior to sampling, the corer device and collection tools should
be examined to ensure that they are clean and free of contaminants from previous sampling
activities.  After the core is sectioned off, the sectioning apparatus should be  removed and rinsed
in Dl  water.

After each sample is placed in the container, the label should be checked to  ensure that all
written information is complete and legible, and that the label has been completely covered with
clear packing tape. It should be verified that the bar code assigned to the sample is recorded
correctly on the Sample Collection Form (Figure 4-4). A flag code should be  recorded and
comments provided on the Sample Collection Form to denote any problems  encountered in
collecting the sample or the presence of any conditions that may affect sample integrity.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 90 of 120
 5.5.5.2Biomass (chlorophyll a) Sample:

Chlorophyll can degrade rapidly when exposed to bright light. It is important to keep the sample
on ice and in a dark place (cooler) until it can be filtered. If possible, prepare the sample in
subdued light (or shade) by filtering as quickly as possible to minimize degradation. If the
sample filter clogs and the entire sample in the filter chamber cannot be filtered, discard the filter
and prepare a new sample, using incremental smaller volumes.

Check the label to ensure that all written information is complete and legible. Place a strip of
clear packing  tape over  the label and bar code, covering the label completely. Record the bar
code assigned to the chlorophyll a sample on the Sample Collection Form (Figure 5-6). Also
record the volume of sample filtered on the Sample Collection Form. Verify that the volume
recorded on the label matches the volume recorded on the Sample Collection Form. Enter a flag
code and provide comments on the Sample Collection Form if there are any problems in
collecting the  sample or if conditions occur that may affect sample integrity. Store the filter
sample in a 50-mL centrifuge tube (or other suitable container) wrapped in aluminum foil and
freeze using dry ice or a portable freezer. Recheck all forms and labels for completeness and
legibility.

 Table 5.5-4. Field Sample Processing Quality Control: chlorophyll a Samples
Quality Control
Activity
Filtration (done in
field)
Description and Requirements
Whatman GF/F (or equivalent) glass fiber filter. Filtration
pressure should not exceed 7 psi to avoid rupture of
fragile algal cells.
Corrective Action
Discard and refilter
Wrap the vial in foil to keep sunlight from degrading the sample and place the vial in a small
Whirl-Pak bag also labeled with SiteJD, Date, and chlorophyll a. Immediately place the sample
in an ice chest filled with ice or dry ice.

Record the chlorophyll a sample data on the data log sheet along with other perishable
sample's data going into the ice chest.
Send the sample to the contracted lab overnight via FedEx. If the chlorophyll a samples are held
for a period prior to shipping, place the samples in a freezer until they can be shipped.

-------
National Wetland Condition Assessment March 2012
QA Project Plan Version 2 Page 91 of 1 20


•
Site ID: NWCA11-
FORM ALG-1 : NWCA ALGAE R,^,^,,,-- A
Date: / / 2 0 1 1

SAMPLEID:

Epiphytic Algae
Algal Toxins Sample NO sample collected O
Collected?
Ov
for
Subsmiples Comments *



SAMPLE ID:

Epiphytic Algae

Substrate

Algae Taxonomic ID Sample NO sample collected Q
Collected?
OY



Total # of subsamples =
Comments *
Subs am pies


iff of





SAMPLE ID:


Bio nass: Chlorophyll-a Sample NO Sampis Coiiactod Q
Vol filtered
(500 ml max)


Comments *


"Us. ^mment field to explain: No measurement, suspect measurement or observation made.
0 £285024681 £
NWCA Algae 01/21/2011

Figure 5-5: Sample Collection Form

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 92 of 120

 5.5.6 Quality Control Procedures: Laboratory Operations

 5.5.6.1 Composite Taxonomic Sample

A total of 10% of the samples collected will be analyzed for quality control. Analysts will swap
vials of material and recount only the dominant taxa (10-15% or more of the units counted). Re-
counts will be performed by another experienced taxonomist at an independent laboratory who
did not participate in the original identifications. EPA will inform the laboratories which random
samples will be re-counted. The samples must then be sent from the original laboratory to the
independent laboratory. The QC taxonomist should complete another copy of the Taxonomic
Bench Sheet for each sample. Each bench sheet should be labeled with the term "QC Re-ID."
As each bench sheet is completed, it should be faxed to the Information Management
Coordinator.

EPA will compare the taxonomic results generated by the primary and QC taxonomists for each
sample based on both the raw data and the appropriate metrics (i.e., taxa identified with similar
autecologies). EPA will then calculate percent similarity. It is expected that the soft algae counts
should have a similarity of >50% and the diatom counts should have a similarity of >70%. If not,
the reasons for the discrepancies between taxonomists should be discussed.  Results less than
these values will be investigated and logged for indication of error patterns or trends.

A report or technical memorandum will be prepared by the QC taxonomists. This document will
quantify both aspects of taxonomic precision, assess data acceptability, highlight taxonomic
problem areas, and provide recommendations for improving precision. This report will be
submitted to the Information Management Coordinator, with copies sent to the primary and QC
taxonomists and another copy maintained in the project file. Significant differences may result in
the re-identification of samples  by the primary taxonomist and a second QC check by the
secondary taxonomist. All samples must be stored at the laboratory until the project officer
notifies the lab.

 5.5.6.2Microcystin Toxin Sample

The Quality Assurance Officer or designee will evaluate overall data quality and QC compliance.
In the event data is not in compliance, the problem(s) will be identified and samples will be
reanalyzed, as appropriate, after corrective action is taken.

The standard curve should have a correlation coefficient of 0.99 (as suggested by ELISA kit
manufacturer).

The absorbency of the blank must be >1.400 (as suggested by ELISA kit manufacturer).
The Check Standard supplied with the ELISA kit should be analyzed a minimum of two times in
each run. Once at the beginning and once at the end. This helps ensure the plate was prepared
in the proper time frame. Values should be +/- 20 % (28.3% relative standard  deviation (RSD))
of expected value.

Laboratory duplicates should have a  percent Relative Standard Deviation (% RSD) of 28.3
percent or less when compared to each other (as suggested by ELISA kit manufacturer).
If laboratory duplicates are outside of this range, then they should be reanalyzed in the next run.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                        March 2012
                                                     Page 93 of 120
Quality control samples are available for each project. Criteria for acceptance of measured
values are +/- 20% of expected concentration. These samples are analyzed every time samples
from the same project code are analyzed.

A designated archived project sample is reanalyzed with every run that is analyzed.
Control charts are maintained for these samples. A running historical average is maintained of
the concentration from each run. The concentration of the QC sample for each successive run
has to be ± 20 percent of that average to be acceptable.

 5.5.6.3Biomass (chlorophyll a) Sample

QC activities associated with sample receipt and processing are presented in Table 5.5-5. The
communications center and information management staffs are notified of sample receipt and
any associated problems as soon as possible after samples are received.
 Table 5.5-5. Sample Processing Quality Control: Composite and chlorophyll a Samples
Quality
Control
Activity
Sample
Storage
Description and Requirements
Store samples in darkness and frozen (-20 °C)
Monitor temperature daily
Corrective Action
Qualify sample as suspect for
all analyses
QC protocols are an integral part of all analytical procedures to ensure that the results are
reliable and the analytical stage of the measurement system is maintained in a state of
statistical control. Most of the QC  procedures described here are detailed in the references for
specific methods. However, modifications to the procedures and acceptance criteria described
in this QAPP supersede those presented in the methods references. Information regarding QC
sample requirements, where applicable, and corrective actions are summarized in Table 5.5-6.
Table 5.5-6: Lab sample processing quality controls: chlorophyll a.
   QC Sample Type
   (Analytes), and
     Description
    Frequency
   Acceptance
     Criteria
     Corrective Action
Laboratory Duplicate
Sample: (All analyses)
One per batch
Control limits <
precision objective
If results are below LRL:

Prepare and analyze split from
different sample (volume
permitting). Review precision of
QCCS measurements for
batch. Check preparation of
split sample. Qualify all
samples in batch for possible
reanalysis.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
                                                         March 2012
                                                       Page 94 of 120
Standard Reference
Material: (When
available for a particular
analyte)
One analysis in a
minimum of five
separate batches
Manufacturers
certified range
Analyze standard in next batch
to confirm suspected
imprecision or bias. Evaluate
calibration and QCCS solutions
and standards for
contamination and preparation
error. Correct before any
further analyses of routine
samples are conducted.
Reestablish control by three
successive reference standard
measurements which are
acceptable. Qualify all sample
batches analyzed since the last
acceptable reference standard
measurement for possible
reanalysis.
Matrix spike samples:
(Only prepared when
samples with potential
for matrix interferences
are encountered)
One per batch
Control limits for
recovery cannot
exceed 100±20%
Select two additional samples
and prepare fortified
subsamples. Reanalyze all
suspected samples in batch by
the method of standard
additions. Prepare three
subsamples (unfortified,
fortified with solution
approximately equal to the
endogenous concentration,
and fortified with solution
approximately twice the
endogenous concentration.
 5.5.7 Data Management, Review, and Validation

 5.5.7.1 Composite Taxonomic Sample

Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.5-7. The Project Lead is ultimately responsible for ensuring the validity of the data,
although performance of the specific checks may be delegated to other staff members. Once
data have passed all acceptance requirements, computerized  data files are prepared in a format
specified for the  NWCA project. The electronic data files are transferred to NWCA IM
Coordinator at WED-Corvallis for entry into a centralized data base. A hard copy output of all
files will also be sent to the NWCA IM Coordinator.

Sample residuals, vials, and slides are archived by each laboratory until the EPA Project Leader
has authorized, in writing, the disposition of samples. All raw data (including field data forms and
bench data recording sheets) are retained  in  an organized fashion by the IM Staff permanently
or until written authorization for disposition has been received from the EPA Project Leader.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 95 of 120
      Table 5.5-7. Laboratory Quality Control: Composite Sample (Diatoms and Soft Algae)
Check or
Sample
Description
Frequency
Acceptance Criteria
Corrective Action
IDENTIFICATION
Independent
identification by
outside
taxonomist
Use standard
taxonomic
references
Prepare
reference
collection
All uncertain
taxa
For all
identifications
Each new taxon
per laboratory
Uncertain identifications to be
confirmed by expert in particular
taxa
All keys and references used must
be on bibliography prepared by
another laboratory
Complete reference collection to
be maintained by each individual
laboratory
Record both tentative and
independent IDs
If other references desired,
obtain permission to use from
Project Leader
Lab Manager periodically
reviews data and reference
collection to ensure reference
collection is complete and
identifications are accurate
DATA VALIDATION
Taxonomic
"reasonable-
ness" checks
All data sheets
Genera known to occur in given
site or geographic area
Second or third identification
by expert in that taxon
5.5.7.2 Microcystin Toxin Sample and Biomass (chlorophyll a)
           Sample:

Checks made of the data in the process of review, verification, and validation are summarized in
Table 5.5-8. Data reporting units and significant figures are given in Table 5.5-9. The Project
Lead is ultimately responsible for ensuring the validity of the data, although performance of the
specific checks may be delegated to other staff members. Once data have passed all
acceptance requirements, computerized data files are prepared in a format specified for the
NWCA project. The electronic data files are transferred to the NWCA IM Coordinator at WED-
Corvallis for entry into a centralized data base. A hard copy output of all files will also be sent to
the NWCA IM Coordinator.

Table 5.5-8.  Data validation quality control: chlorophyll a indicator
Activity or Procedure
Range checks, summary statistics, and/or
exploratory data analysis (e.g., box and whisker
plots)
Review data from QA samples (e.g., laboratory
PE samples or other standards or replicates)
Requirements and Corrective Action
Correct reporting errors or qualify as suspect or
invalid
Determine impact and possible limitations on overall
usability of data

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                   Page 96 of 120
Table 5.5-9. Data reporting criteria: chlorophyll a indicator
Measurement
chlorophyll a
Units
M9/L
No. Significant
Figures
2
Maximum No.
Decimal Places
1
 5.6 Stressors Indicator

Stressors are an important component of an assessment of wetlands because they degrade
ecological condition and can be used to determine management responses designed to improve
condition (Adamus and Brandt 1990). As the number of Stressors accumulates, it is assumed
overall wetland condition declines. We also assume this relationship holds true regardless of
wetland class. All or most of the data collection of Stressors will be done as part of the protocols
of the other indicators. For example, hydrology-related Stressors will be addressed under the
hydrology indicator in the "NWCA Field Operations Manual" (EPA- 843-R-10-001)  For more
detailed information on Stressors and their application to wetland assessment please see
"Ecological Indicators for the 2011 National Wetland Condition Assessment" (in preparation).

 5.7 Rapid Assessment  Method

 5.7.1  Introduction

The primary purpose of the USA-RAM is to assess overall condition and stress for the nation's
wetlands as part of the NWCA. The secondary purposes is to provide a rapid assessment
method  to U.S. states and Tribes  that they can further develop for their own purposes.

USA-RAM focuses on the form and structure of wetlands. For any wetland class, we assume
that wetland with more complex form and structure, and less stress, tends to support higher
levels of ecological integrity.7 Individual metrics within the condition index are selected and
organized to reflect a set of four core wetland attributes describing ecosystem structure and
form. One attribute reflects wetland hydrology  as represented by water level fluctuation and
connectivity to the other aquatic resources. Another attribute reflects physical structure as
represented by topographic complexity and patch mosaic complexity in a wetland assessment
area. The third attribute is biological structure of the wetland is expressed in terms of the vertical
complexity of the vegetation community and overall plant community complexity. A fourth
attribute termed buffer is also part of the condition index.

The presentation of stressor metrics within USA-RAM is based on an assessment framework
that assumes wetland exposure to anthropogenic disturbance will affect ecosystem condition .
The magnitude of those effects is related to the proximity, intensity and duration of Stressors
acting on the wetland in a cumulative way. These influences and their interactions cannot be
assessed with a known level of certainty using  USA-RAM. Instead,  USA-RAM  relies on an
approach that classifies the number of human  caused Stressors that cause wetland degradation.
The overall stress on a wetland is assessed as the number of evident Stressors and their
 Ecological Integrity: The condition of an unimpaired ecosystem as measured by combined chemical, physical
(including physical habitat) and biological attributes.
 Ecological Resilience: The capacity of an ecosystem to withstand disturbance and human-induced stress.

-------
National Wetland Condition Assessment
QA Project Plan Version 2
   March 2012
Page 97 of 120
intensity. As the number of stressors accumulates, wetland overall condition declines. We
assume that this relationship holds true regardless of wetland class.

 5.7.2 Sampling Design

USA-RAM is designed to assess overall condition and stress for a 0.5-ha circular Assessment
Area (AA). Condition and stress are assessed separately for each of four attributes (Buffer,
Hydrology, Physical Structure, and Biological Structure), based on unique metrics and their field
indicators. The same attributes,  metrics, and indicators are applied to every AA. Details on the
field protocol can be found in USA-RAM Manual (Collins and Fennessy 2010).
Attributes
Buffer
Hydrology
Physical Structure
Biological Structure
Condition Metrics
Percent of AA Having Buffer
Buffer Width
Water Level Fluctuation
Hydrological Connectivity
Topographic Complexity
Patch Mosaic Complexity
Vertical Complexity
Plant Community Complexity
Stress Metrics
Stress to the Buffer Zone
Stressors to Water Quality
Alterations to hydroperiod
Habitat/Substrate Alterations
Percent Cover of Invasive Plants
Vegetation Disturbance
This rapid assessment method uses presence/absence checklist and other semi-quantitative
and narrative metrics that rely on best professional judgment and onsite evidence to measure
aspects of landscape, hydrology, physical structure,  and biological structure to generate
individual attribute and aggregate scores to reflect condition on the site.

No USA-RAM data will be sent to a laboratory for further analysis; all metrics are based on field
observations.

 5.7.3 Quality Assurance Objectives

MQOs are given in Table 5.7-1. General requirements for comparability and representativeness
are addressed in Section 2. The MQOs given in Table 5.1-4 represent the maximum allowable
criteria for statistical  control purposes. Precision is determined from results of revisits (field
measurements) taken on a different day (at least 2-4 weeks apart).

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                   Page 98 of 120
Table 5.7-1. Measurement data quality objectives: vegetation indicator
Variable or
Measurement
Field Measurements and
Observations
Precision
±10%
Accuracy
NA
Completeness
90%
NA = not applicable in most cases. This would apply if the field auditor did a separate assessment and
compared the results to the crews.
 5.7.4 Quality Control Procedures: Field Operations

Precision, bias and accuracy of field measurements will not be monitored during the NWCA8.
Control measures to minimize measurement error among crews and sites include the use of
standardized field protocols, consistent training of all crews, field assistance visits to all crews,
and availability of experienced technical personnel during the field season to respond to site-
specific questions from field crews as they arise.

USA-RAM data is collected independently from other NWCA field data to allow for un-biased
calibration of the  USA-RAM based on the more intensive NWCA data. The Field Crew Leader
directs the AB Team to complete all  USA-RAM sampling activities upon arriving at the site.

Upon completion of sampling, the Field Crew Leader reviews all USA-RAM forms for
completeness, legibility, and errors.

 5.7.5 Data Management, Review,  and Validation

The Field Crew Leader is responsible for the validity of all field-generated data (i.e.
measurement and observation data) up to the point it is sent to  EPA (ORD/Corvallis. Once data
have been delivered to EPA, DQ procedures (as detailed in Chapter 2) will be followed to
ensure the validity of data in storage, analysis, reporting and archiving. All raw data (including
all standardized forms and logbooks) are retained permanently  in an organized fashion in
accordance with  EPA records management policies. No USA-RAM data will be sent to a
laboratory for further analysis; all metrics are based on field observations.

Tables for scoring each Metric are provided in this version of USA-RAM. The same tables are
included in a separate set of data sheets designed for use in the field.  These scoring tables are
preliminary. After the method is fully tested, the scoring tables will be removed from the manual
and the field data sheets. The final data sheets will only include the input data used to calculate
the Metric scores. Results from surveys of the regional networks of reference sites will be used
to develop Metric scoring tables for each region of the U.S. It is anticipated that the score for
each Attribute will be the sum of the scores for its respective Metrics, and that each AA score
will be the sum of its Attribute scores. Every AA will have one AA score, a set of Attribute
scores, and a set of Metric scores.
 Bias, for example, cannot be determined directly, since the "true" values at any particular site are not
known.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                 Page 99 of 120
  6   FIELD AND LABORATORY QUALITY EVALUATION AND
      ASSISTANCE VISITS

No national program of accreditation for vegetation and sample processing currently exists.
However, national standards of performance and audit guidance for biological laboratories are
being considered by the National Environmental Laboratory Accreditation Conference (NELAC).
For this reason, a rigorous program of field and laboratory evaluation and assistance visits has
been developed to support the National Wetland Condition Assessment.

Procedural review and assistance personnel are trained to the specific implementation and data
collection methods detailed in the NWCA:  Field Operations Manual (USEPA, 2011[b]). Plans
and checklists for field evaluation and assistance visits have been developed to reinforce the
specific techniques and procedures for both field and laboratory applications. The plans and
checklists are included in this section and describe the specific evaluation and corrective actions
procedures.

It is anticipated that evaluation and assistance visits will be conducted with  each Field Team
early in the sampling and data collection process, and that corrective actions will be conducted
in real time. These visits provide a basis for the uniform evaluation of the data collection
techniques, and an opportunity to conduct procedural reviews as required to minimize data loss
due to improper technique or interpretation of program guidance. Uniform training of field crews
and review cycles conducted early in the data collection process will significantly reduce
sampling variability associated with  specific implementation or interpretation of the protocols.
The field visits evaluations, while performed by a number of different supporting collaborator
agencies and participants, will be based on the uniform training, plans, and checklists. This
review and assistance task will be conducted for each unique crew collecting and contributing
data  under this program; hence no data will be recorded to the project database that were
produced by an 'unaudited' process, or individual.

Similarly, laboratory evaluation and  assistance visits will be conducted early in the project
schedule and soon after sample processing begins at each laboratory to ensure that specific
laboratory techniques are implemented consistently across the multiple laboratories generating
data  for the program. Laboratory evaluation and assistance visit plans and  checklists have been
developed to ensure uniform interpretation and guidance in the procedural  reviews. These
laboratory visits are designed such that full corrective action plans and remedies can be
implemented in the case of unacceptable deviations from the documented procedures observed
in the review process without recollection of samples.

The Field and Laboratory Evaluation and Assistance Visit Plans are as follows:

 6.1  Field Quality Evaluation and Assistance Visit Plan for the
      National Wetland Condition Assessment (NWCA)

Evaluators:  One or more designated EPA or Contractor staff members who are qualified (i.e.,
have completed training) in the procedures of the NWCA field sampling operations.

To Evaluate: Regional Monitoring Coordinator-appointed Field Crews during sampling
operations on site.

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 100 of 120

Purpose:  To identify and correct deficiencies during field sampling operations.

   1.  Training staff will review the Field Evaluation and Assistance Visit Plan and Check List
       with each Evaluator during field operations training sessions.

   2.  The Contractor QA Officer or authorized designee will send a copy of the final Plan and
       4-part carbonless copy versions of the final Check List pages, envelopes to return the
       Check Lists, a clipboard, pens, and NWCA QAPP and Field Operations Manual to each
       participating Evaluator.

   3.  Each Evaluator is responsible for providing their own field gear sufficient to accompany
       the Field Crews (e.g., protective clothing, sunscreen, insect repellent, hat, water bottle,
       food, back pack,  cell phone) during a complete sampling cycle. Schedule of the Field
       visits will be made by the Evaluator in consultation with the Contractor QA Officer and
       respective Field Crew Leader. Evaluators should be prepared to spend additional time in
       the field if needed (see below).

   4.  TBD Contractor and the Regional Coordinators will arrange the schedule of visitation
       with each Field Crew, and notify the Evaluators concerning site locations, where and
       when to meet the Crew, and how to get there. Ideally, each Field Crew will  be evaluated
       within the first two weeks of beginning sampling operations,  so that procedures can be
       corrected or additional training provided, if needed. EPA Evaluators will visit and
       evaluate TBD Contactor Field Crews. Any EPA or Contractor Evaluator may visit
       State/Tribal Field Crews.

   5.  An NWCA Field Crew consists of four persons where, at a minimum, the Field Crew
       Leader is fully trained.

   6.  If membership of a Field Crew changes, and at least two of the members have not been
       evaluated previously,  the Field Crew must be evaluated again during sampling
       operations as soon as possible to ensure that all members of the Field  Crew understand
       and can perform  the procedures.

   7.  The Evaluator will view the performance of a Crew through one complete set  of sampling
       activities as detailed on the Field Evaluation and Assistance Check List.

       a.  Scheduling might  necessitate starting the evaluation midway on the list of tasks at a
          site, instead of at the beginning. In that case, the Evaluator will follow the  Crew to the
          next site to complete the evaluation of the first activities on the list.

       b.  If the Crew misses or incorrectly performs a  procedure, the Evaluator will  note this on
          the  checklist and immediately point this out so the mistake can be corrected on the
          spot. The role of the Evaluator is to provide additional training and guidance so that
          the  procedures are being performed consistent with the Field Operations Manual, all
          data are recorded correctly, and paperwork is properly completed at the site.

       c.  When the sampling operation has  been completed, the Evaluator will review the
          results of the evaluation with the Field Crew Leader before leaving the site (if
          practicable), noting positive practices and problems (i.e.,  weaknesses [might affect
          data quality];  deficiencies [would adversely affect data quality]). The Evaluator will

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 101 of 120

          ensure that the Crew understands the findings and will be able to perform the
          procedures properly in the future.

       d.  The Evaluator will record responses or concerns, if any, on the Field Evaluation and
          Assistance Check List. They will review this list with the field sampling crew at the
          site.

       e.  If the Evaluator's findings indicate that the Field Crew is not performing the
          procedures correctly, safely, or thoroughly, the Evaluator must continue working with
          this Field Crew until certain of the Crew's ability to conduct the sampling properly so
          that data quality is not adversely affected.

       f.  If the Evaluator finds major deficiencies in the Field Crew operations (e.g., less than
          three members, equipment or performance problems) the Evaluator must contact
          one of the following QA officials:

             i.       Regina Poeske, EPA NWCA QA Assistance Visit Coordinator (215) 814-
                     2725.
             ii.       Sarah Lehmann, EPA NWCA Project QA Officer (202-566-1183)

                  The QA official will contact the EPA Project Leader (Michael Scozzafava -
                     202-566-1376) or Alternate EPA Project Leader (Chris Faulkner - 202-
                     566-1185 or Gregg Serenbetz 202-566-1253) to determine  the
                     appropriate course of action.

   8.  Data records from sampling sites previously visited by this Field Crew will be checked to
       determine whether any sampling sites must be redone.

   9.  Complete the Field Evaluation and Assistance Check List, including  a brief summary of
       findings, and ensure that all Crew members have read this and signed off before leaving
       the Crew.

   10. Fasten the pages of the check list for each Field Crew together with  a paper clip.

   11. Mail the remaining pages of each completed Field Evaluation and Assistance Check List
       to:

             Marlys Cappaert
             SRA/Raytheon
             USEPA-WED
             200 West 25th St.
             Corvallis, OR 97333

          Each set of Assistance Visit forms will be scanned and the data will be  entered into
          the NWCA Information Management System. The EPA NWCA QA Assistance Visit
          Coordinator will review the Field Evaluation and Assistance Check Lists, note any
          issues, check off the completion of the evaluation for each Field  Team

-------
National Wetland Condition Assessment
QA Project Plan Version 2
    March 2012
Page 102 of 120
 6.2  Laboratory Quality Evaluation and Assistance Visit Plan for the
      National Wetland Condition Assessment (NWCA)

Evaluators: One or more designated NWCA QA Assistance Visit staff members who are
qualified (i.e., have completed training) in the procedures of the NWCA laboratory operations.

To Evaluate: Laboratories performing chemical, vegetation, diatom or algal analysis or
subsampling, sorting, and taxonomic procedures to analyze wetland samples.

Purpose: To identify and correct deficiencies during laboratory operations and procedures.

   1.  NWCA QA Assistance Visit project staff will review the Laboratory Evaluation and
      Assistance Visit Plan and Check List with each Evaluator prior to conducting laboratory
      evaluations.

   2.  The Contractor QA Officer or authorized designee will send a copy of the final Plan and
      4-part carbonless copy versions of the final Check List pages, envelopes to return the
      Check Lists, a clipboard, pens, and NWCA QAPP and Laboratory Methods manual to
      each participating Evaluator.

   3.  Each laboratory analyzing samples will  receive an Assistance Visit or equivalent
      evaluation from an Evaluator. Those laboratories receiving assistance visits include the
      Natural Resource Conservation Service Laboratory (soil chemistry and bulk density) in
      Lincoln, Nebraska and EcoAnalyst Laboratory (algae and vegetation taxonomy) in
      Moscow, Idaho.  A remote evaluation procedure will be used for all water chemistry
      laboratories including the U.S. Geological Survey Laboratory in Denver, CO, the
      Wisconsin State  Laboratory in Madison, Wl, and  the ORD Western Ecology Division
      Laboratory in Corvallis, OR.  The Evaluator will also use a remote evaluation procedure
      for all state vegetation laboratories. An  intelaboratory comparison investigation will  be
      used for the algae toxins laboratories including the Wisconsin State Laboratory in
      Madison, Wl and the U.S. Geological Survey Kansas Water Science Center in
      Lawrence, KS.
Lab
Natural Resource Conservation Service
(NRCS) Lab- Lincoln, NE
EcoAnalyst Lab - Moscow, ID
U.S. Environmental Protection Agency
(EPA) ORD Western Ecology Division
Lab - Corvallis, OR - Dynamac
U.S. Geological Survey - Denver, CO
Wisconsin State Labs - Water
Chemistry
State Vegetation Labs
U.S. Geological Survey - Kansas Water
Science Center, Lawrence, KS
Analysis
Soil Chemistry and
Bulk Density
Algae and Vegetation
Taxonomy
Water Chemistry
Water Chemistry
Water Chemistry
Vegetation Taxonomy
Algal Toxin
Type of Evaluation
Assistance Visit
Assistance Visit
Remote Evaluation
Remote Evaluation
Remote Evaluation
Remote Evaluation
Interlaboratory
Comparison

-------
National Wetland Condition Assessment
QA Project Plan Version 2
    March 2012
Page 103 of 120
Wisconsin State Lab -
Algal Toxins
Algal Toxins
Interlaboratory
Comparison
   4.  The Contractor will make sure that all lab audits are performed before the first ten
       percent of samples are analyzed, and notify the Evaluators concerning site locations,
       where and when to visit the laboratory, and how to get there. Ideally, each Laboratory
       will be evaluated within the first two weeks following initial receipt of samples, so that
       procedures can be corrected or additional training provided, if needed.

   5.  The Evaluator will schedule lab visits, schedule teleconference calls, and obtain
       documentation in consultation with the Contractor QA Officer and the respective
       Laboratory Supervisor Staff. Evaluators should be prepared to spend additional time in
       the laboratory if needed (see below).

       a.  For those laboratories (Natural Resource Conservation Service Laboratory and
          EcoAnalyst Laboratory) receiving an Assistance Visit, the Evaluator will observe the
          performance of the laboratory procedures and QC Officer through one complete set
          of sample processing activities as detailed on the Laboratory Evaluation and
          Assistance Check List.

            i.    Scheduling might necessitate starting the evaluation midway on the list of
                 tasks for processing a sample, instead of at the beginning. In that case, the
                 Evaluator will view the activities of the laboratory personnel when a new
                 sample is started to complete the evaluation of the first activities on the list.
            ii.    If laboratory personnel miss or incorrectly perform a procedure, the Evaluator
                 will note this on the checklist and immediately point this out so the mistake
                 can be corrected on the spot. The role of the  Evaluator is to provide
                 additional training and guidance so that the procedures are being performed
                 consistent with the Laboratory Operations Manual, all data are recorded
                 correctly, and paperwork is properly completed at the site.

            iii.    When the sample has been completely processed or analyzed, the Evaluator
                 will review the results of the evaluation with laboratory personnel and  QC
                 Officer, noting positive practices and problems (i.e., weaknesses [might affect
                 data quality]; deficiencies [would adversely affect data quality]). The
                 Evaluator will ensure that the laboratory personnel  and QC Officer
                 understand the findings and will  be able to perform the procedures properly in
                 the future.

            iv.    The Evaluator will record responses or concerns, if any, on the Laboratory
                 Evaluation and Assistance Check List. All Laboratory Evaluations and
                 completed checklists are sent to the NWCA Project Manager. The NWCA
                 Project Manager will retain the records permanently in an organized fashion
                 in accordance with EPA records management policies.

            v.    If the Evaluator's findings indicate that Laboratory staff are not performing the
                 procedures correctly, safely, or thoroughly, the Evaluator must continue
                 working with these staff members until certain of their ability to process the
                 sample properly so that data quality is not adversely affected.

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 104 of 120

           vi.    If the Evaluator finds major deficiencies in the Laboratory operations, the
                 Evaluator must contact one of the following QA officials:

                    1.  Regina Poeske, EPA NWCA QA Assistance Visit Coordinator (215)
                       814-2725.

                    2.  Sarah Lehmann, EPA NWCA Project QA Officer (202-566-1183)

                 The QA official will contact the EPA Project Leader (Michael Scozzafava -
                 202-566-1376) or Alternate EPA Project Leader (Chris Faulkner - 202-566-
                 1185 or Gregg Serenbetz 202-566-1253) to determine the proper course of
                 action. Data records from samples previously processed by this Laboratory
                 will be checked to determine if samples must be redone.  In cases where
                 irresolvable deficiencies are noted, the EPA Project Leader will direct the
                 laboratory to stop processing  samples and send them to another laboratory.

       b.  For those water chemistry laboratories receiving a remote evaluation  (U.S.
          Geological Survey Laboratory in Denver, CO, the Wisconsin State Laboratory, and
          the ORD Laboratory), the Evaluator will request the laboratory to provide
          documentation of its policies and procedures (see Section 1.3.2 Overview of
          Laboratory Operations and Laboratory SOPs in Chapter 2 for details), including:

             i.    The laboratory's Quality Manual, Quality Management Plan or similar
                 document

            ii.    Standard Operating Procedures (SOPs) for each analysis to be performed

           iii.    Method Detection Limits (MDLs) for each instrument used and Demonstration
                 of Capability (DOC) for each analysis to be performed

           iv.    A list of the laboratory's accreditations and certifications, if any

            v.    Results from Proficiency Tests for each analyte to be analyzed under the
                 NWCA project

          If a laboratory has clearly documented procedures for sample receiving, storage,
          preservation,  preparation, analysis, and data reporting; has successfully analyzed
          Proficiency Test samples; has a Quality Manual that thoroughly addresses laboratory
          quality including standard and sample preparation, record keeping and QA non-
          conformance; participates in a nationally recognized or state certification program;
          and has demonstrated ability to perform the testing for which program/project the
          audit is intended, then the need for an on-site visit will be waived.  The EPA Project
          Leader will make a final decision on the need for an actual on-site visit after the
          review and evaluation of the documentation requested.

       c.  For participating state vegetation laboratories that are receiving a remote  evaluation,
          the Evaluator will disseminate a checklist to each participating laboratory and
          schedule a teleconference to discuss the checklist.  This teleconference will also be
          used as an opportunity for the laboratories to ask questions about the analytical
          procedures, tracking, and reporting requirements.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 105 of 120

            i.    The role of the Evaluator is to provide additional training and guidance so that
                 the procedures are performed consistent with the Laboratory Operations
                 Manual, all data are recorded correctly, and paperwork is properly completed
                 at the site. For vegetation laboratories, the checklist will focus on how
                 carefully the specimens were pressed, preserved and stored prior to their
                 identification by the expert botanists.  During the teleconference, the
                 Evaluator will  note any incorrectly performed procedures on the checklist and
                 immediately point them out so the mistake can be corrected.

            ii.    When the teleconference call is complete, the Evaluator will review the
                 results of the evaluation with laboratory personnel and QC Officer, noting
                 positive practices and problems (i.e., weaknesses [might affect data quality];
                 deficiencies [would adversely affect data quality]). The Evaluator will ensure
                 that the laboratory personnel and QC Officer understand the findings and will
                 be able to  perform the procedures properly in the future. The Evaluator will
                 send an email summary of the evaluation findings to each laboratory. All
                 Laboratory Evaluations and  completed checklists are sent to the NWCA
                 Project Manager. The NWCA Project Manager will retain the records
                 permanently in an organized fashion in accordance with EPA records
                 management  policies.
            iii.    The Evaluator will record responses or concerns, if any, on the Laboratory
                 Evaluation and Assistance Check List.

            iv.    If the Evaluator's findings indicate that Laboratory staff are not performing the
                 procedures correctly, safely, or thoroughly, the Evaluator must continue
                 working with these staff members until certain of their ability to process the
                 sample properly so that data quality is not adversely affected.

            v.    After all identifications have been completed by each state vegetation
                 laboratory, each state will calculate their Percent Taxonomic Difference. If
                 this value is not less than or equal  to fifteen percent, the Evaluator will follow
                 up with the laboratory to make sure they are aware of this deficiency.  The
                 EPA  NWCA QA Assistance Visit Coordinator will arrange a conference call
                 between the participating laboratory botanists to try to resolve the conflicting
                 identifications.

            vi.    If the Evaluator finds major deficiencies in the Laboratory operations, the
                 Evaluator must contact one of the following QA officials:

                    1.  Regina Poeske,  EPA NWCA QA Assistance Visit Coordinator (215)
                        814-2725.

                    2.  Sarah Lehmann, EPA NWCA Project QA Officer (202-566-1183)

                 The QA official will contact the EPA Project Leader (Michael Scozzafava -
                 202-566-1376) or Alternate EPA Project Leader (Chris Faulkner - 202-566-
                 1185 or Gregg Serenbetz 202-566-1253) to determine what should be done.
                 Data records from samples previously processed by this Laboratory will be
                 checked to determine whether any samples must be reidentified.  In cases

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 106 of 120

                 where irresolvable deficiencies are noted, the EPA Project Leader will direct
                 the laboratory to stop  processing samples and send them to another
                 laboratory.

       d.  For those laboratories (Wisconsin State Laboratory and the U.S. Geological Survey
          Kansas Water Science Center) undergoing an interlaboratory investigation, the
          Evaluator will coordinate a blind interlaboratory comparison for microcystin
          measurements. This comparison will include an analysis of spiked and unspiked
          samples by the two participating laboratories. The study design for this
          interlaboratory comparison is found in Appendix C.  No site visit is envisioned for
          these labs unless the data submitted and reviewed  by EPA does not meet the
          requirements of the interlaboratory comparison.

            i.    The Evaluator will examine a spreadsheet of the results and discuss any
                 discrepancies with each laboratory.

            ii.    Corrective actions may include:
                    a.  A discussion with the laboratory of possible reasons for differences
                       outside of acceptable criteria.
                    b.  Reanalysis if there is a deviation from acceptable criteria.

   6.  The Evaluator will complete the Laboratory Evaluation and Assistance Check List,
       including a brief summary of findings for each laboratory as needed. When conducting
       an Assistance Visit, the Evaluator must ensure that the appropriate lab personnel and
       QC Officer have read this and signed off before leaving the Laboratory.

   7.  The Evaluator will send completed Laboratory Evaluation and Assistance Check Lists to
       the EPA NWCA QA Assistance Visit Coordinator.

   8.  The EPA NWCA QA Assistance Visit Coordinator will review the Laboratory Evaluation
       and Assistance Check Lists, note any issues, and check off the completion of the
       evaluation for each participating Laboratory. All Laboratory Evaluations and completed
       checklists are to be sent to the NWCA Project Manager and retained permanently in an
       organized fashion in accordance with EPA  records management policies.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                  Page 107 of 120
  7   DATA ANALYSIS PLAN

The Data Analysis Plan describes the general process used to evaluate the data for the survey.
It outlines the steps taken to assess the condition of the nation's wetlands and identify the
relative impact of stressors on this condition. Results from the analysis will be included in the
final report and used in future analysis. This is the first analysis of wetlands of this scope and
scale, so the data analysis plan will likely be refined and clarified as the data are analyzed by
EPA and states.

 7.1  Data Interpretation  Background

The basic intent of data interpretation is to evaluate the occurrence and distribution of
parameters throughout the population of wetlands in the United States within the context of
regionally relevant expectations for least disturbed  reference conditions. This is  presented using
a cumulative distribution function  or similar graphic. For most indicators the analysis will also
categorize the condition  of the wetland as good, fair, or poor. Because of the large-scale and
multijurisdictional nature of this effort, the key issues for data interpretation are unique and
include: the scale of assessment, selecting the best indicators, defining the least impacted
reference conditions, and determining thresholds for judging condition.

Scale of assessment. This will be the first national report on the ecological condition of the
nation's wetlands using comparable methods. EPA selected the sampling locations for the
survey using a probability based design, and developed rules for selection to meet certain
distribution criteria, while ensuring that the design yielded a set of wetlands that would provide
for statistically valid conclusions about the condition of the population of wetlands across the
nation. A challenge that this mosaic of sites poses is developing a  data analysis plan that allows
EPA and other partners to interpret data and present results at a large, aggregate scale.

Selecting the best indicators. Indicators should be applicable across all reporting units, and
must be able to differentiate a range  of conditions. As part of the indicator selection process,
EPA Headquarters and EPA Office of Research and Development Western Ecology Division
held a conference April of 2008 to gather input from state experts.  The Agency also formed a
steering committee with state and regional representatives to develop and refine indicators and
sampling methodologies.

EPA developed screening and evaluation criteria which included indicator applicability on a
national scale, the ability of an indicator to reflect various aspects of ecological condition, and
cost-effectiveness.

Defining least impacted reference  condition. Reference condition data are necessary to
describe expectations  for biological conditions under least disturbed setting. EPA has identified
and will sample 150-200 reference wetlands stratified by wetland class. EPA:  (1) compiled lists
of candidate reference wetlands from the 10 regions based on best professional judgment from
the states  and/or regions. Allocation of candidate wetlands to be sampled was based on
wetland class, EPA Region, and national Ecoregion; (2) examined candidate reference wetlands
for disturbances using aerial photographs in a 100 m buffer around the wetland. Disturbances
were scored from 0-3 in seven categories (residential, agricultural, recreational,  industrial,
forestry, water development, roads).  Disturbance scores for each category were summed into
one "total photo"  score for use as an  overall disturbance index (0 = no noted disturbances); (3)

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 108 of 120

gave wetlands with a low "total photo" score, higher preference for inclusion. Wetlands were
stratified by FWS Status & Trend wetland category and then states were used to spread out the
sample spatially. In cases of "ties" (similar total photo scores) wetlands with agricultural and
industrial disturbances (as opposed to road/recreation type disturbances) were dropped first.
After that,  "tie" wetlands were  picked randomly to fill out cells in the table. In addition to the
selection of primary reference wetlands, alternates were listed to be used in case of limited
access issues with the primary wetlands. When replacing a primary wetland with an alternate,
those with a similar ecoregion/wetland size were selected; (4) determined the number and types
of reference wetlands appropriate and feasible for each region and selected reference wetlands
for inclusion in the 20011 sampling effort.

   1.   Selecting a classification system. The U.S. FWS Wetlands  Status and Trends
       classification system (a modified version of the Cowardin wetland classification system)
       will be used to determine how wetland sites are selected in the NWCA survey design.
       The design will stratify by state and wetland type. After field data is collected and
       compiled, we will test the utility of the various classification  systems for use in
       establishing reference  condition.

   2.   Identifying Candidate Reference Sites. Candidate reference sites selected for NWCA will
       be screened to meet regional specific criteria based on a stressor profile, surrounding
       land use, and physical criteria. These sites will be drawn from a population of "hand-
       picked sites" and from  probability sample sites.

Determining thresholds for judging condition. This reference site approach is then used to
set expectations and  benchmarks for interpreting the  data on wetland condition. The range of
conditions found in the reference sites for an ecoregion describes a distribution of those
biological or stressor values expected for least disturbed condition. The benchmarks used to
define distinct condition classes (e.g., good, fair, poor/ least disturbed, intermediate, most
disturbed) are drawn  from this reference distribution. EPA's approach is to examine the range of
values for a biological or stressor indicator in all of the reference sites in a region, and to use the
5th percentile of the reference distribution for that indicator to separate the most disturbed of all
sites from  moderately disturbed sites. Using the 5th percentile means that wetlands in the most
disturbed category are worse than 95% of the best sites used to define reference condition.
Similarly, the 25th percentile of the reference distribution can be used to distinguish between
moderately disturbed sites and those in least disturbed condition. This means that wetlands
reported as least disturbed are as good as 75% of the sites used to define reference condition.

 7.2   Datasets Utilized for the Report

The datasets  available for use in the report were developed based on analytical methods
selected during the NWCA data analysis workshop. Many of the analytical methods used in the
survey stem from discussions, input, and feedback provided by the National Wetland Condition
Assessment Steering Committee. Many of the methods are an outgrowth of the testing and
refinement of the existing and  developed methods and the logistical foundation constructed
during the implementation of the Environmental Monitoring and Assessment Program (EMAP)
studies from 1991 through 1994, from a Gulf Breeze pilot study conducted in 2008/9, from
focused pilot studies for methods development, and from  various State wetland  assessment
methods currently in use.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2	Page 109 of 120

Ecological integrity. The survey will use indicators to assess ecological integrity. Ecological
integrity describes the ecological condition of a wetland based on different assemblages of the
vegetative community, soil characteristics, presence of appropriate hydrology and their physical
habitat. The indicators include vegetation, soils, hydrology, algae, and water chemistry.

 7.3  Vegetation, Soft Algae, and  Diatom Data Analysis

Vegetation, Soft Algea,  and Diatom data will be analyzed using both multimetric indices (MMI)
and observed/expected indices (O/E) models. The MMI approach summarizes various
assemblage attributes, such as composition,  tolerance to disturbance, trophic and habitat
preferences, as individual metrics or measures of the biological community. Candidate metrics
are evaluated for aspects of performance and a subset of the best performing metrics are
combined into an index known as a Vegetation, Algae, or Diatom Index of Biotic Condition (IBI).
This index is then used to rank the condition of the resource.

The predictive model or O/E approach estimates the expected taxonomic composition of an
assemblage in the absence of human stressors, using a set of least-disturbed sites and other
variables related to natural gradients, such as elevation,  wetland size, latitude and longitude.
The resulting  models are then used to estimate the expected taxa composition (taxa richness)
at each site sampled. The number of expected taxa actually observed at a site is compared to
the number of expected taxa as an Observed Expected ratio or index. Departures from a ratio of
one indicate that the taxonomic composition in the sample differs from that expected under least
disturbed conditions.

 7.4  Soils, Hydrology and  Water Quality Data Analysis

A wide array of soil and water parameters will be measured, including a mix of field and lab-
derived values. Results from an analysis of soil chemistry, water chemistry, soil structure, and
hydrologic alteration will feed into an assessment framework to estimate the  extent  of key
stressors and the relative risks that stressors pose to wetland condition.

EPA will develop a set of regional stressor profiles which are qualitative characterizations of the
general types of human-caused stressors that affect wetlands within a broadly defined
landscape. The analytical process of grouping stressors  into a profile takes into account the
dominant land use and  climatic conditions surrounding the surveyed population of wetlands.

We will then calculate a Human Disturbance Index (HDI) based on field observations tallying the
presence and proximity of types of human activities or disturbances at Y systematically located
positions with an assessment area. The HDI incorporates both the extent of human activities
and the intensity of those activities. The extent will be expressed simply as the proportion of an
assessment area that has at least one type of human activity recorded within its boundary. The
intensity of human disturbances will be expressed by the mean proximity-weighted tally of the
number of types of human land-use activities in the assessment area.

Relative Extent, Relative risk and Attributable risk evaluation

Each targeted reference site and survey site will be classified as being in either "Good", "Fair",
or "Poor" condition, separately for each stressor variable and for each MMI (response variable).
From this data, an estimate will be made of the relative extent (prevalence) of wetlands in Poor
condition for a specified stressor and a MMI.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                  Page 110 of 120
The relative risk (RR) of each stressor for a biological response will also be estimated. RR
measures the severity of a stressor's effect on that response in an individual wetland
assessment area, when that stressor is in Poor condition (Van Sickle, et al. 2006).

Finally, the population attributable risk (AR) of each stressor for a biological response will be
estimated. AR combines RR and relative extent into a single measure of the overall impact of a
stressor on a biological response, over the entire wetland resource (Van Sickle and Paulsen
2008).

 7.5   Rapid Assessment Data Analysis and Methodology Evaluation

The USA Rapid Assessment Method (USA-RAM) is a field assessment method that
complements the other multi-metric indices used by NWCA. It includes a coarse multi-metric
index used to assess the ecological condition of wetlands. Also, a separate set of stressor
metrics is organized within USA-RAM to diagnose the cause of observed degradation and
opportunity for ecosystem protection, including restoration and enhancement.

USA-RAM focuses on the form and structure of wetlands. For any wetland class, we assume
that a larger wetland with more complex form and structure, and less stress, tends to support
higher levels of ecological integrity.9 Individual metrics within the condition index are selected
and organized to reflect a core set of hydrogeomorphic (structural) wetland attributes. Those
structural attributes reflect wetland hydrology, including the source of water, hydroperiod, and
connectivity to the other aquatic resources. They also reflect physical structure, including the
topographic complexity in a wetland assessment area. The biological component of wetland
structure is expressed in terms of the general composition,  and vertical  and horizontal structure
of vascular plant communities. A fourth hydrogeomorphic attribute, termed landscape context, is
also part of the condition index. Wetland buffer characteristics are part of the landscape
attribute.

The presentation of stressor metrics within USA-RAM is based on an assessment framework
that assumes wetland exposure to anthropogenic disturbance will affect both ecosystem
condition and ecological resilience2. The magnitude of those effects is related to the proximity,
intensity and duration of stressors acting on the wetland in a cumulative way. These influences
and their interactions cannot be assessed with a known level of certainty using USA-RAM.
Instead, USA-RAM relies on a weight-of-evidence approach to rank the causes of observed
wetland degradation. The approach involves a classification and sorting of stressor types and
an arithmetic "roll-up" of stressors based on their proximity, intensity and duration of effect on
wetland assessment areas. Results from the tallying process are used to screen for correlations
between wetland condition and likely source or sources of degradation (i.e., stressor
occurrence)

USA-RAM will be calibrated for each specific wetland type using the Vegetation and Algae MM I
and O/E scores described above.
 Ecological Integrity: The condition of an unimpaired ecosystem as measured by combined chemical, physical
(including physical habitat) and biological attributes.
 Ecological Resilience: The capacity of an ecosystem to withstand disturbance and human-induced stress.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 111 of 120


  8   REFERENCES

Adamus, P. R., and K. Brandt. 1990. Impacts on quality of Inland Wetlands of the United States:
A survey of indicators, techniques, and applications of community level biomonitoring data.
EPA/600/3-90/073, U.S. Environmental Protection Agency, Environmental Research
Laboratory, Corvallis, OR.

Baker, J.R. and G.D. Merritt, 1990. Environmental Monitoring and Assessment Program:
Guidelines for Preparing Logistics Plans. EPA 600/4-91-001. U.S. Environmental Protection
Agency. Las Vegas, Nevada.

Bender, John, March 2009,  personal communication.

Bourdaghs, M., C. A. Johnston, and R. R. Regal. 2006. Properties and performance of the
floristic quality index in Great Lakes coastal wetlands. Wetlands 26:718-735.

Dahl, I.E. 2005. Status and Trends of Wetlands in the Conterminous United States 1998 to
2004., U.S. Department of the Interior; Fish and Wildlife Service, Washington, D.C.

Diaz-Ramos, S., D.  L. Stevens, Jr., and A.  R. Olsen. 1996. EMAP Statistical Methods Manual.
EPA/620/R-96/002,  U.S. Environmental Protection Agency, Office of Research and
Development, NHEERL-Western Ecology Division, Corvallis, Oregon.

Garner,  F.C., M.A. Stapanian, and K.E. Fitzgerald. 1991. Finding causes of outliers in
multivariate environmental data. Journal of Chemometrics. 5: 241-248.

Gamier, E., J. Cortez, Bill, egrave, G. s, M.-L.  Navas, C. Roumet,  M. Debussche, G. Laurent,
eacute, rard, A. Blanchard, D. Aubry, A. Bellmann, C. Neill, and J.-P. Toussaint. 2004.  Plant
Functional Markers Capture Ecosystem Properties During Secondary Succession. Ecology
85:2630-2637.

Glaser et al., 1981 [Section  2.2.1, from Lakes QAPP. Cited,  but no complete reference]

Heinz Center. 2002. The State of the Nation's  Ecosystems. The Cambridge University Press.

Horn, C.R. and Grayman, W.M. (1993) Water-quality modeling with EPA
reach file system. Journal of Water Resources Planning and Management, 119,
262-74.

Hunt, D.T.E and A.L. Wilson. 1986. The chemical analysis of water:  general principles and
techniques. 2nd edition. Royal Society of Chemistry, London, England.

Kaufmann,  P. R., P. Levine, E. G. Robison, C. Seeliger, and D. V. Peck.  1999. Quantifying
physical habitat in wadeable streams. EPA 620/R-99/003, Office of Research and Development,
U.S. Environmental  Protection Agency, Washington, DC.

Kirchmer, C.J. 1983. Quality control in water analysis.  Environmental Science & Technology.
17: 174A-181A.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 112 of 120

Lane, C.R. and M.T. Brown. 2007. Diatoms as indicators of isolated herbaceous wetland
condition in Florida, USA. Ecological Indicators. 7:521-540.

Larsen,  D. P., P. R. Kaufmann, T. M. Kincaid, and N. S. Urquhart. 2004. Detecting persistent
change  in the habitat of salmon-bearing streams in the Pacific Northwest. Canadian Journal of
Fisheries and Aquatic Sciences 61:283-291.

Larsen,  D. P., T. M. Kincaid, S. E. Jacobs, and N.  S. Urquhart. 2001. Designs for evaluating
local and regional scale trends. BioScience 51:1069-1078.

Larsen,  D. P., N. S. Urquhart, and D. L. Kugler. 1995. Regional-scale trend monitoring of
indicators of trophic condition  of lakes.  Water Resources Bulletin 31:117-139.

Mack, J. J., and M. E. Kentula. in review. Metric similarity in vegetation-based wetland
assessment methods. EPA XXX/X-XX/XXX. U.S. Environmental Protection Agency, National
Health and Environmental Effects Laboratory, Western Ecology Division, Corvallis, OR.

Magee,  T. K., and M. E. Kentula. 2005. Response of wetland plant species to hydrologic
conditions. Wetland Ecology and Management 13:163-181.

McCormick, P., and J. Cairns. 1994. Algae as indicators of environmental change. Journal of
Applied  Phycology 6:509-526.

Meglen, R.R.  1985. A quality control protocol for the analytical laboratory. Pp. 250-270 IN: J.J.
Breen and P.E. Robinson (eds). Environmental Applications of Cehmometrics. ACS Symposium
Series 292. American Chemical Society, Washington, D.C.

Mitsch, W. J., and J. G. Gosselink. 2007. Wetlands /William J. Mitsch, James G. Gosselink.
Hoboken, N.J. : John Wiley & Sons, c2007.

Moulton II, S.R., J.G. Kennen, R.M. Goldstein, and J.A. Hambrook. 2002. Revised protocols for
sampling algal, invertebrate and fish communities  as part of the National Water-Quality
Assessment Program. United States Geological Survey, Reston, Virginia

Munsell  Color Corporation. 1998. Munsell soil color charts. GretagMacbeth, New Windsor, NY.

NAPA. 2002. Environment.gov. National Academy of Public Administration. ISBN: 1-57744-
083-8. 219 pages.

NRC. 2000. Ecological Indicators for the Nation. National Research Council.

National Resources Conservation Service. 2009. Website: http://soils.usda.gov/

OblingerChildress, C.J., Foreman, W.T., Connor,  B.F. andT.J.  Maloney. 1999. New reporting
procedures based on long-term method detection  levels and some considerations for
interpretations of water-quality data provided by the U.S.  Geological Survey National Water
Quality Laboratory. U.S.G.S Open-File Report 99-193, Reston,  Virginia.

Overton, W.S., White, D., and Stevens, D.L. Jr. 1991. Design report for EMAP, the
Environmental Monitoring and Assessment Program. EPA/600/3- 91/053, U.S. Environmental
Protection Agency, Washington, D.C.

-------
National Wetland Condition Assessment                                         March 2012
QA Project Plan Version 2                                                  Page 113 of 120
Paulsen, S.G., DP. Larsen, P.R. Kaufmann, T.R. Whittier, J.R. Baker, D. Peck, J. McGue, R.M.
Hughes, D. McMullen, D. Stevens, J.L. Stoddard, J. Lazorchak, W. Kinney, A.R. Selle, and R.
Hjort. 1991. EMAP - surface waters monitoring and research strategy, fiscal year 1991. EPA-
600-3-91-002. U.S. Environmental Protection Agency, Office of Research and Development,
Washington, D.C. and Environmental Research Laboratory, Corvallis, Oregon.

Peet, R.K., T.R. Wentworth, and P.S. White. 1998. A flexible, multipurpose method for recording
vegetation composition and structure. Castanea 63(3):262-274.

Quetier, F., S. Lavorel, W. Thuiller, and I. Davies. 2007. Plant-trait-based modeling assessment
of ecosystem-service sensitivity to land-use change. Ecological Applications 17:2377-2386

Reiss, K.C. and M.T.  Brown. 2005. The Florida Wetland Condition Index (FWCI): Developing
Biological Indicators for Isolated Depressional Forested Wetlands. Florida Department of
Environmental Protection. #WM-683.

Scozzafava, M. E., T. E. Dahl, C.  Faulkner, and M. Price.  2007. Assessing status, trends, and
condition of wetlands  in the United States. National Wetlands Newsletter 29:24-28.

Selle, A.R., D.P. Larsen, S.G. Paulsen. 1991. GIS procedure to create a national lakes frame for
environmental monitoring. In:  Proceedings of the 11th Annual Environmental Systems
Research Institute User Conference; 1991 May 20-24; Palm Springs, CA. Corvallis, OR: U.S.
Environmental Protection Agency, Environmental Research Laboratory.

Stapanian, M.A., F.C. Garner, K.E. Fitzgerald, G.T. Flatman, and J.M. Nocerino. 1993. Finding
suspected causes of measurement error in multivariate environmental data. Journal of
Chemometrics.  7: 165-176.

Stedman, S. and T.E. Dahl. 2009. Status and trends of wetlands in the coastal watersheds of
the Eastern United States 1998 to 2004. National Oceanic and Atmospheric Administration,
National Marine Fisheries Service, and U.S. Department of the Interior, Fish and Wildlife
Service.

Stevens, D. L., Jr., 1994. Implementation of a National Monitoring Program. Journal Environ.
Management 42:1-29.

Stevens, D.L., Jr. 1997. Variable density grid-based sampling designs for continuous spatial
populations. Environmetrics, 8:167-95.

Stevens, D.L., Jr. and Olsen, A.R. 1999. Spatially restricted surveys over time for aquatic
resources. Journal of Agricultural, Biological, and Environmental
Statistics, 4:415-428

Stevens, D. L., Jr., and A. R. Olsen. 2003. Variance estimation for spatially balanced samples of
environmental resources.  Environmetrics 14:593-610.

Stevens, D. L., Jr., and A. R. Olsen. 2004. Spatially-balanced sampling of natural resources in
the presence of frame imperfections. Journal of American Statistical Association:99:262-278.

Strahler, A.N. 1957. Quantitative Analysis of Watershed Geomorphology. Trans. Am.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2	Page 114 of 120

Geophys. Un. 38,913-920.

Taylor, J. K. 1987. Quality assurance of chemical measurements. Lewis Publishers, Chelsea,
Michigan.

Thien, S. J. 1979. A flow diagram for teaching texture by feel analysis. Journal of Agronomic
Education. 8:54-55.

Tiner, R. W. 1999. Wetland Indicators: A guide to wetland identification, delineation,
classification,  and mapping. Lewis Publishers, Boca Raton, Florida, USA.

U.S.GAO. 2000. Water Quality. GAO/RCED-00-54. Washington, H.G. 1984. Diversity, biotic,
and similarity  indices. Water Research 18(6): 653-694.

USDA, and APHIS. 2010. How to import foreign soil and how to move soil within the United
States. Q-330.300-1. United States Department of Agriculture and Animal and Plant Health
Inspection Service Plant Protection and Quarantine.

USDA, and NRCS. 2006. Field indicators of Hydric Soils in the United States, Version  6.0./n G.
W. Hurt and L. M. Vasilas, editors. United States Department of Agriculture, Natural Resources
Conservation  Service in cooperation with the National Technical Committee for Hydric Soils.,
Lincoln, NE.

USEPA. 2002. Methods for Evaluating Wetland Condition: #10 Using Vegetation to Assess
Environmental Conditions in Wetlands. EPA-822-R-02-020, Office of Water, U.S. Environmental
Protection Agency, Washington, DC.

U.S. EPA 2002a Guidance for Quality Assurance Plans EPA240/R-02/009 U.S. Environmental
Protection Agency, Office of Environmental Information, Washington, D.C.

U.S. EPA. 2003. Draft Report on the Environment. ORD and OEI. EPA-260-R-02-006.

U.S. EPA. 2004. Revised Assessment of Detection and Quantitation Approaches. EPA-821-B-
04-005. U.S. Environmental Protection Agency, Office of Science and Technology, Washington,
D.C.

U.S. EPA. 2006. Guidance on Systematic Planning Using the Data Quality Objectives  Process.
EPA/240/B-06/001. U.S. Environmental Protection Agency, Office of Environmental Information,
Washington, D.C.

U.S. EPA. 2006b. 2006-2011 EPA Strategic Plan: Charting Our Course. EPA-190-R-06-001.
U.S. Environmental Protection Agency, Washington, D.C.

U.S. EPA. 2007. National Rivers and Streams Assessment: Field Operations Manual.  EPA-841-
B-07-009. U.S. Environmental Protection Agency, Washington,  DC.

U.S. EPA. 2011 [a]. National Wetland Condition Assessment: Site Evaluation Guidelines. EPA-
843-R-10-004. U.S. Environmental Protection Agency, Washington, DC.

U.S. EPA. 2011[b]. National Wetland Condition Assessment: Field Operations Manual. EPA-
843-R-10-001. U.S. Environmental Protection Agency, Washington, DC.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                Page 115 of 120
U.S. EPA. 2011[c]. National Wetland Condition Assessment: Laboratory Operations Manual.
EPA-843-R-10-002. U.S. Environmental Protection Agency, Washington, DC.

Web Page: http://www.epa.gov/nheerl/arm

-------
National Wetland Condition Assessment                                 March 2012
QA Project Plan Version 2                                         Page 116 of 120
  9  APPENDIX A:  NATIONAL WETLAND CONDITION
     ASSESSMENT FIELD EVALUATION AND ASSISTANCE
     SITE VISIT SUMMARY OF FORMS
AV-1   Presampling/General Activities
AV-2   Health and Safety
AV-3   Personnel and AA Establishment
AV-4   Veg Plot Layout/Nomenclature
AV-5   USA RAM Metrics 4-12
AV-6   Buffer and USA RAM Metrics 1-3
AV-7   Veg Characterization
AV-8   Plant Specimen Collection and Handling
AV-9   Water Quality
AV-10 Algae
AV-11 Hydrology
AV-12 Soils
AV-13 Sample Handling and Shipping
AV-14 Post Sampling Activities
AV-15 Assistance Visit Summary
T-5    NWCA Tracking - Batched Samples
      Initial Site Activities
      General Activities

-------
National Wetland Condition Assessment                                     March 2012
QA Project Plan Version 2                                            Page 117 of 120
  10 APPENDIX B:  WETLAND SURVEY LABORATORY LIST
Water Chemistry:

Dyanamac
200 S.W. 35th Street
Corvallis, OR  97333

Soils:
NRCS Soil Survey Research and Laboratory
National Soil Survey Center
Natural Resources Conservation Service
Federal Bldg., MS-41
100 Centennial Mall  North
Lincoln, NE 68508

Algae and Vegetation Taxonomy:

EcoAnalyst
1420S. Elaine St., Suite 14,
Moscow,  ID 83843

Algal Toxin:

U.S. Geological Survey
Kansas Water Science Center
4821 Quail Crest Place
Lawrence, KS 66049

Sediment Enzyme:

National Health & Environmental Effects Laboratory
Mid-Continent Ecology Division
6201 Congdon Blvd.
Duluth, MN 55804-2595

-------
National Wetland Condition Assessment                                       March 2012
QA Project Plan Version 2	Page 118 of 120

  11 APPENDIX C:  Inter laboratory Total Microcystin

      Comparison by ELISA

 11.1 Introduction:

The US EPA was tasked by Congress to acquire nationally consistent data sets so that the
nation's water quality could be assessed to help guide management and regulation activities.
The National Aquatic Resource Surveys were developed in response to evaluate in partnership
with the states, tribes, and other federal agencies water quality of the nation's waters through
nationally consistent field and laboratory methodology.

This  interlaboratory investigation is necessary because more than one laboratory will be
providing microcystin data for the 2011 National Wetland Assessment. The goal of this
investigation is to ensure that the two labs are providing comparable data by analyzing samples
in a manner that is consistent with the NWCA Laboratory Operations Manual and the
manufacturer's instructions for the ELISA kits being used.  The two participating laboratories are
the U.S. Geological Survey's Organic Geochemistry Research Laboratory (OGRL) and the
Wisconsin State Laboratory of Hygiene  (WSLH). This study is defined as an interlaboratory
comparison since the same protocols and method will be used by both laboratories as
described in the 2011 NWCA Laboratory quality assurance project plan (QAPP).

All samples will be lysed by 3 sequential freeze/thaw cycles and filtered by 0.7 micron glass
fiber filter.  Filtered aliquots will then be  analyzed per manufacturer directions by the Abraxis
microcystin/nodularin ADDA enzyme-linked immunosorbent assay (ELISA). The WSLH will be
responsible for measuring all samples collected from Wisconsin.  The OGRL will measure all
other samples. Both laboratories will participate in a blind  interlaboratory comparison for
microcystin measurements.


 11.2Interlaboratory Comparison Study Design:

   1. All Wisconsin samples will be lysed and filtered at WSLH. WSLH expects a maximum
      of17 total samples for this study. WSLH will ship a 10 mL aliquot of filtered sample for
      all samples collected to Abraxis, LLC using the US  EPA NWCA shipping account.

   2. Abraxis, LLC will randomly select 3 samples to be analyzed as spiked samples.  The
      spike will consist of microcystin-LR in a matrix matched diluent compared to the
      calibration standards used in the ELISA kit.

   3. Abraxis, LLC. and U.S. EPA will  assign unique sample identification numbers for each
      unspiked and spiked aliquot. Sample identification numbers should be randomized so
      that participating labs will not know the identity of samples.

   4. All standard preparation, dilutions,  and spikes will be conducted by Abraxis, LLC.

   5. Abraxis will split the aliquots in half. One aliquot from each sample (5 mL) will remain
      unspiked and the other aliquot from each sample (5 mL) will be spiked with a
      microcystin-LR standard at a final concentration known only to Abraxis, LLC and US
      EPA Office of Wetlands, Oceans, and Watersheds.

-------
National Wetland Condition Assessment                                        March 2012
QA Project Plan Version 2                                                 Page 119 of 120
   6.  Samples will maintained frozen at all times except with sample splitting and spiking
       occur or analysis.

   7.  1 ml_ of each unspiked and spiked sample will be shipped frozen to the OGRL and
       WSLH for analysis.  Keep remaining aliquot frozen in case issues arise.

   8.  Each laboratory will  analyze the unspiked and spiked samples.  If sample concentration
       exceeds 5 ppb, then dilution is necessary until the final answer is  between 0.1 and 5
       ppb. Quanitation will be by 4-parameter curve fit.

   9.  All values, including all raw data and calculated values, will be e-mailed in a spreadsheet
       back to the U.S.  EPA NWCA QA Mangaer (Regina Poeske) and NWCA Project Lead
       (Michael Scozzafava).

   10. Percent difference and recovery are calculated taking into consideration any sample
       dilutions done for the spiked samples by US EPA.  Calculations should be compared to
       the expected and % difference between labs.

   11. Final results shared back with laboratories and discussion if discrepancies arise.
 11.3Criteria for Acceptable Comparison:

   1.  Laboratory temperature: 20 - 25 deg. C

   2.  Kit blank (0 ppb) > 0.8 absorbance units

   3.  4-parameter curve used

   4.  Kit controls are ± 20% of expected value. At least 1 kit control should be analyzed after
      calibration  standards and before samples and one at the end of samples.

   5.  Any diluents needed to get samples onto calibration curve should be analyzed as well.
      Blank diluents should be <  0.1 ppb.

   6.  At least 1 duplicate analyses should be analyzed for unspiked and spiked samples.

   7.  All samples should be within ± 20% of expected  value or average value whichever is
      appropriate.


 11.4Corrective Action:

   1.  Discussion of possible reasons for differences outside of acceptable criteria.

   2.  Reanalysis if there is a deviation from acceptable criteria. New aliquots may need to be
      sent from the frozen batches made originally.

-------
National Wetland Condition Assessment                                          March 2012
QA Project Plan Version 2	Page 120 of 120

   3.  In case of further discrepancies, kit manufacturer may need to provide an analyses of
       the shipped samples to confirm results of aliquots.

-------