United States     Great Lakes    EPA xxxxxxxxx
             Environmental    National Program  Revision 3, Draft
             Protection Agency  Office        May 2008
Great Lakes National  Program Office
             Quality Management Plan

-------
Page Intentionally Blank

-------
                    Quality Management Plan Approvals
Document Title:
              Quality Management Plan for the Great Lakes National
              Program Office
Organizational Title:      Great Lakes National Program Office
Address:
              77 West Jackson Street
              Chicago, IL 60604
            Reggie Cheatham
            Director, Office of Environmental Information Quality
            Staff
                                                    Date
            Gary Gule2ian
                   •'*'" /         ^ ' I
            Director^ fireat Lakes Mariona! Program Office
                   \^f           i

                     — n  ^\J/
David Cowgill     f
Chief, Technical Assistance & Analysis Branch
Louis Blume
Quality Manager
                                                    Date
                                                          \  U
                                             _
            Vicki Tttomas /            M
            Ch^ef Policy Coordination & Communications Branch
                                                    Date
                 iorvatin
            Chief, IVIonitoring
                                                                Date
            Judy Beck
            Regional Lake Team Manager - Lake Michigan
                                                    Date
Dan O'Riordan
Regional Lake Team Manager - Lake Erie
                                                                Date
Elizabeth LaPlante
  gional Lake Team Manager - Lake Superior
                                                                Date
                                                                Date

-------
Page Intentionally Blank

-------
                                                                                        GLNPO QMP
                                                                                         Revision: 03
                                                                                           May 2008
                                                                                          Page iii of x
                                      Table of Contents
QUALITY MANAGEMENT PLAN APPROVALS	Ill

ACKNOWLEDGMENTS	VII

ACRONYMS AND ABBREVIATIONS	IX

SECTION 1  QUALITY MANAGEMENT AND ORGANIZATION	1

   1.1   INTRODUCTION	1
   1.2   QUALITY MANAGEMENT POLICY, GOALS, AND OBJECTIVES	3
   1.3   PROGRAM DESCRIPTION	5
     1.3.1    Mission	6
        1.3.1.1     Accomplishing the Mission	7
        1.3.1.2     Setting Goals to Accomplish the Mission	8
     1.3.2    GLNPO Activities and Programs	8
     1.3.3    Base Monitoring Program	11
   1.4   ORGANIZATION OF THE GREAT LAKES NATIONAL PROGRAM OFFICE	12
     1.4.1    GLNPO Functional Teams	14
     1.4.2    Key Personnel and Associated Responsibilities	16
        1.4.2.1     GLNPO Director	16
        1.4.2.2     Branch Chiefs/Management Advisors	16
        1.4.2.3     Team Leaders	17
        1.4.2.4     Team Members	18
        1.4.2.5     Quality Manager	19
        1.4.2.6     Project Officer	21
        1.4.2.7     Principal Investigator	22
   1.5   WORKING WITH EPA REGIONS AND STATES	22
   1.6   HISTORY OF QUALITY MANAGEMENT AT GLNPO AND ORGANIZATIONAL ACCEPTANCE	23

SECTION 2 QUALITY SYSTEM COMPONENTS	25

   2.1   QUALITY MANAGEMENT PLANS	25
   2.2   PROJECT QUALITY OBJECTIVES AND SYSTEMATIC PLANNING	26
   2.3   QUALITY SYSTEM DOCUMENTATION	27
     2.3.1    GLNPO's Graded Approaches to Quality System Documentation	28
   2.4   STANDARD OPE RATING PROCEDURES	28
   2.5   TRAINING	29
   2.6   TRACKING ENVIRONMENTAL INFORMATION COLLECTION ACTIVITIES	30
   2.7   QUALITY MANAGEMENT TEAM MONTHLY REPORTSTO MANAGEMENT	35
   2.8   REVIEWS OF QUALITY SYSTEM IMPLEMENTATION AT THE BRANCH LEVEL	35
   2.9   ANNUAL REPORT AND WORKPLAN	35
   2.10  QUALITY SYSTEM AUDITS AND TECHNICAL SYSTEM AUDITS	36

SECTION 3  PERSONNEL TRAINING AND QUALIFICATIONS	37

   3.1   QUALITY MANAGER TRAINING	37
   3.2   GLNPO PERSONNEL QUALITY SYSTEM TRAINING	37

SECTION 4  PROCUREMENT OF ITEMS AND SERVICES	39

   4.1   PROCUREMENT OF ITEMS	39
   4.2   PROCUREMENT OF SERVICES	40
     4.2.1    Contracts	44
     4.2.2    Assistance Agreements	46
     4.2.3    Special Conditions	48

-------
GLNPO QMP
Revision: 03
May 2008
Page iv of x	

SECTION 5  DOCUMENT CONTROL AND RECORDS	51
   5.1   MANAGEMENT OF DOCUMENTS AND RECORDS	51
   5.2   VERSION CONTROL	51
   5.3   INFORMATION QUALITY GUIDELINES	52
     5.3.1    Standard Operating Procedures for Pre-dissemination Review	52
     5.3.2    Standard Operating Procedures for Request for Correction	53

SECTION 6  INFORMATION MANAGEMENT	55

   6.1   GLNPO INFORMATION MANAGEMENT SYSTEM	55
   6.2   HARDWARE AND SOFTWARE REQUIREMENTS	57
   6.3   REPORTING STANDARDS	58
     6.3.1    GLNPO Locational Data Policy	58
   6.4   INFORMATION SECURITY	59
SECTION 7  QUALITY PLANNING	61

   7.1   PROJECT PLANNING AND SCOPING	61
   7.2   DOCUMENTATION	63
     7.2.1    GLNPO's Graded Approaches to Quality System Documentation	66
        7.2.1.1     Quality System documentation for programs	67
        7.2.1.2     Quality System Documentation for Projects	71
     7.2.2    QA Annual Report and Workplan	76
   7.3   PEER REVIEW	76
   7.4   HEALTH AND SAFETY ISSUES	77

SECTION 8  QUALITY IMPLEMENTATION OF WORK PROCESSES	79

   8.1   INTRAMURAL ACTIVITIES	79
   8.2   EXTRAMURAL ACTIVITIES	79
   8.3   COMMUNICATION	80
   8.4   DISPUTE RESOLUTION	80

SECTION 9  QUALITY ASSESSMENT AND RESPONSE	81

   9.1   QUALITY SYSTEMS AUDITS AND TECHNICAL SYSTEMS AUDITS	81
   9.2   PERFORMANCE EVALUATIONS	82
   9.3   PROJECT ASSESSMENTS	83
   9.4   ASSESSMENT IMPLEMENTATION	84
   9.5   DATA QUALITY ASSESSMENTS	85
   9.6   ASSESSMENT REPORTING	87
   9.7   RESPONSE ACTIONS	87

SECTION 10  QUALITY IMPROVEMENT	89

   10.1  PROGRAM REVIEW	89
   10.2  PROJECT REVIEWS	90

REFERENCES	91

GLOSSARY	95

-------
                                                                                GLNPO QMP
                                                                                 Revision: 03
                                                                                   May 2008
                                                                                  Page v of x
Appendices
A. EPA Quality Manual for Environmental Programs (CIO Order 2105) and Policy and Program
   Requirements for the Mandatory Agency-wide Quality System (May 2003) (CIO Order 2105)
B. Great Lakes National Program Office Team Mission Statements
C. Monthly Quality Assurance Status Tracking Sheet
D. Great Lakes National Program Office Data Standard: Quality Assurance/Quality Control Codes
E. Great Lakes National Program Office Information Security Plan
F. Suggested Quality System Documentation Checklist
G. EPA Requirements for Quality Assurance Project Plans (EPA QA/R-5)
H. Project Inventory and Approval Form, Grant Agreement
I.  Project Inventory and Approval Form, Interagency Agreement
J.  Quality Assurance Project Plan and Quality Management Plan Checksheets
K. Examples of GLNPO's Graded Approaches to Quality System Documentation
L. Peer Review Checklists
M. Protocol to Address Missed Requirements in Great Lakes National Program Office Assistance
   Agreements
N. Example of Delinquency Notification Letter
O. Quality System Documentation Status Tracking Sheet
P. Example Audit Checksheet
Q. Audit Finding Response Form
R. Examples of Quality Assurance/Quality Control Analysis Checklists
S. GLNPO Quality System Documentation Review Procedures and Tracking
T. GLNPO Information Quality Products Approval Form

-------
Page Intentionally Blank

-------
                                                                                  GLNPO QMP
                                                                                   Revision: 03
                                                                                     May 2008
                                                                                   Page vii of x
                                   Acknowledgments

       The update and revision of the QMP entailed numerous contributions between Headquarters OEI
Quality staff, GLNPO employees, and GLNPO quality support contract staff of Computer Sciences
Corporation, numerous individuals in other EPA Programs, Regions and States.

       Key EPA Quality Staff employees that have been instrumental in this revision are Pat Mundy,
Vincia Holloman and Gary Johnson and their attention to detail in reviewing the various drafts and
providing ongoing quality management review support that began with the formal review in 2006.

       Additional thanks to the agency-wide quality management leadership provided by Reggie
Cheatham who has helped change the role of QA managers from an insurance policy role where we were
typically utilized retroactively after the damage is done, to a value-added management consultant that is
utilized in a pro-active role in key EPA management and technical decisions. We also want to
acknowledge the numerous economies of scale that have been achieved by creating a new Quality
Framework that incorporates many of the related initiatives into an encompassing, inclusive and logical
quality strategy for the future that uses all of the agency's programs to make improved and transparent
decisions.  Other leaders on the quality staff that were instrumental in this achievement are Katherine
Chalfont and recently retired Ron Shaffer. The timely and professional coordinating efforts provided by
Kim Orr on monthly conference calls and various workgroups were instrumental  in allowing program
office QA managers to discuss  common issues and share common tools.

       The role of the Regional QA managers was critical in serving as a voice of experience and
providing realistic approaches for the many changes.  Specific thanks are extended to the Region 5 QA
manager Kevin Bolger for his leadership and for providing continuous peer support to the GLNPO QA
manager.

       The quality support contract team lead by Judy Schofield of Computer Sciences Corporation and
her staff in helping GLNPO organize and review the large volume of project level Quality information
was outstanding. Judy's leadership in the use of geostatistical tools for quality sampling design and
information reporting was outstanding and resulted in better decisions with significant cost savings.  Key
members of her team that contributed greatly in an unassuming manner are Molly Amos, Ken Miller,
Chip McCarty, Neal Jannelle and Elizabeth Benjamin and others not mentioned.  The work of Rick
Barbiero and his prolific use  of GLNPO monitoring data for scientific reporting helped us to constantly
remember that good data is irrelevant unless it is published, disseminated and debated.

       Numerous GLNPO employees that also deserve recognition are David Cowgill and his attention
to detail, the numerous Stage 3 and 4 GLNPO project officers such as Scott Cieniawski, Marybeth Ross,
Marcia Damato, Jamie Schardt, Ted Smith, Todd Netteshiem, Paul Bertram, Beth Murphy, Tony
Kizlauskas, Michael Russ, Dan O'Riordan, Marvin Palmer, Brenda Jones and Pranas Pranckevicius. The
administrative support provided by Linda Jacobs and Clemmie Hardy was critical to the day to day
operations.  The ORISE intern  program with special mention to Christine McConnaghy and Jacqueline
Adams was instrumental in numerous phases of program development

       Other program office folks such as Marion Kelly, retired Bill Telliard,  Bob Shippen, Beverly
Randolph of the Office of Water, as well as Betsy Grim of OPP, Michael S. Johnson and Lucinda Taylor
of OSWER were instrumental in providing shared contract infrastructure and expertise.

-------
Page Intentionally Blank

-------
                                                                              GLNPO QMP
                                                                               Revision: 03
                                                                                 May 2008
                                                                                Page ix of x
                          Acronyms and Abbreviations

AOC          Area of Concern
CISSP         Certified Information Systems Security Professional
CMM         Contracts Management Manual
CO           Contracting Officer
CRT          Communications & Reporting Team
CTD          Conductivity/temperature/depth
DCC          Document Control Coordinator
DCN          Document Control Number
DDT          Dichloro-diphenyl-trichloroethane
DEFT         Decision Errors Feasibility Trials
DQA          Data Quality Assessment
DQOs         Data Quality Objectives
EICAA        Environmental Information Collection and Assessment Activity
EMIT         Environmental Monitoring & Indicators Team
EPA          Environmental Protection Agency
EPAAR       EPA Acquisition Regulations
ERST         Ecological Protection Restoration Team
FAR          Federal Acquisition Regulations
FY           Fiscal Year
GIS           Geographical Information Systems
GLBTS        Great Lakes Binational Toxics Strategy
GLENDA      Great Lakes Environmental Monitoring Database
GLNPO       Great Lakes National Program Office
GLPs          Good Laboratory Practices
GLWQA      Great Lakes Water Quality Agreement
GPRA         Government Performance & Results Act
HST          Health & Safety Team
IADN         Integrated Atmospheric Deposition Network
IAG          Interagency Agreement
IJC           International Joint Commission
IMDI          Information Management & Data Integration Team
IQG          Information Quality Guideline
IRM          Information Resources  Management
ISO           International Organization for Standardization
LaMP         Lakewide Management Plans
LMMB        Lake Michigan Mass Balance
LAN          Local Area Network
MAD          Method, Accuracy, and Description
MDL          Method Detection Limit
MQOs         Measurement Quality Objectives
MT           Management Team
OEI           Office of Environmental Information
OMB          Office of Management  and Budget
OWOW       Office of Wetlands, Oceans, and Watersheds
P2            Pollution Prevention Team
PAH          Polycyclic Aromatic Hydrocarbons
PBT          Program  Planning & Budgeting Team
PCB          Polychlorinated Biphenyl
PC           Personal  Computer

-------
GLNPO QMP
Revision: 03
May 2008
Page x of x

PE
PO
PTD
QA
QAARWP
QAPP
QC
QMP
QMT
QSA
RAP
RFC
R/V
SART
SOLEC
SOP
SOW
TSA
URL
WAM
WQS
Performance Evaluation
Project Officer
Project Tracking Database
Quality Assurance
Quality Assurance Annual Report and Workplan
Quality Assurance Project Plan
Quality Control
Quality Management Plan
Quality Management Team
Quality System Audits
Remedial Action Plans
Requests for Correction
Research Vessel
Sediment Assessment & Remediation Team
State of the Lake Ecosystem Conference
Standard Operating Procedure
Statement or Scope of Work
Technical System Audit
Uniform Resource Locator
Work Assignment Manager
Water Quality Survey

-------
                                                                                 GLNPO QMP
                                                                                   Revision: 03
                                                                                    May 2008
	Page 1 of 96

                                        Section 1
                      Quality Management and Organization

1.1    INTRODUCTION

       The CIO Order 2105, EPA Quality Manual for Environmental Programs, establishes policy and
program requirements for the preparation and implementation of quality management systems.  The order
requires that all Environmental Protection Agency (EPA) organizational units participate in a centrally
managed quality assurance (QA) program. This Agency-wide system management system provides the
necessary elements to plan, implement, document, and assess the effectiveness of quality assurance (QA)
and quality control (QC) activities applied to environmental programs conducted by or for EPA. The
intent is to develop a consistent approach to environmental decisions that ensures the collection of
supporting data that are scientifically sound, legally defensible, and of known and documented quality.
The Office of Environmental Information's (OEI's) Quality Staff is responsible for developing,
coordinating and directing the implementation of the Agency's QA program.

       To document adherence to CIO Order 2105, EPA requires  each organizational unit to develop a
quality management plan (QMP) per the specifications in Chapter 3 of EPA's Manual CIO Order 2105,
Policy and Program Requirements for the Mandatory Agency-wide Quality System (included as Appendix
A of this QMP).  The QMP is management's statement of the process that will govern the QA  activities
for a given organization. The QMP defines an organization's QA-related policies, areas of application,
roles,  responsibilities and authorities of staff, and the management  and technical practices that  assure that
environmental data used to support decisions are:

       >  of adequate quality and usability for their intended purpose, and
       >  where necessary, legally and scientifically defensible.

       This document defines the Great Lakes National Program Office's (GLNPO's) quality system.
GLNPO is a geographically-focused office, whose mission is to lead and coordinate United States efforts
to protect and restore the Great Lakes.  This Quality Management Plan (QMP) is a management tool that
describes how GLNPO will plan, implement, document, and assess its quality system to support its
mission. The document also communicates the policy and provides guidance on GLNPO's quality
management system to all personnel associated with GLNPO. The document has been approved by
GLNPO management and OEI's Quality Staff.

       Program management is responsible for ensuring that the QMP is implemented. In accordance
with policies and procedures established under CIO Order 2105. Program Office Directors and Senior
Managers shall:

       1.  ensure that all Program components and programs comply fully with the requirements of the
           Order;
       2.  ensure that quality management is an identified activity with associated resources adequate to
           accomplish its program goals and is implemented as prescribed in the organizations approved
           QMP;
       3.  ensure that the environmental data form environmental programs delegated to State, local,
           and Tribal governments are of sufficient quantity and adequate quality for their intended use
           and are used consistent with such intentions;
       4.  ensure that training is available for State, local,  and Tribal governments performing
           environmental programs for EPA in the fundamental concepts and practices of quality
           management and QA and QC activities that they may be expected by EPA to perform'

-------
GLNPO QMP
Revision:  03
May 2008
Page 2 of 96	

       5.  perform periodic assessments of Regional organizations conducting environmental programs
           to determine the conformance of their mandatory quality system to their approved QMPs and
           the effectiveness of their implementations;
       6.  ensure that deficiencies highlighted in the assessments are appropriately addressed;
       7.  identify QA and QC training needs for all levels of management and staff and provide for this
           training; and
       8.  ensure that performance plans for supervisors, senior manager, and appropriate staff contain
           critical elements that are commensurate with the quality management responsibilities
           assigned by this Order and the organizations QMP.

       This QMP documents GLNPO's quality system to meet these requirements in fulfilling its
mission.  The QMP is organized in the following ten sections:

       >  Section 1 continues with a description of GLNPO's program, mission, organizational
           structure, and roles and responsibilities of GLNPO management and staff;
       >  Section 2 describes the components of GLNPO's quality system, including a description of
           the tools used by GLNPO staff to implement the quality system;
       >  Section 3 provides information regarding personnel qualifications and quality system training
           requirements;
       >  Section 4 discusses GLNPO's process for procuring items and services and ensuring
           suppliers provide items and services that  are of known and documented quality and meet
           associated technical requirements;
       >  Section 5 provides information on the  control and maintenance of documents and records;
       >  Section 6 discusses GLNPO's process for managing information, including a description of
           computer hardware and software administration;
       >  Section 7 discusses GLNPO's process for planning environmental projects, including a step-
           by-step list of questions that project planners can implement during initial stages;
       >  Section 8 discusses GLNPO's implementation of work processes;
       >  Section 9 provides a description of GLNPO's policies and procedures for assessing
           environmental information collection activities and responding to assessment findings; and
       >  Section 10 discusses GLNPO's  ongoing activities towards improving quality throughout the
           program.

       A list of references is included at the end of the document that provides detailed information
regarding EPA OEI reference and guidance documents. A number of appendices also are included in the
QMP to further communicate GLNPO's quality system to OEI and GLNPO staff and to provide
assistance to GLNPO staff and associated agencies and organizations in implementing GLNPO's quality
system. For example, Appendix K provides examples of GLNPO's graded approaches to quality system
documentation including a quality assurance project plan (QAPP) for a project using secondary data,
Great Lakes Sediment Data Support, and Appendix L contains a checklist for determining whether peer
review is needed for a given environmental information collection and assessment activity (EICAA).

       In accordance with the guidance provided in CIO Order 2105, this QMP is a dynamic document
that is subject to change as GLNPO's program progresses. A QMP should represent current activities.
This QMP is reviewed annually by the GLNPO Quality Manager to determine if revision is required. In
addition, as GLNPO's program progresses in accordance with the continuous improvement philosophy
consistent with International Organization for Standardization (ISO) 9001 (ISO 9001:2000, Quality
Management Systems-Requirements, 2000), all changes to procedures described in this QMP will be
reviewed by the GLNPO Quality Manager to determine if the changes significantly impact the quality
objectives of the program. If changes are deemed to be significant, the QMP will be revised accordingly

-------
                                                                                GLNPO QMP
                                                                                 Revision: 03
                                                                                   May 2008
	Page 3 of 96

and distributed to the Great Lakes National Program Manager; GLNPO's Director, Branch Chiefs, and
Document Control Coordinator; and OEI's Quality Staff.

       In accordance with GLNPO's document control procedures, GLNPO's Quality Manager and
document control coordinator will maintain controlled copies of this QMP in blue binders (Section 5).
These controlled copies will be distributed to GLNPO's Director, Branch Chiefs, and OEI's Quality Staff.
 GLNPO managers and staff are instructed to locate electronic copies of the QMP on GLNPO's LAN
(G:USER/SHARE/ALL/OA/GLNPO OMP/OMP 02) and in hard-copies by obtaining a controlled copy
from the Document Control Coordinator.

1.2    QUALITY MANAGEMENT POLICY, GOALS, AND OBJECTIVES

       GLNPO's quality management policy       	
focuses on four operating principles: assistance,
flexibility, value-added, and continuous
improvement. GLNPO's quality staff provide
6LNPO focuses on continuous improvement
The integrity of the quality management system
    is maintained when changes to the quality
assistance to Project Officers (POs) and                       .    .         ,     •    • .   ,     .   •
               ,.     ,                         management system are planned and implemented.
planners with quality tools necessary to            I      	_	
implement their programs.

    The quality program is flexible with the policy that all QA policies and requirements should provide
added value to environmental information collection and assessment activities.  GLNPO strives for
continuous improvement by constantly evaluating the system to identify problem areas, potential issues of
concern, and areas for improvement and then developing and implementing corrective actions to address
them. The primary goals and objectives of GLNPO's quality system  are implementing projects that are
based on sound science and provide information of adequate quality to support the underlying decision.
Providing cost effective environmental programs to the taxpayer also is an overarching goal of the quality
program.

       GLNPO is committed to the protection of the Great Lakes system. To accomplish this, a myriad
of decisions must be made on the quality of the environment and the health of humans and wildlife.
These decisions usually depend on qualitative and quantitative measurements derived from various
EICAAs. Decision makers must be able to use these measurements with some level of confidence in
order to make informed decisions. It is the policy of the Great Lakes National Program Office to
ensure collected information is of adequate quality for the intended use. This overarching quality
management policy is implemented through a series of policies and practices that are described below.

-------
GLNPO QMP
Revision:  03
May 2008
Page 4 of 96
Allocation of appropriate
resources
Inclusion of quality
management in daily activities
                                  Policies and Practices

                             GLNPO management will allocate adequate resources to meet the
                             quality system goals and requirements outlined in this QMP for all
                             EICAAs.

                             It is GLNPO policy that the quality system must be implemented in
                             daily activities of GLNPO staff. This policy is fostered through
                             training of all GLNPO staff on the quality system philosophy,
                             requirements, tools, and reference documents. In addition, GLNPO's
                             Quality Manager is involved in a supporting role at the project level in
                             many GLNPO-funded EICAAs. GLNPO policy stresses
                             management's responsibility to create an environment in which all
                             personnel contribute to producing high quality products.

                             It is GLNPO policy that quality can be achieved only through
                             systematic planning, assessment, and corrective action. Every program
                             supported by GLNPO funds should have clear objectives and a detailed
                             plan to meet these objectives.  This is accomplished by employing a
                             thorough systematic planning process at the initiation of any EICAA.

Quality system documentation   It is GLNPO policy that appropriate quality system documentation
                             (such as QAPPs) will be developed for any project which includes
                             EICAAs.
Systematic planning
Provision of quality training
                             GLNPO's quality policy involves training POs and staff on the quality
                             system requirements, available quality implementation tools, and
                             reference and guidance documents to assist them in implementing their
                             programs while meeting GLNPO quality goals and complying with
                             GLNPO's quality system.
       EPA recognizes that a "one size fits all" approach to quality requirements will not work due to
the variety of projects conducted by EPA and funded entities. Therefore, the implementation of the EPA
quality system is based on a graded approach. In accordance with EPA's quality system, GLNPO
employs a graded approach philosophy throughout all quality system activities. Applying a graded
approach means that quality systems for different organizations and programs vary according to the
specific objectives and needs of the organization.  A graded approach also is applied to quality system
documentation.  The level of effort   	
needed to develop and document a
quality system should be based on
the scope of the program and the
nature of the decision.  Similarly,
the level of detail for quality
                                    SLNPO's Quality System Requirements are commensurate with:
                                       >  Importance of Work
                                       >  Availability of resources
                                       >  Unique needs of organization
                                       >  Consequences of potential decision errors
documentation of specific projects
varies according to
 the complexity of the work being performed and the intended use of the data.  Examples of this
philosophy are discussed in this QMP.

       As mentioned above, GLNPO employs a graded approach to documentation. The Quality
Manager, in conjunction with Project Officers, will determine the appropriate level of quality system
documentation for each project. In the past, GLNPO has used a four-tiered project category approach
(Simes, 1989) to determine the detail necessary for QAPPs. These categories have been replaced by the
following categories:

-------
                                                                                   GLNPO QMP
                                                                                    Revision: 03
                                                                                      May 2008
                                                                                    Page 5 of 96
       >  Existing data/Modeling                          >  Habitat/Ecosystem restoration
       >  Tribal grants                                    >  Ambient monitoring and research
       >  Sediment assessment                                demonstration
       >  Consortium grants                               >  Pollution prevention and
       >  Cluster grants                                       Environmental education
       >  Volunteer monitoring                            >  Repeating projects of similar scope
       >  State Agencies

       This QMP encompasses environmental information collection and assessment activities for which
GLNPO has lead responsibility, including any contracting and assistance agreements requiring GLNPO
funds.  However, many agencies (Army Corps of Engineers, Department of Energy,  United States
Geological Survey, EPA Regions, etc.) and state and private organizations (The Nature Conservancy, The
International Joint Commission, The Great Lakes Commission, etc.) cooperate with GLNPO on multiple
and recurring projects. In cases where GLNPO is participating in projects which are not the direct
responsibility of GLNPO, GLNPO personnel will adhere to the approved quality systems of the lead
agency, if consistent with CIO Order 2105; otherwise personnel will adhere to GLNPO's quality policy.

1.3   PROGRAM DESCRIPTION

       The Great Lakes National Program Office (GLNPO) was created in 1978 to  fulfill the United
States' obligation under the Great Lakes Water Quality Agreement with Canada. Since inception,
additional responsibilities for GLNPO have been defined in Section 118 of the Clean Water Act, Section
112 of the Clean Air Act Amendments, and the Great Lakes Critical Programs Act of 1990. GLNPO is a
geographically-focused office, whose mission is to lead and coordinate United States efforts to protect
and restore the Great Lakes. GLNPO's responsibilities include:

       >  Overseeing fulfillment of EPA's international commitments under the U.S.-Canada Great
           Lakes Water Quality Agreement
       >  Monitoring lake ecosystem indicators
       >  Managing and providing public access to Great Lakes data
       >  Helping communities address contaminated sediments in their harbors
       >  Supporting local protection and restoration of important habitats
       >  Promoting pollution prevention through activities and projects such as the U.S.-Canada Great
           Lakes Binational Toxics Strategy
       >  Providing assistance for community-based Remedial Action Plans for Areas of Concern and
           for Lakewide Management Plans

       GLNPO assists Great Lakes partners (including federal, state, tribal, local, educational, and
industry organizations) in these areas through technical assistance and coordination,  as well as grants,
interagency agreements, and contracts.

       GLNPO has the primary responsibility for developing policies and coordinating programs
relating to the Great Lakes which are both national and international in scope and effect.  It functions as a
principal liaison with Canadian federal and provincial  governments, the International Joint Commission,
other EPA Regions, the EPA Office of International Activities, and the State Department.  The Office
serves as a focal point of EPA's activities in fulfillment of the Great Lakes Water Quality Agreement.  It
utilizes the coordinated efforts of contractors, grantees, Regional support staff, and other organizations to
generate necessary technical data and reports of findings.

-------
GLNPO QMP
Revision: 03
May 2008
Page 6 of 96	

1.3.1  MISSION

       The ultimate mission of the Great Lakes National Program Office is to promote the protection and
restoration of the chemical, physical, and biological integrity of the Great Lakes ecosystem.  GLNPO
supports its mission of Great Lakes protection by working towards the following goals:

Chemical Integrity - Reduce toxic substances in the Great Lakes Basin Ecosystem, with an emphasis on
persistent bioaccumulative substances, so that all organisms are protected.  Over time, these substances
will be virtually eliminated. Maintain an appropriate nutrient balance to ensure ecosystem health.

Physical Integrity - Protect and restore the physical integrity of the Great Lakes, including habitats vital
for the support of healthy and diverse communities of plants, fish, and other aquatic life and wildlife in
the Great Lakes Basin Ecosystem. Protect Great Lakes water as a regional natural resource from
diversions and exports.

Biological Integrity - Protect human and biological health. Restore and maintain stable, diverse, and
self-sustaining populations offish and other aquatic life, wildlife, and plants in the Great Lakes Basin
Ecosystem, including controlling and eliminating pathogens and preventing the introduction and spread
of invasive species to the maximum extent possible, to protect human health, biological health, and
economic vitality.

       GLNPO, in cooperation with various States, federal agencies, Tribes, and other key partners has
developed a Strategic Plan to achieve the above three environmental goals.  The Great Lakes Strategy
(Great Lakes 2002, A Plan for the New Millennium,
April 2002) provides the agenda for Great Lakes       The basin of this international watershed
                                                   includes two nations, eight U.S. states, a
                                                   Canadian Province, over forty Tribes and
                                                   First Nations, and many local governments.
                                                   Only through a cooperative partnership can
                                                   we ensure the health of the (Sreat Lakes.
Ecosystem management, reducing toxic substances,
protecting and restoring important habitats, and
protecting human/ecosystem species health.  The
Strategy was developed cooperatively by the U.S.
Policy Committee, a forum of senior-level
representatives from federal, state, and tribal natural
resource management/environmental protection agencies. The U.S. EPA Great Lakes National Program
Manager (also the Region 5 Administrator) chairs this forum. The draft Strategy identifies the major
basin-wide environmental issues in the Great Lakes and establishes common goals that the agencies will
work toward. It also will help fulfill domestic responsibilities described in the U.S.-Canadian Great
Lakes Water Quality Agreement. This strategy includes a fourth goal:

Working Together - Work together as an environmental community to establish effective programs,
coordinate authorities and resources, report on progress, and hold forums for information exchange and
collective decision-making, so the Great Lakes are protected and the objectives of the Agreement are
achieved.

-------
                                                                                 GLNPO QMP
                                                                                   Revision: 03
                                                                                    May 2008
                                                                                   Page 7 of 96
1.3.1.1    A CCOMPLISHING THE MISSION
       The Great Lakes Strategy identifies the major issues or challenges stakeholders face, establishes
major efforts to address these issues, and describes how stakeholders will work together. GLNPO also
has established five objectives that address issues specific to its role in protecting the Great Lakes
ecosystem.
               Five Objectives Established by GLNPO to Accomplish its Mission

  1. Integrate Monitoring and Data Interpretation - The Sreat Lakes face a multitude of
     environmental problems with less than adequate funding to address all. Monitoring efforts
     must be integrated in order to eliminate redundancies and develop consistent methods for
     reporting and interpretation.  SLNPO must be able to interpret and disseminate data and
     information to a myriad of data users and environmental managers.
  2. Promote Coordinated Efforts - SLNPO must ensure coordination and communication with
     cooperators such as EPA Regions, other Federal Agencies, other organizations, States, Tribal
     Nations, and Canada.
  3. Enhance Capabilities - By strengthening and fostering working relationships with other
     organizations with similar goals, SLNPO serves to enhance its capabilities to meet its mission.
      This can be accomplished by conducting and coordinating demonstration projects, developing
     guidance documentation, distributing final interpretative reports, conducting technology
     transfer, and participating in  international symposia.
  4. Fostering Public Understanding - SLNPO will strive to increase public awareness and
     knowledge of the Sreat Lakes through the implementation of its environmental data
     collection efforts and subsequent interpretative reports, and through implementation of an
     innovative Sreat Lakes Public  Outreach and  Education Program, which would  include activities
     such as the Cities Tours by the Research Vessel Lake Guardian, Environmental Education
     grants, and publications.
  5. Manage to  Reflect Quality - SLNPO's Quality Management Team implements a value-added
     quality management system for sound environmental decisions through collection,
     documentation, assessment, and peer review.

-------
GLNPO QMP
Revision:  03
May 2008
Page 8 of 96	

1.3.1.2    SETTING GOALS TO ACCOMPLISH THE MISSION

       At the beginning of each fiscal year, GLNPO staff meet to review and assess progress, identify
goals for the coming year and outline technical activities to meet those goals. These activities typically
include:

       >  Assessment and mitigation of the effects of air pollution and water pollution introduced into
           the Great Lakes through monitoring of water, air, biota, and sediments
       >  Assessments and mitigation of habitat loss or modification
       >  Development of environmental indicators
       >  Assessment and mitigation of invasive species
       >  Binational coordination
       >  Restoration and enhancement of degraded or lost ecological resources
       >  Evaluation of the toxicity and extent of contaminated sediments
       >  Pollution prevention

       GLNPO collects environmental measurements in support of these activities. These measurements
typically include:

       >  Sampling and analysis of organics and inorganics in sediments, water, precipitation, air, and
           wildlife
       >  Abundance and concentration surveys of wildlife
       >  Toxicity testing using bioassays
       >  Testing for physical properties of sediments and water
       >  Mapping and landscape characterization techniques through remote sensing platforms
       >  Meteorological and physical parameters
       >  Surveys regarding utilization of pollution prevention tools
       >  Modeling of secondary data
       >  Surveys of environmental education effectiveness

1.3.2  GLNPO ACTIVITIES AND PROGRAMS

       GLNPO has ongoing monitoring programs, conducts special studies to address new impacts of
concern, and is involved in several large scale cooperative studies including a lakewide pollutant
modeling study, the Lake Michigan Mass Balance Study. Several ongoing activities are summarized
below and GLNPO's base monitoring program is discussed  in detail in the next section.

Great Lakes Strategy - As discussed above, the strategy is being developed cooperatively by the U.S.
Policy Committee and a forum of senior-level representatives from federal, State, and Tribal natural
resource management/environmental  protection agencies. The draft strategy identifies the major
basin-wide environmental issues in the Great Lakes and establishes common goals that the agencies will
work toward.

Great Lakes Binational Toxics Strategy (GLBTS) - Signed on April 7, 1997 by the EPA Administrator
and the Canadian Minister of the Environment, the strategy  has attracted a high level of interest both
nationally and internationally.  The strategy targets a suite of key chemicals that have impacted the Great
Lakes for decades, including mercury, polychlorinated biphenyls (PCBs), and dioxin. The GLBTS has
become instrumental in focusing attention on the importance of eliminating persistent, bioaccumulative,
toxic substances from the environment.

-------
                                                                                   GLNPO QMP
                                                                                    Revision: 03
                                                                                      May 2008
	Page 9 of 96

State of the Lakes Ecosystem Conference (SOLEC) - The fourth biennial SOLEC, held in October
2001, was attended by over 500 people from a wide variety of government and non-government sectors.
Objectives for SOLEC include to 1) assess the state of the Great Lakes ecosystem based on accepted
indicators; 2) strengthen environmental decision-making and management; 3) inform local
decision-makers of Great Lakes environmental issues; and 4) provide a forum for networking among all
the Great Lakes stakeholders. SOLEC has brought heightened awareness in the Great Lakes community
to several  emerging issues, including habitat loss, urban sprawl, and invasive species.

Environmental Indicator Implementation - To be successfully implemented, the SOLEC indicators
(discussed above) must be supported by: 1) a commitment to each indicator by at least one stakeholder
agency or organization for data collection, analysis, and reporting; 2) coordinated monitoring programs
among the stakeholders for cost-effective data collection; and 3) timely reporting through a binational,
interactive web site. GLNPO and Environment Canada are organizing the SOLEC efforts in all three
areas, but  strong cooperation and support of many other agencies and organizations is essential.

Lake Michigan Mass Balance (LMMB) - One of the  most extensive  studies of a lake ecosystem ever
undertaken, the LMMB study is providing important environmental information regarding toxic loadings,
transport, and bioaccumulation within the food web. The chemicals under study, PCBs, atrazine, mercury
and trans-nonachlor, present a cross-section of important contaminants in the environment. The
mathematical mass balance models, the main products of the  study, will provide state-of-the-art
management scenarios/options for control of toxics in Lake Michigan.  Final project results for several
components of the study are scheduled for 2001 and 2002.

Binational Consortium to Protect Great Lakes Coastal Wetlands - Over the last several years, Great
Lakes coastal wetlands have received increasing attention concerning their quantity and quality. A
consortium of Canadian and United States scientists and resource managers has been convened to monitor
the size and ecological health of Great Lakes coastal wetlands in order to guide their protection and
restoration. The tasks of consortium members over the next two years are  1) to design and validate
indicators to assess the ecological integrity of Great Lakes coastal wetlands; 2) to design an
implementable, long-term program to monitor Great Lakes coastal wetlands; and, 3) to create, and
populate, a binational database accessible to all scientists, decision-makers, and the public.

Invasive Species - In the Great Lakes Basin, over 139  non-indigenous aquatic species have become
established since the 1800s.  Many of the non-indigenous species have been introduced over the last 4
decades as a result of increased shipping and international trade.  Control programs cost millions of
dollars annually, and the species are a threat to the ecological and economic value of the Great Lakes.
GLNPO participates in regional, national and international efforts to prevent introductions and control the
impact of invasive species. Through its grant program, GLNPO is funding programs to investigate
control strategies, as well as basin-wide demonstrations of innovative restoration techniques.

Air Deposition -  GLNPO, working with Environment  Canada, operates the Integrated Atmospheric
Deposition Network (IADN), a network of 16 air monitoring stations (5 in the U.S.).  The objectives of
IADN are to determine concentration trends of priority toxic  chemicals, calculate atmospheric loadings
(amounts deposited to the  lakes), and to supply this information to environmental managers so that
appropriate control actions can be pursued.  Currently IADN monitors for organochlorine pesticides
(including dichloro-diphenyl-trichloroethane [DDT], dieldrin, and chlordane), PCBs, and a suite of
poly cyclic aromatic hydrocarbons (PAHs). IADN is considered a model for long-term atmospheric
deposition monitoring.

Contaminated Sediments - GLNPO provides technical, financial, and field support for state and tribal
partners to help solve sediment problems. Recent actions have included remediation efforts involving the

-------
GLNPO QMP
Revision:  03
May 2008
Page 10 of 96	

Ottawa River in Toledo, Ohio, and the Fox River in Green Bay, Wisconsin and additional partnership
projects are targeted for the Trenton Channel and White Lake in Michigan, and the Minnesota Slip in
Duluth, Minnesota. GLNPO also is actively involved in demonstrating the use of treatment technologies
as an alternative to landfilling. GLNPO is coordinating efforts with the Army Corps of Engineers to
evaluate options for the beneficial reuse of sediments.

Remedial Action Plan Delisting Guidance - Great Lakes Areas of Concern (AOCs) were identified in
the  1987 Protocol to the Great Lakes Water Quality Agreement. To date, only one of the 43 identified
AOCs has been remediated and formally delisted through the Remedial Action Plan process.  A
workgroup comprised of federal and state agency staff as well as observers from Canada, the Province of
Ontario, and the International Joint Commission (IJC) are developing a guidance document which
describes what needs to be accomplished to achieve formal delisting.

Lakewide Management Plans - Under the Great Lakes Water Quality Agreement (GLWQA) as
amended in 1987, the United States and Canada agreed to develop and implement, in consultation with
state and provincial governments, Lakewide Management Plans (LaMP) for open waters and Remedial
Action Plans (RAP) for Areas of Concern (AOC). The LaMPs are intended to identify the critical
pollutants that affect the beneficial uses of the lake and to develop strategies, recommendations, and
policy options to restore those beneficial uses.  GLNPO Lake Team Managers are  involved in the
development of LaMPs for Lakes Michigan, Erie and Superior.

       In the case of Lake Michigan, the only Great Lake wholly within the borders of the United States,
the  Clean Water Act holds the U.S. EPA accountable for the LaMP. EPA has chosen a collaborative
approach to the implementation of this responsibility, and a partnership of federal, state, tribal, and local
governments in the basin is working with stakeholders in the Lake Michigan Forum to develop and
implement the LaMP.

       The Lake Erie  LaMP process began in 1995 with the publication of the Lake Erie LaMP Concept
Paper (U.S. EPA 1995) which provided a framework for building the LaMP. In keeping with the
direction of the GLWQA, the framework included an emphasis on public involvement. Throughout the
Lake Erie LaMP process and in preparation of LaMP technical reports and documents, the participation
and input of the Lake Erie Binational Public Forum has been promoted and encouraged.

       The Lake Superior LaMP is developed within the Lake  Superior Binational Program. In 1990,
the  fifth biennial report of the IJC to the U.S. and Canadian governments recommended that Lake
Superior be designated as a demonstration area where "no point source discharge of any persistent toxic
substance will be permitted."  In response, on September 30, 1991, the federal governments of Canada
and the U.S., the Province of Ontario, and the States of Michigan, Minnesota, and  Wisconsin announced a
Binational Program to Restore and Protect Lake Superior, also known as the Lake  Superior Binational
Program.

       The LaMP documents serve as the guides to a continuing process of collaborative ecosystem
management and partnership activities aimed at achieving the LaMP goals and restoring the 14 beneficial
use impairments outlined in the GLWQA. LaMPs are to be completed in four stages: (1) when problem
definition has been completed, (2) when the schedule of load reductions has been determined, (3) when
remedial measures are selected, and (4) when monitoring indicates that the contribution of the critical
pollutants to impairments of beneficial uses has been eliminated.

       In May 1999, the Great Lakes States Environmental Directors issued a challenge to the U.S.
Environmental Protection Agency that all LaMP documents were to be completed by Earth Day 2000.  It
is expected that the LaMP process will be an iterative process from 2000 forward and that the LaMPs will
be updated biennially, with the latest scientific and technical information incorporated into the existing

-------
                                                                                  GLNPO QMP
                                                                                   Revision: 03
                                                                                     May 2008
	Page 11 of 96

documents. LaMPs for Lake Michigan, Lake Erie, and Lake Superior have common chapters, but differ
in format and amount of detail. GLNPO Lake Team Managers assist in development and implementation
of the LaMPs and continue to develop updates to the LaMP documents to assist in achieving LaMP goals.

1.3.3 BASE MONITORING PROGRAM

       GLNPO has primary responsibility within the U.S. for conducting surveillance monitoring of the
offshore waters of the Great Lakes. The water quality surveys generally consist of two surveys per year:
a spring survey and a summer survey. The spring surveys are designed to collect water quality
information during unstratified (isothermal) conditions of the lake, so the survey circuit is planned to
move from warmest to coolest waters to ensure that sampling at all sites is conducted before stratification
begins. The summer surveys are designed to monitor the quality of each lake during stratified conditions.

       Survey activities are conducted onboard EPA's Research Vessel (R/V) Lake Guardian, a former
offshore oil field supply vessel.  The ship is operated by an onsite ship operations contractor and staffed
by GLNPO Chief Scientists, contractors, and grantees. Most of the survey measurements are made
onboard the ship, either on the bridge or deck (e.g., meteorological measurements such as wind speed and
direction, wave height and direction, air temperature, etc.), by the conductivity/temperature/depth (CTD)
probe attached to a sampling device, or in the onboard laboratories (e.g., turbidity, conductivity, pH, etc.).
The remaining measurements are made by grantee and contractor staff.
       GLNPO attempts to coordinate its
spring and summer survey activities with other
organizations, such as EPA Regions 2, 3, and
5, who are involved in Great Lakes research
activities.  The goal of this coordination is to
maximize sampling activities aboard the R/V
Lake Guardian. GLNPO offers the vessel for
special projects to States, universities, research
institutions, agencies, and other organizations.
In these cases, GLNPO's Quality Management
Team reviews quality system documentation,
usually a QAPP or streamlined QAPP that
addresses relevant components consistent with
the graded approach policy of GLNPO.

       To support GLNPO's ongoing monitoring programs and other special studies, GLNPO has
developed a quality management system to assure environmental information used to support decisions is
of adequate quality and usability for their intended purpose.
(SLNPO's quality system tools increase the
effectiveness of its monitoring program through:

  >  Documenting and implementing SOPs
  >  Conducting annual training and readiness
     reviews
  >  Coordinating efforts among researchers,
     State, and Regional staff
  >  Maintaining, distributing, and implementing a
     QAPP
  ^  Creating real-time electronic data files

-------
GLNPO QMP
Revision:  03
May 2008
Page 12 of 96	

1.4   ORGANIZATION OF THE GREAT LAKES NATIONAL PROGRAM OFFICE

       GLNPO is organized into nine teams that specialize in specific areas of interest and expertise for
data gathering activities. Each team is comprised of a team leader and staff from GLNPO, and where
appropriate, other organizations within EPA (such as Region 5, Region 2, etc).  The teams conduct
projects, develop products, and provide support to other teams, making teams interdependent and
providing for efficiencies. A management team oversees the activities of all other teams.  In addition,
three Regional Lake Managers are responsible for implementing Lakewide Management Plans as
described in Section 1.3.2. Figure 1.1 represents the GLNPO organizational structure. As illustrated, the
organization is made up of a Director, Management Advisors, Branch Chiefs, Regional Lake Managers,
Senior Advisor, Teams, and staff. The following sections describe the functions of the teams and various
individuals. The Quality Management Team Leader, the  Quality Manager, reports directly to GLNPO's
Office Director.  GLNPO's Director reports to the Great Lakes National Program Manager, who is also
the Region 5 Administrator as dictated by Section 118 of the Clean Water Act, and thus the GLNPO
Director is part of Region 5 Senior Management. Therefore, administrative services such  as budget
development, training, implementation of contracts and grants, and facilities are provided  through Region
5. The GLNPO management and staff share responsibility for implementation of GLNPO's quality
system.

-------
                                                                                                                 GLNPO QMP
                                                                                                                  Revision: 03
                                                                                                                    May 2008
                                                                                                                 Page 13 of 96
     Figure 1.1: Great Lakes National Program Office Functional Organizational Structure
             r •
 Regional Lake Team Managers
   Lake Michigan, Judy Beck
   Lake Erie, Dan O'Riordan
Lake Superior, Elizabeth LaPlante
Gary Gulezian
   Director
                                                                       Management Team
                                        r
                                   Vicki Thomas
                                   Branch Chief,
                                 Policy Coordination
                                 & Communications
                              and Management Advisor
           Dave Cowgill
           Branch Chief,
        Technical Assistance
            & Analysis
      and Management Advisor
     Paul Horvatin
     Branch Chief,
  Monitoring Indicators
     & Reporting
and Management Advisor
                                                           Louis Blume
                                                         Quality Manager
                              Teams:
                         Pollution Prevention,
                     Communications & Reporting,
                                and
                         Program Planning &
                             Budgeting
            Teams:
      Sediment Assessment
         & Remediation,
       Ecological Protection
          Restoration,
              and
     Information Management
        & Data Integration
        Teams:
 Environmental Monitoring
   & Invasive Species,
         and
     Health & Safety
      Team:
Quality Management
               Quality Management Authority
               Management Authority

-------
GLNPO QMP
Revision: 03
May 2008
Page 14 of 96	

1.4.1  GLNPO FUNCTIONAL TEAMS

       In establishing teams, the following three premises apply:  (1) all teams will be established to
accomplish tasks identified as Office priorities; (2) each team and its members have responsibility for
grant and contract oversight where support resources are essential to achieving team commitments; and
(3) in general, PO and grant and contract oversight roles are determined through the teams, the team
leader, and the management advisor. These teams are:

       >  Management Team (MT)
       >  Ecological Protection and Restoration Team (ERST)
       >  Sediment Assessment and Remediation Team (SART)
       >  Information Management and Data Integration Team (IMDI)
       >  Environmental Monitoring and Indicators Team (EMIT)
       >  Health and Safety and Environmental Compliance Team (HST)
       >  Communications and Reporting Team (CRT)
       >  Pollution Prevention Team (P2)
       >  Program Planning and Budget Team (PBT)
       >  Quality Management Team (QMT)

Each team has developed a performance agreement (except for the Management Team and Teamlets),
which includes a mission statement.  Mission statements of each team are provided in Appendix B. Team
leaders are responsible for developing the performance agreements. Although team  leaders suggest and
facilitate key people and functions for staff, they are not part of management and do not have
supervisory authority.  However, as part of the development of performance agreements, tasks are
assigned to key staff and these tasks are finalized with management approvals of the agreements.  For this
reason, it is important that Team leaders draft performance agreements that are detailed enough to capture
staff assignments. Team performance agreements do not cover all  GLNPO activities such as: Great Lakes
Team activities; developing, publishing and distributing Reports to Congress; LaMP activities; and new
activities addressing AOCs. Annually, each team submits a workplan and proposed resources for
discussion and approval by GLNPO managers. Team activities for the year are planned based on these
approvals. Two "functional areas" also are active and in the future may become teams. They are:
Invasive Species and Emerging Issues. These functional areas coordinate grants and  contracts and
prepare formal and informal reports to management.

       The Quality Management Team supports GLNPO's environmental information collection and
assessment activities by providing the necessary resources and tools to assure the collection of data of
known and appropriate quality.

       The Quality Management Team serves three major functions. First and foremost, the team
provides assistance to GLNPO staff and cooperators to assure that  studies produce data of adequate
quality for the environmental decisions being made. Secondly, the Quality Management Team functions
in a role of independent evaluation and oversight to assure adherence to GLNPO quality policy and
protocols. Lastly, the Quality Management Team assists in developing documents or products, including
policy statements, progress reports, plans, and reports for major data collection activities.

-------
                                                                                                   GLNPO QMP
                                                                                                   Revision:  03
                                                                                                      May 2008
                                                                                                   Page 15 of 96
Table 1-1. Major Functions of the Quality Management Team
       Function
                                     Description
   Provide assistance
> Assist in the development of quality system documentation and identification of project quality
   objectives
> Provide tools (software, guidance documents, technical expertise) for the development of
   quality system products. These include QA project plans, sampling designs, field and
   laboratory audits, data quality assessments, and QA reports
> Provide training on various quality system requirements and concepts, with special emphasis to
   POs
> Assist in defining appropriate project-specific quality system documentation
> Provide assistance and guidance in implementing the National Peer Review Policy
> Maintain a QA Library
> Act as a liaison with EPA Quality Staff (monthly conference calls) for policy information
> Attend national/bi-national QA meetings to keep abreast of QA improvements
  Evaluation/Oversight
>  Review and comment on QA project plans within 10 working days of receipt by PO
>  Assist in implementing data quality and technical systems audits
>  Conduct data quality assessments
>  Use QATRACK as an evaluation tool for quality system documentation development (Section
   2.6)
>  Serve as QA chair on various programs
>  Develop QA reports for major data collection activities
>  Oversee peer review program
     Documentation
>  Revise GLNPO's QMP every five years or as needed to capture policy changes, submit QMP
   to Quality Staff, and distribute to GLNPO staff
>  Write QA Annual Report and Workplan for EPA Quality Staff and GLNPO Staff
>  Prepare monthly Quality Management Team briefing reports
>  Maintain a database of peer review and scientific activities. Write a mid-year and annual report
>  Coordinate development of SOPs for ongoing monitoring projects
>  Maintain a database for key QA project-level documents
>  Assist in the development of QA reports for efforts supporting key environmental decisions

-------
GLNPO QMP
Revision: 03
May 2008
Page 16 of 96	

1.4.2  KEY PERSONNEL AND ASSOCIATED RESPONSIBILITIES

1.4.2.1    GLNPO DIRECTOR

        The Director has overall responsibility for managing GLNPO according to Agency policy and has
final authority at the program office level. As noted in CIO Order 2105, the direct responsibility for
assuring data quality rests with line management. Ultimately, the Director is responsible for establishing
QA policy and resolving QA issues
which are identified through the
Quality Management Team and
Quality Staff.
       (SLNPO's Director fosters a seamless quality management
       policy by integrating quality activities into daily operations.
Table 1-2. Major Responsibilities of the Director
           Function
                             Description
         Management
>  Serves as Management Team Leader
>  Approves the budget and planning processes
>  Ensures quality management policy is discussed with GLNPO management and is
   addressed in individual and team performance agreements
    Quality policy establishment
   Ensures that GLNPO develops and maintains a current QMP and ensures
   adherence to the document by GLNPO staff, other EPA offices, and extramural
   cooperators funded by GLNPO
   Establishes policies to ensure that quality management requirements are
   incorporated in all environmental information collection activities
   Ensures that appropriate quality system documentation is developed for all data
   collection activities in which GLNPO is the project leader and submitted to the QA
   Manager for review and approval prior to project initiation
   Encourages atmosphere where quality management practices are a beneficial and
   integral part of GLNPO staff daily activities
   Recognizes and awards exemplary quality implementation
       QA issue resolution
>  Maintains an active line of communication with the Quality Manager
>  Facilitates corrective action that may be required by the quality Manager's findings
>  Ensures that the protocol to address missed requirements in GLNPO assistance
   agreements (Appendix M) is implemented.
        The Director delegates partial responsibility of quality management system development and
implementation in accordance to Agency policy to the Branch Chiefs, Team Leaders, and Project
Officers. Oversight of the GLNPO quality management program is delegated to the Quality Manager.

1.4.2.2    BRANCH CHIEFS/MANAGEMENT ADVISORS

        Branch Chiefs and Management Advisors oversee and support the activities of the various
GLNPO teams. These individuals serve a dual role: that of supervisor to designated GLNPO staffer
Senior Advisors, and that of Management Advisor to designated teams. Because each advisor has
specific skills, education, and work experience, advisors are a resource to all teams.  The Branch Chief is
the delegated manager responsible for data collection and quality management activities of projects
occurring within their respective branch.  GLNPO's director is the management advisor for the Quality
Management Team.

-------
                                                                                             GLNPO QMP
                                                                                              Revision: 03
                                                                                                 May 2008
                                                                                              Page 17 of 96
Table 1-3.  Major Responsibilities of Branch Chiefs and Management Advisors
         Function
                                  Description
  Supervision/Management
>  Supports the team members and assists in obtaining necessary resources
>  Helps select the team leaders and team members
>  Advocates the team cause and works to overcome barriers
>  Ensures that the protocol to address missed requirements in GLNPO assistance
                               agreements (Appendix M) is implemented.
    Management of data
    collection and quality
    management activities
>  Ensures that appropriate QA criteria for all projects and tasks are included in operating
   guidance for all teams in which the Senior Advisor is the Management Advisor
>  Ensures establishment of data quality acceptance criteria for all projects and tasks
   conducted by the branch
>  Ensures that an adequate degree of auditing is performed to determine compliance with
   quality management requirements
>  Ensures that deficiencies highlighted in audits are appropriately addressed
>  Develops quality management-related infrastructure and communications channels
>  Identifies project-specific quality training needs and provides for required Quality
   management training
>  Evaluates QA/QC costs
>  Reviews and evaluates the quality of outputs generated by each project
>  Informs the GLNPO Quality Manager of all data collection activities occurring within his/her
   respective branches
>  Ensures that all POs understand their quality management responsibilities and that quality
   is addressed in the position descriptions of all subordinates involved in data collection
   activities
>  Ensures that quality management is an identifiable activity with associated resources
   adequate to accomplish program goals in the development and execution of all projects and
   tasks, both intramural and extramural, involving environmentally related measurements
>  Ensures that all projects and tasks involving environmentally related measurements are
   covered by an appropriate quality system documentation and that the system is
   implemented
1.4.2.3     TEAM LEADERS

        The Team Leader is the person who manages the team, including orchestrating all team activities,
calling and facilitating meetings, handling administrative details, and overseeing preparations for reports
and presentations. The Team Leader is responsible for preparing Team performance agreements.
Ultimately, it is the Team Leader's responsibility to identify team tasks and, through the Supervisor, to
assign tasks and provide the means to enable team members to do their work. The Team Leader is the
contact point for communication between the team and the rest of the organization, including the
management team. The Team Leader position will be periodically reviewed and changes in leaders can
occur based on recommendations from the current leader, the team, or the Management Team.

-------
GLNPO QMP
Revision: 03
May 2008
Page 18 of 96	

Table 1-4.  Role of the Team Leader
         Function
                                 Description
     Team management
   Focuses the energies of the team on defining and accomplishing desired outcomes on
   projects as directed by the Management Team to accomplish an Office priority
   Strengthens the team and its processes by being careful to see that all matters that
   involve and affect the team are dealt with by the group, while at the same time avoiding
   those items or tasks that do not concern the group. (These items are handled by
   appropriate subgroups or individuals who give feedback on progress and results to the
   whole team.)
   Shares information on quality system documentation delinquencies (except for Safety
   Team)
   Prepares monthly report and annual workplan (except for Safety Team)
   Considers quality system implementation in workplan and team discussions
   Suggests POs to team projects
   Communication contact
           point
   Helps the team utilize efficient communication processes that provide better information,
   more technical knowledge, more facts, and more experience for decision-making
   purposes
   Routinely updates and briefs the Management Advisor and the Management Team on
   status of projects, schedules, anticipated road blocks with proposed solutions, budget
   projections, etc.
       Team member
     encouragement and
          support
>  Uses group decision making at every appropriate opportunity to earn team member's
   support for the final product or decision, thus gaining commitment to execute it fully
>  Knows that at times decisions must be made rapidly and cannot wait for group processes;
   therefore, anticipates these emergencies and establishes procedures with the team for
   handling them so that action can be taken rapidly with group support
>  Takes primary responsibility for establishing and maintaining a thoroughly supportive
   atmosphere throughout the team; encourages every member to participate
>  Encourages discovery of alternatives and solutions by protecting team members and their
   ideas from attack and criticism, so team members feel secure in sharing and exploring a
   multitude of proposals, ideas, thoughts and opinions
1.4.2.4     TEAM MEMBERS

        Although one person in the team will have the formal responsibility of being the Team Leader,
each team member shares in the responsibility to accomplish the goals of the team. Whenever a team
comes together to trade information, develop strategies, solve problems, or make decisions, every member
of the team must share the responsibility for making the meeting as successful as possible.  In addition, all
GLNPO team members and GLNPO staff are responsible for implementing GLNPO's quality system.

-------
                                                                                     GLNPO QMP
                                                                                      Revision: 03
                                                                                         May 2008
                                                                                      Page 19 of 96
Table 1-5. Responsibilities of Team Members
            Function
                          Description
  Accomplishing the goals and purpose
            of the team
>  Team members will consider their participation as a priority responsibility, not
   an intrusion on their real jobs
>  Team members are responsible for contributing fully to the project as possible,
   sharing their knowledge and expertise, participating in all team meetings and
   discussions, even on topics that may be outside their area
>  Team members will carry out their assignments between meetings
An individual may be needed as a team member if they:

        >  possess critical information, knowledge, or expertise pertinent to the subject or project
           under consideration;
        >  have a stake in the final outcome. That is, this individual will be directly impacted by what is
           decided, and his/her commitment is required for successful implementation;
        >  have responsibility for implementing the project;
        >  possess contrary viewpoints that will stimulate discussion, produce critical thinking, and
           moves the group forward in its thought process; and
        >  need to be a team member because of his or her position in the management structure of
           the Office.

1.4.2.5   QUALITY MANAGER

        The Quality Manager is the delegated manager of the GLNPO quality management program.  The
main responsibilities of the Quality Manager are overseeing all quality management tasks, ensuring that
all personnel  understand GLNPO's quality management policy and requirements and that all personnel
understand their specific quality management responsibilities.  The Quality Manager provides technical
support to plan, implement, document, and assess the effectiveness of QA and QC activities associated
with GLNPO-funded EICAAs.  To facilitate this role, the Quality Manager reviews and approves all
quality management products.

-------
GLNPO QMP
Revision: 03
May 2008
Page 20 of 96	

Table 1-6. Responsibilities of the Quality Manager
             Function
                            Description
     Oversight and management
Interprets Agency quality policy and develops the QA policy for GLNPO in
accordance with Agency quality management policies and direction from
management
As a quality management advisor, reviews all acquisition packages (grants,
cooperative agreements, inter-agency agreements) to determine the necessary
QA requirements (the Quality Manager's approval signature is required on all
grant acquisition packages to evaluate whether environmental data is collected or
used)
Develops quality management budgets
Ensures that all laboratory, field, or office personnel involved in environmental
information collection have access to any training or QA information needed to be
knowledgeable in QA requirements and protocols
Ensures that audits/reviews are accomplished to assure adherence to approved
quality system documentation and to identify deficiencies  in QA/QC systems
Ensures that adequate follow-through actions are implemented in response to
audit/review findings
Tracks the status of all quality system documentation
Identifies problems and advises on required management-level corrective actions
to Director and Management Team
Serves as the program's liaison with Quality Staff
Implements peer review component of quality system
     Provision of technical support
Assists staff scientists and project managers in developing quality system
documentation and in providing answers to technical questions
Ensures that all environmental information collection activities are covered by
appropriate quality system documentation (e.g., QAPPs)
Ensures that sampling and analytical methods for routine operations are well-
documented through Standard Operating Procedures (SOPs)
Assists in determining for each project, the need for, type, and frequency of
performance evaluation and reference samples.
Assists in solving QA-related problems at the lowest possible organizational  level
  Development, review and approval
           of QA products
Develops a QMP and revises it as necessary
Develops a QA Annual Report and Workplan for the GLNPO Director and the
Agency's Quality Staff

-------
                                                                                          GLNPO QMP
                                                                                          Revision:  03
                                                                                             May 2008
	Page 21 of 96

1.4.2.6    PROJECT OFFICER

        The PO has ultimate responsibility for ensuring
GLNPO's quality policy is implemented for all EICAAs
under their primary direction.  The PO, in consultation with
the Quality  Management Team, determines the quality
system requirements and criteria for scheduled projects
based on the intended use of the data. The PO has the
responsibility for ensuring that these quality system
activities are communicated in the project-specific quality system documentation (e.g., QAPPs). The PO
is the key Agency spokesman  for the grantee regarding implementation of the quality system. The PO is
the principal technical contact for the Agency regarding extramural investments. In general, the PO is the
only person who can evaluate the technical content of requested products and accept or reject these
products.

Table 1-7. Responsibilities of the Project Officer	
                               (SLNPO's Quality Management Team
                               provides "tailor-made" support to
                               assist the Project Officer in
                               implementing appropriate quality
                               system activities.
           Function
                             Description
  Coordination of specific project(s)
  and determination of QA criteria
>  Develops or assists in the development of project quality objectives
>  Ensures the submission, review, and approval of appropriate quality system
   documentation (e.g., QAPP) prior to information collection
>  Ensures the implementation of approved quality system documentation for the
   project
>  Ensures that standard operating procedures (SOPs) for each data collection
   operation are reviewed and approved
>  Implements systematic planning for all environmental information collection
   activities
>  Reviews project products for adherence to QA goals
>  Adheres to Agency peer review policy
>  Alerts management and  Quality  Management Team when a grantee or contractor is
   not producing acceptable deliverables
>  Arranges for performance evaluation samples or reference samples (when
   applicable) in consultation with the quality manager
>  Arranges and conducting audits
>  Ensures that required corrective actions are implemented
>  Reports data quality problems to the Quality Manager.
>  Reviews the quality system documentation with project staff, and for extramural
   projects, with extramural organization's QA representative and principal investigator
>  Ensures that the protocol to address missed requirements in GLNPO assistance
   agreements (Appendix M) is implemented.
        In some cases, GLNPO will fund programs in other EPA Regions and a Project Officer will be
identified from Region staff.  This individual will be responsible for the quality management activities
listed above as required in the Region's QMP.  However, the GLNPO Quality Manager and other staff
will have an opportunity to review quality management material as defined in Section 1.2

-------
GLNPO QMP
Revision: 03
May 2008
Page 22 of 96	

1.4.2.7   PRINCIPAL IN VES TIG A TOR

        The principal investigator (both intramural and extramural) is responsible for adhering to
guidance and protocol specified in the quality system documentation when carrying out tasks for
GLNPO-funded EICAAs.

Table 1-8. Responsibilities of the Principal Investigator	
         Function
                           Description
  Adherence to quality system
       documentation
Prepares quality system documentation, in accordance with EPA Order 2105 and grant
specifications, and submits it to GLNPO PO for review and approval prior to information
collection activities
Negotiates data quality requirements with the PO and appropriate QA representatives
Trains staff in the requirements of the quality system documentation and in the
evaluation  of QC measurements
Develops SOPs and implements good laboratory practices
Verifies that all required quality management activities were performed and that
measurement quality standards were met as required in the quality system
documentation
Follows all  manufacturer's specifications for utilized instrumentation
Performs and documents preventative maintenance
Documents deviations from established procedures and methods
Reports all problems and corrective actions to the PO
Prepares quarterly reports and final project report in addition to other deliverable
requirements in the grant/contract
1.5    WORKING WITH EPA REGIONS AND STATES

        In implementing EICAAs to achieve its mission, GLNPO works cooperatively with EPA Regions
2, 3, and 5, as well as eight Great Lakes States (Pennsylvania, New York, Ohio, Indiana, Michigan,
Illinois, Minnesota, and Wisconsin). The Regions, as well as some of the Great Lakes States, have
developed and implemented quality systems specific to their EICAAs. Implementation of the quality
systems may not be similar; however, comparability of data collected from each organization is of prime
importance. Before cooperative projects are implemented, Regions, States, and programs participating in
the project will have an opportunity to review quality system documentation. The PO has the
responsibility of reviewing the quality system documentation from a technical perspective.  GLNPO will
distribute this QMP to Regions with which it frequently cooperates for review and comment.  GLNPO
also will review quality system documentation from the respective Regions, States, and other
organizations to gain an understanding  of their quality policy requirements.

        GLNPO will work with States to develop and implement an approved quality system through
review and approval of a quality management plan.  If a State has a GLNPO-approved quality system,
then GLNPO will assist the State with review of quality system documentation but will defer to the State
for approval of the documentation.  To  date, Wisconsin is the only state to develop a GLNPO-approved
QMP; however, additional states are in the process of developing QMPs.

        GLNPO will oversee projects delegated to states or regions to implement successful EICAAs
including reviewing  quality system  documentation, providing technical assistance for project design and
implementation, and providing quality system training to state and regional personnel. The degree of
GLNPO oversight will be  dependent on the level of effort associated with GLNPO-funded activities, the

-------
                                                                                GLNPO QMP
                                                                                Revision: 03
                                                                                   May 2008
	Page 23 of 96

magnitude of the project, and the importance of the environmental decision that the project is supporting.
 All project planning documents are subject to GLNPO review and approval.  GLNPO also will
participate in Headquarter's management system reviews of regional offices and regional management
system reviews of States.  Refer to sections 7.2.1.1 and 7.2.1.2 for specific information regarding GLNPO
oversight and quality system requirements for states and regions working with GLNPO.

1.6    HISTORY OF QUALITY MANAGEMENT AT GLNPO AND ORGANIZATIONAL
       ACCEPTANCE

       GLNPO initiated an independent quality management program in 1992, previously operating
under the Region 5 QA Program. Since 1992, GLNPO's program has matured and now governs a wide
variety of environmental information collection activities. Currently, for all base monitoring program
activities that derive key environmental data for the Government Performance and Results Act (GPRA),
GLNPO has established sampling and analytical protocols that are documented in a retrievable and
publicly-available format. GLNPO also has developed a database that includes a common data dictionary
and sufficient management controls that provide for data collection storage and reporting. In aggregate,
GLNPO's quality system provides for ensuring quality data collection  for decisions based in sound
science.

       GLNPO's efforts over the next five years will emphasize maintaining the current program, as
documented in this QMP, as well as institutionalizing the use of our data systems and standards.  Further,
GLNPO hopes to work with the many external stakeholders implementing the new Agency QA Order
which is consistent with the International Standard 9000 (2000), that provides for all our partners to
implement consistent quality management systems.

-------
GLNPO QMP
Revision: 03
May 2008
Page 24 of 96

-------
                                                                                 GLNPO QMP
                                                                                  Revision:  03
                                                                                    May 2008
	Page 25 of 96

                                        Section 2
                             Quality System Components

       GLNPO must implement a quality management program that provides the management and
technical practices to ensure that environmental information used to support Agency decisions are of
adequate quality and usability for their intended purpose. GLNPO uses a wide variety of quality
management practices and tools to implement its quality system including:

       >   quality management plans,
       >   project quality objectives and systematic planning,
       >   quality system documentation,
       >   standard operating procedures,
       >   training,
       >   tracking EICAAs,
       >   quality management team monthly reports to management,
       >   Director review of quality system implementation at the branch level, and
       >   GLNPO's annual report and workplan.

2.1    QUALITY MANAGEMENT PLANS

       This QMP serves to document GLNPO's quality system and also to communicate the quality
system to all GLNPO staff. The QMP is developed for use by all GLNPO staff, as detailed in Section 1.
This QMP is approved by the GLNPO Director, Branch Chiefs, Team Leaders, and a Quality Staff
representative. Staff also will be encouraged to use the QMP as a reference in support of EICAAs.

       It is GLNPO policy that before any funding of contracts or agreements containing EICAAs is
initiated appropriate quality system  documentation must be submitted. Contracts involving EICAAs will
include requirements for the provision of a quality management plan and quality assurance project plans,
or other appropriate quality system documentation.  Per EPA CIO Order 2105, environmental data are
any measurements or information that describe environmental processes or conditions, or the performance
of environmental technology. For EPA, environmental data include information collected directly from
measurements, produced from models, and compiled from other sources such as data bases or the
literature.  Environmental technology includes treatment systems, pollution control systems and devices,
and waste remediation and storage methods. Environmental information collection and assessment
activities represent any activities that involve collection, assessment, or use of environmental data.  All
applicants for grants or cooperative  agreements involving environmental programs shall submit quality
system documentation which describes the quality system implemented by the applicant which may be in
the form of a quality management plan or equivalent documentation. In some cases, a QAPP with several
paragraphs describing quality system issues will be considered equivalent to a QMP.  For these cases, the
document should include discussions of management and oversight, approval of subcontracted
components of the projects, and independent quality management reviews. Quality system
documentation for contracts is discussed in Section 4.2.1 and quality system documentation for assistance
agreements is discussed in Section 4.2.2.  Further, quality system documentation for programs is
discussed in Section 7.2.1.1 and quality system documentation for projects is discussed in Section 7.2.1.2.

-------
GLNPO QMP
Revision: 03
May 2008
Page 26 of 96
2.2    PROJECT QUALITY OBJECTIVES AND SYSTEMATIC PLANNING

        A crucial component of GLNPO's quality system is up-front systematic planning. Although
projects vary greatly in scope and importance, each should be started in essentially the same way —by
determining the level of quality required and by planning accordingly.  Consistent with GLNPO's graded
approach, the level of quality required will be determined by evaluating the importance of the activity,
available resources, the unique needs of the organization, and the consequences of potential decision
errors. A systematic planning process is used to facilitate the planning of data collection activities. It
asks the data user to focus their planning efforts by specifying: 1) the use of the data (the decision), 2) the
decision criteria, and  3) an acceptable probability threshold for making an incorrect decision based on the
data.  The process should:

        >  establish  a common language to be shared by decision makers, technical personnel, and
           statisticians in their discussion of program objectives and data quality;
        >  provide a mechanism to pare down a  multitude of objectives  into major critical questions;
        >  facilitate  the development of clear statements of program objectives and constraints which
           will optimize data collection plans; and
        >  provide a logical structure within which an iterative process of guidance, design, and
           feedback may be accomplished efficiently and cost effectively.

        Systematic planning must be a normal part of the project planning process and must be
accomplished based on cost-effectiveness and realistic capabilities of the  measurement process. A
detailed step-by-step process for planning is included in Section 7 that can be used to assist POs and
planners with planning effective EICAAs and complying with GLNPO's systematic planning
requirements.

        One approach that can be used for systematic planning is the Data Quality Objective process as
described in EPA's document Guidance on Systematic Planning using the Data Quality Objectives
Process, EPA QA/G-4, February 2006. Generally, the data quality objectives (DQOs) are statements of
the overall maximum uncertainty  associated with  the measurement system and the population that the
data users are willing to accept in the results derived from the EICAA. It  is the responsibility of the
GLNPO PO to define this allowable uncertainty and develop DQOs with the principal investigators and
cooperators. When a formal DQO document is required, it will be reviewed and approved by the Quality
Manager.  Training software on the DQO process  also can be acquired from Quality Staff: Decision
Errors Feasibility Trials (DEFT)  is an updated version of earlier software that assists in the
implementation of the DQO process.

        A formal systematic planning document must be prepared for EICAAs that meet any of the
following criteria:

        >  EICAAs  in support of EPA regulations or enforcement
        >  Long-term monitoring programs at numerous sites throughout the Great Lakes and
           surrounding basin
        >  EICAAs that provide a basis for significant environmental decisions (relative to the
           importance of the decision)
        >  EICAAs  longer than two years
        >  Any EICAA for which the Director deems necessary

-------
                                                                                   GLNPO QMP
                                                                                   Revision: 03
                                                                                      May 2008
	Page 27 of 96

       Other projects that do not meet these criteria will require less formal documentation and should
include the following information:

       >  definition of the objectives of the study and why environmental data are needed;
       >  definition of the quality of the data needed in order to meet the objectives (acceptable
           uncertainty);
       >  time and resource constraints of the project and how it affects data quality;
       >  identification of the possible errors that may arise during the data collection process; and
       >  the calculations, statistical or otherwise, that will be performed on the data in order to arrive
           at a result.

       A systematic planning process assists the user in defining the purpose  for the EICAA and sets the
framework for the design, implementation, and quality management of the project.  Once project quality
objectives are defined, a quality management program can be developed.  Quality system documentation
detailing this quality management program is then created which describes all  the activities specifically
designed for controlling and evaluating the data in order to satisfy the project objectives.  A detailed
description of GLNPO's systematic planning process is presented in Section 7.1.

2.3   QUALITY SYSTEM DOCUMENTATION

       The  EPA quality policy requires every EICAA to have written and approved quality system
documentation (e.g., QAPPs) prior to the start of the EICAA. This policy and the information included in
this section applies equally to intramural and extramural quality system documentation. The purpose of
the documentation is to specify the policies,  organization, objectives, and the quality assurance activities
needed to achieve the project objectives of an EICAA. It is the responsibility of the PO to adhere to this
policy. GLNPO employs a checklist that can be  used by the PO and the Quality Manger to determine if
formal quality  system documentation is necessary for a given project (Appendix F). If the PO proceeds
without approved quality system documentation, they are fully aware of the risks and assumes all
responsibility.  This risk should only be taken in  extreme emergencies.  The PO also bears the
responsibility of providing copies of the approved quality system documentation to each individual  who
has a major responsibility in the EICAA and explaining the elements of the quality system documentation
to these individuals.

       If a QAPP is deemed to be  required  by the PO and Quality Manager, QAPPs are prepared,
reviewed, and approved in accordance with EPA QA/R-5, EPA Requirements for Quality Assurance
Project Plans (Appendix G).  This  document identifies and defines the 24 elements that must be
addressed in all formal QAPPs.  For some EICAAs, only a subset of the 24 elements may be applicable
and according to GLNPO's graded approaches, GLNPO only requires that applicable elements are
addressed in quality system documentation.  These graded approaches are further discussed in Section
2.3.1.

       Review of the quality system documentation must include principal investigators, the PO, and the
Quality Manager.  It is recommended that the document be reviewed by the PO before submission to the
Quality Manager.  The Quality Manager will review quality system documentation for the required
elements, the soundness of the quality assurance  activities,  and compliance with GLNPO's quality
system. The Quality Manager will provide written comments on each element, which will be
accompanied by a QAPP or QMP checksheet (Appendix J). The checksheet is a summary that alerts the
PO as to whether or not QA requirements have been adequately described. The Quality Manager will
attempt to review quality system documentation within 10 working days of submission.

-------
GLNPO QMP
Revision: 03
May 2008
Page 28 of 96	

       All quality system documentation should be filed with the GLNPO Document Control
Coordinator (DCC), who will identify the document with a unique document control number (see section
5).  At present, this will be accomplished manually; however, an automated system is planned.  All
original copies of the quality system documentation will be secured by the DCC. The Quality Manager
will maintain a hard- and soft-copy of the QM review for the quality management files.  Tracking of
quality system documentation will be accomplished by the GLNPO Quality Manager utilizing
"QATRACK" a database system described in Section 2.6.

2.3.1  GLNPO's GRADED APPROACHES TO QUALITY SYSTEM DOCUMENTATION

       All EICAAs conducted by GLNPO staff (including Federal or private employees retained for
GLNPO services and located at the GLNPO offices) must be covered by appropriate quality system
documentation prior to the start of the EICAA. In the past, GLNPO has used a four-tiered project
category approach to its quality management program in  order to effectively focus quality management
activities.  This approach was developed by the U.S. EPA, Risk Reduction Engineering Laboratory,
Cincinnati, Ohio (EPA/600/9-89/087). As stated in this year's Quality Assurance Annual Report and
Workplan (QAARWP), GLNPO is no longer using these categories for development and review of
QAPPs. However, for historical clarification, GLNPO's  base monitoring program falls under Category 2
and a QAPP meeting these requirements is being implemented in support of that program. Guidelines for
each of these categories can be found in EPA document EPA/600/9-89/087.

       Every expenditure that GLNPO makes towards EICAAs and every intramural project that collects
environmental information driven by environmental decisions should have some documentation
commensurate with the importance of the question that is being  addressed. GLNPO's graded approaches
for quality system documentation for secondary data, modeling, consortium grants, habitat/ecosystem
restoration, and sediment assessment are discussed in Section 7.2.1 and examples are included in
Appendix K.

2.4   STANDARD OPERATING PROCEDURES

       Good laboratory practices and good management of field sampling operations include the
development and use of SOPs for all routinely used sampling, preparation and analytical laboratory
methods, and the housekeeping that supports them. SOPs facilitate comparability of data generated at
different times, or by different field or laboratory staff. These protocols should be detailed enough so that
someone else can reproduce results using the SOP (i.e., a journal article  is usually not sufficient).

       GLNPO will use SOPs to reduce variability in processes that are performed repeatedly by
multiple staff. During the planning phase of any EICAA, GLNPO will identify tasks that will be
performed routinely or by multiple staff, and will develop procedures for performing these tasks. SOPs
are written by the individuals performing the procedure and are  reviewed by one or more individuals with
appropriate training and experience with the process. The SOPs not only serve to ensure that routine
tasks are performed correctly and consistently, but also provide  the basis for staff training programs. The
GLNPO Quality Manager will review and audit staff conformance to SOPs and make recommendations
for updating these procedures annually. GLNPO POs or  their designees are responsible for approving
SOPs used for GLNPO EICAAs.  The POs in conjunction with the Quality Manager will work with the
principal investigators and quality staff from their organization to identify the procedures that would
benefit from the use of SOPs. This approach for developing and approving SOPs for use in new and
ongoing projects provides GLNPO with the following benefits:

       >  Consistency in performance, particularly in conducting data review and validation tasks
       >  Improved data comparability, credibility, and defensibility

-------
                                                                                   GLNPO QMP
                                                                                   Revision: 03
                                                                                      May 2008
	Page 29 of 96

        >  Reduced errors
        >  Increased efficiency in performing tasks, thus lowering costs

        In 2000, GLNPO developed a comprehensive manual titled, Sampling and Analytical Procedures
for GLNPO's Water Quality Survey of the Great Lakes. GLNPO reviews all SOPs in the manual for
improvement and clarity during performance of the WQS and will develop new SOPs for appropriate
activities. This manual will be updated yearly using appropriate document control procedures to
incorporate improvements and clarifications identified during the survey. A controlled copy of this
document is maintained by GLNPO to ensure that all individuals participating in the survey have and
employ current SOPs (Section 5).
                      Recommended Elements  of an Analytical SOP
                        >   Scope and Application
                        >   Method Summary
                        >   Sample Handling and Preservation
                        >   Interferences
                        >   Safety
                        >   Equipment/Materials/Reagents
                        >   Calibration
                        >   Procedure
                        >   Calculations
                        >   QA/QC
                        >   References
        The elements listed above are recommended but will not be strictly enforced due to the fact that
methods may be used have been previously documented.  However, long-term programs should contain
SOPs that include these elements, presented in a form that is useful to anyone performing the method.

        Methods can be included in the quality system documentation either in the body of the document
or as an appendix.  If the referenced method is not followed precisely, addendums to the method must be
included in the documentation that clearly identifies changes to the method, such that changes are obvious
to any individual using the method. If this altered method is used for an extended period of time, the full
method must be revised and submitted.  A method cannot be revised during project implementation
without the prior consent of the PO. If the modification is accepted, it must be documented in a letter to
the PO and included in the next submitted report.  It is the responsibility of the PO to inform all relevant
project participants of the protocol  change.

        Laboratories working with GLNPO should have a good laboratory practices document that is
available for review during technical audits. Good laboratory practices (GLPs) refer to the general
practices that relate to the majority of measurements such as: facility and equipment maintenance, record
keeping, chain-of-custody, reagent control, glassware cleaning, and general safety.

2.5    TRAINING

        In order to facilitate staff awareness of the quality system, all GLNPO staff involved in EICAAs
will receive a training course on the quality system based on this QMP. As part of the training, staff will
be given an overview of the contents of this QMP and the location, electronically and hard-copy, of the
current QMP.  Staff also will be encouraged to use the QMP as a reference in support of EICAAs.
Management personnel receive additional training on their specific roles and responsibilities for

-------
GLNPO QMP
Revision: 03
May 2008
Page 3Oof 96	

implementing GLNPO's quality system.  GLNPO's Quality Management Staff receive additional training
on quality planning, documentation, and assessment. Section 3 provides detailed information regarding
GLNPO's training requirements for quality management.

2.6    TRACKING ENVIRONMENTAL INFORMATION COLLECTION ACTIVITIES

       Due to the number of assistance agreements funded each year, GLNPO developed and employs a
database system (QATRACK) to track the development, review, and approval of quality system
documentation for all EICAAs involving GLNPO funds.  The database assists GLNPO with ensuring that
all grants and contracts have the required quality system documentation as determined by the Quality
Manager and PO. Because quality system documentation must be approved prior to initiation of an
EICAA, the Quality Manager is part of the assistance agreement signature chain (see Appendices H and
I), allowing for the earliest possible tracking of the agreement.  Data are entered into QATRACK during
the assistance agreement start-up stage, upon review and signature approval of the agreement by the
Quality Manager. The Quality Management Team enters new projects into QATRACK and maintains the
database. QATRACK has the capability to capture multiple rounds  of submissions and reviews of the
quality system documentation.

       The QATRACK database also is used to prepare reports on the status of quality system
documentation for ongoing EICAAs.  GLNPO's Quality Management Team uses the database to assist
them in preparing reports to management that provide the number of ongoing EICAAs and the status of
required quality system documentation. The database was designed also to function as a management
tool for evaluating adherence to the quality system and  staff workload distribution.  QATRACK is a
Microsoft Access-based application with one version available for the Quality Manager for entry, editing
and archive, and a second version available to staff which is "write protected" for review and query.
GLNPO is in the process of developing a version of this system in Oracle, GLNPO's Project Tracking
Database (PTD) and hopes to implement the system in the last quarter of fiscal year 2002.  When
implemented, information for new projects will be entered into the PTD and information for past projects
will be rolled over into the new database as time allows. The database will be organized according to
nine tabs as follows:

-------
                                                                                 GLNPO QMP
                                                                                  Revision: 03
                                                                                    May 2008
	Page 31 of 96

Tab 1: General Project Information
>  GLNPO ID # (alpha-numeric)
>  Funding Mechanism (grant, interagency agreement, cooperative agreement, contract, in-house,
    procurement request)
>  Assistance Agreement #
>  Contract #
>  Solicitation Type (text)
>  Document Control Number (DCN) (alpha-numeric)
>  Title of Project (text)
>  Project Summary/Description (text)
>  Project Type (research or demonstration; survey, study, or investigation; other)
>  PI Information (name,  address, phone, fax, email, organization name, organizational type [checklist:
    State; Interstate Agency or Commission; Sub-state or special purpose district; County; Municipality;
    Federal Agency; College or University; Tribal Organization; Federally funded research and
    development center; Individual; For-profit Company; or Other])
>  PO Information (name, phone)
>  Category (contaminated sediments, ecological protection and restoration, pollution prevention and
    reduction, monitoring, indicator development, invasive species, strategic or emerging issues, LaMP,
    GLWQA, GPRA, other)
>  Lake basin (Ontario, Erie, St. Clair, Huron, Michigan, Superior, Connecting Channels, AOCs [list all
    out] All, Other)
>  State/province where project is located (NY, PA, OH, MI, IN, IL, WI, MN, Ontario, All, Other)
>  Congressional district (text) Project Location (text)
>  Project period start (date)
>  Project period end (date)
>  Amendment start date(s)
>  Amendment end date(s)
>  Amendments (open)
>  Award date
>  QA needed (yes or no)
>  Peer review  needed (yes or no)
>  Data to GLENDA (yes or no)
>  Keyword (text)
>  Comments (open field)

Tab 2: Grant, Cooperative Agreement, Interagencv Agreement Pre-Award Phase
>  GLNPO ID #
>  Funded? (Yes/no)
>  Amount requested (money)
>  Amount match (money)
>  Preproposal (open field to attach document)
>  Technical  screening (checkbox)
>  Lake team recommendations (checkbox)
>  Recommendations to management (checkbox)
>  Management decision (checkbox)
>  Date of Management decision
>  Comments (open field)

Tab 3: Grant, Cooperative Agreement, Interagencv Agreement Award Phase
>  GLNPO ID #
>  Grant specialist name and phone
>  Budget period start (date)

-------
GLNPO QMP
Revision: 03
May 2008
Page 32 of 96	

>  Budget period end (date)
>  Fiscal year
>  Amount obligated (money)
>  Amount matched (money)
>  Congressional earmark? (Yes/no)
>  Date of full proposal request letter
>  Date full proposal received
>  Date documents/checklist come from assistance section
>  Date commitment notice sent (GLNPO)
>  Date of decision memo
>  Date final documents come from assistance section
>  Date assistance  agreement to NPM
>  Comments (open field)

Tab 4: Grant, Cooperative Agreement, Interagencv Agreement Project Management Phase
>  GLNPO ID #
>  Progress report frequency (open)
>  Site visits (date) (be able to enter in multiple dates)
>  Site visit report (open field) (be able to enter in multiple reports)
>  Program requirements checklist (progress report, draft of final report, final report, project
    documentation, payment, meetings/conferences, subcontracting, quality assurance, locational
    information, data reporting, safety manual, signage, disposition of wastes, other)
>  Date progress reports due
>  Date progress reports received
>  Progress reports (open field)
>  Date receive QA plan
>  Unliquidated obligations (open field)
>  Unliquidated obligations dates
>  Date of financial status report
>  Financial  status report (checkbox)
>  Comments (open field)
>  Reminder e-mail sent (checkbox)
>  Recipient contacted (checkbox)
>  Demand letter sent (checkbox)
>  Delinquency notice sent (checkbox)
>  Letter sent to branch chief (checkbox)
>  Letter sent to grantee from BC (checkbox)
>  Teleconference (checkbox)
>  Suspension letter sent from BC (checkbox)
>  Meeting to discuss options (checkbox)
>  Comments (open)

Tab 5: Grant, Cooperative Agreement, Interagencv Agreement Project Closeout Phase
>  GLNPO #
>  Final document checklist (MBE/WBE, property, inventions, final  FSR)
>  Money expended
>  Final report received (yes/no)
>  Final report approved (yes/no)
>  Final report (open field)
>  Final report location (e.g., web address)
>  Date closed-out
>  Comments (open field)

-------
                                                                                  GLNPO QMP
                                                                                  Revision: 03
                                                                                     May 2008
                                                                                  Page 33 of 96
>  Reminder letter from assistance (check mark)
>  Contact recipient prior to due date (check mark)
>  Recipient contacted on due date (check mark)
>  Demand letter sent (check mark)
>  Teleconference (check mark)
>  Delinquency notice sent (check mark)
>  Letter sent to branch chief (check mark)
>  Letter to grantee from director (check mark)
>  Teleconference (check mark)
>  Suspension letter sent to grantee (check mark)
>  Teleconference (check mark)
>  Debarment and suspension call and letter sent (check mark)
>  Comments (open)

Tab 6: Contracts
>  GLNPO ID #
>  Statement of work/Specifications (open field)
>  Contract specialist name and phone
>  Contract officer name and phone
>  Contracting Officers Representative Training (yes or no)
>  Contracting Officers Representative Training Form (open)
>  Work assignment/Task order (open)
>  Work assignment # / Task order #
>  Work assignment/Task order Start Date
>  Work assignment/Task order End Date
>  Work plan (open)
>  Technical Directive (text)
>  Technical Directive (date)
>  Procurement Request (open)
>  Date of procurement requisition
>  Procurement requisition amount ($)
>  Committed amount ($)
>  Amount ($)
>  Date of commitment
>  DCN#(text)
>  Comments (open)

Tab 7: Communications
>  GLNPO ID #
>  Final report submitted to GLNPO? (yes/no)
>  Web information (open)
>  Press release (open)
>  Fact sheet (open)
>  Journal article publication (open)
>  Outreach/Tech transfer planned (open)
>  Slides (open)
>  Comments (open)
Tab 8: Quality System Documentation
>   GLNPO ID #
>   Date entered

-------
GLNPO QMP
Revision: 03
May 2008
Page 34 of 96	

>  Quality system documentation (check QMP or QAPP)
>  Quality system documentation (open)
>  Quality system documentation type (modeling, habitat restoration, secondary data, monitoring,
    research analytical, sediment assessment, sediment remediation)
>  Date QS documentation due
>  GLNPO Lead
>  Funding branch
>  Funding team
>  Date to QA manager
>  Quality system review checksheet (open)
>  Initial approval date
>  QA approval ID
>  Approval  status (no submittal, under review, approved, approved w/minor revisions, unacceptable,
    delinquent)
>  Date of review completion
>  Date of final approval
>  Audit checksheet (open field)
>  Date of audit
>  Audit summary report (open field)
>  Data files (open field)
>  Data assessment report (open field)
>  Comments (open field)
>  Contacted recipient (check mark)
>  Second contact with recipient (check mark)
>  Delinquent but not significantly affecting quality (check mark)
>  1st Phase Delinquency (check mark)
>  Letter sent to PO (check mark)
>  Teleconference (check mark)
>  Cc letter sent to branch chief (check mark)
>  2nd Phase  Delinquency (check mark)
>  Letter sent to branch chief (check mark)
>  Seriously  Delinquent (check mark)
>  Letter sent to grantee from director (check mark)Suspension letter sent to grantee (check mark)
>  Teleconference (check mark)
>  Debarment and suspension call and letter sent to grantee (check mark)
>  Withdraw funds  (check mark)
>  Comments (open)

Tab 9: Peer Review
>  GLNPO ID #
>  Date entered into peer review database
>  Objective (open)
>  Cross-cutting science issues (older Americans, children's health, tribal science, contaminated
    sediments, cumulative risk, indoor environments, environmental justice, genomics)
>  Peer Review Leader (name and phone)
>  Science Category (major scientific/technical, non-major scientific/technical, major economic, non-
    major economic, major social science, non-major social science, other)
>  Environmental Regulatory Model (new, modified, new application, N/A)
>  Environmental Medium (air, human health, multimedia, terrestrial, water, other)
>  Peer Review Type (internal, external, to be determined)
>  Peer Review Mechanism (text)

-------
                                                                               GLNPO QMP
                                                                                Revision: 03
                                                                                  May 2008
	Page 35 of 96

>  Results of Peer Review Comments (substantive revision to final product, minor revision to final
    product, no significant change to final product, product was terminated, to be determined)
>  Peer Review Charge/Instructions (open)
>  Peer Reviewer Name & Affiliation (text)
>  Peer Review Comments (open)
>  Management Decision on Comments (open)
>  Location of Peer Review File (text)
>  File Contact Name, organization, telephone (text)
>  Additional Supporting Documentation / Comments (open)
>  Date of projected peer review
>  Date peer review was conducted
>  Date peer review was completed
>  Date final peer review comments were received
>  Date of management decision on peer review comments
>  Comments (open)

2.7   QUALITY MANAGEMENT TEAM MONTHLY REPORTS TO MANAGEMENT

       GLNPO's Quality Management Team presents a status report on quality management activities
associated with all GLNPO-funded EICAAs each month during a management meeting. GLNPO uses a
monthly quality assurance status tracking sheet (Appendix C) to summarize and present the status of these
activities to management. The Team provides for all outstanding EICAAs, a list of required quality
system documentation, the due dates for the documentation, and the review and approval status of the
documentation. These reports are designed to also  serve as a management tool to evaluate office work
loads.

2.8   REVIEWS OF QUALITY SYSTEM IMPLEMENTATION AT THE BRANCH LEVEL

       GLNPO recently established a new procedure to facilitate assessing quality system
implementation at the Branch level. GLNPO's Office Director will meet with each Branch Chief and
Regional Lake Manager periodically at the discretion of the Quality Manager regarding adherence to the
quality system by all EICAAs administered through the branch. Prior to the meeting, GLNPO's Quality
Manager will prepare a briefing document on implementation of the quality system, including the status
of quality system documentation, the status and results of all technical system audits or reviews, and
outstanding issues. This meeting will function to close the loop on the quality system cycle from
planning and implementation to assessment and applying corrective action when needed.

2.9   ANNUAL REPORT AND WORKPLAN

       GLNPO recently established a new procedure to facilitate assessing quality system
implementation at the Branch level. GLNPO's Office Director will meet with each Branch Chief and
Regional Lake Manager periodically at the discretion of the Quality Manager regarding adherence to the
quality system by all EICAAs administered through the branch. Prior to the meeting, GLNPO's Quality
Manager will prepare a briefing document on implementation of the quality system, including the status
of quality system documentation, the status and results of all technical system audits or reviews, and
outstanding issues. This meeting will function to close the loop on the quality system cycle from
planning and implementation to assessment and applying corrective action when needed.

-------
GLNPO QMP
Revision: 03
May 2008
Page 3 6 of 96	

2.10  QUALITY SYSTEM AUDITS AND TECHNICAL SYSTEM AUDITS

       Quality systems audits (QSAs), previously termed management systems reviews, are on-site
evaluations by internal or external parties to determine if the organization is implementing a satisfactory
quality management program. They are used to determine the adherence to the program, the effectiveness
of the program, and the adequacy of allocated resources and personnel to achieve and ensure quality in all
activities.  Internal QSAs  are conducted by GLNPO senior management. GLNPO-funded entities also
may undergo QSAs lead by GLNPO's Quality Manager. External QSAs are conducted by EPA Quality
Staff to determine compliance of GLNPO's program with this QMP. Technical systems  audits (TSAs) are
qualitative on-site evaluations of all phases of an EICAA (i.e., sampling, preparation, analysis). These
audits can be performed prior to or during the data collection activity, in order to evaluate the adequacy of
equipment, facilities, supplies, personnel, and procedures that have been documented in the quality
system documentation. Because a TSA is most beneficial at the beginning of a project, GLNPO
schedules audits at the initiation phase of an EICAA, when possible. GLNPO performs a QSA, site visit,
or TSA for the most high-profile EICAAs (i.e., those that support an important decision). The number
and frequency are dependent on the length of the project, the importance of the project objectives, and the
evaluations of prior audits. Technical System Audits and QSAs are discussed in Section 9.1

-------
                                                                                  GLNPO QMP
                                                                                  Revision: 03
                                                                                     May 2008
                                                                                  Page 37 of 96
                                        Section 3
                      Personnel Training and Qualifications

       The success of any quality management program ultimately lies with the personnel who
implement the program on a daily basis. GLNPO must not only support activities that will satisfy the
mandatory quality management program, but also instill the philosophy of improving activities to provide
the highest quality data in a cost-efficient manner.  It is GLNPO policy to provide the quality system
training necessary to ensure that all staff involved with the generation and use of environmental data
understand and use GLNPO's quality system. Management is committed to ensuring that GLNPO
personnel responsible for EICAAs have the necessary education, training, and experience to develop,
control and evaluate data quality. The following sections describe GLNPO's quality system training
program.

3.1    QUALITY MANAGER TRAINING

       The Quality Manager regularly attends national and, in some cases, international conferences and
meetings on quality systems and the development of quality management materials and protocols relevant
to GLNPO. The Quality Manager will participate in training courses on quality management topics, such
as data quality assessment and QAPP development. This will assure that GLNPO personnel receive up-
to-date training on a variety of quality assurance subjects including EPA's quality policy.

3.2    GLNPO PERSONNEL QUALITY SYSTEM TRAINING

       Supervisors are responsible for ensuring that staff have the qualifications to do their jobs,
including those related to the quality system. Managers are responsible for discussing quality training
needs with personnel involved in EICAAs during the mid year and annual personnel performance
evaluations. As part of personnel performance evaluations, supervisors and personnel should incorporate
training and refresher training courses into the individual development plan to maintain competencies.
Personnel complete self-certification forms annually and the Quality Staff manage these records assuring
staff training and refresher training adequacy.

       In addition, because line management is ultimately responsible for the quality of data, managers
and supervisors also must receive the
necessary training to ensure their
                                              Repetition is critical to raising awareness
understanding of the importance of the
quality system, their responsibilities as
managers of data collection activities, and
specific GLNPO quality system policies and procedures.

       Training schedules will be developed in order to optimize attendance. Training may consist of
seminars or classes, or on-the-job training. If training cannot be met through in-house expertise, it may
be accomplished through external organizations.  Training also will be available to all personnel
cooperating on GLNPO projects (universities, other agencies etc.).  It will be the responsibility of the
GLNPO projects leads to make cooperators aware of these training opportunities. The Quality Manager
will provide the following training at least every three years:

>   Overview of GLNPO's Quality System    >  GLNPO Quality System Training for Project
>   QA Project Plan development               Officers
                                           >  Auditing and data verification/validation technique

-------
GLNPO QMP
Revision: 03
May 2008
Page 3 8 of 96	

       The Quality Manager will develop a library of pertinent quality management documentation to
assist GLNPO technical staff. The library will include documentation as well as software training
programs. The minimum required quality management training for GLNPO staff is detailed in Table 3-1.

Table 3-1. Quality Management Training Requirements for GLNPO Staff
Position
Managers (Branch Chiefs, Division Directors)
Work Assignment Managers, Project
Managers, Project Officers
GLNPO Quality Management Team
All GLNPO staff involved in the generation or
use of environmental information
Quality Management Training Requirements
> Overview of GLNPO's Quality System (every 3 years)
> Orientation to Quality Assurance for Managers (1 time only)
> Overview of GLNPO's Quality System (every 3 years)
> GLNPO Quality System Training for Project Officers (every 3 years)
> Overview of GLNPO's quality system (every 3 years)
> Development of Quality Assurance Project Plans (1 time only)
> Data Quality Assessment (1 time only)
> Development of Quality Management Plans (suggested course)
> Overview of GLNPO's quality system (every 3 years)
     Attendance at the courses will be recorded, and attendees will receive a written record from the
Quality Manager or instructor after completion of a course. The Quality Management Team will maintain
records of the quality system training taken by personnel in each Program Office.  A summary of the
quality system training will be provided in the annual report, including, but not limited to, a list of the
courses offered, the number of attendees, and a listing of all participating organizations. Whenever a new
QMP is developed or whenever significant revisions to the QMP are conducted, training will take place
within  6 months of approval of the QMP by OEFs Quality Staff in order to ensure GLNPO staff are fully
informed of the quality system at any given time.

-------
                                                                                 GLNPO QMP
                                                                                  Revision: 03
                                                                                    May 2008
                                                                                 Page 39 of 96
                                        Section 4
                        Procurement of Items and Services

       GLNPO must ensure that procured items and services meet EPA regulations, are delivered in a
timely fashion, and are within GLNPO's specifications.  The following sections describe GLNPO's
procurement procedures.

       It is GLNPO policy that quality system requirements be explicitly addressed when acquiring
items or services that involve EICAAs. This policy applies to procurements such as contracts, as well as
to cooperative agreements, partnership agreements, grants to institutions of higher education, and other
non-profit organizations, Tribes, States, local governments, and interagency agreements. The following
Federal regulations contain sections relating to quality management or quality systems:

       >  48CFRPart46. Quality Assurance
       >  40 CFR Part 30. Grants and Agreements with Institutions of Higher Education, Hospitals,
                          and Other Non-Profit Organizations
       >  40 CFR Part 31. Uniform Administrative Requirements for Grants and Cooperative
                          Agreements to State and Local Governments
       >  40 CFR Part 35. State and Local Assistance

       In addition, there are other rules and regulations that apply to contracts and other forms of
financial assistance, including grants, assistance agreements, performance partnership agreements, and
interagency agreements, as described below.

4.1    PROCUREMENT OF ITEMS

       GLNPO utilizes the services of the EPA Region 5 Purchasing Section of the Contracts and Grants
Branch for its procurement needs. The Purchasing Section follows the guidelines in Section 13 of the
Federal Acquisition Regulations (FAR) which establishes government-wide policies and procedures
governing the acquisition process. Two EPA documents: EPA 1900-Contract Management Manual and
the EPA Acquisition Regulation Manual (EPAAR) have been developed to supplement the FAR.  Region
5 is required to implement the regulations in these documents.  EPA attempts to purchase through FAR
mandatory sources (i.e., GSA): items on the FAR source list that meet the minimum specifications on the
procurement request must be purchased through a FAR source.  Procurements of computer hardware and
software have a distinct process.  Computer procurements will be developed by the Information
Management and Data Integration Team and adhere to Region 5 policy.

       Requests for purchases and identification of funds begins at the planning stages of any GLNPO
project.  In order to assure  agreement among GLNPO POs, principal investigators,  and the Region 5
Purchasing Section, requesters should explicitly identify all items and associated specifications required
to meet the government's minimum needs. These specifications will be required during the procurement
process. In order to provide the Region 5 Purchasing Section with the correct information, it is suggested
that the specifications be developed or reviewed with a purchasing agent before initiating the procurement
request.  This will assure that the GLNPO requestor will receive the proper item and reduce the chances
of purchase delays or incorrect purchases due to inadequate product specifications.  The purchasing agent
also can assist the requestor in preparing the procurement request form.

       GLNPO utilizes procurement request forms (EPA form 1990-8) to initiate requests.  These forms
will be reviewed by the GLNPO planning and management staff for completeness and accuracy then

-------
GLNPO QMP
Revision: 03
May 2008
Page 40 of 96	

forwarded to appropriate GLNPO staff for required GLNPO reviews and approvals. The procurement
request will then be forwarded for additional reviews and approvals to the Region 5 Budget Office.  The
Budget Office certifies that funds are available and the Budget Services Center assigns a document
control number (DCN). The Budget Office forwards the procurement request to the Property
Management Officer for signature.  Finally, the request is sent to the Region 5 Purchasing Section. The
approval process may take up to two weeks from the time the procurement request is written until it
arrives at Region 5 Purchasing for approval and procurement. If the item is required sooner, the
procurement request must be "walked through" the approval process.

4.2   PROCUREMENT OF SERVICES

       Two types of mechanisms are principally used to procure services; contracts and assistance
agreements (grants, interagency agreements etc.). At GLNPO, contract officers are the only individuals
authorized to obligate funds for services.

       Certain activities are of a policy and decision-making nature and should remain the sole authority
of the EPA. Therefore, contracts or assistance grants should not include the following services:

       >  The actual preparation of Congressional testimony
       >  The interviewing or hiring of individuals for employment by EPA
       >  Developing and/or writing of Position Descriptions and Performance Standards
       >  The actual determination of Agency policy
       >  Participating as a voting member on a Performance Evaluation Board;  participating in and
           attending Award Fee meetings
       >  Preparing Award Fee letters, even under typing services contracts
       >  The preparation of documents on EPA letterhead other than routine administrative
           correspondence
       >  Reviewing vouchers and invoices for the purposes of determining whether cost hours, and
           work performed are reasonable
       >  The development of Statements of Work, Work Assignments, Technical Direction
           Documents, Delivery Orders, or any other work issuance document under a contract the
           contractor is performing or may perform
       >  On behalf of EPA, actually preparing responses to audit reports from the Inspector General,
           General Accounting Office, or other auditing entities
       >  On behalf of the EPA, actually preparing responses to Congressional correspondence
       >  The actual preparation of responses to Freedom of Information Act requests, other than
           routine, non-judgmental correspondence -  in all cases, EPA must sign it
       >  Any contract which authorizes a contractor to represent itself as EPA to outside parties
       >  Conducting administrative hearings

       In the past, contract and assistance agreements were utilized to provide QA/QC support to
EICAAs. Such use of non-EPA personnel could result in situations in which  inherent government
functions are being performed by non-governmental personnel or in which potential conflict of interest
could occur. Table 4-2 lists quality management tasks as either EPA only tasks or  discretionary tasks
which may be performed by EPA or non-EPA support staff.

-------
                                                                                                 GLNPO QMP
                                                                                                  Revision:  03
                                                                                                     May 2008
                                                                                                  Page 41 of 96
Table 4-2. Quality Management Task Performance by EPA and Non-EPA Personnel
    Quality Management Task
                          Details
Performed by:
     Manage and coordinate
     quality system (quality
     management program)
    Manages the day-to-day operation of GLNPO's mandatory
    quality system (quality management program)
    Acts as a liaison between the organization and OEI's Quality
    Staff on matters of quality policy
    Coordinates with senior management on the development and
    preparation of the organization's QMP that describes the
    quality system implemented by the organization
    Coordinates with senior management on changes to the
    quality system as needed to assure its continued effectiveness
    and reports the results annually to management and to OEI's
    Quality Staff in the QAARWP
    Manages organization resources designated for the quality
    system (quality management program)
    Maintains pertinent records of all quality system activities
    performed by the organization	
  EPA only
      Review and approve
   procurement and financial
  assistance documents for QA
         requirements
>   Reviews procurement and financial assistance documents to
    confirm that any need for QA requirements has been
    established, provides any necessary special language or
    conditions for such QA requirements, and approves by signing
    the appropriate Quality Assurance Review (QAR) Form
>   Participates directly or indirectly in the proposal or cooperative
    agreement/grant review processes to advise the PO on the
    suitability of the offerer's Quality System (QA program) or
    QA/QC approach for the particular project.
>   Reviews work assignments to certify that appropriate QA
    requirements have been established and that the necessary
    instructions are being communicated to the contractor to carry
    out the expected QA/QC tasks; provides signature approval
  EPA only
     Review and approval of
  quality management planning
          documents
>   Reviews QAPPs for all projects, work assignments, grants,
    cooperative agreements, and inter-agency agreements
    involving data acquisition, data generation, and/or
    measurement activities that are performed on behalf of EPA
>   Approves all QAPPs for implementation in all applicable
    projects, work assignments, grants, cooperative agreements,
    and inter-agency agreements performed on behalf of EPA and
    where specific approval of QAPPs has been delegated to a
    responsible EPA official, the Quality Manager reviews the
    QAPPs for concurrence
>   Coordinates the correction of deficient QAPPs with the PO and
    his/her management, and assures through appropriate
    procedures (e.g., contract, financial assistance) that no data
    collection operations commence before a QAPP is approved
                                 >  Reviews, at the specific technical direction of the Quality
                                     Manager, QAPPs and other QA-related planning documents,
                                     such as sampling and analysis plans, DQO specifications, etc.,
                                     to determine if the proposed QA approach documented is
                                     adequate for the work planned, based on explicit evaluation
                                     criteria provided by the Quality Manager (the reviews should
                                     identify specific technical deficiencies in the QAPP to the
                                     attention of the Quality Manager)	
                                                                                                  EPA only
                                                               Discretionary
                                                                  tasks

-------
GLNPO QMP
Revision: 03
May 2008
Page 42 of 96
Quality Management Task
Review and technical
assistance in developing and
preparing experimental
designs
Tracking and reporting of QA
program deliverables
Management of contractor
support work assignments
Conduct management
assessments
Conduct technical
assessments
Details
> Interprets Agency policy and requirements pertaining to
developing and preparing experimental design requirements
> Provides corrective action technical assistance and guidance
to intramural and extramural researchers to enable them to
produce, in a timely manner, satisfactory experimental design
documents using the DQO process
> Using explicit criteria provided by the Quality Manager, reviews
experimental designs produced to determine if satisfactory
results can be obtained from the design and provides the
Quality Manager with a technical assessment of strengths and
weaknesses in the design
> Tracks critical QA program deliverables for the organization
and makes periodic reports to senior management on the
status of reporting actions and deliverables
> Compiles/logs administrative management information
including:
*• turnaround times to correct QAPPs
*• responses to audits
* quality reviews of final reports
> Serves as PO of record for contracts established to provide
QA support to the organization and usually serves as the Work
assignment manager (WAM) for specific work assignments
involving QA activities within the same or other contracts
> Plans, directs/conducts, and reports to senior management the
results of annual assessments of effectiveness of the quality
system (QA program) being applied to EICAAs
> Coordinates with senior management any revision of the
quality system (QA program) as necessary based on findings
of the assessment
> Provides technical support to EPA Quality Manager in the
planning phase of management assessments (such activities
are limited to the assembly and compilation of background
information and data, guidance documents, technical reports,
etc., available in public domain, for use by EPA in designing
the assessment goals and specifications)
> Plans and directs, with the responsible EPA PO, the
implementation of periodic technical assessments of ongoing
EICAAs that technical and quality objectives are being met and
that needs of the customer are being satisfied (such
assessments include technical systems audits, audits of data
quality, surveillance, performance evaluations, and data quality
assessments)
> Performs technical assessment (as listed above) of the
organization's EICAAs, both intramural or extramural
according to a specific plan prepared by the Quality Manager
in the presence of the Quality Manager or authorized EPA
official (preparations for such assessments may include the
acquisition or development of audit materials and standards.
Results (findings) are summarized and presented to the
Quality Manager, or authorized EPA official, for determination
of conclusions and necessary actions, if any)
Performed by:
EPA only
Discretionary
tasks
EPA only
Discretionary
tasks
EPA only
EPA only
Discretionary
tasks
EPA only
Discretionary
tasks

-------
GLNPO QMP
 Revision: 03
   May 2008
Page 43 of 96
Quality Management Task
Preparation and presentation
of quality management
subjects in the technical
literature and at
meetings/symposia
Research relative to quality
management issues
Preparation and presentations
of QA training materials and
courses
Quality review and approval
of final reports
Details
> Represents EPA in transferring quality management subjects
to other Agency, public, or scientific groups through:
* participation in technical meetings and symposia,
and
* participation in technical literature, including peer-
reviewed journal papers, oral presentations, and
panel discussions
> Transfers quality management subjects to other groups
through participation in technical meetings and symposia and
through the technical literature, with appropriate disclaimer that
the information does not represent EPA policy or position
(only EPA personnel may represent the Agency in an official
role)
> The Quality Manager should be kept abreast of advances in
quality management through technical literature, training, and
symposia to create opportunities for improvements to the
organization's quality system (QA program)
> Performs searches of the technical and quality management
literature relative to specific QA/QC issues including compiling
summaries of alternative sampling and analytical methods,
identification of QC reference material, and availability of
standard operating procedures for calibrating certain
instrumentation
> Develops and presents detailed guidance and training for
QA/QC activities based on interpretation of Agency-wide
requirements and guidance.
> Provides or coordinates quality-related training for the
organization in special skill areas not generally available to the
organization
> Provides assistance in preparing and presenting
quality-related technical training (within constraints of potential
conflict of interest)
> Defines criteria for the acceptability of quality documentation in
the organization's published papers and reports
> Approves for publication only those papers and reports that
contain an adequate discussion of the quality of the projects
results and the usability of the data produced
> Conducts a review of all reports produced by the organization
using the qualitative and quantitative specifications obtained
from the DQO process to ensure that an adequate discussion
of the quality of the project results and the usability of the data
produced are included (This quality review complements the
peer review process and documents that the results of the
EICAA have or have not been reconciled with the quality
objectives. Results of any reviews performed by non-EPA
support personnel are presented to the Quality Manager for
decision on the acceptability of the report)
Performed by:
EPA only
Discretionary
tasks
Discretionary
tasks
EPA only
Discretionary
Tasks
EPA only
Discretionary
tasks

-------
GLNPO QMP
Revision: 03
May 2008
Page 44 of 96	

       As it does for procurement of items, GLNPO utilizes the services of the EPA Region 5 Contracts
Management Branch for the procurement of services. This group must approve all contracts and
agreements before they are implemented. It is GLNPO policy that before any funding of contracts or
agreements containing EICAAs is initiated appropriate quality system documentation must be submitted.
To that end, all procurement packages and associated quality system documentation are reviewed by the
Quality Manager.  The documentation must be reviewed by the GLNPO Quality Manager and determined
to be "acceptable with minor revisions" (see Appendix J) prior to initiation of an EICAA. Information on
the  development of quality system documentation is detailed in Section 2.  In order to determine whether
quality system documentation is required, GLNPO POs can use the Quality System Documentation
Checklist (Appendix F).  The PO is responsible for ensuring that the technical requirements of the quality
system are satisfied. It is GLNPO policy that all POs overseeing contracts and  assistance agreements be
certified through PO and contract administration training.

4.2.1  CONTRACTS

       GLNPO conducts procurement functions in accordance with the Federal Acquisition Regulations
(FAR), and generally accepted business practices for the acquisition process. The FAR was recently
amended to address contract quality systems requirements on a government-wide basis.  The new FAR
contract clause at 52.246 11, Higher Level Contract Quality Requirements (February 1999), as prescribed
by FAR 46.311, allows a Federal agency to select a voluntary consensus standard as the  basis for its
quality requirements for contracts.  The EPA Contracts Management Manual (April 7, 2004) identifies
ANSI/ASQ E4-2004, Quality Systems for Environmental Data and Technology Programs - Requirements
with Guidance for Use, as an acceptable standard.

       The EPA Directive 1900, Contracts Management Manual (CMM), Chapter 46 - Quality
Assurance, Section 46.1 Guidance for Use of Higher-Level Contract Quality Requirements in
Acquisitions, establishes guidance for program personnel and Contracting Officers (COs) regarding the
inclusion of higher-level contract quality requirements in applicable solicitations and contracts, and
supplements the procedures and requirements contained in FAR 46.202-4 and FAR 52.246-11 (Higher-
Level Contract Quality Requirement, Feb 1999). It also contains instructions for the use  of the requisite
FAR clause (FAR 52.246-11) for higher-level contract quality requirements (Appendices 46.1A and B),
the  QA Review Form (Appendix 46. ID) and provides a variety of "tailored" clauses that can be
incorporated into contract actions (Appendix 46.1C). The FAR clause at 52.246-11 allows  acceptable
quality standards to be "tailored" to meet specific Agency needs.

       The EPA Office of Acquisitions Management issued Procurement Policy Notice No. 01-02 in
March 2001 that provides guidance for the use of these higher-level contract quality requirements. Notice
01-02 includes two attachments that provide directions for contracting officers and their representatives in
the  program offices (e.g., PO and Work Assignment Managers), as well as quality staff,  and describes the
process for determining the quality system requirements that must be included in contract acquisition
packages.

       Contracts are used when the government derives sole benefit from a particular product or service.
 Contracts can be specific and require a degree of lead time for development. Depending upon the scope
of the service, quality assurance requirements that must be adhered to under the terms and agreements of
the  contract are developed. Currently, POs and their supervisors are responsible for review of
procurement packages. GLNPO relies on the contract officer, PO, work assignment manager, and
specialist (as appropriate) to include required documentation. GLNPO quality staff also will assist in the
contracting process by evaluating quality system documentation submitted by contractors in response to
either pre-award or post-award requirements. As noted in the EPA 1900 ~ Contracts Management
Manual, a member of the GLNPO's Quality Management Team at the appropriate level will be involved

-------
                                                                                    GLNPO QMP
                                                                                    Revision: 03
                                                                                       May 2008
	Page 45 of 96

for procurements over $500,000, in cases where quality system requirements are applicable to the
procurement. GLNPO's Quality Manager will generally fulfill this role.

       GLNPO's management staff and Quality Management Team play active roles in assisting the
contract management staff in defining the quality system requirements for contracts.  Contracts involving
EICAAs will include requirements for the provision of a quality management plan and quality assurance
project plans, or other appropriate quality system documentation.

       In procuring services, responsibility does not follow the line of authority. The PO, as a functional
person, submits a request stating the desired service, measures the quality of the service, and accepts the
service. The Contracting Officer provides the means of getting a contract and enforcing the provision.
The PO has overall responsibility to see that the service is provided but works through the contracting
officer's authority. The PO is appointed by the Contracting Officer and formally designated as a
technical representative of the Contracting Officer in the contract. Project Officers must complete PO and
contract administration training to serve on a contract.  Chapter 7 of the EPA Contracts Manual
(EPA-1900) specifies the required training,  experience, and workload limitations for an individual to
serve as a PO.  GLNPO will adhere to these specifications.  Two major tools to ensure that adequate
service is provided are a well-defined  statement of work (SOW) and quality system documentation that
includes reviews (audits).

       Whenever the government enters into a contract, it is entitled to receive quality service. In order
to define and measure quality, the PO  must develop a statement or scope of work (SOW) that will
accurately define the minimum acceptable requirements for the service. This is the first step in the
procurement process that helps to ensure that services produce results or products of acceptable quality.
The PO must succinctly state their expectations of the product or service and be able to relate this to the
supplier. Good communication between the PO and the supplier of a product or  service is essential to a
mutual understanding of what the expectations are and how quality will be defined.  Methods used to
determine quality (audits, quarterly interviews, random inspections etc.) should be explained prior to
project implementation so that the supplier will understand how quality will be assessed.  Supplement #2
to OMB circular A-76, A Guide for Writing and Administering Performance Statements of Work for
Service Contracts, provides good guidance for writing SOWs and implementing  QA surveillance plans.
Another important source of information  is the EPA 1900-Contracts Management Manual which specifies
all required documents for developing contracts.  The Quality Manager will maintain copies of both
documents in the QA library.

       GLNPO personnel must be aware of "personal services," which are characterized by an
employer-employee relationship between government and contractor employees.  These contracts are
illegal in EPA.  Personnel services  conflicts arise when government employees assume the right to
instruct, supervise, or control a contractor's employee in how they perform work. It is the contractor's
right to hire  and terminate, to assign, and to organize and implement tasks, as the contracting organization
deems appropriate. GLNPO may tell the contractor what to do within the terms and agreements of the
contract, but not how to do it.

-------
GLNPO QMP
Revision: 03
May 2008
Page 46 of 96	

4.2.2  ASSISTANCE AGREEMENTS

       Assistance agreements are used when both parties (EPA and the group providing the service)
derive benefit out of the service.  This usually occurs with grants, cooperative agreements, or interagency
agreements (lAGs) where universities or States derive benefits from participating in EICAAs. If the
project involves environmentally-related measurements or generation of either primary or secondary data,
then the applicant/recipient must develop and implement a quality management system. Grants are
assistance agreements where EPA has no substantial involvement in the project.  Cooperative agreements
are assistance agreements where EPA has substantial involvement in the project.

       As discussed in the EPA proposed Order, Policy for Competition in Assistance Agreements, July
15 2002, it is EPA policy to promote  competition in the award of assistance agreements to the maximum
extent practicable. When assistance agreements are awarded competitively, it is EPA policy that the
competitive process be fair and open and that no applicant receive an unfair competitive  advantage. It is
GLNPO policy to promote fair and open competition in the award of assistance agreements  and GLNPO
is committed to meeting the specifications of EPA's final order on this subject.  This policy  will be
discussed as part of the staff training session, Overview of GLNPO's Quality System.

       GLNPO follows  guidelines developed in the EPA Assistance Administration Manual (EPA-5700)
and in the 4th edition of Managing Your Financial Assistance Agreement - Project Officer Responsibilities
(EPA 202-B-96-002, January 2000).  Project Officers are responsible for incorporating project materials
into the official working  files located in the GLNPO central standardized filing system (see  section 5).

       A Project Inventory and Approval Form (Appendices H and I) is used to determine  if a grant or
IAG package contains all required components in appropriate format with sufficient documentation. This
is usually completed by the PO and the Grants Specialist. The GLNPO Quality Manager also will review
the application to determine the QA and peer review requirements and sign this checklist. All decisions
on the checklist are sanctioned by the GLNPO Director through signature.  The PO is responsible for
submitting the checklist to the Quality Manager. The Quality Manager also will use the  information for
tracking progress on the development of project quality system documentation.

       For assistance agreements,  SOWs are usually developed jointly. However, once the SOW is
completed, the parties also must agree on the quality standards for assuring the product or service. It is
the responsibility of the PO to be aware of EPA QA policy and to work with the GLNPO Quality
Manager to represent these standards during the development of the projects SOW.

       All assistance agreements originating within GLNPO must meet established administrative and
quality assurance requirements in the latest editions of the following:

       >  Assistance Administration Manual, EPA Directive 5700, 1984 Edition (or later)
       >  EPA Order 5700.1, Policy for Distinguishing Between Assistance and Acquisition, March 22,
           1994
       >  EPA Order 5730.1, Policy and Procedures for Funding Assistance Agreements,  January 21,
           1994
       >  40 CFR Part 30, Grants and Agreements with Institutions of Higher Education,  Hospitals, and
           Other Non-Profit Organizations
       >  40 CFR Part 31, Uniform Administrative Requirements for Grants and Cooperative
           Agreements to State  and Local Governments
       >  40 CFR Part 35, State and Local Assistance

-------
                                                                                    GLNPO QMP
                                                                                    Revision: 03
                                                                                       May 2008
	Page 47 of 96

       As stated in Managing Your Financial Assistance Agreement (EPA 202-B-94-001, May 1994), it
is Agency policy that applicants are required to develop and implement quality management practices for
all projects involving environmentally-related measurements or data generation.  These practices consist
of policies, procedures, specifications, standards, and documentation which will produce data of sufficient
quality to meet project objectives and will minimize loss of data due to out-of-control conditions or
malfunctions.  All applicants for grants or cooperative agreements involving environmental programs
shall submit quality system documentation which describes the quality system implemented by the
applicant, which may be in the form of a quality management plan or equivalent documentation. In
keeping with the graded approach described throughout this plan, GLNPO policy requires that all parties
receiving EPA grants/financial assistance under which EICAAs are performed include either a quality
assurance project plan that has been prepared in accordance with EPA Requirements for Quality
Assurance Project Plans (Final, March 2001), or equivalent quality system documentation.

       The applicant's quality system documentation shall indicate whether the assistance involves an
environmental information collection or use. The applicant is requested to submit a description of the its
program  or project as part of the workplan submitted with Standard Form 424.

       The level of documentation must be established by GLNPO staff when planning for the grant or
financial assistance. If the applicant has an EPA-approved QMP or QAPP and it covers the project in the
application, then they need only reference the plan in their application. The quality assurance project plan
must be acceptable to the Award Official in order to receive a grant award.

       The grant applicant is responsible for preparing the quality system documentation, which is then
reviewed and certified by the Quality Manager or his designee prior to initiation of the EICAA.  For
financial assistance grants under the purview of Regions, the Regional Quality Assurance Officer or his
designee is responsible for the review and approval of the quality system documentation.  At the request
of the Regional Quality Assurance Officer, the quality system documentation also may be reviewed and
cosigned by GLNPO's Quality Manager.

       If an applicant is unfamiliar with EPA and the GLNPO quality requirements, the PO should direct
them to the appropriate quality staff, either in the GLNPO, or in the Office of Environmental Information.
 The following are  quality requirements by applicant type:

       >  If an application is for research financial assistance, the application must include a quality
           statement which either addresses certain areas or provides justification why specific areas do
           not apply [see 40 CFR 30.503(d)].
       >  If an application is from a State or Tribal government (except for a wastewater treatment
           construction grant) the applicant must define their plans for completion of the necessary
           quality system documentation.
       >  All other applicants must submit quality system documentation with their application.

       The decision on whether a grant or cooperative agreement involves environmental information
collection or use is determined by the GLNPO PO in consultation with the Quality Management Team
and a review of the project workplan. Project Officers approve projects, subject to their terms and
conditions, during the "award phase" of the project.  At that time, they review award  documentation
prepared by the Assistance Section, develop and initial atransmittal  letter, and pass the package "up-the-
chain" for sign off by Team Leaders, Management, and the Great Lakes National Program Manager.

       The applicant's quality system documentation will be reviewed and approved as a condition for
award of any assistance agreement. The quality system documentation must be submitted as part of the
application (unless GLNPO-approved quality system documentation is cited in the application).

-------
GLNPO QMP
Revision: 03
May 2008
Page 48 of 96	

If the quality system documentation is not submitted as part of the application, GLNPO will, in some
cases, fund the project and include a term and condition in the assistance agreement. This term and
condition requires the recipient to submit the quality system documentation within a specified time after
award of the agreement and notifies the recipient that they may not begin the EICAA until the GLNPO
PO informs them that the quality system documentation has been approved (Section 4.2.3).

       When States receiving funds from GLNPO agree to enter into performance partnership
agreements with GLNPO, the performance partnership agreements will be used as a mechanism to define
the quality system requirements for the effort and to establish the respective roles of and responsibilities
of the State and GLNPO in quality management activities.

       Interagency agreements that are funded by GLNPO should include quality system documentation
requirements in the agreement. Because GLNPO cannot unilaterally impose such requirements, these
requirements must be negotiated into each agreement. Policies and administrative procedures governing
interagency agreements are defined in Chapter 5 of Managing Your Financial Assistance Agreement. The
GLNPO quality system requirements related to environmental data apply to all activities funded by
GLNPO through interagency agreements.  Cooperative agreements that will involve EICAAs must adhere
to the quality system documentation requirements in 40  CFR 30.503.  These standards must be included
explicitly in all cooperative funding agreements.

       All interagency agreements with EICAAs which GLNPO funds, or participates in, will include
quality system documentation. Where GLNPO is providing funds to another organization, that
organization is responsible for preparing the quality system documentation. If the other organization has
equivalent requirements for quality system documentation, that guidance may be employed. If there are
not comparable quality system procedures, the quality system procedures agreeable to both parties must
be negotiated prior to initiation of the program or effort  and are attached to the Memorandum of Decision.
 The quality system documentation will be reviewed and certified by GLNPO's Quality Manager prior to
initiation of the EICAA.  All proposed cooperative funding agreements shall be reviewed to determine the
applicability of quality system requirements as defined in CIO Order 2105. This determination shall be
documented by the GLNPO Quality Management Team.

       Where a quality management plan is required, the plan shall be prepared in accordance with the
specifications provided in the most current version of EPA Requirements for Quality Management Plans
(QA/R-2), which describes the quality system implemented by the party involved in the environmental
program. The plan shall define the approving officials of the plan, which, at a minimum will include the
GLNPO Quality Manager.

4.2.3  SPECIAL CONDITIONS

       Special conditions are usually included in assistance agreements.  The PO will list the conditions
for which project participants must adhere. One of these conditions relates to quality system
documentation. Any assistance  agreement that includes environmental information collection activities
must include the following statement:

       Projects involving collection of environmental data (measurements or information that describe
       environmental processes, location, or conditions; ecological or health effects and consequences;
       or the performance of environmental technology) must meet the "Quality Systems for
       Environmental Data and Technology Programs  - Requirements with Guidance for Use,"
       ANSI/ASQ E4-2004. "Quality System Documentation" includes a Quality Management Plan
       (QMP), a Quality Assurance Project Plan  (QAPP), or such other documentation which
       demonstrates compliance with ANSI/ASQ E4-2004.

-------
                                                                                   GLNPO QMP
                                                                                    Revision: 03
                                                                                      May 2008
                                                                                   Page 49 of 96
       An applicant with current, approved Quality System Documentation will, by the earlier of (i) the
       30th day prior to collection of environmental data and (ii) the 90th day after the project start date,
       notify GLNPO's Quality Assurance Manager of the way it is applying the above standard to this
       project. In all other cases, Quality System Documentation shall be submitted for approval to
       GLNPO by the earlier of (i) the 30th day prior to collection of environmental data and (ii) the 90th
       day after the project start date. Costs associated with data collection are not allowable costs until
       Quality System Documentation is approved by the GLNPO Quality Manager.

       Contact GLNPO's Quality Manager, Louis Blume (312) 353-2317 with questions or to request
       sample documentation. Further guidance is available in the Grants Requirements/Instruction in
       the Application Kit and from http://www.epa.gov/quality/qa_docs.html. which includes the
       document "QA/R-5: EPA Requirements for Quality Assurance Project Plans."

In some cases, exceptions to this condition are made. For example, the time constraint may be lengthened
in cases where GLNPO is working with State agencies and the funds are going to a subcontract.

-------
GLNPO QMP
Revision: 03
May 2008
Page 50 of 96

-------
                                                                                 GLNPO QMP
                                                                                 Revision: 03
                                                                                    May 2008
	Page 51 of 96

                                        Section 5
                          Document Control and Records

       Organizations that perform EICAAs must establish and maintain procedures for the timely
preparation, review, approval, issuance, use, control, revision and maintenance of documents and records.
 A document is any volume that contains information which describes, defines, specifies, reports,
certifies, or provides data or results pertaining to environmental programs. As defined in the Records
Disposal Act of 1943 (now 44 U.S.C.  3301), records are: "...books, papers, maps, photographs, machine
readable materials, or other documentary materials, regardless of physical form or characteristics, made or
received by an agency of the United States Government under Federal Law or in connection with the
transaction of public business and preserved or appropriate for preservation by that agency or its
legitimate successor as evidence of the organization, functions, policies, decisions, procedures,
operations, or other activities of the Government or because of the informational value of data in them..."
 GLNPO adheres to the current EPA records management program and policy to ensure compliance with
federal laws and regulations, EPA policies, and best practices. This section will define GLNPO's
document control and records procedures.

5.1    MANAGEMENT OF DOCUMENTS AND RECORDS

       GLNPO has centralized the office function to manage documents and records by establishing a
staff role, the DCC. The staff member that fulfills this role is within the Communications and Reporting
Team in Policy Coordination and Communications Branch. This document control coordinator is
responsible for the following activities:

       >  Coordinating assignment of EPA document numbers,
       >  Conducting review of documents as requested, and
       >  Maintaining inventory of all GLNPO documents with EPA numbers.

       POs are responsible for submitting deliverables associated with an EICAA (planning documents,
progress reports, final reports, etc.) to the DCC for entry into the  system. The DCC also will maintain
submitted copies of these deliverables in central marked files for proper storage and protection from
degradation.  Soft-copies are maintained on GLNPO's local area network (LAN). Backups of the LAN
are made periodically and maintained offsite. Document preparation, review, and approval will be
dependent upon the type of document being produced. For example, an internal document will have
different preparation, review, and approval requirements than an external document.  The process
befitting each document will be determined by the task leader and immediate supervisor. The process for
quality system documentation is discussed in Section 2.

5.2    VERSION CONTROL

       In order to ensure that GLNPO staff and involved parties are using current documents, GLNPO
uses version control procedures that are consistent with ISO 9000 for documents that support critical
activities. Currently, these include: 1) this Quality Management Plan, 2) the SOP document in support of
GLNPO's base monitoring program, Sampling and Analytical Procedures for GLNPO's Open Lake
Survey of the Great Lakes, and 3) the GLNPO Health,  Safety, and Environmental Compliance Manual.
GLNPO's procedure involves placing these controlled copies in blue binders for easy identification.
Controlled copies of a particular document will be provided to individuals with signature approval for
that document. All GLNPO staff are informed of this procedure during GLNPO's training on the quality
management system. POs are responsible for ensuring that all relevant parties involved in EICAAs are
aware of GLNPO's controlled copy policy to ensure they locate the current document when needed.

-------
GLNPO QMP
Revision: 03
May 2008
Page 52 of 96	

When a new version of a controlled document is created, a summary of revisions will be maintained with
the controlled copies.

5.3   INFORMATION QUALITY GUIDELINES

       Recently, EPA developed Information Quality Guidelines titled Guidelines for Ensuring and
Maximizing the Quality, Objectivity, Utility, and Integrity of Information Disseminated by Federal
Agencies to comply with an Office of Management and Budget (OMB) guideline (67 FR 8452, February
22, 2002). Section 515 of the Treasury and General Government of Appropriations Act for fiscal year
(FY) 2001 (Public Law 106-554) directed OMB to issue guidelines that "provide policy and procedural
guidance to Federal agencies for ensuring and maximizing the quality, objectivity, utility, and integrity of
information, including statistical information, disseminated by Federal agencies." EPA posted the final
EPA Information Quality Guidelines (IQGs)  document entitled Guidelines for Ensuring and Maximizing
the Quality, Objectivity, Utility, and Integrity of Information Disseminated by the Environmental
Protection Agency on October 2, 2002 on OEFs website.  Information subject to IQG includes:

       >  Information produced by EPA to support or represent EPA's viewpoint,
       >  Information produced by EPA to formulate or support a regulation, guidance, or other
           Agency decision or position is subject to IQG,
       >  Information from an external provider if the EPA uses the information in a manner that
           reasonably suggests EPA endorses or agrees with the information,
       >  Information from an external provider if the EPA uses the information to support or represent
           EPA's viewpoint, and
       >  Information from external provider if the EPA directs the outside party to disseminate the
           information on EPA's behalf.

       GLNPO believes their procedures will enable  them to comply with these guidelines and it is
GLNPO's goal to implement these guidelines as appropriate throughout the program. Implementation of
the requirements specified in these guidelines will be administered through GLNPO's quality system.
Specifically, the Information Quality Guidelines will be implemented through the Communications and
Reporting Team.  The Team Leader for the Communications and Reporting Team will serve as the
Divisional Tracking Contact and a Primary Consultant for this initiative. The Team Leader will be
supported by the Document Control Coordinator, whose role at GLNPO was established this year (see
Section 5.1). GLNPO's Quality Manager will serve as an IQG primary consultant and also as the IQG
Trainer. Training on IQGs will be included in the training course, Overview of GLNPO's Quality
System, discussed in Section 3. GLNPO will report on the status of this implementation in the QA annual
report and workplan.  The GLNPO Information Quality Products Approval Form serves as Appendix T to
this QMP.

5.3.1  STANDARD OPERATING PROCEDURES FOR PRE-DISSEMINATION REVIEW

       As part of the implementation of IQGs, GLNPO is developing standard operating procedures for
pre-dissemination review of GLNPO documents subject to the guidelines. These procedures will include
an approval process of pertinent documents by the GLNPO Director. The GLNPO Director may request
that other GLNPO staff with applicable experience, such as Team Leaders and the GLNPO Quality
Manager, participate in the review of particular documents, as he deems necessary. Formal peer  review,
as discussed in Section 7.3, also may be part  of the pre-dissemination review procedures. Currently,
GLNPO plans to track pre-dissemination review in GLNPO's Project Tracking Database described in
Section 2.6.

-------
                                                                              GLNPO QMP
                                                                               Revision: 03
                                                                                 May 2008
                                                                               Page 53 of 96
5.3.2  STANDARD OPERATING PROCEDURES FOR REQUEST FOR CORRECTION

       GLNPO is developing standard operating procedures for requests for correction (RFC) pertaining
to GLNPO documents. A response to an RFC will be processed within 90 days of receipt. If a correction
is made, the GLNPO Director will provide signature approval of the correction. If the correction is
denied, substantiation to defend the decision will be signed by the GLNPO Director. Currently, GLNPO
plans to track requests for correction in GLNPO's Project Tracking Database described in Section 2.6.

-------
GLNPO QMP
Revision: 03
May 2008
Page 54 of 96

-------
                                                                               GLNPO QMP
                                                                                Revision: 03
                                                                                  May 2008
	Page 55 of 96

                                       Section 6
                             Information Management

       EPA's OEI manages EPA's information technology policy, infrastructure, and oversight of
Federal and Agency information technology statutes, regulations and standards. With Agency Directive
2100, Information Resources Management Policy Manual, EPA established a policy framework for
information management at EPA. This manual defines information resources management as
encompassing activities associated with planning, budgeting, organizing, directing, and controlling
information.  GLNPO's efforts to manage environmental information will comply with Agency Directive
2100 and employ this manual and other guidance and planning documents developed by OEI. In
addition, all information management system development, enhancement, and modernization efforts
comply with the most recent versions of the System Design and Development Guidance (EPA Directive
2182, April 30, 1993) and the Operations and Maintenance Manual (EPA Directive 2181, April 1990)
available from the Office of Environmental Information. The IRM policy, standards, guidance and
planning documents are listed on OEFs website.

       As GLNPO's information needs expand, the Office must develop and maintain hardware,
software, and information systems that  efficiently support GLNPO environmental information collection
activities and are compatible with Federal and State  agencies. GLNPO's IMDI Team leads development
and implementation of GLNPO's information management policy.

6.1    GLNPO INFORMATION  MANAGEMENT SYSTEM

       The IMDI Team provides leadership and support to GLNPO and other involved parties in the
storage, access, and  retrieval of Great Lakes environmental information. The IMDI Team is  responsible
for:

       >  defining GLNPO information management policy,
       >  developing and maintaining information management systems (e.g., hardware, software, and
          networks),
       >  ensuring conformance with EPA and Federal information management policy,
       >  reviewing and processing hardware and software requests,
       >  providing technical support to operate information management systems,
       >  serving  as the contact point for electronic data dissemination,
       >  supporting specialized processing requirements (geographical information systems [GIS],
          remote sensing, etc.), and
       >  providing database administration and management.

       The IMDI Team oversees and coordinates all information management activities at GLNPO. The
IMDI Team assists GLNPO project planners with communicating pertinent details of GLNPO's
Information Management Systems to parties involved in GLNPO-funded EICAAs and facilitating their
use.  These systems  are made available  so that States, agencies, and organizations involved in GLNPO
EICAAs can implement them if desired.

-------
GLNPO QMP
Revision: 03
May 2008
Page 56 of 96	

       Most GLNPO employees transfer and retrieve information electronically through the use of
personal computers (PCs). GLNPO PCs are linked to a number of communications packages. All
GLNPO PCs are linked through LANs.  This allows a number of PCs to be grouped together to provide a
means for sharing information, applications, and equipment such as printers and plotters.

       Central to GLNPO's information management  system is a computerized database system to house
environmental monitoring data. The system was developed and initially implemented to support the Lake
Michigan Mass Balance Study (LMMB) and is now supporting aspects of GLNPO's Base monitoring
program and other special studies. The Oracle-based system, the Great Lakes Environmental Monitoring
Database (GLENDA), was developed to provide data entry, storage, access, and analysis capabilities to
meet the needs of GLNPO staff, LMMB study modelers, and other potential users of Great Lakes data.

       Development of GLENDA began in 1993, based on the user requirements and logical design
from U.S. EPA's STORET modernization project. The close association with STORET has limited the
duplication of effort common to database development projects, and ensures maximum portability to the
future central home of most environmental data. GLENDA was developed with the following guiding
principles:

       >  True multi-media scope -  water, air, sediment, taxonomy, fish tissue, fish diet, and
           meteorology data can all be housed in the database,
       >  Data of documented quality - data quality is documented by including results of quality
           control parameters,
       >  Extensive  contextual indicators - ensures data longevity by including enough information
           to allow future or secondary users to make  use of the data,
       >  Flexible and expandable - the database is able to accept data from any Great Lakes
           monitoring project, and
       >  National-level compatibility - GLENDA  is compatible with STORET and allows ease of
           transfer between these large databases.

       During GLNPO's Water Quality Survey (WQS) of the Great Lakes in Summer 2000,  GLNPO
began phasing in the use of an on-board GLENDA data entry system designed to capture data directly
into the database on a daily basis.  The purpose of this pilot was to test the entry system as a means of
facilitating streamlined upload of survey data to GLENDA and dissemination of survey results to
interested parties. The  data entry system is available on the Lake Guardian and also  at GLNPO
headquarters.  The data entry system can be used for real-time data entry or by  entering data recorded on
hard-copy data forms. The system is designed to include real-time data entry checks to prevent analysts or
technical staff from entering "nonsensical" values.

       The WQS Chief Scientist has primary responsibility for assuring that all data gathered in the
survey are documented. Documentation includes raw instrument level printouts, summary bench sheets,
and electronic records generated on board ship and in the laboratories.  All shipboard-generated  strip
charts, bench records, and computer printouts are kept in a folder, indexed by station, until the remaining
samples are transferred to involved laboratories. All raw data are assembled and indexed by parameter,
by lake, and by survey  leg. All onboard results will be  recorded in the GLENDA data entry system, or on
hard-copy or soft-copy data forms for subsequent entry into the onboard system. These files will be
transferred to a compact disc or floppy diskettes.

       For extramural programs, GLNPO POs are responsible for obtaining all appropriate data derived
from GLNPO-funded EICAAs. At the time of project planning, the data reporting requirements,
including the type of data (raw, verified, QA/QC), and the media (paper, electronic, disk, tape, etc.), must
be explicitly determined and documented in the quality system documentation. It is highly recommended

-------
                                                                                  GLNPO QMP
                                                                                  Revision: 03
                                                                                     May 2008
	Page 57 of 96

that all environmental monitoring data funded by GLNPO be reported according to the data reporting
standards (Section 6.3).

6.2    HARDWARE AND SOFTWARE REQUIREMENTS

       The IMDI Team ensures that GLNPO is conforming to all Federal and EPA standards for
hardware and software and in-house information management systems. The Office of Environmental
Information develops standards for EPA data processing and telecommunications. They also provide
technical assistance and advice to EPA and State agencies concerning the acquisition and implementation
of information management technology. As part of this assistance, EPA developed an information
technology architecture road map that establishes the Agency's information technology portfolio, as
required under the Information Technology Management Reform Act of 1996. The road map forms the
basis for the selection and deployment of computing platforms and network connectivity between
computing platforms, as well as the software and related products that interconnect computing platforms.
GLNPO conforms with this road map when purchasing hardware and software. In addition, GLNPO
complies with the Delegation of Procurement Authority Guide to ensure that purchased software will
meet user requirements and will comply with the Office of Environmental Information standards. The
IMDI Team relies on guidance and planning documents developed by OEI on technology infrastructure to
purchase and manage information technology systems.

       IMDI regularly conducts user needs assessments of GLNPO staff to identify when additional
infrastructure is needed. Once a need is defined, the IMDI solicits various vendors for software
demonstrations. Only vendors that comply with Agency information resources management standards are
solicited. The IMDI, as well as appropriate GLNPO staff, evaluate the software to determine its
performance and future capabilities.  Prior to purchase, the IMDI Team fully evaluates software in terms
of its intended use. Software is selected on the basis of minimum performance standards and cost.

       Software also may be developed in-house and may involve smaller, specialized information
systems that could include small databases, spreadsheets, and data entry tools. Many of these systems are
based on commercially-available software  and may only be employed for short periods.  In these cases,
the information system standards may not be applicable and may do little to ensure the quality of those
systems.  GLNPO's graded approach applies to planning and documentation of software development.
The PO and the Branch Chief are responsible for identifying when such "minor" information systems will
be employed and documenting all efforts by the project staff to ensure their quality.  Project
documentation should include a detailed description of the algorithm or software process used in the
project. For larger scope software development projects proper documentation must be developed that
includes:

       > Hard-copy documentation of the software
       > Developer's name
       > Names of all current maintenance personnel
       > Intended use of the software package (capabilities & non-capabilities)
       > Detailed process used to verify the software, including test examples
       >  Software code with sufficiently detailed comments to ensure understanding by other analysts
          or programmers according to EPA coding policy

       A formal quality management process must be developed for complex software development
projects (e.g., mathematical models).  Prior to initiation of these projects, GLNPO staff must prepare
planning  documentation that includes a section devoted to the quality management of the software
system. The documentation should address the planning, budgeting, organizing, directing, and training

-------
GLNPO QMP
Revision: 03
May 2008
Page 58 of 96	

needs associated with the software development project.  The planning documentation also should address
audits and tests of the software at various phases to assure the integrity of the software.

6.3   REPORTING STANDARDS

       In order to be capable of utilizing information across GLNPO programs and between other EPA
Regions and laboratories, emphasis must be placed on improving information compatibility.  The EPA
Data Standards Program is established and documented in EPA Directive 2100 Information Resources
Management Policy Manual. GLNPO adheres to this program and all mandatory Agency data standard
policies.

       GLNPO devised reporting standards for data submissions for their EICAAs in accordance with
Agency policy.  GLNPO's data standards are designed to ensure consistency in reporting and facilitate
data verification, data validation, and database development.  GLNPO's data standard conforms to the
QA/QC codes as derived from EPA Order 2180.2 Appendix B. To date, GLNPO has developed
standards for the following data: air/water analyses, fish tissue, fish diet, and sediments. The latest
reporting standards are available at the following web page:
http://www.epa. gov/glnpo/lmmb/rptstds/index.html. This site will be updated periodically as the
reporting standards are changed. The reporting standards also can be requested from GLNPO for transfer
by disk or e-mail. Examples  of GLNPO's data reporting standard and QA/QC codes are listed in
Appendix D.

       The  Field Reporting Standard contains information about the station visit, sample collection
activities, and results from in situ measurements and visual observations. Generally, it is used to record
information about what was  collected, where it was collected, how it was collected, by whom it was
collected, and the results of field measurements and observations. The Laboratory Reporting Standard
contains information about laboratory samples and analyses.  Generally, it is used to record information
about the analysis of field and QC samples and the results obtained from such analyses.  To capture all of
this information in a logical and useful way, the Reporting Standard is broken into several files - most of
which are used only in special cases.

       Data reporting requirements must be consistent with GLNPO and Agency information
management policy. POs are responsible for the format, quality and submittal of data derived from
GLNPO funded EICAAs. At the time of project planning, the type of data (raw,  verified, QA/QC) and
the media (paper, electronic, disk, tape, etc.) must be determined. A data submission is defined as a
logical combination of Reporting Standard files that are sent as a unit from an investigator to GLNPO.
Each submission is defined within a field delivery header file that contains a unique combination of
project code, sampling organization code, submission number, and version number. The submission
number and version number  are assigned by the investigator for tracking purposes. The version number is
simply a control number that distinguishes a specific collection of data from older (or newer) submissions
of the same data. The data submission concept becomes important when an investigator determines that
he/she needs to replace a data submission that was sent previously to GLNPO. An investigator easily can
replace a data submission by sending an updated, complete data submission that is identified with the
same submission number as the original data but with an incremented version number

6.3.1  GLNPO LOCATIONAL DATA POLICY

       IRM Policy Manual  2100 Chapter 13, requires geographic coordinates and associated method,
accuracy, and description codes (MAD) for all environmental measurements collected by EPA
employees, contractors, and grantees. This policy establishes the principles for collecting and
documenting latitude/longitude coordinates for facilities, sites, and monitoring and observation points

-------
                                                                                  GLNPO QMP
                                                                                  Revision: 03
                                                                                     May 2008
	Page 59 of 96

under Federal environmental programs.  The intent of this policy is to extend environmental analyses and
allow data to be integrated based upon location, thereby promoting the enhanced use of EPA's extensive
data resources for cross-media environmental analyses and management decisions. This policy
underscores EPA's commitment to establishing the data infrastructure necessary to enable data sharing
and secondary data use. To facilitate the integration of data into these systems it is important that coding
of geographic coordinates and associated attributes be standardized.  All GLNPO projects that include
data collection, habitat restoration, or other "location dependant" activity are required to adhere to
GLNPO's Locational Data Reporting Format, both intramural and extramural.  The GLNPO Locational
Data Reporting Format is compliant with the Region 5 and Headquarters Locational Data Policy and was
approved by the Region 5 Locational Data Manager. The GLNPO Locational Data Reporting Format is
available on the GLNPO web page, http ://www.epa. gov/glnpo/fund/ldp.html.

6.4   INFORMATION SECURITY

       GLNPO has developed an information security plan to manage security controls of GLNPO's
LAN, named Great Lakes National Program Office Binational and Partner Network (GLNPO.net). The
purpose ofGLNPO.net is to provide a development/production/collaboration Internet site available for
web projects that are binational or regional in scope without putting the U.S. EPA network at risk. The
system supports dozens of developers and hundreds of users, including GLNPO, Region 5, U.S. Army
Corp of Engineers, State and Environment Canada staff and their contractors, as well as EPA grantees and
cooperators.  GLNPO is responsible for development and implementation of the system.  The users that
view pages on the site range from GLNPO partners (using user IDs and passwords) to the public, who are
directed to a particular uniform resource locator (URL), depending on content.  The system is located at
in the Metcalf Federal Building in Chicago, Illinois.

       The Information Security Plan is maintained by the Information Management and Data
Integration Team who are responsible for implementation of the plan. Specifically, Pranas
Pranckevicius1, a Certified Information Systems Security Professional (CISSP), the Team Leader and
Information Security Officer,  maintains the plan.  A cover page of the January 30, 2008 Information
Security and the Security Accreditation and Authorization to Operate is included in Appendix E that
includes authorizations for the plan and the table of contents for the Information Security Package. In the
plan, GLNPO documents assignment of responsibilities associated with the development and
implementation of GLNPO's system security as well as user responsibilities. GLNPO included in the
plan the rules for accessing GLNPO.net and using GLNPO  data. The plan also includes information on
training individuals to use the system and on security awareness in general. GLNPO also details
management of personnel controls for the system and incident response activities.  Confidentially
sensitive data are not permitted on GLNPO.net. GLNPO is developing this system in accordance with the
following laws and regulations:

        > The Computer Security Act of 1987
        > Privacy Act
        > OMB Circular A-13 0, Management of Federal Information Resources
        > EPA Information Resources Management Policy Manual
        > Enterprise Technology Services Division LAN Operating Procedures Manual
1 Pranas Pranckevicius contact information: Phone - (312) 353-3437, E-mail - pranckevicius.pranas@epa.gov

-------
GLNPO QMP
Revision: 03
May 2008
Page 60 of 96

-------
                                                                                  GLNPO QMP
                                                                                   Revision:  03
                                                                                     May 2008
	Page 61 of 96

                                         Section  7
                                    Quality Planning

       In order to accomplish its mission, GLNPO must effectively plan and implement environmental
information collection activities.  It is GLNPO policy that collecting environmental data of adequate
quality for its intended use can only be achieved with systematic planning prior to initiation of an EICAA.
 Further, GLNPO's quality management policy stipulates that planning processes must be adequately
documented.  In accordance with GLNPO's graded approach the level of detail in planning and
documenting the planning process should be commensurate with the importance and intended use of the
work, available resources and unique needs of the organization. The following sections describe the
processes GLNPO uses to plan EICAAs and document the planning process.

7.1    PROJECT PLANNING AND SCOPING

       Each year, GLNPO conducts planning meetings with local, State, Tribal, Federal, and
international Great Lakes stakeholders.  From these annual meetings and through Agency direction, Great
Lakes environmental issues of concern are identified and funds are reserved for discretionary grants and
intramural activities to address these issues.  GLNPO Teams then develop and distribute requests for
proposals (RFPs) regarding these issues of concern.  Once pre-proposals are submitted, GLNPO Teams
review them and identify those that are of interest to GLNPO to fulfill their mission. The Quality
Management Team assists the POs in distributing to potential grantees a copy of EPA QA/R5, EPA
Requirements of Quality Assurance Project Plans and pertinent examples of quality system
documentation from similar past projects. Potential grantees are encouraged to submit a full proposal
based on applicable sections of QA/R5. In this way, the quality planning process begins at the earliest
phase of the EICAA and the probability this process will provide added value to the EICAA is greatly
increased.

       Systematic planning is essential to managing quality and should be conducted by a group with
sufficient knowledge to ensure that the activities undertaken will result in a product with the level of
quality needed for its intended purpose. The planning steps listed below are one suggested approach that
GLNPO POs and task leads can use to plan effectively and meet the requirements of GLNPO's quality
system. Although the exact questions illustrated in the planning steps may not need to be answered, the
issues behind the questions need to be addressed before proceeding with the activity. A common
approach to answering these questions, and thus to planning, is to assemble a team or a work group of
knowledgeable staff to address these details. At GLNPO, the functional teams often fulfill this role with
their Management Advisor and additional participating staff with a firm technical grasp of the subject
matter. In addition, staff that control the budget and those that manage any contractors or grantees
involved in the effort also should be included in the group. It is also essential that the team consult with a
member of the Quality Management Team to ensure that GLNPO's quality system requirements are being
addressed.

Step 1 - Problem Identification

>  What is the problem and how does it relate to GLNPO's mission? (e.g., verbal statements of the
    general problem can be are narrowed into succinct questions that are unambiguous and can be
    answered with specific data.)

-------
GLNPO QMP
Revision: 03
May 2008
Page 62 of 96
Step 2 - Project Purpose

>  What is the primary purpose of the activity and why is it important to proceed?
>  Are environmental data required?
>  What is the schedule for completion and is it driven by forces outside of GLNPO (e.g., legislative or
    judicial deadlines)?

Step 3 - Project Design

>  What are the quality requirements for the activity?
>  What is the allowable level of uncertainty? (i.e., quantify the level of uncertainty that will be allowed
    while still being able to answer the project questions outlined in Step 1.)

Step 4 - Resource Requirements

>  What activities must be performed?
>  What staff members are needed to complete these activities? Are these staff available?  If not, what
    other options exist (e.g., will staffing limitations dictate achievable project quality or project design?)
>  What resources and materials are needed to complete project activities? Are these resources/materials
    available? If not, what other options exist (e.g., will resource limitations dictate achievable project
    quality or project design)?
>  If data are required, what kind of data are needed, how will they be collected, and what are the quality
    requirements?
>  Can we achieve these requirements within the schedule, using the available technical, financial, and
    staffing resources?

Note:   Quality management is an integral part of any GLNPO EICAA and should be included as a
        budget line item.

Step 5 - Roles and Responsibilities and Project Products

>  Who is the customer and what are their expectations (e.g., senior EPA management, the public,
    Congress, the regulated community, etc.)?
>  What types of information does the customer need (i.e., summary information, detailed trends,
    graphs, GIS, etc.)?
>  Who is the supplier and what are their responsibilities (i.e., personnel responsible for management,
    planning, budgeting, reporting, etc. must be clearly identified in the planning documentation)?

Step 6 - Performance Measures

>  How can we measure the success of the project (e.g., through quantitative measures, surveys, peer
    review, etc.)?

Note:   The measure of success is an important aspect of the assessment and corrective action phases of
        the project, which are discussed  in Section 9.

-------
                                                                                 GLNPO QMP
                                                                                  Revision: 03
                                                                                    May 2008
	Page 63 of 96

       Upon completion of these steps, it should be clear whether the questions can be answered with
the resources available and at the desired level of uncertainty.  The project leads may have to choose
between adding more resources or having less confidence in the data. In summary, systematic planning
involves determining and clearly defining your objective and developing a detailed plan to address the
objective that considers available resources that are allocated to support it.

       At GLNPO, the Quality Management Team is involved in planning EICAAs. The Team will
assist POs and task leads by identifying necessary quality management activities for a given project. The
Quality Management Team generally assists in:

       >  developing quality system documentation,
       >  implementing the quality system,
       >  conducting audits and system reviews,
       >  providing technical support to determine needed QC samples, and
       >  assessing data quality.

       In order to ensure that quality system requirements are considered for every EICAA, the Quality
Manager reviews every expenditure for extramural projects to  determine if quality system documentation
and peer review are needed for that project to meet GLNPO's quality policy.  A Project Inventory and
Approval Form (Appendices H and I) accompanies assistance  (grant and IAG) packages and serves to
document the Quality Manager's determination of whether quality system documentation or peer review
are needed for a given project. GLNPO  senior management does not approve of expenditures without a
project inventory and approval form containing the approval signature of the  Quality Manager or his
designee from Quality Management Team. This is a key part of GLNPO's quality system because it is
the first step in the inventory of all EICAAs at GLNPO.  The Quality Management Team and POs can
employ a series of checklists to assist them in making this determination.  The Quality System
Documentation Checklist provided in Appendix F can be used to guide POs, line management and the
Quality Management Team through the process of planning a project while complying with GLNPO's
quality system.

       If the project requires quality system documentation, the Quality Management Team begins
tracking the EICAA by entering information regarding the project into GLNPO's tracking database,
QATRACK.  The Quality Management Team uses QATRACK to provide the status of EICAAs and
associated required quality system documentation to GLNPO staff and management at monthly meetings.
 In addition, the database assists the Quality Management Staff in notifying POs when required
documentation is overdue. Once the documentation is overdue, GLNPO's Protocol to Address Missed
Requirements is implemented (Appendix M). The protocol addresses the failure of funded entities to
provide quality system documentation and other required work products on schedule by providing a
general framework for action including how and when to involve management. This protocol involves
notifying the PO of the delinquency through a written memo from the Quality Management Team.  An
example of this memo is provided in Appendix N.

7.2    DOCUMENTATION

       Another critical component of GLNPO's quality system is that the planning process must be
documented.  The information generated during the  six planning steps listed above forms the basis of the
quality system document. GLNPO recognizes that documentation of the project is of utmost importance
and must address all phases of the EICAA. While the data from a project may be technically sound, lack
of proper documentation can make the data suspect and the defense of a project difficult, if not
impossible. Documentation must address all phases of the EICAA.

-------
GLNPO QMP
Revision: 03
May 2008
Page 64 of 96	

       Quality system documentation can be developed in a variety of forms. Two of the most common
forms are a quality management plan such as this one, and a quality assurance project plan for an activity
involving the collection of environmental data. The reference section of this document contains the titles
of the latest guidance and requirements documents for those plans that are available from the EPA Quality
Staff.  In some cases, a QAPP with several paragraphs describing quality system issues will be considered
equivalent to a QMP. For these cases,  the document should include discussions of management and
oversight, an approval process for any  delegated components of the projects, and independent quality
management reviews.

       These plans do not apply to every decision-making activity that may be conducted by or for
GLNPO, and may apply poorly to others.  Therefore, GLNPO's quality system explicitly provides for a
graded approach to the documentation  of environmental data collection activities. The most stringent
approach to such program-level documentation is a quality management plan  as described in EPA QA/R-
2 whereas the most stringent approach  to such project-level documentation is  a quality assurance project
plan as described in EPA QA/R-5. However, the planning process may be used to specify when other
forms of documentation will be employed. For example, GLNPO funds a variety of grants to States,
Tribes, and public and private organizations that advance its mission. Some of those grants involve the
collection of environmental data, but are for small dollar amounts that simply cannot support the
production of elaborate quality system  documents such as quality assurance project plans. Other
intramural activities involve the collection of environmental data that are never going to be used to make
an environmental decision, but rather are used as a means to raise public awareness of environmental
issues or provide educational outreach.  Therefore, these data collection activities need not be documented
in a format as formal as those mentioned above. For other projects, a document addressing all
components of QA/R-2 or QA/R-5 is appropriate.  For example, GLNPO has  developed a QAPP in
accordance with EPA QA/R-5 for the base monitoring program that is updated annually by POs and the
technical leads. GLNPO's graded approach will rely on EPA QA/R-2 or QA/R-5 to form the basis for
quality system documentation and will require documentation that addresses applicable components of
these documents for all GLNPO-funded EICAAs.  In some cases, a QAPP with several paragraphs
describing quality system issues will be considered equivalent to a QMP.  For these cases, the document
should include discussions of management and oversight, an approval process for any delegated
components of the projects, and independent quality management reviews.

       According to the EPA Quality Manual for Environmental Programs,  the eight elements of the
planning process listed in Exhibit 6 must be documented. The specific details of these elements are
addressed in the six suggested planning steps described in the previous section.  Whatever form of
documentation is used, it must address  these elements of the planning process.

-------
                                                                                   GLNPO QMP
                                                                                   Revision: 03
                                                                                      May 2008
 	Page 65 of 96


                                           Exhibit 6
               Eight Elements of the Planning Process That Must  Be Documented

  1.  Identifying the project manager, the sponsoring organization and the responsible individual
     within that organization, the project personnel, the "customers" and "suppliers," and
     describing their involvement in the project
  2.  The project goal, objectives, and the questions and issues to be addressed
  3.  The project schedule, resources and budget, and milestones, and any applicable requirements
     (e.g., regulatory or contractual requirements)
  4.  The type of data needed and how those data will be used to support the project objectives
  5.  How the quantity of data needed was determined and how the criteria for the quality of the
     data were determined
  6.  How, when, and from where data will be obtained, including existing data.  Identifying any
     constraints on the data collection process
  7.  Specification of the activities during data collection that will provide the information used
     to assess data quality (i.e., field or laboratory quality control operations, audits, technical
     assessments)
  8.  How the data for the project will be analyzed, evaluated, and assessed against their
     intended use and the performance criteria established above
       It is important to establish efficient communications among all project participants to ensure
smooth operation of all phases of an EICAA. A "Project Organization" section in the quality system
documentation helps to establish the lines of communication.

       For projects or tasks involving environmental data performed through grants and cooperative
agreements (40 CFR Parts 30, 31, and 35), the planning process must identify the appropriate level of
quality system documentation that will be employed. The documentation must be reviewed and approved
by the relevant POs, task leads, and the Quality Management Team prior to the start of EICAAs.

       When working with States that have GLNPO-approved QMPs, approval authority for quality
system documentation for EICAAs under their supervision is delegated to the State.  The PO can review
quality system documentation to ensure that the technical requirements of the project are clearly met;
however, approval of the quality system is delegated to the State.  The States often request that GLNPO
review and provide input on quality system documentation.  GLNPO routinely offers its support and is
committed to working with States in the Great Lakes Region to effectively plan and implement EICAAs.

       For projects that employ data from other sources (i.e., secondary data), the level of quality system
documentation should be commensurate with the nature of the data and the decision to be made. The
Office of Environmental Information is developing guidance on using data from other sources.  When the
guidance is finalized, GLNPO will review it and, if appropriate, incorporate the guidance into the
procedures for assessing the quality of secondary data. In the meantime, GLNPO will continue to use the
planning process described in this plan to identify when secondary data will be used, to establish
acceptance criteria for the data, and to outline the manner and extent to which secondary data will be
verified.  The project staff will continue to employ professional judgment and the assessment procedures
outlined in Section 9 to ensure that the data meet the needs of the project.

       Once quality system documentation is submitted, the PO reviews the document to ensure that the
technical requirements of the project are clearly met. If approved by the PO, the quality system
documentation is forwarded to the Quality Management Team for review and approval.  The Team

-------
GLNPO QMP
Revision: 03
May 2008
Page 66 of 96	

employs formal procedures for logging, tracking, and maintaining all submitted documents as described
in Appendix S. As part of these tracking procedures, the Quality Management Team uses a Quality
System Documentation Status and Tracking Sheet (Appendix O) to monitor fulfillment of documentation
requirements.  All submitted documents are maintained by the Team in both hard-copy and soft-copy
(when available).  The Team conducts a standardized review using a checklist to document the
acceptability of all applicable requirements and determines that quality requirements are:

       >  included and acceptable,
       >  included and not acceptable,
       >  not included, or
       >  not applicable.

       The checksheets, one for reviewing QAPPs, and one for reviewing QMPs, can be found in
Appendix J. After completing the review, the Quality Management Team provides a cover memo and
check sheet with comments to the PO detailing conclusions, recommendations and any required revisions.
 When revisions are needed, the documentation is resubmitted and reviewed by the Team. GLNPO's
Quality Manager is available to assist involved parties with understanding and meeting the quality system
requirements and often participates in meetings or conference calls if requested by the  PO.

       As part of project planning, project leaders will develop timelines for the development, review,
and completion of required documentation. Appropriate reviewers of documentation also should be
identified as part of the planning process.  Documents  will be archived as detailed in Section 5.

7.2.1  GLNPO's GRADED APPROACHES TO QUALITY SYSTEM DOCUMENTATION

       GLNPO implements a graded approach to quality system documentation that is consistent with
the decision being made. Every expenditure that GLNPO makes towards EICAAs and every intramural
project that collects environmental information driven by environmental decisions should have quality
system documentation commensurate with the importance of the question that is being addressed. A
small environmental education grant will not need as much documentation as a large monitoring program
involving several agencies and large-scale  sampling. At a minimum, all projects that collect
environmental information should default to quality system documentation in accordance with EPA
Requirements for Quality Management Plans (EPA QA/R-2), addressing all applicable components
and for project-level quality system documentation EPA Requirements for Quality Assurance Project
Plans  (EPA QA/R-5). QMPs are not required for all of GLNPO's extramural projects, but
documentation of the quality system is. In some cases, a QAPP with several paragraphs describing
quality system  issues will be considered equivalent to a QMP. For these cases, the document should
include discussions of management  and oversight, an approval process for any delegated components of
the projects, and independent quality management reviews.  Available resources are a consideration, such
as when working with Tribes.  In these  cases, GLNPO will work more closely with organizations that do
not have the infrastructure to support a  large-scale quality system.

-------
                                                                                  GLNPO QMP
                                                                                   Revision: 03
                                                                                     May 2008
	Page 67 of 96

       The Quality Manager, in conjunction with Project Officers, will determine the appropriate level
of quality system documentation for each project.  In the past, GLNPO has used a four-tiered project
category approach to determine the detail necessary for QAPPs (GLNPO QMP, October 1997, Revision
03).  These categories have been replaced by the eleven categories listed in the table below.
Categories of Environmental Information Collection and Assessment Activities
State agencies
Consortium grants
Cluster grants
Repeating projects of similar scope
Existing data/modeling
Tribal grants
Sediment assessment
Habitat/Ecosystem restoration
Ambient monitoring and research demonstration
Pollution prevention and Environmental education
Volunteer monitoring
Quality system documentation at the program-level is most
often required (Q M Ps)
Quality system documentation at the project-level is most
often required (QAPPs) with a few paragraphs describing the
quality system including discussions of management and
oversight, an approval process for any delegated
components of the projects, and independent quality
management reviews.
Suggestions and requirements for quality system documentation for each of these categories are discussed
in detail in the following sections. In some cases, categories identified as most often requiring project-
level documentation may be better suited to program-level documentation.  This will be determined on a
case-by-case basis by GLNPO's Quality Manager.  Quality system documentation for programs is
discussed in Section 7.2.1.1 and quality system documentation for projects is discussed in Section 7.2.1.2.
 Examples of GLNPO's graded approaches for quality system documentation are included in Appendix K

7.2.1.1    QUALITY SYSTEM DOCUMENTATION FOR PROGRAMS

       GLNPO often funds other organizations that conduct or oversee EICAAs. These activities fall
into four general categories: 1) consortium grants, 2) cluster grants, 3) state agency funding, and 4)
funding repeating projects with similar scope and common elements. Consortium grants include large
studies with multiple grantees and sub-grantees addressing a single issue. For example, GLNPO issued a
consortium grant for Lake Erie where many experts were assembled to address the single issue of oxygen
depletion in the lake.  Cluster grants generally involve one organization that administers multiple projects.
 GLNPO regularly administers funds to States around the Great Lakes covering a multitude of projects.
Repeating projects of similar scope are a unique type of agreement. They include projects where GLNPO
works with an approach or standard operating plan developed by a lead agency, such as The Nature
Conservancy, where multiple projects are administered individually through various grantees.

       For groups that are working with GLNPO in one of these categories, GLNPO expects the grantee
organization to have a quality system in place that is consistent with ISO 9001 and CIO Order 2105. For
major programs, EPA Requirements for Quality Management Plans (EPA/QA R-2) must serve as the
basis for quality system documentation. A list of components of a quality management plan according to
EPA QA/R-2 is provided in Table 1 at the end of this section.  For non-major programs, a QAPP with

-------
GLNPO QMP
Revision: 03
May 2008
Page 68 of 96	

several paragraphs describing quality system issues will be considered equivalent to a QMP. For these
cases, the document should include discussions of management and oversight, an approval process for
any delegated components of the projects, and independent quality management reviews.  All of the
project planning documents, quality system documentation, and products of the projects are subject to
GLNPO review and approval.  Quality system documentation for specific projects including QAPPs and
SOPs must be provided to GLNPO upon request to facilitate a technical system or a quality system audit.
 Many POs at GLNPO request project-specific documentation in order to get an idea of the technical
direction and adequacy of specific projects, but they do not serve in an approval capacity. In cases where
GLNPO has delegated approval to other organizations, GLNPO will ensure that the quality system is
implemented as described in the quality system documentation through training and ongoing
communication with responsible quality managers and through management system reviews and technical
audits. A description of anticipated audits must be included in quality system documentation to ensure
the quality system is implemented.
                                                         A true functional value-added quality
                                                         system is not driven by approved
                                                         documentation, but more so, by the
                                                         activities implemented on a daily
                                                         basis that enhance the quality of the
                                                         environmental decision.
       Once a quality management plan is approved,
GLNPO will not micro-manage organizations or agencies.
GLNPO understands that time is needed to implement a true
functioning quality system. GLNPO will request quality
system documentation only for specific instances in order to
review technical correctness and will not review
documentation as a matter of course. GLNPO's Quality
Management Team will provide training on development of
quality systems  as requested.

       GLNPO administers funds to eight States around the Great Lakes: Minnesota, Wisconsin,
Michigan, Ohio, Indiana, Illinois, Pennsylvania, and New York.  Currently, GLNPO only recognizes the
quality management system implemented by the State of Wisconsin. For states that have obtained EPA
Regional approval of their quality management system, GLNPO will recognize this approval, as long as
the scope of GLNPO activities that the State is involved in is addressed in the quality system. GLNPO is
interested in working with each State to develop an approved quality system and will offer assistance
when requested.

       Quality  management resources often are lacking at the State level. In these cases, GLNPO may
provide assistance for reviewing quality system documentation and other quality management areas as
requested by the State. In some cases, grants to State agencies fit into the cluster grant category and are
termed performance partnership agreements.

       The quality system documentation requirements for the remaining three program-level categories
are further described below.

7.2.1.1.1   CLUSTER GRANTS

       GLNPO issues a large number of grants, many of which are quite small, and the administrative
burden can exceed GLNPO's  ability to effectively oversee and implement the projects.  GLNPO uses
cluster grants to coordinate groups of grantees and sub-grantees.  For example, GLNPO often administers
one cluster grant to the Great Lakes Commission which provides grants to 3  different agencies.  In these
agreements, GLNPO expects that project-level quality system documentation will be reviewed by a
trained quality manager. The  principle investigator also can review the quality system documentation for
technical correctness and GLNPO encourages this procedure. The quality system documentation for
cluster grants must include detailed information regarding the review procedures for project-level quality
system documentation. This detailed information should discuss what is expected in project-level quality
system documentation, the procedures that will be implemented to review the quality system

-------
                                                                                   GLNPO QMP
                                                                                    Revision:  03
                                                                                      May 2008
	Page 69 of 96

documentation, and the roles and responsibilities of involved parties regarding the review and approval of
the project-level quality system documentation. The GLNPO Quality Management Team can assist
grantees with training for reviewing quality system documentation and can assist in the review, if
requested. Grantees should conduct a quality system audit at least once during the project, preferably
close to the start of the project.  Details of how this assessment will be conducted and who is responsible
for ensuring that it is conducted also should be included in the quality system documentation. Cluster
grants require regular coordination among quality managers at grantee and sub-grantee organizations and
GLNPO.

7.2.1.1.2  CONSORTIUM  GRANTS

       Consortium grants  are similar to cluster grants, except that groups of grantees are focusing on one
key question. The quality system documentation requirements are the same as for cluster grants,
however, the focus is on the over-arching quality management plan, whereas for cluster grants, the focus
is on project-specific quality system documentation.  Other than this difference in focus, the quality
system documentation requirements are the same for consortium grants as they are for cluster grants. The
Great Lakes Commission Quality Management Plan is provided in Appendix K as example
documentation for a consortium grant.

7.2.1.1.3  REPEATING PROJECTS OF SIMILAR SCOPE

       GLNPO works with various non-governmental organizations  that conduct projects of similar
scope with a variety of grantees. For example, GLNPO often enters into agreements with The Nature
Conservancy. In implementing projects, each grantee uses standard procedures  developed by The Nature
Conservancy that include:

       1.  Data dictionary,
       2.  Data reporting  procedures,
       3.  Data use and security restrictions, and
       4.  Standard operating procedures for species identification and abundance measures.

       GLNPO encourages organizations that implement these types of projects to develop a single
quality system document to cover all of these types of activities in order to save  resources needed to
develop and review multiple quality system documents. A quality management plan at the national level
will ease the administrative burden at the local level. GLNPO will assist these organizations in
developing program-level quality system documentation. A generic quality assurance project plan for the
repeating projects also may be appropriate. In these cases, additional  event-specific documentation, such
as a sampling and analysis  plan that documents specific information that is not addressed in the generic
QAPP, may be sufficient.

-------
GLNPO QMP
Revision: 03
May 2008
Page 70 of 96
Table 7-1. Required Elements in a Quality Management Plan (QMP)
Elements
A1
A2
A3
A4
A5
B1
B2
B3
C1
D1
D2
D3
E1
F1
G1
H1
Management and Organization
Organization's QA Policy
Statement
Distribution List
QA Manager/Staff Authorities
Technical Activities/Programs
Quality System Components
Principle Components
Tools for Implementing
Components
Personnel Qualifications &
Training
Procurement of Items & Services
Procurement Document Approval
Solicitation Response Approval
Documents and Records
Computer Hardware and Software
Planning
Implementation of Work
Processes
11 Assessment and Response
J1
Quality Improvement
Requirements
Give an overview of management and organization. Include title page,
appropriate signatures, and organizational chart.
State the importance of QA and QC, general objectives/goals of Quality System,
and policy for resource allocation for the Quality System.
Distribution list for the QMP revisions and final guidance.
Discuss the responsibilities and authorities of QA Manager and other QA staff;
document independence of QA Manager
Discuss specific programs that require quality management controls, where
oversight of extramural programs is needed to assure data quality, and where
internal coordination of QA and QC activities needs to occur.
Describe the organization's quality system
Discuss the principal components (quality system documentation, annual systems
review, management assessments, etc.) which comprise the quality system
QMPs, management assessments, technical assessments, systematic planning,
SOPs, QAPPs, and data quality assessments
Describe organization's training policy, processes, and documentation.
Describe the roles, responsibilities, and authorities of management and staff
which pertain to all appropriate procurement documents or extramural
agreements
Describe review and approval procedures for procurement documents
Describe review and approval processes of all applicable responses
Discuss procedures for documents and records
Describe QA and QC processes for the use of computer hardware and software
to support environmental data operations
Describe systematic planning process for environmental programs, discuss QAPP
process, and discuss organization's secondary data policy
Describe developing and implementing procedures, planned procedures, and
controlling measures
Describe how and by whom assessments of environmental programs are
planned, conducted, and evaluated; and the processes by which management
determines assessment activities and tools appropriate for a particular project and
expected frequency of use
Describe how organization will detect and prevent quality problems and ensure
continual quality improvement; communication of expectations about quality
improvement to staff
GLNPO added note:
The above elements are the minimum requirements for a QMP, and therefore, should be used
during the QMP development.

-------
                                                                                   GLNPO QMP
                                                                                    Revision: 03
                                                                                      May 2008
	Page 71 of 96


7.2.1.2    QUALITY SYSTEM DOCUMENTATION FOR PROJECTS

       GLNPO's environmental information collection activities typically fall into the seven categories
listed in 7.2.1.  Quality system documentation for these activities will vary according to the characteristics
of the specific projects in accordance with GLNPO's graded approach, such as the impact of the decision,
the cost of the project, and the project objective. The required documentation for all the following
categories must be based on EPA QA/R-5. A list of the components of quality system documentation
according to EPA QA/R-5 is provided in Table 2 at the end of this section.  However, all components will
not apply to all projects.  Suggestions for quality system documentation for the seven general categories
are further described below.

7.2.1.2.1   EXISTING DATA/MODELING

       GLNPO environmental information activities can involve modeling and the use of existing data.
Project planning for modeling projects is important in order to ensure that the model is scientifically
sound, robust and defensible. EPA's Quality Staff has developed a guidance document (EPA QA/G-5M)
that can provide assistance in planning and implementing modeling projects. Additionally, EPA's
Quality Staff also has developed a draft checklist, Using Data from Other Sources - A Checklist for
Quality Concerns, for use in planning modeling and other projects using existing data. The checklist
includes the following steps (adapted by GLNPO):

       1. Identify the decision you are making or the project objectives.
       2. Identify the data and information from outside sources proposed for the project.
       3. Prioritize data needed for decision (i.e., what are the most important pieces of data).
       4. Determine whether the data have any constraints affecting their use in the new project.
       5. Determine where the acquired data will be used in the decision-making process.
       6. Scrutinize data for quality concerns pertinent to the intended use.
       7. Document your analysis plan.
       8. Execute your analyses and document the outcome appropriately.

       A past GLNPO modeling project, Lake Erie Total Phosphorus Loads, illustrates how the checklist
can be used for planning and documenting a project (see Appendix K). An example of quality system
documentation for modeling, Quality Assurance Project Plan for Lake Erie Total Phosphorus Loads:
1996 to 2000, also is provided in Appendix K.  In addition,  an example of quality system documentation
for secondary data, Quality Assurance Project Plan for Great Lakes Sediment Data Support, is provided
in Appendix K.

       In general,  for modeling projects and projects that use existing data, the  expertise of personnel
involved in the modeling effort is of utmost importance. Curriculum vitae and resumes of key personnel
should be  included  with the quality system documentation.  Roles of involved personnel and their
expertise is critical  due to the subjective components of model development and application. Experts that
are needed to provide related technical assistance, such as chemists, also must be identified in the
documentation.

       For modeling projects and other projects that use existing data, a QAPP  based on the applicable
components of EPA QA/R-5 would be considered sufficient to define the appropriate quality system. For
modeling projects, the importance of having all of the quality system documentation finalized prior to
initiation of the project is not as great as it is for other types of projects (such as when cost constraints
involved with sampling efforts require thorough documentation prior to sampling). GLNPO understands
that for modeling projects, much of the planning will depend on the outcome of the initial efforts that

-------
GLNPO QMP
Revision: 03
May 2008
Page 72 of 96	

involve evaluating existing models and available data. For this reason, the documentation could be
developed in steps, given the evolution of the modeling effort.  Documentation with as much detail as
practical should be developed prior to initiation of the project.  Thorough quality system documentation
that addresses other components of the project could be developed in the mid-stages of the project, once
the initial efforts have provided the final direction of the project.  For example, the roles and
responsibilities of involved personnel must be documented in detail prior to the initiation of the modeling
effort.  In addition, the criteria for evaluating data for acceptability into the model also must be
documented. Another important component to include in the initial documentation is the methods that
will be used to evaluate, assess, and test software and other modeling tools.

       For development of new models, comparison of the output to existing models would provide a
good assessment tool for the new model. Documentation should discuss these planned activities in
addition to planned sensitivity testing of the model and identification of key variables.

7.2.1.2.2   TRIBAL GRANTS

       GLNPO administers grants to several Tribes in the Great Lakes region.  In some cases, Tribes are
characterized by less infrastructure and available resources than many other organizations that work with
GLNPO. However, in order to conduct effective projects and make good use of resources, quality
assurance issues need to be addressed. Tribal grantees should use EPA QA/R-2 and EPA QA/R-5 as the
basis for quality system documentation, while considering the tenet behind the graded approach discussed
throughout this document.  In fact, tribal grants are one of the driving forces behind the graded approach.
 GLNPO will implement a flexible approach when reviewing quality system documentation for tribal
grants, however, as stipulated for GLNPO's graded approach, quality activities for a given project must
be commensurate with the decision that is being made. GLNPO will offer training to tribal grantees when
requested and will provide examples  of quality system documentation from similar grants when available.
 GLNPO also encourages coordination among Tribes, so that quality management activities for one
project can serve as an example for similar projects.

7.2.1.2.3   SEDIMENT ASSESSMENT

       GLNPO conducts three types of sediment assessment projects: 1) assessments which include
screening level and site-specific assessments, 2) remediation projects, and 3) post-remediation projects.
For GLNPO's sediment assessment grants, GLNPO requests that grantees develop quality system
documentation that addresses all components of EPA QA/R-5.  Typically, remediation projects will
involve more detailed documentation than the other types of projects.   Sediment assessment projects use
the NOAA Query Manager Database to store and retrieve sediment data. Sediment assessment projects
stipulate data reporting in a format specific to the Query  Manager Database and the quality system
documentation should describe activities that implement this data standard.  Details of the database are
described in, The Quality Assurance Project Plan for Great Lakes Sediment Data Support included in
Appendix K. Sediment assessment projects also can involve existing data and the quality system
documentation issues described in Section 7.2.1.2.1 are pertinent for these projects. Examples of quality
system documentation are provided in Appendix K and include: for assessment, The Quality Assurance
Project Plan for Cuyahoga River Old Channel Assessment; for site-specific assessment, The Quality
Assurance Project Plan for Raisin River Sediment Sampling in  FY2002: A Follow-up to the 1997
Sediment Remediation Project; and for remediation projects, Quality Assurance Project Plan for
Kinnickinnic River Sediment Sampling in FY2002.

-------
                                                                                   GLNPO QMP
                                                                                   Revision: 03
                                                                                      May 2008
                                                                                   Page 73 of 96
7.2.1.2.4  HABITAT/ECOSYSTEM RESTORATION MANAGEMENT
       Activities supporting habitat and ecosystem restoration management generally fall into three
categories: assessment, restoration, and protection. Of these three categories, habitat assessment projects
generally require the highest level of data quality. For habitat restoration management projects, quality
system documentation should identify the expertise needed to support the project. Expertise involving
species identification, detailed knowledge of site conditions, and optimal site conditions associated with
target species is very important in habitat restoration and management activities and should be discussed
in the quality system documentation.  Species identification training for staff supporting the project also
should be included and often involves picture or photo keys that staff can use in the field or the lab.
Because some of the habitat measures involve subjective measurements, such as percent dieback, photo
documentation may be helpful.  Random checks on personnel involved in these subjective measures, such
as a side-by-side comparison of measurements against qualified personnel can enhance the confidence in
the  data and if implemented should be discussed in the documentation.  Other important considerations
for  habitat projects that should be included in the documentation include use of geographical standards
and proper handling of ecologically sensitive data such as those that involve endangered species. An
example of quality system documentation for habitat restoration management, Quality Assurance Project
Plan for Controlling the Spread of Swallow-wort, is provided in Appendix K.

7.2.1.2.5   AMBIENT MONITORING AND RESEARCH DEMONSTRATION

       GLNPO's ambient monitoring activities include programs on a variety of environmental media
such as the open lake, fish, and atmospheric  toxics monitoring, and are described in Section 1.3. For
these long-term projects, comparability of data overtime is critical because many projects involve data
sets representing more than 30 years. Comparability among other monitoring agencies, such as the
Canadian government in the air toxics monitoring program, also is critical. For these programs, GLNPO
stipulates development of quality system documentation that addresses all components of EPA QA/R-5.
For GLNPO's fish monitoring program, sampling is conducted on a voluntary basis and therefore
additional oversight is needed and must be addressed in quality system documentation.

7.2.1.2.6   POLLUTION PREVENTION AND ENVIRONMENTAL EDUCATION

       GLNPO is involved in many environmental education and pollution prevention projects and the
primary function often is to build stakeholder involvement and consensus for environmental initiatives.
These projects also function to empower other organizations by providing information and training. In
some cases, these projects involve compiling existing information to develop outreach materials for
educating the public and organizations.  These projects typically do not require as much documentation as
many of the other GLNPO projects. The quality system documentation may focus on objectives of the
project more than data quality. In general, quality management activities often are discussed in
qualitative, as opposed to quantitative, terms. For some projects, an original proposal may serve to
document the quality system sufficiently. In other cases, a quality management narrative that describes
the  objective of the project and the quality management activities that will be undertaken to ensure a
successful project would suffice.

7.2.1.2.7   VOLUNTEER MONITORING

       GLNPO administers several projects that involve volunteer monitoring. The quality system
documentation for volunteer monitoring depends on the primary reason for collecting the  information. In
some cases, the  primary reason is to educate and promote public stewardship of local environments. In
these cases, quality system documentation may be a scaled-back version of EPA QA/R-5 and should
focus on interactions with the individuals involved in monitoring and how they can be educated about the
environmental data they are gathering. In other cases, the monitoring is being conducted for a large study

-------
GLNPO QMP
Revision: 03
May 2008
Page 74 of 96	

and the data will be used to characterize environmental conditions and make decisions. In these cases,
standard operating procedures and training are of primary importance. The project may involve such
quality management activities as a minimum performance test at the completion of volunteer monitoring
training, to increase the confidence and quality of the data. Data collection forms are an important
component of all volunteer monitoring activities and should be included in the systematic planning of
projects and also in quality system documentation.  Oversight of the project by qualified personnel is an
important aspect of volunteer monitoring. Safety is of utmost importance in volunteer monitoring
projects and must be considered in the documentation. For projects where a high level of data quality is
needed, audits may be required at a high frequency to ensure proper procedures are being followed.

       A series of fact sheets, guidance documents and methods for volunteer monitoring have been
developed by EPA's Office of Wetlands, Oceans, and Watersheds (OWOW).  GLNPO's QA Manager
believes OWOW has done a very nice job of defining QA issues and considerations for quality system
documentation.  An OWOW website, http://www.epa.gov/owow/monitoring/vol.html, includes, The
Volunteer Monitor's Guide to Quality Assurance Project Plans, in addition to a variety of other guidance
documents and methods. GLNPO encourages individuals involved in volunteer monitoring projects to
review these materials.

Table 7-2.  Required Elements in a Quality Assurance Project Plan (QAPP)
Elements
Requirements
PROJECT MANAGEMENT
A1
A2
A3
A4
A5
A6
A7
A8
A9
Title and Approval Sheet
Table of Contents
Distribution List
Project/Task Organization
Problem Definition/
Background
Project/Task Description
Data Quality Objectives for
Measurement Dat
Special Training
Requirements/Certification
Documentation and Record
Title and approval sheet.
Document control format.
Distribution list for the QAPP revisions and final guidance.
Identify individuals or organizations participating in the project and discuss their roles,
responsibilities and organization.
1) State the specific problem to be solved or the decision to be made.
2) Identify the decision maker and the principal customer for the results.
1) Hypothesis test, 2) expected measurements, 3) ARARs or other appropriate
standards, 4) assessment tools (technical audits), 5) work schedule and required
reports.
Decision(s), population parameter of interest, action level, summary statistics and
acceptable limits on decision errors. Also, scope of the project (domain or geographical
locale).
Identify special training that personnel will need.
Itemize the information and records that must be included in a data report package,
including report format and requirements for storage, etc.
MEASUREMENT/DATA ACQUISITION
B1
B2
B3
B4
Sampling Process Designs
(Experimental Design)
Sampling Methods
Requirements
Sample Handling and
Custody Requirements
Analytical Methods
Requirements
Outline the experimental design, including sampling design and rationale, sampling
frequencies, matrices, and measurement parameter of interest.
Sample collection method and approach.
Describe the provisions for sample labeling, shipment, chain-of-custody forms,
procedures for transferring and maintaining custody of samples.
Identify analytical method(s) and equipment for the study, including method performance
requirements.

-------
                                                                                         GLNPO QMP
                                                                                          Revision: 03
                                                                                            May 2008
                                                                                         Page 75 of 96
Elements
B5
B6
B7
B8
B9
B10
Quality Control
Requirements
Instrument/Equipment
Testing Inspection and
Maintenance Requirements
Instrument Calibration and
Frequency
I nspection/Acceptance
Requirements for Supplies
and Consumables
Data Acquisition
Requirements
(Non-direct Measurements)
Data Management
Requirements
Describe routine (real-time) QC procedures that should be associated with each
sampling and measurement technique. List required QC checks and corrective action
procedures.
Discuss how inspection and acceptance testing, including the use of QC samples, must
be performed to ensure their intended use as specified by the design.
Identify tools, gauges and instruments, and other sampling or measurement devices that
need calibration. Describe how the calibration should be done.
Define how and by whom the sampling supplies and other consumables will be accepted
for use in the project.
Define the criteria for the use of non- measurement data such as data that come from
databases or literature.
Outline the data management scheme including the path and storage of the data and
the data record-keeping system. Identify all data handling equipment and procedures
that will be used to process, compile, and analyze the data.
ASSESSMENT/OVERSIGHT
C1
C2
Assessments and Re-
sponse Actions
Reports to Management
Describe the assessment activities needed for this project. These may include DQA,
PE, TSA, MSR/ PR/RR
Identify the frequency, content and distribution of reports issued to keep management
informed.
DATA VALIDATION AND USABILITY
D1
D2
D3
Data Review, Validation,
and Verification
Requirements
Validation and Verification
Methods
Reconciliation With Data
Quality Objectives
State the criteria used to accept or reject the data based on quality.
Describe the process to be used for validating and verifying data, including the chain-of-
custody for data throughout the lifetime of the project.
Describe how results will be evaluated to determine if DQOs have been satisfied.
GLNPO added note:   The above elements are the minimum requirements for a QAPP, and therefore, should be used during
                   the QAPP development.

-------
GLNPO QMP
Revision: 03
May 2008
Page 76 of 96	

7.2.2  QA ANNUAL REPORT AND WORKPLAN

       The Quality Manager may submit a QAARWP to GLNPO line management and EPA Quality
Staff.  This report summarizes the quality management activities for previous years and describes the
activities planned for the coming year. The QAARWP generally includes:

       >  A general status of the quality management program including strengths, weaknesses,
           successes, and problems
       >  An assessment of the adequacy of the QMP and recommended changes
       >  A list of quality management training for GLNPO personnel
       >  A list of quality system documentation and SOPs developed over the last year
       >  A list of the major EICAAs undertaken in the last year that require quality system
           documentation, the status of the documentation, including budgeting and time estimates for
           the development of outstanding documents
       >  A list of the projects that were audited and reviewed in the preceding year
       >  A list of the major projects for which audits are planned for the coming year

       The report is submitted in conjunction with the GLNPO Team Workplans.  In addition, the
Quality Manager develops periodic reports that include categories for progress, problems and resolutions,
reports, and a list of activities scheduled for the next period.  These reports are presented to the
management team during the monthly team briefings. The Quality Manager will immediately report any
serious QA problems to line management as they arise. Serious matters requiring immediate attention
will be reported directly to the GLNPO Office Director.

7.3   PEER REVIEW

       EPA has  a formal Peer Review Policy, described in the EPA Peer Review Handbook. In
accordance with this policy, GLNPO requires that peer review be incorporated into the planning process
for all major, scientific or technical work products. This  documented, critical review is an in-depth
assessment of the assumptions, calculations, extrapolations, alternate interpretations, methodology,
acceptance criteria and conclusions pertaining to the major scientific or technical work product and of the
documentation that supports this product.  The determination that a scientific or technical product is
"major" is based on whether it meets at least one of the following criteria:

       >  Does it support major regulatory decisions or policy/guidance of major effect?
       >  Does it establish a significant precedent, model or methodology?
       >  Does it address controversial issues?
       >  Does it focus on significant emerging issues?
       >  Does it have significant cross-Agency/inter-Agency implications?
       >  Does it involve a significant investment of Agency resources?
       >  Does it consider an innovative approach for a previously defined
           problem/process/methodology?
       >  Does it satisfy a statutory or other legal mandate for peer review?

       GLNPO POs and the Quality Management Team can use the checklist, GLNPO Required Peer
Review Information, (Appendix  L) to assist them in complying with Agency peer review policy. Because
GLNPO manages approximately 100-200 assistance agreements and contracts each year it is not practical
to complete this checklist for all of these projects because most of them  do not need peer review.  The
alternative approach that GLNPO utilizes meets the criteria in the Peer Review handbook and entails
Project Inventory and Approval  Forms (Appendices H and I) that are reviewed by the QA manager, who

-------
                                                                                  GLNPO QMP
                                                                                  Revision: 03
                                                                                     May 2008
	Page 77 of 96

recommends to GLNPO's Director whether the project needs quality system documentation or peer
review. The peer review recommendation made by the QA manager is based on the questions posed in
Appendix L and a final determination is made by the Director on approval of the package and
corresponding documents. A new database, GLNPO's Projects Tracking Database (see section 2.6),
which will be implemented in the last quarter of FY2002, will track the peer review status for each
project. This database will include the contracts and in-house projects that are not subject to grant project
inventory and approval forms and sign-offs. Peer review determinations will be sanctioned by the office.

       Those projects requiring peer review, including those to be peer reviewed via a refereed,
scientific journal, will be entered into the EPA Science Inventory which tracks national peer review
products and science activities. The Peer Review Leader will be given GLNPO's checklist #2, or will fill
out the appropriate fields in GLNPOs Projects Tracking Database.  The progress of the peer review
planning process can be tracked by entering comments or attaching documents to section 14 of the
National Database, and when the peer review is completed the database manager will update the entry in
the National Database. Those projects which do not need peer review, but are scientific or technical, are
entered into the Science Activity section of the National Database. The project summaries for each fiscal
year are added to the appropriate GLNPO science category, which are: Atmospheric Deposition
Monitoring Program, Coastal Wetlands Monitoring and Indicator Development, Emerging Issues, Fish
Contaminant Monitoring, Great Lakes Limnology Monitoring, Great Lakes Planktonic and Benthic
Monitoring Program, Habitat Protection/Restoration, Invasive Species Program, Lake Michigan Mass
Balance, Pollution Prevention,  Sediment Assessment and Monitoring, and State of the Lakes Ecosystem
Indicators.

7.4     HEALTH AND SAFETY ISSUES

       The health and safety of project participants is of primary importance to GLNPO and health and
safety issues must be addressed in the planning process for all GLNPO-funded EICAAs.  For assistance
agreements, GLNPO will address safety requirements within the special conditions sections of written
agreements.

       GLNPO has a designated Health, Safety, & Environmental Compliance Coordinator and Health
and Safety Team who are responsible for managing health and safety issues at GLNPO.  GLNPO has
developed a comprehensive document outlining health and safety issues pertinent to GLNPO EICAAs
titled, Health, Safety, & Environmental Compliance Manual. This manual was updated in May 2002.
GLNPO conducts many EICAAs on the GLNPO research vessel R/V Lake Guardian and the R/V
Mudpuppy. The manual specifies comprehensive requirements for medical monitoring and safety training
for any GLNPO employee that will be working on our vessels. The R/VLake Guardian and the R/V
Mudpuppy also accommodate other EPA personnel, contractors, and researchers implementing GLNPO-
funded EICAAs.  The vessels each contain a copy of the current safety manual and GLNPO requires that
all individuals working on the vessels review the manual. To facilitate this review, GLNPO provides
training on the manual as part of its mandatory onboard training conducted each Spring prior to GLNPO's
Water Quality Surveys.  Safety inspections of the vessel occur annually or more frequently if hazardous
situations are discovered. A report of the inspection is distributed to the GLNPO Health and Safety Team
who  then address any safety infractions. GLNPO's Health, Safety, & Environmental Compliance
Coordinator is responsible for reviewing the Health, Safety, & Environmental Compliance Manual and
revising it as needed to address all issues related to GLNPO EICAAs.

-------
GLNPO QMP
Revision: 03
May 2008
Page 78 of 96	

       Office safety also is addressed in the GLNPO Health, Safety, & Environmental Compliance
Manual as well as in the Region 5 Safety Manual (2000). GLNPO participates in the Region 5 evacuation
procedures and has personnel identified to assist the Region 5 Safety Manager in an emergency. The
Region 5 Safety Manager and the GLNPO Safety Manager annually inspect the GLNPO offices and
vessels for hazardous conditions. The GLNPO Safety Manager will inform GLNPO employees of any
specific hazards in their office space.

-------
                                                                                 GLNPO QMP
                                                                                  Revision: 03
                                                                                    May 2008
                                                                                 Page 79 of 96
                                        Section 8
                   Quality Implementation of Work Processes
       Implementation of a quality system is as important as planning the system. For intramural and
extramural projects, the GLNPO PO is responsible for ensuring that tasks supporting EICAAs are
performed according to plan. GLNPO's processes for ensuring implementation of the quality system
during EICAAs are detailed below.

8.1    INTRAMURAL ACTIVITIES

       As described in the previous section, EICAAs require detailed, comprehensive planning and
documentation of the planning process.  These planning documents should detail operational tasks of an
EICAA such as:

       >  Field sampling                      >   Sample analyses
       >  Sample handling and shipping        >   Sample archiving
       >  Sample preparation                  >   Data reporting

       GLNPO ensures that these tasks are carried out according to plan by ensuring all relevant parties
have copies of the planning documentation. It is the PO's responsibility to identify these relevant parties
and distribute the planning documentation. The PO is also responsible for ensuring all involved parties
understand the plan and their specific roles and responsibilities. This is often accomplished through a
"kick-off or "all-hands" meeting prior to the initiation of an EICAA.  Because GLNPO often employs
functional teams in the planning process and the individuals on the team will be subsequently involved in
the tasks involved in the project, this provides an added benefit to GLNPO where the involved staff are
fully informed of the purpose of the project and the project design.

       For example, in support of GLNPO's base monitoring program, the monitoring team coordinates
meetings of all involved staff in January or February as they develop their schedule for the year.  These
meetings include an assessment and subsequent revision of the QAPP for the survey and the SOP
document, Sampling and Analytical Procedures for GLNPO's Open Lake Water Quality Survey of the
Great Lakes, to capture small changes to the program. The schedules developed in the meetings are
communicated to all parties involved in the survey through distribution of an "information package." The
package contains a detailed survey schedule, sampling plan, and logistical information.  In addition,
onboard training is conducted prior to the survey. The training includes a review of health and safety
issues, an overview of the survey schedule, a detailed review of SOPs, and information regarding location
of current documents including all SOPs, planning documentation, and safety manuals.  In this way, the
PO ensures all involved parties are aware of the project purpose, plan, and their roles and responsibilities
as outlined in the planning documents.

8.2    EXTRAMURAL ACTIVITIES

       To ensure tasks are being performed in accordance with the project plan, the PO has regular
communication with investigators. For example, for all GLNPO-funded EICAAs, the grantee prepares
periodic (generally semi-annual or quarterly) reports on the status and progress of activities. For larger

-------
GLNPO QMP
Revision: 03
May 2008
Page 80 of 96	

scale or long-term EICAAs, the PO also conducts site visits, in accordance with government policy, to
view tasks in progress. As needed, the PO is available to provide technical and logistical support to the
investigator.

       The investigator is required to submit SOPs with their quality system documentation or prior to
initiating EICAAs. Any revisions to SOPs or quality system documentation must be approved by the PO
prior to implementing the changes. The investigator is responsible for ensuring personnel are trained in
all SOPs and laboratory and field operations according to the requirements in the quality system
documentation. During review of the quality system documentation, the PO and Quality Management
Team attempt to identify additional procedures that would benefit from development and implementation
of SOPs.  If additional SOPs are deemed to be necessary, the PO can request development and submittal
of these SOPs prior to initiation of the EICAA. SOPs are discussed further in Section 2.4.

       To assist POs in obtaining the required work products on schedule, the Quality Management
Team provides reports to the PO listing outstanding work products, associated due dates, and the review
and approval status of these reports.  As necessary, the Quality Management Team assists the PO with
implementing the Protocol to Address Missed Requirements (Appendix M). This protocol is discussed in
detail in Section 7.

       For extramural projects, GLNPO's relationship to external parties is maintained through the PO.
The PO is the only person who has the power to accept or reject the product from an EICAA that is being
funded by GLNPO. Although the Quality Management Team is intimately involved in the EICAA, they
function to assist the PO in effectively implementing the EICAA. For example, all correspondence
regarding the project is addressed to the PO, who must then communicate the information to the grantee.
The Quality Management Team does not work directly with the funded party without the PO's
involvement. Responsibility of the PO for implementing the system and they have final authority for
determining activities of the grantee.

8.3   COMMUNICATION

       Due to the multitude of external parties involved in GLNPO projects, communication is critical to
the success of GLNPO EICAAs and accomplishment of the mission.  The Quality Manager attends
monthly meetings with GLNPO management and presents a written report on the progress and status of
quality system activities for all ongoing EICAAs. In most cases, the  Quality Management Team will
copy Management Advisors on all internal and external communications for projects under their
responsibility.  Communication between the Quality Management Team and all GLNPO staff also is
fostered through the quality system training provided by the Team.

8.4   DISPUTE RESOLUTION

       Implementation of quality management activities may sometimes result in disagreements among
involved parties.  When these disputes occur, resolution will be sought at the lowest management level.
GLNPO staff will attempt to resolve the dispute through discussion and negotiation. Final resolution will
be made by GLNPO's Director when negotiations do not resolve the  issue.

-------
                                                                                 GLNPO QMP
                                                                                 Revision: 03
                                                                                    May 2008
	Page 81 of 96

                                        Section 9
                         Quality Assessment and Response

       Quality assessments are used to determine the effectiveness of a quality system in meeting its
goals - in GLNPO's case, in ensuring that environmental information is of adequate quality for its
intended use.  An assessment is a formal evaluation of performance relative to pre-determined standards.
Once an evaluation is conducted and documented, a response is implemented that provides corrective
actions to improve performance where necessary. A variety of tools are available to assess environmental
information collection activities including: audits, data quality assessments, quality systems audits, peer
reviews, technical reviews, performance evaluations, and technical systems audits.  These assessments are
the principal means used by EPA to determine compliance and to control systems in a real-time manner to
improve performance. This section describes GLNPO's use of these assessment tools.

9.1    QUALITY SYSTEMS AUDITS AND TECHNICAL SYSTEMS AUDITS

       QSAs, previously termed management systems reviews, are on-site evaluations by internal or
external parties to determine if the organization is implementing a satisfactory quality management
program. They are used to determine the adherence to the program, the effectiveness of the program, and
the adequacy of allocated resources and personnel to achieve and ensure quality in all activities. Internal
QSAs are conducted by GLNPO senior management. GLNPO-funded entities also may undergo QSAs
lead by GLNPO's Quality Manager. External QSAs are conducted by EPA Quality Staff to determine
compliance of GLNPO's program with this QMP. The  QSA includes reviews of:

       >  adherence to the GLNPO quality management plan,
       >  procedures for developing project quality objectives and other acceptance criteria,
       >  procedures for planning EICAAs,
       >  procedures for developing and approving quality system documentation,
       >  the quality of existing quality system documentation,
       >  procedures for developing and approving SOPs,
       >  procedures, criteria, and schedules for designing and conducting audits,
       >  tracking systems for ensuring that the quality management program is operating and that
           corrective actions disclosed by audits have been taken,
       >  the degree of management support,
       >  responsibilities and authorities of the various line managers and the Quality  Manager for
           carrying out the quality system,
       >  the level of financial resources and personnel devoted the implementing the  quality system,
           and
       >  existence of appropriate quality system documentation and its conformance  with the
           requirements of the quality management plan.

       To achieve the objectives of a QSA, the review should be conducted by an individual somewhat
independent of the organization, but still with a "stake" in seeing improvement. Because  GLNPO
cooperates with many agencies, an individual from one  of these agencies could lead an internal QSA.
The leader could then build a review team composed of individuals from GLNPO senior management that
would assist in the planning, scheduling, and implementation of the review.  Prior to the audit, the review
team should develop an audit plan that defines the scope, purpose, and details of the review as discussed
in Section 9.5.

-------
GLNPO QMP
Revision: 03
May 2008
Page 82 of 96	

       The Quality Management Team will coordinate periodic QSAs of specific GLNPO functional
teams, depending on the number and importance of the projects being conducted by a Team.  For
GLNPO's base monitoring program, the Quality Management Team will conduct a QSA on at least one
component of the program every year (e.g., limnology, fish monitoring, atmospheric monitoring, etc.) and
more frequently if serious deficiencies are identified. The review should occur between the months of
March and April, at the start of the Spring Survey.

       Results of all QSAs coordinated by GLNPO are presented in reports to GLNPO management, the
Quality Manager, and EPA Quality Staff. The format for these reports is discussed in Section 9.5. For all
identified deficiencies, the review teams will develop proposed corrective actions and will discuss them in
the report. These corrective actions will become goals as part of the Quality Manager's performance
appraisal.  In addition, the Quality Manager will detail the progress on these corrective  actions at
management meetings, during Quality Manager performance reviews, and in subsequent QAARWPs.

       TSAs are qualitative on-site evaluations of all phases of an EICAA  (i.e., sampling, preparation,
analysis). These audits can be performed prior to or during the data collection activity, in order to
evaluate the adequacy of equipment, facilities, supplies,
personnel, and procedures that have been documented in
the quality system documentation. Because a TSA is
most beneficial at the beginning of a project, GLNPO will
schedule audits at the initiation phase of an EICAA, when
possible.  GLNPO will perform a QSA, site visit, or TSA
for the most high-profile EICAAs (i.e., those that support
an important decision). The number and frequency are dependent on the length of the project, the
importance of the project objectives, and the evaluations of prior audits. GLNPO will conduct TSAs on
two or more GLNPO grants or contracts that involve EICAAs each year. Selection of these grants and
contracts will be proposed by the Quality Manager and sanctioned by the Quality Management Team
Management Advisor (GLNPO's Director).

       Audits will be scheduled by the PO and tracked by the Quality Manager.  The Quality Manager
can assist in facilitating audits at the request of the PO.  In addition, the Quality Manager can participate
in audits at any time, to evaluate auditing procedures.  The PO, in conjunction with the  Quality
Management Team, is responsible for developing an audit plan (section 9.5) and documenting audit
results (section 9.7).

9.2   PERFORMANCE EVALUATIONS

       Performance evaluations  (PEs) are a means of independently evaluating data quality and the
variability associated with the overall measurement system or a distinct phase of the measurement system.
 This is accomplished through the analysis of samples of known composition and concentration.  These
samples can be introduced into the measurement system as blind samples where the identity and the
concentration are unknown to the analyst.  These samples can be used to evaluate bias and precision and
to determine whether DQOs or method quality objectives (MQOs) associated with a given project have
been satisfied. PEs also can be used to determine inter- and intra-laboratory variability over the  course of
long-term projects, and to evaluate laboratories prior to contract awards.
(SLNPO's value-added assessments
focus on improvement of the data
collection process through direct
involvement of the technical team
members in the assessment response.

-------
                                                                                   GLNPO QMP
                                                                                    Revision: 03
                                                                                       May 2008
	Page 83 of 96

       PEs are required for projects involving important decisions or where multiple parties are involved
in data collection and data comparability is an issue.  At times, PE samples may be of interest to a PO but
appropriate reference material from reliable sources may not be available. If time permits, the PO, in
conjunction with the Quality Management Team, can coordinate development of reference samples from
a bulk source.  The samples are characterized by a laboratory, independent of the project, through analysis
of a statistically valid number of replicates and then used as reference material.  GLNPO participates in
PE programs that apply to its EICAAs and provide information regarding data quality. For example,
GLNPO participates in several PE programs coordinated by the National Laboratory for Environmental
Testing in Ontario. GLNPO distributes information regarding pertinent PE programs to grantees, states,
and other involved parties and encourages participation. As part of the systematic planning, POs should
consider the use of PE samples.

9.3    PROJECT ASSESSMENTS

       In order to conduct efficient audits, POs in conjunction with the Quality Management Team
should thoroughly plan the audit and document the plan. The audit plan document is not a major
undertaking and in most cases will be a one page table or report. However, the document represents
thoughtful and concise planning for an efficient and successful audit. The audit plan should be made
available to the organization audited, with adequate lead time to ensure that appropriate personnel and
documents are available for the audit.  An audit plan for any type of audit will typically include the items
listed in Table 9-1. Additionally,  all data must be assessed against its intended use by the technical
person doing the work. This is the responsibility of the project lead. Resources for this assessment
include EPA QA/G-9 and most importantly, the project assessment criteria in the project-level quality
system documentation. The project-level quality system documentation must provide explicit assessment
criteria to allow proper assessment against the intended use of the data.

       Prior to an onsite evaluation, the POs in conjunction with the Quality Management Team discuss
and evaluate the expertise and minimum level of competence for personnel to effectively and efficiently
conduct onsite evaluations. The POs and Quality Management Team develop and assemble a checklist
prior to the onsite evaluation to establish the base level of competency and ensure the personnel meet the
requirements of the checklist in terms of knowledge, auditing skills, experience, etc. After personnel
expertise is established, the Quality Management Team assesses the personnel conflict of interest and
direct involvement in the work being investigated. The Quality Manager is often the leader of the Onsite
Evaluation Team.  In cases, where the  Quality Manager is not the leader, the Quality Manager and Project
Lead are involved in the coordination of the onsite evaluation and notify the  appropriate personnel being
evaluated. They clearly define prior to the onsite evaluation, the authority, access to programs and
managers, access to documents  and records, and organizational freedom required to conduct the onsite
evaluation. The Quality Management  Team and POs ensure the authority of personnel to  establish the
assessment will be conducted without restraint or bias.  Assessments of specific personnel being
considered for participation in an audit team are accomplished through thorough and complete reviews of
the auditor's education, experience, and auditing history. Recommendations by experts in the field of
interest are accepted and the POs ensure competency, conflict of interest, and authority are addressed
during the personnel selection process.

-------
GLNPO QMP
Revision: 03
May 2008
Page 84 of 96
Table 9-1.  Items to be Included in Project Assessments
Item
Project title
Audit number
Date of audit
Scope
Purpose
Standards
Audit team
Auditees
Documents
Timeline
Description
GLNPO project title
Year and number of audit can be combined; 2001-1, 2001-2
Date audit is scheduled
Establishes the boundary of the audit and identifies the groups and activities to be evaluated. The
scope can vary from general overview, total system, to part of system, which will affect the length of
the audit.
Why the audit is being conducted and what the audit should achieve.
Standards are the criteria against which performance is evaluated. These standards must be clear
and concise and should be used consistently when auditing similar facilities or procedures. The use
of audit checklists is suggested to assure that the full scope of an audit is covered. An example
checklist for an analytical laboratory audit can be found in Appendix P.
Team leader and members and their affiliation. The audit team will consist of competent experts with
no conflicts of interest. The audit team will demonstrate authority without restraints or bias.
People that should be available for the audit from the audited organization. This should include the
Program Manager, Principal Investigator, organizations QA Representative, and other management,
and technicians as necessary.
Documents that should be available for the audit to proceed efficiently. Too often documents are
asked for during an audit, when auditors do not have the time to wait for these documents to be found.
Documents could include QMPs, QAPPs, SOPs, GLPs, control charts, raw data, QC data, previous
audit reports, etc.
A timeline of when organizations (auditors/auditees) will be notified of the audit for efficient scheduling
and full participation of all parties. The timeline also may include the schedule for the opening
briefing, data collection, exit briefing, and draft report or other product.
9.4   ASSESSMENT IMPLEMENTATION

       After the audit plan has been developed, the auditee is notified prior to conducting the on-site
visit, to inform the auditee about the audit including its scope, purpose, and logistics. During this initial
contact, certain information can be requested such as the auditees organizational structure, SOPs, GLPs,
control charts, etc., that will help to efficiently implement the audit.  Mutually acceptable audit dates
should be identified so that the appropriate staff are available during the audit period.

       During the actual audit, there should be an initial interview with the audit team and the auditees,
in order to restate the scope and purpose of the audit, and develop a detailed schedule that is acceptable to
all involved parties.  The schedule should include a list of the operational phases of the EICAA that will
be observed and time for debriefing activities.

-------
                                                                                 GLNPO QMP
                                                                                  Revision: 03
                                                                                    May 2008
	Page 85 of 96

       During the observation phase of the audit, one must be aware of one's own perceptions (a good
job to one auditor may be a sloppy job to another). Three concepts (Arter, 1989) to consider in order to
persuade the audited organization that the auditor's perception of the facts is useful include:

       >   Present items and facts that will satisfy the needs of the audited and auditing organizations.
            Make a contribution.  Show how the facts affect the product or service.

       >   Ignore or downplay mildly disturbing things.  Do not nitpick. Strive to answer the "So
            what? " response.

       >   Pay attention to significant things.  Chronic or persistent problems and weaknesses, along
            with trends, will get the auditee 's attention.
                      The Six Steps to the Basic Observation Interview
                                (As explained by Arter,  1989)

   1.  Put them at ease. &\ve the auditee an opportunity to size you up and lower the natural
      sense of anxiety.
   2.  Explain your purpose. Tell  the auditee what you want and why you are asking the questions.
      Most people will express a desire to share information once it is known why it is wanted.
   3.  Find out what they are doing.  Use open-ended questions (e.g., "What do you do as soon as
      you get the sample; and then what happens?") The checksheet described in Appendix P can
      then be used during the auditees response to fill in the "yes/no" answers.
   4.  Analyze what they are doing. Once you have heard the words, analyze what they mean.
      You may want to repeat the process and "think out loud" which will force you to put the
      facts in perspective and in logical arrangement.
   5.  Make a tentative conclusion.  Conclusions of that phase of the EICAA can be made. If the
      initial analysis indicates that all is well, let the person know; they will continue to perform
      well with recognition from an outsider. If there is a deficiency, give the auditees an
      opportunity to produce additional factual evidence to show that you have made the error.
   6.  Explain your next step.  Conclude the discussions and let the person know what's next. It
      is important to remember that people want to know: (1) how they did in the interview, and
      (2) whether they are finished.
       The six steps listed below are presented in Alter's book, Quality Audits for Improved
Performance (1989) along with additional details of all phases of an audit.  A reference copy is
available from the GLNPO Quality Manager and is a suggested source of good information.

9.5   DATA QUALITY ASSESSMENTS

       A data quality assessment (DQA) is used to evaluate the quality of specific data and determine if
this quality satisfies the stated project objectives of an EICAA.  POs and the Quality Management Team
can use the guidance document, EPA QA/G 9R, Data Quality Assessment: A Reviewer's Guide developed
by EPA Quality Staff to assist them assessing data quality. Another guidance document, Data Quality
Assessment: Statistical Tools for Practitioners (EPA  QA/G-9S), provides  useful techniques in assessing
the quality of data.

-------
GLNPO QMP
Revision: 03
May 2008
Page 86 of 96	

       Data quality assessments are generally conducted on the second TSA, at the completion of an
EICAA as part of a QA report, or at the request of a PO when concerns about data quality are identified.
Based on TSA reports, the Branch Chief or the Quality Manager may suggest a DQA to the PO. The PO
is responsible for determining the need for a DQA and will be responsible for conducting this assessment
and developing subsequent reports. The GLNPO Quality Manager can assist in facilitating the
assessment as necessary.

       Data quality assessments also can be made by conducting data review and verification.  These
reviews typically focus on the QA data collected with the routine field data.  Examples of QA analysis
checklists are provided in Appendix R.

       In support of the LMMB Study, GLNPO has developed "state-of-the-art" data quality assessment
approaches.  For the study analytes, GLNPO determined quantitative estimates of data quality in terms of
three attributes: sensitivity, precision, and bias.  In addition, GLNPO developed a novel approach to
assessing data quality, the percent variability due to sampling and analytical measurement uncertainty.
These estimates are described below.

Sensitivity      The detection limit for each data set was estimated by the principal investigator. The
               types of detection limits vary among investigators, although a commonly used approach
               in LMMB was EPA's method detection limit (MDL) (described at 40 CFR part 136,
               Appendix B that involves analysis of seven replicate samples of known concentration).
               Sensitivity for each data set is presented as a percent of field sample results above and
               below the  detection limit.

Precision       Precision was estimated through statistical analysis of analytical results for field and
               laboratory duplicate samples. System precision, a measure of the precision for the entire
               sampling and analytical procedure, was estimated through comparison of analytical
               results obtained for routine field samples and their associated field duplicates. Analytical
               precision,  a measure of the precision of the laboratory analytical component of the
               system, is a subset of the system precision, and was estimated through comparison of
               analytical  results obtained for routine field samples and their associated laboratory
               duplicates.

Bias           Bias was estimated through  statistical analysis of analytical results for laboratory-spiked
               routine field samples.  System bias, a measure of the bias of the entire sampling and
               analytical  procedure, can be estimated through analysis of samples that have been spiked
               in the field. Analytical bias, a measure of the bias of the laboratory analytical component
               of the  system, can be estimated through analysis of laboratory-spiked environmental or
               blank samples.

Percentage of variability due to sampling
and analytical measurement uncertainty

               The percentage of variability due to sampling and  analytical measurement uncertainty is
               estimated as the proportion of variability among all RFS results that can be attributed to
               sampling and analytical measurement uncertainty. This measure is estimated as the mean
               variance between field duplicate pairs as a proportion of the variance among all routine
               field samples.

       The quantitative estimates described above reflect data quality for a given entire data set
produced by a single laboratory. Study modelers have requested an interval estimate for single  study

-------
                                                                                   GLNPO QMP
                                                                                   Revision: 03
                                                                                      May 2008
	Page 87 of 96

results.  GLNPO is currently developing approaches to determine these interval estimates.  GLNPO plans
to continue to develop approaches to assessing data quality and apply these approaches to all EICAAs.

9.6    ASSESSMENT REPORTING

        At the completion of an audit, the audit team will meet with auditees to discuss the results of the
audit and the next steps of the process.  Positive and negative aspects of the EICAA will be discussed
between the audit team, management of the area audited, and, if necessary, technical personnel
performing the measurement activity. The review team should provide copies of the draft audit summary
and findings to all in attendance. The review team will discuss with the auditees, all necessary actions
needed to improve the measurement system.

        The PO is responsible for reporting results from TSAs, DQAs, and PEs, even when the PO is not
the  review team leader for the audit. For QSAs, the review team leader or an appointed designee is
responsible for preparing the report. These reports generally include:

        >  Audit title and number and  any other identifying information
        >  audit team leaders, audit team participants and audited participants
        >  Background information about the project, purpose of the audit, dates of the audit, particular
           measurement phase or parameters that were audited, and a brief description of the audit
           process
        >  Summary and conclusions of the audit and corrective action requirements
        >  Attachments or appendices  that include all audit evaluation forms and audit finding forms

        A report should be completed within five working days of completion of an audit.  The Quality
Manager and PO review TSA, DQA, and PE reports and document their review and approval through an
approval signature. The report is then filed with the Quality Manager.  For QSAs, the audit team leader
and GLNPO Director review the reports and provide the approval signatures.  The reports are filed with
the  Director and Quality Manager.  It is the responsibility of the review team leader to forward audit
reports to the appropriate project participants. Audit reports have restricted distribution in order to foster
constructive working relationships. When significant concerns are identified during an audit, GLNPO's
Quality Manager or Director will schedule a meeting to address these concerns with the appropriate
parties.

9.7    RESPONSE ACTIONS

        As mentioned above, the audit reports are discussed with the audited organization specifically
noting the corrective actions necessary to rectify and control the situation. Line management may be
requested to assist in problem resolution.  For each audit finding, an audit finding response form will be
developed to track corrective actions. An example audit finding response form is provided in Appendix
Q.  These forms will be included in the  audit file retained by the Quality Manager.  The PO (for TSAs,
DQAs, PEs, and QSAs of other organizations) or the GLNPO Director (for QSAs of GLNPO) is
responsible for ensuring compliance with the corrective actions.  If major deficiencies are found,
follow-up audits are often required.

-------
GLNPO QMP
Revision: 03
May 2008
Page 88 of 96

-------
                                                                                GLNPO QMP
                                                                                 Revision:  03
                                                                                   May 2008
	Page 89 of 96

                                       Section 10
                                 Quality Improvement

       GLNPO's quality policy focuses on four operating principles: assistance, flexibility, value-added,
and continuous improvement.  This QMP details GLNPO's quality system policy and processes, many of
which facilitate improvement of GLNPO EICAAs. For example, monthly management meetings, quality
system training, data quality assessments, and peer review activities provide opportunities to identify
areas for improvement that can be addressed in subsequent projects.  GLNPO's quality system is designed
to facilitate identification and communication of these areas of improvement to all GLNPO staff through
meetings, training, and Quality Management Team reports. This section describes GLNPO's process for
realizing continuous improvement.

10.1   PROGRAM REVIEW

       This QMP is approved by the GLNPO Director and all GLNPO Branch Chiefs, thereby
demonstrating their  commitment to GLNPO's quality system. GLNPO management is responsible for
ensuring that GLNPO staff follow the guidelines of the quality system as documented in this QMP.  This
is facilitated through regular quality system training, Quality Management Team monthly reports on the
status of quality system activities, and maintenance of a central library that contains GLNPO quality
system documents and Agency guidance and requirements documents.  GLNPO's quality
system is constantly being evaluated for effectiveness.  Line
management and the Quality Manager meet quarterly to
discuss adherence to the QMP and to identify where
improvements can be made. For any major quality system
deficiencies, the Quality Manager will develop proposed
corrective action, implement the action, and finally assess
its implementation.  The Quality Manager will summarize
these meetings including the areas for improvement and
corrective actions in a quarterly report that is placed on
„„,,..„,,.        ™            ,         -   The status of preventive and
GLNPO s LAN in the QA directory.  These reports also                 ..     .
  •111   •  i  i   i •  j  A* *T>-nrr>                              corrective actions
will be included in the QAARWP.
The input to management review
shall include information on:
>  Results of audits
>  Stakeholder feedback on the
   quality system
>  Staff competence
>  Process performance
       Management should always seek positive methods for ensuring adherence to policy; however,
management must implement corrective action procedures for staff that do not adhere to GLNPO or EPA
policy (e.g., obtaining approval of quality system documentation prior to implementation of an EICAA).
If staff members continuously disregard the policy, management should include this assessment in the
individual's performance evaluations.

       The Quality Manager will review this QMP each year to determine if the document is relevant to
GLNPO's mission and reflects current procedures.  Sections will be modified to address GLNPO's
evolving program and changing needs. GLNPO's quality system and this QMP also will be reviewed as
needed when changes in Agency policy or guidance occur. Revision of the QMP will be noted by the
change in revision number and the date of the revision included in the header information and in the

-------
GLNPO QMP
Revision: 03
May 2008
Page 90 of 96	

Table of Contents. All revisions will be reviewed and approved by GLNPO's Director, Branch Chiefs,
and EPA  Quality Staff before implementation. Upon approval, the QMP will distributed to all GLNPO
staff, along with a summary of the revisions

       As discussed in Section 9, GLNPO invites Quality Staff to review the GLNPO's quality system at
any time or at a minimum of every two years. For all areas where deficiencies exist, the GLNPO Quality
Manager and line management will develop action plans for any deficiencies and inform EPA Quality
Staff of their progress to rectify these situations.

10.2  PROJECT REVIEWS

       As mentioned above, a variety of tools that are implemented as part of GLNPO's quality system
facilitate improvement.  Technical audits, peer reviews, and data quality assessments can improve the
quality for long-term EICAAs and for subsequent projects. In addition, to increase the effectiveness of its
EICAAs, GLNPO will conduct "wrap-up" meetings at the conclusion of as many EICAAs as possible.
These meetings should include as many project participants as practical.  The meetings provide an
opportunity to review the quality system  documentation to determine how the plan could be improved,
and how similar ongoing projects may benefit from addressing these areas for improvement. Preliminary
data quality assessments should be available to determine whether the quality system was successful in
controlling data quality to an acceptable level. The meetings serve to focus project planners and
participants on areas for improvement in  all aspects of the EICAA including planning, field and
laboratory procedures, and appropriateness of the quality system.  This project review will assist project
planners in identifying preventive  actions that can be included in future EICAAs.  The Quality
Management Team will document these meetings and maintain them on the GLNPO LAN so that these
"lessons learned" can be applied to all subsequent EICAAs where applicable.
                  (SLNPO shall continually improve the effectiveness of the
                  quality management system through the use of the quality
                  policy, quality objectives, audit results, analysis of data,
                  corrective and preventive action, and management review.

-------
                                                                                 GLNPO QMP
                                                                                  Revision:  03
                                                                                     May 2008
                                                                                  Page 91 of 96
                                       References
       The EPA Quality Staff developed a series of documents describing the various requirements of
the overall EPA quality system as well as a series of guidance documents that describe how the system
can be implemented by EPA and by external organizations, including contractors and grantees. Many of
these documents are cited in the body of this quality management plan. All of the documents are
available from the Quality Staff web site in PDF format. The current uniform resource locator (URL) for
that web site is:  www.epa.gov/quality

       The Quality Staff also are working on a variety of new documents and revisions to existing ones,
and the reader is encouraged to check the web site above frequently for the latest available information.

Requirements Documents

       All of the documents that describe formal quality requirements for EPA organizations are defined
"EPA Directives," and are policy documents.  These include:

> CIO Order 2105, EPA Quality Manual for Environmental Programs.  This document describes
   the Quality requirements for EPA organizations that produce environmental data.

> CIO Order 2105, May 2003, EPA Policy and Program Requirements for the Mandatory Agency-
   wide Quality System. This document describes the specifications for satisfying the mandatory
   quality system defined in CIO Order 2105.

       Additional requirements documents apply to external organizations but can be used by EPA since
they may be clearer than the EPA directives. They are designated with the letter "R" followed by a
number. The documents that are available in final form at this time are:

> EPA QA/R-2, March 2001, EPA Requirements for Quality Management Plans.  QA/R-2 is the policy
   document containing the specifications and requirements for Quality Management Plans.

> EPA QA/R-5, March 2001, EPA Requirements for Quality Assurance Project Plans.  QA/R-5
   replaces the 1980 document QAMS-005/80. This external policy document establishes the
   requirements for QA Project Plans prepared for activities conducted by or funded by EPA. It is
   intended for use by organizations having extramural agreements with EPA.

Guidance Documents

       The Quality Staff have prepared a number of guidance documents that can assist in the
development and implementation of a suitable quality system for both EPA and non-EPA organizations.
The guidance documents are designated with the letter "G" followed by a number. The documents that
are available in final form at this time are:

> EPA QA/G-1, January 2008, Guidance for Developing Quality Systems for Environmental Programs.
   QA/G-1 provides guidance on developing and documenting the elements of a functional quality
   system in organizations that carry out environmental data operations within,  or on behalf of, EPA.

> EPA QA/G-3, March 2003,  Guidance on Assessing Quality Systems. QA/G-3 provides guidance on
   assessing the adequacy and effectiveness of an environmental quality system.

-------
GLNPO QMP
Revision: 03
May 2008
Page 92 of 96	

>  EPA QA/G-4, February 2006, Guidance on Systematic Planning using the Data Quality Objectives
    Process. QA/G-4 provides guidance to help organizations plan, implement, and evaluate the Data
    Quality Objectives (DQO) process, a systematic planning process for environmental data collection.
    It has a focus on environmental decision-making for regulatory and enforcement decisions. The
    guidance presents a step-by-step description of the DQO process.

>  EPA QA/G-4D, September 2001, Data Quality Objectives Decision Errors Feasibility Trials (DEFT)
    Software. QA/G-4D provides guidance for using the Decision Error Feasibility Trials (DEFT)
    software to help organizations plan, implement, and evaluate the Data Quality Objectives (DQO)
    process. The guidance presents a step-by-step description of the use of the PC-based DEFT software
    DQO process.

>  EPA QA/G-5, December 2002, Guidance for Quality Assurance Project Plans.  QA/G-5 provides
    guidance to help organizations develop Quality Assurance Project Plans that will meet EPA
    expectations and requirements. The document provides a linkage between the DQO process and the
    QAPP. It contains tips, advice, and case studies to help users develop improved QAPPs.

>  EPA QA/G-5G, March 2003, Guidance for Geospatial Data Quality Assurance Project Plans. QA/G-
    5G provides guidance on developing Quality Assurance Project Plans for geospatial data projects.

>  EPA QA/G-5M, December 2002, Guidance for Quality Assurance Project Plans for Modeling.
    QA/G-5M discusses issues to be addressed in QA Project Plan elements in the context of data use on
    modeling, emphasizing systematic planning, the use of existing data, hardware and software
    configuration issues for modeling, and the  graded approach.

>  EPA QA/G-5S, December 2002, Guidance on Choosing a Sampling Design for Environmental Data
    Collection. G-5S provides guidance on applying standard statistical sampling designs (such as simple
    random sampling) and more advanced sampling designs (such as ranked set sampling, adaptive
    cluster sampling) to environmental applications.

>  EPA QA/G-6, April 2007, Guidance for Preparing Standard Operating Procedures. QA/G-6
    provides guidance to help organizations develop and document standard operating procedures
    (SOPs).  The document contains tips, advice, and case studies to help users develop improved SOPs.

>  EPA QA/G-7, May 2006, Guidance on Technical Assessments for Environmental Data Operations.
    QA/G-7 provides guidance to help organizations plan, conduct, evaluate, and document technical
    assessments for their programs.

>  EPA QA/G-8, January 2008, Guidance on  Environmental Data  Verification and Data Validation.
    QA/G-8 provides guidance to help organizations conduct data verification and data validation
    activities.

>  EPA QA/G-9R, February 2006, Data Quality Assessment: A Reviewer's Guide. QA/G-9R
    provides general guidance to organizations on assessing data quality criteria and performance
    specifications for decision making. G-9R is non-technical document and shows a reviewer what
    constitutes an appropriate Data Quality Assessment (DQA), and how to recognize  situations or
    reports where a DQA has been conducted.

>  EPA QA/G-9S, February 2006, Data Quality Assessment: Statistical Tools for Practitioners. QA/G-
    9S can be considered the technical aspect of G-9R. The document is designed as a "tool-box" of
    useful techniques in assessing the quality of data. The overall structure of the document will enable
    the analyst to investigate many different problems using a systematic methodology.

-------
                                                                                 GLNPO QMP
                                                                                  Revision: 03
                                                                                    May 2008
                                                                                 Page 93 of 96
>  EPA QA/G-10, May 2006, Guidance for Determining Quality Training Requirements for
    Environmental Data Operations. QA/G-10 provides guidance to help organizations determine and
    develop program-specific quality system training for all levels of management and staff.

>  EPA QA/G-11, January 2005, Guidance on Quality Assurance for Environmental Technology
    Design, Construction and Operation. G-l 1 provides guidance on basic quality assurance and quality
    control procedures, and good engineering principles/practices that may be used in the design,
    construction, or operation of environmental technologies.

>  No number, July 1999, Guidance on Quality Assurance Project Plans for Secondary Research Data.
    Example Quality Assurance Project Plan requirements for secondary research data developed by the
    QA Managers in EPA's National Risk Management Research Laboratory.
EPA QA/G-0

EPA QA/G-3

Arter, D.R. 1989.


Byers, G.E.  1991.



CIO Order 2105

Proposed New EPA Order

ISO 9001:2000


Simes, G. F. 1989.


Taylor, J.K.  1987.
"EPA Quality System Description" November 1997

"Guidance for the Management Systems Review Process" January 1994

"Quality Audits for Improved Performance" ASQC Quality Press,
Milwaukee, Wisconsin.  93 pp.

"Quality Assurance Program in Monitoring and Research Strategy For
Forests" EPA 600/4-91/012.  U.S. Environmental Protection Agency.
Las Vegas, Nevada.

"EPA Quality Manual for Environmental Programs," CIO 2105-P-01-0

Policy for Competition in Assistance Agreements, July 15, 2002

"Quality Management Systems - Requirements," International
Organization for Standardization, 2000.

"Preparing Perfect Project Plans" EPA/600/9-89/087.  U.S.
Environmental Protection Agency, Cincinnati, Ohio.

"Quality Assurance of Chemical Measurements" Lewis Publishers,
Chelsea, Michigan. 328 pp.

-------
GLNPO QMP
Revision: 03
May 2008
Page 94 of 96

-------
                                                                                   GLNPO QMP
                                                                                    Revision: 03
                                                                                      May 2008
                                                                                   Page 95 of 96
                                          Glossary
Bias


Blind sample


Comparability



Completeness


Data quality objective
Environmentally-
related data
Environmental
information
collection and
assessment
activities
Measurement quality
objective

Overall data
uncertainty

Precision

Project


Quality assurance


Quality control sample
Representativeness


Uncertainty
The level of agreement between an observed value and the "true" value of a
characteristic.

A reference sample submitted to the analyst in such a manner that the sample
is known as a reference sample but its concentration is unknown.

The similarity of data from different sources included in a given set of data
and the similarity of methodologies from related projects across the regions
of interest.

The quantity of data that is successfully collected with respect to the amount
intended in the experimental design.

User-defined criteria established for each parameter to evaluate usability of
data.

Any laboratory or field data gathering activity or investigation involving the
determination of chemical, physical, or biological factors related to the
environment.

Environmental information collection and assessment activities represent any
activities that involve collection, assessment, or use of environmental data.
Per CIO Order 2105, environmental data are any measurements or
information that describe environmental processes or conditions, or the
performance of environmental technology.  For EPA, environmental data
include information collected directly from measurements, produced from
models, and compiled from other sources such as data bases or the literature.
 Environmental technology includes treatment systems, pollution control
systems and devices, and waste remediation and storage methods.

Critical level which, if exceeded, is considered to append additional and
possibly unacceptable, measurement uncertainty to the corresponding data.

Confounded population and measurement uncertainty occurring in a sample.
The level of agreement among multiple measurements of a characteristic.

An organized undertaking or specified unit of investigation involving
environmentally related measurements.

The total integrated program for assuring the reliability of monitoring and
measurement data.

Any sample utilized by the analyst to check measurement conditions, whose
measurement is expected to fall within specific acceptance criteria or control
limits.

The degree to which the data collected accurately represents the population
of interest.

A measure of imprecision, bias, or other sources of variability in a given
value.

-------
GLNPO QMP
Revision: 03
May 2008
Page 96 of 96
Validation                The process of determining the legitimacy of data for its state purpose,
                          involving internal consistency checks for outlier removal and definition of
                          levels of confidence.

Variability                Imprecision about a specific characteristic.

-------
                                  Appendix A

       EPA Quality Manual for Environmental Programs (CIO Order 2105-P-01-0)

                                       and

     Policy and Program Requirements for the Mandatory Agency-wide Quality System
                                (CIO Order 2105)
GLNPO Quality Management Plan - Appendix A                                             May 2008

-------
                  EPA
        QUALITY MANUAL
                  FOR
ENVIRONMENTAL PROGRAMS

                CIO2105-P-01-0
               (formerly 5360 Al)
               May 5, 2000
     United States Environmental Protection Agency
        Office of Environmental Information
                Quality Staff
            Washington, D.C. 20460

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                             05/05/2000

                             TABLE OF CONTENTS

                                                                                 Page
1.     QUALITY SYSTEM POLICY AND RATIONALE  	1-1
       1.1    Introduction	 1-1
       1.2    Scope	 1-2
       1.3    Applicability	 1-2
             1.3.1  Applicability to Environmental Programs  	 1-2
             1.3.2  Applicability to Other EPA Programs 	 1-3
       1.4    Organizational Applicability	 1-4
             1.4.1  EPA Organizations	 1-4
             1.4.2  Extramural Agreements	 1-4
       1.5    Exemptions  	 1-4

2.     QUALITY SYSTEM IMPLEMENTATION	2-1
       2.1    Introduction	2-1
       2.2    Implementation of Quality System Functions 	2-1
             2.2.1  Background  	2-1
             2.2.2  Mandatory Quality Management Tasks and Descriptions	2-2
             2.2.3  Non-Mandatory Quality Management Tasks and Descriptions	2-7
       2.3    Reporting Requirements	2-8
             2.3.1  Quality Management Plans  	2-8
             2.3.2  Quality Assurance Annual Report and Work Plan	2-9
       2.4    Requirements for Extramural Agreements	2-9
             2.4.1  Contracts  	2-9
             2.4.2  Assistance Agreements	2-10
             2.4.3  Interagency Agreements	2-10
       2.5    Requirements for Reporting Environmental Data  	2-10
             2.5.1  Technical Reports	2-10
             2.5.2  Data Management and Storage	2-11
       2.6    QA and  QC Requirements and Guidance Documents  	2-11
             2.6.1  Agency-wide Requirements and Guidance Documents  	2-11
             2.6.2  Development of User-Specific QA/QC Guidance Documents  	2-12
       2.7    QA and  QC Requirements in Regulations	2-12
       2.8    Dispute  Resolution	2-13
             2.8.1  Technical Disputes 	2-13
             2.8.2  Management Systems Disputes	2-13
       2.9    Performance Agreements	2-14

3.     QUALITY MANAGEMENT PLANS	3-1
       3.1    Introduction	3-1

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                             05/05/2000

                      TABLE OF CONTENTS - Continued

                                                                                 Page
       3.2    Preparation, Submission, Review, and Approval	3-2
             3.2.1   QMP Preparation Responsibility	3-2
             3.2.2   Internal Submission and Approval	3-2
             3.2.3   Agency Review and Approval of the QMP  	3-2
             3.2.4   QMP Revisions	3-3
       3.3    Quality Management Plan Requirements	3-3
             3.3.1   General Requirements	3-3
             3.3.2   Management and Organization 	3-4
             3.3.3   Quality System and Description	3-5
             3.3.4   Personnel Qualifications and Training	3-5
             3.3.5   Procurement of Items and Services	3-6
             3.3.6   Documents and Records	3-6
             3.3.7   Computer Hardware and Software 	3-7
             3.3.8   Planning  	3-8
             3.3.9   Implementation of Work Processes	3-9
             3.3.10  Assessment and Response	3-10
             3.3.11  Quality Improvement  	3-11

4.     QUALITY ASSURANCE ANNUAL REPORT AND WORK PLAN	4-1
       4.1    Background  	4-1
       4.2    Use of QAARWP Submissions	4-1
       4.3    Requirements	4-1
             4.3.1   QA Annual Report 	4-2
             4.3.2   Work Plan	4-5

5.     QUALITY ASSURANCE PROJECT PLANS	5-1
       5.1    Introduction	5-1
       5.2    QAPP Responsibilities and Application	5-1
             5.2.1   QAPP Preparation Responsibilities and Approvals	5-1
             5.2.2   QAPP Implementation and Revision	5-2
             5.2.3   Applicability of QAPPs	5-2
       5.3    QAPP Elements and Requirements	5-3
             5.3.1   General Content Requirements	5-3
             5.3.2   Group A, Project Management	5-5
             5.3.3   Group B, Measurement/Data Acquisition  	5-7
             5.3.4   Group C, Assessment/Oversight 	5-11
             5.3.5   Group D, Data Validation and Usability  	5-12

-------
EPA Quality Manual for Environmental Programs                                  5360 A1
                                                               05/05/2000

                  TABLE OF CONTENTS - Continued

                                                                  Page

6.    QUALITY SYSTEM ASSESSMENT
           (RESERVED)

7.    DESIGN, CONSTRUCTION, AND OPERATION OF ENVIRONMENTAL
     TECHNOLOGY
           (RESERVED)

APPENDIX A.  GLOSSARY 	A-l

APPENDIX B.  REFERENCES	B-l
                                  in

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                              05/05/2000

                                    CHAPTER 1
               QUALITY SYSTEM POLICY AND RATIONALE

1.1    Introduction

       EPA Order 5360.1 CHG 2, Policy and Program Requirements for the Mandatory
Agency-wide Quality System, provides requirements for the conduct of quality management
practices, including quality assurance (QA) and quality control (QC) activities, for all
environmental data collection and environmental technology programs performed by or for this
Agency.  The primary goal of the Agency-wide Quality System is to ensure that environmental
programs and decisions are supported by data of the type and quality needed and expected for
their intended use, and that decisions involving the design, construction, and operation of
environmental technology are supported by appropriate quality assured engineering standards and
practices.  The EPA  Quality Manual for Environmental Programs provides program
requirements for implementing the mandatory Quality System defined in EPA Order 5360.1 CHG
2.

       All EPA organizational units conducting environmental programs shall comply with EPA
Order 5360.1 CHG 2.  Work  performed on behalf of EPA through appropriate extramural
agreements shall comply with the quality system requirements defined by EPA Order 5360.1 CHG
2 or applicable regulations, unilateral orders, and negotiated agreements.  Environmental data are
any measurements or information that describe environmental processes or conditions, or the
performance of environmental technology. For EPA, environmental data include information
collected directly from measurements, produced from models, and compiled from other sources
such as data  bases or the literature.  Environmental technology includes treatment systems,
pollution control systems and devices, and waste remediation and storage methods.

       In accordance with EPA Order 5360.1 CHG 2, EPA requires that environmental programs
be supported by a quality system that complies with the American National Standard
ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental
Data Collection and Environmental Technology Programs, incorporated herein by reference.
ANSI/ASQC E4-1994  is a national consensus standard authorized by the American National
Standards Institute (ANSI) and developed by the American Society for Quality (ASQ) that
provides a basis for planning,  implementing, documenting, and assessing an effective quality
system for collecting and evaluating environmental data for decisions and for use in the design,
construction, and operation of environmental technologies.  Copies of the standard may be
obtained from:

                           ASQC Quality Press
                           P.O. Box 3005
                           Milwaukee, WI 53201-3005
                           Phone: (800)248-1946

                                          1-1

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

       By invoking this American National Standard as the basis for its internal Quality System,
EPA recognizes that environmental programs are diverse and impact many constituencies,
including Federal, State, local, and Tribal governments and industry.  EPA realizes that a uniform,
consistent set of quality management criteria are essential to assure effective environmental
programs by Government and industry alike. ANSI/ASQC E4-1994 provides a stable foundation
upon which quality systems supporting all environmental data and technology work across the
Agency may be based.

1.2    Scope

       The scope of this Manual includes applicable environmental programs involving:

              the collection, evaluation,  and use of environmental data by and for the Agency,
              and

              the design, construction, and operation of environmental technology by the
              Agency.

       Environmental data are critical inputs to decisions involving the protection of the public
and the environment from the adverse effects of pollutants from natural and man-made sources.

       Decisions concerning environmental and human health protection often result in requiring
the design, construction, and operation of pollution control or waste remediation systems. For
example, environmental technologies are  required to reduce contamination levels in the
environment and to maintain the levels at concentrations that do not threaten the environment or
human health and safety.

1.3    Applicability

1.3.1   Applicability to Environmental  Programs

       This Manual contains the minimum specifications for quality management functions and
activities necessary to support EPA environmental programs and satisfy the requirements of EPA
Order 5360.1 CHG 2.  Such programs may include activities encompassing the:

              characterization of environmental or ecological systems and the health of human
              populations;

              characterization of ambient conditions in air, water, sediments, and soil in terms of
              physical, chemical, radiological, or biological characteristics;
                                           1-2

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              characterization of radioactive, hazardous, toxic, and mixed wastes in the
              environment and their health and ecological effects;

              characterization and quantification of waste and effluent discharges to the
              environment from processes and operations (e.g., energy generation, metallurgical
              processes, chemicals production), during either normal or upset conditions (i.e.,
              operating conditions that cause pollutant or contaminant discharges);

              performance assessment of environmental technology used for pollution
              prevention; pollution control; waste treatment, storage, and disposal; and waste
              remediation;

              demonstration of environmental technology (e.g., treatability and pilot studies);

              investigation of chemical, biological, physical, or radioactive constituents in
              environmental and ecological systems, and their behavior and associated interfaces
              in those systems, including exposure assessment, transport, and fate;

              definition and evaluation of methods for use in the collection, analysis, and use of
              environmental data;

              development, evaluation, and use of computer or mathematical models (and their
              input data) that characterize environmental processes or conditions;

              use of environmental data  collected for other purposes or from other sources (also
              termed "secondary data"),  including literature, industry surveys, compilations from
              computerized data bases and information systems, results from computerized or
              mathematical models of environmental processes and conditions; and

              collection  and use of environmental data pertaining to the occupational health and
              safely of personnel in EPA facilities (e.g., indoor air quality measurements) and in
              the field (e.g., chemical dosimetry, radiation dosimetry).

1.3.2   Applicability to Other EPA Programs

       This Manual applies to the collection and use of medical testing data from Government
and non-Government personnel in EPA facilities for determination of substance abuse.
                                            1-3

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                               05/05/2000

1.4    Organizational Applicability

1.4.1   EPA Organizations

       The Agency-wide Quality System requirements established in EPA Order 5360.1 CHG 2
shall apply to all EPA organizations and components thereof in which the environmental programs
conducted involve the activities described in Section 1.2 above.

1.4.2   Extramural Agreements

       The Agency-wide Quality System requirements defined by Order 5360.1 CHG 2 shall
apply to non-EPA organizations as defined by terms and conditions in EPA-funded extramural
agreements.  Compliance with the Agency-wide Quality System requirements may be specified in
applicable regulations for these extramural agreements. In other cases, such requirements may be
invoked as part of negotiated agreements such as memoranda of understanding. These extramural
agreements include:

              Any organization or individual under direct contract to EPA to furnish services or
              items or perform work (i.e., a contractor) under the authority of 48 CFR 46,
              (including applicable work assignments, delivery orders, and task orders);

              Institutions of higher education, hospitals, and other non-profit recipients of
              financial assistance  (e.g., Grants and Cooperative Agreements) under the authority
              of 40 CFR 30;

              State, local, and  Tribal governments receiving financial assistance under the
              authority of 40 CFR 31 and 35; and

              Other Government Agencies receiving  assistance from EPA through interagency
              agreements.

Extramural (non-EPA) quality systems that provide objective evidence (such as a Quality
Management Plan, quality manual, or audit report acceptable to EPA) of complying fully with the
specifications of ANSI/ASQC E4-1994 are in compliance with EPA policy.

1.5    Exemptions

       Statutory requirements for quality may supersede the specifications in this Manual or be
more rigorous. In such cases, affected programs shall  be exempt from the requirements of this
Manual. EPA organizations conducting exempted activities shall comply with EPA Order 5360.1
CHG 2 in  all other respects.  The following exemptions from these requirements apply:
                                           1-4

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                               05/05/2000

              The collection of environmental data under the authority of Good Laboratory
              Practices as defined by 40 CFR 792, for the Toxic Substances Control Act.

              The collection of environmental data under the authority of Good Laboratory
              Practices as defined by 40 CFR 160, for the Federal Insecticide, Fungicide, and
              Rodenticide Act.

       Requests for exemptions should be directed to the Office of Environmental Information
(OEI) Quality Staff. Exemptions shall be made by the Assistant Administrator for the OEI
(AA/OEI).
                                           1-5

-------
EPA Quality Manual for Environmental Programs                                              5360 A1
                                                                                      05/05/2000
                                              1-6

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                              05/05/2000

                                    CHAPTER 2
                   QUALITY SYSTEM IMPLEMENTATION
2.1    Introduction

       This chapter addresses the implementation of quality management activities to satisfy EPA
Order 5360.1 CHG 2, including limitations on the use of extramural support to implement quality
systems for programs involving environmental data operations; reporting requirements;
requirements for extramural agreements; requirements for reporting results from applicable
environmental programs; QA and QC requirements and guidance documents; user-specific QA
and QC guidance; and dispute resolution.

2.2    Implementation of Quality System Functions

2.2.1   Background

       Many quality system activities involving environmental data operations are inherently
governmental functions and must be performed only by EPA personnel or by personnel explicitly
authorized by EPA based on statute, regulation, or by the terms of an extramural agreement.
Such representatives may include other governmental personnel and with specific authorization,
contractor personnel.  When such quality management tasks are performed by a contractor, the
contract must be appropriately managed and must remain under the control of the authorized EPA
contracting representatives.  EPA cannot use cooperative agreements or grants to provide quality
management activities such as QA and QC services for EPA because it is an inappropriate use of
financial assistance (Office of General Counsel memorandum, August 2, 1994).

       This section describes the quality management tasks necessary to comply with the Order
and identifies those tasks that may be performed by non-government personnel under appropriate
management controls.

       Two types of quality management functions are described:

             Exclusively EPA Functions - inherently governmental work which must be
             performed only by responsible EPA officials, including the QA Managers (QAMs),
             or authorized EPA representatives.

             Discretionary Functions - activities that may be performed either by EPA personnel
             or by non-EPA personnel under the specific technical direction of and performance
             monitoring by the QA Manager or other responsible EPA or Government official
             under an approved contract, work assignment, delivery order, task order, etc.
                                          2-1

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

       In the situations involving the other associated functions, there may be instances involving
sensitive contracting services, advisory and assistance services, and vulnerable contracting
practices as defined by the Federal Acquisition Regulations, Office of Federal Procurement Policy
(OFPP), and the EPA Contracts Management Manual (EPA Order 1900). Such situations are
identified by italicized text in the following sections.  In addition, management approval of
services contracts as defined by OFPP Letter 93-1 must be obtained for many of the associated
tasks.

       Technical direction or other instructions to an extramural organization, relating to
performance of an extramural agreement, shall be provided only by authorized EPA or other
Government representatives in accordance with the terms of the applicable extramural agreement.
Only authorized EPA or other Government representatives are to provide direction or instructions
to an extramural organization providing quality systems support for environmental programs.
This is to avoid such actions as:

              the providing of directions or instructions that are inconsistent with the terms of an
              extramural agreement,

              unauthorized access to confidential business information (CBI), or

              unauthorized access to information that may allow an extramural organization to
              gain an unfair competitive advantage.

2.2.2   Mandatory Quality Management Tasks and Descriptions

       This section describes the activities and tasks  integral to an effective quality system.
These tasks are required to implement EPA Order 5360.1 CHG 2.

2.2.2.1         Manage and Coordinate the Quality System

       Exclusively EPA functions that must be performed by EPA QA personnel include:

              managing the day-to-day implementation of the mandatory quality system.

              acting as liaison between the organization and the OEI Quality Staff on matters of
              QA policy.

              coordinating with senior management the development of and preparation of the
              organization's Quality Management Plan.

              coordinating with senior management changes  to the Quality  System as needed to
              assure its continued effectiveness and  assisting  in reporting the results annually to

                                           2-2

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                05/05/2000

              management and to the OEI Quality Staff in the Q A Annual Report and Work
              Plan.

              managing organization resources designated for the quality system.

              maintaining records of pertinent quality system activities performed by the
              organization.

2.2.2.2         Review and Approve Procurement and Financial Assistance Documents for QA
        Requirements

       Exclusively EPA functions that must be performed by EPA QA personnel include:

              reviewing procurement and financial assistance documents (e.g., statements of
              work, scopes of work, applications for assistance, funding requests, and purchase
              requests) to confirm any need for QA requirements, providing any necessary
              special language or conditions for such QA requirements, and approving by
              signature the appropriate Quality Assurance Review Form.

              participating directly or indirectly in the solicitation or agreement review process
              to advise the Project Officer on the suitability of the  offerer's quality system or QA
              and QC approach for the particular project.

              reviewing work assignments, delivery orders, and task orders to certify that
              appropriate QA and QC requirements have been established and that the necessary
              instructions are being communicated to the contractor to carry out the required
              QA and QC tasks.  Approving by signature appropriate Quality Assurance Review
              Form (EPA Order 1900,  Chapter 2).

2.2.2.3         Review and Approve QA Planning Documents

       Exclusively EPA functions that must be performed by EPA QA personnel or their
authorized EPA representative include:

              reviewing Quality Assurance Project Plans (QAPPs) for all projects, work
              assignments, delivery orders, task orders, grants, cooperative agreements, and
              interagency agreements involving data acquisition, data generation,  and/or
              measurement activities that are performed on behalf of EPA.

              approving all QAPPs  for implementation in all  applicable projects, work
              assignments, delivery orders, task orders, grants, cooperative agreements, and
              interagency agreements performed on behalf of EPA.

                                           2-3

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

              coordinating the correction of deficient QAPPs with the Project Officer and his/her
              management.

       Discretionary functions that may be performed by either EPA personnel or non-EPA
personnel include:

              reviewing, at the specific technical direction of the QAM, QA Project Plans and
              other QA-r elated planning documents, such as sampling and analysis plans, Data
              Quality Objectives (DQO) specifications, etc., and providing specific
              substantiated recommendations to the QAM on the adequacy of the QA approach
              in meeting the criteria provided by the QAM.  (The reviews should identify
              specific technical deficiencies in the planning documents.)

2.2.2.4         Track and Report Quality System Deliverables

       Exclusively EPA functions that must be performed by EPA QA personnel or their
authorized EPA representative include:

              tracking critical quality system deliverables for the organization and make periodic
              reports to senior management on the status of reporting actions and deliverables.

       Discretionary functions that may be performed by either EPA personnel or non-EPA
personnel include:

              compiling/logging administrative and management information including
              turnaround times to correct deficient QAPPs, responses to audits (e.g., responses
              and corrective actions), and quality reviews of final reports.

2.2.2.5         Manage Contractor Support Work Assignments. Delivery Orders, and Task
              Orders

       Exclusively EPA functions that must be performed by EPA QA personnel include:

              serving as the Contracting Officer Representative (for example, Project Officer,
              Work Assignment Manager, or Delivery Order Project Officer) for specific QA
              support contracts, work assignments, delivery orders, and task orders.

2.2.2.6         Plan and Conduct Management Assessments

       Exclusively EPA functions that must be performed by EPA QA personnel include:
                                           2-4

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

              planning, directing, and conducting assessments of the effectiveness of the quality
              system being applied to environmental data operations and reporting results to
              senior management.  Such assessments may be conducted using the Management
              Systems Review (MSR) process.

              coordinating with senior management any revision of the quality system as
              necessary based on the findings of the assessment.

       Discretionary functions that  may be performed by either EPA personnel or non-EPA
personnel include:

              providing technical support to the EPA QAM in the planning phase of
              management assessments. (Such activities are limited to the assembly and
              compilation of background information and data,  guidance documents, technical
              reports, etc., available in the public domain, for use by EPA in designing the
              assessment goals and specifications.)

2.2.2.7         Plan and Conduct Technical Assessments

       Exclusively EPA functions that must be performed by EPA QA personnel or their
authorized EPA representative include:

              planning and directing with the responsible EPA project officials the
              implementation of periodic technical assessments  of ongoing environmental data
              operations to provide information to management to assure that technical and
              quality objectives are being met and that the needs of the customer are being
              satisfied. Such assessments may include technical systems audits, surveillance,
              performance evaluations, and data quality assessments.

              determining conclusions and necessary corrective actions (if any) based  on the
              findings of the assessments.

       Discretionary functions that  may be performed by either EPA personnel or non-EPA
personnel include:

              performing technical assessments of environmental data producing activities,
              both intramural and extramural (on-site and off-site) according to a specific plan
              approved by the QAM. Preparations for such assessments may include the
              acquisition or development of audit materials and standards. Results (findings)
              are summarized, substantiated,  and presented to  the QAM or authorized EPA
              representative.
                                           2-5

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              A determination of whether an authorized Agency representative should
              accompany a contractor's personnel should be made on a case-by-case basis only
              after coordination between the responsible organization and contracting officer.
              Such coordination should include consideration of the purpose of the
              accompaniment and clear definition of the Agency representative 's role and
              responsibility during the contractor's performance of the audit or technical
              assessment to avoid the appearance of a personal services relationship.

2.2.2.8         Prepare and Present QA Training Materials and Courses

       Exclusively EPA functions that must be performed by EPA QA personnel or their
authorized EPA representative include:

              developing and presenting detailed guidance and training for QA and QC activities
              based on interpretation of Agency-wide requirements and guidance.

       Discretionary functions that may be performed by either EPA personnel or non-EPA
personnel include:

              providing or coordinating quality-related training for the organization in special
              skill areas identified by the Agency  and not generally available to the organization.

              providing allowable technical and/or logistical assistance in preparing and
              presenting quality-related technical training (within the Agency's implementation
              of special management and control measures and the constraints of potential for
              conflict of interest, of revealing confidential business information, or of
              appearing to be interpreting or representing Agency policy).

2.2.2.9         Review and Approve Final Reports for Quality  Documentation

       Exclusively EPA functions that must be performed by EPA QA personnel or their
authorized EPA representative include:

              establishing criteria for the acceptability of quality documentation in the
              organization's published papers and reports; that  is, defining what is required for an
              adequate discussion of the quality of the project results and the usability of the
              information reported.

              approving for publication those papers and  reports that meet the defined criteria.

       Discretionary functions that may be performed by either EPA personnel or non-EPA
personnel include:

                                           2-6

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                               05/05/2000

              conducting a substantiated technical review of all reports produced by the
              organization using the qualitative and quantitative specifications obtained from
              the DQO process or other criteria provided by EPA. This quality review
              complements the peer review process.

2.2.3   Non-Mandatory Quality Management Tasks and Descriptions

       This section describes other activities and tasks integral to an effective quality system.
They are not explicitly required to implement EPA Order 5360.1 CHG 2, but if implemented, they
must be implemented as described below.

2.2.3.1         Review and Assist in the Development and Preparation of Environmental Data
        Collection Survey Designs

       Exclusively EPA functions that must be performed by  EPA QA personnel or their
authorized EPA representative include:

              interpreting Agency policy and requirements pertaining to the development and
              preparation of environmental data collection survey/experimental design
              requirements.

              providing corrective action, technical assistance, and guidance to intramural and
              extramural personnel (in accordance with terms of the extramural agreement)
              conducting environmental data operations to enable them to produce in a timely
              manner satisfactory design documents.

       Discretionary functions that may be performed by either EPA personnel or non-EPA
personnel include:

              reviewing environmental data collection survey designs using criteria provided by
              the QAM and providing the QAM with a substantiated technical assessment of the
              strengths and weaknesses in the design.

2.2.3.2         Prepare and Present Quality Management Information in the Technical Literature
        and at Meetings/Symposia

       Exclusively EPA functions that must be performed by  EPA QA personnel or their
authorized EPA representative include:

              transferring information on EPA quality management subjects to other Agency,
              public, or scientific groups through participation in technical meetings and
                                          2-7

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

              symposia, and papers or articles in the technical literature, including peer reviewed
              journal papers, oral presentations, and panel discussions.

       Discretionary functions that  may be performed by either EPA personnel or non-EPA
personnel include:

              transferring information on quality management subjects to other groups through
              participation in technical meetings and symposia and through the technical
              literature, with appropriate disclaimer that the information does not represent EPA
              policy or position. (NOTE: Only EPA personnel may represent the Agency in an
              official role.)

2.2.3.3         Research Relative to Quality Management Issues

       Discretionary functions that  may be performed by either EPA personnel or non-EPA
personnel include:

              providing information to the QAMs on the state-of-the-art in quality management.

              performing  searches  of the technical and quality management literature relative to
              specific QA and QC  issues. This may include compiling summaries of alternative
              sampling and analytical methods, identifying QC reference materials, and
              availability  of standard operating procedures for calibrating certain
              instrumentation.

              performing  studies of quality management issues having a mathematical or
              statistical foundation with the objective of optimizing QA and QC protocols and
              procedures.

2.3    Reporting Requirements

2.3.1   Quality Management Plans

       All Agency organizational units governed by EPA Order 5360.1 CHG 2 shall document
their quality system in a Quality Management Plan (QMP). The QMP is a policy statement
describing how an EPA organization shall comply with the requirements of EPA Order 5360.1
CHG 2. Quality systems encompass the management and technical activities necessary to plan,
implement, and assess the effectiveness of QA and QC operations applied to environmental
programs. The QMP provides the blueprint for how an individual EPA Program Office, Region,
and National Laboratory or Center will plan, implement,  and assess its quality system for the
environmental work to be performed as part of its mission. QMPs are reviewed and approved by
                                           2-8

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                               05/05/2000

the OEI.  Approval is valid for a period of up to five years. Specific QMP requirements are
described in Chapter 3.

2.3.2   Quality Assurance Annual Report and Work Plan (QAARWP)

       All Agency organizations subject to the requirements of EPA Order 5360.1 CHG 2 must
submit a  Quality Assurance Annual Report and Work Plan (QAARWP) annually to the OEI. The
QAARWP shall summarize the results of having implemented the quality system the previous
fiscal year and describe QA activities planned for the fiscal year beginning in October. The
QAARWP may be used to identify limited changes or updates to the organizations's approved
QMP.  The QAARWP should provide helpful information to management by documenting the
past fiscal year's activities and estimating the current year's workload based on the prior year and
the expected activities in the current year. QAARWPs are described in Chapter 4.

2.4    Requirements for Extramural Agreements

       All environmental data operations performed under extramural  agreements shall comply
with the Agency-wide Quality System requirements as defined by the relevant regulations.
Accordingly, all acquisitions and assistance  agreements must be reviewed by an authorized QA
Manager, Officer, or Coordinator, as specified in the organization's Quality Management Plan, to
determine if environmental data operations are to be performed and, if so, to ensure that
appropriate QA and QC specifications are included or identified in the  acquisition and assistance
agreement solicitation package.  Upon their receipt in response to the solicitation, proposals or
applications must be reviewed by the QA approval authority to evaluate the adequacy with which
the offerer or applicant addressed stated specification, as well as the adequacy of  Quality
Management Plans and QA Project Plans when submitted.

2.4.1   Contracts

       The requirements for QA and QC activities in contracts are given in 48 CFR 46 and in the
Contract  Management Manual, EPA Order  1900. The requirements and the specifications set
forth in this Manual extend to all contract forms involving environmental programs, including
work assignments, delivery orders, and task orders.

       The Quality  Assurance Review Form (QARF) (EPA Order 1900, Chapter 2) provides
confirmation to the Office of Acquisition Management (OAM) that appropriate QA and QC
requirements have been determined.  OAM officials shall then incorporate the necessary standard
clauses or conditions to assure that the minimum specifications for compliance with EPA policy
are included.  The specific statement of work may include  additional QA and QC specifications
and requirements identified by the project officer in consultation with the organization's QAM.
                                           2-9

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

       Approved QARFs are required for work assignments, delivery orders, and task orders.
For level-of-effort and delivery order contracts, only general QA and QC requirements are
specified in the contract. Since not all work assignments, delivery orders, and task orders issued
involve data collection, QA and QC specifications may not be necessary. The QARF attached to
the work assignment shall identify clearly those work assignments that need QA and QC
specifications and shall identify the necessary requirements.

2.4.2   Assistance Agreements

       The regulatory requirements for QA and QC activities in assistance agreements are given
in 40 CFR 30, for assistance to institutions of higher education, hospitals, and other non-profit
organizations, and in 40 CFR 31, for assistance agreements to State, local, and Tribal
governments.  The requirements and the specifications set forth in this Manual extend to all
assistance agreement forms involving environmental programs, including grants and cooperative
agreements.

2.4.3   Interagency Agreements

       EPA cannot  unilaterally require other Federal agencies to comply with Agency-wide
Quality System requirements for interagency agreements funded by EPA. QA and QC
specifications for interagency agreements must be negotiated between EPA and the other agency.
When agreement is reached on the QA and QC specifications, the specifications must be included
in the interagency agreement.

       When EPA receives funding from another Agency through an interagency agreement, the
EPA QA and QC requirements  shall apply in addition to any specifications provided by the
funding organization.  If the funding agency does not specify any requirements, EPA QA and QC
requirements given by the Order and this Manual shall apply.

2.5    Requirements for Reporting Environmental Data

2.5.1   Technical Reports

       Published Agency reports containing environmental data shall be accompanied by a
readily-identifiable section or appendix that discusses the quality of the data and any limitations on
the use of the data with respect to their original intended application. Published EPA reports
include those reports printed  by the Government or distributed through publication services to the
general public. This requirement does not apply to papers, journal articles, etc., that undergo peer
review processes external to EPA.

       Agency reports shall be reviewed by the QA manager (or other authorized official) before
publication to  ensure that an adequate discussion of QA and QC activities is enclosed.  Adequacy

                                           2-10

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                              05/05/2000

is a subjective determination that should be based on the nature of the environmental data
operations performed and the intended and likely use of the data by others, or on the objectives of
the study.  The purpose of the review is to ensure that sufficient information is provided to enable
a knowledgeable reader to determine if the technical and quality goals were met for the intended
use of the data.  Reports should include applicable statements regarding the use of any
environmental data presented as a caution about possible misuse of the data for other purposes.

2.5.2   Data Management and Storage

       Data management and electronic transfer and storage of environmental data shall conform
with applicable Agency information resources management policies and procedures, or with
applicable American National Standards.  Agency policy requires that environmental data shall
have their location reported and documented as well (2100, Chapter 13). Other applicable EPA
Orders include 2180.1, 2180.2, 2180.3, and 7500.1A.

2.6    QA and OC Requirements and Guidance Documents

2.6.1   Agency-wide Requirements and Guidance Documents

       The OEI Quality Staff shall develop quality management practices and "tools" for use
Agency-wide to enable effective planning, implementation, documentation, and  assessment of
individual quality systems.  The Quality Staff produces the following types of documents for this
purpose:

             Requirements Documents (QA/R-Series)

             Requirements Documents  contain mandatory, minimum specifications or
             procedures for use by non-EPA organizations that must comply with Agency-wide
             Quality System requirements. These documents are usually invoked through
             regulations.  Requirements Documents are designated the QA/R-series and are
             issued by the AA/OEI as Agency Policy documents following appropriate review
             and approval.

             Guidance Documents (QA/G-Series)

             Guidance Documents contain non-mandatory guidelines for use by EPA and non-
             EPA organizations in implementing quality management practices or QA and QC
             activities. Such documents often provide suggestions on how to meet
             specifications given in Requirements Documents.  Guidance Documents are
             designated the QA/G-series and issued as OEI reports following  peer review and
             approval.
                                         2-11

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

       All EPA quality-related Requirements Documents and Guidance Documents shall be valid
for a period of five (5) years from the approval date. After five years, some action must be taken
to reaffirm the document's validity, revise it, or delete it from the Agency-wide Quality System.
Any changes to approved documents must comply with the procedures used for the original
review and approval.

2.6.2  Development of User-Specific QA and QC Guidance Documents

       Additional user-specific QA and QC requirements and guidance that are tailored to a
particular organization and its mission may be appropriate given the diversity of Agency
programs. For the purposes of this Manual, "user-specific QA and QC guidance" includes but is
not limited to written documents, computer software, and videos (if used to provide instructions).
Such guidance shall be developed by the organization itself, consistent with Agency policy and the
organization's QMP.

       Management shall ensure that all changes to the guidance documents are available to all
personnel using that guidance, including active contractors and assistance agreement recipients
(e.g., grantees, cooperative agreement holders). The changes do not become binding on
contractors and assistance agreement holders until the Office of Acquisition Management or the
Office of Grants and Debarment revises the extramural agreement at the request of the EPA
project officer.

2.7    QA and OC Requirements in Regulations

       EPA regulations often require the regulated community to collect and submit
environmental data to EPA or the State as part of compliance or enforcement actions. Some
regulations include specific QA and QC requirements while others are less specific. Because
critical decisions are often made based on data collected from the  regulated community,  the
adequacy of those data are very important, both to EPA and to the regulated community. The
inclusion of appropriate QA and QC requirements in EPA regulations helps to assure that data of
the type and quality needed for a particular decision are obtained.  The authoring organization
shall ensure that appropriate and effective specifications for QA and QC activities are included in
proposed regulations.

2.8    Dispute Resolution

       Oversight responsibilities for QA and QC activities may sometimes result in disagreements
between the oversight group and the program reviewed regarding the results of the activity. Such
disputes may occur in situations involving technical issues (e.g., audits, surveillance, data quality
assessments) and management issues (e.g., QMP reviews, management systems reviews). This
section discusses the process for resolving such disputes.
                                          2-12

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                05/05/2000

2.8.1   Technical Disputes

       For those situations in which technical issues regarding QA and QC activities are in
dispute, resolution should be sought at the lowest management level practicable. All parties
should make every effort to resolve disputes through discussion and negotiation. If unsuccessful,
final resolution should be made by the senior manager for the organization.

       It is recommended that an organization's QMP include a process for dispute resolution
within that organization. While the process described below (Section 2.8.2) may be used for
technical disputes, an organization may develop a process that meets its particular needs.

2.8.2   Management Systems Disputes

       Implementation of the Agency-wide Quality System may create disagreements arising
from the results of the review and approval of Quality Management Plans (QMPs) and from the
performance of management assessments using Management Systems Reviews (MSRs). The
dispute resolution process should only be used when parties cannot achieve mutual resolution of
their disagreement at the lowest possible administrative level. The dispute resolution process is
terminated at any point where resolution is achieved and the issue resolved.

       Disagreements should be resolved at the lowest administrative level possible. The dispute
resolution officials for management systems issues are as follows:

              Within the components of the OEI - the AA/OEI.

              Between OEI and the National Program Offices or Regional Offices - the AA/OEI
              and the respective National Program Office Assistant Administrator (AA) or
              Regional Administrator (RA).

Should agreement not be reached at this level, the issue shall be resolved by the Deputy
Administrator.

       The dispute resolution process has the following steps:

       (1)    The process begins when either disagreeing party declares an issue to be
              unresolvable and sends a memorandum to the other party invoking this dispute
              resolution process, defining the disputed issue, and presenting supporting
              arguments for the first party's position on the issue.

       (2)    Within 30 days, the second party must send a draft dispute resolution package to
              the first  party. As soon as possible after this, the two parties, working together,
              must submit a dispute resolution package to the dispute resolution official.  This

                                           2-13

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              package would contain both party's arguments, both party's rebuttals, and any
              supporting materials.

       (3)    The dispute resolution official shall schedule a meeting for resolving the dispute
              within 30 to 60 days from receipt of the dispute resolution package, and for
              notifying both parties of this date. Both parties are invited to attend the resolution
              meeting to present arguments and answer questions.  The dispute resolution
              official may get advice from third parties. The decision of the dispute resolution
              official shall be binding on both parties.

2.9    Performance Agreements

       Successful implementation of the Agency-wide Quality System requires that EPA
managers and staff perform specific quality management functions.  Performance agreements for
senior managers, supervisors, and applicable staff shall contain critical element(s) to
commensurate with the quality management responsibilities assigned by EPA Order 5360.1 CHG
2, this Manual, and the organization's  Quality Management Plan.
                                           2-14

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                               05/05/2000

                                     CHAPTER 3
                       QUALITY MANAGEMENT PLANS

3.1    Introduction

       All Agency organizational units governed by EPA Order 5360.1 CHG 2 shall document
their quality system in a Quality Management Plan (QMP).  The QMP is a policy statement
describing how an EPA organization shall comply with the requirements of EPA Order 5360.1
CHG 2. Quality systems encompass the management and technical activities necessary to plan,
implement, document, and assess the effectiveness of QA and QC operations applied to
environmental programs.  The QMP provides the blueprint for how an individual EPA Program
Office, Region, and National Laboratory or Center will plan, implement, document,  and assess its
quality system for the environmental work to be performed as part of its mission.

       The QMP defines an organization's QA-related:

              policies and procedures,

              criteria for and areas of application, and

              roles, responsibilities, and authorities.

The QMP, therefore, is a management tool that should be appropriately tailored to the needs of
the organization.  The QMP must be sufficiently inclusive, explicit, and readable to enable
managers and supervisors to understand the priority that senior management places on QA, the
established QA policies and procedures, and their respective QA roles.  The QMP must be
constructed and written so that an assessment of its effectiveness following implementation can be
made.  This enables managers to determine whether or not the quality system is being
implemented in a way that ensures successful results from environmental programs.  The QMP
should focus on the processes used to plan, implement, document, and assess the programs to
which it is applied. The level of detail should be based on a common sense, graded approach that
establishes QA and QC requirements commensurate with the importance of the work, the
available resources, and the unique needs of the  organization.

       QMPs shall be tailored to individual requirements and modified as the requirements
change. This chapter describes the quality management practices which are normally considered
to be critical to an effective quality system.  Each Agency organization shall evaluate these key
elements to see if they are applicable to its quality system.  Where a particular element is not
relevant, a brief explanation of why it is not relevant shall be provided in the QMP. If the QMP
preparer determines that additional quality management elements are useful or necessary for an
adequate quality system, these elements shall be developed and discussed in the QMP.
                                          3-1

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                               05/05/2000

3.2    Preparation, Submission, Review, and Approval

3.2.1   QMP Preparation Responsibility

       The senior manager for each EPA organization is responsible for the preparation of a
QMP that covers all environmental programs for which the manager is accountable. A senior
manager is a manager who is responsible and accountable for mission accomplishment and overall
operations. Senior management shall ensure that the quality system documented in the QMP
complies with the requirements in this Manual.

       If desired by an organization's management, a draft of the QMP may be submitted to the
OEI Quality Staff one time for informal comment prior to the formal review and approval
process.  However,  an informal review by the Quality Staff shall not be used to replace a thorough
review of the quality system and the proposed QMP by the management of the organization
preparing the plan.

3.2.2   Internal Submission and Approval

       The QMP must be approved  and signed by the senior manager of the organization
preparing the QMP. The senior manager shall indicate his/her approval of the QMP on the
signature page of the document. The senior managers are designated as follows:

       Organization Preparing the QMP           Senior Manager

       Headquarters Office                      Office Director
       Headquarters Field Component             Office Director
       Laboratory/Center/Office                  Laboratory/Center/Office Director
       Regional Office                           Regional Administrator

The QMP must also be approved and signed by the QA manager of the organization, the OEI
Quality Staff Director, and the AA/OEI. The senior manager may require the concurrence on the
QMP by appropriate subordinate line managers to encourage the acceptance and implementation
of the quality system. Subordinate line managers may include Division Directors, Branch Chiefs,
and other supervisory personnel as defined  for a particular organization.

3.2.3   Agency Review and Approval of the QMP

       The QMP must be submitted to  the  OEI Quality Staff for review and approval.  Each
QMP will be reviewed to determine  compliance with Agency-wide Quality System requirements.
The review of the QMP shall focus on the substance of the group's QA management process and
not on format or technical details. QMPs that address all mandatory program  elements and
include acceptable QA policies, procedures, administrative  criteria, and management systems for

                                          3-2

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

key QA elements including systematic planning processes, QA Project Plans, other QMPs,
Standard Operating Procedures, assessments, and oversight of delegated programs shall be
approved. QMP approval shall be valid for a period not to exceed five years.

3.2.4   QMP Revisions

       Although the QMP approval by the Agency is valid for up to five years, all Agency quality
systems must be reviewed at least annually by their organizations to reconfirm the effectiveness of
the approved quality management practices.  This assessment must include an evaluation of the
effectiveness of the QMP.  The process of developing and annually updating the QMP provides an
opportunity for management and staff to review and clarify roles and responsibilities, to address
problem areas, and to acknowledge successes. Having an accurate QMP at all times is an
essential element in every quality system.  Changes in QA policy and procedures shall be
documented in a timely fashion by QMP revisions.  In general, a copy of any QMP revision(s)
made during the year should be submitted to the OEI Quality  Staff as an attachment to or as part
of the QA Annual Report and Work Plan (see Chapter 4).

       A revised QMP may be submitted at any time, but is required under certain conditions
including:

              expiration of the five-year life of the approved  QMP,

              a major reorganization that requires "Directive Clearance" review (EPA, 1995),

              a significant change in the organization's mission, such as may occur in the
              reauthorization or revision of enabling environmental legislation, or

              any major change to the organization's quality system.

       All appropriate personnel performing work for the organization, including active
contractors and assistance agreement holders, shall be notified of all changes to the quality system
and the QMP that may affect their work for EPA and to keep them apprised of the current
requirements.

3.3    Quality Management Plan Requirements

3.3.1   General Requirements

       The QMP describes the processes by which the organization plans, implements,
documents, and determines the effectiveness of quality management practices, including QA and
QC activities, to help management to obtain results of its  technical work that are of the type and
quality needed for their intended use.  Specifically, the QMP must  discuss:

                                           3-3

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              the mission and quality policy of the organization,

              the specific roles and responsibilities of management and staff with respect to QA
              and QC activities,

              the means and structure by which effective communications are assured,

              the process(es) used to plan, implement, and assess the work performed,

              the process(es) by which measures of effectiveness for QA and QC activities will
              be established and how frequently effectiveness will be measured, and

              the process for continual improvement of the organization's quality system.

The QMP should reflect the organization's commitment to quality management principles and
practices, tailored by senior management to meet the organization's needs.

       To reduce duplication between QMPs and QAPPs, the QMP shall include discussion of
those activities, policies, and procedures that are common to all projects.  Such discussions
provide an "umbrella" under which individual project activities may be performed.

       There are ten elements contained in a QMP. If an element is not applicable to an
organization's quality system, then the QMP must state why this is the case.  Specific
requirements for each of the elements follow (Sections 3.3.2 - 3.3.11).

3.3.2  Management and Organization

       Provide or address the following management and organizational items:

              a statement of the organization's policy on QA, including the importance of quality
              in its products to the organization and its mission, and why; the general
              objectives/goals of the quality system;  and the policy for resource allocation for the
              quality system, including personnel, extramural funding, and travel funding;

              an organization chart that identifies all of the components of the organization and,
              in particular, the organizational position and lines of reporting for the QA Manager
              (and any QA staff) that confirms and documents the independence of the QA
              Manager from groups generating, compiling, and evaluating environmental data;

              a discussion of the responsibilities and authorities of the QA Manager and any
              other QA staff;
                                           3-4

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              a brief discussion of the technical activities or programs that are supported by the
              quality system and to which it applies; that is, the specific programs that require
              extensive quality management controls; where oversight of delegated, contracted,
              or other extramural programs is needed to assure data quality; and where internal
              coordination of QA and QC activities among the group's organizational units needs
              to occur;

              a discussion of the QA and QC roles and responsibilities of line management,
              technical staff, and any other staff, and how these roles and responsibilities are
              incorporated into performance standards;

              a discussion of the organization's process for resolving disputes regarding quality
              system requirements, QA and  QC procedures,  assessments, or corrective actions;

              a discussion of how management shall assure that applicable elements of the
              quality system are understood  and implemented in all environmental programs; and

              an approval page for the signatures of the senior manager, senior line management
              (as appropriate), the QA manager of the organization, the OEI Quality Staff
              Director, and the AA/OEI. This approval  page may be part of a title page or a
              separate sheet following the title page.  Approving officials whose signatures must
              be contained on the approval page are described in section 3.2.2.

3.3.3   Quality System and Description

       Discuss the principal components (or  "tools") comprising the quality system and how they
are used to implement the quality system. These components include, but are not limited to
QMPs, management assessments (self and independent), systematic planning processes, QA
Project Plans, Standard Operating Procedures, technical assessments (self and independent), and
Data Quality Assessments. This discussion shall also identify how and when the components of
the quality system are to be applied to individual projects and tasks.

3.3.4   Personnel Qualifications and Training

       State the organization's policy regarding training for management and staff. Describe the
processes and the management and/or staff responsible for:

              identifying statutory, regulatory, or professional certifications that may be required
              to perform certain operations;  and

              identifying, designing, performing, and documenting technical, quality, and project
              management training.

                                           3-5

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

       Describe how staff proficiency in critical technical disciplines is maintained and
documented.

3.3.5   Procurement of Items and Services

       Describe and discuss the organization's process for ensuring that all appropriate
extramural agreements, including grants, cooperative agreements, and contracted and
subcontracted activities, involving or affecting environmental programs  shall:

              contain appropriate QA and QC requirements in all applicable documents;

              receive the same review and approval for changes as for the original documents;

              address satisfactorily all QA and QC requirements in applicable responses to
              solicitations and include QA as an integral criterion in the evaluation criteria;

              provide objective evidence of quality furnished by suppliers and subcontractors for
              applicable items and services, including source selection, source inspections,
              supplier audits, and examination of deliverables; and

              provide evidence of the suppliers capability to satisfy EPA QA and QC
              requirements  as defined in the extramural agreement or applicable regulation (e.g.,
              40 CFR 30, 40 CFR 31, 48 CFR 46).

       Describe how procurement documents or financial assistance agreements shall require
suppliers (i.e., contractors, subcontractors, or financial assistance recipients) to have a quality
system consistent with EPA requirements.  This requirement applies only to those suppliers who
provide services or items that directly affect the quality of results or products from environmental
programs.

3.3.6   Documents and Records

       Describe or provide a reference to the process:

              for identifying quality-related documents and records requiring control;

              for handling documents and records to assure their accessibility, protection from
              damage and deterioration, and means of retention, including discussion of the roles
              and responsibilities for management and staff;

              by which all technical guidance documents are prepared, reviewed, approved,
              issued, used, and revised; and

                                            3-6

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

              by which all planning documents (e.g., QA Project Plans, Sampling and Analysis
              Plans) are prepared, reviewed, approved, issued, used, and revised; and

              that ensures compliance with all statutory, contractual, and assistance agreement
              requirements for records from environmental programs and that provides adequate
              preservation of key records necessary to support the mission of the organization.

Documents and records, including revisions, must be reviewed for conformance with the quality
system requirements and approved by authorized personnel before general use.

       Describe or provide a reference to the management process that ensures that records
accurately reflect completed work and/or fulfill statutory and contractual requirements, including
any specific record keeping requirements defined in EPA Order 2160 and EPA Directive 2100,
Chapter 10.  The maintenance of records includes defining requirements and responsibilities for
record transmittal, distribution, retention, protection, preservation, traceability, disposition, and
retrievability.

       Identify how the disposition of records, in accordance with regulatory requirements,
schedules, or directives from senior management, is accomplished.

3.3.7   Computer Hardware and Software

       Describe or discuss:

              how applicable EPA requirements for information resources management are
              addressed (EPA Directive 2100) including Year 2000 compliance, security, and
              privacy requirements  (Chapters 5, 8, and 11 of EPA Directive 2100, respectively);

              the process for ensuring that computer hardware used in environmental programs
              meets technical requirements and quality expectations (i.e., configuration testing);

              how changes to hardware shall be controlled to assess the impact of the change on
              performance;

              the process for developing computer software, for validating, verifying, and
              documenting the software for its use, and for assuring that the software meets the
              requirements of the user (EPA Directive 2182);

              how purchased software is evaluated to meet user requirements and to comply
              with applicable contractual requirements and standards; and
                                           3-7

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              the process for ensuring that data and information produced from or collected by
              computers meet applicable EPA information resources management requirements
              and standards (EPA Directive 2100).

These discussions shall include the roles and responsibilities assigned to management and staff.
The QMP shall document how the organization manages its computer hardware and software
operations that directly impact the quality of the results of environmental programs.  Computer
programs covered by this Manual include, but are not limited to, design, design analysis, data
handling, data analysis, modeling of environmental processes and conditions, operations or
process control, and data bases.

3.3.8   Planning

3.3.8.1         Systematic Planning

       Environmental data operations shall be planned using a systematic planning process that is
based on the scientific method. The planning process shall be based on a common sense, graded
approach to ensure that the level of detail in planning is commensurate with the importance and
intended use of the work and the available resources. Elements of a systematic planning approach
that shall be documented include:

              Identification and involvement of the  project manager, sponsoring organization
              and responsible official, project personnel, stakeholders, scientific experts, etc.
              (e.g., all customers and suppliers);

              Description of the project goal, objectives, and questions and issues to be
              addressed;

              Identification of project schedule, resources (including budget), milestones, and
              any applicable requirements (e.g., regulatory requirements, contractual
              requirements);

              Identification of the type of data needed and how the data will be used to support
              the project's objectives;

              Determination of the quantity of data  needed and specification of performance
              criteria for measuring quality;

              Description of how, when, and where the data will be obtained (including existing
              data) and identification of any constraints on data collection;
                                           3-8

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              Specification of needed QA and QC activities to assess the quality performance
              criteria (e.g., QC samples for both the field and laboratory, audits, technical
              assessments, performance evaluations, etc.);

              Description of how the acquired data will be analyzed (either in the field or the
              laboratory), evaluated (i.e., QA review, validation, verification), and assessed
              against its intended use and the quality performance criteria.

       A systematic planning process shall ensure that all organizations and/or parties who
contribute to the quality of the environmental program or use the results are identified and that
they participate in this process. The planning process shall also provide for direct communication
between the customer and the supplier to ensure that there is a clear understanding by all
participants of the needs and expectations of the customer and the product or results to be
provided by the supplier. EPA has developed a systematic planning process called the Data
Quality Objectives Process (EPA, 1994). While not mandatory, this process is the recommended
planning approach for many EPA data collection activities.

       Describe the process for planning environmental programs, including identification of who
is responsible and how general project planning is documented. Describe who uses the planning
"tools" (as defined in the Quality System Description section of the QMP) and the roles and
responsibilities of all management and staff involved in planning.

3.3.8.2         Quality Assurance Project Plans

       Discuss how the results of planning for environmental data operations shall be
documented in a Quality Assurance Project Plan (QAPP) (see Chapter 5) and approved by
authorized personnel for implementation. The process for developing, reviewing, approving,
implementing, and revising  a QAPP, must be described in this section of the QMP. Identify the
staff who is authorized to approve QAPPs.

       Describe or discuss how data obtained from sources outside EPA that did not use an
EPA-approved QAPP (or equivalent planning document) for data collection shall be evaluated
and qualified for use.  Discuss the process for qualifying such data, including the application of
any statistical methods used.

3.3.9   Implementation of Work Processes

       Describe the process of how and by whom work shall be implemented within the
organization for:

              ensuring that work is performed according to plan;
                                           3-9

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

              development and implementation of procedures for appropriate routine,
              standardized, special, or critical operations, including those that address, but are
              not limited to:

                     identification of operations needing procedures;
                     preparation of procedures, including form, content, and applicability; and
                     review and approval of procedures; and

              use of QA and QC "tools" such as standard operating procedures (SOPs).

       Describe how appropriate measures for controlling the release, change, and use of planned
procedures are implemented. These measures provide for the necessary approvals, specific times
and points for implementing changes, removal of obsolete documentation from work areas, and
verification that the changes are made as prescribed.

       To help to assure consistency in common procedures, SOPs are encouraged for
appropriate routine, standardized, or special/critical operations. The QMP shall contain the
organization's process for identifying the need for SOPs, the process for developing SOPs, and
the policy for using SOPs.  The QMP shall also  describe the process by which SOPs are reviewed
for initial and subsequent use.

3.3.10 Assessment and Response

       Describe how and by whom assessments of environmental programs are  planned,
conducted, and evaluated. Describe the process by which management chooses a particular
assessment tool, and the expected frequency of their application to environmental programs.
Available assessment tools include audits, data quality assessments, management systems reviews,
peer reviews and technical reviews, performance evaluations, readiness reviews, technical systems
audits, and surveillances. Senior management shall assess (at least annually) the adequacy of the
quality system.

       Discuss or address the following items pertaining to management and technical
assessments:

              how the process for the planning, scheduling, and implementation of assessments
              works,  as well as how the organization shall respond to needed changes;

              responsibilities, levels of participation, and authorities for all management and staff
              participating in the assessment process; and

              how, when, and by whom actions shall be taken in response to the findings of the
              assessment, and how the  effectiveness of the response shall be determined.

                                          3-10

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

       Describe how the level of competence, experience, and training necessary to ensure the
capability of personnel conducting assessments are determined. Personnel conducting
assessments shall be qualified, based on project-specific requirements, to perform the  assigned
assessment. Management is responsible for choosing the assessors, defining acceptance criteria,
approving audit procedures and check lists, and identifying goals prior to initiation of an
assessment. Assessors shall be technically knowledgeable with no real or perceived conflict of
interest.  If the assessors are chosen from within the organization, they must have no direct
involvement or responsibility for the work being assessed, except for self-assessments.

       Describe how personnel conducting assessments shall have sufficient authority, access to
programs and managers, access to documents and records, and organizational freedom to:

              identify quality problems;

              identify and cite noteworthy practices that may be shared with others to improve
              the quality of their operations and products;

              propose recommendations for resolving quality problems; and

              independently confirm implementation and effectiveness of solutions.

The QMP shall also discuss conditions under which a "stop work" order may be needed and when
and how authority for such decisions shall be made.

       Describe how assessment results shall be documented, reported to, and reviewed by
management. Describe how management shall respond to the results (or findings) and
recommendations from assessments in a timely manner. When conditions needing corrective
action are identified, the appropriate response must be made promptly. Indicate how  follow-up
action shall be taken and documented to confirm the implementation and effectiveness of the
response action.  Describe how disputes, if encountered, as a result of assessments are addressed
and by whom.

3.3.11  Quality Improvement

       Describe how the organization shall detect and prevent quality problems.  Describe the
organization's process for ensuring continual quality improvement, including the management
process for determining, planning, implementing, and evaluating the effectiveness of quality
improvement activities; who (organizationally) is responsible for quality improvement; and the
corrective action program to ensure that conditions adverse to quality are identified promptly and
corrected as soon as practical.
                                          3-11

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

       Corrective actions shall include the identification of root causes of problems, the
determination of whether the problem is unique or has more generic implications, and a
recommendation of procedures to prevent recurrence.

       The QMP shall describe how staff at all levels are encouraged to identify and establish
communications among customers and suppliers, identify process improvement opportunities,
identify problems, and offer solutions to those problems.
                                           3-12

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                              05/05/2000

                                    CHAPTER 4
      QUALITY ASSURANCE ANNUAL REPORT AND WORK PLAN

4.1    Background

       Careful annual planning is necessary to the success of the Agency-wide Quality System.
In order for management to budget adequate resources to implement the quality system for an
organization, estimates of the quality system workload are needed.  Moreover, management must
give priority to quality system resource needs with respect to other mission requirements.

       All Agency organizations subject to the requirements of EPA Order 5360.1 CHG 2 shall
submit a Quality Assurance Annual Report and Work Plan (QAARWP) annually.  The QAARWP
shall summarize the results of having implemented the quality system the previous fiscal year and
describe QA activities planned for the fiscal year beginning in October. The QAARWP may be
used to identify limited changes or updates to the organizations's approved Quality Management
Plan (QMP) (see Chapter 3). The QAARWP should provide helpful information to management
by documenting the past fiscal year's activities and estimating the current year's workload based
on the prior year and the expected activities in the current year.

4.2    Use of OAARWP Submissions

       The contents of the QAARWPs shall be used by OEI Quality Staff to evaluate the overall
effectiveness of the Agency-wide Quality System. As problems related to  QA are identified by
EPA organizations, the Quality Staff shall use the QAARWP information to identify systemic or
Agency-wide problems and shall initiate plans to  address such problems. The QAARWPs shall
not be approved as are QMPs.

4.3    Requirements

       The QAARWP must be submitted under the signature of the senior manager for the
organization to the AA/OEI. Organization is defined here as an ORD Center, Laboratory, or
Office; individual Program Office one level under an AA-ship; or Region.  In order to simplify the
submittal,  the QAARWP and its attachments may be submitted electronically to the OEI Quality
Staff Director along with a copy of the original signature page.  The QAARWP has two parts, the
annual report for the previous fiscal year and the  proposed work plan for the new fiscal year.
Specifications for the QAARWP are contained in Sections 4.3.1 - 4.3.2.
                                          4-1

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

4.3.1   QA Annual Report

4.3.1.1         Quality Management Resources

       Resources provided here may be estimated if not strictly accounted for by position or
expense tracking. For example, if 10% of the time of all project officers is spent on quality
assurance activities, then an estimate can be made by multiplying 10% times the number of project
officers.  If such an estimate is made, the algorithm should be provided to clarify the response.

              Provide a current estimate of the organization's filled FTE positions.

              State the total EPA (and other Federal) FTE (to the nearest tenth of an FTE)
              involved in the management of QA and QC activities.  Examples of these activities
              include providing input to management on the need for and use of QA resources,
              disseminating Agency QA policy, developing and ensuring the implementation of
              the organization's QMP (including document tracking, auditing, and training), and
              acting as a liaison with OEI Quality Staff.  Vacancies should be listed separately.

              State the total EPA (and other Federal) FTE involved in QA and QC support
              activities. Examples of these activities include writing and reviewing QAPPs and
              Standard Operating Procedures (SOPs) and validating data.

              If your organization uses contractor support to implement the quality system, state
              the total contractor FTEs involved in QA and QC support activities. Include the
              technical QA and QC support activities of Senior Environmental Employment
              Program (SEEP) grantees or other grantees.

              State the total FTEs involved in other non-technical QA and QC support activities.
              This category would include the non-technical SEEP employees and clerical staff.

              State the total dollar amounts (rounded to nearest $K) of other QA supporting
              funds, not including travel funds or training.

              State the total dollar amounts of travel funds used for QA activities such as
              oversight, surveillance, and audits/assessments.

              State the total dollar amount of funds used for QA and QC training, including
              travel for training, and the total number of people who attended the training.
              Include training for QA staff.

              Discuss the adequacy of the above listed resources in relation to their impact on
              your quality system, including the ability to implement last year's work plan.

                                           4-2

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

4.3.1.2         Training

              Briefly describe the method used to assess the organization's QA training needs
              and the results of the assessment; i.e., what needs were identified.  Discuss
              separately any needs assessment made of other organizations participating in your
              quality system (i.e., States, Tribes, Regions).

              List courses and name the supplier(s) for QA and QC courses given to your
              organization and other non-EPA organizations participating in your quality system.
              The courses attended by the QA staff, while critically important for educating new
              QA staff members, are not the primary concern.  For example, contract
              administration and field/lab equipment techniques are not considered QA courses.
              Statistics courses, however, are considered related to QA in terms of project
              planning and data analysis.

              List the attendance for each course.  If course attendees included persons from
              outside your organization, identify the total attendance by each organization
              represented.

              For each course,  indicate whether the course goals were achieved; that is, were the
              training needs fulfilled? If not, please indicate why and what you believe needs to
              be done to satisfy the remaining needs.

4.3.1.3         Management Accomplishments

              Discuss any innovative quality management practices you have developed and used
              in planning, implementing, or assessing your quality system.  Include any proposed
              revisions to your Quality Management Plan reflecting changes to your operating
              practices for QA and QC activities.

              Summarize the technical assessments that were performed by your organization on
              itself and others participating in your quality system. For each assessment, identify
              the type of assessment performed, the organization and project which were the
              subject of the assessment, when the assessment was performed, and who did the
              assessment. Also provide a general statement of the assessment results and any
              corrective actions. For example, the QA staff members performed technical
              systems audits of these projects:  (list). Note, if this information is indicated on an
              attached quality system tracking program output, give the totals here.

              Summarize the technical assessments that were performed on your organization by
              others. For each assessment, identify the type of assessment performed, the
              organization and project which were the subject of the assessment, when the

                                           4-3

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

              assessment was performed, and who did the assessment.  Also provide a general
              statement of the assessment results and any corrective actions.

              Summarize technical assistance given on planning, data review, etc.  For QAPPs,
              the numbers of QAPPs reviewed (or re-reviewed) and an indication of average
              turnaround time is sufficient.  Note, if tracking program outputs are attached, give
              totals for intramural and extramural projects.  Include QA assistance given by the
              organization or individuals to non-EPA organizations, including states, private
              industry, and foreign countries. Organizations participating in your quality system
              that have QMPs reviewed and approved by you should also be listed.

              List any new or revised QA guidance developed by your organization. It is not
              necessary to list SOPs written and revised during the past year.  The numbers of
              both new and revised SOPs is sufficient.

              List any publications and presentations concerning QA and QC practices and
              results by your organization.  QA guidance listed above need not be repeated.
              Internet or other electronic document forms should be included.

              List any awards or recognition related to QA given to your organization or to
              individual staff members.

4.3.1.4         Management Assessment of the Approved Quality System

              Summarize the management assessments (i.e. management systems reviews) that
              were performed by your organization on itself and others participating in your
              quality system. Include the part of the implemented program that was examined
              and what was learned (including positive findings). If corrective actions were
              indicated, summarize the response  actions taken, and discuss progress toward their
              implementation. Report on their effectiveness,  if known.

              Summarize the management assessments that were performed by others (including
              the OEI Quality Staff) on your quality system.  Include the part of the implemented
              program that was examined and what was learned.  If corrective actions were
              indicated, briefly summarize the response actions taken, and discuss progress
              toward their implementation.  In particular, highlight any incomplete corrective
              actions resulting from the Quality Staff assessments. Report on the effectiveness
              of the corrective actions, if known.

4.3.2   Work Plan

4.3.2.1         Quality Management Resources

                                           4-4

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              State the total EPA (and other Federal) FTE (to the nearest tenth of an FTE)
              proposed for supporting quality management activities in your organization.
              Examples of these activities include providing input to management on the need
              for and use of QA resources, disseminating Agency QA policy, developing and
              ensuring the implementation of the organization's QMP (including document
              tracking, auditing, and training), and acting as a liaison with the OEI Quality Staff.
              Vacancies should be listed separately.

              State the total EPA (and other Federal) FTE proposed for QA and QC support
              activities. Examples of these activities include writing and reviewing QAPPs and
              SOPs and validating data.

              If your organization uses contractor support to implement the quality system, state
              the total contractor FTEs proposed for the QA and QC support activities.

              State the total FTEs proposed for other non-technical QA and QC support
              activities.  This category would include Senior Environmental Employment
              Program employees, other grantees, and clerical staff.

              State the total dollar amounts (rounded to nearest $K) of proposed QA and QC
              support funds, not including travel funds or training.

              State the total dollar amounts of proposed travel funds for QA activities such as
              oversight, surveillance, and  audits.

              State the total dollar amounts of proposed funds for training,  including travel.

4.3.2.2         Activities

              List and briefly describe anticipated major QA and QC activities expected during
              the year including QA and QC-related training to be given, taken, or developed,
              guidance to be developed or revised, technical and management assessments of
              your organization and others participating in your quality system, and
              implementation of corrective actions from prior MSRs of your quality system.
                                           4-5

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                              05/05/2000

                                    CHAPTER 5
                  QUALITY ASSURANCE PROJECT PLANS

5.1    Introduction

       EPA policy requires that all work performed by or on behalf of EPA involving the
collection of environmental data shall be implemented in  accordance with an Agency-approved
Quality Assurance Project Plan (QAPP).  The QAPP defines and documents how specific data
collection activities shall be planned, implemented, and assessed during a particular project. This
chapter presents detailed specifications on the information that must be addressed in a QAPP for
environmental data operations performed by or on behalf of EPA and on the procedures for its
review and approval.  Guidance on developing QAPPs, including examples of QAPP elements,
may be found in Guidance on Quality Assurance Project Plans (QA/G-5) (EPA 1998).

       The QAPP is a critical planning document for any environmental data operation since it
documents how environmental data operations are planned, implemented, documented, and
assessed during the life cycle of a program, project, or task. The ultimate success of an
environmental program or project depends on the adequacy and sufficiency of the quality of the
environmental data collected and used in decision-making. This may depend significantly on the
adequacy of the QAPP and its effective implementation.  Quality planning (Section 3.3.8.1) is an
absolutely essential component of project management and the QAPP provides the mechanism for
documenting the results of the planning process.  This planning must include the "stakeholders"
(i.e., the data users, data producers, decision makers, etc.) to ensure that all needs are defined
adequately at the outset and that the planning for quality addresses the specific needs defined.

       In the  sections to follow, the elements of the QAPP are discussed in detail.  These
elements represent the information generally required for most data operations involving the
characterization of environmental processes and conditions. Because of the diversity of Agency
programs, some elements described in this chapter may not be applicable to all programs. The
final decision  on the applicability or use of any or all of these elements for QAPPs shall be made
by individual EPA organizations. EPA organizations may tailor these requirements in their own
implementation documents to better fit their specific needs.

5.2    QAPP Responsibilities and Application

5.2.1   QAPP Preparation Responsibilities and Approvals

       The EPA organization's Quality Management Plan (QMP) establishes how, when, and by
whom development,  review, approval, and effective oversight of  QAPPs occurs, including
situations in which QAPPs are prepared by extramural (non-EPA) organizations.  In some cases,
it may be necessary to add special requirements to the QAPP.  The EPA organization sponsoring
the work shall define any specific requirements beyond those listed in this manual.

                                          5-1

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                               05/05/2000

       No environmental data collection work shall be started until the QAPP has been approved
and distributed to project personnel except under circumstances requiring immediate action to
protect human health and the environment or operations conducted under police powers.  Some
non-data collection activities such as equipment procurement, instrument calibration, etc., may be
conducted prior to the approval of the QAPP.  In limited circumstances, EPA may grant
conditional approval to a QAPP to permit some work to begin while non-critical deficiencies in
the QAPP are being resolved.

5.2.2   QAPP Implementation and Revision

       All QAPPs shall be implemented as approved by EPA.  The organization performing the
work shall implement the approved QAPP and ensure that all personnel involved in the work have
copies of the approved QAPP and all other necessary documents. Personnel implementing the
approved QAPP should understand the requirements prior to the start of data generation
activities.

       Because of the complex and diverse nature of environmental data operations, changes to
original plans are often needed. The EPA Project Manager, with the assistance  of the QA
Manager as appropriate, must determine the impact of such changes on the technical and quality
objectives of the project. When a substantive  change is warranted,  the originator of the QAPP
shall modify the QAPP to document the change and submit the revision for approval by the same
authorities that performed the original review.  Only after the revision has been  approved and
received (at least verbally with written follow-up) by project personnel, shall the change be
implemented.

       It is essential that the QAPP be kept current and that all personnel involved in the work
have easy access to a current version of the QAPP. For programs or projects of long duration,
such as multi-year monitoring programs, the QAPPs shall be reviewed at least annually by the
Project Manager.  If revisions are necessary to reflect current needs, the QAPP must be revised
and resubmitted for review and approval.

5.2.3   Applicability of QAPPs

       The QAPP requirements in this chapter apply to all (intramural and extramural)
environmental data operations that acquire, generate, or compile environmentally-related data and
that are performed by or on behalf of EPA. Extramural data operations may include contracts and
work assignments, delivery orders, task orders, cooperative agreements, interagency agreements,
State-EPA agreements, State, local, and Tribal Financial Assistance/Grants, Research Grants, and
responses to statutory or regulatory requirements and to consent agreements negotiated as part of
enforcement actions. QA and QC requirements shall be negotiated  into applicable interagency
agreements, including sub-agreements since EPA cannot unilaterally impose its QA and QC
requirements in these agreements. Where specific Federal regulations require QA and QC

                                          5-2

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

activities, QAPPs shall be prepared, reviewed, and approved in accordance with the specifications
contained in this document for the data collection activity unless superseded by the regulation.

5.3    QAPP Elements and Requirements

       Environmental data operations encompass diverse and complex activities, including rule
making, compliance with regulations, and research. As a result, some environmental data
operations may only require a qualitative discussion of the experimental process and its objectives
while others may require extensive documentation in order to adequately describe a complex
environmental program.  The content and level of detail in each QAPP may vary according to the
nature of the work being performed and the intended use of the data. The final decision on QAPP
content and level of detail belongs to the EPA organization responsible for the work to be done,
consistent with the approved QMP.

5.3.1   General Content Requirements

       The QAPP must provide sufficient detail to demonstrate that:

              the project technical  and quality objectives (e.g., Data Quality Objectives) are
              identified;

              the intended measurements or data acquisition methods are appropriate for
              achieving project objectives;

              assessment procedures are sufficient for confirming that data of the type and
              quality needed and expected are obtained; and

              any limitations on the use of the data can be identified and documented.

Most environmental data operations require the coordinated efforts of many individuals, possibly
including managers, engineers, scientists, statisticians, and  others.  The QAPP must integrate the
contributions and requirements of everyone involved into a clear, concise statement of what needs
to be accomplished, how it shall be done, and by whom. It must provide understandable
instructions to those who must implement the QAPP, including the field sampling team, the
analytical laboratory, and the data reviewers.  The use of standard operating procedures and
national standards and practices is encouraged in all aspects of the QAPP.

       In order to be effective, the QAPP must specify the level or degree of QA and QC
activities needed for the particular environmental data operations. The QA  and QC technical
requirements of a project should commensurate with:
                                           5-3

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                               05/05/2000

              the purpose of the environmental data collection (e.g., enforcement action,
              research and development),

              the type of work to be done (e.g., monitoring, site characterization, bench level
              proof of concept), and

              how the results shall be used (e.g., regulatory enforcement, permit approval).

       The QAPP must be composed  of standardized, recognizable elements covering the entire
project from planning, through implementation, to assessment. The QAPP elements that follow
are presented in that order and have been arranged for convenience into four general groups. The
four groups of elements and their intent are summarized as follows:

              A. Project Management - These elements cover the basic area of project
              management, including the project history, project objectives, and roles and
              responsibilities of the participants.  These elements document that the project has a
              defined goal and that the participants understand the goal and the approach to be
              used.

              B. Measurement/Data  Acquisition - These elements cover all aspects of
              measurement systems design and implementation, ensuring that appropriate
              methods for sampling, analysis, data handling, and QC are employed and are
              properly documented.

              C. Assessment/Oversight  - These elements address the activities for assessing the
              effectiveness of the implementation of the project and associated QA and QC
              activities. The purpose of assessment is to ensure that the QAPP is implemented
              as prescribed.

              D. Data Validation and Usability - These elements cover the QA activities that
              occur after the data collection phase of the project is completed. Implementation
              of these elements ensures that the data conform to the specified criteria,  thus
              achieving the project objectives.

       All applicable elements defined by the EPA organization sponsoring the work must be
addressed in the QAPP.  Documentation, such as an approved Work Plan, Standard Operating
Procedures (SOPs), etc., may be referenced in response to a particular required QAPP element to
reduce the size of the QAPP and the time required for preparation and review. All referenced
documents must be attached to the QAPP itself or be placed on file with the appropriate EPA
office and available for routine referencing when needed. Such references must be kept current by
the submitter.  The QAPP shall also address related QA planning documentation (e.g., Quality
                                           5-4

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                               05/05/2000

Management Plans) from subcontractors or suppliers of services critical to the technical and
quality objectives of the project or task.

5.3.2   Group A, Project Management

       This  group of QAPP elements covers the basic area of project management, including:

              Al           Title and Approval Sheet
              A2           Table of Contents
              A3           Distribution List
              A4           Project/Task Organization
              A5           Problem Definition/Background
              A6           Project/Task Description
              A7           Quality Objectives and  Criteria for Measurement Data
              A8           Special Training Requirements/Certification
              A9           Documentation and Records

5.3.2.1         Al. Title and Approval Sheet

       Include the title of the plan, name of the organization(s) implementing the project, and
names, titles, signatures of appropriate approving officials and  their approval dates.  Approving
officials include the Organization's Project Manager, Organization's Quality Assurance Manager,
EPA Project Manager, and EPA Quality Assurance Manager, as appropriate. Other officials, as
needed, may include the field operations manager, laboratory manager, State officials, and other
Federal Agency officials.

5.3.2.2         A2. Table of Contents

       List the sections, figures, tables, references,  and appendices.  Document control format
may be required at the option of the Project Manager  and QA Manager.  When required, use a
document control format on each  page following the Title and  Approval Sheet.  For example, the
following may be placed on the upper right hand  corner of each page:

                           Section No.
                           Revision No.
                           Date
                           Page	of
                                           5-5

-------
EPA Quality Manual for Environmental Programs                                               5360 A1
                                                                                     05/05/2000

5.3.2.3         A3. Distribution List
                                              5-6

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                  05/05/2000

       List the individuals and their organizations who shall receive copies of the approved
QAPP and any subsequent revisions.  Include all persons responsible for implementation
(including managers), the QA managers, and representatives of all groups involved.

5.3.2.4        A4.  Project/Task Organization

       Identify the individuals or organizations participating in the project and discuss their
specific roles and responsibilities. Include the principal data users, the decision-makers, the
project QA manager, and all persons responsible for implementation.  The project quality
assurance manager must be independent of the unit generating the data. (This does not include
being independent of senior officials, such as  corporate managers or agency administrators, who
are nominally, but not functionally, involved in data generation, data use, or decision-making.) If
someone other than the project manager is responsible for approving and accepting final products
and deliverables, then that individual shall be identified. The individual responsible for
maintaining the official,  approved QAPP shall also be identified.

       Provide a concise organization chart showing the relationships and the lines of
communication among all project participants. Include other data users who are outside of the
organization generating  the data, but for whom the data are nevertheless intended; e.g., modelers,
risk assessors, design engineers, toxicologists, etc.  The organization chart must also identify any
subcontractor relationships relevant to environmental data operations. Where direct contact
among project managers and data users does not occur, such as a Superfund Potentially
Responsible Party and the EPA Risk Assessment staff, the organization chart should show the
pathway by which information is exchanged.

5.3.2.5        A5.  Problem Definition/Background

       State the specific problem to be addressed or decision to be made.  Include sufficient
background information to provide the historical perspective for the project.

5.3.2.6        A6.  Project/Task Description

       Provide a description of the work to be performed and schedule for implementation.  This
discussion may not need to be lengthy or overly detailed, but it should give an overall picture of
how the project shall resolve the problem or question described in A5. Describe in general terms
the following, as needed:

              Measurements that will be made during the course of the project.

              Applicable technical, regulatory, or program-specific quality standards, criteria, or
              objectives.
                                            5-7

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

              Any special personnel and equipment requirements.

              The assessment tools needed (i.e., program technical reviews, surveillances, and
              technical audits as needed by the project or specified by the QMP) for the project.

              A schedule for the work to be performed.

              Project and QA and QC records required, including the types of reports needed.

5.3.2.7         A7.  Quality Objectives and Criteria for Measurement Data

       Provide a statement of the project quality goals or objectives and measurement
performance criteria. EPA requires the use of a systematic planning process and prefers that most
project planning be accomplished using the DQO Process. For details on the DQO Process and
when it may be used, see the Guidance for the Data Quality Objectives Process (QA/G-4) (EPA,
1994).

5.3.2.8         A8.  Special Training Requirements/Certification

       Identify and describe any specialized training or certification requirements needed by
personnel in order to successfully complete the project or task. Discuss how such training shall be
provided and how the necessary skills shall be assured and documented.

5.3.2.9         A9.  Documentation and Records

       Describe the process  and responsibilities for ensuring that the  most current approved
version of the QAPP is available.

       Itemize the information and records which must be included in a data report package and
specify the desired reporting format. Documentation can include raw data, field logs, instrument
printouts, and results of calibration and QC checks.  Specify the level  of detail of the field
sampling and/or laboratory analysis narrative needed to provide a complete description of any
difficulties encountered during sampling  or analysis. The narrative refers to an annotated
summary of the analytical work performed by a laboratory that describes in narrative form what
activities were performed and identifies any problems encountered that provides additional
information to users in interpreting the data received.

       Specify any requirements for the final disposition of records and documents from the
project, including location and  length of retention period.
                                           5-8

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                               05/05/2000

5.3.3   Group B, Measurement/Data Acquisition

       This group of QAPP elements covers all aspects of measurement systems design and
implementation, including:

              B1    Sampling Process Design (Experimental Design)
              B2    Sampling Methods Requirements
              B3    Sample Handling and Custody Requirements
              B4    Analytical Methods Requirements
              B5    Quality Control Requirements
              B6    Instrument/Equipment Testing, Inspection, and Maintenance Requirements
              B7    Instrument Calibration and Frequency
              B8    Inspection/Acceptance Requirements for Supplies and Consumables
              B9    Data Acquisition Requirements (Non-direct Measurements)
              BIO   Data Management

5.3.3.1         B1.  Sampling Process Design (Experimental Design)

       Describe the experimental design or data collection design for the project, including as
appropriate the types and numbers of samples required, the design of the sampling network,
sampling locations and frequencies, sample matrices, measurement parameters of interest, and the
rationale for the design. Classify each measurement as critical (i.e., required to achieve project
objectives) or non-critical (informational purposes only).

5.3.3.2         B2.  Sampling Methods Requirements

       Describe the procedures for collecting samples and identify the sampling methods and
equipment, including any implementation requirements, sample preservation requirements,
decontamination procedures, and materials needed. Identifying sampling methods by number,
date, and regulatory  citation (as appropriate) is often sufficient. If a method allows the user to
select from various options, then the method citations should state exactly which options are
being selected. Describe specific performance requirements for the method. For each sampling
method, identify any support facilities needed. The discussion should also address what to do
when a failure in the sampling or measurement system occurs and who is responsible for
corrective action and how the effectiveness of the corrective action shall be  determined and
documented.

       Describe the process for the preparation and decontamination of sampling equipment,
including the disposal of decontamination by-products; the selection and preparation of sample
containers, sample volumes,  preservation methods, and maximum holding times to sample
extraction and/or analysis.
                                          5-9

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

5.3.3.3         B3.  Sample Handling and Custody Requirements

       Describe the requirements and provisions for sample handling and custody in the field,
laboratory, and transport, taking into account the nature of the samples, the maximum allowable
sample holding times before extraction or analysis, and available shipping options and schedules.
Sample handling includes preservation, packaging, shipment from the site, and storage at the
laboratory. Examples of sample labels, custody forms, and sample custody logs should be
included.

5.3.3.4         B4.  Analytical Methods Requirements

       Identify the analytical methods and equipment required, including subsampling or
extraction methods, laboratory decontamination procedures and materials (such as in the case of
hazardous or radioactive samples), waste disposal requirements (if any), and any specific
performance requirements for the method. Address what to do when a failure in the analytical
system occurs and who is responsible for corrective action and how the effectiveness of the
corrective action shall be determined and documented.

       Identifying analytical methods by number, date, and regulatory citation (as appropriate) is
often sufficient. If a method allows the user to select from various options, then the method
citations should state exactly which options are being selected. For non-standard methods, such
as unusual sample matrices and situations, appropriate method performance study information is
needed to confirm the performance of the method for the particular matrix. If previous
performance studies are not available, they must be developed during the project and included as
part of the project results.

5.3.3.5         B5.  Quality Control Requirements

       Identify QC procedures needed for each sampling, analysis, or measurement technique.
For projects at or beyond the "proof-of-concept" stage and projects employing well-characterized
methods, this section should list each required QC procedure,  along with the associated
acceptance criteria and corrective action. Because standard methods are often vague or
incomplete in specifying QC requirements, simply relying on the cited method to provide this
information is usually insufficient.

       Identify required measurement QC checks for both the field and the laboratory;  for
example, blanks, duplicates, matrix spikes, laboratory control samples, surrogates, or second
column confirmation.  State the frequency of analysis for each type of QC check, and the spike
compounds sources and levels. State or reference the required control limits for each QC check
and corrective action required when control limits are exceeded and how the effectiveness of the
corrective action shall be determined and documented.
                                           5-10

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

       Describe or reference the procedures to be used to calculate each of the QC statistics,
including the QC checks described in the preceding paragraph as well as precision and bias.
Copies of the formulas are acceptable as long as the accompanying narrative or explanation
specifies clearly how the calculations will address difficult situations such as missing data values
and "less than" or "greater than" values.

5.3.3.6         B6.  Instrument/Equipment Testing. Inspection, and Maintenance Requirements

       Describe how inspections and acceptance testing of environmental sampling and
measurement systems and their components shall be performed and documented to assure their
intended use as specified by the design. Identify and discuss the procedure by which final
acceptance shall be performed by independent personnel (e.g., personnel other than those
performing the data collection work) and/or by the EPA Project Officer. Describe how
deficiencies are to be resolved, and when re-inspection shall be performed and how the
effectiveness of the corrective action shall be determined and documented.

       Identify the equipment and/or systems requiring periodic maintenance. Describe or
reference how periodic preventive and corrective maintenance of measurement or test equipment
shall be performed to ensure availability and satisfactory performance of the systems. Discuss
how the availability of critical spare  parts, identified in the operating guidance and/or design
specifications of the systems, shall be ensured and maintained.

5.3.3.7         B7.  Instrument Calibration and Frequency

       Identify all tools, gauges, instruments, and other sampling, measuring, and test equipment
used for data collection activities affecting quality that must be controlled and, at specified
periods, calibrated to maintain performance within specified limits. Describe or reference how
calibration shall be conducted using  certified equipment and/or standards with known valid
relationships to nationally recognized performance standards. If no such nationally recognized
standards exist, document the basis for the calibration. Identify the certified equipment and/or
standards used for calibration.  Indicate how records of calibration shall be maintained and be
traceable to the instrument.

5.3.3.8         B8.  Inspection/Acceptance Requirements for  Supplies and Consumables

       Describe how and by whom  supplies and consumables  shall be inspected and accepted for
use in the project.  State acceptance  criteria for such supplies and consumables.  They include, but
are not limited to: sample bottles, calibration gases, reagents, hoses, materials for decontamination
of sampling equipment, deionized water, and potable water.
                                           5-11

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

5.3.3.9         B9. Data Acquisition Requirements (Non-direct Measurements)

       Identify any types of data needed for project implementation or decision making that are
obtained from non-measurement sources such as computer data bases, spreadsheets, programs,
and literature files. Define acceptance criteria for the use of such data in the project. Discuss any
limitations on the use of the data resulting from uncertainty in its quality and from the impact of
adding more error to the results.

5.3.3.10 BIO. Data Management

       Describe the project data management scheme, tracing the path of the data from their
generation in the field or laboratory to their final use or storage.  Describe or reference the
standard record-keeping procedures, document control system, and the approach used for data
storage and retrieval on electronic media. Discuss the control mechanism for detecting and
correcting errors and for preventing loss of data during data reduction (i.e.,  calculations), data
reporting,  and data entry to forms, reports, and databases. Provide examples of any forms or
checklists  to be used.

       Identify and describe all  data handling equipment and procedures to  process, compile, and
analyze the data including procedures for addressing data generated as part  of the project as well
as data from other sources.  Include any required computer hardware and software and address
any specific performance requirements for the hardware/software configuration used.  Describe
the procedures that shall be followed to demonstrate acceptability of the hardware/software
configuration required.

       Describe the process for  assuring that applicable Agency  information resource
management requirements (EPA Directive 2100) are satisfied.  Agency policy requires that
locational  data be collected and reported with environmental data.  If other Agency data
management requirements are applicable, such as the Chemical Abstract Service Registry Number
Data Standard (EPA Order 2180.1), Data Standards for the Electronic Transmission of
Laboratory Measurement Results (EPA Order 2180.2), or the Minimum Set of Data Elements for
Ground-Water Quality (EPA Order 7500.1 A), discuss how these requirements  are addressed.

5.3.4   Group C, Assessment/Oversight

       This group of QAPP elements addresses the activities for assessing the effectiveness of the
implementation of the project and associated QA and QC activities, including:

              Cl    Assessments  and Response Actions
              C2    Reports to Management
                                          5-12

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

5.3.4.1         Cl. Assessments and Response Actions

       Identify the number, frequency, and type of assessment activities needed for this project.
Assessments include, but are not limited to surveillance, management systems review, readiness
review, technical systems audit, performance evaluation, audit of data quality, and data quality
assessment.

       List and describe the assessments to be used in the project.  Discuss the information
expected and the success criteria (i.e., goals, performance objectives, acceptance criteria
specifications, etc.) for each assessment proposed. List the approximate schedule of activities.
For any planned self-assessments (utilizing personnel from within the project groups), identify
potential participants and their exact relationship within the project organization. For independent
assessments, identify the organization and person(s) that shall perform the assessments if this
information is  available.  Describe how and to whom the results of the assessments shall be
reported.

       Define the scope of authority of the assessors, including stop work orders.  Define
explicitly the unsatisfactory conditions under which the assessors are authorized to act and
provide an approximate schedule for the assessments to be performed.

       Discuss how response actions to non-conforming conditions shall be addressed and by
whom. Identify who is responsible for implementing the response action. Describe how response
actions shall be verified, validated, and documented.

5.3.4.2         C2. Reports to Management

       Identify the frequency and distribution of reports issued to inform management of the
status of the project; results of performance evaluations and system audits; results of periodic data
quality assessments; and significant quality assurance problems and recommended solutions.
Identify the preparer and the recipients of the reports.

5.3.5   Group D, Data Validation and Usability

       This group of QAPP elements covers the QA activities that occur after the data collection
phase of the project is completed, including :

              Dl     Data Review, Validation, and Verification Requirements
              D2     Validation and Verification Methods
              D3     Reconciliation with User Requirements
                                           5-13

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                  05/05/2000

5.3.5.1         Dl.  Data Review. Validation, and Verification Requirements

       State the criteria used to review and validate - that is, accept, reject, or qualify - data, in an
objective and consistent manner.  Provide examples of any forms or checklists to be used.
Identify any project-specific calculations required.

5.3.5.2         D2.  Validation and Verification Methods

       Describe the process to be used for validating and verifying data, including the chain of
custody for data throughout the life cycle of the project or task.  Discuss how issues shall be
resolved and the authorities for resolving such issues. Describe how the results are conveyed to
data users.

5.3.5.3         D3.  Reconciliation with User Requirements

       Describe how the results obtained from the project or task shall be reconciled  with the
requirements defined by the user.  Describe how issues shall be resolved. Discuss how limitations
on the use of the data shall be reported to decision makers.
                                            5-14

-------
EPA Quality Manual for Environmental Programs                                             5360 A1
                                                                                 05/05/2000

                                     APPENDIX A
                                      GLOSSARY

assessment - the evaluation process used to measure the performance or effectiveness of a system
and its elements.  As used here, assessment is an all-inclusive term used to denote any of the
following: audit, performance evaluation, management systems review, peer review, inspection, or
surveillance.

audit (quality) -  a systematic and independent examination to determine whether quality
activities and related results comply with planned arrangements and whether these arrangements
are implemented effectively and are suitable to achieve objectives.

bias - the systematic or persistent distortion of a measurement process which causes errors in one
direction (i.e., the expected sample measurement is different from the sample's true value).

calibration - comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.

data quality assessment (DQA) - a statistical and scientific evaluation of the data set to
determine the validity and performance of the data collection design and statistical test, and to
determine the adequacy of the data set for its intended use.

data quality objectives (DQOs) - qualitative and quantitative statements derived from the DQO
Process that clarify study objectives, define the appropriate type of data, and specify tolerable
levels of potential decision errors that will be used as the basis for establishing the quality and
quantity of data needed to support decisions.

data quality objectives process - a systematic planning tool to facilitate the planning of
environmental data collection activities.  Data quality objectives are  the qualitative and
quantitative outputs from the DQO Process.

design - specifications, drawings, design criteria,  and performance requirements. Also the result
of deliberate planning, analysis, mathematical manipulations, and design processes.

document - any compilation of information which describes, defines, specifies, reports, certifies,
requires, or provides data or results pertaining to environmental programs.

environmental conditions - the description of a physical medium (e.g., air, water,  soil, sediment)
or biological system expressed in terms of its physical, chemical, radiological, or biological
characteristics.
                                           A-l

-------
EPA Quality Manual for Environmental Programs                                           5360 A1
                                                                                05/05/2000

environmental data -any measurements or information that describe environmental processes,
location, or conditions; ecological or health effects and consequences; or the performance of
environmental technology.  For EPA, environmental data include information collected directly
from measurements, produced from models, and compiled from other sources such as data bases
or the literature.

environmental data operations - work performed to obtain, use, or report information
pertaining to environmental processes and conditions.

environmental processes - manufactured or natural processes that produce discharges to, or that
impact, the ambient environment.

environmental programs  - work or activities involving the environment, including but not
limited to: characterization of environmental processes and conditions; environmental monitoring;
environmental research and development; and the design, construction, and operation of
environmental technologies; and laboratory operations on environmental samples

environmental technology - an all-inclusive term used to describe pollution control devices and
systems, waste treatment processes and storage facilities, and site remediation technologies and
their components that may  be utilized to remove pollutants or contaminants from or prevent them
from entering the environment. Examples include wet scrubbers (air), soil washing (soil),
granulated activated carbon unit (water), and filtration (air, water).  Usually, this term applies to
hardware-based systems; however, it also applies to methods or techniques used for pollution
prevention, pollutant reduction, or containment of contamination to prevent further movement of
the  contaminants, such as capping, solidification or vitrification, and biological treatment.

extramural agreement - a legal agreement between EPA and an organization outside EPA for
items or services to be provided.  Such agreements include contracts, work assignments, delivery
orders, task orders, cooperative agreements, research grants, state and local grants, and EPA-
funded interagency agreements.

financial assistance - the process by which funds are provided by one organization (usually
government) to another organization for the purpose of performing work or furnishing services or
items. Financial assistance mechanisms include grants, cooperative agreements,  and government
interagency agreements.

graded approach - the process of basing the level of application of managerial controls applied
to an item or work according to the intended use of the results and the degree of confidence
needed in the quality of the results.
                                          A-2

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

independent assessment - an assessment performed by a qualified individual, group, or
organization that is not a part of the organization directly performing and accountable for the
work being assessed.

management - those individuals directly responsible and accountable for planning, implementing,
and assessing work.

management assessment - the qualitative assessment of a particular program operation and/or
organization(s) to establish whether the prevailing quality management structure, policies,
practices, and procedures are adequate for ensuring that the type and quality of results needed are
obtained. A management assessment may either be performed by those immediately responsible
for overseeing and/or performing the work (i.e., a management self-assessment) or by someone
other that the group performing the work (i.e., a management independent assessment).

management system - a structured non-technical system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for conducting work and producing items and services.

management systems  review (MSR) - the qualitative assessment of a data collection operation
and/or organization(s) to establish whether the prevailing quality management structure, policies,
practices, and procedures are adequate for ensuring that the type and quality of data needed are
obtained.

measurement and testing equipment - tools, gauges, instruments,  sampling devices or systems
used to calibrate, measure, test, or inspect in order to control or acquire data to verify
conformance to specified requirements.

method - a body of procedures and techniques for performing an activity (e.g., sampling,
chemical analysis, quantification) systematically presented in the order in which they are to be
executed.

observation - an assessment conclusion that identifies a condition (either positive or negative)
which does not represent a significant impact on an item or activity.  An observation may identify
a condition which does not yet cause a degradation of quality.

organization - a company, corporation, firm, enterprise, or institution, or part thereof, whether
incorporated or not, public or private, that has its own functions and administration. In the
context of this Manual, an EPA organization is an office, region, national center or laboratory.

peer review - a documented critical review of work by qualified individuals (or organizations)
who are independent of those who performed the work, but are collectively equivalent in technical
expertise.  A peer review is conducted to ensure that activities are technically adequate,

                                           A-3

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

competently performed, properly documented, and satisfy established technical and quality
requirements.  The peer review is an in-depth assessment of the assumptions, calculations,
extrapolations, alternate interpretations, methodology, acceptance criteria, and conclusions
pertaining to specific work and of the documentation that supports them.

performance evaluation (PE) - a type of audit in which the quantitative data generated in a
measurement system are obtained independently and compared with routinely obtained data to
evaluate the proficiency of an analyst or laboratory.

precision - a measure of mutual agreement among individual measurements of the same property,
usually under prescribed similar conditions, expressed generally in terms of the standard deviation.
process - a set of interrelated resources and activities which transforms inputs into outputs.
Examples of processes include analysis, design, data collection, operation, fabrication, and
calculation.

quality - the totality of features and characteristics of a product or service that bear on its ability
to meet the stated or implied needs and expectations of the user.

quality assurance (QA) - an integrated system of management activities involving planning,
implementation, documentation, assessment, reporting, and quality improvement to ensure that a
process, item, or service is of the type and quality needed and expected by the customer.

quality assurance manager (QAM) - the individual designated as the principal manager within
the organization having management oversight and responsibilities for planning, documenting,
coordinating, and assessing the effectiveness of the quality system for the organization.

quality assurance project plan (QAPP) - a document describing in comprehensive detail the
necessary QA, QC, and other technical activities that must be implemented to ensure that the
results of the work performed will satisfy the stated performance criteria.

quality control (QC) - the overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the
stated requirements established by the customer; operational techniques and activities that are
used to fulfill requirements for quality.

quality improvement - a management program for improving the quality of operations.  Such
management programs generally entail a  formal mechanism for encouraging worker
recommendations with timely management evaluation and feedback or implementation.
                                           A-4

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

quality management - that aspect of the overall management system of the organization that
determines and implements the quality policy. Quality management includes strategic planning,
allocation of resources, and other systematic activities (e.g., planning, implementation,
documentation, and assessment) pertaining to the quality system.

quality management plan (QMP) - a document that describes a quality system in terms of the
organizational structure, policy and procedures, functional responsibilities of management and
staff, lines of authority, and required interfaces for those planning, implementing, documenting,
and assessing all activities conducted.

quality system - a structured and documented management system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation
plan of an organization for ensuring quality in its work processes, products (items), and services.
The quality system provides the framework for planning, implementing, documenting, and
assessing work performed by the organization and for carrying out required QA and QC activities.
readiness review - a systematic, documented review of the readiness for the start-up or
continued use of a facility, process, or activity.  Readiness reviews are typically conducted before
proceeding beyond project milestones and prior to initiation of a major phase of work.

record - a completed document that provides objective evidence of an item or process. Records
may include photographs, drawings, magnetic tape, and other data recording media.

scientific  method - the principles and processes regarded as necessary for scientific investigation,
including  rules for concept or hypothesis formulation, conduct of experiments, and validation of
hypotheses by analysis of observations.

self-assessment - assessments of work conducted by individuals, groups, or organizations directly
responsible for overseeing and/or performing the work.

standard  operating procedure (SOP) - a written document that details the method for an
operation, analysis, or action with thoroughly prescribed techniques and steps,  and that is
officially approved as the method for performing certain routine or repetitive tasks.

supplier - any individual or organization furnishing items or services or performing work
according to a procurement document or financial assistance agreement. This is an all-inclusive
term used in place of any of the following: vendor, seller, contractor, subcontractor, fabricator, or
consultant.

surveillance (quality) - continual or frequent monitoring and verification of the status of an
entity and the analysis of records to ensure that specified requirements are being fulfilled.

                                           A-5

-------
EPA Quality Manual for Environmental Programs                                            5360 A1
                                                                                 05/05/2000

technical assessment - the evaluation process used to measure the performance or effectiveness
of a technical system and its elements with respect to documented specifications and objectives.
Such assessments may include qualitative and quantitative evaluations.  A technical assessment
may either be performed by those immediately responsible for overseeing and/or performing the
work (i.e., a technical self-assessment) or by someone other that the group performing the work
(i.e., a technical independent assessment).

technical review - a documented critical review of work that has been performed within the state
of the art. The review is accomplished by one or more qualified reviewers who are independent
of those who performed the work, but are collectively equivalent in technical expertise to those
who performed the original work. The review is an in-depth analysis and evaluation of
documents, activities,  material, data, or items that require technical verification or validation for
applicability, correctness, adequacy, completeness, and assurance that established requirements
are satisfied.

technical systems  audit (TSA) - a thorough, systematic, on-site, qualitative audit of facilities,
equipment, personnel, training, procedures, record keeping, data validation, data management,
and reporting aspects of a system.

user - an organization, group, or individual that utilizes the results or products from
environmental programs or a customer for whom the results or products were collected or
created.

validation  - confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use are fulfilled. In design and development, validation
concerns the process of examining a product or result to determine conformance to user needs.

verification - confirmation by examination and provision of objective  evidence that specified
requirements have been fulfilled. In design and development, verification concerns the process of
examining a result  of a given activity to determine conformance to the stated requirements for that
activity.
                                           A-6

-------
EPA Quality Manual for Environmental Programs                                         5360 A1
                                                                            05/05/2000

                                  APPENDIX B
                                  REFERENCES

Title 40, Part 30, Code of Federal Regulations, "Grants and Agreements With Institutions of
      Higher Education, Hospitals, and Other Non-Profit Organizations."

Title 40, Part 31, Code of Federal Regulations, "Uniform Administrative Requirements for Grants
      and Cooperative Agreement to State and Local Governments."

Title 40, Part 160, Code of Federal Regulations, "Good Laboratory Practices (Pesticides)."

Title 40, Part 792, Code of Federal Regulations, "Good Laboratory Practices (Toxic Substances
      Control Act)."

Title 48, Part 46, Code of Federal Regulations, "Quality Assurance."

ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental
      Data Collection and Environmental Technology Programs., American National Standard,
      January 1995.

EPA Directive 2100, Information Resources Management Policy Manual, 1998.

EPA Directive 2182, EPA System Design and Development Guidance, December 1996.

EPA Order 1900, Contracts Management Manual, February 1998.

EPA Order 1900.1 A, Use of Contractor Services to Avoid Improper Contractor Relationships,
      April 1994.

EPA Order 2160, Records Management Manual, July 1984.

EPA Order 2180.1,  Chemical Abstract Service Registry Number Data Standard, June 1987.

EPA Order 2180.2,  Data Standards for the Electronic Transmission of Laboratory Measurement
      Results, December 1988.

EPA Order 2180.3,  Facility Identifications Standard, April 1990.

EPA Order 5360.1 CHG 2, Policy and Program Requirements for the Mandatory Agency-wide
      Quality System, XXXX 1999.

EPA Order 7500.1 A, Minimum Set of Data Elements for Ground-Water Quality, October 1992.

                                        B-l

-------
EPA Quality Manual for Environmental Programs                                          5360 A1
                                                                             05/05/2000

Office of Federal Procurement Policy, 1992.  OFPP Policy Letter 92-1, "Inherently Governmental
      Functions," 57 FR 45096, Washington, DC.

Office of Federal Procurement Policy, 1993.  OFPP Policy Letter 93-1, "Management Oversight
      of Service Contracting," 58 FR 63593, Washington, DC.

U.S. Environmental Protection Agency, 1998. Guidance for Quality Assurance Project Plans
      (QA/G-5), EPA/600/R-98/018, Office of Research and Development.

U.S. Environmental Protection Agency, 1996. Guidance for the Data Quality Assessment
      Process:  Practical Methods for Data Analysis (QA/G-9), EPA/600/R-96/084, Office of
      Research and Development.

U.S. Environmental Protection Agency, 1995. The Directives Clearance Process - What It Is
      and How It Works, EPA/200/F-95/001, Office of Administration  and Resource
      Management.

U.S. Environmental Protection Agency, 1994. Guidance for the Data Quality Objectives Process
      (QA/G-4), EPA/600/R-96/055, Office of Research and Development.

U.S. Environmental Protection Agency, 1994. Memorandum from Ray E. Spears, Associate
      General Council, to H. Matthew Bills, Director for Modeling, Monitoring Systems and
      Quality Assurance, "Use of Extramural Support to Implement Quality Systems for
      Environmental Programs Involving Environmental Data Operations."
                                         B-2

-------
 xvEPA
Classification No.:  CIO 2105.0 (formerly 5360.1 A2)

Approval Date:     May 5, 2000
                     POLICY AND PROGRAM REQUIREMENTS
             FOR THE MANDATORY AGENCY-WIDE QUALITY SYSTEM
1.  PURPOSE. This Order re-affirms the policy defined by EPA Order 5360.1 (April 1984) and
subsequent editions, and expands that policy to accommodate the current and evolving needs of
the Agency. The Order establishes policy and program requirements for the preparation and
implementation of organizational or programmatic management systems pertaining to quality and
contains the minimum requirements for the mandatory Agency-wide Quality System.

2.  BASIS. AUTHORITY. AND REQUIREMENTS.

       a. Since 1979, Agency policy has required participation in an Agency-wide Quality
System by all EPA organizations (office, region, national center or laboratory) supporting
environmental programs and by non-EPA organizations performing work in behalf of EPA
through extramural agreements. This policy was affirmed in EPA Order 5360.1 in April 1984 and
is reaffirmed in this Order.

       b. It is EPA policy that all environmental programs performed by EPA or directly for
EPA through EPA-funded extramural agreements shall be supported by individual  quality systems
that comply fully with the American National Standard ANSI/ASQC E4-1994, Specifications and
Guidelines for Quality Systems for Environmental Data Collection and Environmental
Technology Programs, incorporated herein by reference.  ANSI/ASQC E4-1994 is a national
consensus standard authorized by the American National Standards Institute (ANSI) and
developed by the American Society for Quality Control (ASQC) that will provide a basis for the
planning, implementation, documentation, and assessment of the Agency-wide Quality System.
Adoption of this standard is consistent with the statutory authority of the National Technology
Transfer and Advancement Act of 1995 and the implementation authority of Office of
Management and Budget (OMB) Circular A-l 19, Federal Participation in the Development and
Use of Voluntary Consensus Standards and in Conformity Assessment Activities.

       c. Under Delegation of Author!ty-I-41, "Mandatory Quality Assurance Program," the
Office of Environmental Information (OEI) is the focal point in the Agency for Quality System

-------
EPA ORDER                                                                  5360.1A2
                                                                              05/05/2000

policy. OEI is responsible for developing quality assurance (QA) and quality control (QC)
requirements and for overseeing implementation of the Agency-wide Quality System. The
Assistant Administrator for OEI (AA/OEI) is designated as the Agency Senior Management
Official for Quality. The Quality Staff is designated by the AA/OEI to serve as the central
management authority for this program.

       d.  Each EPA Headquarters Office, National Program Office, Region, and components
thereof, that conducts activities  described by ANSI/ASQC E4-1994 shall develop and implement
a quality system that complies with the requirements of this Order.

3. BACKGROUND.

       a.  The Agency-wide Quality System is a management system that provides the necessary
elements to plan, implement, document, and assess the effectiveness of QA and QC activities
applied to environmental programs conducted by or for EPA. This system embraces many
functions including:

       establishing quality management policies and guidelines for the development of
       organization- and project-specific quality plans;
       establishing criteria and  guidelines for planning, implementing, documenting, and assessing
       activities to obtain sufficient and adequate data quality;
       providing an information focal point on QA and QC concepts  and practices;
••     performing management and technical assessments to ascertain effectiveness of QA and
       QC implementation; and
••     identifying and developing training programs related to QA and QC implementation.

In addition, this Order expands the applicability of QA and QC activities to the design,
construction, and operation by EPA organizations of environmental technology such as pollution
control and abatement systems; treatment, storage, and disposal systems; and remediation
systems.

       b.  A consistent, Agency-wide Quality System  will provide, when implemented, the needed
management and technical practices to assure that environmental data used to support Agency
decisions are of adequate quality and usability for their intended purpose. Since most EPA
decisions rest on environmental data, a management system is needed that provides for: (1)
identification of environmental programs for which QA and QC activities are needed, (2)
specification of the quality of the data required from environmental programs, and (3) provision of
sufficient resources to assure that an adequate level of QA and QC activities are performed.

4. REFERENCES. The following documents contain provisions which, through reference in this
text, constitute provisions of this Order.  At the time of the issuance of this Order, the editions

-------
EPA ORDER                                                                5360.1A2
                                                                           05/05/2000

were valid.  Since policy documents and standards are subject to periodic revision, users of this
Order should apply the most recent editions of the documents indicated below.

       a. 40 CFR 30, "Grants and Agreements With Institutions of Higher Education, Hospitals,
and Other Non-Profit Organizations."

       b. 40 CFR 31, "Uniform Administrative Requirements for Grants and Cooperative
Agreement to State and Local Governments."

       c. 40 CFR 35, "State and Local Assistance."

       d. 48 CFR 46, "Quality Assurance."

       e. ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for
Environmental Data Collection and Environmental Technology Programs, American National
Standard, January 1995.

       f Circular A-119, Federal Participation in the Development and Use of Voluntary
Consensus Standards and in Conformity Assessment Activities, Office of Management and
Budget, February 1998.

       g. Delegation of Authority 1-41, "Mandatory Quality Assurance Program," U.S.
Environmental Protection Agency, Washington, DC, April 1981.

       h. EPA Order 5360 CHG 1, EPA Quality Manual for Environmental Programs, 1999.

       i. National Technology Transfer and Advancement Act of 1995, PL104-113, March
1996.

5. SCOPE AND FIELD OF APPLICATION.

       a. Scope.  This Order  defines the minimum requirements for quality systems supporting
EPA environmental programs  that encompass:

             (1) the collection, evaluation, and use of environmental data by or for EPA, and

             (2) the design, construction, and operation of environmental technology by EPA.

       b. Applicability to Environmental Programs.  This Order applies to (but is not limited to)
the following environmental programs:

-------
EPA ORDER                                                                  5360.1A2
                                                                              05/05/2000

              (1) the characterization of environmental or ecological systems and the health of
human populations;

              (2) the direct measurement of environmental conditions or releases, including
sample collection, analysis, evaluation, and reporting of environmental data;

              (3) the use of environmental  data collected for other purposes or from other
sources (also termed "secondary data"), including literature, industry surveys, compilations from
computerized data bases and information systems, results from computerized or mathematical
models of environmental processes and conditions; and

              (4) the collection and use of environmental data pertaining to the occupational
health and safely of personnel in EPA facilities (e.g., indoor air quality measurements) and in the
field (e.g., chemical dosimetry, radiation dosimetry).

       c.  Applicability to Other EPA Programs. This order applies to the collection and use of
medical testing data from Government and non-Government personnel in EPA facilities for
determination of substance abuse.

       d.  Organizational Applicability.

              (1) EPA Organizations.  The Agency-wide Quality System requirements defined
by this Order apply to all EPA organizations, and components thereof, in which the environmental
programs conducted involve the scope of activities described in Section 5.a above. The authority
of this Order applies only to EPA organizations except as addressed by Section 5.d(2) below.

              (2) Extramural Agreements.  Agency-wide Quality System requirements may also
apply to non-EPA organizations.  These requirements are defined in the applicable regulations
governing extramural agreements. Agency-wide Quality System requirements may also be
invoked as part of negotiated agreements such as memoranda of understanding.  Non-EPA
organizations that may be subject to quality system requirements include:

                    (a) Any organization  or individual under direct contract to EPA to furnish
services or items or perform work (i.e., a contractor) under the authority of 48 CFR 46,
(including applicable work assignments, delivery orders, and task orders);

                    (b) Institutions of higher education, hospitals, and other non-profit
recipients of financial assistance (e.g., Grants and Cooperative Agreements) under the authority of
40 CFR 30;

                    (c) State, local, and Tribal governments receiving financial assistance
under the authority of 40 CFR 31 and 35; and

-------
EPA ORDER                                                                  5360.1A2
                                                                             05/05/2000

                    (d) Other Government Agencies receiving assistance from EPA through
interagency agreements.

6. QUALITY SYSTEM REQUIREMENTS AND IMPLEMENTATION.

       a.  Quality System Requirements. EPA organizations covered by the scope of this Order
shall develop, implement, and maintain a quality system that demonstrates conformance to the
minimum  specifications of ANSI/ASQC E4-1994 and that additionally provides for the following:

              (1) A quality assurance manager (QAM), or person assigned to an equivalent
position, who functions independently of direct environmental data generation, model
development, or technology development responsibility; who reports on quality issues to the
senior manager having executive leadership authority for the organization; and who has sufficient
technical and management expertise and authority to conduct independent oversight of and assure
the implementation of the organization's quality system in the environmental programs of the
organization.

              (2) A Quality Management Plan (QMP), which documents the organizations
quality policy,  describes its quality system, identifies the environmental programs to which the
quality system applies, and which is implemented following approval by the organizations
executive leadership and the AA/OEI.

              (3) Sufficient resources to implement the quality system defined in the approved
QMP.

              (4) Assessments of the effectiveness of the quality system at least annually.

              (5) Submittal to the AA/OEI of the Quality Assurance Annual Report and Work
Plan (QAARWP) for the organization that summarizes the previous years QA and QC activities
and outlines the work proposed for the current year.

              (6) Use of a systematic planning approach to develop acceptance or performance
criteria for all work covered by this Order.  (See Section 3.3.8 of the EPA Quality Manual for
Environmental Programs)

              (7) Approved Quality Assurance Project Plans (QAPPs), or equivalent documents
defined by the QMP, for all applicable projects and tasks involving environmental data with
review and approval having been made by the EPA QAM (or authorized representative defined in
the QMP). QAPPs must be approved prior to  any data gathering  work or use, except under
circumstances  requiring immediate action to protect human health and the environment or
operations conducted under police powers.

-------
EPA ORDER                                                                 5360.1A2
                                                                            05/05/2000

             (8) Assessment of existing data, when used to support Agency decisions or other
secondary purposes, to verify that they are of sufficient quantity and adequate quality for their
intended use.

             (9) Implementation of Agency-wide Quality System requirements in all applicable
EPA-funded extramural agreements (see Section 5.d(2)).

             (10) Implementation of corrective actions based on assessment results.

             (11) Appropriate training, for all levels of management and staff, to assure that
QA and QC responsibilities and requirements are understood at every stage of project
implementation.

       b.  Quality System Implementation.

             (1) EPA Organizations. Mandatory requirements for implementing this Order are
contained in the EPA Quality Manual for Environmental Programs, hereafter referred to as the
EPA Quality Manual.  Additional non-mandatory guidance for implementing the requirements are
provided in EPA Guidance Documents which may be applied to intramural environmental
programs, as appropriate.

             (2) Extramural Agreements.

                    (a)  Mandatory requirements for implementing this Order are defined in
applicable EPA regulations.  EPA Requirements Documents provide specifications for satisfying
the requirements of these regulations.  The EPA Requirements Documents provide the equivalent
information to the EPA Quality Manual, except they have been written especially for the
extramural user.  Non-mandatory guidance for implementing the requirements are provided in
EPA Guidance Documents which may be applied to extramural  environmental programs, as
appropriate.

                    (b)  Extramural organizations that provide objective evidence (such as a
QMP or quality manual) of conforming to the specifications of the American National Standard
ANSI/ASQC E4-1994 are in compliance with this Order.

7. GENERAL REQUIREMENTS FOR MANAGERS AND STAFF.

       a.  AA for Environmental Information.  In addition to the requirements specified in
Section 7.b below, the AA/OEI,  as the Agency Senior Management Official for Quality, shall:

-------
EPA ORDER                                                                  5360.1A2
                                                                             05/05/2000

              (1) Establish, document, and periodically revise Agency policies and procedures
for planning, implementing, and assessing the effectiveness of the mandatory, Agency-wide
Quality System.

              (2) Review and approve QMPs from Agency components conducting
environmental  programs for implementation for up to five years.

              (3) Perform periodic management assessments of all EPA organizations
conducting environmental programs to determine the effectiveness of their mandatory quality
systems and recommend corrective actions.

              (4) Develop generic training programs, for all levels of EPA management and
staff, so that quality management responsibilities and requirements are understood at every stage
of project implementation.

       b.  National Program Office Assistant Administrators and Senior Managers.

              (1) Each National Program Office (NPO) Assistant Administrator (AA) shall
designate a representative for quality management and QA and QC activities to advise and assist
the AA in the planning, implementation, documentation, and assessment of the quality systems for
organizations under the AAs responsibility.

              (2) The National Program Office (NPO) Assistant Administrators and senior
managers shall:

                    (a)  Ensure that all NPO components and applicable programs comply fully
with the requirements of this Order.

                    (b)  Ensure that quality management is an identified activity with associated
resources adequate to accomplish its program goals and is implemented as prescribed in the
organizations approved QMP.

                    (c)  Ensure that all environmental programs implemented through
extramural agreements comply fully with applicable QA and QC requirements.

                    (d)  Ensure that environmental data from the parts of National Programs
implemented by the Regions or delegated to State, local, and Tribal governments or from research
and development programs are of sufficient quantity and adequate quality for their intended use
and are used consistent with such intentions.

-------
EPA ORDER                                                                  5360.1A2
                                                                              05/05/2000

                    (e) Ensure that all proposed and final regulations needing environmental
data during their development or implementation include the application of sufficient and
adequate QA and QC activities during the collection and use of such data.

                    (f) Perform periodic assessments of NPO organizations conducting
environmental programs to determine the conformance of their mandatory quality systems to their
approved QMPs and the effectiveness of their implementation.

                    (g) Ensure that deficiencies highlighted in the assessments are
appropriately addressed.

                    (h) Identify program-specific QA and QC training needs for all levels of
management and staff and provide for this training.

                    (i) Ensure that performance plans for supervisors, senior managers, and
appropriate staff contain critical element(s) that are commensurate with the quality management
responsibilities assigned by this Order and the organizations QMP.

       c.  Regional Administrators and Senior Managers.  Regional Administrators  and senior
managers shall:

             (1) Ensure that all Regional components and programs comply fully  with the
requirements of this Order.

             (2) Ensure that quality management is an identified activity with associated
resources adequate to accomplish its program goals and is implemented as prescribed in the
organizations approved QMP.

             (3) Ensure that all environmental programs implemented through extramural
agreements comply fully with applicable QA and QC requirements.

             (4) Ensure that the environmental data from environmental programs delegated to
State, local, and Tribal governments are  of sufficient quantity and adequate quality for their
intended use and are used consistent with such intentions.

             (5) Ensure that training is available for State, local, and Tribal governments
performing environmental programs for EPA in the fundamental concepts and practices of quality
management and QA and QC activities that they may be expected by EPA to perform.

             (6) Perform periodic assessments of Regional organizations conducting
environmental programs to determine the conformance of their mandatory quality systems to their
approved QMPs and the effectiveness of their implementation.

-------
EPA ORDER                                                                   5360.1A2
                                                                               05/05/2000

              (7)  Ensure that deficiencies highlighted in the assessments are appropriately
addressed.

              (8)  Identify QA and QC training needs for all levels of management and staff and
provide for this training.

              (9)  Ensure that performance plans for supervisors, senior managers, and
appropriate staff contain critical element(s) that are commensurate with the quality management
responsibilities assigned by this Order and the organizations QMP.

       d.  Quality Management Personnel. Quality management personnel, including the QAM,
refers to individuals within the organization who are assigned specific quality management duties
and are delegated authority for quality management as defined in the organizations QMP. The
functions of the quality management personnel may be totally related to quality system activities
or be in conjunction with other functions and responsibilities within the organization.  If these
personnel have other functions to perform, there should be no conflict of interest. Specific duties
and responsibilities of all quality management personnel shall be documented in the organizations
QMP.  Specific responsibilities shall include:

              (1)  facilitating QMP development and approval by the organization and preparing
updates to the approved QMP;

              (2)  representing the organization to QAD and other groups on matters pertaining
to quality management and QA and QC activities;

              (3)  providing expert assistance to the staff in the organization on QA and QC
policies, requirements, and procedures applicable to procurement and technical activities;

              (4)  reviewing  and approving QMPs and QAPPs submitted by intramural programs
and by holders of extramural agreements as defined in the organizations QMP;

              (5)  identifying QA and QC training needs for the organization;

              (6)  providing oversight of QA and QC implementation in the environmental
programs conducted by or for the organization; and

              (7)  performing assessments of environmental programs and confirming the
effectiveness  of corrective actions.

       e.  Agency Managers and Staff.

              (1)  Managers at all levels shall:

-------
lities
EPA ORDER                                                                  5360.1A2
                                                                               05/05/2000

                     (a) Ensure that quality management is an identified activity with associated
resources adequate to accomplish its program quality goals.

                     (b) Ensure that all organizational components and programs comply fully
with the requirements of this Order.

                     (c) Ensure that all applicable environmental programs for which
management is responsible comply fully with the requirements of this Order.

                     (d) Perform all other quality management roles and responsibility
assigned to them in their organizations QMPs.

              (2)  Managers and staff shall:

                     (a) Ensure that all applicable intramural programs and activities comply
fully with the requirements of this Order.

                     (b) Ensure that all applicable extramural environmental programs for
which the manager or staff member is responsible comply fully with the requirements of this
Order.

                     (c) Assure that the results of environmental programs are of sufficient
quantity and adequate quality for their intended use.

                     (d) Perform all other quality management roles and responsibilities
assigned to them in their organizations QMPs.

8.  DEFINITIONS. The following terms have special meanings in relation to this Order.

       a. assessment - the evaluation process used to measure the performance or effectiveness
of a system and its elements. As used here, assessment is an all-inclusive term used to denote any
of the following: audit, performance  evaluation, management review, peer review, inspection, or
surveillance.

       b. environmental data - any measurements or information that describe environmental
processes, location, or conditions; ecological or health effects and consequences; or the
performance of environmental technology.  For EPA, environmental data include information
collected directly from measurements, produced from models, and compiled from other sources
such as data bases or the literature.

       c. environmental programs -  work or activities involving the environment, including but
not limited to: characterization of environmental processes and conditions; environmental

                                           10

-------
EPA ORDER                                                                   5360.1A2
                                                                               05/05/2000

monitoring; environmental research and development; the design, construction, and operation of
environmental technologies; and laboratory operations on environmental samples.

       d.  environmental technology - an all-inclusive term used to describe pollution control
devices and systems, waste treatment processes and storage facilities, and site remediation
technologies and their components that may be utilized to remove pollutants or contaminants
from or prevent them from entering the environment. Examples include wet scrubbers (air), soil
washing (soil), granulated activated carbon unit (water), and filtration (air, water).  Usually, this
term applies to hardware-based  systems; however, it also applies to methods or techniques used
for pollution prevention, pollutant reduction, or containment of contamination to prevent further
movement of the contaminants,  such as capping, solidification or vitrification, and biological
treatment.

       e.  extramural agreement - a legal agreement between EPA and an organization outside
EPA for items or services to be  provided.  Such agreements include contracts, work assignments,
delivery orders, task orders, cooperative agreements, research grants, state and local grants, and
EPA-funded interagency agreements.

       f  management system - a structured non-technical system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation
plan of an organization for conducting work and producing items and services.

       g.  organization - a company,  corporation, firm, enterprise, or institution, or part thereof,
whether incorporated or not, public or private, that has its own functions and administration. In
the context of this Order, an EPA organization is an office, region, national center or laboratory.

       h.  process - a set of interrelated resources and activities which transforms inputs into
outputs. Examples of processes include analysis, design, data collection, operation, fabrication,
and calculation.

       i.  quality - the totality of features and characteristics of a product or service that bear on
its ability to meet the stated or implied needs and expectations of the user.

       j.  quality assurance (QA) - an integrated system of management activities involving
planning, implementation, documentation, assessment, reporting, and quality improvement to
ensure that a process, item, or service is of the type and quality needed and expected by the
customer.

       k.  quality assurance manager (QAM) - the individual designated as the principal manager
within the organization having management oversight and responsibilities for planning,
documenting, coordinating, and  assessing the effectiveness of the quality system for the
organization.

                                            11

-------
EPA ORDER                                                                   5360.1A2
                                                                               05/05/2000

       1. quality assurance project plan (QAPP) - a document describing in comprehensive detail
the necessary QA, QC and other technical activities that must be implemented to ensure that the
results of the work performed will satisfy the stated performance criteria.

       m.  quality control (QC) - the overall system of technical activities that measures the
attributes and performance of a process, item, or service against defined standards to verify that
they meet the stated requirements established by the customer;  operational techniques and
activities that are used to fulfill requirements for quality.

       n. quality management - that aspect of the overall management system of the organization
that determines and implements the quality policy.  Quality management includes strategic
planning, allocation of resources, and other systematic activities (e.g., planning, implementation,
documentation, and assessment) pertaining to the quality system.

       o. quality management plan (QMP) - a document that describes a quality system in terms
of the organizational structure, policy and procedures, functional responsibilities of management
and staff, lines of authority, and required interfaces for those planning, implementing,
documenting, and assessing all activities conducted.

       p. quality system - a structured and documented management system describing the
policies,  objectives, principles, organizational authority, responsibilities, accountability, and
implementation plan of an organization for ensuring quality in its work processes, products
(items), and services. The quality system provides the framework for planning, implementing,
documenting, and assessing work performed by the organization and for carrying out required QA
and QC activities.

       q. user - an organization, group, or individual that utilizes the results or products from
environmental programs or the customer for whom the results  or products were collected or
created.

9.  SUPERSESSION. This Order replaces previous editions of EPA Order 5360.1, in their
entirety.
                                            12

-------
                                 Appendix B
        Great Lakes National Program Office Team Mission Statements
GLNPO Quality Management Plan - Appendix B                                           May 2008

-------
GLNPO Team Mission Statements

Management Team

The purpose of the Management Team is to lead the United States' Great Lakes Program and to manage
GLNPO in a manner that steers public and private actions to protect and restore the Great Lakes
ecosystem and improves the quality of life for its citizens. The management team functions to:

           •       Continue to build relationships with external partners to foster and promote Great
                   Lakes ecosystem protection and establish the strategic direction together with multi-
                   year and annual priorities for the Great Lakes Program and for GLNPO.
           •       Link annual team performance agreements that set ambitious and achievable goals
                   consistent with GLNPO's mission and the Great Lakes Program strategic plan with
                   Agency and Regional performance management processes.
           •       Secure resources through the Agency budget process, prioritize funding and
                   resource needs, allocate resources among the implementing teams, and ensure
                   financial integrity and accountability in all GLNPO activities.
           •       Effectively guide individuals in career development, coordinate team activities, and
                   develop specific milestones to accomplish the office-wide mission.
           •       Target specific office-wide performance areas for improvement and monitor
                   progress.
           •       Ensure implementation of the GLNPO Health, Safety and Environmental
                   Compliance program.

Planning and Budget Team

The purpose of the Planning and Budget Team is to facilitate informed planning, budgeting, and resource
management by GLNPO's Management Team. Function of the team are to:
           •       Secure resources for GLNPO through the Agency's planning and budgeting process
                   for Great Lakes priorities so that GLNPO teams will have sufficient resources to
                   support those priorities.
           •       Develop program priorities and funding process (funding guidance).
           •       Coordinate development of and track the internal GLNPO operating budget.
           •       Analyze Great Lakes National Program resource utilization to support appropriate
                   planning, management, and accountability.
           •       Identify processes in need of improvement/management action.
           •       Coordinate grants process to expedite funding.  Provide relevant grants financial
                   information (from preproposal stage to closeout (as appropriate)) for internal and
                   external agency release.
                                         Page 1 of 5

-------
Communications and Reporting Team

The Communications and Reporting Teams mission is to disseminate information on the state of the
Great Lakes ecosystem and on efforts by various parties in the protection, management and restoration of
the Great Lakes Basin. This includes activities to:

           •   Communicate in an accurate and timely fashion GL basin-wide programs, with special
               emphasis and priority on GLNPO-originated programs,
           •   Develop and maintain a collection of publications for general public consumption,
               including the annual Report to Congress, describing GLNPO and its activities as well as
               the relevant activities of other partners in the GL Program,
           •   Provide GLNPO-wide briefing materials for staff use in preparation for public-oriented
               meetings,
           •   Develop and maintain devices for communicating the GL Program, such as a slide
               catalog, videotapes,  and displays,
           •   Provide support to basin-wide education programs, including the Cities Tour of the R/V
               Lake Guardian, the Great Lakes, Great Minds teacher's workshops, etc.,
           •   Consult on major GLNPO technical reports, publications  and presentations to ensure
               relative uniformity in quality and compliance with publication regulations, and
           •   Facilitate, with the Information Management and Data Integration Team, the uploading
               of as much GLNPO-generated data into the GLNPO Gopher/GLIN/ CIESIN/REIS
               systems.

Ecological Protection and Restoration Team

Ecological Protection and Restoration Teams mission is to effectively and efficiently protect, restore, and
enhance the habitats needed to sustain a healthy and diverse Great Lakes ecosystem. This includes
activities to:

         •      Plan, coordinate and implement effective restoration and  protection activities
               throughout the Great Lakes ecosystem,
         •      Acquire and disseminate information about ecosystem protection and restoration
               techniques and activities throughout the Great Lakes ecosystem,
         •      Award grants to organizations and individuals for protection and restoration projects
               throughout the Great Lakes ecosystem, and
         •      Tackle special assignments from EPA Headquarters or the Director as needed.
                                          Page 2 of 5

-------
Environmental Monitoring and Indicators Team

This Team is primarily accountable for assessing and reporting on the present status and trends of the
environmental quality of the Great Lakes ecosystem.  Major areas of responsibility include planning,
monitoring, research and outreach.

Several goals and their products or environmental results are recognized for this Team, and they are
summarized below.

           •   Planning: This Team will 1) lead U.S. efforts to establish international strategic
               monitoring plans for each of the Great Lakes, and will translate those plans into specific
               GLNPO monitoring plans; 2) actively support international efforts to establish
               ecosystem objectives and to identify appropriate environmental indicators for the Great
               Lakes; and 3) Develop  comprehensive work plans and quality assurance plans for
               specific special studies.
           •   Monitoring:  This Team will conduct or oversee field sampling operations and laboratory
               analyses to implement specific monitoring plans or special studies. Data analysis,
               interpretation and reporting are recognized as inherent functions of a comprehensive
               monitoring program.
           •   Outreach: This Team will exchange data and interpretive information with both internal
               (EPA) and external (other agencies, states, public, scientific community, etc.) clients
               through a variety of vehicles, including indicators reports, scientific publications,
               technical advice, data reports, etc., and will work in close cooperation with the GLNPO
               Communications and Reporting Team.
           •   Research: This team will identify and conduct or oversee important research activities
               required to develop new tools for monitoring the  Great Lakes ecosystem, to develop and
               test environmental indicators, and to understand critical components of the Great Lakes
               ecosystem.

Sediment Assessment and Remediation Team

The Sediment Assessment and Remediation Teams purpose is to reduce the impacts of contaminated
sediments on the Great Lakes ecosystem.  Specific functions include:

           •   Perform and provide support for sediment assessments throughout the Great Lakes.
           •   Provide support for sediment based mass balance modeling activities.
           •   Provide support for sediment based risk assessments.
           •   Provide technical support toward the selection and implementation of remedial
               alternatives.
           •   Foster partnerships between Great Lakes stakeholders to promote sediment clean- up
               activities.
           •   Active involvement in setting of policy and direction for the Great Lakes contaminated
               sediment Program.
                                          Page 3 of 5

-------
Pollution Prevention Team

The Pollution Prevention Teams purpose is to lead in the development, coordination and implementation
of the Virtual Elimination Project, and the U.S. - Canada Binational Strategy. To initiate P2 ideas and
projects, to lead and coordinate GLNPO pollution prevention efforts; to set the P2 agenda.  To serve as
the coordinating body within GLNPO to Regional and national pollution prevention efforts.

Health, Safety & Environmental Compliance Team

The Health, Safety & Environmental Compliance Teams purpose is to manage the GLNPO's Health,
Safety and Environmental Compliance program in accordance with all applicable regulations and policies
defined in the GLNPO Health & Safety Manual and the Region 5 Safety Manual. To provide a safe,
secure and healthy work environment for all GLNPO and affiliated employees within all GLNPO
facilities. Its major functions are to:

       •   Conduct Quarterly Meetings To Promote Office Awareness and Serve as Communications
           Focal Point for Health, Safety & Environmental Compliance Issues.  Organize  and conduct
           quarterly safety meetings that are chaired by the Office Deputy Director, and open to all
           office personnel.  Safety team activities will be discussed at these meetings as well as an
           update on all abatement requirements resulting from various inspections.
       •   Streamline, review and update the GLNPO safety manual with emphasis on realistic,
           practical and regulatory required activities.
       •   Develop Health Safety and Environmental Compliance manual for the R/V Mudpuppy.
       •   Formulate and manage the GLNPO Health & Safety budget.
       •   Develop a new GLNPO tracking system for safety training and Medical Monitoring.
       •   Address recurring safety hazards identified on ships and in warehouse, specific emphasis on
           Hoods and HVAC on R/V Lake Guardian.
       •   Enhance office attitudes on safety,  health and environmental compliance within the office
           through extensive communication on Lan and seminars on safety program.
       •   Informed team participation in Regional Health, Safety & Environmental Compliance
           Committee and Subcommittees.  The four subcommittees are: Training; Security; Regional
           Issues; Personnel Protective Equipment.

Information Management and Data Integration Team

The purpose of the Information Management and Data Integration  Team is to provide leadership and
support to the GLNPO and the multiple agency Great Lake Program in storage and access to Great Lakes
environmental information.
                                         Page 4 of 5

-------
Quality Assurance Team

The purpose of the Quality Assurance Team is to support GLNPO's environmental data collection activities
by providing the necessary resources and tools to assure the collection of data of known and appropriate
quality.

The QA team serves three maj or functions. First and foremost the team provides assistance to GLNPO staff
and cooperators to assure that studies produce data of appropriate quality. Secondly, the QA team functions
in a role of independent evaluation/oversight to assure adherence to GLNPO QA policy/protocol. Lastly,
the QA team develops various documents or products including policy statements, progress reports, plans
and reports for major data collection activities.

    Provide Assistance
        •   Assist in the development of QA project plans and data quality objectives
        •   Provide tools (software, guidance documents, technical expertise) for the development of QA
           products. These would in elude QA project plans, sampling designs, data quality assessments and
           QA reports.
        •   Provide training on various QA concepts.
        •   Maintain a QA Library.
        •   Act as a liaison with QAD (monthly conference calls) for policy information
        •   Attend national/bi-national QA meetings to keep abreast of QA improvements

    Evaluation/Oversight
        •   Review and comment on QA project plans within 10 working days
        •   Assist in and/or implement data quality and technical systems audits
        •   Develop and implement RDMQ data verification software for base monitoring programs
        •   Use QATRACK as an evaluation tool for QA project plan development
        •   Serve as QA chair on various programs
        •   Develop QA reports for major data collection activities

    Documentation
        •   Revise the GLNPO Quality Management Plan (yearly), distribute to GLNPO and submit to QAD
        •   Write the Yearly QA Report and Workplan for QAD and GLNPO Staff
        •   Write quarterly reports
        •   Write QA Program Plans
        •   Write QA Reports

Exotics Team

The purpose of the Exotics Team is to foster further understanding of the impact of exotic species on the Great
Lakes Ecosystem and to further explore the management options for control and prevention of the further
spreading of exotic species.
                                          Page 5 of  5

-------
                                  Appendix C
               Monthly Quality Assurance Status Tracking Sheet
GLNPO Quality Management Plan - Appendix C                                             May 2008

-------
Monthly QA Status
P.O. Name and
Date Signed




























Project
Investigator














Grant
Number














Project
Title














Legend - NG: new grant, A: approved, A/R: acceptable with minor revisions, NA: not approved TOTAL:
NG















A















A/R















NA















Approval
Date















     Page 1 of 1

-------
                                Appendix D
             Great Lakes National Program Office Data Standard:
                  Quality Assurance/Quality Control Codes
GLNPO Quality Management Plan - Appendix D                                         Mauy 2008

-------
                                             GLENDA FIELD REMARK CODES
                                                         (Fieldrmk.xls)
Code
ALT
CONT
FRZN
LOST
SPIL
SUSP
SXBD
OTHER
Name
Alternate Method
Known contamination
Freezing
Lost/Not Submitted
Spillage/Leakage
Suspected contamination
Equipment Malfunction
Other
Description
Sample was obtained using an alternate collection method. This flag alerts data users to read details presented in the sampling
method exception text
Sample is known (i.e., confirmed) to have been contaminated in the field or during transport. Validity of results from this sample
may be compromised
Sample was unintentionally frozen in field or during transport
Sample was taken but either was not submitted for analysis or was lost before being analyzed
Sample spilled or leaked in the field or during transport. Sample was submitted anyway. Validity of results from this sample may
be compromised
Sample is suspected (i.e., but not confirmed) to have been contaminated in the field or during transport. Validity of results from this
sample may be compromised
Sampling equipment malfunctioned or did not function as intended
Validity of results from this sample may be compromised due to conditions other than presented in this list. This flag alerts data
users to read details presented in the field crew comments text
Field Remark Code
Page 1 of 1
Version 1.03 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
ALT
BSD
BAG
BDL
BLQ
CAJ
CAN
CBC
CBL
CCA
GDI
Name
Alternate Method
Below 5 Times
MDL
Correction Factor,
background
Detection Limit,
less than
Between
Instrument
Detection and
Quantification
Correction Factor,
lab
No Result
Reported, analysis
canceled
No Result
Reported, cannot
be calculated
Correction Factor,
blank
Correction Factor,
calibration
Correction Factor,
dilution
Group
Procedure
Other
Corrected
Limit
Limit
Corrected
No Result
Reported
No Result
Reported
Corrected
Corrected
Corrected
Description
Reported value was obtained using an alternate analytic
method. Validity of reported value may be compromised
Reported value is greater than the method detection
limit but less than 5 times the method detection limit.
Validity of reported value and associated precision
statistics (e.g., RPD) may be compromised
Reported value was corrected for variable background
contribution to the instrument signal in the
determination of trace elements
Analyte produced an instrument response but reported
value is below a detection limit. The type of detection
limit was unspecified. Validity of reported value may be
compromised
Reported value is above calculated instrument
detection limit but below quantification limit. Validity of
reported value may be compromised
Reported value was corrected by a lab performance
check factor
Analysis was canceled and not performed. No result
value was reported
Result should have been a calculated value but it could
not be determined because an operand value was
qualified. No result value was reported
Reported value was corrected by a blank correction
factor
Reported value was corrected by a calibration
correction factor
Reported value was corrected by a dilution correction
factor
Reporting Instruction Description
Information about the alternate analytic method used should be
provided in the Exception to Method Text

The value of the correction factor, if known, should be provided in the
Correction Factor table

Information about limits should be provided in the Project QA/QC
Summary
The value of the correction factor, if known, should be provided in the
Correction Factor table
The reason for cancellation should be provided in the Exception to
Method Text

The value of the correction factor, if known, should be provided in the
Correction Factor table
The value of the correction factor, if known, should be provided in the
Correction Factor table
The value of the correction factor, if known, should be provided in the
Correction Factor table
Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 1 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
CLC
CON
CSP
CST
CSU
CTP
DDL
EER
EHT
EST
FAC
Name
Correction Factor,
other
Value Confirmed
Correction Factor,
standard pressure
Correction Factor,
standard
temperature
Correction Factor,
surrogate
Correction Factor,
standard
temperature and
pressure
Daily Detection
Limit, less than
No Result
Reported, entry
error
Exceeded Holding
Time
Estimated Value,
outside limit of
precision
No Result
Reported, field
accident
Group
Corrected
Other
Corrected
Corrected
Corrected
Corrected
Limit
No Result
Reported
Handling
Estimated
Value
No Result
Reported
Description
Reported value was corrected. Correction factor was
derived by unspecified means or means other than
those presented in this list
Reported value was confirmed by using an auxiliary
analytical technique
Reported value was corrected by a standard pressure
correction factor
Reported value was corrected by a standard
temperature correction factor
Reported value was corrected by a surrogate correction
factor
Reported value was corrected by a standard
temperature and pressure correction factor
Analyte produced an instrument response but reported
value is below the calculated daily detection limit.
Validity of reported value may be compromised
Original value is known to be incorrect due to a data
entry error. The correct value could not be determined.
No result value was reported
Sample or extract was held longer than the approved
amount of time before analysis. Validity of reported
value may be compromised
Reported value was not within expected limits of
precision and is therefore considered an estimate
Analysis was halted because a field accident either
destroyed the sample or rendered it not suitable for
analysis. No result value was reported
Reporting Instruction Description
The value of the correction factor, if known, should be provided in the
Correction Factor table. Information about how the correction factor
was derived should be provided in the Result Description
Information about confirmation technique should be provided in the
Analytic Method or the Exception to Method Text
The value of the correction factor, if known, should be provided in the
Correction Factor table
The value of the correction factor, if known, should be provided in the
Correction Factor table
The value of the correction factor, if known, should be provided in the
Correction Factor table
The value of the correction factor, if known, should be provided in the
Correction Factor table
Information about detection limits should be provided in the Project
QA/QC Summary

The length of time that the sample was held should be provided in the
Exception to Method Text

Information about the field accident should be provided in the
Exception to Method Text
Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 2 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
FBB
FBS
FCB
FCC
FCL
FCN
PCS
FCV
FOB
FDC
FDL
Name
Field Bottle Blank,
failed
Blank Sample,
failed
Lab Calibration
Blank, failed
Continuing
Calibration Check,
failed
Lab Control
Solution, failed
Calibration
Sample, failed
Field Control
Solution, failed
Coefficient of
Variation Limit,
failed
Dry Blank, failed
Drift Check, failed
Lab Duplicate,
failed
Group
QC Failed
QC Failed
QC Failed
QC Setup
QC Failed
QC Failed
QC Failed
Other
QC Failed
QC Setup
QC Failed
Description
A field bottle blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A blank sample associated with this analysis failed the
acceptance criteria. It is unknown whether the blank
that failed was a field blank or a lab blank. Validity of
reported value may be compromised
A lab calibration blank associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A continuing calibration check associated with this
analysis failed the acceptance criteria. Validity of
reported value may be compromised
A lab control solution associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A calibration sample (type unknown or unspecified)
associated with this analysis failed the acceptance
criteria. Validity of reported value may be compromised
Afield control solution associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
Precision, measured as CV between multiple analyses
of a sample within and between instrumental analysis
runs, did not meet the method criteria. Validity of
reported value may be compromised
A dry blank associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A drift check associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A lab duplicate associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
Reporting Instruction Description











Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 3 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
FFB
FFD
FFR
FFS
FFT
FIB
FIC
FIS
FKB
FLA
FLB
Name
Field Matrix Blank,
failed
Field Duplicate,
failed
Field Blank, failed
Field Spike, failed
Trip Blank, failed
Field Instrument
Blank, failed
Lab Interference
Check Sample,
failed
Internal Standard,
failed
Continuing Check
Blank, failed
Field Lab Anomaly
Lab Matrix Blank,
failed
Group
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
Other
QC Failed
Description
A field matrix blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A field duplicate associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
Afield blank sample (type unknown or unspecified)
associated with this analysis failed the acceptance
criteria. Validity of reported value may be compromised
A field spike associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A trip blank associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A field instrument blank associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A lab interference check sample associated with this
analysis failed the acceptance criteria. Validity of
reported value may be compromised.
An internal standard associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A continuing check blank associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
Reported value for lab measurement was inconsistent
with reported value for corresponding field
measurement. Validity of reported value may be
compromised
A lab matrix blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
Reporting Instruction Description











Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 4 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
FLC
FLR
FLS
FMB
FMS
FNB
FOB
FPB
FPC
FPS
FQC
Name
Linearity Check,
failed
Lab Blank, failed
Lab Spike, failed
Matrix Spike
Blank, failed
Matrix Spike,
failed
Lab Instrument
Blank, failed
Field Fortified
Blank, failed
Lab Procedural
Blank, failed
Performance
Check, failed
Lab Procedural
Spike, failed
Quality Control,
failed
Group
QC Setup
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
Description
A linearity check associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A lab blank sample (type unknown or unspecified)
associated with this analysis failed the acceptance
criteria. Validity of reported value may be compromised
A lab spike associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A matrix spike blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A matrix spike associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A lab instrument blank associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A field fortified blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A lab procedural blank associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A lab performance check sample associated with this
analysis failed the acceptance criteria. Validity of
reported value may be compromised
A lab procedural spike associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
Quality control criteria were exceeded during analysis.
Value was not rejected, however. Validity of reported
value may be compromised
Reporting Instruction Description











Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 5 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
FRB
FRF
FRM
FRN
FRS
FSB
FSD
FSF
FSK
FSL
FSP
Name
Field Reagent
Blank, failed
Reference
material, failed
Field Reference
Material, failed
Lab Reagent
Blank, failed
Lab Reference,
failed
Lab Solvent Blank,
failed
Lab Spike
Duplicate, failed
Surrogate Spike,
failed
Spike sample,
failed
Lab Spike Blank,
failed
Lab Solvent Spike,
failed
Group
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
QC Failed
Description
A field reagent blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A reference sample (type unknown or unspecified)
associated with this analysis failed the acceptance
criteria. Validity of reported value may be compromised
A field reference material associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A lab reagent blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A lab reference associated with this analysis failed the
acceptance criteria. Validity of reported value may be
compromised
A lab solvent blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A spiked lab duplicate associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
Surrogate spike recoveries associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
A spike sample (type unknown or unspecified)
associated with this analysis failed the acceptance
criteria. Validity of reported value may be compromised
A spiked lab blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A lab solvent spike associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
Reporting Instruction Description











Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 6 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
FSR
FSS
FTB
FUB
FVS
FWB
GTL
HIB
IDL
IDS
INT
Name
Standard
Reference
Material, failed
Surrogate, failed
Field Filter Blank,
failed
Field Tubing
Blank, failed
Lab Calibration
Verification
Solution, failed
Field Source
Water Blank,
failed
Operating Range,
greater than
Likely Biased High
Instrument
Detection Limit,
less than
Analyte Not
Confirmed
Interference
Suspected
Group
QC Failed
QC Failed
QC Failed
QC Failed
QC Setup
QC Failed
Limit
Other
Limit
Other
Other
Description
A standard reference material associated with this
analysis failed the acceptance criteria. Validity of
reported value may be compromised
Surrogate recoveries associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A field filter blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A field tubing blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
A lab calibration verification solution associated with
this analysis failed the acceptance criteria. Validity of
reported value may be compromised
A field source water blank associated with this analysis
failed the acceptance criteria. Validity of reported value
may be compromised
Reported value is above the valid operating range of the
analytical system, quantitative process, or qualitative
process, or reported value is above the highest
calibration standard. Validity of reported value may be
Reported value is probably biased high as evidenced by
LMS (matrix spike, lab) results, SRM (reference
material, standard) recovery, blank contamination or
other internal lab QC data. Reported value is not
Analyte produced an instrument response but reported
value is below the calculated instrument detection limit.
Validity of reported value may be compromised
Identity of analyte could not be confirmed using an
alternate technique
Reported value is believed to be the result of
interference and not presence of the analyte. Validity of
reported value may be compromised
Reporting Instruction Description








Information about detection limits should be provided in the Project
QA/QC Summary


Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 7 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
INV
ISC
ISP
JCN
JCW
KCA
KCF
KCP
KCX
LAC
LOB
Name
Invalid
Correction Factor,
internal standard
Improper Sample
Preservation
Sample Container
Damaged, no
sample lost
Sample Container
Damaged, sample
lost
Known
Contamination, lab
analysis
Known
Contamination,
field
Known
Contamination, lab
preparation
Known
Contamination,
unknown
No Result
Reported, lab
accident
Likely Biased Low
Group
Other
Corrected
Handling
Handling
Handling
Contaminati
on
Contaminati
on
Contaminati
on
Contaminati
on
No Result
Reported
Other
Description
Reported value is deemed invalid by the QC
Coordinator
Reported value was corrected for the internal standard
recovery
Sample was not properly preserved. Validity of reported
value may be compromised
Sample container (jar, test tube, etc.) was damaged but
no portion of the sample was lost. Validity of reported
value may be compromised
Sample container (jar, test tube, etc.) was damaged. At
least a portion of the sample was lost. Validity of
reported value may be compromised
Contamination is known to have occurred during the
laboratory analysis process. Validity of reported value
may be compromised
Contamination is known to have occurred during the
field collection process. Validity of reported value may
be compromised
Contamination is known to have occurred during the
laboratory preparation process. Validity of reported
value may be compromised
Contamination is known to have occurred but the
source of that contamination is unknown. Validity of
reported value may be compromised
Analysis was halted because a laboratory accident
either destroyed the sample or rendered it not suitable
for analysis. No result value was reported
Reported value is probably biased low as evidenced by
LMS (matrix spike, lab) results, SRM (reference
material, standard) recovery or other internal lab QC
data. Reported value is not considered invalid, however
Reporting Instruction Description

The value of the correction factor, if known, should be provided in the
Correction Factor table



The source of contamination, if known, should be provided in the
Exception to Method Text
The source of contamination, if known, should be provided in the
Exception to Method Text
The source of contamination, if known, should be provided in the
Exception to Method Text

Information about the lab accident should be provided in the
Exception to Method Text

Assignor
QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
QC
Analytical Remark Code
Page 8 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
LTL
MBK
MDL
NAI
NRR
NSQ
NWL
OTHER
PNQ
PPD
REJ
Name
Operating Range,
less than
Blank, detected
below MDL
Method Detection
Limit, less than
No Result
Reported,
interference
No Result
Reported, other
No Result
Reported,
insufficient
quantity of sample
Operating Range,
not within
Other
No Quantifiable
Result Reported
Spiked Blank
Duplicate, failed
Value Rejected
Group
Limit
Other
Limit
No Result
Reported
No Result
Reported
No Result
Reported
Limit
Other
No Result
Reported
QC Failed
Other
Description
Reported value is below the valid operating range of the
analytical system, quantitative process, or qualitative
process, or reported value is less than the lowest
calibration standard. Validity of reported value may be
Analyte was detected in a related lab blank at a
concentration below the method detection limit (MDL)
and/or blank action limit, however the related lab blank
did not fail
Analyte produced an instrument response but reported
value is below the calculated method detection limit.
Validity of reported value may be compromised
A valid result could not be obtained from the analysis
due to interference. Analysis was halted. No result
value was reported
Result value was not determined or entered for reasons
other than those presented in this list. No result value
was reported
Result value could not be obtained due to insufficient
quantity of the sample. No result value was reported
Reported value is outside (above or below not
specified) the valid operating range of the analytical
system, quantitative process, or qualitative process, or
outside the calibration standard. Validity of reported
Validity of reported value may be compromised for
reasons other than those presented in this list
Analyte was present in the sample but was not
quantifiable. No result value was reported
Analysis results showed unacceptable duplicate
precision between laboratory prepared spiked blank
duplicates. Validity of reported value may be
compromised
Reported value was rejected by the laboratory. Value
was not utilized in the calculation of any results
Reporting Instruction Description


Information about detection limits should be provided in the Project
QA/QC Summary
Information about the type of interference should be provided in the
Exception to Method Text
The reason the result was not determined or entered should be
provided in the Exception to Method Text


The reason the validity of the reported value may be compromised
should be provided in the Result Description


The reason that the value was rejected should be provided in the
Exception to Method Text
Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 9 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
REQ
RET
REX
RIN
RSL
SCA
SCF
SCP
sex
SDL
SFF
Name
Method Not
Approved, re-
analyze
Value Not
Approved
Re-Prepared
Re-Analyzed
Resloped
Suspected
Contamination,
lab analysis
Suspected
Contamination,
field
Suspected
Contamination, lab
preparation
Suspected
Contamination,
unknown
System Detection
Limit, less than
Field Spike Blank,
failed
Group
Procedure
Other
Procedure
Procedure
Procedure
Contaminati
on
Contaminati
on
Contaminati
on
Contaminati
on
Limit
QC Failed
Description
Analytic method for the reported value was not
approved. The sample was re-analyzed using a
different method
Reported value is not approved by laboratory
management. The sample was re-analyzed with no
change in the method. Validity of reported value may be
compromised
Reported value was generated from a re-preparation of
the same sample
Reported value was generated from a re-analysis of the
same sample extract or aliquot using the same method
Reported value was quantified from a resloped
calibration curve during the instrument run
Contamination is suspected to have occurred during the
laboratory analysis process. Validity of reported value
may be compromised
Contamination is suspected to have occurred during the
field collection process. Validity of reported value may
be compromised
Contamination is suspected to have occurred during the
laboratory preparation process. Validity of reported
value may be compromised
Contamination is suspected to have occurred but the
source of that contamination is unknown. Validity of
reported value may be compromised
Analyte produced an instrument response but reported
value is below the calculated system detection limit.
Validity of reported value may be compromised
A field spike blank associated with this analysis failed
the acceptance criteria. Validity of reported value may
be compromised
Reporting Instruction Description

The reason that the value is not approved should be provided in the
Exception to Method Text



The source of contamination, if known, should be provided in the
Exception to Method Text
The source of contamination, if known, should be provided in the
Exception to Method Text
The source of contamination, if known, should be provided in the
Exception to Method Text

Information about detection limits should be provided in the Project
QA/QC Summary

Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 10 of 11
Version 1.05 (9-2-98)

-------
                                               GLENDA LAB REMARK CODES
                                                           (Lab_rmrk.xls)
Code
TIE
UDL
UNC
UNO
Name
Estimated value,
no calibration
standard
Sample-specific
Detection Limit,
less than
Value Not
Confirmed
Analyte Not
Detected
Group
Estimated
Value
Limit
Other
Limit
Description
Reported value has been estimated because no
calibration standard was analyzed
Analyte produced an instrument response but reported
value is below the calculated sample-specific detection
limit. Validity of reported value may be compromised
Reported value could not be confirmed by using an
auxiliary analytic method (e.g., an alternate GC
column). Validity of reported value may be
compromised
Analyte produced no instrument response above noise
Reporting Instruction Description

Information about detection limits should be provided in the Project
QA/QC Summary
Information about the confirmation technique should be provided in
the Analytical Method or the Exception to Method Text

Assignor
Lab, QC
Lab, QC
Lab, QC
Lab, QC
Analytical Remark Code
Page 11 of 11
Version 1.05 (9-2-98)

-------
                                                                           GLENDA QC IDs
                                                                                 (Qc_ident.xls)
Code
Name
Description
                                                                                                                                                                      Field or
CAL
       Calibration solution
                     Aliquot of target analyte(s) or reference material of known
                     concentration analyzed using the exact instrument and
                     conditions to analyze routine field samples
                                                          To set/calibrate instrument response relative to various concentrations of analyte(s)
                                                          prior to the laboratory production run
CHn
Standard check, high
("n"-th member from
lab)
The "n"-th aliquot of solution with known high concentration
(e.g.. 80%) of subject analyte. Not carried to field. Analyzed
using exact instrument used to analyze routine field samples
To evaluate how closely reported result matches the "known" high value. If not
identical, can indicate (1) inaccurate instrumentation at high end of reporting spectrum
or (2) possible contamination from lab
CLB
Blank check,
continuing
Aliquot of reagent water analyzed for background levels of the
target analyte(s) using the exact instrument and conditions
used to analyze routine field samples. This aliquot will be run
several times during the production run
To verify instrument background and/or check for contaminant buildup in the
instrument during the production run
CLC
Calibration check,
continuing
Aliquot of subject analyte(s) or reference material (different
source than CAL solution) of known concentration analyzed
using the exact instrument used to analyze routine field
samples. This aliquot will be run several times during the
production run
To verify whether the initial calibration data are still valid at various points in the
laboratory production run (i.e., to measure instrument "drift")
CLM
Calibration solution,
initial of multiple point
Initial aliquot of target analyte(s) or reference material of known
concentration analyzed using the exact instrument used to
analyze routine field samples. Used when a group of
calibrations is required at different levels of target analyte
concentrations
To set/determine the initial instrument response during multipoint calibration (i.e.,
calibration using three or more standards of known, but different,  concentrations)
CLn
Standard check, low
("n"-th member from
lab)
The "n"-th aliquot of solution with known low concentration
(e.g.. 20%) of subject analyte. Not carried to field. Analyzed
using exact instrument used to analyze routine field samples
To evaluate how closely reported result matches the "known" low value. If not
identical, can indicate (1) inaccurate instrumentation at low end of reporting spectrum
or (2) possible contamination from lab
CLS
Calibration solution,
initial of single point
Initial aliquot of target analyte(s) or reference material of known
concentration analyzed using the exact instrument used to
analyze routine field samples. Used when target analyte
concentration will be in a fixed range
To set/determine the initial instrument response during singlepoint calibration (i.e.,
calibration using a single standard of known concentration)
DDLS
Daily detection limit
solution
Aliquot of reagent water or other neutral item (resin, filter)
analyzed only to calculate daily detection limits of instruments
To calculate daily detection limits
FBB
Blank, field bottle
Aliquot of reagent water placed in one empty sample container
(e.g., bottle) in the field. Not exposed to any other field activity.
Otherwise handled same as routine field sample in all facets of
transport and lab analysis.
To isolate and evaluate potential contamination that may have pre-existed in the
sample containers prior to filling them with actual samples
QC Identifier
                                                                           Page 1  of 6
                                                                                                                           Version 1.04  (10-10-97)

-------
                                                          GLENDA QC IDs
                                                              (Qc_ident.xls)
Code
FBS
FBI
FCM
FDn
FFB
FFM
FMB
FRB
FRM
Name
Blank, field source
water
Blank, field tubing
Control solution, field
Duplicate, ("n"-th
member from field)
Blank, field filter
Blank, field fortified
Matrix blank, field
Blank, field reagent
Reference material,
field
Description
Aliquot of reagent water passed through entire train of
sampling equipment before it is used to take routine field
sample. Not exposed to any other field activity. Otherwise
handled, transported, and analyzed same as routine field
sample
Aliquot of reagent water passed through field tubing before it is
used to take routine field sample. Not exposed to any other
field activity. Otherwise handled, transported, and analyzed
same as routine field sample
Aliquot of reagent water or other neutral item (resin, filter) to
which known quantity of target analyte is added in the field.
Otherwise handled, transported, and analyzed same as routine
field sample
The "n"-th duplicate of a routine field sample (RFS). Taken at
the SAME TIME and SAME PLACE, using the same gear, and
treated same as RFS through all field, transport, and lab
procedures
Aliquot of reagent water passed through field filter material
before it is used on routine field sample. Not exposed to any
other field activity. Otherwise handled, transported, and
analyzed same as routine field sample
Aliquot of sample matrix (known to be below detection for
target analyte) to which a known concentration of target analyte
is added in field. Otherwise handled, transported, and analyzed
same as routine field sample
Unexposed sample collection medium (e.g., dry deposition
plate) carried to field and left unexposed for the duration of
sampling event. Otherwise handled, transported, and analyzed
Qflmp flQ rnntinp fiplrl QnmnlpQ
Aliquot of reagent water or other neutral item (resin, filter)
containing all reagents, preservatives, solvents, standards
used to process routine field sample. Handled, transported,
and analyzed same as routine field sample
Aliquot containing a certified value of the target analyte (aliquot
usually from NIST). Not exposed to any field conditions,
equipment, or additives. Sent to lab from field crew. Handled,
transported, and analyzed same as routine field sample
Purpose
To isolate and evaluate potential contamination introduced to samples from entire
configuration of sampling gear
To isolate and evaluate potential contamination introduced to samples by the sampling
line
To evaluate how closely reported result matches the "known" value added in field. If
not identical, can indicate (1) presence of subject analyte in environment below
detection limits or (2) possible contamination from field, transport, or lab
To evaluate field sampling and matrix variability when duplicate samples theoretically
contain the same amount of the subject analyte
To isolate and evaluate potential contamination introduced to samples by filter
materials used in the field
To enable detection/quantification of subject analyte by raising the "known" amount in
the sample above detection/quantification limits
To evaluate contamination from the sampling medium, field collection activities, and
transportation practices
To identify and/or evaluate potential contamination introduced to samples from any
source in the field, during transport, or in the laboratory
To evaluate how closely lab reported result matches the "certified" value. If not
identical, can indicate (1) inaccurate analytical procedures or (2) possible
contamination from field, transport, or lab
Field or
Lab Flag
F
F
F
F
F
F
F
F
F
QC Identifier
Page 2 of 6
Version 1.04 (10-10-97)

-------
                                                          GLENDA QC IDs
                                                              (Qc_ident.xls)
Code
FSF
FTB
IDLS
IFB
ILB
LCB
LCM
LDB
LDF
LDn
Name
Spiked sample, field
(final value)
Blank, field trip
Instrument detection
limit solution
Blank, field
instrument
Blank, lab instrument
Blank, lab calibration
Control solution, lab
Blank, lab dry
Diluted sample, lab
(final value)
Duplicate, ("n"-th
member from lab)
Description
One part of a routine field sample that is split in the field. This
split (FSF) is fortified in field with known concentration of
analyte and analyzed in the lab according to the specified
method. The other split is analyzed without fortification
Aliquot of reagent water or other neutral item (resin, filter)
carried to field but NOT exposed to any field conditions,
equipment, or additives. Handled, transported, and analyzed
same as routine field sample
Aliquot of target analyte(s) or reference material of known
concentration analyzed only to calculate instrument detection
limits
Aliquot of reagent water or other neutral item (resin, filter)
created in field and analyzed in field for background levels of
the target analyte using the exact instrument to be used in
subsequent analyses when conducted in field
Aliquot of reagent water or other neutral item (resin, filter)
created in lab and analyzed for background levels of the target
analyte using the exact instrument to be used in subsequent
analyses
Aliquot of reagent water or other neutral material (resin, filter),
possibly adjusted in pH, but without addition of any other
reagents. Created in lab and analyzed using the exact lab
instrument used to analyze routine field samples
Aliquot of reagent water or other neutral item (resin, filter) to
which known quantity of target analyte is added. Contains
same reagents, solvents, standards, etc. as routine field
sample. Created in lab. Handled and analyzed same as routine
field sample
Aliquot with all reagents, internal standards, surrogates, and
solvents to be added to routine field sample EXCEPT has no
reagent water/neutral material. Created in lab. Handled and
analyzed same as routine field sample
One part of a routine field sample that is split in the lab. This
portion (LDF) is analyzed according to the specified method
after dilution. The other portion is analyzed without dilution
The "n"-th duplicate of a routine field sample. Created in the
lab and treated same as routine field sample through all
procedures
Purpose
To evaluate the amount of target analyte existing in the fortified sample so that it can
be compared to a "duplicate" sample (FSO) that should be identical in all ways except
that it did not have addition of the subject analyte
To isolate and evaluate potential contamination introduced to samples during sample
transport. Used as QC for samples taken during an entire trip
To calculate instrument detection limits
To 1) test for instrument contamination or 2) verify results from calibration blank
To 1) test for instrument contamination or 2) verify results from calibration blank
To test and adjust instrument settings for "zero level" prior to, or during, sample
analysis
To evaluate how closely reported result matches the "known" value added in lab. If not
identical, can indicate (1) presence of subject analyte in environment below detection
limits or (2) possible contamination from lab materials or equipment
To evaluate possible contamination from reagents, standards, solvents, surrogates,
etc. when reagent water is NOT present
To assess precision of lab dilution techniques and to evaluate potential contamination
in dilution material. Done by comparing to a "duplicate" sample (LDO) that should be
identical in all ways except that it did not have addition of the diluting material
To evaluate lab variation in reported results when duplicate samples theoretically
contain the same amount of the subject analyte. Provides precision assessment of
results
Field or
Lab Flag
F
F
L
F
L
L
L
L
L
L
QC Identifier
Page 3 of 6
Version 1.04 (10-10-97)

-------
                                                          GLENDA QC IDs
                                                              (Qc_ident.xls)
Code
LFn
LIM
LIS
LMB
LMn
LMS
LPB
LPC
LPn
LPS
Name
Spiked sample, lab
(Final values - "n"-th
member)
Interference check
sample, lab
Internal standard, lab
Matrix blank, lab
Matrix spike multiple,
lab
Matrix spike, lab
Blank, lab procedural
Performance check
solution, lab
Procedural spike
duplicate, lab
Procedural spike, lab
Description
The "n"-th duplicate of the "Spiked Sample, Lab Final Value"
(LSF). Used when routine field sample is split in two portions,
one portion is analyzed with spiking (LSF). The LSF may be
duplicated n times
Solution with known concentration of a suite of target analytes.
Created in lab and analyzed using exact instrument used to
analyze routine field samples
Routine field sample fortified with addition of a known
concentration of a standard compound which does not occur in
the environment but which does have similar spectral signature
during analysis
Unexposed sample collection medium (e.g., dry deposition
plate) NOT carried to field. Handled and analyzed in lab same
as routine field samples
N-th duplicate of the lab matrix spike (LMS). Aliquot of routine
field sample split from "true" sample. Fortified with a known
concentration of target analyte(s). Created in the lab. Handled
and analyzed same as routine field sample
Aliquot of routine field sample split from "true" field sample.
Fortified with a known concentration of target analyte(s).
Created (ie. split and fortified) in the lab. Handled and analyzed
same as routine field sample
Aliquot containing all reagents, internal standards, surrogates,
and solvents in same volumes used to process/analyze RFS.
Created in lab. Contains no field collection media (XAD resin)
or dummy blank matrix (reagent water). Handled/analyzed
^-,m~ -i^ DCC
Aliquot containing a solution with known concentrations of
target analyte(s), surrogate(s), and/or internal standards used
to evaluate the performance of an instrument with respect to a
defined set of criteria
N-th duplicate of the lab procedural spike (LPS). Aliquot of
reagent water or other neutral item containing all reagents,
solvents, standards, surrogates as routine field sample.
Fortified with known quantity of target analyte. Created in lab.
Aliquot of reagent water or other neutral item (filter, resin)
containing all reagents, solvents, standards, surrogates as
routine field sample. Fortified with known quantity of target
analyte. Created in lab. Handled & analyzed same as rout, field
sample
Purpose
To verify the results from Spiked Sample, Lab Final Value (LSF)
To evaluate spectral interferences on the signature of one analyte caused by another
analyte in the suite being tested. Only checks interference from instrumentation (does
not check interference from matrix)
To enable quantification of the analyte(s) of interest by enhancing the magnitude of the
spectral signature. Often used when target analytes are "detected but not quantifiable"
without the lab internal standard
To evaluate lab-induced contamination from sample collection media, reagents, and
methods
To evaluate matrix effect on routine field samples. Checks interference both from
matrix and laboratory instrumentation. Provides precision assessment of LMS
To evaluate matrix effect on routine field samples (e.g., does organic content of matrix
sorb any of the spiked material). Checks interference both from matrix and laboratory
instrumentation
To evaluate possible contamination biases from the reagents and solvents used in the
process without the interfering presence of sample collection media or dummy sample
matrix
To evaluate the performance of a lab instrument with respect to a pre-defined set of
criteria
To evaluate the accuracy of extraction and analysis of target analytes in the absence
of field matrix interferences. Also to evaluate potential contamination from extraction
solvent. Provides precision assessment of LPS
To evaluate the accuracy of extraction and analysis of target analytes in the absence
of field matrix interferences. Also to evaluate potential contamination from extraction
solvent
Field or
Lab Flag
L
L
L
L
L
L
L
L
L
L
QC Identifier
Page 4 of 6
Version 1.04 (10-10-97)

-------
                                                                          GLENDA  QC IDs
                                                                                (Qc_ident.xls)
Code
Name
Description
                                                                                                                                                                   Field or
LRB
Blank, lab reagent
Aliquot of reagent water or other neutral item (resin, filter)
containing all reagents, internal standards, surrogates, and
solvents used to process/analyze routine field samples.
Created in lab. Handled and analyzed same as routine field
sample
To identify and/or evaluate potential contamination introduced to samples from any
source in the laboratory
LRS
Reference sample,
lab
Reference sample of the same matrix as a routine field
sample. The reference sample has a mean value, established
over time, which is specific to the lab running the analysis.
Created in the lab. Handled and analyzed same as routine field
sample
To evaluate performance of lab equipment against known concentrations of target
analyte. (Similar to standard solutions, except created by lab rather than some external
entity like EMSL or NIST)
LSB
Blank, lab solvent
Aliquot containing solvents used to process/analyze routine
field sample. Does not contain reagent water, standards,
surrogates, or other reagents.   Created in lab. Handled and
analyzed same as routine field sample
To isolate and evaluate possible contamination introduced to routine field samples
from solvents
LSD
Spike duplicate, lab
Routine field sample which is analyzed according to the
analytical method, and is the 2nd of two independent aliquots
of the sample taken for fortification with target analyte(s)
To assess lab precision on sample matrix and to assess matrix variability
LSF
Spiked sample, lab
(final values)
One part of a routine field sample that is split in lab. This split
(LSF) is fortified in lab with known concentration of analyte and
analyzed in the lab according to the specified method. The
other split is analyzed without fortification
To evaluate the amount of target analyte existing in the fortified sample so that it can
be compared to a "duplicate" sample (ISO) that should be identical in all ways except
that it did not have addition of the subject analyte
LSS
Surrogate spike, lab
Routine field sample fortified with a surrogate of the target
analyte(s) which mimics the target analyte but which is not
normally found in routine field sample. Handled and analyzed
same as routine field sample
To evaluate bias in the sample matrix (usually as a function of percent recovery of the
surrogate)
LIB
Blank, lab trip
Aliquot of reagent water or other neutral item (resin, filter)
created in lab. Not carried to field. Handled, transported, and
analyzed same as routine field sample
To isolate and evaluate potential contamination introduced to samples during lab
processing/analysis. Used as QC for samples taken during an entire trip
LVM
Calibration
verification solution,
lab
Aliquot of reagent water or other neutral item (resin, filter) to
which known quantity of target analyte is added. Created in lab.
Analyzed using exact instrument used to analyze routine field
samples
To verify calibration reached with LCM sample
MDLS
Method detection limit
solution
Standard solution containing known quantities of target
analytes in units comparable to the routine field sample.
Standard solution created in accordance with 40 CFR, Part
136, Appendix B (e.g., Ultra 10 congener and
pesticide/TNC/atrazine standards)
To establish concentration range of analytical equipment where quantification is
reliable
QC Identifier
                                                                         Page 5 of 6
                                                                                                                         Version 1.04  (10-10-97)

-------
                                                                          GLENDA QC IDs
                                                                               (Qc_ident.xls)
Code
Name
Description
                                                                                                                                                                  Field or
MSB
Matrix spike blank,
lab
Aliquot of same sample matrix as RFS (though not collected in
field for this project) with historically known/established
concentration of target analyte(s). Analyzed using exact
instrument used to analyze routine  field sample
To determine background levels of analytes in matrix used to process Laboratory
Procedural Spike (IPS). The MSB is a historically "clean" environmental sample used
as an IPS
RFS
Routine field sample
Sample or aliquot collected in the field. Routine field samples
are the actual, "real" samples taken in the field. Not a quality
control sample of any kind
To assess the environmental "level" of the subject analyte, species of interest, or other
collected entity
SFB
Spiked blank, field
Aliquot of reagent water/solvent used in routine field sample
extraction.  Includes internal standards and surrogates with
known level of target analytes added in the field.  Not
processed on adsorption media. Handled, transported, and
analyzed same as RFS
To evaluate recovery of target analytes without interference from adsorption media
SFDn
Sequential duplicate
("n"-th member from
field)
The "n"-th duplicate of a routine field sample (RFS). Taken at
the SAME PLACE but somewhat LATER TIME as RFS, using
the same gear, and treated same as RFS through all field,
transport, and lab procedures
To evaluate field sampling and matrix variability when duplicate samples theoretically
contain the same amount of the subject analyte. The sample is taken at different time
than RFS when the method or conditions make it difficult for true duplication
SLB
Solvent spike, lab
Aliquot of solvent at same volume used in routine field sample
extraction, includes internal standards/surrogates, fortified in
lab with known levels of target analytes. Not processed on
adsorption media. Handled and analyzed same as routine field
sample
To evaluate recovery of target analytes without interference from adsorption media
SRHn
Standard check, high
("n"-th member from
field)
The n-th aliquot of solution with known high concentration (e.g.
80%) of target analyte. Carried to field and exposed to same
conditions/equipment as routine field sample. Handled,
transported, and analyzed same  as routine field sample
To evaluate how closely reported result matches the "known" value. If not identical,
can indicate (1) inaccurate instrumentation at high end of reporting spectrum or (2)
possible contamination from field, transport, or lab
SRLn
Standard check, low
("n"-th member from
field)
The n-th aliquot of solution with known low concentration (e.g.
20%) of target analyte. Carried to field and exposed to same
conditions/equipment as routine field sample. Handled,
transported, and analyzed same  as routine field sample
To evaluate how closely reported result matches the "known" value. If not identical,
can indicate (1) inaccurate instrumentation at low end of reporting spectrum or (2)
possible contamination from field, transport, or lab
SRM
Reference material,
standard
Aliquot containing a certified value of the target analyte (aliquot
usually from NIST). Never carried to field. Analyzed same as
routine field sample
To evaluate how closely reported result matches the "certified" value (ie, check on
accuracy/precision or calibration of the measurement system). If values are not
identical, can indicate (1) inaccurate analytical procedures or (2) possible
contamination
QC Identifier
                                                                         Page 6 of 6
                                                                                                                         Version 1.04  (10-10-97)

-------
                                 Appendix E
        Great Lakes National Program Office Information Security Plan
GLNPO Quality Management Plan - Appendix E                                            May 2008

-------
United States Environmental Protection Agency
Great Lakes National Program Office
Binational and Partner Network
(glnpo.net)
          vc  System  Documentation
              updated
              December 12, 2002


              Table of Contents

              Authorizations and Agreements
              1.  Request for Waiver to Operate Off-Network
                 Developmental Server - March 24, 2000 - Francis X.
                 Lyons to Margaret N. Schneider
              2.  Authorization to Operate Servers - May 15, 2000 -
                 Margaret N. Schneider to Francis X. Lyons
              3.  Request to Upgrade Operating System for GLNPOnet
                 August 31, 2000 - Gary Gulezian to Myra Galbraith
              4.  Approval to Upgrade Operating System - September
                 29, 2000 -  Mark Day to Gary V. Gulezian
              5.  Assignments of Responsibility: Management of
                 Security Controls; Information Security Officer
              6.  Joint Agreement between GLNPO and R5RMD
                 Regarding IT Activities Statement of Principles

              Information Security Package
              7.  Information Security  Plan
              8.  Network Description
              9.  Security Policies and Procedures Handbook
              10. Security Review Procedure Manual
              11. Continuity of Support Plan
              12. Final Risk Assessment of the Great Lakes National
                 Program Office's Information Technology
                 Infrastructure - July 12, 2001
              13. Automated Security Self-Evaluation and Reporting
                 Tool (ASSERT) Summary - September 2002

-------
United States
Environmental Protection Agency
       *  «B*K  *   Binational and Partner Network
                   (glnpo.net)
             Supplementto Information
             Security Plan
             1. System description
             2. System configuration
             3. Application configuration
             4. Continuity of operations
             5. Risk assessment
             6. Policies and procedures
             Great Lakes National Program Office
             Chicago, Illinois
             January 30,2008

-------
  Appendix B GLNPO LAN General  Systems Security Plan Accreditation Memo
                              DATE:  May 30th  2008
MEMORANDUM
SUBJECT:  Security Accreditation, and Authorization to Operate, for the GLNPO.NET
             LAN System Security Planning Package
FROM:      Gary V. Guluzian, Director
             Great Lakes National Program Office
             Region 5 USEPA
TO:         David Cowgill, Chief
             Technical Analysis and Assistance Branch
             Region 5 USEPA
       After reviewing the results of the security certification of the GLNPO.NET LAN
and its constituent system-level components located at Region 5 USEPA, and the
supporting evidence provided in the associated security accreditation package , I have
determined that the risk to Agency operations, assets, or individuals resulting from the
operation of the information system is fully acceptable.

       Accordingly, I  am issuing an Authorization to Operate the  information system in
its existing operating environment. The  information system is accredited without any
significant restrictions or limitations. This security accreditation is my  formal declaration
that adequate security  controls have been implemented in the information system and that
a satisfactory level of security is present in the system.

   The security accreditation of the information system will remain in effect as long as
the conditions exist as follows:

    1.  The vulnerabilities reported during the  continuous monitoring process do not
       increase Agency-level risk to levels deemed unacceptable.
   2.  The system has neither exceeded its three (3) year Certification and Accreditation
       period nor undergone any major changes requiring the security  plan to be
       updated.
   3.  The system's owner commits to  complete any POAMs that are  established now or
       in the future to ensure the continued effectiveness of this security plan and the
       security controls  specified.
                                     PAGE 1

-------
  Appendix B GLNPO LAN General Systems Security Plan Accreditation Memo


   A copy of this letter with all supporting security Certification and Accreditation
documentation should be retained in accordance with the Agency's record retention
schedule.              •
   -  ,H            _J            /"
Gary V. Guluzian, Director Great Lakes National Program Office
Region 5 USEPA
     itLU
Signature
cc:        Marian Cody, Chief Information Security Officer
          Office of Technology Operations and Planning
          Office of Environmental Information

Enclosures:
Attachments of: GLNPO .NET LAN S ystem Documentation
               GLNPO.NET LAN Supplement to Information Security Plan
                                   PAGE 2

-------
                                 Appendix F
              Suggested Quality System Documentation Checklist
GLNPO Quality Management Plan - Appendix F                                            May 2008

-------
                  Suggested Project Quality System Documentation Checklist

        The purpose of this checklist is to guide GLNPO project officers and quality system staff through the
processes of planning a project, reviewing the planning documentation, and complying with GLNPO's quality
system requirements. You may use this form, or equivalent documentation, for any IN-HOUSE work effort, WORK
ASSIGNMENT, CONTRACT, COOPERATIVE AGREEMENT, GRANT, or INTERAGENCY AGREEMENT
where GLNPO provides funds or technical support.

Section 1 - General Project Information

 Brief Descriptive Project Title:
 Project Start Date:

 Anticipated Project Completion Date:

 EPA Project Manager:

 Project Team Members:


 Designated Quality System Team Member:

 Name of contractor or grantee (if any):
Yes








No









Is this project related to a specific environmental decision, regulation, or enforcement
action?
Will EPA be collecting data during this project?
Will an EPA contractor or grantee be collecting data during this project?
Will data from other sources be used during this project?
If so, were the data collected in association with this project or for some other purpose?
(e.g., is this a secondary use of the data?)
Sources of other data (if any):
Is this a software/modeling development project?
Is this a new contract, new work assignment, or new grant?
If the answer to any question above is "Yes," then complete the rest of this form.

If all answers above are "No," then sign this page and submit it with the procurement request or procurement
initiation notice.
  Project Manager's Signature
Date
                                              Page 1 of 3

-------
Section 2 - Quality System Documentation Requirements
(for projects involving environmental measurements or data)

The questions below are to be answered by the quality system staff member in order to establish the requirements for
quality system documentation for the project.
Yes






No






Does the project require that:
A written quality management plan or other document that describes the commitment of the
offer's management to meet the quality requirements of the scope of work be included in the
project plan, contract/cooperative agreement/grant proposal, etc.?
A written quality assurance project plan (QAPP) be delivered as part of the project plan,
proposal, grant, contract task order, etc.?
contract
Quality system reports be delivered?
	 with Progress Reports 	 with Final Report?
Quality system audits be conducted for the contract?
	 Pre-Award 	 During Contract?
Procedures are in place to review data against acceptance criteria?
Another form of documentation be used instead of a QAPP (see below)?
Rationale, if no QAPP required:  (if another form of documentation is used, please specify it here)
 Please identify:
 Organization responsible for preparing the QAPP or
 other quality system documentation
 If EPA, name of author

 Due date for QAPP or other documentation

 Anticipated start date of data collection
Section 3 - Review and Approval of Quality System Documentation
(to be completed by the quality system member)
 EPA reviewer for QAPP or other documentation

 Date review completed
 Date documentation approved

 Location of approved and signed documentation
 Project Manager's Signature
Date
Quality System Signature
Date
                                              Page 2 of 3

-------
Section 4 - Management Review (to be completed by the Branch Chief in consultation with the quality system
member)
Yes






No







Are environmental data required for this project? (Section 1)
Have requirements for the quality system documentation been established? (Section 2)
Has the quality system documentation been reviewed and approved by both the Project
and the quality system staff member? (Section 3)
Manager
If this is a contract, work assignment, task order, grant, cooperative agreement, or TAG, have the
quality system requirements been included in the activity and documented on the appropriate
forms?
May this project proceed as planned?
Is concurrence required from the Division Director or Office Director?
Comments:
 Branch Chief's Signature
Date
For projects at the Office level:
 Office Director's Signature
Date
                                              Page 3 of  3

-------
                                Appendix G
    EPA Requirements for Quality Assurance Project Plans (EPA QA/R-5)
GLNPO Quality Management Plan - Appendix G                                          May 2008

-------
         United States        Office of Environmental     EPA/240/B-01/003
         Environmental Protection    Information         March 2001
         Agency          Washington, DC 20460



&EPA  EPA Requirements for Quality


         Assurance Project Plans





         EPA QA/R-5

-------
Page Intentionally Blank

-------
                                     FOREWORD
       The U.S. Environmental Protection Agency (EPA) has developed the Quality Assurance
Project Plan (QA Project Plan) as a tool for project managers and planners to document the type
and quality of data needed for environmental decisions and to describe the methods for collecting
and assessing those data. The development, review, approval, and implementation of the QA
Project Plan is part of EPA's mandatory Quality System. The EPA Quality System requires all
organizations to develop and operate management structures and processes to ensure that data
used in Agency decisions are of the type and quality needed for their intended use.  The QA
Project Plan is an integral part of the fundamental principles and practices that form the
foundation of the EPA Quality System.

       This document provides the QA Project Plan requirements for organizations that conduct
environmental data operations on behalf of EPA through contracts, financial assistance
agreements, and interagency agreements; however, it may be used by EPA as well. It contains the
same requirements as Chapter 5 of EPA Order 5360 A1 (EPA 2000), The EPA Quality Manual
for Environmental Programs., which has been developed for internal use by EPA organizations.
A companion document, EPA Guidance for Quality Assurance Project Plans (QA/G-5) (EPA
1998) provides suggestions for both EPA and non-EPA organizations on preparing,  reviewing,
and implementing QA Project Plans that satisfy the requirements defined in this document.

       This document is one of the EPA Quality System Series documents which describe EPA
policies and procedures for planning, implementing,  and assessing the effectiveness of a quality
system. Questions regarding this document or other EPA Quality System Series documents
should be directed to:

                           U.S. EPA
                           Quality Staff (2811R)
                           Washington, DC 20460
                           Phone: (202) 564-6830
                           FAX:  (202)565-2441
                           e-mail: quality@epa.gov

Copies of Quality System Series documents may be  obtained from the Quality Staff or by
downloading them from the  Quality Staff Home Page:

                           www.epa.gov/quality
                                                                                   Final
EPAQA/R-5                                 i                                  March 2001

-------
                                ACKNOWLEDGMENTS
       This document reflects the collaborative efforts of many quality management professionals
who participate in the challenge for continual improvement in quality systems supporting
environmental programs.  These individuals, representing the EPA, other Federal agencies, State
and local governments, and private industry, reflect a diverse and broad range of needs and
experiences in environmental data collection programs.  Their contributions and the
comprehensive reviews during the development of this document are greatly appreciated.
                                                                                    Final
EPAQA/R-5                                  ii                                  March 2001

-------
                           TABLE OF CONTENTS

                                                                       Page

CHAPTER 1. INTRODUCTION  	1
      1.1   BACKGROUND	1
      1.2   QA PROJECT PLANS, THE EPA QUALITY SYSTEM, AND
           ANSI/ASQC E4-1994 	2
      1.3   THE GRADED APPROACH AND THE EPA QUALITY SYSTEM 	4
      1.4   INTENDED AUDIENCE 	4
      1.5   PERIOD OF APPLICABILITY	4
      1.6   ADDITIONAL RESOURCES	4
      1.7   SUPERSESSION	5

CHAPTER 2. QA PROJECT PLAN REQUIREMENTS 	7
      2.1   POLICY	7
      2.2   PURPOSE	7
      2.3   APPLICABILITY 	7
      2.4   GENERAL CONTENT AND DETAIL REQUIREMENTS	7
           2.4.1  General Content	7
           2.4.2  Level of Detail 	8
      2.5   QA PROJECT PLAN PREPARATION AND APPROVAL	8
      2.6   QA PROJECT PLAN IMPLEMENTATION	9
      2.7   QA PROJECT PLAN REVISION 	9

CHAPTER 3. QA PROJECT PLAN ELEMENTS	11
      3.1   CONTENT REQUIREMENTS 	11
      3.2   GROUP A:  PROJECT MANAGEMENT 	12
           3.2.1  Al - Title and Approval Sheet	13
           3.2.2  A2 - Table of Contents	13
           3.2.3  A3 - Distribution List 	14
           3.2.4  A4 - Project/Task Organization	14
           3.2.5  A5 - Problem Definition/Background  	14
           3.2.6  A6 - Project/Task Description	14
           3.2.7  A7 - Quality Objectives and Criteria 	15
           3.2.8  A8 - Special Training/Certification 	15
           3.2.9  A9 - Documents and Records	15
      3.3   GROUP B: DATA GENERATION AND ACQUISITION	15
           3.3.1  Bl- Sampling Process Design (Experimental Design) 	16
           3.3.2  B2 - Sampling Methods  	17
           3.3.3  B3 - Sample Handling and Custody	17
           3.3.4  B4 - Analytical Methods	17
           3.3.5  B5 - Quality Control	18

                                                                        Final
EPAQA/R-5                             iii                             March 2001

-------
                                                                          Page
            3.3.6  B6 - Instrument/Equipment Testing, Inspection, and Maintenance	18
            3.3.7  B7 - Instrument/Equipment Calibration and Frequency 	18
            3.3.8  B8 - Inspection/Acceptance of Supplies and Consumables	19
            3.3.9  B9 - Non-direct Measurements	19
            3.3.10 BIO -DataManagement	19
      3.4    GROUP C: ASSESSMENT AND OVERSIGHT 	20
            3.4.1  Cl - Assessments and Response Actions	20
            3.4.2  C2 - Reports to Management 	20
      3.5    GROUP D: DATA VALIDATION AND USABILITY	21
            3.5.1  Dl - Data Review, Verification, and Validation	21
            3.5.2  D2 - Verification and Validation Methods	21
            3.5.3  D3 - Reconciliation with User Requirements	21

REFERENCES  	23

APPENDIX A. CROSSWALKS AMONG QUALITY ASSURANCE DOCUMENTS . A-l
      A.I    BACKGROUND	A-l
      A.2   CROSSWALK BETWEEN EPA QA/R-5 AND QAMS-005/80	A-l
      A.3    CROSSWALK BETWEEN THE DQO PROCESS
            AND THE QA PROJECT PLAN	A-3

APPENDIX B. TERMS AND DEFINITIONS  	B-l
                                   FIGURES
                                                                          Page
1.  EPA Quality System Components and Tools	3
2.  Example Document Control Format	14
                                   TABLES
                                                                          Page
1.  Group A: Project Management Elements  	13
2.  Group B: Data Generation and Acquisition Elements 	16
3.  Group C: Assessment and Oversight Elements 	20
4.  Group D: Data Validation and Usability Elements	21
                                                                           Final
EPA QA/R-5                              iv                              March 2001

-------
                                      CHAPTER 1

                                   INTRODUCTION

1.1    BACKGROUND

       Environmental programs conducted by or funded by the U.S. Environmental Protection
Agency (EPA) involve many diverse activities that address complex environmental issues. The
EPA annually spends several hundred million dollars in the collection of environmental data for
scientific research and regulatory decision making. In addition, non-EPA organizations may
spend as much as an order of magnitude more each year to respond to Agency requirements.  If
decision makers (EPA and otherwise) are to have confidence in the quality of environmental data
used to support their decisions, there must be a structured process for quality in place.

       A structured system that describes the policies and procedures for ensuring that work
processes, products, or services satisfy stated expectations or specifications is called a quality
system. All organizations conducting environmental programs funded by EPA are required to
establish and implement a quality system. EPA also requires that all environmental data used in
decision making be supported by an approved Quality Assurance Project Plan (QA Project Plan).
This requirement is defined in EPA Order 5360.1 A2 (EPA 2000), Policy and Program
Requirements for the Mandatory Agency-wide Quality System, for EPA organizations. Non-EPA
organizations funded by EPA are required to develop a QA Project Plan through:

             48 CFR 46, for contractors;

             40 CFR 30, 31, and 35 for assistance agreement recipients; and

             other mechanisms, such as consent agreements  in enforcement actions.

       The QA Project Plan integrates all technical and quality aspects of a project, including
planning, implementation, and assessment.  The purpose of the QA Project Plan is to document
planning results for environmental data operations and to provide a project-specific "blueprint"
for obtaining the type and quality of environmental data needed for a specific decision or use.  The
QA Project Plan documents how quality assurance (QA) and quality control  (QC) are applied to
an environmental data operation to assure that the results obtained are of the type and quality
needed and expected.

       The ultimate success of an environmental program or project depends on the quality of the
environmental data collected and used in decision-making, and this may depend significantly on
the adequacy of the QA Project Plan and its effective implementation.  Stakeholders (i.e., the data
users, data producers, decision makers, etc.) shall be involved  in the planning process for a
program or project to ensure that their needs are defined adequately and addressed.  While time
spent on such planning may seem unproductive and costly, the penalty for ineffective planning

                                                                                     Final
EPAQA/R-5                                  1                                   March 2001

-------
includes greater cost and lost time. Therefore, EPA requires that a systematic planning process be
used to plan all environmental data operations.  To support this requirement, EPA has developed
a process called the Data Quality Objectives (DQO) Process.  The DQO Process is the Agency's
preferred planning process and is described in the Guidance for the Data Quality Objectives
Process (QA/G-4) (EPA 2000b).  The QA Project Plan documents the outputs from systematic
planning.

       This requirements document presents specifications and instructions for the information
that must be contained in a QA Project Plan for environmental data operations funded by EPA.
The document also discusses the procedures for review, approval, implementation, and revision of
QA Project Plans. Users of this document should assume that all of the elements described herein
are required in a QA Project Plan unless otherwise directed by EPA.

1.2    QA PROJECT PLANS, THE  EPA QUALITY SYSTEM, AND ANSI/ASQC
       E4-1994

       EPA Order 5360.1 A2 and the applicable Federal regulations (defined above) establish a
mandatory Quality System that applies to all EPA organizations and organizations funded by
EPA. Components of the EPA Quality  System are illustrated in Figure 1. Organizations must
ensure that data collected for the characterization of environmental processes and conditions are
of the appropriate type and quality for their intended use and that environmental technologies are
designed, constructed, and operated according to defined expectations. The QA Project Plan is a
key project-level component of the EPA Quality System.

       EPA policy is based on the national consensus standard, ANSI/ASQC E4-1994,
Specifications and Guidelines for Environmental Data Collection and Environmental
Technology Programs.  The ANSI/ASQC E4-1994 standard describes the necessary management
and technical  elements for developing and implementing a quality system. This standard
recommends using a tiered approach to  a quality system. This standard recommends first
documenting  each organization-wide quality system in a Quality Management Plan or Quality
Manual (to address requirements of Part A: Management Systems of the standard) and then
documenting the applicability of the quality system to technical activity-specific efforts in a QA
Project Plan or similar document (to address the requirements of Part B: Collection and
Evaluation of Environmental Data of the standard). EPA has adopted this tiered approach for its
mandatory Agency-wide Quality System.  This document addresses Part B requirements of the
standard.

       A Quality Management Plan, or equivalent Quality Manual, documents how an
organization structures its quality system, defines and assigns QA and QC responsibilities, and
describes the  processes and procedures used to plan, implement, and assess the  effectiveness of
the quality system. The Quality Management Plan may be viewed as the "umbrella" document
under which individual projects are conducted. EPA requirements for Quality Management Plans
are defined inEPA Requirements for Quality Management Plans (QA/R-2)  (EPA 2001). The

                                                                                   Final
EPAQA/R-5                                 2                                  March 2001

-------
            O
            LLJ

            O
o
o
Q.
z
g
i
z
o
o
                     O
                     _l
                     O
                     Q.
                                            Consensus Standards
                                                 ANSI/ASQC E4
                                                 ISO 9000 Series
Internal EPA Policies
   EPA Order 5360.1
   EPA Manual 5360
                                              External Policies
                                              Contracts - 48 CFR 46
                                             Assistance Agreements -
                                              40 CFR 30, 31, and 35^
                                                EPA Program &
                                                Regional Policy
                         Supporting System Elements
                             (e.g., Procurements,
                         Computer Hardware/Software)
                                    Training/Communication
                                      (e.g., Training Plan,
                                        Conferences)
                       Systematic
                        Planning
                    (e.g., DQO Process)
                       
-------
Quality Management Plan is then supported by project-specific QA Project Plans. In some cases,
a QA Project Plan and a Quality Management Plan may be combined into a single document that
contains both organizational and project-specific elements. The QA Manager for the EPA
organization sponsoring the work has the authority to determine when a single document is
applicable and will define the content requirements of such a document.

1.3    THE GRADED APPROACH AND THE EPA QUALITY SYSTEM

       Recognizing that a "one size fits all" approach to quality requirements will not work in
organizations as diverse as EPA, implementation of the EPA Quality System is based on the
principle of graded approach. Applying a graded approach means that quality systems for
different organizations and programs will vary according to the specific objectives and needs of
the organization. For example, the quality expectations of a fundamental research program are
different from that of a regulatory  compliance program because the purpose or intended use of the
data is different.  The specific application of the graded approach principle to QA Project Plans is
described in Section 2.4.2.

1.4    INTENDED AUDIENCE

       This document specifies the requirements for developing QA Project Plans for
organizations that conduct environmental data operations  funded by EPA through contracts,
financial assistance agreements, and interagency agreements.  EPA organizations may also use this
document to develop QA Project Plans since this document is clearer and more user-friendly than
the equivalent requirements defined in Section 5.3 of EPA Order 5360 Al (EPA 2000), The EPA
Quality Manual for Environmental Programs (an internal policy document).  However, the
preparation, submission, review, and approval requirements for EPA organizations are still
contained in Section 5.2 of EPA Order 5360 Al as these represent internal  EPA policy.

1.5    PERIOD OF APPLICABILITY

       This document shall be valid for a period of up to five years from the official date of
publication. After five years, it shall either be reissued without change, revised, or withdrawn
from the EPA Quality System.

1.6    ADDITIONAL RESOURCES

       Guidance on preparing QA Project Plans may be found in a companion document, EPA
Guidance for Quality Assurance Project Plans (QA/G-5)  (EPA 1998). This guidance discusses
the application of the QA Project Plan requirements and provides examples. Other documents
that provide guidance on activities critical to successful environmental data operations and
complement the QA Project Plan preparation effort include:
                                                                                   Final
EPAQA/R-5                                 4                                  March 2001

-------
             Guidance for the Data Quality Objectives Process (QA/G-4), (EPA 2000b)
       •      Guidance for the Preparation of Standard Operating Procedures for Quality-
             Related Documents (QA/G-6), (EPA 1995)
       •      Guidance for Data Quality Assessment: Practical Methods for Data Analysis
             (QA/G-9),  (EPA 2000a)

1.7    SUPERSESSION

       This document replaces QAMS-005/80, Interim Guidelines and Specifications for
Preparing Quality Assurance Project Plans (EPA 1980) in its entirety.
                                                                                Final
EPAQA/R-5                                5                                March 2001

-------
                               (This page is intentionally blank.)
                                                                                          Final
EPAQA/R-5                                   6                                     March 2001

-------
                                     CHAPTER 2

                        QA PROJECT PLAN REQUIREMENTS

2.1    POLICY

       All work funded by EPA that involves the acquisition of environmental data generated
from direct measurement activities, collected from other sources, or compiled from computerized
data bases and information systems shall be implemented in accordance with an approved QA
Project Plan.  The QA Project Plan will be developed using a systematic planning process based
on the graded approach. No work covered by this requirement shall be implemented without an
approved QA Project Plan available prior to the start of the work except under circumstances
requiring immediate action to protect human health and the environment or operations conducted
under police powers.

2.2    PURPOSE

       The QA Project Plan documents the planning, implementation, and assessment procedures
of, and how specific QA and QC activities will be applied during a particular project. The QA
Project Plan demonstrates conformance to Part B requirements of ANSI/ASQC E4-1994.

2.3    APPLICABILITY

       These requirements apply to all environmental programs funded by EPA that acquire,
generate, or compile environmental data including work performed through contracts, work
assignments, delivery orders, task orders, cooperative agreements, interagency agreements, State-
EPA agreements, State, local and Tribal Financial Assistance/Grants, Research Grants, and in
response to statutory or regulatory requirements and consent agreements. These requirements are
negotiated into interagency agreements, including sub-agreements, and, in some cases, are
included in enforcement settlement and consent agreements and orders. Where specific Federal
regulations require the application of QA and QC activities (see Section 1.1), QA Proj ect Plans
shall be prepared, reviewed,  and approved in accordance with the specifications contained in this
document unless explicitly superseded by the regulation.

2.4    GENERAL CONTENT AND DETAIL REQUIREMENTS

2.4.1   General Content

       The QA Project Plan must be composed of standardized, recognizable elements covering
the entire project from planning, through implementation, to assessment. Chapter 3 of this
document describes specific  elements to address for QA Project Plans submitted to EPA. In some
cases, it may be necessary to add special requirements to the QA Project Plan.  The EPA
organization sponsoring the work has the authority to define any special requirements beyond

                                                                                   Final
EPAQA/R-5                                 7                                  March 2001

-------
those listed in this document.  If no additional requirements are specified, the QA Project Plan
shall address all required elements.  Each EPA organization defines their organizational-specific
requirements for QA Project Plan documentation in their Quality Management Plan. All
applicable elements defined by the EPA organization sponsoring the work must be addressed.

       While most QA Project Plans will describe project- or task-specific activities, there may be
occasions when a generic QA Project Plan may be more appropriate.  A generic QA Project Plan
addresses the general, common activities of a program that are to be conducted  at multiple
locations or over a long period of time; for example, it may be useful for a large monitoring
program that uses the same methodology at different locations. A generic QA Project Plan
describes, in a single document, the information that is not site or time-specific but applies
throughout the program. Application-specific information is then added to the approved QA
Project Plan as that information becomes known or completely defined. A generic QA Project
Plan shall be reviewed periodically to ensure that its content continues to be valid and applicable
to the program over time.

2.4.2  Level of Detail

       The level of detail of the QA Project Plan should be based on a graded approach so that
the level of detail in each QA Project Plan will vary according to the nature of the work being
performed and the intended use of the data. As a result, an acceptable QA Project Plan for some
environmental data operations may require a qualitative discussion of the experimental process
and its objectives while others may require extensive documentation to adequately describe a
complex environmental program.

2.5     QA PROJECT PLAN PREPARATION AND APPROVAL

       The QA Project Plan may be prepared by an EPA organization, a contractor, an assistance
agreement holder, or another Federal agency under an interagency agreement.  Except where
specifically delegated in the Quality Management Plan of the EPA organization  sponsoring the
work, all QA Project Plans prepared by non-EPA organizations must be approved by EPA before
implementation.

       The QA Project Plan shall be reviewed and approved by an authorized EPA reviewer to
ensure that the QA Project Plan contains the appropriate content and level of detail.  The
authorized reviewer, for example the EPA project manager1 with  the assistance  and approval of
the EPA QA Manager or by the EPA QA Manager alone, are defined by the EPA organization's
Quality  Management Plan.  In some cases, the authority to review and approve QA Project Plans
is delegated to another part of the EPA organization covered by the same Quality Management
1 This term refers to the EPA official responsible for the project. This individual may also be called Project Officer,
Delivery Order Project Officer, Work Assignment Manager, or Principal Investigator.

                                                                                     Final
EPAQA/R-5                                  8                                   March 2001

-------
Plan.  In cases where the authority to review and approve QA Project Plans is delegated in writing
by EPA to another organization (i.e., a Federal agency or a State under an EPA-approved Quality
Management Plan when the environmental data operation itself has been delegated to that
organization for implementation), it is possible that the EPA project manager and EPA QA
Manager may not be involved in the review and approval steps.

2.6    QA PROJECT PLAN IMPLEMENTATION

       None of the environmental work addressed by the QA Project Plan shall be started until
the QA Project Plan has been approved and distributed to project personnel except in situations
requiring immediate action to protect human health and the environment or operations conducted
under police powers. Subject to these exceptions, it is the responsibility of the organization
performing the work to assure that no environmental data are generated or acquired before the
QA Project Plan is approved and received by the appropriate project personnel. However, EPA
may grant conditional approval to  a QA Project Plan to permit some work to begin while non-
critical deficiencies in the QA Project Plan are being resolved.

       The organization performing the work shall ensure that the QA Project Plan is
implemented as approved and that all personnel involved in the work have direct access to a
current version of the QA Project Plan and all other necessary planning, implementation, and
assessment documents.  These personnel  should understand the requirements prior to the start of
data generation activities.

2.7    QA PROJECT PLAN REVISION

       Although the approved QA Project Plan must be  implemented as prescribed; it is not
inflexible. Because of the complex and diverse nature of environmental data operations, changes
to original plans are  often needed.  When such changes occur, the approving official shall
determine if the change significantly impacts the technical and quality objectives of the project.
When a substantive change is warranted, the originator of the QA Project Plan  shall modify the
QA Project Plan to document the change  and submit the  revision for approval by the same
authorities that performed the original review.  Only after the revision has been received and
approved (at least verbally with written follow-up) by project personnel, shall the change be
implemented.

       For programs or projects of long duration, such as multi-year monitoring programs or
projects using a generic QA Project Plan, the QA Project Plans shall be reviewed at least annually
by the EPA Project Manager (or authorized representative).  When revisions are necessary, the
QA Project Plan must be revised and resubmitted for review and approval.
                                                                                     Final
EPAQA/R-5                                  9                                  March 2001

-------
                               (This page is intentionally blank.)
                                                                                         Final
EPAQA/R-5                                   10                                   March 2001

-------
                                      CHAPTER 3

                           QA PROJECT PLAN ELEMENTS

3.1    CONTENT REQUIREMENTS

       The QA Project Plan is a formal document describing in comprehensive detail the
necessary QA, QC, and other technical activities that must be implemented to ensure that the
results of the work performed will satisfy the stated performance criteria. The QA Project Plan
must provide sufficient detail to demonstrate that:

             the project technical and quality objectives are identified and agreed upon;

             the intended measurements, data generation, or data acquisition methods are
              appropriate for achieving project objectives;

       •      assessment procedures are sufficient for confirming that data of the type and
              quality needed and expected are obtained; and

             any limitations on the use of the data can be identified and documented.

Most environmental data operations require the coordinated efforts of many individuals, including
managers, engineers, scientists, statisticians, and others. The QA Project Plan must integrate the
contributions and requirements of everyone involved into a clear, concise statement of what is to
be accomplished, how it will be done, and by whom.  It must provide  understandable instructions
to those who must implement the QA Project Plan, such as the field sampling team, the analytical
laboratory, modelers, and the data reviewers. In all aspects of the QA Project Plan, the use of
national consensus standards and practices are encouraged.

       In order to be effective, the QA Project Plan must specify the level or degree of QA and
QC activities needed for the particular environmental data operations.  Because this will vary
according to the purpose and type of work being done, EPA believes  that the graded approach
should be used in planning the work. This means that the QA and QC activities applied to a
project will be commensurate with:

              the purpose of the environmental data operation (e.g.,  enforcement, research and
              development, rulemaking),

       •       the type of work to be done (e.g., pollutant monitoring, site  characterization, risk
              characterization, bench level proof of concept experiments), and

              the intended use of the results (e.g., compliance determination, selection of
              remedial technology, development of environmental regulation).

                                                                                     Final
EPAQA/R-5                                  11                                   March 2001

-------
       The QA Project Plan shall be composed of standardized, recognizable elements covering
the entire project from planning, through implementation, to assessment. These elements are
presented in that order and have been arranged for convenience into four general groups. The
four groups of elements and their intent are summarized as follows:

       A      Project Management - The elements in this group address the basic area of project
              management, including the project history and objectives, roles and responsibilities
              of the participants, etc.  These elements ensure that the project has a defined goal,
              that the participants understand the goal and the approach to be used, and that the
              planning outputs have been documented.

       B      Data Generation and Acquisition - The elements in this group  address all aspects
              of project design and implementation. Implementation of these elements ensure
              that appropriate methods for sampling, measurement and analysis,  data collection
              or generation, data handling, and QC activities are employed and are properly
              documented.

       C      Assessment and Oversight - The elements in this group address the activities for
              assessing the effectiveness of the implementation of the project and associated QA
              and QC activities. The purpose of assessment is to ensure that the QA Project
              Plan is implemented as prescribed.

       D      Data Validation and Usability - The elements in this group address the QA
              activities that occur after the data collection or generation phase of the project is
              completed.  Implementation of these elements ensures that the data conform to the
              specified criteria, thus achieving the project objectives.

       All applicable elements, including the content and level of detail under each element,
defined by the EPA organization sponsoring the work must be addressed in the QA Project Plan.
If an element is not applicable, state this in the QA Project Plan. Documentation,  such as an
approved Work Plan, Standard Operating Procedures, etc., may be referenced in response to a
particular required QA Project Plan element to reduce the size of the  QA Project Plan. Current
versions of all referenced documents must be attached to the QA Project Plan itself or be placed
on file with the appropriate EPA office and available for routine referencing when needed. The
QA Project Plan shall also address related QA planning documentation (e.g., Quality Management
Plans) from suppliers of services critical to the technical and quality objectives of the project or
task.

3.2    GROUP A:  PROJECT MANAGEMENT

       The elements in this group (Table 1) address project management, including project
history and objectives, roles and responsibilities of the participants, etc.  These elements document
                                                                                     Final
EPAQA/R-5                                  12                                  March 2001

-------
that the project has a defined goal, that the participants understand the goal and the approach to
be used, and that the planning outputs have been documented.
Table 1. Group A: Project Management Elements
Al
A2
A3
A4
A5
A6
A7
A8
A9
Title and Approval Sheet
Table of Contents
Distribution List
Project/Task Organization
Problem Definition/Background
Project/Task Description
Quality Objectives and Criteria
Special Training/Certification
Documents and Records
3.2.1   Al - Title and Approval Sheet

       On the Title and Approval Sheet, include the title of the plan, the name of the
organization(s) implementing the project, the effective date of the plan, and the names, titles,
signatures, and approval dates of appropriate approving officials. Approving officials may
include:

                    Organization's Project Manager
                    Organization's QA Manager
                    EPA Project Manager
                    EPA QA Manager
                    Others, as needed (e.g., field operations manager,  laboratory managers,
                    State and  other Federal agency officials)

3.2.2   A2 - Table of Contents

       Provide a table of contents for the document, including sections, figures, tables,
references, and appendices. Apply a document control format (Figure 2) on each page following
the Title and Approval Sheet when required by the EPA Project Manager and QA Manager.
EPA QA/R-5
13
     Final
March 2001

-------
                                 Section No. _
                                 Revision No.
                                 Date
                                 Page	of
                      Figure 2. Example Document Control Format

3.2.3  A3 - Distribution List

       List the individuals and their organizations who need copies of the approved QA Project
Plan and any subsequent revisions, including all persons responsible for implementation (e.g.,
project managers), the QA managers, and representatives of all groups involved. Paper copies
need not be provided to individuals if equivalent electronic information systems can be used.

3.2.4  A4 - Project/Task Organization

       Identify the individuals or organizations participating in the project and discuss their
specific roles and responsibilities. Include the principal data users, the decision makers, the
project QA manager, and all persons responsible for implementation.  The project quality
assurance manager must be independent of the unit generating the data.  (This does not include
being independent of senior officials, such as corporate managers or agency administrators, who
are nominally, but not functionally, involved in data generation, data use, or decision making.)
Identify the individual responsible for maintaining the official, approved QA Project Plan.

       Provide a concise organization chart showing the relationships and the lines of
communication among all project participants. Include other data users who are outside of the
organization generating the data, but for whom the data are nevertheless intended.  The
organization chart must also identify any subcontractor relationships relevant to environmental
data operations, including laboratories providing analytical services.

3.2.5  A5 - Problem Definition/Background

       State the specific problem to be solved, decision to be made, or outcome to be achieved.
Include sufficient background information to provide a historical, scientific, and regulatory
perspective for this particular project.

3.2.6  A6 - Project/Task Description

       Provide a summary of all work to be performed, products to be produced, and the
schedule for implementation. Provide maps or tables that show or state the geographic locations
of field tasks.  This discussion need not be lengthy or overly detailed, but should give an  overall
picture of how the project will resolve the problem or question described in A5.
                                                                                      Final
EPAQA/R-5                                  14                                   March 2001

-------
3.2.7  A7 - Quality Objectives and Criteria

       Discuss the quality objectives for the project and the performance criteria to achieve those
objectives. EPA requires the use of a systematic planning process to define these quality
objectives and performance criteria.

3.2.8  A8 - Special Training/Certification

       Identify and describe any specialized training or certifications needed by personnel in order
to successfully complete the project or task. Discuss how such training will be provided and how
the necessary skills will be assured and documented.

3.2.9  A9 - Documents and Records

       Describe the process and responsibilities for ensuring the appropriate project personnel
have the most current approved version of the QA Project Plan, including version control,
updates, distribution, and disposition.

       Itemize the information and records which must be included in the data report package
and specify the reporting format for hard copy and any electronic forms.  Records can include raw
data, data from other sources such as data bases or literature, field logs, sample preparation and
analysis logs, instrument printouts, model input and output files, and results of calibration and QC
checks.

       Identify any other records and documents applicable to the project that will be produced,
such as audit reports, interim progress reports, and final reports.  Specify  the level of detail of the
field sampling, laboratory analysis, literature or data base data collection,  or modeling documents
or records needed to provide a complete description of any difficulties encountered.

       Specify or reference all applicable requirements for the final  disposition of records and
documents, including location and length of retention period.

3.3     GROUP B: DATA GENERATION AND ACQUISITION

       The elements in this group (Table 2) address all aspects of data generation and acquisition
to ensure that appropriate methods for sampling, measurement and analysis, data collection or
generation, data handling, and QC activities are employed and documented. The following QA
Project Plan elements describe the requirements related to the actual methods or methodology to
be used for the:

       •      collection, handling, and analysis of samples;
                                                                                      Final
EPA QA/R-5                                  15                                  March 2001

-------
       •      data obtained from other sources (e.g., contained in a computer data base from
              previous sampling activities, compiled from surveys, taken from the literature); and

             the management (i.e., compiling, handling) of the data.

The methods described in these elements should have been summarized earlier in element A6. The
purpose here is to provide detailed information on the methods. If the designated methods are
well documented and are readily available to all project participants, citations are adequate;
otherwise, detailed copies of the methods and/or SOPs must accompany the QA Project Plan
either in the text or as attachments.
Table 2. Group B: Data Generation and
Acquisition Elements
Bl
B2
B3
B4
B5
B6
B7
B8
B9
BIO
Sampling Process Design (Experimental Design)
Sampling Methods
Sample Handling and Custody
Analytical Methods
Quality Control
Instrument/Equipment Testing, Inspection, and Maintenance
Instrument/Equipment Calibration and Frequency
Inspection/ Acceptance of Supplies and Consumables
Non-direct Measurements
Data Management
3.3.1   Bl- Sampling Process Design (Experimental Design)

       Describe the experimental data generation or data collection design for the project,
including as appropriate:

              the types and numbers of samples required,
       •       the design of the sampling network,
              the sampling locations and frequencies,
       •       sample matrices,
              measurement parameters of interest, and
       •       the rationale for the design.
EPA QA/R-5
16
     Final
March 2001

-------
3.3.2   B2 - Sampling Methods

       Describe the procedures for collecting samples and identify the sampling methods and
equipment, including any implementation requirements, sample preservation requirements,
decontamination procedures, and materials needed for projects involving physical sampling.
Where appropriate, identify sampling methods by number, date, and regulatory citation.  If a
method allows the user to select from various options, then the method citations should state
exactly which options are being selected.  Describe specific performance requirements for the
method. For each sampling method, identify any support facilities needed. The discussion should
also address what to do when a failure in the sampling or measurement system occurs, who is
responsible for corrective action, and how the effectiveness of the corrective action shall be
determined and documented.

       Describe the process for the preparation and decontamination of sampling equipment,
including the disposal of decontamination by-products; the selection and preparation of sample
containers,  sample volumes, and preservation methods; and maximum holding times to sample
extraction and/or analysis.

3.3.3   B3 - Sample Handling and Custody

       Describe the requirements for sample handling and custody in the field, laboratory, and
transport, taking into account the nature of the samples, the maximum allowable sample holding
times before extraction or analysis, and available shipping options and  schedules for projects
involving physical sampling. Sample handling includes packaging, shipment from the site, and
storage at the laboratory. Examples of sample labels, custody forms, and  sample custody logs
should be included.

3.3.4   B4 - Analytical Methods

       Identify the analytical methods and equipment required, including sub-sampling or
extraction methods, laboratory decontamination procedures and materials (such as in the case of
hazardous or radioactive samples), waste disposal requirements (if any), and any specific
performance requirements for the method. Where appropriate, analytical methods may be
identified by number, date, and regulatory citation. Address what to do when a failure in the
analytical system occurs, who is responsible for corrective action, and how the effectiveness of the
corrective action shall be determined and documented.  Specify the laboratory turnaround time
needed, if important to the project schedule.

       List any method performance standards. If a method allows the user to select from
various options, then the method citations should state exactly which options are being selected.
For non-standard method applications,  such as for unusual sample matrices and situations,
appropriate method performance study information is needed to confirm the performance of the
                                                                                    Final
EPAQA/R-5                                 17                                  March 2001

-------
method for the particular matrix. If previous performance studies are not available, they must be
developed during the project and included as part of the project results.

3.3.5  B5 - Quality Control

       Identify QC activities needed for each sampling, analysis, or measurement technique.  For
each required QC activity, list the associated method or procedure, acceptance criteria, and
corrective action. Because standard methods are often vague or incomplete in specifying QC
requirements, simply relying on the cited method to provide this information is usually insufficient.
QC activities for the field and the laboratory include, but are not limited to, the use of blanks,
duplicates, matrix spikes, laboratory control samples,  surrogates, or second column confirmation.
State the frequency of analysis for each type of QC activity, and the spike compounds sources and
levels. State or reference the required control limits for each QC activity and corrective action
required when control limits are exceeded and how the effectiveness of the corrective action shall
be determined and documented.

       Describe or reference the procedures to be used to calculate applicable statistics (e.g.,
precision and bias). Copies of the formulas are acceptable as long as the accompanying narrative
or explanation specifies clearly how the calculations will address potentially difficult situations
such as missing data values, "less than" or "greater than" values, and other common data
qualifiers.

3.3.6  B6 - Instrument/Equipment Testing, Inspection, and Maintenance

       Describe how inspections and acceptance testing of instruments, equipment, and their
components affecting quality will be performed and documented to assure their intended use as
specified. Identify and discuss the procedure by which final acceptance will be performed by
independent personnel (e.g., personnel other than those performing the work) and/or by the EPA
project manager.  Describe how deficiencies are to be resolved, when re-inspection will be
performed, and how the effectiveness of the corrective action shall be determined and
documented.

       Describe or reference how periodic preventive and corrective maintenance of
measurement or test equipment or other systems and their components affecting quality shall be
performed to ensure availability and satisfactory performance of the systems. Identify the
equipment and/or systems  requiring periodic maintenance.  Discuss how the availability of critical
spare parts, identified in the operating guidance and/or design specifications of the systems, will
be assured and maintained.

3.3.7  B7 - Instrument/Equipment Calibration and Frequency

       Identify all tools, gauges, instruments, and other sampling, measuring, and test equipment
used for data generation or collection activities affecting quality that must be controlled and, at

                                                                                      Final
EPA QA/R-5                                  18                                  March 2001

-------
specified periods, calibrated to maintain performance within specified limits. Describe or
reference how calibration will be conducted using certified equipment and/or standards with
known valid relationships to nationally recognized performance standards. If no such nationally
recognized standards exist, document the basis for the calibration.  Identify the certified
equipment and/or standards used for calibration.  Indicate how records of calibration shall be
maintained and be traceable to the instrument.

3.3.8   B8 - Inspection/Acceptance of Supplies and Consumables

       Describe how and by whom supplies and consumables (e.g., standard materials and
solutions, sample bottles, calibration gases, reagents, hoses, deionized water, potable water,
electronic data storage media) shall be inspected and accepted for use in the project.  State
acceptance criteria for such supplies and consumables.

3.3.9   B9 - Non-direct Measurements

       Identify any types of data needed  for project implementation or decision making that are
obtained from non-measurement sources  such as computer data bases, programs, literature files,
and historical data bases. Describe  the intended use of the data.  Define the acceptance criteria
for the use of such data in the project and specify any limitations on the use of the data.

3.3.10 BIO - Data Management

       Describe the project data management process, tracing the path of the data from their
generation to their final use or storage (e.g., the field, the office, the laboratory).  Describe or
reference the standard record-keeping procedures, document control system, and the approach
used for data storage and retrieval on electronic media.  Discuss the control mechanism for
detecting and correcting errors and  for preventing loss of data during data reduction, data
reporting, and data entry to forms, reports, and databases. Provide examples of any forms or
checklists to be used.

       Identify and describe all  data handling equipment and procedures to process, compile, and
analyze the data. This includes procedures for addressing data generated as part of the project as
well as data from other sources.  Include  any required computer hardware and software and
address any specific performance requirements for the hardware/software configuration used.
Describe the procedures that will be followed to demonstrate acceptability of the
hardware/software configuration required. Describe the process for assuring that applicable
information resource management requirements are  satisfied.

       Describe the process for  assuring  that applicable Agency information resource
management requirements  (EPA Directive 2100) are satisfied (EPA QA Project Plans only). If
other Agency data management requirements are applicable, such as the Chemical Abstract
Service Registry Number Data Standard  (EPA Order 2180.1), Data Standards for the Electronic

                                                                                     Final
EPAQA/R-5                                  19                                 March 2001

-------
Transmission of Laboratory Measurement Results (EPA Order 2180.2), the Minimum Set of Data
Elements for Ground-Water Quality (EPA Order 7500.1 A), or new data standards as they are
issued by EPA, discuss how these requirements are addressed.

3.4    GROUP C: ASSESSMENT AND OVERSIGHT

       The elements in this group (Table 3) address the activities for assessing the effectiveness
of project implementation and associated QA and QC activities. The purpose of assessment is to
ensure that the QA Project Plan is implemented as prescribed.
Table 3. Group C: Assessment and
Oversight Elements
Cl
C2
Assessments and Response
Actions
Reports to Management
3.4.1   Cl - Assessments and Response Actions

       Describe each assessment to be used in the project including the frequency and type.
Assessments include, but are not limited to, surveillance, management systems reviews, readiness
reviews, technical systems audits, performance evaluations, audits of data quality, and data quality
assessments. Discuss the information expected and the success criteria (i.e., goals,  performance
objectives, acceptance criteria specifications, etc.) for each assessment proposed. List the
approximate schedule of assessment activities.  For any planned self-assessments (utilizing
personnel from within the project groups), identify potential participants and their exact
relationship  within the project organization.  For independent assessments, identify  the
organization and person(s) that shall perform the assessments if this information is available.
Describe how and to whom the results of each assessment shall be reported.

       Define the scope of authority of the assessors, including stop work orders, and when
assessors are authorized to act.

       Discuss how response actions to assessment findings, including corrective actions for
deficiencies  and other non-conforming conditions, are to be addressed and by whom. Include
details on how the corrective actions will be verified and documented.

3.4.2   C2 - Reports to Management

       Identify the frequency and distribution of reports issued to inform management (EPA or
otherwise) of the project status; for examples, reports on the results of performance evaluations
and system audits; results of periodic data quality assessments; and significant quality assurance
EPA QA/R-5
20
     Final
March 2001

-------
problems and recommended solutions.  Identify the preparer and the recipients of the reports, and
any specific actions recipients are expected to take as a result of the reports.

3.5    GROUP D:  DATA VALIDATION AND USABILITY

       The elements in this group (Table 4) address the QA activities that occur after the data
collection phase of the project is completed.  Implementation of these elements determines
whether or not the data conform to the specified criteria, thus satisfying the project objectives.
Table 4. Group D: Data Validation
and Usability Elements
Dl
D2
D3
Data Review, Verification,
Verification and Validation
and Validation
Methods
Reconciliation with User Requirements
3.5.1   Dl - Data Review, Verification, and Validation

       State the criteria used to review and validate - that is, accept, reject, or qualify - data, in
an objective and consistent manner.

3.5.2   D2 - Verification and Validation Methods

       Describe the process to be used for verifying and validating data, including the chain-of-
custody for data throughout the life of the project or task.  Discuss how issues shall be resolved
and the authorities for resolving such issues. Describe how the results are conveyed to data users.
Precisely define and interpret how validation issues differ from verification issues for this project.
Provide examples of any forms or checklists to be used. Identify any project-specific calculations
required.

3.5.3   D3 - Reconciliation with User Requirements

       Describe how the results obtained from the project or task will be reconciled with the
requirements defined by the data user or decision maker. Outline the proposed methods to
analyze the data and determine possible anomalies or departures from assumptions established in
the planning phase of data collection. Describe how reconciliation with user requirements will be
documented, issues will be resolved, and how limitations on the use of the data will be reported  to
decision makers.
EPA QA/R-5
21
     Final
March 2001

-------
                               (This page is intentionally blank.)
                                                                                         Final
EPAQA/R-5                                  22                                   March 2001

-------
                                   REFERENCES

40 CFR 30, Code of Federal Regulations, "Grants and Agreements With Institutions of Higher
       Education, Hospitals, and Other Non-Profit Organizations."

40 CFR 31, Code of Federal Regulations, "Uniform Administrative Requirements for Grants and
       Cooperative Agreement to State and Local Governments."

40 CFR 35, Code of Federal Regulations, "State and Local Assistance."

48 CFR 46, Code of Federal Regulations, "Federal Acquisition Regulations."

ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for Environmental
       Data Collection and Environmental Technology Programs, American National Standard,
       January 1995.

EPA Directive 2100 (1998), Information Resources Management Policy Manual, U.S.
       Environmental Protection Agency, Washington, DC.

EPA Order 2180.1 (June 1987), Chemical Abstract Service Registry Number Data Standard,
       U.S. Environmental Protection Agency, Washington, DC.

EPA Order 2180.2 (December 1988), Data Standards for the Electronic Transmission of
       Laboratory Measurement Results, U.S. Environmental Protection Agency, Washington,
       DC.

EPA Order 5360 A1 (May 2000). EPA Quality Manual for Environmental Programs, U.S.
       Environmental Protection Agency, Washington, DC.

EPA Order 5360.1 A2 (May 2000), Policy and Program Requirements for the Mandatory
       Agency-wide Quality System, U.S. Environmental Protection Agency, Washington, DC.

EPA Order 7500.1 A (October 1992), Minimum Set of Data Elements for Ground-Water Quality,
       U.S. Environmental Protection Agency, Washington, DC.

U.S. Environmental Protection Agency, 2001.  EPA Requirements for Quality Management
       Plans (QA/R-2), EPA/240/B-01/002, Office of Environmental Information.

U.S. Environmental Protection Agency, 2000a. Guidance for Data Quality Assessment:
       Practical Methods for Data Analysis (QA/G-9), EPA/600/R-96/084, Office of
       Environmental Information.
                                                                                Final
EPAQA/R-5                               23                                March 2001

-------
U.S. Environmental Protection Agency, 2000b. Guidance for the Data Quality Objectives
      Process (QA/G-4), EPA/600/R-96/055, Office of Environmental Information.

U.S. Environmental Protection Agency, 1998. Guidance for Quality Assurance Project Plans
      (QA/G-5), EPA/600/R-98/018, Office of Research and Development.

U.S. Environmental Protection Agency, 1995. Guidance for the Preparation of Standard
      Operating Procedures (SOPs) for Quality-Related Documents (QA/G-6), EPA/600/R-
      96/027, Office of Research and Development.

U.S. Environmental Protection Agency, 1980. Interim Guidelines and Specifications for
      Preparing Quality Assurance Project Plans, QAMS-005/80, Office of Research and
      Development.
                                                                                 Final
EPAQA/R-5                                24                                March 2001

-------
                                    APPENDIX A
           CROSSWALKS AMONG QUALITY ASSURANCE DOCUMENTS

A.1    BACKGROUND

       This appendix contains crosswalks between this document and other QA planning
documents. The first crosswalk compares this requirements document with its predecessor
document, QAMS 005/80, Interim Guidelines and Specifications for Preparing Quality
Assurance Project Plans (EPA 1980). The second crosswalk compares the elements of the QA
Project Plan defined in this document with the steps defined in Guidance for the Data Quality
Objectives Process (QA/G-4) (EPA 2000b), the Agency's preferred systematic planning process
for environmental decision making. This crosswalk is provided to assist the reader in determining
how the outputs from the DQO Process can be integrated into a QA Project Plan.

A.2    CROSSWALK BETWEEN EPA QA/R-5 AND QAMS-005/80
QAMS-005/80 ELEMENTS
1.0
2.0
3.0
4.0
5.0
6.0
7.0
8.0
Title Page with Provision for
Approval Signatures
Table of Contents
Project Description
Project Organization and
Responsibility
QA Objectives for Measurement
Data (PARCC)
Sampling Procedures
Sample Custody
Calibration Procedures and
Frequency
QA/R-5 ELEMENTS
Al
A2
A5
A6
A3
A4
A8
A9
A7
Bl
B2
B3
B7
Title and Approval Sheet
Table of Contents
Problem Definition/Background
Project/Task Description
Distribution List
Project/Task Organization
Special Training/Certification
Documents and Records
Quality Objectives and Criteria
Sampling Process Design
Sampling Methods
Sample Handling and Custody
Instrument/Equipment Calibration and
Frequency
EPA QA/R-5
A-l
     Final
March 2001

-------
QAMS-005/80 ELEMENTS
9.0
10.0
11.0
12.0
13.0
14.0
15.0
16.0
Analytical Procedures
Data Reduction, Validation, and
Reporting
Internal Quality Control Checks
and Frequency
Performance and Systems
Preventive Maintenance
Specific Routine Procedures Mea-
surement Parameters Involved
Corrective Action
QA Reports to Management
QA/R-5 ELEMENTS
B4
Dl
D2
B9
BIO
B5
Cl
B6
D3
Cl
C2
Analytical Methods
Data Review, Verification, and
Validation
Verification and Validation Methods
Non-direct Measurements
Data Management
Quality Control
Assessments and Response Actions
Instrument/Equipment Testing,
Inspection, and Maintenance
Reconciliation with User Requirements
Assessments and Response Actions
Reports to Management
EPA QA/R-5
A-2
      Final
March 2001

-------
A.3   CROSSWALK BETWEEN THE DQO PROCESS AND THE QA PROJECT PLAN
Elements
Requirements
DQO Overlap
PROJECT MANAGEMENT
Al Title and Approval Sheet
A2 Table of Contents
A3 Distribution List
A4 Project/Task
Organization
A5 Problem Definition/
Background
A6 Project/Task Description
A7 Quality Objectives and
Criteria
A8 Special Training/
Certification
A9 Documents and Records
Title and approval sheet.
Document control format.
Distribution list for the QA Project Plan
revisions and final guidance.
Identify individuals or organizations
participating in the project and discuss their
roles, responsibilities and organization.
1) State the specific problem to be solved or
the decision to be made.
2) Identify the decision maker and the principal
customer for the results.
1) Hypothesis test, 2) expected measurements,
3) ARARs or other appropriate standards, 4)
assessment tools (technical audits), 5) work
schedule and required reports.
Decision(s), population parameter of interest,
action level, summary statistics and acceptable
limits on decision errors. Also, scope of the
project (domain or geographical locale).
Identify special training that personnel will
need.
Itemize the information and records that must
be included in a data report package, including
report format and requirements for storage,
etc.
N/A
N/A
Step 1 : State the Problem
Step 1 : State the Problem
Step 1 : State the Problem
Step 2: Identify the Decision
Step 1 : State the Problem
Step 2: Identify the Decision
Step 3 : Identify the Inputs to the Decision
Step 6: Specify Limits on Decision Errors
Step 4: Define the Boundaries
Step 5: Develop a Decision Rule
Step 6: Specify Limits on Decision Errors
N/A
Step 3 : Identify the Inputs to the Decision
Step 7: Optimize the Design for Obtaining Data
EPA QA/R-5
A-3
    Final
March 2001

-------
Elements
Requirements
DQO Overlap
DATA GENERATION AND ACQUISITION
B 1 Sampling Process Design
(Experimental Design)
B2 Sampling Methods
B3 Sample Handling and
Custody
B4 Analytical Methods
B5 Quality Control
B6 Instrument/Equipment
Testing, Inspection, and
Maintenance
B7 Instrument/Equipment
Calibration and
Frequency
B8 Inspection/ Acceptance of
Supplies and
Consumables
Outline the experimental design, including
sampling design and rationale, sampling
frequencies, matrices, and measurement
parameter of interest.
Sample collection method and approach.
Describe the provisions for sample labeling,
shipment, chain-of-custody forms, procedures
for transferring and maintaining custody of
samples.
Identify analytical method(s) and equipment
for the study, including method performance
requirements.
Describe quality control procedures that
should be associated with each sampling and
measurement technique. List required checks
and corrective action procedures.
Discuss how inspection and acceptance testing,
including the use of QC samples, must be
performed to ensure their intended use as
specified by the design.
Identify tools, gauges and instruments, and
other sampling or measurement devices that
need calibration. Describe how the calibration
should be done.
Define how and by whom the sampling
supplies and other consumables will be
accepted for use in the project.
Step 5: Develop a Decision Rule
Step 7: Optimize the Design for Obtaining Data
Step 7: Optimize the Design for Obtaining Data
N/A
Step 3 : Identify the Inputs to the Decision
Step 7: Optimize the Design for Obtaining Data
Step 3 : Identify the Inputs to the Decision
Step 3 : Identify the Inputs to the Decision
Step 3 : Identify the Inputs to the Decision
N/A
EPA QA/R-5
A-4
      Final
March 2001

-------
Elements
B9 Non-direct
Measurements
BIO Data Management
Requirements
Define the criteria for the use of non-
measurement data, such as data that come
from databases or literature.
Outline the data management scheme including
the path and storage of the data and the data
record-keeping system. Identify all data
handling equipment and procedures that will be
used to process, compile, and analyze the data.
DQO Overlap
Step 1 : State the Problem
Step 7: Optimize the Design for Obtaining Data
Step 3 : Identify the Inputs to the Decision
Step 7: Optimize the Design for Obtaining Data
ASSESSMENT AND OVERSIGHT
Cl Assessments and
Response Actions
C2 Reports to Management
Describe the assessment activities needed for
this project.
Identify the frequency, content, and
distribution of reports issued to keep
management informed.
Step 7: Optimize the Design for Obtaining Data
N/A
DATA VALIDATION AND USABILITY
Dl Data Review,
Verification, and
Validation
D2 Verification and
Validation Methods
D3 Reconciliation With User
Requirements
State the criteria used to accept or reject the
data based on quality.
Describe the process to be used for verifying
and validating data, including the chain-of-
custody for data throughout the lifetime of the
project.
Describe how results will be evaluated to
determine if performance criteria have been
satisfied.
Step 7: Optimize the Design for Obtaining Data
Step 3 : Identify the Inputs to the Decision
Step 7: Optimize the Design for Obtaining Data
EPA QA/R-5
A-5
      Final
March 2001

-------
                               (This page is intentionally blank.)
                                                                                         Final
EPAQA/R-5                                  A-6                                   March 2001

-------
                                      APPENDIX B

                              TERMS AND DEFINITIONS

assessment - the evaluation process used to measure the performance or effectiveness of a system
and its elements. As used here, assessment is an all-inclusive term used to denote any of the
following: audit, performance evaluation, management systems review, peer review, inspection, or
surveillance.

audit (quality) - a systematic and independent examination to determine whether quality
activities and related results comply with planned arrangements and whether these arrangements
are implemented effectively and are suitable to achieve objectives.

calibration - comparison of a measurement standard, instrument, or item with a standard or
instrument of higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustments.

chain-of-custody - an unbroken trail of accountability that ensures the physical security of
samples, data, and records.

contractor - any organization or individual that contracts to furnish services or items or perform
work; a supplier in a contractual situation.

data quality assessment - a statistical and scientific evaluation of the data set to determine the
validity and performance of the data collection  design and statistical test, and to determine the
adequacy of the data set for its intended use.

data usability - the process of ensuring or determining whether the quality of the data produced
meets the intended use of the data.

design - specifications, drawings,  design criteria,  and performance requirements.  Also the result
of deliberate planning, analysis, mathematical manipulations, and design processes.

environmental conditions - the description of a physical medium (e.g., air, water, soil, sediment)
or biological system expressed in terms of its physical, chemical, radiological, or biological
characteristics.

environmental data - any measurements or information that describe environmental processes,
location, or conditions; ecological  or health effects and consequences; or the performance of
environmental technology. For EPA, environmental data include information collected directly
from measurements, produced from models, and compiled from other sources such as data bases
or the literature.
                                                                                     Final
EPAQA/R-5                                 B-l                                  March 2001

-------
environmental data operations - work performed to obtain, use, or report information
pertaining to environmental processes and conditions.

environmental processes - manufactured or natural processes that produce discharges to or that
impact the ambient environment.

environmental programs - work or activities involving the environment, including but not
limited to: characterization of environmental processes and conditions; environmental monitoring;
environmental research and development; the design, construction, and operation of
environmental technologies; and laboratory operations on environmental samples.

environmental technology - an all-inclusive term used to describe pollution control devices and
systems, waste treatment processes and storage facilities, and site remediation technologies and
their components that may be utilized to remove pollutants or contaminants from or prevent them
from entering the environment.  Examples include wet scrubbers (air), soil washing (soil),
granulated activated carbon unit (water), and filtration (air, water). Usually, this term will apply
to hardware-based systems; however, it will also apply to methods or techniques used for
pollution prevention, pollutant reduction, or containment of contamination to prevent further
movement of the contaminants, such as capping, solidification or vitrification, and biological
treatment.

financial assistance - the process by which funds are provided by one organization (usually
government) to another organization for the purpose of performing work or furnishing services or
items.  Financial assistance mechanisms include grants, cooperative agreements, performance
partnership agreements, and government interagency agreements.

graded approach - the process  of basing the level of application of managerial  controls applied
to an item or work according to  the intended use of the results and the degree of confidence
needed in the quality of the results.

independent assessment - an assessment performed by a qualified individual, group, or
organization that is not a part of the organization directly performing and accountable for the
work being assessed.

information resources management - the planning, budgeting, organizing, directing, training
and controls associated with information. The term encompasses both information itself and
related resources such as personnel, equipment, funds and technology.

inspection - an activity such as measuring, examining, testing, or gauging one or more
characteristics of an entity and comparing the results with specified requirements in order to
establish whether conformance is achieved for each characteristic.
                                                                                     Final
EPAQA/R-5                                 B-2                                  March 2001

-------
management system - a structured, non-technical system describing the policies, objectives,
principles, organizational authority, responsibilities, accountability, and implementation plan of an
organization for conducting work and producing items and services.

method - a body of procedures and techniques for performing an activity (e.g., sampling,
modeling, chemical analysis, quantification) systematically presented in the order in which they are
to be executed.

participant - when used in the context of environmental programs,  an organization, group, or
individual that takes part in the planning and design process and provides special knowledge or
skills to enable the planning and design process to meet its objective.

performance evaluation - a type of audit in which the quantitative  data generated in a
measurement system are obtained independently and compared with routinely obtained data to
evaluate the proficiency of an analyst or laboratory.

quality - the totality of features and characteristics of a product or service that bear on its ability
to meet the stated or implied needs and expectations of the user.

quality assurance (QA) - an integrated system of management  activities involving planning,
implementation, documentation, assessment, reporting, and quality improvement to ensure  that a
process, item, or service is of the type and quality needed and expected by the client.

quality assurance manager - the individual designated as the principal manager within the
organization having management oversight and responsibilities for planning, documenting,
coordinating, and  assessing the effectiveness of the quality system for the organization.

quality assurance project plan - a document describing in comprehensive detail the necessary
QA, QC, and other technical activities that must be implemented to  ensure that the results of the
work performed will satisfy the stated performance criteria.

quality control (QC) - the overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards  to verify that they meet the
stated requirements established by the customer; operational techniques and activities that are
used to fulfill requirements for quality.

quality management - that aspect of the overall management system  of the organization that
determines and implements the quality policy.  Quality management  includes strategic planning,
allocation of resources, and other systematic activities (e.g., planning, implementation,
documentation, and assessment) pertaining to the quality system.

quality management plan - a document that describes a quality system in terms of the
organizational structure, policy and procedures, functional responsibilities of management and

                                                                                     Final
EPAQA/R-5                                 B-3                                 March 2001

-------
staff, lines of authority, and required interfaces for those planning, implementing, documenting,
and assessing all activities conducted.

quality system - a structured and documented management system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation
plan of an organization for ensuring quality in its work processes, products (items), and services.
The quality system provides the framework for planning, implementing, documenting, and
assessing work performed by the organization and for carrying out required QA and QC activities.

readiness review - a systematic, documented review of the readiness for the start-up or continued
use of a facility, process, or activity. Readiness reviews are typically conducted before proceeding
beyond project milestones and prior to initiation of a major phase of work.

record - a completed document that provides objective evidence of an  item or process. Records
may include photographs,  drawings, magnetic tape, and other data recording media.

specification - a document stating requirements and which refers to or includes drawings or other
relevant documents.  Specifications should indicate the means and the criteria for determining
conformance.

supplier - any individual or organization furnishing items or services or performing work
according to a procurement document or financial assistance agreement. This is an all-inclusive
term used in place of any of the following: vendor, seller, contractor, subcontractor, fabricator, or
consultant.

surveillance (quality) - continual or frequent monitoring and verification of the status of an
entity and the analysis of records to ensure that specified requirements are being fulfilled.

technical systems audit (TSA) - a thorough, systematic, on-site, qualitative audit of facilities,
equipment, personnel, training, procedures, record keeping, data validation, data management,
and reporting aspects of a  system.

validation - confirmation by examination and provision of objective evidence that the particular
requirements for a specific intended use are fulfilled. In design and development, validation
concerns the process of examining a product or result to determine conformance to user needs.

verification - confirmation by examination and provision of objective evidence that specified
requirements have been fulfilled. In design and development, verification concerns the process of
examining a result of a given activity to determine conformance to the  stated requirements for that
activity.
                                                                                      Final
EPAQA/R-5                                 B-4                                  March 2001

-------
                                Appendix H
                    Project Inventory and Approval Form
                              Grant Agreement
GLNPO Quality Management Plan - Appendix H                                          May 2008

-------
                     PROJECT INVENTORY AND APPROVAL FORM
                                       GRANT A GREEMENT
PROJECT TITLE:	
GLNPO ID#                      COMMITMENT PHASE          GRANT #
                                 Routing List (Date and initial each step before proceeding)
	Folder Contents           	1. Project Officer:	Prepare blue folder.
(Left-Incoming   Outgoing-Right)         -  Commitment Notice (CN). Prepare and sign. Choose applicable form
    (Check items you enclose)               in:  ...SHARE\GRANTS\FORMS\COMMITMENT NOTICES\
                                       -  Decision Memo. Prepare original and 2 copies. Initial yellow.
	  Decision Memo (R)                   r...SHARE\GRANTS\FORMS\DECISION.wpd1
	  Commitment Notice (R)             -  Award Document Markup. Mark changes for final.
                                       -  Checklist. Address all issues and sign.
	  Mark-up of Award Doc. (L)	2. Team Leader:	
	  Signed Checklist (L)                -  Initial Decision Memo and sign Commitment Notice. If Proposal
	  Application (L)                      amount exceeds approved budgeted amount, you are requesting that the
                                         excess come from:	
     Project Type (check 1)        	3. QA Officer/Peer Review Coordinator (L. Blume)
                                       -  Enter into QA Track: Y or N (circle one)
   n Survey, Study, or                    -  Does Project need Agency Peer Review? Y or N (circle one). If
     Investigation                         questionable, use ...SHARE\GRANTS\FORMS\PeerCheckListl.wpd
                                 	4.GLNPO Program Analyst (E.Marie Phillips - Elias Avalos as backup)
   n Research or Demonstration           -  Sign Commitment Notice/Fix Budget issues
                                 	5. Management Advisor (for Subject Area)	
                                       -  Sign Commitment Notice / Initial Decision Memo (see next item)
                                 	6. GLNPO Director
                                       -  Sign Decision Memo. Management Advisor signs when Director is not
                                         available. (Date stamp, log, and file per front office procedures.)
                                 	7. Elias Avalos
                                       -  Deliver CN and Decision Memo to  Budget. "Yellow" of Decision
                                         Memo goes to Doreatha Oliver. CN copy to E. Marie and PO.
                                 	8. Budget and Finance Division (MF-10J) Deborah Harper
	9. Grants Specialist (MCG-10J)	(See reverse)

                                 AWARDS  PHASE
	ToNPM                   	1. Grants Specialist	
     Award Letter (R)                    "  Prepare Award Document and deliver, w/copy of final Commitment
                      .  .                  Notice (DCN filled in and signed by budget), to Elias Avalos
     Award Documents (R)                „  ,-,,.   .  ,   „   ,   , ,.    ,  ^^  ™T      ,  „    ,,   0 r^f  •
—    .  .   . . ..    ..    '                2. Elias Avalos-Track; deliver to PO; CN copy to Doreatha & EMane.
     original (clipped) + 4                 ?  ra  •  * i-k«~
     /   i  ix                           3. Project Officer
     (stapled) copies                       _         , . .,. ,  .     , T ,,
                                        -  Prepare and initial Award Letter
	  Decision Memo (R)                    r...SHARE\GRANTS\FORMS\AWARDLTR.WPD1
	  Addressed envelope (R)              -  Paragraph project summary is in subdirectory (habitat sediment p2
_  Commitment Notice (L)                monitor exotic  emer§mg other) of g:\user\share\grants\
                                          summaryO 1 \(Filename:	)
                                  	4. Team Leader for Applicable Category
       Reference Material                 -  Initial Award Letter
	  Application (L)               	5- Management Advisor
     „.  , c      ,,„,  ,    ,              -  Initial Award Letter
     Final Scope or Work and              , „.
—  B d   fn                   	Director
      . , .  .   .   „,   ,,.  ,T,           -  Sign "Stop Sign"and initial Award Letter.
     Administrative Checklist (L)             _   . .  5,       , „     .      XT .      .    ,     _  ,,
—  T ,_KTT^ .    _,        ,T ;            -  Decision Memo and Commitment Notice stay in package. Rererence
     LAN Project Summary (L)              ,f .   . ,     .   _     ,,                  j  F     &
—          J           j  \ ;              Material goes to Doreatha.
                                        -  Hand-carry Award Package to National Program Manager
RETAIN THIS SLIP AS PART OF OFFICIAL PROJECT FILE

                                               Page 1 of 2

-------
                                            Additional Instructions

Applications. Applications are generally delivered to the Assistance Section (AS). AS will enter into GICs, then
deliver 2 copies of the application, the Administrative Review Checklist, the Draft Award Document and the Project File
to Elias.  Elias will give the File Folder and application to Doreatha Oliver. If the Application comes to the PO, deliver
it immediately to the Grants Specialist. Do not wait to develop a Commitment Notice or Decision Memo.

Checklist.  The PO should mark changes/corrections to the draft Award Document (title, locational information, NPM
title, etc.). If this is or supports a "Survey, Investigation, or Study," do not allow 40CFR Part 40 to appear in the
"Regulatory Authority" box on page 2 of the Assistance Agreement. Address all issues raised in the Checklist. The
Assistance Section generally does not know whether costs  are necessary, reasonable, and allocable, so they automatically
check "N," meaning that they do not know and it is the project officer's responsibility to make that determination.
Note also the Assistance Section has taken on responsibility for the items marked "Grant Specialist" and "Administrative
Terms and Conditions."  The Assistance Section will be contacting the applicant directly to address their concerns about
these items.  If there are any questions about the checklist,  call the applicable grants specialist immediately to resolve
them.

Draft Award Document. The  PO should verify that the workplan incorporates the requirements specified in the
Application Request Letter.  If it does not, after the PO has discussed the issues with the Applicant, the PO can use the
Decision Memo to request that those items be included as special conditions in the Award Document.

Decision Memo.  Now signed and delivered  during Commitment Phase, instead of a startup memo. See
g:\user\share\grants\forms\decision.wpd

Management signoff. Pages to be signed or initialed should be flagged.  Write appropriate initials on the flags.
Director or Acting Director also need to follow front office procedures (date-stamping, logging, and filing).

During Awards Phase, the Director's office will hand-carry grants packages to the 19th floor. PO can walk through, if
desired, but the PO needs to let the Director's Office and Evelyn Cabrera do appropriate tracking after the Director (or
the Acting Director) signs.

The applicable Management Advisor may not be a PO's supervisor.  Vicki Thomas is the Management Advisor for P2
and Education/Outreach. Dave Cowgill is the Management Advisor for Habitat, Sediments, and Information
Management. Paul Horvatin is  the Management Advisor for Monitoring, QA, Health and Safety, Exotics, and Emerging
Issues. However, except for Management Advisor sign-off, staff work follows "home rule" - use your own staff secretary
for clerical support.

Grants Specialists.
Grants Specialist
Darlene Lewis
Francisca Ramos
George Stone (Team Leader")
States
IN, MI, MN, WI
non-R5, including PA, NY, and international
IL. OH
Phone
3-2199
6-5945
6-7517
Procedures after NPM signature. After award documents are signed by NPM:
-  they go to the Assistance Section where they wait for the blackout period and are dated.
-  a copy is delivered to Finance and funds are obligated.
-  a copy is delivered to Elias Avalos, who re-delivers to Doreatha Oliver (for File copies) and the Project Officer.
                                                 Page 2 of 2

-------
                                      Appendix I
                          Project Inventory and Approval Form
                                 Interagency Agreement
GLNPO Quality Management Plan - Appendix I                                                 May 2008

-------
PROJECT TITLE:

GLNPO ID #	
 PROJECT INVENTORY AND APPROVAL FORM
              INTERA GENCYA GREEMENT

	                            IAG #  DW-	
              (if applicable)
                                           STARTUP PHASE
Folder Contents                  Routing List (Date and initial each step before proceeding)
(Left-Incoming   Outgoing-Right)	1. Project Officer (	
    (Check items you enclose)

	   Commitment Notice (R)

	   Decision Memo (R)

       Reference Material

	   Proposal (L)

	   WorkpIan/Scope of Work
      (L)

	   Budget (L)

	   Draft of form 1610 (L)
                 -  Prepare blue folder.  «/" items at left.
                 -  Commitment Notice (CN). Prepare and sign. Choose applicable form in:
                    G:\USER\SHARE\GRANTS\FORMS\COMMIT\
                 -  Decision Memo. Prepare and initial.  Example is in:
                    rG:\USER\SHARE\IAG\FORMS\decision.wpd1
                 -  Prepare draft of form 1610 rG:\USER\SHARE\IAG\FORM\Iagl-3.wp61
                 2. Team Leader:	
                 -  Initial Decision Memo. Sign Commitment Notice. If amount exceeds
                    approved budgeted amount, the excess comes from:	
            	3. QA Officer/Peer Review Coordinator (Lou Blume)
                 -  Enter into QA Track: Y or N (circle one)
                 -  Does Proj ect need Agency Peer Review? Y or N (circle one). If
                    questionable, use ...SHARE\GRANTS\FORMS\PeerCheckListl.wpd
            	4.GLNPO Program Analyst (E.Marie Phillips - Elias Avalos as backup)
                 -  Sign Commitment Notice/Fix Budget issues
            	5. Management Advisor
                 -  Sign Commitment Notice
                 -  Initial Decision Memo "yellow" (see next item)
            	6. GLNPO Director
                 -  Sign Decision Memo.  Management Advisor signs when Director is not
                    available. (Date  stamp, log, and file per front office procedures.)
            	7-Elias Avalos
                 -  Deliver CN and Decision Memo to Budget. "Yellow" of Decision Memo
                    goes to Doreatha Oliver.  CN copy to E. Marie and PO.
            	9. Budget and Finance Division (MF-10J) Deborah Harper
            	10. AAB Specialist (Barbara Cash)
                 -  Final IAG, routing slip, and copy of Commitment Notice (signed, final)
                    to Elias Avalos then PO
Folder Contents
	  Decision Memo (R)
	  2 Final lAG's (R)

       Reference Material
	  LAN Project Summary (L)
	  Proposal, including final
     scope of work and budget
     (L)
	  Commitment Notice Copy
     (L)

RETAIN THIS SLIP AS PART
OF OFFICIAL PROJECT FILE
                       AWARD PHASE

                 1. Elias Avalos-Track; deliver to PO; CN copy to Doreatha & EMarie.
                 2. Project Officer
                 -   Paragraph project summary is in subdirectory (habitat sediment p2
                    monitor exotic emerging other) of G:\USER\SHARE\GRANTS\
                    summary. 01 \(Filename:	)
                    Format: G:\USER\SHARE\GRANTS\SUMMARY.OO\format.wpd
                 3. Team Leader for Applicable Category - Review. Initial at left
                 4. Management Advisor -  Review.  Initial at left.
                 5. GLNPO Director
                 -   Sign 2 "original" lAG's (page 3)
                 -   Route Slip (copy) and LAN Project Summary to Doreatha
                 6. National Program Manager -  Sign 2 "original" lAG's (page 3)
                 7. AAB Specialist (Barbara Cash)
                    -   Transmit IAG under Signature of Assistance Section Chief
                    -   Return folder, this Slip, and copies  to Doreatha Oliver
                                              Page 1 of 1

-------
                                   Appendix J
                       Quality Assurance Project Plan and
                     Quality Management Plan Checksheets

      The following forms will be used to review all Quality Assurance Project Plans and
Quality Management Plans in which environmental data collection activities are GLNPO's
responsibility. These forms are very similar to review forms used in EPA Regions 10 and 3.
GLNPO Quality Management Plan - Appendix J                                              May 2008

-------
  ^6D sr%       UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                     GREAT LAKES NATIONAL PROGRAM OFFICE
                           77 WEST JACKSON BOULEVARD
                               CHICAGO, IL 60604-3590
TO:      , PROJECT OFFICER
THRU:   LOUIS BLUME, QA MANAGER
FROM:
SUBJECT: REVIEW OF""
GRANT #:
DATE:

-------
                UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                        GREAT LAKES NATIONAL PROGRAM OFFICE
                              77 WEST JACKSON BOULEVARD
                                  CHICAGO, IL 60604-3590
         QUALITY ASSURANCE PROJECT PLAN CHECKSHEET
      Revision:
QAPP Category:
                            Project Number:
  GRANT TITLE:
     Author/P.L:
     Date Review Submitted:
     Date Review Completed:
                     Project Officer:
                     Date Review Requested:
                     Reviewed by:
 Major (X) and/or minor (O) deficiencies, defined here as the absence of relevant or incomplete
 information, were found in the following elements:
          Title & Approval Sheet
          Table of Contents
          Distribution List
          Project/Task Organization
          Problem Definition/Background
          Project/Task Description
          Data Quality Objectives
          Special Training/Certification
          Documentation & Records
          Sampling Process Design
          Sampling Method
          Sample Handling
                      Analytical Methods
                      Quality Control
                      Instrument/Equipment Testing
                      Instrument Calibration & Frequency
                      Inspection/Acceptance for Supplies
                      Data Acquisition (Non-Direct)
                      Data Management
                      Assessments & Response Actions
                      Reports to Management
                      Data Review, Validation, & Verification
                      Validation and Verification Methods
                      Reconciliation with User Requirements
See attached sheets for discussion comments relative to all elements.
Conclusion/Recommendation:
     Acceptable
Acceptable with minor revisions
                                   Unacceptable with major revisions

-------
IA = Included & Acceptable NI = Not Included
IU = Included & Unacceptable NA = Not Applicable
IA
IU
NI
NA
COMMENTS PAGE 2 OF 5
Al. Title & Approval Sheet
Title
Organization's name
Dated signature of project manager
Dated signature of QA officer
Other signatures, as needed
A2. Table of Contents
A3. Distribution List
A4. Project/Task Organization
Identifies key individuals with their
responsibilities (e.g., data users, decision
makers, project QA manager, Subcontractors,
etc.)
Organization chart shows lines of authority &
reporting responsibilities
A5. Problem Definition/Background
Clearly states problem or decision to be
resolved
Historical & background information
A6. Project/Task Description
Lists measurements to be made
Cites applicable technical, regulatory, or
program-specific quality standards, criteria, or
objectives
Notes special personnel or equipment
requirements
Provides work schedule
Notes required project & QA records/reports
A7. Quality Objectives & Criteria for
Measurement Data
States project objectives and limits, both
qualitatively & quantitatively
States & characterizes measurement quality
objectives as to applicable action levels or
criteria
A8. Special Training Requirements/Certifications
A9. Documentation & Records
Lists information & records to be included in
data report (e.g. raw data, field logs, results of
QC checks, problems encountered)
States requested lab turnaround time
Gives retention time and location for records
and reports













































































































































-------
IA = Included & Acceptable NI = Not Included
IU = Included & Unacceptable NA = Not Applicable
IA
IU
NI
NA
COMMENTS PAGE 3 OF 5
Bl. Sampling Process Design (Experimental
Design)
Types and number of samples required
Sampling network design & rationale for
design
Sampling locations & frequency of sampling
Sample matrices
Classification of each measurement parameter
as either critical or needed for information
only
Validation study information, for non-
standard situations
B2. Sampling Method Requirements
Identifies sample collection procedures &
methods
Lists equipment needs
Identifies support facilities
Identifies individuals responsible for
corrective action
B3. Sample Handling & Custody Requirements
Notes sample handling requirements
Notes chain of custody procedures, if
required
B4. Analytical Methods Requirements
Identifies analytical methods to be followed
(with all options) & required equipment
Provides validation information for non-
standard methods
Identifies individuals responsible for
corrective action
B5. Quality Control Requirements
Identifies QC procedures & frequency for
each sampling, analysis, or measurement
technique, as well as associated acceptance
criteria and corrective action
References procedures used to calculate QC
statistics ( e.g., precision, bias, accuracy)
B6. Instrument/Equipment Testing, Inspection,
and Maintenance Requirements
Identifies acceptance testing of sampling and
measurement systems
Describes equipment needing calibration and
frequency for such calibration
Notes availability & location of spare parts



































































































































-------
IA = Included & Acceptable NI = Not Included
IU = Included & Unacceptable NA = Not Applicable
IA
IU
NI
NA
COMMENTS PAGE 4 OF 5
B7. Instrument Calibration & Frequency
Identifies equipment needing calibration and
frequency for such calibration
Notes required calibration standards and/or
equipment
Cites calibration records & manner traceable
to equipment
B8. Inspection/Acceptance Requirements for
Supplies & Consumables
States acceptance criteria for supplies &
consumables
Notes responsible individuals
B9. Data Acquisition Requirements for Non-Direct
Measurements
Identifies type of data needed from non-
measurement sources (e.g., computer data
bases and literature files), along with
acceptance criteria for their use
Describes any limitations of such data
BIO. Data Management
Describes standard record keeping & data
storage and retrieval requirements
Checklist or standard forms attached to QAPP
Describes data handling equipment &
procedures used to process, compile and
analyze data ( e.g., required computer
hardware & software)






































































Cl. Assessments & Response Actions
Lists required number, frequency, & type of
assessments, with approximate date & names
of responsible personnel
Identifies individuals responsible for
corrective actions
C2. Reports to Management
Identifies the preparer and recipients of
reports
Identifies frequency and distribution of
reports for:
Project status
Results of performance evaluations & audits
Results of periodic data quality assessments
Any significant QA problems



















































-------
IA = Included & Acceptable NI = Not Included
IU = Included & Unacceptable NA = Not Applicable
IA
IU
NI
NA
COMMENTS PAGE 5 OF 5
Dl. Data Review, Validation, & Verification
States criteria for accepting, rejecting, or
qualifying data
Includes project-specific calculations or
algorithms
D2. Validation and Verification Methods
Describes process for data validation and
verification
Identifies issue resolution procedure and
responsible individuals
Identifies method for conveying these results
to data users
D3. Reconciliation with User Requirements
Describes process for reconciling with DQOs
and reporting limitations on use of data














































-------
  ^6D sr%       UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                      GREAT LAKES NATIONAL PROGRAM OFFICE
                           77 WEST JACKSON BOULEVARD
                              CHICAGO, IL 60604-3590
TO:      , PROJECT OFFICER
THRU:   LOUIS BLUME, QA MANAGER
FROM:
SUBJECT: REVIEW OF""
GRANT #:
DATE:

-------
                UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                       GREAT LAKES NATIONAL PROGRAM OFFICE
                             77 WEST JACKSON BOULEVARD
                                 CHICAGO, IL 60604-3590
      QUALITY MANAGEMENT PLAN CHECKSHEET
      Revision:
                                     Project Number:
  TITLE:


  Author/P.L:
  Date Review Submitted:
  Date Review Completed:
                              Project Officer:
                              Date Review Requested:
                              Reviewed by:
 Major (X) and/or minor (O) deficiencies, defined here as the absence of relevant
 or incomplete information, were found in the following elements:
 Management & Organization
 Organization's QA Policy Statement
 Distribution List
 QA Manager/Staff Authorities
 Technical Activities/Programs
 Quality System Components
 Principle Components
 Tools for Implementing Components
 Personnel Qualification and Training
                            Procurement of Items and Services
                            Procurement Document Approval
                            Solicitation Response Approval
                            Documents & Records
                            Computer Hardware & Software
                            Planning
                            Implementation of Work Processes
                            Assessment & Response
                            Quality Improvement
See attached sheets for discussion comments relative to all elements.
Conclusion/Recommendation:
 Acceptable
Acceptable with minor revisions
Unacceptable with major revisions

-------
IA = Included & Acceptable NI = Not included
IU = Included & Unacceptable NA = Not Applicable
Al. Management & Organization
Title Page
Organization's name
Dated signature of project manager
Dated signature of Q A officer
Other signatures, as needed
A2. Organization's QA Policy Statement
Importance of QA and QC
General objectives and goals of a quality
system
Policy for resource allocation for the quality
system
A3. Distribution List
Identifies key individuals with their
responsibilities (e.g., data users, decision
makers, project QA manager, Subcontractors,
etc.)
Organization chart shows lines of authority &
reporting responsibilities
A4. QA Manager/Staff Authorities
Documents the organizational independence of
the QA Manager from groups generating,
compiling, and evaluating environmental data
Indicates how the organization will ensure that
QA personnel will have access to the
appropriate levels of management in order to
plan, assess, and improve the organization's
quality system
A5. Technical Activities/Programs
States the specific programs that require
quality management controls
States where oversight of delegated,
contracted, or other extra mural programs are
needed to assure data quality
States where and how internal coordination of
QA and QC activities among the group's
organizational units needs to occur
IA




















IU




















NI




















NA




















COMMENTS





















-------
IA = Included & Acceptable NI = Not included
IU = Included & Unacceptable NA = Not Applicable
Bl. Quality System Components
Description of the organization's quality
system
B2. Principle Components
A description of the principle components
of the organization's quality system in
terms including the roles and
responsibilities of management and staff:
Quality system documentation
Annual systems review
Management assessments
Training
Systematic planning of projects
Project-specific quality documentation
Project and data assessments
B3. Tools for Implementing Components
QMP's
Quality system audits
Training plans
QAPP
Data verification and validation
A list of components of the organization that
develop QAPP's in support of the
organization's quality system and the review
and approval procedures for such
documentation
A discussion of how roles and responsibilities
for the principal components of the quality
system are incorporated into performance
standards
Cl. Personnel Qualification & Training
A statement of the policy regarding training
for management and staff
A description of the process(es), including
the roles, responsibilities, and authorities of
management and staff for:
Identifying, ensuring, and documenting that
personnel have maintained the appropriate
knowledge, skill, and statutory, regulatory,
professional or other certifications,
accreditation, licenses, or other formal
qualification necessary
Identifying the need for retraining based on
changing requirements
IA
























IU
























NI
























NA
























COMMENTS

























-------
IA = Included & Acceptable NI = Not included
IU = Included & Unacceptable NA = Not Applicable
Dl. Procurement of Items & Services
Describes or references the process(es),
including the roles, responsibilities, and
authorities of management and staff,
pertaining to all appropriate procurement
documents or extra mural agreements,
including grants, cooperative agreements, and
contracted and subcontracted activities,
involving or affecting environmental programs
D2. Procurement Document Approval
Reviewing and approving procurement
documents to ensure that procurement
documents are accurate, complete, and
clearly describe:
The item or service needed
The associated technical and quality
requirements
The quality system elements for which the
supplier is responsible
How the supplier's conformance to the
customer's requirements will be verified
D3. Solicitation Response Approval
Review and approval of all applicable
responses to solicitations to ensure that
these documents:
Satisfy all technical and quality requirements
Provide evidence of the supplier's capability
to satisfy EPA quality system requirements as
defined in the extra mural agreement or
applicable Federal Regulation
Ensuring that procurement items and services
are of acceptable quality, including the review
of objective evidence of quality for applicable
items and services furnished by suppliers and
subcontractors, source selection, source
inspections, supplier audits, and examination
of deliverables
Review and approval procedures for
mandatory quality-related documentation (e.g.,
QMP's or QAPP's) from suppliers
Policies and criteria for delegations of EPA
authority to review and approve mandatory
quality-related documentation (e.g., QMP's or
QAPP's) from suppliers consistent with
Chapter 2.2 of EPA Order 5360
IA















IU















NI















NA















COMMENTS
















-------
IA = Included & Acceptable NI = Not included
IU = Included & Unacceptable NA = Not Applicable
Ensure that EPA quality -related contracting
policies, as defined by the Federal Acquisition
Regulations Office of Federal Procurement
Policy, and the EPA Contracts Management
Manual [EPA Order 1900(EPA 1998)]
El. Documents and Records
Describes or references the process(es)
including roles, responsibilities, and
authorities of management and staff for:
Quality-related documents and records
requiring control
Records and documents accurately reflect
completed work
Established chain of custody
Ensure compliance with all applicable
statutory, regulatory, and EPA requirements
for documents and records
Establish and implement confidentiality
procedures for evidentiary records
Fl. Computer Hardware and Software
Describes or references the process(es)
including roles, responsibilities, and
authorities of management and staff for:
Assessed and documented the impact of
change to user requirements and/or the
hardware and software
Evaluated purchased hardware and software to
ensure it met user requirements and complied
with applicable and contractual requirements
and standards
Ensured that data and information produced
from, or collected by, computers met
applicable information resource management
requirements and standards
Developed, installed, and tested documenting
hardware and software used in environmental
programs to ensure it met technical and quality
requirements and directives from management
Ensured that applicable EPA requirements for
information resources management were
addressed
IA















IU















NI















NA















COMMENTS
















-------
IA = Included & Acceptable NI = Not included
IU = Included & Unacceptable NA = Not Applicable
Gl. Planning
Describes or references the process(es)
including roles, responsibilities, and
authorities of management and staff for
planning environmental data operations
using a systematic planning process which
includes:
Identification and involvement of the project
manager, sponsoring organization and
responsible official, project personnel,
stakeholders, and scientific experts
Description of the project goal, objectives, and
questions and issues to be addressed
Identification of project schedule, resources,
milestones, and any applicable requirements
Identification of the type and quality of data
needed and how the data will be used to
support the project's objective
Specifications of performance criteria for
measuring quality
HI. Implementation of Work Processes
Describes or references the processes),
including roles, responsibilities, and
authorities of management and staff for:
Ensuring that work is performed according to
approved planning and technical documents
Identification of operations needing procedures,
preparation, review, approval, revision, and
withdrawl of these procedures; and policy for use
Controlling and documenting release, change,
and use of planned procedures, including any
necessary approvals, specific times and points for
implementing changes, removal of obsolete
documentation from work areas, and verification
that the changes are made as prescribed
11. Assessment and Response
Describes or references the process(es)
including roles, responsibilities, and
authorities of management and staff for:
Assesses the adequacy of the quality system at
least annually
Plans, implements, and documents
assessments and reporting assessment results
to management including how to select an
assessment tool, the expected frequency of
their application to environmental programs,
and the roles and responsibilities of the
assessors
IA
















IU
















NI
















NA
















COMMENTS

















-------
IA = Included & Acceptable NI = Not included
IU = Included & Unacceptable NA = Not Applicable
Determines the level of competence,
experience, and training necessary to ensure
that personnel conducting assessments are
technically knowledgeable, have no real or
perceived conflict of interest, and have no
direct involvement or responsibility for the
work being assessed
Ensures that personnel conducting assessments
have sufficient access to programs, manager,
documents, and records
Management's review and response to
assessment
Identifies how and when corrective actions are
to be taken in response to the findings of the
assessment, ensuring corrective actions are
made promptly, confirming the
implementation and effectiveness of any
corrective action, and documenting such
actions
Addresses any disputes encountered as a result
of assessments
Jl. Quality Improvement
Identify who (organizationally) is responsible
for identifying, planning, implementing, and
evaluating the effectiveness of quality
improvement activities and describes the
process to ensure continuous quality
improvement, including the roles and
responsibilities of management and staff
Ensures that conditions adverse to quality
are:
Prevented
Identified promptly including a determination
of the nature and extent of the problem
Corrected as soon as practical, including
implementing appropriate corrective actions
and actions to prevent reoccurrence
Documenting all corrective actions
Tracking such actions to closure
Encourages staff at all levels to establish
communications between customers and
suppliers, identify process improvement
opportunities, and identify and offer solutions
to problems
IA














IU














NI














NA














COMMENTS















-------
                               Appendix K
                Examples ofGLNPO's Graded Approaches to
                       Quality System Documentation
GLNPO Quality Management Plan - Appendix K                                         May 2008

-------
                                 Great Lakes Commission
                            QUALITY MANAGEMENT PLAN

                                       Approval Sheet
Michael J. Donahue - President/CEO
Signature I Date
Julie Wagemakers - GLC Information Manager/Quality Assurance Manager:
Signature I Date
Stuart Eddy - GIS/Systems Specialist:
Signature I Date
Derek Moy - GIS Specialist 2/Internet Specialist:
Signature I Date
                      - EPA Project Officer
Signature I Date


Lou Blurne — US EPA Great Lakes National Program Office Quality Assurance Manager



Signature I Date


                                             -1-

-------
                                    Great Lakes Commission
                              QUALITY MANAGEMENT PLAN
A.I.    Management and Organization
The Great Lakes Commission (Commission) is a binational agency that promotes the orderly, integrated and
comprehensive development, use and conservation of the water and related natural resources of the Great
Lakes basin and St. Lawrence River. Its members include the eight Great Lakes states with Associate Member
status for the Canadian provinces of Ontario and Quebec. The Great Lakes Commission has been a pioneer
in applying principles of sustainability to the development, use and conservation of the natural resources of
the Great Lakes basin and St. Lawrence River. Three principal functions of the Commission include:
information  sharing among the entire Great Lakes-St. Lawrence community; policy research, development
and coordination on issues of regional interest; and advocacy of those positions.

The Commission is uniquely positioned to coordinate, develop, apply and institutionalize a management
support system incorporating ecological indicators, monitoring and comprehensive data sets. Project
participants consist of scientific and policy experts drawn from key U.S. and Canadian federal agencies, state
and provincial agencies, non-governmental organizations, and other interest groups.

A.2.    Quality Assurance
Quality assurance is  a top priority at the Great Lakes Commission (Commission). In order to adequately
manage the Great Lakes as both an economic and environmental resource, managers and decision makers
require the best data available. The Commission serves the eight Great Lakes states and the Provinces of
Ontario and Quebec.. Accuracy and consistency are of utmost importance to our membership. The
Commission is mandated to serve its clients with the most accurate and timely products and services available
and does this through constant evaluation and review from its numerous task forces and advisory boards.

In order to assure consistency across the basin and among Commission projects, the following quality
management system is in place:
    *   A quality assurance manager will oversee the quality of all Commission projects.
    *   Each Commission project may also include a project management team comprised of members from
        one or more states or the provinces of Ontario and Quebec, and from the United States federal
        agency that has legislative reporting responsibility or regulatory responsibility. The project
        management team may also include representatives from academic institutions and non-governmental
        organizations.
    *•   Results from each project will be scrutinized, tested and reviewed by a review team of experts for
        accuracy and practicality. Review teams will consist of staff from related projects or institutions. For
        example, a coastal wetland project review team may include the International Joint Commission's
        Lake Ontario Study or the University of Minnesota's Natural Resources Research Institute indicator
        development research team.
    *•   The results of each project will be reviewed by at least three qualified independent reviewers.

The Commission's quality management system is designed to protect the integrity of each project. The system
will specifically assure that:
    *•   Project results  are accurate, reliable, practical and reviewed before release;
    *   Project results  are obtained in a timely manner;
    *   Project results  are designed in a practical way for ease of implementation; and,
    *•   There is a process for solving project problems/issues that may arise throughout the project.

Resources for each project will be allocated quality assurance manager, quality assurance team, staff from
related projects, and independent reviewers to ensure the above results.
                                                 -2-

-------
                                    Great Lakes Commission
                              QUALITY MANAGEMENT PLAN
A.3.       Distribution List

Great Lakes Commission Organizational Chart:
Observers, Federal
•* k.

Commissioners for each
Great Lakes state
4_±

Ontario and Quebec
Chair
^_w

Board of Directors
4_h

Vice-Chair
                                    President/CEO
                                    Program Mgrs./QA Mgr.
                                     Staff
Great Lakes Commission staff and board responsible for different aspects of project quality
assurance:

John Hummer — Program Specialist, works on environmental quality issues including wetlands, Lake MI
monitoring projects, statewide public advisory councils and watershed training projects. Mr. Hummer
supports projects through coordination efforts on correspondence, including returning calls and e-mails,
report writing, meeting support, and participates in staff reviews and discussions, Mr. Hummer reports to Ric
Laws on.

Ric Lawson — Project Manager, coordinates Lake MI monitoring work at the Great Lakes Commission and
heads up watershed training programs across the basin. Mr. Lawson also supports the wetlands project by
drafting reports, working with the program manager to design and conduct wetlands meetings, works with all
project participants to develop and maintain materials related to a project. Mr. Lawson reports to Julie
Wagemakers

Julie Wagemakers - Program Manager, manages various communications  and information management
projects. Ms. Wagemakers ensures that projects are conducted in a high quality, timely and fair manner. She
also works directly with project management teams, Commission Board members and is directly responsible
for Commission staff working on projects. Ms. Wagemakers also acts as the Commission's quality assurance
manager for the coastal wetlands project. Ms. Wagemakers reports to Michael Donahue.

Dr. Michael Donahue - President and CEO, manages all programs and Commission business. Dr. Donahue
oversees projects in terms of ensuring that projects are conducted in a manner consistent with Commission
                                                -j-

-------
                                    Great Lakes Commission
                              QUALITY MANAGEMENT PLAN

and its members policies. Dr. Donahue reports to the Executive Committee of the Commission's Board of
Directors.

Dr. Samuel Speck - Director of Ohio DNR, manages all departments and budgets for the Ohio Department
of Natural Resources. Dr. Speck is the vice chair of the Great Lakes Commission and reports to the Chair,
Nathaniel Robinson.

Nathaniel Robinson — Executive Assistant, Wisconsin Technical College System Board. Mr. Robinson is
Chair of the Great Lakes Commission.

A.4.    Quality Assurance (QA) Manager
The Commission's  Quality Assurance Manager is responsible for implementing the  Commission's quality
management system as well as the quality of data resulting from Commission projects. Commission staff,
project management teams, and independent reviewers are responsible to the QA Manager in that they will
assess project work directly related to data collection, analysis, and compilation, and draft quality assurance
plans as necessary. The  QA Manager is directly responsible to the President and CEO of the Commission,
who has final authority on all Commission projects.

A.5.    Technical Activities/Programs
Commission projects involving data collection and/or analysis require quality management controls, and
quality assurance plans will include the following components:

Project Management
Title and Approval Sheet
Table of Contents
Distribution List
Project/Task Organization
Problem Definition/Background
Project/Task Description
Quality Objectives and Criteria
Special Training/Certification
Documents and Records
    '. Generation and Acquisition
Sampling Process Design
Sampling Methods
Sample Handling and Custody
Analytical Methods
Quality Control
Instrument/Equipment Testing, Inspection and Maintenance
Instrument/Equipment Calibration and Frequency
Inspection/Acceptance of Supplies and Consumables
Non-direct measures
Data Management

Assessment and Oversight
Assessments and Response Actions
Reports to Management
Data Validation and Usability Elements
Data Review, Verification and Validation
                                                -4-

-------
                                    Great Lakes Commission
                              QUALITY MANAGEMENT PLAN

Verification and Validation Methods
Reconciliation with User Requirements

Commission projects may require requests for proposals (RFP) that result in a selection of project
components from external sources. Proposals resulting from an RFP will undergo several review stages:
    *•   The Program Specialist and Project Manager will preliminarily review and log all proposals received.
    *   Proposals will be presented to and reviewed by the project management team.
    *   Proposals qualifying as finalists will be presented to and reviewed by external reviewers. These
        reviewers will be experts with either an advanced academic degree with 5 years professional
        experience or professionals with 10 years experience in the discipline.
    *•   The QA Manager will review proposals to determine whether a Quality Assurance Project Plan
        (QAPP) is needed.
    *•   Upon completion of the reviews, grants will be processed.
    *•   Follow-up reports will be reviewed periodically during the project period.
    *   Site visits will be conducted periodically or as needed to monitor the quality assurance elements of the
        workplan.

Commission quality management staff will ensure that quality assurance measures have been followed before
the release of any data or reports through regular briefings at subcommittees, project management, and
secretariat meetings.

The Commission will conduct a review of the quality management system annually to ensure there are no
problems encountered when finalizing project results.

B.I.    Quality System Components
In addition to the procedures for the content of quality assurance project plans (QAPP) and proposals
received by way of an  RFP, the Commission will assess quality of work of projects through the following:

    *•   Site visits — As described earlier, periodic site visits to ensure projects are adhering to project
        workplans and QAPPs  will be conducted by the Project Manager.
    *   Project reports - Project reports will be required twice annually, by both the principal project leaders
        and by Commission staff overseeing quality management issues and project progress.
    *•   Tracking — The Commission QA Manager will be responsible for tracking all correspondence relating
        to quality system components.
    *   Approval - All quality checks will be approved by the Commission's QA Manager.

B.2.       Principle Components
Quality System Documentation
All documentation relating to the quality management system, including reviews, reports, Quality Assurance
Project Plans (QAPP), and approval sheets, will be kept on file at Commission offices by project. The QA
Manager will be responsible for all quality system documentation on behalf of each project, including record
keeping, reporting, and signatures for approvals. The Manager is also the point of contact for audits.
Commission project staff will support the QA Manager in drafting reports in consultation with principal
project leads. Commission management is responsible for acceptance  and approval of all quality system
documentation. In consultation with the  QA Manager, Commission management will review and approve
documentation and documentation procedures.

Annual Systems Review
The QA Manager will conduct a review of the Commission's quality management system together with
Commission staff and management on a yearly basis. Corrections or adjustments to the system will be

                                                 -5-

-------
                                    Great Lakes Commission
                              QUALITY MANAGEMENT PLAN

approved by Commission management and board and implemented by Commission staff under the guidance
of the QA Manager.
The QA Manager and the Commission President/CEO will conduct a management assessment as part of the
annual review process. This will include a review of all elements of the quality system, including an assessment
of effectiveness of project QAPPs and the process for quality system implementation.

Training
EPA offers training in quality management system development, review and documentation. The QA
Manager will attend a 1-day EPA QA training course at least once per year. In addition, Commission staff and
the President/CEO will participate in training and briefing sessions offered by the Commission QA Manager.
The QA Manager will serve as a resource for all Commission projects where QAPPs are required.

Where training is required, it is the Commission's policy to offer training to those individuals that request
technical training and have followed up with research on various options. The Commission maintains a highly
trained staff in order to provide state-of-the-art project results.

Systematic Planning of Projects
The project manager, project management team, QA Manager and Commission staff are responsible for
planning and guiding projects. Project planning and guiding includes project initiation, analysis, design,
implementation, and stewardship of all resulting data.

Project Specific Quality Documentation
Project specific quality documentation is the responsibility of the project manager with the project
management team with oversight by the QA Manager and the Commission management. Documentation for
each project will include work plan, budget, and QAPP if appropriate.

Project and Data Assessments
A standard (with identical elements for each project) checklist will be used for QAPP and quality assurance
system reviews (provided earlier in this document). The QA Manager is  responsible for consistency of reviews
and documentation of reviews for each project. Reports will be forthcoming from the Commission program
and project managers and the QA Manager on a project by project basis. Projects are reviewed semi-annually.
Data review, compilation, analysis and dissemination is undertaken at the beginning of each project and
reviewed on a quarterly basis for appropriateness and accuracy.

Ongoing Commission programs require annual planning  meetings. New projects must have management
documentation as to how they will serve the Commission's overall strategic plan. The Commission undergoes
an annual project and financial audit.

B.3.        Tools for Implementing Components
Quality Management Plan
The Commission's quality management plan is integral to each project and serves as insurance, a guide and a
framework for project management. The quality system details all elements of a delivery system for products
that are scientifically sound and practical. A process is defined to address how to ensure that project products
are consistent and to resolve problems should they arise.

Quality System Audits
An annual review of the quality system includes an audit of both financial and human resources.

-------
                                     Great Lakes Commission
                              QUALITY MANAGEMENT PLAN

Training Plans
Commission staff and the QA Manager will participate in quality management training courses offered by US
EPA.

Quality Assurance Project Plans (QAPP)
Quality assurance project plans (QAPP) are implemented to ensure quality results for the project. Project
QAPPs, accepted and approved by the QA Manager, are written by project principals. The QA Manager
conducts reviews and records documentation.
     Verification and Validation
Data developed in-house by Commission staff will be compiled and released with FGDC-compliant metadata
(Z39.5 compliance). For surveyed data, spatial accuracy will be determined by the capabilities of the GPS unit
used to record data points. For data digitized on-screen from digital representations of topographic
quadrangles or orthophotography, data error will match that of the source imagery. For data created using a
digitizing tablet, a Root Mean Square error of .003 will be considered adequate for digitizing to proceed.

Data created by other agencies will be included in this project only if accompanied by FGCD-compliant
metadata which confirms that the data meets Commission standards for this project.

Quality Assurance Project Plan Development
The principal project leaders develop quality assurance project plans. The QA Manager approves all QAPPs.
QAPP's must contain appropriate content in the QAPP checklist. The level of detail should be appropriate for
the data quality needs of the specific project. The  QA Manager must ensure that each project QAPP supports
the Commission's quality management plan.

Performance Standards
The QA Manager is responsible for staff evaluations at the Commission and will conduct an evaluation of
project staff and the project management team on an annual basis. Commission staff evaluations include
performance ratings for project work, implementation of the quality management system, and accomplishment
of tasks described within  the system. Within the review of project staff performance, requirements include full
disclosure of project files including project staff evaluations. All project documentation is disclosed to the
Commission for auditing and reporting requirements.

C.I.    Personnel Qualifications and Training
Employee Training
Employees of the Commission may, at the discretion of the President/CEO and QA Manager, receive
training and instruction to their current responsibilities and or advancement to more responsible positions
within limitations of the project budget. To the extent possible, all training deemed required for the position
shall be conducted during working hours.

Management staff are required to provide annual reviews of staff performance. These evaluations include
training assessments and implementation measures.

Certifications
The QA Manager must approve all applications for licenses, certifications or other regulatory qualifications
needed to conduct specific projects. These costs are borne by project budgets. The QA Manager must also
ensure that all appropriate licenses are obtained for collecting any form of natural resource (collecting permits)
or handling of hazardous materials (such as preservatives that may be used on samples collected).
                                                 -7-

-------
                                    Great Lakes Commission
                              QUALITY MANAGEMENT PLAN

D.I.    Procurement of Items and Services
Request for Proposals (RFP)
A request for proposals will detail requirements for collection, analysis and consolidation of data. Proponents
to the RFP must not only detail a work plan for accomplishing the proposal, they may be required to develop
a QAPP. Within the QAPP they must indicate a need for any procurement of equipment or training or other
related services. QAPP's are reviewed and approved by the QA Manager and the project team.

Contracted/ Subcontracted Activities
All proposals requesting contract or subcontract activities will be reviewed and approved by Commission staff
and the project management team. Contractors and subcontractors automatically have the same
responsibilities of project teams, including development and implementation of a QAPP, and observation of
certification and training requirements.

D.2.       Procurement document approval
Commission Policy
The Commission will not discriminate against any person, business or organization because of race, color,
religion, sex or national origin of its employees or its ownership. The Commission will take affirmative action
to ensure that all procurement of goods and services is done without regard to race, color, religion, sex or
national origin.

Purchase of Services, Materials, Supplies or Equipment
The Commission is exempt from all state and federal taxes. Vendors shall be advised not it include any tax on
invoices.

Restricted fund purchases must be made in compliance with the terms and conditions of the funding source.

The University of Michigan general stores shall serve as the primary source for purchases. All Purchases made
through the University of Michigan must be approved by the President/CEO. The Purchase of items from
the University of Michigan at another vendor must be approved by the President/CEO.

Procurements not made through the University of Michigan must follow a competitive bid process. In all
cases, affirmative action processes shall be followed. When preparing the requisition for materials, supplies,
equipment, etc..., the  requester must include a detailed description of the goods or services along with an
estimate of the price. The requisition should also include a date when delivery is  required.

If any vendor is habitually late in making deliveries, he/she may be removed from any further consideration as
a source for Commission purchases.

Purchases not exceeding $25.00 — Supervisory personnel requiring materials, supplies or services can purchase
the items without approval of the President/CEO. An invoice for the purchase must be secured in the name
of the Commission and forwarded to the Financial Officer for payment, or the employee may pay for item (s)
purchased,  secure a receipt and file for reimbursement.

Requests for purchases between $25.00 and $500.00 shall be submitted to the President/CEO for approval in
writing. The Financial  Officer will prepare a purchase order and solicit at least three quotations. This can be
done by telephone as long as records are maintained.

Purchases between $500.00 and $7,500.00 must be approved in wiring by the President/CEO. Upon receipt
of such approval, the Financial Officer will prepare a purchase order detailing goods and services requested.

-------
                                      Great Lakes Commission
                               QUALITY MANAGEMENT PLAN

The request for bids will be sent to appropriate vendors. Efforts should be made to obtain at least three bids.
Bids must be made in writing. The lowest cost qualified bidder will be selected.

Purchases of goods and services over $7,500 must be approved by formal action of the Board of Directors.
Upon Board approval to proceed, the financial officer shall prepare invitations for bids for distribution.
Requests  for bids shall be published in a manner likely to attract prospective bidders. There shall be at least 20
days between publication and the date bids are due.

Award of a contract will be made to the bidder with the lowest responsive cost. If, in the discretion of the
President/CEO, there are no acceptable bids, the Commission can re-advertise.

For general fund expenditures:

The Commission may, suspend and waive the above provisions requiring competitive bids whenever:
    ••   The purchase is to be made from or the contract is to be made with the federal or any state government or
        any agency or political subdivision thereof, or pursuant to any open and bulk purchase contract of any of
        them;
    ••   The public or agency requires the immediate delivery of the articles or performance of the service;
    ••   Only one source of supply is available
    ••   The equipment to be purchased is of a technical nature and the procurement thereof without advertising is
        necessary in order to assure standardization or equipment and interchangeability of parts in the public
        interest; or
    ••   Services are to be provided of a specialized or professional nature.


The financial officer shall provide the necessary forms required to implement the policies contained herein.

Technical and Quality Requirements
All requests for equipment including technical requests for software, GPS, nets etc..., are reviewed and approved by
the project management team and the Commission. The project management team assesses level of quality of
procurement items.

Supplier Responsibilities
Within the project planning stage of each proposal, project teams must specify specific suppliers for procurements
indicated in their proposals. The Commission and project management team will approve all suppliers.

Supplier Conformance
It is the responsibility of the project principals to ensure that suppliers are providing exactly what procurements have
requested.

D.3.    Solicitation Response Approval
Technical and Quality Requirements
Proposals will be selected by the project management team and grants will be awarded to those applicants that satisfy
technical and quality requirements as stated within the PJ7?.

EPA Approval
For projects funded by EPA, suppliers must comply with EPA requirements as defined in the Federal Regulations.

Acceptable Quality
                                                   -9-

-------
                                      Great Lakes Commission
                                QUALITY MANAGEMENT PLAN

Ensuring that procured items and services are of acceptable quality includes the review of objective evidence of
quality for applicable items and services furnished by suppliers and subcontractors, source selection, source
inspections, supplier audits,  and examination of deliverables.
Approval of Supplier QMPs/QAPPs
It is the responsibility of the QA Manager to review and approve procedures for mandatory quality-related
documentation from suppliers.

EPA approval of Supplier QMPs/QAPPs
The review and approval by the Commission of mandatory quality-related documentation from suppliers will be
consistent with Chapter 2.2 of EPA Order 5360, the EPA Contracts Management Manual (EPA Order 1900), and the
Federal Acquisition Regulations Office of Federal Procurement Policy.

E.I.    Documents and Records
Quality Related Documents
Program managers and quality management staff ensure that records and documents accurately reflect completed
work through continual communication, data checking and oversight of all Commission projects. All Commission
reports are approved by the President/CEO, all financial statements are approved by the financial officer and
ultimately assessed by the Commision's auditors.

Records Reflect Work
Maintaining documents and records is the responsibility of all staff. A chronological file is kept of all correspondence
the Commission receives and sends, including faxes and most email (email is backed up on our server). Access to
records is  at the discretion of the President/CEO. All contracts including those that cover loss or damage (insurance-
based) are negotiated by the President/CEO and approved by the Board of Directors. The Commission maintains
archives of all materials at an offsite secure location.

Chain of Custody
N/A

EPA Requirements for Documents and Records
Commission financial - payables are archived for a minimum of seven years, general accounting/general ledger
records date back to the establishment of the Commission, payroll is archived for a minimum of seven years.
Commission project files - are archived for a minimum of seven years.

All confidentiality and evidentiary procedures are at the discretion of the Commission President/CEO.

F.I.    Computer Hardware and Software
All systems development must be approved by the Commission program specialist for systems. Installation is either
through a contracted service, or when available, through Commission staff.  Server configuration is through
contracted services. Equipment testing is accomplished through a combination of in-house expertise and contracted
services. Documentation of Commission equipment and use is available from the program specialist, systems.

Computer hardware systems are primarily Dell or Gateway units purchased through an account maintained by the
University of Michigan. Other peripherals are name-brand products purchased as needed by this or other projects
under the direct supervision of Commission GIS and data processing staff.

All computer operating system and most application software are commercially available, industry-standard
products. Customizing to enhance productivity within a given application are carried out using the macro languages
                                                  -10-

-------
                                      Great Lakes Commission
                               QUALITY MANAGEMENT PLAN

and techniques designed for that particular application. When required, beta testing of products before purchase will
be implemented if there is any question of performance.

Assessments of equipment performance take place on a monthly basis. Recommendations for upgrades,
enhancements or other improvements are subject to program manager and QA Manager approval, and the
President/CEO authorizes any purchases required by these improvements.

The Commission tracks all equipment and maximizes performance through direct contact with staff, inventory and
documentation.

Regular backups of hard drives and servers assists in data storage and management. Active data files stored on the
server are copied to magnetic tape in case of hard drive failure, and copies of completed data files are archived to
CD-ROM for long-term storage and for distribution to other agencies.

Impact of Change to User Requirements
Should there be a requirement for updating or changing any systems requirements, this will be addressed and
approved by the QA Manager. Currently, all major software for office documentation is provided in at least two of
the most common formats (i.e. MS Word and WordPerfect, Netscape and Internet Explorer, etc...). All Commission-
related software is industry standard (ARC Info, ARC IMS, etc...).

Hardware and Software Standards
All hardware and software implemented at the Commission is under annual review and assessment. Should hardware
or software not perform to Commission staff standards, equipment will be updated or replaced at cost to the supplier,
the Commission or provisions made under the particular project.

Information Management Compliance
Data developed in-house by Commission staff will be compiled and released with FDGC-compliant metadata (Z39
compliance). Data created by other agencies will be included in this project only if accompanied by FGCD-compliant
metadata which confirms that the data meets Commission standards for this project.

FGDC - federal geographic data committee standards are listed at
http://www.fgdc.gov/standards/status/csdgm rs ex.html (extensive)

Developed,  Installed and Tested Documenting Software
N/A

EPA Standards for Information Resources Management
For geographical standards see above. The Commission maintains Vshield virus protection software. Our Internet
security is covered under the Commission contract with Merit Networks and is subject to their firewalls or lack
thereof (basically we can block any address from entering our system as per our contract agreement). EPA has full
access to all Commission hardware and software documentation and agreements.

G.I.    Planning
Program Manager - oversees cradle to grave operations of a project. Which includes approval of all communications
and deliverables including data collection, compilation and analysis. This individual is responsible for all internal and
external reviews including advisory boards and task forces.

Sponsoring organization - For some projects, the sponsoring organization may participate in all stages of the project
(e.g., EPA cooperative agreements). For other projects, the sponsoring organization has oversight responsibility only
(e.g., EPA grant awards).
                                                  -11-

-------
                                      Great Lakes Commission
                                QUALITY MANAGEMENT PLAN

Project personnel - Commission staff perform data collection, compilation and analysis level tasks and assist the
Program Manager in meeting the goals of the project and the goals of the quality management plan.

Stakeholders - Assess the relevancy of project work and advise on development and process. This group also
participates in quality management implementation through recommendations to the Program Manager.

Scientific experts - Project participants are selected for both their level of expertise, including years of experience,
and their ability to work collaboratively with numerous partners. Experts will be expected to standardize data
collection and develop evaluation and review procedures.

Project Goals/Objectives and Issues Addressed
The project management team, with all appropriate personnel as described above, will formulate appropriate goals
and objectives. Data collected, compiled and analyzed will be required to be statistically defensible and be in a
standard format for modeling purposes.

Project Schedule
Individual project work plans will specify the project schedule. Schedules will differ for each project, but each will
contain many or all of the following elements:

Project Scoping
Development Management Infrastructure
Establish Protocols for Funding Program
Manage Small Grant/Pilot Studies Program
Design Management Support System
Secure Supplemental Funds
Provide Project Secretariat Services
QA Assessment
Evaluation of Results

The Commission has direct oversight of all project committees.

Support to Project Objectives
Data collection, analysis and compilation is often required and will therefore directly support project objectives.
Database will serve as an information management tool and be valuable throughout project time frames and beyond.
Quality requirements for data usage include quality checks of data before being released. As part of QAPP checks,
data submitted as results to projects are tested and validated by Commission staff, the project management team, and
the QA Manager.

Specifications of Performance Criteria for Measuring Quality
Quality measurements will be assessed at the beginning of each project, and will therefore be project specific.
Performance criteria will include:
    Data accuracy
    Meets the goals of the project
    Meet deadlines
    Issue resolution
    Within budget
    Personnel, supplier and contractor evaluations
    Meets all QAPP requirements
    Complies with the Quality Management Plan
    Report writing
    Access to research data
    Future considerations/recommendations
                                                   -12-

-------
                                      Great Lakes Commission
                                QUALITY MANAGEMENT PLAN

    Practical
    Reviews were positive and recommended acceptance

Approved Planning and Technical Documents
Many projects are required to follow approved QAPPs, and revise or update them according to periodic reviews by
the Commission, the project management team and the QA Manager.  QAPPs required for EPA projects must adhere
to the elements listed in EPA's Region 5 quality planning document.

Review Procedures
The Commission is responsible for all review procedures as outlined in its Quality Management Plan.

Procedure Documentation
The Commission is responsible for all project documentation including procedures relevant to the project. All
procedures are reviewed under the quality management review process. Release of procedures is consistent the
project work plan. Verification of any changes or updates to procedures are the responsibility of the QA Manager.
The QA Manager is also responsible to report their findings to the appropriate sponsoring organization. The project
management team will review statistical models.

A procedures document will be developed for data collection, compilation and analysis for all project data. This
document will be reviewed on an annual basis. Policy associated with procedures will be reviewed by the
Commission who in turn will assure that they fall within Commission guidelines. The Commission will approve all
updates, revisions or changes to policy and procedures related to each project and will keep records of these activities
for up to five years after project completion.

LI.     Assessment and Response
Annual Assessment
The Commission's quality system is assessed annually as part of its annual strategic planning process. Project
personnel update the President/CEO on a bi-weekly basis, including quality management activities. Assessors of
quality performance include the project secretariat, project chair, project management team, subcommittee chairs and
stakeholders to the project. Assessments of data, process and work performance will ultimately reside with
Commission staff. Assessors may chose their own assessment tools including geographic reference, adherence to
standards set in procedures document and accepted use policies.

Peer reviews - will be conducted of all RFP drafts, proposal submissions and review of all deliverables.
Technical reviews - project management team members will be  required to provide technical review of each others
work
Performance evaluations - will be conducted at the Commission level and at various stages of the project, over an
annual and semiannual time frame.
Data quality assessments - will be administered and determined relevant by Commission staff.

The project management team is established and designed to be  an open forum to conduct project assessment from a
wide variety of participants. Problem identification and resolution building is within the purview of this team.
Effectiveness of recommendations and solutions will be weighed at the project secretariat level. Project management
team members may either approach the Chair or the secretariat with reviews and responses to findings related to
assessment of quality performance.

Assessment Documentation
A one-page report is developed for each assessment and serves as an update for Commission management and the
project management team. Assessment tools are  selected by the  QA Manager in consultation with the project
management team and will include:
    Ease and frequency for data collection
    Recommendations  for data collection
                                                  -13-

-------
                                      Great Lakes Commission
                                QUALITY MANAGEMENT PLAN

                                     Data analysis procedures
    Database review
    Recommendations for frequency of review
    Review of practicality for implementation
    Review of implementation
    Reviews beyond the project period
    Review of effectiveness
    Review of roles of project participants
    Review of agency cooperation and mandates
    Follow up recommendations

Assessing the Assessors
Assessors are selected by reviewing expertise, level of education or training and on whether they stand to gain from
assessing elements of the project. The QA Manager selects assessors on recommendations from the project
management team and members of the stakeholder group. The QA Manager will assess qualifications of assessors
according to components of the project that they will be assessing (i.e. if they are testing a statistical model, they
should have a strong background in statistical applications/modeling). To ensure there are no conflicts of interest, no
assessor will be reviewing his/her own work or the work of a colleague with whom they may be engaged on a
project.

Assessor Checklists
Each assessor will be provided with a checklist including elements listed above, from the QA Manager. The QA
Manager will act as a resource for any questions/issues that arise as a result of the assessment. Likewise, for EPA
projects,  the EPA Quality Assurance Manager will act as a resource for the Commission QA Manager. All corrective
action is at the discretion of the President/CEO of the Commission. And in this role will address any disputes
encountered as a result of assessments.

Corrective Actions
Corrective actions will take place as a result of the assessments and be accompanied by a recommendation for
corrective action that has been approved by the QA Manager and the Commission President/CEO. Corrective action
will be made promptly so as not to  cause further delay to the project's progress. An interim assessment review will
be conducted two to four weeks after the corrective action has taken place to determine whether the situation has
been rectified and to assess whether further action needs to take place. The QA Manager is responsible for
documenting any such action.

Disputes
Should there be a dispute after corrective measures have been made, the Commission will confer with the project
management team on steps to ensure project integrity.

J.I.    Quality Improvement
The Commission President/CEO is responsible for review, approval, and implementation of the quality system.
Working together with staff, the President/CEO responds to problems identified within the project and works to
identify solutions and implement them. All actions will be documented. The President/CEO will act from a
consensus driven process and track each issue through to an acceptable level of completion.

Continuous Quality Improvement
The QA Manager is responsible for leading the effort to identify, plan, implement and evaluate the effectiveness of
quality improvement activities. Quality improvement activities may include:
        QMP
        QAPP
        Checklists
        Evaluations
                                                  -14-

-------
                                      Great Lakes Commission
                               QUALITY MANAGEMENT PLAN

        Assessments
        Data collection, management, storage and access
        Evaluation
        Planning procedures
        Technical procedures
        Training procedures
        Documentation and record keeping
        Staffing
        Related project elements

A checklist for quality improvements will include the above and be circulated by the QA Manager to Commission
staff and the project management team. This checklist will be circulated on an annual basis.

Conditions for Quality
Prevention - The Commission President/CEO delegates decisions related to quality management to the QA Manager.
Through implementation of the quality management plan including regular assessments and reviews of QAPP's and
other procedural practices, issues should be minimized. Open and frequent communication with all project staff
should prevent misunderstandings and misconceptions and also highlight expectations.
Nature and extent - The QA Manager, in consultation with the President/CEO, will determine the nature and extent
of action to be taken after an occurrence has been noted or disclosed through a site visit or regular review and
assessment. All occurrences will be documented. Any corrective action (as determined necessary - described above)
will be documented.
Correction - Any corrective actions will be taken as soon as necessary (see above). Through regular review and
incorporation of enhancements and frequent communication - reoccurrence should be kept at a minimum.
Documentation - The QA Manager tracks all corrective actions.
Tracking to closure - The Commission conducts bi-weekly staff meetings to ensure that problems encountered are
identified early and dealt with immediately. All issues not addressed within a two-week time frame will be
documented and tracked to completion.
Communications - The  QA Manager will encourage Commission staff and other project participants to
communicate and document all project activities they are involved in, including supplier reviews, evaluations and
other management system elements. They will identify process improvement opportunities as well as offer solutions
to problems and forward these recommendations to the QA Manager.

Attachments
QAPP Checklist - see http://www.epa.gov/glnpo/fund/qachcklst.html
                                                  -15-

-------
             Quality Assurance Project Plan
                Title and Approval Sheet
               Final Draft, August 26, 2002
        Cuyahoga River Old Channel Assessment
              Demaree Collier, Project Officer
       U.S. EPA-Great Lakes National Program Office
  77 W. Jackson Blvd (G17-J), Chicago, Illinois 60604-3590
         Louis Blume, Quality Assurance Manager
       U.S. EPA-Great Lakes National Program Office
  77 W. Jackson Blvd (G17-J), Chicago, Illinois 60604-3590
        Lisa Morris, Chief, Division of Surface Water
   Ohio EPA, P.O. Box 1049, Columbus, Ohio 43216-1049
  Linda Friedman, Chief, Division of Environmental Services
Ohio EPA, Murray Hall, 1571 Perry St., Columbus, Ohio, 43210
Tutu Rosanwo, QA Officer, Division of Environmental Services
Ohio EPA, Murray Hall, 1571 Perry St., Columbus, Ohio 43210
             Julie Letterhos, Project Manager
           Ohio EPA, Division of Surface Water
        P.O. Box1049, Columbus, Ohio 43216-1049

-------
                                       Table of Contents

QAPP Signature Page	1
Table of Contents	2
List of Figures	2
List of Tables	2
Appendices	3
QAPP Distribution List	4
A.    Project Management	5
  A.1.   Project Organization	5
  A.2.   Background and Problem Definition	6
  A.3.   Project/Task Description	7
    A.3.1.   Historical Reconnaissance Survey of Old Channel and Adjacent Properties	7
    A.3.2.   Sediment Chemistry Sampling	7
    A.3.3.   Fish Community Sampling	8
    A.3.4.   Fish Tissue Sampling	8
  A.3.5.   Project Schedule	9
  A.4.   Data Quality Objectives and Criteria	9
    A.4.1.   Historical Use Reconnaissance Survey	9
    A.4.2.   Sediment Chemistry	9
    A.4.3    Fish Tissue Sampling	9
    A.4.4.   Fish Community Survey	9
  A.5.   Special Personnel, Training and Equipment Requirements	10
    A.6.   Documentation and Records	10
B.    Data Generation and Acquisition	10
  B.1   Sampling Process Design and Rationale	10
    B.1.2.   Definition of Sample Types	11
    B.1.3.   Type and Number of Samples	11
  B.2   Sampling Methods	12
  B.3.   Sample Handling and Custody	12
    B.3.1    Sample Containers	12
    B.3.2.   Sample Labeling	13
  B.4.   Analytical Methods Requirements	13
  B.5   Quality Control  Requirements	13
  B.6.   Instrument/Equipment Testing, Inspection and Maintenance Requirements	13
  B.7.   Instrument Calibration and  Frequency	14
  B.8.   Inspection/Acceptance Requirements for Supplies and Consumables	14
  B.9.   Data Acquisition Requirements	14
  B.10.    Data Management	14
C.    Assessment and Oversight	14
  C.1.     Assessment and Response Actions	14
  C.2.     Reports to Management	16
D.    Data Validation and Usability	17
E.    References	17

List of Figures

Figure 1.       Project Management Organization Chart	4
Figure 2.       Map of sampling sites	7

List of Tables

Table 1.       Project Schedule	7
Table 2.       Summary of Type and Number of Samples to be Collected	10
Table 3.       Sediment Volume, Container Type and Holding Times	11

-------
Appendices

Appendix A.   Fish Community SOP
Appendix B.   Copies of field sheets and forms
Appendix C.   Sediment SOPs
Appendix D.   Fish tissue SOPs
Appendix E.   Analytical Methods and MDLs

-------
QAPP Distribution List

Each of the following individuals will receive a hard copy of this Quality Assurance Project Plan (QAPP).
A copy of the final signed QAPP should be retained by each of these individuals until the completion of
laboratory analysis and final acceptance of the data report. All individuals listed below must receive a
hard copy of any changes or addendums to the QAPP.

       Demaree Collier, Project Officer
       U.S. EPA, GLNPO
       77 West Jackson Boulevard (G17J)
       Chicago, Illinois 60604-3590
       312-886-0214

       Julie Letterhos, Project Manager
       Ohio EPA, Division of Surface Water
       P.O. Box 1049
       Columbus, Ohio 43216-1049
       614-644-2871

       Tutu Rosanwo, QA Officer
       Ohio EPA, Division of Environmental Services
       Murray Hall, The Ohio State University
       1571 Perry Street
       Columbus, Ohio 43201
       614-644-4247

       Roger Thoma
       Ohio EPA, NEDO
       2110 East Aurora Rd.
       Twinsburg, Ohio 44087
       330-963-1141

       Kelvin Rogers
       Ohio EPA, NEDO
       2110 East Aurora Rd.
       Twinsburg, Ohio 44087
       330-963-1117

-------
A.     Project Management

       A.1.    Project Organization

              Figure 1  provides a summary of the project organization for this study.

              Figure 1. Project Management Organization
R/V Mudpuppy Field Crew
Joe Bonem/Polly Brooks
Cetacean Marine, Inc.
517-893-1033


Louis Blume
QA Manager
USEPA-GLNPO
312-353-2317

Julie Letterhos
Project Manager
Ohio EPA
614-644-2871
1

1 1 1 1
Kelvin Rogers
Field Crew
Ohio EPA
330-963-1117

Roger Thoma
Field Coordinator
Ohio EPA
330-963-1141

Linda Friedman
Ohio EPA
Lab Manager
614-644-4236

Kathie Haas
Ohio EPA
Sample Receiving
614-644-4243
Guy
ahoga RAP
              Ohio EPA is the principal investigating agency for this survey.  Ohio EPA is responsible
              for the development, coordination and implementation of the sampling plan and QAPP,
              and is the principal client for the final data. Sediment cores will be collected in
              association with U.S. EPA - GLNPO and the use of the R/V Mudpuppy. The Ohio EPA,
              Division of Environmental Services, will provide all analytical services. Staff associated
              with this project and their responsibilities  include:
              Person:
              Demaree Collier
              U.S.EPA-GLNPO
              77 W. Jackson Blvd. (G-17J)
              Chicago, IL 60604
              Phone:312-886-0214
              Collier.demaree@epa.gov

              Louis Blume
              U.S. EPA-GLNPO
              77 W. Jackson Blvd (G-17J)
              Chicago, IL 60604
              Phone:312-353-2317
              Blume.louis@epa.gov

              Julie Letterhos
              Ohio EPA-DSW
              P.O. Box1049
              Columbus, OH 43216-1049
              Phone:614-644-2871
              julie.letterhos@epa.state.oh.us
Responsibilities:
Project Officer
Grant oversight and budgeting
Coordinating sediment collection
Technical guidance
Field Team member
Review and approval of final deliverables

GLNPO QA Manager
Review and approve QAPP
Project Manager
Grant oversight and budgeting
Coordinate Ohio EPA support
Prepare QAPP
Review Final Report
Field Team Member
Coordinate delivery of field samples to lab

-------
       Roger Thoma
       Ohio EPA-NEDO
       2110 East Aurora Rd.
       Twinsburg, OH 44087
       Phone:330-963-1141
       Roger.thoma@epa.state.oh.us
       Kelvin Rogers
       Ohio EPA-NEDO
       2110 East Aurora Rd.
       Twinsburg, OH 44087
       Phone:330-963-1117
       Kelvin.rogers@epa.state.oh.us

       Linda Friedman
       Ohio EPA-DES
       OSU, Murray Hall
       1517 Perry St.
       Columbus, OH 43210
       Phone:614-644-4236
       Linda.fried man@epa.state.oh.us

       Tutu Rosanwo
       Ohio EPA-DES
       OSU, Murray Hall
       1517 Perry St.
       Columbus, OH 43210
       Phone:614-644-4247
       Tutu.rosanwo@epa.state.oh.us

       Kathie Haas
       Ohio EPA-DES
       OSU, Murray Hall
       1517 Perry St.
       Columbus, OH 43210
       Phone:614-644-4243
       Kathie. haas@epa.state, oh. us
Fish Field Coordinator
Conduct fish community sampling
Conduct fish tissue sampling
Provide SOP for electro-fishing
Review QAPP
Analyze fish community data
Review Final Report
Ensure all QA/QC procedures for fish sampling
are followed in the field

Cuyahoga River RAP coordinator
Field Team Member
Prepare historical use study
Review QAPP
Prepare Final Report
Communicate with Cuyahoga RAP

Laboratory Manager
Provide lab SOPs
Review QAPP
Provide all analytical support
Ohio EPA QA Officer
Review QAPP
Ensure all analytical procedures adhere to
approved laboratory QA/QC methods
Review data quality indicators
Review final analytical data
Assure the integrity of samples
once received at the lab and that
appropriate chain of custody is followed
Provide all sample bottles and preservatives
Coordinate with field team for sample delivery
A.2.    Background and Problem Definition

The Cuyahoga River flows into the central basin of Lake Erie at Cleveland, Ohio. Within the Lake
Erie basin, the Cuyahoga River has long been considered the single most environmentally
disturbed system and is a Great Lakes area of concern. Considerable modification of the
physical habitat of the lower river has occurred, particularly as a result of navigation needs. In
1827, a bypass channel was dredged allowing the river to enter the lake approximately one mile
above the natural mouth.  This eased some navigation problems,  but the bypass channel diverted
river flow and the natural mouth was filled  in, leaving the original course of the river a blind
channel.  Both the old channel and the lower main stem of the river are now heavily used
navigation channels lined with factories, commercial docks and storage facilities, marinas and
entertainment complexes. These  segments of the river have been severely modified by deep

-------
dredging, bank-shaping and shoreline structures of steel sheet piling, cement seawall and
limestone rip-rap.

In spite of the massive physical alterations and decades of pollutant discharge, the area at the
upper end of the old channel has returned to some semblance of naturalness by regaining beds
of submerged aquatic vegetation.  Fish community surveys over the past several years indicate
the potential for the old channel to support a diverse fish community. However, the fish that
populate this area display high levels of DELT anomalies. Brown bullhead, observed in the old
channel by both USGS-BRD and Ohio EPA, have a high incidence of tumors.  Much of the old
channel is continually dredged to maintain a depth of 21 ft., and it is assumed that this would
reduce the potential for contaminants to concentrate there. However, the upper blind end of the
channel is much shallower, rarely dredged and has little circulation or flushing.  Lake Erie seiches
also contribute to backflow and the potential to direct polluted water from the main  river channel
into the old channel. There is also the potential that sediments along the stream banks adjacent
to the dredged channel may harbor elevated levels of contaminants. Existing sediment data for
the old channel is  limited to several sites sampled by the Corps of Engineers in the dredged
navigation channel. This project will focus on the approximately one-mile long old channel.

The high incidence of brown bullhead tumors and DELTs in other species suggests that a serious
contamination problem exists.  Ohio EPA proposes a screening assessment of the old channel
portion of the Cuyahoga River navigation channel to determine the potential presence and
sources of contaminants that may be causing a high incidence of tumors and DELT anomalies in
brown bullhead and other fish. The results of this project will assist Ohio EPA in determining if
additional actions are needed to characterize contaminant problems in the old channel. The
results will also assist the Cuyahoga River RAP to better document the beneficial use
impairments in the old channel and the need for remedial actions.
A.3.   Project/Task Description

Ohio EPA proposes to conduct a screening assessment of the old channel of the Cuyahoga River
to provide a background environmental status report. Ohio EPA will collect fish community and
fish tissue data, and coordinate with U.S. EPA/GLNPO to collect sediment core samples using
the RA/ Mud puppy. Ohio EPA will also conduct a reconnaissance survey to compile the potential
past and  present sources of contaminants to the area. A map of the old channel and the
approximate sampling locations is presented as Figure 2.  Specific tasks are listed below.

A.3.1.  Historical Reconnaissance Survey of Old Channel and Adjacent Properties
A history  of the area will be compiled to investigate past and present industrial use, presence of
landfills or abandoned disposal sites, past spills or "pollution events" (i.e. warehouse fire),
dredging  history (401/404 permits), marina development, current use and location of discharges
(NPDES  permits), and history of the relocation of the river mouth.  The preliminary results of this
study will provide background documentation to identify potential source areas and may help to
better position the sampling sites. This task will also include compilation of any existing
environmental data for the old channel.

A.3.2.  Sediment Chemistry Sampling
The Ohio EPA proposes to sample 18 sites in the old river channel for sediment analysis. Seven
sites will be in the shallow upstream, non-dredged area; five sites will be located between the
dredged channel and  the riverbank;  and six sites will be in the slips adjacent to the old channel.
Sample locations are  approximate and the actual sample site locations will be recorded in the
field. Two to three subsamples will be  collected per sediment core, with the 0-1 ft. horizon initially
established.  Remaining subsamples will be collected at the discretion of the field crew based on
appearance of the sediment cores and suspected zones of contamination. Each sample will
consist of subsamples from the selected horizon that are composited and homogenized and then
placed in  sample containers. All sediment samples will be analyzed for total metals (including

-------
mercury), particle size, total organic carbon, VOCs, PCBs and organochlorine pesticides and
BNAs (including PAHs).  Also, in order to remain comparable to historical data, surface grabs
using a Ponar will be collected at sites within each of the six representative zones.

A.3.3.  Fish Community Sampling
Six zones, approximately 400 to 500 meters long and within one meter of shore, will be surveyed
for fish community composition.  Electro-fishing gear will be used and all fish collected will be
identified to species, counted, measured, weighed and checked for DELTs and external tumors.
The six zones will cover the length of the old  channel.

A.3.4.  Fish Tissue Sampling
Fish tissue samples will be collected in each of the six fish community zones.  Samples will
consist of a composite of whole body carp. Samples will be analyzed for the bioaccumulative
metals, (mercury, selenium, cadmium, arsenic and lead), PCBs and organochlorine pesticides.
Fish tissue will not be analyzed for PAHs as they tend to metabolize rather than bioaccumulate.
Figure 2.  Map of Cuyahoga Old Channel with approximate locations of sampling sites and
fish zones, (yellow dots are sediment sites; blue dots are fish sites)

-------
A.3.5.  Project Schedule

       Table 1.  Tentative Project Schedule
Task
QAPP development and sign-off
Preliminary Historical Use report
Sediment sampling
Fish community and fish tissue sampling
Obtain analytical results
Submit progress report to Project Officer
Review results and prepare draft report
Submit final report and recommendations for next
steps
Project End
Completion Date
April 30, 2002
May 5, 2002
Week of May 13,2002
Week of May 6, 2002
August 2002
September 2002
November 2002
February 2002
February 2002
A.4.   Data Quality Objectives and Criteria

A.4.1.  Historical Use Reconnaissance Survey
Records of Ohio EPA, USEPA-EDO, local agencies, local libraries, Cuyahoga RAP, etc. will be
searched to provide a history of any activities that may have contributed contaminants to the old
channel.  Particular interest is in identifying any sources of PAHs and PCBs.

A.4.2.  Sediment Chemistry
Sediment chemistry data will be compared to existing sediment quality quidelines (SQGs) as
referenced in MacDonald et al. (2000) and Persuad et al. (1993).  They will also be compared to
background concentrations in the Lake Erie basin sediment project report (Ohio EPA and
Heidelberg College 1998). Of particular concern will be screening for levels of contaminants that
may be associated with tumor development - PAHs and PCBs. Since this is a screening survey,
the sediment data will be used to determine if the contaminants of concern exist at elevated
levels that may contribute to tumor development in brown bullhead and/or trigger further
investigations.  Results will also be discussed with Dr. Paul Baumann, USGS-BRD, per their
connection to fish tumor occurrence.

A.4.3   Fish Tissue Sampling
Fish tissue chemistry analysis will be conducted following Ohio EPA analytical methods used
statewide so results will be comparable to any existing recent data.  Levels of contaminants will
be compared to guidelines used by the Ohio Department of Health to determine fish consumption
advisories, and also to the Fish Consumption Beneficial Use Impairment Report prepared for the
Lake Erie Lakewide Management Plan (LaMP) (Lambert, 1999).

A.4.4.  Fish Community Survey
Fish community surveys will be conducted in accordance with the SOP presented in Appendix A.
Ohio EPA has developed fish monitoring methodology specifically for the lower tributary and
harbor areas (lacustuaries). IBIs and MIWBs will  be calculated for this area and compared to the
"lacustuary" values developed by Ohio EPA (Thoma, 1999). The percentage of DELT anomalies
and external tumors will also be recorded to estimate the general health of the resident fish
community. A qualitative habitat evaluation (QHEI) will also be conducted.

All field and laboratory procedures will adhere to those developed and approved by Ohio EPA.
Definitions of precision, accuracy and completeness are contained in Ohio EPA's  QAPP
Integrated Work Program (Ohio EPA,  1995). Standard Operating Procedures (SOPs) for all field
sampling and laboratory analytical work are contained in the Manual of Laboratory Standard
Operating Procedures, Volumes I,  II and III (Ohio EPA, 2001).

-------
       A.5.   Special Personnel, Training and Equipment Requirements

       Sediment sampling will require the use of the U.S.EPA/GLNPO's R/V Mudpuppy and her captain
       and crew.

       Ohio EPA staff will conduct fish community surveys using electro-fishing methodology designed
       specifically for Lake Erie lake-effect tributaries and harbors (lacustuaries).

       A.6.   Documentation and Records

       The Ohio EPA project manager will ensure that all the appropriate project personnel have the
       most current approved version of the QAPP and that all staff are aware of scheduling and their
       responsibilities.

       The project manager will maintain a project field notebook to contain all planning meeting notes,
       field data sheets, copy of ship's log, historical background report, copies of laboratory chain of
       custody forms, field expenses, and any other communication that may be important to
       preparation of the final project report.  Field data sheets will be completed for each sediment
       sampling station, each fish tissue sampling area, and each fish community zone. Latitude and
       longitude will be recorded for each station  using GPS to ensure accurate interpretation of where
       contaminant problems exist. Copies of field data sheets and forms to be used are included in
       Appendix B.

       Copies of all laboratory records documenting sample handling and analysis will be submitted to
       the project manager and maintained in the permanent project file.  All laboratory reports and
       analytical data will be handled as outlined in the Ohio EPA QAPP Integrated Work Program (Ohio
       EPA 1995).

       All data will be entered into Ohio EPA databases and submitted to USEPA/GLNPO in MS Excel
       format. Per GLNPO project/grant guidelines, a final copy of the project report will be provided in
       hard copy, electronic and html versions.


B.     Data Generation and Acquisition

       B.1    Sampling Process Design and Rationale

       A high incidence of tumors in brown bullhead observed by Ohio EPA and USGS-BRD in separate
       surveys suggests that contamination is present in the old channel. Lack of sediment data and
       environmental background information in general make it impossible to determine the reason for
       the fish tumors. This screening assessment was designed to determine if contaminated sediment
       is present, what contaminants are present, and if contamination is high enough to initiate
       additional site characterization and remediation. Sites were chosen based on proximity to
       potential sources, whether the area has been or is regularly dredged, and the location of
       aquatic vegetation. The sites were also chosen to provide representative coverage of the
       various channel characteristics, i.e. shallow upper channel, area adjacent to the regularly
       dredged commercial navigation channel, and slips off the river. Fish community surveys will be
       conducted in six zones to cover the basic habitat types and to determine the diversity available,
       as well as the general condition of the fish community (i.e. DELTs and tumors).

       Using the rationale and SOPs in Ohio EPA 2001 and Ohio EPA 1994a, samples for fish tissue
       analyses will be collected of whole body carp. Each sample will be comprised of a composite of
       five fish of similar size. The only variance from the SOP will be that the aluminum foil used to
       wrap the fish samples will not be decontaminated with acetone. Tests by the Ohio EPA lab
       determined that this step is no longer necessary.  Fish tissue SOPs are presented in Appendix D.
                                             10

-------
Sediment cores will be collected at 18 sites.  Seven sites will be in the most upstream shallow
area where deep sediment is expected and where contaminants may have accumulated over
time since little to no maintenance dredging is done in this area. Five sites will be along the edge
of the navigation channel and six sites will be in the three slips adjacent to the channel. Cores
were chosen rather than grab samples to provide an indication of past contaminant loading and
the preliminary depths as to where contaminants might exist.  Pictures will be taken of each core
and labeled for reference when preparing the final project report. In order to comply with limited
resources, each core will be split into two to three subsections. Subsamples from each section
will be composited and homogenized. Sediment samples will be collected using SOPs in
Appendix C. VOCs are not typically found in sediments in the Lake Erie watershed, so only
minimal screening will be done unless the background reconnaissance survey suggests that
VOCs may be a contaminant of concern.

The fish zones will cover the length of the approximately one-mile-long channel, and each  will be
400-500 meters long and one meter from shore. Samples will be collected once, in the spring, to
coincide with when brown bullhead will be in  the area.

B.1.2.  Definition of Sample Types
Three types of sediment samples will be collected  during this survey: routine field samples (RFS);
field replicates (FR); and field duplicates (FD).  Each sample type is described below.

Routine Field Samples (RFS):  Routine field samples will be collected by splitting a  core into
several sections, taking subsamples from each portion and homogenizing them, and filling all
required sample jars. Routine field samples will be collected at 18 locations, with two to three
samples collected at each site.

Field Duplicates (FD):  Prepared by filling a second set of sample jars from a homogenized core
subsample. Four FDs will be collected.

Field Replicates (FR):  Prepared by collecting a second, separate sediment core sample,
homogenizing the material separate from the RFS and filling the required sample jars. FRs will
be collected at two locations.

B.1.3.  Type and Number of Samples
Table 2 summarizes the type and number of samples to be collected during this project. The
estimated number of samples includes all RFS, FD, and FRs.
       Table 2.  Summary of Type and Number of Samples to be Collected
Sample Type
Sediment Chemistry
from Cores and
Surface Grabs
Sediment VOCs
Whole Body Carp
Fish Community
Number of
Samples
69
20
6
6
Sample Matrix
Sediment
Sediment
Fish Tissue
Water
Analysis Required
Total PCBs, BNAa(PAHs), metals, TOO,
Particle size, Organochlorine pesticides
VOCs
Total PCBs, metals,
organochlorine pesticides
Identified to Species
All of the data listed in Table 2 is considered critical to the success of this assessment-screening
project.
                                      11

-------
B.2    Sampling Methods
SOPs for all field measurements are contained in Ohio EPA, 1991 or 2001. At each sediment
site, measurements will be taken for latitude/longitude location. This data is critical for use in
determining where sediment samples were collected. The two Differential Global Positioning
Systems (DGPS) onboard the R/V Mudpuppy are both capable of ascertaining horizontal
locations with < 5 meters of accuracy. To achieve this accuracy,  it is important that the DGPSs
are in good working order and are obtaining strong satellite signals. The R/V Mudpuppy field
team will be responsible for checking the satellite signal strength for the DGPS systems prior to
recording this data and for ensuring that the two systems are recording equivalent horizontal
locations.  Any problems with signal strength of differences between the two systems shall be
recorded in the field sample log. A qualitative description should  also be provided in the field log
utilizing any available permanent landmarks. Ohio EPA will collect other field data including a
water column profile for temperature, dissolved oxygen, pH, and conductivity.  Water depth will
also be recorded.

Sediment cores will be collected utilizing the vibracorer sampling  device located  on the R/V
Mudpuppy. Appendix C contains the equipment needs, the SOPs, and the decontamination
procedures for the collection of sediment core samples. Sediment cores will be split with
subsamples of the upper 1-foot composited and homogenized as one sample, and subsamples of
two or three sections of the  lower core composited and homogenized as additional samples.  In
the unexpected event that it is not possible to obtain a core sample, a ponar sample will be taken
instead following the SOP in Appendix C.

The fish community will be surveyed in six 400-500  meter zones using the SOP in Appendix A.

Fish tissue samples will be collected for whole body carp using the SOP for fish tissue included in
Appendix D.

B.3.    Sample Handling and Custody

B.3.1   Sample Containers
After processing, sediment samples will  be placed into the appropriate sample containers as
summarized  in Table 3. A field sample data sheet will be completed for each sampling location.
All containers, lab analytical request sheets, and chain-of-custody forms will be provided by Ohio
EPA, Division of Environmental Services (DES). Notify Kathie Haas, DES, 24 hours in advance
for supply  pick-up. Coolers will be provided by Ohio EPA, Northeast District Office, Division of
Surface Water.

Table 3. Sediment Sample Volume and Container Type
Parameter
VOCs
Semi-volatiles
and PAHs
Pesticides/PCBs
Metals
Particle Size
TOC
Sample
Volume
60 mis
100g
100g
250 g
500 g
125 g
Container Type
60 ml wide mouth glass vial with
Teflon lined lid. Fill to eliminate
head space
500 ml wide mouth amber glass
jar with Teflon lined lid
500 ml wide mouth amber glass
jar with Teflon lined lid
500 ml wide mouth glass jar with
Teflon lid
Plastic ziplock bag or 500 ml
HOPE
125 ml glass jar with Teflon lined
lid
Holding Time
Extracted within 14 days
Extracted within 14 days
Extracted within 14 days
Hg - 28 days
All other metals - 6 months

14 days
                                      12

-------
All samples except particle size will be immediately stored on ice in coolers and shipped to the
lab within 48 hours.

Fish tissue samples will be wrapped in foil packages, frozen with dry ice and delivered to the lab
within 48 hours.

All sample handling and custody results will be followed as referenced in Ohio EPA, 1995 and
2001.

B.3.2.  Sample Labeling
All sample containers will be labeled with the site name/sample ID as it appears on the laboratory
submission form and field data sheets, date, time (military), type of sample and name of collector.
All labeling will be done using tape or waterproof labels and indelible blue or black ink.

B.4.   Analytical Methods Requirements

EPA approved standard methods will be used. Quantitative methods to be employed include
atomic absorption (AA), emission spectroscopy (ICP), and manual cold vapor for heavy metals,
GC/MS forsemi-volatiles, VOCs and PAHs, and GC for PCBs and organochlorine pesticides.
Quantitative analyses will be performed by the Ohio EPA, Division of Environmental Services
(DES) as referenced in Ohio EPA 2001.

Methods and RLs are included in Appendix E.

B.5    Quality Control Requirements

All analytical procedures are documented in writing as SOPs, and each SOP includes QC
information that addresses the minimum QC requirements for the procedure. The internal QC
checks might differ slightly for each individual  procedure.  Examples  of some of the QC samples
that will be used during this project include:
    •   Method blanks
    •   Reagent/preparation blanks
    •   Surrogate standards
    •   Analytical spikes
    •   Field replicates
    •   Field duplicates
    •   Laboratory duplicates
    •   Matrix spike/matrix spike duplicate
    •   Laboratory quality control check standards

The actual QC sample  requirements will be dictated by the method requirements. Details on the
use of each QC check are provided in the analytical SOPs provided for each measurement (see
Ohio EPA 2001).

B.6.   Instrument/Equipment Testing, Inspection and Maintenance Requirements

The purpose of this section is to discuss the procedures used to verify that all instruments and
equipment are maintained in sound operating  condition, and are capable of operating at
acceptable  performance levels.

As part of the Ohio EPA, Division of Environmental Services QA/QC  program, a routine
preventative maintenance program will be conducted to minimize the occurrence of instrument
failure and other system malfunctions. All laboratory instruments are maintained in accordance
                                      13

-------
       with manufacturer's specifications and the requirements of the specific method employed. This
       maintenance is carried out on a regularly scheduled basis.

       B.7.    Instrument Calibration and Frequency

       Parameter specific calibration procedures and frequency requirements for quantitative analyses
       are defined in the method SOPs and contained in Ohio EPA 2001.

       B.8.    Inspection/Acceptance Requirements for Supplies and Consumables

       The purpose of this section is to establish and document a system for inspecting and accepting
       all supplies and consumables that may directly or indirectly affect the quality of the project.

       Supplies for sediment core collection and handling will be provided by the crew of the R/V
       Mudpuppy. All sample bottles and distilled/deionized  water will be provided by Ohio EPA, DES.
       The Ohio EPA laboratory will use high quality supplies (i.e. gases, reagents, etc.) during the
       analysis of samples from this project.  Water purification systems are tested on a regular basis
       and solvent blanks are run to verify the purity of solvents used in organic analyses. The lab will
       utilize their standard  practices to handle, process and analyze samples.

       B.9.    Data Acquisition Requirements

       Information collected from files and documents for the historical use report will be documented
       and listed in the References section of the report.  Any existing data that is collected for the old
       channel area will be reviewed for data quality, methods and detection limits.

       B.10.  Data Management

       Data reduction, data validation and data reporting policies and procedures will adhere to those
       established by the Ohio EPA, DES for quantitative analysis procedures and are contained in Ohio
       EPA, 1995, Section 10.1.


C.     Assessment and  Oversight

       C.1.    Assessment and Response Actions

       During the planning process, many options for sampling design, sample handling, sample
       cleanup and analysis, and data  reduction are evaluated  and chosen for the project.  In order to
       ensure that the data collection is conducted as planned, a process of evaluation and validation is
       necessary.  This section of the QAPP describes the internal and external checks necessary to
       ensure that:

          •   All elements of the QAPP are correctly implemented as prescribed.
          •   The quality of the data generated by implementation of the QAPP is adequate.
          •   Corrective actions, when needed, are implemented in a timely manner and their
              effectiveness is confirmed.

       The most important part of this section is documenting all planned internal assessments.
       Generally, internal assessments are initiated or performed by the QA Officer. Two types of
       assessments can be performed as described below:

          Management Systems Review (MSR). A form of  management assessment, this process is a
          qualitative assessment of a data collection operation or organization to establish whether the
          prevailing quality management structure, policies, practices, and procedures are adequate for
          ensuring that the type and quality of data needed  are obtained. The MSR is used to ensure
                                              14

-------
    that sufficient management controls are in place and carried out by the organization to
    adequately plan, implement, and assess the results of the project.

    Readiness Reviews. A readiness review is a technical check to determine if all components
    of the project are in place so that work can commence on a specific phase.

It is anticipated that routine readiness review by the laboratory manager as described in the
laboratory quality assurance plan will be sufficient for this project.  No management systems
review is anticipated for this project.

Assessment of Project Activities
Assessment of project activities can involve the following tasks:

    •    Surveillance
    •    Technical Systems Audit (ISA)
    •    Performance Evaluation (PE)
    •    Audit of Data Quality (ADQ)
    •    Peer Review
    •    Data Quality Assessment.

Surveillance will be the primary assessment technique of project activities. This will most readily
occur by the Lab Manager and QA Officer.

Number, Frequency, and Types of Assessments
Due to the short-term nature of this project no types of assessments are planned other than
general surveillance.

Assessment Personnel
External and internal laboratory audits are coordinated by the laboratory QA Officer.

Schedule of Assessment Activities
External audits by the GLNPO QA Officer and/or the GLNPO Project Manager is up to his/her
discretion. The scheduling of regular internal audits at the lab is at the discretion of the QA
Officers.

Reporting and Resolution of Issues
Any audits or other assessments that reveal findings of practice or procedure that do not conform
to the written QAPP need to be corrected  as soon as possible. The Project Manager and QA
Officer need to be informed immediately of critical deviations that compromise the acceptability of
the test.

For noncompliance problems, a formal corrective action program will be determined and
implemented at the time the problem is identified. The person who identifies the problem will be
responsible for notifying the project manager.  Implementation  of corrective actions will be
confirmed in writing through the same channels.

Corrective actions in the laboratory may occur prior to, during, and after initial analysis. A number
of conditions, such as broken sample containers, multiple phases, and potentially high
concentration samples may be identified during sample log-in or just prior to analysis.  Following
consultation  with laboratory analysts  and section leaders, it may be necessary for the Laboratory
QA Officer to approve the implementation of corrective  actions. The submitted SOPs specify
some conditions  during or after analysis that may automatically trigger corrective actions of
samples, including additional sample extract cleanup and automatic re-injection/reanalysis when
certain quality control criteria are not met.
                                       15

-------
Corrective actions are required whenever an out-of-control event or potential out-of-control event
is noted.  The investigative action taken is somewhat dependent on the analysis and the event.

Laboratory personnel are alerted that corrective actions may be necessary if:

    •   QC data are outside the warning or acceptable windows for precision and accuracy
    •   Blanks contain target analytes above acceptable levels
    •   Undesirable trends are detected in spike recoveries or RPD between  duplicates
    •   There are unusual changes in detection limits
    •   Deficiencies are detected by the Laboratory and/or GLNPO QA Officers) during any
       internal or external audits or from the results of performance evaluation samples
    •   Inquires concerning data quality are received.

Corrective action procedures are often handled at the bench level by the analyst, who reviews the
preparation or extraction procedure for possible errors, checks the instrument calibration, spike
and calibration mixes, instrument sensitivity, experimental set-up, and so on.  If the problem
persists or cannot be identified, the matter is referred to the Laboratory Manager and/or
Laboratory QA Officer for further investigation.  Once resolved, full documentation of the
corrective action procedure is filed with the Laboratory QA Officer.

These corrective actions are performed prior to release of the data from the laboratory.  The
corrective actions will be documented in both the laboratories corrective action log and the
narrative  data report sent from the laboratory.
C.2.   Reports to Management

DES shall provide analytical results to the Ohio EPA Project Manager. Unless otherwise noted, it
will be assumed that all QA/QC results have met the approved Laboratory standards (Ohio EPA
1995 and 2001). Written QC data and appropriate QA/QC reports generated by Ohio EPA, DES
shall be included in the Analytical Data Report provided to the Ohio EPA Project Manager as
needed.

Any serious QA problems needing  immediate decisions will be discussed orally between the
Project Manager and laboratory staff, with such discussions recorded in the overall project file.
These problems will be noted in the final project report.

A summary of QA/QC  information will be provided in the final written report to USEPA. This
report will  include information on adherence of measurements to the QA objectives. The final
report will  contain detailed discussions of QA/QC  issues, including any changes in the QAPP, a
summary of the contract laboratories QA/QC reports, results of any internal performance audits,
any significant QA/QC problems, detailed information on how well the QA objectives were met,
and their ultimate impact on decision making. The following is a list of items that should be
included in the final project report:

    •   Changes in the QAPP
    •   Results of any internal system audits
    •   Significant QA/QC problems, recommended solutions, and results of corrective actions
    •   Data quality assessment in terms of precision, accuracy, representativeness,
       completeness, and sensitivity
    •   Indication of fulfillment of QA objectives
    •   Limitations on  the use of the measurement data
                                       16

-------
    D.      Data Validation and Usability

       The project manager will make a final decision regarding the validity and usability of the data
       collected during this project. The project manager will evaluate the entire sample collection,
       analysis, and data reporting processes to determine if the data is of sufficient quality to meet
       project objectives, and determine if representative hypotheses were met. Data validation involves
       all procedures used to accept or reject data after collection and prior to use. These include
       screening, editing, verifying, and reviewing through external performance evaluation audits.  Data
       validation procedures ensure that objectives for data precision and bias will be met, that data will
       be generated in accordance with the QA project plan and SOPs, and that data are traceable and
       defensible. The process is both qualitative and quantitative and is used to evaluate the project as
       a whole.

       Procedures Used to Validate Field Data
       Procedures to evaluate field data for this project primarily include checking for transcription errors
       and reviewing field logs.

       Procedures Used to Validate Laboratory Data
       The laboratory QA Officer will conduct a systematic review of the analytical data for compliance
       with the established QC criteria based on the spike, duplicate, and blank results provided by the
       laboratory.  All technical holding times will be reviewed, the laboratory analytical instrument
       performance will be evaluated, and results of initial and continuing calibration will be reviewed
       and evaluated.

       The data review will identify any out-of-control data points and data omissions, and the
       Laboratory QA Officer will interact with the laboratory to  correct data.

       The Ohio EPA QA Officer will compare all field and laboratory duplicates for RPD.  Based on the
       results of these comparisons, the QA Officer will determine the acceptability of the data.  One
       hundred percent of the analytical data will be validated.


E.     References

Baumann, P., V. Cairns, B. Kurey, L. Lambert, I. Smith and R. Thoma. 2000. Fish Tumors and Other
Deformities. Lake Erie LaMP Technical Report Series. Tech.  Rept. No. 6.

Cuyahoga River Remedial Action Plan Coordinating Committee. 1992. Cuyahoga River RAP Stage One
Report and Appendices.  Ohio EPA, Division of Surface Water, Columbus, Ohio.

Cuyahoga River Remedial Action Plan Coordinating Committee. 1996. Cuyahoga River RAP Stage One
Update Report and Appendices. Ohio EPA, Division of Surface Water, Columbus, Ohio.

Lambert, L. 1998. Impairment Assessment of Beneficial Use: Restrictions on Fish and Wildlife
Consumption. Lake Erie LaMP Technical Report Series. Tech. Rept. No. 2.

MacDonald, D.D., C.G. Ingersoll, and T.A. Berger, T.A. 2000. Development and evaluation of
consensus-based Sediment Quality Guidelines for freshwater ecosystems. Archives of Environmental
Contamination and Toxicology, 39:20-31.

Ohio EPA and Heidelberg College. 1998.  Lake Erie Basin Sediment Project Statistical Analysis of Data
Organized by Analyte. Volumes I  and  II. Ohio EPA, Division of Surface Water, Columbus, Ohio.

Ohio EPA. 1991. Manual of Ohio EPA Surveillance Methods and Quality Assurance Practices, Volumes I,
II and III, Division of Environmental Services, Columbus, Ohio.
                                              17

-------
Ohio EPA. 1994a. Fish Tissue Guidance Manual. Ohio EPA Tech. Bull. MAS/1994-11-1. Ohio EPA,
Division of Surface Water. Columbus, Ohio.

Ohio EPA. 1994b. Biological and Water Quality Study of the Cuyahoga River and Selected Tributaries.
Volumes I and 2. Division of Surface Water, Ecological Assessment Section, Columbus, Ohio.

Ohio EPA. 1995. Quality assurance project plans integrated work program. Columbus, Ohio.

Ohio EPA. 2001. Manual of Laboratory Standard Operating Procedures, Volumes I, II and III. Division of
Environmental Services, Columbus, Ohio.

Ohio EPA. 2001. Sediment Sampling Guide and  Methodologies (Second Edition). Ohio EPA, Division of
Surface Water, Columbus, Ohio.

Persuad, D., R. Jaaguamagi, and A. Hayton. 1993. Guidelines for the protection and management of
aquatic sediment in Ontario. Water Resources Branch, Ontario Ministry of the Environment, Toronto,
Ontario.

Thoma, R.F. 1999. Biological Monitoring and an Index of Biotic Integrity for Lake Erie's Nearshore
Waters, pp. 225-246 in T.P. Simon [Ed]: Assessing the Sustainability and Biological Integrity of Water
Resources Using Fish Communities. CRC Press, Boca Raton, FL.

U.S. EPA. 2001. Methods for Collection, Storage and Manipulation of Sediments for Chemical and
Toxicological Analyses: Technical Manual.  EPA 823-B-01-002. U.S. Environmental Protection Agency,
Office of Water, Washington, DC.
                                             18

-------
                                       Appendix A

                           Fish Community Sampling Methods
         for Ohio's Lake Erie Nearshore Waters, Harbors, and Lacustuaries

Fish field sample collection methods:

All fish collected will be identified to species, enumerated, examined for external anomalies, and either
returned to the lake, preserved as voucher specimens and stored at the Ohio State University Museum of
Biodiversity, or used for fish tissue samples. Weights will be taken on a representative sub-sample if
more than 15 individuals of a species are captured. All fish will be weighed if 15 or less individuals of a
species are captured.

Electro-fishing

Electro-fishing consistently catches more species and individuals in less time and effort than other
sampling methods used by Ohio EPA. It is a method that can be used under all habitat conditions thus
yielding a database that was easily comparable (in terms of catch/effort) under the variable conditions
encountered. Previous Ohio EPA work indicates that night electro-fishing captures as many species and
individuals as day electro-fishing in lacustuary habitats.

       General electro-fishing methodologies

A 5.8-meter modified V-hull John boat will be used for electro-fishing.  Electrical current is provided by a
7,000 watt generator and Smith-Root pulsator. Controls  are set on DC current, 60 pulses per second,
240-340 volts, and run at 5 to 6  amps. In low conductivity conditions the voltage is increased to 360-500
volts and the frequency increased to 120 pulses per second in order to maintain 5 to 6 amps.  Anodes are
two separately charged 1 m circumference electrospheres.  Two articulated booms supported by distal
floats are positioned about 2.1 m in front of the boat on articulated booms (3 m total length), one to the
port and one to the starboard  side at angles of approximately 20 degrees from the center line. Sampling
will not be conducted under wave conditions of 0.6 m or more.

Each sampling site will be 400 to 500 meters long and within 1 meter of the shore.  A set sampling time
will not be used and time may vary between 2,000 and 5,000 seconds.  The greater the number of fish
captured  in  the zone and the greater the complexity of the shore line the longer it will  take to complete the
sample. A crew of three individuals will be used in all electro-fishing efforts. During sampling one
individual will be positioned on the bow of the boat with a dip net and serve as the principal collector of
fish captured in the electrical field, a second person will be at mid-ship and serve as an assistant that will
collecte any fish missed by the principal netter, while the third person will operate the outboard motor,
pulsator controls, and collected  any fish that surface at the back of the boat. All fish will be placed in live-
wells supplied with fresh water from a pump. Common carp will be placed in their own live-well to avoid
excess oxygen consumption and the death of small fish that otherwise frequently are  trapped in common
carp mouths and crushed.

The anode and cathode array deployments used in this study are different from those used in previous
Ohio EPA sampling efforts. Anodes are two separately charged electrospheres, 1 m  in circumference,
and constructed with two stainless steel salad bowls approximately 2 mm  thick. The  anodes will be
suspended  5 cm below the surface, 2.1 m in front of the boat on articulated booms, one to each side.
Three sets of cathodes, each  to be used at different depths, will be used.  All have electrified portions 1.6
m in length.  The cathode sets are designed to be deployed  with the electrified surface at a maximum of
1.8 m, 3 m,  and 7.3 m deep. Cathodes 1.8 m long, are used under all conditions where bottom depths
are 2.5 m or less and 3 m cathodes are used in depths greater than 3 m.  Cathodes are deployed in one
of two ways: 1) in most areas  eight cathodes are deployed from the sides  of the boat  at mid-ship (1.8 and
3 m cathodes), four on each side, 2) in ship channel areas, four, 7.3 m depth cathodes are deployed from
the front of the boat.
                                              19

-------
                        Appendix B




Copies of field data sheets and Chain of Custody Form



         (See hard copy for copies of field data sheets)
                          20

-------
                                  Appendix C
                      Sampling SOPs for Sediment
                          SURFICIAL SEDIMENT SAMPLING
                        USING A PONAR DREDGE SAMPLER
                           ONBOARD THE R/VMUDPUPPY

Overview
The following Standard Operating Procedure (SOP) explains the technique for collecting
sediment samples using a ponar dredge sampling device.  The procedures cover the following
activities:

       •   Positioning of Vessel
       •   Securing the Vessel for Sampling
       •   Sample Collection Procedures

NOTE: This SOP only illustrates the technique for collecting surficial sediment samples (i.e., to
an approximate depth of 6 inches).  Details regarding the labeling and transport of sediment
samples should be addressed by individual project leads in the project specific Quality
Assurance Project Plan (QAPP).

Sample Handling and Preservation
Due to the expense of operating a vessel to collect sediment samples and the costs of analytical
sample analysis, every sediment sample is considered important.  Any contamination through
mishandling or lack of preservation could cause biases in data results.

The following considerations should be addressed.
       •   At a minimum, nitrile gloves and Saranac coveralls should be worn during sampling
          and sub-sampling activities.
       •   Sample containers must  be kept free of contamination and should remain sealed
          until use.
       •   Preservatives should be fresh and labeled.
       •   Samples should be stored in coolers and freeze packs or ice as soon as possible.
       •   Cooler temperatures should be checked at a minimum frequency of twice a day to
          ensure that the appropriate temperature is maintained.
       •   Mode of sample transport must maintain the integrity of the samples.

Safety
In any field operation, emphasis should be placed on  safety.  Site operators must be aware of
the potential safety hazards to which they will be subjected.  The Great Lakes National
Program Office has developed a safety manual specific to R/V Mudpuppy operations. All
personnel on board must be familiar with the contents of this document prior to implementing
any data collection activities.  Additionally, onboard personnel must follow all safety protocols
and equipment guidelines, and be prepared for emergency situations. The ship's captain is the
primary authority during vessel operations.  The Captain is also responsible for determining
whether a sampling activity will be undertaken during  inclement weather.  Sampling personnel
are responsible for their safety from  potential hazards including, but not limited to:
                                         21

-------
       •   Electrical Hazards.  For obvious hazards or problems (fire, scorching, blown fuses,
          etc.), turn off power for the circuit involved and notify the Captain who will take the
          lead in responding to the hazard. Never attempt electrical repairs.
       •   Personal Protective Equipment. Sampling personnel should be prepared to work
          with large and/or heavy equipment where safety shoes, head, hand, protective
          clothing (Tyvek or Saranac suits), and eye protection are necessary. Some sites
          may require respirators.  Sampling personnel should have clothing available in the
          event of weather extremes.
       •   Sampling Equipment.  Never force glassware with unprotected hands. Care should
          be taken around the sampling equipment to avoid injuries and slipping overboard
          while positioning sampling equipment.
       •   Chemical Dangers.  Organic solvents and acids are occasionally used to
          decontaminate sampling equipment. These materials should not be ingested nor
          come into contact with bare skin or flames (if flammable).  Sampling personnel
          should be familiar with the Material Safety Data Sheets for each and every chemical
          used during sampling.

Equipment and Supplies
The following equipment and supplies are required for the collection of a single sediment
sample at a typical sampling location.
       •   Ponar Dredge Sampler
       •   Winch
       •   Stainless Steel Bowls, Spoons, and Spatulas
       •   High Density Polyethylene (HOPE) Sediment Sample Bottles(1)
       •   Glass Sample Bottles (for Organic Contaminant Samples)(1)
       •   Coolers/Ice Chests(1)
       •   Sample Labels(1)
       •   Indelible Markers and Pencils(1)
       •   Global Positioning  System or Locational Equipment
       •   Generator (230 Volt, 60  Hz, 3-Phase,  14 Amps)
       •   Safety Equipment (including:  Hard  Hat with Face Shield, gloves, safety glasses(2),
          Saranac/Tyvek suit, Steel-Toed  Boots(2), and Boot Covers)
Notes:
(1)  Must be provided by Grantee
(2)  Must be provided by each individual user

Sample Collection Procedures
A sampling activity may consist of collecting more than one type of sample at a particular site.
This procedure will detail the collection of surficial sediment samples (to an approximate depth
of 6 inches) from a single location. When benthic organism samples are being collected at the
same site, it is important to collect the benthic organism samples prior to the collection of
sediment samples in order to minimize the disturbances to the benthic organisms.

Every attempt should be made to stabilize the vessel as much as possible in order to collect
vertical sediment samples. When the vessel is moving, the dredge may enter the sediments at
an angle.
                                         22

-------
Sample Location
The sample location may be defined either prior to sampling, where the R/V Mudpuppy would
be destined for a specific location, or the site may be selected during the sampling process.
Generally, sites can be located with an onboard GPS system that is accurate to within 1-3
meters.  If the vessel is headed to a pre-determined site, the locational equipment shall be used
to locate the site. However, the actual locational readings for the site should be recorded after
the vessel is anchored at the sampling site since waterway conditions, obstructions, boat traffic,
or other circumstances may influence the exact sampling location.  The ultimate location of the
sample collection shall be determined by the Captain of the R/V Mudpuppy in consultation with
the sampling crew.
Securing the Vessel
Once at the sampling site, the vessel must be secured in order to avoid drifting and rotation that
could cause the sample device to enter the sediment at an angle.

Procedures are as follows:
       •   Triple anchor the vessel as instructed by the Captain
       •   Establish the exact sampling location with locational equipment (if location does not
          need to be accurately predetermined, the reading can be taken during sampling
          activities).

Sampling Procedures - Ponar Dredge Sampler

The following sampling procedure is used to collect surficial sediment samples on the R/V
Mudpuppy.

1.  Measure and Record Water Depth.
2.  Remove the straight (locking) bolt from the sampler,  place Ponar into its open position and
   insert the spring loaded  bolt into the assembly.
3.  Lift the ponar sampler into a vertical position using the Winch so that the sampler is
   suspended just off of the bow of the sampling vessel.
4.  Lower the sampler with the winch until the sampler is imbedded in the sediments.
5.  Using your hands, pull up on the winch cable to close the sampler and raise the ponar
   above the sediment surface.
6.  Maintain hand tension on the cable while slowly reversing the winch to remove the slack
   from the cable.
7.  Slowly lift the sampler back above the water surface until the sampler is in a vertical position
   just off of the bow of the sampling vessel
8.  Tip the ponar sampler to drain excess water from the sample.
9.  Place a stainless steel bowl under the ponar sampler and lower the sampler into the bowl.
10. Press down on hinge arms  to open sampler and discharge the sediment sample.
11. Replace straight (locking) bolt into the ponar sampler and set aside.
12. Handle and subsample the sediment sample as described in the QAPP.
13. Wash the sampler, bowl, spoons, deck, and other equipment using site water pump through
   onboard garden hoses or include other decontamination as required  by the project QAPP.
   (Note:  The R/V Mudpuppy and its crew supplies only site water and  hoses for
   decontamination. Accompanying personnel are responsible for providing all other
   decontamination equipment and supplies required by the QAPP.)
                                         23

-------
              LONG CORE SEDIMENT SAMPLING USING A VIBRO-CORER
                           ONBOARD THE R/VMUDPUPPY

Overview
The following Standard Operating Procedure (SOP) explains the technique for collecting
sediment core samples (up to 15 feet in length) using a Rossfelder Vibro-corer. The procedures
cover the following activities:

       •   Positioning of Vessel
       •   Securing the Vessel for Sampling
       •   Sample Collection Procedures

Questions regarding sampling methods or operation of sampling equipment should be directed
to:

       Dr. Marc Tuchman
       U.S. EPA Great Lakes National Program Office
       77 West Jackson Blvd. (G-17J)
       Chicago, IL 60604
       Phone:  312/353-1369
       Fax:  312/353-2018

NOTE: This SOP only illustrates the technique for sampling and sub-sampling long sediment
core samples (i.e., up to 15 feet in length). Details regarding the labeling and transport of
sediment samples should be addressed by individual project leads in the project specific Quality
Assurance Project Plan (QAPP).

Sample Handling and Preservation
Due to the expense of operating a vessel to collect sediment samples and the costs of analytical
sample analysis, every sediment sample is considered important. Any contamination through
mishandling or lack of preservation could cause biases in data results.

The following considerations should be addressed.
       •   At a minimum, nitrile gloves and Saranac coveralls should be worn during sampling
          and sub-sampling activities.
       •   Sample containers must be kept free of contamination and should remain sealed
          until use.
       •   Preservatives should be fresh and labeled.
       •   Samples should be stored in coolers and freeze packs or ice  as soon as possible.
       •   Cooler temperatures should be checked at a minimum frequency of twice a day to
          ensure that the appropriate temperature is maintained.
       •   Mode of sample transport must maintain the integrity of the samples.

Safety
In any field operation, emphasis should be placed on safety.  Site operators must be aware of
the potential  safety hazards to which they will be subjected.   The Great  Lakes National
Program Office has developed a safety manual specific to R/V Mudpuppy operations.  All
personnel on board must be familiar with the contents of this document prior to implementing
any data collection activities. Additionally, onboard personnel must follow all safety protocols
                                         24

-------
and equipment guidelines, and be prepared for emergency situations.  The ship's captain is the
primary authority during vessel operations. The Captain is also responsible for determining
whether a sampling activity will be undertaken during inclement weather.  Sampling personnel
are responsible for their safety from potential hazards including, but not limited to:

       •   Electrical Hazards.  For obvious hazards or problems (fire, scorching, blown fuses,
          etc.), turn off power for the circuit involved and notify the Captain who will take the
          lead in responding to the hazard. Never attempt electrical repairs.
       •   Personal Protective Equipment. Sampling personnel should be prepared to work
          with large and/or heavy equipment where safety shoes, head, hand, protective
          clothing (Tyvek or Saranac suits), and eye protection are necessary. Some sites
          may require respirators. Sampling personnel should have clothing available in the
          event of weather extremes.
       •   Sampling Equipment.  Never force glassware with unprotected hands. Care should
          be taken around the sampling equipment to avoid injuries and slipping overboard
          while positioning sampling equipment.
       •   Chemical Dangers.  Organic solvents and acids are occasionally used to
          decontaminate sampling equipment. These materials should not be ingested nor
          come into contact with  bare skin or flames (if flammable). Sampling personnel
          should be familiar with  the Material Safety Data Sheets for each and every  chemical
          used during sampling.

Equipment and Supplies
The following equipment and supplies are required for the collection of a single sediment core at
a typical sampling location.

       •   Vibrocorer
       •   Rolling Box for Vibrating Head
       •   Winch
       •   4" Acetate Butyrate Core Tubes
       •   Stainless Steel Bowls, Spoons, and Spatulas
       •   High Density Polyethylene (HOPE) Sediment Sample Bottles(1)
       •   Glass Sample Bottles (for Organic Contaminant Samples)(1)
       •   Coolers/Ice Chests(1)
       •   Sample Labels(1)
       •   Indelible Markers and Pencils(1)
       •   Global Positioning System or Locational Equipment
       •   Generator (230 Volt, 60 Hz, 3-Phase,  14 Amps)
       •   Heavy Duty Riveter and Steel Rivets
       •   Battery Powered Cordless Drill
       •   Battery Powered Cordless Saw
       •   Safety Equipment (including:  Hard Hat with Face Shield, gloves, safety glasses(2),
          Saranac/Tyvek suit, Steel-Toed Boots(2), and Boot Covers)
       •   Core Caps
Notes:
(1) Must be provided by Grantee
(2) Must be provided by each individual user
                                          25

-------
Sample Collection Procedures
A sampling activity may consist of collecting more than one type of sample at a particular site.
This procedure will detail the collection of long sediment core samples (up to 15 feet) from a
single location. When benthic organism samples are being collected at the same site, it is
important to collect the benthic organism samples prior to the collection of sediment samples in
order to minimize the disturbances to the benthic organisms.

Every attempt should be made to stabilize the vessel as much as possible in order to collect
vertical sediment cores.  When the vessel is moving, the cores may enter the sediments at an
angle.

Sample Location
The sample location may be defined either prior to sampling, where the R/V Mudpuppy would
be destined for a specific location, or the site may be selected during the sampling process.
Generally, sites can be located with an onboard GPS system that is accurate to within 1-3
meters.  If the vessel  is headed to a pre-determined site, the locational equipment shall be used
to locate the site.  However, the actual locational readings for the site should be recorded after
the vessel is anchored at the sampling site since waterway conditions, obstructions, boat traffic,
or other circumstances may influence the exact sampling  location.  The ultimate location of the
sample collection shall be determined by the Captain of the R/V Mudpuppy in consultation with
the sampling crew.
Securing the Vessel
Once at the sampling site, the vessel must be secured in order to avoid drifting and rotation that
could cause the coring device to enter the sediment at an angle.

Procedures are as follows:
       •  Triple anchor the vessel as instructed by the Captain
       •  Establish the exact sampling location with locational equipment (if location does not
          need to be accurately predetermined, the reading can be taken during sampling
          activities).

Sampling Procedures - Vibrocorer

The following sampling procedure is used to collect long core sediment samples on the R/V
Mudpuppy.

1.  Measure and Record Water Depth.
2.  Lift the Vibrating Head into a vertical position using the Winch so that the Vibrating Head is
   Suspended Just Off of the Bow of the Sampling Vessel.
3.  Insert the Core Tube into the Vibrating Head, Making sure that the Tube Slides into the
   Check Valve.
4.  Tighten the Collar to the Vibrocorer (Two Bolts on Each Side of the Assembly).
5.  Lower the Entire Assembly until the Core Nose is just above the Sediment Surface. Turn on
   the Generator and  the Vibrating Head.
6.  Slowly Lower the Vibrocorer by running out 6-10 Inches of cable at a time.  Monitor Core
   Tube Penetration by feeling  for slack in the cable.
7.  When the Vibrocorer ceases to penetrate the sediment (i.e., the Unit stops lowering or is
   "Refused"), or the Vibrating  Head is near the sediment surface, reverse the winch and pull
                                          26

-------
   the unit from the sediment.  Do Not Allow the Vibrating Head to become Imbedded in the
   Sediment.
8.  Turn off Power to the Vibrating Head and the Generator when the Core "Breaks Free" of the
   Sediment Surface.
9.  Lift the Entire Assembly so that the Sediment/Water Interface is Visible.  Drill Holes through
   the Core Tube at the Sediment/Water Interface to Decant Water from the Tube.
10. Disengage the Core Tube.
11. Lay Sediment Core on the Deck of the Vessel, Saw off the Excess Core Tube at the
   Sediment Surface and Cap the Top  of the Tube with a Red Cap Plug.
12. Lower the vibracore head back into  its holding cart.
13. Handle and Subsample the Sediment Core as Desired, Either On-Board the R/V Mudpuppy
   or at a Shore Location.  (R/V Mudpuppy is equipped to roughly section the sediment core
   into sub-sections of six (6) inches or greater using a battery powered circular saw.
   Subsamples can  be homogenized onboard using stainless steel bowls and  spoons.)
14. Dispose of excess sediments back into  the water body.
15. Wash the sampler, bowl, spoons, deck, and other equipment using site water pump through
   onboard garden hoses or include other  decontamination as required by the  project QAPP.
   (Note:  The R/V Mudpuppy and its crew supplies only site water and hoses  for
   decontamination. Accompanying personnel are responsible for providing all other
   decontamination equipment and supplies required by the QAPP.)
                                         27

-------
                                        Appendix D

                                   Fish Tissue SOP
                   Fish Tissue  Monitoring Program
                              Guidance Manual
                         Ohio EPA Technical Bulletin MAS/1994-11-1
                        State of Ohio Environmental Protection Agency
                             Monitoring and Assessment Section
                                Ecological Assessment Unit
                                   1685 Westbelt Drive
                                  Columbus, Ohio 43228
                                    November 30, 1994
  (Includes only pages 5 and 6 of the manual, referencing collection of whole body fish tissue samples)

Fish Tissue Sample Collection Methodology

In this section we will discuss the actual methods used to prepare a fish tissue sample following the
collection of the fish from the waterbody. As discussed previously, there are three different types of fish
tissue composite samples.

Whole Body Composite (WBC)
Skin-on Fillet Composite (SOFC)
Skin-off Fillet Composite (SFFC)

We  will discuss each of these three types  of composite samples in detail exploring the types of fish
species to collect, the length categories to collect, and sample preparation techniques. In all cases, the
lab requires about 150 grams of sample to perform the required analyses. Make sure to obtain enough
fish for a properly sized sample if possible.

Whole Body Composites (WBC)

The whole body composite sample is the easiest of the three types of samples to prepare because it does
not involve cutting the fish. Instead, the fish  is preserved whole for later analysis. Ohio EPA is the only
agency that should be obtaining WBC samples as they are the sole agency overseeing the Fish Tissue
Baseline Monitoring Program.
                                          28

-------
The primary fish species we are interested in collecting for WBC samples include (in order of preference):

Common carp (Cyprinus carpio)
Channel catfish (Ictalurus punctatus)
Bullheads (Ameiurus spp.)
Redhorse (Moxostoma spp.)
Buffalo (Icticbus spp.)
White sucker  (Catostomus commersoni)

The WBC sample consists  of up to five (5) fish of the same species with a minimum of two (2) fish for a
sample. An attempt should be made to collect the same species within an entire subbasin.  In order to
reduce variability in the sample, it is  also  important to collect fish of the same length grouping.  To
accomplish this, we ask that each of the fish be premeasured to determine their appropriateness for the
sample. The  smallest fish  in the WBC sample should be within at least 10% of the total length  of the
largest fish chosen for the sample. For example:

On a  sampling run in Mack Creek you collect 8 carp of the following lengths (in millimeters); 325; 330,
340; 350, 360, 600, 650, ar~d 660. Which ones do you use for the W13C sample? There are two obvious
length groups to choose from here.

The 325, 330, 340; 350, and 360 length carp (325 mm is within 36 mm or 360 mm), or the 600, 650 and
660 length carp (600 mm is within 66 mm of 660). The larger fish are probably older and therefore have
been exposed to pollutants longer (although this is not alwavs true) and may provide a better sample than
the smaller fish.

Since length  correlates well with age, we can be reasonably sure that fish of a similar  length  are of a
similar age  class  and have therefore  been "exposed" to  contamination  in their  surroundings  for
approximately the same amount of time (if possible, it is  also preferable to have fish from the same size
range throughout the entire subbasin).  These same two  generalizations will be true for the two types of
fillet composite samples (SOFC,  SFFC).

Once  you have collected the appropriate fish and  determined which ones are to be kept  for the sample,
then you are ready to prepare the WBC sample. First, obtain a bucket of water from the stream which the
fish were removed. Dip  the fish in the water to rinse any residue (sediment, organic matter, etc.) off of the
outside of the fish. Sacrifice the fish  by applying a blow to the nape using the fiberglass fish club. This will
break the fish's neck and kill it.  Take care to  strike the  fish with some restraint as  you  do  not want to
cause bleeding or other unnecessary fluid losses. It does  not take a heavy blow to break the neck of most
fish, except carp. Try lighter blows at first to experiment with the right amount offeree.

After sacrificing the fish, weigh the entire fish on the 1000 gram (to the nearest gram) or 10 kg scale  (to
the nearest 50 grams)  and measure the length again,  to the nearest millimeter using  the  measuring
board. These  weights and measures should then be recorded on the sample label.

Following this, wrap the fish in decontaminated aluminum foil (decontaminated side in towards the fish)
and tape  the ends of the  foil to secure the wrapping.  Number the fish according to its place  in the
composite sample (i.e.,  1 of 5, 2  of 5,...5 of 5) with a permanent marker. Wrap the fish together with tape
and affix the  sample label ensuring that the label is securely fixed to the sample (NOTE: If the fish
happen to be  large, you may need to label each individual fish in the composite as it will be impossible to
tape all of the fish together and store them properly). Quickly place the fish  in the sample  cooler after
affixing the label.

The cooler should be filled with  about 40 pounds of dry ice  (enough for about 4 days in  the field) which
will serve to freeze the  fish. Dry  ice  should be  placed on the top of the  fish initially to provide for quicker
freezing. Keep an eye on the dry ice to ensure that it does not run too low, thus causing  the fish to thaw
(Safety Note: Dry ice is very cold! Do not handle it with your bare hands or else severe burns may result.
Ask the dry ice vendor to wrap each  20  pound block in paper as this will ease handling.)
                                              29

-------
                    Appendix E




        Analytical Methods and MDLs



(See hard copy. Appendix E is not available electronically.)
                      30

-------
Kinnickinnic River QAPP, Draft, April 2002
                                Quality Assurance Project Plan
                                   Title and Approval Sheet
                                        Draft, April 2002
                      Kinnickinnic River Sediment Sampling in FY2002:

                                     IAG#DW96947964-01
                             Demaree Collier, Project Officer, GLNPO
                     77 West Jackson Blvd (G-17J), Chicago, Illinois 60604-3590,
                             Phone:312-886-0214, Fax: 312-353-2018
                      Paul Baxter, Project Coordinator, U.S. Army Corps of Engineers
                       Detroit District, 477 Michigan Ave., Detroit, Michigan 48226,
                              Phone (313) 226-7555, Fax (313) 226-2013
                                 Xiaochun Zhang, Project Investigator,
                              Wisconsin Department of Natural Resources
                                      101 Webster St., Box 7921
                                      Madison, WI 53707-7921
                                       Phone: (608) 264-8888
                                        Fax (608) 267-2800
                      Ian Kerr, Project Manager, Altech Environmental Services, Inc.
                    24209 Northwestern Hwy., Suite 230, Southfield, Michigan 48219,
                              Phone (248) 353-3832, Fax (248) 353-5485
                     Ann Preston, Project Manager, Trace Analytical Laboratories, Inc.
                        2241 Black Creek Rd., Muskegon, Michigan 49444-2673,
                          Phone (231) 773-5998, Ex. 224, Fax (231) 773-6537
                      Scott Strigel, Project Geologist, Coleman Engineering Company
                        635 Industrial Park Drive, Iron Mountain, Michigan 49801
                              Phone (906) 774-3440, Fax (906) 774-7776
                           Louis Blume, Quality Assurance Manager, GLNPO
                            77 W. Jackson Blvd. (G-17J), Chicago, IL 60604
                              Phone (312) 353-2317, Fax (312) 353-2018

-------
Kinnickinnic River QAPP, Draft, April 2002

                                  QAPP Distribution List
Demaree Collier, Project Officer
USEPA- GLNPO
77 West Jackson Blvd (G-17J)
Chicago, Illinois 60604-3590
phone: 312-886-0214
E-mail: Collier.Demaree@epamail.epa.gov

Paul Baxter, Project Coordinator
U.S. Army Corps of Engineers - Detroit District
477 Michigan Avenue
Detroit, MI 48226
phone: 313-226-7555
E-mail: paul.r.baxter@lre02.usace.army.mil

Xiaochun Zhang,
Project Investigator
Wisconsin Department of Natural Resources
101 Webster St., Box 7921
Madison, WI 5 3 707-7921
Phone : (608)264-8888
E-mail:  ZHANGX@DNR.STATE.WI.US
E-mail: ZhangX@mailO 1 .dnr.state.wi.us

Ian Kerr
Altech Environmental Services, Inc.
24209 Northwestern Hwy., Suite 230
Southfield, Michigan 48219
Phone(248)353-3832
E-mail: ikerr@altechenvironmental.com

Ann Preston, Project Manager
Trace Analytical Laboratories, Inc.
2241 Black Creek Rd., Muskegon, Michigan 49444-2673
phone: (231) 773-5998, Ex. 224
E-mail: traceanalytical@mad.scientist.com

 Scott Strigel
Coleman Engineering Company
635 Industrial Park Drive, Iron  Mountain, Michigan 49801
Phone (906) 774-3440
E-mail:  colemanengineering@uplogon.com

Lou Blume, QA Manager
USEPA- GLNPO
77 West Jackson Blvd (G-17J)
Chicago, Illinois 60604-3590
phone: 312-886-0214
E-mail: blume.louis@epa.gov

-------
Kinnickinnic River QAPP, Draft, April 2002

Table of Contents

QAPP Signature Page   .........        1
QAPP Distribution List .........        2
Table of Contents      .........        3

1.      Summary      .........        5
       1.1    Purpose        ........        5
       1.2    Background                  ......        5
       1.3    Project Organization   .......        5
2.      Project Description     .       .               .....        9
       2.1    Data Uses and Expected Measurements. .       ....        9
       2.2    Criteria and Objectives .       .        .       .       .       .       .        11
              2.2.1    Sediment Chemistry    .        .       .       .       .       .        11
       2.3    Special Personnel, Training, and Equipment Requirements      .       .        13
       2.4    Project Schedule       .......        13
3.      Sampling Plan  .........        13
       3.1    Sampling Network Design and Rationale       .       .       .       .        13
       3.2    Definition of Sample Types    ......        14
       3.3    Type and Number of Samples  ......        14
       3.4    Field Data Collection   .       .        .       .       .       .       .        15
4.      Sample Collection and Handling.       .        .       .       .       .       .        16
       4.1    Sample Collection.     .......        16
              4.1.1    Sediment Cores.       ......        16
              4.1.2    Sediment Grab Samples.        .....        16
              4.1.3    Hand Augured Samples .        .....        17
       4.2    Sample Handling      .......        17
              4.2.1    Sample Processing     ......        17
              4.2.2    Equipment Decontamination    .....        17
              4.2.3    Sample Containers     .        .       .       .       .       .        17
              4.2.4    Sample Labeling       ......        18
              4.2.5    Shipping and Chain-of-Custody .       .       .       .       .        19
              4.2.6    Receipt of Samples     ......        20
5.      Laboratory Analysis    ........        20
       5.1    Analytical Methods    .......        20
       5.2    Data Quality Objectives.       ......        20
              5.2.1    Method Detection Limits and Level of Quantitation     .       .        21
              5.2.2    Bias   ........        22
              5.2.3    Precision      .......        22
              5.2.4    Accuracy      .......        23
              5.2.5    Representativeness     ......        23
              5.2.6    Comparability  .......        23
              5.2.7    Completeness         ......        24
6.      Documentation and Records    .......        25
              6.1     Field Documentation   ......        25
              6.2     Laboratory Reports     ......        25
7.      Special Training Requirements .......        27
8.      Quality Control Requirements  .......        27
       8.1    Instrument/Equipment Testing,  Inspection, and Maintenance Requirements       28
       8.2    Instrument Calibration and Frequency   .       .       .       .       .        28
       8.3    Inspection/Acceptance Requirements for Supplies and Consumables     .        29
       8.4    Data Management      .......        30
       8.5    Data Acquisition Requirements (Non-Direct)    .       .       .       .        31

-------
Kinnickinnic River QAPP, Draft, April 2002

Table of Contents Continued

9.      Assessment and Oversight
       9.1     Assessment and Response Actions
       9.2     Reports to Management.
10.    Data Validation and Usability
32
32
34
35
List of Figures
Figure 1. Organizational Chart
Figure 2. Project Location Map
Figure 3. Sampling Station Locations
Figure 4. Example Sample label
Figure 5. Kinnickinnic River Illustration of Sediment Boring
List of Tables
Table 1.       Kinnickinnic River Approximate Station Boring Depths. .
Table 2.       Kinnickinnic River Sediment Testing Requirements.
Table 3.       Kinnickinnic River TCLP Metals Requirements .
Table 4.       Kinnickinnic River TCLP Volatiles Requirements
TableS.       Kinnickinnic River TCLP Semivolatiles Requirements.  .
Table 6.       Kinnickinnic River TCLP Pesticides Requirements.
Table 7.       Kinnickinnic River TCLP Herbicides Requirements.
Table 8.       Tentative Project Schedule      ....
Table 9.       Summary of Data and Analyses at Sampling Locations.  .
Table 10.      Summary of Type and Number of Samples to be Collected
Table 11.       Sample Container and Preservation Requirements
Table 12.      Addresses for Shipment of Samples

Appendices
Appendix A:  Example Chain of Custody Form
Appendix B:  Minimum  QA/QC Checklist for Post-Sampling Data Evaluation
Appendix C:  Cooler Receipt Form
Appendix D:  Shipping Container Checklist Summary
Appendix E:  Sampling Station Coordinates
10
11
11
12
12
12
12
13
14
15
18
20

-------
Kinnickinnic River QAPP, Draft, April 2002                                                             5

1.     SUMMARY

1.1    Purpose

The sampling efforts detailed in this document outlines a plan to determine the extent of
contamination in a stretch of the Kinnickinnic River, Milwaukee Wisconsin. The data collected
during this study will be used to determine alternatives to perform environmental clean-up of the
contaminated sediments. Proposed testing for this sampling include: PCBs, PAHs, TOC, and
Toxicity Characterization. Samples for particle size distribution, hydrometer, loss upon ignition,
and Atterberg limits will be collected but not analyzed within the scope of this project.

Primary Objective:  Collect sufficient data to ascertain the current state of contamination in a
stretch of the Kinnickinnic River.

Secondary Objective: Collect samples for geotechnical analysis to generate sufficient data to
create plans and specifications for dredging the contaminated sediments.

1.2    Background

Site Location
The Kinnickinnic flows primarily  east through the southeast corner of Wisconsin, discharging
into Lake Michigan after a convergence with the Milwaukee River in Milwaukee, Wisconsin.
The stretch of the Kinnickinnic River subject to this sediment sampling investigation begins at
the State Highway 32 bridge and extends approximately 1,700 ft. upstream to the Becher Street
bridge.

Historical Sampling
A limited amount of sediment chemistry data is available to document contamination conditions
at the site.  If available, the following historical data sets will be evaluated as part of this project:

           1.  Wisconsin DNR sediment testing;
           2.  University of Wisconsin, Milwaukee sediment testing;
           3.  2001 Tetra Tech Em, Inc (EPA Superfund Sampling).

The above sediment sampling and analytical projects revealed  sediments within this project area
to have sufficient contamination at least to the depth of 7 feet to warrant further investigation.
This project is designed to delineate the full extent of contamination within the project area.

1.3    Project Organization

Figure 1 provides a summary of the project organization for this project. A  description of the
duties of each individual is provided below.

-------
Kinnickinnic River QAPP, Draft, April 2002

Figure 1.  Organizational Chart
           Project Officer
          Scott Cieniawski
          USEOA-GLNPO
           312-353-9184
   PROJECT MANAGER
     Demaree Collier
     USEPA-GLNPO
     312-866-0214
       GLNPOQA/QC MANAGER
            Louis Blume
          USEPA-GLNPO
           312-353-2317
 PROJECT COORDINATOR
      Paul Baxter
.Army Corps of Engineers-Detroit
     313-226-7555
SITE COORDINATOR
  Xioachun Zhang
  Wisconsin DNR
                                 ANALYTICAL SERVICES COORDINATOR
                                            lanKerr
                                    Altech Environmental Services, Inc.
                                           24-353-3823
                                 FIELD SUPPORT PNR)
                                   Steve Westenbroek
                                    Marsha Butzynski
                 SEDIMENT/WATER CHEMISTRY ANALYSIS
                           Ann Preston
                      Trace Anatytical Laboratory
                          231-773-5998
            SAMPLE COLLECTION/GEOTECHNICAL
                      Scott Strigel
                     906-774-3440
USEPA-GLNPO
USEPA-GLNPO is the principal investigating agency for this sediment survey. They are
responsible for coordination and approval of the Scope of Work and QAPP as well as the
principal client for the final data. USEPA-GLNPO staff associated with this project include:
Person:

Scott Cieniawski
Project Manager
77 W. Jackson Blvd. (G-17J)
Chicago, IL 60604
Phone: 312-353-9184
Cieniawski.Scott@epamail.epa.gov
        Responsibilities:

        Coordinate Project Funding

-------
Kinnickinnic River QAPP, Draft, April 2002
                                       Project Management
                                       Review/Approve QAPP
                                       Perform Project Management Tasks
                                       Review/Approve QAPP
Demaree Collier
Project Manager
77 W. Jackson Blvd. (G-17J)
Chicago, IL 60604
phone: 312-886-0214
collier.demaree@epa.gov

Louis Blume
GLNPO Q A Manager
77 W.Jackson Blvd. (G-17 J)
Chicago, IL 60604
phone: 312-353-2317
blume.louis@epa.gov
U.S. Army Corps of Engineers - Detroit District
The U.S. Army Corps of Engineers will provide laboratory contracting support, and an initial
QA/QC review of the project data. The USAGE representative will be responsible for
developing a project Scope of Work and QAPP, contracting collection of sediment samples and
analysis of sediment samples, contacting USEPA-GLNPO/WDNR regarding any concerns
regarding the data received from the laboratories, and advising USEPA-GLNPO/WDNR
regarding any concerns expressed by the laboratory.  USAGE individual involved in this project
is:
Person:
Paul Baxter
Proj ect Coordinator
477 Michigan Av.
U.S. Army Corps of Engineers
Detroit, Michigan 48226
313-226-7555
Paul.R.Baxter@lre02.usace.army.mil
                                       Responsibilities:
                                       Prepare Scope of Work and Project QAPP
                                       Contract Sampling and Analytical Testing
                                       Perform Contract Management Activities
                                       Initial QA/QC Review
Wisconsin DNR
The Wisconsin DNR (WDNR) will provide oversight/coordination and field support to this
project.  Field support will be provided during sediment sampling. WDNR will also provide
historical data, results, and information on sampling and analysis methods used during historical
studies.  WDNR will be responsible for evaluation of analytical test results which will be
submitted to EPA-GLNPO Project Manager. The WDNR staff involved in this project are:
Person:
Xiaochun Zhang
Project Investigator - WDNR
Wisconsin DNR
101 Webster St., Box 7921
Madison, Wisconsin 53707-7921
608-264-8888
ZhangX@mail01.dnr.state.wi.us
                                       Responsibilities:
                                       Oversee/Coordinate Project Sampling Activities
                                       Provide Information on Historical Sampling Results
                                       and Analytical Methods
                                       Review/Approve QAPP
                                       Analyze Data and Prepare Report

-------
Kinnickinnic River QAPP, Draft, April 2002
                                         Provide Field and Technical Support
                                         Provide Field and Technical Support
                                         Provide Field and Technical Support
Steve Westenbroek
Water Resources Engineer
Wisconsin DNR
2300 N. Dr. Martin Luther King Jr. Dr.
Milwaukee, Wisconsin 53212
414-263-8576
westes@dnr.state.wi.us

Jim Killian
Water Resources Specialist
Bureau of Watershed Management
Wisconsin DNR
101 S.Webster St., Box 7921
Madison, Wisconsin 53707-7921
608-264-6123
killij@dnr. state, wi.us

Marsha Burzynski
Water Resources Planner
Southeast Region-Milwaukee
Wisconsin DNR
2300 N. Martin Luther King Jr. Dr.
Milwaukee, Wisconsin 53212
414-263-8708
Laboratory
Laboratory analyses for this project will be performed by Trace Analytical Laboratories, Inc.
Altech Environmental Services, Inc. (Altech) will coordinate analytical services from the
laboratory under a contract agreement with the USAGE. Altech will be responsible for sub-
contracting for sample analysis. Trace Analytical Laboratories, Inc. will have sample analysis and
review responsibilities on this project. The laboratory will have their own provisions for
conducting an internal QA/QC review of the data before it is released to the U.S. Army Corps of
Engineers.  The laboratory  contract supervisor listed below will contact the USAGE project
coordinator with any data concerns.

Written QA/QC reports will be filed by the analytical laboratory when data is submitted to the
USAGE. Corrective actions will be reported to the USAGE project coordinator along with the
QA/QC report (see Section 9). The laboratory may be contacted directly by USEPA or USAGE
personnel to discuss QA concerns. Altech will act as laboratory coordinator on this project and all
correspondence from the laboratory should be coordinated with Altech. Responsibilities of the
laboratory and the laboratory  coordinator are provided below:
Person:
Ian Kerr
Project Manager
Altech Environmental Services, Inc.
313-535-7882
ikerr(3)altechenvironmental. com
                                         Responsibilities:
                                         Review/Approve QAPP
                                         Review final analytical report
                                         Ensure Sub-Contract Laboratory Resources are
                                         Available on an As-required Basis
                                         Review Final Analytical Report

-------
Kinnickinnic River QAPP, Draft, April 2002
Ann Preston
Client Services Manager
Trace Analytical Laboratories, Inc.
2241 Black Creek Rd.
Muskegon, Michigan 49444-2673
231-773-5998
traceanaly ti cal@mad.sci enti st. com

Gregory J. Hayes
QA/QC Manager
Trace Analytical Laboratories, Inc.
2241 Black Creek Rd.
Muskegon, Michigan 49444-2673
traceanaly ti cal@mad.sci enti st. com
                                        Coordinate Chemical Analyses
                                        Ensure Laboratory Resources are
                                        Available on an As-required Basis
                                        Supply Required Sample Bottles
                                        and Coolers (including temperature blanks)
                                        Review/Approve QAPP
                                        Perform QA/QC Review on Analytical Test Results
                                        Prepare a QA/QC Case Narrative
Sample Collection
Sample collection will be accomplished utilizing the services of Coleman Engineering Company
(Coleman). Coleman will perform sample collection under the direction of the on-site field
coordinator.  Coleman will be responsible for preparing sample boring logs and supplying all
equipment necessary for sample collection and containers for geotechnical  testing of the sediments.
Coleman's on-site project manager is:
                                        Review/Approve QAPP
                                        Coordinate and Perform Sample Collection
Scott Strigel
Project Manager
Coleman Engineering Company
63 5 Industrial Park Dr.
Iron Mountain, Michigan 49801
906-774-3440
col emanengineering@upl ogon. com

2.     Project Description
2.1    Data Uses and Expected Measurements

GLNPO proposes an assessment of contamination. Work would be coordinated with the
Wisconsin DNR to insure mapping the extent of contamination will be thorough. The proposed
work components are summarized below.

Determination of Existing Data Availability
The first step of this project will be to identify sources and availability of data for this site.  The
Wisconsin DNR, University of Wisconsin, Milwaukee, U.S. EPA have collected data at or near
the project site.

GLNPO will coordinate with Wisconsin DNR to determine the quality and availability of
existing data.

Sediment Chemistry Sampling
Sediment chemistry sampling will consist of the collection of a sediment core samples at
approximately 16 locations. Two (2) of the locations will be upstream of the project area and 14

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                      10
of the locations will be within the project area. All sediment cores will be sectioned into sub-
samples of 0'-2', 2'-4', 4'-6', 6'-8', 8'-10', 10'-12', 12'-14', 14'-16', 16'-18', and 18'-19'. At
four locations cores will extend an additional 7 ft for geotechnical testing. It is anticipated there
will be a minimum of 5 and a maximum of 10 samples per boring for chemical testing. The two
locations upstream of the project area will be surficial grab (ponar) samples.  All sediment
samples collected will be analyzed for PCBs Aroclors and PAHs. Seven of the borings (borings
ending with an odd number) will also have vertical composites for toxicity characterization
testing as defined in 40 CFR Ch 1 Part 261 as well as TOC analysis on the boring discrete
samples. Refer to Figure 5 for illustrations of sediment borings.  Tablel summarizes the
anticipated water and boring depths and number of samples.

Table 1. Kinnickinnic River Approximate Station Boring Depths
Station I.D.
KK0201
KK0202
KK0203
KK0204
KK0205
KK0206
KK0207
KK0208
KK0209
KK0210
KK0211
KK0212
KK0213
KK0214
KK02US1
KK02US2
Water Depth
(Ft.)
3
5
O1
7
O1
O1
9
O1
O
6
3
6
8
7
N/A
N/A
Boring Depth
(Ft.)
16
14
19
12
191
191
10
19
16
13
16
13
11
12
N/A
N/A
Totals
No. Environmental
Samples
8
7
101
6
101
101
5
101
8
7
8
7
6
6
1
1
10012
No. Geotechnical
Samples
4
3
4
3
4
4
2
4
4
3
4
3
3
3
1
1
501
1 - Estimated
2- Does not include field QA/QC samples

2.2    Criteria and Objectives

2.2.1   Sediment Chemistry

Tables 2 through Table 7 provide the requirements necessary for sediment chemistry testing.
Standard Operating Procedures for analysis of sediments for this project are available upon
request to the USAGE Project Coordinator, Paul Baxter.

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                          11
Table 2. Kinnickinnic River Sediment Testing Requirements
Parameter
Bulk Sediment PCBs
Bulk Sediment PAHs
Bulk Sediment TOC
TCLP Procedure
TCLP Procedure for Volatiles
TCLP Metals
TCLP Volatiles
TCLP Semivolatiles
TCLP Pesticides
TCLP Herbicides
Corrosivity
Reactive Cyanide
Reactive Sulfide
Ignitability
Paint Filter Test
No. of
Samples
122
122
61
7
7
7
7
7
7
7
7
7
7
7
7
TDL1
Per Method
Per Method
1,000 mg/kg
N/A
N/A
Table 3
Table 4
Table 5
Table 6
Table 7
N/A
0.5 mg/kg
5.0
>200° F
N/A
Precision
(RPD)
50%
50%
20%
N/A
N/A
Table 3
Table 4
Table 5
Table 6
Table 7
20%
7.5 %
20%
20%
N/A
Analytical Method
SW-846/8082
SW-846/8270C
Walkely-Black
SW-846/1311
SW846/1311
SW-846 (Table 3)
SW-846/8260B
SW-846/8270C
SW-846/8081
SW-846/8150
SW-846/9040/9045B
SW-846 Ch. 7/EPA 90 12
SW-846/Ch-7/EPA 376.2
SW-846/1020A
EPA 9095
1 - Target Detection Limit
Table 3.  Kinnickinnic River TCLP Metals Requirements
Analyte
Arsenic
Barium
Cadmium
Chromium
Lead
Mercury
Selenium
Silver
Analytical
Method
SW-846
6010/6020/7000
SW-846
6010/6020
SW-846
6010B/6020/7000A
SW-846
6010B/6020
SW-846
6010B/6020/7000A
SW-846
7471A
SW-846
6010B/6020
SW-846
776 1/60 10B/6020
TDL1 (mg/1)
0.30
1.00
0.10
0.50
0.50
0.01
0.60
0.10
Precision (RPD)
20%
20%
20%
20%
20%
12%
20%
20%
1 - Target Detection Limit

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                                      12
Table 4. Kinnickinnic River TCLP Volatiles Requirements
Analyte
Benzene
Carbon Tetrachloride
Chlorobenzene
Chloroform
Methyl ethyl keytone
1 ,4-Dichlorobenzene
1 ,2-Dichloroethane
1 , 1 -Dichloroethylene
Trichloroethene
Tetrachl oroethy 1 ene
Vinyl chloride
TDL1 (mg/1)
0.05
0.05
0.05
0.05
0.25
0.05
0.05
0.05
0.05
0.05
0.05
Precision
50 % RPD
50 % RPD
50 % RPD
50 % RPD
50 % RPD
50 % RPD
50 % RPD
50 % RPD
50 % RPD
50% RPD
50 % RPD

1- Target Detection Limits
Table 5. Kinnickinnic River TCLP Semivolatiles Requirements
Analyte
2-Methylphenol
3/4-Methylphenol
Methylphenol(2,3,4)
2,4-Dinitrotoulene
Pentachlorophenol
Hexachl orob enzene
Hexachl orobutadi ene
Hexachl oroethane
Nitrobenzene
Pyridine
2,4,5-Trichlorophenol
2,4,6-Trichlorophenol
TDL1 (mg/1)
0.10
0.10
0.10
0.10
0.10
0.10
0.10
0.10
0.10
0.10
0.10
0.10
Precision (RPD)
50%
50%
50%
50%
50%
50%
50%
50%
50%
50%
50%
50%

Table 6. Kinnickinnic River TCLP Pesticides Requirements
Analyte
Chlordane
Endrin
Heptachlor
Heptachlor Epoxide
4,4-DD
Lindane
Methoxychlor
Toxaphene
TDL1 (mg/1)
0.020
0.010
0.008
0.008
0.010
0.010
0.500
0.500
Precision (RPD)
50%
50%
50%
50%
50%
50%
50%
50%
Table 7. Kinnickinnic River TCLP Herbicides Requirements
Analyte
2,4-D
2,4,5-TP (Silvex)
TDL1 (mg/1)
10.0
1.0
Precision (RPD)
50%
50%

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           J

2.3    Special Personnel, Training, and Equipment Requirements

Sediment Sampling
Sediment sampling will require the use of the Coleman Engineering's barge and associated
drilling rig or an equivalent.  Equipment requirements for collecting sediment core samples are
contained in Appendix A.

Under normal operations, the minimum Personal Protective Equipment (PPE) required to be
worn by personnel working on deck aboard the drill rig barge is Modified Level D Protection.
Modified Level D Protection includes:  hard hat with face shield, steel toed footwear, tyvek
coveralls, boot covers, Personal Floatation Device, and double gloves. Modified Level D
indicates that no respiratory protection  is required.

This survey will require PPE suitable for normal operating conditions as described above.  The
main method to avoid exposure to the contaminants present, is to avoid direct contact with skin.
Washing hand immediately after sampling will also reduce potential exposures to the
contaminants.

2.4    Project  Schedule

A tentative project schedule is provided in Table 8. All personnel shown in Figure 1 should be
contacted regarding significant schedule changes.

Table 8. Tentative Project  Schedule
Task
Scope of Work Acceptance
QAPP Development and Approval
Sediment Sampling
Completion of Sediment Analysis
Draft Analytical Report Due to USAGE
Final Analytical Report Due to USAGE
Report Due to GLNPO and WDNR
QA/QC Review Due to GLNPO and WDNR
Completion Date
May 2002
May 2002
June 2002
July 2002
July 2002
August 2002
September 2002
November 2002
3.     Sampling Plan

3.1    Sampling Network Design and Rationale

The purpose of this sampling survey is to determine the quality of the sediments in the project
area. In order to obtain a full picture of the extent of contamination a large number of samples
need to be collected.  Sediment chemistry and geotechnical test samples will be collected.  These
samples will allow WDNR and the COE to determine the levels of contaminants present in the
sediments and development of disposal options of the dredged sediments.

Figure 2 presents an overview of the project area and approximate locations for collection of
sediment samples. Latitude/longitude of the sampling points within the project area are provided
in Appendix E.  Exact sample locations may be relocated by the on-site field coordinator during

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                     14
sampling. This will be dependent upon, but not limited to; site characteristics and ability of the
sampling team to collect sufficient sample material.

The sampling locations are designed to provide focused coverage of the project area as well as
some general coverage upstream of the project area.  Table 9 summarizes the types of data and
analyses to be collected at each type of sampling location.

Table 9. Summary of Data and Analyses at Sampling Locations
Core Sample (14 locations)
Sediment Chemistry
Sediment Depth
Water Depth (Actual & Corrected)
Geotechnical Sample Collection
Physical Descriptions of Samples
Photographs of Samples
Grab Samples (2 Locations)
Sediment Chemistry
Water Depth (Actual & Corrected)
Physical Descriptions of Samples
Photographs of Samples


3.2    Definition of Sample Types

Three types of sediment samples will be collected during this survey; Routine Field Samples
(RFS), Field Replicates (FR), and Field Duplicates (FD). Each sample type is described below.

Routine Field Samples (RFS):  Prepared by collecting a section of a sediment core,
homogenizing the sediments collected, and filling all required sample jars. Routine field
samples will be collected at fourteen (14) locations. Refer to Figure 3 for locations of the RFS.

Field Duplicates (FD): Prepared by filling a second set of sample jars from a sediment core after
the cores have been homogenized. FDs will be collected at one (1) sediment core location. This
is approximately equivalent to a ratio of FDs to RFSs of 1 to 10 (10%).   Location of the FDs
will be determined in the field by the on-site  sample collection coordinator.

Field Replicates (FR):  Prepared by collecting a second,  sediment core sample, homogenizing
the material separately from the RFS  and filling the required sample bottles.  FRs will be
collected at one (1) sediment core location. This is approximately equivalent to a ratio of FRs  to
RFSs of 1 to 10 (10%). Locations of the FRs will be determined in the field by the on-site
sample collection coordinator.

3.3    Type and Number of Samples
Table 10 summarizes the type and number of samples to be collected during this sampling event.
The estimated number of samples does include all RFS, FD, and FR samples.

-------
Kinnickinnic River QAPP, Draft, April 2002

Table 10. Summary of Type and Number of Samples to be Collected
15
Sample Type
Sediment Chemistry
Sediment Chemistry
Sediment Chemistry
Geotechnical Samples2
Geotechnical Samples2
Estimated
Number of
Samples1
122
61
7
50
8
Sample
Matrix
Sediment
Sediment
Sediment
Sediment
Sediment
Analysis Required
PCBs Aroclors and PAHs
roc
Loxicity Characterization
Grain Size with Hydrometer and Loss Upon Ignition
Atterberg Limits
 - Includes field QA/QC samples.
2 - To be analyzed at a later date, analysis is not included within this project.

All of the data listed in Table 10 is considered critical to the success of this assessment project.

3.4    Field Data Collection

Three sets of field data will be collected that are critical to the data quality objectives for this
project.

       Latitude/Longitude Location: This data is critical for use in determining where sediment
       samples were collected. The Differential Global Positioning Systems (DGPS) onboard
       the Coleman Engineering barge will be capable of ascertaining horizontal locations with
       < 5 meters of accuracy.  To achieve this accuracy, it is important that the DGPS is in
       good working order and are obtaining strong satellite signals. The field team will be
       responsible for checking the satellite signal strength for the DGPS system prior to
       recording this data and for ensuring that the system is recording equivalent horizontal
       locations.  Any problems with signal strength shall be recorded on in the field boring log.
       If problems are noted, the field team should provide a qualitative description of the
       sampling location utilizing any available, permanent landmarks. The DGPS unit will
       have the accuracy checked prior to each days sampling activities by locating one of the
       USAGE survey markers shown on Figure 3.  The DGPS unit's antennas will be located as
       close to the marker as possible and the reading will  be compared to those on Figure 3.

       Sediment Depth: Sediment depth data is critical for determining the volume of sediments
       with a potential for being contaminated. Sediment depth will be measured to the nearest
       0.1 ft.

       Water Depths: Water depths will be taken directly over the location of the sampling site
       prior to sample collection with a weighted measuring tape.  Water depths will be reported
       as actual depth measured and as water depth corrected to Low Water Datum.  Low Water
       Datum is available for Milwaukee Harbor at the closest daily and hourly water levels

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           J g

       station for Lake Michigan, which can be obtained from the Internet at the NOAA home
       page for water elevations. Water depths will be measured to the nearest 0.1 ft.

            [Note: Low Water Datum is available in 6 minutes intervals.  The address is:
         http://www.opsd.nos.noaa.gov/data res.html. From this address under Preliminary
           Water Level Data select Great Lakes Stations then choose Milwaukee and display
                                  recorded water levels in feet.]

4.     Sample Collection and Handling

4.1    Sample Collection

4.1.1   Sediment Cores

Sediment cores will be  collected utilizing a two inch and/or three inch diameter (depending on
amount of recovery) split spoon sampler and associated hollow stem auger (ASTM Method D-
1586-84, Standard Method for Penetration Test and Split-Barrel Sampling of Soils).  The split
spoon sampler is capable of collecting continuous sediment cores to the depths required for this
project. All sediment cores will be analyzed for sediment chemistry as summarized in Table 10
and explained in detail in Section 5.

Once the barge has been positioned over a given sampling station, the following activities will
take place, but not necessarily in this order:

       1.  Water depth will be measured through the hole in the barge where the samples
              will be collected;
       2.  Location coordinates will be recorded by placing the GPS antenna over the sampling
          hole;
       3.  The split spoon sampler will be lowered penetrating two  feet into the sediment, if
          applicable, the sampler will be hammered into the sediment with a 30 inch free fall of
          a 140 Ib. hammer and the blow counts per every six inches will be recorded in the
          boring log.  The split spoon sampler will be retrieved to the barge deck for sample
          handling.  The hollow stem auger will be lowered to the sediment surface. If upon
          retrieval, the split spoon sampler did not retain/collect any sample, the hollow stem
          auger will be slowly rotated to a depth of two feet and slowly retrieved to the surface
          of the barge. A sediment sample will be collected from the hollow stem auger fins
          representative of the two foot sample depth.
       4.  After the 0 to 2 ft. depth has been sampled, the hollow stem auger will be advanced to
          the 2 foot depth and flushed with site water to remove  any residual sediments within
          the auger.
       5.  The split spoon sampler will then be advanced to the 4 ft. depth and retrieved for
          sample handling;
       6.  This procedure will continue until either native material (such as clay) is encountered
          or the predetermined depth for a given boring is achieved.

4.1.2. Sediment Grab  Samples

Sediment grab samples  will be collected utilizing a ponar dredge sampler.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           J 7

4.1.3. Hand Augured Samples

Four of the sampling locations within the project area are not accessible to the drill rig barge.
Therefore, samples will be collected from these locations utilizing a bucket hand auger. Samples
will be collected to the deepest depth as practical depending upon but not limited to hole
collapse, complete resistance, obtaining the project sample depth of 19 ft., etc.

4.2    Sample Handling

4.2.1   Sample Processing

Samples not analyzed for TCLP Toxicity Characterization.

Upon retrieval of the split spoon/ponar sampler, the sampler will be carefully opened, sample
retained within the split spoon sampler will be measured for recovery, transferred to a clean
stainless steel mixing bowl or equivalent, photographed, a description will be recorded,
thoroughly homogenized, and transferred into the appropriate sample containers. Samples for
chemical analysis will be placed on ice within a cooler for shipment to the laboratory.  Samples
for geotechnical testing will be placed into an appropriate sample container ensuring that there
will be no loss of moisture from the samples and then stored in a storage container for transport
to Coleman Engineering Company's testing facility.

 Samples analyzed for TCLP Toxicity Characterization.

Upon retrieval of the sediment sample, the sample will be transferred to a clean stainless steel
mixing bowl or equivalent, photographed, and a description will be recorded. For TCLP
volatiles, an aliquot will be transferred into the appropriate laboratory supplied sample container
(TCLP volatiles samples will be composited at the laboratory). Another aliquot will be
transferred to the sample container for TOC analysis. The remainder of the material will be
thoroughly homogenized and a sufficient aliquot will be transferred into a second stainless steel
mixing bowl for compositing. This process will be repeated for  each 2 ft. split spoon sample
collected from the boring. Upon completion of the boring, the sediment placed into the second
stainless steel mixing bowl will be photographed and thoroughly homogenized and transferred
into the proper TCLP extractable sample container.

4.2.2   Equipment Decontamination

Immediately after the samples have been transferred from the split spoon/ponar sampler, the
equipment will be scrubbed with on-site water, scrubbed with a alconox/liquinox solution, and
followed by a on-site water rinse. The on-site water wash and rinse may be disposed of on-site.
The alconox/liquinox wash solution will be retained by the sampling team and disposed of
properly at the completion of this sampling project.  Disposal should be to a wastewater
treatment facility.

4.2.3   Sample Containers

After processing, sediment samples will be placed into the appropriate  sample containers as
summarized in Table 6.  A field sample log shall be filled out for each sampling location.

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                    18
Note: The analyzing laboratory will supply all required Chain-of Custody forms, sample
containers, and sample coolers, including a temperature blank with each sample cooler.  The
coolers and sample bottles shall be shipped to the following address no later than June 14,
2002:

Scott Strigel
Coleman Engineering Company
635 Industrial park Dr.
Iron Mountain, Michigan 49801

Table 11.  Sample Container and Preservation Requirements
Analyses
PCBs
PAHs
TOC
TCLP Volatiles
TCLP Extractables1
Ignitability, Corrosivity, and
Reactivity
Grain Size
Hydrometer
Atterberg Limits
Loss upon ignition
Percent Moisture
Container
8 oz Glass
4 oz Glass
4 oz Glass
16 oz Glass
16 oz., Widemouth
Plastic
Included in grain size
Included in grain size
Included in grain size
Included in PCBs
Preservation
Technique
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C,
No head space3
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Holding
Times
14 days/40 days2
14 days/40 days2
28 days
14 days/40 days2
14 days/40 days2
Analyze as soon as
practical
No hold time
No hold time
No hold time
No hold time
28 days
  1 As, Ba, Cd, Cr, Cu, Pb, Hg, Ag, Semivolatiles, Pesticides, and Herbicides
  2 From time of collection to extraction/From time of extraction to analysis

4.2.4   Sample Labeling

Each sample bottle shall be individually labeled using a waterproof pen. The label shall contain,
but not be limited to, the following information:

       •  Unique Sample Number: KK02XX-XX/XX; where "KK02" refers to the
          Kinnickinnic River 2002 sampling event, "XX-XX/XX" refers to the numerical
          sequence of the sample locations and the depth interval of the sample (KK0201-00/02
          is sample number 1 collected from the sediment depth of 0 to 2 feet). Field duplicates
          and field replicates shall have a suffix of "R"  for replicate and "D" for duplicate.
       •  Sample Date (MM-DD-YYYY)
       •  Sample Time (HHMM, on a 24-hour clock)
       •  Analysis to be performed (e.g. PCBs, PAHs, etc.)
       •  Sampler's Initials
       •  Client: Altech Environmental Services
       •  Project:  Kinnickinnic River

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                      19
An example label is shown in Figure 4.  Clear tape will be placed over the label after the label
has been completely filled out and attached to the sample container. The sample identification
number and date of sample collection will be written on the sample container closure with a
water proof marker.

Figure 4.  Example Sample Label
                      Project: Kinnickinnic River
                      Client: Altech Environ. Services
                      K0201-00/04             6-18-2002
                      PCBs & PAHs           1300 hrs.
                                               DC
4.2.5   Shipment and Chain-of-Custody

After collection and labeling, all glass containers shall be placed in a zip-lock bag, wrapped in
bubble wrap and placed in an appropriate sample cooler. Within 24 hours of sample collection,
the samples will be sent to the analyzing laboratory. After samples are collected each day, the
Field Team Coordinator shall be responsible for shipping and/or arranging pickup of samples.  A
Shipping Container Checklist is provided for guidance (Appendix D). The Field Team
Coordinator shall insure that:
       1.  The coolers contain sufficient ice to keep the sample below 4° C during the shipment
          process and samples are immobilized with bubble pack to reduce the risk of breakage,
       2.  The chain of custody form (see example in Appendix A) is properly filled out,
       3.  A copy of the chain-of-custody form shall be retained and provided to the project
          manager,
       4.  A copy of the chain-of-custody form will be placed in a "ziploc" bag and taped to the
          inside lid of the cooler,
       5.  A temperature blank is included in each sample cooler (temperature blank to be
          supplied by the laboratories),
       6.  The outside of the container will be sealed using fiberglass or duct tape,
       7.  The laboratory name, phone number, and address, as well as the return name and
          address, will be clearly labeled on the outside of the cooler,
       8.  The samples will be sent to the contract laboratory by an overnight courier, and
       9.   Receipts of bills of lading will be retained as part of the permanent documentation
          and a copy of the air bill and/or bill of laden will be sent to Altech Environmental
          Services Project Manager, Ian Kerr,.
       10. Commercial couriers are not required to sign off on the sample tracking form,
       11. Laboratories are contacted prior to shipment to insure they are prepared for sample
          arrival.

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                    20
Note: The analyzing laboratory will supply chain-of custody forms to the Project Field Sample
Collection Team Leader, Scott Strigel, prior to the sampling event.

Table 11 summarizes where each of the respective types of samples shall be shipped.

Table 12  Addresses for Shipment of Samples	
                 Analysis
      Laboratory Contact Information
PCBs, PAHs, TOC, and Toxicity
Characterization
Ann Preston
Trace Analytical Laboratories, Inc.
2241 Black Creek Rd.
Muskegon, Michigan 49444-2673
(231)773-5998 Ext. 224
Grain Size with Hydrometer, Atterberg limits,
and Loss Upon Ignition1
Coleman Engineering - Scott Strigel
63 5 Industrial Park Rd.
Iron Mountain, Michigan 49801
(906) 774-3440
 - Geotechnical samples may be held on-site and in custody by Coleman Engineering Company
and taken to Coleman Engineering's facility with the sampling team after completion of the
project.

4.2.6   Receipt of Samples

Upon receipt of project samples for chemical analysis the laboratory shall

       •  Complete their portion of the chain-of-custody forms,
       •  Contact the Altech Environmental Services Project Manager to inform him of sample
          receipt and to discuss any problems or issues,
       •  Insure that the samples are maintained at < 4°C,
       •  Complete a Cooler Receipt Form (See example in Appendix C).
       •  If there are any sample shipment problems, the laboratory should contact Altech
          Environmental Services Project Manager (Ian Kerr) and the Altech Environmental
          Services Project Manager shall contact the USAGE Project Coordinator (Paul Baxter)
          and the USEPA Project Manager (Demaree Collier) as soon as the sample shipment
          problem is discovered,
       •  Fax a copy of the chain-of-custody form to the Altech Environmental Services Project
          Manager, Ian Kerr at (248) 353-5485

5.      Laboratory Analysis

5.1    Analytical Methods

Analysis and preparation methods for all required analyses are provided in Table 2.

5.2    Data Quality Objectives (DQOs)

Data from the historical sampling events contains very little information regarding the extent of
contamination within the project area.  Additionally, the analytical obtained in the historical
sampling events is not sufficient to meet the primary and secondary objectives of this project.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           21

Therefore, the DQOs chosen for this project will be based on the objectives required to
adequately assess the current state of contamination within the project area.

The DQOs for the laboratory analysis portion of this project are defined according to the
following four quality assurance objectives.

Definitions

Instrument Detection Limit (IDL): The instrument detection limit (IDL) is the lowest analyte
concentration that an instrument can detect. The IDL is determined on samples that have not
gone through any sample preparation (e.g. calibration standards).

Limits of Quantification (LOQ): The limits of quantification is the lowest analyte concentration
that can be accurately measured and reported, as opposed to simply detected.

Method Detection Limit (MDL): Method detection limits (MDL) will be determined by making
repeated measurements (a minimum of seven) over several non-consecutive days of either a
calibration blank or a low-level standard with a concentration  within 1-5 times the IDL. The
MDL is calculated, at the 95 percent confidence level, as 3 times the standard deviation of the
measured sample concentrations.

Target Detection Limit (TDL):  The target detection limit (TDL) is the concentration at which
each analyte must be detected and quantified in order to meet the study objectives.  This means
that, if possible, all IDLs, MDLs and LOQs, should be less than the TDLs for all analytes.  If the
laboratory expects any of the DDLs, MDLs, or LOQs to exceed the required TDLs, they must
contact the USAGE and USEPA project managers to develop corrective action procedures.

5.2.1   Method Detection Limits and Level of Quantification

For quantitative chemical analyses, the analytical laboratory will be required to determine the
instrument detection limit (IDL) prior to any analysis of the routine samples. The target detection
limit (TDL) is the concentration at which the presence of an analyte must be detected to properly
be able to assess and satisfy the DQOs.  To be acceptable, a laboratory must demonstrate that the
MDL is less than  or equal to the TDL through use of laboratory quantitation standards. The
laboratory shall also strive to set the dry sample Levels of Quantification (LOQs) below the
applicable TDLs.  Tables 2 through 7 contain the threshold effect concentrations (TECs) for the
chemicals to be analyzed that have actually had the TECs calculated and contain the exact
information, plus  a few additional parameters that do not have calculated TECs, which are all
also listed at the TDL for each parameter.

Target detection limits for all required sediment chemistry are provided in Tables 2 through
Table 7.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           22


Note: If a laboratory is unable to obtain MDLs and LOQs that are below the respective TDLs
for each analyte, the laboratory shall contact the U.S. Army Corps of Engineers Project
Coordinator and/or the U.S. Environmental Protection Agency's Project Manager to discuss
required course of action. Decisions to be made could include:  implementation of additional
sample clean-up procedures prior to analysis, USEPA acceptance of higher MDLs and LOQs, or
implementation of other potential suggestions.

Note: It is understood that potential high moisture contents of the sediments could impact MDLs
and LOQs achieved by the laboratory. In an effort to reduce the impact of high water content on
MDLs and LOQs the labs shall decant free water from the surface of the sediment samples prior
to analysis.

5.2.2   Bias

Bias is the systematic or persistent distortion of a measurement process that causes errors in one
direction. Bias assessments for environmental measurements are made using personnel,
equipment, and spiking materials or reference materials as independent as possible from those
used in the calibration of the measurement system. When possible, bias assessments should be
based on analysis of spiked samples rather than reference materials so that the effect of the
matrix on recovery is incorporated into the assessment. A documented spiking protocol and
consistency in following that protocol are important to obtaining meaningful data quality
estimates.  Spikes should  be added at concentrations approximately at the mid-range.  Spiked
samples shall be used in accordance with the specified method.

Bias will be assessed through the use of certified reference materials (CRMs), standard reference
materials (SRMs: a reference material certified by the U.S. National Institute of Standards
Technology [U.S. NIST]), or other standards, such as, matrix spikes. The use of spiked
surrogate compounds for  GC and GC/MS procedures for PCB and PAH compounds,
respectively, will be used to assess for bias.

Matrix spike and matrix spike duplicate samples (MS/MSD) also will be used to assess bias as
prescribed in the specified methods. Acceptable recovery values will be within the recoveries
specified by each of the analysis methods. Control samples for assessing bias will be analyzed at
a rate as specified in the analytical SOPs and specified analytical methods.

5.2.3   Precision

Precision is a measure of agreement among replicate measurements of the same property, under
prescribed similar conditions.  This agreement is calculated as either the range ® or as the
standard deviation (s). It  may also be expressed as a percentage of the mean of the
measurements, such as relative percent difference (RPD) or relative standard deviation (RSD)
(for three or more replicates).

Laboratory precision is assessed through the collection and measurement of laboratory
duplicates.  The laboratories shall follow the protocols in the specified method and
corresponding SOPs regarding the frequency of laboratory duplicates. This allows intra-
laboratory precision information to be obtained on sample acquisition, handling,  shipping,
storage, preparation, and analysis. Both samples can be carried through the steps in the

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           23

measurement process together to provide an estimate of short-term precision. An estimate of
long-term precision can be obtained by separating the two samples and processing them at
different times, or by different people, and/or analyzed using different instruments. Acceptable
RPDs will be in accordance to those specified by each analysis method.
For duplicate measurements, relative percent  difference (RPD) is calculated as follows:

              RPD = iDj-D?! x 100%
                     (Di + D2)/2
                                        RPD = relative percent difference
                                        DI = sample value
                                        D2 = duplicate sample value
For three or more replicates:
              RSD = (s/x)xlOO
                                        RSD = relative standard deviation
                                        s = standard deviation of three or more results
                                        x = mean of three or more results
Standard deviation is defined as follows:
              s = ((Z(yi-meany)2xl/(n-l)))a5
                                        s = standard deviation
                           yi = measured value of the replicate
                                        mean y = mean of replicate measurements
                                        n = number of replicates

Quality control limits for Precision, Accuracy, and Completeness are summarized in Tables 2
through Table 7.

5.2.4   Accuracy

Accuracy measures how close analytical results are to a true or expected value.  Accuracy
objectives will be determined by calculating the percent recovery range of laboratory matrix
spikes and matrix spike duplicates. Accuracy measures are calculated using the RPD between
the expected value and the actual analytical results.
5.2.5   Representativeness

Representativeness is the degree to which the sampling data properly characterize the study
environment. For the field-sampling phase, the previously established sampling sites reasonably
cover the entire project area, and have been previously deemed to adequately represent any
various sub-units within the project area.

In the analytical phase, and as specified elsewhere in this document, appropriate sample storage
and preservation, and sample homogenization will insure that the samples analyzed adequately
reflect conditions as they existed in the natural environment.

5.2.6   Comparability

Comparability states the confidence with which one data set can be compared to another.
Comparability will be enhanced by the consistent use of standardized  sampling methods and

-------
Kinnickinnic River QAPP, Draft, April 2002                                                          24

specified protocols for the sampling phase and through the use of standard documented
methodologies for analyte determinations. Any deviations from the standardized, selected
methods or protocols will be clearly documented by the laboratories and noted in the final
analytical report. There are a number of issues that can make two data sets comparable, and the
presence of each of the following items enhances their comparability:

       •  Two data sets should contain the same set of variables of interest
       •  Units in which these variables were measured should be convertible to a common
          metric
       •  Similar analytical procedures and quality assurance should be used to collect data for
          both data sets
       •  Time measurements of certain characteristics (variables) should be similar for both
          data sets
       •  Measuring devices used for both data sets should have approximately similar
          detection levels
       •  Rules for excluding certain types of observations from both samples should be similar
       •  Samples within data sets should be selected in a similar manner
       •  Sampling frames from which the samples were selected should be similar
       •  Number of observations in both data sets should be of the same order or magnitude.

These characteristics vary in importance depending on the final use of the data.  The closer two
data sets are with regard to these characteristics, the more appropriate it will be to compare them.
Large differences between characteristics may be of only minor importance, depending on the
decision that is to be made from the data.

For this investigation, comparability will be satisfied by ensuring that the field sampling plan is
followed, standard EPA Methods of analysis are used  for sample analysis and that proper
sampling techniques are used.

5.2.7   Completeness

Completeness is a measure of the amount of valid data obtained from a measurement system
compared to the amount that was expected to be obtained under normal conditions. Field
completeness is a measure of the amount of valid measurements obtained from all the
measurements taken in the project. Field completeness objectives for this project will be greater
than 90%. Laboratory completeness is a measure of the amount of valid measurements obtained
from all the measurements taken in the project. Laboratory completeness for this project will be
greater than 90% of the total number of samples submitted to the analytical laboratories.

The calculation for percent completeness is as follows:

             %C =  100%x(V/n)

             %C = percent completeness
             V = number of valid measurements
             n = number of measurements planned

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           25

6.     Documentation and Records

6.1    Field Documentation

Field logs, boring logs, ship logs, and chain of custody documents will be used to record
appropriate sample collection information in the field.

Sediment Sample Collection/Boring Logs:  A sediment sample collection and/or boring log will
be filled out by the field crew for each sample collected.  All original field data sheets shall be
turned over to the Project Coordinator at the conclusion of the field sampling and shall be kept as
part of the permanent project file. A summary of sample collection information shall be
maintained for each day of field sampling. Information to be included in the field log shall
include but not be limited to: sample location ID, latitude/longitude of each sampling location,
time of sample collection, water depth, etc.

Chain-of-Custody Forms:
An example chain of custody form is provided in Appendix A.  A chain-of-custody form will be
filled out for each set of samples shipped to the laboratory. A copy of the chain-of-custody form
will be faxed to the Altech Environmental Services' Project Manager at the end of the field
sample portion of this project.

6.2    Laboratory Reports

All laboratory data and records will be included in the final analytical report submitted to the
project manager.  A complete copy of the QAPP will be provided to the lab. The project
manager will be responsible for maintaining the reports in the permanent project file. The
following laboratory-specific records will be compiled by the laboratory and included in the final
analytical report submitted to the project manager.

Sample Data. These records contain the times that samples were analyzed to verify that they met
holding times prescribed in the analytical methods. Included should be the overall number of
samples, sample location information, any deviations from the SOPs, time of day, and date.
Corrective action procedures to replace samples violating the protocol also should be noted.

Sample Management Records.  Sample management records document sample receipt, handling
and storage,  and scheduling of analyses.  The records verify that sample tracking and proper
preservation were maintained, reflect any anomalies in the samples (such as receipt of damaged
samples), note proper log-in of samples into the laboratory, and address procedures used to
ensure that holding time requirements were met.

Test Methods.  Unless analyses are performed exactly as prescribed by SOPs, this documentation
will describe how the analyses were carried out in the laboratory. This includes sample
preparation and analysis, instrument standardization,  detection and reporting limits, and test-
specific QC  criteria. Documentation demonstrating laboratory proficiency with each method
used should  be included (i.e. LCS data).

QA/QC Reports.  These reports will include the general QC records, such as instrument
calibration, routine monitoring of analytical performance, calibration verification, etc.  Project-
specific information from the QA/QC checks such as blanks (e.g., reagent, method), spikes (e.g.,

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           26

matrix, matrix spike duplicate, surrogate spike), calibration check samples (e.g., zero check, span
check, and mid-range check), replicates, and so on should be included in these reports to
facilitate data quality analysis.

Data Reporting Package Format and Documentation Control Report: The format of all data
reporting packages must be consistent with the requirements and procedures used for data
validation and data assessment described in Sections 8, 9, and 10 of the QAPP.  The Field
Sampling Coordinator will ensure that data are being recorded appropriately on the sample
labels, sample tracking forms, and in the field notebook.  All  entries will be made using
permanent ink, signed, and dated, and no erasures will be made. If an incorrect entry is made,
the information will be crossed out with a single strike mark that is signed and dated by the
sampler. A similar data entry process will be followed by the contract laboratory. Only
QC/Calibration summary forms will be provided at this time, unless analytical raw data is
necessary.

Contract laboratory will be expected to provide a data package with the following components:

       •  Case Narrative:
       •  Date of issuance
       •  Laboratory analyses performed
       •  Any deviations from intended analytical strategy
       •  Laboratory batch  number
       •  Numbers of samples and respective matrices
       •  Quality control procedures utilized and also references to the acceptance criteria
       •  Laboratory report contents
       •  Project name and  number
       •  Condition of samples "as received"
       •  Discussion of whether or not sample holding times were met
       •  Discussion of technical problems or other observations  which may have created
          analytical difficulties
       •  Discussion of any laboratory QC checks which failed to meet project criteria
       •  Signature of the Laboratory QA Manager.

Chemistry Data Report:
       •  Case narrative for each analyzed batch of samples
       •  Summary page indicating dates of analyses for samples and laboratory quality control
          checks
       •  Cross referencing of laboratory sample to project sample identification numbers
       •  Descriptions of data qualifiers
       •  Sample preparation and analyses for samples
       •  Sample and laboratory quality control results
       •  Results of (dated) initial and continuing calibration checks
       •  Matrix spike and  matrix spike duplicate recoveries, laboratory control samples,
          method blank results, calibration check compounds, and system performance check
          compound results
       •  Results of tentatively identified compounds.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                          27

** An electronic copy of the Analytical Data Report will be submitted in an MS
Excel format on CD containing the analytical test results**


7.     Special Training Requirements

No special training requirements are required for this project.


8.     Quality Control Requirements

All analytical procedures are documented in writing as SOPs and each SOP includes QC
information, which addresses the minimum QC requirements for the procedure. The internal
quality control checks might differ slightly for each individual procedure.  Examples of some of
the QC samples that will be used during this project include:

       •  Method blanks
       •  Reagent/preparation blanks
       •  Instrument blanks
       •  Surrogate spikes
       •  Analytical spikes
       •  Field replicates
       •  Laboratory duplicates
       •  Matrix Spike/Matrix Spike Duplicate
       •  Laboratory control standards
       •  Internal standard areas for GC/MS or GC/ECD analysis; control limits.

The actual QC samples requirements will be dictated by the method requirements. Details on the
use of each QC check are provided in the analytical SOPs provided for each measurement.
Method detection limits will be calculated for each analyte.

Note: Instrument calibration concentrations, method validation procedures, internal quality
control protocols, analytical routines, maintenance and corrective actions,  and the data reduction
procedures are included in and will be performed as specified in the Standard Operation
Procedures as required by the designated analytical methods.


8.1     Instrument/Equipment Testing, Inspection, and Maintenance Requirements

The purpose of this section is to discuss the procedures used to verify that  all instruments and
equipment are maintained in sound operating condition, and are capable of operating at
acceptable performance levels.

Testing, Inspection, and Maintenance
The success of this project is dependent on well functioning field, analytical, and toxicological
equipment. Preventative maintenance of this equipment is the key to reduce possible project
delays due to faulty equipment.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           28

As part of each laboratory's QA/QC program, a routine preventative maintenance program will
be conducted to minimize the occurrence of instrument failure and other system malfunctions.
All laboratory instruments are maintained in accordance with manufacturer's specifications and
the requirements of the specific method employed. This maintenance is carried out on a regular,
scheduled basis and is documented in the laboratory instrument service logbook for each
instrument.
8.2     Instrument Calibration and Frequency

This section concerns the calibration procedures that will be used for instrumental analytical
methods and other measurement methods that are used in environmental measurements.
Calibration is defined as checking physical measurements against accepted standards.

Instrumentation Requiring Calibration
All of the equipment used to analyze the sediment samples will require calibration.

Calibration Methods That Will Be Used For Each Instrument
Instrument calibration procedures are dependent on the method and corresponding SOP.  All
ongoing calibration measurements must be within the requirements of the corresponding SOP to
be considered adequate

Calibration Apparatus
None of the analytical instruments will be calibrated using a calibration apparatus.

Calibration Standards
The working linear range of an instrument should be established prior to performing sample
analyses. Calibration standards as specified in the applicable methods and SOPs will be used
when establishing the working linear range. The working linear range for a specific analysis
should bracket the expected concentrations of the target analyte in the samples to be analyzed.
Calibration Frequency
Instrument calibration is performed before sample analysis begins and is continued during
sample analysis at the intervals specified within the applicable methods and SOPs in order to
ensure that the data quality objectives are met. The verification of instrument stability is
assessed by analyzing continuing calibration standards at regular intervals during the period that
sample analyses are performed.  Standards will be analyzed on a schedule as specified in the
analytical SOPs. The concentration of the continuing calibration standard should be equivalent to
the midpoint of the working linear range of the instrument.

Equipment logbooks will be maintained at the laboratory, in which will be recorded the usage,
maintenance, calibration, and repair of instrumentation.  These logbooks will be available during
any audits that may be conducted.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                            29

8.3     Inspection/Acceptance Requirements for Supplies and Consumables

The purpose of this section is to establish and document a system for inspecting and accepting all
supplies and consumables that may directly or indirectly affect the quality of the project or task.

Identification of Critical Supplies and Consumables
Critical supplies and consumables include sample bottles, gases, reagents, hoses, materials for
decontamination activities, and  distilled/deionized water.  The laboratory will utilize high quality
supplies and consumables to reduce the chances of contaminating the samples. All water
purification systems are tested on a regular basis to ensure that water produced is acceptable for
use. Solvent blanks are run to verify the purity of solvents used in the organic analyses. The
contract laboratory may also incorporate other measures, such as the dedicated use of glassware
for certain analyses.

Establishing Acceptance Criteria
Acceptance criteria must be consistent with overall project technical and quality criteria. The
laboratory should utilize their own acceptance criteria for normal operations with analyzing
and/or testing contaminated sediments.

Inspection of Acceptance Testing Requirements and Procedures
The contract laboratory should document inspections of acceptance  testing, including procedures
to be followed, individuals responsible, and frequency of evaluation. In addition, handling and
storage conditions for supplies and consumables should be documented.

Tracking and Quality Verification of Supplies and Consumables
Procedures should be established to ensure that inspections or acceptance testing of supplies and
consumables are adequately documented by permanent, dated, and signed records or logs that
uniquely identify the critical supplies or consumables, the date received, the date tested, the date
to be retested (if applicable), and the expiration date. These records should be kept by the
responsible individual(s) at the laboratory. In order to track supplies and consumables, labels
with the information on receipt  and testing should be used. These or similar procedures should
be established to enable project personnel to: 1) verify, prior to use, that critical supplies and
consumables meet the project objectives;  and 2) ensure that supplies and consumables that have
not been tested, have expired, or do not meet acceptance criteria are not used for the project.
8.4    Data Management

This section will present an overview of all mathematical operations and analyses performed on
raw data to change their form of expression, location, quantity, or dimensionality. These
operations include data recording, validation, transformation, transmittal, reduction, analysis,
management, storage, and retrieval.

Laboratory Data Recording
All raw analytical and toxicity data will be recorded in numerically identified laboratory
notebooks or data sheets.  The data will be promptly recorded in black ink on appropriate forms
that are initialed and dated by the person collecting the data.  Changes to recorded data are made
in black ink, with a single line cross-out, initials, and date.  No "whiteout" will be allowed.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                            3 Q


If a laboratory has the capability to directly enter or download the data into a computerized data
logger, then this is preferable. All labs shall download data directly into a computerized
database.  Sample data are recorded along with other pertinent information, such as the sample
identification number. Other details which will also be recorded include: the analytical method
used (SOP #), name of analyst, the date of analysis or toxicity test, matrix sampled, reagent
concentrations, instrument settings, and the raw data. Each page of the notebook or data sheet
will be signed and dated by the analyst. Copies of any  strip chart printouts (such as gas
chromatograms) will be maintained on file. Periodic review of these notebooks by the
Laboratory Supervisors will take place prior to final data reporting.  Records of notebook entry
inspections are maintained by the Laboratory QA/QC Officer.

Data Verification
The method, instrument, or system should generate data in a consistent, reliable, and accurate
manner. Data validation will be shown by meeting acceptable QC limits for analytical
parameters and sediment toxicity tests.  In addition, the application of preventative maintenance
activities and internal QA/QC auditing will ensure that field and laboratory generated data will
be valid. Quality control data (e.g., laboratory duplicates, matrix spikes, matrix spike duplicates,
and performance of negative controls) will be compared to the method acceptance criteria.  Data
considered to be acceptable will be entered into the laboratory computer system. Data
verification is performed by a second designated senior/experienced staff at the technical level
where QC results, hold times, and instrument calibration is evaluated.  All QA requirements are
programmed into automated systems and flagged where appropriate.

Data Transformation
Data transformations result from calculations based on  instrument output, readings, or responses.
The procedures for converting calibration readings into an equation that will be applied to
measurement readings are given in the SOPs for analytical parameters.

Data Transmittal
Data transmittal occurs when data are transferred from  one person or location to another or when
data are copied from one form to another. Some examples of data transmittal are copying raw
data from  a notebook onto a data entry form for keying into a computer file and electronic
transfer of data over a computer network.  The transmittal of field data will be double-checked
by the PI.  The transmittal of laboratory data will be checked by the individual analyst with
periodic checks by the Laboratory Project Manager and/or QA/QC Officer.

Data Reduction
Data reduction includes all processes that change the number of data items.  The laboratory has
their own  data reduction techniques, as is usually documented in their QA Manual. For the
analytical  results, data reduction will involve calculating the arithmetic mean and standard
deviation of field and laboratory replicates.

Data Analysis
Data analysis will involve comparing the surficial contaminant concentrations to qualitative
values contained in Tables 2 through 7. The analysis shall be performed by the WDNR Project
Manager.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                          3 1

Data Tracking
Data management includes tracking the status of data as they are collected, transmitted, and
processed.  The laboratory will have its own data tracking system in place.

Data Storage and Retrieval
The contract laboratory will have its own data storage and retrieval protocols. USEPA-GLNPO
will retain all the analytical data packages in the project files for this study.  In addition, the
sediment contaminant data will be added to GLNPO's contaminated sediment database.

8.5    Data Acquisition Requirements (Non-Direct)

Additionally, sets of screening values will be used to evaluate the potential impacts of the
contaminant concentrations found in the sediments during this survey. All parameter data will
be compared to existing sediment quality guidelines available mMacDonaldet. Al. (2000) and
Persuadet. Al (1993). All of these screening levels were specifically developed for freshwater
ecosystems and have been published in peer reviewed journals and documents.  Therefore, these
guidelines are considered sufficient for a screening level analysis of sediment data.

Water surface elevation data will be obtained from the NOAA web page.  Only data from the
"verified/historical water level data" page will be utilized in the study. However, NOAA has
attached the following disclaimer on data from this web page:

"These raw data have not been subjected to the National Ocean Service's quality control or
quality assurance procedures and do not meet the criteria and standards of official
National Ocean Service data. They are released for limited public use as preliminary data
to be used only with appropriate caution."

  Since the water surface elevation data is non-critical data, this preliminary data is sufficient  for
our needs.

9.      Assessment and Oversight

9.1     Assessment and Response Actions

During the planning process, many options for sampling design, sample handling, sample
cleanup and analysis, and data reduction are evaluated and chosen for the project. In order to
ensure that the data collection is conducted as planned, a process of evaluation and validation  is
necessary.  This section of the QAPP describes the internal and external checks necessary to
ensure that:

       •  All elements of the QAPP are correctly implemented as prescribed.
       •  The quality of the data generated by implementation of the QAPP is adequate.
       •  Corrective actions, when needed, are implemented in a timely manner and their
          effectiveness is confirmed.

The most important part of this section is documenting all planned internal assessments.
Generally, internal assessments are initiated or performed by the QA Officer.

Assessment of Subsidiary Organizations

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           22

Two types of assessments of the subsidiary organizations can be performed as described below.

       •  Management Systems Review (MSR). A form of management assessment, this
          process is a qualitative assessment of a data collection operation or organization to
          establish whether the prevailing quality management structure, policies, practices, and
          procedures are adequate for ensuring that the type and quality of data needed are
          obtained.  The MSR is used to ensure that sufficient management controls are in place
          and carried out by the organization to adequately plan, implement, and assess the
          results of the project.
       •  Readiness Reviews. A readiness review is a technical check to determine if all
          components of the project are in place so that work can commence on a specific
          phase.

It is anticipated that a readiness review by each contract  laboratory project manager will be
sufficient for this project. No management systems review is anticipated for this project. A pre-
project QA/QC conference call (already held) and submittal of laboratory certifications and/or
QA plans  shall suffice as a MSR.

Assessment of Project Activities
Assessment of project activities can involve the following tasks:

       •  Surveillance
       •  Technical Systems Audit (TSA)
       •  Performance Evaluation (PE)
       •  Audit of Data Quality (ADQ)
       •  Peer Review
       •  Data Quality Assessment.

Surveillance will be the primary assessment technique of project activities. This will most
readily occur by the Project Manager and QA Officer of the contract laboratory.

Number, Frequency, and Types of Assessments
Due to the short-term nature of this project for the contract laboratory, no types of assessments
are planned  other than general surveillance, a data quality assessment by USAGE
representatives, and peer review by USAGE and USEPA.

Assessment Personnel
Internal audits of the contract laboratory are regularly performed by their respective QA Officers.

Schedule of Assessment Activities
External audits by the GLNPO QA Officer and/or the GLNPO Project Manager is up to his/her
discretion. The scheduling of regular internal audits at contract labs is at the discretion of the
respective QAQC Officer.

Reporting and Resolution of Issues
Any audits or other assessments that reveal findings of practice or procedure that do not conform
to the written QAPP need to be corrected as soon as possible. The Laboratory Project Manager
and Laboratory QA/QC Officer need to be informed immediately of critical deviations that
compromise the acceptability of the test. For any critical deviations from the QAPP (i.e.,

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           3 3

elevated detection levels, surrogate recoveries outside control limits, etc.) that cannot be
corrected within the laboratories standard procedure, the Laboratory Project Manager must
contact both the USEPA Project Manager and the USAGE Project Coordinator within 24-hours
of being informed of the deviation. The laboratory project manager should be ready to provide
suggestions for corrective action.  For non-critical deviations, they need to be informed by the
next business day.

Corrective actions should only be implemented after approval by both the USAGE Project
Coordinator and the USEPA Project Manager. If immediate corrective action is required,
approvals secured by telephone from the USEPA Project Manager should be documented in an
additional memorandum. In general communications from the laboratory should follow the
chain-of-command as shown in Figure 1. However, if the contract laboratory is unable to
contact the Altech Environmental Services Project Manager on any time-critical matter, the
laboratory shall contact either the USAGE Project Coordinator or USEPA Project Manager as
necessary.

For noncompliance problems, a formal corrective action program will be determined and
implemented at the time the problem is identified. The  person who identifies the problem will be
responsible for notifying the project manager.  Implementation of corrective actions will be
confirmed in writing through the same channels.  The laboratory shall issue a nonconformance
report for each nonconformance condition.

Corrective actions in the laboratory may occur prior to,  during, and after initial analysis.  A
number of conditions, such as broken sample containers, multiple phases, and potentially high
concentration samples may be identified during sample log-in or just prior to analysis.
Following consultation with laboratory analysts and section leaders, it may be necessary for the
Laboratory QA/QC Officer to approve the implementation of corrective actions.  The submitted
SOPs specify some conditions during or after analysis that may automatically trigger corrective
actions of samples, including additional sample extract  cleanup and automatic re-
injection/reanalysis when certain quality control  criteria are not met.

Corrective actions are required whenever an out-of-control event or potential out-of-control
event is noted.  The investigative action taken is  somewhat dependent on the analysis and the
event.

Laboratory personnel are alerted that corrective actions may be necessary if:
       •  QC data are outside the warning or acceptable windows for precision and accuracy
       •  Blanks contain target analytes above acceptable levels
       •  Undesirable trends are detected in spike recoveries or RPD between duplicates
       •  There are unusual changes in detection limits
       •  QC limits for sediment toxicity tests are not  met
       •  Deficiencies are detected by the Laboratory  and/or GLNPO QA Officer(s) during any
          internal or external audits or from the results of performance evaluation samples
       •  Inquiries concerning data quality are received.

Corrective action  procedures are often handled at the bench level by the analyst, who reviews the
preparation or extraction procedure for possible errors,  checks the instrument calibration, spike
and calibration mixes, instrument sensitivity, experimental set-up, and so on.  If the problem
persists or cannot be identified, the matter is referred to the Laboratory Project Manager and/or

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           34

Laboratory QA/QC Officer for further investigation. Once resolved, full documentation of the
corrective action procedure is filed with the Laboratory QAQC Officer.

These corrective actions are performed prior to release of the data from the laboratory. The
corrective actions will be documented in both the laboratories corrective action log and the
narrative data report sent from the laboratory to the Altech Environmental Services Project
Manager.

If corrective action does not rectify the situation, the laboratory will contact Altech
Environmental Services Project Manager who will then contact the USAGE Project Coordinator
and USEPA Project Manager to discuss details of the corrective actions and required future
actions.

9.2    Reports to Management

Responsible Organizations
Written QC data and appropriate QA/QC reports generated by the laboratory shall be included in
the Analytical Data Report. The Analytical Data Report will be provided by the laboratory to the
Project Manager by the persons identified in Section 1.3 whenever sample measurements are
reported. The QC section of the Analytical Data Report should include the QC data (including
results, recoveries, and RPDs), any non-conformance reports, and chains of custody. The report
should give detailed results of analysis of QC samples, and provide information on the precision,
accuracy, and completeness for each sample run. These written reports will note any significant
QA/QC problems encountered during sample analyses, as well as state the corrective actions
taken.

Any serious QA problems needing immediate decisions will be discussed orally between the
USAGE Project Coordinator and laboratory staff, with such discussions denoted in writing.
Communication should follow the chain-of-command summarized in Figure 1. These problems
will be noted in the final project report to the USEPA Project Manager.

The USAGE Project Coordinator will provide summary QA/QC information in the final written
report to USEPA. This report will include information on adherence of measurements to the QA
objectives. The final report will contain detailed discussions of QA/QC issues, including any
changes in the QAPP, a summary of the contract laboratory QA/QC reports, results of any
internal performance audits, any significant QA/QC problems, detailed information on how well
the QA objectives were met, and their ultimate impact  on decision making.  The following is a
list of items to be included in the final project report:

       •  Changes in the QAPP
       •  Results of any internal system audits
       •  Significant QA/QC problems, recommended solutions, and results of corrective
          actions
       •  Data quality assessment in terms of precision, accuracy, representativeness,
          completeness, and sensitivity
       •  Indication of fulfillment of Q A obj ectives
       •  Limitations on the use of the measurement data.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                            3 5

10.    Data Validation and Usability

The USEPA Project Manager will make a final decision regarding the validity and usability of
the data collected during this project. The project manager will evaluate the entire sample
collection, analysis, and data reporting processes to determine if the data is of sufficient quality
to meet project objectives. Data validation involves all procedures used to accept or reject data
after collection and prior to use. These include screening, editing, verifying, and reviewing
through external performance evaluation audits. Data validation procedures ensure that
objectives for data precision  and bias will be met, that data will be generated in accordance with
the QA project plan and SOPs, and that data are traceable and defensible. The process is both
qualitative and quantitative and is used to evaluate the project as a whole.

Procedures  Used to Validate Field Data
Procedures to evaluate field data for this project primarily include checking for transcription
errors and reviewing field notebooks.  This task will be the responsibility of the WDNR project
manager.

Procedures  Used to Validate Laboratory Data
The Laboratory QAQC Officer will conduct a systematic review of the  analytical data for
compliance with the established QC criteria based on the spike, duplicate, and blank results
provided by the laboratory. All technical holding times will be reviewed, the laboratory
analytical instrument performance will be evaluated, and results of initial and continuing
calibration will be reviewed  and evaluated.

Upon receipt of the draft laboratory report, the U.S. Army Corps of Engineers will perform a
QA/QC review of the analytical report. At a minimum, this review will include an analysis of:

          •   Sample Receipt Verification/Documentation
          •   Detection Limits
          •   Surrogate Recoveries
          •   Laboratory QC Documentation and Results
          •   Holding Time Data
          •   Process Bias and Sensitivity
          •   MS/MSD Recoveries
          •   Analytical Method Documentation

At the conclusion of the review, the U.S. Army Corps of Engineers will prepare a report
describing the results of the review, providing recommendations on data items requiring
corrective action or further documentation/information, and drawing conclusions as to the
usability of the data provided.  A draft report will be provided to the analyzing laboratory and the
U.S. EPA project manager for review and comment prior to finalizing conclusions and
recommendati ons.

The data review will identify any out-of-control data points and data omissions, and the
Laboratory QA Officer will interact with the laboratory to correct data deficiencies.  Decisions to
repeat sample collection and analysis may be made by the USEPA Project Manager based on the
extent of the deficiencies and their importance in the overall context of the project.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           3 g

Additionally, the USEPA project manager will compare all field and laboratory duplicates for
RPD. Based on the results of these comparisons, the USEPA project manager will determine the
acceptability of the data.  One hundred percent of the analytical data will be validated.
Reconciliation of laboratory and field duplicates shall be the responsibility of the USEPA project
manager.

Finally, the USAGE project coordinator will compare the laboratory methods and results to the
QA/QC Review checklist contained in Appendix B.  Any critical problems identified by these
checklist that we are unable to rectify through corrective actions, may be cause for rejecting
portions or all of the data provided.

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                                                                           37
                                         APPENDIX A
U.S. Army Corps of Engineers, Detroit District
              Environmental Analysis Branch
Address:
To:
Phone:
Fax:
Turnaround time:
#












Sample Identification

Project Name:
Project #
P.OJ
Sampled by:
Name:
Samples Re
Cold: Y













Relinquished By:
Relinquished By:
Date












:eived:
N °C

Time












Comp












Initials:
Intact: Y N
Grab












Received By:
Received By:

Matrix












#












Requested Analysis













Date/Time:
Date/Time:









































































































Lab Comments:












Notes:

-------
Kinnickinnic River QAPP, Draft, April 2002                                                          3 8

                                    APPENDIX B

                Minimum QA/QC Checklist for Data Evaluation

Upon receipt of the Draft Analytical Report, the draft report will be checked to verify that the
following are included:

1.   Proj ect name and number
2.   Date of issuance
3.   Laboratory report contents
4.   Case Narrative
5.   Numbers of samples analyzed
6.   Laboratory analysis performed
7.   Condition  of the samples "as received"
8.   Copy of the cooler receipt form
9.   Any deviations from the intended analytical strategy
10. Discussion of whether or not sample hold times were met
11. Discussion of technical problems or other observations which may have created analytical
    difficulties
12. Discussion of any laboratory QC checks which failed to meet project criteria
13. Analytical test results in spreadsheet format using USAGE sample I.D.s and laboratory
    sample I.D.s
14. Summary page indicating dates of analyses for samples and laboratory quality control
    checks
15. Analytical test methods utilized
16. Quality control test results
17. Descriptions of data qualifiers
18. Matrix spike/matrix spike duplicate recoveries, laboratory control samples, method blank
    results calibration check compounds, system performance check compound results, and
    precision results
19. Statement  signed by laboratory QA/QC officer that all data and information submitted is
    valid.

-------
Kinnickinnic River QAPP, Draft, April 2002                                                        3 9


                                  APPENDIX C

                            COOLER RECEIPT FORM

LIMS #                                              Number of Coolers
PROJECT:	  Date/Time Received

A. PRELIMINARY EXAMINATION PHASE:
Cooler opened by (print)	  (sign)	
1.  Did cooler come with a shipping label (air bill, etc)?	YES   NO
   If yes, enter carrier name & air bill number	
2.  Were custody seals outside of cooler?	YES   NO
How many & where	, sealed date:	seal name:	

3.  Were seals unbroken and intact at the date and time of arrival?	YES   NO

4.  Were Chain-of-Custody papers in a plastic bag & taped to the cooler lid?	YES  NO

5.  Were Chain-of-Custody papers filled out properly?	YES  NO

6.  Did you sign the Chain-of-Custody papers in the appropriate location?	YES NO

7.  Were temperature blanks used?	YES NO
   Cooler Temperature	(°C) Thermometer ID No.
8.  Have designated person initial here to acknowledge receipt of
   cooler:                    Date/time
Comments:
                                                                     Continued

-------
Kinnickinnic River QAPP, Draft, April 2002                                                          4Q


                        COOLER RECEIPT FORM Continued

B. LOG-IN PHASE: Date samples were logged in:	

By (print)	(sign)	

11. Describe type of packing in cooler:	
12. Were all bottles sealed in separate plastic bags?	YES NO

13. Did all bottles arrive unbroken with labels in good condition?	YES NO

14. Were all bottle labels complete (ID, date, time, initials, etc.?)	YES NO

15. Did all labels agree with Chain-of-Custody?	YES NO

16. Was a sufficient amount of sample sent for tests indicated?	YES NO

17. If answered NO to any of the above, was laboratory manager notified and project manager
called to discuss	YES NO

Document discussion/comments:

-------
Kinnickinnic River QAPP, Draft, April 2002                                                           4 J


                                    APPENDIX D

                  SHIPPING CONTAINER CHECKLIST SUMMARY

Failure to properly handle or document the Project samples could jeopardize the usability of the
sample results and ultimately the Project.  Prior to sending a cooler to the Analytical Laboratory
please check the following items:

  *    Is the Project clearly identified on the Chain-of-Custody (official project name, project
       location)?

  *    Are all enclosed sample containers clearly labeled with waterproof ink, is the label
       covered with clear tape, enclosed in a plastic bag, and wrapped in bubble wrap?

  *    Are the sample labels complete?

  *    Are the desired analyses indicated  on the bottle labels and Chain-of-Custody?

  *    Does the information on the Chain-of-Custody match the information on the sample
       container labels?

  *    Is the sample identification clearly marked on the sample container enclosure with
       waterproof ink?

  *    Has the Chain-of-Custody been placed into a plastic bag and attached to the inside of the
       cooler lid?

  *    Is the shipping Bill of Laden been properly and clearly filled out including laboratory
       contact name and phone number?

  *    Is there sufficient ice (double bagged in ziploc baggies) or "blue ice" in the cooler?

  *    Are the sample container secured (no free space between containers) with bubble wrap or
       equivalent?

-------
Kinnickinnic River QAPP, Draft, April 2002
                                                                              42
                                 APPENDIX E
                         Sampling Station Coordinates
Station
Identification
KK0201

KK0202

KK0203

KK0204

KK0205

KK0206

KK0207

KK0208

KK0209

KK0210

KK0211

KK0212

KK0213

KK0214
Degrees Minutes
Seconds
N 43° 00' 24.582"
W87U 54' 49.461"

N43U 00' 26.052"
W 87° 54' 50.774"

N43U00'27.127"
W 87° 54' 51. 121"

N 43° 00' 27.587"
W87U 54' 49.501"

N 43° 00' 29.083"
W 87U 54' 50.804"

N43U 00' 28.793"
W 87° 54' 47.492"

N43°00' 30.135"
W 87° 54' 46.746"

N 43° 00' 29.027"
W 87U 54' 44.620"

N43U 00' 29.594"
W 87U 54' 44.067"

N43U 00' 30.668"
W 87° 54' 41.053"

N 43° 00' 30.479"
W 87U 54' 39.337"

N 43° 00' 29.352"
W 87U 54' 36.910"

N43U00'29.196"
W 87° 54' 35.002"

N43U00'30.198"
W 87° 54' 31. 186"
Degrees Minutes
N 43° 00.4097'
W 87U 54.8244'

N43U 00.4342'
W 87° 54.8462'

N43U 00.4521'
W 87° 54.8520'

N 43° 00.4598'
W 87U 54.8250'

N 43° 00.4847'
W 87U 54.8467'

N43U 00.4799'
W 87° 54.7915'

N 43° 00.5022'
W 87° 54.7791'

N 43° 00.4838'
W 87U 54.7437'

N43U 00.4932'
W 87U 54.7344'

N43U 00.5111'
W 87° 54.6842'

N 43° 00.5080'
W 87U 54.6556'

N 43° 00.4892'
W87U 54.6152'

N43U 00.4866'
W 87° 54.5834'

N43U 00.5033'
W 87° 54.5198'
State Plane NAD 83
N 373938. 5418
E 2526476.8398

N 374084.8365
E 2526375. 5895

N 374193. 0581
E 2526374.0692

N 374242.5581
E 2526466.2548

N 377391. 5662
E 2526365.6385

N 374368. 3612
E 2526612.4021

N 374505. 5048
E 2526664.4580

N 374397.3218
E 2526825. 1040

N 3 7445 5. 7424
E 2526864.7492

N 374570.0809
E 2527085.9289

N 374554.1841
E 2527213. 8477

N 374444.6494
E 2527396.9646

N 374432.3383
E 2527539.0922

N 374540.8733
E 2527819.9944

-------
                                 EXHIBIT  1
       KINNICKINNIC RIVER  ILLUSTRATION OF  SEDIMENT  BORINGS
                                   2002
                                              SEDIMENT SURFACE (DEPTHS  VARY 'I
                                               2 ft
                                               A ft
                                                            2 ft
                                                            4 ft
                                               6 ft
                                               1 ft
                                                            6 ft
                                                            8 ft
                                               10 ft
                                                           110 ft
                                               12 ft
                                                              ft
                                              114 ft
                                                              ft
                                               16 ft
                                                            18 ft
ENVIRONMENTAL
AND GEOTECHNICAL
ATTERBERG  LIMITS
I
                                              hi ft
                                               19 ft
ENVIRONMENTAL
                          ENVIRONMENTAL
                          AND  ATTERBERG LIMITS
                                     18 it

                                     19 ft

                                     20 -ft
                                  2Z ft
                                     24 ft
                                                                     u

-------
      SCALE
  1 CM  -  100 FT
o  BORING TO  IS FT
«  BORING TO 26  FT
£73 MAY  NOT BE  ACCESSABLE
fc^TO THE DRILL RIG  BARGE
                FIGURE 2
          KINNICKINNIC RIVER
SEDIMENT SAMPLING STATION LOCATIONS
                  2002
.\Microstotion\Kintiiktnnic R.dqn "05/08/02 09:58:26 AM
KINNICKINNtC RIVER
ENVIRONMENTAL SAMPLING
PROPOSED S««.INO LOCATIONS

-------
                            CANADA
MILWAUKEE
                           OHIO
      FIUGRE 1



  MILWAUKEE HARBOR



    LOCATION MAP

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002
                             Quality Assurance Project Plan
                                Title and Approval Sheet
                             Version 1.0, December 23, 2001
                         "Great Lakes Sediment Data Support"
                                IAG No.: DW13947973-01
                         Demaree Collier, Project Officer, GLNPO,
                 77 West Jackson Blvd (G-17J), Chicago, Illinois 60604-3590
                         Phone:  312-886-0214, Fax:312-353-2018
                     Scott Cieniawski, Data Entry Coordinator, GLNPO,
                 77 West Jackson Blvd (G-17J), Chicago, Illinois 60604-3590
                          Phone: 312-353-9184, Fax:312-353-2018
                    Jay Field, Project Manager, Hazmat (N/ORCA3), NOAA
                      7600 Sand Point Way NE, Seattle, Washington 98115
                          Phone: 206-526-6404, Fax:  206-526-6865
                       Louis Blume, Quality Assurance Manager, GLNPO
                        77 W. Jackson Blvd. (G-17J), Chicago, IL 60604
                          Phone(312)353-2317, Fax (312) 353-2018

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002

                               QAPP Distribution List

Demaree Collier, Project Officer
Lou Blume, QA Manager
USEPA- GLNPO
77 West Jackson Blvd (G-17J)
Chicago, Illinois 60604-3590
phone: 312-353-9184

Jay Field, Project Manager
Hazmat (N/ORCA3), NOAA
7600 Sand Point Way NE
Seattle, Washington 98115
Phone: 206-526-6404

-------
  GL Sediment Data Support QAPP, Final Draft, January 24, 2002

TABLE OF CONTENTS

  QAPP Signature Page  .........       1
  QAPP Distribution List .........       2
  Table of Contents      .........       3

  1.      Summary      .........       4
         1.1    Purpose       ........       4
         1.2    Background                 ......       4
         1.3    Project Organization   .......       5
  2.      Project Description     .      .              .....       7
         2.1    Expected Measurements and Data Uses. .       ....       7
         2.2    Description of Work to be Performed.       ....       7
         2.3    Special Personnel, Training, and Equipment Requirements .       .       8
         2.4    Project Schedule     .......       8
  3.      QA/QC Requirments for Population and Use of Database  ...       8
         3.1    Identification and Evaluation of Candidate Data Sets       .       .       9
         3.2    Check Data In MS Excel Format     .....       9
         3.3    Check Data In MS Access Format   .....       10
         3.4    Field Data Collection  .       .       .       .      .       .       .       11
         3.5    Documentation Provided to Data Users      ....       13
               3.5.1   Custodial Records    ......       13
               3.5.2   Technical Documentation      .       .      .       .       .       13
  4.      Response Actions     ........       13
  List of Figures
  Figure 1.  Organizational Chart ........       5


  List of Tables
  Table 1.      Tentative Project Schedule     ......       8
  Table 2.      List of Data Qualification Codes     .....       12
  Appendices
  Appendix A:   Data Quality Documentation Checksheet
  Appendix B:   Minimum Requirements for Laboratory Data Package Reports
  Appendix C:   QA/QC Audit Checklists
  Appendix D:   Technical Documentation of Data Translation Process

-------
  GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                                /_

1.      SUMMARY

  1.1    Purpose

  The U.S. EPA's Great Lakes National Program Office (GLNPO) and the National Oceanic and
  Atmospheric Administration (NOAA) are coordinating on a project to develop and populate a
  database system to house GLNPO's historical and future sediment data. Under the project
  sediment data will be formatted using NOAA's existing Query Manager (QM) database format
  for use in the MARPLOT visualization software, a general-purpose mapping application.
  Additionally, this data will be stored for many years and will be made available to other federal,
  state, and local agencies, as well as private citizens and companies as requested.  Therefore, it is
  important to ensure that a high quality database is created.

  In order to ensure construction a high quality database, it is important that certain quality
  control/quality assurance (QA/QC) actions be followed during development and population of
  the database. The Quality Assurance Project Plan (QAPP) details the required QA/QC
  components to be adhered to during this project.
  1.2    Background

  Since 1993, GLNPO has been providing funds for the assessment of sediments within the Great
  Lakes Basin, through an annual grants program.  The data collected during these sediment
  assessment surveys has been primarily used by GLNPO and the grantee to determine current
  sediment quality conditions, to prioritize sediment management decision, and for site-specific
  program-focused objectives (i.e., regulatory enforcement).  For these purposes, it was
  considered unnecessary to house the data in a relational database, since each of the individual
  programs only dealt with a relatively small amount of data, and the data was primarily stored in
  hard copy format, or in electronic spreadsheet tables.

  However, in recent years, GLNPO has received numerous requests for sediment data for use in
  determining spatial and temporal trends (e.g. the National Sediment Inventory), prioritizing
  sediment management decisions on a regional basis, and other potential uses. It has become
  apparent that the current format of the GLNPO sediment data was too unwieldy for widespread
  dissemination.  The QM database format provides several advantages over other existing
  database formats, including:

            1.  Ability to incorporate both chemical and toxicity testing data,
            2.  Compatibility with the universally available MS ACCESS database system, and
            3.  Steam-lined compatibility with the existing and universally available MARPLOT
               visualization software.

  Therefore, the QM format was selected for use with the GLNPO sediment data.

  Several complicated, time-intensive, and error-prone steps will be required in order to convert
  the sediment data from its  current format into the QM format for use with the MARPLOT
  software. It is important, therefore, that the QA/QC steps detailed in this document be adhered
  to during population and development of the database.  The rest of this document details the

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                                 5

responsibilities, tasks, and QA/QC efforts necessary to ensure the development of a high-quality
sediment database.
1.3    Project Organization

Table 1 provides a summary of the project organization for this project.  A description of the
duties of each of the major participants in the population of the database is provided below.

Figure 1. Organizational Chart
               PROJECT OFFICER
                Demaree Collier
                USEPA-GLNPO
                 312-886-0214
                                 PROJECT MANAGER
                                     Jay Field
                                   NOAAtezmat
                                   206-526-6404
 DATA ENTRY COORDINATOR
     U.S.EPA-GLNPO
      312-353-9184
GLNPOQA MANAGER
  USEPA-GLNPO
   312-353-2317
  DATABASE DEVELOPER
     Corinne Severn
Premier Environmental Services
     702-255-9685
  DATABASE DEVELOPER
      Peggy Myre
EVS Environmental Consultants
     206-217-9337
USEPA-GLNPO
USEPA-GLNPO is currently in charge of data management for the data sets to be used on this
project., and GLNPO and it's Great Lakes partners will be the principal users of the sediment
database.

GLNPO is responsible for development of the project QAPP, data input and conversion, and
QA/QC reviews of the candidate data sets prior to and after inclusion into the Query Manager
formate.  USEPA-GLNPO  staff associated with this project include:
Person:
Demaree Collier
Project Officer
77 W.Jackson Blvd. (G-17 J)
Chicago, IL 60604
phone: 312-886-0214
collier.demaree@epa.gov
          Responsibilities:
          Coordinate GLNPO and NOAA Activities
          Approve Work Plan and QAPP
          Participate in QA/QC Review of Candidate Data Sets
          Participate in Data Entry Activities
          Participate in QA/QC Review of Formatted Data Sets
          Participate in Query Manager Training
          Controls and tracks the release of data to requestors

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002
Scott Cieniawski
Data Entry Coordinator
77 W.Jackson Blvd. (G-17 J)
Chicago, IL 60604
phone: 312-353-9184
cieniawski.scott@epa.gov

Louis Blume
GLNPO Q A Manager
77 W.Jackson Blvd. (G-17 J)
Chicago, IL 60604
phone: 312-353-2317
blume.louis@epa.gov
Prepare QAPP
Participate in QA/QC Review of Candidate Data Sets
Participate in Data Entry Activities
Participate in QA/QC Review of Formatted Data Sets
Participate in Query Manager Training
Review/Approve QAPP
National Oceanic and Atmospheric Administration
NOAA representatives will contract management, technical support, and database and
MARPLOT software training for this project. NOAA representatives involved in this project
include:
Person:
Jay Field
Project Manager
NOAA Hazmat (N/ORCA3)
7600 Sand Point Way NE
Seattle, Washington 98115
Phone: 206-526-6404
       Responsibilities:
       Contract for database development
       Contract and coordinate computer training
       Provide technical support on database formatting
       questions to USEPA
       Review and Approve QAPP
       Review technical documentation for database
Contractors
The contractors will be responsible for development of data tools and documentation as directed
by the NOAA project manager.
Person:
Peggy Myre
EVS Environmental Consultants
Phone: 206-217-9337

Corrine Severn
Premier Environmental Services
13900 Panay Way, Suite SR 305
Marina del Rey, CA 90292
Phone: 310-578-9667
       Responsibilities:
       Develop User Interface for Database Conversions
       Provide training on use of database
       Prepare technical documentation on database format

       Develop User Interface for Database Conversions
       Provide training on use of database
       Prepare technical documentation on database format

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                                j

2.     Project Description

2.1    Expected Measurements and Data Uses

No new sediment surveys will be undertaken as part of this project. Instead, historical sediment
quality data will be compiled, evaluated and used to populate a sediments database utilizing the
existing database structure, Query Manager. This project is intended to provide data from
GLNPO-sponsored sediment sampling surveys to all interested users in a consistent, usable
format.

Currently, the database structure is only set up to handle data relating to the collection and
analysis of sediment and tissue chemistry and biological toxicity samples. Information that we
expect to compile and format during this data compilation effort includes:

           1.  Results of Sediment Chemistry Analysis (including QA/QC samples),
           2.  Results of Biological Toxicity Testing (including QA/QC samples),
           3.  Locational Information on Sampling Points,
           4.  Information on Principal Investigator(s), and
           5.  Information on Physical Attributes of Samples.

This GLNPO sediment database will have several useful applications, including allowing users:

           1. Open, easy data on sediment chemistry, biological testing, and physical
              properties of sediments,
           2. To view and analyze data spatially on electronic mapping software,
           3. To perform spatial and temporal trends analysis on the data,
           4. To identify priority sites for additional investigations or sediment management
              decisions,
           5. To make decisions regarding the quality and usability of the data for their
              intended purposes,
           6. Include GLNPO data in site specific, regional, and national sediment inventories,
           7. Compare this data to data collected in other surveys, and
           8. Other unanticipated uses of the data.
2.2    Description of Work to be Performed

The following work items (described in Chapter 3) are required to be accomplished during this
project:

          1.  Development of User Interface to Convert data from MS Excel format to MS
             Access format,
          2.  Identification and QA/QC Evaluation of Candidate Data Sets,
          3.  Input Data into Query Manager MS Excel Data Format,
          4.  QA/QC Evaluation of MS Excel formatted data,
          5.  Convert data from MS Excel to MS Access format,
          6.  QA/QC Evaluation of MS Access formatted data.,
          7.  Produce technical guidance for formatting and converting data sets,
          8.  Maintain custodial records of electronic and hard copy data sets.

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                               g

2.3     Special Personnel, Training, and Equipment Requirements

Database Population and Use
Formatting data for inclusion into the GLNPO database is a complicated process and requires
knowledge or training in MS Excel, MS Access, the Query Manager data reporting format, the
data conversion routines to convert data from MS Excel to MS Access format, and MARPLOT
visualization software. NOAA personnel and contractors will train GLNPO personnel on the use
of the QM system and MARPLOT  software during a 2-day training session to be held in
November 2001.

Technical guidance for the conversion routines will be provided as part of this project. Training
in the reporting format, data conversion, and the MARPLOT  software will also take place as part
of this project. Familiarity with MS Excel and MS Access will be required of those participating
in the data entry and/or conversion  process.
2.4    Project Schedule

A tentative project schedule is provided in Table 3. All personnel shown in Figure 1 should be
contacted regarding significant schedule changes.

Table 1. Tentative Project Schedule

Task                                                       Completion Date
Training for GLNPO Personnel                                November 15, 2001
Production of Conversion Routines                             December 31, 2001
QAPP Development and Sign-Off                              January 15, 2002
Selection and Evaluation of Candidate Data Sets                 January 31, 2002
Input Data Sets into MS Excel Format                          February 7, 2002
QA/QC Evaluation of MS Excel Formatted Data                 February 15, 2002
Conversion of Data into MS Access Format                     February 20, 2002
QA/QC Evaluation of MS Access Formatted Data                February 28, 2002
Production of Technical Guidance for Conversion Process         March 31, 2002
 3.     QA/QC Requirements for Population and Use of Database

There are five main project tasks where it is critical for the project participants to implement
thorough QA/QC strategies, in order to ensure the success of the project. These tasks include:
          1.  Identify and Evaluate Candidate Data Sets,
          2.  Check Data after Inputting into Query Manager MS Excel Data Format,
          3.  Check Data after Converting from MS Excel to MS Access format,
          4.  Provide Users with Proper Qualification Codes and Documentation to Aid Proper
             Use of Data, and
          5.  Maintain custodial records of electronic and hard copy data sets.

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002

Each of these tasks are vital to producing a quality database, as well as, for troubleshooting
potential problems during the project. The QA/QC strategies required for each of these five
project tasks are described in detail in the remainder of Section 3.

The QA/QC review process will be documented using the Data Quality Documentation
Checksheet provided in Appendix A.
3.1    Identification and Evaluation of Candidate Data Sets

All sediment chemistry and toxicity data collected and/or analyzed since 1993 utilizing GLNPO
funding is considered eligible for inclusion in this database.  However, GLNPO has determined
that the data sets to be included in this sediment database must meet the following criteria:

          1. Data must have been collected and analyzed under a QAPP or QMP approved by
             GLNPO,
          2. GLNPO must have possession of Final Approved QAPP and a copy of the
             Laboratory Data Report Package (either electronic or hard copy). Minimum
             requirements for this report are provided in Appendix B,
          3. Data must pass the Post-Sampling QA/QC Audit Checklists provided in Appendix
             C, and
          4. Data Analysis must have been performed utilizing standard method and
             procedures, wherever available (e.g. ASTM, SW-846, etc.).

**  GLNPO Representatives will be responsible for all evaluation procedures described above.

The data checklist were developed in consultation between GLNPO and state agency
representatives, and provide a method for ensuring a minimum quality level for data put into the
database.  Any data points that are questionable, due to failing QA/QC requirements or inability
to confirm questionable number (e.g. contaminant concentrations several orders of magnitude
higher than other nearby sites), will  be qualified, if possible, (see discussion in Section 3.4) or
deleted from the data set. Note that all or only portions of a given data sets may be excluding
from use in the database based on the screening criteria described above.

Additionally, since we are dealing with historical data, GLNPO representative must verify the
current contact information for the Principal  Investigator (PI).  If GLNPO is unable  to verify
contact information for the PI, GLNPO personnel's contact information will be substituted for
the Pi's information.
3.2    Check Data In MS Excel Format

GLNPO personnel will be responsible for translating all selected data sets from the hard copy or
electronic format to the MS Excel Query Manager reporting format. This translation process
will require significant data input and manipulation and provides ample opportunities for
typographical and data entry errors. In fact, this step is probably the most error-prone task of the
entire project.

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                              J Q

Therefore, in order to check for errors during the data entry process, all data (laboratory analysis
and locational data) entered in the MS Excel format will be double-checked, 100% number for
number, in order to ensure 100% accuracy of the data input process  This data check will
entail the use of two (2) GLNPO personnel, and will proceed as follows:

          1. One person will be responsible for the reviewing the original hard copy or
             electronic laboratory format (Person #1) and one person will be responsible for
             reviewing the MS Excel formatted data set (Person #2),
          2.  Person #1  will read off a line of data, including: site number, analysis performed,
             numerical analysis result, units of measure, and any accompanying qualifying
             code,
          3. Person #2 will read off a line of data, including: site number, analysis performed,
             numerical analysis result, units of measure, and any accompanying qualifying
             code,
          4. Persons #1 and #2 will  confirm the accuracy of each line of data or make
             corrections as required,
          5. Repeat Steps 2 through 4 until all lines of data have been checked,
          6. Sign and Document QA/QC review on Data Quality Documentation Checksheet
             (Appendix A)

** GLNPO personnel are responsible for completing this portion of the QA/QC Evaluation.
3.3    Check Data In MS Access Format

The MARPLOT visualization software requires a relational database format for use in viewing
the sediment data.  Since MS Excel is not a relational database format the MS Excel spreadsheets
must be converted to a relational database format.  Therefore, after completing the QA/QC
review described in Section 3.2 above, each data set will be converted from MS Excel format
into MS Access format utilizing the conversion routines and technical documentation provided in
Appendix D.

The conversion routines developed for this project have several QA/QC checks imbedded into
the routines (see Appendix D for details). These imbedded QA/QC routines are helpful, but they
are incidental to the overall QA/QC evaluation of the final database. The next critical step of the
QA/QC evaluation is to ensure that the  data has been properly converted from MS Excel format
into MS Access format.

The conversion routines used to translate the data from MS Excel worksheets to a MS Access
database are less error prone than the initial data entry steps. Additionally, errors in the
conversion routines would tend to be universal, rather than individual (e.g.  translation of all PAH
data from ppm to ppb, etc). Therefore,  although the same method will be used to check this
converted data,  as was used in Section 3.2 above, a less exhaustive process will be employed.
The MS Access data sets will be double-checked on a semi-random, 10% basis. A total of
10% of all data  points will be checked.  The semi-random method, refers to the fact that instead
of checking a random 10% of all data entered, 10% of all site locations will be checked for 100%
of their data.  This data check will entail the use of two (2) GLNPO personnel, and will proceed
as follows:

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                               J J

          1. Use of a random number generator to select 10% of station locations to check,
          2. One person will be responsible for the reviewing the original hard copy or
             electronic laboratory format (Person #1) and one person will be responsible for
             reviewing the MS Access formatted data set (Person #2),
          3.  Person #1  will read off a line of data, including: site number,  analysis performed,
             numerical analysis result, units of measure, and any accompanying qualifying
             code,
          4. Person #2 will read off a line of data, including:  site number, analysis performed,
             numerical analysis result, units of measure, and any accompanying qualifying
             code,
          5. Persons #1 and #2 will confirm the accuracy of each line of data or make a note of
             errors encountered,
          6. Repeat Steps 2 through 4 until all  randomly selected lines of data have been
             checked,
          7. If ANY errors are encountered the entire MS Access formatted data set should be
             double-checked, 100% number for number, in order to determine if there
             are any imbedded errors in the conversion routines, or in the implementation
             of the conversion routines. Representatives should also check the MS Excel
             spreadsheets to determine if the original spreadsheets contained the
             identified errors.
          8. If ANY errors are encountered, NOAA representatives shall be contacted for
             trouble-shooting support and for assistance in determining if changes need to be
             made in the conversion routines or documentation (all data input and conversion
             activities shall stop until the errors have been identified and corrected.)
          9. If no errors are discovered, Personnel shall sign and document QA/QC review on
             Data Quality Documentation Checksheet (Appendix A)

** GLNPO personnel are responsible for completing this portion of the QA/QC Evaluation.
3.4    Documentation Provided to Data Users

The relational database constructed through these efforts represents a secondary use of sediment
chemistry and toxicity data. For most projects, the primary use of the data was for the grantees
to determine and make sediment management decisions regarding current sediment quality
conditions at a particular site.  Often, the primary use is the only data use specified in a QAPP,
and therefore, it is difficult to determine if the data collected was of sufficient quality to support
the multitude of secondary uses that users of the database may envision.

The documentation contained within the database provides adequate information for most users
to determine if the data provided is of sufficient quality to be utilized for their intended purposes.
At the very least the user will be able to determine if additional QA/QC information is necessary
to support their intended use. The database also contains contact and report availability
information for the user to obtain additional QA/QC data if required.

Additionally, the data will only be available upon individual requests from GLNPO Sediment
Assessment and Remediation Team members. Whenever, such requests are fulfilled, the full
GLNPO MS Access formatted sediment database will be sent to the requestor accompanied by
the qualifications provided below, and the data qualification codes provided in Table  1.  Because

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002
                                                                                           12
the relational database may be used independently of the MARPLOT software and MS Access is
a universally available database system, no documentation will be provided on use of the
MARPLOT visualization tool or MS Access.
QUALIFICATION
The sediment data contained in the database was collected and analyzed under an Quality
Assurance Project Plan (QAPP) and/or Quality Management Plan (QMP) approved by GLNPO.
However, the quality of the data contained in this database may not be sufficient for use for any
purposes except the purposes outlined in the original QAPP/QMP document. IT IS THE
RESPONSIBILITY OF THE USER TO ENSURE THAT THE QUALITY OF THE DATA IS
SUFFICIENT TO MEET ITS INTENDED USE.  The database contains contact information for
the principal investigator, and the user is strongly advised to contact the PI to discuss any
questions regarding quality assurance and quality control and potential uses of the data.

Table 2. List of Data Qualification Codes
     Code
      B

      E

      I

      J

     LD

     LS

      M


     MX
      Q

      u

      X
  GLNPO QUALIFIER CODES
                 Description
Analyte was detected in laboratory method blank.
Estimated Concentration. Analyte to internal standard
ratio exceeds the range of calibration curve.
Estimated maximum possible concentration due to
peak interferences.
Estimated Concentration. Result is greater than
detection limit, but less than reporting limit.
Lab Duplicate: Batch quality control for lab surrogate
exceeds upper or lower control limits.
Lab Surrogate: Batch quality control for lab surrogate
exceeds upper or lower control limits.
MS/MSD recoveries exceed the upper or lower control
limits.
MS/MSD: Analyte was present in the original sample
at a concentration X times greater than the spike
concentration. Therefore, control limits for the
MS/MSD samples are not applicable.
The lower of the two values is reported when the
percent difference between the results of two GC
columns is greater than applicable control limits.
Results was qualitatively confirmed, but not quantified.

Analyte was not detected at or above the reporting
limit.
Estimated Value.  Analytes co-elution resulted in
inability to differentiate different analytes.
** GLNPO personnel are responsible for providing the data users with the documentation
described in this section.

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                              J 3

3.5     Custodial Records and Documentation

3.5.1   Custodial Records

A separate GLNPO Sediment Database project file will be maintained for this project will be
maintained for as long as the GLNPO Sediment Database is maintained. This project file will be
stored at the GLNPO offices.  A separate folder will be maintained for each dataset included in
the sediment database. The following items will be maintained in the folder for each dataset:

          1 .  Final approved QAPP or QMP,
          2.  Copy of the Laboratory Data Report Package (either electronic or hard copy),
          3.  Completed Post-Sampling QA/QC Audit Checklists (see Appendix C),
          4.  Final project report summarizing the data, and
          5.  Completed Data Quality Documentation Checksheet (see Appendix A)
3.5.2   Technical Documentation

Technical documentation on the data translation routines is contained in Appendix D. This
documentation will be stored and maintained in the GLNPO Sediment Database project files for
as long as the GLNPO Sediment Database is maintained.  Whenever this documentation is
updated, all previous versions of the technical documentation will be removed from the project
files.

** Demaree Collier of GLNPO is responsible for maintaining the GLNPO Sediment Database
project files. It is her responsibility to keep the files current.  She is also responsible for
delegating this responsibility to another member of the GLNPO Sediment Assessment and
Remediation Team, if and when, she leaves the services of GLNPO.
4.     Response Actions

Potential problems and erros on this project fall into three major categories: (1) Data entry
errors, (2) imbedded conversion routine errors, and (3) incomplete or incorrect technical
documentation.

Data Entry Errors
All data entry errors will be identified and corrected by USEPA GLNPO personnel. Scott
Cieniawski and Demaree Collier will have the responsibility of identifying and correcting data
entry errors utilizing the steps outlined in Section 3. No documentation of data entry errors is
required.

Imbedded Conversion Routine Errors and Incomplete or Incorrect Technical Documentation
All problems involving conversion routines and technical documentation will be referred
immediately to Jay Field of NOAA. Either Scott Cieniawski or Demaree Collier may contact
Mr. Field directly. Response actions will be documented and carried out as follows (responsible
party).

-------
GL Sediment Data Support QAPP, Final Draft, January 24, 2002                                               J 4

          1.  Identify and Document Problem in an E-mail (GLNPO personnel)
          2.  Contact Jay Field of NOAA via E-mail (GLNPO Personnel)
          3.  Identify and Correct Cause of the Problem (NOAA Personnel and/or Contractors)
          4.  Document Corrective Action in E-mail to Demaree Collier (NOAA Personnel)

-------
Raisin River QAPP, Final, October 4, 2001
                                Quality Assurance Project Plan
                                   Title and Approval Sheet
                                Final Version, October 4, 2001
                         Raisin River Sediment Sampling in FY2002:
                   A Follow-Up to the 1997 Sediment Remediation Project
                                     IAG#DW96947964-01
                             Scott Cieniawski, Project Officer, GLNPO
         77 West Jackson Blvd (G-17J), Chicago, Illinois 60604-3590, Office Phone:312-353-9184,
                       Cellular Phone (field) 312-961-0592, Fax: 312-353-2018
              Paul Baxter, Project Coordinator, U.S. Army Corps of Engineers - Detroit District
                          19215 W. Eight Mile Road, Detroit, Michigan 48226,
                              Phone (313) 226-7555, Fax (313) 226-2013
                   Patricia Novak, Project Manager, Lakeshore Engineering Services, Inc.
                          19215 W. Eight Mile Road, Detroit, Michigan 48219,
                              Phone (313) 535-7882, Fax (313) 535-7875
                     Ann Preston, Project Manager, Trace Analytical Laboratories, Inc.
                        2241 Black Creek Rd., Muskegon, Michigan 49444-2673,
                          Phone (231) 773-5998, Ex. 224, Fax (231) 773-6537
                           Al Mozol, Operations Manager, AScI Corporation
                             444 Airpark Blvd., Duluth, Minnesota 55811,
                              Phone (218) 722-4040, Fax (218) 722-2592
                   Kathleen Loewen, Quality Assurance Manager, Lancaster Laboratories
                      2425 New Holland Pike, Lancaster, Pennsylvania 17605-2425,
                              Phone (717) 656-2300, Fax (717) 656-0450
                           Louis Blume, Quality Assurance Manager, GLNPO
                            77 W. Jackson Blvd. (G-17J), Chicago, IL 60604
                              Phone (312) 353-2317, Fax (312) 353-2018

-------
Raisin River QAPP, Final, October 4, 2001

                                  QAPP Distribution List

Scott Cieniawski, Project Officer
Lou Blume, QA Manager
USEPA- GLNPO
77 West Jackson Blvd (G-17J)
Chicago, Illinois 60604-3590
phone: 312-353-9184

Paul Baxter, Project Coordinator
U.S. Army Corps of Engineers - Detroit District
477 Michigan Avenue
Detroit, MI 48226
phone: 313-226-7555

Roger Jones
Michigan DEQ - SWQD
P.O Box 30273
Lansing, MI 48909
phone: 517373-4704

Patricia Novak, Project Manager
Lakeshore Engineering Services, Inc.
19215 W. Eight Mile Road, Detroit, Michigan 48219
phone: (313)535-7882

Ann Preston, Project Manager
Trace Analytical Laboratories, Inc.
2241 Black Creek Rd., Muskegon, Michigan 49444-2673
phone: (231) 773-5998, Ex. 224

Al Mozol, Operations Manager
AScI Corporation
4444 Airpark Blvd., Duluth, Minnesota 55811
phone: (218) 722-4040

Kathleen Loewen, Quality Assurance Manager
Lancaster Laboratories
2425 New Holland Pike, Lancaster, Pennsylvania 17605-2425
phone: (717) 656-2300

-------
  Raisin River QAPP, Final, October 4, 2001

TABLE OF CONTENTS

  QAPP Signature Page  .........
  QAPP Distribution List .........
  Table of Contents      .........

  1.      Summary      .........
         1.1    Purpose        ........
         1.2    Background                  ......
         1.3    Project Organization    .......
  2.      Project Description     .       .              .....
         2.1    Data Uses and Expected Measurements. .       .       .        .       .
         2.2    Criteria and Objectives  .       .       .       .       .        .       .
                       2.2.1           Sediment Chemistry    ....
                       2.2.2           Fish Tissue and L. variegatus Tissue Chemistry  .
         2.3    Special Personnel, Training, and Equipment Requirements       .       .
         2.4    Project Schedule       .......
  3.      Sampling Plan .........
         3.1    Sampling Network Design and Rationale       .       .        .       .
         3.2    Definition of Sample Types     .       .       .       .        .       .
         3.3    Type and Number of Samples   ......
         3.4    Field Data Collection   .       .       .       .       .        .       .
  4.      Sample Collection and Handling.       .       .       .       .        .       .
         4.1    Sample Collection.     .       .       .       .       .        .       .
                4.1.1   Sediment Cores .       ......
                4.1.2   Sediment Ponars       .       .       .       .        .       .
                4.1.3   Cage Fish Samples     ......
         4.2    Sample Handling       .......
                4.2.1   Sample  Containers     .       .       .       .        .       .
                4.2.2   Sample  Labeling       ......
                4.2.3   Shipping and Chain-of-Custody .       .       .        .       .
                4.2.4   Receipt of Samples     ......
  5.      Laboratory Analysis    ........
         5 . 1    Analysis Methods       .......
         5.2    Data Quality Objectives .       ......
                5.2.1   Method Detection Limits and Level of Quantitation      .       .
                5.2.2   Bias    ........
                5.2.3   Precision       .......
                5.2.4   Accuracy       .......
                5.2.5   Representativeness     ......
                5.2.6   Comparability  .......
                5.2.7   Completeness          ......
  6.      Documentation and Records    .......
                6.1     Field Documentation    ......
                6.2     Laboratory Reports     .       .       .       .        .       .
  7.      Special Training Requirements  .       .       .       .       .        .       .
  8.      Quality Control Requirements   .       .       .       .       .        .       .
         8.1    Instrument/Equipment Testing, Inspection, and Maintenance Requirements
         8.2    Instrument Calibration and Frequency  .....
         8.3    Inspection/Acceptance Requirements for Supplies and Consumables     .
         8.4    Data Management      .......
         8.5    Data Acquisition Requirements (Non-Direct)    ....
1
2
3

5
5
5
5
10
10
12
12
13
13
14
14
14
16
17
17
18
18
18
18
19
19
19
20
21
22
23
23
24
24
26
26
28
28
28
29
29
29
30
31
31
32
32
33
34
36

-------
Raisin River QAPP, Final, October 4, 2001

Table of Contents

9.      Assessment and Oversight
       9.1     Assessment and Response Actions
       9.2     Reports to Management.
10.    Data Validation and Usability
11.    References     ....
36
36
39
40
41
List of Figures
Figure 1. Organizational Chart .
Figure 2. Sampling Locations  .
Figure 3. Example Sample Label
7
15
21
List of Tables
Table 1.       Screening Values for Contaminants of Concern  ....       13
Table 2.       Required Reporting Limits for Tissue Chemistry to Allow Comparison to
                     Historical and Future Data Sets .       .       .       .       .       13
Table 3.       Tentative Project Schedule     .       .       .       .       .       .       14
Table 4.       Summary of Data and Analyses at Sampling Locations  .       .       .       16
Table 5.       Summary of Type and Number of Samples to be Collected      .       .       17
Table 6.       Sample Container and Preservation Requirements.      ...       20
Table 7.       Addresses for Shipment of Samples            ....       22
Table 8.       Laboratory Analysis and Preparation Methods.  ....       23
Table 9.       Target Detection Limits for Sediment Chemistry        ...       25
Table 10.      Target Detection Limits for Tissue Chemistry   ....       25
Table 11.      Quality Control Limits .......       27

Appendices
Appendix A:   Sampling SOPs
Appendix B:   Benthic Community Assessment SOP
Appendix C:   Caged Fish Sampling SOP (GLEAS Procedures #62 and #31)
Appendix D:   Field Measurement SOP
Appendix E:   Field Sample Log
Appendix F:    USAGE Survey Markers
Appendix G:   Example Chain of Custody Form
Appendix H:   Cooler Receipt Form
Appendix I:    Laboratory Method Detection Limits, Method Reporting Limits, & SOP Numbers
Appendix J:    Minimum QA/QC Checklist for Post-Sampling Data Evaluation
Appendix K:   Summary Tables for Sample Collection and Analysis

-------
  Raisin River QAPP, Final, October 4, 2001                                                            5

1.      SUMMARY

  1.1    Purpose

  The sampling efforts detailed in this document outlines a plan to determine the extent of
  recovery of the aquatic system at the site of the 1997 River Raisin - Ford Outfall Site sediment
  remediation project.  The data collected during this study will be used to compare with data
  collected prior to the remediation project.  The data will also be used to compare with future data
  to document the progress of the system's recovery over time. Proposed testing for this round of
  sampling include:

            1.  Caged Fish Testing
            2.  Sediment Bioaccumulation Testing (with Lumbriculus variegatus),
            3.  Sediment Chemistry (PCBs, metals, SVOCs, SEM-AVS, TOC, nutrients)
            4.  Whole Sediment Toxicity Testing (with Hyalella azteca & Chironomus tentans),
            5.  Benthic Community Analysis (specific to lowest taxonomic level below family)

  Primary Objective:  Collect sufficient data to ascertain the current state of the sediment
  environment in the vicinity of the Raisin River, Ford Outfall, Sediment Removal Project

  Secondary Objectives: 1.  Collect sufficient data to compare current sediment chemistry and
  caged fish testing data to historical data collected at the site.

  2. Collect adequate sediment data (chemistry, toxicity, tissue chemistry, etc.) to provide an
  overall summary of sediment quality conditions within the Raisin River AOC.

  1.2    Background

  Site Location
  The Raisin River flows southeast through the southeast corner of Michigan's Lower Peninsula,
  discharging into Lake Erie at Monroe Harbor near the city of Monroe, Michigan. The Area of
  Concern (AOC) is defined as "the lower 2.6 miles of the River Raisin, downstream from Dam
  Number 6 at Winchester Bridge in the city of Monroe, extending [downstream] one-half mile
  into Lake Erie,  and including Plum Creek which discharges into Lake Erie through a canal."
  (USEPA,  1994)

  Once forested with mature hardwood  stands, the AOC now consists  of mostly cleared land that is
  urban, suburban, and industrial in nature. Industries within the AOC include automotive, steel
  and paper manufacturers. Several landfills also border the river within the AOC. Contaminants
  of concern in the AOC include: polychlorinated bi-phenols (PCBs), chromium, copper, zinc, and
  oil and grease.  (USEPA, 1994).

  Remedial Activities
  In 1997 the U.S. Environmental Protection Agency (USEPA) completed a Superfund removal
  project at the Ford Monroe contaminated sediment site.  The cleanup resulted in the remediation
  of approximately 27,000 cubic yards of PCB-contaminated sediments (maximum concentration
  49,000 ppm) by the Ford Motor Company from a large depositional  area near an old 48-inch
  outfall from the Ford Monroe Stamping Plant.  Remedial work also included the removal of

-------
Raisin River QAPP, Final, October 4, 2001                                                            g

contaminated in-plant sewer material to eliminate the potential for on-going releases.  The
remedial work at the site was completed in October 1997.  (USEPA, 1998).

Historical Sampling
A significant amount of sediment chemistry and caged fish testing data is available to document
pre-remedial conditions at the site.  A limited amount of post-remediation data is also available.
If available, the following historical data sets will be evaluated as part of this project:

          1.  1988 Michigan DNR caged fish testing,
          2.  April 1991 sediment sampling survey by Michigan State University,
          3.  November 1991 sediment sampling survey by Michigan DNR's Surface Water
              Quality Division,
          4.  1991 Michigan DEQ caged fish testing,
          5.  October 1992 sediment sampling survey by Region 5 USEPA,
          6.  1995 Michigan DEQ sediment sampling survey,
          7.  1997 Michigan DEQ sediment sampling survey
          8.  1998 Michigan DEQ caged fish testing,
          9.  1998 Michigan DEQ sediment sampling survey, and
          10.  2000 Army Corps of Engineers sediment sampling survey.

Selection of Site for Post-Remediation Testing
The Great Lakes National Program Office (GLNPO) has been interested in collecting data on:
(1) the ability of environmental dredging projects to reach remedial  targets, (2) levels of residual
contamination  left behind, and (3) recovery of the aquatic system after completion of
environmental  dredging projects. However, in order to address these questions,  significant pre-
remedial data must exist for a site, a major sediment remediation project must have been
completed that addresses a significant portion of the contamination present at the site, and, based
on information gained  on post-remedial assessments at the Black River Ohio site, the aquatic
system must be given a minimum of three years to recover before post-remedial work is
performed.

GLNPO believes that the Raisin River is one of the few sites within the Great Lakes Basin that
meets all three of these criteria. It has now been approximately four years since sediment
remediation and source control work was completed at the Ford Monroe sediment site. The Ford
Monroe removal  represents a significant portion of the PCB contamination present at the site,
and significant pre-remedial sediment and caged fish data exists for the AOC.
1.3    Project Organization

Table 1 provides a summary of the project organization for this project.  A description of the
duties of each individual is provided below.

-------
Raisin River QAPP, Final, October 4, 2001

Figure 1. Organizational Chart
                                       PROJECT MANAGER/FIELDTEAM LEADER
                                             Scoll Cieniauski
                                             USEPA-GLNPO
                                             312-353-918*
   ENVIRONMENTAL SCIENTIST
      Demaree Collier
      U.S.EPA-GLNPO
       312-886-02H
  GLNPO OAMANAGER
    Louis Elume
   USEPA-GLNPO
   312-353-2317
   PROJECTMANAGER
     Paul Baxter
U.S. Army Corps of Engineers -Detroi
     313-226-7555
        SITE COORDINATOR
          Roger Jones
          Michigan DEO
          517-3734704
                                        ANALYTICAL SERVICES COORDINATOR
                                             Patricia Novak
                                          Lakeshore Engineering Services
                                             313-535-7882
                                       CAGED FISH FIELD SUPPORT
                                          Mike Alexander
                                          Michigan DEO
                                          517-3354189
                                                                                CAGED FISH FIELD SUPPORT
                                           Michigan DEO
                                           517-335-33H
   BENTHIC COMMUNITY ANALYSIS
        AScI
       218-7224040
BIOACCUMULATION TESTING
 AIMozol/KathleenLoewen
  AScl/LancasterLabs
 218-7224040/717-656-2300
   FISHTISSUE ANALYSIS
    Kathleen Loewen
    Lancaster Labs
     717-656-2300
                                                           WHOLE SEDIMENT TOXICITY TESTING
  AScI
218-7224040
SEDIMENT/WATER CHEMISTRY ANALYSIS
      Ann Preston
   Trace Analytical Laboratory
      231-773-5998
USEPA-GLNPO
USEPA-GLNPO is the principal investigating agency for this sediment survey. They are
responsible for coordination and development of the sampling plan and QAPP as well as the
principal client for the final data. USEPA-GLNPO staff associated with this project include:
Person:
Scott Cieniawski
Project Coordinator/Field Team Manager
77 W.Jackson Blvd. (G-17 J)
Chicago, IL 60604
phone: 312-353-9184
cieniawski.scott@epa.gov

Demaree Collier
Environmental Scientist
11W. Jackson Blvd. (G-17J)
Chicago, IL 60604
phone: 312-886-0214
collier.demaree@epa.gov

Louis Blume
GLNPO QA Manager
11 W. Jackson Blvd. (G-17J)
Chicago, IL 60604
phone: 312-353-2317
blume.louis@epa.gov
                        Responsibilities:
                        Prepare Sampling Plan
                        Prepare QAPP
                        Oversee Sample Collection
                        Field Team Member
                        Analyze data and write report
                        Perform project management tasks

                        Review/Analyze Data
                        Field Team Member
                        Review/Approve QAPP

-------
Raisin River QAPP, Final, October 4, 2001
U.S. Army Corps of Engineers - Detroit District/US ACE HTRW Center of Expertise
The U.S. Army Corps of Engineers will provide laboratory contracting support, and an initial
QA/QC review of the project data.  The USAGE representatives will be responsible for
contacting USEPA-GLNPO regarding any concerns regarding the data received from the
laboratories, and advising USEPA-GLNPO regarding any concerns expressed by the
laboratories. USAGE individuals involved in this project include:
Person:
Paul Baxter
Project Coordinator
313-226-7555
Paul.R.Baxter@lre02.usace.army.mil

Cheryl Groenjes
Chemist
402-697-2568
Cheryl.A.Groenjes@nwd02.usace.army.mil
Responsibilities:
Contract out analytical work
Perform contract management activities
Review QAPP
Perform QA/QC review of the analytical report
Michigan DEQ
The Michigan DEQ will provide coordination and field support to this project. Field support will
be provided during sediment sampling and MDEQ will conduct the caged fish sampling. MDEQ
will also provide historical data, results, and information on sampling and analysis methods used
during historical studies.
Person:
Roger Jones
Site Coordinator - MDEQ - SQWD
P.O. Box 30273
Lansing, MI 48909-7973
517.373.4704
j onesrjj @state.mi .us

Mike Alexander
P.O. Box 30273
Lansing, MI 48909-7973
alexandm@state.mi.us

Bob Day
P.O. Box 30273
Lansing, MI 48909-7973
517-335-3314
Responsibilities:
Coordinate MDEQ Support to Project
Provide information on historical sampling results
and analytical methods
Review final report
Review QAPP
Conduct Caged Fish Sampling
Provide SOP for caged fish sample collection
Review QAPP
Provide SOP for processing caged fish samples
Provide Analytical SOP for caged fish samples
Review QAPP
Laboratories
Laboratory analyses for this project will be performed by several different laboratories.  Lakeshore
Engineering Services, Inc. will coordinate analytical services from a variety of separate
laboratories under a contract agreement with the USAGE. Lakeshore Engineering Services, Inc.
will be responsible for sub-contracting for sample analysis. Both the contract laboratory and the
analyzing laboratories will have sample analysis and review responsibilities on this project. Each
laboratory will have their own provisions for conducting an internal QA/QC review of the data

-------
Raisin River QAPP, Final, October 4, 2001
before it is released to the U.S. Army Corps of Engineers. The laboratory contract supervisors
listed below will contact the USAGE project coordinator with any data concerns.

Several different laboratories will be utilized to perform all of the testing for this project.  The
following is a summary of the analytical work to be performed by each laboratory.
Analyses
Whole Sediment Toxicity Tests
Benthic Community Assessment
Bioaccumulation Testing
Caged Fish Tissue Analysis
Geotechnical Analysis
All Other Chemical and Physical Analyses
Laboratory
AScI
AScI
AScI/Lancaster Laboratories
Lancaster Laboratories
Coleman Engineering Company
Trace Analytical Environmental
Laboratories/Lancaster Laboratories
Written QA/QC reports will be filed by the analytical laboratories each time data is submitted to
the USAGE.  Corrective actions will be reported to the USAGE project coordinator along with the
QA/QC report (see Sections 9). Any of the laboratories may be contacted directly by USEPA or
USAGE personnel to discuss QA concerns. Lakeshore Engineering Services will act as laboratory
coordinator on this project and all correspondence from the laboratories should be coordinated
through Lakeshore Engineering Services. Trace Analytical Laboratories and Lancaster Laboratory
will perform all chemical analyses and AScI Corporation will perform the whole sediment toxicity
testing. Coleman Engineering Co. will perform geotechnical testing (grain size distribution).
Responsibilities of each lab and the laboratory coordinator are provided below:
Person:
Patricia Novak
Lakeshore Engineering Services
313-535-7882
pattin@lakeshoreeng.com
Aziz Khandker
Lakeshore Engineering Services
313-535-7882
azizk@lakeshoreeng.com

Ann Preston
Trace Analytical Laboratories
231-773-5998
Responsibilities:
Review final analytical report
Ensure Sub-Contract Laboratory Resources are
available on an as-required basis
Review final analytical reports
Review Quality Assurance Plan

Perform independent technical review of final
analytical reports
Coordinate/Perform chemical and physical analyses
Ensure Laboratory resources are
available on an as-required basis
Review final analytical reports analyses
Supply required sample bottles, required preservatives,
and coolers (including temperature blanks)

-------
Raisin River QAPP, Final, October 4, 2001                                                             J Q

Al Mozol                                 Coordinate/Perform Whole Sediment Toxicity Tests
AScI Corporation                          Ensure Laboratory resources are available on an
218-722-4040                             as-required basis
                                         Review final analytical reports analyses
                                         Supply Required sample bottles, required preservatives,
                                         and coolers (including temperature blanks)
                                         Perform Lumbriculus exposure to test sediments
                                         Ship Lumbriculus tissue samples to Lancaster
                                         Laboratory for analysis

Kathleen Loewen                         Coordinate/Perform chemical analyses
Lancaster Laboratory                       Ensure Laboratory resources are available on an
717-656-2300                             as-required basis
                                         Review final analytical reports analyses
                                         Supply required sample bottles, required preservatives,
                                         and coolers (including temperature blanks)
                                         Submit a QA/QC Case Narrative

Jim Strigel                                Coordinate/Perform geotechnical analysis
Coleman Engineering                       Ensure Laboratory resources are available on an
906-774-3440                             as-required basis
2.     Project Description

2.1    Data Uses and Expected Measurements

GLNPO proposes a full-scale post-remediation assessment at the Ford Monroe site to augment
the data collected during the 1998 assessment.  The full-scale post-remediation assessment
would incorporate the collection of sediment cores and surficial grabs at and downstream of the
remedial site, and caged fish testing at and downstream of the remedial site.  Work would be
coordinated with the Michigan DEQ to insure comparability with the existing pre-remediation
data.  The proposed work components are summarized below.

Determination of Existing Pre-Remedial Data Availability
The first step of this project will be to identify sources and availability of pre-remediation data
for this site. The USEPA Superfund, the Michigan DEQ, and the U.S. Army Corps of Engineers
(USAGE) all have collected pre-remedial data at or near this site.  Both sediment data and caged
fish data exist for documenting pre-remedial conditions at this site.

GLNPO will coordinate with each of these agencies to determine the quality and availability of
existing data.

Sediment Chemistry Sampling
Sediment chemistry sampling will consist of the collection of a sediment core or a surficial
sediment grab sample at approximately 20 locations.  Three (3) of the locations will be upstream
of the remediated area, twelve (12) of the locations will be in the remediated area, and five (5) of
the locations will be downstream of the remediated area.   These locations were selected to meet
the priority goal of determining the success of the remediation effort, and the secondary goal of
providing data on the overall state of sediment chemistry within the entire AOC.

-------
Raisin River QAPP, Final, October 4, 2001                                                           j j


The 12 locations within the remediated area were selected utilizing a systematic-aligned
sampling algorithm in order to minimize the probability of missing a major sediment desposit
(greater than 25 feet in diameter) within the remediated area.. Sampling locations outside the
remediated area were selected by identifying sediment deposits within the AOC, assigning the
deposits to an "upstream"  or "downstream" category, and then randomly selecting five locations
from "downstream" deposits, and three locations form "upstream" deposits.  More locations were
selected downstream of the of the remediated area, since historical sampling indicates that the
Ford-Monroe Outfall was the major source of contamination within the  AOC.

If insufficient sediment is present for the collection of a grab sample and/or core within the
remediated area, no sample will be collected at the site, and the site will be identified as "No
Sediment Present". If insufficient sediment is present for the collection of samples outside of the
remediated area, the sampling crew will probe the entire sediment deposit until an adequate
sampling location is identified.

All sediment cores will be sectioned into sub-samples of 0"-6", 6"-18",  18"-54", and 54"-90".
GLNPO anticipates collection of an average of two (2) samples per site.  All sediment samples
collected will be analyzed for PCBs, using the PCB Arochlor (to be consistent with the historical
data sets) and PCB congener-specific analysis as provided in USEPA SW-846 Method 8082;
heavy metals  (USEPA Methods 6020 and 6010B); and mercury (USEPA Method 7471 A).

Whole Sediment Toxicity Testing
Surficial grab samples will be collected from 8  locations using a ponar dredge sampler. The
sampling locations will be positioned as follows: two  (2) locations upstream of the remediated
area, four (4)  locations within the remediated area, and two (2) locations downstream of the
remediated area.  Rationale and sample location algorithms for the location of toxicity sampling
is similar to that described above under "Sediment Chemistry Sampling".  Fewer sites were
selected based on budget contraints.

These surficial grab  samples will be used to conduct Hyalella azteca 28-day and Chironomus
tentans  10-day whole sediment toxicity tests according to USEPA Test Methods 100.4 and
100.2, respectively, as detailed in Methods for Measuring the Toxicity and Bioaccumulation of
Sediment-Associated Contaminants in Freshwater Invertebrates (USEPA, 2000).  Sediment
samples will also be analyzed for PCB  concentrations, simultaneously extracted metals-acid
volatile sulfide (SEM-AVS), total metals, grain size and Total Organic Carbon (TOC).

Benthic Community Structure Evaluations
Surficial grab samples will be collected from 8  locations using a ponar dredge sampler. The
sampling locations will be positioned as follows: two  (2) locations upstream of the remediated
area, four (4)  locations within the remediated area, and two (2) locations downstream of the
remediated area. Testing  manuals recommend  co-locating benthic community evaluations with
whole sediment toxicity testing locations to allow for  examination of correlations between the
data.

These surficial grab  samples will be used to conduct benthic macroinvertebrate community
assessments. Organisms contained in the sediment samples will be sorted and enumerated in to
the following orders or families:  Oligochaeta,  Chironomidae, Bivalvia, Gastropoda,
Ephemeroptera, Odonata, Plecoptera, Hemiptera, Megaloptera, Trichoptera Coleoptera,

-------
Raisin River QAPP, Final, October 4, 2001                                                           J 2

Diptera (other than Chironomidae), Himdinea, andAmphipoda using published taxonomic keys
(e.g. Wiederholm [1983]; Merritt and Cummins [1984], Pennak [1989], Thorp and Covich
[1991]).  Samples will be used to estimate macroinvertebrate numerical abundance (individuals
per square meter), species composition, and taxa richness. Identification will be to the lowest
taxonomic level below family.

Sediment Bioaccumulation Tests
Surficial grab samples will be collected from 8 locations using a ponar dredge sampler. The
sampling locations will be positioned as follows: two (2) locations upstream of the remediated
area, four (4) locations within the remediated area, and two (2) locations downstream of the
remediated area. Testing manuals recommend co-locating bioaccumulation evaluations with
whole sediment toxicity testing locations to allow for examination of correlations between the
data.

These surficial grab samples will be used to conduct Lumbriculus variegatus 28-day
bioaccumulation testing for PCB congeners.  Tests will be conducted according to USEPA Test
Method 100.3 as detailed in Methods for Measuring the Toxicity and Bioaccumulation of
Sediment-Associated Contaminants in Freshwater Invertebrates (USEPA, 2000).

Caged Fish Testing
Caged young or one year catfish (4"-8" long) testing will take place at approximately four (4)
locations, plus one replicate at one of the four locations within the AOC as close as possible to
MDEQ's historical caged fish sampling locations. Duration of the caged fish testing will be 28-
days. At the initial startup of the sampling, four (4) day-0 fish will be sampled for the control
group.  After the end of the 28 days, four  (4) fish from each sampling location plus another four
(4) fish from the replicate cage will be sampled to make a total of twenty (24) samples collected
for the caged fish testing.  MDEQ will supply the cages for the study. The caged fish will be
analyzed for total PCB levels using the method outlined in Bulletin of Environmental
Contamination and Toxicology,  1986, Vol. 37, pages 1-9  (copy in Appendix I)

Prior to  setting each cage, one sediment and one water column sample will be collected at each
site, including the replicate location. Prior to pulling each trap, a water sample will be collected
at each site. The sediment samples will be collected using a ponar grab sampler and water
samples will be collected using a Van Dorn sampler. All  water and sediment samples collected
will be analyzed for PCB concentrations using the PCB Arochlor and congener-specific analysis
(EPA Method 8082).

The Michigan DEQ has the most experience conducting caged fish testing at this site.  The
Michigan DEQ will assist in coordinating the caged minnow testing and corresponding sediment
and water sampling at the site. The USAGE contract laboratory will perform the laboratory
analysis and costs will be covered using the existing Inter-Agency Agreement.

Care should be taken  to insure that cased fish testing is performed at approximately the same
time of year as historical cased fish testing, which is between August and September. This
testing should also be scheduled to avoid any significant dredging projects within the Raisin
River watershed, though none are currently scheduled for 2002.

-------
Raisin River QAPP, Final, October 4, 2001
                                                                                       13
Bathymetric Survey
A sediment probe will be used to estimate the depth of sediments present at the 20 sediment
coring locations.  Sediment depth will be estimated to the nearest 0.25 feet.
2.2    Criteria and Objectives

2.2.1   Sediment Chemistry

Sediment Chemistry data will be compared to existing sediment quality guidelines (SQGs) like
those of MacDonald et al. (2000) and Persuad et al. (1993). Table 1 provides the required
reporting limits necessary to allow for sediment chemistry results to be compared directly to
these screening guidelines.
Table 1.
Screening Values for Contaminants of Concern
Analyte
Arsenic
Cadmium
Chromium
Copper
Lead
Mercury
Nickel
Zinc
Total PCBs (as.Arochlors)
PCB Congeners
Unit
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
mg/kg DW*
Required Reporting
Limits
6.0
0.6
26.0
16.0
31.0
0.2
16.0
120.0
0.676
0.01
*      DW = Diy Weight

2.2.2   Fish Tissue Chemistry and L. variegatus Tissue Chemistry

Fish tissue and L.  variegatus tissue chemistry analysis must be of sufficient quality to allow for
comparison to historical and future data collected at the site. Table 2 provides the required
reporting limits necessary to allow for fish tissue chemistry results to be compared directly to
historical and future data.

-------
Raisin River QAPP, Final, October 4, 2001

Table 2.      Required Reporting Limits for Tissue Chemistry to Allow Comparison to
             Historical and Future Data Sets
14

Analyte
Total PCBs (as Arochlors)
PCB congeners
Level of Chlorination
Mono- to tri-chloro
Tetra- to Hexa-chloro
Hepta- to Octa-chloro
Nona- to Deca-chloro

Units
mg/kg


mg/kg
mg/kg
mg/kg
mg/kg
Required
Reporting Limits
0.30


0.02
0.03
0.04
0.05
2.3    Special Personnel, Training, and Equipment Requirements

Sediment Sampling
Sediment sampling will require the use of the USEPA's Research Vessel (R/V) Mudpuppy or an
equivalent vessel. Additional equipment requirements for collecting sediment core and sediment
ponar samples are contained in Appendix A.
Benthic Community Assessment
Collection and filtering of sediment samples for benthic community assessments requires the use
of an elutriator fitted with a 500 micron mesh filter (equipment available on the R/V Mudpuppy).
Additionally, a 0.500 M formaldehyde solution is required for preserving the benthic community
samples. A detailed SOP is provided in Appendix B.

Caged Fish Sampling

Collection of caged fish samples will follow the SOP provided in Appendix C in order to be
consistent with historical sampling performed by the MDEQ.  Training and equipment
requirements are provided in Appendix C.
2.4    Project Schedule
A tentative project schedule is provided in Table 3. All personnel shown in Figure 1 should be
contacted regarding significant schedule changes.

-------
Raisin River QAPP, Final, October 4, 200 1
                                                           Completion Date
                                                           August 31,2001

                                                           October 2001
                                                           January 2002
                                                           March 2002

                                                           April 2002
                                                           August/September 2002
                                                           August/September 2002

                                                           October/November 2002

                                                           December 2002
Table 3.  Tentative Project Schedule

Task
QAPP Development and Sign-Off
Sediment Sampling   (chemistry, toxicity, bioaccumulation,
                    and benthic community)
Completion of Sediment Analysis and Testing
Analytical Report Due to USAGE
Sediment Sampling Summary Report and Analytical Data
      Due to USEPA-GLNPO
Caged Fish Sampling
Sediment and Water Sampling at Caged Fish Sites
Completion of Caged Fish, Sediment, and Water Analytical
      Analysis
Caged Fish Sampling Summary and Analytical Data Report
      Due to USEPA GLNPO

 3.    Sampling Plan

3.1   Sampling Network Design and Rationale

The purpose of this sampling survey is to determine the quality of the sediments and health of
the aquatic ecosystem in the vicinity of a previously completed sediment remediation project as
well  as to provide an overall summary of sediment quality within the Raisin River AOC. In
order to obtain a full picture of the health of the aquatic ecosystem data on a large number of
ecosystem metrics need  to be collected.  Sediment chemistry, sediment toxicity, in-situ
bioaccumulation, and ex-situ bioaccumulation samples will be collected along with water and
benthic community samples. These samples will allow us to determine the levels of
contaminants present in  the water, the direct impact of sediments on the benthic community, and
the potential for bioaccumulation of contaminants through the aquatic food chain.

Figure 2 presents an overview of the assessment area, a delineation of the remediated area, and
approximate locations for collection of sediment samples. Approximate latitude/longitude of the
sampling points and specific analytical tests to be performed on each sample are provided in
Appendix K

The sampling locations are designed to provide focused coverage of the removal area as well as
some general coverage of areas upstream and  downstream of the removal area. In-depth
rationale and algorithms used for selecting sampling locations are provided in Section 2.1. Table
5 summarizes the types of data and analyses to be collected at each type of sampling location.

-------
Raisin River QAPP, Final, October 4, 2001

Figure 2.  Sampling Locations
16
                                                                      River Raisin
                                                                       Sampling Plan
                                                                            Upstream Points (3)
                                                                          • Downstream Feints (5)
                                                                          • Dredged Area Points (12)
                                                                         |   | Dredged Area

-------
Raisin River QAPP, Final, October 4, 2001

Table 4. Summary of Data and Analyses at Sampling Locations
                                      17
       Core Samples (20 Locations)
              Sediment Chemistry
              Sediment Depth
              Water Depth (corrected to LWD)
              Latitude/Longitude
       Caged Fish Sample (4 Locations)
              Sediment Chemistry
              Water Chemistry
              Fish Tissue Chemistry
              Latitude/Longitude
Ponar Samples (8 Locations)
       Sediment Chemistry
       Whole Sediment Toxicity
       Benthic Community
       Water Depth (corrected to LWD)
       Latitude/Longitude

Bioaccumulation Samples (8 Locations)
       Tissue Chemistry
       Latitude/Longitude
3.2    Definition of Sample Types

Four types of sediment samples will be collected during this survey; Routine Field Samples
(RFS), Field Replicates (FR), Field Duplicates (FD), and Matrix Spikes/Matrix Spike Duplicates
(MS/MSD).  Each sample type is described below.

Routine Field Samples (RFS)'. Prepared by collecting a single ponar grab sample or section of a
sediment core, homogenizing the sediments collected, and filling all required sample jars.
Routine field samples will be collected at twenty (20) locations.  Locations of the RFS are
indicated in Appendix K.
Field Duplicates (FD):  Prepared by filling a second set of sample jars from a single ponar or
section of a sediment. FDs will be collected at three (3) sediment core locations and one (1)
ponar sampling location. This is approximately equivalent to a ratio of FDs to RFSs of 1 to 10
(10%).  Locations of the FDs are indicated in Appendix K.

Field Replicates (FR): Prepared by collecting a second, separate ponar grab or sediment core
sample, homogenizing the material separately from the RFS and filling the required sample
bottles jars. FRs will be collected at two (2) sediment core locations, one (1) ponar sampling,
and one (1) caged fish location. This is approximately equivalent to a ratio of FRs to RFSs of 1
to 10 (10%). Locations of the FRs are indicated in Appendix K.

Matrix Spike/Matrix Spike Duplicates (MS/MSD) :   The MS/MSD samples will be collected out
of the same sample as the RFS, homogenizing the sediment sample, and filling the required
sample jars.  One MS/MSD sample will be collected from each of six (6) sediment ponar
locations.  This is approximately equivalent to a ratio of 1 to 10 (10%) since a total of at least
sixty samples will be collected for this project. Locations of MS/MSDs are indicated in
Appendix K.

The tables in Appendix  K summarize the types of samples to be collected and analyzed for each
sampling location.

-------
Raisin River QAPP, Final, October 4, 2001

3.3    Type and Number of Samples
18
Table 5 summarizes the type and number of samples to be collected during this sampling event.
The estimated number of samples include all RFS, FD, FR, and MS/MSD samples.

Table 5.      Summary of Type and Number of Samples to be Collected
Sample Type
Sediment Chemistry from Cores
Sediment Chemistry from Ponars
Sediment Toxicity
Benthic Community
Caged Fish Tissue
Sediment from Cage Fish Sites
Water from Caged Fish Sites
Sediment Bioaccumulation
Estimated
Number of
Samples
48
11
11
11
24
6
12
11
Sample
Matrix
Sediment
Sediment
Sediment
Sediment
Tissue
Sediment
Water
Sediment
Analysis Required
PCBs (Aroclors and Congeners), Oil & Grease, TOC, %
Moisture
PCBs (Aroclors and Congeners), SVOCs (PAHs), Metals (As,
Ba, Cd, Cr, Cu, Fe, Pb, Mn, Hg, Ni, Se, Ag, and Zn), TOC,
SEM-AVS, Grain Size, Nutrients (Total Phosphorus,
Ammonia-Nitrogen, TKN), Moisture Content
28-Day H. azteca Survival and Growth (weight and length),
10-day C. tentans Survival and Growth (weight)
Identified to Family Level
PCBs (Aroclors and Congeners)
PCBs (Aroclors and Congeners), Metals (As, Ba, Cd, Cr, Cu,
Fe, Pb, Mn, Hg, Ni, Se, Ag, and Zn), TOC, Moisture Content
PCBs (Aroclors and Congeners), Metals (As, Ba, Cd, Cr, Cu,
Fe, Pb, Mn, Hg, Ni, Se, Ag, and Zn), TOC
PCBs (Congeners)
All of the data listed in Table 5 is considered critical to the success of this assessment project.
3.4    Field Data Collection

SOPs for all field measurements are contained in Appendix D.  Two pieces of field data will be
collected that are critical to the data quality objectives for this project.

       Latitude/Longitude Location: This data is critical for use in determining where sediment
       samples were collected.  The two (2) Differential Global Positioning Systems (DGPS)
       onboard the R/V Mudpuppy are both capable of ascertaining horizontal locations with <
       5 meters of accuracy.  To achieve this accuracy, it is important that the DGPSs are in
       good working order and are obtaining strong satellite signals. The field team will be
       responsible for checking the satellite signal strength for the DGPS systems prior to
       recording this data and for ensuring that the two systems are recording equivalent
       horizontal locations. Any problems with signal strength or differences between the two
       systems shall be recorded on in the field sample log (Appendix E). If problems are noted,
       the field team should provide a qualitative description of the sampling location utilizing
       any available, permanent landmarks. Both DGPS units will have their accuracy  checked
       prior to each days sampling activities by locating one of the USAGE survey markers
       shown on Appendix F. The DGPS unit's antennas will be located as close to the marker
       as possible and the reading will be compared to those in Appendix F.

-------
Raisin River QAPP, Final, October 4, 2001                                                            J C)

       Sediment Depth:  Sediment depth data is critical for determining the volume of sediments
       and mass of PCBs remaining in the removal area.

Other field data to be collected includes:

           1.  Water Depth (corrected to Low Water Datum post-sampling),
          2.  Log of Major Ship Traffic in the Survey Area,
          3.  Log of Significant Navigation Dredging in the Survey Area, and
          4.  Sediment Physical Observations.

However, these additional pieces of data are not considered critical to the objectives of this
project.

       Water Depths
       Water depths will be taken directly over the location of the sampling site prior to sample
       collection with a weighted measuring tape. Water depths will be reported as actual depth
       measured and as water depth corrected to Low Water Datum. Low Water Datum is
       available for Raisin River at the closest daily and hourly water levels station for Lake
       Erie at Fermi Power Plant, which can be obtained from the Internet at the NOAA home
       page for water elevations.

       [Note:  Low Water Datum is available in 6 minutes intervals.  The address is:
       http://www.opsd.nos.noaa.gov/data_res.html. From this address under Preliminary
       Water Level Data select Great Lakes Stations then choose Fermi power Plant and display
       recorded water levels in feet.]
4.     Sample Collection and Handling

4.1    Sample Collection

4.1.1   Sediment Cores

Sediment cores will be collected utilizing the vibracorer sampling device located on the USEPA-
GLNPO's sampling vessel, the R/VMudpuppy.  The vibracorer is capable of collecting
continuous sediment cores up to 15 feet in length. Appendix A contains the equipment needs,
the Standard Operating Procedures (SOP), and the decontamination procedures for the collection
of sediment core samples during this sediment survey.

All sediment cores will be analyzed for sediment chemistry as summarized in Table 5 and
explained in detail in Section 5.

4.1.2   Sediment Ponars

Sediment ponar samples will be collected utilizing the ponar dredge sampling device on the
USEPA-GLNPO's sampling vessel, the R/VMudpuppy. The ponar dredge sampler collects a
surficial sediment sample of approximately six inches (6") in depth. Appendix A contains the
equipment needs, the Standard Operating Procedures (SOP),  and the decontamination procedures
for the collection of sediment ponar samples during this sediment survey.

-------
Raisin River QAPP, Final, October 4, 2001                                                           2Q


Sediment ponar samples will be used for sediment chemistry analysis, whole sediment toxicity
testing, laboratory bioaccumulation testing, and benthic community analysis as summarized in
Table 5  and explained in detail in Section 5.

4.1.3   Caged Fish Samples

Caged fish samples will be collected according to the Standard Operating Procedures contained
in Appendix C.  Appendix C also contains the SOP for collecting the surficial sediment and
water samples that need to be collected in conjunction with the caged fish sampling. Locations
of the caged fish testing will be selected by the MDEQ to correspond to their historical sampling
locations, but may be modified slightly to account for current site conditions. The MDEQ will
determine the latitude/longitude locations of the caged fish samples after the cages have been set.
Latitude and longitude will be recorded in a field notebook, along with the site number (i.e. C-l,
C-2, etc.) and relayed to Scott Cieniawski via e-mail.

Fish tissue samples, sediment from caged fish sites, and water from caged fish sites will be
analyzed for chemistry as summarized in Table 5  and explained in detail in Section 5.
4.2    Sample Handling

4.2.1   Sample Containers

After processing, sediment samples will be placed into the appropriate sample containers as
summarized in Table 6. A field sample log shall be filled out for each sampling location.

Note: The analyzing laboratory will supply all required sample containers, preservatives, and
sample coolers, including a temperature blank with each sample cooler.  The coolers, sample
bottles, and required preservatives for all samples, except the fish tissue samples, shall be
shipped to the following address no later than September 30, 2001:

Scott Cieniawski
USEPA-GLNPO
77 W. Jackson Blvd. (G-17J)
Chicago, IL 60604

Logistics for the delivery of the fish tissue sample containers will be made between the MDEQ,
USEPA, USAGE, and the analyzing laboratory prior to commencement of the caged fish
sampling.

-------
Raisin River QAPP, Final, October 4, 2001

Table 6.  Sample Container and Preservation Requirements
21
Analyses
PCBs
SVOCs (PAHs)
Mercury
Metals (Cd, Cr, Cu,
Ni, Pb, Se, Zn)
AVS/SEM
Nutrients, TOC
Total Solids
Percent Moisture
Particle Size
Whole Sediment
Toxicity
L. variageus,
B i oaccumul ati on
Benthic Community
Assessment
Fish Tissue
Container
8 oz. Widemouth Glass
4 oz. Widemouth Glass
Included in metals
4 oz, Widemouth Glass
4 oz., Widemouth Glass
with Teflon liner
Included in metals
Included in metals
Included in metals
1 Quart Zip Lock Baggies
4 L, Plastic
4 L, Plastic
1 L Plastic
aluminum packages
Preservation
Technique
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C,
No head space
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Sealed container
Do Not Cool
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C
Cool/dark, < 4 ° C,
airtight, preservative
of formalin
Frozen/dark, < 4 ° C
Holding
Times
14 days/40 days**
14 days/40 days**
28 days*
6 months*
7 days*
28 days*
7 days*
40 days*
6 months*
14 days
14 days
14 days
14 days/40 days**
              * From time of collection to analysis
              * * From time of collection to extraction/From time of extraction to analysis
4.2.2   Sample Labeling

Each sample bottle shall be individually labeled using a waterproof pen.  The label shall contain
the following information:

       •  Unique Sample Number:  RR01-XX-A; where "RR01" refers to the Raisin River
          2001 sampling event, "XX" refers to the numerical sequence of the sample locations,
          and "A" refers to the alphabetical sequence indicating sample depth ("A" is 1st
          layer/segment, "B" is the 2nd layer/segment, etc., a ponar sample will be designated
          "P") Field duplicates and field replicates shall receive their own unique sample
          number so as to provide a blind duplicate/replicate for laboratory analysis.  Caged
          fish sampling locations are numbered C-l, C-2, etc. Therefore caged fish samples
          will be labeled RR01-C-1, RR01-C-2, etc.)
       •  Sample Date (MM-DD-YYYY)
       •  Sample Time (HH:MM, on a 24-hour clock)
       •  Analysis to be performed  (e.g. PCBs, metals, whole sediment toxicity, etc.)
       •  Sampler's Initials

-------
Raisin River QAPP, Final, October 4, 2001                                                            22

An example label is shown in Figure 3.  Clear tape will be placed over the label after the label
has been completely filled out and attached to the sample container. The sample identification
number and date of sample collection will be written on the sample container closure with a
water proof marker.

Figure 3. Example Sample Label
                    RR01-09-A

                    PCBs
10-10-2001
13:30

SEC
4.2.3   Shipment and Chain-of-Custody

After collection and labeling, all glass containers shall be placed in a zip-lock bag, placed in an
appropriate sample cooler. Within 24 hours of sample collection, the samples will be sent to the
respective analyzing laboratory. After samples are collected each day, the USEPA Field Team
Leader shall be responsible for shipping and/or arranging pickup of samples. The Field Team
Leader shall insure that:
       1.  The coolers contain sufficient ice to keep the sample below 4° C during the shipment
          process,
       2.  Are immobilized with bubble pack to reduce the risk of breakage,
       3.  The chain of custody form (see example in Appendix G) is properly filled out,
       4.  A copy of the chain-of-custody form shall be retained and provided to the project
          manager,
       5.  A copy of the chain-of-custody form will be placed in a "ziploc" bag and taped to the
          inside lid of the cooler,
       6.  A copy of the chain-of-custody form is faxed to the Lakeshore Engineering Services
          Project Manager at (313) 535-7875,
       7.  A temperature blank is included in each sample cooler (temperature blank to be
          supplied by the laboratories),
       8.  The outside of the container will be shut using fiberglass or duct tape,
       9.  The laboratory name and address, as well as the return name and address, will be
          clearly labeled on the outside of the container,
       10. These samples will be sent to the contract laboratory by an overnight courier, and
       11.  Receipts of bills of lading will be retained as part of the permanent documentation.
       12. Commercial couriers are not required to sign off on the sample tracking form as long
          as it is sealed inside the sample cooler.
       13. Laboratories are contacted prior to shipment to insure they are prepared for sample
          arrival.
       14. Whole fish samples andL. variageus (bioaccumulation) samples are shipped frozen.

Note: Each analyzing laboratory will supply chain-of custody forms to the USEPA field team
leader prior to the sampling event.

-------
Raisin River QAPP, Final, October 4, 2001

Table 7 summarizes where each of the respective types of samples shall be shipped.

Table 7  Addresses for Shipment of Samples
                                           23
       Analysis Type
       PCB aroclors and congeners
       Sediment, water and tissue
       SEM & AVS
       Metals (including Hg),
       Nutrients, TOC, PAHs,
       % Moisture
       Whole Sediment Toxicity,
       Benthic Community Assessment,
       and Bioaccumulation Exposure
       Caged Fish Samples
       (PCB Aroclors and
       Congener Analysis)
       Grain Size
Laboratory Contact Information
Kathleen Loewen
Lancaster Laboratory
2425 New Holland Pike
Lancaster, PA 17605-2425
phone: (717)656-2300

Ann Preston
Trace Analytical Laboratories
2241 Black Creek Road
Muskegon, MI 49444-2673
phone: (231) 773-5998 ext. 224

Al Mozol
AScI Corporation
4444 Airpark Blvd
Duluth, MN55811
phone: (218)722-4040

Lancaster Laboratory/Sample Administration Grp.
2425 New Holland Pike
Lancaster, PA 17605-2425
phone: (717)656-2300

Coleman Engineering - Jim Strigel
635 Industrial Park Road
Iron Mountain, MI 49801
Phone: (906) 774-3440
4.2.4   Receipt of Samples

Upon receipt of project samples, each laboratory shall

       •   Complete their portion of the chain-of-custody forms,
       •   Contact the Lakeshore Engineering Services Project Manager to inform her of sample
          receipt and to discuss any problems or issues,
       •   Insure that the samples are maintained at < 4°C,
       •   Complete a Cooler Receipt Form (See example in Appendix H).
       •   If there are any sample shipment problems, the laboratory should contact Lakeshore
          Engineering Services Project Manager (Patricia Novak) and the Lakeshore
          Engineering Services Project Manager shall contact USEPA Project Manager (Scott
          Cieniawski) as soon as the sample shipment problem is discovered,
       •   Fax a copy of the chain-of-custody form to the USEPA project manager, Scott
          Cieniawski, at 312-353-2018.

-------
Raisin River QAPP, Final, October 4, 2001

5.     Laboratory Analysis

5.1    Analysis Methods

Analysis and preparation methods for all required analyses are provided in Table 8.

Table 8. Laboratory Analysis and Preparation Methods
24
Analyte
Moisture Content (Total
Solids)
Grain Size without
hydrometer
28-day H. azteca, and 10-
day C. tentans Whole
Sediment Toxicity Tests
TOC
Ammonia-Nitrogen
TKN
Total Phosphorus
Metals Kit
PCBs (Sediments)
Aroclors & Congeners'1 }
PCBs (Fish Tissue)
Oil and Grease
PCBs (Water)
PAHs
Acid Volatile Sulfide
(AVS) (2)
Simultaneously Extracted
(SEM) Metals (2)
Analysis Method
ASTM D2937
ASTM-D422
EPA/600/R-99/064,
Methods 100.4
(Survival & Length)
and 100.2 (Survival &
Weight)
9060/Lloyd-Kahn
E350.1
E 351.2
E 365.2M
EPA 6020, 60 10A (As
& Cd analyzed using
60 10BICAP Trace
procedures), 7471A
EPA 8082B
EPA 8082B
EPA 9070 for water
EPA 9071 A for
sediments
EPA 8082B
EPA 8270C
EPA-121-R91-100
EPA6020M6010A
Sample
Preparation
Method
N/A
N/A
N/A
N/A
N/A
N/A
N/A
EPA 3051 for
sediments and EPA
3015 for water
EPA 3540C
Lancaster Analysis
#2487 "Food and
Tissue Preparation"
N/A
EPA 35 IOC
EPA 3540C
Acid Leach
N/A
Sample Cleanup
Method








EPA Method 3620B,
3665A, or3630C


EPA 3630C
EPA Method 3630C


Laboratory SOP
Per ASTM
Per ASTM
S-301 and S-304
UWC-SOP-KAHN
UWC-SOP-350.2
UWC-SOP-351.3
UWC-SOP-365.2
UME-SOP-6010B-T,
UME-SOP-245.1
UGE-SOP-8082
UGE-SOP-8082



EPA-121-R91-100
EPA-121-R91-100
(1)The following 19 PCB congeners, listed by their International Union of Pure and Applied Chemistry Number
(IUPAC #), will be analyzed and reported for sediment chemistry: 1,5, 18,31,44,52,66,87, 101, 110, 138, 141,
151, 153, 170, 180, 183, 187, 206.

(2) AVS-SEM Analysis Method can be found in the "Draft Analytical Method for Determination of Acid Volatile
Sulfide in Sediment", EPA#: 821/R-91-100 YEAR:  1991, NTIS#:PB93-155901, EPJC#: D-121

-------
Raisin River QAPP, Final, October 4, 2001                                                            25

5.2    Data Quality Objectives (DQOs)

Data from the historical sampling events contains very little information regarding data quality
objectives. Additionally, the analytical detection limits obtained in the historical sampling
events (i.e., PCB Aroclor detection limits) may not be sufficient to meet the secondary objectives
of this project. Therefore, the DQOs chosen for this project will be based on the objectives
required to adequately assess the current state of the aquatic system in the study area.

The DQOs for the laboratory analysis portion of this project are defined according to the
following six quality assurance objectives.

Definitions

Instrument Detection Limit(IDL):  The instrument detection limit (IDL) is the lowest analyte
concentration that an instrument can detect.  The IDL is determined on samples that have not
gone through any sample preparation (e.g. calibration standards).

Limits of Quantification (LOQ): The limits of quantification is the lowest analyte concentration
that can be accurately measured and reported, as opposed to simply detected.

Method Detection Limit (MDL): Method detection limits (MDL) will be determined by making
repeated measurements (a minimum of seven) over several non-consecutive days of either  a
calibration blank or a low-level standard with a concentration within 1-5 times the IDL. The
MDL is calculated, at the 95 percent confidence level, as 3 times the standard deviation of the
measured sample concentrations.

Target Detection Limit (TDL): The target detection limit (TDL) is the concentration at which
each analyte must be detected and quantified in order to meet the study objectives. This means
that, if possible, all IDLs, MDLs and LOQs, should be less than the TDLs for all analytes.  If the
laboratory expects any of the IDLs, MDLs, or LOQs to exceed the required TDLs, they must
contact the USAGE and USEPA project managers to develop corrective action procedures.

5.2.1   Method Detection Limits and Level of Quantification

For quantitative physical and chemical analyses, analytical laboratories will be required to
determine the instrument detection limit (DDL) prior to any analysis of the routine samples. The
target detection limit (TDL) is the concentration  at which the presence of an analyte must be
detected to properly be able to assess and satisfy the DQOs.  To be acceptable, a laboratory must
demonstrate that the MDL is less than or equal to the TDL through use of laboratory quantitation
standards. The laboratories shall also strive to set the dry sample Limits of Quantification
(LOQs) below the applicable TDLs. Tables 1 and 2 contain the threshold effect concentrations
(TECs) for the chemicals to be analyzed that have actually had the TECs calculated.  Tables 9
and 10 contain this exact information, plus a few additional parameters that do not have
calculated TECs, which are all also listed at the TDL for each parameter.

Target detection limits for all required  sediment chemistry and tissue chemistry are provided in
Tables 9 and 10, respectively.

-------
Raisin River QAPP, Final, October 4, 2001
                                                                                           26
Table 9. Target Detection Limits for Sediment Chemistry
Analyte
Arsenic(1)
Cadmium(1)
Chromium
Copper
Lead
Mercury
Nickel
Zinc
Total Organic Carbon
PCBs, as Aroclors(2)
PCBs, as Congeners(2)
PAHs
Acid Volatile Sulfides
TKN
Total Phosphorus
Ammonia Nitrogen
TDL
1.2
0.2
5.0
3.0
6.0
0.05
3.2
20.0
1000.0
0.1
0.01
0.6
50.0
20.0
5.0
1.0
Unit
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
mg/kg DW
(1) Analysis may require EPA Method 6010B ICAP trace procedures to achieve required stated detection limits.
(2) Analysis may require additional sample cleanup preparation to achieve required detection limit.
(3) Metals not listed above to complete the 13 Metals Kit (Ba, Fe, Mn, Se, and Ag) should have the assigned
  laboratory meet the lowest detection limit possible.

Table 10.      Target Detection Limits for Tissue Chemistry
Analyte
Total PCBs (as Aroclors)
PCB congeners
Level of Chlorination
Mono- to tri-chloro
Tetra- to Hexa-chloro
Hepta- to Octa-chloro
Nona- to Deca-chloro
Units
mg/kg


mg/kg
mg/kg
mg/kg
mg/kg
TDL
0.30


0.02
0.03
0.04
0.05
Note:  If a laboratory is unable to obtain MDLs and LOQs that are below the respective TDLs
for each analyte, the laboratory shall contact the U.S. Army Corps of Engineers Project
Coordinator and/or the U.S. Environmental Protection Agency's Project Manager to discuss
required course of action.  Decisions to be made could include:  implementation of additional

-------
Raisin River QAPP, Final, October 4, 2001                                                            27

sample clean-up procedures prior to analysis, USEPA acceptance of higher MDLs and LOQs, or
implementation of other potential suggestions.

Note: It is understood that potential high moisture contents of the sediments could impact MDLs
and LOQs achieved by the laboratory.  In an effort to reduce the impact of high water content on
MDLs and LOQs the labs shall decant free water from the surface of the sediment samples prior
to analysis. Tentative laboratory Reporting Limits are listed in Tables 9 and 10.
5.2.2   Bias

Bias is the systematic or persistent distortion of a measurement process that causes errors in one
direction.  Bias assessments for environmental measurements are made using personnel,
equipment, and spiking materials or reference materials as independent as possible from those
used in the calibration of the measurement system. When possible, bias assessments should be
based on analysis of spiked samples rather than reference materials so that the effect of the
matrix on recovery is incorporated into the assessment. A documented spiking protocol and
consistency in following that protocol are important to obtaining meaningful data quality
estimates.  Spikes should be added at concentrations approximately at the mid-range. Spiked
samples shall be used in accordance with the specified method.

Bias will be assessed through the use of certified reference materials (CRMs), standard reference
materials (SRMs: a reference material certified by the U.S. National Institute of Standards
Technology [U.S. NIST]), or other standards, such as, matrix spikes. The use of spiked
surrogate compounds for GC and GC/MS procedures for PCB and PAH compounds,
respectively, will be used to assess for bias.

Matrix spike and matrix spike duplicate samples (MS/MSD) also will be used to assess bias as
prescribed in the specified methods. Acceptable recovery values will be within the recoveries
specified by each of the analysis methods. Control samples for assessing bias will be analyzed at
a rate as specified in the analytical SOPs and specified analytical  methods.
5.2.3   Precision

Precision is a measure of agreement among replicate measurements of the same property, under
prescribed similar conditions. This agreement is calculated as either the range (R) or as the
standard deviation (s). It may also be expressed as a percentage of the mean of the
measurements, such as relative percent difference (RPD) or relative standard deviation (RSD)
(for three or more replicates).

Laboratory precision is assessed through the collection and measurement of laboratory
duplicates.  The laboratories shall follow the protocols in the specified method and
corresponding SOPs regarding the frequency of laboratory duplicates. This allows intra-
laboratory precision information to be obtained on sample  acquisition, handling, shipping,
storage, preparation, and analysis.  Both samples can be carried through the steps in the
measurement process together to provide an estimate of short-term precision. An estimate of
long-term precision can be obtained by separating the two  samples and processing them at

-------
Raisin River QAPP, Final, October 4, 2001                                                             28

different times, or by different people, and/or analyzed using different instruments.  Acceptable
RPDs will be in accordance to those specified by each analysis method.
For duplicate measurements, relative percent difference (RPD) is calculated as follows:

              RPD = iDj-D?! xlOO%
                      (Di + D2)/2
                                         RPD = relative percent difference
                                         DI = sample value
                                         D2 = duplicate sample value
For three or more replicates:
              RSD = (s/x)xlOO
                                         RSD = relative standard deviation
                                         s = standard deviation of three or more results
                                         x = mean of three or more results
Standard deviation is defined as follows:
              s = ((Z(y1-meany)2xl/(n-l)))a5
                                         s = standard deviation
                           Yi = measured value of the replicate
                                         mean y = mean of replicate measurements
                                         n = number of replicates

Additionally, precision will be assessed by the collection of field duplicates and field replicates.
Field duplicates are collected by splitting samples AFTER the homogenization process. The
duplicates are assigned a separate Sample ID number and, therefore, provide a blind measure of
the laboratories precision by providing the laboratory with two basically identical samples. Field
duplicates measure the ability for the laboratory to obtain similar results  from two separate, blind
samples. RPD for field duplicates should meet the RPD control limits specified in Table 11.

Field replicates are collected from slightly different locations (<3 feet away) than the original
sample. Field replicates provide a measure of the variability inherent in the entire sampling and
analysis process, including, small-scale variability of site conditions, consistency of sampling
and homogenization process, and laboratory analysis.  The field replicates provide a general
picture of the amount of variability that can be expected between this and future sampling events,
even if site conditions do not change substantially. This is an important consideration since this
data will be compared to historical and future sampling events.  Since site variability can greatly
influence RPD for field replicates, no strict RPD measures will be used to evaluate this measure.
However, most sediment guidance recommends that RPD measures for field  replicates be in the
same range as that for field duplicates.

Quality control limits for Precision, Accuracy, and Completeness are summarized in Table 11.

-------
Raisin River QAPP, Final, October 4, 2001                                                           29

Table 11. Quality Control Limits (Aqueous and Sediment Matrices)	
       Analyte
  Precision
    (RPD)
                                                    Accuracy
                             Completeness
                                 fo/.
  Individual PAHs
  PCBs (Congeners
  and Aroclors)
  Mercury
  Metals (As, Cd,
  Cr, Cu, Ni, Pb, Zn)
  Acid Volatile
  Solids
  Simultaneously
  Extracted Metals
  (Cd, Cu, Pb, Ni,
  Ag, Zn)
  Percent Moisture

  Particle Size

  Ammonia
  TOC
     <40
     <50

     <40
     <35

     <40

     <40
As specified in
   method
As specified in
   method
     <25
     <20
As determined by Laboratory       90%
As determined by Laboratory       90%

As determined by Laboratory       90%
As determined by Laboratory       90%

As determined by Laboratory       90%

As determined by Laboratory       90%
   As specified in method          90%

           N/A                  90%

          80-120                 90%
          50-130                 90%
>80% Mean Survival of
Negative Control Samples          90%
  Toxicity Testing

RPD = Relative Percent Difference

5.2.4   Accuracy
Accuracy measures how close analytical results are to a true or expected value.  Accuracy
objectives will be determined by calculating the percent recovery range of laboratory matrix
spikes and matrix spike duplicates. Accuracy measures are calculated using the RPD between
the expected value and the actual analytical results.
5.2.5   Representativeness

Representativeness is the degree to which the sampling data properly characterize the study
environment. For the field-sampling phase, the previously established sampling sites reasonably
cover the entire AOC, and have been previously deemed to adequately represent the various sub-
units within the AOC.  A statistical analysis of the 12 sampling locations designed to cover the
remediated area, indicates a less than 20% probability of missing a significant sediment deposit
of over 25-feet in diameter. This probability is deemed acceptable for this project.

Additionally, due to the mobile, semi-fluid nature of sediment deposits over time, the sediment
within a particular deposit!onal zone tend to be stratified vertically (as a surrogate of time) rather
than horizontally.  However, sediments from one depositional zone can vary drastically from
sediments in other depositional zones based on location and time of contamination release.

-------
Raisin River QAPP, Final, October 4, 2001                                                           3 Q

Therefore, by randomly sampling sediment deposits from a range of spatial locations within the
AOC we obtain a representative description of sediment quality within the AOC.

In the analytical phase, and as specified elsewhere in this document, appropriate sample storage
and preservation, and sample homogenization will insure that the samples analyzed adequately
reflect conditions as they existed in the natural environment.
5.2.6   Comparability

Comparability states the confidence with which one data set can be compared to another.
Comparability will be enhanced by the consistent use of standardized sampling methods and
specified protocols for the sampling phase and through the use of standard documented
methodologies for analyte determinations. Any deviations from the standardized, selected
methods or protocols will be clearly documented by the laboratories and noted in the final
analytical report. There are a number of issues that can make two data sets comparable, and the
presence of each of the following items enhances their comparability:

       •  Two data sets should contain the same set of variables of interest
       •  Units in which these variables were measured should be convertible to a common
          metric
       •  Similar analytical procedures and quality  assurance should be used to collect data for
          both data sets
       •  Time measurements of certain characteristics (variables) should be similar for both
          data sets
       •  Measuring devices used for both data sets should have approximately similar
          detection levels
       •  Rules for excluding certain types of observations from both samples should be similar
       •  Samples within data sets should be selected in a similar manner
       •  Sampling frames from which the samples were selected should be similar
       •  Number of observations in both data sets  should be of the same order or magnitude.

These characteristics vary in importance depending on the final use of the data.  The closer two
data sets are with regard to these characteristics, the more appropriate it will be to compare them.
Large differences between characteristics may be of only minor importance, depending on the
decision that is to be made from the data.

For this investigation, comparability will be satisfied by ensuring that the field sampling plan is
followed, standard EPA Methods of analysis are used for sample analysis and that proper
sampling techniques are used.
5.2.7   Completeness

Completeness is a measure of the amount of valid data obtained from a measurement system
compared to the amount that the project expected to obtained under normal conditions. Field
completeness is a measure of the amount of valid measurements obtained from all the
measurements taken in the project. Field completeness objectives for this project will be greater
than 90%. Laboratory completeness is a measure of the amount of valid measurements obtained

-------
Raisin River QAPP, Final, October 4, 2001                                                           3 J

from all the measurements taken in the project. Laboratory completeness for this project will be
greater than 90% of the total number of samples submitted to the analytical laboratories.

The calculation for percent completeness is as follows:

       %C= 100%x(V/n)
                                  %C = percent completeness
                                  V = number of valid(1) measurements
                                  n = number of measurements planned

(1) For this sampling event, a valid measurement is defined as the arrival at a sampling location
and collection and analysis of a sediment sample.  However, since remediation would be
expected to remove all sediment that is present, for the remediated area only, a valid measure is
collection and analysis of a sediment sample or the analysis of sediment depth and the recording
of "No Sediment Present".


6.     Documentation and Records

6.1    Field Documentation

Field logs, ship logs, and chain of custody documents will be used to record appropriate sample
collection information in the field.

Sediment  Sample Collection Logs: An example sediment sample collection log is provided in
Appendix E. A sediment sample collection log will be filled out by the field crew for each
sample collected. All original field data sheets shall be turned over to the Project Coordinator at
the conclusion of the field sampling and  shall be kept as part of the permanent project file.

Ship Log: A ship log maintaining a summary of sample collection information shall be
maintained for each day of field sampling.  Information to be included in the ship log shall
include: sample location ID, latitude/longitude of each sampling location, time of sample
collection. The ship log shall remain with the ship files for a period of at least 2 years following
the conclusion of field sampling.

Chain-of-Custody Forms:
An example chain of custody form is provided in Appendix G. A chain-of-custody form will be
filled out for each set of samples shipped to the laboratory.  A copy of the chain-of-custody form
will be faxed to the Lakeshore Engineering Services Project Manager at the end of each field
day. All copies of the chain-of-custody form will be returned to the PI at the conclusion of the
project.

6.2    Laboratory Reports

All laboratory data and records will be included in the final analytical report submitted to the
project manager.  A complete copy of the QAPP will be provided to the labs. The project
manager will be responsible for maintaining the reports in the permanent project file. The
following laboratory-specific records will be compiled by the appropriate laboratory and
included in the final analytical report submitted to  the project manager.

-------
Raisin River QAPP, Final, October 4, 2001                                                            22


Sample Data.  These records contain the times that samples were analyzed to verify that they met
holding times prescribed in the analytical methods. Included should be the overall number of
samples, sample location information, any deviations from the SOPs, time of day, and date.
Corrective action procedures to replace samples violating the protocol also should be noted.

Sample Management Records.  Sample management records document sample receipt, handling
and storage, and scheduling of analyses. The records verify that sample tracking and proper
preservation were maintained, reflect any anomalies in the samples (such as receipt of damaged
samples), note proper log-in of samples into the laboratory, and address procedures used to
ensure that holding time requirements were met.

Test Methods.  Unless analyses are performed exactly as prescribed by SOPs, this documentation
will describe how the analyses were carried out in the laboratory. This includes sample
preparation and analysis, instrument standardization,  detection and reporting limits, and test-
specific QC  criteria. Documentation demonstrating laboratory proficiency with each method
used should  be included (i.e. LCS data).

QA/QC Reports. These reports will include the general QC records, such as instrument
calibration, routine monitoring of analytical performance, calibration verification, etc.  Project-
specific information from the QA/QC checks such as blanks (e.g., reagent, method),  spikes (e.g.,
matrix, matrix spike duplicate, surrogate spike), calibration check samples (e.g., zero check,  span
check, and mid-range check), replicates, and so on should be included in these  reports to
facilitate data quality analysis.

Data Reporting Package Format and Documentation Control Report:  The format of all data
reporting packages must be consistent with the requirements and procedures used for data
validation and data assessment described in Sections 8,  9, and 10 of the QAPP.  The Project
Manager will ensure that data are being recorded appropriately on the sample labels, sample
tracking forms, and in the field notebook.  All entries will be made using permanent  ink, signed,
and dated, and no erasures will be made.  If an incorrect entry is made, the information will be
crossed out with a single strike mark that is signed and dated by the sampler. A similar data
entry process will be followed by the contract laboratories. Only QC/Calibration summary forms
will be provided at this time, unless analytical raw data  is necessary.

Contract laboratories will be expected to provide a data package with the following components:

       •  Case Narrative:
       •  Date of issuance
       •  Laboratory analyses performed
       •  Any deviations from intended analytical strategy
       •  Laboratory batch number
       •  Numbers of samples and respective matrices
       •  Quality control procedures utilized and also  references to the acceptance criteria
       •  Laboratory report contents
       •  Project name and number
       •  Condition of samples "as received"
       •  Discussion of whether or not sample holding times were met

-------
Raisin River QAPP, Final, October 4, 2001                                                           33

       •  Discussion of technical problems or other observations which may have created
          analytical difficulties
       •  Discussion of any laboratory QC checks which failed to meet project criteria
       •  Signature of the Laboratory QA Manager.

Chemistry Data Report:
       •  Case narrative for each analyzed batch of samples
       •  Summary page indicating dates of analyses for samples and laboratory quality control
          checks
       •  Cross referencing of laboratory sample to project sample identification numbers
       •  Descriptions of data qualifiers
       •  Sample preparation and analyses for samples
       •  Sample and laboratory quality control results
       •  Results of (dated) initial and continuing calibration checks
       •  Matrix spike and matrix spike duplicate recoveries, laboratory control samples,
          method blank results, calibration check compounds, and system performance  check
          compound results
       •  Results of tentatively identified compounds.

** An electronic copy of the Analytical Data Report will be submitted in an MS
Excel format containing  the analytical test
7.     Special Training Requirements

No special training requirements are required for this project.


8.     Quality Control Requirements

All analytical procedures are documented in writing as SOPs and each SOP includes QC
information, which addresses the minimum QC requirements for the procedure. The internal
quality control checks might differ slightly for each individual procedure.  Examples of some of
the QC samples that will be used during this project include:

       •  Method blanks
       •  Reagent/preparation blanks
       •  Instrument blanks
       •  Surrogate spikes
       •  Analytical spikes
       •  Field replicates
       •  Laboratory duplicates
       •  Matrix Spike/Matrix Spike Duplicate
       •  Laboratory control standards
       •  Internal standard areas for GC/MS or GC/ECD analysis; control limits.

-------
Raisin River QAPP, Final, October 4, 2001                                                            34

The actual QC samples requirements will be dictated by the method requirements. Details on the
use of each QC check are provided in the analytical SOPs provided for each measurement (see
Appendix I).  Method detection limits will be calculated for each analyte.

Note: Instrument calibration concentrations, method validation procedures, internal quality
control protocols, analytical routines, maintenance and corrective actions, and the data reduction
procedures are included in and will be performed as specified in the Standard Operation
Procedures found in Appendix I and/or as required by the designated analytical methods.
8.1     Instrument/Equipment Testing, Inspection, and Maintenance Requirements

The purpose of this section is to discuss the procedures used to verify that all instruments and
equipment are maintained in sound operating condition, and are capable of operating at
acceptable performance levels.

Testing, Inspection, and Maintenance
The success of this project is dependent on well functioning field, analytical, and toxicological
equipment. Preventative maintenance of this equipment is the key to reduce possible project
delays due to faulty equipment.

As part  of each laboratory's QA/QC program, a routine preventative maintenance program will
be conducted to minimize the occurrence of instrument failure and other system malfunctions.
All laboratory instruments are maintained in accordance with manufacturer's specifications and
the requirements of the specific method employed. This maintenance is carried out on a regular,
scheduled basis and is documented in the laboratory instrument service logbook for each
instrument.
8.2     Instrument Calibration and Frequency

This section concerns the calibration procedures that will be used for instrumental analytical
methods and other measurement methods that are used in environmental measurements.
Calibration is defined as checking physical measurements against accepted standards.

Instrumentation Requiring Calibration
All of the equipment used to analyze the sediment samples will require calibration, as will the
water quality equipment used to monitor overlying water quality parameters in the sediment
toxicity tests.

-------
Raisin River QAPP, Final, October 4, 2001                                                             35

Calibration Methods That Will Be Used For Each Instrument
Instrument calibration procedures are dependent on the method and corresponding SOP (see
Appendix I).  All ongoing calibration measurements must be within the requirements of the
corresponding SOP to be considered adequate

Calibration Apparatus
None of the analytical instruments will be calibrated using a calibration apparatus.

Calibration Standards
The working linear range of an instrument should be established prior to performing sample
analyses.  Calibration standards as specified in the applicable methods and SOPs will be used
when establishing the working linear range.  The working linear range for a specific analysis
should bracket the expected concentrations of the target analyte in the samples to be analyzed.

Calibration Frequency
Instrument calibration is performed before sample analysis begins and is continued during
sample analysis at the intervals specified within the applicable methods and SOPs (see Appendix
I) in order to ensure that the data quality objectives are met. The verification of instrument
stability is assessed by analyzing continuing calibration standards at regular intervals during the
period that sample analyses are performed. Standards will be analyzed on a schedule as
specified in the analytical SOPs. The concentration of the continuing  calibration standard should
be equivalent to the midpoint of the working linear range of the instrument.

Equipment logbooks will be maintained at each laboratory,  in which will be recorded the usage,
maintenance, calibration, and repair of instrumentation.  These logbooks will be available during
any audits that may be conducted.

Thermometer Calibration           See Appendix I

Deionized Water Supply            See Appendix I

Analytical Balances                See Appendix I

Glassware Calibration/Verification  See Appendix I


8.3     Inspection/Acceptance Requirements for Supplies  and Consumables

The purpose of this section is to establish and document a system for  inspecting and accepting all
supplies and consumables  that may directly or indirectly affect the quality of the project or task.

Identification of Critical Supplies and Consumables
Critical supplies and consumables include sample bottles, gases, reagents, hoses, materials for
decontamination activities, and distilled/deionized water. Each of the laboratories will utilize
high quality supplies and consumables to reduce the chances of contaminating the samples. All
water purification systems are tested on a regular basis to ensure that  water produced is
acceptable for use. Solvent blanks are run to verify the purity of solvents used in the organic
analyses.  The contract laboratories may also incorporate other measures, such as the dedicated
use of glassware for certain analyses (e.g., inorganics, organics) or toxicity tests.

-------
Raisin River QAPP, Final, October 4, 2001                                                             3 g


Establishing Acceptance Criteria
Acceptance criteria must be consistent with overall project technical and quality criteria. Each of
the laboratories should utilize their own acceptance criteria for normal operations with analyzing
and/or testing contaminated sediments.

Inspection of Acceptance Testing Requirements and Procedures
Each contract laboratory should document inspections of acceptance testing, including
procedures to be followed, individuals responsible, and frequency of evaluation.  In addition,
handling and storage conditions for supplies and consumables should be documented.

Tracking and Quality Verification of Supplies and Consumables
Procedures should be established to ensure that inspections or acceptance testing of supplies and
consumables are adequately documented by permanent, dated, and signed records or logs that
uniquely identify the critical supplies or consumables, the date received, the date tested, the date
to be retested (if applicable), and the expiration date.  These records should be kept by the
responsible individual(s) at each laboratory. In order to track supplies and consumables, labels
with the information on receipt and testing should be used. These or similar procedures should
be established to enable project personnel to:  1) verify, prior to use, that critical supplies and
consumables meet the project objectives; and 2) ensure that supplies and consumables that have
not been tested, have expired, or do not meet acceptance criteria are not used for the project.
8.4    Data Management

This section will present an overview of all mathematical operations and analyses performed on
raw data to change their form of expression, location, quantity, or dimensionality. These
operations include data recording, validation, transformation, transmittal, reduction, analysis,
management, storage, and retrieval.

Laboratory Data Recording
All raw analytical and toxicity data will be recorded in numerically identified laboratory
notebooks or data sheets.  The data will be promptly recorded in black ink on appropriate forms
that are initialed and dated by the person collecting the data.  Changes to recorded data are made
in black ink, with a single line cross-out, initials, and date.  No "whiteout" will be allowed.

If a laboratory has the capability to directly enter or download the data into a computerized data
logger, then this is preferable.  All  labs shall download data directly into a computerized
database.  Sample data are recorded along with other pertinent information, such as the sample
identification number. Other details which will also be recorded include:  the analytical  method
used (SOP #),  name of analyst, the date of analysis or toxicity test, matrix sampled, reagent
concentrations, instrument settings, and the raw data.  Each page of the notebook or data sheet
will be signed and dated by the analyst. Copies of any strip chart printouts (such as gas
chromatograms) will be maintained on file. Periodic review of these notebooks by the
Laboratory Supervisors will take place prior to final data reporting. Records of notebook entry
inspections are maintained by the Laboratory QA Officer.

-------
Raisin River QAPP, Final, October 4, 2001                                                            37

Data Verification
The method, instrument, or system should generate data in a consistent, reliable, and accurate
manner.  Data validation will be shown by meeting acceptable QC limits for analytical
parameters and sediment toxicity tests. In addition, the application of preventative maintenance
activities and internal QA/QC auditing will ensure that field and laboratory generated data will
be valid. Quality control data (e.g., laboratory duplicates, matrix spikes, matrix spike duplicates,
and performance of negative controls) will be compared to the method acceptance criteria. Data
considered to be acceptable will be entered into the laboratory computer system. Data
verification is performed by a second designated senior/experienced staff at the technical level
where QC results, hold times, and instrument calibration is evaluated.  All QA requirements are
programmed into automated systems and flagged where appropriate.

Data Transformation
Data transformations result from calculations based on instrument output, readings, or responses.
The procedures for converting calibration readings into an equation that will be applied to
measurement readings are  given in the SOPs for analytical parameters (Appendix I).

Data Transmittal
Data transmittal occurs when data are transferred from one person or location to another or when
data are copied from  one form to another. Some examples of data transmittal are copying raw
data from a notebook onto a data entry form for keying into a computer file and electronic
transfer of data over a computer network.  The transmittal of field data will be double-checked
by the PI. The transmittal  of laboratory data will be checked by the individual analyst with
periodic checks by the Laboratory Project Manager and/or QA Officer.

Data Reduction
Data reduction includes all processes that change the number of data items. Each laboratory has
their own data reduction techniques, as is usually documented in their QA Manual.  For the
analytical results, data reduction will involve calculating the arithmetic mean and standard
deviation of field and laboratory replicates.

Data Analysis
Data analysis will involve  comparing the surficial contaminant concentrations to qualitative
values contained in Table  1. The analysis shall be performed by the USEPA Project Manager.

Data Tracking
Data management includes tracking  the status of data as they are collected, transmitted, and
processed.  Each laboratory will have its own data tracking  system in place.

Data Storage and Retrieval
Each contract laboratory will have its own data storage and  retrieval protocols. USEPA-GLNPO
will retain all the analytical data packages in the project files for this study. In addition, the
sediment contaminant data will be added to GLNPO's contaminated sediment database.

8.5    Data Acquisition Requirements (Non-Direct)

We will be utilizing historical sediment chemistry data and fish tissue data for this project.  Prior
to utilizing this data,  the USEPA Project Manager will be responsible for verifying the quality of
this data. Historical data will  only be used if original laboratory reports are available for review

-------
Raisin River QAPP, Final, October 4, 2001                                                            3 g

and assessment.  At a minimum, the USEPA Project Manager will utilize the checklists
contained in Appendix J to verify the quality of the historical data. If the historical data does not
meet the requirements of the checklist, or if the Project Manager is unable to ascertain the quality
of the historical data, this data will not be used in the project analysis.

Additionally, sets of screening values will be used to evaluate the potential impacts of the
contaminant concentrations found in the sediments during this survey. All parameter data will
be compared to existing sediment quality guidelines available in MacDonald et. al. (2000) and
Persuadet. al (1993).  All of these screening levels were specifically developed for freshwater
ecosystems and have been published in peer reviewed journals and documents.  Therefore, these
guidelines are considered sufficient for a screening level analysis of sediment data.

Water surface elevation data will be obtained from the NOAA web page. Only data from the
"verified/historical water level data" page will be utilized in the study. However, NOAA has
attached the following disclaimer on data from this web page:

"These raw data have not been subjected to the National Ocean Service's quality control or
quality assurance procedures and do not meet the criteria and standards of official
National Ocean Service data. They are released for limited public use as preliminary data
to be  used only with appropriate caution."

  Since the water surface elevation data is non-critical  data, this preliminary data is sufficient for
our needs.
9.      Assessment and Oversight

9.1     Assessment and Response Actions

During the planning process, many options for sampling design, sample handling, sample
cleanup and analysis, and data reduction are evaluated and chosen for the project. In order to
ensure that the data collection is conducted as planned, a process of evaluation and validation is
necessary. This section of the QAPP describes the internal and external checks necessary to
ensure that:

       •  All elements of the QAPP are correctly implemented as prescribed.
       •  The quality of the data generated by implementation of the QAPP is adequate.
       •  Corrective actions, when needed, are implemented in a timely manner and their
          effectiveness is confirmed.

The most important part of this section is documenting all planned internal assessments.
Generally, internal assessments are initiated or performed by the QA Officer.

-------
Raisin River QAPP, Final, October 4, 2001                                                            29


Assessment of Subsidiary Organizations
Two types of assessments of the subsidiary organizations can be performed as described below.

       •  Management Systems Review (MSR). A form of management assessment, this
          process is a qualitative assessment of a data collection operation or organization to
          establish whether the prevailing quality management structure, policies, practices, and
          procedures are adequate for ensuring that the type and quality of data needed are
          obtained.  The MSR is used to ensure that sufficient management controls are in place
          and carried out by the organization to adequately  plan, implement, and assess the
          results of the project.
       •  Readiness Reviews. A readiness review is a technical check to determine if all
          components of the project are in place so that work can commence on a specific
          phase.

It is anticipated that a readiness review by each contract  laboratory project manager will be
sufficient for this project. No management systems review is anticipated for this project. A pre-
project QA/QC conference call (already held) and submittal of laboratory certifications and/or
QA plans shall suffice as a MSR.

Assessment of Project Activities
Assessment of project activities can involve the following tasks:

       •  Surveillance
       •  Technical Systems Audit (TSA)
       •  Performance Evaluation (PE)
       •  Audit of Data Quality (ADQ)
       •  Peer Review
       •  Data Quality Assessment.

Surveillance will be the primary assessment technique of project activities. This will most
readily occur by the Project Manager and QA Officer of each contract laboratory.

Number, Frequency, and Types of Assessments
Due to the short-term nature of this project for the contract laboratories, no types of assessments
are planned other than general surveillance, a data quality assessment by USAGE
representatives, and peer review by USAGE and USEPA.

Assessment Personnel
Internal audits of the contract laboratories are regularly performed by their respective QA
Officers.

Schedule of Assessment Activities
External audits by the GLNPO QA Officer and/or the GLNPO Project Manager is up to his/her
discretion. The scheduling of regular internal audits at contract labs is at the discretion of the
respective QA Officers.

-------
Raisin River QAPP, Final, October 4, 2001                                                            4Q


Reporting and Resolution of Issues
Any audits or other assessments that reveal findings of practice or procedure that do not conform
to the written QAPP need to be corrected as soon as possible.  The Laboratory Project Manager
and Laboratory QA Officer need to be informed immediately of critical deviations that
compromise the acceptability of the test.  For any critical deviations from the QAPP (i.e.,
elevated detection levels, surrogate recoveries outside control limits, etc.) that cannot be
corrected within the laboratories standard procedure, the Laboratory Project Manager must
contact both the USEPA Project Manager and the USAGE Project Coordinator within 24-hours
of being informed of the deviation. The laboratory project manager should be ready to provide
suggestions for corrective action.  For non-critical  deviations, they need to be informed by the
next business day.

Corrective actions should only be implemented after approval  by both the USAGE  and the
USEPA Project Managers.  If immediate corrective action is required, approvals secured by
telephone from the USEPA Project Manager should be documented in an additional
memorandum.  In general communications from the laboratories should follow the chain-of-
command as shown in Figure 1. However, if the subcontract laboratories are unable to contact
the Lakeshore Engineering Services Project Manager on any time-critical matter, the laboratories
shall contact either the USAGE Project Coordinator or USEPA Project Manager as necessary.

For noncompliance problems, a formal corrective action program will be determined and
implemented at the time the problem is identified.  The person who identifies the problem will be
responsible for notifying the project manager.  Implementation of corrective actions will be
confirmed in writing through the same channels. Each laboratory shall issue a nonconformance
report for each nonconformance condition.

Corrective actions in the laboratory may occur prior to, during, and after initial analysis. A
number of conditions, such as broken sample containers, multiple phases, and potentially high
concentration samples may be identified during sample log-in  or just prior to analysis.
Following consultation with laboratory analysts and section leaders, it may be necessary for the
Laboratory QA Officer to approve the implementation of corrective actions. The submitted
SOPs specify some conditions during or after analysis that may automatically trigger corrective
actions of samples, including additional sample extract cleanup and automatic re-
injection/reanalysis when certain quality control criteria are not met.

Corrective actions are required whenever an out-of-control event or potential out-of-control
event is noted.  The investigative action taken is somewhat dependent on the analysis and the
event.

Laboratory personnel are alerted that corrective actions may be necessary if:
       •  QC data are outside the warning or acceptable windows for precision and accuracy
       •  Blanks contain target analytes above acceptable levels
       •  Undesirable trends are detected in spike recoveries or RPD between duplicates
       •  There are unusual changes in detection limits
       •  QC limits for sediment toxicity tests are not met
       •  Deficiencies are detected by the Laboratory and/or GLNPO QA Officer(s) during  any
          internal or external audits or from the results of performance evaluation samples
       •  Inquiries concerning data quality are received.

-------
Raisin River QAPP, Final, October 4, 2001                                                            4 J


Corrective action procedures are often handled at the bench level by the analyst, who reviews the
preparation or extraction procedure for possible errors, checks the instrument calibration, spike
and calibration mixes, instrument sensitivity, experimental set-up, and so on. If the problem
persists or cannot be identified, the matter is referred to the Laboratory Project Manager and/or
Laboratory QA Officer for further investigation. Once resolved, full documentation of the
corrective action procedure is filed with the Laboratory QA Officer.

These corrective actions are performed prior to release of the data from the laboratory.  The
corrective actions will be documented in both the laboratories corrective action log and the
narrative data report sent from the laboratory to the Lakeshore Engineering Services Project
Manager.

If corrective action does not rectify the situation, the laboratory will contact Lakeshore
Engineering Services Project Manager who will then contact the USAGE Project Coordinator
and USEPA Project Manager to discuss details of the corrective actions  and required future
actions.
9.2    Reports to Management

Responsible Organizations
Written QC data and appropriate QA/QC reports generated by the laboratories shall be included
in the Analytical Data Report. The Analytical Data Report will be provided by the laboratories
to the Project Manager by the persons identified in Section 1.3 whenever sample measurements
are reported. The QC  section of the Analytical Data Report should include the QC data
(including results, recoveries, and RPDs), any non-conformance reports, and chains of custody.
The report should give detailed results of analysis of QC samples, and provide information on
the precision, accuracy, and completeness for each sample run. These written reports will note
any significant QA/QC problems encountered during sample analyses, as well as state the
corrective actions taken.

Any serious QA problems needing immediate decisions will be discussed orally between the
USAGE Project Coordinator and laboratory staff, with such discussions denoted in writing.
Communication should follow the chain-of-command summarized in Figure 1. These problems
will be noted in the final project report to the USEPA Project Coordinator.

The USACE-Project Coordinator will provide summary QA/QC information in the final written
report to USEPA.  This report will include information on adherence of measurements to the QA
objectives.  The final report will contain detailed discussions of QA/QC issues, including any
changes in the QAPP,  a summary of the contract laboratories QA/QC reports, results of any
internal performance audits, any significant QA/QC problems, detailed information on how well
the QA objectives were met, and their ultimate impact on decision making. The following is a
list of items to be included in the final project report:

       •  Changes in the QAPP
       •  Results of any internal system audits
       •  Significant QA/QC problems, recommended solutions, and results of corrective
          actions

-------
Raisin River QAPP, Final, October 4, 2001                                                             42

       •  Data quality assessment in terms of precision, accuracy, representativeness,
          completeness, and sensitivity
       •  Indication of fulfillment of Q A obj ectives
       •  Limitations on the use of the measurement data.
10.           Data Validation and Usability

The USEPA Project Manager will make a final decision regarding the validity and usability of
the data collected during this project.  The project manager will evaluate the entire sample
collection, analysis, and data reporting processes to determine if the data is of sufficient quality
to meet project objectives. Data validation involves all procedures used to accept or reject data
after collection and prior to use. These include screening, editing, verifying, and reviewing
through external performance evaluation audits. Data validation procedures ensure that
objectives for data precision and bias will be met, that data will be generated in accordance with
the QA project plan and SOPs, and that data are traceable and defensible. The process is both
qualitative and quantitative and is used to evaluate the project as a whole.

Procedures Used to Validate Field Data
Procedures to evaluate field data for this project primarily include checking for transcription
errors and reviewing field notebooks.  This task will be the responsibility of the project manager.

Procedures Used to Validate Laboratory Data
The respective Laboratory QA Officer will conduct a systematic review of the analytical data for
compliance with the established QC criteria based on the spike, duplicate, and blank results
provided by the laboratory. All technical holding times will be reviewed, the laboratory
analytical instrument performance will be evaluated, and results of initial and continuing
calibration will be reviewed and evaluated.

Upon receipt of the draft laboratory report, the U.S. Army Corps of Engineers will perform a
thorough QA/QC review of the chemical data.  At a minimum, this review will include an
analysis of:

          •   Sample Receipt Verification/Documentation
          •   Detection Limits
          •   Surrogate Recoveries
          •   Laboratory QC Documentation and Results
          •   Holding Time Data
          •   Process Bias and Sensitivity
          •   MS/MSD Recoveries
          •   Analytical Method Documentation

At the conclusion of the review, the U.S. Army Corps of Engineers will prepare a report
describing the results of the review, providing recommendations on data items requiring
corrective action or further documentation/information, and drawing conclusions as to  the
usability of the data provided.  A draft report will be provided to both the analyzing laboratories
and the U.S. EPA project manager for review and comment prior to finalizing conclusions and
recommendations.

-------
Raisin River QAPP, Final, October 4, 2001                                                           43


The data review will identify any out-of-control data points and data omissions, and the
Laboratory QA Officer will interact with the laboratory to correct data deficiencies. Decisions to
repeat sample collection and analysis may be made by the USEPA Project Manager based on the
extent of the deficiencies and their importance in the overall context of the project.

Additionally, the USEPA project manager will  compare all  field and laboratory duplicates for
RPD. Based on the results of these comparisons, the USEPA project manager will determine the
acceptability of the data.  One hundred percent  of the analytical and toxicity data will be
validated.  Reconciliation of laboratory and field duplicates shall be the responsibility of the
USEPA project manager.

Finally, the USEPA project manager will compare the laboratory methods and results to the
QA/QC Review checklists  contained in Appendix J.  Separate checklists are for chemistry data
and toxicity data.  Any critical problems identified by these checklists that we are unable to
rectify through corrective actions, may be cause for rejecting portions or all of the data provided.
11.    References

MacDonald, Donald D., Ingersoll, C.G., and Berger, T.A., 2000, "Development and Evaluation
of Consensus-Based Sediment Quality Guidelines for Freshwater Ecosystems", Archives of
Environmental Contamination and Toxicology, 39:20-31

NATO, 1988. International Toxicity Equivalency Factor (I-tef) Method of Risk Assessment for
Complex Mixtures of Dioxins and Related Compounds. North Atlantic Treaty Organization.
Report Number 176.

Persuad, D., Jaaguamagi, R., and Hayton, A. (1993), "Guidelines for the Protection and
Management of Aquatic Sediment in Ontario", Water Resources Branch, Ontario Ministry of the
Environment, Toronto, Ontario.

USEPA (2000), Methods for Measuring the Toxicity and Bioaccumulation of Sediment-
Associated Contaminants with Freshwater Invertebrates, Office of Research and Development,
U.S. Environmental Protection Agency, Washington, DC, EPA/600/R-99/064.

-------
QAPP Quality Assurance Project Plan

CONTROLLING THE SPREAD OF SWALLOW-WORT

The Nature Conservancy
State University of New York-College of Environmental Science and Forestry
Environmental Protection Agency

Project Manager:
Kris Agard, The Nature Conservancy	

Quality Assurance Manager:
Dr. Dudley J. Raynal, SUNY-ESF	
EPA Project Manager:
Robert Beltran	
EPA Quality Assurance Manager:
Louis Blume
Originator:
Frances M. Lawlor, SUNY-ESF

-------
A2 - TABLE OF CONTENTS

A3 - Distribution List                                       page  3
A4 - Project/Task Organization                                     3
A5 - Problem Definition/Background                                4
A6 - Project/Task Description                                      5
A7 - Quality Objectives and Criteria for Measurement Data             7
A8 - Proj ect Narrative                                            10
A9 -Special Training Requirements/Certification                      10
A10 - Documentation and Records                                  10
Bl - Sampling Process Design (Experimental Design)                  10
B2 - Sampling Methods Requirements                               13
B3-B4                                                        13
B5 - Quality Control Requirements                                  13
B6-B7-B8                                                    13
B9 - Data Aquisition Requirements                                  14
B10-Data Management                                           14
Cl - Assessment and Response Actions                              15
C2 - Reports to Management                                       15
Literature cited                                                  16
Appendix                                                       17

-------
A3 - DISTRIBUTION LIST

The following are to receive copies of QAPP and any subsequent revisions of this portion
of the project:

EPA:  Louis Blume, QA Manager
Robert Beltran, Project Manager

The Nature Conservancy: Kris Agard, Director of Science and Stewardship

SUNY-ESF Research Foundation: Dr. Dudley Raynal, Technical Representative and
       Principal Investigator
             Frances Lawlor, Graduate Student Research Assistant
             Robert Mason, Administrative Representative

NYS Department of Environmental Conservation:
             Bernie Davies, Region 6 Office, Lowville Subregion

United States Fish and Wildlife Service: Tracy Gingrich, Biologist, Montezuma National

Wildlife Refuge


A4 - PROJECT/TASK ORGANIZATION

       This project, Controlling the Spread of Swallow-wort, is one of a larger, three part
project, Restoration of Habitats and Natural Processes on the Great Lakes Plain of New
York State, Assistance ID# GL985591-01-0,  to be executed under the auspices of The
Nature Conservancy (TNC). Research is subcontracted to the State University of New
York at the College of Environmental Science and Forestry (ESF). Kris Agard is Director
of Science and Stewardship at TNC and will be project manager. Dr. Dudley Raynal,
Distinguished Teaching Professor, is the Principal Investigator Technical Representative
for ESF. He will be providing professional oversight for the work of the graduate
research assistant. Frances Lawlor, graduate research assistant will be doing the field
work, statistical analysis and preparing reports to the funding agency and project
principals. Quality Assurance will be managed by Dr. Dudley Raynal for ESF and TNC.

The following organizational chart shows the relationships and the lines of
communication among all project participants. Project design, field work and reports will
be generated by the graduate research assistant. All work will be reviewed by the
Principal Investigator before being sent on to TNC and EPA. Study location management
will also receive reviewed reports.

-------
Figure 1. Relationships and lines of communication among cooperators.
EPA


TNC proje
manager
Kric: Acf^rr
                                             Principal
                                             Investigaf
                                             SUNY-ES
                                             Dr.  Raym
NYS certifi
pesticide
applicator
3C

NhWS
Montezui
Wildlife
Refuse
                                                          na
                                                                    I
NYS-DEC
Henderso
Shores
Unique Ar
A5 - PROBLEM DEFINITION/BACKGROUND

       Swallow-wort (Cynanchum rossicum) is an exotic herbaceous twining perennial
herb found on calcareous soils in the northeastern U.S. and adjacent Canada. A member
of the milkweed family, the species, introduced from Europe more than a century ago,
exhibits highly invasive characteristics and is capable of displacing native species over
extensive areas.  The species is expanding through much of the Great Lakes area.

       Capable  of forming dense monocultures, swallow-wort grows well under a wide
range of light conditions on limestone derived soils.  It occurs beneath forest canopy, in
thickets and in many open habitats. It produces copious amounts of wind-dispersed  seeds,
making control difficult once populations become established.

       While the distribution and life history characteristics of swallow-wort have been
investigated (Sheeley 1992, Sheeley and Raynal 1996), effective control methods have
not been developed. Preliminary control experiments at one of The Nature Conservancy's
preserves have shown that both hand pulling and herbicide application alone are
ineffective. Cleared areas are quickly reinvaded by swallow-wort or other invasive
weeds, including garlic mustard (Allliaria petiolata)  and buckthorn (Rhamnus cathartica).
It is likely that effective control measures will require both herbicide treatment and the
planting of desirable native plants.

       We need to know which herbicide treatments are most effective in controlling
swallow-wort. We also need to obtain information on effective restoration techniques that
will encourage establishment of native species after  swallow-wort is  removed.
       Swallow-wort has invaded roadsides, state parks and wildlife management areas,
federal lands such as the Montezuma Wildlife Refuge as well as state and private lands in

-------
central New York. The plant has invaded significant habitats such as Great Gully in
Cayuga county, Rush Oak Openings in Monroe County, and the alvar communities in
Jefferson County. Further it threatens sites in Onondaga County that support state and
federally protected plant species. Land managers require information about effective
control of swallow-wort and control technique details. Knowledge of swallow-wort
control is essential to provide a basis for maintaining ecological integrity in central New
York and throughout the Great Lakes Basin where swallow-wort is spreading.
A6 - PROJECT/TASK DESCRIPTION

       In order to standardize herbicidal application, two herbicides will be tested for
effectiveness in controlling swallow-wort under three different light conditions, i.e.,
under forest canopy (70% to 100% canopy cover), edge (30% to 70% canopy cover), and
full sun (0 to 30% canopy cover). Glyphosate, in three concentrations of Roundup (2.5%,
and5% and a 50% cut stem application), and triclopyr, in two concentrations of Garlon 4
(25% cut stem application  and 1% foliar spray) will be tested. Glyphosate is a broad
spectrum herbicide, whereas triclopyr is limited to killing broadleaf plants. Treated plots
will be subdivided and seeded with one of three nurse crop species to test establishment
of a cover crop for the prevention of re-establishment of swallow-wort.  A baseline
survey of site and plot characteristics including vegetation presence and percent cover,
percent swallow-wort cover, density, soil type, light exposure, aspect, slope, topographic
position, moisture regime,  and disturbance history will be made. Changes in vegetation
coverage will be monitored throughout the project. The cover crop establishment will be
measured for percent coverage at the end of each growing season.

Field activities will be conducted in accordance with state, local, OSHA requirements,
and industry guidelines for environmental and health safety  and operator training.
Herbicides will be U.S. EPA registered. All waste and hazardous waste materials,
containers, and equipment associated with pesticides or herbicides will be treated or
disposed of in  an environmentally  sound manner at approved facilities consistent with
federal and state law and regulations and local ordinances.

       Personnel trained in plant identification and field sampling techniques, as well as
being certified in pesticide application, will be required.

       Assessment of field work will be provided by the graduate research assistant's
steering committee, including Dr. Dudley Raynal, Kris Agard, and Dr. Don Leopold, also
of SUNY-ESF and on the Board of Directors for TNC, Central and  Weatern New York
Chapter. Herbicide application technical oversight will be provided by George Spak,
N.Y.S. certified pesticide applicator.

The work will  be performed according to the following schedule:

A. Develop research program
1.  Select 3 potential research locations                         9/97

-------
2.  Identify plants for use in restoration                          9/97
3.  Collect/obtain seeds of restoration plants                      9/97-10/98
4.  Conduct literature review                                    9/97 ongoing
5.  Submit QA/QC plant for EPA review and approval 180 days from contract origin

B. Implement research program
1.  Initiate monitoring program                                  5/98
       2.  Initiate swallow-wort control                          6/98 - 10/98
       3.  Implement restoration                                6/98-10/99

C. Analyze and report results
1.  Summarize data from field season                            12/98
                                                              12/99
       2.  Summarize restoration techniques                     12/98
                                                              12/99
3.  Complete final research results                               12/99
       4.  Document progress in photographic slides              6/98 - 10/99
       5.  Prepare quarterly report                               1998,1999

A7 Quality Objectives and Criteria for Measurement Data

       Three sets of study sites, one for each of the three light conditions, under forest
canopy, edge/gap, and full sun, will be established at three locations, Montezuma
Wildlife Refuge, Henderson Shores Unique Area, and Great Gully Preserve.  Study sites
will be divided into six 2m x 2m plots, each to receive a different herbicidal treatment.
Plots will be further subdivided into 4 subplots, each to be seeded with a different cover
crop treatment. Each location is considered a replication, providing 3 repetitions for each
treatment. The following diagrams  illustrate the study site design with 3  sites per
location, one for each light condition:

-------
Figure 2. Schematic diagram of study. Numbered plots are herbicidal treatments. Lettered
subplots are nurse/cover crop treatments.
         FULL SUN
      1        2      3
a
c
a
c
b
d
b
d
a
c
a
c
b
d
b
d
a
c
a
c
b
d
b
d
                     6
EDGE/GAP
UNDER  CANOP
                                 Montezuma
                                  Great Gully
                              Henderson  Shores

All study sites will be on land with public access and in management areas that will
provide the least possible disruption of study plots. Plots will have at least 70% swallow-
wort cover. Treatment will be assessed for effectiveness by percent cover of swallow-
wort after treatment and one year after treatment.  Species composition and percent of the
vegetation will be surveyed before treatment, after treatment effect, at the beginning of
the second field season and the end of each field season. Nurse crop establishment will be
assessed at the end of September during both field seasons. Percent cover and density of
establishing species will be visually estimated.

Herbicide study field data sheets (Appendix 1)~
       Pre-study site characteristics will be noted including: percent canopy cover (light
regime), swallow-wort population percent cover and density, woody and herbaceous
species and per cent cover; physical characteristics of slope, aspect, topographic position;
soil designation, texture and pH; moisture regime (mesic through xeric); evidence of
herbivory or disturbance. At spray time, date, time, temperature, soil moisture, time since
most recent rain, cloud cover, humidity, wind speed and direction, herbicide used,

-------
concentration, application rate, hours until rain after spray, amount of rain within 24
hours, days until maximum apparent effect will be documented. During the herbicide
effect period on siste rain guages and maximum and minimum thermometers will
measure rainfall and temperature. The daily and nightly temperature and humidity, as
recorded at the closest NOAA weather stations will be also be recorded. Study sites will
be surveyed again after maximum herbicide effect for woody and herbaceous vegetation
per cent cover changes, density of live stems of swallow-wort, percent herbicide effect on
swallow-wort during the first season will be recorded. Additional vegetation survey will
be done at the end of each swallow-wort growing season (early September), including
any additional signs of herbivory or disturbance.

Post control restoration study field sheets (Appendix 2)~
       Species seeded, rate of seeding, date seeded, date germinated, approximate
percent germination, weather data as above during germination and establishment, signs
of herbivory on nurse crop, percent cover of nurse crop at end of first season, on spray
anniversary, at end of second season, and the per ent cover and density of swallow-wort
will be recorded for each subplot.

       Voucher specimens of predominant plant species will be collected and preserved
in the SUNY-ESF herbarium.  Pilot plots will be used to refine application techniques
before use in field. Random resampling of 5 plots will be done to check for survey
accuracy. Photographic slides  will provide a visual record of herbicide effect.
A8 - Project Narrative
       see sections A5-A7.

A9 - Special Training Requirements/Certification
       Taxonomic skill and field sampling skills have been assured through successful
completion of appropriate courses at SUNY-ESF. The field worker has completed the 30-
hour pesticide applicator's training course and has provisional certification.

A10 - Documentation and Records

       Sample Field record sheets are included in Appendix 1. Voucher plant specimens
will be kept in the SUNY-ESF Herbarium. Calibration of pesticide application rate will
be done according to equipment manufacturer's instructions. Pesticide concentrations will
be according to manufacturer's instruction.

-------
Bl - Sampling Process Design (Experimental Design)

       Three locations have been selected for the study: Henderson Shores State Unique
Area, Jefferson County, NY (longitude 76 degrees, 16', latitude 43 degrees, 51');  Great
Gully Preserve, Cayuga County, NY (longitude 76 degrees, 40', latitude 42 degrees, 48');
Montezuma National Wildlife Refuge, Seneca County,NY(longitude 76 degrees, 44',
latitude 42 degrees, 58'). All locations have been selected for the presence of large
populations of swallow-wort, for ease of public access and low probability of site
disturbance. At each location, study plots will be established under 3 light conditions:
under forest canopy (70%-100% canopy cover), forest gap or forest edge (30%-70%
canopy cover), and full sun  (0%-30% canopy cover). In locations with large areas of at
least 70% swallow-wort cover, one study site from a selection of three sites for each light
condition will be randomly  established. In locations with smaller areas of dense swallow-
wort cover, the study site will be established where space allows. Each of the three
locations, Henderson Shores Unique Area, Great Gully and Montezuma Wildlife Refuge,
provides a replication of light requirements for the herbicide and nurse crop
establishment study. Once the sites are established, site characteristics will be recorded,
including the following: light regime, swallow-wort percent cover, swallow-wort density,
slope, aspect, topographic position, soil characteristics (texture, drainage, pH), moisture
regime (mesic to xeric). Overstory and shrub  layer woody vegetation will be sampled by
species presence and  percent cover, both total and by species. The herbaceous layer will
be likewise sampled.

       Six plots will  be established in each study site, one  for an untreated control and
five for the herbicidal treatments: 2.5% glyphosate, 5% glyphosate, 50% glyphosate cut
stem application, triclopyr 1% and 25% triclopyr cut stem application. Treatments will be
randomly assigned to the 6 plots. Herbicides will be applied with a D.B. Smith "Field
King" backpack sprayer. Recommended timing for related  milkweed family species,
Asclepias syriaca, is at peak flower bud stage (Cramer and Burnside 1981). Lowest
second year survival of A. syriaca after glyphosate treatment is when treatment is at
flower bud stage of growth  (Bhowmik 1982). 2.5% concentration has been selected for
trial because a study with Canada thistle, Cirsium arvense,  has indicated that greater
concentrations are toxic to leaf tissue and herbicide translocation is impeded (Boerboom
and Wyse, 1988). Five percent glyphosate is the recommended concentration for
milkweed. The cut stem applications will direct the herbicide to the exact plat desired
without spray drift concern. Triclopyr is commonly used in natural areas as a broadleaf
herbicide and has less soil residual problem than most broadleaf systemic herbicides. If
graminoid species are unaffected by the herbicidal treatment, reestablishment of invasive
plants may be slowed as the grasses are able to use resources released from uptake by the
competing invasives.  Triclopyr concentration will be at label  recommended
concentration, applied when plants are in flower.

       Herbicide applications will be documented according to concentration,
application method, application rate, time of day and date applied, temperature, soil
moisture and  cloud cover at time of application, as well as  hours or days since last
rainfall, hours until rain after spray, amount of rain if rain occurs within 24 hours of

-------
spray.  Also recorded will be days until maximum apparent effect is obtained, density of
live stems and per cent effect of each treatment. Treatment effects to be looked for will
include yellowing, mottling, distortion of leaves and stems, early senescence and any
atypical symptoms as compared to the no herbicide treatment control plots. Vegetation
species composition and cover changes will be measured in the shrub and herbaceous
populations. The vegetation will  be sampled after a 15 day burndown period and again at
the end of the swallow-wort growing season, when most neighboring swallow-wort
plants are showing signs of senescence, usually the end of August to beginning of
September in our area.

       After the herbicide treatment has had maximum apparent effect, three herbicide
treatment plots will be divided, creating 4 subplots in each selected plot. These subplots
will be used to study the feasibility of using a nurse crop to prevent re-establishment of
invasive species and to set the stage for succession of native plants. One subplot will be
used as a control for herbicidal treatment with no seeding treatment.  The  other subplots
will be seeded with one of three graminoid nurse crop  species. Treatments will be
randomly assigned to the subplots. The three wild ryes, Elymus virginicus, Elymus
villosus and Elymus hystrix are native to central New York and occur  in all counties
involved in the study (New York  State Flora Association, 1990). Rather than seed all 6
herbicide treatment plots, I will only seed the label rate glyphosate plots,  the triclopyr
plots and the 20% glyphosate bundled and cut stem plots. This will reduce the number of
seeding subplots from 216 to 108. The comparison in these subplots  is  seeding
treatments. It will be assumed that the herbicide treatments are effective.  The no
herbicide treatment control plots will not be seeded, as swallow-wort has demonstrated
ability to exclude successful competitors, hence its success as an invader. Date seeded,
rate of seeding, date of germination, percent germination, percent cover at end of first
season, percent cover at spray anniversary, and percent cover end of second year,
evidence of disturbance or herbivory of nurse crop will all be documented.

       The end of study summary evaluation of the combined effect of the herbicidal
treatment and nurse  crop treatment on swallow-wort coverage and effect  on the plant
community in general. Successful herbicidal treatment will show a persistent decrease in
swallow-wort presence by 90% or more. Success of cover crop treatment will be assessed
by ability to prevent the establishment of new swallow-wort seedlings.

       The split plot design of nine study plots with 6 subplots will allow 30 degrees of
freedom for the light and herbicide treatments, and 54  degrees of freedom for the light
and herbicide/seeding treatment. Study uncertainties will be in quantifying percent cover,
in taxonomy,  and in herbicide and seed application rates. A blind recheck of plant
identification and per cent cover estimates of 5 out of 54 plot surveys will be done to
assure accuracy and precision. The pilot spray and seeding plots will assure proper
application rates. The herbicide pilot plots will be done under the supervision of George
Spak, New York State certified pesticide applicator in horticultural and forestry classes.

B2 - Sampling methods Requirements

-------
       Estimate of percent cover is subjective, but will be randomly verified. Plant
identification will be according to Gleason and Cronquist (1991). The project has
sufficient replication to minimize unforeseen events.

B3 - Sample Handling and Custody Requirements.  N/A

B4 - Analytic Methods Requirements    N/A

B5 - Quality Control Requirements

       Voucher specimens, pilot plots for herbicide and seed treatments,
oversight by New York State certified applicator for herbicide rate application. See
section Bl for details.

B6 - B7 - B8 - N/A

B9 - Data Acquisition Requirements (Non-direct Measurements) N/A

-------
BIO - Data Management

       Sun, edge and shade study sites are levels of the whole plot factor of light in a
completely randomized block design at 3 locations. Herbicide plot treatments are the
subplot factor. Seeding subplot treatments are the sub-subplot factor. The split-plot
further divided into a split-split-plot satisfies the need to maintain spatial control of the
plots. Degrees of freedom (df) are provided as follows:
Herbicide treatment:
block
light
Error A
Herbicide
Herbicide x light
Error AB

TOTAL



df
2
2
4
5
10
30

53



Seeding treatment:
block
light
Error A
Herbicide(3 treatments)
Herbicide x light
Error AB
Seeding
Light x seeding
Herbicide x seeding
Light x herbcde x seedg
Error ABC
df
2
2
4
2
4
12
O
6
6
12
54
                                                TOTAL                     107

There is low power to detect differences in light treatment, as indicated by Error A test,
but this comparison is a low priority objective. Error AB and Error ABC have large
enough numbers of degrees of freedom that meaningful biological differences will be
detectable. The herbicide and herbicide X light are tested by the AB error term. Seeding
(establishment), light X seeding, herbicide X seeding, light X herbicide X seeding use the
Error ABC term. Significance level for hypothesis testing will be 0.05. This study is a
screening process to identify useful treatments. There are no critical health issues  and we
do not want to restrict candidate treatments by using a too small alpha level. The null
hypothesis will be that all treatments are equal. The alternative hypothesis will be that at
least two treatments differ. Overall, an F-test will be used against the null hypothesis.
Further, Waller Duncan tests will be used against pairs of treatments. It is expected that
SAS or Minitab will be used for statistical analysis of the data.
Cl - Assessments and Response Actions

       The results of the first season work will be the basis for the research assistant's
Master's thesis. Execution of this project will be subject to oversight by the steering
committee listed on page 5 in section A6. Oversight of possible uncertainties will be as
described in section Bl, page 13. The quarterly and final reports will be reviewed by the
TNC project manager and the SUNY-ESF principal investigator.

-------
C2 - Reports to Management




      See section A6 (C).

-------
Literature Cited—

Bhowmik, P.C., 1982. Herbicidal control of common milkweed (Asclepias syriaca).
Weed Science 30: 349-351.

Boerboom, C.M. and D.L.Wyse, 1988. Influences of glyphosate concentration on
glyphosate  absorption and translocation in Canada thistle (Cirsium arvense). Weed
Science 36: 291-295.

Cramer, G.L. and O.C. Burnside, 1981. Control of common milkweed (Asclepias
syriaca). Weed Science 29: 636-640.

Gleason, Henry A. and Cronquist, Arthur. 1991. Manual of vascular plants of
Northeastern United States and adjacent Canada. Second edition.  New York Botanical
Gardens, New York.

Sheeley, Scott. 1992. The distribution and life history characteristics of Swallow-wort
(Vincetoxicum rossicum). Master's thesis. SUNY-ESF.

Sheeley, S.E., and DJ. Raynal, 1996. The distribution and status of species of
Vincetoxicum rossicum in eastern North America.   Bulletin of the Torrey Botanical
Club 123(2): 148-156.

-------
              Project Planning and Documentation for Projects using Existing Data
   Illustration of the checklist, Using Data from Other Sources -A Checklist for Quality Concerns

Project planning for modeling projects is important in order to ensure that the model is scientifically
sound, robust and defensible. EPA's Quality Staff has developed a draft checklist for use in planning
modeling and other projects using existing data, Using Data from Other Sources -A Checklist for Quality
Concerns. A past GLNPO modeling project, Lake Erie Total Phosphorus Loads, illustrates how the
checklist can be used for planning and documenting a project as detailed below.

1.  Identify the decision you are making or the project objectives: The overall objective of the project was
to  revive phosphorus  load estimation efforts for the Great Lakes, using Lake Erie (1996-2000) as an
example. The information required by GLNPO was the estimate of total phosphorus load for Lake Erie
for the years 1996-2000. The over-arching decision was whether changes in environmental management
were needed to address phosphorus loads to Lake Erie.

2.  Identify the data and information from outside sources proposed for the project: A list of government
agency databases was identified and used to provide data for input to study models.

3.  Determine whether the data have any constraints affecting their use in the new project: The
government agency databases that were used were verified to be available and accessible for use in the
project.

4.  Determine where the acquired data will be used in the decision making process: The data were used as
inputs to the phosphorus loading model. Use of the data and model calculations were detailed.

5.  Scrutinize data for quality concerns pertinent to the intended use: The Principal Investigator (PI) and
GLNPO noted that the responsibility for basic data review, validation and verification was with the
agency that collected the data. However, additional data screening for the purpose of load estimation was
conducted prior to input to the model. The PI used statistical programs to identify outliers by checking
for internal consistency and comparing to historical information.  A plan was developed to investigate
outliers  and determine when data points would  not be used in the model. Another quality concern
involved the critical assumption that the quality objectives and criteria associated with the targeted
agency databases would be adequate  for the purposes of this project. GLNPO and the PI noted that
numerous  studies have been conducted in the past to ensure that this was the case. Data comparability
also was a concern. A main project requirement was that the monitoring data should be obtained in the
same way  as previous load estimates  to ensure comparability to historical data. The PI determined that
the flow and total phosphorus data currently being generated by the agencies responsible for the existing
databases are of comparable quality to data reported by these agencies previously. The PI noted one
exception and developed a plan to address this data issue.

6.  Document your analysis plan in a Quality Assurance Project Plan (QAPP): GLNPO and the PI
developed a QAPP according to EPA QA/R-5, EPA Requirements for Quality Assurance Project Plans.
In  accordance with the graded approach, the QAPP noted when components  discussed in R-5 were not
applicable to the project. For example, Section B.4 Sampling Methods, was  cited as "not applicable"
because the project used existing data and environmental sampling was not a component of the project
(see more on this at the end of this  section). The plan also included a detailed description and citations
for peer reviewed  equations used in the model.

7.  Execute your analyses and document the outcome appropriately: The QAPP included a schedule for
model completion and development of a final report.  As noted under Step 5, a main project  requirement
was data comparability to historical data.  To address this, the PI used standard methods for load
estimation and documented any deviations in the final report.

-------
       "Lake Erie Total Phosphorus Loads, 1996-2000"

                  Quality Assurance Project Plan
                          Submitted to
           U.S. EPA Great Lakes National Program Office
                               and
                       U.S. EPA Region V
                           Chicago, IL
                               By
                         David M. Dolan
                University of Wisconsin-Green Bay
               Natural and Applied Sciences, ES-317
                         2420 Nicolet Dr.
                       Green Bay, WI 54311

                          920-465-2986
                        doland@uwgb.edu
Al. Approvals
David M. Dolan, Principal Investigator & Project Manager, UWGB       Date
David Rockwell, Project Manager, USEPA GLNPO                    Date
Louis Blume, Quality Assurance Manager, USEPA GLNPO              Date

-------
A2. Table of Contents
  Al. Approvals	1
  A2. Table of Contents	2
  A3. Distribution List	3
  A4. Project/ Task Organization	4
  A5. Problem Definition / Background	4
  A6. Project/ Task Description	5
  A7. Quality Objectives and Criteria for Measurement Data	5
  A8. Special Training/ Certification	6
  A9. Documentation and Records	6
  SECTIONB.DATA GENERATION AND ACQUISITION	6
  Bl. Sampling Process Design	6
  B2. Sampling Methods	6
  B3. Sample Handling and Custody	6
  B4. Analytical Methods	6
  B5. Quality Control Requirements	7
  B6. Instrument / Equipment Testing, Inspection and Maintenance	7
  B7. Instrument / Equipment Calibration and Frequency	7
  B8. Inspection/Acceptance of Supplies and Consumables	7
  B9. Data Acquisition Requirements	7
  BIO. Data Management and Load Estimation	8
  SECTION C.  ASSESSMENT AND OVERSIGHT	9
  Cl. Assessment and Response Actions	9
  C2. Reports to Management	9
  SECTION D.  DATA VALIDATION AND USABILITY	9
  Dl. Data Review, Verification and Validation	9
  D2. Verification and Validation Methods	9
  D3. Reconciliation with User Requirements	9
  REFERENCES	11

-------
A3. Distribution List

Louis Blume
U.S.EPA - Great Lakes National Program Office
77 W Jackson Blvd.
Chicago, JL 60604-3590

David Rockwell
U.S.EPA - Great Lakes National Program Office
77 W Jackson Blvd.
Chicago, JL 60604-3590

David M. Dolan
University of Wisconsin - Green Bay
Natural and Applied Sciences, ES-317
2420 Nicolet Dr.
Green Bay, WI 54311

-------
A4. Project / Task Organization

Dr. Dolan is the Principal Investigator of this project, and as such has the responsibility to
oversee all aspects of this project. The PI will review all load estimates made by his
personnel, and review the performance of data management by his personnel; he will
direct and evaluate any necessary corrective action. He has the overall responsibility to
ensure the quality of all loads estimated by this project.

Dr. Dolan reports to the EPA Great Lakes National Program Office (GLNPO) Project
Officer, David Rockwell or his designate. The QAPP is reviewed and approved by the
EPA GLNPO Quality Assurance Officer, Louis Blume.

Dr. Dolan will provide interpretation of the data generated by this project in coordination
and cooperation with appropriate staff from EPA GLNPO.
A5. Problem Definition / Background

One of the best developed indicators of progress in achieving Great Lakes water quality
goals is the mean annual loading of total phosphorus to each lake. For the past 25 years,
particular emphasis has been placed on total phosphorus because the International
Reference Group on Great Lakes Pollution from Land Use Activities (PLUARG)
recommended target loads for each of the lakes based on detailed analysis of their
eutrophication status. Tributary monitoring programs and associated point source and
atmospheric deposition monitoring allowed progress in meeting these targets to be
reported throughout the 1980s and into the 1990s. Key components of the total loading
such as municipal and non-point sources were tracked as remedial control measures were
put in place. Improvements in computer availability and networking allowed greater
accessibility and quicker reporting of results. Unfortunately, budget cuts to environmental
programs in the mid 1990s had a dramatic effect on monitoring efforts, particularly on
tributary  sampling. No total  phosphorus load estimates are available for any lake after
1995.

The Lake Erie Millennium Plan (LEMP) is a binational effort of research organizations,
including the U.S. EPA, "to foster and coordinate research that will identify and solve
basic ecological questions relevant to the Lake Erie Ecosystem through a binational,
collaborative network." At a recent LEMP event (The Lake Erie in the Millennium
Conference, March 28-29, 2001, at the  University of Windsor), U.S. and Canadian
researchers presented the latest information on the ecosystem components and process
that were changing most rapidly and/or were of the greatest concern. Without exception,
investigators reporting on water quality, plankton communities, fish stocks and the food
webs that link these components noted  that their efforts have been hampered by a lack of
knowledge of nutrient loads to Lake Erie. Further, the key nutrient that drives many
aspects of the ecosystem is still phosphorus.

-------
A6. Project / Task Description

The overall objective of this project is to revive phosphorus load estimation efforts for the
Great Lakes, using Lake Erie (1996-2000) as an example. Point source and atmospheric
deposition monitoring results are available to allow continued estimation of these
components. Several key watersheds are still being monitored, making some tributary
load estimation possible. Additional tributary data are available from the U.S.G.S. study
conducted from 1996-1998.

Specific tasks include: 1. Assess status of phosphorus data needed for Lake Erie load
estimation. 2. Collate data from U.S. and Canada into common formats required for
computation. 3. Make estimates for each component (tributary, point source and
atmospheric) of total lake loading including unmonitored areas. 4. Prepare tables and
graphs of results showing subtotals for lake basin (Western, Central and Eastern) and
source (point, nonpoint, etc.). 5. Assess feasibility of extending approach to the other four
lakes. 6. Recommend improvements to streamline load estimation process. 7. Train
environmental science students to understand and conduct load estimation studies.

A7. Quality Objectives and Criteria for Measurement Data

No new environmental measurement data will be obtained during this project. However,
data are to be acquired from government agencies as indicated in B9 below. The
assumption that the quality objectives and criteria for the purpose of these agencies will
be adequate for purposes of this project is critical to the completion of this study.
Numerous studies have been conducted in the past to ensure that this was indeed the case.
Further studies of this nature are beyond the scope of this project.

In the opinion  of the PI, the flow and total phosphorus data currently being generated by
the agencies in B9 below are of comparable quality to data reported by these agencies
previously with one notable exception. Some tributary total phosphorus concentrations
are being reported as "less than the detection limit" or censored. When data are reported
in this way, it is not possible to estimate phosphorus loads directly. This problem is being
dealt with in one of two ways depending on the availability of additional data.

If additional data are available on the same sampling  date that are not censored (from
other agencies at the same location or duplicate sampling), the censored value is
discarded.

If additional data are not available, a replacement value is imputed to the data set using
the following procedure:

       1.  Loads for all sampling dates for a given tributary and year are calculated by
          taking the product of flow and total phosphorus concentration or the detection
          limit (whichever has been reported).
       2.  These loads are log-transformed to ensure admissible estimates.

-------
       3.  Replacement loads for all censored sampling dates are estimated by the MLE
          procedure developed by El-Shaarawi and Dolan (1989).
       4.  Replacement total phosphorus concentrations are calculated by dividing the
          replacement load by the flow for that date.
       5.  Replacement values are substituted into the data set for purposes of stratified
          ratio estimation. These values are flagged for identification purposes.

A8. Special Training / Certification

The personnel at the University of Wisconsin - Green Bay involved in completing the
technical aspects of this project have strong backgrounds in environmental science.
Further, while they are not computer programmers, they are familiar with databases,
spreadsheets, statistical packages and computer files. They also have training in statistics
beyond the introductory level.

The PI has twenty years of experience in conducting this kind of load estimation project.

A9. Documentation and Records

Project documentation will include notebooks, raw data files, final processed data (in
spreadsheet files), and summary tables. This information is available for review on site
by the EPA Project Officer or QA Officer.
SECTION B. DATA GENERATION AND ACQUISITION

Bl. Sampling Process Design

N/A.

B2. Sampling Methods

N/A.

B3. Sample Handling and Custody

N/A.

B4. Analytical Methods

N/A.

-------
B5. Quality Control Requirements

N/A.

B6. Instrument / Equipment Testing, Inspection and Maintenance

N/A.

B7. Instrument / Equipment Calibration and Frequency

N/A.

B8. Inspection/Acceptance of Supplies and Consumables

N/A.

B9. Data Acquisition Requirements

All of the data used to estimate Lake Erie Total Phosphorus Loads come from government
databases. The following is a list of the required data, database name (if any) and the responsible
agency.

U.S. Point Source dischargers in Lake Erie Basin (Monthly Average Effluent Flow and Total
Phosphorus Concentration). Database: Permit Compliance System (PCS). Responsible agencies:
U.S. EPA and States: New York, Pennsylvania, Ohio, Indiana and Michigan.

Canadian Point Source dischargers in Lake Erie Basin (Monthly Average Effluent Flow and Total
Phosphorus Concentration). Database: Municipal and Industrial Strategy for Abatement (MISA)
database. Responsible Agency: Ontario Ministry of the Environment.

U.S. Daily Average Tributary Flows for gauged Lake Erie tributaries. Database: WATSTORE.
Responsible Agency: U.S.G.S. Water Division.

Canadian Daily Average Tributary Flows for gauged Lake Erie tributaries. Database: Available
on CD-ROM. Responsible Agency: Environment Canada, Water Survey Canada.

U.S. Tributary Total Phosphorus Concentrations for monitored Lake Erie tributaries. Database:
STORET. Responsible Agencies: U.S. EPA, U.S.G.S., New York DEC, Ohio EPA, Michigan
DEQ.

Canadian Tributary Total Phosphorus Concentrations for monitored Lake Erie tributaries.
Database: Stream Monitoring System. Responsible Agencies:  Ontario Ministry of the
Environment.

Total Phosphorus Concentrations in Rainfall and Rainfall Amounts in the Lake Erie Basin.
Database: Available is spreadsheet format. Responsible Agency: Environment Canada.

-------
BIO. Data Management and Load Estimation Methods

Once the data has been acquired from the sources identified in B9, its receipt will be
documented and it will be maintained as an archive. Copies will be made of the original
data (in "flat" or text-file format) for use with existing SAS and FORTRAN programs on
the EPA-NCC mainframe computer. This software is known to perform reliably.
These programs are used to prepare the data for the load estimation process, estimate the
load from individual and area sources, and summarize the resulting loads by type and
geographic sub-units. The methods used to estimate loads vary by type of source, but the
basic calculation involved is forming the product of concentration multiplied by volume
per unit time to produce the mass per unit time. This quantity is then averaged over the
required time period. For example, to estimate loads from point sources, the following
calculations (Dolan, 1993) can be used to report annual average point source phosphorus
loads:

       Loading = (ZC;Qi)/n        for/= 1,2, ...,12

where  C; is the average total phosphorus effluent concentration for the rth month
       Qi is the mean effluent flow for the rth month
and    n is the number of months of monitoring.

These calculations are performed on a "per pipe" basis and the estimates are summed (for
multi-pipe facilities) to provide loads on a "per facility" basis.

For monitored tributaries, the Stratified Beale's Ratio Estimator (Beale, 1962;  Tin, 1965;
Dolan etal., 1981)  is used. Daily tributary loads are calculated on a yearly basis for each
tributary and then the data are stratified into one or more  strata depending on the nature
of the flow and concentration relationship within each strata. In general, tributaries with
greater than monthly sampling frequency are stratified into at least two strata.

For unmonitored tributaries, a unit area load (UAL) is estimated from nearby monitored
tributaries and applied to the unmonitored basin area. For details of this procedure, see
Rathke and McCrae (1989).

For atmospheric loadings, the flux of phosphorus in units of mass per area is estimated
from precipitation collectors and applied to the lake area  that the collector represents.
This is also detailed in Rathke and McCrae (1989).
SECTION C. ASSESSMENT AND OVERSIGHT

Cl. Assessment and Response Actions

Dr. Dolan will monitor all project-related activities.

-------
The results of this project may be published, and if so will undergo anonymous outside
peer review by experts in the field.
C2. Reports to Management

Due to the short duration of this project, the PI will provide one progress report to the
EPA Project Officer in December, 2001 which will summarize all progress to date. A
Final Project Report will be provided to the EPA Project Officer at the end of the project
that includes all project results.
SECTION D. DATA VALIDATION AND USABILITY

Dl. Data Review, Verification and Validation

The responsibility for basic data review, validation and verification lies with the agency
that collected the data. However, additional data screening for the purpose of load
estimation will be conducted. A series of SAS programs are used to identify outliers by
checking for internal consistency and comparing to historical information. In general,
data points that depart from the current and/or historical average by more than two
standard deviations are investigated.
D2. Verification and Validation Methods

Data points that are flagged as potential outliers are investigated for assignable cause.
Adjustments can be made for causes such as a change in units or analytical method.
When the cause is failure of instrumentation or a missing sample, data points are deleted
and, if necessary, replacement values are imputed using accepted procedures. See A7
above.
D3. Reconciliation with User Requirements

The information required by the user is the estimate of total phosphorus load for Lake
Erie for the years 1996-2000. The main requirement is that the information should be
obtained in the same way as previous load estimates to ensure comparability. A statement
will be provided that documents how standard methods for load estimation were used and
noting any deviations.

-------
REFERENCES

Beale, E.M.L. 1962. Some uses of computers in operational research. Industrielle
Organisation 31:51 -52.

Dolan, D.M., Yui, A.K., and Geist, R.D. 1981. Evaluation of river load estimation
methods for total phosphorus. J. Great Lakes Res. 7(3): 207-214.

Dolan, D.M., 1993. Point source loadings of phosphorus to Lake Erie: 1986-1990.
J. Great Lakes Res. 19(2): 212-223.

El-Shaarawi, A. H. and Dolan, D.M. 1989. Maximum likelihood estimation of water
quality concentrations from censored data. Can. J. Fish. Aquat. Sci. 46(6): 1033-1039.

Rathke, D. E. and McCrae, G. 1989. Appendix B, Volume III, Report of the Great Lakes
Water Quality Board. International Joint Commission, Windsor, Ontario.

Tin, M. 1965. Comparison of some ratio estimators. J. Amer. Statist. Assoc. 60: 294-307.
                                       10

-------
                                    Appendix L
                              Peer Review Checklists
GLNPO Quality Management Plan - Appendix L                                               May 2008

-------
                                                                        GLNPO - Version 7/9/02
                      Great Lakes National Program Office
                        Required Peer Review Information
 If peer review for the work product has not yet been completed, please fill out sections 1 -9, and ISA of this form.


1.    Title of Work Product:	
2.     Objective / Intended Use:.
3.     Abstract / Summary:    (attach document if necessary}
4.

5.
6.

7.
Web Site Address:
Cross-Cutting Science Issues:
	    Children's Health
	    Cumulative Risk
	    Contaminated Sediments
	    Older Americans

Peer Review Leader:	
                                               Environmental Justice
                                               Genomics
                                               Indoor Environments
                                               Tribal Science
Science Category:
	    Major Scientific/Technical
	    Major Economic
	    Major Social Science
Other:
                                        	    Non-major Scientific/Technical
                                        	    Non-major Economic
                                        	    Non-major Social Science
       Environmental Regulatory Model:
       	   New                       	    Modified
       	   New Application              	    N/A
       Environmental Medium:
       	Air   	 Human Health
                                Multimedia
Terrestrial
Water
Other
       Peer Review Type and Mechanism:
       	 Internal 	
       	 External	
           To Be Determined

-------
10.    Results of Peer Review Comments:
       	    Substantive revision to final product
       	    No significant change to final product
       	    To Be Determined
                                                                            GLNPO - Version 7/9/02
Minor revision to final product
Product was terminated
11.    Peer Review Charge / Instructions:  (attach document if necessary)
12.    Peer Reviewer Name & Affiliation:
13.    Peer Review Comments:    (attach document if necessary}
14.    Management Decision on Comments:    (attach document if necessary}
15.    Location of Peer Review File:
16.    File Contact:
               Name, organization & telephone number, if different from Peer Review Leader.


17.    Additional Supporting Documentation / Comments:  (attach documents if necessary}
18.    External / Internal Peer Review Dates
       A.     Date of Projected Peer Review:	
       B.     Date Peer Review was Conducted:	
       C.     Date Peer Review was Completed:	
       D.     Date Final Peer Review Comments were received:	
       E.     Date of Management Decision on Peer Review Comments:_

-------
                                Appendix M
                 Protocol to Address Missed Requirements in
         Great Lakes National Program Office Assistance Agreements
GLNPO Quality Management Plan - Appendix M                                          May 2008

-------
       Protocol to Address Missed Requirements in GLNPO Assistance Agreements
       Although most recipients of GLNPO funding comply with requirements of grants,
cooperative agreements, and inter-agency agreements (collectively, Assistance Agreements),
some do not.  This written protocol addresses the failure of recipients to provide Quality System
Documentation, Final Reports, and Progress Reports, in order that Project Officers will know
how and when to engage the various tools at their disposal, GLNPO grants personnel will know
when to become involved, and GLNPO management will know when to become involved.  This
protocol provides a general framework to address these issues, recognizing that the individual
project officer still is in the best position to exercise judgment to address Assistance Agreement
issues with recipients.

Day1

45



60



90



97






127





Problem - Quality System Documentation Not Provided
Action

Contact Recipient



Contact Recipient.



- Tele-conference.
- Delinquency
Notice to PO

cc of Delinquency
Notice to Branch
Chief




Stage 2
Delinquency Notice
to Branch Chief



P
O
X



X



X



X






X





QA
Mgr








X



X






X





PBT
Ldr



















X





Br
Chief












X






X





AssistSpec
/ Chief



















X





GLNPO
Director

























RMD
Director

























Explanation

PO calls and e-
mails.
QA Manager
calibrates database.
PO calls/e-mails.
Must talk with
recipient, not just
voice-mail.
PO and QA
manager talk with
recipient, following
up w/e-mail.
PO convenes
meeting. Branch
Chief calls
counterpart
w/recipient
organization. PO
follows up w/e-mail.
PO convenes
meeting w/
Assistance Section
to discuss options.
Follow up briefing of
GLNPO Director.
       '# Days after the award is made. The time period would be adjusted if data collection is to occur before the 90th day
after an award is made, or if grantee is cooperating.
                                       Page 1 of 2

-------

134














164






178





Problem - Quality System Documentation Not Provided
Letter to Grantee
from GLNPO
Director.












Teleconference.
GLNPO
Suspension via
letter from GLNPO
Director.


Teleconference.
Debarment &
Suspension via
letter and phone
call. Withdraw
funds.
X














X






X





X














X












X














X






X





X














X






X



























X



























X



























X





Lay out
consequences.
After 30 days,
suspend payments
and make no new
GLNPO awards.
Debarment and
suspension and
withdrawal of funds
2 weeks later. QA
plan mustbe
approved in
advance of future
awards for two year
period.
GLNPO Director
notifies recipient
verbally and in
writing. PO and
Assistance Section
notify Finance to
withhold payments.






Page 2 of 2

-------
                                  Appendix N
                  Example of Delinquency Notification Letter
GLNPO Quality Management Plan - Appendix N                                             May 2008

-------
                       UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                                             REGION 5
                              GREAT LAKES NATIONAL PROGRAM OFFICE
                                   77 WEST JACKSON BOULEVARD
                                       CHICAGO, IL 60604-3590
   DATE:

SUBJECT: Delinquent QAPP notice

   FROM: Louis Blume, QA Manager

    TO:        , Project Officer
This is a notification that the QAPP for the project entitled, "                          ,"
with grant number X, has not been submitted and it is now considered delinquent.

This delinquency signifies the departure from the contractual agreement between GLNPO and the grantee.
Office policy states that a QAPP must be submitted within 90 days of the grant award AND 30 days prior to the
commencement of any environmental data collection activities. This step is implemented to ensure that all
quality assurance issues are adequately resolved and integrated prior to any data collection activities and that
projects are funded within the appropriate time frame.  As a reminder, costs associated with data collection are
not allowable costs until the QAPP is approved by the GLNPO QA Manager.

Since the Annual QA Report to Headquarters is due on November 16,1998, it is critical that prompt action be
taken to resolve this issue. Please convey this information to the Project Manager and develop an acceptable
action plan describing when the QAPP will be delivered.

If you have  any questions or concerns about this memorandum, please contact me as soon as possible.
CC:          , Branch Chief,
             , Team Leader*
* Your respective Branch Chief and Team Leader will be notified within 7 days of this announcement if an
acceptable action plan has not been received by the QA Manager.

-------
                                 Appendix O
             Quality System Documentation Status Tracking Sheet
GLNPO Quality Management Plan - Appendix O                                            May 2008

-------
                                                                QA Monthly Delinquency Status
        QUALITY SYSTEM DOCUMENTATION SUMMARIES AS OF 12/13/2001 FOR KNOWN GRANTS/IAGS/COOPS AWARDED DURING/AFTER 2000
Seriously Delinquent
P.O.

Team

P. (./Institute

Delinquent but not significantly affecting c
P.O.





Team





P.l./lnstitute






Grant #

Title/Comments


Award Date

uality
Grant #






Title/Comments












Award Date






Waiting for revisions
P.O.



Team



P. (./Institute



Grant #



Title/Comments






Award Date



Resolved Delinquency
P.O.

Team

P. (./Institute

Grant #

Title/Comments


Award Date

Legend - EMIT: Environmental Monitoring & Indicators Team, E: Ecological Protection Restoration Team, 1ST: Invasive Species Team, SED: Sediment Assessment & Remediation Team,
P2: Pollution Prevention Team
                                                                         Page 1 of 1

-------
                                    Appendix P
                            Example Audit Checksheet
       The following audit questionnaire was used in the EPA Direct Delayed Response
Program to consistently and objectively evaluate contract laboratories analyzing soil samples.
The following example checklist is for pH and is not the complete questionnaire.
GLNPO Quality Management Plan - Appendix P                                                May 2008

-------
      ANALYTICAL LABORATORY ON-SITE EVALUATION QUESTIONNAIRE
                              DDRP SOIL SURVEY

                                   GENERAL
Date:
Laboratory:
Street Address:

Mailing Address
(if different from above):
City:
State:

Laboratory Telephone
Number:
Laboratory Director

Laboratory Quality
Assurance Officer (Quality
Control Chemist):
Type of Evaluation:
Contract Number:
Contract Title:
Zip:
                                   Page 1 of 23

-------
                             GENERAL (continued)





Personnel Contacted:





              Name                                       Title
              Name                                       Title
                                  Page 2 of 23

-------
                       ORGANIZATION AND PERSONNEL
Laboratory Organization Chart:
                                  Page 3 of 23

-------
                         ORGANIZATION AND PERSONNEL

Laboratory Personnel:

                                      Academic            Special           Years
   Position          Name             Training*           Training       Experience**
  * List highest degree obtained and specialty. Also list years toward a degree.
** List only experience directly relevant to task to be performed.
                                      Page 4 of 23

-------
Item

Do personnel assigned to this project have
the appropriate educational background to
successfully accomplish the objectives of
the program?

Do personnel assigned to this project have
the appropriate level and type of experience
to program?

Is the organization adequately staffed to
meet project commitments in a timely
manner?

Does the laboratory Quality Assurance
Supervisor report to senior management
levels?

Was the Project Manager available during
the evaluation?

Were chemists and technicians available
during the evaluation?

Was the Quality Assurance Supervisor
available during the evaluation?
Yes
No
Comment
                                       Laboratory Manager:
Item
Does the laboratory manager have his/her
own copy of the standard operating
procedures?

Does the laboratory manager have his/her
own copy of the instrument performance
data?

Does the laboratory manager have his/her
own copy of the latest monthly QC plots?

Is the laboratory manager aware of the most
recent control limits?

   a. The data itself?

   b. The quality control data sheet with
        analyst notes?

   c. The general instrument performance
        and routine maintenance reports?
Yes
No
Comment
                                            Page 5 of 23

-------
                      STANDARD OPERATING PROCEDURES (SOP)
Item
Does the laboratory have a standard
operating procedure (SOP) manual?

Is the SOP manual followed in detail?

Does the SOP manual contain quality
control practices?

Does each analyst/technician have a copy of
the SOP manual?

Does the SOP manual deviate from the
procedures required by the project?

If the SOP manual does deviate, are the
deviations documented in written form?

Does each analyst/technician have a copy of
all methods and procedures required by this
project?

Are plots of instrumental accuracy and
precision available for every analysis?

Are detection limit data tabulated for each
analysis?
Yes
No
Comment
                                          Page 6 of 23

-------
                                    LABORATORY FACILITIES
When touring the facilities, give special attention to:
   (1) the overall appearance of organization and neatness,
   (2) the proper maintenance of facilities and instrumentation,
   (3) the general adequacy of the facilities to accomplish
     the required work.
 Item
 Does the laboratory appear to have adequate workspace
 (6 linear meters of unencumbered bench space per
 analyst)?

 Does the laboratory have a source  of
 distilled/demineralized water?

 Is the specific conductance of distilled/demineralized
 water routinely checked and recorded?

 Are the analytical balances located away from draft and
 areas subject to rapid temperature changes?

 Has the balance been calibrated within one year by a
 certified technician?

 Is the balance checked with a class S standard before
 each use and recorded in a logbook? Have technician
 demonstrate how this is done.

 Are exhaust hoods provided to  allow efficient work with
 volatile materials?

 Have the hoods been checked for operating efficiency?
 How often is this done?

 Is the laboratory maintained in  a clean and
 organized manner?

 Are contamination-free work areas provided for the
 handling of toxic materials?

 Are adequate facilities provided for separate storage of
 samples, extracts, and standards, including cold storage?

 Is the temperature of the cold storage units recorded daily
 in logbooks?

 Are chemical waste disposal policies/procedures
 adequate?

 Are contamination-free areas provided for trace level
 analytical work?

 Can the laboratory supervisor document that trace-free
 water is available for preparation of standards and
 blanks?
Yes
No
Comment
                                              Page 7 of 23

-------
                                LABORATORY FACILITIES
Item
Do adequate procedures exist for disposal of waste
liquids from the ICP and AA spectrometers?

Do adequate procedures exist for disposing of liquid and
solid wastes?

Is the laboratory secure?

Are all chemicals dated on receipt and thrown away
when shelf life is exceeded?

Are all samples stored in the refrigerator
between analyses?

Are acids and bases stored in separate
areas?

Are hazardous, combustible, and toxic
materials stored safely?
Yes
No
Comment
                                LABORATORY FACILITIES
Item
Gas
Lighting
Compressed air
Vacuum system
Electrical services
Hot and cold water
Distilled water
Laboratory sink
Ventilation system
Hood space
Cabinet space
Storage space (m2)
Refrigerated storage (4°C)
Available
Yes













No













Comments
(where applicable, cite system, QC
check, adequacy of space)













                                         Page 8 of 23

-------
                       LABORATORY FACILITIES




COMMENTS ON LABORATORY FACILITIES:
                              Page 9 of 23

-------
                                  EQUIPMENT GENERAL
Item

Balance, analytical

(1)

(2)

(3)

Balance, top loader

Class "S" weights

Balance table

NBS-calibrated thermometer

Desiccator

Distilled water

Double deionized,
distilled/deionized, or double
distilled water

Glassware

(1) Beakers

(2) Erlenmeyer flasks

(3) Sedimentation cylinders

(4) Graduated cylinders

(5) Fleakers

(6) Other

Drying ovens

Hot plates

Water bath

Centrifuge

Vortex mixer
                                    Equipment
 #of
Units
Make
Model
Condition/Age
Good
Fair
Poor
Comments

                                         Page 10 of 23

-------
EQUIPMENT GENERAL
 Item

 Eppendorf pipets
 (or equivalent)

 Reciprocating shaker
                                Equipment
 #of
Units
Make
Model
Condition/Age
Good
Fair
Poor
Comments

Comments:
                                    Page 11 of 23

-------
Item
Digital pH meter

Combination
electrodes,
non-gel type
                                 pH DETERMINATION
 Manufacturer
       Model
          Installation
             Date
     Comments
Item

Thermometer

Beakers, 50 mL

Stirrers

QCCS standard


Chemical
  Available
      Quantity
             Type
     Comments
Quantity
Grade
Expiration Date
Comments
Calcium Chloride
(CaC12)

Calcium hydroxide
(Ca(OH)2)

Chloroform
(CHC13) or Thymol
(C10H140)

Hydrochloric acid
(HCL)

National Bureau of
Standards (NBS)
buffers

Potassium
Biphthalate
(KHC8H404)

Potassium chloride
(KC1)
                                       Page 12 of 23

-------
pH DETERMINATION

 Question

 Are chemicals reagent grade or
 better?

 Is the air-dried soil stored in sealed
 containers?

 Does the pH meter have internal
 temperature compensation to +0.5°
 C?

 Is the combination electrode a non-
 gel type?

 Is the combination electrode of the
 recommended style with retractable
 sleeve junction?

 Are the buffers calibrated daily to
 +.01 pH units?

 Is the pH meter:
 - calibrated before samples are
         analyzed?
 - checked every batch as stated in
         methods?

 Is the temperature compensation
 manual or internal?

 Are equilibrium times required for
 standards checked, to see if
 electrode response is slowing?

 Is a spare combination electrode
 available and properly stored?

 Is manufacturer recommended
 warm-up time allowed before
 samples are run?

 Are pH meters placed away from
 drafts and areas of rapid
 temperature change?

 Are the specified between-sample
 procedures followed?

 Are pH units equipped with
 programmable sampling times?

 If yes above, are they used in this
 analysis?

 Are electrodes properly stored and
 maintained?
Yes
No
NA
Comments
                                            Page 13 of 23

-------
pH DETERMINATION

Question

Are the QC results plotted in real
time?

What is the QCCS sample?

Is the QCCS solution analyzed first
and thereafter as called for in the
methods?

Are a QCCS and duplicate sample
included in each run?

Is the quality control data reviewed
by the analyst before deciding
whether to release the data for
reporting?
Yes
No
NA
Comments
                                        Page 14 of 23

-------
                              DOCUMENTATION/TRACKING
Item
Yes
No
Comment
Is a sample custodian designated? If
yes, name of sample custodian:
Are the sample custodian's procedures
and responsibilities documented? If
yes, where are these documented?

Is sample tracking performed via paper
or computer?

Are written standard operating
procedures (SOPs) developed for
receipt of samples?  If yes, where are
they documented?


Are written standard Operating
procedures (SOPs) developed for
compiling and maintaining sample
document files?  If yes, where are they
documented?

Are samples stored under refrigeration?
At what temperature?
After completion of the analysis, are
the samples properly stored for six
months or until laboratory personnel
are told otherwise?
                                          Page 15 of 23

-------
                               ANALYTICAL METHODOLOGY
Item
Yes
No
Comment
Are the required methods used?
Is there any unauthorized deviation
from contract methodology?


Are written analytical procedures
provided to the analyst?
Are the reagent grade or higher purity
chemicals used to prepare standards?


Are fresh analytical standards prepared
at a frequency consistent with good
QA?


Are reference materials properly
labeled with concentrations, date op
preparations, and the identity of the
person preparing the sample?
Is a standard preparation and tracking
logbook maintained?

Do the analysts record bench data in a
neat and accurate manner? Is the
appropriate instrumentation used in
accordance with the required
protocol(s)?
                                           Page 16 of 23

-------
ANALYTICAL METHODOLOGY




COMMENTS ON ANALYTICAL METHODS AND PRACTICES:
                            Page 17 of 23

-------
QUALITY CONTROL
Item
Does the laboratory maintain a quality
control manual?
Does the manual address the important
elements of a QC program, including
a. Personnel:
b. Facilities and equipment:
c. Operation of instruments?
d. Documentation of procedures?
e. Procurement and inventory
f. Preventive maintenance?
g. Reliability of data?
h. Data validation?
i. Feedback and corrective action?
j. Instrument calibration?
k. Record keeping?
1. Internal audits?
Are QC responsibilities and reporting
relationships clearly defined?
Have standard curves been adequately
documented?
Are laboratory standards traceable?
Are quality control charts maintained
for each routine analysis?
Do QC records show corrective action
when analytical results fail to meet QC
Do supervisory personnel review the
data and QC results?
Yes




















No




















Comment




















    Page 18 of 23

-------
QUALITY CONTROL
Item
Does the QC chemist have a copy of
the standard operating procedures?
Does the QC chemist have a copy of
the instrument performance data?
Does the QC chemist have a copy of
the latest QC plots?
Is the QC chemist aware of the most
recent control limits?
Does the QC chemist prepare a blind
audit sample once per week?
Does the QC chemist routinely review
and report blank audit data to the
laboratory manager?
Does the QC chemist update control
limits and obtain new control charts
once per batch?
Are all QC data (e.g., control charts,
regression charts, QC data bases) up to
date and accessible?
Are minimum detection limits
calculated as specified?
Is QC data sheet information reported
to the analyst?
Yes










No










Comment










                             Page 19 of 23

-------
DATA HANDLING
Item
Does data clerk check all input to the
computer for accuracy?
Are calculations checked by another
person?
Are calculations documented?
Does strip chart reduction by on-line
electronic digitization received at least
5% manual spot checking?
Are data from manually interpreted
strip charts spot-checked after initial
entry?
Do laboratory records include the
following:
- Sample identification number
- Sample type
- Date sample received in laboratory
- Date of analysis
- Analyst
- Result of analysis (including raw
analytical data)
- Recipient of the analytical data
Does laboratory follow required
sample tracking procedures from
sample receipt to discard?
Does the data clerk routinely report
quality control data sheet information
to the analyst?
Does the data clerk submit quality
control data sheet information to the
laboratory manager, along with the
analytical data to be reported?
Do records indicate corrective action
taken?
Yes

















No

















Comment

















   Page 20 of 23

-------
DATA HANDLING

Item
Yes
No
Comment
Are provisions made for data storage
for all raw data, calculations, quality
control data, and reports?

Are all data and records retained the
required amount of time?

Are computer printouts and reports
routinely spot-checked against
laboratory records before data are
released?
                                           Page 21 of 23

-------
SUMMARY
Item
Do responses to the evaluation indicate
that project and supervisory personnel
are aware of QA and its application to
the project?
Do project and supervisory personnel
place positive emphasis on QA/QC?
Have responses with respect to QA/QC
aspects of the project been open and
direct?
Has a cooperative attitude been
displayed by all project and
supervisory personnel?
Does the organization place the proper
emphasis on quality assurance?
Have any QA/QC deficiencies been
discussed before leaving?
Is the overall quality assurance
adequate to accomplish the objectives
of the project?
Have corrective actions recommended
during previous evaluations been
implemented?
Are any corrective actions required? If
so, list the necessary actions below.
Yes










No










Comment










Page 22 of 23

-------
SUMMARY




Summary Comments and Corrective Actions:
                                   Page 23 of 23

-------
                                Appendix Q
                       Audit Finding Response Form
      The follow ing forms will be used to document GLNPO audit activities.
         These forms are very similar to the finding forms published in:
 Arter, D. 1989. Quality Audits for Improved Performance. ASQC Quality Press,
                  Milwaukee, Wisconsin. 93 pp. Appendix P
GLNPO Quality Management Plan - Appendix Q                                          May 2008

-------
                  Audit Finding Response Form

Audit Title:	
Audit #:
Finding #:

Finding:
Cause of problem:
Actions taken or planned for correction:
Responsibilities and timetable for the above actions:
Prepare by:	Date:.
Reviewed by:	Date:

Remarks:
                            Page 1 of 1

-------
                                   Appendix R

      Examples of Quality Assurance/Quality Control Analysis Checklists
   These example checklists were used for QA/QC review of whole sediment toxicity tests and
                             sediment chemistry analysis.
GLNPO Quality Management Plan - Appendix R                                              May 2008

-------
                             QA/QC Analysis Checklist for
           ACUTE AND CHRONIC WHOLE SEDIMENT TOXICITY TESTS
                    (10-dayC tentans and 10-day or 28-day//, aztecd)
GRANT/IAG NUMBER:
PROJECT NAME: 	
REVIEWER:	
DATE:
1.  Did toxicity tests employ appropriate procedures? [ASTM: E1367, E1611, E1706, USEPA (2000)]

                    YES	
                    NO	(UNACCEPTABLE)

2.  Does sample storage time exceed the allowable storage time specified in the QAPP?

        Allowable Storage Days Specified in QAPP	
       Number of Storage Days Prior to Testing	
                    YES   	(UNACCEPTABLE)
                    NO    	

3.  Was the age for H. azteca organisms between 7- to 14-days at the start of the test with an age range
 less than 2-days?
                    YES   	
                    NO    	 (UNACCEPTABLE)

4A. Were all of the C. tentans organisms second- to third-stage larvae with at least 50% at the third
 instar?
                    YES   	
                    NO    	 (UNACCEPTABLE)

4B. How was the developmental stage of the C. tentans larvae measured?
                    Head Capsule Width	(See Table 10.2 of EPA/600/R-99/064, March 2000)
                    Length	(Should fall between 4 mm to 6 mm)
                    Weight	(Should fall between 0.08  to 0.23 mg/individual)

5.  Do flow rates through the different test chambers differ by more than 10% at any particular time
during the test?
                    YES	(UNACCEPTABLE)
                    NO	

6.  Did Dissolved Oxygen remain above 2.5 mg/L?


                    YES
                    NO    	(Provide Explanation at end of Checklist)
                                       Page 1 of 5

-------
7.  Does daily mean Temperature remain at 23 ± 1°C?
                    YES	
                    NO	(UNACCEPTABLE)

8.  Does the instantaneous Temperature remain at fluctuate less then 23 ± 3°C?

                    YES	
                    NO	(UNACCEPTABLE)

9.  Do the Ranges of for Hardness, Alkalinity, pH, and Ammonia fluctuate more than 50%?
       Ranges:
             DO	Alk	
             pH	NH3    	
                    YES	(UNACCEPTABLE)
                    NO
10.  Was the Ammonia concentration greater than 20 mg/L?
                                                 9
                    YES   	  (See EPA/600/R-99/064, March 2000 to determine if ammonia
                                  contributed to toxicity of H. azteca.)
                    NO
                                                 9
11.  Was the Ammonia concentration greater than 82 mg/L?

                    YES   	  (See EPA/600/R-99/064, March 2000 to determine if ammonia
                                  contributed to toxicity of C. tentans)
                    NO	

12.  Was the Mean Control Survival in the H. azteca Control Sediments greater than or equal to 80%?

                    YES   	
                    NO	(UNACCEPTABLE)

13.  Was the Mean Control Survival in the C. tentans Control Sediments greater than or equal to 70%?

                    YES	
                    NO	(UNACCEPTABLE)

14.  Was the mean weight per surviving C. tentans control organism greater than 0.48 mg (ash-free dry
weight)?
                    YES	
                    NO	(UNACCEPTABLE)

15.  Was the overlying water renewed at a rate of 2 volumes per day?
                    YES	
                    NO	(UNACCEPTABLE)
                                       Page 2 of 5

-------
16.  Please provide details for all of the "UNACCEPTABLE" responses marked above. Include details
on the specific results that potentially maybe affected by any QA/QC discrepancies, and
recommendations regarding usability of data.
                                          Page 3 of 5

-------
                             QA/QC Analysis Checklist for
                        SEDIMENT CHEMISTRY ANALYSIS
GRANT/IAG NUMBER:
PROJECT NAME:	
REVIEWER:	
DATE:
1.  What sediment chemistry data has been collected (CHECK ALL THAT APPLY)?

       Total Metals	    PCBs	pH	TOC	
       Dioxins/Furans	  PAHs	Pesticides	DO	  AVS
       SEM Metals	Particle Size	Other	

2.  Were the target detection limits met for each parameter?

                    YES	
                    NO    	(UNACCEPTABLE)

3.  Were the Method Blanks less than the established MDL for each parameter?
                    YES
                    NO	(UNACCEPTABLE)

4.  Did the results of Field Duplicate Analysis vary by less than the % RPD specified in the QAPP?

                    YES   	
                    NO	(UNACCEPTABLE)

5.  Did the results of the Field Replicates Analysis vary by less than the % RPD specified in the QAPP?

                    YES   	
                    NO	(UNACCEPTABLE)

6.  Did the surrogate spike recoveries and MS/MSD recoveries meet the limits set forth in the QAPP?

                    YES   	
                    NO	(UNACCEPTABLE)

7.  Did the initial calibration verification standards meet the requirements set forth in the QAPP?

                    YES   	
                    NO	(UNACCEPTABLE)

8.  Were any level of contaminants detected above the MDL for the trip blanks and storage blanks?

                    YES	(UNACCEPTABLE)
                    NO
                                      Page 4 of 5

-------
9.  Did all required analysis take place within the required holding time protocols set forth in the QAPP?

                     YES   	
                     NO     	  (UNACCEPTABLE)

10. Did the laboratory duplicates vary by less than the % RPD specified in the QAPP?

                     YES   	
                     NO     	  (UNACCEPTABLE)

11.    Are measured dry weight contaminant concentrations reported? (Note: Conversion from wet
 weight to dry weight concentration may occur ONLY if data on moisture or TOC are provided.
 Nominal concentrations are unacceptable.)

                     YES
                     NO	(UNACCEPTABLE)

12.     Please provide details for all of the "UNACCEPTABLE" marked above. Include details on the
specific analytes affected by any QA/QC discrepancies, and recommendations regarding usability of
data.
                                        Page 5 of 5

-------
                                Appendix S
   GLNPO Quality System Documentation Review Procedures and Tracking
GLNPO Quality Management Plan - Appendix S                                          May2008

-------
                    GLNPO's Quality System Documentation
                         Review Procedures and Tracking
NEW PROJECT/GRANT

1.  Blue folder
   •   Determine if Q APP and/or peer review are needed, then initial and date (Lou should
       initial the routing slip, but if he is not here, then have another member of the QA team
       initial it: Marvin, Scott)
   •   Make a copy of the routing slip, memo, and workplan
   •   If peer review is needed make a second copy of just the routing slip

2.  Enter into QA Track
   •   Enter the date entered into QA Track, assistance agreement number, Project Investigator
       (PI), Pi's organization, GLNPO Project Officer, funding team (determine team based on
       the Team Leader listed on the routing slip), branch (determined by the branch which
       manages the team), project title, and award date (if available)

3.  Enter into Monthly QA log
   •   Enter P.O., P.I., assistance agreement number, and project title; check NG for "new
       grant"
   G:/User/Share/All/QA/QA Team/Forms/Monthly Status.xls

4.  File
   •   Create a label with assistance agreement number, P.I., P.O., funding year, and title; attach
       label to green file folder
   G:/User/Share/All/QA/QA Team/Forms/QAPP file labels.doc
   •   Put copies made from blue folder into green folder, and file by fiscal year and assistance
       agreement #
   •   If peer review is needed, put copy of the routing slip into the appropriate manila folder
       (by fiscal year) in the green peer review folder, and enter into the National Peer Review
       Database

QAPP/OMP REVIEW

1.  Checksheets
   •   You have ten business days to review a QAPP/QMP from the date of the P.O.'s  signature
   •   Review the QA document by filling out the appropriate checksheet
   G:/User/Share/All/QA/Library/Checklists/QAPPChklst.doc
   G:/User/Share/All/QA/Library/Checklists/QMPChklst.doc
   •   Send checksheet to Lou, who will review and send his comments to the P.O.

2.  Status
   •   If the QA document is approved, Lou will sign and date the title page of the plan
   •   If the QA document is approved with minor revisions, Lou will sign and date it,

-------
       contingent on approval of received revisions
   •   If the QA document is not approved, Lou will return it to the P.O. with his comments

3.  File
   •   File an electronic copy of the QAPP/QMP and the review checksheet in the appropriate
       folder under G:/User/Share/Aii/QA/Library/QAPPs or QMP'S; also name the QA document
       and review checksheet as specified in the QA directory document
   G:/User/Share/All/QA/QA Team/Forms/QA Directory.doc
   •   File the hard copy QAPP/QMP and the review checksheet in the respective file folder

4.  Update QA Track
   •   Enter the review date, status, and comments in QA Track

5.  Update Monthly Log
   •   Enter the P.O., date signed by P.O., P.I., assistance agreement number, title, and status
       into the monthly log

6.  Update Delinquency List
   •   If a delinquent QA submission or revision is approved, remove that document from the
       delinquency list and enter into delinquency resolved list

7.  Consequent Submittals / Revisions
   •   Repeat steps 1-6

DELINQUENCY

1.  1st phase (after 90  days of award date)
   •   Send memo to P.O. and "cc" branch chief within 7 days
   •   Enter P.O., team, P.I., assistance agreement number, project title, and award date into the
       delinquency table (Excel spreadsheet)
   G:/User/Sbare/All/QA/QA Team/Delinquency/Delinquent/DelinquentM -DD-YY.xls
   •   File memo into 1st phase delinquency folder, and a copy into project folder
   •   If plan or response is received within 7 days, no memo is sent to branch chief, and project
       is taken off of delinquency status
   •   If no response is received, send memo to branch chief, and file memo in 7-day 'cc' folder
   •   If no response is received in the next 30 days, move to 2nd phase delinquency
   •   Update QA Track

2. 2nd phase (3 weeks after 1st phase memo is sent out)
   •   Send a memo to the director and "cc" branch chief, P.O., and the grants person
   •   File the memo in the 2nd phase delinquency folder  and project folder
   •   Send a memo every 30 days until a response is received
   •   Update QA Track

3. Resolved
   •   Notify P.O.; and/or branch chief, team leader, director, and grants person by memo

-------
    •   Cross project off the delinquency summary list and enter in resolved delinquency section
    •   Update the QA track
    •   Enter approved project and date into monthly log
    •   File a copy of the memo into the resolved folder
    •   File memo and revisions/QA plan into respective project folder

CLOSE-OUT

1.  Update QA Track
    •   Enter close-out date; if not in QA Track just file into close-out folder because it didn't
       have a QA plan; if found in QA Track follow steps 2 and 3

2.  File
    •   File a copy of the memo into the respective project folder
    •   File the original memo into the close-out folder

3.  Label
    •   Make and attach anew label that indicates close-out (gray background color; with close
       out, assistance agreement #, P.I.'s organization, year, and project title)
       G:/User/Share/All/QA/QA Team/Forms/QAPP file labels.doc

NOTE: Verify that the closed-out project had approved quality system documentation; if not,
bring to Lou's attention.

OTHER

Each month:

    •   Sum the total number of new grants, and the number of QA documents approved,
       approved with minor revisions, and not approved
    •   Update the QA Monthly Delinquency Status report with the QA documents which are
       seriously delinquent, delinquent but not significantly affecting quality, and resolved
       delinquencies
       G:/U ser/Sbare/All/QA/QA Team/Delinquency/Delinquent/DelinquentM -DD-YY.xls

The QA  Manager will report the QA status to management on the second Tuesday of
each month.
GRANTS PEOPLE

George Stone - Grants / lAGs / Coops
Brigitte Manzke - Contracts

-------
                                Appendix T
            GLNPO Information Quality Products Approval Form
GLNPO Quality Management Plan - Appendix S                                          May 2008

-------
                             GLNPO Information Quality Products Approval Form
                                    (for Publication Number Assignment & Web Site/Internet Products)
                                                                 Date of Request:	
              Originator's Name(s):
                      Telephone:
 EPA Document Control*: (to be assigned byNSCEP)                                  Is this a Revised Document? Y  N
            Month / Year Document to be Published:
                                Number of Pages:	        (If so, what is the original EPA #?)

  Does this Document contain information that the Agency deems sensitive and is not suitable for the public?  Y   N
                                               Is this Document for Internal Agency Distribution ONLY?  Y   N
          1. Title of Product:
          2. Description / Abstract:
          3. Product Type Code: (Choose One)      	   Web Site Content: (Choose One)    New:	  Revision:!	
A =Reprinted Articles; B =Reference; C =Computer; D =Draft; E =Exhibit-Display-Booth; F =Unbound Publication; H =Photo, Flimstrip, etc; J =Peer-Review Journal;
K =Bound Publication; M =Microfilm, Microfiche; N =Periodical (other than Peer-Review Journal); P =Public Comment Draft; R =Report; S =Summary, Brief, Issue
Paper; U =Audio; V =Video; Z =Federal Register
  Web Site Address (URL): http://www.epa.gov/greatlakes/  	
                              http://binational.net/
          5. Branch Chief Approval on Conformance to IQG
            Signature                                                            Date

          6. GLNPO Office Director Approval on Conformance to IQG
            Signature                                                            Date

          7. Received by IQG Tracking Contact
            Signature                                                            Date
                                                                                                              revised: June 20, 2007

-------