FIFTH ECOLOGICAL QUALITY ASSURANCE WORKSHOP
October 14-15,1992, Toronto, Ontario, Canada
Proceedings
"Towards a Generic Quality Assurance Program Plan"
Sponsored by:
Environment Canada
U.S. Environmental Protection Agency
Ontario Ministry of Environment & Energy
-------
PROCEEDINGS OF THE
FIFTH ANNUAL
ECOLOGICAL QUALITY ASSURANCE
WORKSHOP
October 14-15, 1992
Meeting Coordinators
George Crawford
Dan Orr
Ontario Ministry of the Environment & Energy
Robert Bisson
Environment Canada, NWRI
Mike Papp
Robert Graves
James Lazorchak
Linda Kirkland
Steven Hedtke
U.S. Environmental Protection Agency
Craig Palmer
University of Nevada
Don Hart
Beak Consultants Limited
Sponsored by
United States Environmental Protection Agency
Environmental Monitoring Systems Laboratory
Las Vegas, Nevada, U.S.A.
Environmental Research Laboratory
Duluth, Minnesota, U.S.A.
Environment Canada
National Water Research Institute
Burlington, Ontario, Canada
Great Lakes Environment Office
Toronto, Ontario, Canada
Ontario Ministry of Environment and Energy
Laboratory Services Branch
Toronto, Ontario, Canada
-------
TABLE OF CONTENTS
Acknowledgements
List of Acronyms
Disclaimer Notice
Executive Summary
Abstract
Foreword
KEYNOTE SPEAKERS
Quality Assurance in the New Federal Great Lakes Program
Harvey Shear
Environment Canada
Toronto, Ontario
Quality Assurance in the Great Lakes National Programs Office
James Giattina
U.S. Environmental Protection Agency
Chicago, Illinois
PLENARY SESSION
EMAP on the Great Lakes - Quality Assurance Program Planning:
Current EPA Thoughts and Issues
Linda Kirkland
U.S. Environmental Protection Agency
Washington, D.C.
The Niagara River Toxics Management Plan - Quality Planning
Considerations, Past, Current and Future - Field Sampling
Design and Quality Control Procedures
Francis Philbert and Ken Kuntz
Environment Canada
Burlington, Ontario
iii
-------
MISA Database Evaluation - A Roadmap for the Integration,
Assessment and Critical Review of an Interlaboratory Database
George Steinke
Ontario Ministry of the Environment
Rexdale, Ontario
Quality Assurance Planning in Great Lakes Ecological Studies -
Framework for a Quality Assurance Plan
Donald Hart
Beak Consultants Limited
Brampton, Ontario
Comments and Questions
Workshop Participants
WORKGROUP PANEL DISCUSSION (reports and comments)
Program Management Issues
Mike Papp (leader) and Todd Howell (rapporteur)
U.S. Environmental Protection Agency, Chicago, Illinois
Ontario Ministry of the Environment, Rexdale, Ontario
Field Operations QA/QC
Dan Orr (leader) and Steven Hedtke (rapporteur)
Ontario Ministry of the Environment, Rexdale, Ontario
U.S. Environmental Protection Agency, Duluth, Minnesota
General Quality Assurance
Robert Bisson (leader) and Linda Kirkland (rapporteur)
Environment Canada, Burlington, Ontario
U.S. Environmental Protection Agency, Washington, D.C.
Laboratory Operations QA/QC
George Crawford (leader) and Craig Palmer (rapporteur)
Ontario Ministry of the Environment, Toronto, Ontario
U.S. Environmental Protection Agency, Las Vegas, Nevada
APPENDIX A: WORKSHOP PARTICIPANT ADDRESSES
APPENDIX B: ORGANIZING COMMITTEE ADDRESSES
APPENDIX C: A GENERIC QA PROGRAM PLAN FOR GREAT
LAKES ECOLOGICAL STUDIES
iv
-------
ACKNOWLEDGEMENTS
The Organizing Committee for this Fifth Ecological Quality Assurance Workshop
included George Crawford and Dan Orr of the Ontario Ministry of the Environment and
Energy; Robert Bisson of Environment Canada; Robert Graves, James Lazorchak, Linda
Kirkland and Steven Hedtke of the U.S. Environmental Protection Agency; and Craig Palmer
of the University of Nevada. The Committee wishes to acknowledge the efforts of all
participants in making the workshop a success. The proceedings were prepared by Don Hart
of Beak Consultants Limited, with assistance from Committee members.
A very special thank you is extended to Veronica Headley, Ontario Ministry of
Environment and Energy, Laboratory Services Branch, Rexdale, and Debbie Hall, EPA,
Environmental Monitoring Systems Laboratory, Cincinnati, without whose assistance this
workshop could not have happened.
The Ontario Ministry of Environment and Energy, Laboratory Services Branch, as
host agency of the Firth Ecological Quality Assurance Workshop, extends its heartfelt thanks
to this year's organizing committee members for their unfailing support and constant
encouragement.
Funding was provided by the U.S. Environmental Protection Agency (Environmental
Monitoring Systems Laboratory and Environmental Research Laboratory), the Ontario
Ministry of the Environment and Energy (Laboratory Services Branch), and Environment
Canada (National Water Research Institute and Great Lakes Environmental Office).
v
-------
LIST OF ACRONYMS
ARB
Air Resource Branch (MOEE)
ARCS
Assessment and Remediation of Contaminated Sediments
ASTM
American Society for Testing and Materials
ATG
Analytical Test Group
BHC
Benzene Hexachloride
CAEAL
Canadian Association of Environmental Analytical Laboratories
CCIW
Canadian Centre for Inland Waters
DCM
Dichloromethane
DDW
Double-distilled water
DFO
Department of Fisheries and Oceans (Canada)
DQO
Data Quality Objective
DQS
Data Qualtiy Subcommittee (of the NRTC)
EC
Environment Canada
EC50
Effective Concentration (50% response)
ECD
Electron Capture Detector
EMAP
Environmental Monitoring and Assessment Program
EPA
Environmental Protection Agency
FE
Fort Erie
GC
Gas Chromatograph
GLISP
Great Lakes International Surveillance Plan
GLNPO
Great Lakes National Program Office (EPA)
GLSE
Goulden Large Sample Extractor
IADN
Integrated Atmospheric Deposition Network
IJC
International Joint Commission
LC50
Lethal Concentration (50% mortality)
LMDL
Laboratory Method Detection Limit
LSB
Laboratory Services Branch (MOEE)
MISA
Muncipal Industrial Strategy for Abatement
MOE
Ministry of the Environment (Ontario)
MOEE
Ministry of Environment and Energy
MQO
Measurement Quality Objective
NLET
National Laboratory for Environmental Testing (Canada)
NOTL
Niagara-on-the-Lake
NRTC
Niagara River Toxics Committee
NRTMP
Niagara River Toxics Management Plan
NWRI
National Water Research Institute (Canada)
NYSDEC
New York State Department of Environmental Conservation
QA
Quality Assurance
QAMS
Quality Assurance Management Staff (EPA)
QC
Quality Control
QMO
Quality Management Office
RMC
River Monitoring Subcommittee (Niagara)
RMDL
Reference Method Detection Limit
SD,S
Standard Deviation
SOP
Standard Operating Procedure
SRM
Standard Reference Material
TP
Total Phosphorus
TQM
Total Quality Management
WQB
Water Quality Branch (EC)
WRB
Water Resources Branch (MOEE)
vii
-------
DISCLAIMER NOTICE
The Proceedings of the Fifth Annual Ecological Quality Assurance Workshop have
not been subjected to formal review by either the U.S. Environmental Protection Agency,
Environment Canada or the Ontario Ministry of Environment and Energy. Consequently,
the opinions expressed are those of the individual authors and participants and do not
necessarily reflect the views or policies of any agency. No official endorsements should be
inferred.
viii
-------
EXECUTIVE SUMMARY
The Fifth Ecological Quality Assurance Workshop was held on October 14-15, 1992,
at Queen's Park, Toronto, Ontario, Canada. The purpose of this workshop was to provide
a forum for interdisciplinary exchange of ideas and resolution of issues associated with
quality assurance in ecological studies on the Great Lakes, and specifically to evolve a
Generic QA Program Plan which would provide a common framework and elements of
guidance for implementation and description of integrated QA programs by federal, state and
provincial participants in Great Lakes monitoring.
Keynote speakers included Harvey Shear, Director of the Great Lakes Environmental
Office (GLEO) of Environment Canada, and James Giattina, Assistant Director of the Great
Lakes National Program Office (GLNPO) of the U.S. Environmental Protection Agency. Dr.
Shear outlined the goals, objectives and strategies of the federal Great Lakes monitoring and
management program in Canada, emphasizing the vision of sustainable development within
the basin. He noted the importance of quality assurance in both tracking and demonstrating
progress toward those goals. Mr. Giattina presented a similar outline, emphasizing the
strategic shift from chemical regulation to ecosystem assessment, diagnosis, protection and
remediation. He noted the importance of quality assurance in defining the risks associated
with ecosystem management decisions, and minimizing the risk of making a wrong decision.
Invited papers were presented in a plenary session on the first day of the workshop.
Linda Kirkland (U.S. Environmental Protection Agency) described QA program planning for
the Environmental Monitoring and Assessment Program (EMAP) on the Great Lakes. She
emphasized the statistical basis of the program design, and the systematic approach to quality
management, both focused on assessment of ecological endpoints. Francis Philbert and Ken
Kuntz (Environment Canada, NWRI) described the Niagara River Toxics Management Plan
for bi-national coordination and evaluation of collective pollution control measures. They
outlined the data quality planning considerations, general objectives, sampling design and
QA/QC features of the program. George Steinke (Ontario Ministry of the Environment)
described a QA/QC database evaluation associated with the province's Municipal Industrial
Strategy for Abatement (MISA) program of effluent monitoring. He illustrated techniques
for identification of sampling, sample stability and analytical problems that might affect data
interpretation, and discussed some of the data management problems associated with
handling of large multi-laboratory databases. Don Hart (Beak Consultants Limited)
presented a retrospective review of approaches to QA planning in Great Lakes ecological
studies. He noted some progress over the past few years with respect to development of
quality controls for biological monitoring, awareness of goal-driven approaches to sampling
design and quality assessment, and integration of diverse monitoring activities. He suggested
a framework and elements of guidance for preparation of QA plans and identified some
related issues that may need to be addressed in development of integrated plans for the Great
Lakes.
ix
-------
Participants were divided into four workgroups on the second day of the workshop,
and used Don Hart's presentation as a 'straw-plan' to assist them in identifying, developing
and refining the skeleton of a Generic QA Program Plan. The intent was to produce a plan
that could be used as a template across the Great Lakes to encourage some degree of
consistency in development and description of quality assurance programs by the monitoring
agencies.
The four workgroups respectively addressed (1) program management issues, (2) field
operations QA/QC, (3) general quality assurance, and (4) laboratory operations QA/QC. The
program management group (1) focussed on (a) study objectives, questions and data uses to
address the questions, (b) basic study design approaches, (c) relationships (e.g., data
linkages) between programs, (d) reporting channels and communications, and (e) document
control. The field operations group (2) focussed on (a) sampling design, (b) sampling
protocols, and (c) data management. The general quality assurance group (3) focussed on
(a) accreditation and interlaboratory studies, (b) system and performance audits, (c) use of
QC data, and (d) data review and management. The laboratory operations group (4)
focussed on (a) laboratory facilities and management, (b) analytical procedures, (c) sample
handling, and (d) data management and reporting.
Suggestions from the workgroups in each of these areas were incorporated into a revised
Generic QA Program Plan. The revised document is attached as Appendix C, and is also
available from any member of the Organizing Committee (Appendix B).
x
-------
ABSTRACT
The Fifth Ecological Quality Assurance Workshop was held on October 14-15, 1992,
at Queen's Park, Toronto, Ontario. Manuscripts of invited papers and reports of workgroup
discussions, conclusions and recommendations are presented in this proceedings document
along with the framework for a generic QAPP.
The purpose of this workshop was to provide a forum for interdisciplinary exchange
of ideas and resolution of issues associated with quality assurance (QA) in ecological studies
on the Great Lakes, and specifically to evolve a Generic QA Program Plan which would
provide a common framework and elements of guidance for implementation and description
of integrated QA programs by federal, state and provincial participants in Great Lakes
monitoring.
Presentations included:
1. Quality Assurance in the Great Lakes Environment Office (GLEO)
2. Quality Assurance in the Great Lakes National Programs Office (GLNPO)
3. EMAP on the Great Lakes - Quality Assurance Program Planning: Current EPA
Thoughts and Issues
4. The Niagara River Toxics Management Plan - Quality Planning Considerations,
Past, Current and Future - Field Sampling Design and Quality Control
Procedures
5. MIS A Database Evaluation - A Roadmap for the Integration, Assessment and
Critical Review of an Interlaboratory Database
6. Quality Assurance Planning in Great Lakes Ecological Studies - Framework for
a Quality Assurance Plan
Workgroup topics included:
1. program management issues,
2. field operations QA/QC,
3. general quality assurance, and
4. laboratory operations QA/QC.
Each group focused specifically on elements of guidance for inclusion in a Generic
QA Program Plan, using the last presented paper as a straw plan. Group rapporteurs
presented a summary of group conclusions for general discussion in a plenary session held
on the last day.
xi
-------
FOREWORD
The Fifth Ecological Quality Assurance Workshop was conceived as a crucible for
the creation of a generic quality assurance program plan (QAPP). The workshop offered 75
participants both plenary and working group forums to kindle discussions and aid the
contribution of ideas from the participants. The proceedings document captures the details
from the workshop.
It was the hope of the coordinating committee members that this workshop would
yield a practical template - a focus - for the development of more comprehensive quality
management plans by workshop participants, scientists, project managers and administrators
responsible for the local and bi-national, Great Lakes environmental programs.
A Generic OA Program Plan for Great Lakes Ecological Studies (Generic QAPP) is
the template proposed for consideration in environmental program design. The template
provides: guidance in 4 fundamental areas of program design; insights to 7 activities within
the program design; and questions for 47 specific issues related to program implementation
and more then 30 references that provide advice for the answers. This General QAPP is a
first step toward improved environmental program quality design and bi-national program
harmonization. It is an introductory step. Implementation, tracking, discussion, addition and
modification are the keys to its continuous improvement.
On behalf of the coordinating committee I encourage you to use this Generic QAPP
during your next program design, and tell us of your successes and suggestions.
George Crawford
Ontario Ministry of Environment and Energy
October, 1993
xiii
-------
FIFTH ANNUAL QUALITY ASSURANCE WORKSHOP
KEYNOTE SPEAKERS
Wednesday, October 14, 1992
-------
[transcription from oral presentation]
QUALITY ASSURANCE IN THE NEW FEDERAL
GREAT LAKES PROGRAM
Harvey Shear, Director
Great Lakes Environment Office
Conservation and Protection - Ontario Region
Environment Canada
Toronto, Ontario
I thought it might be useful to look at where the Great Lakes Programme is heading
in Canada and try to illustrate some of the areas where quality assurance is absolutely
essential. As you know, in 1987 at the signing of the Great Lakes Water Quality Agreement
Protocol, the focus of the Great Lakes Programmes shifted from the International Joint
Commission, which had been facilitating the Agreement, back to the Parties, that is the
United States and Canada, and the Parties are now fully responsible for delivering all aspects
of the Canada/U.S. Agreement. The public of course is watching and commenting, and the
spotlight is always on us.
In Canada, we are currently looking at our Great Lakes Programme, looking towards
the turn of the century, with a new strategic direction for the Great Lakes Programme. We
have had at least twenty years of history and concurrently we are also looking at our funding
for the Great Lakes Programme and also renewing our Agreement with the Province of
Ontario. So a number of factors have come together to make us look at our programme and
focus it to the turn of the century and see where we want to be heading. With the United
Nations Conference on Environment Development, the Rio conference in June, Canada
played a major role. We have essentially adopted Agenda 21 which came out of that
conference. Our Minister Charest has told us that if we cannot deliver sustainable
development of Canada and hopefully the United States, certainly in Canada; if we cannot
deliver sustainable development in the Great Lakes Basin with our resources, our finances
and intelligence, we cannot do it anywhere. We should be able to do it and show the world
that it can work.
So we have taken sustainable development to the highest level of our programme.
The Great Lakes strategic framework - at the moment it has been largely a federal
framework. We are working with the Province of Ontario as I said on a new Canada
Ontario Agreement and most of the components that you will see here are very consistent
with our discussions with Ontario. As I said, the vision, if you will, of our programme, is
sustainable development, a healthy environment. It is a healthy ecosystem and a healthy
economy. We are often faced with the question of "Well if we implement these
environmental rules and regulations, we will lose jobs". This does not have to happen. I
think there are sufficient examples on both sides of the border to show that this is not the
case.
3
-------
[transcription from oral presentation]
In going down our strategic framework, you will see ultimately it is just a one page
synopsis of what we are doing. There will be a lot of words behind this explaining what we
are attempting to achieve. We have two main goals; let us fix up the mess that we have
made in the last 150 years. Let us cure our environment and let us make sure it does not
happen again because it is a lot easier and a lot less expensive to prevent the problems than
to cure them. We have also got a lot of programmes which cut across all our goals and
objectives and they involve research and monitoring which you will see in a minute.
Within these two major goals, we have some programme objectives. As far as curing
past mistakes we are really focusing here on restoring degraded sites in the Great Lakes,
many of which will be our Areas of Concern in Canada. We have got seventeen of them,
five shared with the U.S. and twelve domestic; and other sites as well. In the prevention,
the future-looking side, we want to prevent pollution from occurring and we have got an
active pollution prevention programme and we also want to make sure that our regulatory
programme is active so we have got two sides of the pollution situation; continue to control
pollutants, regulate them as strictly as necessary and work towards preventing pollution in
the first place. And on the ecosystem side, including humans, we are looking to conserve
ecosystem health and human health; in fact promote human health. We have got a very
active human health effects programme within our Great Lakes Programme. This all sounds
very vague and I am sure you are wondering what I am talking about but it will start to take
shape I hope.
What we will try to identify is a number of areas where we will have some action
in restoring degraded sites throughout the Great Lakes, in our remedial action plan
programme with which most of you are familiar. In our areas of concern - we are going
full steam ahead. We are going to restore habitat, fish and wildlife populations in areas of
concern. We are going to deal with ground water. That says remediate ground water. Well
that is a monumental task and I am not sure that anyone is up to doing that throughout the
Basin but at certain waste sites we are going to look at what can be done to remediate
ground water problems and probably more importantly, to prevent them in the first place,
cleaning up waste sites, and there are some others there that I need not go into.
We have got a number of actions in each of those other two objectives. We have
tried to be as comprehensive as possible in this. At the moment, it was developed federally.
We see the provincial government in here as well and I said most of the objectives of the
talk are consistent with what we are currently negotiating with Ontario. There are generic
types of activities that could apply to the federal government, or the provincial government
within the box I have labelled Programme Strategies. These are the generic kinds of
activities we carry out to deliver these objectives. Each of those boxes in fact will be an
environmental objective. What do we want to do for example in protected areas, for
example, fish sanctuaries, wildlife refuges, etc. How many hectares or acres of habitat do
we want to protect? How many parks do we want to create? So we are looking at actual
targets. We were criticized in the past. People were saying "Well your programmes are too
vague. We do not know what you are doing. It is all process. Where are you really
4
-------
[transcription from oral presentation]
heading? What are your objectives here?" What we are attempting to do is establish those
objectives and then have a programme in place to measure our attainment of those
objectives. Where does quality assurance fit into all of this? Well just about everywhere.
Ecosystem processes, which we talk about, ecosystem research, and management decisions
based on the ecosystem concept clearly require extremely good information. We
unfortunately, in government, seem to get criticized for whatever reason, fbr apparently not
telling the truth or distorting information or whatever. I have heard this many times and it
really bothers me. I think the only way to build a defence against something like that is
good, solid information. There are a number of environmental groups around who have their
own agendas and who will take shots at government programmes for their own purposes.
I think the best way to counter that is with good solid information which depends to a large
extent on the work that you people are doing. A sad commentary, but unfortunately it is the
truth.
As a biologist I understand the kind of variables that you are dealing with in
ecological studies and reducing that variability or at least understanding it and being able to
explain it and provide the kind of information that managers need to take action is
particularly critical. One of the most important areas that we will be dealing with and that
requires very good solid quality assurance work would be in Lake Superior. Last year the
two governments announced a major programme in response to the International Joint
Commission s recommendation on Lake Superior. The governments went beyond the zero
discharge zone that was asked for. We in fact came up with a plan to restore and protect
the Lake Superior ecosystem which is quite a broad plan involving both pollution
components as well as ecosystem components. It is a long-term plan. Because of the nature
of the Lake Superior ecosystem which I guess one could describe as "pristine" relative to the
other lakes and probably more fragile than the others, good solid quality information is
going to be absolutely critical. As we report the objectives that we are developing, we are
going to have to , as I said earlier, defend the kinds of criticism that will come back to us
saying "Well, you really have not achieved anything. You are comparing information from
year X against year Y. Well, we really do not see much of a change." We have to be able
to show without any shadow of a doubt that in fact we have made progress. And I guess
looking sort of selfishly at the whole framework that we have put up here, we are using that
as a way of convincing our government to give us money for this programme and the only
way to do that is to show what results we have achieved, what targets we are setting for
ourselves and how we are going to be able to measure our progress against those targets.
And for the Canadians, the federal people who are here, there is certainly an opportunity as
we develop our Great Lakes Action Plan submission to go through to our federal cabinet,
to build in a good solid quality assurance component with the appropriate arguments for how
it helps to deliver an entire programme. That is kind of a rambling description of where the
Canadian programme is heading. We are currently in negotiations with the Province on the
new Canada/Ontario Agreement which has been in place since 1971. That agreement allows
Canada to deliver its commitments under the Canada/ U.S. agreement and within that
agreement I think we are going to see a fairly significant quality assurance component as
well.
5
-------
GREAT LAKES STRATEGIC FRAMEWORK
ON
COAL *f: CURE
PAST ENVIRONMENTAL
MISTAKES
OBJECTIVE »1
RESTORE
DEGRADED SITES
Implement RAPs
in Areas of Concern
V
'Habi
Restore
ab'tat and Populations
TT
Remediate
Groundwater
n
Clean Up
Waste Sites
y
toot
n
Remediate
'ontaminated Sediments
Protect Humans
at Risk
VISION:
SUSTAINABLE BASIN DEVELOPMENT
(healthy ecosystem-vigorous economy)
COAL #2; PREVENT FUTURE
ECOSYSTEM DEGRADATION
r
PROGRAM
STRATEGIES
OBIECTIVE #2
PREVENT & CONTROL
POLLUTANT IMPACTS
Virtually
Eliminate Toxics
Regulate Sources
of Pollutants
Improve Drinking
Water & Sewage
. Treatment
Minimize
Solid Wastes
Stop Lone-Range
Air Pollution
OBJECTIVE »4 B OBIfCTM $
TAKE DECISIVE ACTION ¦ LAYING FOUNDATION fOt
THROUGH fCOSVSTfM | BEKAVtORAt CHANCf
SOfNCF A MANACfMiMT
OBJECTIVE #3
CONSERVE HUMAN
& ECOSYSTEM HEALTH
Develop Se Apply Ecosystem -
Based Decision Processes
Implement Policies & Enforce
Legislation
Protect and Promote
Human Health
3/ Conduct Ecosystem-Based Research
n
Sustain Fish
Develop & Apply Technology
Wildlife Populations
U
Monitor Ecosystem Trends & tfftcts
Enhance
Protected Areas
~
Share Strategic Ecosystem Information
4/ Ensure
Sustainable Land and
Resource Use
Promote Environmental
Citizenship, Stewardship
u
Prevent or
Manage Nuisance
Exotic Species
Spil
Fugitive Emissions
Promote Use of
"Economic Instruments"
Enhance Consultation & Partnerships
Manage Canada/U.S. Relations
Strategically
Prevent or
Mitigate Climate
Change Impacts
-------
QUALITY ASSURANCE IN THE GREAT LAKES
NATIONAL PROGRAMS OFFICE
James Giattina, Assistant Director
Great Lakes National Programs Office
U.S. Environmental Protection Agency
Chicago, Illinois
What I'd like to do this morning, is give you, as Harvey did, a general overview of
how we see quality assurance fitting into the programme activities in the future. And I'm
going to do that in just about twenty minutes. In 1978, the Great Lakes National Programme
Office was created to fulfill the United States obligation under the Great Lakes Water
Quality Agreement. In 1987, the programme got a tremendous boost from Congress because
they mandated under the Clean Water Act, they basically recognized the Great Lakes as a
valuable national resource and mandated GLNPO responsible for coordinating the United
States response under the agreement.
These are exciting times in the Great Lakes Basin. Longstanding grass-roots, public
support for environmental action to address issues such as fish contamination and its
potential effects on humans (in particular, pregnant and nursing mothers and their offspring,
recreational fishermen, and subsistence fishermen particularly, native Americans and the
poor). Environmental Equity has combined with political support in Congress and with the
Administration.
As a result, we are getting both financial support and political attention at an
unprecedented level. This, of course, also carries with it lots of responsibility, an emphasis
on accountability and what I believe is a very narrow window of opportunity to establish a
solid foundation/infrastructure needed to achieve our long-term environmental objectives.
If you accept the premise that, to address the key environmental issues both globally,
as well as hi subecosystems such as the lakes, you must also accept the fact that we can't
attempt to do the job alone and we can't solely rely on environmental statutes as a basis for
action.
The ultimate objective of the Great Lakes National Program Office is to promote the
protection and restoration of the chemical, physical and biological integrity of the Great
Lakes ecosystem.
Recognizing this fact, we convened 18 federal, state and tribal agencies and, this past
year, we have hammered out a set of environmental goals and specific steps to be undertaken
by the participating agencies to move us toward achieving those goals.
What is important here is that this activity sets a precedent for the Great Lakes basin
within the United States in that it is the first time this breadth of players responsible for
7
-------
resource management in the basin have come to the table together. The environmental
vision is a joint vision.
The vision is by necessity broad, eliminating the release of toxics, particularly
persistent, bioaccumulative toxics to eliminate concerns for human health and wildlife;
protecting and restoring critical and important habitats to ensure the survival of important
species; protecting human health from non-toxic substances or pathogens and restoring
biological diversity of the region and recognizing our joint responsibility to foster our
partnership in order to achieve the above goals by rebuilding our infrastructure for research
and monitoring, and information management.
Information is the common currency that will bind these organizations together, and
that will bind our two countries together in our efforts to improve and protect the
environment.
As far as monitoring goals, GLNPO's vision, within the EPA, is a shift from a purely
regulatory emphasis to and an ecosystem emphasis, one which places our regulatory
capabilities and responsibilities in a richer context.
The ecosystem approach focuses on our ability to apply research and monitoring in
assessment, diagnosis and implementation of activities to improve the environment. That
approach is a very tall order. Data come out every day on different environmental problems
in the lakes. Every one of them is of major concern to someone, and rightfully so, for every
individual, every ecosystem, has a right to a healthy existence. The bottom line is that we
can't address every concern nor can we be sure every concern is real.
And so it comes down to managers, who must choose, from a myriad of
environmental problems, the ones we feel are the most important to remedy. Our choices
must be educated, because every choice has an associated risk. We're required to assess
these risks and make appropriate decisions. What we as managers hope for are ways to
minimize our risks of making a wrong decision and maximize our environmental successes.
So what is required to make appropriate decisions? Obviously we need appropriate
data. These are data that, when evaluated, provide the manager with enough certainty that
he or she is willing to risk making an inappropriate decision. This is the basis for quality
assurance - assuring management that the risk is worth taking.
Every day we weigh risks and make countless decisions based on data of perceived
quality. Within the Great Lakes community, we make decisions on whether or not to eat
fish based upon advisories. Each person determines the quality of the data and acts
accordingly. Some decide the risk is too great to eat any fish from the Great Lakes, others
consume within prescribed limits and others ignore the information altogether.
This example serves to illustrate that the decisions we make every day are based on
our perception of the quality of data and our choice to take an action that may be
appropriate.
8
-------
Managers responsible for environmental programs of the magnitude of the Great
Lakes face very difficult decisions. Why are the decisions difficult? In the times we're
living in, our environmental decisions have economic implications. The state of the world
economy and the moral need to continue to foster economic growth in an environmentally
sustainable manner demands that we make sound decisions, that we make sound choices to
protect the environment based on facts with measurable certainty. Another complicating
factor is our subject, the environment. We are still learning how the Great Lakes ecosystem
functions and the interrelated and interdependent factors that govern the ecological health of
this system. Aldo Leopold said that the outstanding scientific discovery of the twentieth
century is not television or radio, but rather the complexity of the land organism. Only those
who know the most about it can appreciate how little is known about it. This statement is
perhaps even more true for an ecosystem as complex as the Great Lakes.
To complicate this, science does not always provide one answer. There are many
conflicting journal articles to prove this. Therefore, our decisions tend to be those that are
the most appropriate with the knowledge we have at the time. Managers must rely on others
for data. This complicates the decision-making process because often, by the time
information is received by the manager, data quality attributes are non-existent. We rely on
others to provide us with data of a quality they perceive as adequate. The great equalizer
for decision-making is the quality of the data on which those decisions are based.
Congressman G.E. Brown said that no number is significant, and subsequently worthy of
being recorded, without an estimate of its uncertainty. A good manager needs to know the
uncertainty to determine the potential risk for making an inappropriate decision. Therefore,
dat quality attributes that are associated with data are necessary for any educated ecological
management decision.
In many studies, we run into the problem of determining the data quality after the
study is over and having to admit that it's not adequate for making a decision. In a world
where we are facing a multitude of environmental problems with less than adequate funding
to address them, the "ready, shoot, aim" paradigm will not work. We have to manage and
plan our environmental data collection and implementation activities more appropriately. If
we as managers make a decision to fund project A and not B, we better bear the
responsibility for ensuring the success of A. My presentation this morning is really a
challenge to environmental managers. This success starts with planning and communication.
It's been stated that managers often don't know what they want. "Bring me a rock, and I'll
know it's the right rock when I see it". In reality, I think that management knows what it
wants, but needs some assistance in the development of programme objectives that are
technically sound and cost-efficient. Therefore, management needs experts to ask the
question "Why do you need a rock,how big, what colour, and what kind?" The
communication loop between the expert and manager is essential. Management must be able
to convey the decisions they need to make with data and the risks they want to take.
Managers must allow the technical personnel to be part of this decision-making process, and
therefore a participant in the risks as well as the successes.
In order for this to happen, managers must perceive themselves not only as customers
but also as suppliers. We need to be able to articulate our programmatic needs. To do this,
9
-------
we must first be cognizant of our overall goal (the overall needs based on the Water Quality
Agreement) and then be able to articulate how an individual program helps us achieve the
goal.
This reminds me of a basic surveying course in college where you were given two
benchmarks, one to start on and one on which to end. The overall goal was to determine
the difference in elevation between the benchmarks within a certain allowable error. Since
you could not directly see benchmark 2, a number of intermediate segments had to be
measured. The fewer segments you surveyed, the lower your risk of a transit set-up error.
However, fewer set-ups meant a greater measuring distance which increased your risk of a
measurement error. By evaluating distance prior to beginning the operation, an appropriate
implementation plan could be developed optimizing distance and set-up.
To put this example into the context of environmental programmes, the present
condition of the Great Lakes is benchmark 1 and the Water Quality Agreement, what we are
aiming for, is benchmark 2. In order to get from where we are to where we want to go, we
must carefully evaluate the landscape from here to there and we must optimize the
environmental programs in light of our ultimate goal as we move from benchmark 1 to
benchmark 2. In order to do this, we need to build a strong communication link between
managers and their experts.
We can't afford to simply ask for the rock or point to the Agreement and say "make
it so". We must ask ourselves the following questions:
What is our vision for the future, a 10 to 50 year time frame?
What are our short-term, measurable objectives to move us toward that vision?
What are our strategies for achieving those objectives?
What information do we need to assure us that our strategies are effective?
What degree of certainty do we need in this information and how much are we
willing to pay for that certainty?
In order to answer these questions, we as managers need to work effectively with a
broad range of expertise. In particular, we need social, economic and scientific expertise,
statistical expertise and quality assurance expertise. The manager and experts lead the
planning effort while the statistician and the QA representative keep them honest. The up-
front interaction of these players would lead to a greater understanding of each other's need
and greatly improve the chances of a successful programme.
Through communication and reiteration, the manager and the experts come to a
mutual understanding of the programme, the decisions that will be made with the data, and
the quality of data needed in order for the manager to risk making an inappropriate decision.
We managers must set better objectives and we need to ask better questions.
Through this process, we may very well recognize that we don't have the money to
support a monitoring programme that will provide a reliable answer within the stated error
and confidence intervals. I may have to allow for more uncertainty than I really wanted.
10
-------
However, the important part is that I, as an environmental manager, must understand my
options before the project is implemented.
Both Canada and the United States are embarking on cooperative programmes such
as the Integrated Atmospheric Deposition Network and others. These programmes will, in
the near future, be extensive and multifaceted, including research and monitoring. My hope
is that this workshop can develop strong QA concepts and guidance that will help managers
understand the roles they need to play in this process. Managers are on the front lines. We
need to be assured that the decisions we make are justified. With continued work on this
generic QA Programme Plan, we may be setting the stage for bi-national agreement on the
basic needs for data collection activities in the Great Lakes.
11
-------
FIFTH ANNUAL ECOLOGICAL QUALITY ASSURANCE WORKSHOP
PLENARY SESSION
Wednesday, October 14,1992
-------
EMAP ON THE GREAT LAKES
QUALITY ASSURANCE PROGRAM PLANNING:
CURRENT EPA THOUGHTS AND ISSUES
Linda Kirkland
Acting EMAP QA Coordinator
U.S. Environmental Protection Agency
Washington, D.C.
ABSTRACT
The Environmental Monitoring and Assessment Program (EMAP) is being designed
to provide annual statistical summaries and periodic interpretive reports on ecological status
and trends to resource managers and the public. EMAP's Quality System is being developed
in a phased manner and will focus on planning of data use and evaluation in its current
research and future monitoring. A systematic approach to quality management is
emphasized, stressing the whole's relationship to the parts. This presentation will outline
the current thoughts and issues for EMAP's Quality System with examples from the Great
Lakes Resource Group.
OVERVIEW OF EMAP - GREAT LAKES
In 1988, the U.S. Environmental Protection Agency's (EPA) Science Advisory Board
(EPA, 1988) recommended implementation of a program within the EPA to monitor
ecological status and trends, and to develop innovative methods for anticipating emerging
environmental problems before they reach crisis proportions. The Environmental Monitoring
and Assessment Program (EMAP) is part of the EPA Office of Research and Development's
response to the Science Advisory Board's recommendations.
These objectives pose a challenge that cannot be met without 1) commitment to
environmental monitoring, research, and assessment on long-term regional and national scales
and 2) experience and expertise of other environmental management organizations
conducting more localized monitoring. Good coordination in planning is essential to
optimize the usefulness of EMAP to other levels of monitoring (Figure 1). The EMAP-Great
Lakes Monitoring and Research Strategy (Hedtke, et al., 1992) is exemplary in emphasizing
good coordination both within EMAP and with external users such as the scientists
implementing the Great Lakes International Surveillance Plan. Workshops, like this one, are
also a valuable mechanism for communication and coordination.
EMAP is utilizing a probability design for its sampling program to allow
determination of the status of resources, or condition, with known statistical confidence. The
basic framework is derived from a 12,600 point hexagonal grid placed randomly over the
conterminous United States. Those points that fall on the Great Lakes are considered the
15
-------
base grid locations. However, an important feature of EMAP is that each resource group
(e.g., Great Lakes) has defined a series of classes within the resource category. The
sampling design then can be modified to account for the different level of homogeneity
within those classes. EMAP-Great Lakes has selected four classes: offshore, nearshore,
harbors and embayments, and coastal wetlands.
Although the design for each class will remain probability based, various options are
being explored for selecting sites and the necessary sampling frequency to characterize
condition (Figure 2). Condition is characterized by a suite of biological, chemical, and
physical measurements providing information to be aggregated to estimate condition of the
class or lake. Another important aspect is the temporal and spatial interpenetrating nature
of the site selection and field visits. In order to cost-effectively sample the resource with
sufficient spatial and temporal coverage, only one-fourth of the sites will be visited each
year. Early design questions include investigations on the adequacy of the various options
to characterize condition with the desired degree of confidence.
Localized Monitoring
f NOAA \
Sediment
Trap Program
Representative
Samples of the 43
Areas of Concern
in the Great lakes
Each of the Four Resource Classes
Within Each Great Lake, !.e.,
Offshore, Nearshore, Harbors and
Embayments, and Coastal Wetlands
Each of the Five Great Lakes
l.e., Lakes Michigan, Superior, Ontario,
Huron and Erie
Figure 1 Concept of the EMAP Tiered Approach as Applied to EMAP - Great Lakes
16
-------
3 Fold Grid
Figure 2 Etghty-fiv* matar dapth contour lor Lake Michigan
with EMAP baaa grid for oftahora zona, 3-fold grid
for naarahora zona.
Sampling at specific sites or areas for characterization and estimation of condition is
beyond the scope of EMAP. However, integration of EMAP monitoring with more localized
sampling can maximize monitoring efforts by providing background for areas of concern
such as Remedial Action Plans and ecological research. Figure 3 shows a model (EPA,
1992) for how this might occur.
Monitor Ecosystem
Raaponaa
Ecological Stability
Impaired or
Threatened? >
Monitor Source
Reduction*
No
Yea
Determine Critical
Pollutant*
Regional-EMAP
QUAD
IADN
Lake Survey
Lake Monitoring
Identity Prevention,
Reduction, and
Remediation Acttvitfea
IADN Baaln
GLAD
Lake Survey
Lake Monitoring
Quantify
Load*
EatabUah Load
Reduction Target*
Figure 3
17
-------
EMAP is being designed with a "top down" approach in a risk assessment framework
focussing on the endpoints of concern rather than the environmental perturbations or
stressors. Its indicators must relate to the assessments of endpoints of biotic integrity and
trophic status including:
Response Indicator: A characteristic of the environment measured to provide
evidence of the biological condition of a resource at the organism, population, community,
or ecosystem level of organization.
Stressor Indicator: A characteristic measured to quantify a natural process, an
environmental hazard, or a management action that affects changes in exposure and habitat.
Exposure Indicator: A characteristic of the environment measured to provide
evidence of the occurrence or magnitude of a response indicator's contact with a chemical
or biological stressor.
Habitat Indicator: A physical, chemical, or biological attribute measured to
characterize conditions necessary to support an organism, population, or community in the
absence of pollutants.
Indicators are identified through the development of conceptual models of ecosystems.
These models may be based primarily on how current and anticipated stresses affect
ecosystems, or from a perspective of the structural, functional, and recuperative features of
ecosystems. The research aspect of EMAP involves the long term process of selecting (i.e.,
joint U.S.-Canada work group) and developing ecological indicators as shown in Figure 4.
CANDIDATE INDICATORS
IDENTIFY AND PRIORITIZE
Expert Knowledge
Utentum Review
Pew Review
EVALUATE EXPECTED PERFORMANCE
RESEARCH INDICATORS
AmlyM el ExHttng OaU
UmlMMeM FMM TM*
flMf W .r*,,.
IOTW'
DEVELOPMENTAL INDICATORS
EVALUATE ACTUAL PERFORMANCE
Rtfltsnal DamoratiMsn PrajMts
CORE INDICATORS
IMPLEMENT REGIONAL AND
NATIONAL MONITORING
PERIODIC REEVALUATION
Figure 4 General Approach for Selection and Development of Indicators for EMAP
18
-------
In addition to determining the ecological condition of the Great Lakes, EMAP will
determine statistical associations between the occurrence of poor conditions as defined by
response indicators and values for the exposure, habitat, and stressor indicators.
An essential complement to the careful design and indicator strategies discussed
above is planning for data quality and evaluating the progress made in obtaining it. The
decentralized Resource Groups, such as EMAP-Great Lakes, will maintain data bases for raw
data and data aggregated by location. A proof of concept pilot is being developed for the
centralized EMAP Center database for aggregated summary data. These databases will
support the production of EMAP's products, the annual statistical reports and periodic
interpretive reports (Figure 5).
EMAP
Central
RDBMS
^ GIS ^
Central
Repository
SAS
Hies
Task
Group
RDBMS
Task
Group
RDBMS
nr«k
Group
GlSDati
^T*»k
Group
.GiSDtta
SAS
Plies
Figure 5 EMAP Architecture
19
-------
EMAP'S QUALITY MANAGEMENT PLANNING FOCUS
Phased Implementation of the EMAP Quality Assurance Program
A Quality Management Plan for EMAP has been prepared and is currently being
reviewed internally. This QA Program blueprint emphasizes planning for, and assessment
of, data quality as the focus for the QA design. Reports from the assessments are related
to control systems for EMAP's data collection operations and their procurements. In this
manner good communication with EMAP's data suppliers in terms of providing
specifications and feedback is emphasized.
Another aspect of EMAP's Quality System is annual reporting of the status of the QA
program to top management and planning of the next year's activities in concert with the
annual budget cycle. The EMAP QA Coordinator integrates the QA elements of the central
planning of the technical coordinators and the seven resource groups in this planning effort.
This emphasizes coordination between different resource groups as well as assessment of
needs of the whole. For example, in the next year the development of guidance on
preparation of QA project plans and proof of concept testing of data quality planning and
assessment procedures have high priority.
Planning for Data Quality Objectives and Indicators
EPA's management accountability stresses producing data of the type and quality
needed for their intended use, which for EMAP will include multiple levels (Figure 6).
Therefore, a systematic process is needed to define data use, such as answering research
questions on indicators performance or supporting defensible resource trend estimates, and
to plan for data quality indicators to assure the data are adequate for that use. EMAP is
currently adapting the EPA Data Quality Objectives (DQO) Process to its needs on three
levels. The first is the program level where target DQOs are being developed (i.e., for
trends: "Detect a 2% per year change, over a decade, with an alpha = 0.2 and beta = 0.3
(0.7 power), in a response indicator, for a resource class (population), on a regional scale").
This provides a context for developing data quality objectives for second level or research
questions and hypotheses at the indicator level (i.e., "The EMAP-Resource Group requires
75% certainty that a benthic index of condition distinguishes between nominal and
subnominal condition correctly for at least 80% of sites throughout a region"). This
uncertainty, establishing bounds for acceptable total error in the indicator, provides a context
for the third level of measurement quality objectives (i.e., "benthic organism counting bias
should not exceed 10%"). Method selection must be done to best accommodate the error
rates allowable after consideration of those from other sources such as sampling design and
statistical data analysis.
20
-------
Increasing
Amount of
Data
Task Group
Information
Center
Regional
Information
Center
Field Information Center
Public Officials, Environmental
Managers and the Public
Task Group Senior Scientists
EMAP Assessment,
and Environmental
Decision Officials
Regional Assessment
Regional/State
Regulators and
^ Scientists
Regional
* Logistics
m. and
Field
Increasing
Number of
Users
Figure 6 Relationship of Classes of EMAP Users and Data
Development of Procedures for Data Quality Assessment
Quality Assurance design is underway to connect planning for data quality with
assessment of data quality. Data validation and verification at the Resource Group level will
be largely automated and must be coordinated with the information management system
design. For example, quality control data from check samples, blanks, or replicate samples
will be used to determine if there are any problems with sample collection or measurement.
Also, internal consistency checks will be used, such as summing species proportions to
ensure they do not exceed 100%. Validation and verification for summary statistics like
indices must also be planned and coordinated with design of the EMAP Center data base.
EMAP will be developing a pilot project to test a proof of concept of Data Quality
Objectives for EMAP to determine if the results can be used to develop procedures for
assessing data quality for adequacy in its intended use (i.e., validation of an ecological index
used in determining status and trends). EMAP's summary statistics, such as ecological
indices, are highly compiled and related to landscape characteristics in time and space. A
vision of full implementation is needed to focus planning for research on indicators and
design testing particularly for integration. Also, EMAP needs to identify mechanisms to
21
-------
caution future non-EMAP data users on the limitations of the data to avoid inappropriate use
of the data.
The EMAP Center data base will need procedures for storing QA/QC information or
meta-data and assessment of data quality for verification of summary statistics (indices)
validated by Resource Group suppliers. Decisions on what QA/QC information to keep and
how to use it (i.e., a QA Design) are needed. The design for the EMAP Center data base
needs to be compatible with Resource Group data bases in terms of QA/QC information
access and procedures for data validation and verification. Also, data need to have
information associated with them to caution users when they may be violating EMAP design
constraints (i.e., number of samples too low to provide a reliable index of status).
Developing an Infrastructure for Management Control
Periodic reports on findings of data quality audits and assessments need to be
planned. These reports will become part of the feedback to EMAP managers on the
effectiveness of their data collection operations. These reports will assist management's
control of data acquisitions and provide facts to support continuous improvement in EMAP
at both the Resource Group and EMAP Center levels. Roles and responsibilities for
oversight and corrective action within Resource Groups and EMAP Center are being refined
in this area.
CONCLUSION
From this brief overview of EMAP and its activities in the Great Lakes, it is readily
apparent how important the identification of future data users and their intended uses of
EMAP data are to the success of EMAP. We need to develop a vision of success for
EMAP's part (i.e., in Great Lakes monitoring) in order to plan for it. This includes defining
EMAP's data uses in its methods research and its program outputs, annual statistical
summaries and periodic interpretive reports, and development of indicators of data quality
to ensure data adequate for those uses. These data uses and quality indicators provide the
basis for the QA design needed for control of EMAP's data collection operations. The QA
design is important input into the design of the information management architecture and
procedures needed to carry out assessments of data quality and support reports to EMAP
managers.
22
-------
REFERENCES
EPA (U. S. Environmental Protection Agency). 1988. Future Risk: Research Strategies of
the 1990's. SAB-EC-88-040. U.S. Environmental Protection Agency, Science Advisory
Board. Washington D.C.
EPA (U.S. Environmental Protection Agency). 1992. Lakewide Management Planning
Process. Region 5 Water Division, LaMPOST, FY92, 3rd Quarter.
Hedtke, S., A. Pilli, D. Dolan, G. McRae, B. Goodno, R. Kreis, G. Warren, D. Swackhamer,
and M. Henry. 1992. EMAP-Great Lakes Monitoring and Research Strategy. U.S.
EPA Environmental Research Laboratory-Duluth, Duluth, Minnesota.
23
-------
THE NIAGARA RTVER TOXICS MANAGEMENT PLAN -
QUALITY PLANNING CONSIDERATIONS
PAST, CURRENT AND FUTURE ¦
FIELD SAMPLING DESIGN AND
QUALITY CONTROL PROCEDURES
Francis Philbert and Ken Kuntz
Environmental Quality Branch
Inland Waters Directorate
Burlington, Ontario
ABSTRACT
A water quality monitoring program has been established by Environment Canada to
measure nutrient, major ion and contaminant concentrations and loads in the Niagara River.
A station established at Niagara-on-the-Lake in 1975 and another station at Fort Erie
established in 1983 are used to measure the upstream and downstream differences in the
river. Since 1986, weekly, 24-hour, time-integrated water and suspended sediment samples
have been collected at both stations. The water is pumped from the river and passed through
a continuous-flow centrifuge to collect the suspended solids. The centrifugate is extracted
for organics with dichloromethane in a Goulden Large Sample Extractor (GLSE) on site.
An ongoing quality control program has been in place since 1975, but was
considerably expanded in 1986 with the signing of a Four Party Statement of Intent. The
following components of this QC program will be described: sample preservation, blanks,
duplicates, surrogate spikes, field audits, lab audits, practical detection limits, sample
program design, field sampling protocol and lab analytical protocol. All aspects of this
QA/QC program have been examined in detail and have been agreed to by the four agencies
involved, i.e., Ontario Ministry of Environment and Energy (MOEE), New York State
Department of Environmental Conservation (NYSDEC), Environment Canada (EC) and the
United States Environmental Protection Agency (U.S. EPA).
INTRODUCTION
The Niagara River is the main drainage system for the three Upper Great Lakes
(Superior, Huron and Michigan) and Lake Erie into Lake Ontario. The approximately 58
km river is a source of municipal drinking water to over 930,000 people in fywln and the
United States, including the City of Buffalo. It provides about 83% of the total tributary
flow to Lake Ontario and thus, has a significant impact on the quality of the i«i»» which
itself, among other things, is the source of drinking water for approximately 3.8 million
25
-------
Canadians and about 805,000 Americans. The river itself is a major source of water for
industry, municipalities, commerce, power generation, recreation and tourism.
THE NIAGARA RIVER POLLUTION PROBLEM
The abundance and availability of the Niagara River water for both
municipal/domestic uses and as a source of inexpensive hydroelectric power, led to the
extensive industrialization of the area surrounding the river. This in turn resulted in the
classic case of resource use conflict in which the river became the receptor of inordinate
amounts of pollutants originating primarily from a massive complex of chemical, steel and
petrochemical plants and municipal outfalls along its banks. The extent and severity of the
Niagara River pollution problem and its attendant social and economic implications have
been an issue of major public concern during the past decade or so. The river has been
listed by the International Joint Commission (UC) as an Area of Concern since 1974.
Accordingly, governments and industries have been spending several million dollars in
implementing cleanup programs, determining the effectiveness of cleanup programs, and
identifying additional contamination sources needing action.
MANAGEMENT FRAMEWORK
A number of constitutional, institutional and jurisdictional factors make the
management of the Niagara River an involved and complicated matter. Nevertheless, Canada
and the United States have succeeded in developing and implementing a management plan
for the river, which could only be described as an exemplary cooperative approach to a
major international environmental problem.
Over the past several decades, and particularly since the mid 1940's, the Niagara
River pollution problem has been intently studied and the principal jurisdictions have been
attempting, both unilaterally through their own regulatory programs and legislation, and on
a joint international basis, to address the issue. In January 1981, the UC issued a special
report (International Joint Commission, 1981) on pollution in the Niagara River. The
Commission made a number of specific recommendations which, in part, called for:
1. the undertaking of a comprehensive and coordinated study of the river, including the
identification of sources, concentrations, fate and probable effects of all detected
organic compounds and metals so as to permit assessment of the problem and to
implement the required remedial or preventative action on a common basis;
2. that a comprehensive and continuing monitoring program for the entire Niagara River
and western end of Lake Ontario be developed and maintained, coordinated and
supported by all relevant jurisdictions.
26
-------
The first comprehensive integrated assessment of toxic chemicals in the Niagara River
was undertaken in 1981, by the Niagara River Toxics Committee (NRTC), a four party
committee consisting of representatives from Environment Canada, the Ontario Ministry of
Environment and Energy, the U.S. Environmental Protection Agency (US EPA), and the
New York State Department of Environmental Conservation. The NRTC goal was to
determine what chemicals were in the river, identify their sources, recommend control
measures and establish a procedure to monitor the effectiveness of those control measures.
The results of the NRTC three-year study were presented in a comprehensive report
(Niagara River Toxics Committee, .1984) completed in October 1984. The report contained
a number of recommendations directed to the four parties. In a series of meetings between
the Environment Canada Minister and the U.S. EPA Administrator, Canada and the U.S.
reaffirmed their commitment to developing and implementing an action plan for the Niagara
River. Several months of intensive work and negotiations led to the development and
adoption by the four agencies of a Niagara River Toxics Management Plan (NRTMP)
consisting of a detailed work plan which is updated regularly, and a Declaration of Intent
signed by the four parties on February 4,1987 (The Niagara River Toxics Management Plan,
1990). The Declaration of Intent (DOI) formalizes the plan and is a commitment by the
parties to use the full powers of their domestic laws and regulations to work together to fully
implement the programs and activities outlined in the plan. In a joint communique issued
on May 14, 1986 by the Environment Canada Minister and the US EPA Administrator, it
was stated that "It (the Plan) is the top priority for a cooperative approach to solving a
problem in the Great Lakes Basin ecosystem".
The NRTMP identifies an organization and management/implementation mechanism
for the coordination and evaluation of collective pollution control measures which are
directed to achieving significant reductions of toxic chemical loadings to the River.
Specifically, under the DOI, the goal is to achieve by 1996 a 50% reduction in loadings of
certain persistent toxic chemicals of concern entering the river from point and non-point
sources on both sides.
The Plan was initially comprised of the following eight basic components: (i) river
monitoring, (ii) point sources, (iii) non-point sources, (iv) chemicals of concern, (v) technical
and scientific cooperation, (vi) a communication plan, (vii) organization and implementation,
and (viii) reporting. It calls for a senior level coordination committee to coordinate and
oversee plan implementation. Under the umbrella of the Coordination Committee, a number
of sub-committees have been formed to perform tasks specified in the plan. Naturally, the
technical and scientific data gathering activities relate primarily to the first four components.
It should be noted at this juncture, however, that the NRTC could provide no indication of
the magnitude of toxic loadings to the Niagara River from non-point sources. The NRTMP
identified two activities directed to the estimation of such contributions. Attempts are being
made to derive an initial estimate using ambient river monitoring data from the head and
mouth of the river (input-output differential monitoring) together with updated loading data
from municipal and industrial sources. In addition, pending the development of additional
27
-------
site-specific data and more direct measurements, estimates of potential contaminant loadings
would be derived from existing hydrogeological and contaminant data at the various sites.
DATA QUALITY PLANNING CONSIDERATIONS
The NRTC, in carrying out its study, struck a number of subcommittees to provide
technical and scientific support. One such committee was the Data Quality Subcommittee.
Its primary charge was to investigate the chemical data for the Niagara River Project and
make recommendations to the NRTC on the interpretation and usability of the data.
The DQS, formed in October 1982 submitted its final report to the NRTC in
November 1983 and an updated version followed in March 1984. In consideration of
future monitoring and surveillance activities, the DQS made a number of recommendations
which in substance addressed the following points:
i) the need to incorporate a carefully planned quality assurance program in the planning
and implementation of data gathering activities;
ii) definition and communication of project objectives, data and QA requirements to all
concerned;
iii) for multi-component or multi-agency projects, the need to reach agreement at the
planning stages on key aspects of the program, including analytical laboratory
performance criteria, parameters, and desired detection limits;
iv) provisions for contracting out analytical work; and
v) the need for ongoing consultation/communication between laboratory analysts and
project manager/data user.
These recommendations, made almost a decade ago (NRTC Data Quality
Subcommittee, 1984), still remain relevant to the theme of this workshop.
AMBIENT RIVER MONITORING COMPONENT
The rest of my presentation will be focussed on the Ambient River Monitoring
component of the plan which is being coordinated and managed by the River Monitoring
Committee (RMC) formed in 1986. The RMC is ably supported by its three technical work
groups, i.e.,:
Sampling Protocol Group
Analysis Protocol Group
Data Interpretation Group.
Hie primary charge of the Niagara River Monitoring Committee was:
28
-------
"To develop and implement a river monitoring program that will be capable of
assessing whether the input of toxic chemicals to the river is being reduced,
increased, or kept constant over a five to ten-year period".
An upstream/downstream ambient monitoring program, endorsed by all four parties
and being implemented by Environment Canada, constitutes the basic four-party ambient
river monitoring program. Given the intent to deduce non-point source loading estimates
from the ambient and point source monitoring data, data comparability considerations were
of paramount concern from the beginning. Sampling station (outflow) representativeness has
been confirmed through a special study of the downstream station at Niagara-on-the-Lake.
Based on all available information, the representativeness of the upstream station (inflow)
at Fort Erie has been judged to have been established; nevertheless, a special study to
confirm this is just about to begin.
The development of sampling and analytical procedures, quality assurance/control and
data validation procedures, statistical treatment and interpretation of the data, and data
exchange protocols have all occurred with the participation and consensus of the four parties.
All procedures are fully documented. Field and laboratory operations are scheduled to
undergo technical systems audits biennially.
The project results speak for themselves in that the ambient data is considered to be
one of the best, if not the best, data sets available on any river in the Great Lakes basin and,
perhaps, even beyond.
PROGRAM OBJECTIVES AND SAMPLING DESIGN
The specific objectives of the program are to:
estimate mean concentration and loading of nutrients, major ions, trace metals and
organics at the head and mouth of the Niagara River;
identify those chemicals which change along the river; and
identify changes of various chemicals with time along the river.
The key elements of the sampling design are as follows:
collection of samples at two permanent monitoring stations located at the head and
mouth of the river;
grab samples collected weekly for nutrients, trace metals, etc.;
29
-------
24-hour time integrated collection of water and suspended sediment for organic
contaminant analyses; and
15- to 18-hour time delay between collection of upstream and downstream samples
to simulate the hydrologic regime of the river.
The sampling locations are shown in Figure 1. The upstream station is located on the
Canadian side at Fort Erie approximately 3,000 metres downstream from the head of the
river.
The downstream station is located on the Canadian side at Niagara-on-the-Lake
approximately 1,500 metres from the river mouth. Stations were established at locations that
were expected to provide samples that represented as closely as possible water entering the
river at Fort Erie and leaving the river at Niagara-on-the-Lake. A study was done by Chan
in 1974 (Chan, 1977) to measure cross-stream variability before the station at NOTL was
established. Another study by Seastar in 1988 confirmed that organics also were not
different across the river at NOTL. Similar studies are to be done by NYSDEC at Fort Erie
in 1993-94.
FIELD SAMPLING PROCEDURES
The sampling system at both stations consists of a polyethylene intake line, magnetic
drive submersible pump, Westfalia continuous-flow centrifuge and Goulden Large Sample
Extractor. The intake is positioned in the middle of the water column so as to avoid
sampling bedload and to avoid being a navigational hazard:
at NOTL 5.5 metres above bottom and 60 metres from shore (Figure 2), and
at FE 1 metre above bottom and 30 metres from shore.
The sampling process and schedule for specific chemical groups is as follows:
Nutrients: grab samples collected three times per week.
Major Ions: grab samples once per week.
Trace Metals: grab samples once per week.
Volatiles: grab samples collected outside the trailer, in duplicate, once per week.
Organics: 24-hour continuous extraction of a 50 L water sample on site with the GLSE
(Figure 3).
30
-------
Suspended Solids: collected with a Westfalia continuous-flow centrifuge which has
recovery efficiency of 80 to 90% when operated at flow rate of 6 L/min over 24 hours
(about 8,600 L of water centrifuged); 15- to 18-hour delay between upstream and
downstream sample collection is allowed to simulate river flow through time.
Phenols: 20 L grab samples acidified in lab, before collection.
The overall sampling regime is illustrated in Figure 4.
QA/QC FEATURES OF THE PROGRAM
Bottle/Field Sampling Equipment Cleaning Procedures
A detailed washing protocol is followed for each type of bottle washed. For example,
nutrient sample containers are filled to the rim with acidified (H2S04) distilled water (DDW),
capped and allowed to soak for at least three days. They are then rinsed three or four times
with DDW. Each type of bottle is used for only one type of analysis, i.e., nutrients, trace
metals, etc., thus minimising chances of cross-contamination from preservatives. Each batch
of bottles is washed, then analyses are run on 5% of the group, picked at random. If cleared,
only then is this batch of bottles approved for field sampling.
Sample/reagent bottles for organics analysis and centrifuge bowls are cleaned with soap
and hot water, then petroleum ether and acetone before use each week. All sampling
equipment is comprised of glass, stainless steel, teflon or polyethylene/polypropylene to
avoid contamination.
Preservatives
Preservatives are made up by the National Laboratory for Environmental Testing
(NLET) of CCIW in Burlington and checked for contamination and only then used by field
staff for preserving samples.
Sample Preservation and Storage
Nutrients are refrigerated in the field, then total phosphorus samples are acid preserved.
Trace metal samples are preserved with nitric acid. Organics are GLSE extracted on site
with dichloromethane (DCM). Phenols are preserved with sulphuric acid. Sample storage
times (e.g., volatiles = 14 days) are not to be exceeded or samples are discarded.
Blanks
Distilled water is added to pre-cleaned sample bottles, preservatives are added and
bottles are stored and run along with regular samples. Blanks are collected at each station,
once per month for nutrients, major ions and trace metals.
31
-------
Duplicates
Duplicates are collected at each station, once per month for nutrients, major ions and
trace metals.
QA/QC Data
Blank and duplicate data for 1989 are illustrated in attached charts for total phosphorus
(TP), nitrate, chloride and sodium.
Solvents Checked
A 300-mL portion of each lot of solvent (DCM) is evaporated to 1 mL using the
Goulden evaporator and injected into a GC to check for interferences. If okay, the lot is
approved and used to extract samples. Each bottle is opened, lot number recorded and then
used for sample extraction. Each week a new bottle is used.
Surrogate Spike Recoveries
Spike recoveries are charted for organics in GLSE and suspended sediments.
Field Spikes
Each sample is spiked with a known quantity of the following compounds in the field:
1,3,5 -tribromobenzene
1,2,4,5-tetrabromobenzene
delta BHC
These compounds monitor the extraction, concentration and fractionation steps.
Lab Spikes
Each sample is spiked with a known quantity of 1,3-dibromobenzene in the lab prior
to the concentration step. This compound monitors the volatility losses of the
chlorobenzenes in the concentration step of the analysis.
Each sample is spiked with a known quantity of the following compounds to the non-
polar concentrate:
endrin ketone for fraction A
2,3,5,6-tetrachlorobiphenyl for fraction B
These compounds monitor analytical performance throughout the cleanup steps.
32
-------
Each sample is spiked with octachloronaphthalene to the final 1 mL extract prior to
GC-ECD to monitor system response during the GC steps. Each of these compounds were
carefully chosen so that they fall into various regions of the chromatogram, they are not
generally found in the environment, they are similar to the chemicals being analyzed so their
analysis characteristics are similar, and they are readily available from suppliers.
A couple of recovery control charts with upper and lower (2 SD) warning limits are
attached as examples.
Field Sampling Protocol
All methodology, including field sampling, preservation methods, sampling methods,
equipment used, operating procedures, frequency and duration of sampling, storage times and
procedures for updating protocols, are contained in a field sampling protocol document
which has been agreed to by the four parties - NYSDEC, MOEE, U.S. EPA and EC.
Field Audits
Audits are carried out by the four parties to ensure that sampling protocols are being
followed as required.
Lab Analytical Protocol
All steps in the analytical methods are documented and approved by the four parties.
Lab QA/QC procedures are also documented.
Lab Audits
Lab audits are carried out by the four parties to ensure that lab analytical protocols are
being followed.
Performance Evaluation Study
A blind sample was submitted by U.S. EPA for analysis by NLET. It contained known
amounts of various analytes of interest so that method performance could be evaluated.
Proficiency Studies
The capability of labs and individuals were established prior to analysis of samples.
Samples were spiked, blanks extracted, etc. to determine analysts proficiency. These samples
were also used to train new staff.
33
-------
SUMMARY AND CONCLUSION
The adoption and implementation of the NRTMP places a firm commitment on each
of the four principal jurisdictions to follow an agreed upon management strategy for the
effective coordination and evaluation of collective pollution abatement programs and
activities aimed towards the achievement of significantly reduced loadings of toxic chemicals
to the river. As far as project planning and the associated QA/QC are concerned, the
invaluable lessons learned from the NRTC study were applied in the formulation and
implementation of the river monitoring component of the NRTMP. In addition to
commitments at the most senior levels, there was, from the inception, the recognition and
acceptance by management within all four agencies of the basic principles governing the
production of data of the requisite quality for predetermined uses. QA/QC requirements
were incorporated up front at the project planning stage and have been continuing
throughout project implementation. In so doing, laboratory personnel have been playing a
prominent role throughout the overall process. The laboratory staff are involved as a partner
or team member and not merely as the provider of a service. Thus, the Niagara river
ambient monitoring component of the NRTMP is arguably an exemplary international
multi-agency cooperative monitoring project.
REFERENCES
International Joint Commission: 1981, "Special Report (under the 1978 Great Lakes Water
Quality Agreement) on Pollution in the Niagara River'.
Niagara River Toxics Committee: 1984, A joint publication of Environment Canada,
United States Environmental Protection Agency, Ontario Ministry of Environment and
Energy, and New York State Department of Environmental Conservation.
Niagara River Toxics Management Plan: 1990 update, A document by Environment Canada,
United States Environmental Protection Agency, New York State Department of
Environmental Conservation, and Ontario Ministry of the Environment.
Niagara River Toxics Committee's Data Quality Subcommittee: 1984, Final Report to the
Niagara River Toxics Committee.
34
-------
\
Lake Ontario
ONTARIO
NOTL
USA
^Niagara River
L
N
IWelland
1 Canal
Scale 1 = 250,000
5 0 5 10
Jiagara
rails
NEW YORK
15 Kilometres
Welland River
CANADA
Grand
Island
FEi
+43°00'
78° 45'
Lake Erie
Figure 1: Sampling Locations on the Niagara River
35
-------
UJ
Q\
G.L.S.E.
O
RSI!*
Subsurface float
Mmm
~ . -^
S»»
SliMtmSiH
Intake
Figure 2: Water Delivery System at Niagara-on-the-Lake
-------
Thermometer
Soluent make-up pump
^ 38.0 mlt/hr
mm
DCM
1.0 Litres
Heoter
Ulestfelio
centrifuge
From Riuer
4.0 L
Reservoir
Ulster feed
pump
2.1 L/hr
Hester
contr
Large Volume Counter Current Extractor (GLSE)
37
-------
f Trace Metals)
Uolatiles
(outside of
trailer)
Suspended sediments
collected ouer 24 hr
period
OC's, PCB's CB s
PflH's, Phenols,
Phtholates, DioHin,
Trace Metals
20.0 Litres
Phenols
Coarse Filter
Intake Line
March Pump
24 hr Large Uolume
Liquid EHtractor
(35 mls/mln)
¦V Coarse Filter
(in-line)
IVestfalia
Centrifuge
4.0 Litre
6lass reservoir
OC's, PCB's CB's
PflH's, Phenols,
Phthalates, Oloxln
Figure 4: Niagara River Sampling Regime
38
-------
NOTL-Water, 1,3,5-Tribromobenzene
1987/88/89, Mean +/- 2 Std.Dev.
OJ
SO
200
180-
160i
140
£ 120H
g 100H
8
DC
1
0 ii 111 n i ii m mi iii n ii i hi n 111 nui i hi nil i ii 111 ii 11 ui i hi i n 111 im 11 ii i ii 11 ii 11 m 111111 ii 1111111 ii 111111 ii 11111 mi 111 n h
870403 870827 880225 880915 890202 890803 900118
Sampling Date
-------
FE-Water, 1,3,5-Tribromobenzene
1987/88/89 Mean+/-2 Std.Dev.
1
1
h n m m 11 m mill m m i m ii ii mi i m mi i ii ii 111 »mi 11 mum 111 m i m m m i n m m ni i ii 11 mi 11 m m m 11 m m 11 ii i ii n m 111 n ii ii i m i m i m n
870402 870902 880217 880706 881208 890517 890927 900321
Sampling Date
-------
MISA DATABASE EVALUATION - A ROADMAP FOR THE
INTEGRATION, ASSESSMENT AND CRITICAL REVIEW OF
AN INTERLABORATORY DATABASE
George Steinke
Laboratory Services Branch
Ontario Ministry of Environment and Energy
Rexdale, Ontario
ABSTRACT
Since the inception of the MISA (Municipal Industrial Strategy for Abatement) program to
the promulgation of the first effluent monitoring regulation in July of 1988, the Quality
Management Office (QMO) of the Laboratory Services Branch (LSB) of the Ministry of
Environment and Energy (MOEE) has been actively involved in the development of
sampling, analytical and reporting criteria that would support the data quality objectives of
the program.
The QMO has continued to provide support to the MISA program, providing consultation
on quality assurance/quality control (QA/QC) concerns to both ministry and private
stakeholders, as well as laboratory audit services, methods distribution, inter-laboratory
studies, and a window to the analytical expertise of the LSB.
Continuing the tradition of providing services that encourage the improvement of the data
produced by Ontario's private and commercial laboratories, the QMO has undertaken an
intensive evaluation of the QA data contained in the MISA databases.
In June of 1991, the Laboratory Services Branch of the Ministry began to collect MISA
databases in order to extract the wealth of field QA data they contain.
The databases contain the results of the analyses of more than 140,000 QA samples
(travelling blanks, travelling spiked blanks, and field duplicates).
The goal of this project is to summarize in tabular and graphical formats the performance
of 114 laboratories that were involved in the analyses of 245 MISA parameters.
INTRODUCTION
The MISA program is an initiative of the Ministry that was conceived in 1986 and whose
goal, at that time, was the "virtual elimination of toxic contaminants in municipal and
industrial discharges into waterways" (Government of Ontario, 1990).
41
-------
194 Industries were categorized into 9 sectors: Petroleum, Organic Chemical, Inorganic
Chemical, Pulp and Paper, Electric Power Generating, Mining, Metal Casting, Iron and Steel,
and Industrial Minerals.
The 9 sectors completed a year of self-monitoring during which their effluents were analyzed
for up to 254 parameters at frequencies varying from annual to daily.
Field and laboratory QA/QC were built into the MISA program to support the effluent data
being generated (Government of Ontario, 1988). Bench QC requirements included the
analyses of blank, spiked blank and replicate samples at specified frequencies, as well as
requiring the use of standard reference materials (SRM's). Laboratories were required to
establish "reasonable" control limits for the bench QC samples.
The field requirements included the analyses of travelling blank, travelling spiked blank, and
duplicate samples at specified frequencies.
The objective of this report is to provide the groundwork wherebye current levels of
laboratory performance can be evaluated. Laboratories will be able to compare their
performance against other laboratories performing the same analyses on similar matrices.
Data users will be able to identify parameters (methods ?) that have chronic (across
laboratories) performance shortcomings.
Future laboratory audits will be more effective, since problem parameters for a laboratory
can be easily identified in advance.
Of course the validity of the results of this report are dependent on the honesty of the
laboratories that provided the data. Assuming the data has validity, it has a characteristic that
most round-robin data sets lack. Namely the analyses would have been performed by the
same analysts that normally perform the analyses, and probably with the same level of care
with which "routine" samples are treated. The huge workload imposed by MISA on most
laboratories would have made it difficult to set aside the extra time and care usually afforded
round-robin samples, and laboratories had no reason to try to achieve superior results for QA
samples since there was no indication that the results would ever be made public.
Travelling Spiked Blank Samples
A travelling spiked blank sample is a sample of uncontaminated water to which a known
quantity of the target parameters have been added by the laboratory in the 24 hour period
preceding travel. This sample accompanies sample containers from the laboratory to the
sampling site but is not opened at the site (Thorten, et al, 1992).
Travelling spiked blank data will indicate whether the recovery of a parameter is infuenced
by transport and storage time of the sample. Since sample storage conditions were clearly
defined for MISA, it is reasonable to look across laboratories and evaluate inter-laboratory
differences in analyte recoveries as well as differences in precision.
42
-------
Travelling Blank Samples
A travelling blank sample is a sample of uncontaminated water that accompanies a set of
sample containers from the laboratory to a sampling point where it is opened, preserved,
resealed and returned to the laboratory with a set of samples for analysis (Thorten et al,
1992).
Travelling blank samples will reveal any field contamination, as well as indicate potential
laboratory (analytical) problems.
Duplicate Samples
A duplicate sample is one of two samples collected at the same time, and at the same
sampling point in a manner that minimizes differences between the samples (Thorten et al.,
1992).
Analyses of duplicate samples allows the evaluation sampling and analytical variability.
DATA HANDLING
The nine sector databases were received over a period of more than a year, with the last
database (Electric Power Generating) arriving in September of 1993.
Programs were written using PCFocus that extracted the QA data from the databases. The
extracted QA data was combined into 3 sub-databases: MISADUP contains the field
duplicate/replicate data; MISASPK contains the travelling spiked blank a- MISABLK
contains the travelling blank data.
43
-------
STRUCTURES OF ORIGINAL AND EXTRACTED DATABASES
ORIGINAL D
STRUCTURE
ATABASE
MISASPK STRUCTURE
MISABLK STRUCTURE
MISADUP ST]
SUCTURE
FIELD
NAME
FIELD
TYPE
FIELD
NAME
FIELD
TYPE
FIELD
NAME
FIELD
TYPE
FIELD
NAME
FIELD
TYPE
SECTOR
A5
SECTOR
A6
SECTOR
A6
SECTOR
A6
COMPANY
A10
PTEST
A6
PTEST
A6
PTEST
A6
CTRLPT
A4
RMDL
D15.4
RMDL
D15.4
PMTHD
A6
SDATE
I6YMD
PMTHD
A6
PMTHD
A6
COMPID
A14
STIME
A4
COMPID
A14
COMPID
A14
SDATE
I6YMD
STYPE
A3
SDATE
I6YMD
SDATE
I6YMD
FOCLIST
15
SNUM
A8
FOCLIST
15
FOCLIST
15
SAMPLE
D15.4
SFREQ
A2
DAA
D15.4
GAA
D15.4
PREMK1
A3
PTEST
A6
PREMK
A3
PREMK
A3
DUPJREP
D15.4
PUNIT
A3
YAA
D15.4
GAARATIO
D15.4
PREMK2
A3
PSCAL
A3
PCTRECOV
D15.4
CODE
A6
DUPJOTF
D15.4
PREMK
A3
SPKLEVEL
D15.4
AVG_CONC
D15.4
PVAL
P12.3
CODE
A6
CODE
A6
PMTHD
A6
ADATE
I6YMD
CDATE
I6YMD
The original databases were linked to 2 support files. A "company" file contained specifics
about the sampling location, while a "parameter" file contained specifics about each
parameter relevant to the sector.
The original databases required anywhere from about 10 to 40 megabytes of disk storage
capacity. In total, close to 200 megabytes of storage capacity was required to handle all the
databases.
At the project management level several groups took responsibility for the structure of the
various sector databases, and as a result some inconsistencies arose between the databases
and their support files across sectors. In order to allow the merging of QA data from all of
the sectors, it became necessary to standardize the database structure, as well as create a
unique "parameter" support file. The generic support file "Testlist" was created in order to
allow links to be made with sample information across all sectors.
44
-------
STRUCTURES OF SUPPORT FILES
TESTUST" SUPPORT
FILE STRUCTURE
TYPICAL "COMPANY'
SUPPORT FILE STRUCTURE
FIELD
NAME
FIELD
TYPE
FIELD
NAME
FIELD
TYPE
PTEST
A6
COMPID
A14
ATG
A2
NUMBER
12
PARAMETER
A30
NAME
A30
RMDL
D15.4
SCLASS
A2
RUNIT
A7
E_TYPE
A2
CTRLPT
A4
STREAM
A33
SUBCAT
A2
CATEGORY
A1
S_SORT
A1
Examples of inconsistencies between sectors include: missing sector designations; different
field types for the same item; different reporting units for the same parameter; a parameter
may be classified into a different analytical test group (ATG). In order to address these
inconsistencies, numerous programs were written that allowed the standardization of the
databases. (See Appendix 1: Misaspk.fex).
Once the QA databases had been extracted, programs were written to generate Lotus
compatible spreadsheets (See Appendix 1: Spkstat4.fex, Boxplotfex). These spreadsheets
were imported into Statgraphics, which was used to produce the graphical summaries.
Confidentiality was insured by the use of numerical codes in place of laboratory identifiers.
Some 790 graphs were produced. Every attempt was made to minimize the huge volume of
data into formats that would allow both an overview of parameter recovery and allow
laboratory intercomparisons.
45
-------
TABULAR SUMMARIES
Summary statistics for travelling blank and travelling spiked blank data are presented in
tabular form. The tables present the number of observations (N), average (AVG), standard
deviation (S), as well as the quartiles (1st, 2nd, 3rd, 4th), for each parameter Parameters are
grouped into analytical test groups (ATG's) e.g. Metals are in ATG 9.
TABLE 1: SUMMARY STATISTICS FOR TRAVELLING SPIKED BLANK DATA FROM
THE COMBINED MISA DATABASES.
RESULTS ARE EXPRESSED AS PERCENT RECOVERY.
ATG
PARAMETER
N
AVG
S
1st
2nd
3rd
4th
09
Zinc
45
108.7
24.2
100.0
104.5
110.0
210.0
16
Dichloromethane
436
98.9
55.1
75.0
90.0
108.0
416.0
17
Benzene
530
93.4
26.3
85.0
96.4
105.0
257.3
20
2,4-Dimethyl phenol
279
77.5
57.6
49.0
72.0
90.0
440.0
23
Hexachloroethane
262
84.4
53.9
51.6
76.0
100.0
346.0
TABLE 2: SUMMARY STATISTICS FOR TRAVELLING BLANK DATA FROM THECOMBINED MISA
DATABASES.RESULTS ARE EXPRESSED AS CONENTRATION DIVIDED BY
THE REGULATION METHOD DETECTION LIMIT (RMDL).
ATG
PARAMETER
RMDL
RUNIT
N
AVG
S
1st
2nd
3rd
4th
04
Ammonia plus
Ammonium
0.25
mg/L
180
1.7
8.60
0.10
0.36
1.08
111.68
09
Aluminum
30.00
ug/L
163
1.9
2.90
0.43
1.00
1.80
20.57
16
Methylene chloride
1.30
ug/L
109
9.5
25.70
0.92
2.00
6.38
207.69
17
Toluene
0.50
ug/L
63
8.0
23.40
0.60
1.40
3.94
146.00
19
Bis(2-ethylhexyl)
phthalate
2.20
ug/L
63
14.7
22.30
1.14
5.00
16.82
100.00
19
Di-n-butyl phthalate
3.80
ug/L
17
1.1
2.10
0.03
0.16
0.55
7.58
23
Hexachloroethane
0.01
ug/L
23
3.1
2.20
1.20
2.10
4.40
9.60
25
Oil and grease
1.00
mg/L
113
1.5
1.30
0.60
1.00
1.80
7.60
46
-------
GRAPHICAL SUMMARIES
The travelling spiked blank data is summarized as 3-dimensional histograms, xy scatterplots
and notched box-and-whisker diagrams. The travelling blank data is summarized as 3-
dimensional histograms, and notched box-and-whisker diagrams. Duplicate data is
summarized as xy-scatterplots.
3-Dimensional Histograms
Travelling Spiked Blanks:
These graphs summarize the average recovery of a parameter for a laboratory. They allow
for a simple (perhaps over-simplified) means of comparing multiple laboratories and multiple
parameters.
ATG 16
- £?/*££
r
r
^ J-
Figure 1: Average recoveries for several laboratories (A to N) for several parameters in
analytical test group (ATG) 16 (Halogenated Volatiles).
Parameters are (from top to bottom): bromomethane; ethylene dibromide; chlorobenzene;
1,4-dichlorobenzene; 1,3-dichlorobenzene; 1,2-dichhlorobenzene; vinyl chloride;
tichloroethylene; tetrachloroethhylene; tribhlorofluoromethane; trans-1,2-dichloroethylene;
dichloromethane; and, carbon tetrachloride.
47
-------
Travelling Blanks:
These graphs summarize the average blank levels (concentration/RMDL) a laboratory found
for a given parameter. Only data that was above the laboratory's method detection limit
(LMDL) was used in these summaries.
Figure 2: Average travelling blank levels for several laboratories (A to K) for several
parameters in the Halogenated Volatiles test group (ATG 16).
Parameters are (from left to right): 1,1,2,2-tetrachloroethane; chlorodibromomethane;
chloromethane; chloroform; carbon tetrachloride; dichloromethane; trans-1,2-
dichloromethane; tetrachloroethhylene; trichhioroethhylene; vinyl chloride; 1,3-
dichlorobenzene; and bromomethane.
48
-------
XY SCATTERPLOTS
Travelling Spiked Blanks:
XY scatterplots summarize the recovery of a parameter over the range of spiking
concentrations used by the laboratories. Outliers have been excluded using a criteria of the
median +/- 1.5 x the interquartile (3rd minus 1st) range at any given spiking concentration.
The vertical axis is expressed as the difference from expected (found-expected) divided by
the RMDL. The horizontal axis represents the spiking concentration divided by the RMDL.
BENZENE SPIKCO UANKt
to
II
10
.10
11
to
Hfi
I ,1,
_L
-L
10
It
It «»
Tl ON/ MIDI
10
Figure 3: Differences from expected for travelling spiked blank data expressed as average
(dot) and standard deviation (whisker) at each spiking concentration.
49
-------
Field Duplicates:
XY scatterplots summarize the difference between duplicate results across the average
concentration of the duplicate pair. The vertical axis is expressed as duplicate difference
divided by the RMDL. The horizontal axis is expressed as average concentration divided by
the RMDL. The lower graph shows the average duplicate difference in average concentration
groupings of 1 RMDL. A factor may be applied to the regression line that would establish
a control limit for duplicate results.
IERZINC OmiCATCS
>1
11
wd
e
a
« 11
g
¦
Hi
m
£ i
s
4
:! i - * *1 ! 1 - !
1
1 : 4 P '-4-
« t it it tt
AYKMftE C0N6INTHATI ON/ NIDI
IINZENC OUFLI CATEt (MOOTNEO)
II
11
s
K
s
S "
m
m
tk
Ik
5 i
IM
2
M
« 4
:| i ' ! |
: i-j 1 | i
1
4 11 11 10
Figure 4: Field duplicate results. The upper graph shows all the duplicate (difference) data,
while the lower graph shows the average difference at intervals of 1 RMDL with a regression line
added.
50
-------
NOTCHED BOX-AND-WHISKER PLOTS
These plots illustrate the 1st, 2nd (median), and 3rd quartiles of the data (found minus
expected) produced by a given lab (lab_code) for each parameter. The "whiskers" extend to
the furthest data point within 1.5 times the interquartile (3rd minus 1st) range.
The width of the "box" is proportional to the square of the number of data points. The
"notch" represents the 95% confidence limit of the median (Frigge et al, 1989).
IENZINE KID IUNKS
1 ! ! ! ! ! ! ! f
j 1 -1 1 1 f-
I if j if
I fra-f 1(
I #"H 1 ; [
1 5
\
-
I
i
'!*'
091.1171 IM.M7I 0(7_»m III.IKI *11.7111
oti.rut tor.rut ofi.nri* ii.ien
UIOIUTOIIY COOt
Figure 5: Notched box and whisker plot of travelling spiked blank data from several
laboratories for benzene.
51
-------
Note that the x-axis (lab_code) identifies not only the laboratory that produced the data, but
also the spiking concentration used by that laboratory e.g. 005_6675, where the first 3 digits
represent the spiking concentration, and the last four digits represent the laboratory.
This format was chosen because a great deal of information can be gleaned from these plots
for each laboratory, as well as allowing relatively easy comparisons between laboratories.
AlUMI HUM HANKS
r
j-
I f
1
( i
I t
»1
«. 1
t
} 1
i t
i ;
M7 l«7i 1177 1711 1171 71(1 7214 777S 7771 711* 1711 10(1
LABORATORY CODE
Figure 6: Notched box and whisker plot of travelling blank data from several laboratories for
aluminum.
52
-------
DISCUSSION
The information obtained from these QA data summaries will be valuable as a model of
laboratory performance for a broad range of parameters.
Laboratories that are biased can be identified, as well as parameters that pose problems
across the majority of laboratories.
Questions such as: is there an optimum spiking concentration for parameter X ?, is there a
consistent bias across laboratories for parameter X ?, what level of precision can I expect
for parameter X ?, is sample contamination a concern for parameter X ? ... can be addressed
through the use of these models.
Much as round-robins promote the improvement of analytical performance, largely due to
a new "awareness" of a laboratory's results relative to those of other laboratories, the
information contained in these QA summaries will allow similar intercomparisons, and
hopefully promote improvements. Intercomparisons will be facilitated by releasing
confidential laboratory codes to the corresponding laboratories.
QA data summaries will assist future audits (as mandated by provincial legislation) of
Ontario's analytical laboratories by providing background information that can identify areas
on which the audits should focus.
Future regulatory programs, such as MISA, may stipulate minimum performance criteria
based on the models derived from these QA summaries.
53
-------
REFERENCES
Frigge, M. et al. 1989. Some implementations ofthe boxplot. American Statistician 43:50-
54.
Government of Ontario. 1990. A Policy and Program Statement of the Government of
Ontario on Controlling Municipal and Industrial Discharges into Surface Waters.
Queen's Printer for Ontario.
Government of Ontario. 1988. Ontario Regulation 695/88 as Amended to Ontario
Regulation 533/89 under the Enironmental Protection Act.
King, D.E. Principles of Control Charting. Ontario Ministry of Environment and Energy,
Laboratory Services Branch, Rexdale.
Miller, J.C. and J.N. Miller. 1992. Statistics for Analytical Chemistry. Second Edition. Ellis
Harwood, West Sussex.
Ontario Ministry of Environment and Energy (MOEE). 1989. MIDES- Misa Data Entry
System, General Users Guide.
Thorten et al. 1992. Report on thhe Analysis of the Quality Assurance and Quality Control
Data forthe petroleum Refining Sector. Queen's Printer for Ontario.
54
-------
APPENDIX 1
PROGRAM: MISASPK.FEX
-* THIS program extracts the travelling spiked blank results
-» ('DAA'=FOUND and 'YAA'=EXPECTED) from the misa databases, correcting
-* FOR UNIT DISCREPANCIES AND CODING LABORATORIES. PERCENT RECOVERIES
-* are CALCULATED. THE OUTPUT IS A FOCUS
-» DATABASE FOR EACH SECTOR, AND THESE EXTRACTED DATABASES
-* CAN BE LINKED AND TREATED AS ONE QA DATABASE.
CLS
-RUN
-TYPE MISASPK IS NOW PROCESSING ...
-SET &OUTFILE= &&SECTOR I I S I I &&N;
SET MORE=OFF
_»».»»»«CORRECT For UNIT DISCREPANCIES AMONG SECTORS»M*****»'H,*,w*»*
DEFINE FILE &&SECTOR
YAA1/D15.4=IF (STYPE EQ 'YAA') THEN PVAL
ELSE IF (SNUM CONTAINS 'YAA') THEN PVAL
ELSE 04
DAA1/D15'4=IF (STYPE EQ 'DAA') THEN PVAL
ELSE IF (SNUM CONTAINS 'DAA') THEN PVAL
ELSE 0;
YAA/D15.4=IF (ATG EQ '09') AND (PTEST EQ 'FEUT' OR 'MGUT') AND
(PUNIT EQ '064') THEN YAA1
ELSE IF (ATG EQ '09') AND (PTEST EQ TEUT' OR 'MGUT')
AND (PUNIT EQ '0630 THEN YAA1/1000
ELSE IF (ATG EQ '09') AND (PTEST NE TEUT' OR 'MGUT')
AND (PUNIT EQ '063') THEN YAA1
ELSE IF (ATG EQ '09') AND (PTEST NE TEUT' OR 'MGUT')
AND (PUNIT EQ '064') THEN YAAinOOO
ELSE IF (ATG EQ '10' OR '11' OR '12' OR '13' OR '14' OR '23'
OR '27' OR '29') AND (PUNIT EQ '063') THEN YAA1
ELSE IF (ATG EQ '10' OR '11' OR '12' OR '13' OR '14' OR '23'
OR '29') AND (PUNIT EQ '064') THEN YAA1»1000
ETSE IF (ATG EQ '15') AND (PUNIT EQ '064')
THEN YAA1
FT SF IF (ATG EQ '15') AND (PUNIT EQ '063')
THEN YAA1/1000
ELSE IF (ATG EQ '23' OR '27) AND (PUNIT EQ '062')
THEN YAA1/1000
ETCSK if (ATG EQ '24') AND (PUNIT EQ '061')
THEN YAA1/1000
F.T SF. IF (ATG EQ '24') AND (PUNIT EQ '062')
THEN YAA1
ELSE IF (ATG EQ '24') AND (PUNIT EQ '063')
THEN YAAinOOO
RISK IF (PUNIT EQ '006' OR '007') THEN 0
ELSE YAA1;
DAA/D15.4=IF (ATG EQ '09') AND (PTEST EQ TEUT OR 'MGUT') AND (PUNIT EQ '064')
THEN DAA1
ELSE IF (ATG EQ '09') AND (PTEST EQ TEUT' OR 'MGUT') AND (PUNIT EQ '063*)
THEN DAA1/1000
ELSE IF (ATG EQ '09') AND (PTEST NE TEUT' OR 'MGUT') AND (PUNIT EQ '0630
55
-------
THEN DAA1
ELSE IF (ATG EQ '09') AND (PTEST NE 'FEUT' OR 'MGUT') AND (PUNH EQ '064')
THEN DAAinOOO
ELSE IF (ATG EQ '10' OR '11' OR '12' OR '13' OR '14' OR '23'
OR '27' OR '29') AND (PUNIT EQ '063') THEN DAA1
ELSE IF (ATG EQ '10' OR '11' OR '12' OR '13' OR '14' OR '23'
OR '29') AND (PUNIT EQ '064') THEN DAA1*1000
ELSE IF (ATG EQ '15') AND (PUNIT EQ '064')
THEN DAA1
ELSE IF (ATG EQ '15') AND (PUNIT EQ '063')
THEN DAA1/1000
ELSE IF (ATG EQ '23' OR '27') AND (PUNTT EQ '062')
THEN DAA1/1000
ELSE IF (ATG EQ '24') AND (PUNIT EQ '061')
THEN DAA1/1000
ELSE IF (ATG EQ '24') AND (PUNIT EQ '062')
THEN DAA1
ELSE IF (ATG EQ '24') AND (PUNTT EQ '063')
THEN DAA1»1000
ELSE IF (PUNTT EQ '006' OR '007') THEN 0
ELSE DAA1;2
.~"GENERATE CODES FOR LABORATORIES**"""*"****"*""
(CONFIDENTIAL)
-""""EXTRACT THE AVERAGE YAA PVAL'S AND HOLD AS SPK1 **""*"
TABLE FILE &&SECTOR
PRINT YAA CODE
BY SECTOR BY PTEST
BY RMDL BY PMTHD BY COMPID BY SDATE
IF YAA GT 0
IF PMTHD NE ''
IF ATG NE ''
ON TABLE HOLD AS SPK1
END
MATCH FILE &&SECTOR
PRINT DAA PREMK CODE
BY SECTOR BY PTEST
BY RMDL BY PMTHD BY COMPID BY SDATE
IF DAA GT 0
IF PMTHD NE ''
IF ATG NE ''
RUN
FILE SPK1
SUM AVE.YAA
BY SECTOR BY PTEST
BY RMDL BY PMTHD BY COMPID BY SDATE
AFTER MATCH HOLD AS SPK2 OLD-AND-NEW
END
-DATA FROM SPK2 IS USED TO CALCULATE % RECOVERIES AND SPIKE LEVELS
-AND THE OUTPUT IS STORED AS "OUTFILE". THE END OF THE PROGRAM WILL
-PRODUCE A TABLE SUMMARIZING RECOVERIES FOR EACH TEST AS A FUNCTION OF
-SPIKE LEVEL (OPTIONAL).
DEFINE FILE SPK2
PCTRECOV /D15.4=DAA/YAA*100;
56
-------
SPKLEVEL/D15.4=YAA/RMDL;
END
TABLE FILE SPK2
PRINT DAA PREMK YAA PCTRECOV SPKLEVEL CODE
BY SECTOR BY PTEST
BY RMDL BY PMTHD BY COMPID BY SDATE
ON TABLE HOLD AS &OUTFILE FORMAT FOCUS
END
DOS ERASE SPK1.FTM
DOS ERASE SPK1.MAS
DOS ERASE SPK2.FTM
DOS ERASE SPK2.MAS
57
-------
PROGRAM: SPKSTAT4.FEX
-* THIS PROGRAM WILL PRODUCE WK1 FILES FROM THE MISASPK
-* DATABASE 1HAT CAN BE IMPORTED INTO STATGRAFICS TO
-* PRODUCE BOTH XY SCATTERPLOTS AND BOX-AND-WHISKER DIAGRAMS
-* THAT SUMMARIZE THE TRAVELLING SPIKED BLANK DATA ACROSS
-* ALL SECTORS. NOTE THAT THE XY SCATTERPLOT DATA HAS BEEN
-* SCREENED TO REMOVE OUTLIERS BASED ON 1.5 TIMES THE INTER-
-* QUARTILE RANGE FOR A GIVEN SPIKING LEVEL ... IN ORDER TO
-* PRODUCE A MORE MEANINGFUL ESTIMATE OF STD_DEV. ALSO THE
-* X-AXIS HAS BEEN CUT-OFF AT 50 TIMES THE RMDL SINCE A
-* CONSISTENT SCALE IS DESIREABLE (50 SEEMS BEST COMPROMISE).
-* BOXWHISK IS INVOKED TO PRODUCE THE BOX-AND-WKSKER WKl'S.
-SET &&N=1;
-START
TABLE FILE HLIMTTS
PRINT *
IF LIST EQ &&N
ON TABLE HOLD
END
-RUN
-READ HOLD &LIST.I5. &PTEST.A6. &HZERO.D7. &RIGHT.D7. &ATG.A2.
-RUN
-"IF ((&ATG EQ 09 OR 27) OR ({&ATG GE 16) AND (&ATG LE 23)))
-*IF (&ATG CONTAINS '04' OR '08' OR '09' OR '10' OR '11'
>OR '12' OR '15' OR '23' OR '24'
-OR '27' OR '29' OR '14' OR '15' OR 16' OR '17' OR '18')
-IF &ATG IS '26'
-GOTO BEGIN ELSE GOTO LOOP;
-* CALCULATE VERTICAL LIMITS FOR OUTLIERS BASED ON
1.5 X THE INTERQUARTILE RANGE.
-BEGIN
-TYPE PROCESSING PTEST: StPTEST NUMBER: &&N
TABLE FILE MISASPK
SUM AVE.PCTRECOV RANKED BY SPKLEVEL
IF PTEST EQ StPTEST
ON TABLE HOLD AS LIST
END
-RUN
-READ LIST &UST2.I5.
DEFINE FILE MISASPK
DDPF/D15.3=(DAA-YAA) /RMDL;
END
TABLE FILE MISASPK
SUM CNT.DIFF BY SPKLEVEL
LIST DIFF PTEST
BY SPKLEVEL BY DIFF NOPRINT
IF PTEST EQ &PTEST
ON TABLE HOLD AS QUART! FORMAT FOCUS
58
-------
END
DEFINE FILE QUART1
QUART1 / A6=IF (FOCLIST LE E02»25) THEN '1ST' ELSE
IF ((FOCLIST LE E02*.5) AND (FOCLIST GT E02».25)) THEN '2ND' ELSE
IF ((FOCLIST LE E02*.75) AND (FOCLIST GT E02*.5» THEN '3RD' ELSE
IF (FOCLIST GT E02*.75) THEN '4TH';
END
TABLE FILE QUART1
SUM MAX.E04 ACROSS QUART
BY PTEST BY SPKLEVEL BY E02
ON TABLE HOLD AS QUART2
END
-RUN
-** IF THAN 4 RECORDS IN QUART2, THEN THE PTEST IS IGNORED,
-** SINCE NOT ENOUGH DATA TO DRAW MEANINGFUL CONCLUSIONS.
-*IF &LINES LT 4 GOTO LOOP;
TABLE FILE QUART2
SUM MAX.E03
ON TABLE HOLD AS COUNT
END
-RUN
-READ COUNT &E01.I5.
-RUN
-IF &E01 LT 4 GOTO LOOP;
DEFINE FILE QUART2
1ST/D15.3=E04;
2ND/D15.3=E05;
3RD/D15.3=E06;
41H/D15.3=E07;
UPPER/D15.3=IF E03 GE 4 THEN 2ND+(1.5*(3RD-1ST)) ELSE
IF E03 LE 3 THEN 4TH;
LOWER/D15.3=IF E03 GE 4 THEN 2ND-(1.5*(3RD-1ST)) ELSE
IF E03 EQ 3 THEN 2ND ELSE
IF E03 EQ 2 THEN 2ND ELSE
IF E03 EQ 1 THEN 4TH;
END
TABLE FILE QUART2
PRINT 1ST 2ND 3RD 4TH UPPER LOWER
BY PTEST BY SPKLEVEL
ON TABLE HOLD AS QUART3
END
59
-------
-* MATCH QUART3 WITH MISASPK TO CREATE A FILE CONTAINING DIFFS
-* AND UPPER AND LOWER LIMITS (BASED ON 1.5XINTERQAURTILE RANGE)
MATCH FILE QUART3
SUM UPPER LOWER BY PTEST BY SPKLEVEL
RUN
FILE MISASPK
PRINT DIFF BY PTEST BY SPKLEVEL
IF PTEST EQ &PTEST
-WHERE (SPKLEVEL LE &RIGHT);
IF SPKLEVEL LE 50
AFTER MATCH HOLD AS UPLOW OLD-AND-NEW
END
TABLE FILE UPLOW
PRINT DIFF BY PTEST BY SPKLEVEL
WHERE (DIFF LE UPPER);
WHERE (DIFF GE LOWER);
ON TABLE HOLD AS VERTSPK
END
- CALCULATE AVERAGE AND STD_DEV FOR XY PLOTS ... SPKLEVEL VS DIFF
-* FOR EACH PTEST.
-TYPE PROCESSING WORKSHEET FOR XY SCATTERPLOT FOR &PTEST
TABLE FILE VERTSPK
SUM CNT.DIFF NOPRINT SUM.DIFF NOPRINT
COMPUTE
N/I5=CNT.DIFF;
AVG/D15.3=AVE.DIFF;
STD_DEV/D15.3=SQRT((ASQ.DIFF/((N-1)/N))-((SUM.DIFF*SUM.DIFF)/(N*(N-1))))-
BY SPKLEVEL
IF PTEST EQ &PTEST
ON TABLE HOLD AS &PTEST FORMAT WK1
END
-RUN
- GENERATE PREFIXES FOR LAB CODES TO ALLOW SORTING BY
-* SPKLEVEL FOR GRAPHING PURPOSES (BOX AND WHISKERS)
-TYPE PROCESSING WORKSHEET FOR BOX AND WHISKER PLOT FOR &PTEST
-INCLUDE BOXWHISK
-RUN
-RUN
-LOOP
-SET &&N=&&N+1;
-IF &&N LE 160 GOTO START ELSE GOTO QUIT;
QUIT
60
-------
PROGRAM: BOXWHISK.FEX
- THIS PROGRAM IS INVOKED BY SPKSTAT4 TO PRODUCE WK1
-* FILES TO BE IMPORTED TO STATGAPHICS FOR BOX-AND-WHISKER
-* SUMMARIES FOR TRAVELLING SPIKED BLANK DATA.
-» PREFIXES ARE ATTACHED TO LAB-CODES TO ASSOCIATE SPIKING
-* LEVEL WITH THE LAB'S RESULTS.
-» GENERATE PREFIXES FOR LAB CODES TO ALLOW SORTING BY
-» SPKLEVEL FOR GRAPHING PURPOSES (BOX AND WHISKERS)
FILEDEF FILLER DISK FILLER.FTM
-TYPE PROCESSING WORKSHEET FOR BOX AND WHISKER PLOT FOR &PTEST
DEFINE FILE MISASPK
DIFF / D15.3=(DAA-Y AA) /RMDL;
SPKLEV/D=INT(SPKLEVEL);
SPK_CODE/A3=EDIT(SPKLEV);
LAB_CODE/A8=SPK_CODE I I I I CODE;
END
TABLE FILE MISASPK
PRINT PTEST DIFF SPKLEVEL RANKED BY LAB_CODE
IF PTEST EQ &PTEST
IF SPKLEVEL LE 999
ON TABLE HOLD AS CODELBT
END
.mttHMHWIWIimKWMWWmHtMKHmHnMMMWWn
-"START LOOP THAT WILL GENERATE WK1 FILES FOR BOX & WHISKERS
-AND FILL ANY MISSING RECORDS (I.E. IF LESS THAN 10 LAB.CODES)
-USING THE FILE "FILLER"
-SET &LOWER=0;
-SET &UPPER=10;
-SET &M=1;
-FILL
-SET &CODENAME=«cPTEST I I _ I I &M;
TABLE FILE CODELBT
PRINT DIFF
RANKED BY LAB_CODE
IF RANK LE &UPPER
IF RANK GT &LOWER
ON TABLE HOLD
END
-RUN
-IF & LINES LTI GOTO BOTTOM ELSE GOTO NEXT;
-NEXT
MATCH FILE HOLD
61
-------
PRINT DIFF LAB_CODE BY RANK
RUN
FILE FILLER
SUM DIFF LAB_CODE BY RANK
AFTER MATCH HOLD AS HOLD2 NEW-NOT-OLD
END
MATCH FILE HOLD
PRINT RANK BY LAB_CODE BY DIFF
RUN
FILE HOLD2
PRINT RANK BY E05 BY E04
AFTER MATCH HOLD AS HOLD3 OLD-OR-NEW
END
TABLE FILE HOLD3
PRINT LAB_CODE DIFF
ON TABLE HOLD AS &CODENAME FORMAT WK1
END
-RUN
-SET &M=&M+1;
-SET &UPPER=&UPFER+10;
-SET &LOWER=&LOWER+10;
-GOTO FILL
-BOTTOM
DOS ERASE HOLD*.*
END
-------
QUALITY ASSURANCE PLANNING IN
GREAT LAKES ECOLOGICAL STUDIES -
FRAMEWORK FOR A QUALITY ASSURANCE PLAN
Donald R. Hart
Beak Consultants Limited
INTRODUCTION
I was asked to present in this paper a generic framework for quality assurance (QA)
planning and documentation related to Great Lakes monitoring, based on a retrospective
review of Great Lakes programs. This charge arises, in part, from a widely recognized need
for integration of monitoring activities under the Great Lakes Water Quality Agreement. The
Parties to this agreement are committed to develop Lakewide Management Plans. Clearly,
monitoring activities should be designed to service the data uses identified in these plans.
This means planning for data of sufficient quality to support these uses.
Having said this, I want to make it clear that I am not putting forward a definitive
quality assurance program plan. This is a task for the Parties to complete, and individuals
have been assigned to coordinate this effort on both sides of the border. At lakewide or
basinwide levels, the task may have to await better definition of integrated data uses. I am
simply offering a template that might be followed in preparing a QA plan, with general
guidance as to content for different levels of planning, and identification of problematic
issues that may need to be resolved as a QA plan evolves.
Everything I say today, and in the paper, is intended to solicit and perhaps provoke
comment from workshop participants so that together we can add, delete or modify elements
of guidance, resulting in the evolution of a more useful product. I have been involved in the
collection and interpretation of Great Lakes monitoring data for various Environment
and Ontario MOE projects, and have spent some time on the IJC Data Quality Workgroup,
but the individuals in this room can bring far more experience to bear. With the workshop
coordinators and I want to capitalize, on that experience both in planning and implementing
Great Lakes monitoring programs.
A brief aside on levels of QA planning may be appropriate here. In the U.S., three
levels of planning have been recognized: quality management plans which focus mainly on
process, often for larger integrated monitoring programs (EPA, 1992a); QA program plans
which cover both process and QA method for an integrated program (EPA, 1987); and QA
project plans which focus, mainly on methods for sampling, analysis and QAJQC specific
to an individual project (EPA, 1980).
In Canada, while QA elements are addressed in a project or program, there is no
standardized approach to producing a QA plan. Also, the distinction between levels of QA
63
-------
planning is generally less rigid. I will be talking mainly to the program level today, but also
highlighting issues that might normally be the subject of a quality management plan or a QA
project plan. I am more concerned that critical issues should be addressed in QA planning,
and less concerned about the packaging of the material, as, for example, in one, two or three
different documents. I will avoid such packaging questions over the next two days.
At the outset, I should emphasize that the basic principles of QA planning and
implementation are no different in Great Lakes monitoring than they are in any other field
of environmental science or, for that matter, industrial manufacturing. However, as I go
along, I will try to relate key points of guidance to specific examples of problems
encountered or progressive approaches to QA/QC in Great Lakes programs. I draw these
from a review of Great Lakes surveillance programs which I completed over this past year
under the auspices of this workshop, and from a similar review of selected QA program
plans during 1986-88.
The framework of my presentation and of the guidance document that I have in mind,
is outlined in Table 1. Many of you may recognize 'points of light' from the EPA guidance
on preparation of QA project plans. I have added some additional headings, reorganized into
four main sections, and increased the level of detail. I am aiming for flexible guidance that
will be useful at multiple levels. The user can direct the emphasis where it belongs. For
example, a program plan need not elaborate on field and laboratory operations when these
are covered in detail in separate project plans.
RETROSPECTIVE
Past Situation
My previous review of quality assurance programs associated with Great Lakes
monitoring activities (BEAK, 1988) included 36 different projects at federal and state or
provincial levels. The key findings are listed in Table 2. I will expand briefly on each of
these.
QA Plans. One of the first problems I encountered in 1986 was the general lack of
QA plans in Canada. For EPA and state programs, I was able to start by reviewing a QA
plan of some sort and follow-up with additional questions as needed. For Environment
Canada and Ontario MOE programs, I had to extract QA information from sampling or
analytical protocols, data reports and questionnaire responses. Quality assurance for
biological measurements was poorly developed in both the United States and Canada.
Data Uses. The DC Data Quality Workgroup was reviewing round-robin data from
chemistry laboratories and finding statistically significant interlaboratory discrepancies
(Rathke and McCrea, 1987). However, the program significance was always unclear, for
various reasons, the main one being lack of pre-defined data uses, particularly by the IJC and
its workgroups. Thus, data comparability was often a contentious issue, because the desired
comparisons were seldom defined as program goals.
64
-------
For example, 'tributary' laboratories tended to be biased high relative to 'open lake'
laboratories (by 2 to 3 |ig/L for phosphorus), and on-board analyses differed systematically
between ships (by 2 to 4 [ig/L for phosphorus). These differences were equivalent to the
estimated change in lake mean over a five- to seven-year period. For metals and nutrients
in a water matrix, 20 to 40% of laboratory results in round-robins were flagged as statistical
outliers from the group median. For round-robins on organics in water, median values were
well below target values, suggesting systematic difficulties in maintenance of calibration
standards.
DQOs. Data quality objectives (DQOs) were usually defined, at least in the U.S.
where they were a required element of QA plans, but often they originated in the analytical
laboratory by default. A justification in terms of program or project data uses was
consistently lacking. Likewise, definitions of detection, precision and accuracy goals were
inconsistent among projects, or lacking from project plans.
For example, detection limits might be defined in terms of: reading increments,
instrument detection limits, method detection limits, limits of quantitation or 'practical'
quantitation limits. Precision might be computed by replication before or after sample
preparation, within-run or between-run, for one or several concentration ranges. The Niagara
River Toxics Committee (NRTC, 1984) noted the data compatibility problems which may
arise from such inconsistent definitions.
Data quality objectives were never defined in terms of total uncertainty, or power to
answer a question. Moreover, the questions themselves, usually about spatial patterns and/or
temporal trends in contamination, or about compliance evaluation, were not specific enough
to support such an objective development process. The domain of the question was often
unclear.
For example, the question 'is there a temporal trend in water quality?' could be
clarified to ask 'is there a temporal trend in annual area average water quality over a ten-
year timeframe in areas x, y or zT The latter version of the question, with some preliminary
information on temporal and spatial variability, allows an appropriate sampling frequency
and spatial intensity to be determined, so as to detect a real change of particular magnitude,
with minimal risk of non-detection (i.e., increased power).
Study Design. Designs were specified as far as station locations and sampling
periods, but the statistical basis of the design was seldom defined. There was little rationale
in terms of ability to satisfy project or program goals or data uses. Project goals were
diverse, and integrated lakewide program goals were not defined at all.
Data Reporting. Reporting practice was highly variable with respect to such factors
as blank and recovery correction and data flagging. Use of the ASTM flags T and W for
low level chemical data was encouraged, but not universally practised. In fact, there seemed
to be several T, W systems in use (Crane, 1986). Data quality information was typically
confined to the laboratory, and if reported was often difficult to associate with project data.
65
-------
For example, only 25 to 50% of laboratories routinely reported precision or detection
limits, both of which may vary from sample to sample. Reported average values may or
may not apply to a particular sample. What you really need to report is the precision
function and this was not commonly done. However, most laboratories identified an
individual responsible for the report, facilitating user enquiry.
Coordination. A primary recommendation arising from this review in 1988 was the
appointment of a binational coordinator to facilitate specific definition of binational program
objectives, data uses, and data quality needs, to evaluate achievement of program objectives
and recommend program changes as required, and to develop a quality management plan for
this purpose. As of today, there are two coordinators, one on each side of the border. They
are beginning to address these tasks. Hopefully, the Workshop Proceedings this year may
serve as a catalyst.
Present Situation
My recent (1992) review of quality assurance programs has been more selective and
perhaps more focused on newer programs. I will refer to some of these in developing the
elements of guidance that follow in detail. However, I can now summarize this effort
briefly.
During this review, I found several Canadian QA plans in place (e.g., for the Ontario
MOE (1990) Toxics Deposition Monitoring Network, and for Great Lakes Water Quality
Monitoring by Environment Canada (L'ltalien, 1992)). Furthermore, it is clear that this is
a growing trend. And, although there is currently no Canadian standard as to content, format
or level of detail, many other QA plans were in various stages of preparation.
Two recent programs south of the border represent, in my view, a long step forward
in QA planning on the Great Lakes (the Environmental Monitoring and Assessment Program
(EMAP) and the Assessment and Remediation of Contaminated Sediments (ARCS) Program.
Both give particular attention to QA for biological measurements.
The first steps are now being taken toward binational integration of U.S. and
Canadian programs. A QA plan for an Integrated Atmospheric Deposition Network (IADN)
is near completion, which should provide a model for integration of other programs.
By virtue of both U.S. and Canadian efforts, I think we are making progress,
particularly in relation to development of quality controls for environmental monitoring,
awareness of goal-driven approaches to sampling design and quality assessment, and
integration of diverse monitoring activities.
If we can collectively develop a generic framework for QA plans, with input from
both Canadian and U.S. Great Lakes monitoring communities, I believe it would help to
maintain the momentum and facilitate further binational integration.
66
-------
The rest of my talk follows a possible framework for a QA plan. The material is
presented to solicit critical review by the working groups this afternoon and tomorrow. Is
the guidance offered appropriate? Is it complete? Are there critical issues to be dealt with
that I have neglected to mention?
FRAMEWORK FOR A QUALITY ASSURANCE PLAN (FOR DISCUSSION)
Quality Management
Quality management begins with a description or vision of the program (or project),
its overall purpose and specific data quality objectives, and its organizational structure and
management process focused on achievement of objectives. Each of these topics should be
addressed as a major heading in a QA program (or project) plan. However, the program
plan may be focused at a higher level of organization, picking up where existing project
plans leave off.
Program (Project) Description (Overview)
The description provides an overview of the main project(s) which is (are) the focus
of the QA plan, any related projects and relationships between projects, data collection
activities, basic design approaches, objectives in terms of questions to be resolved, data uses
to address these questions, and the overall framework for decision making based on the
answers to these questions.
Project(s). First, we identify the project(s) which is (are) the focus of the QA project
or program plan. In a program plan, a group of related projects is encompassed within an
umbrella program.
Relationships. Any blocks of data to be produced by one project and used by
another, either as input data or in comparisons between projects, must be clearly identified
by measurement type, matrix, timeframe and geographic region. A data producer x user
matrix with data blocks listed in each cell can be a convenient platform for presentation of
these relationships.
Activities. The list of data collection activities generally corresponds to the data
blocks, defined. More detail can be given here as to the specific measurements included in
block, and general methods of sampling and analysis (reference SOPs).
Basic Design. For each date collection activity, we identify the sampling strategy
(e.g., random, stratified random, systematic) and level of effort (e.g., numbers of samples or
observations) over the entire domain of study, and within any identified environmental strata.
The Hnmain and any identified strata should be clearly defined.
67
-------
Objectives. The objectives can be stated as general or specific questions to be
resolved, usually about the nature and extent of contaminants or bioindexes, and often in
comparison to environmental criteria, reference areas or baseline time periods. General
questions can be partitioned into more specific questions pertaining to individual variables
or indexes that are testable as hypotheses.
Data Uses are largely dictated by the form of the specific questions posed; however,
statistical methods (including transformations) may depend on the assumptions that the user
wishes to make, which in turn may depend on preliminary examination of data. Also, the
variables and domains of interest may be defined or revised after preliminary data review.
This kind of flexibility is allowed and should be identified.
Decision Framework. The logic leading from the answers to specific questions, to
the ultimate selection of abatement, remedial or further investigative initiatives, should be
defined as clearly as possible. Often there are significant socio-political rather than scientific
inputs to these decisions, which should be acknowledged.
Issues. These requirements point to a number of key issues that may arise in the
integration of Great Lakes monitoring programs:
1. Development of a decision framework for lake or basin management. This is the
primary issue and should be the first step in binational integration. A possible
framework is shown in Figure 1 which addresses both toxics and habitat issues.
Existing projects fall within one or several of these boxes. In a very general way, data
relationships are defined by the arrows between boxes.
2. Geographical and technical scope of the program. Is it lakewide, basinwide; single or
multimedia? Which existing projects will be included?
3. Integration of sampling strategies to meet both project and program objectives. What
compromises are necessary? What efficiencies can be realized?
4. Binational process for definition of integrated objectives, questions, data uses and
sampling strategies. Who is involved (besides the coordinators)? What are the steps?
Program (Project) Data Quality Objectives
The linkage of fine scale questions to broader scale questions means that data may
have to meet the needs of different users at project and program levels. The data quality
(and quantity) needs of these users may differ, and it is critical that project managers
understand the broader context, including the lakewide or basinwide management program.
A process for defining and assessing achievement of DQOs is essential. The process
often involves compromise between what we would like to achieve and what we can afford,
68
-------
but ensures ail appreciation of what is achievable with available budget, encourages cost-
effective resource allocation, and guards against spreading resources too thinly.
Power. A specified power to answer program or project questions with a specified
resolution is the ultimate data quality objective. It is determined as an objective from
consideration of the consequence of getting incorrect answers and thus making 'wrong'
decisions. Other data quality objectives (MQOs) are designed to control components of the
overall uncertainty that limits the power of the study.
Sensitivity. Sample detection limits should be sufficiently low that unacceptable bias
due to censored data is not introduced to the summary statistics that are used to answer
questions (e.g., mean or variance). The required sensitivity typically depends on the average
measured value in each domain of interest (e.g., one-tenth of mean or less). Sensitivity is
determined from analytical precision of blanks or low level standards.
Precision is specified as a permissible limit for random error. It may be specified at
any or all levels, for the relevant environmental domain (total precision), the measurement
process (including sampling method) or the analytical method. It is determined by
replication at each level.
Accuracy is specified as a permissible limit for control sample deviation from expected
value. It is determined by analysis of lab and field blanks, reference standards and/or matrix
spikes.
Specificity. The measurement system should respond only to the analyte of interest.
It is normally evaluated during verification of an analytical method.
Comparability. Two data sets may be considered comparable if they are both
essentially unbiased or share essentially the same bias as determined by average of lab and
field blanks, reference standards and/or matrix spikes at any concentration. Differences in
total precision may require special methods of comparison leading to some loss of power,
but generally do not preclude comparison.
Compatibility. Two data sets may be pooled for subsequent treatment as a single data
set, only if they are essentially the same with respect to both mean value and total precision
for all variables and domains of interest.
Completeness. The percentage of planned measurements that are actually obtained and
useable for any measurement variable and matrix. Criteria for any declaration of data as not
useable should be stated or referenced. Note DQO exceedence does not necessarily m»T»»
data unusable.
Representativeness. Samples may be considered to represent the domain of interest
if collected by an unbiased sampling plan and if associated accuracy is acceptable.
Assessment involves consideration of time and location and other conditions of sampling.
69
-------
Assessments. We should define the schedule and general process for assessing
whether each DQO is being or has been achieved, and for determining, based on this,
whether the basic study design, management practice (organization) or technical operations
(field, laboratory, quality assurance) should be revised.
Issues. A number of key issues are related to the requirement for data quality
objectives:
1. Complexity of process for defining DQOs, MQOs. How should it vary with type and
scale of program? How can it be phased-in to existing programs? Both EPA (EMAP)
and EC (NWRI) have identified a commitment to the process.
2. Complexity of the optimized sampling design. Different levels of effort for different
variables? Or average level of effort, different degrees of power?
3. Standardization of data quality measures. Is it necessary? Is it possible? The need
for such standardization has been recognized (NRTC, 1984).
4. Accuracy without standard reference materials? Consistency over time instead? (see
ARCS program, Schumacher, 1992). Development of appropriate standards? (see
Elliot and Drake, 1981)
5. Bias correction to permit comparison. Should it be done? If so, who does it?
6. Specifically which data sets require comparison? This should fall out of Lake
Management Plans.
Power to detect a spatial pattern of environmental impact has been identified by EC/DFO
as a required consideration in design of environmental effects monitoring (Figure 2).
Similarly, Reckhow considers how uncertainty in a loading estimate affects power to predict
lake trophic status (Figure 3).
Some MQOs from the ARCS program are shown in the next two Tables (3 and 4). The
footnotes define the data quality measures. Clear definitions are needed, both for data
quality assessment, and for comparison/integration of programs. The goals for toxicity
testing are interesting. The control response limit is a measure of test sensitivity, similar to
an MDL. Precision is measured by replication of both reference toxicant tests and tests of
project samples. Accuracy is a relative standard deviation over time of reference toxicant
test results. No MQOs are defined for benthic enumeration. However, they could be
defined in terms of species recovery (Figure 4) or relative standard errors of organism
density estimates.
70
-------
Program (Project) Organization
The structure and process by which management ensures that objectives are achieved
must be defined in order to be effective. This includes definition of responsibilities and
authorities, reporting channels and schedules, QA policy, resource allocations and strategies,
document controls, procurement processes and staff training and evaluation procedures.
Issues. A number of key issues are related to program organization:
1. Independence of the QA officer. Is it necessary? Is it desirable? How to accomplish?
2. Strategy for schedule vs. quality tradeoffs. Single vs. multiple laboratories?
3. Strategy for budget vs. quality tradeoffs. Lab information systems for small
operations?
4. QA policy statement. How to make it more than motherhood? Specifically address
tradeoff situations?
5. Secure vs. accessible information systems? What is the best compromise?
6. Standardization of work statements for contract laboratories?
7. Binational training and evaluation programs? Particularly for field techniques?
FIELD OPERATIONS
Field operations include sampling design and sampling procedures. The latter may
reference any pertinent SOPs that exist, rate than repeating them in detail.
Design. We can design to meet project and data quality objectives, based on previous
data. The design will specify time period mid area selection, adequate numbers of stations
or timft points in each, unbiased sampling within areas or time periods, sampling locations,
adequate field and laboratory replication to check or control components of eiror.
Procedures. We can reference SOPs for standard methods. Non-standard methods
or deviations from standard methods are described in the QA plan. They include any real
tima quality control checks, such as repeated measurements, or sampling efficiency tests.
Issues. A number of key issues related to field operations seem to need attention
and/or standardization:
1. Justification of sampling design in relation to objectives. What is the expectation of
confidence, in estimators and decisions? (see Hedtke et al, 1992; L'ltalien, 1992)
71
-------
2. Selection of appropriate reference areas for impact assessment? (EC/DFO, 1991)
3. Allowance for bias as well as imprecision in estimation of required sampling effort?
4. Development of real-time quality controls on the sampling process and performance
criteria for field method equivalence. Replicate field samples are often collected, but
less often utilized to estimate or control sampling method error. Real-time controls are
necessarily limited to rapidly determined characteristics of the sample.
A method comparison study on the St. Marys River (Figure 5) illustrates the order of
magnitude differences in benthic enumeration that can arise between methods. In this case,
the sieve size is the dominant source of error, but the effect varies from station to station.
The same study examined the effect of replicate sampling effort on characterization of
species richness (Figure 6). Preliminary studies of this nature provide useful data on which
to base a survey design.
LABORATORY OPERATIONS
Laboratory operations include laboratory facilities and analytical procedures. The latter
may reference pertinent SOPs, rather than repeating them in detail. The Ontario Ministry
of the Environment Code of Practice for Environmental Laboratories (King, 1989) provides
a useful framework for preparation of laboratory SOPs.
Facilities. Describe general layout, separation of areas, reagent and climate control.
Procedures. Again, we can reference SOPs for standard methods. Non-standard
methods or deviations from standard methods are described in the QA plan. They include
or reference a description of the internal quality control program.
Issues. Most chemical laboratories have some sort of quality control program in place,
but many biological laboratories do not. Key issues include:
1. Development of 'analytical' QC check points for toxicity testing and taxonomic
laboratories.
2. Development of performance criteria for method equivalence.
Range and average control charts for toxicity testing (Figure 7) are similar to those
used in the chemistry laboratory, except that standard reference materials generally do not
exist. Thus, accuracy is problematic but we can address consistency.
72
-------
Discrepancies between sorters in benthic sample processing (Figure 8) can be checked
periodically, just as instrument operators are compared in the chemistry laboratory. Sample
archives permit assessment of comparability between old and recent surveys.
QUALITY ASSURANCE
Quality assurance includes interlaboratory and accreditation studies, performance and
systems audits, internal QC data review, project or program data reviews, corrective actions
when problems are identified, assessment of DQO achievement and QA reporting. Each of
these topics should be addressed as a major heading in a QA program (or project) plan.
Interlaboratory and Accreditation Studies. Typically are external to the project or
program, but the results can be pertinent.
Audits. Performance audits involve blind submission of standard samples. System
audits involve site visits, interviews and document review.
QC Data Review. Involves examination of control sample frequency, association with
project sample data, control limit exceedences and real-time response to limit exceedence.
Project Data Review. Includes data validation, reduction and reporting. Reduction
includes any bias correction of data (e.g., blank or recovery correction). Validation is based
on associated QC data.
Corrective Actions. These may be taken in real-time or after QC or project data
reviews. Specific trigger conditions should be defined for each corrective action.
Assessment Achievement of analytical or measurement DQOs for each sample can
be assessed and tracked continually, since the data flagging criteria typically relate to their
achievement. Other DQOs involve the entire data set and are not assessed definitively until
all planned samples that can be collected are processed.
QA Reports should contain study design recommendations based on assessment of
DQO achievement. The schedule often coincides with the schedule for QAPP review and
revision. The content is a summary of results from all of the QA operations.
Issues. A number of key issues related to these QA operations will likely need
attention in the integration of Great Lakes programs.
1. Interlaboratory studies vs. performance audits. Audits can be conducted more
frequently and designed to correspond more closely with periods and types of project
activity for each laboratory. However, round-robins may provide a better indication
of interlaboratory bias in relevant matrices.
73
-------
2. Development of appropriate performance audit samples and standardized system audit
checklists. The Canadian Association of Environmental Analytical Laboratories is
developing audit programs for both chemical and biological laboratories.
3. Standardization of QC data uses across the Great Lakes, e.g., to flag or reject project
data.
4. Standardization of key data reductions such as blank and recovery correction.
5. Development of a binational process for feedback from data quality assessment to
study design.
6. Development of minimum standards for QA reporting - frequency, level of detail.
An inter-ship study on Lake Erie (Figure 9) addressed both sampling and analytical
comparability. Three ships exchanged samples. In this instance, the analytical biases
between ships exceeded the sampling method bias (Rathke and McCrea, 1987).
A biotesting round-robin (Figure 10) illustrates a problem that is unique to biotesting.
Test protocols permit different dilution waters to be used, resulting in large interlaboratory
differences in LC50. We should know about these discrepancies, but should not use them
to judge lab performance. An estimated toxicant concentration based on the LC50 is a better
measure of laboratory performance.
SUMMARY
A common framework for implementation and description of quality assurance
programs is needed on the Great Lakes. Within a suggested framework, I have presented
some elements of guidance that may be useful in preparation of quality assurance plans, and
some related issues that may need to be addressed in development of integrated plans for the
Great Lakes. I look forward to input from the working groups today and tomorrow, and
based on this, to the evolution of a Generic QA Program Plan as a guide to actual plan
preparation.
74
-------
REFERENCES
Alldredge, J.R. 1987. Sample size for monitoring of toxic chemical sites. Environ.
Monitor. Assess. 9: 143-154.
ASTM. 1986. Manual on Presentation of Data and Control Chart Analysis. Committee E-
11 on Statistical Methods. ASTM Special Technical Publication 15D. American
Society for Testing and Materials.
Beak Consultants Limited (BEAK). 1988. Quality Assurance Coordination of Great Lakes
Surveillance. Report to Environment Canada, Inland Waters Directorate, Ontario
Region.
Beak Consultants Limited (BEAK). 1991. Quality Assurance Guidelines for Biology in
Aquatic Environmental Protection. Environment Canada, Conservation and Protection,
National Water Research Institute, Burlington.
Brydges, T.G., C.A. Franklin, I.K. Morrison, P.W. Summers, B.B. Hicks, K.L. Demeijian,
D.L. Radloff, M.L. Wesely and B.M. Levinson. 1991. Integrated Monitoring in the
U.S.-Canada Transboundary Region, Monitoring for Integrated Analysis. Report to the
International Joint Commission, Air Quality Advisory Board.
Canadian Association of Environmental Analytical Laboratories. 1992. Laboratory
Accreditation Program for Biological Environmental Toxicity Testing (Draft).
Crane, J.L. 1986. Workgroup Summary: New Quality Control Issues. In: Methods for
Analysis of Organic Compounds in the Great Lakes, Volume n. Proceedings of an
International Workshop. W.C. Sonzogni and D.J. Duke, Chairmen. WIS-SG-86-244.
University of Wisconsin Sea Grant Institute, Madison.
Dux, J.P. 1986. Handbook of Quality Assurance for the Analytical Chemistry Laboratory.
Van Nostrand Reinhold Co., New York.
Elliot, J.M. and C.M. Drake. 1981. A comparative study of seven grabs used for sampling
benthic macroinvertebrates in rivers. Freshwater Biology 11: 99-120.
Environment Canada. 1990. Guidance Document on Control of Toxicity Test Precision
Using Reference Toxicants. EPA l/RM/12. Environmental Protection Service, Ottawa.
Environment Canada/Department of Fisheries and Oceans (EC/DFO). 1991a. Aquatic
Environmental Effects Monitoring Requirements, Annex 1, Aquatic Environmental
Effects Monitoring at Pulp and Paper Mills and Off-Site Treatment Facilities Regulated
under the Pulp and Paper Effluent Regulations. EPS l/RM/18. Environmental
Protection Service, Ottawa.
?5
-------
Environment Canada/Department of Fisheries and Oceans (EC/DFO). 1991b. Technical
Guidance Manual for Aquatic Environmental Effects Monitoring at Pulp and Paper
Mills. Volume 1 Overview and Study Design, Volume 2 Methodology. Environmental
Protection Service, Ottawa.
Environment Canada/Department of Fisheries and Oceans (EC/DFO). 1991c. Toxic
Chemicals in the Great lakes and Associated Effects: Vol.1. Contaminant Levels and
Trends. Government of Canada, Ottawa.
Environmental Protection Agency (EPA). 1980. Interim Guidelines and Specifications for
Preparing Quality Assurance Project Plans. Quality Assurance Management Staff,
Washington, DC.
Environmental Protection Agency (EPA). 1984. Definition and Procedure for the
Determination of the Method Detection Limit - Revision 1.11. Appendix B to Part 136.
Federal Register 49(209): 26. October 1984.
Environmental Protection Agency (EPA). 1987. Guidelines and Specifications for Preparing
Quality Assurance Program Plans and Quality Assurance Annual Report and Workplans
for EPA National Program Offices and the Office of Research and Development.
Quality Assurance Management Staff, Washington, DC.
Environmental Protection Agency (EPA). 1991. Environmental Monitoring and Assessment
Program, Quality Assurance Program Plan. Environmental Monitoring Systems
Laboratory, Cincinnati.
Environmental Protection Agency (EPA). 1992a. EPA Requirements for Quality Assurance
Management Plans. Quality Assurance Management Staff, Washington, DC.
Environmental Protection Agency (EPA). 1992b. Lakewide Management Planning Process.
Region 5 Water Division, LaMPOST, FY92, 3rd Quarter.
Fay, L.A. 1987. Great Lakes Methods Manual - Field Procedures (draft). Report to the
International Joint Commission Surveillance Work Group. Center for Lake Erie Area
Research, Columbus.
Gilbert, R.O. 1987. Statistical Methods for Environmental Pollution Monitoring. Van
Nostrand Reinhold Co., New York.
Green, R.H. 1989. Power analysis and practical strategies for environmental monitoring.
Environ. Res. 50: 195-205.
Hedtke, S A. Pilli, D. Dolan, G. McRae, B. Goodno, R. Kreis, G. Warren, D. Swackhamer
and M. Henry. 1992. Environmental Monitoring and Assessment Program, EMAP-
Great Lakes Monitoring and Research Strategy. Environmental Protection Agency,
Environmental Research Laboratory, Duluth.
76
-------
Hunt, D.T. and A.L. Wilson. 1986. The Chemical Analysis of Water. General Principles
and Techniques. Second Edition. Royal Society of Chemistry, London.
International Joint Commission (IJC). 1988. Summary: A Plan for Assessing Atmospheric
Deposition to the Great Lakes. Surveillance Work Group, Atmospheric Deposition
Monitoring Task Force, IJC Great Lakes Regional Office, Windsor.
King, D.E. 1984. Principles of Control Charting. Ontario Ministry of the Environment,
Laboratory Services Branch, Rexdale.
King, D.E. 1989. Code of Practice for Environmental Laboratories. Ontario Ministry of
the Environment, Laboratory Services Branch, Rexdale.
Kuntz, K.W. and B. Harrison. 1991. Field Sampling Design and Quality Control
Procedures used in the Niagara River Program. Chemical Institute of Canada
Conference, Hamilton, June 1991.
L'ltalien, S. 1992. Great Lakes Water Quality Monitoring Quality Assurance/Quality
Control. Environment Canada, Water Quality Branch, Inland Waters Directorate,
Burlington.
McCrea, R.C. and J.D. Fischer. 1992. Quality Assurance and Control Considerations for
the ISOMET Stream Sampler (draft). Environment Canada, Burlington.
Metikosh, S., J.O. Nriaga, J.M. Bewers, O. El-Kei, D.C. Rockwell, R. Rossman, I. Sekerka
and P.A. Yeats. 1987. Recommended Sampling and Analytical Protocols for Trace
Metals Measurement in Great Lakes Open Waters (draft). Report to the Surveillance
Work Group from the Technical Group on Sampling and Analytical Protocol.
Neilson, M. and R. Stevens. 1988. Evaluation of a large-volume extractor for determining
trace organic contaminant levels in the Great Lakes. Water Pollut. Res. J. Canada 23:
578-588.
Niagara River Sampling Protocol Group. 1988. Niagara River Sampling Protocol.
Environment Canada, Burlington.
Niagara River Toxics Committee (NRTC). 1984. Final Report to the Niagara River Toxics
Committee. Report by the Data Quality Subcommittee.
Ontario Ministry of the Environment (MOE). 1988. Estimation of Analytical Detection
Limits (MDL).
Ontario Ministry of the Environment (MOE). 1990. Toxics Deposition Monitoring
Network, Quality Assurance Plan. ARB-060-90. Air Resources Branch, Atmospheric
Research and Special Projects Section, Toronto.
77
-------
Ontario Ministry of the Environment (MOE). 1992. Report on the Analysis of Quality
Assurance and Quality Control Data for the Petroleum Sector.
Overton, W.S., D.L. Stevens and D. White. 1991. Design Report for EMAP, Environmental
Monitoring and Assessment Program (Draft). Environmental Protection Agency,
Environmental Research Laboratory, Corvallis.
Peck, D.V., J.L. Engels, K.M. Howe and J.E. Pollard. 1988. Aquatic Effects Research
Program, Episodic Response Project, Integrated Quality Assurance Plan. EPA 600/x-88-
274. U.S. EPA Environmental Monitoring Systems Laboratory, Las Vegas, Nevada.
QAMS. 1986. Development of Data Quality Objectives. Description of Stages I and II
(draft). Quality Assurance Management Staff, Environmental Protection Agency,
Washington, DC.
Rang, S J. Holmes, S. Slota, D. Bryant, E. Nieboer and H. Regier. 1991. The Impairment
of Beneficial Uses in Lake Ontario. Report to Environment Canada, Great Lakes
Environmental Program Office.
Rathke, D. and G. McRae. 1987. 1987 Annual Report of the Water Quality Board to the
International Joint Commission, Appendix B, Sections 4.4 and 4.5. International Joint
Commission, Windsor.
Reckhow, K.H. 1979. Uncertainty analysis applied to Vallenweider's phosphorus loading
criterion. J. Water Poll. Control Fed. 51: 2123-2128.
Schumacher, B.A. 1992. Quality Assurance Management Plan for the Assessment and
Remediation of Contaminated Sediments (ARCS) Program. EPA/600/92.
Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Las
Vegas.
Shackleton, M.N. 1990. Technical and Operating Manual, Toxics Deposition Monitoring
Program. Ontario Ministry of the Environment, Air Resources Branch, Toronto.
Turle, R. and R.J. Norstrom. 1987. The CWS Guide to Practical Quality Assurance for
Contracted Chemical Analysis. Technical Report Series No. 21. Environment Canada,
Canadian Wildlife Service, Headquarters.
Weseloh, D.V., J.E. Elliot, T.J. Miller and J.H. Olive. 1988. Herring Gull Surveillance
Program, Great Lakes International Surveillance Plan (GLISP), Volume HI. Surveillance
Handbook. Report to the International Joint Commission, Water Quality Board.
Weseloh, D.V. and P. Mineau. 1991. Herring Gull Surveillance Program, Field Protocol.
Environment Canada, Canadian Wildlife Service, Burlington.
78
-------
TABLE 1:
GENERIC FRAMEWORK FOR A QUALITY ASSURANCE PLAN
Quality Management (in QA Program and Project Plans)
Program (Project) Description (overview)
Main and Related Projects, Relationships between Projects
Data Collection Activities, Basic Design Approaches
Objectives and Questions, Data Uses to Address Questions, Decision Framework
Program (Project) Data Quality Objectives
Power to Answer Questions with Specified Resolution
Sensitivity, Precision, Accuracy, Comparability, Compatibility
Completeness, Representativeness, DQO and Related Assessments
Program (Project) Organization
Responsibilities and Authorities, Reporting Channels and Schedules
QA Policy Statement, Resource Allocations and Strategies
Document Control (for QMP, QAPP, SOPs, QA reports, data files)
Procurement Process, Staff Training, Evaluation and Safety
Field Operations and QA/QC (in QA Project Plan)
Sampling Design (based on objectives and previous data)
Adequate Replication, Time and Area Selection, Unbiased within Areas
Sampling Procedures (may reference SOPs)
Equipment and Instrument Maintenance and Calibration
Sample Collection, Preservation, Custody, Storage, Records
In-field Quality Control
Laboratory Operations and QA/QC (in QA Project Plan)
Laboratory Facilities, Layout, Reagent and Climate Control
Analytical Procedures (may reference SOPs)
Equipment and Instrument Maintenance and Calibration
Sample Receiving, Storage, Preparation, Analysis, Records
Internal Quality Control
Quality Assurance (in QA Program and Project Plans)
Interlaboratory and Accreditation Studies (participation, follow-up)
Performance and System Audits (schedule, checkpoints, criteria)
Internal QC Data Review (schedule, checkpoints, criteria)
Project Data Review (reduction, validation, reporting)
Corrective Actions (problems, actions, effectiveness)
Assessment of DQO Achievement (recommendations)
QA Reports (schedule, content, include with QAPP revision)
79
-------
TABLE 2: PAST REVIEW OF GREAT LAKES PROGRAMS 1986-88
(BEAK, 1988)
poor documentation of QA programs - especially for biological measurements.
round-robins detecting interlaboratory bias - significance to programs unclear
no systematic approach to bias estimation - no statement of comparability or compatibility
goals
analytical DQOs were stated for sensitivity, precision, accuracy
definitions and uses of these performance measures were inconsistent or lacking
no goals defined for total uncertainty or power to answer questions
questions and data uses too vague to permit definition of such goals
study designs were stated but no linked to goals, objectives or uses
projects and objectives diverse - no integrated lakewide objectives
data reporting practice variable - bias correction, quality flags, low level flags (T, W)
inconsistent and infrequent reporting of data quality and therefore little use of such information
80
-------
TABLE 3:
MQOs FOR INORGANIC AND ORGANIC CHEMISTRY ANALYSES FOR THE ARCS PROGRAM1 (excerpt)
MDL"
Parameter
Sediment
Tleeue Elutriate
Water
Accuracy*
Freouencv
Precision*
Freouencv
(HB/kO)
(MO/kg) (moA.)
(WJA-)
DO
10
l O.S mg/L 1/batch
i O.i mg/L 1/batch
Conductivity
N/A
11 jiS/cm 1/batch
t 2 jiS/cm 1/batch
Chkjrophytl-a
100
t 20%
Ifttatch
£20%
1/batch
Total eoilda
0.001Q
N/A
N/A
£20%
1/batch
Volatile aoRda
0.0020
N/A
N/A
£20%
1/batch
TSS
O-OOIfl
N/A
N/A
£20%
1/batch
PSA
0.001o
windows
1/batch
£20%
1/batch
sen
0.001Q
i 20%
1/batch
£20%
1/batch
Moisture content
0.001g
N/A
N/A
£20%
1/batcli
~old content
0.001a
120%
1/batch
£20%
1/batch
a - MQOs presented do not apply to die measurement of water quality parameters associated with bioassays or fish bioaccumulation studies.
b - MDLs for water include porewater and water column samples. Units presented in subheading are applicable to all parameters unless
otherwise noted. If no MDL is presented, then that parameter is not measured in that given matrix. N/A = not applicable.
c - Accuracy determined from CRM, SRM or standard, and is measured from the known concentration.
d - Precision is calculated as % RSD. It should be noted that LLRS will only be performing duplicate analyses; therefore, the limit will
be calculated as a RPD. Precision requirements listed here are for analytical replicates only; field duplicates are required to have a RPD
£30%.
-------
TABLE 4: MQOs FOR BIOASSAYS AND FISH BIOACCUMULATION STUDIES FOR THE ARCS PROGRAM (excerpt)
Precision"
Assay Omanlsm/Communltv Endootnt* Response limit fmeanl6 ffoffftnc* T^apf fr Accuracy
Pimeohaies oromeias
Survival
80%
1 CR
60%
Larval growth
0.25 mg
25% or 1 CR
60%
Daohnla maona
Survival
80%
45%
40%
Reproduction
60 young
1 CR
40%
Ames Assay (Salmonella)
Revertant Colonies
N/A
25%
N/A
Alkaline phosphatase
Enzyme Activity
20%*
25%
80%
Dehydrogenase
Enzyme Activity
20%*
25%
80%
p-Galactosidase
Enzyme Activity
20%*
25%
80%
/}-Giucoeldaee
Enzyme Activity
20%*
25%
00%
Benthic community structure
Abundance of species
N/A
N/A
N/A
Raold Bioassessment II.III
Community Indices HOI
N/A
N/A
00%
a - For bioassays in which survival is the primary endpoint, MQOs are only presented for the survival endpoint. The additional endpoints,
such as avoidance, uptake, development and molting frequency, will be recorded by the Pis.
b - The response limit is presented as the mean of the test replicates in the control or blank sample. For Ceriodaphnia dubia and Daphnia
magna, the reproduction response limit is the cumulative total of three broods. N/A = not applicable.
c - CR = concentration range in the serial dilution assays. The percentages are the maximum % RSD of EC50 and LC50 values allowed
among the replicates.
d - Precision values presented are the maximum % RSDs allowed among the replicate tests. Accuracy limits presented are the maximum
% RSDs compared through time for a given test of reference toxicant or reference sediment.
-------
FIGURE
1:
DECISION FRAMEWORK FOR GREAT LAKES MANAGEMENT
(modified from EPA, 1992b)
No
Yes
Reduce
jHarvest Targets
Declare
Success!
Identify Sources
Identify Causes
Monitor Source
Reductions
Implement Activities
Monitor Habitat
Value
Monitor Ecosystem
Response
Identify
Critical Habitats
Estimate Habitat
Losses
Determine Critical
Pollutants
Quantify
Loads
Habitat Inventory
Targets
Establish Load
Reduction Targets
Identify Protection
Restoration Activities
Identify Prevention
Reduction, and
Remediation Activities
Are
Ecological Resources
Impaired or
Threatened?
83
-------
FIGURE 2:
CONSEQUENCE OF MEASUREMENT ERROR IN DECISION-
MAKING. Power = 1-0 (from EC/DFO, 199la,b)
Ho Hi
Mean Response V
1. Pr (rejecting H<, when fic - fiT ^ 0) £ a
2. Pr (not rejecting Ho when Me - Mr ^ 0) 2: 1 - «
3. Pr (rejecting H, when Me - m, ^ A) £ 1 - 0
4. Pr (not rejecting H, when Me - Mr ^ A) £ 0
84
-------
Model:
11.6 + 1.2 q_
where: L = areal phosphorus loading (ug/L), with error
11.6 = net settling velocity to sediment (m/a)
qs = areal hydraulic loading (m/a)
I.OO
.90
K = SL/L
1.0\
AW0""'
k K
l«
1.0
7
\\
Nh-
!
1.00
yL=P
(predicted phosphorus concentration, in mg/l)
c
©
(5
o
55
m
to
O
o
JZ
a.
o
w
O
»
0
2
o
>»
¦O
a
.a
o
K = Sl/L
1.00
0.005
0.03S
0.020
yL=P
(predicted phosphorus concentration, in mg/1)
c
o
«
o
(19
«
o
o
JZ
a
o
w
«-»
3
UJ
O
>.
J3
«o
x>
o
.50
K =
sl/l
| 0.0
/ \ 03
/
/-0.6
K
/I/f\
1(V/
777 o
- K
O.OOS
0.035
0.020
yL=P
(predicted phosphorus concentration,in mg/ I)
FIGURE 3:
CONSEQUENCE OF LOADING ERROR IN TROPHIC STATUS DESIGNATION
(after Reckhow, 1979)
-------
40 From Smith 1974
30
Number of
Species
20
10
Maximum
Return
5 10 15 20 25 30 35 40
Quadrats
FIGURE 4: SPECIES RECOVERY OBJECTIVE IN A BIOLOGICAL SURVEY
86
-------
FIGURE 5:
SAMPLING METHOD EFFECTS ON LOG TOTAL DENSITY
CO
C.
0)
Q
I
4.5
4
3.5
3
2.5
2
1.5
1
0.5
0
ft
9
~
¦ Ponar grab, 200 um sieve
~ Ponar grab, 500 um sieve
~ Airlift sampler, 200 um sieve
2
Station
C
-------
FIGURE 6:
REPLICATION EFFECTS ON APPARENT SPECIES RICHNESS
7
6
o
X
5
£
i
M
4
o
©
3
n
E
2
3
z
1
0
T SMD 4.2E - Airlift Sampler
1
3 4
Number of Replicates
88
-------
FIGURE 7: RANGE AND AVERAGE CONTROL CHARTS FOR TOXICITY TESTING
Average Control Charts for Reference Toxicants
o s -| Control Chart: Rainbow Trout
Sodium Pentachlorophenate
0.45 -j
0.4
0.35 ~
0.3 -
LC50
(mg/L) 025
0.2 -
0.15 -
0.1
o.os
Upper Contra* Limit
i *
Lower Control Limit
, 1 I I I I 1 1 1 1 1 1
02/01/89 23/01/69 0SO3/89 28/03/89 24/04/89 08/05/89 29/05/89 08/06/89 13/06/89 19/06/B9 19/06/89 26/06/89
Date
Control Charting of Duplicate Toxicity Test Data
60
Range of Difference Between Duplicates
Using Rainbow Trout
70
Range
(%)
60-
50.
40-
30 .
20-
10
0 »
Control Limit
Mean Range
80
Mean Effluent LC50 {%) of Duplicates
w
-------
FIGURE 8:
SORTER AND TEMPORAL EFFECTS ON LOG TOTAL DENSITY
4.5
4
3.5
3
2.5
2
1.5
1
0.5
0
Sorter
¦
d
= sorter DF
1985
~
e
= sorter EJ
1985
~
m
= sorter MS
1983
B
o
~
~
~
~
~
~
t
~
~
~
t
~
H
4
2 3
Station
-------
TFP
V©
361
34
32
3°{
15
14-
13
12
11
1
-------
¦
000
900
800
700
600
500
400
300
200
100
0
INTERLABORATORY STUDY OF TOXICITY TEST PERFORMANCE
Alkalinity <700 mg/L
Alkalinity >700 mg/L
true concentration of unknown
50 o
40 5
BC ED HFX YUK IEC ONT QC
-------
COMMENTS AND QUESTIONS
Workshop Participants
[Prior to break-out to individual working groups]
George Crawford:
Don Hart has talked to us in some detail and given us many good and useful
examples on all of these subjects, particularly those that are highlighted by Table 1. What
we're interested in now - as a result of the plenary session that you've listened to, through
the presentation by Don and in consideration of discussions with your colleagues here - is
whether there is any subject that you feel is missing, too soft, too hard or needs further
exploration to proceed?
Dave Bigelow:
While listening through this, I have the following comments ... I think, in this design,
from my experience that its a mistake not to include data management as a separate topic
along with field and laboratory. I challenge you and your groups to consider what you
would put in data management. To give you some examples, tilings like ... conversion
factors. Supposing that all the U.S. people are working in the English system and everyone
else is working in the metric system and we decide that we're going to convert to two
significant digits versus three significant digits versus four significant digits. We can all
introduce biases there. It's an item that can all be collapsed together under an issue of data
management.
I look at some issues about censoring or sequestering of data ... think about spelling
this out in advance. If you are an observer and you look out and say "well, I didn't see
anything". Well, that's a little different than if you said "well, I forgot to go out and the
data's missing" or "I went out and looked and didn't see anything, so it must be beyond
"x"". That's censored data. There are big differences in terms of how you're going to report
for all these different situations. I think it's important that you think about this when you're
bringing a generic plan together.
Another important issue is when you're doing integrated studies you almost always
end up with a bunch of data sets that are pulled together. You need to consider whether or
not this (combined) data set is a 'product' of the program. In most cases ... yes, it is ... and
it is the primary point where you're going to jump off and find out if you met your program
objectives. The data set is a product and it needs to be explicitly defined. All the
parameters that you're going to use ... I don't think it should be left up to program
management, per se. I think that it's a data management question.... That's often the way
work is distributed anyway, at least in my experience. One group has responsibility for
some laboratory measurements, another group or another contractor does some of the field
work and yet another group does the data management. So why not organize your data
quality objectives and quality control activities in the way that the work's done?
93
-------
... Think about the quality assurance in each of these areas. You can think about
interlaboratory or accreditation studies in data management. You can do round-robins with
instruments. You can send instruments around to various places and find out if they perform
the same way. You can send people around to work on a particular instrument to make sure
they all use it the same way. Well, that's no different than the round-robins that you're
doing in the laboratory - you can do the same things with data. If you've been involved in
data management and you take something from a DOS machine and you put it into a Mac
or you put it into a VAX or you put it into some UNIX box, you don't always get the same
number. You need to control that - you need to make sure you're getting the same number.
So some of these techniques that are in 'quality assurance' can be applied exclusively in
many operational areas. You can ask yourself: what's the precision needed in field sampling
versus the precision needed in laboratory analysis, which is the precision I need to carry
along to data management? For any other one of those quality parameters, you can do that.
So I'd say as some food for thought in the groups. "Do vou really want 'quality assurance'
addressed outside of the other operational categories, or do vou want to deal with it
explicitly in each operational area?"
John Lawrence:
I think we also need to keep in mind when we have the discussions that we heard a
lot today in the Great Lakes program and the EMAP program where we're starting to put
much more attention than in the past on integration and integrated monitoring. I think we
need to really bear in mind that we're not talking about QA programs for a water sampling
program or a terrestrial program or an atmospheric program - we're really tallcinp ahnnt
integration and looking at everything from an ecosystem perspective. I think that holistic
view probably comes in primarily within the quality management block but also its going
to have to be reflected down in the field, the lab and the QA sections as well.
Don King:
I think we compartmentalize too much, "there's lab people, there's field people,
there's program people, there's data management people". I would hope that we don't
maintain the areas of the past where those people are so separate that they don't even talk
to each other and, in fact, they exclude people from input. So much of what lab people do
is affected by what they think the client thinks or the boss or the public might need and we
end up screening everything through several vertical layers of management or committees.
If we can take a more TQM approach ... spread it out and get everyone involved so that if
an idea comes up, it comes from me to the group - not from me to my boss to my boss's
boss. We've got to get ideas disseminated and broadly discussed.
94
-------
Jim Bowers:
... I'm from Westinghouse. I'm a limnologist. I work for a private corporation so, in
some respects, maybe my perspective is a little different.... First of all, we hear a lot about
total quality management. That's rampant in my own corporation, its rampant in a lot of
governments right now. I would suggest... to this group - always remember, for those of us
who have read Dftmminp's books, total quality management was built for manufacturing
THINGS. So if you're hanging doors on Tovotas. that system of thought works very well....
To implant that QA strategy and management thought template on development work or
QA/QC work for an ecological risk assessment or Superfund site is not the same thing.
We're not malring anything. So I would temper our acceptance of that.
As a certified ecologist, I would also humbly suggest... some ecology here. QA, to
me as a scientist, when I review an NSF proposal or an EPA research proposal, means an
algebra equation. QA = quality plus accountability. They are two very distinctive things.
I have monitored and seen and reviewed several programs that have a lot of excellent
accountability. Quality wise they were terrible. Why? Because they were trying to test an
untestable hypothesis or they asked the wrong question over the issue. We've got to be very
careful there, as to what you call quality and what you call accountability. I can run all the
blanks and do the round-robins with the samples but if you're measuring the wrong
parameter, what difference does it make? There's been a lot of environmental research
practised like that, where we ask the wrong question. Let's put that in the OA program to
brine up the quality as much as the accountability.
If you think about population biology ... this whole environmental business is getting
beyond measuring pH and a bunch of physical and chemical parameters ... but we're starting
to measure biological rate processes or fate and effects of a chemical at the population or
community level. We're trying to formalize that. I don't see the ecology coming in on that.
I don't see the latest breakthroughs that are given at the Ecological Society of America, the
latest paradigm or observations on what works or doesn't work getting into the QA/QC
process. I don't see bottom-up, top-down theory in ecosystems. That hasn't been mentioned
here, but it is the paradigm driving all ecologists around the world now on how they view
ecosystems. If you don't know what top-down, cascading trophic effects are or if you don't
know what bottom-up effects are, that was developed in open water oceanography, then how
can you talk about ecological stress if you don't even know ecosystem structures and how
they function. That comes to quality - let's get some modern ecology into the QA/QC
process right up front, right up with our management.
Integration - I like people to use that word. I've done some projects on Lake
Michigan and if you're going to look at trophic dynamics and try to unconfound the effects
of toxicants, you've got to sample together. I'm on the Laurentian one day taking biological
samples and somebody else gets there two days later on the Inland Seas and takes samples
for chemicals. Then they're going integrate the results - I'm sorry, you can't do it, it won't
work. In studying primary production - the nutrient samples come out of the same water
bottle that vou run vour 14C assimilation on. That's integration, where its time and space
-------
exactly together. If you pull a sample out of a lake, like Tenolli always said, it represents
neither the place nor the history nor the environment of where you pulled that sample from.
Don King:
I'm going to follow on what Jim's indicating. In the MISA program, we've looked
at hundreds of parameters. I think it was a useful exercise in the end because it
demonstrated the things that everybody was worried about, really weren't there to worry
about. But we never did find out whether they had been there to worry about. We know
that several parameters disappear - like resin and fatty acids, as soon as you try to spike a
biologically-active sample, its gone, even when its preserved. So if you're going to try to
sample what's going on in the environment, which is a dynamic equilibrium at best, are our
static procedures of taking samples back to a lab that's hundreds of miles away and five days
later complete our analysis, is that going to give us the answer that we want? We're going
to have to focus on what we're trying to measure and why we're trying to measure it before
we start telling labs to do a lot of work, which will end up giving information but not the
right information.
George Crawford:
That sounds similar to Jim's point, i.e., be sure you ask the right question first before
you start things off.
Don King:
... Quality assurance should be up front, to help us decide not to do something rather
than a year later finding out we've wasted our time.
James Stribling:
I just wanted to echo that myself. I think what Jim was saying and also what Don
was saying were close to the same thing - understanding the questions to be asked and how
you're going to answer those questions. A lot comes from current ecological thinking, even
ecological studies on the forefront of that science. My understanding from when I started
learning about DQO's from the proceedings of the third workshop, is that part of the DQO
process is to state your objectives very clearly, break them down into more specific
questions, decide what measurements are going to taken to answer those specific questions,
decide what variables exist which might affect your being able to answer those questions
and, then the next step, to decide which of those measurements you're going to take. You're
not taking an encyclopedic approach, but asking which suite of variables are going to give
you data to help you address your original objectives. You don't have to measure everything
96
-------
in the world. Chosen variables should be based on the best current ecological thinking and
that does have a lot to do with the indicator development process that EMAP is using.
Dave Blgelow:
Listening to the last two people, I think there are several levels to keep in mind. One
is that you're doing some quality assessments on the project as a whole. But you're also
assessing the parts. You have two functions: ... you're trying to ensure that the end product
was done in a quality way .... you're also trying to give some information to management.
Those are two different things. For some of these things we're referring to the quality of
the end product and for other things we're talking about the quality of the parts that went
into it. To some extent, those are two different things. The bottom line is. if you look at
the data when all is said and done, it ought to make some sense. One thing that I like to do
with different kinds of data, especially temporal data, is to use one of these manufacturing
techniques. Put up a control chart, look at the median, look at the psuedo-sigmas or the
inter-quartile ranges. If you look at that over ten years of data or twenty years of data, its
kind of interesting. You get an idea of the distribution. In certain periods if I have lots of
variability, maybe I'll go back and look at that QA information. Maybe there's something
there that tells me why its variable. Or maybe there's something there that gives me a good
confidence that the structure is real. So there's a reality check there. You go through all
these steps - but you want to make sure at the end that the result makes good sense
scientifically. For example, is there an ion balance in the chemistry?
Craig Palmer:
I wanted to, first of all, support what Dave said at the very start about data
management. We had a situation this past year where we took all of our field data and,
when we went to collapse the data into plot level aggregate data, we found that there was
a glitch in our program. For a certain percentage of our data it created gobbledegook.
Fortunately, some of our analysts picked it up before we published our final report. But it
brought to mind that there could be a number of things happening in information
management systems. I was talking to a fellow who handles lots of data, not in an
ecological study but at a test range where they fly these new airplanes.... What they do on
their data , because they have lots of data, is they send out these test data packs ... and then
send it back home through their information management system to see if it comes back the
same. We don't do that in ecological studies but we collect an awful lot of data in the field,
in our program its all on portable data recorders, but we don't check the data management
system by sending out data and see if it come back through the whole system intact. So
there are a number of things about QA that we can expand in terms of information
management.
As a second point, it seems to me, that under field operations there is a real emphasis
on sample collection, for subsequent submission to the laboratory. In our program, I'm in
EMAP forests, we collect most of our data through field observations, and I think a lot of
97
-------
ecological data is like that where its actual field observations. I'm not in that group but I
would like to suggest that the group that is looking at the field operations might give some
thought to quality assurance and some of the different things we have to do to address field
measurements in which the actual data is collected right in the field through field observers.
Don King:
Something that I've been involved in recently is looking at the quality of organic
data. We're in the beginning of looking at ways to improve the quality of that information.
It's always a practice to set up limits for expectations and if a number exceeds that
expectation, then you immediately turn a knob, push a button, do something. But if you
haven't got any information to support your expectations, you might end up fixing something
that didn't need to be fixed. Several years ago, a field study brought home to me the need
to record other things that are unusual. Not necessarily everything, but things that are
different. We were doing a between lab, between province/state transect of the Detroit
River. Two boats were out there taking samples morning and afternoon for two or three
days. One day, in the morning, all of our phosphate data was flying all over the place.
Where Michigan had a nice smooth curve going from one shore to the other shore, ours went
across part way. There's an island upstream -on one side of the island the data were all over
the place. On the other side of the island, it was fine. In the afternoon, phosphate was
beautiful across the river. The ammonia and the nitrate were up and down all over the place.
It turned out that the ammonia and nitrate were related ... 1 to 2. So after plotting the data,
we thought we had a lab problem. I called up the field people. I say, "Was there a strong
wind down the river that day?" He says, "Yeah, it was about 30-35 miles an hour." I say,
"Is there a fertilizer plant upstream?" He says, "Yes." They were using open-mouth jars,
sitting on the deck, not a usual practice but in this particular situation it was the wrong thing
to do. We gathered up about a milligram of fertilizer in these containers while we were on
the one side of the island.... I think its important to consider up front what other observations
mav be keys to data interpretation because there mav be something show up.
Dave Bigelow:
I'd like to mention another part of the QA process ... for lack of a better way to put
it. It's important earlv on to begin using the data. When you start to do that then you have
some explicit knowledge of how things are really going together. To me, that's something
that needs to go into management. Piloting is one way to put it. I think that's very
important. I don't know exactly where it goes in a QA plan.
98
-------
Dan Orr:
You've been talking about integrating data. I'm iust wondering what sort of
mfvhanismc arp. in plane or does some agency exist to act as a clearinghouse for Great Lakes
data? i know that there are a number of integrated data sets in Canada but I'm just
wondering if there is something for the Great Lakes? Or who would have that
responsibility?
Craig Palmer:
I was at a workshop recently and I asked that question and they responded by saying,
"Since you asked the question, you're responsible".... This past year we had a beautiful QA
Project Plan. The only one that knew what was in it was our QA officer. All of our lead
scientists had written little parts of it... and when it came to implementing it, it really fell
apart. Where can we put into the QA planning process or the Project Plan some action to
ma iff, sure it happens (in the sense that all of it gets implemented logistically and that people
understand that these are the QA requirements for this project, that its not just the QA
Officer who knows about it)? Does that come in under Quality Management? Don, did you
address that in any part?
Don Hart:
Its probably a part of communications. I guess I would see it as more of a program
organization issue.
George Crawford:
Ok, so that's a suggestion that comes under program organization, that third bullet
on Table 1 (Responsibilities, Authorities, Reporting Channels).
Craig Palmer:
But why organization when we're communicating...?
George Crawford:
Maybe the word communications is lacking.
99
-------
Don King:
I came to this workshop thinking that it was just going to be an exercise. I get here
and I find out that there is a major new initiative in the Great Lakes starting. One of the
questions is: If we're going to do OA. quality accountability, how long in advance do we
have to plan it? Can they afford to give us two years to get our act together or is it going
to be another project that starts November 1st... and get the report out by November 30th?
Should we be discussing how long it takes to get our act together? ... I think the kev thinp
is that the program management organization has to really address the communication
aspects and make sure that everybody is on the team. Thev can't wait until after the project
is started to have that happen.
George Crawford:
We've got thirteen extra points here, although there may be duplication. I'm not
quite sure where they belong, where we should place them. But that's a half hour so I'm
going to hold you to my word and suggest we cut that off at this point. ... we'll get these
printed up for tomorrow, and then it will be up to each of the working groups to decide
where they fit. I don't think I would be presumptuous enough to try to categorize for you,
where these items would fit under a generic plan. But given Table 1 as starting point for
us all, let's try to figure out where these belong and whether some are more important than
other things listed here or what their status might be.
100
-------
TABLE 1: VALUE ADDED ELEMENTS FOR CONSIDERATION AS A RESULT OF DAY
1 OPEN FORUM FOLLOWING PLENARY SESSION
The following represent additional elements for inclusion in a Generic Quality Assurance Program
Plan, as identified by the Workshop participants following Dr. Hart's presentation. Apologies to the
contributors for any variance between their original comments and this text.
Item
Description
Category
1
Standardization of lab QA practices to improve, data integration
(e.g., standardization of correction factors, recovery, blanks,
surrogates, etc.)
Lab
Data Mgt.
2
Censoring and sequestering of data, need for standardization of
rules, protocols
Data Mgt.
Lab, Field
3
Data integration, creating a new "product" that requires
standardized approaches for review, edit, categorization
Data Mgt.
4
DQOs - development of details at separate and appropriate level for
lab, field, data management - then integrate into overall plan
DQOs
Prg. Mgt.
Data Mgt.
5
Integration of ecosystem-wide studies into major program - impacts
of diversity and need to flag potential problems
Prg. Mgt.
6
DQOs and TQM as a mechanism for identifying goals, objectives,
criteria for success and ongoing review - increasing the "power to
answer the question"
Prg. Mgt.
7
TQM ("pure" Demming) and the translation from
production/manufacture to the areas of ecological program
management - a perceived limitation
Prog. Mgt.
8
DQOs and TQM - help to ensure you are asking the right question,
the first time in the development of the program
Prg. Mgt.
9
Field Data - integration of temporal, spatial, visual, logical data,
ensuring a "complete database" - META DATA
Prg. Mgt.
Field Ops.
10
Quality Information being integrated from many sources - quality
daft! versus information management - ensuring data integration
integrity
Data Mgt.
Prg. Mgt.
11
Data Management - data handling/transfer verification - use of
"standard reference data set" to pass through all stages of data
handling to do "check-sum-error"
Data Mgt.
Lab Ops.
Field Ops.
12
Data Utilization - hands-on assessment and identification of
problems while running the program - key people hands-on, less
accumulation for review later or at the end of the program
Data Mgt.
13
Data Management - multi-party or multi-jurisdiction data
integration - identification of "Clearing-house", responsibility
assignment, protocols, data sharing agreements
Data Mgt.
Prg. Mgt.
101
-------
FIFTH ANNUAL ECOLOGICAL QUALITY ASSURANCE WORKSHOP
WORKGROUP PANEL DISCUSSION
Thursday, October 15,1992
-------
PROGRAM MANAGEMENT ISSUES
Working Group #1
Name
Organization
Facilitators:
Dr. Mike Papp (leader)
Dr. Todd Howell (rapporteur)
U.S. EPA, GL National Program Office
Ontario MOEE, WRB
Participants:
Dr. Martine Allard
Robert Bilyea
Sharon Brown
A.L. DaCosta
Dr. Sushil Dixit
Mark Gordon
Dr. John Lawrence
Francis Philbert
J.-Guy Zakrevsky
David Bigelow
James Giattina
Steven Nemeth
Dr. Harvey Shear
Dave Rokosh
Larry King
Environment Canada
MOEE, WRB, Great Lakes
MOEE, WRB, Great Lakes
Imperial Oil Limited
Queens University
MOEE, WRB, Watershed Management
Environment Canada, NWRI
Environment Canada, Conserv. & Protection
Environment Canada, Saskatchewan
Colorado State University
U.S. EPA, GL National Program Office
General Motors Canada Ltd.
Environment Canada, GL Env. Office
MOEE, WRB, Watershed Management, Bioasmt.
Acres International
105
-------
PROGRAM MANAGEMENT ISSUES
Todd Howell
Ontario Ministry of Environment and Energy
The process that our group took to address our charge was to start yesterday by
prioritizing the elements in Table 1 of Don Hart's paper, with respect to management issues.
We identified five issues or subheadings within that table that we wanted to spend most of
our effort on, on the basis that we thought they were important. I'll just give you a listing -
under program or project description, there were three. We spent a great deal of effort in
this particular area. Objectives and Questions and Data Uses to Address Questions were the
first heading. Basic Design Approaches was our second area. Relationships Between
Programs was a third. And then under Program Organization, two areas - Reporting
Channels and Communications and Document Control. Just as a qualifier, most of the
discussion approached the topic on the basis of a management level or a program level and
really wasn't at a project level of examination. I'll just briefly go through the primary topics
which we talked about in the five areas.
OBJECTIVES, QUESTIONS AND DATA USES
Under Objectives and Questions and Data Uses to Address Questions, one of the
primary areas of discussion was identification of the fact that the planning process may be
an iterative process and that development of the QA Plan may need to go through several
cycles. It was important to clearly identify whether that was to be the case and, if so, to
identify some sort of time line to achieve the desired product. We also agreed that it was
important to identify objectives, the ultimate needs and the degree of confidence required,
but bearing in mind the importance of the interdependency of the objectives, and any other
uses of the information for each of the objectives identified.
There are several other points - I'll just list them. We thought that the cost
implications of the QA planning should be identified. The implications of not completing
elements of the QA Plan should be identified. We thought that socio-political concerns and
issues which impact on the program were important and should be identified. And I guess
one thing that was most important and crucial - we thought that the objectives needed to be
expressed in realistic and achievable terms and that they needed to be developed in concert
between managers, technical experts and the QA expert.
BASIC DESIGN APPROACHES
Moving on to design, basic design questions - we basically developed a list of items
that we considered should be part of the design section. I will just go down through it:
106
-------
Sampling concurrency - It should be recognized that goals and objectives of a project
may be achieved by merging two sampling protocols that are not concurrent. One,
for instance, may be hourly and another daily, but both can be aggregated to a
weekly estimate. If, however, concurrent sampling is required, say for meteorological
modelling, then the concurrency requirement should be clearly stated. Often times,
considerable savings can be achieved by recognizing early that concurrent sampling
is not a requirement for achieving project goals or objectives.
Fixed versus dynamic sampling - Programs requiring "on-demand" sampling versus
fixed schedules should be clearly identified. Sampling to capture random or
"triggered" events with a high level of completeness will probably be much more
difficult to implement than those using a fixed schedule with anticipated
incompleteness.
Spatial scale - The spatial scale of the design and the basis for defining the scale
should be clearly defined. Geographic, political, scientific and statistical criteria that
define the spatial interpretation of the data should be identified. If the achievement
of goals requires a random sampling design or a resolution to a political entity (a
city, stream, region, etc.), this should be clearly defined in the planning process. The
bounds of spatial interpolation and extrapolation of the data set should also be
considered.
Finally, it should be identified whether scoping work would be required in your
research or pilot investigation to enable the clear definition of the sampling and study design.
RELATIONSHIPS BETWEEN PROGRAMS
Preliminary data - Is it to be released?
The strategy for merging data sets - it has to be identified and evaluated as to its
adequacy. We also thought it was important to make the distinction between
merging of data sets for the purpose of merging equal or like sets of variables with
some degree of understanding of the precision and accuracy of variables, as opposed
to merging types of distinctly different pieces of information which might be used
to build a data set that is used in modelling exercises or developing second or third
level relationships. The criteria for the merging of the two types of information
should be identified separately.
REPORTING CHANNELS AND COMMUNICATIONS
There was consensus that this was a very important area of the Plan. It was
identified that communications could really serve as a remedial action plan for the overall
QA Plan. If there were problems, effective communication would certainly facilitate the
addressing of difficulties. The point was made that communications is very much tied into
achieving completeness objectives and timeliness goals. And by not effectively or speedily
107
-------
communicating information, you could actually lose information because of a time lag in
fixing up your problem and that could impact on the overall quality of the project.
The QA person responsible for the project or program should have ... access
privileges, should be part of the management team for the project, should have some degree
of independence and should have access to communication channels both in a vertical and
horizontal direction.
And finally, QA data should never be sequestered.
DOCUMENT CONTROL
We agreed that a review schedule was needed for the elements of the QA Plan.
There was both a need for a formal, fixed schedule and also the option for irregular additions
and modifications to be made. There should be a primary list of changes that are made to
these documents over time, permitting the identification of when and how changes were
made to the quality-related documentation.
You should have a policy to identify how information and data are to be released in
your program. You should have a person responsible for keeping all of your QA
documentation together in one place so you have a complete set that's accessible. And
finally, if and when the project does come to an end, its very important to have an archive
arrangement worked out so that all of your QA information just doesn't vanish and is
accessible if someone wishes to use your data ten years down the road.
I'm going to stop right here. We did, however, discuss very briefly an alternative
strategy for putting together a management plan. I'll just tell you what the headings are: it
included a data quality management section, that would basically cover off concerns about
the use and maintenance of the data. But most importantly, the thought was raised that an
integration section as separate from the management section would be valuable as a place
to identify how all the various bits of QA/QC information would fit together and be used to
assess the overall data product and used to assess how the overall data product addressed the
goals and objectives of the program.
COMMENTS
None
108
-------
FIELD OPERATIONS QAJQC
Working Group #2
Name
Organization
Facilitators:
Dan Orr (leader)
Steven Hedtke (rapporteur)
MOEE, Air Resources Branch
U.S. EPA, Duluth
Participants:
Dr. Peter Kauss
Kenneth Kuntz
Emery Law
Karen Ralph
Lisa Richman
Jane Richards
Michael Shackleton
Charles Barnett
Rhonda Mendel
Serge L'ltalien
Joseph B. Hunn
Jim Bowers
MOEE, WRB
Environment Canada, Conserv. & Protection
MOEE, WRB
Bayfield Institute
MOEE
Ecologistics Ltd., Lucan, Ontario
MOEE, ARB
USDA, Forest Service
Lotic Inc.
Environment Canada, WQB
U.S. Fish and Wildlife, NFCRC
Westinhouse Savannah River Lab
109
-------
FIELD OPERATIONS QA/QC
Steven Hedtke
U.S. Environmental Protection Agency
Well, first of all, I have to say that it was a general consensus of our group that
Don's straw document was a pretty comprehensive document that covered a lot of the things
we talked about. We continued to talk about them anyway, but a lot of the things we did
discuss were in that document. If I could summarize the three things we felt most important
were: documentation, communication and assessment.... Those were continuous themes
throughout our discussions.
SAMPLING DESIGN
It's interesting that from the field operations side we came up with many of the same
things that the program management side came up with. In spite of the fact that we weren't
developing (necessarily) data quality objectives, we believed that a clear definition of the
question that was going to be addressed and a clear understanding of an acceptable
confidence level or risk or uncertainty had to be expressed by management in order for the
field team to develop an appropriate sampling design to meet those needs. From my
impressions, that's been notably lacking in most management and organizations that I've
been involved with.
In the process of developing a sampling design, you also need to communicate - you
need to talk to people, you need to get more information, you need to continually express
what you're trying to do and document that so that as you go through the process of
changing the design (which inevitably you will do) you have a good process that documents
all these changes and the justifications and rationales for them.
Unfortunately, most programs in the Great Lakes are limited to some extent by
resources. We recognize that's a constraint, something that upfront also needs to be
expressed in order to develop the appropriate sampling design - one that we can live with.
When the manager tells you that he wants 100% accuracy all of the time and tells you that
you have $50,000 to do that, then you can go back to him and tell him "well, that isn't
exactly what you can accomplish with that amount of money".
Todd also mentioned pilot studies ... we felt that was an important part of developing
the sampling design for almost any program. The pilot studies don't necessarily have to be
ones where you go out in the field and spend six months doing sampling, but could be
taking historical data or available data and doing some paper exercises with that to do pilot
studies. Continually feed back as you go through this process to try to improve the design
and make it as cost-effective as possible.
110
-------
SAMPLING PROTOCOLS
Under sampling protocols, there was a good list of the things that should be included,
that Don has in his working draft. We felt that a methods manual was an important feature
that was capable and organized so that it could be updated as needed. A methods manual
that is prepared then sits on somebody's shelf and isn't communicated to the people who are
going out into the field or the new people coming into the program doesn't do anybody any
good, so you have to make sure that you are, in fact, communicating those methods to the
people.
DATA MANAGEMENT
In terms of data management, we felt that standardized forms are needed for a
particular project (not a generic standard form for all projects) ... in essence, a checklist of
what crews need to look at, including observational data, places for making comments,
things that they need to initial at the end (indicating) that they followed all the methods and
met all of the requirements at a sampling point.
We also felt that the field crews were in a position that they could do some early
assessment of the quality of the data by looking at the data. They've gone through their
field notebook when they get back to their home base or back to the ship or out of the rain.
Right then, they should be looking at the data to see if there's anything that stands out to
them, that is not appropriate or looks odd - they need to check on it right away. They are
the ones that have the freshest mind of what's gone on.
SUMMARY
This would then be a summary of how we see this process working at the project
level. You could expand it to the program level or reduce it down to the small,
measurement level. Some sort of ... DQO's need to be developed enabling you to develop
a sampling design which includes what you're going to measure, how you're going to
measure, when you're going to measure it. Actually, going through a sampling procedure
can be helpful, where you're determining how it is you're actually going to go out and
measure things. You need to have the assessment phase where you are looking at the data
as soon as possible. A number of participants in our group talked about data sitting around
for months or a year or, in some cases, a number of years. You might think its just fine and
(later) you look at it and you find there are some things you just don't understand. You may
have wasted the whole time period there, whereas if you look at it right away and start
making modifications as necessary, it won't be a wasted effort. We put that assessment
phase in the whole process and it includes the field crews, analytical crews or technicians
... trying to strive, not only to improve the sampling procedures, but to try to optimize the
sampling design. Even if the DQO is met, you may still be able to optimize the sampling
design (and reduce the number of stations, for example). Without this assessment phase, we
111
-------
often get "mindless monitoring" ... where the data builds up, nobody ever looks at it and the
money keeps getting spent and then, finally, somebody looks at it and decides they don't
know why they got it or what they'll do with it. If we follow this sort of cycle, we think
we can prevent that.
COMMENTS
Don King: If you're making changes, ... in a year-long project, and six months into
the data set you realize this thing isn't working and you make a change.
One keeps track of the fact that you don't use the previous data any more -
you don't average across these two inconsistent data sets. Does this mean
that the project is now a year and a half long? If you're going for a full
year's data, and you change half way through it, what effect does it have
on the project?
Steve Hedtke:
Dave Bigelow:
I don't know if there's a generic answer to that. We have lots of problems
and questions like that because there's this whole range from a very small
one-person research study to a major basin-wide monitoring program. The
significance of that change and how you handle the data depends on the
particular circumstances. If its a one-person effort, they're the ones that
are going to throw away the data. If its a multi-agency survey, hopefully
there's a data management scheme in place to track and deal with those
problems.
Just because you change methods, doesn't automatically mean its not
comparable. One thing you can do is you can ... establish a relationship
between the old method and the new method. Maybe you can. If you can,
you can decide to retroactively change it, and then you have a data set that
has integrity again over time. There are resources also involved. If you
run a side-by-side comparison for a while, when you make the transition
and you plan (for) that. You can build that into your design. So there are
some options, but again they have to be planned for.
112
-------
GENERAL QUALITY ASSURANCE
Working Group #3
Name
Organization
Facilitators:
Robert Bisson (leader)
Linda Kirkland (rapporteur)
Environment Canada
U.S. EPA, Washington
Participants:
Don King
Dr. Peter MacDonald
Claire Maclnnis
Deneen Sebastian
James Bowers
Sylvia Cussion
Maryann Bogard
Elizabeth Cline
Havard Hovind
Dave Warry
MOEE, Laboratory Services Branch
McMaster University
New Brunswick Department of Environment
Ecologists Ltd., Lucan, Ontario
Westinghouse Sav. Laboratory
MOEE, Laboratory Services Branch
MOEE, Laboratory Services Branch
Colorado State University
Norwegian Institute for Water Research
Environment Canada
113
-------
GENERAL QUALITY ASSURANCE
Linda Kirkland
U.S. Environmental Protection Agency
We looked over the list that Don had put together and basically had no problem with
agreeing that interlaboratory and accreditation studies, performance and systems audits,
internal QC data review, project data review and assessments of DQO achievement should
all be included. However, when we discussed these, we incorporated the corrective action
and QA reports as we went along, because most of these things have to be reported and are
associated with corrective actions. When we went through each one of the items, we tried
to locate the purpose of the activity in terms of the general QA system. We tried to get a
feeling for where the responsibility lies for taking care of that activity, the nature of the
information generated and then once the information got to where it needed to go, what kind
of follow-up would be obtained. We noted that we had a definite thread going through all
of our discussions, that is that we have to be careful in planning for QA information to make
sure its something that will get used. Is it needed? What information is going to be in the
results of that activity? Where is it going to be communicated? What kind of follow-up is
going to be associated with it? We tried to take the information management or data
management component and look at that as the technology through which these things would
be done. We had a systematic overview of all the areas we went through.
ACCREDITATION AND INTERLABORATORY STUDIES
We noted when we were talking about accreditation studies versus interlaboratory
studies that very often the purpose for these is somewhat different and therefore so is the
type of information and who uses it different. Accreditation seemed to be very much
information that was needed for ranking labs so that on the one end the lab knew what their
credentials were and on the other end the procurers of those labs would have some way of
doing a selection. The information usually was results from PE studies, usually from fairly
large ones, greater than twenty labs. So they did produce some good information for various
parameters on what kind of biases you could expect from various labs. They were very
often planned to be followed up by site visits. These studies were a little more procurement-
oriented and larger than inter-lab studies, that in our experience, seemed to be related more
to projects so there would be few labs actually involved. The actual definition of what the
project was doing would be a little tighter for an inter-lab study. The inter-lab study would
be more specific in terms of what it is we need to know here, based on, maybe, data you're
pooling or something of that nature, where you might use the data from the interlaboratory
study to evaluate your project information.
As to internal use of QA study data, there wasn't much question that data were going
to be used for continuous improvement. However, when it came to any idea of external use,
114
-------
other than ranking the labs, it was a little unclear on how an external user doing the data
analysis might use that information. There should be some use for it but it has to be planned
in advance, and you would certainly want some information on the method attributes. It
would be important to have that associated with the study data. Good planning on what
information and associated information fits in your program would be needed. If you are
going to have this done right, you need to centre on what range of parameters you're dealing
with, so that the QA study can give useful information, and you need to have your samples
selected to fit your needs. If possible, give some idea as to the approach of the tests. Like,
if you're wanting information on variability in analysis, you don't want the labs varying in
how they prep the samples. You would like to have that held constant so that the QA study
will tell you what you really want to know about. So just as you plan your sampling
information, you really need to plan your QA studies, too.
SYSTEM AND PERFORMANCE AUDITS
Under systems audits and performance audits, we looked into the balance of effort,
knowing that a systems audit is more in depth. If you don't know a system, its good
information overall. However, it is far more consuming of resources than a performance
audit, which might be more needed if you know you have trouble areas where variability
creeps in that you want to control. We can see the different audits have different uses, and
you might want to tailor your frequency to the actual needs of your program and the kind
of projects you are doing (we kept talking of data quality objectives (DQOs) for our QA
information). We recognized that either of these audits could be done with either your field
component or your lab component. We found it was most important to make sure the results
actually would get to the right people, where they would impact on the continuous
improvement of the project.
USE OF QC DATA
Internal QC data use, we found, was very important in terms of both field and
laboratory. In some cases, if you are working with a team approach, you would combine
these to make sure the lab people could tell the field people what information they might
need to know about the sample coming in, or the field people would warn the lab if they
took in their field notes some unusual observations that might be needed to do an
interpretation. We got down to talking about basic principles of... TQM, (where) we needed
buy-in by our field and lab personnel. Very often, these people would be closest to the
actual real time corrective action, and, if they weren't bought in, and they didn't know their
place in the system, they would not be able to communicate the information where it needed
to go in real time (i.e., you could flag if you needed to re-run an analysis, or if you were
having problems in a prep and you needed to use some of the remaining sample, or needed
to collect another sample). We noted that adequate training was a very important part of the
QA system. Otherwise, even if staff did collect QC information and put it in a control chart,
it might not be used for real-time corrective activity. There needs to be a real
115
-------
communication to the personnel, of their roles and responsibilities in this, i.e., the
communication links they are expected to react to and the ideas of team-building.
In order to have some idea of what the data are to be used for, and what is really
important about it, you need to have a good producer-client relationship, that is
communication and team building between those two so that the client isn't a mindless
supplier ... but would actually be part of a team. The nature of the final product and interim
product (which is usually a data set) needs to be understood enough to connect the field or
the lab to the project level so that there is good vertical communication as well as a
horizontal communication within the organization.
DATA REVIEW AND MANAGEMENT
Under project data review/data management, we found that information access could
be a real problem, that we had to think through information access to make sure the right
stuff could get to the right place. Obviously, an up-front planning for this is a critical
management task. We were concerned about resources for hardware and software needs to
make sure we could store enough of the QC information that it would be accessible at a later
date, particularly in long-term monitoring like we envision might be necessary for the Great
Lakes. We were very concerned about fostering the use of new technology like optical discs
with the advantage that you can have your QC information attached to the data in some
relational manner. It would be retrievable in interpretation or retrospective project reviews
like DQO-type assessment. We could see a use (in two ways) of project data review. One
might be in procurement - if the activity is relying on external suppliers rather than internal
people, the review of the data packages coming in as a deliverable would be needed as a
short-term check. There would be some way that you would need to verify the validation
of that data by the supplier, some sort of check on what you're buying. We recognized that
we would need also a long-term assessment - this would be more important for monitoring
where there was no legal use involved. Very often, your own people will be doing the work.
In that case, it would be more likely you would probably have special reports evaluating the
data over several years rather than an annual report that's done for procurement-type things.
We recognize that levels of accountability needed to be shared. Again, this gets back
to letting the staff know what their roles are, how they fit in the big picture, where to
communicate, in real-time, middle-time (for an annual basis) and long-term (for a project
basis) reporting. We felt very strongly that you needed to feed the information back into the
project. Continuous improvement can't happen if that's not ongoing.
Troubleshooting could be very important, but it requires that some procedure be set
up to feed information back into the QA system (e.g., annual project review). Problems may
be highlighted for attention in the next go-around of audits to check out what the problem
is or to take care of any variability that's a problem.
116
-------
COMMENTS
Don King: One quick one that I seem to have missed in all three talks - in particular, in the
last talk where quality control seemed to be directed towards improvement.
What I'm missing is some target or objective in the quality control program
where one says that the data is so bad that we can't use it at all and let's just
start from scratch. Or at the other end that the data is good, the program is in
control, let's not fix it any more. There doesn't seem to be any definition of
who sets the objectives for data quality - where is it in the process?
Kirkland: Well, when I mentioned data validation by the supplier, they would do their own
check against the specs they provided in procurement - that's one level of project
review. At the client's end there would be a verification (maybe not 100%) to
make sure the specs are satisfied. Obviously, if the work is internal, you're
going to have objectives worked into your performance control charts and how
you do your normal business inside. There would be a difference if its internal
or external but the same kind of project review or data review would have to be
fairly routine. The criteria, of course, would be in the Project Plan.
117
-------
LABORATORY OPERATIONS QA/QC
Working Group #4
Name
Organization
Facilitators:
George Crawford (leader)
Dr. Craig Palmer (rapporteur)
MOE, Pollution Prevention Office
University of Nevada
Participants:
Rob Cuello
Jane Sexton
James B. Stribling (Sam)
Rainer Lictenthaler
Daniel Andrews
Anna Del Carlo
George Steinke
Don Wismer
Guat-Peng Lee
Sathi Selliah
Pat Baulu
Keijo Aspila
Mark Gordon
Maggie Wilson
Bill Mcllveen
Battelle
PTI Environmental Services
Tetra Tech Inc.
Norwegian Institute for Water Research
Acres International
MOEE
MOEE, Laboratory Services Branch
Ontario Hydro
Environment Canada, WQB
MOEE, LSB, Quality Management Office
MOEE, LSB, MIS A Coordinator
Environment Canada, NWRI
MOEE, Water Resources Branch
Tetra Tech Inc.
MOEE, ARB
119
-------
LABORATORY OPERATIONS AND QA/QC
Craig Palmer
University of Nevada
Just a general comment ... our group, in terms of the workshop, had an interesting
observation. There was some concern that all of this work that we are going to in producing
this utility document (guidance document) is meant for the Great Lakes. We'll make sure
that Mike Papp and Alfie Chau use it. The utility document is somewhat limited to the
people here that helped develop it. There is some concern that a lot of groups that could
make good use of the document may not see it. The question arose as to how we could get
the word out (about the document) once it was done? One way possibly is to have our next
quality assurance workshop broaden our invitation list to some folks who haven't been
invited in the past. A lot of people here have found out ... through word-of-mouth or past
participation. A suggestion was made to try to broaden the mailing list, possibly by putting
some notices in some ecological journals. We went back and forth on that a little bit
because we've tried to keep the workshops about this size so you can accomplish things.
However, there is a desire to spread it out a bit. The other thing is to see if we couldn't
produce a paper - someone might take that on. There are some journals that take on quality
assurance papers.
There is a real need to integrate the field operations with the laboratory operations.
All the planning described under management is very critical to the success of the operations
and the overall program. In particular, an ability to partition error - the importance of the
laboratory measurement error as opposed to the field measurement error - we need a feel for
how those are contributing.
As was mentioned, our group tackled laboratory operations and QA/QC. What we
did really, first, was to brainstorm - what are some of the things that are not in Don's Table
1, things that we would like to add to the outline? We came up with:
We should probably say more about how to document the procedures;
More about sample handling (as it comes to the lab, in the lab) and more about
analysis?
How we're going to handle reference materials and standards and documenting
those;
How we're going to report our data;
We decided to try to summarize these points into five categories, which might help
to organize the section on laboratory operations.
120
-------
LABORATORY FACILITIES AND MANAGEMENT
First of all, we felt that in the Plan there should be some discussion of the nature of
the facilities required from their physical standpoint (and some of the things are there such
as layout, contamination control is particularly important), and then how the laboratory itself
was organized. In particular, who is going to handle quality assurance in the lab, who is
accountable for the data and health and safety is an important issue.
ANALYTICAL PROCEDURES
The third area that we suggest talking about is the analytical procedures and the
methods. We thought it would be worthwhile to put a little bit more detail, kind of
annotated information, that the procedures should be verified and that they should have some
standard operating procedures. If possible, reference where the procedure came from in the
literature. We kept coming back to reference materials and the importance of reference
materials in analytical procedures, that we have some good method performance objectives.
There (should) be some sort of inventory of standards and these standards should be
traceable back to international standards, if possible.
Bench Level
We had a separate section which we called ... analytical procedures at the bench
level. These are general laboratory QC issues about how to handle samples - when they
arrive, do you have some criteria for refusing to handle samples? Sometimes, that's
important. Is there any treatment of samples that needs to be taken care of when they arrive
at the lab? And then, verifying that your analytical procedures have a real-time assessment
of how your analytical procedures are performing.
Bench QA/QC
Continuing in that area, we thought you should expand a bit on quality control and
quality assurance of these analytical procedures at the bench. And specify who is going to
take a look at the quality assurance data. Some groups have what is called a Quality
Management Office, others just call it Quality Assurance specialists, to verify the data. But
there needs to be levels of looking at the data, internally is good but you also need some sort
of independent, external verification. On occasion, you need external experts to come in and
help make sure that everything is working.
We also tried to look at laboratories, not only for chemical analyses but for biological
analyses (such as analysis of benthic samples or root disease). It's very important in your
analytical procedures to give some thought to standard reference materials or tools so that
you can have some standards which your data relate to in accuracy assessment.
121
-------
It's also important as you verify your data that you have some verification checks
(ion balances are an example). On all of your data, try to build into it some way of
checking the data for its logical attributes, either chemical or historical, from past data sets
you can compare against.
SAMPLE HANDLING
Those are analytical procedures. We suggest you might want to add a section on
sample logistics in the lab. How you handle the samples when they come in, what sort of
chain-of-custody they have, what tracking mechanism you use so you don't get your samples
mixed up (mislabelled, or whatever). On the other end, what you are going to do with the
result. Also, in your planning you need to give some thought to safety. Some of these
samples may or may not be toxic and you may treat them with toxic chemicals in extracting
them. It was also felt that, for the lab to do a good job, they needed some feedback from
the field data ... being able to know what's happening with the field data that's been
collected. When you're done with these samples, you need to give some thought as to what
to do with them (archiving).
DATA MANAGEMENT AND REPORTING
We felt it was important to have some consistency in our format (for data
management and reporting), that we have an ability to flag data that was felt to be
questionable and to provide some statements to the data users about quality of data. A lot
of this information comes in terms of blinds and double blinds. These are important for
assessment of data quality. Other samples are also important for quality control. In
verification, we started giving thought to the information management system and problems
that can happen in the information management system. By the time we were done, we
decided to tackle a whole separate section on information management systems as opposed
to laboratory operations ... because, when we got into it, we realized there were a number
of important components to quality assurance in the lab and in the field where real-time
turnaround really makes a difference to the quality of the data.
The laboratory information management systems are very important, to be considered
as an approach to improving quality of data and finding data errors with rapid turnaround.
COMMENTS
Dave Bigelow: From my experience, I've been involved in a project that involved field
studies and lab studies and after several decades, it was up to me to try to
integrate everything. It would have been useful if, at the beginning, there
was a conscious decision to choose among field and lab studies in terms
of the amount of error you could live with. Its a tradeoff -in a lab study,
122
-------
you've got more control over treatments, you've got randomization, but
you lose environmental realism. In a field study, its the other way -
you've got more environmental realism but you lose in control.
Ultimately, somebody like me, who gets all this stuff after several decades
has to try to total up the errors that have accumulated from field and lab
(studies). I was worried looking at Table 1. The assumption was we're
dealing with samples people collected in the field then sent in to the lab
for analysis. We haven't thought of projects where you may have 15
different studies at once - some are field and some are lab, but they all
contribute to some central theme.
Todd Howell: We actually did think along those lines in our discussion. Some projects
were obviously lab, others more field but we did recognize a whole class
of projects that were fusions of many types of operations. We thought that
the best strategy to deal with these was the clear definition of the
objectives and the ways in which you were going to bring your information
to bear on the objectives.
Craig Palmer: We considered that in the laboratory, we couldn't make up for things that
got messed up in the field and we had some fun times telling horror stories
about samples that were collected inappropriately in the field. We felt that
QA plans should give attention to audits and verifications and performance
checks in the field, as well as the lab. We should also make sure our
sampling equipment is nicely specified.
123
-------
APPENDIX A
Workshop Participant Addresses
-------
FIFTH ECOLOGICAL QUALITY ASSURANCE WORKSHOP
ATTENDANCE LIST
CANADA
Dr. Martine Allard
A/Science Liaison Officer
Research & Application Branch
National Water Research Institute
Environment Canada
867 Lakeshore Rd.
P.O. Box 5050
Burlington, Ontario
L7R 4A6
Phone (416) 336-4877
FAX (416) 336-4989
Daniel Andrews
Acres International
480 University Ave., 13th Floor
Toronto, Ontario
M5G 1V2
Phone (416) 595-2000
FAX (416) 595-2004
Nabil Arafat
National Water Research Institute
Research & Application Branch/CCIW
Environment Canada
867 Lakeshore Rd.
Burlington, Ontario
L7R4A6
Phone (416) 336-4645
FAX (416) 336-4989
Keijo Aspila
National Water Research Institute
Environment Canada
867 Lakeshore Rd.
Burlington, Ontario
Phone (416) 336-4645
FAX (416)336-4989
Robert W. Bilyea
Ministry of the Environment
Water Resources Branch
Great Lakes section
1 St. Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4947
FAX (416) 323-4973
Dr. Robert Bisson
Science Liaison Officer
National Water Research Institute
Environment Canada
867 Lakeshore Road
P.O.Box 5050
Burlington, Ontario
L7R4A6
Phone (416) 336-6441
FAX (416)336-4989
Mary Ann Bogard
Laboratory Services Branch
Ministry of the Environment
125 Resources Road
P.O. Box 213
Rexdale, Ontario
M9W 5L1
Phone (416) 235-5761
FAX (416)235-6110
Duncan Boyd
Water Resources Branch
Ministry of the Environment
1 St. Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1P5
Phone (416) 323-4984
FAX (416)323-4973
-------
ATTENDANCE LIST (CANADA) (Cont'd)
Patricia Baulu
Laboratory Services Branch
Ministry of the Environment
125 Resources Rd.
P.O. Box 213
Rexdale, Ontario
M9W 5L1
Phone (416) 323-2770
FAX (416)323-2785
Sharon Brown
Water Resources Branch
Great Lakes Section
1 St. Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4955
FAX (416) 323-4973
Alfred S.Y. Chau
Quality Assurance
National Water Research Institute
Canada Centre for Inland Water
Environment Canada
867 Lakeshore Rd.
Burlington, Ontario
L7R4A6
Phone (416) 336-4653
FAX (416) 336-4989
Anna Del Carlo
Ministry of the Environment
1 St. Clair Ave. West, 3rd Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-2787
FAX (416) 965-9807
Sylvia Cussion
Laboratory Services Branch
Ministry of the Environment
125 Resources Rd., P.O. Box 213
Rexdale, Ontario
M9W 5L1
Phone (416) 235-5842
FAX (416)235-6110
Gordon Craig
Director, Bio-testing Services
Beak Consultants Limited
14 Abacus Road
Brampton, Ontario
L6T5B7
Phone (416) 794-2325 EXT 226
FAX (416) 794-2338
George Crawford
Laboratory Services Branch
Ministry of the Environment
125 Resources Road
P.O. Box 213
Rexdale, Ontario
M9W 5L1
Phone (416) 323-5075
FAX (416)323-5166
A.L. DaCosta
Imperial Oil Limited
Samia Refinery
P.O. Box 3004
Samia, Ontario
N7T7M5
Phone (519) 339-5657
FAX (519)339-5945
Dr. Sushil Dixit
Queen's University
Department of Biology
Kingston, Ontario
K7L 3N6
Phone (613) 545-6159
FAX (613) 545-6617
A.H. El-Shaarawi
National Water Research Institute
Canada Centre for Inland Waters
Environment Canada
867 Lakeshore Rd.
Burlington, Ontario
L7R4A6
Phone (416) 336-4584
FAX (416) 336-4989
-------
ATTENDANCE LIST (CANADA) (Cont'd)
Gilles Gagnon
Direction de la Recherche
Ministere des Forets
2700 Rue Einstein
Sainte-Foy, Quebec
GlP 3W8
Phone (418) 643-7994
FAX (418) 643-2165
Beverly Genest-Conway
Quality Assurance
National Laboratory for Environmental Testing
Environment Canada
P.O.Box 5050
Burlington, Ontario
L7R4A6
Phone (416) 336-4761
FAX (416) 336-6404
Mark Gordon
Water Resources Branch
Watershed Mgmt. Section
Ministry of the Environment
1 St. Clair Ave.West, 7th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4817
FAX (416)323-4823
Dr. Don Hart
Beak Consultants Limited
14 Abacus Rd.
Brampton, Ontario
L6T5B7
Phone (416) 794-2325
FAX (416) 794-2338
Larry King
Acres International
5259 Dorchester Road, P.O. Box 1001
Niagara Falls, Ontario
L2E 6W1
Phone (416) 374-5200
Todd Howell
Water Resources Branch
Great Lakes Section
Ministry of the Environment
1 St Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4801
FAX (416) 323-4973
Dr. Peter Kauss
Water Resources Branch
Ministry of the Environment
1 St. Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4952
FAX (416) 323-4973
Dr. Gail Krantzberg
Water Resources Branch
Great Lakes Section
Ministry of the Environment
1 St. Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4956
FAX (416)323-4973
Kenneth W. Kuntz
Water Quality Branch
Environment Canada
867 Lakeshore Rd., P.O. Box 5050
Burlington, Ontario
L7R 4A6 '
Phone (416) 336-4965
FAX (416)336-4609
Don King
Laboratory Services Branch
QA/QC Section
Ministry of the Environment
125 Resources Rd., P.O. Box 213
Rexdale, Ontario
M9W 5L1
Phone (416) 235-5838
FAX (416)235-6110
-------
ATTENDANCE LIST (CANADA) (Cont'd)
Emery Law
Water Resources Branch
Great Lakes Section
Ministry of the Environment
1 St. Clair Ave. W., 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4959
FAX (416)965-9807
Dr. John Lawrence
Director
Research and Applications Branch
National Water Research Institute
Environment Canada
P.O.Box 5050, 867 Lakeshore Rd.
Burlington, Ontario
L7R4A6
Phone (416) 336-4927
FAX (416)336-4989
Serge L'ltalien
Environment Canada -WQB
867 Lakeshore Rd., P.O. Box 5050
Burlington, Ontario
L7R4A6
Phone (416) 336-4960
FAX (416)336-4609
Guat-Peng Lee
Water Quality Branch
Environment Canada
11 Innovation Blvd.
Saskatoon, Sask. S7N 3H5
Phone (306) 975-5389
FAX (306)975-5143
Dr. Wan Li
Environment Canada
National Water Research Institute
867 Lakeshore Rd.
Burlington, Ontario
L7R4A6
Phone (416) 336-4926
FAX (416) 336-4989
Dr. Peter Macdonald
McMaster University
Dept. of Math & Statistics
1280 Main St. W.
Hamilton, Ontario
L8S4K1
Phone (416) 529-7070 EXT. 3423
FAX (416) 522-0935
Claire Maclnnis
QA Officer
New Brunswick Dept. of Environment
12 McGloin St.
Fredericton, New Brunswick 63A 5T8
Phone (506) 453-2477
FAX (506) 453-3269
Anne Neary
Ministry of the Environment
Dorset Research Centre, P.O. Box 39
Dorset, Ontario
POA1EO
Phone (705) 766-2412
FAX (705) 766-2254
Steven Nemeth
General Motors of Canada Limited
570 Glendale Ave.
St. Catharines, Ontario
L2R7B3
Phone (416) 641-6375
FAX (416) 641-6361
Dan Orr
Air Resources Branch
Ministry of the Environment
125 Resources Road, P.O. Box 213
Rexdale, Ontario M9W 5L1
Phone (416) 235-6177
FAX (416)235-6235
Francis Philbert
Canada Centre for Inland Waters
Environment Canada
867 Lakeshore Road, P.O. Box 5050
Burlington, Ontario L7R 4A6
Phone (416) 336-4663
FAX (416)336-4609
-------
ATTENDANCE LIST (CANADA 1 (Cont'd)
Karen Ralph
DFO/Great Lakes Lab.for Fisheries
and Aquatic Sciences
Bayfield Institute
867 Lakeshore Road
Burlington, Ontario
L7R4A6
Phone (416) 336-6425
FAX (416)336-6437
Lisa Richman
Ministry of the Environment
Water Resources Branch
Great Lakes Section
1 St. Clair Ave. W., 6th Floor
M4V 1K6
Phone (416)323-4953
FAX (416) 323-4073
Jane Sadler Richards
Ecologistics Limited
Box 399
Lucan, Ontario
NOM 2J0
Phone (519) 293-3251
FAX (519)293-3319
Deneen Sebastian
Ecologistics Limited
Box 399
Lucan,. Ontario
NOM 2J0
Phone (519) 293-3251
FAX (519)293-3319
Jega Selliah
Ministry of the Environment
Water Resources Branch
1 St Clair Ave. W,, 3rd Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4862
FAX (416) 965-9807
Dr. A.K. Sharma
Water Resources Branch
Ministry of the Environment
1 St Clair Ave. West, 6th Floor
Toronto, Ontario
M4V 1K6
Phone (416) 323-4891
FAX (416)965-9807
Dr. Harvey Shear
Great Lakes Environment Office
Conservation & Protection - Ontario Region
25 St Clair Ave. East, 6th Floor
Toronto, Ontario
M4T1M2
Phone (416) 973-1104
FAX (416) 973-7438
Michael Shackleton
Air Resources Branch
Ministry of the Environment
125 Resources Rd., P.O. Box 213
RexdaLe, Ontario
M9W 5L1
Phone (416) 235-6163
FAX (416)235-6235
George Steinke
Laboratory Services Branch
QA/QC Section
Ministry of the Environment
125 Resources Rdv P.O. Box 213
Rexdaie, Ontario
M9W 5L1
Phone (416) 235-5758
FAX (416)235-6110
Jane Tims
New Brunswick Dept. of the Environment
P.O. Box 6000
Fredericton, New Brunswick
E3B5H1
Phone (506) 457-4846
FAX (506) 457-7823
-------
ATTENDANCE LIST (CANADA) (Confd)
Don Wismer
Ontario Hydro
Nuclear Safety & Licensing
Kipling Ave.
Toronto, Ontario
M5G 1X6
Phone (416) 592-8596
FAX (416)978-0156
Jean-Guy Zakrevsky
Water Quality Branch
Western and Northern Region Inland
Environment Canada
Room 300, Park Plaza
2365 Albert Street
Regina, Sask.
S4P 4K1
Phone (306) 780-6423
FAX (306)780-5311
David Rokosh
Water Resources Branch
Ministry of the Environment
125 Resources Rd., P.O. Box 213
Rexdale, Ontario
M9W 5L1
Phone (416) 235-5967
FAX (416)235-6091
Sathi Selliah
Laboratory Services Branch
Ministry of the Environment
125 Resources Road, P.O. Box 213
Rexdaie, Ontario
M9W 5L1
Phone (416) 235-5700
FAX (416)235-6110
Dr. W.D. Mcllveen
Ministry of Environment and Energy
Air Resources Branch
Phytotoxicology Section
880 Bay Street
Toronto, Ontario
M5S 1Z8
-------
ATTENDANCE LIST (U.S.A.)
Charles J. Bamett
USDA Forest Service
5 Radnor Corporate Ctr, Suite 200
100 Matsonford Rd.
Radnor, PA 19087
Phone (215) 975-4038
David S. Bigelow
National Atmospheric Deposition Program
Colorado State University
Fort Collins, CO 80523
Phone (303) 491-5574
FAX (303) 491-1965
James A. Bowers
Westing ho use Savannah River Lab
Building 773-42 AC
126 Riviera Rd.
Aiken, SC 29803
Phone (803) 725-5213
FAX (803) 725-4704
Bill Burkman
USDA Forest Service
5 Radnor Corporate Ctr, Suite 200
100 Matsonford Rd.
Radnor, PA 19087
Phone (215) 975-4037
FAX (215) 975-4200
Elizabeth Whippo Cline
CSU - BLM Global Change Data Center
Natural Resource Ecology Laboratory
Colorado State University
Fort Collins, CO 80523
Phone (303) 491-5643
FAX (303)491-1965
Rob Cuello
Battelle
Pacific Northwest Division
Marine sciences Laboratory
439 Sequin Bay Road
Sequin, WA 98382
Phone (206) 681-3645
FAX (206)681-3699
James Giattina
US Environmental Protection Agency /GLNPO
230 South Dearborn St.
Chicago, II 60604
Phone (312) 886-4063
FAX (312) 353-2018
Steven Hedtke
US Environmental Protection Agency
6201 Congdon Blvd.
Duluth, MN 55804
Phone (218) 720-5610
FAX (218)720-5539
Joseph B. Hurtn
NFCRC - US Fish & Wildlife Services
4200 New Haven Road
Columbia MD 65201
Phone (314) 875-5399
FAX (314) 876-1896
Dr. Linda Kirkland
EPA-EMAP
ORD RD 680, 401 M Street S.W.
Washington, CXI 20460
Phone (202) 260-5775
FAX (202)260-4346
Rhonda J. Mendel
Lotic Inc.
P.O. Box 279
Unity, ME 04988
Phone (207) 948-3062
Robert A. Mickler
Southern Global Change Program
1509 Varsity Drive
Raleigh, NC 27606
Phone (919) 515-3311
FAX (919)515-3593
-------
ATTENDANCE LIST (U.S.A.) (confd.)
Dr. Craig J. Palmer James B. Stribling
USEPA and University of Nevada Tetra Tech Inc.
USEPA Environmental Monitoring Systems Lab. 10045 Red Run Blvd., Ste 110
P.O.Box 93478 Owings Mills, MD 21117
Las Vegas, NV 89193-3478 Phone (410) 356-8993
Phone (702) 798-2186 FAX (410) 356-9005
FAX (702)798-2454
Dr. Mike Papp
QA Officer
USEPA/GLNPO
230 South Dearborn St.
Chicago, IL 60604
Phone (312) 886-4063
FAX (312)353-2018
Maggie Wilson
Tetra Tech Inc.
10306 Eaton Place, Suite 340
Fairfax, VA 22030
Phone (703) 385-6000
FAX (703) 385-6007
George Schupp
US EPA - Region 5, Quality Assurance Section
77 W. Jackson Blvd.
Chicago, II 60604
Mail Code 5Q-14J
Phone (312) 886-6221
FAX (312) 353-4342
Jane E. Sexton
PTI Environmental Services
15375 SE 30th Place, Suite 250
Bellevue, WA 98007
Phone (206) 643-9803
FAX (206)643-9827
ATTENDANCE LIST (EUROPEAN)
Havard Hovind
Norwegian Institute for Water Research
P.O. Box 69
Korsvoll 0808
OSLO Norway
Phone (47) 223-5280
FAX (47)295-2189
Rainer Lichtenthaler
Norwegian Institute for Water Research
P.O. Box 69
Norsvoll 0808
OSLO, Norway
Phone (47) 223-5280
FAX (47)295-2189
-------
APPENDIX B
Organizing Committee Addresses
-------
FIFTH ECOLOGICAL QUALITY ASSURANCE WORKSHOP
ORGANIZING COMMITTEE
Dr. Robert Bisson
Science Liaison Officer
National Water Research Institute
Environment Canada
867 Lakeshore Road
P.O. Box 5050
Burlington, Ontario
L7R 4A6
Phone (416) 336-6441
FAX (416) 336-4989
George Crawford
Laboratory Services Branch
Ministry of the Environment
125 Resources Road
P.O. Box 213
Rexdaie, Ontario
M9W 5L1
Phone (416) 323-5075
FAX (416) 323-5166
Robert Graves
United Sates Environmental
Protection Agency
26 W. Martin Luther King Dr.
Cincinnati, OH 45268-1525
Phone (513) 569-7197
FAX (513) 569-7115
Steven Hedtke
United States Environmental
Protection Agency
6201 Congdon Blvd.
Duluth, MN 55804
Phone (218) 720-5610
FAX (218) 720-5539
Dr. Linda Kirkland
EPA-EMAP
ORD-RD 680, 401 M Street S.W.
Washington, DC 20460
Phone (202) 260-5775
FAX (202) 260-4346
Dr. James Lazorchak
United States Environmental
Protection Agency
3411 Church Street
Newtown Facility
Cincinnati, OH 46244
Dan On-
Air Resources Branch
Ministry of the Environment
125 Resources Road
P.O. Box 213
Rexdaie, Ontario
M9W 5L1
Phone (416) 235-6177
FAX (416) 235-6235
Dr. Craig Palmer
USEPA and University of Nevada
USEPA Environmental Monitoring
Systems Lab
P.O. Box 93478
Las Vegas, NV 89193-3478
Phone (702) 798-2186
FAX (702) 798-2454
Mike Papp
QA Officer
USEPA/GLNPO
230 South Dearborn St.
Chicago, IL 60605
Phone (312) 886-4063
FAX (312) 353-2018
-------
APPENDIX C
A Generic QA Program Plan
for
Great Lakes Ecological Studies
-------
FIFTH ECOLOGICAL QUALITY ASSURANCE WORKSHOP
October 14-15,1992, Toronto, Ontario, Canada
Draft Q.A.P.P.
"Towards a Generic Quality Assurance Program Plan"
Sponsored by:
Environment Canada
U.S. Environmental Protection Agency
Ontario Ministry of Environment & Energy
-------
A GENERIC QA PROGRAM PLAN FOR
GREAT LAKES ECOLOGICAL STUDIES
by:
Donald R. Hart
Beak Consultants Limited
Brampton, Ontario
With Contributions from Participants of the:
Fifth Annual
Ecological Quality Assurance Workshop
October 14-15, 1992
Sponsored by:
United States Environmental Protection Agency
Environmental Monitoring Systems Laboratory
Las Vegas, Nevada, U.S.A.
Environmental Research Laboratory
Duluth, Minnesota, U.S.A.
Environment Canada
National Water Research Institute
Burlington, Ontario, Canada
Great Lakes Environment Office
Toronto, Ontario, Canada
Ontario Ministry of Environment and Energy
Laboratory Services Branch
Toronto, Ontario, Canada
-------
LIST OF ACRONYMS
ARB
Air Resource Branch (MOEE)
ARCS
Assessment and Remediation of Contaminated Sediments
ASTM
American Society for Testing and Materials
CAEAL
Canadian Association of Environmental Analytical Laboratories
DFO
Department of Fisheries and Oceans (Canada)
DQO
Data Quality Objective
EC
Environment Canada
EMAP
Environmental Monitoring and Assessment Program
EPA
Environmental Protection Agency
GLISP
Great Lakes International Surveillance Plan
IADN
Integrated Atmospheric Deposition Network
IJC
International Joint Commission
LC50
Lethal Concentration (50% mortality)
MDL
Method Detection Limit
MOE
Ministry of the Environment (Ontario)
MOEE
Ministry of the Environment and Energy
MQO
Measurement Quality Objective
NRTC
Niagara River Toxics Committee
NWRI
National Water Research Institute (Canada)
QA
Quality Assurance
QAMS
Quality Assurance Management Staff (EPA)
QAPP
Quality Assurance Program Plan
QAMP
Auality Assurance Management Plan
QAARW
Quality Assurance Annual Report and Work Plan
QC
Quality Control
RSD
Relative Standard Deviation
RMDL
Reference Method Detection Limit
S
Standard Deviation
SOP
Standard Operating Procedure
TQM
Total Quality Management
iii
-------
TABLE OF CONTENTS
Page
FOREWORD vii
INTRODUCTION 1
RETROSPECTIVE 2
Past Situation 2
Present Situation 6
SUGGESTED FRAMEWORK FOR A QUALITY ASSURANCE PLAN 8
Quality Management 8
Program (Project) Description 8
Program (Project) Data Quality Objectives 10
Program (Project) Organization 13
Field Operations and QA/QC 15
Laboratory Operations and QA/QC 17
General Quality Assurance 18
SUMMARY 21
APPENDIX 1: GENERIC QUALITY ASSURANCE PLANNING CHECK LIST
APPENDIX 2: REFERENCES
APPENDIX 3: EXAMPLES
v
-------
FOREWORD
The Fifth Ecological Quality Assurance Workshop was conceived as a crucible for
the creation of a generic quality assurance program plan (QAPP). The workshop offered 75
participants both plenary and working group forums to kindle discussions and aid the
contribution of ideas from the participants. The proceedings document captures the details
from the workshop.
It was the hope of the coordinating committee members that this workshop would
yield a practical template - a focus - for the development of more comprehensive quality
management plans by workshop participants, scientists, project managers and administrators
responsible for the local and bi-national, Great Lakes environmental programs.
A Generic OA Program Plan for Great Lakes Ecological Studies (Generic QAPP) is
the template proposed for consideration in environmental program design. The template
provides: guidance in 4 fundamental areas of program design; insights to 7 activities within
the program design; and questions for 47 specific issues related to program implementation
and more then 30 references that provide advice for the answers. This General QAPP is a
first step toward improved environmental program quality design and bi-national program
harmonization. It is an introductory step. Implementation, tracking, discussion, addition and
modification are the keys to its continuous improvement.
On behalf of the coordinating committee I encourage you to use this Generic QAPP
during your next program design, and tell us of your successes and suggestions.
George Crawford
Ontario Ministry of Environment and Energy
October, 1993
vii
-------
1
INTRODUCTION
This guidance document has evolved from a paper presented at the Fifth Ecological
Quality Assurance Workshop, held on 14-15 October 1992 at Queen's Park, Toronto,
Ontario, Canada. The purpose of this workshop was to provide a forum for interdisciplinary
exchange of ideas and resolution of issues associated with quality assurance in ecological
studies on the Great Lakes, and specifically to develop a Generic QA Program Plan. The
purpose of the generic plan is to provide a reference framework and elements of guidance
for the design and implementation of integrated QA programs by federal, state and provincial
participants in Great Lakes monitoring.
Four workgroups reviewed a draft version of the paper, respectively addressing:
program management issues; field operations/QA/QC; laboratory operations/QA/QC; and
general quality assurance. Suggestions from the workgroups were incorporated into this
revised guidance document.
The Organizing Committee requested in the paper a generic framework for quality
assurance planning and documentation related to Great Lakes monitoring, based on a
retrospective review of Great Lakes programs. This charge arose, in part, from a widely
recognized need for integration of monitoring activities under the Great Lakes Water Quality
Agreement. The Parties to this agreement are committed to develop Lakewide Management
Plans. Clearly, monitoring activities should be designed to service the data uses identified
in these plans. This means planning for data of sufficient quality to support these uses.
This generic plan offers guidance on structure and content, but does not put forward
a definitive quality assurance program plan. That is a task for the Parties to complete, and
individuals have been assigned to coordinate this effort on both sides of the border. At
lakewide or basinwide levels, the task may have to await better definition of integrated data
uses. The generic plan offers a template to be followed in preparing a QA plan, and
contributes to the identification of problematic issues that can then be resolved as the QA
plan evolves.
A brief aside on levels of QA planning may be appropriate here. In the U.S., three
levels of planning have been recognized: quality management plans which focus mainly on
process, often for larger integrated monitoring programs (EPA, 1992a); QA program plans
which cover both process and QA method for an integrated program (EPA, 1987); and QA
project plans which focus mainly on methods for sampling, analysis and QA/QC specific to
an individual project (EPA, 1980).
In Canada, while QA elements are addressed in a project or program, there is no
standardized approach to producing a QA plan. Also, the distinction between levels of QA
planning is generally less rigid. This document is focused on the program level of the U.S.
model, but also highlights issues that might normally be the subject of a quality management
plan or a QA project plan. Notwithstanding the use of this hybrid model, the need for
critical issues to be addressed in QA planning is more important than the packaging of
material, in one, two or three different documents.
-------
2
The framework of this guidance document, and the framework suggested for QA
plans, is outlined in Table 1. Many readers may recognize 'points of light' from the EPA
guidance on preparation of QA project plans. As a result of the Workshop some additional
headings have been added and these have been reorganized into four main sections. The aim
is for flexible guidance that will be useful at multiple levels. The user can direct the
emphasis where it belongs. For example, a program plan need not elaborate on field and
laboratory operations when these are covered in detail in separate project plans.
RETROSPECTIVE
Past Situation
A previous review of quality assurance programs associated with Great Lakes
monitoring activities (BEAK, 1988) included 36 different projects at federal and state or
provincial levels. The key findings are listed in Table 2, and each is discussed briefly
below.
QA Plans. One of the first problems encountered in 1986 was the general lack of
binational consistency in QA plans. For EPA and state programs, it was possible to start by
reviewing a QA plan of some sort and follow-up with additional questions as needed. For
Environment Canada and Ontario MOEE programs, it was necessary to extract QA
information from sampling or analytical protocols, data reports and questionnaire responses.
Quality assurance for biological measurements was poorly developed in both countries.
Data Uses. The IJC Data Quality Workgroup was reviewing round-robin data from
chemistry laboratories and finding statistically significant interlaboratory discrepancies
(Rathke and McCrea, 1987). However, the program significance was always unclear, for
various reasons, the main one being lack of pre-defined data uses, particularly by the IJC and
its workgroups. Thus, data comparability was often a contentious issue, because the desired
comparisons and/or the degree of discrimination required were seldom defined as program
goals.
For example, tributary laboratories tended to be biased high relative to 'open lake'
laboratories (by 2 to 3 (J-g/L for phosphorus), and on-board analyses differed systematically
between ships (by 2 to 4 fig/L for phosphorus). These differences were equivalent to the
estimated change in lake mean concentrations over a five- to seven-year period, but there
was no consensus as to the acceptability of such bias. For metals and nutrients in a water
matrix, 20 to 40% of laboratory results in round-robins were flagged as statistical outliers
from the group median. For round-robins on organics in water, median values were well
below target values, suggesting among other problems, systematic difficulties in maintenance
of calibration standards.
-------
3
TABLE 1: GENERIC FRAMEWORK FOR A QUALITY ASSURANCE PLAN
Quality Management (in QA Program and Project Plans)
Program (Project) Description (overview)
Main and Related Projects, Relationships between Projects
Data Collection Activities, Basic Design Approaches
Objectives and Questions, Data Uses to Address Questions, Decision Framework
Program (Project) Data Quality Objectives
Power to Answer Questions with Specified Resolution
Sensitivity, Precision, Accuracy, Comparability, Compatibility
Completeness, Representativeness, DQO and Related Assessments
Program (Project) Organization
Responsibilities and Authorities, Reporting Channels and Communications
QA Policy Statement, Resource Allocations and Strategies
Document Control (for QMP, QAPP, SOPs, QA reports, data files)
Procurement Process, Staff Training, Evaluation and Safety
Field Operations and QA/QC (in QA Project Plan)
Sampling Design (based on objectives and previous data)
Adequate Replication, Time and Area Selection, Unbiased within Areas
Sampling Procedures (may reference SOPs)
Equipment and Instrument Maintenance and Calibration
Sample Collection, Preservation, Custody, Storage, Records
In-field Quality Control and Data Validation
Laboratory Operations and QA/QC (in QA Project Plan)
Laboratory Facilities, Layout, Reagent and Climate Control, Management
Analytical Procedures (may reference SOPs)
Equipment and Instrument Maintenance and Calibration
Sample Receiving, Storage, Preparation, Analysis, Records
Internal Quality Control and Data Validation
Quality Assurance (in QA Program and Project Plans)
Interlaboratory and Accreditation Studies (participation, follow-up)
Performance and System Audits (schedule, checkpoints, criteria)
Internal QC Data Review (schedule, checkpoints, criteria)
Project Data Review (reduction, validation, reporting)
Corrective Actions (problems, actions, effectiveness)
Assessment of DQO Achievement (recommendations)
QA Reports (schedule, content, include with QAPP revision)
-------
4
TABLE 2: PAST REVIEW OF GREAT LAKES PROGRAMS 1986-88
(BEAK, 1988)
poor documentation of QA programs - especially for biological measurements.
round-robins detecting interlaboratory bias - significance to programs unclear
no systematic approach to bias estimation - no statement of comparability or compatibility
goals
analytical DQOs were stated for sensitivity, precision, accuracy
definitions and uses of these performance measures were inconsistent or lacking
no goals defined for total uncertainty or power to answer questions
questions and data uses too vague to permit definition of such goals
study designs were stated but no linked to goals, objectives or uses
projects and objectives diverse - no integrated lakewide objectives
data reporting practice variable - bias correction, quality flags, low level flags (T, W)
inconsistent and infrequent reporting of data quality and therefore little use of such
information
-------
5
DQOs. Data quality objectives (DQOs) were usually defined, particularly in the U.S.
where they were a required element of QA plans. But, often, they originated in the
analytical laboratory by default. Their justification in terms of program or project data uses
was consistently lacking. Likewise, definitions of detection, precision and accuracy goals
were inconsistent among projects, or lacking in project plans.
For example, "detection" limits were defined in terms of: reading increments,
instrument detection limits, method detection limits, limits of quantitation or 'practical'
quantitation limits. Precision was computed by replication before or after sample
preparation, within-run or between-run, for one or several concentration ranges. The Niagara
River Toxics Committee (NRTC, 1984) noted the data compatibility problems which have
arisen from such inconsistent definitions.
Data quality objectives were never defined in terms of total uncertainty, or power to
answer a question. Moreover, the questions themselves, usually about spatial patterns and/or
temporal trends in contamination, or about compliance evaluation, were not specific enough
to support an objective development process. The domain of the question was often unclear.
For example, the question 'is there a temporal trend in water quality?' could be
clarified to ask 'is there a temporal trend in annual area average water quality over a ten-
year timeframe in areas x, y or z?' The latter version of the question, with some preliminary
information on temporal and spatial variability, allows an appropriate sampling frequency
and spatial intensity to be determined, so as to detect any real change of particular
magnitude, with minimal risk of non-detection (i.e., adequate power).
Study Design. Designs were specified as far as station locations and sampling
periods, but the statistical basis of the design was seldom defined. There was little rationale
in terms of ability to satisfy project or program goals or data uses. Project goals were
diverse, and integrated lakewide program goals were often not defined at all.
Data Reporting. Reporting practice was highly variable with respect to such factors
as blank and recovery correction and data quality flagging. Use of the ASTM (1985) flags
T and W for low level chemical data was encouraged, but not universally practised (T =
instrument response less than criterion of detection; W = lowest readable positive value, no
instrument response). In fact, there seemed to be several T, W systems in use (Crane, 1986).
Data quality information was typically confined to the laboratory, and if reported was often
difficult to associate with project data.
For example, only 25 to 50% of laboratories routinely reported precision, recovery
or detection limits, all of which may vaiy from sample to sample. Reported average values
may or may not have applied to a particular sample. What was needed was the precision
function and this was not commonly reported.
-------
6
Coordination. A primary recommendation arising from the review in 1988 was the
appointment of a binational coordinator to facilitate specific definition of binational program
objectives, data uses, and data quality needs, to evaluate achievement of program objectives
and recommend program changes as required, and to develop a quality management plan for
this purpose. As of today, there are two coordinators, one on each side of the border. They
are beginning to address these tasks. Hopefully, this guidance document may serve as a
catalyst.
Present Situation
A recent (1992) review of quality assurance programs, performed in preparation of
this workshop paper, has been more selective and perhaps more focused on newer programs.
Some of these are cited in developing the elements of guidance that follow in detail.
However, this effort can be summarized briefly.
Several Canadian QA plans are now in place (e.g., for the Ontario MOE (1990)
Toxics Deposition Monitoring Network, and for Great Lakes Water Quality Monitoring by
Environment Canada (L'ltalien, 1992)). Furthermore, it is clear that this is a growing trend.
And, although there is currently no Canadian standard as to content, format or level of detail,
many other QA plans were in various stages of preparation.
Two recent U.S. programs represent a long step forward in QA planning on the Great
Lakes (the Environmental Monitoring and Assessment Program (EMAP) and the Assessment
and Remediation of Contaminated Sediments (ARCS) Program. Both give particular attention
to QA for biological measurements.
The first steps are now being taken toward binational integration of U.S. and
Canadian programs. A QA plan for an Integrated Atmospheric Deposition Network (IADN)
is near completion, which should provide a model for integration of other programs. An
evolving decision framework for Great Lakes management (Figure 1) should help to guide
both definition and integration of program goals.
By virtue of both U.S. and Canadian efforts, we are making progress, particularly in
relation to development of quality controls for environmental monitoring, awareness of goal-
driven approaches to sampling design and quality assessment, and integration of diverse
monitoring activities. We hope that this guidance document may help to maintain the
momentum and facilitate further binational integration.
-------
FIGURE
7
1: DECISION FRAMEWORK FOR GREAT LAKES MANAGEMENT
(modified from EPA, 1992b)
No
Yes
Reduce
Harvest Targets
Establish toad
Reduction Targets
Oeclare
Success I
Identify Sources
Identify Causes
Implement
Activities
Monitor Source
Reductions
Monitor Habitat
Value
Implement Activities
Estimate Habitat
Losses
Identify
Critical Habitats
Monitor Ecosystem
Response
Determine Critical
Pollutants
Quant i fy
Loads
Habitat Inventory
Targets
Identify Protection
Restoration Activities
Identify Prevention
Reduction, and
Remediation Activities
Are
Ecological Resources
Impaired or
Threatened?
-------
8
SUGGESTED FRAMEWORK FOR
A QUALITY ASSURANCE PLAN
Quality Management
Quality management begins with: a description or vision of the program (or project);
an overall purpose and specific data quality objectives; and details on organizational structure
and management process focused on achievement of objectives. Each of these topics should
be addressed as a major heading in a QA program (or project) plan. However, the program
plan may be focused at a higher level of organization, picking up where existing project
plans leave off. The cost of such front-end planning is usually offset by savings associated
with increased program efficiency.
Program (Project) Description
The description provides an overview of the main project(s) which is (are) the focus
of the QA plan, any related projects and relationships between projects, data collection
activities, basic design approaches, objectives in terms of questions to be resolved, data uses
to address these questions, and the overall framework for decision making based on the
answers to these questions.
Project(s). First, we identify the project(s) which is (are) the focus of the QA project
or program plan. In a program plan, a group of related projects is encompassed within an
umbrella program.
Relationships. Any blocks of data to be produced by one project and used by
another, either as ancillary data or in comparisons or in data pooling, must be clearly
identified by measurement type, matrix, timeframe and geographic region. A data producer
x user matrix with data blocks and uses listed in cells can be a convenient platform for
summarizing these relationships (Table 2.1). Any criteria that would permit or preclude such
data uses should be stated.
Activities. The list of data collection activities generally corresponds to the data
blocks defined. More detail can be given here as to the specific measurements included in
each block, and general methods of sampling and analysis (reference SOPs).
Basic Design. For each data collection activity, we identify the sampling strategy
(e.g., random, stratified random, systematic) and level of effort (e.g., numbers of samples or
observations) over the entire domain of study, and within any identified environmental strata.
The spatial and temporal domain of study and bounds of interpolation or extrapolation should
be clearly defined and justified. Any needs for sampling concurrency, dynamic sampling or
additional pilot study and design effort should be stated.
-------
9
Objectives. The objectives can be stated as general or specific questions to be
resolved, usually about the nature and extent of contaminants or bioindexes, and often in
comparison to environmental criteria, reference areas or baseline time periods. General
questions can be partitioned into more specific questions pertaining to individual variables
or indexes that are testable as hypotheses. All questions should have a sound basis in
current ecological theory.
Data Uses. These are largely dictated by the form of the specific questions posed;
however, statistical methods (including transformations) may depend on the assumptions that
the user wishes to make, which in turn may depend on preliminary examination of data.
Also, the variables and domains of interest may be defined or revised after preliminary data
review. This kind of flexibility is allowed and should be identified.
Decision Framework. The logic leading from the answers to specific questions, to
the ultimate selection of abatement, remedial or further investigative initiatives, should be
defined as clearly as possible. Often there are significant socio-political rather than scientific
inputs to these decisions, which should be acknowledged.
ISSUES
These requirements point to a number of key issues that may arise related to program
definition in the integration of Great Lakes monitoring programs:
1. Development of a decision framework for lake or basin management. This is
the primary issue and should be the first step in binational integration. A
possible framework is shown in Figure 1 which addresses both toxics and
habitat issues. Existing projects fall within one or several of these boxes. In a
very general way, data relationships are defined by the arrows between boxes.
Individual Lake Management Plans are jointly under development, with most
progress to date on the Niagara River and on Lake Ontario, particulary in the
area of toxics management (Rang et aL, 1991). However, not all ot the
problems on the Great Lakes are related to toxic chemicals.
2. Geographical and technical scope of the program. Is it lakewide, basinwide;
single or multimedia? Which existing projects will be included? The
atmospheric monitoring programs are now in the process ol binational
integration, with development of an Integrated Atmospheric Deposition Network
(IADN). A general approach to integrated atmospheric monitoring is outlined
by Brydges et al. (1991).
-------
10
3. Integration of sampling strategies to meet both project and program objectives.
What compromises are necessary? What efficiencies can be realized?
Integration of sampling strategies will be needed in order to address lakewide or
basinwide questions without the complications of interjurisdictional sampling
bias. These complications may particularly affect large scale indexes of lake
status. General guidance on sampling design can be found in Gilbert (1987).
4. Binational process tor definition of integrated objectives, questions, data uses
and sampling strategies. Who is involved (besides the coordinators)? What are
the steps? The U.S. and Canadian QA coordinators will be an integral part of
this process, but they will need input from all the project managers. They will
also need budget and manpower resources.
Program (Project) Data Quality Objectives
The linkage of fine scale questions to broader scale questions means that data may
have to meet the needs of different users at project and program levels. The data quality
(and quantity) needs of these users may differ, and it is critical that project managers
understand the broader context.
A process for defining and assessing achievement of DQOs is essential. The process
of DQO development is iterative, involving both program managers and technical experts.
It should lead to DQOs that are realistic and achievable. The process often involves
compromise between what we would like to achieve and what we can afford, but ensures an
appreciation ol what is achievable with available budget, encourages cost-effective resource
allocation, and guards against spreading resources too thinly.
Power. A specilied power to answer program or project questions with a specified
resolution is the ultimate data quality objective. It is determined as an objective from
consideration of the consequence of getting incorrect answers and thus making 'wrong'
decisions. Other data quality objectives (MQOs) are designed to control components of the
overall uncertainty that limits the power of the study.
Sensitivity. Sample detection limits should be sufficiently low that unacceptable bias
due to censored data is not introduced to the summary statistics that are used to answer
questions (e.g., mean or variance). The required sensitivity typically depends on the average
measured value in each domain of interest (e.g., one-tenth of mean or less). Sensitivity is
determined from analytical precision of blanks or low level standards.
-------
11
Precision, is specified as a permissible limit for random error. It may be specified at
any or all levels, for the relevant environmental domain (total precision), the measurement
process (including sampling method) or the analytical method. It is determined by
replication at each level.
Accuracy, is specified as a permissible limit for control sample deviation from
expected value. It is determined by analysis of lab and field blanks, reference standards
and/or matrix spikes.
Specificity. The measurement system should respond only to the analyte of interest.
Specificity is normally evaluated during verification of an analytical method.
Comparability. Two data sets may be considered comparable if they are both
essentially unbiased or share essentially the same bias as determined by average of lab and
field blanks, reference standards and/or matrix spikes at any concentration. Differences in
total precision may require special methods of comparison leading to some loss of power,
but generally do not preclude comparison.
Compatibility. Two data sets may be pooled for subsequent treatment as a single data
set, only if they are essentially the same with respect to both mean value and total precision
for all variables and domains of interest. Compatibility is not necessarily ensured by use of
similar sampling and analytical protocols.
Completeness. The percentage of planned measurements that are actually obtained and
useable for any measurement variable and matrix. Criteria for any declaration of data as not
useable should be stated or referenced. Note DQO exceedence does not necessarily make
data unusable.
Representativeness. Samples may be considered to represent the domain of interest
if collected by an unbiased sampling plan and if associated accuracy is acceptable.
Assessment of representativeness involves consideration of time and location and other
conditions of sampling. Representativeness is not necessarily ensured by adherence to
sampling and analytical protocols.
Assessments. We should define the schedule and general process for assessing whether
each DQO is being or has been achieved, and for determining, based on this, whether the
basic study design, management practice (organization) or technical operations (field,
laboratory, quality assurance) should be revised.
-------
12
ISSUES
A number of key issues are related to the requirement for data quality objectives:
1. Complexity of process for defining DQOs, MQOs. How should it vary with
type and scale of program? How can it be phased-in to existing programs?
Both EPA (EMAP) and EC (NWRI) have identified a commitment to the
process. General guidance on the process is provided by QAMS (1986).
2. Complexity of the optimized sampling design. Different levels of effort for
different variables? Or average level of effort, different degrees of power?
Alldredge (1987) and Green (1989) provide useful guidance on power analysis.
3. Standardization of data quality measures. Is it necessary? Is it possible? The
need for such standardization in an integrated binational program has been
recognized (NRTC, 1984). In particular, various types of detection limits are in
use (e.g., EPA, 1984; Hunt and Wilson, 1986; MOE, 1988).
4. Accuracy without standard reference materials? Consistency over time instead?
Development of appropriate standards? The Niagara River program utilizes an
uncertified 'reference' sediment sample to assess the consistency of sediment
chemical analysis through time. The Canadian Bioindex program (Ralph, 1992,
pers. comm.) has used blind recounts of old phytoplankton samples to assess
consistency of taxonomic enumeration. Elliot and Drake (1981) describe a
synthetic reference benthic sample that has been used to assess accuracy (i.e.,
recovery) in sampling benthic organisms for enumeration. Additional effort
methods, as described for EMAP-Great Lakes (Hedtke et al., 1992) can serve
the same purpose.
5. Bias correction to permit comparison. Should it be done? If so, who does it?
Use of censored data techniques (El-Shaarawi and Dolan, 1989) to compensate
for test insensitivity? Metikosh et al. (1987) noted that detection limits for
metals analysis in open lake waters have often been inadequate for purposes of
trend evaluation.
6. Specifically which data sets require comparison? This should fall out of Lake
Management Plans, but requires explicit definition. The Ontario MOE (1990)
Toxics Deposition QA Plan defines two data sets as comparable when statistical
tests indicate no significant difference (presumably with respect to measurement
accuracy and precision) at a given confidence level. However, it is not clear
which data sets will be compared. The EMAP-Great Lakes program (Hedtke et
al., 1992) lists both external and contemporary EMAP databases with which
comparison is desired but does not say how comparability will be determined.
-------
13
Power to detect a spatial pattern of environmental impact has been identified by EC/DFO
(1991) as a required consideration in design of environmental effects monitoring (Figure 2).
Similarly, Reckhow (1979) considers how uncertainty in a loading estimate affects power
to predict lake trophic status (Figure 3). Power to detect change and trend has been
identified as the driving force behind the design of the EMAP-Great Lakes program (Hedtke
et al, 1992).
Some MQOs from the ARCS program are shown in the next two Tables (3 and 4). The
footnotes define the data quality measures. Clear definitions are needed, both for data
quality assessment, and for comparison/integration of programs. The goals for toxicity
testing are interesting. The control response limit is a measure of test sensitivity, similar to
an MDL (MOE, 1988). Precision is measured by replication of both reference toxicant tests
and tests of project samples. Accuracy is a relative standard deviation over time of reference
toxicant test results. No MQOs are defined for benthic enumeration. However, they could
be defined in terms of species recovery (Figure 4) or relative standard errors of organism
density estimates.
Program (Project) Organization
The integrated structure and process by which management ensures that objectives are
achieved must be defined in order to be effective. This includes definition of responsibilities
and authorities, reporting channels and schedules, QA policy, resource allocations and
strategies, document controls, procurement processes and staff training and evaluation
procedures. The EMAP QA Program Plan (EPA, 1991) provides good examples of such
definition, based on an overall philosophy of Total Quality Management (TQM).
Responsibilities and Authorities. Identify by title and agency the individuals
responsible for planning, implementing and assessing each of the main field, laboratory and
QA operations, and list these areas of responsibility for each individual.
Reporting Channels and Communications. Define each report product, the schedule
or frequency of preparation, the individuals responsible for preparation, review and final
approval and the dissemination to program participants. Dissemination of QA information
is particularly important.
QA Policy Statement. Affirm the senior management commitment to, and ultimate
responsibility for, achievement of data quality objectives as a priority equal to or greater than
that of schedule or budget. Specifically address strategy for situations when, due to
unforeseen circumstances, tradeoffs may be necessary.
Resource Allocations. Identify strategy for budget and manpower allocations to
specific field, laboratory and QA operations, including QA planning, and any process for
cost monitoring and/or increasing allocations to meet program needs.
-------
14
Document and Data Control. Identify the process for revision, certification and
circulation of key documents such as QAMP, QAPP, SOPs, project or QA reports and data
files, so as to ensure information integrity and use of most recent versions. A regular review
schedule, a process for irregular revisions and a list of changes is recommended for all living
documents, and an archive of dead documents should be maintained. For large, multi-user
programs, a data clearing house can be established to check on integrity of data transfers,
perhaps using reference data sets.
Procurement Process. Describe the process for ensuring that all purchased goods and
services meet established quality and performance specifications related to program (project)
needs.
Staff Training and Evaluation. Describe the process for design and implementation
of program (project) related staff training, evaluation and certification programs.
ISSUES
A number of key issues are related to program organization:
1. Independence of the QA officer. Is it necessary? Is it desirable? How can it
be accomplished? For small operations this function is not a full time job and
there is a strong economic incentive to assign other duties. However, when
these other duties involve either management of the project or data generation,
the 'peer review' aspect of quality evaluation may be lost.
2. Strategy for scheduling vs. quality needs. Single vs. multiple laboratories?
Tradeoffs are often required between scheduling and data consistency. For
example, one way of avoiding interlaboratory bias is to assign all work to a
single laboratory, but this can create a large sample backlog and a protracted
analytical schedule. Splitting the work among several laboratories can reduce
the backlog at each lab, and in theory shorten the schedule, but can also
introduce data comparability problems. A clear statement of data quality needs,
schedule needs, and their relative priority can help to prevent situations where
quality may be unduly compromised to meet a schedule.
3. Strategy for budget vs. quality tradeoffs. Lab information systems for small
operations? Tradeoffs may be required between budget and quality, particularly
when surveys initiated without adequate planning must be repeated to achieve
objectives. Also, for small operations, some tools of quality assurance, such as
laboratory information systems, may not be affordable.
-------
15
4. QA policy statement. How to make it more than motherhood? Specifically
address tradeoff situations? The policy statement often sounds like
'motherhood'. No one would oppose it because it does not specifically address
the conflicting priorities of quality, schedule and budget that are typically
responsible for sacrificing quality assurance. An effective policy statement
may well be controversial.
5. Secure vs. accessible information systems? What is the best compromise? An
absolutely secure system often restricts or discourages access. In practice a
balance must be achieved.
6. Standardization of work statements and QA/QC specifications for contract
laboratories?
7. Binational training and evaluation programs? Particularly for field techniques?
Joint training programs are a good way of reducing bias, particularly in
sampling operations for which SOPs are often lacking.
Field Operations and QA/QC
Field operations include sampling design and sampling procedures. Design requires
good communication between managers and technical staff, and consideration of available
resources. Procedures may reference or build on any pertinent SOPs that exist, rather than
re-creating them. Examples of field SOPs include those developed for atmospheric
deposition monitoring by the Ontario Ministry of the Environment (Shackelton, 1990), for
water quality monitoring on the Niagara River (Niagara River Sampling Protocol Group
1988), and for herring gull surveillance by Environment Canada (Weseloh and Mineau
1991).
Design. Studies can be designed to meet project and data quality objectives, based on
previous pilot or historical data, clearly defined questions and stated acceptable confidence
levels. The design will specify time period and area selection, adequate numbers of stations
or time points in each, unbiased sampling within areas or time periods, sampling locations,
and adequate field and laboratory replication to check or control components of error.
Procedures. It is acceptable to reference or build on existing SOPs for standard
methods. Non-standard methods or deviations from standard methods are described in the
QA plan. They include any real time quality control checks, such as repeated measurements,
or sampling efficiency tests. They should be organized to facilitate updating as needed, and
must be communicated to field staff.
-------
16
Field Data Management. Standardized forms facilitate consistent data recording and
should include places for observational (meta) data and comments, as well as places to
indicate that required methods were followed, or to note exceptions. In-field data quality
checks should be encouraged.
ISSUES
A number of key issues related to field operations seem to need attention and/or
standardization:
1. Justification of sampling design in relation to objectives. What is the
expectation of confidence in estimators and decisions? The water quality
sampling effort on Lake Ontario by Environment Canada has been justified in
relation to confidence in estimation of the lakewide mean (L'ltalien, 1992). In
EMAP, estimator variances have been specifically defined in relation to
sampling design features (Overton et al., 1991; Hedtke el al., 1992).
2. Selection of appropriate reference areas for impact assessment? This selection
is a critical but often rather subjective task. General guidelines on reference
area selection are included with the technical guidance for environmental effects
monitoring in Canada (EC/DFO, 1991b).
3. Allowance for bias as well as imprecision in estimation of required sampling
effort? Often, it is assumed to make a negligible contribution to measurement
uncertainty, and ignored. Alternatively, the square of a bias allowance may be
added to the variance estimate used in sampling design. A consistent approach
is desirable.
4. Development of real-time quality controls on the sampling process and
performance criteria for field method equivalence. Replicate field samples are
often collected, but less often utilized to estimate or control sampling method
error. Real-time controls are necessarily limited to rapidly determined
characteristics of the sample. Replicate sampling and analysis are used to
estimate sampling method error in EMAP-Great Lakes (Hedtke et at., 1992) and
Lake Ontario water quality monitoring by Environment Canada (L'ltalien,
1992). Collocated samplers serve this purpose in Ontario MOE (1990) toxics
deposition monitoring and in the Canada-U.S. IADN program (IJC, 1988). Fay
(1987) has reviewed lake sampling methods and identified a number of field QC
checkpoints. Quality control for autosampling of streams is described by
McCrea and Fischer (1992).
-------
17
A method comparison study on the St. Marys River (Figure 5) illustrates the order of
magnitude differences in benthic enumeration that can arise between methods. In this case,
the sieve size is the dominant source of error, but the effect varies from station to station!
The same study examined the effect of replicate sampling effort on characterization of
species richness (Figure 6). Preliminary studies of this nature provide useful data on which
to base a survey design.
Laboratory Operations and QA/QC
Laboratory operations include laboratory facilities and analytical procedures. The latter
may reference pertinent SOPs, rather than repeating them in detail. The Ontario Ministry
of the Environment Code of Practice for Environmental Laboratories (King, 1989) and the
CAEAL Code of Practice for Analytical laboratories (ZENON, 1991) provide a useful
framework for preparation of laboratory SOPs.
Facilities. Describe general layout, separation of areas, reagent and climate control
and management structure. Emphasize prevention of contamination, tracking of reagents and
working solutions with respect to source, quality and expiry dates, adequate ventilation, air
quality, water quality and temperature control, and responsibilities for QA, health and safety,
sample handling and data management.
Procedures. Again, we can reference SOPs for standard methods. Non-standard
methods or deviations from standard methods are described in the QA plan. They include
or reference a description of the internal quality control, data validation, data management
and reporting systems. Analytical procedures should be verified and method performance
criteria stated.
Sample Handling. Login, pre-treatment and storage, criteria for refusing samples, chain
of custody and sample tracking mechanics and archiving procedures should be described.
All relevant field data should be captured during login.
Lab Data Management. Standardized forms facilitate consistent data recording and
reporting. They should allow flagging of questionable data, with some indiction of the
reason. Automated information management systems facilitate real-time data validation and
flagging.
-------
18
ISSUES
Most chemical laboratories have some sort of quality control program in place, but
many biological laboratories do not. Key issues include:
1. Development of 'analytical' QC check points for toxicity testing and taxonomic
laboratories. Range and average control charting techniques for chemical
analysis (King, 1984; ASTM, 1986; Dux, 1986) are also generally applicable for
toxicity testing (Environment Canada, 1990), although reference toxicants
cannot be interpreted as absolute standards in 'accuracy' assessment. Duplicate
processing, sorting verification and taxonomic verification provide for similar
quality controls on benthic enumeration (Peck et al., 1988; BEAK, 1991).
2. Development of performance criteria for method equivalence. These should
correspond to standard QC check points.
Range and average control charts for toxicity testing (Figure 7) are similar to those
used in the chemistry laboratory, except that standard reference materials generally do not
exist. Thus, accuracy is problematic but we can address consistency.
Discrepancies between sorters in benthic sample processing (Figure 8) can be checked
periodically, just as instrument operators are compared in the chemistry laboratory. Sample
archives permit assessment of comparability between old and recent surveys.
General Quality Assurance
General quality assurance issues include interlaboratory and accreditation studies,
performance and systems audits, internal QC data review, project or program data reviews,
corrective actions when problems are identified, assessment of DQO achievement and QA
reporting. Each of these topics should be addressed as a major heading in a QA program
(or project) plan.
Interlaboratory and Accreditation Studies. Typically these are external to the project
or program, but the results can be pertinent, particularly in interlaboratory studies with
project-relevant matrices. Variations on the interlaboratory theme include split sample
exchanges among participating laboratories, or with a reference laboratory throughout the
coarse of a multi-laboratory program. Identify most recent, current or upcoming relevant
studies, relevant parameters in each one, any deficiencies noted and follow-up actions.
Audits. Performance audits involve blind submission of standard samples and can be
focused on particular problem parameters in the field or laboratory. System audits involve
site visits, interviews and document review and are generally more comprehensive. For
audits within the project or program, identify type, schedule, system checkpoints and
performance criteria for each parameter. Also describe the dissemination of results and
follow-up procedures to ensure that any problems are rectified.
-------
19
QC Data Review. Involves examination of control sample frequency, association with
project sample data, control limit exceedences and real-time response to limit exceedence.
Identify review schedule, checkpoints and follow-up action criteria at bench level and above.
Field and laboratory staff must understand the use of QC data in order to respond effectively.
Project Data Review. Includes data validation, reduction and reporting. Reduction
includes any bias correction of data (e.g., blank or recovery correction). Validation is based
on associated QC data, in relation to invalidation or flagging criteria. Identify review
schedule, checkpoints and follow-up action criteria at bench level and above. Also define
the mechanism for feedback to the QA program design (e.g., audits to focus on identified
problems), and the technology used to associate QC and other project data. Optical discs
can be used to avoid sequestering of QC data.
Corrective Actions. These may be taken in real-time or after QC or project data
reviews. Specific trigger conditions should be defined for each corrective action. Any such
action taken should be recorded and its effectiveness assessed.
Assessment. Achievement of analytical or measurement DQOs for each sample can
be assessed and tracked continually, since the data flagging criteria typically relate to their
achievement. Other DQOs (e.g., power to answer a question) involve the entire data set and
are not assessed definitively until all planned samples that can be collected are processed.
However, interim evaluations are highly recommended.
QA Reports. These should contain study design recommendations based on assessment
of DQO achievement. The schedule often coincides with the schedule for QAPP review and
revision. The content is a summary of results from all of the QA operations. Both schedule
and content should be defined in QA plans.
ISSUES
A number of key issues related to these QA operations will likely need attention in
the integration of Great Lakes programs.
1. Interlaboratory studies vs. performance audits. Audits can be conducted more
frequently and designed to correspond more closely with periods and types of
project activity for each laboratory. However, round-robins may provide a
better indication of interlaboratory bias in relevant matrices. The IJC Data
Quality Workgroup has conducted many round-robins for chemical analysis
related to Great Lakes monitoring (Rathke and McRae, 1987). The Canadian
Association for Environmental Analytical Laboratories now operates an
accreditation program for chemical laboratories, and is expanding the program
to include toxicity laboratories (CAEAL, 1992).
-------
20
2. Development of appropriate performance audit samples and standardized system
audit checklists. The Canadian Association of Environmental Analytical
Laboratories is developing audit programs for both chemical and biological
laboratories. Field and laboratory audits are a regular part of EMAP, ARCS and
Niagara River monitoring programs (Kuntz and Harrison, 1991; Hedtke et al.,
1992; Schumacher, 1992). Laboratory audits are conducted in the Environment
Canada Herring Gull program (Turle and Norstrom, 1987), and field audits are
conducted in the Ontario MOE (1990) Toxics Deposition program.
3. Standardization of QC data uses across the Great Lakes, e.g., to flag or reject
project data. Both EMAP and ARCS programs (Hedtke et al., 1992;
Schumacher, 1992) call for real-time review of QC data, such as repeated
measurements in the field, and duplicate, standard, spike or blank measurements
in the laboratory, by field and laboratory staff. Electronic data logging or
portable computer use (Hedtke et al., 1992) can facilitate automation of this
process.
4. Standardization of key data reductions such as blank and recovery correction.
Both EMAP and ARCS program plans (Hedtke et al., 1992; Schumacher, 1992)
describe a data validation and verification process which includes range checks
and internal consistency checks, in addition to evaluation of MQO achievement
based on QC data. The Canadian lake Surveillance Programs use a surrogate
recovery criterion for data validation (Neilson and Stevens, 1988).
5. Development of a binational process for feedback from data quality assessment
to study design. This feedback is often missing from QA plans. It should be
identified as an ongoing process leading to annual updates of the QA plan.
While there is a tendency to stick with original designs for the sake of
consistency, consistent failure to meet program goals is not desirable
6. Development of minimum standards for QA reporting, e.g., frequency, content,
level of detail. While QA reports have not been standard practice in Canada,
they provide a convenient vehicle for and record of feedback from quality
assessment to study design. The Ontario MOE (1990) Toxics Deposition
program requires an annual QA report to be prepared by the network
coordinator. The ARCS program calls for at least one QA report by each
participating laboratory, and one final QA report by the program QA staff, over
the duration of the program (Schumacher, 1992). The EMAP program (EPA,
1991) calls for a QA annual report and workplan (QAARW) from each of the
resource and integration groups, including Great Lakes.
-------
21
An inter-ship study on Lake Erie (Figure 9) addressed both sampling and analytical
comparability. Three ships exchanged samples. In this instance, the analytical biases
between ships exceeded the sampling method bias (Rathke and McCrea, 1987).
A biotesting round-robin (Figure 10) illustrates a problem that is unique to biotesting.
Test protocols permit different dilution waters to be used, resulting in large interlaboratory
differences in LC50. We should know about these discrepancies, but should not use them
to judge lab performance. An estimated toxicant concentration based on the LC50 is a better
measure of laboratory performance.
SUMMARY
A common framework for implementation and description of quality assurance
programs is needed on the Great Lakes. Within a suggested framework, this document has
presented some elements of guidance that may be useful in preparation of quality assurance
plans, and some related issues that may need to be addressed in development of integrated
plans for the Great Lakes. The document is intended serve as a template to encourage plan
preparation and focus binational discussion as integrated quality assurance plans are
developed. Toward this end a QA planning checklist is attached as Appendix 1.
-------
APPENDIX 1
Generic Quality Assurance
Planning Checklist
-------
APPENDIX 1: GENERIC QUALITY ASSURANCE PLANNING CHECKLIST
Page 1 of 5
PROGRAM (PROJECT) COMPONENT/ACTIVITY
KEY QUESTIONS/ISSUES TO BE ADDRESSED
CHECK
STATUS
EXAMPLES OR
REFERENCES
QUALITY MANAGEMENT
Program (Project) Description
Overall scope of program or project
EPA, 1987
Main and Related Projects
Project names and data products
Relationships Between Projects
Data producer x user matrix
Nature of data uses and criteria for use
Table 2.1
Data Collection Activities
General sampling methods
Parameters to be measured in each sample type
Basic Design Approaches
Sampling times and areas
Integration of project designs
Randomization strategy (e.g., stratified, systematic)
Brydges et al., 1991
Gilbert, 1987
Objectives and Questions
Ecological basis of questions
Clear spatial/temporal domains and parameters
Data Uses to Address Questions
Statistical methods, transformations, tests
Gilbert, 1987
Decision (Action) Framework
Link from answers to actions
Rang et al., 1991
Figure 1
Data Quality Objectives
Definition, derivation, assessment
Power to Answer Question
Conditional on degree of effort, magnitude of effect
Variation by parameter in power or effort
Alldrege 1897
Green 1989
Figures 2 and 3
Sensitivity, Precision, Accuracy.
Standardization of data quality measures
Accuracy (consistency) without SRMs
NRTC, 1984
Elliot and Drake, 1981
Hedtke et al., 1992
EPA 1984; MOE, 1988.
Tables 3 and 4
-------
APPENDIX 1: GENERIC QUALITY ASSURANCE PLANNING CHECKLIST
Page 2 of 5
PROGRAM (PROJECT) COMPONENT/ACTIVITY
KEY QUESTIONS/ISSUES TO BE ADDRESSED
CHECK
STATUS
EXAMPLES OR
REFERENCES
Comparability and Compatibility
Data sets to be compared or pooled
Criteria for comparison or pooling
Bias correction and uncensoring to permit comparison
Hedtke et al 1992
MOE, 1990
El-Shaarawi and Dolan, 1989
Metikosh et al., 1987
Representativeness of Data
Relevance of sampling domain to question posed
Unbiased sampling strategy
Process for DQO Definition
Process complexity appropriate to scale of program
Phase-in to existing programs
QAMS, 1986
Process for Quality Assessment
Nature and frequency of reviews
Feedback to study design
Program (Project) Organization
General process and structure
___
EPA, 1991, 1992a
Responsibilities and Authorities
Management and technical staff included
Independence of quality reviews
Reporting (Communication) Channels
Horizontal and vertical pathways defined
_
QA Policy Statement
Strategy for tradeoff situations defined
Resource Allocation Process
Budget and manpower adjustments
Document Control Process
Security vs. accessibility balanced
_ '
Procurement Process
Standardization of work statements
Staff Training, Evaluation
Integration across projects and jurisdictions
FIELD OPERATIONS AND QA/QC
Sampling Design
Temporal and spatial distribution of effort
Gilbert, 1987
Time and Area Selection
Appropriateness of reference areas
EC/DFO, 1991
-------
APPENDIX 1: GENERIC QUALITY ASSURANCE PLANNING CHECKLIST
Page 3 of 5
PROGRAM (PROJECT) COMPONENT/ACTIVITY
KEY QUESTIONS/ISSUES TO BE ADDRESSED
CHECK
STATUS
EXAMPLES OR
REFERENCES
Adequate Replication
Justification of sample sizes
Based on preliminary data
Expectation of power and confidence
L'ltalien, 1992
Overton et al., 1991
Hedtke et al., 1992
Sample Locations and Timing
Sepcify both, with methods of station location
__
Randomization Strategy
Demonstration of unbiased design
Sampling Procedures (SOPs)
Written, accessible and understood
NRSPG, 1988
Shackelton, 1990
Weseloh and Mineau, 1991
Equipment/Instrument Maintenance
Frequency of each procedure (e.g., cleaning gear)
Instrument Calibrations
Frequency, range and standards used
___
Sample Collection Process
Method performance criteria and comparisons
__
Figure 5
Sample Preservatives, Storage, Shipping
Permissible conditions and transit times
_____
Sampling and Custody Records
Paper trail from field to laboratory
In-field Quality Controls
Real-time checks and corrective actions
Additional effort checks (e.g., taxa recovery)
MOE, 1990; Fay, 1987
McCrea and Fischer, 1992
Hedtke et al., 1992
BEAK, 1991; Figure 4
In-field Data Validation
Checks performed and criteria for flags assigned
LABORATORY OPERATIONS AND QA/QC
Laboratory Facilities
Physical features and management process
King 1989; ZENON, 1991
Spatial Layout
Separation of critical areas (e.g., receiving, testing)
Climate Control
Specifications, monitoring systems and data
Reagent Control
Process to ensure adequate reagent quality
Lab Management
Organization, process and structure
-------
APPENDIX 1: GENERIC QUALITY ASSURANCE PLANNING CHECKLIST
Page 4 of 5
PROGRAM (PROJECT) COMPONENT/ACTIVITY
KEY QUESTIONS/ISSUES TO BE ADDRESSED
CHECK
STATUS
EXAMPLES OR
REFERENCES
Analytical Procedures (SOPs)
Written, accessible and understood
King, 1989; ZENON, 1991
Equipment/Instrument Maintenance
Frequency of each procedure (e.g., cleaning bottles)
____
Instrument Calibration
Frequency, range and standards used
Sample Receiving, Storage, Preparation
Permissible conditions and holding times
_____
Analytical Methods
Method performance criteria and comparisons
Analysis and Custody Records
Paper trail from field to lab to data report
Lab Quality Control
Real-time checks and corrective actions
Control charts, benthic sorting checks
King, 1984; Dux 1986
ASTM 1986; EC, 1990
Peck, 1988; BEAK, 1991
Figures 7 and 8
Lab Data Validation
Checks performed and criteria for flags assigned
QUALITY ASSURANCE ACTIVITIES
Interlaboratory/Accreditation Studies
Rational for use of these data
Relevance to program objectives
Rathke and McRae, 1987
CAEAL, 1992
Figures 9 and 10
Performance and System Audits
Standardization of audit checklists
Appropriateness of audit samples
Kuntz and Harrison, 1991
Hedtke et al., 1992
Schumacher, 1992
Turle and Norstrom, 1987
Review of QC Data
Standardization of QC data uses
Hedtke et al., 1992
Schumacher, 1992
Project Data Review
Standardization of data reductions
Standardization of data validations
Hedtke et al., 1992
Schumacher, 1992
Nielson and Stevens, 1988
Corrective Action Plan
Trigger criteria, response options
-------
APPENDIX 1: GENERIC QUALITY ASSURANCE PLANNING CHECKLIST
Page 5 of 5
PROGRAM (PROJECT) COMPONENT/ACTIVITY
KEY QUESTIONS/ISSUES TO BE ADDRESSED
CHECK
STATUS
EXAMPLES OR
REFERENCES
Assessing DQO Achievement
Quality Assurance Reports
Process for feedback to study design and methods
Minimum standards specified for QA reports
MOE, 1990; EPA, 1991
Schumacher, 1992
-------
APPENDIX 2
References
-------
APPENDIX 2: REFERENCES
Alldredge, J.R. 1987. Sample size for monitoring of toxic chemical sites. Environ.
Monitor. Assess. 9: 143-154.
ASTM. 1985. Standards on Precision and Bias for Various Applications (2nd. ed.).
D4210-83. Practice for Intralaboratory Quality Control Procedures and a Discussion on
Reporting Low-Level Data. American Society for Testing and Materials, Philadelphia.
ASTM. 1986. Manual on Presentation of Data and Control Chart Analysis. Committee E-
11 on Statistical Methods. ASTM Special Technical Publication 15D. American
Society for Testing and Materials, Philadelphia.
Beak Consultants Limited (BEAK). 1988. Quality Assurance Coordination of Great Lakes
Surveillance. Report to Environment Canada, Inland Waters Directorate, Ontario
Region.
Beak Consultants Limited (BEAK). 1991. Quality Assurance Guidelines for Biology in
Aquatic Environmental Protection. Environment Canada, Conservation and Protection,
National Water Research Institute, Burlington.
Brydges, T.G., C.A. Franklin, I.K. Morrison, P.W. Summers, B.B. Hicks, K.L. Demeijian,
D.L. Radloff, M.L. Wesely and B.M. Levinson. 1991. Integrated Monitoring in the
U.S.-Canada Transboundary Region, Monitoring for Integrated Analysis. Report to the
International Joint Commission, Air Quality Advisory Board.
Canadian Association of Environmental Analytical Laboratories. 1992. Laboratory
Accreditation Program for Biological Environmental Toxicity Testing (Draft).
Crane, J.L. 1986. Workgroup Summary: New Quality Control Issues. In: Methods for
Analysis of Organic Compounds in the Great Lakes, Volume II. Proceedings of an
International Workshop. W.C. Sonzogni and D.J. Duke, Chairmen. WIS-SG-86-244.
University of Wisconsin Sea Grant Institute, Madison.
Dux, J.P. 1986. Handbook of Quality Assurance for the Analytical Chemistry Laboratory.
Van Nostrand Reinhold Co., New York.
Elliot, J.M. and C.M. Drake. 1981. A comparative study of seven grabs used for sampling
benthic macroinvertebrates in rivers. Freshwater Biology 11: 99-120.
El-Shaarawi, A.H. and M. Dolan, 1989. Maximum likelihood estimation of water quality
concentrations from censored data. Can. J. Fish. Aquat. Sci. 46:1033-1039.
Environment Canada. 1990. Guidance Document on Control of Toxicity Test Precision
Using Reference Toxicants. EPA l/RM/12. Environmental Protection Service, Ottawa.
-------
Environment Canada/Department of Fisheries and Oceans (EC/DFO). 1991a. Aquatic
Environmental Effects Monitoring Requirements, Annex 1, Aquatic Environmental
Effects Monitoring at Pulp and Paper Mills and Off-Site Treatment Facilities Regulated
under the Pulp and Paper Effluent Regulations. EPS l/RM/18. Environmental
Protection Service, Ottawa.
Environment Canada/Department of Fisheries and Oceans (EC/DFO). 1991b. Technical
Guidance Manual for Aquatic Environmental Effects Monitoring at Pulp and Paper
Mills. Volume 1 Overview and Study Design, Volume 2 Methodology. Environmental
Protection Service, Ottawa.
Environment Canada/Department of Fisheries and Oceans (EC/DFO). 1991c. Toxic
Chemicals in the Great lakes and Associated Effects: Vol. 1. Contaminant Levels and
Trends. Government of Canada, Ottawa.
Environmental Protection Agency (EPA). 1980. Interim Guidelines and Specifications for
Preparing Quality Assurance Project Plans, Quality Assurance Management Staff,
Washington, DC.
Environmental Protection Agency (EPA). 1984. Definition and Procedure for the
Determination of the Method Detection Limit - Revision 1.11. Appendix B to Part 136.
Federal Register 49(209): 26. October 1984.
Environmental Protection Agency (EPA). 1987. Guidelines and Specifications for Preparing
Quality Assurance Program Plans and Quality Assurance Annual Report and Workplans
for EPA National Program Offices and the Office of Research and Development.
Quality Assurance Management Staff, Washington, DC.
Environmental Protection Agency (EPA). 1991. Environmental Monitoring and Assessment
Program, Quality Assurance Program Plan. Environmental Monitoring Systems
Laboratory, Cincinnati.
Environmental Protection Agency (EPA). 1992a. EPA Requirements for Quality Assurance
Management Plans. Quality Assurance Management Staff, Washington, DC.
Environmental Protection Agency (EPA). 1992b. Lakewide Management Planning Process.
Region 5 Water Division, LaMPOST, FY92, 3rd Quarter.
Fay, L.A. 1987. Great Lakes Methods Manual - Field Procedures (draft). Report to the
International Joint Commission Surveillance Work Group. Center for Lake Erie Area
Research, Columbus.
Gilbert, R.O. 1987. Statistical Methods for Environmental Pollution Monitoring. Van
Nostrand Reinhold Co., New York.
Green, R.H. 1989. Power analysis and practical strategies for environmental monitoring.
Environ. Res. 50: 195-205.
-------
Hedtke, S., A. Pilli, D. Dolan, G. McRae, B. Goodno, R. Kreis, G. Warren, D. Swackhamer
and M. Henry. 1992. Environmental Monitoring and Assessment Program, EMAP-
Great Lakes Monitoring and Research Strategy. Environmental Protection Agency,
Environmental Research Laboratory, Duluth.
Hunt, D.T. and A.L. Wilson. 1986. The Chemical Analysis of Water. General Principles
and Techniques. Second Edition. Royal Society of Chemistry, London.
International Joint Commission (IJC). 1988. Summary: A Plan for Assessing Atmospheric
Deposition to the Great Lakes. Surveillance Work Group, Atmospheric Deposition
Monitoring Task Force, IJC Great Lakes Regional Office, Windsor.
King, D.E. 1984. Principles of Control Charting. Ontario Ministry of the Environment,
Laboratory Services Branch, Rexdale.
King, D.E. 1989. Code of Practice for Environmental Laboratories. Ontario Ministry of
the Environment, Laboratory Services Branch, Rexdale.
Kuntz, K.W. and B. Harrison. 1991. Field Sampling Design and Quality Control
Procedures used in the Niagara River Program. Chemical Institute of Canada
Conference, Hamilton, June 1991.
L'ltalien, S. 1992. Great Lakes Water Quality Monitoring Quality Assurance/Quality
Control. Environment Canada, Water Quality Branch, Inland Waters Directorate,
Burlington.
McCrea, R.C. and J.D. Fischer. 1992. Quality Assurance and Control Considerations for
the ISOMET Stream Sampler (draft). Environment Canada, Burlington.
Metikosh, S., J.O. Nriaga, J.M. Bewers, O. El-Kei, D.C. Rockwell, R. Rossman, I. Sekerka
and P.A. Yeats. 1987. Recommended Sampling and Analytical Protocols for Trace
Metals Measurement in Great Lakes Open Waters (draft). Report to the Surveillance
Work Group from the Technical Group on Sampling and Analytical Protocol.
Neilson, M. and R. Stevens. 1988. Evaluation of a large-volume extractor for determining
trace organic contaminant levels in the Great Lakes. Water Pollut. Res. J. Canada 23:
578-588.
Niagara River Sampling Protocol Group. 1988. Niagara River Sampling Protocol.
Environment Canada, Burlington.
Niagara River Toxics Committee (NRTC). 1984. Final Report to the Niagara River Toxics
Committee. Report by the Data Quality Subcommittee.
Ontario Ministry of the Environment (MOE). 1988. Estimation of Analytical Detection
Limits (MDL). Laboratory Services Branch, Rexdale.
-------
Ontario Ministry of the Environment (MOE). 1990. Toxics Deposition Monitoring
Network, Quality Assurance Plan. ARB-060-90. Air Resources Branch, Atmospheric
Research and Special Projects Section, Toronto.
Ontario Ministry of the Environment (MOE). 1992. Report on the Analysis of Quality
Assurance and Quality Control Data for the Petroleum Sector.
Overton, W.S., D.L. Stevens and D. White. 1991. Design Report for EMAP, Environmental
Monitoring and Assessment Program (Draft). Environmental Protection Agency,
Environmental Research Laboratory, Corvallis.
Peck, D.Y., J.L. Engels, K.M. Howe and J.E. Pollard. 1988. Aquatic Effects Research
Program, Episodic Response Project, Integrated Quality Assurance Plan. EPA 600/x-88-
274. U.S. EPA Environmental Monitoring Systems Laboratory, Las Vegas, Nevada.
QAMS. 1986. Development of Data Quality Objectives. Description of Stages I and II
(draft). Quality Assurance Management Staff, Environmental Protection Agency,
Washington, DC.
Rang, S., J. Holmes, S. Slota, D. Bryant, E. Nieboer and H. Regier. 1991. The Impairment
of Beneficial Uses in Lake Ontario. Report to Environment Canada, Great Lakes
Environmental Program Office.
Rathke, D. and G. McRae. 1987. 1987 Annual Report of the Water Quality Board to the
International Joint Commission, Appendix B, Sections 4.4 and 4.5. International Joint
Commission, Windsor.
Reckhow, K.H. 1979. Uncertainty analysis applied to Vallenweider's phosphorus loading
criterion. J. Water Poll. Control Fed. 51: 2123-2128.
Schumacher, B.A. 1992. Quality Assurance Management Plan for the Assessment and
Remediation of Contaminated Sediments (ARCS) Program. EPA/600/92.
Environmental Protection Agency, Environmental Monitoring Systems Laboratory, Las
Vegas.
Shackleton, M.N. 1990. Technical and Operating Manual, Toxics Deposition Monitoring
Program. Ontario Ministry of the Environment, Air Resources Branch, Toronto.
Turle, R. and R.J. Norstrom. 1987. The CWS Guide to Practical Quality Assurance for
Contracted Chemical Analysis. Technical Report Series No. 21. Environment Canada,
Canadian Wildlife Service, Headquarters.
Weseloh, D.V., J.E. Elliot, T.J. Miller and J.H. Olive. 1988. Herring Gull Surveillance
Program, Great Lakes International Surveillance Plan (GLISP), Volume m. Surveillance
Handbook. Report to the International Joint Commission, Water Quality Board.
-------
Weseloh, D.V. and P. Mineau. 1991. Herring Gull Surveillance Program, Field Protocol.
Environment Canada, Canadian Wildlife Service, Burlington.
Zenon Environmental Laboratories (ZENON). 1991. Code of Practice for Laboratory
Analysis in Support of the MISA Program. Report to the Canadian Association of
Environmental Analytical Laboratories (CAEAL).
-------
APPENDIX 3
Examples
-------
TABLE 2.1: HYPOTHETICAL EXAMPLES OF DATA PRODUCER X USER
MATRIX FORMAT
Data Users and Uses
Data
Producer
Data
Block
Lake Ontario
Task Force
Niagara R.
Toxics Comm.
Hamilton
Harbour RAP
Ontario
MOEE
ON Tributary
Metals 1991
Flows 1991
Input to
Lake Model
Load to Lake
Trend Eval.
Load to Harbour
Trend Eval.
Harbour W. Qual.
Trend Eval.
New York
DEC
NY Tributary
Metals 1991
Flows 1991
Input to
Lake Model
Load to Lake
Trend to Eval.
MOEE
NYDEC
Niagara R.
Metals 1991
Flows 1991
Input to
Lake Model
Load to Lake
Trend to Eval.
Load to River
Trend Eval.
River W. Qual.
Trend Eval.
Ontario
MOEE
N. shore L. Ont.
Metals 1991
Calibration of
Lake Model
Comparison to
Harbour W. Qual.
New York
DEC
S. shore L. Ont.
Metals 1991
Calibration of
Lake Model
DOE
EPA
Open Lake Ont.
Metals 1991
Calibration of
Lake Model
Lake W. Qual.
Trend Eval.
-------
FIGURE 2:
CONSEQUENCE OF MEASUREMENT ERROR IN DECISION-
MAKING. Power = 1-0 (from EC/DFO, 199la,b)
^ ~[
Ho Hi
tBS
Mean Response V
1. Pr (rejecting Ho when nc- nr < 0) < a
2. Pr (not rejecting H when *tc - /*r ^ 0) ^ 1 - a
3. Pr (rejecting H when /xe - /ir 2s A) 1 - 0
4. Pr (not rejecting H when - fxr A) <> (}
-------
Model:
11.6 + 1.2 q
where: L = areal phosphorus loading (ug/L), with error SL
11.6 = net settling velocity to sediment (m/a)
qs = areal hydraulic loading (m/a)
.00
1.0'
0-K
.SO
1.0
0.030
0.020
0.010
yL=P
(predicted phosphorus concentration, in mg/l)
1.00
^ .50
>
K = St/L
-
w
r- -
r
\\
A
O-K
v .r
EN
i.
o
a
o
.50
K =
sl/l
0.0
!/
0.3
^ 0.6
f\/
/ /\X /
J / /
1.0
K
/\
////I
1 .o/y
/ 7/o
//
- K
0.005
0.035
0.020
yl=P
(predicted phosphorus concentration,in mg/ I)
FIGURE 3: CONSEQUENCE OF LOADING ERROR IN TROPHIC STATUS DESIGNATION
(after Reckhow, 1979)
-------
TABLE 3: MQOs FOR INORGANIC AND ORGANIC CHEMISTRY ANALYSES FOR THE ARCS PROGRAM* (excerpt)
MOL*
Parameter
Sediment
Tissue Elutriate
Water
Accuracv* Freauencv
Precision*
Freauencv
(fjg/kg)
(fjg/kg) (mj/L)
(WJ/L)
DO
10
l 0.5 mg/L 1/batch
t 0.1 mg/L 1/batch
Conductivity
N/A
t 1 fiSfcm 1/batch
i 2 pS/cm 1/baich
Chlorophyil-a
100 l 20%
1/batch
520%
1/batch
Total solids
O.OOIg
N/A
N/A
520%
1/b«itch
Volatile solids
0.002g
N/A
N/A
520%
1/batch
TSS
O.OOIg
N/A N/A
520%
1/batch
PSA
0.001Q
windows 1/batch
520%
1/batch
SER
O.OOIg
120% 1/batch
520%
1/batch
Moisture content
O.OOIg
N/A
N/A
520%
1/batch
Lfold content
0.00 to
120% 1/batch
520%
1/batch
a - MQOs presented do not apply to the measurement of water quality parameters associated with bioassays or fish bioaccumulation studies.
b - MDLs for water include porewater and water column samples. Units presented in subheading are applicable to all parameters unless
otherwise noted. If no MDL is presented, then that parameter is not measured in that given matrix. N/A = not applicable.
c - Accuracy determined from CRM, SRM or standard, and is measured from the known concentration.
d - Precision is calculated as % RSD. It should be noted that LLRS will only be performing duplicate analyses; therefore, the limit will
be calculated as a RPD. Precision requirements listed here are for analytical replicates only; field duplicates are required to have a RPD
£30%.
-------
TABLE 4: MQOs FOR BIOASSAYS AND FISH BIOACCUMULATION STUDIES FOR THE ARCS PROGRAM (excerpt)
Assav Oraanlsm/Communltv
EndDOlnt'
Resoonse limit *(mean1b
Reference Toxicant'
Precision"
& Accuracv
Pimeohales oromelas
Survival
80%
1 CR
G0%
Larval orowth
0.25 mo
25% or 1 CR
60%
Daphnia maona
Survival
90S
45%
40%
Reproduction
60 young
1 CR
40%
Ames Assay (Salmonella)
Revertant Colonies
N/A
25%
N/A
Alkaline phosphatase
Enzyme Activity
20%*
25%
80%
Dehydrogenase
Enzyme Activity
20%*
25%
80%
£-Galactosldaae
Enzyme Activity
20%*
25%
80%
p-Glucosldaee
Enzyme Activity
20%*
25%
00%
Banthlc community structure
Abundance of species
N/A
N/A
N/A
Raold Bloasaessment II.1TI
Community Indices f10l
N/A
N/A
no%
a - For bioassays in which survival is the primary endpoint, MQOs are only presented for the survival endpoint. The additional endpoints,
such as avoidance, uptake, development and molting frequency, will be recorded by the Pis.
b - The response limit is presented as the mean of the test replicates in the control or blank sample. For Ceriodaphnia dubia and Daphnia
magna, the reproduction response limit is the cumulative total of three broods. N/A = not applicable.
c - CR = concentration range in the serial dilution assays. The percentages are the maximum % RSD of EC50 and LC50 values allowed
among the replicates.
d - Precision values presented are the maximum % RSDs allowed among the replicate tests. Accuracy limits presented are the maximum
% RSDs compared through time for a given test of reference toxicant or reference sediment.
-------
40
From Smith 1974
4 i.
Number of
Species
DQO
90%
Maximum
Return
20 25
15
5
30
35 40
Quadrats
FIGURE 4:
SPECIES RECOVERY OBJECTIVE IN A BIOLOGICAL SURVEY
-------
FIGURE 5: SAMPLING METHOD EFFECTS ON LOG TOTAL DENSITY
Ui
C
CD
Q
O
£
o>
o
4.5
4
3.5
3
2.5
2
1.5
1
0.5
0
&
5
8
S
B
B
~
¦ Ponar grab, 200 um sieve
D Ponar grab, 500 um sieve
~ Airlift sampler, 200 um sieve
+
2
Station
*
©
S
~
-------
FIGURE 6:
REPLICATION EFFECTS ON APPARENT SPECIES RICHNESS
SMD 4.2E - Airlift Sampler
o
x
o
0
£1
E
D
z
1
3 4
Number of Replicates
-------
FIGURE 7: RANGE AND AVERAGE CONTROL CHARTS FOR TOXICITY TESTING
Average Control Charts for Reference Toxicants
LC50
(mg/L)
o.s 1
0.45
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.OS -
Control Chart: Rainbow Trout
Sodium Pentachlorophenalo
<
)
(
>
> 1
<
> X '
I \ 1
Hi
Lower Control Limit
0 "I 1 I I I I I I I I I I I
02/01/89 23X11/89 06/03/89 28/03/89 24/04/89 08/05/89 29/05/89 08/06/89 13/06/89 18/06/89 19/06/89 26/06/89
Date
Control Charting of Duplicate Toxicity Test Data
80
Range of Difference Between Duplicates
Using Rainbow Trout
Range
(°/.
60
50.
40-
30
Control Umit
Mean Range
T 1 I I ~
20 40 60
Mean Effluent LC50 {%) of Duplicates
80
-------
FIGURE 8:
SORTER AND TEMPORAL EFFECTS ON LOG TOTAL DENSITY
>
"go
C
0
o
o
+
.0
O)
o
4.5
4
3.5
3
2.5
2
1.5
1
0.5
0
Sorter
¦
d
= sorter DF
1985
~
e
= sorter EJ
1985
~
m
= sorter MS
1983
B
o
~
~
A
~
~
~
~
~
~
~
t
~
Station
-------
TFP
TP
36-j
34-
32-
30-
15-
14-
13-
12-
11-
10-
l
I
\
V
I
1
\
4
1 ,
1/
v
\
\
\f
V
9-
laboratory
ship
TThTTXhT
TTh"
H
261
24-
22-
20
18-1
17
7
6-
5-
4-
I
I
\
A
\
\/
V
I
\
IJl
/ \
V V
-f-
\ I
\ I
tl
v
i i i i i i 11 i i i
L S H I L SHI LSH
LSH
SRP
20-i
18-
16J
14-
12-
10-
2
1
I
.J.
I
I
I
I
I
\
\
¦r
\
\
i
r-
7^
» l i II i I Ilir-
L SMI LSH I LSH
LSH
FIGURE 9:
LABORATORY MEANS FOR THE ANALYTICAL VARIABILITY (AV) COMPARISON
AND LABORATORY AND SHIP MEANS FOR BOTH DEPTHS OF THE OVERALL VARI-
ABILITY (OV) COMPARISON IN THE THREE-SHIP INTERCOMPARISON STUDY ON
LAKE ERIE. AUGUST 1985.
-------
FIGURE 10: INTERLABORATORY STUDY OF TOXICITY TEST PERFORMANCE
Alkalinity <700 mg/L
Alkalinity >700 mg/L
700 "
true concentration of unknown
600 "
t100
90 3
o
w
o
80 *
_i
O)
- 70
E,
c
60
o
c
ZD
- 50
O
c
r\
-- 40
u
co
k.
c
/It
-- 30
QJ
o
c
o
o
-- 20
"O
0)
4'
CTJ
E
- 10
*«
CO
UJ
0
BC ED HFX YUK IEC ONT QC
------- |