&EPA
U.S. Environmental Protection Agency
April 13-16,
1999
Regal
Cincinnati
Hotel
Cincinnati,
Ohio
Annual
National
Conference on
Managing Quality Systems
for Environmental Programs
Quality	for the Nex

-------
18th Annual National Conference on
Managing Quality Systems for Environmental Programs
Table of
Contents
IQDIIIIIU
7 305131 Reorder No.

1
Welcome Letter

2
Agenda

3
Participant List

4
EPA/QA Community Phone List

5
Tuesday Sessions

6
Session Wll: QA for Secondary Data Use - Lessons Learned

7
Session W12: Implementing NELAP - Status Report (Panel Discussion)

8
Session W13: The QA Professional as an Expert Witness in Court

9
Session W21: Environmental QA/QC Practices - Issues in Research

10
Session W22: Implementing QA in Environmental Labs

11
Session W23: A Sample is a Sample is a Sample?

12
Session W31: Information Management for the QA Professional

13
Session W32: Environmental QA/QC Practices

14
Session W33: Benchmarking - Processes and Lessons Learned

15
Session W41: Information Management for the QA Professional

16
Session W42: Putting QA into Operation: Planning, Implmtn. & Assessment

17
Session W43: Environmental QA/QC Practices - Oversight & impimntn.

18
Session Til: Cost of Quality - Real Dollars or Otherwise

| 19
Session T12: Practical Approaches to Quality Auditing

20
Session T13: Implementing QA in Health and Ecological Research

21
Session T21: Field Analytical Methods - Exp. & Lessons Learned

22
Session T22: Practical Approaches to Quality Auditing
23
Session T23: What's Happening in Int'l Standards & Why You Should Care

24
Plenary Wrap-up Session: Reports from Breakout Sessions
!
I
25
Miscellaneous

26
Notes

27


28


29


30

~
31



-------

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
NATIONAL CENTER FOR ENVIRONMENTAL RESEARCH
AND QUALITY ASSURANCE
WASHINGTON, DC 20460
APR I 3 1999
OFFICE OF
RESEARCH AND DEVELOPMENT
Welcome to the 18th Annual National Conference on Managing Environmental Quality
Systems. On behalf of the U.S. Environmental Protection Agency (US EPA) and the Quality
Assurance Division (QAD), I thank you for taking time from your busy schedule to participate in
this important conference.
The world around us is changing. We stand on the threshold of a new century and a new
millennium. The dynamics of the business place have shown that the old ways of doing things
will not work well in the future. Accordingly, quality management concepts and practices must
also change in order to provide sufficient and adequate support to evolving environmental
programs in government and in business.
The theme of this Conference is Quality for the Next Millennium. The solutions to
today's problems may require new tools and approaches for use by quality professionals. We
have assembled a conference program to inform you about emerging tools and approaches and to
engage you in a constructive dialogue. We need to learn from you and become better equipped to
meet your future needs in the environmental arena. Accordingly, this Conference will include
sessions on Performance Based Measurement Systems, implementation of the National
Environmental Laboratory Accreditation Program (NELAP), "secondary" data use, quality
auditing, and science and law, among others. We have brought together professionals and
practitioners on these subjects to share their perspectives with you. Your challenge is to engage
them, to learn from them, and to provide your perspectives to them. Our goal is to leave
Cincinnati better informed and better equipped to meet the needs of the next millennium.
I look forward to meeting you during the week. I believe that this will be an exciting and
productive week as we seek to continually improve our quality practices and our management of
quality systems.
Director
Quality Assurance Division
Recycled/Recyclable • Printed with Vegetable Oil Based Inks on 100% Recycled Paper (40% Postconsumer)

-------
2

-------
THE 18™ ANNUAL NATIONAL CONFERENCE ON MANAGING QUALITY
SYSTEMS FOR ENVIRONMENTAL PROGRAMS:
Quality for the Next Millennium
April 11-15, 1999
Regal Cincinnati Hotel
Cincinnati, Ohio
C199 TECHNICAL PROGRAM OUTLINE
SUNDAY. APRIL 11.1999
PRECONFERENCE TRAINING COURSES
8:30 am - 12:00 pm Training Course: "Integrating Quality Assurance into Project Development"
(Two Days) Prerequisites: Completion of "Introduction to the Data Quality Objectives
[DQO] Process" or equivalent experience.
The purpose of this two-day workshop is to show project managers responsible
for planning and implementing projects involving environmental data
collection how to integrate adequate and sufficient quality assurance(QA) and
quality control (QC) into the planning of such projects. By the end of this
workshop, participants will be able to :(1) explain the necessity of planning for
data collection; (2)describe the linkages among Data Quality Objectives
(DQOs), Data Quality Indicators (DQIs), and QA Project Plans; and (3)
discuss issues related to selecting methods for assessing data quality.
This is a highly interactive workshop that will demand active co-operation
between participants that have been organized into independent teams
attempting to find a solution to a complex project. The workshop will be an
integrated mixture of lecture, example, and the use of simulated project teams
to provide the necessary inputs to achieve success at each stage of the
project's life-cycle. This workshop will provide attendees with a more complete
understanding of how quality assurance affects the entire project.
Workshop Materials: Agency publications EPA QA/G-4 (Guidance for the
Data Quality Objectives Process), EPA QA/G-5 (Guidance on Quality
Assurance Project Plans), and EPA QA/G-9 (Guidance for Data Quality
Assessment: Practical Methods for Data Analysis) will be used or referenced as
text materials together with supplementary papers and programs.
Instructors:
•	John Warren (U.S. EPA, Quality Assurance Division, Washington,
DC)
•	Malcolm Bertoni (Research Triangle Institute, Washington, DC)
•	Nancy Hassig (Battelle Pacific Northwest Laboratories, Richland, WA)

-------
12r00 noon
Lunch Break
1:00 - 5:00 pm	Training Course: "NELAP Accrediting Authority Assessor Training"
(Enrollment is limited to State and Federal Employees.) (Continued on Monday)
The National Environmental Laboratory Accreditation Program (NELAP)
recognizes State and federal organizations as Accrediting Authorities, which,
in turn, accredit environmental laboratories in the United States. NELAP
assessments of Accrediting Authorities are conducted by NELAP-trained
assessors. These assessors must be either State of federal employees. NELAP
assessors review applications for NELAP recognition and make
recommendations on whether the accreditation program should be recognized.
This course is intended to potential NELAP Accrediting Authority assessors.
The course includes a full day of instruction on NELAP assessments and one-
half day on interviewing and listening skills. Students must complete the entire
course in order to receive credit for the training. Note: This course does not
teach specific skills associated with laboratory assessments.
Instructors:
•	Elizabeth Dutrow (U.S. EPA, Quality Assurance Division,
Washington, DC)
•	Fred Siegelman (U.S. EPA, Quality Assurance Division, Washington,
DC)
Training Course: "Integrating Quality Assurance into Project Development"
(Continued from morning)
2

-------
MONDAY. APRIL 12.1999
8:30 am - 12:00 noon Training Course: "Integrating Quality Assurance into Project Development"
(Continued from Sunday)
Training Course: "NELAP Accrediting Authority Assessor Training"
(Continued from Sunday)
Training Course: "Introduction to EPA Quality System Requirements" (One Day)
In July 1998, EPA Order 5360.1 CHG 1 was issued to update the Agency's
requirements for quality systems supporting environmental programs. This
course provides cm introduction to these requirements and discusses some of
the key "tools " available for planning, implementing, and assessing the
effectiveness of the quality systems. Attendees completing this course will have
a better understanding of EPA quality requirements, including those for
extramural agreements. (This course replaces the "QA Orientation" course.)
Instructors:
•	Tom Dixon (U.S. EPA, Quality Assurance Division, Washington, DC)
•	Linda Kirkland (U.S. EPA, Quality Assurance Division, Washington,
DC)
Training Course: "Introduction to the Management Systems Review Process"
The U.S. EPA has developed the Management Systems Review (MSR) process
to assist users in conducting management assessments of the suitability and
effectiveness of quality systems. The MSR utilizes interviews, file reviews, and
objective evidence of quality systems documentation to compile the needed
information for the assessment. This one-day course includes a detailed
discussion of the MSR process and how to plan, implement, evaluate, and
report the results of the assessment to managers of quality systems, and
training in interviewing and listening skills.
Instructors:
•	Gary Johnson (U.S. EPA, Quality Assurance Division, Research
Triangle Park, NC)
•	Diann Sims (U.S. EPA, Quality Assurance Division, Washington, DC)
12:00 noon Lunch Break
3

-------
1:00 - 5:00 pm	Training Course;-"Integrating Quality Assurance into Project Development"
(Conclusion)
Training Course: "NELAP Accrediting Authority Assessor Training"
(Conclusion)
Training Course: "Introduction to EPA Quality System Requirements for
Environmental Programs" (Conclusion)
Training Course: "Introduction to the Management Systems Review Process"
(Conclusion)
7:00 - 9:00 pm	Regional QA Managers Meeting (EPA Only)
"Accreditation of State Laboratories"
4

-------
Managing Quality Systems for Environmental Programs:
Quality for the Next Millennium
TUESPAY, APRIL U, 1999
8:00 am	Welcome and Introduction of Guests
Nancy Wentworth (Director, Quality Assurance Division)
8:15 am	Introduction of Keynote Speaker
Nancy Wentworth (Director, Quality Assurance Division)
8:20 am	Keynote Address:
RADM Frank C. Collins, Jr., USN (Ret.)
Frank C. Collins Associates/Survival 21
Alexandria, VA
9:00 am	The New Information Office: What Does this Mean for Quality at EPA?
Nancy Wentworth (Director, Quality Assurance Division)
9:30 am	Break
10:00 am	Highlights and Reports from QAD
Changes to Extramural Agreement Requirements
-	Status Report on Requirements and Guidance Documents
-	Status Report on Training Program
-	Status Report on Peer Review
-	Status Report on Management Assessments
-	NELAP Implementation Status at EPA
11:30 am	Lunch Break
1:00 pm	Technical Plenary Session: Report of the Inter-Governmental Task Force on Data
Quality
This plenary session will provide a status report on the Inter-Governmental
Task Force on Data Quality and its efforts to harmonize quality assurance
practices on environmental data collection, analysis, evaluation, and use
across the Government user sector. Perspectives from EPA and other Federal
departments and agencies will be presented.
Session Manager: Mike Carter (U.S. EPA, Federal Facilities Restoration
and Reuse Office, Washington, DC)
5

-------
Speakers:
•	Jackie Sample (U.S. Navy, Naval Sea Systems Command, Charleston,
SC)
•	Stan Morton (U.S. Department of Energy, Idaho Falls, ID)
2:00 pm	Technical Plenary Session: Implementing Performance Based Measurement
Systems in a Regulatory Setting - Issues and Answers (Panel Discussion)
This plenary session will examine current issues affecting the implementation
of Performance-Based Measurement Systems (PBMS) in a regulatory setting
like that at EPA. The session will include an update on the current status of
PBMS implementation at EPA, discussion of the practicality aspects of PBMS
implementation, and perspectives from the user community.
Panel Moderator: Nancy Wentworth (Director, Quality Assurance
Division)
Panelists:
TBA
3:00 pm	Break
3:30 pm	Discussion Breakout Sessions
Following the plenary PBMS session, breakout sessions are planned to enable
small group discussions of key issues to occur. Each group will be asked to
consider one or more relevant issues and to develop responses based on
consensus. Facilitators and recorders will be assigned to capture the results
and to report them to the plenary closing session on Thursday.
5:00 pm	Adjourn Tuesday activities
6

-------
WEDNESDAY. APRIL 14.1999
8:00-9:30 am	Session Wll: QA for Secondary Data Use - Lessons Learned
Developing and implementing appropriate QA and QC for data collected from
sources other than direct measurements is a growing issue. Data are
increasingly compiled from models, computerized data bases, the literature,
etc. for use in environmental decision making. This session examines some
experiences and techniques in the application of QA/QC to such "secondary
data."
Session Manager: Patricia Lafornara (U.S. EPA, QAD, Edison, NJ)
Speakers:
•	"Secondary Data Use - Lessons Learned" - Paul Mills (DYNCORP,
Reston, VA)
•	"ORD's Science Information Management System - Support for Data
Usability" - Linda Kirkland (U.S. EPA, QAD, Washington, DC)
•	CEIS Paper - Ronald Shafer (U.S. EPA, CE1S, Washington, DC)
Session W12: Implementing NELAP - Status Report (Panel Discussion)
A status report on the development and implementation of the National
Environmental Laboratory Accreditation Program (NELAP) will be presented,
including the status ofNELAC standards, progress in accrediting authority
recognition, and accomplishments in laboratory accreditation.
Session Manager: Jeanne Mourrain (U.S. EPA, QAD, Research Triangle
Park, NC)
Panelists:
•	"NELAP Activities and Status" - Elizabeth Dutrow (U.S. EPA,
Washington, DC)
•	"ELAB Recommendations" - Jerry Parr (Catalyst, LLC, Evergreen,
CO)
•	"State Experience" - Richard Sheibley (Pennsylvania Department of
Environmental Protection, Harrisburg, PA)
7

-------
Session W13: TheQA Professional as an Expert Witness in Court
The QA professional as a factual/expert witness Previous "Science, law, and
QA" sessions focused on laboratory fraud, and case law affecting the
admission of junk science in the courtroom. This session will address why a
QA professional would be called as a factual/expert witness, how to prepare
for the courtroom, conduct oneself in the presence of a jury, and how to
prevent being called.
Session Manager: George Michael Brilis (U.S. EPA, NERL, Las Vegas,
NV)
Speakers:
•	Dave Preston (Partner, Varnum, Riddering, Schmidt, and Howlett,
LLP, Grand Rapids, MI)
•	Jeffrey Worthington (U.S. EPA, Quality Assurance Division,
Washington, DC)
•	George Michael Brilis (U.S. EPA, NERL, Las Vegas, NV)
9:30 am	Break
10:00 - 11:30 am Session W21: Environmental QA/QC Practices - Emerging Issues in Research
The application of QA/QC to environmental programs often entails a variety of
concepts, practices, and procedures. This session examines experiences from
several QA/QC practices that have been applied to a variety of environmental
issues in a research setting but which may be applied to other organizations as
well.
Session Manager: Nancy Adams (U.S. EPA, APCD/NRMRL, Research
Triangle Park, NC)
•	"Facility Manuals: Documentation Unique to Research Facilities" -
Shirley Wasson (U.S. EPA, APCD/NRMRL, Research Triangle Park,
NC)
•	"Auditing New Technologies: Issues and Approaches to QA in an
R&D Setting" - Nancy Adams (U.S. EPA, APCD/NRMRL, Research
Triangle Park, NC)
•	"Quality Assurance Project Plan Development for Hydrological
Simulation Program FORTRAN (HSPF) Model - a Progress Report" -
Jean Chruscicki (U.S. EPA, Water Division, Region 5, Chicago, IL)
8

-------
Session W22: Implementing QA in Environmental Labs
The implementation of QA in environmental analytical laboratories involves a
broad array of technical issues and considerations. This session examines
several issues confronting laboratories as they seek to implement appropriate
quality practices in their operations.
Session Manager: Fred Siegelman (U.S. EPA, Quality Assurance
Division, Washington, DC)
Speakers:
"NELAC and PBMS: the Good, the Bad, and the Ugly" - Jerry Parr
(Catalyst, LLC, Evergreen, CO)
"Automation in Laboratory Information Management Systems: Myths
and Virtues" - Cesar Muedas (Brown and Caldwell, Nashville, TN)
"Visual Estimation: Damage to Data Quality Through Improper
Technique" - Denise K. MacMillan (U.S. Army Corps of Engineers,
Omaha, NE)
Session W23 - A Sample Is a Sample is a Sample? A Comparison of Different
Water Media Sampling and the QA Implications of Different Methods and
Protocols
Safe drinking water and clean surface water are some of our most important
natural resources, and a considerable amount of effort is spent collecting and
analyzing water samples. This session will address problems in sampling
different water media; sampling objectives; components of sample collection
(design, sampling methods, sample handling); laboratory methods; and data
analysis. The focus of the session is equally weighted on which methods and
equipment to select (i.e., quality of data), and where, when, and how much to
sample (i.e., quantity and representativeness of data).
This session seeks to address the following questions:
•	Are all water samples the same?
•	What is common/distinct among different water media (i.e.,
groundwater, surface water, drinking water, lakes and streams) in data
collection and in QA protocols?
•	How do sampling techniques forjlifferent water media affect analysis
of sampling results?
•	What aspects of sampling have the greatest impact on the quality of
results?
•	Where should most of the sampling dollars be spent?
Session Presenters: John Warren (U.S. EPA, QAD, Washington, DC)
Nancy Hassig (Battelle PNL, Richland, WA)
9

-------
Lunch Break
Session W31: Information Management for the QA Professional
(Mini-Workshop)
This session will help participants to understand the roles that policy,
technology, and coalition-building all play in developing and maintaining
effective information management infrastructures. The participants will also
learn what areas they need to address during the requirement phase, how to
analyze problems that arise during implementation and operation, and how to
bring together the best teams for building effective information systems.
Session Presenters: Greg Fraser and David Blair (Information Technology
University, Central Intelligence Agency, Washington,
DC)
Session W32: Environmental QA/QC Practices - Emerging Issues on
Questionable Practices
The application of QA/QC to environmental programs often entails a variety of
concepts, practices, and procedures. This session examines experiences from
several QA/QC practices that have been applied to a variety of environmental
issues involving questionable practices.
Session Manager: Frederic Siegelman (U.S. EPA, Quality Assurance
Division, Washington, DC)
•	"The Use and Abuse of QA/QC Samples in Environmental Data
Acquisition" - Emma P. Popek (IT Corporation, Martinez, CA) and
Garabet Kassakhian (Consultant, Glendale, CA)
•	"Questionable Practices in the Laboratory" - Joseph Solsky (U.S. Army
Corps of Engineers, Omaha, NE)
Session W33: Benchmarking - Processes and Lessons Learned
Benchmarking has been an effective tool for quality assurance quality control
managers in manufacturing operations in assessing the effectiveness of systems
and processes through comparative analysis. This practice has not been as
widely applied to environmental work as manufacturing. This session
examines some of the basics of benchmarking and the application of these
techniques to current work.
Session Manager: Esperanza Renard (U.S. EPA, Quality Assurance
Division, Edison, NJ)
10

-------
2:30 pra
3:00 - 4:30 pm
Speakers:
"Benchmarking - Processes and Lessons Learned" - Silky Labie,
Florida Department of Environmental Protection, Tallahassee, FL)
"Lessons Learned While Using the Technical Project Planning (TPP)
Process" - Heidi Novotny and Larry Becker (U.S. Army Corps of
Engineers, Washington, DC)
"Beyond Environmental Data - Using the Malcolm Baldrige Criteria to
Assess Organizations" - Mark Doehnert (U.S. EPA, ORIA,
Washington, DC)
Break
Session W41: Information Management for the QA Professional
(Mini-Workshop continued from Session W31)
Session W42: Putting QA into Operation: Planning, Implementation, and
Assessment
This session examines techniques employed by quality professionals in order to
plan, implement, and assess the effectiveness of QA practices applied to
environmental programs.
Session Manager: Tom Dixon (U.S. EPA, Quality Assurance Division,
Washington, DC)
Speakers:
"Operational Quality Assurance - 'Completing' the End State" - Dave
Bottrell and Kevin Kelkenberg (U.S. DOE, Office of Site Operations)
and Tim Harms (U.S. DOE, Office of Waste Management)
"The Significance of Analytical Error in Project Planning,
Implementation, and Assessment" - Cliff J. Kirchmer (Washington
State Department of Ecology, Olympia, WA)
"Pitfalls in Performance Auditing" - Nancy Adams (U.S. EPA,
APPCD/NRMRL, Research Triangle Park, NC)
Session W43: Environmental QA/QC Practices - Oversight and Implementation
The application of QA/QC to environmental programs often entails a variety of
concepts, practices, and procedures. This session continues to examine
experiences from several QA/QC practices that have been applied to oversight
and implementation issues.
11

-------
Session Manager:	Esperanza Renard (U:S ."EPA, -Quality Assurance
Division, Edison, NJ)
Speakers:
•	"Superfund QA Oversight - Where Do We Go from Here?" - Joan Fisk
and Duane Geuder (U.S. EPA, Office of Emergency and Remedial
Response, Washington, DC) and Conrad Kleveno and Marguerite
Jones (DYNCORP, Reston, VA)
•	"An Alternative Approach to Quality Assessment Sample Allocation"
Daniel Michael (Neptune and Company, Los Alamos, NM) and
{Catherine Campbell (Los Alamos National Laboratory, Los Alamos,
NM)
•	"Remedial Process Quality Assurance and Optimization" - Javier
Santillan, Jeffrey Perl, and Chung Yen (USAf, AFCEE/ERC, Brooks
AFB, TX)
4:30 pm	Adjourn Wednesday activities
THURSDAY. APRIL 15.1999
8:00 - 9:30 am	Session Til: Cost of Quality - Real Dollars or Otherwise
Determining the cost of quality has often been a struggle, particularly for
environmental operations wherein QA/ QC is largely a prevention mechanism.
When "poor data " are preventedfrom occurring because effective QA/QC has
been applied, it is difficult to determine the value of the savings in dollars.
Increasingly, however, techniques are emerging that enable users to determine
the cost of quality (or the lack thereof). This session examines some techniques
and related issues.
Session Manager: Jeffrey Worthington (U.S. EPA, QAD, Washington,
DC)
Speakers:
•	"Measuring the Cost of Quality - Analytical Program Considerations" -
Paul Mills (DYNCORP, Reston, VA) and Jeffrey Worthington (U.S.
EPA, QAD, Washington, DC)
•	"An Evaluation of the EPA Quality Assurance Annual Report and
Work Plan from the Perspective of a Traditional Cost of Quality
Model" - Lora Johnson (U.S. EPA, NERL, Cincinnati, OH)
•	"Efficient QA Through Consolidated Metrology" - Paul W. Groff (U.S.
EPA, APPCD/NRMRL, Research Triangle Park, NC)
12

-------
Session T12: Practical Approaches to Quality Auditing (Mini-Workshop)
Quality auditing is a continually evolving science. While there are basic
principles of auditing that underpin all types of audits, auditing environmental
programs often requires mastery of additional skills and techniques due, in
large part, to the diversity and breadth of the environmental sector. This mini-
workshop provides some practical approaches to quality auditing of
environmental programs and is intended to augment existing auditing skills.
The session will be highly interactive and attendees will be asked to contribute
to the discussions from their own experiences.
Session Presenter: Craig R. Mesler (QMS, Uncasville, CT)
Session T13: Implementing QA in Health and Ecological Research
The implementation of suitable and effective QA/QC to health and ecological
research studies may not always follow traditional practices. Such is the
nature of research in general and especially in health and ecological settings
in which well-known QA/QC tools and techniques may not be applicable. This
session examines experiences by the research community in seeking solutions
to implementation problems.
Session Manager: Joe LiVolsi (U.S. EPA, NHEERL/AED, Narragansett,
RI)
•	"Quality Assurance Reviews at the National Health and Environmental
Effects Research Laboratory, Mid-Continent Ecology Division,
Duluth" - Allan R. Batterman (U.S. EPA, NHEERL/MED, Duluth,
MN)
•	"Quality Assurance for the Lake Michigan Mass Balance Modeling
Project" - W.L. Richardson, D.D. Endicott, and K.R. Rygwelski (U.S.
EPA, NHEERL/MED, Grosse lie, MI)
•	"NHEERL Approach to Technical Systems Reviews of Health Effects
Studies" - B. Michael Ray and Brenda T. Culpepper (U.S. EPA,
NHEERL, Research Triangle Park, NC)
9:30 am	Break
10:00 - 11:30 am Session T21: Field Analytical Methods - Experiences and Lessons Learned
The use of field analytical methods offers considerable promise in speeding up
the collection and analysis of field samples. This session discusses the use of
such techniques from several perspectives including Superfund site
remediation, legal defensibility, and lessons learnedfrom actual application of
such methods.
Session Manager: Robert Hitzig (U.S. EPA, Office of Emergency and
Remedial Response, Washington, DC)
13

-------
Speakers:
"Using Field Methods - Experiences and Lessons Learned" - Bart
Simmons (California EPA, Berkeley, CA)
•	"Applying Field Analytical Methods: Case Studies and Lessons
Learned" - Kira Lynch (U.S. Army Corps of Engineers, Seattle, WA)
•	"Using Field Methods in Superfiind Projects" - Robert Hitzig (U.S.
EPA, Office of Emergency and Remedial Response, Washington, DC)
Session T22: Practical Approaches to Quality Auditing
(Mini-Workshop continued from Session T12)
Session T23: What's Happening in International Standards and Why You
Should Care
Recent andforthcoming changes in international consensus standards
pertaining to quality and environmental management and laboratories may
have implications for environmental programs in the government and private
sectors. Currently, ISO 9001 and ISO 9004 on quality management are being
revised. Within a few months, ISO 14001 and ISO 14004 on environmental
management will also be revised. Both sets of standards are being made more
"compatible " with each other and the quality and environmental auditing
standards are being combined into a single standard. In addition, ISO Guide
25 is being elevated to an international standard as ISO 17025. This session
examines the changes taking place and discusses possible implications for
traditional environmental programs which may affected by these standards.
Session Manager: Gary Johnson (U.S. EPA, Quality Assurance Division,
Research Triangle Park, NC)
Speakers:
•	"Overview and use of Consensus Standards" - Jeffrey Worthington
(U.S. EPA, Quality Assurance Division, Washington, DC)
•	"Revisions of the ISO 9001 Quality Management System and ISO
14001 Environmental Management Systems Standards - Increased
Compatibility or Confusion?" - Gary Johnson (U.S. EPA, Quality
Assurance Division, Research Triangle Park, NC)
•	"Integration of the ISO 10011 Quality Auditing and ISO 14010
Environmental Auditing Standards - What Does It Mean to the User?"
- Michael A. Ross (Registrar Accreditation Board, Milwaukee, Wl)
•	"ISO Guide 25 to International Standard ISO 17025" - Paul Mills
(Mentorprises, Inc., Reston, VA)
11:30 am	Lunch Break
14

-------
1:00 - 2:30 pm	Plenary Wrap-up Session, Reports from Breakout Sessions
The accomplishments of the conference will be summarized. In particular, the
results from the breakout sessions on PBMS implementation issues will be
presented.
Session Moderator: Nancy Wentworth (U.S. EPA, Quality Assurance
Division, Washington, DC)
2:30 pm	Adjourn Conference
FRIDAY, APRIL 16,1999
8:00 am - 12:00 noon EPA QA Managers Meetings (Closed to public)
15

-------

-------
18™ ANNUAL NATIONAL CONFERENCE ON MANAGING QUALITY SYSTEMS
FOR ENVIRONMENTAL PROGRAMS
Quality for the Next Millennium
April 11-15,1999
Regal Cincinnati Hotel
Cincinnati, Ohio
LIST OF PARTICIPANTS
Susan J. Abbgy
A1 Alwan
Nancy Barmakian
QA Coordinator
Chemist, Water Division
Chief, Quality Assurance Unit
Atmospheric Science & Applied
U.S. EPA - Region 5
U.S. EPA
Technology
77 West Jackson Boulevard
60 Westview Street
Battelle Memorial Institute
Chicago, IL 60604
Lexington, MA 02421
505 King Avenue
Phone: (312)353-2004
Phone: (781)860-4684
Columbus, OH 43201
Fax: (312) 886-0168
Fax:(781)860-4397
Phone: (614)424-4194
E-mail: alwan.al@epamail.epa.gov
E-mail:
Fax: (614) 424-3638

barmakian.nancy@epamail.epa.gov
E-mail: abbgys@battelle.org
Amelia (Amy) I. Arceo


Senior QA Specialist
Allan Batterrtian
Nancy Adams
CTAC/MDM/Lamb, IncVQA
Quality Assurance Manager
QA Manager, APPCD/NRMRL
WIPP-DOE Carlsbad Area Office
ORD, NHEERL, MED-Duluth
APPCD, Technical Services Branch
P.O. Box 2074
U.S. EPA
U.S. EPA
Carlsbad, NM 882212074
6201Congdon Blvd
Mail Drop 91
Phone: (505) 234-3208
Duluth, MN 55804
Research Triangle Park, NC 27711
Fax: (505) 234-3195
Phone: (218) 529-5027
Phone:(919)541-5510
E-mail: aarceo@wipp.carlsbad.nm.us
Fax: (218)529-5015
Fax: (919)541-0496

E-mail: Batterman.Allan@epa.gov
E-mail:
Gavin Armstrong

adams.nancy@epamail.epa.gov
QA/QC Coordinator
Libby Beach

Div. of Emergency & Remedial
QA Manager
James R. Agin
Response
Raleigh/RTP
Microbiology Supervisor
Ohio EPA
ARCADIS Geraghty & Miller
Consumer Analytical Laboratory
Lazarus Government Center
4915 Prospectus Drive
Ohio Department of Agriculture
P.O. Box 1049
Durham, NC 27713
8995 East Main Street
Columbus, OH 43216-1049
Phone: (919)544-4535
Reynoldsburg, OH 43068
Phone: (614) 644-4850
Fax: (919) 544-5690
Phone: (614) 7280198
Fax: (614) 644-3146
E-mail: lbeach@acurex.com
Fax: (614) 728-6322
E-mail:

E-mail: agin@odant.agri.state.oh.us
gavin.armstrong@epa.state.oh.us
Larry Becker


QAM
Larry Alderson
Ernest Arnold
CEMP-RA/HQ
Supervisor, Superfund/RCRA Unit
Regional Quality Assurance Manager
U.S. Army COE
Environmental Services Program
Region VII
20 Massachusetts Avenue, NW
Department of Natural Resources
U.S. EPA
Washington, DC 203141000
2710 West Main Street
25 Funston Road
Phone: (202) 761-8882
Jefferson City, MO 65109
Kansas City, KS 66115
Fax: (202) 761-4879
Phone: (573) 526-3364
Phone: (913) 551-5194
E-mail: larryd.becker@usace.anny.mil
Fax: (573) 526-3350
Fax:(913)551-5218

E-mail:
E-mail: amold.emie@epa.gov
Berne Bennett
nraldel@mail.dnr.state.mo.us

Chemist
HEASD/SACB
U.S. EPA
Research Triangle Park, NC 27711
Phone: (919) 541-2366
Fax: (919) 541-0960
E-mail:
Bennett.Bernie@epamail.epa.gov

-------
Gary Bennett
Quality Assurance Manager
Office of Quality Assurance
U.S. EPA - Region 4
980 College Station Road
Athens, GA 30605
Phone: (706) 355-8551
Fax: (706) 355-8803
E-mail:
bennett.gary@epamail.epa.gov
Malcolm Bertoni
Senior Research Environmental
Scientist
Center for Environmental
Measurements & QA
Research Triangle Institute
1615 M Street, NW
Suite 740
Washington, DC 20036
Phone: (202) 728-2067
Fax: (202) 728-2095
E-mail: mjb@rti.org
Elizabeth Betz
QA Manager
Human Exposure & Atmospheric
Sciences Division
U.S. EPA
NERL-RTP, MD-77
Research Triangle Park, NC 27711
Phone:(919)541-1535
Fax: (919) 541-0239
E-mail:
betz.elizabeth@epamail.epa.gov
Katherine (Katie) Biggs
Associate Director
NEPA Compliance Division/OECA
U.S. EPA
Office of Federal Activities (2252A)
401 M Street, SW
Washington, DC 20460
Phone: (202) 564-7144
Fax: (202) 564-0072
E-mail:
biggs.katherine@epamail.epa.gov
David Blair
Info. Technology University/CIA
Washington, DC
Anthony Blake
Associate Environmental Scientist
Harding Lawson Associates
90 Digital Drive
Novalto, CA 94949
Phone: (415) 884-3186
Fax: (415) 884-3300
E-mail: tblake@harding.com
Louis Blume
QA Manager
GLNPO
U.S. EPA
77 W. Jackson
Chicago, IL 60604
Phone:(312)353-2317
Fax: (312)353-2018
Kevin Bolger
Chemist
QA Core/ RMD
U.S. EPA - Region 5
77 West Jackson Blvd, M-9J
Chicago, IL 60604
Phone: (312) 886-6762
Fax:(312)353-4342
E-mail: bolger.kevin@epa.gov
Tom Bond
Senior Engineering Tech.
Comarco Systems, Inc.
RR 6, Box 28
Bloomfield, IN 47424
Phone: (812) 384-3587 x267
Fax: (812)384-3744
E-mail: TBOND@comarcosystems-
blfd.com
Patricia Boone
Chemist
Waste, Pesticides & Toxics Division
U.S. EPA - Region 5
77 West Jackson Blvd., DT-8J
Chicago, IL 60604-3590
Phone: (312) 886-3172
Fax: (312) 353-4788
E-mail: boone.patricia@epa.gov
Alison Boshes
Environmental Scientist
Environmental Research Planning
Department
Research Triangle Institute
1615 M Street, NW
Suite 740
Washington, DC 20036
Phone: (202) 728-2488
Fax: (202) 728-2095
E-mail: amb@rti.org
Dave Bottrell
Chemist
Site Operations
U.S. Department of Energy
19901 Germantown Road, CLOV
Germantown, MD 208741290
Phone:(301)903-7251
Fax:(301)903-7613
E-mail: david.bottrell@em.doe.gov
Timothy Bowren
Chemist
Office of Water Management /
Assessment Branch
Ind. Dept. of Environmental
Management
P.O. Box 6015 (65-40-2)
Indianapolis, IN 46206-6015
Phone: (317)308-3181
Fax:(317)308-3219
E-mail:
TBOWREN@DEM.STATE.IN.US
Carl Brickner
Environmental Scientist
U.S. EPA
75 Hawthorne Street (PMD3)
San Francisco, CA 94105
Phone: (415)744-1536
Fax:(415)744-1476
George M. Brilis
QA Manager
NERL/ESD
U.S. EPA
P.O. Box 93478
Las Vegas, NV 89193
Phone: (702) 798-3128
Janice Brown
Quality Assurance Manager
NHEERL/RTD
U.S. EPA
MD-71 NHEERL Bid.
Research Triangle Park, NC 27711
Phone: (919) 541-0331
Fax: (919) 541-1499
E-mail: brown.janice@epamail.epa.gov
Rex C. Bryan
Senior Geostatistician
Contractor
DynCorp
2221 East Street
Golden, CO 80401
Phone: (303) 277-0070
Fax: (303) 278-4749
E-mail: rexbryan@compuserve.com
Christian Byrne
Quality Assurance Manager
BEAD/Environmental Chemistry
Laboratory
U.S. EPA
Building 1105
Stennis Space Center, MS 39529
Phone: (228) 688-3213
Fax: (228) 688-3536
E-mail: byme.christian@epa.gov
2

-------
(Catherine Campbell
Kenyon C. Carlson
Environmental Program Supervisor
ADEQ QA Unit
AZ Dept. Environmental Quality
3033 N. Central Avenue
Phoenix, AZ 850122809
Phone: (602) 207-4866
Fax: (602) 207-4705
E-mail:
carlson.kenyon@ev.state.az.us
Constance Carpenter
Laboratory Manager
Springfield Environmental Inc.
1001 East Street
PO Box 2728
Springfield, OH 45501-2728
Phone: (937)324-8001
Fax: (937)324-5185
E-mail: clcarpenter@email.msn.com
Mike Carter
Physical Scientist
Federal Facilities Restoration &
Reuse Office
U.S. EPA
Mail Code 5101
401 M St., SW
Washington, DC 20460
Phone: (202) 260-5686
Fax: (202) 2605486
E-mail: carter.mike@epa.gov
lames Chambers
QA Engineer
QA/R Data Quality
DOE-Fernald
P.O. Box 538704
Cincinnati, OH 452538704
Phone: (513)648-7543
Fax:(513)648-7598
E-mail:
james.chambers@fernald.gov
Luke Charpentier
QA Coordinator
Policy and Planning Division
MN Pollution Control Agency
520 Lafayette Road
St. Paul, MN 55155
Phone:(651)296-8445
Fax:(651)297-8676
E-mail:
luke.charpentier@pca.state.mn.us
Jean Chruscicki
Water Division
U.S. EPA - Region 5
Chicago, IL
Patrick Churilla
Lab Certification Program Manager
Water Division
U.S. EPA - Region 5
77 W. Jackson
WD-15J
Chicago, IL 60604
Phone:(312)353-6175
Fax: (312) 886-6171
E-mail: churilla.patrick@epa.gov
Bob Clark
Environmental Engineer
U.S. EPA-Region 6
1445 Ross Ave.
Dallas, TX 75202
Phone: (214) 665 6487
Fax: (214) 665-2168
E-mail: clark.robertc@epamail.epa.gov
Steve Clark
Engineer, Drinking Water Division
U.S. EPA
401 M Street, SW (4601)
Washington, DC 20460
Phone: (202) 260-7159
Fax: (202) 260-4383
E-mail: clark.stephen@epa.gov
Roy Cohen
Quality Assurance Officer
Analytical Laboratory Services
Fluor Daniel Fernald
P.O. Box 538704, MS-35
Cincinnati, OH 45253-8704
Phone: (513) 648-4226
Fax:(513)648-5451
E-mail: Roy_Cohen@femald.gov
Harriet Colbert
Information Management Specialist
OW/Office of GW & DW
U.S. EPA
401 M ST., SW (4607)
Washington, DC 20460
Phone: (202) 260-2302
Fax: (202) 260-3762
E-mail:
COLBERT.HARRlET@EPA.GOV
Cheryl Cole
Environmental Specialist
Division of Underground Storage Tanks
TN Dept. of Environment &
Conservation
401 Church Street
L&C Tower, 4th Floor
Nashville, TN 372431541
Phone: (615) 532-0985
Fax: (615) 532-0938
E-mail: ccol2@mail.state.tn.us
Sharon R. Coleman
Quality Assurance Specialist
TNRCC
P.O. Box 13087, MC-176
Austin, TX 787113087
Phone: (512) 239-6340
Fax: (512)239-6390
E-mail: scoleman@tnrcc.state.tx.us
James P. Cross
Environmental Scientist
Fluor Daniel Fernald
3246 Floridale Lane
Cincinnati, OH 45239
Phone: (513) 648-7537
Fax: (513)648-7595
E-mail: james_cross@femald.gov
Brenda Culpepper
QA Manager
ORD/NHEERL/Experimental
Toxicology Div.
U.S. EPA
86 Alexander Drive (MD-66)
Research Triangle Park, NC 27711
Phone: (919)541-0153
Fax:(919)541-4284
E-mail: culpepper.brenda@epa.gov
Richard L. Daddow
Hydrologist/Contract Officer's Rep.
DODEC Program
U.S. Geological Survey
P.O. Box 25046
DFC, MS-425
Denver, CO 80225
Phone:(303)236-0156
Fax: (303) 236-5046
E-mail: rldaddow@usgs.gov
Anamary R. Daniel
President/CEO
Informatica International, Inc.
795 Main Street, West, 2nd Floor
Oak Ridge, TN 37830
Phone: (423) 220-8700
Fax:(423)481-3234
E-mail: danielar@ick.net
Robert Darley
Chemist
NAVSEA
U.S. Navy
1661 Redbank Road
Goose Creek, SC 29445
Phone: (843) 764-7337 x20
Fax: (843) 764-7360
E-mail:
darley_skip@hg.navsea.navy.mil
3

-------
Tim Dawson
Mark Doehnert
Carl Eshelman
Divisional QA Officer
Quality Assurance Manager
Quality Assurance Supervisor
U.S. EPA-Rcgion 6
OAR/ ORIA
Consumer Analytical laboratory
1445 Ross Ave.
U.S. EPA
Ohio Department of Agriculture
Dallas, TX 75202
6602J
Building Three
Phone: (214) 665-2218
401 MSTSW
8995 East Main Street
Fax: (214) 665-8072
Washington, DC 20460
Reynoldsburg, Ohio 43068
E-mail:
Phone: (202) 564-9386
Phone: (614) 728-2619
dawson.timothy@epamail.com
Fax: (202) 565-2042
Fax:(614)728-6322

E-mail: doehnertmark@epa.gov
E-mail: eshelman@odantagri.state.oh.us
Donalea Dinsmore


QA Coordinator
Lisa Doucet
Larry Fade
Bureau of Integrated Science
Quality Assurance Officer
Chemist
Services
Region 6 Laboratory
San Francisco District, DMMO
W1DNR
U.S. EPA
U.S. Army Corps of Engineers
P.O. Box 7921
10625 Fallstone
333 Market Street
Madison, WI 53707
Houston, TX 77099
Room 8111
Phone: (608) 266-8948
Phone:(281)983-2129
San Francisco, CA 94105-2197
Fax: (608) 266-5226
Fax:(281)983-2124
Phone: (415) 977-8477
E-mail: dinsmd@dnr.state.wi.us
E-mail: doucet.lisa@epa.gov
Fax: (415) 977-8483


E-mail: lfade@spd.usace.army.mil
John Dirgo
Lauren Drees

QA Manager
QA Manager
Nabil Fayoumi
TetraTech EM Inc.
Sustainable Technology Division
Environmental Scientist, WPTD
200 E Randolph Drive, Suite 4700
USEPA NRMRL
U.S. EPA - Region 5
Chicago, IL 60601
26 W. Martin Luther King Dr.
77 W. Jackson Boulevard
Phone: (312) 856-8765
Cincinnati, OH 45268
DRP-8J
Fax: (312)938-0118
Phone: (513)569-7087
Chicago, IL 60604
E-mail: dirgoj@ttemi.com
Fax: (513) 569-7585
Phone: (312) 886-6840

E-mail: drees.lauren@epamail.epa.gov
Fax: (312)353-4788
Khouane Ditthavong

E-mail: fayoumi.nabil@epa.gov
Environmental Scientist
Elizabeth Dutrow

Office of Water
Chemist, Quality Assurance Division
Loreto Ferrada
U.S. EPA
U.S. EPA
Technical Assistant
401 M Street, SW
401 M Street, SW (8724R)
Informatica International, Inc.
Washington, DC 20460
Washington, DC 20460
795 Main Street, West, 2nd Floor
Phone: (202) 260-6115
Phone: (202) 564-9061
Oak Ridge, TN 37830
Fax: (202) 260-7185
Fax:(202)565-2441
Phone: (423) 220-8700
E-mail:

Fax: (423)481-3234
ditthavong.khouane@epa.gov
Joe Elkins
E-mail: ferradali@ick.net

QA Manager, OAQPS

Thomas Dixon
U.S. EPA
Luba Finkelberg
Quality Assurance Division
MD-14
Chemist, Superfund Division
U.S. EPA
RTP.NC 27711
U.S. EPA - Region 5
401 M Street, SW
Phone: (919)541-5653
77 W. Jackson
Washington, DC 20460
Fax: (919) 541-5678
Chicago, IL 60604
Phone: (202) 564-6877
E-mail: elkins.joe@epamail.epa.gov
Phone: (312) 353-7712
Fax: (202) 565-2441

Fax: (312)353-9281
E-mail: dixon.thomas@epa.gov
D.D. Endicott
E-mail: finkelberg.luba@epa.gov

NHEERL/MED

Jacquelyn Doan
U.S. EPA
Joan Fisk
QA/QC Chemist
Grosse He, MI
Chemist/Regional Coordinator
Environmental Qual. Management,

OERR
Inc.

U.S. EPA
1310 Kemper Meadow Drive

401 M Street, SW (5204G)
Cincinnati, OH 45240

Washington, DC 20460
Phone: (800) 229-7495

Phone: (703) 603-8791
Fax:(513)825-7495

Fax: (703)603-9104
E-mail: jdoan@eqm.com

E-mail: fisk.joan@epa.gov
4

-------
Paul Fitzgerald
QA Officer
Duke Power Analytical Lab
Duke Energy
MG03A2
13339 Hagers Ferry Rd.
Huntersville, NC 28078
Phone: (704) 875-5208
Fax: (704) 875-5038
E-mail: pjfitzge@duke-energy.com
Vance Fong
Quality Assurance Manager
Region 9
U.S. EPA
PMD-3
75 Hawthorne Street
San Francisco, CA 94105
Phone: (415)744-1492
Fax: (415)744-1476
E-mail:
fong.vance@epamail.epa.gov
Alfredo O. Fontanilla
Chemist 3/QA Officer
Laboratory Services
Washington State Department of
Agriculture
21 North 1st Avenue, Suite 106
Yakima, WA 98901
Phone: (509) 575-2759
Fax:(509)454-7699
Alexa Fraser
Senior Project Director
Environmental Studies Area
Westat
1650 Research Boulevard
Room RP4044
Rockville, MD 208503129
Phone: (301)294-2842
Fax: (301)294-2829
E-mail: Fraseral@westat.com
Greg Fraser
Info. Technology Univ./CIA
Washington, DC
Reinhard Friske
Qualtiy Assurrance
Soil & Water Characterization
Project
Fluor Daniel Femald
P.O. Box 538704
Cincinnati, OH 45253
Phone: (513)648-5477
Fax: (513)648-3973
E-mail:
reinhard_friske@fernald.gov
Duane Geuder
QA Manager
Office of Emergency & Remedial
Response
U.S. EPA
401 M Street, SW
Washington, DC 20460
Clifford R. Glowacki
Senior Program Coordinator
Environmental Analytical Services
Division
Ashland Chemical Company
R&D/Analytical Services & Technology
5200 Blazer Parkway
Dublin, OH 43107
Phone: (614) 790-3482
Fax: (614) 490-4294
E-mail: cglowacki@ashland.com
Paul Golden
Chemist
OPPTS/OPP/BEAD/ACB
U.S. EPA
701 Mapes Road
Fort Meade, MD 20755-5350
Phone: (410) 305-2960
Fax: (410) 305-3091
E-mail: golden.paul@epamail.epa.gov
John Goldsmith
QA Chemist
Great Lakes National Program Office
U.S. EPA
77 West Jackson Blvd., G-17J
Chicago, IL 60604
Phone: (312)353-5002
Fax:(312)353-2018
E-mail:
goldsmith.john@epamail.epa.gov
Michael Goodis
Chemical Review Manager
OPP/SRRD
U.S. EPA
401 M Street, SW (7508C)
Washington, DC 20460
Phone: (703)308-8157
Fax:(703)308-8041
E-mail: goodis.michael@epa.gov
Paul GrofT
Environmental Scientist
Air Pollution Prevention Control
Division/Technica
U. S. EPA - ORD/NRMRL
MD-91
86 Alexander Drive
RTP, North Carolina 27711
Phone: (919) 541-0979
Fax:(919) 541-0496
E-mail: groff.paul@epa.gov
Diane Guthrie
Enviromental Engineer
Office of Quality Assurance
U.S. EPA
960 College Station Road
Athens, GA 30605
Phone: (706)355-8622
Fax:(706)355-8803
E-mail: guthrie.diane@epamail.epa.gov
Tim Harms
Office of Waste Management
U.S. DOE
Tom Harper
Personnel Officer
OAR, OMS, DOD, HRO
U.S. EPA
2000 Traverwood Drive
Ann Arbor, MI 48105
Phone: (734)214-4308
Fax: (734)214-4550
E-mail:
HARPER.THOMAS@EPA.GOV
Kathleen Harris
QA Officer
Indiana State Chemistry Department
Purdue University
1154 Biochemistry
W. Lafayette, IN 47907
Phone: (765) 494-1549
Fax:(765)494-4331
E-mail: harrisk@isco.purdue.edu
Nancy Hassig
Senior Research Scientist
Battelle Pacific Northwest Labs.
P.O. Box 999
Richland, WA 99352
Phone: (415) 969-3969
Fax: (415)967-4170
RaeAnn Haynes
QA Manager, Laboratory
OR Department of Environ. Quality
1712 SW 11th Avenue
Portland, OR 97201
Phone: (503) 229-5983
Fax: (503) 229-6924
E-mail: haynes.raeann@deq.state.or.us
Rosemarie Hemmen
Quality Assurance Chemist
Bureau of Public Health Laboratories
Ohio Department of Health
1571 Perry Street
P.O. Box 2568
Columbus, OH 432162568
Phone: (614) 644-4639
Fax:(614) 421-2324
E-mail: rhemmen@gw.odh.state.oh.us
5

-------
Kathy Hillig
Team Leader, Prod. Regs. &
Analytical Services
Corporate Ecology
BASE Corporation
1609 Biddle Avenue
Wyandotte, MI 48192
Phone: (734) 324-6334
Fax: (734) 324-5226
E-mail: hilligk@basf.com
Robert Hitzig
Environmental Scientist
Superfund Division
U.S. EPA
401 M Street, SW (MC-5204G)
Washington, DC 20460
Phone: (703) 603-9047
Fax: (703) 603-9112
E-mail: hitzig.robert@epa.gov
Janice Huang
Environmental Protection Specialist
Water Division
USEPA - Region 5
77 W. Jackson Blvd. (WA-16J)
Chicago, IL 60604
Phone: (312)3538228
Fax: (312) 8867804
E-mail: huang.janice@epa.gov
Elizabeth Hunike
QA Specialist
Atmos Methods & Monitoring
Branch HEASD/NERL
U.S. EPA
MD-46
Research Triangle Park, NC 27711
Phone: (919)541-3737
Fax: (919) 541-1153
E-mail:
Hunike.Elizabeth@epamail.epa.gov
Margo Hunt
Environmental Scientist
Division of Environmental Science
and Assessment
U.S. EPA - Region 2
2890 Woodbridge Ave.
Edison, NJ 08837
Phone: (732) 321-6606
Fax: (732) 321-6616
E-mail: hunt.margo@epa.gov
Bill Ingersoll
Chemist
NAVSEA
U.S. Navy
1661 Redbank Road
Goose Creek, SC 29445
Phone: (843)764-7337
Fax: (843) 764-7360
E-mail:
ingersoll_william_s@hq.navsea.navy.m
il
Syed Iqbal
Quality Assurance Scientist
Measurement & Analysis Research
Division
Environment Canada
4905 Dufferin Street
Downsview, Ontario M3H 5T4
CANADA
Phone: (416) 739-4827
Fax: (416)739-5704
E-mail: syediqbal@ec.gc.ca
Terry Jackson
QA Officer
Division of Inspection Services
CA Department of Food & Agriculture
Center for Analytical Chemical
3292 Meadowview Road
Sacramento, CA 95832
Phone: (916) 262-1498
Fax:(916) 262-1572
E-mail: tjackson@cdfu.ca.gov
Roland E. Jenkins
Division Chief
Consumer Analytical Laboratory
Ohio Department of Agriculture
8995 E. Main Street, Bldg. 3
Reynoldsburg, OH 43068
Phone: (614) 728-6230
Fax:(614)728-6322
E-mail: jenkins@odant.agri.state.oh.us
Wiley Jenkins
Laboratory Quality Specialist
Division of Laboratories
Illinois Department of Public Health
825 North Rutledge
Springfield, IL 62702
Phone: (217) 524-6228
Fax: (217)524-7924
E-mail: wjenkins@IDPH.state.il.us
David Johnson
Senior Quality Engineer
Technical Services Division
Westinghouse Savannah River Co.
766-H / 2432
Aiken, SC 29808
Phone: (803) 208-0930
Fax:(803)208-0920
E-mail: david.johnson@srs.gov
Don Johnson
QA Officer, Superfund Division
U.S. EPA Region 6
1445 Ross Avenue, 6SF-D
Dallas, TX 75202
Phone: (214) 665-8343
Fax: (214)665-7330
E-mail: johnson.donald@epa.gov
Gary Johnson
Environmental Engineer
Quality Assurance Division
U.S. EPA
MD-75
Research Triangle Park, NC 27711
Phone: (919) 541-7612
Fax: (919) 541-4261
E-mail: johnson.gaiy@epamail.epa.gov
Kimberly Johnson
Environmental Scientist, QAD
Jason Associates Corporation
477 Shoup Ave, Ste 201
Idaho Falls, ID 83402
Phone: (208) 522-1662
Fax: (208) 522-2076
E-mail: kjohnson@jason.com
Lora Johnson
Director of Quality Assurance
National Exposure Research Lab.
USEPA
26 W. MLK Drive
Cincinnati, OH 45268
Phone: (513)569-7299
Fax: (513) 569-7424
E-mail: Johnson-lora@epamail.epa.gov
Jack Jojokian
QA Coordinator OECA/OSRE/PPED
U.S. EPA
401 M Street, SW
Washington, DC 20460
Phone: (202) 564-6058
Fax: 2025640074
E-mail: jojokian.jack@epamail.epa.gov
6

-------
Maggie Jones
Senior Quality Assurance Analyst
Mission Support Services
DynCorp I&ET, Inc.
300 North Lee Street, Suite 500
Alexandria, VA 22314
Phone: (703) 519-1284
Fax: (703) 548-4766
E-mail: jonesme@dyncorp.com
Monica Jones
Environmental Scientist
ESD/OASQA/QAT
U.S. EPA - Region 3
701 Mapes Road
Fort Meade, MD 20755-5350
Phone: (410) 305-2747
Fax: (410) 305-3095
E-mail: jones.monica@epa.gov
Tony Jover
Sr. IRM Official
Office of Solid Waste and
Emergency Response
U.S. EPA
401 M Street, S. W.
Washington, DC 20460
Phone: (202) 260-2387
Fax: (202) 260-6754
E-mail: jover.tony@epa.gov
Edward Kantor
QA Representative
HEADS/HERB
U.S. EPA
944 East Harmon Ave.
Las Vegas, NV 89121
Phone: (703) 603-9021
Fax:(703)603-9112
E-mail: kantor.edward@epa.gov
James Kariya
Environmental Scientist
Off. of Science Coordination &
Policy
U.S. EPA
Office of Prevention, Pesticides, &
Toxic Substances
401 M Street, SW (7101)
Washington, DC 20460
Phone:(202)260-2916
Fax: (202) 401-0849
E-mail: kariya.jim@epa.gov
Garabet Kassakhian
Consultant
Glendale, CA
Kris Kehoe
Chemist, Office of Water
Management/Assessment Branch
Indiana Department of Environmental
Management
OWM/Assessm en tB ranch
P.O. Box 6015 (65-40-02)
Indianapolis, IN 46206-6015
Phone: (317)308-3105
Fax: (317) 308-3219
E-mail: KKEHOE@dem.state.in.us
Kevin Kelkenberg
Office of Site Operations
U.S. DOE
Karyn K. Kennedy
QA Manager
Public Health Assurance Lab.
NE Health & Human Services
3701 S. 14th Street
Lincoln, NE 68502
Phone: (402) 471-8428
Fax: (402) 471-2080
E-mail: kk3sk@msn.com
Ann Kem
Quality Assurance Manager
Land Remediation and Pollution
Control Division
USEPA/ORD
26 W. Martin Luther King Dr.
Cincinnati, OH 45268
Phone: (513)569-7635
Fax: (513)5697585
E-mail: kern.ann@epamail.epa.gov
Cliff J. Kirchmer
Quality Assurance Officer
Environmental Assessment Program
WA Department of Ecology
P.O. Box 47600
Olympia, WA 47600
Phone: (360) 407-6455
Fax: (360) 407-6884
E-mail: ckir461@ecy.wa.gov
Linda Kirkland
ORD/NCERQA/Quality Assurance Div.
U.S. EPA
8724R
401 M Street SW
Washington, DC 20460
Phone: (202) 564-6873
Fax: (202) 565-2441
E-mail: Kirkland.Linda@EPA.gov
Conrad Kleveno
Senior Environmental Scientist
DynCorp, I & ET
300 N. Lee Street
Alexandria, VA 22314
Phone: (703)519-1172
Fax: (703) 739-2233
E-mail: klevenoc@dyncorp.com
Mark Krigbaum
Product Manager
T ekmar-Dahrmann
7143 E. Kemper Road
Cincinnati, OH 45249
Phone:(513)247-7038
Fax: (513) 247-7043
E-mail: markri@tekmar.com
Feng-Chao Kuo
QA Supervisor
Technical Department
Formosa Plastics Corporation
201 Formosa Drive
Point Comfort, TX 77978
Phone: (361) 987-7575
Fax:(361)987-7487
E-mail: chaokuo@ftpc.fpcusa.com
Sylvia ("Silky") Labie
Environmental Administrator
Quality Assurance Division
FL Dept. of Environmental Protection
2600 Blair Stone Road, MS6505
Tallahassee, FL 32399-2400
Phone: (850) 488-2796
Fax: (850) 922-4614
E-mail: labie_s@dep.state.fl.us
Patricia Lafornara
Environmental Scientist, QAD
U.S. EPA
2890 Woodbridge Ave. (MS-104)
Edison, NJ 08837-3679
Phone: (732) 906-6988
Fax: (732) 321-6640
E-mail: lafornara.patricia@epa.gov
Leslie Laing
QA/QC Mgr
Springfield Environmental
PO Box 2728
Springfield, OH 45501
Phone: (937) 324-8001
Fax: (937) 324-5185
E-mail: Charlielaing@compuserve.com
7

-------
Norrell Lantzer
Senior Operations Research Analyst
Comarco Systems Inc.
RR6, Box 28
Bloomfield, IN 47242
Phone: (812)384-3587 x267
Fax: (812)384-3744
E-mail:
lantzer@eodmgate.navsea.navy.mil
Moira Lataille
Chemist, Quality Assurance Unit
U.S. EPA
60 Westview Street
Lexington, MA 02421
Phone: (781)860-4312
Fax: (781)860-4397
E-mail:
lataille.moira@epamail.epa.gov
Sharon Laycock
QA/QC Chemist
Environmental Qua!. Management,
Inc.
1310 Kemper Meadow Drive
Cincinnati, OH 45240
Phone: (513) 825-7500
Fax: (513) 825-7945
E-mail: slaycock@eqm.com
Thomas Leek
Water Resource Specialist III
New Mexico Environment
Department
NM State Government
4131 Montgomery Boulevard, NE
Albuquerque, NM 87110
Phone: (505) 841-9479
Fax: (505) 881-9645
E-mail:
thomas_leck@nmenv.state.nm.us
Santiago Lee
Chemist
Weiss Associate
5500 Shellmound Street
Emeryville, CA 94608
Phone: (510) 450-6156
Fax: (510) 547-5043
E-mail: sml@weiss.com
Ida Levin
QA Team Leader
Superfund Division
U.S. EPA - Region 5
77 West Jackson
SMF-4J
Chicago, IL 60604
Phone: (312) 886-6254
Fax: (312) 353-9281
E-mail: levin.ida@epamail.epa.gov
Mary Ellen Ley
QA Coordinator, ICPRB
Chesapeake Bay Program
410 Severn Ave. Suite 109
Annapolis, MD 21403
Phone: (410) 267-5750
Fax: (410)267-5777
E-mail: ley.maiy@epamail.epa.gov
Joe LiVoIsi
NHEERL/AED
U.S. EPA
Narragansett, RI
Robert Lordo
Principal Research Scientist
Statistics and Data Analysis Systems
Battelle
505 King Avenue
Columbus, OH 43201
Phone:(614)424-4516
Fax: (614) 424-4250
E-mail: lordor@battelle.org
Mary Lumpkin
Physical Science Tech, HEASD/EMMB
U.S. EPA - NERL
EPA Annex
79 Alexander Drive, MD-44
Research Triangle Park, N.C. 27711
Phone: (919) 541-4292
Fax: (919) 541-3527
E-mail:
lumpkin.susan@epamail.epa.gov
Kira Lynch
U.S Army COE
Seattle, WA
Carol L. Lynes
Environmental Scientist
Division of Environmental Science &
Assessment
U.S. EPA
Hazardous Waste Support Section
2890 Woodbridge Avenue
Edison, NJ 08837
Phone:(732)321-6760
Fax:(732)321-6622
Gregory A. Mack
VP, Environmental Monitoring &
Assessment
Environmental Division
Battelle
505 King Avenue
Collumbus, OH 43201
Phone: (614) 424-7241
Fax: (614) 424-4250
E-mail: mackga@battelle.org
Denise MacMillan
Quality Assurance Officer
Division of Chemistry QA
U.S. Army COE
Bremen Laboratory
420 S 18 th Street
Omaha, NE 68102
Phone: (402) 444-4304
Fax:(402)341-5448
E-mail:
Denise.k.macmillan@nwo02.usace.army
.mil
Caroline Madding
QA Coordinator, OGWDW
U.S. EPA
26 W M L King Dr.
Cincinnati, OH 45268
Phone: (513) 569-7402
Fax: (513)569-7191
E-mail: madding.caroline@epa.gov
Lee Mao
Quality Assurance Officer
Environmental Monitoring Branch
U.S. Bureau of Reclamation
2800 Cottage Way
Sacramento, CA 95825
Phone: (916) 978-5282
Fax: (916) 978-5290
E-mail: lmao@mp.usbr.gov
Marisol Marrero
Chemist/QA Officer
Hazardous Waste Permit Division
PR Environmental Quality Board
P.O. Box 11488
San Juan, PR 00910
Phone: (787) 766-2817
Fax: (787) 767-8118
John Martinson
QA Manager
ORD/NERL Cincinnati
U.S. EPA
26 W. Martin L. King Drive
Cincinnati, OH 45268
Phone: (513) 569-7564
Fax: (513) 569-7424
E-mail: Martinson.John@epa.gov
Linda Mauel
Environmental Engineer
Division of Environmental Science &
Assessment (DE
U.S. EPA - Region 2
2890 Woodbridge Ave. MS-220
Edison, NJ 08837
Phone: (732) 321-6766
Fax: (732) 321-6616
E-mail: mauel.linda@epa.gov
8

-------
Doris Maxwell
Management Analyst
OAR/OAQPS/ESD
U.S. EPA
MD-13
Research Triangle Park, NC 27711
Phone: (919) 541-5312
Fax: (919) 541-0072
E-mail:
maxwell.doris@epamail.epa.gov
Margaret McCue
Associate Director
Waste, Pesticides & Toxics Division
U.S. EPA - Region 5
77 W. Jackson
Chicago, IL 60024
Phone: (312) 886-0653
Fax: (312)353-4788
E-mail: mccue.margaret@epa.gov
Kathleen McEnerny
Associate
DPRA, Inc.
1300 N. 17th Street, Suite 950
Rosslyn, VA 22209
Phone: (703) 841-8052
Fax: (703) 524-9415
E-mail: kmcenerny@dpra.com
Fred S. McLean
Chemist
NAVSEA
U.S. Navy
1661 Redbank Road
Goose Creek, SC 29445
Phone: (843) 764-7337
Fax: (843) 764-7360
E-mail: mclean-fred-
s@hq.navsea.navy.mil
Mary Sue McNeil
Senior Scientist, Research Division
ManTech Environmental
919 Kerr Research Drive
Ada, OK 74820
Phone: (580) 436-8711
Fax: (580)436-8501
E-mail: lafever.mary-sue@epa.gov
Heather Medley
Environmental Scientist
Soil&Water/Sample & Data Mgmt.
Fluor Daniel Femald
P.O. Box 538704
Cincinnati, OH 45253-8704
Phone:(513)648-7541
Fax: (513)648-7595
E-mail:
heather_medley@fernald.gov
Stephen Mehay
Environmenatl Protection Specialist
Comarco Systems Inc.
RR 6, Box 28
Bloomfield, IN 47424
Phone: (812) 384-3587 x250
Fax: (812)384-3744
E-mail: smehay@comarcosystems-
blfd.com
Ray Merrill
Senior Program Manager
Engineering and Science Division
Eastern Research Group
1600 Perimeter Park, Box 2010
Morrisville, NC 27560
Phone: (919) 468-7887
Fax: (919) 468-7801
E-mail: rmerrill@erg.com
Craig R. Mesler
QMS
Uncasville, CT
Edward Messer
Environmental Chemist
Office of Quality Assurance
U.S. EPA-Region 4 SESD
980 College Station Road
Athens, GA 30605
Phone: (706) 355-8560
Fax: (706) 355-8803
E-mail:
messer.edward@epamail.epa.gov
Sheila L. Meyers
Quality Assurance Specialist, QAD
TNRCC
P.O. Box 13087
MC-176
Austin, TX 787113087
Phone: (512) 239-6340
Fax: (512) 239-6390
E-mail: smeyers@tnrcc.state.tx.us
Paul Mills
Senior Technical Specialist, I&ET
DynCorp
2000 Edmund Halley Drive
Reston, VA 20191
Phone: (703) 264-9560
Fax: (703) 264-9236
E-mail: millsp@dyncorp.com
Martha Mitchell
Sr. Tech. Manager, ER Project Sandia
Roy F. Weston
2955 Hyder, SE
Albuquerque, NM 87106
Phone: (505) 844-8208
Fax: (505) 284-2617
E-mail: mjmitch@sandia.gov
James Moore
Quality Assurance Manager
Gulf Ecology Division
U.S. EPA-NHEERL
One Sabine Island Drive
Gulf Breeze, FL 332561
Phone: (850)934-9236
Fax: (850) 934-9201
E-mail: moore.jim@epamail.epa.gov
James B. Moore
QA Coordinator
U.S. EPA - NAREL
540 South Morris Avenue
Montgomery, AL 36115
Phone: (334) 270-3451
Fax: (334) 270-3454
E-mail: moore.James@epamail.epa.gov
Paul Morrison
Groundwater & Regulatory Service
Section
Dept Agriculture, Trade & Consumer
Prot
State of Wisconsin
2811 Agriculture Drive
PO Box 8911
Madison, WI 53708
Phone: (608) 224-4512
Fax: (608) 225-4656
E-mail:
morripa@wheel.datcp.state.wi.us
Stan Morton
U.S. Department of Energy
Idaho Falls, ID
Jeanne Mourrain
Chemist, ORD/NELAP
U.S. EPA
MD 75
RTP.NC 27711
Phone: (919) 541-1120
Fax: (919)541-4261
Cesar A. Muedas
QA Manager, Kenfelder Laboratory
Brown and Caldwell
227 French landing Drive
Nashville, TN 37228
Phone: (615) 255-2288 x524
Fax: (615) 256-8332
E-mail: cmuedas@brwncald.com
Dave Neal
Service Manager, Service Division
T ekmar-Dahrmann
7143 E. Kemper Road
Cincinnati, OH 45249
Phone: (513)247-7081
Fax: (513)247-7043
E-mail: davnea@tekmar.com
9

-------
Robert Nichols
Environmental Scientist
ENSV/DISO/RQAO
U.S. EPA - Region 7
25 Funston Rd
Kansas City, KS 66115
Phone: (913) 551-5189
Fax: (913) 551-5218
E-mail:
nichols.robert@epamail.epa.gov
Melvin Nolan
Environmental Scientist
ORD/NCEA-IO
U.S. EPA
401 M Street, SW (860ID)
Washington, DC 20460
Phone: (202) 564-3354
Fax: (202) 565-0061
E-mail:
nolan.melvin@epamail.epa.gov
Heidi Novotny
U.S. Army COE
Washington, DC
Mary J. O'Donnell
Environmental Scientist
PMD/QAP
U.S. EPA - Region 9
75 Hawthorne Street
San Francisco, CA 94105
Phone: (415) 744-1533
Fax: (415) 7441476
E-mail:
odonnell.mary@epamail.epa.gov
Brenda Odom
Mathematical Statistician
Quality Assurance Division
Environmental Protection Agency
401 M. Street, SW (8724R)
Washington, DC 20460
Phone: (202) 564-6881
Fax: (202) 565-2441
E-mail: odom.brenda@epa.gov
Steve Ostrodka
Chief, Field Services Section
Superfund Division
U.S. EPA - Region 5
77 West Jackson Boulevard
Chicago, 1L 60604
Phone: (312) 886-3011
Fax: (312) 353-9281
E-mail: ostrodka.stephen@epa.gov
Jerry Parr
Principal
Catalyst Information Resources, L.L.C.
1153 Bergen Parkway
Evergreen, CO 80439
Phone: (303) 670-7823
Fax: (303) 670-2964
Nan Party
QA Manager, NCERQA
U.S. EPA
401 M Street, S.W.
Mail Code 8721R
Washington, DC 20460
Phone: (202) 564-6859
Fax: (202) 565-2444
E-mail: parry.nan@epa.gov
Jeffrey Perl
AFCEE/ERC
USAF
Brooks AFB, TX
John Petiet
Environmental Chemist 2
NYSDEC
50 Wolf Road
Albany, NY 122337258
Phone: (518) 457-2052
Fax: (518) 4857733
E-mail: txpetiet@gw.dec.state.ny.us
Debra Piper
QA Manager
Grace Analytical Laboratory
USEPA GLNPO
36W580 Lancaster Rd.
St. Charles, IL 60175
Phone: (312) 353-0377
Fax: (630) 587-9645
E-mail: piper.debra@epamail.epa.gov
Charles Plost
Chemist
Office of Research & Development
U.S. EPA
401 M Street, SW (8724)
Washington, DC 20460
Phone: (202) 564-6874
Fax: (202) 564-2551
E-mail: plost.charles@epamail.epa.gov
Tom Pomales
Air Pollution Specialist
Monitoring & Laboratory Division
CA Air Resources Board
1309 T Street
Sacramento, CA 95814
Phone: (916) 322-7053
Fax: (916) 327-8217
E-mail: tpomales@arb.ca.gov
Wade Ponder
Chief, Technical Services Branch
ORD/NRMRL/APPCD
U.S. EPA
Mail Drop 91
RTP, NC 27711
Phone: (919) 541-2818
Fax:(919)541-0496
E-mail: ponder.wade@epamail.epa.gov
Emma P. Popek
Technical Manager
IT Corporation
4585 Pacheco Boulevard
Martinez, CA 94553
Phone: (925) 372-4486
Fax:(925)372-5220
E-mail: popek@ohm.com
Dave Preston
Partner
Vamum, Riddering, Schmidt, and
Howlett, LLP
Grand Rapids, MI
Mike Ray
QA Manager, NHEERL/HSD
U.S. EPA
MD-58A
86 Alexander Drive
RTP, NC 27711
Phone: (919) 966-0625
Fax:(919)966-6212
E-mail: ray.mike@epamail.epa.gov
William R. Ray
QA Program Manager
Division of Water Quality
CA State Water Resourses Control Brd
901 P Street
P.O. Box 944213
Sacramento, CA 942442130
Phone: (916)657-1123
Fax: (916) 653-8628
E-mail: rayb@dwq.swrcb.ca.gov
Kenneth Reid
Program Analyst (Alternate QA Coord.)
NCEA-w
U.S. EPA
401 M Steet, S.W.
Washington, DC 20460
Phone: (202) 564-3229
Fax: (202) 565-0050
E-mail: reid.kenneth@epa.gov
10

-------
Esperanza P. Renard
Ron Rogers
Kenneth Rygwelski
NCERQA/QAD
QA Manager, Env. Carcinogenesis Div.
Environmental Scientist
U.S. EPA
U.S. EPA, NHEERL
ORD/NHEERL/MED
2890 Woodbridge Avenue
MD-70
U.S. EPA
MS-104
RTP.NC 27711
9311 GrohRd
Edison, NJ 08837
Phone: (919) 541-2370
Grosse lie, MI 48138
Phone: (732) 321-4355
Faxj (919) 541-4002
Phone: (734)692-7641
Fax: (732)321-6640
E-mail: rogers.ron@epamail.epa.gov
Fax: (734) 692-7603
E-mail:

E-mail:
Renard.Esperanza@epamail.epa.gov
Randall Romig
RYGWELSKI.KENNETH@EPAMAIL.

U.S. EPA - Region 6
EPA.GOV
Willaim Richardson
1445 Ross Ave. (6WQ-D)

Environmental Engineer
Dallas, TX 75202-2733
Jackie Sample
ORD/NHERRL/MED-
Phone: (214) 665-8346
Navy Laboratory Program Manager
Duluth/CBSSS
Fax: (214) 665-6049
NAVSEA 04XQ/LABS
9311 GrohRoad
E-mail:
U.S. Navy
Grosse lie, MI 48138
Romig.Randall@EPAmail.EPA.Gov
1661 Redbank Road
Phone: (734) 692-7611

Goose Creek, SC 294456511
Fax: (734) 692-7603
Debbie Rosano-Reece
Phone: (843)764-7337x11
E-mail:
Task Order Manager
Fax: (843) 764-7360
richardson.william@epa.gov
Marasco Newton Group
E-mail:

2801 Clarendon Boulevard
Sample_Jackie_H@hg.navseanavy.mil
Gene Riley
Arlington, VA 22209

Environmental Protection Spec.
Phone: (703) 516-9469
Anays A. Santaliz
EMAD, OAQPS
Fax: (703)516-9108
Quality Assurance Officer
U.S. EPA
E-mail: dreece@marasconewton.com
Air Quality Area
2300 Southern Drive

PR Environmental Quality Board
Durham, NC 27703
Ann Rosecrance
P.O. Box 11488
Phone: (919)541-5239
Corporate QA Director
Santurce, PR 00910
Fax: (919) 541-1039
Core Laboratories
Phone: (787) 723-8184
E-mail: riley.gene@epainail.epa.gov
5295 Hollister Road
Fax: (787) 725-0140

Houston, TX 77040

Charles Ritchey
Phone: (713) 329-7414
Javier Santillan
U.S. EPA
Fax: (713)895-8982
AFCEE/ERC
1445 Ross Avenue, 6PD
E-mail: arosecrance@corelabcorp.com
USAF
Dallas, TX 75202-2733

Building 532
Phone: (214) 665-8350
Michael A. Ross
3207 North Road
Fax: (214) 665-6762
President
Brooks AFB.TX 782355363
E-mail:
Registrar Accreditation Board
Phone: (210) 536-5207
Ritchey.Charles@epamail.epa.gov
P.O. Box 3005
Fax: (210) 536-6408

Milwaukee, WI 532013005

C£sar Rodriguez
Phone: (414) 272-3937x7811
Eric Sappington
Quality Assurance Officer
Fax: (414) 765-8661
Environmental Specialist
Air Quality Area
E-mail: mross@rabnet.com
Division of Enviommental Quality
PR Environmental Quality Board

MO Department of Natural Resources
P.O. Box 11488
Robert M. Runyon, Jr.
P.O. Box 176
Santurce, PR 00910
Quality Assurance Manager
Jefferson City, MO 65102
Phone: (787) 723-8184
DESA/HWSB
Phone: (573) 526-3373
Fax: (787) 725-0140
U.S. EPA - Region 2
Fax: (573) 526-3350

2890 Woodbridge Avenue

Jenine Rogers
Edison, NJ 08837
Cheryl Scholten
Environmental Scientist, SWP/SDM
Phone: (732)321-6645
Quality Assurance Coordinator
Fern aid
Fax: (732) 906-6824
Minnesota Pollution Control Agency
11003 Hamilton-Cleves RD
E-mail: runyon-robert@epamail.epa.gov
Outcomes Division
Dos Building

520 Lafayette Road
Ross, OH 45061

St Paul, MN 55155
Phone: (513)648-7534

Phone: (651)296-7387
Fax: (513)648-7595

Fax: (651)297-8324
E-mail: Jenine_Rogers@fernald.gov

E-mail: cheryl.scholten@pca.state.mn.us
11

-------
Mary Ellen Schultz
Scott D. Siders
Joseph Solsky
Environmental Scientist
Divisional QAO
Chemist
Office of Analytical Services &
Division of Laboratories
U.S. Army COE
Quality Assurance
Illinois EPA
CENWO-HX-C
U.S. EPA - Region 3
1021 North Gamd Avenue East
12565 West Center Road
Environmental Science Center
Springfield, IL 627949276
Omaha, NE 68144
701 Mapes Road
Phone: (217) 785-5163
Phone:(402)697-2573
Fort Meade, MD 20755-5350
Fax: (217) 524-0944
Fax: (402)697-2595
Phone: (410) 305-2746
E-mail: epa6113@epa.state.il.us

Fax: (410) 305-3095

Elaine S. Sorbet
E-mail: schultz.maiyellen@epa.gov
Fred Siegelman
Laboratory Program Manager

Quality Assurance Division
Louisiana DEQ
George Schupp
U.S. EPA
8618 GSRI
Quality Assurance Coordinator
Washington, DC
Baton Rouge, LA 70810
Central Regional Laboratory

Phone: (225) 765-2405
U.S. EPA - Region 5
Guy Simes
Fax: (225) 765-2408
Mailcode: ML-10C
Scientist
E-mail: elaines@deq.state.la.us
77 W. Jackson Blvd.
Sustainable Technology Division

Chicago, 1L 60604-3511
US EPA
Margaret St Germain
Phone: (312) 353-1226
26 W. Martin Luther King Dr.
Environmental Chemist
Fax: (312) 886-2591
Cincinnati, OH 45268
Environmental Science Division
E-mail: schupp.george@epa.gov
Phone: (513) 569-7845
U.S. EPA - Region 7

Fax: (513) 569-7471
Quality Assurance Office
Charles Sellers
E-mail: simes.guy@epamail.epa.gov
25 Funston Road
Chemist, OSW

Kansas City, KS 66115
U.S. EPA
Barton P. Simmons
Phone:(913)551-5242
401 M Street, SW
California EPA
Fax:(913)551-5242
Washington, DC 20460
2151 Berkeley Way
E-mail: stgermain.margie@epa.gov
Phone: (703) 308-0504
Berkeley, CA 94704

Fax: (703) 308-0509
Phone: (510) 540-3003
Lucy Stanfield
E-mail:
Fax: (510)540-2305
Assistant Data Manager
sellers.charles@epamail.epa.gov
E-mail: bsimmons@dtsc.ca.gov
Great Lakes National Program Office


(GLNPO)
Fred Seto
Terry Simpson
U.S. EPA
Public Health Chemist
Environmental Scientist, OIG/ESS
77 W. Jackson Blvd., G-17J
Hazardous Materials Laboratory
U.S. EPA
Chicago, IL 60604
CA EPA
401 M Street, SW
Phone: (312) 886-1121
2151 Berkeley Way, #55
Washington, DC 20460
Fax:(312)353-2018
Berkeley, CA 94704
Phone: (202) 260-3276
E-mail: stanfield.lucy@epamail.epa.gov
Phone: (510) 540-3388
Fax: (202) 260-3030

Fax: (510) 540-2305
E-mail: simpson.terry@epa.gov
Carvin D. Stevens
E-mail: fseto@ix.netcom.com

QA Specialist, HEASD/HEAB

Diann Sims
U.S. EPA
Ronald Shafer
Environmental Scientist
79 T.W. Alexander Drive
Operation Research Analyst, CEIS
Quality Assurance Division
RTP.NC 27711
U.S. EPA
U.S. EPA
Phone: (919) 541-1515
401 M Street, SW, (MC 2163)
401 M St. SW (8724R)
Fax:(919)541-4368
Washington, DC 20460
Room 31158
E-mail: stevens.carvin@epamail.epa.gov
Phone: (202) 260-6966
Washington, DC 20460

Fax: (202) 260-4968
Phone: (202) 564-6872
Shari L. Stevens

E-mail: sims.diannfti.cpa.gov
Chief, DESA-HWSB
Richard Sheibley

U.S. EPA - Region 2
Chief, Lab. Accreditation Section
Al Smith
Building 209
Bureau of Laboratories
Regional Quality Assurance Mgr.
2890 Woodbridge Avenue
PA Dept of Environmental
U.S. EPA - Region 6
Edison, NJ 08837
Protection
1445 Ross Ave.
Phone: (732) 906-6994
P.O. Box 1467
Dallas, TX 75202
Fax:(732)321-6622
Harrisburg, PA 171051467
Phone: (214)665-8347
E-mail: stevens.shari@epamail.epa.gov
Phone: (717) 705-2425
Fax: (214)665-8072

Fax: (717) 783-1502
E-mail: smith.alva@epa.gov

E-mail:


sheibley.richard@al .dep.state.pa.us


12

-------
Victor Stokmanis
Cheng-Wen Tsai
Shirley Wasson
Quality Assurance Chemist
Chemist/QA Expert
Chemist, QA Team
Division of Resources Management
RMD/QA Core
APPCD/NRMRL/ORD
U.S. Bureau of Reclamation
U.S. EPA - Region 5
EPA
2800 Cottage Way
RMD/QA Core (M-9J)
MD-91
Sacramento, CA 95825
77 West Jackson Blvd.
Research Triangle Park, NC 27711
Phone: (916) 978-5285
Chicago, IL 60604
Phone: (919) 541-1439
Fax: (916)978-5290
Phone: (312) 886-6234
Fax: (919) 541-0496
E-mail: vstokmanis@mp.usbr.gov
Fax:(312)353-4342
E-mail: wasson.shirley@epa.gov

E-mail: tsai.cheng-wen@epa.gov

Barry Stoll

Carl Weaver
Team Leader
Dana S. Tulis
Lab Supervisor
OIG OA Engineering and Science
Director
Consumer Analytical Laboratory
Staff
Analytical Operations Ctr., OERR
Ohio Department of Agriculture
U.S. EPA
U.S. EPA
8995 E. Main St. Bld.#3
401 M Street SW
401 M Street, SW (MC 52046)
Reynoldsburg, OH 43214
Washington, DC 20460
Washington, DC 20460
Phone:(614)728-6313
Phone: (202) 260-4976
Phone: (703) 603-8993
Fax: (614) 728-6322
Fax: (202) 260-3030
Fax: (703)603-9112
E-mail: cweaver@odant.agri.state.oh.us
E-mail: bstoll@EPA.gov
E-mail: tulis.dana@epa.gov



Nancy Wentworth
Peter Styczen
Steve Vandegrift
Director, Quality Assurance Division
Technical Assistant
QA Manager
U.S. EPA
Informatica International, Inc.
Subsurface Protection and Remediation
401 M Street, SW (8201)
795 Main Street, West, 2nd Floor
Division
Washington, DC 20460
Oak Ridge, TN 37830
U.S. EPA
Phone: (202) 564-6830
Phone: (423) 220-8700
P.O. Box 1198
Fax: (202) 565-2441
Fax:(423)481-3234
Ada, OK 74820
E-mail:
E-mail: pstyczen@ick.net
Phone: (580) 436-8684
wentworth.nancy@epamail.epa.gov

Fax: (580) 436-8528

James Sutton
E-mail:
Dennis Wesolowski
QA Manager
vandegrift.steve@epamail.epa.gov
Regional QA Manager
Neurotoxicology

U.S. EPA - Region 5
U.S. EPA - NHEEERL
Thomas Wagner
Mail Code M-9J
MD-74B, ERC
Senior Scientist
Chicago, IL 60604
86 TW Alexander Dr
Environmental & Health Sciences
Phone: (312) 886-1970
RTP.NC 27711
Science Applications Int'l. Corp.
Fax: (312)353-4342
Phone: (919) 541-7610
2260 Park Avenue, Suite 401A
E-mail: Wesolowski.Dennis@epa.gov
Fax: (919) 541-4849
Cincinnati, OH 45206
E-mail:
Phone: (513) 569-5869
J. Kaye Whitfield
Sutton.james@epamail.epa.gov
Fax: (513) 569-4800
Environmental Engineer

E-mail:
Air Pollution Prev.& Control Div./
Deborah Szaro
THOMAS.J.WAGNER@cpmx.saic.co
Tech. Svcs. Br.
Environmental Scientist, QA Unit
m
U.S. EPA
U.S. EPA

MD-91
60 Westview Street
John Warren
Research Triangle Park, NC 27711
Lexington, MA 02421
QA Division
Phone: (919) 541-2509
Phone: (781)860-4312
U.S. EPA
Fax:(919)541-0496
Fax:(781)860-4397
401 M Street, SW (8724R)
E-mail: whitfield.kaye@epa.gov
E-mail: szaro.deb@epamail.epa.gov
Washington, DC 20460

Phone: (202) 564-6876
Chuck Wibby
Cynthia Szymanski
Fax: (202) 565-2441
Vice President
Quality Assurance Officer
E-mail: warren.john@epamail.epa.gov
ERA
OPP/BEAD/APPB

5540 Marshall
U.S. EPA

Arvada, CO 80002
401 M St. SW

Phone:(303)431-8454
Mailcode 7503C

Fax: (303)421-0159
Washington, DC 20460

E-mail: eracxw@aol.com
Phone: (703) 308-8191


Fax: (703) 308-8090


E-mail:


szymanski.cynthia@EPA.gov


13

-------
Carolyn Wieland
Management Analyst/Project
Officer-QA Contract
ORD/NRMRL/TTSD/IO
U.S. EPA
26 W. MLK Drive
MS-163
Cincinnati, OH 4S268
Phone: (513) 569-7846
Fax: (513)569-7585
E-mail:
wieland.carolyn@epamail.epa.gov
Aileen Winquist
Environmental Scientist
Marasco Newton Group
2801 Clarendon Boulevard
Arlington, CA 22209
Phone: (703) 247-4254
Fax: (703)516-9108
E-mail:
awinquis@marasconewton.com
Rebecca Wiscovitch
QA Officer
Air Quality Area
PR Environmental Quality Board
P.O. Box 11488
Santurce, PR 00910
Phone: (787) 723-8184
Fax: (787) 725-0140
Kevin Worden
MDA-PDP QAO, Agriculture
Division
State of Michigan
1615 South Harrison Road
East Lansing, MI 48823
Phone: (517) 337-5106
Fax: (517) 337-5094
E-mail: wordenk@state.mi.us
Jeffrey Worthington
Quality Manager, QAD
USEPA - ORD NCERQA
401 M. Street SW (8724R)
Washington, DC 20460
Phone: (202) 564-5174
Fax: (202) 565-2441
E-mail:
Worthington. Jeffrey @EPAMAIL.E
PA.GOV
Dallas Wright, Jr.
QA Manager OPP
EPA- OPP
U.S. Government
3112 Winifred Dr.
Burtonsville, MD 20866
Phone: (703) 605-0644
Fax: (703) 305-6309
E-mail: wright.dallas@epa.gov
Chieh Wu
Environmental Engineer
National Center for Environmental
Assessment
U.S. EPA - ORD
401 M Street, SW
Washington, DC 20460
Phone: (202) 564-3257
Fax: (202) 565-0076
E-mail: wu.chieh@epamail.epa.gov
Chung Yen
AFCEE/ERC
USAF
Brooks AFB.TX
14

-------
4

-------
Group
EPA's QA Community
(Note: The area code for all HQ Phone numbers with the "260-* and "564" exchange is 202)
Name	Phom	Mail Code Email Access Code	Fax Number
QAD
CROSS, Carolyn
919/541-3151
MD-75
cross-carolyn
919/541-4261
QAD
DIXON, Tom
564-6877
8724R
dixon-thomas
565-2441
QAD
DUTROW, Betsy
564-9061
8724R
dutrow-eltzabeth
565-2441
QAD
HOLLOMAN, Vini
564-5176
8724R
holloman-virxaa
565-2441
QAD
HUNT, Margo
732/321-6606

hunt-margo
732/321-6616
QAD
JOHNSON, Gary
819/541-7612
MD-75
johnson-gary
919/541-4261
QAD
KIRKLAND, Linda
564-6873
8724R
kirkland-linda
565-2441
QAD
LAFORNARA, Patricia
732/906-6988
MS-104
lafomara-patricia
732/321-6640
QAD
MAISONNEUVE, Betty
564-6879
8724R
maisonneuve. betty
565-2441
QAD
MOURRAIN, Jeanne
919/541-1120
MD-75
mourrain-jeanne
919/541-4261
QAD
ODOM, Brenda
564-6881
8724R
odom-brenda
565-2441
QAD
RENARD, Esperanza
732/321-4355
MS-104
renard-esperanza
732/321-6640
QAD
PLOST, Charles
564-8874
8724R
plost-charles
565-2441
QAD
SIEGELMAN, Fred
564-5173
8724R
slegelman-frederic
565-2441
QAD
SIMS. Diann
564-6872
8724R
sims-diann
565-2441
QAD
WALDRON, Betty
564-6830
8724R
waklron-betty
565-2441
QAD
WARREN. John
564-6876
8724R
warren-John
565-2441
QAD
WENTWORTH, Nancy
564-6830
8724R
wentworth-nancy
565-2441
QAD
WORTHINGTON, Jeff
564-5174
6724R
worth ington-Jeffrey
565.2441
National Prnnram OffirPQ
Sffl
OAR
AA QA rep
MAZZA, Carl
260-4672
6101
mazza-carl
260-5155
OAP
DIGIOVANNI .Vincent
202/564-8981
6204J
dig'iovannt-vincent
202/565-2141 or -2139
OAQPS
VACANT




PAPP, Michael
919/541-2408
MD-14
papp-michael
919/541-1903

AUTRY, Lara
919/541-5544
MD-14
autiy-lara
919/541-1039

MAXWELL, Doris
.919/541-5312
MD-13
maxwell-doris
919/541-0072
OMS
HARPER, Thomas
734/214-4308

harper-thomas
734-214-4550
ORIA
DOEHNERT, Mark
564-9386
6802J
doehnert-mark
565-2042/3

FISHER, Eugene
564-9418
6604J
fisher-eugene
565-2038

EAGLE, Mike
564-9376
6602J
eagle-mike
565-2062

MOORE. Jim (NAREL)
334/270-3451

moore-james
334/270-3454

LEVY, Richard (LV)
702/798-2466

levy-richard
702/798-2465

SELLS, Mark
702/798-2336

sells-mark
702/733-6013

BRAGANZA, Emillo
702/798-2430

braganza-emilio
702/798-2465

FLOTARD, Richard
702/798-2113

flotard-richard
702/798-2109
AA QA rep





OA
PASTORE, Tom
260-2064
3201
pastom-tom
260-6591

DAVIDSON, Jeff
260-1650
3207
davldson-jeff
260-0215
OHROS
CLARK, Joanne
260-3182
3601
dark-Joanne
260-1039
AA QA rep
O'Brien, Kathy Sedlak-
260-1162
2724
obrien-kathy
260-
AA QA rep
MARION, Greg
564-7139
2254A
marion.greg
564-0073
FFEO
JONES. Kelly
564-2459
2261A
jones-kelly
501-0069
OC
BROZENA, Steve
202/564-4126
2225A
brozena-stephen
564-0029
OCEFT
TOPPER, Martin
564-2564
2231A
topper-martin
501-0599
NEIC
HUGHES, Barbara (QAM)
303/236-6116

hughes-barbara
303/236-5116

ROHRER, Mary
6295

rohrer-mary
303/236-2395

LEE, Johnny
6356

lee-johnny
303/236-5116

MATHEWS, Kaye
-6281

matftews-kaye
303/236-2395

YARBROUGH, Kenna
-6711

yarbrough-kenrta
303/236-2395

ALEYNIKOV, Marina
-6062

aleynikov-marina
303/236-5116
OEJ
SETTLE, Mary E
564-2594
2201A
settte-mary
501-0740
OFA
BIGGS, B. Katherlne
564-7144
2252A
biggs-katherfne
564-0072
ORE
OLSON. Don
564-5558
2243A
oliorvdon
564-0054
OSRE
JOJOKIAN, Jack
564-6058
2273A
jojokian-jack


SIMPSON, Terry
260-3276
2421
simpson-teny
260-3030
AAQA rep
OARM
OCFO
OECA
OIG
OP
OPPTS
OSWER
AA QA rep
KARIYA, Jim
260-2916
7101
kariya-jim
401-0849
OPP
WRIGHT, Dallas(acting)
703/605-0644
7503W
wrlght-dallas
301/504-8060

BYRNE, Christian (ECL/BEAD)
601/666-3213
7503W
byme-christian
601/688-3536

GOLDEN, Paul(ACUBEAD)
301/504-8189
7503W
go Id en-paul
301/504-8060

SZYMANSKI, Cynthia (BEAD)
703/308-6191
7503W
szymanskl-cynthia
703/308-6090

THOMPSON, Peter (Arm p Div)
703/308-7372
7510W
thompson-peter
703/308-6466

GLASGOW, Carol (BPPD)
703/308-6810
7511C
glasgow-carol
703/308-7026

NGUYEN, Thuy (EFED)
703/605-0562
7507C
nguyen-thuy
703/308-6181

OLINGER, Christine (HED)
703/305-5406
7509C
olinger-ctuistine
703/305-5529

BRADLEY. Cheryl (IRSD)
703/305-5981
7502C
bradley-cheryl

OPPT
GLATZ, Jay
260-3990
7401
glatz-joseph
260-6704
AA QA rep
JOVER, Tony
260-2387
5103
jover-tony
260-6754
OERR
GEUDER, Duane (QAM)
703/603-8891
5202G
geuder-duane
703/603-9132

COAKLEY, Bill
732/321-6921
5204
coakley-wllliam
732/906-6921

WAETJEN, Hans
703/603-8906
5201 G
waetjen-hans
703/603-9133
OSW
SELLERS, Charles
703/308-0504
5307W
sellers-chartes
703/308-0511
OUST
DEPONT, Lynn
703/603-7148
5401G
depont-tynn
703/603-9163
FFRRO
CARTER. Mike
202/260-5686
5101
carter-mike
202/260-5646
Date of printout: 13:58J06Apr99

-------
Group	Name	Phone	Mall Code Email Access Coda Fax Number
AAQArep
COLEMAN, Wendy Bla*e
260-5680
4102
coterrian-wertdy
260^7923
AIEO
LIU, Edwin (QAM)
2609872
4104
liu-ed
260-7509
OGWDW
CLARK, Steve (QAM)
260-7575
4601
clark-stephen
260-3762

MADDING, Carol
513/569-7402

madding-caroline
513-684-7191

SMITH, Bob
260-5559
4602
smith-robe rt-e
260-0732

HAERTEL, Frances
214/665-8090

haertel-frances
214/665-2191
OST
TELLIARD, Bill
260-7134
4303
teliiard-william
260-7185

BISWAS, Hlranmay
260-7012
4305
biswas-hira
260-9830
OWM
WALKER, John (QAM)
260-7283
4204
walker-john
260-0118

SMITH, Tony (Roberto A.)
260-1017
4203
smith-tony
260-1460

BENROTH. Barry (MSD)
260-2205
4204
betroth-barry
260-0116
OWOW
BROSSMAN, Martin
260-7023
4503F
brass man-martin
260-1977

PAN, Paul
260-9111
4504F
pan-paul
260-9960

SIPPLE. William
260-6066
4502F
sipple-willlam
260-8000
Regional Offices
Region 1
BARMAKIAN, Nancy
781/860-4684
barmakian-nancy
781/860-4397

SZARO, Deborah
781/860-4312
szaro-deb
781/860-4397

LATAILLE, Moira
781/860-4635
lataille-moira
781/860-4397
Region 2
RUNYON, Bob
732/321-6645
runyon-robert
732/906-6824

Dore LaPosta
732/321-6686
laposta-dore
732/906-6616
Region 3
VACANT


Region 4
BENNETT,Gary
706/355-8551
bennett-gaiy
706/355-8803
Region 5
WESOLOWSKI, Dennis
312/888-1970
M-9J wesoloskt-dennis
312/353-4135
Region 6
SMITH, Alva
214/665-8347
6MD-HX smith-aiva
214/665-8072
DOUCET. Lisa
281/983-2129
doucet-iisa
281/983-2124 or 2248
Region 7
ARNOLD. Ernie
913/551-5194
amold-emle
913/551-5218
Region 8
MEDRANO, Tony
303/312-6336
medrano.tony
303/312-6961
Region 9
FONG. Vance
415/744-1492
fong-vance
415/744-1476
Region 10
TOWNS, Bany
206/553-1675
ES-095 towns-barry
206/553-8210
ORD
NCERQA	PARRY, Nan
NCEA	NOLAN, Metvin (QAM)
HQ WU. Chieh
Ci WILLIAMS. Doug
RTP FENNELL, Douglas
NHEERLAED (Narra.)
ECD (RTP)
RTD (RTP)
ETD (RTP)
HSD (RTP)
NTD(RTP)
GED (GB)
MED (Duluth)
(Grosse lie)
WED (Corvallis)
UVOLSI. Joe
ROGERS, Ronald
BROWN, Janice
CULPEPPER,Brenda
RAY. Mike
SUTTON, Jim
MOORE, James C.
BATTERMAN. Allan
RYGWELSKI, Ken
MCFARLANE, Craig
HENDRICKS, Charles
564-6859
8721R
parry-nan
565-2449
564-3354
8601 -D
nolan-melvtn
565-0061
564-3257
8623-D
wu-chieh
565-0078
513/569-7361
MD-185
williams-do ug
513/569-7475
919/541-3789
MD-52
fennell-douglas
919/541-1818
401/782-3163

livolsi-Joseph
401/782-3030
919/541-2370
MD-70
rogers-ron
919/541-5394
919/541-0331
MD-71
brawn-janice
919/541-1499
919/541-0153
MD-66
culpepper-brenda
919/541-4284
919/968-0625
MD-58A ray-mike
919/966-6212
919/541-7610
MD-70
sutton-james
919/541-5394
850/934-9236

moore-jlm
850/934-9201
218/529-5027

batterman-allan
218/529-5015
734-692-7641

rygwelsW-kenneth
734/692-7603
541/754-4606

mcfarlane-craig
541/754-4614
541/754-4799

hendricks-charles
541/754-4799
NRIMRL (Clnti)
WSWRD (Cin)
LRPCD (Clnti)
STD (Clnti)
APPCD (RTP)
SPRD (Ada)
ADAMS, Nancy (Acting QAM)
HAYES. Sam
KERN Ann
DREES, Lauren
ADAMS, Nancy (QAM)
WHITFIELD, Kaye (TSB)
GROFF, Paul
WASSON, Shirley
SHORES, Richard
VANDEGRIFT, Steve
NERL	JOHNSON, Lora
EEROMCEAOfCtntQ MARTINSON, John
HEASD/RTP
AMD/RTP
ERD (Athens)
ESD (LV)
BETZ, Elizabeth
VIEBROCK, Herbert
SWANK, Robert
POPE, John
BRILIS, George
919/541-5510
513/569-7514
513/569-7635
513/569-7087
919/541-5510
919/541-2509
919/541-0979
919/541-1439
919/541-4983
560/436-6684
513/569-7299
513/569-7564
919/541-1535
919/541-4543
706/355-6008
706/355-3268
702/798-3126
MD-91
689
G-77
G-77
MD-91
MD-91
MD-91
MD-91
MD-91
MS5B7
MS587
MD-77
MD-80
adams-nancy
hayes-sam
kem-ann
drees-lauren
adams-nancy
whitfield-kaye
groff-paul
wasson-shirtey
Bhores-rtchard
vandegrift-steve
johnson-lora
martinson-john
betz-elizabeth
viebrock-hertoert
swank-robert
pope-john
brilis-george
919/541-0496
513/569-7655
513/569-7585
513/569-7585
919/541-0496
919/541-0496
919/541-0496
919/541-0496
919/541-0496
580/436-8528
513/569-7424
513/569-7424
919/541-0239
919/541-1379
706/355-8007
706/355-8160
702/798-2233
ORO
GLNPO
CBP
OIRM
OCFO
LUTTNER, Pamela
BLUME, Louis.
LEY, Mary Ellen
O'BRIEN, Katheen Sedlak
Other OA People
1108
260-2441
312/353-2317
410/267-5750
260-1162
luttner-pamela
blume-touis
ley-ma ry
2724 obrien-Kathy
260-2159
312/353-2018
410/267-5777
401-1515
Date of printout: 13:56/06Apr99

-------
To get a copy of the current list, or to make any corrections,
Please Phone QAD on 202/564-6830
Acromania:
OAROffice of Air and Radiation
OAP	Office of Atmospheric programs
OAQPS	Office of Air Quality Planning and Standards
OMS	Office of Mobile Sources
ORIA	Office of Radiation and Indoor Air
OARM Office of Administration and Resource Management
OA	Office of Administration
OECA Office of Enforcement and Compliance Assurance
FFEO	Federal Facilities Enforcement Office
OC	Office of Compliance
OCEFT	Office of Criminal Enforcement, Forensics and Training
OEJ	Office of Environmental Justice
OFA	Office of Federal Activities
ORE	Office of Regulatory Enforcement
OSRE	Office of Site Remediation Enforcement
OIG Office of Inspector General
OP Office of Policy
OSPED	Office of Strategic Planning and Environmental Data
OPPTS Office of Prevention, Pesticides and Toxic Substances
OPPOffice of Pesticide Programs
EFED	Environmental Fate and Effects Div
HED	Health Effects Division
IRSD	Information Resources and Services Division
OPPT	Office of Pollution Prevention and Toxics
ORD Office of Research and Development
NCERQA	National Center for Environmental Researchand Quality Assurance
NCEA	National Center for Environmental Assessment
NHEERL	National Health and Environmental Effects Research Laboratory
AED	Atlantic Ecology Division (Narragansett Rl)
ECD	Environmental Carcinogenesis Div (RTP)
GED	Gulf Ecology Division (Gulf Breeze, FL)
MED	Mid Continent Ecology Division (Duluth, MN)
NTD	Neurotoxicology Division
WED	Western Ecology Division (Corvallis OR)
NRMRL	National Risk Management Research Laboratory
APPCD	Air Pollution Prevention and Control Division (RTP, NC)
LRPCD	Land Remediation and Pollution Control Division (Cinti, OH)
SPRD	Subsurface Protection and Remediation Division
STD	Sustainable Technology Div (Cinti, OH)
WSWRD	Water Supply and Water Resources Division (Cinti, OH)
NERL National Exposure Research Laboratory
AMD	Atmospheric Modeling Division (RTP, NC)
EERD	Ecological Eposure Rsearch Division (Cinti, OH)
ERD	Ecolosystem Research Division (Athens, GA)
ESD	Environmental Sciences Division (Las Vegas, NV)
MCEAD	Microbiological and Chemical Exposure Research Division (Cinti, OH)
HEASD	Human Exposure and Atmospheric Research Division (RTP, NC)
OSWER Office of Solid Waste and Emergency Response
OERR	Office of Emergency and Remedial Response
OSW	Office of Solid Waste
OUST	Office of Underground Storage Tanks
FFRRO	Office of Federal Facilities Reclamation and Reuse
OW Office of Water
AIEO	American Indian Environmental Office
OGWDW	Office of Ground Water and Drinking Water
OST	Office of Science and Technology
OWM	Office of Waste Management
MSD	Municipal Support Division
OWOW	Office of Wetlands, Oceans, and Watersheds
Other QA People
OCFO Office of the Chief Financial Officer
ORO Office of Regional Operations
GLNPO Great Lakes National Program Office
CBP Chesapeake Bay Program
OIRM Office of Information Resources Management
Date of pnntout: 13 56/06Apr99

-------

-------
CURRENT ACTIVITIES OF THE INTERGOVERMENTAL DATA QUALITY TASK FORCE
Mike H. Carter
U.S. Environmental Protection Agency
Federal Facilities Restoration and Reuse Office (5101)
401 M Street SW
Washington, DC 20460
SUMMARY
In March, 1997, the EPA Office of Inspector General (OIG) issued a report titled Laboratory
Data Quality at Federal Facility Superfund Sites (E1SKB6-09-0041-7100132). In this report and
others, the OIG has raised concerns about the quality of the environmental data upon which
decisions critical to protection of human health and the environment are based. The issues raised
by the OIG focus on the need for effective, intergovernmental Quality Systems and processes for
developing Quality Assurance Project Plans. The OIG stated that problems due to current
deficiencies lead to poorly designed Quality Assurance Project Plans, deficient Data Quality
Objectives, serious problems with laboratory data quality and insufficient EPA oversight.
Recommendations in the report for the Assistant Administrator for Solid Waste and Emergency
Response (OSWER) include:
•	Develop a national quality management plan,
•	Assess the adequacy of DOD's and DOE's environmental data management systems,
•	Issue guidance that specifies regional oversight responsibilities for Federal facility
Superfund cleanups, and;
•	Issue program specific QAPP guidance.
To address the recommendations, the Federal Facilities Restoration and Reuse Office (FFRRO)
led the formation of the Intergovernmental Data Quality Task Force (IDQTF), composed of
representatives from EPA Program Offices, Regional Offices, DoD, DOE and other Federal
Agencies.
CURRENT ACTIVITIES
The representatives from the Regional Offices and the other Federal Agencies have made it clear
that they will participate in the IDQTF only if their organizations derive significant benefit from
that participation; simply helping OSWER address Office of Inspector General recommendations
is not sufficient justification for their participation. The goals of the IDQTF are as follows:
•	To develop a written agreement on what constitutes and adequate QA program,
•	To develop guidance/framework that outlines the roles and responsibilities of the EPA
(Headquarters and Regions) and Federal Facilities with regard to QA/QC oversight, and;
•	To develop guidance for Federal Agency-wide requirements and procedures regarding
data quality.

-------
The additional goals of the Departments of Defense and Energy involve the establishment of
national consistency in the implementation of the quality assurance guidance and framework.
They quite understandably express the need and benefits of being held to a single standard of
quality assurance/quality control instead of different requirements in each Region and under
every EPA media program. This standardization of monitoring requirements across the entire
Agency was the driver for the establishment of the Environmental Monitoring Management
Council a number of years ago.
The members of the IDQTF from the Regions have also made it clear that their participation is
strongly motivated by the need to conduct Quality Assurance business in a more efficient and
effective manner in the future. In too many instances, basic quality assurance functions, such as
the development, review and approval of Quality Assurance Project Plans (QAPPs), has become
a perfunctory exercise that accomplishes very little. We heard anecdotal accounts of QAPPs that
are largely boilerplate, based on little or no thought given to their real purpose. Reviewers of
QAPPs recount experiences where they had to look through 20 pages of language cut and pasted
from previous QAPPs just to find one relevant paragraph. Apparently it is not unusual to find
the wrong site name on the QAPP because it was a part of the cut and paste from another QAPP.
With these goals and objectives in mind, the IDQTF is working to develop Quality Assurance
requirements with enough specificity to provide standardization, while allowing flexibility to
deal with project specific needs. The use of the term "requirements" has become an important
consideration if the efforts of the IDQTF are to be effective. The IDQTF is concerned that
simply providing yet another set of guidance will not accomplish the national consistency,
effectiveness and efficiency that are needed to meet the needs of the federal quality assurance
community. Quality Assurance activities across the Agency could also benefit from the same
approach to conducting the business of quality.
The documentation the IDQTF is developing is based on the American National Standard
Specifications and Guidelines for Quality Systems for Environmental Data Collection and
Environmental Technology Programs, ANSI/ASQC E4-1994 (E-4). While compliance with the
E-4 standard is required for all Federal agencies, that compliance takes many forms. Therefore,
the IDQTF concluded that the only way to achieve consistency in the application of E-4 is to
provide much more detail and specificity for implementation. Requirements documents are
under development for QA Management Systems, corresponding to Part A of E-4, and for the
collection and evaluation of environmental data, corresponding to Part B of E-4.
The review comments for the first draft of the QA Management Systems document made it clear
that the relationship between E-4, EPA Requirements for Quality Management Plans, EPA
QA/R2 (R2) and the IDQTF must be explicit. The introduction of a third QA Management
requirement, in the absence of a specification of the precedence and the relationship of all the
documents, produces confusion, not consistency. IDQTF requirements must either stand alone
with no need to refer to E-4 or R2, or they must exactly follow the format of the source
document. They must provide a solid requirements base to establish consistency and they must
not cause confusion in implementation.

-------
For detailed guidance on the preparation of QAPPs, the IDQTF selected the Region IQAPP
guidance as a starting point. The other Federal agencies and all EPA Regional offices have been
asked to provide a critical review, comments and recommendations on the Region I guidance as a
model for a national standard. The IDQTF representatives from the other Federal Agencies and
the EPA Regional Offices were adamant that development of requirements and guidance on the
collection and evaluation of environmental data must be led by those who are actually involved
with that collection and evaluation. The IDQTF agreed with that position and the work group
leaders are all from the Regions.
CONCLUSION
The objective of the IDQTF is to establish national, intergovernmental standards for quality
assurance, leading to consistency and reciprocity between EPA Regions and other Federal
Agencies. These standards must be clear and concise and they must be implemented
consistently. The goal of the effort is the production of environmental data of known and
acceptable data that can be used to establish environmental restoration and waste management
requirements and monitor compliance with regulations and compliance agreements.

-------

-------
SECONDARY DATA USE: HOW GOOD ARE THE DATA?
Paul Mills & Sean Kolb
DynCorp
2000 Edmund Halley Drive
Reston, VA 20191
SUMMARY
Users who plan to "mine" primary data from computerized databases for a secondary purpose
must specify, "How good do the data need to be?" With these established criteria, the primary
data can be evaluated to determine "How good are the data?" Unless the primary data are
specifically qualified, flagged, identified with quality indicators, cataloged, segregated, and
stored by quality characteristics, the data may not meet secondary use needs. How can the data
be accessed?
What must you know about the various primary data sets-censored data, outlier treatment,
estimated results, screening data, etc.? Standards are needed that allow users to decide whether
the data are suitable for their (secondary) use. What should those standards be? How can they
be developed and applied to environmental data? What decisions don't require validated data?
Which ones do? Is data quality adequately described by DQO levels and PARCCS indicators?
What about other data without such descriptors—caveat emptor?
This paper discusses and suggests standards for, and uses of, secondary data, using the Contract
Lab Program (CLP) laboratory data reports as a source of primary analytical data. The CLP has
specified the answers to many of these questions, and serves as a model for discussion.
INTRODUCTION
(EPA Order 5360.1 CHG 1, July 1998); (EPA July, 1998) refers to "secondary data" as
"environmental data collected for other purposes or from other sources, including literature,
industry surveys, compilations from computerized data bases and information systems, results
from computerized or mathematical models of environmental processes and conditions."
Primary data, actual empirical results from environmental measurements, are used to make the
decision called for in a project's DQOs. Secondary data use could be to "mine" the data for other
purposes, such as how well samplers, labs, methods, etc. perform. Secondary data, then, is
derivative, using previously collected data for decision-making purposes, other than its original
intent. So how can you determine that the data collected by someone else for a different purpose
can meet the data quality needs of your project? What would some of those uses be? Secondary
data could be used to identify trends and tendencies for programmatic-and-conceptual reporting.
Examples of the possible uses of secondary data from environmental databases include:
1) Building a profile of a lab, to determine its performance against established criteria over time,
and compare it with a desired model as well as other labs. What indicators would be used to
determine if there were problems requiring on-site visits for detailed inspections? The power
of the database and secondary data usage is in the identification of patterns that would not be
Cinti2B

-------
visible if looked at on a project-specific basis. Trends and changes would not be noticed
unless some comparisons were made.
2)	Plot the number of samples and matrixes by lab to determine if there are capacity and related
quality trends—more samples mean more problems, or are the SDGs too much for a lab's
capacity?
3)	Toxic Release Inventories and information from EPA reports have been used by independent
environmental activist organizations to provide Web-accessible maps of "Hazardous Waste
Sites in your Zip Code." The data may not be qualified regarding timeliness, currency,
quality, or associated risk. It is dangerous to release unqualified information for general use.
The quality of the data in the database that would be included in such reports must be known
and documented (i.e., CLP criteria).
CLP DATA: OF KNOWN AND DOCUMENTED QUALITY
The CLP's analytical services provide customers with various options for combinations of
analytical parameters, data turnaround times, and detection limits. CLP services help data users
to make the best use of their limited analytical testing resources while reducing the overall cost
of analytical services at the site. CLP operations bring efficiencies in sharing information
regarding labs' performance and quality, uniform methods and variances, standardized
monitoring, timely and effective PE samples, prompt validation, consistent treatment of
performance problems, and overall lower prices.
Quality: By specifying standardized services, the Quality Assurance Project Plan's elements are
defined for: analytical method, preventative maintenance for laboratory equipment, calibration
and corrective action for laboratory equipment, chain of custody forms for sample shipment,
analytical precision and accuracy (including quantitation limits), laboratory quality control
requirements, data management and documentation for laboratory analysis including the
structure and format of the data package.
Standardized data submissions and evaluations are comparable over time. Readily available
program-wide statistics are available for cross-lab comparisons, and trending.
Performance results and lessons learned are shared quickly with all program users.
Legal: Contracts are standardized, with known performance requirements and conditions, and
liquidated damages clauses. Records retention, management, and access procedures protect data
integrity. Constant monitoring reveals suspicious patterns of activity, prompting investigation
for possible fraud. Environmental data of known and documented quality are produced that can
withstand independent review and confirmation.
Technical: A central repository of historic and current analytical data is available through the
information infrastructure, as well as the tools to assess these data. Performance of standardized
methods is captured in a database, and method options, waivers, exceptions, and improvements
can be tracked and shared. This will enable Performance Based Measurement Systems to be
implemented effectively.
Cinti2B

-------
Data Validation and Assessment: CLP data deliverables allow different levels of validation and
assessment - from a review of sample results to a full evaluation of raw analytical data. Data are
validated as hardcopy and electronic versions are available immediately, not held as raw data are
until after all testing is completed. Delays are avoided in detecting and solving problems. All
contracts awarded under a solicitation have identical requirements, ensuring delivery of
consistent analytical data.
PRINCIPLES OF SECONDARY DATA USE:
The quality of the primary data in any database is the paramount consideration for any secondary
usage, in applications like data mining, and has to be considered in data warehousing. Is
"Garbage In, Garbage Out" (GIGO) true? Yes! Identify and segregate data by quality, before it
is stored in any database! The following are additional recommended guiding principles
developed in response to questions about secondary data usage. The CLP approach to the
production and review of primary data is referenced as a model.
How are the data accessed? The CLP stores data electronically, and in hard copy format.
Requests for the data are handled by Regional and contractor points of contact, based on EPA
authorizations to see the data. Data may be in summary form, electronic format (diskettes, tapes,
computer hard drives), and in raw form. It may be complete in one set (Sample Delivery Group,
or SDG), or spread over time across a number of data packages (multiple SDGs in a case) by a
particular analytical method (Statement of Work). The user must specify the data desired.
Who is responsible for data quality? First, responsibility is held by the originators of the data. In
the CLP, the laboratories append data qualifiers as required by the Statement of Work (SOW) for
the particular analysis. Second, the CLP program Contract Laboratory Analytical Support
Services (CLASS) contractor applies semi-automated and manual Contract Compliance
Screening (CCS) software programs that check each data package's electronic deliverables for
completeness and compliance with the SOW. The CLASS contractor uses an automated Data
Assessment Tool (DAT) to review the electronic data and provide EPA Regions with PC-
compatible reports, spreadsheets, and electronic files to facilitate transfer of analytical data into
Regional databases. QC data are examined for all the analytical results and evaluated against
applicable review criteria. Third, the Regional data users conduct a manual hardcopy data
review, using the National Functional Guidelines for Data Validation, and apply a different set of
qualifiers based on the data's intended use. Each responsible party in this process has the
responsibility not to misuse the data, to twist it from its original purpose or stretch it to cover
more. How those qualifiers are used later, by others, must be addressed by providing as many
indicators of data quality as possible on the original data set.
How can primary data-quality be determined? Data quality can-be described by DQO levels,
PARCCS indicators, and other data acquisition elements. (EPA's QA/R-5, October 1998) In
(EPA, October 1998) Elements B1 through BIO list the manner in which data acquisition
elements (primary data use) are to be described. Element B1, Sampling Process Design, says to
classify all measurements as either "critical" that is, required to achieve project objectives, or
"non-critical" for information purposes only. For sampling (B2) and analytical methods (B4)
selected, the method citations should specify exactly which options were selected. B5, Quality
Cinti2B

-------
Control, requires specification of the QC procedures needed for each sampling, analysis, or
measurement technique, and how the QC statistics were calculated. B9 discusses data
acquisition requirements for decision-making that uses data from non-direct measurement
sources, such as computer databases, programs, literature files, and historical databases. "Define
the acceptance criteria for the use of such data in the project, and discuss any limitations on the
use of the data resulting from uncertainty in its quality. Document the rationale for the original
collection of data and indicate its relevance to this project." BIO, Data Management, is a key in
this discussion. "Describe the project data management scheme, tracing the path of the data
from their generation in the field or laboratory to their final use or storage...Describe or
reference the standard record-keeping procedures, document control system, and the approach
used for data storage and retrieval on electronic media. Discuss the control mechanism for
detecting and correcting errors and for preventing loss of data during data reduction, data
reporting, and data entry to forms, reports, and databases." By specifying standardized services,
the CLP Quality Assurance Project Plan's elements are defined for the categories of:
•	analytical methods,
•	preventative maintenance for laboratory equipment,
•	calibration and corrective action for laboratory equipment, chain of custody forms for sample
shipment,
•	analytical precision and accuracy (including quantitation limits),
•	laboratory quality control requirements,
•	data management/documentation, including the structure and format of the data package.
How should data be qualified? Flag the data with qualifiers, and add the "reason codes" to show
why it was qualified. Also indicate whether the data are resubmitted/corrected. From Q/R-5,
Dl, Data Review, Validation, and Verification Requirements: "State the criteria used to review
and validate—that is, accept, reject, or qualify—data in an objective and consistent manner."
D2, Validation and Verification Methods: "Describe the process used for validation and
verifying data." D3, Reconciliation with User Requirements: "Describe how the results
obtained from the project or task are reconciled with the requirements defined by the data user or
decision maker... Describe how issues will be resolved and limitations on the use of the data will
be reported." CLP data are properly assessed using the National Functional Guidelines.
What about other data without such descriptors? Data that do not clearly state the data
acquisition elements and acceptance criteria are less certain, and should be used with greater care
in drawing any conclusions. The CLP provides qualifiers for indeterminate or estimated data.
What are the detection and reporting limits, the effects of errors throughout the measurement
system? These must be reported with the original data and maintained with any secondary use,
to include adjustments made to reporting limits. The CLP requires reported data to be adjusted
for % moisture content of samples, dilutions, sub-optimal sample volumes, or other reasons.
CLP labs report their Method Detection Limits (organics), Instrument Detection Limits
(inorganics) and must meet Contract Required Reporting Limits for specified analytes. Quality
control sample information is reported in each data package so the user can readily determine the
affects of sample matrixes upon recovery and reproducibility, and if any contamination occurred
during sample shipment, storage, and processing.
Cinti2B

-------
Who should be the keeper of the data—EPA or its designates? Should the data be maintained in
its original form, or in a reduced form? The producer of the original primary data has the
responsibility for maintaining its security and integrity. The CLP lab maintains the original raw
data until EPA requests it, or for the duration specified in the laboratory's contract. Copies of
the data are distributed from the lab for data assessment of the contract deliverables. If the data
are released in any form, the qualifiers associated with the data acquisition elements must be
taken as an integral part of the data.
How should any transformations be documented? The database should have a tracking system
that indicates who accessed it, when, and what changes were made to the data. Database security
procedures are essential in maintaining data that can be trusted. Any data reduction of the
primary data set must be described, stating the algorithms used and the extent of transformations,
their purposes and effect. It must be possible to reconstruct the data back to its original source.
How current is the information? Were the latest methods used, or were out-dated ones applied?
Version control for methods must document the specific sampling and analytical methods used
to produce primary data. CLP contracts specify the version of methods to be used, and the
deliverable requirements. Traffic Reports and Chain of Custody documentation describe the
samples collected as grab or composite samples, and as one-time-only or continuous, quarterly,
or annual monitoring. The exact location of the sample in time and space is recorded.
What models were used, and what are their weaknesses? Is the quality of data acceptable for the
model? Are the scales of measurement used by the model consistent with that of the original
data? If pollutant transport models, for example, were used, their version must be noted, along
with any information about the inputs that applied. Are project-specific data appropriate for use
in a programmatic model?
How should data be used for ecological risk assessment? Assumptions about the data, and the
decisions made must be clearly stated. For example, "non-detect" data may be treated as "zero,"
or as Vi the MDL, or the CRDL, or the IDL.
How much of the data were actually validated, and to what extent? 100% full CLP validation
lends more confidence to the data set than only 10% validation of a summary level package, with
no raw data examined. If a percentage check were to be applied, what data were assessed, and
which were not? What conclusions and qualifications were made and applied across the entire
data set? Was there a semi-automated screen, or full manual validation, or both? How do these
agree? These amounts may vary depending on individual projects' needs. The CLASS
contractor applies the DAT check to 100% of the electronic data.
What decisions don't require validated data?- If you are interested in patternsror relative
performance, rather than ultimate decision-making, perhaps validated data aren't required.
How were outliers treated in the data set? The originators should state if there were any data
"censored," or not used for some reason. The CLP deliverables package must indicate data that
failed acceptance criteria, and the attempts made to correct laboratory problems.
Cinti2B

-------
CONCLUSION
The CLP data specifies a known level of data quality, provides quality indicators and a system
for verifying and maintaining established quality levels. Other analytical programs may start
from scratch to establish project-specific DQOs, and to set up all the programs necessary to
qualify analytical data, including auditing programs, data validation systems, etc. With data
specifically qualified by CLP programs, secondary data user can readily determine the quality of
the primary data.
REFERENCE LIST
"EPA Quality Manual for Environmental Programs." EPA Order 5360.1 CHG 1 (July 1998),
USEPA, Office of Research and Development, National Center for Environmental Research and
Quality Assurance, Quality Assurance Division, Washington, DC.
"EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations."
EPA QA/R-5, External Review Draft Final, (October, 1998) USEPA, Office of Research and
Development, National Center for Environmental Research and Quality Assurance, Quality
Assurance Division, Washington, DC.
"USEPA Contract Laboratory Program National Functional Guidelines for Organic Data
Review," EPA540/R-94-012 (February, 1994)
"USEPA Contract Laboratory Program National Functional Guidelines for Inorganic Data
Review," EPA540/R-94-013 (February, 1994)
Cinti2B

-------
Linda Kirkland, Ph.D., Robert Shepanek, Ph.D
Gary Collins, Ph.D.
U.S. EPA Office of Research Devel.
ORD's Science Information Management
System - Support for Data Usability
Linda Kirkland, Ph.D., Robert Shepanek, Ph.D. and
Gary Collins, Ph.D.
USEPA Office of Research and Development
Int^^^i
ion: Implementation Planning Process
Goals and vision derived from
strategic documents and
program plans.
Definition of coordination and
leadership roles.
Articulation of guiding
principles used throughout
SIMCorB activities.
System IM requirements derived from user and client needs.
Assessment of current IM organization and ORD activities.
Articulation of an IM system vision and architecture.
Definition of specific implementation projects.
ORD/SIMCorB

-------

Str^B^ ORD Strategic Plan	J
Develop scientifically sound approaches to assessing and
characterizing risks to human health and environment.
Integrate human health and ecological assessment methods
into a comprehensive multimedia assessment technology.
Provide common sense and cost-effective approaches for
preventing and managing risks.
Provide credible, state-of-the-art risk assessments, methods,
models, and guidance.
Exchange reliable scientific, engineering, and risk
assessment/risk management information.
Provide leadership in identifying, assessing, and preventing
emerging environmental issues.
F
ORD/SIMCorB

Strl^W: IM Component Goals	''
5?
rr
Planning
~	ORD will integrate IM planning into its research planning
process to ensure ORD information is made available.
Awareness
~	ORD will provide the awareness, tools, and services needed
to make stakeholders aware of ORD information.
Access
~	ORD will provide communication paths and equipment
.requited to allow stakeholders access to ORD-information.
Usability
~	ORD will provide the planning, policies and standards,
training, and user tools required to make ORD information
available to stakeholders.
ORD/SIMCorB

-------
~
IM r.nordination Board ^
Hp]
- ZV\
=£r
~ Charter and Organization

~ ORD—Chief Information Officer

~ SIMCorB participation

~ Project Management Strategy

~ SEAC—Executive Sponsorship of SIMCorB projects


ORD/SIMCorB
[	M j
[ AfcMWchrtVMon	j
Cl~
I	fl I
|	r3 j
: Science IM Coordination Board ,
~ SIMCorB Standing Sub-Groups
~	Requirements Definition and Planning (NCEA and nerl, co-
leads)
~	Data Administration and Quality Assurance (NCERQA lead)
~	Systems Engineering and Operations (NRMRL and orma, co-
leads)
~	Advanced Technology Evaluation and Modeling (nerl lead)
~	Science Direction (nheerl and oaa, co-loads)
~	Outreach and Liaison
ORD/SIMCorB
SI


-------
SI
: Science IM Coordination Board r-
~ SIMCorB Operations
~	Evaluation and adoption of scientific information
management policies.
~	Recommendations on priorities and strategic IM direction.
~	Coordination and leadership on cross-cutting issues and
multi-project initiatives.
~	Sponsorship, oversight, and management of IM
development projects.
~ Information Management
~	Quality of ORD scientific information must be known.
~	Information systems/documentation demonstrating quality
must be developed.
~	Methods and costs must be recognized and documented.
~	Scientific data must be made publicly available in a timely
ORD/SIMCorB
Gi
rinciples
manner.
ORD/SIMCorB

-------
Gu^j^rinciples
Curantftofaets

S3
~ Policy
~	ORD will be a catalyst for information standards.
~	ORD will identify, prototype, and produce information
standards internally, within EPA, and across other agencies.
~	ORD will adopt and/or re-evaluate existing agency IM
policies and procedures.
~	ORD IM must be cost-effective in deployment of information
technology and associated procedures.
ORD/SIMCorB
u#p
GuiW^Principles	A-
£

~ Data and Data Administration
~	Metadata should be created for all data; data should not be
distributed without accompanying metadata.
~	All projects are subject to documentation, quality control,
archival, and directory recording policies adopted by ORD.
~	Implement common definition and tracking of information
pedigree, archiving, quality, and physical security.
ORD/SIMCorB

-------
A | «"IX
&
... OuUtng PrtmcMpft W
Gui^r Principles -T"
~ System Design

~ Economy, efficiency, and interoperability should be key

criteria in system development.

~ IM systems must support the risk management/risk

assessment mission of ORD.

~ System designs should employ a common language to

define function and operation of system components.

ORD/SIMCorB
Gj^^rinciples	r
~ Operations
~	Operations should provide secure, quality services.
~	Operations should facilitate sharing of data and software.
~	Operations should enable effective communication of project
and scientific information.
vJP
5V
ORD/SIMCorB

-------

GuilMjPrinciples	j
~ Outreach and Liaison
~	ORD information should be made available to users in a
timely manner and in a usable format.
~	Facilitate collection and recognition of user requirements.
~	ORD should understand the scientific IM direction of other
organizations and promote collaboration where appropriate.
r
3
ORD/SIMCorB

3P
ORllMBer Requirements
g
Ban
llilsSr"
mm ¦
Sis
m
• lteacljciurrMt fufcfcs	• J>nxkioe jtxirtial nrttdta
» ^WKJid rrcctfc*fp andvorialiqpti • Itel^se databases toinctj :-r
';'—rsr~
ORD/SIMCorB

-------
Priority Projects
Architectural Vision
Current Projects
Users
Guiding Principles
er Requirements
SIMCorB
Strategy
Collect data or
generate data
models
Analyze data
and/or perform
assessments
ORD/SIMCorB

-------

~ Requirements for ORD Research Scientists
~	Planning/Design
~	Data Access
~	Data Integrity
~	Modeling, Visualization, and Analysis Tools
~	Documentation
~	Queries and Reports
~	Methods, Indicators, and Reference Databases
~	IM Research Tools
J?
ORlBser Requirements
ORD/SIMCorB
^ 4
ORnMser Requirements
MMynwxi H ]
AnMMctoMVMon V
Cmi—IIMHi r 3 j
rt r
i winLipii. r 3
- pJ|
~ Requirements for ORD Executives and Managers

~ Researcher Support

~ Cost Efficiency

~ Cooperative Inter-office Efforts

~ Risk Assessment/Risk Management

~ Accountability

~ Publicity


ORD/SIMCorB

-------
Requirements
~r.f
, ! "" !
i ri i
i — ra i
~ Requirements for External Users

~ Information Access

~ Data Integrity


ORD/SIMCorB
ORD/SIMCorB
Trcffltedu^^v

-------
Information
Object
Types
Long-Term
Archives
System
Datasets and
Databases
EPA
Libraries
OMIS/LIPS
with
QA Info
Software, Models,
Systems and Tools
Documents
and Methods
ORD SIMS
(Scientific
Information
Management
System)
13 EPA
National
Systems
EIMS Master
Metadata
OMIS
Yearly
ORD Research
Products
EPA
Environmental
Data Registry
Oomain-
Specific
Systems
Architecture Overview
ORD/SIMCorB
Priority Projects
Architectural Vision
Current Projects
Users
Guiding Principles
SIMCorB
Strategy
ACN363
SIMI - Scientific Information
Management Interface

GEOSIM - Geographic
Information Model
M
E
T
A
GIS, spatial analysis and
visualization tools
MODSIM - Modeling
Component
Modeling algorythms
STATSIM - Statistical Toolkit
Statistical analysis tools
D
SDFW - SIMS Data Format
Wizard
A
Data converters, subsetters,
aggregators
T
A
DATA
Datasets and databases

DOCUMENTATION

Project documentation

Bibliographic information

Education/training


-------

^ure: Data and Information
i pn
1 Architectural VMon
! CWNrtfrOfMtt ri
i - rr i
soman! it»imi.i 1
-
~ Data and Information Architecture

~
Health-General

~
Hazard Identification

~
Dose-Response

~
Exposure Assessment

~
Health-Risk Characterization

~
Ecological-General

~
Problem Formulation

~
Characterization of Exposures

~
Characterization of Effects



ORD/SIMCorB


ftMQrft*— |
—— rr
Currant***** fl
Ml
ArcW
Sure: Function
n i
~ Function Architecture

~
Data administration and security

~
Project definition and tracking

~
Metadata management

~
Human health and ecological risk assessments

~
Spatial analysis

~
Development of methods, models, and indicators

~
Report publication

~
Data distribution

~
Administration



ORD/SIMCorB

-------

Lire: Function

SPP
Steps in
Risk Assessment
SINK
EBtabase Components
Risk Assessment and SIMS
ORD/SIMCorB
~
1 ""OBW ¦§ 1
1 Architectural VMon V
I |1j
Arcn^Rure: Network c
[ OuWngMneWi -.M j
— _ri.i
"""» f" 51
~ Network Architecture

~ ORD Central Computing Network

~ ORD Laboratories and Centers

~ Public Access to ORD Information


ORD/SIMCorB

-------

Select data and models
and provide full documentation
Steps in
Risk .As s es sment
SEVB
Database CbmponenLs
Risk Characterization
Risk Minagement

Dictionary
Analytical Database
.Analysis
-	ClTaractcrization ofExposurc"
-	Characterization ol"Effects
Load selected datasets into
database and develop
analytical products
Catalog
Problem
Fonrulation
Identify and document data
sources and models
Directory
Planning
/CN36
Risk Assessment and SIMS
ORD/SIMCorB

-------
~
| mmmwat ¦ ]
j Architectural VMan Fl|
l H i
ArcHMKture: Organization

~ Organizational Architecture

~ NHEERL

~ NCERQA

~ NCEA

~ NERL

~ NRMRL


ORD/SIMCorB
-A.	I Mortl,
j eurwwi>te)«ct»
Pri^^^rojects: Organization	i
~ Activities Supporting the Organization of ORD Scientific
Information Management
~	Architecture for Development of the Knowledge Base for
Scientific Information Management Across ORD.
~	Enhance Capacity for Scientific Information Management
Coordination in ORD Laboratories and Centers.
~	Fully Develop SIMCorB Coordination and Advisory
Capability.
5?
ORD/SIMCorB

-------
Pr
rojects: Policy
~ Projects Supporting Scientific Information Management Policies,
Procedures, and Standards
~	Develop Data Administration and Quality Assurance
Standards for Scientific IM.
~	User Requirements and Policies/Standards Development.
~ Projects Supporting Outreach Activities for Scientific Information
Management
~	Develop a SIMCorB Web Site to Publicize and Promote
Awareness of Scientific IM Activities within ORD.
~	Develop a Strategy to Promote Partnerships with Other
Organizations.
ORD/SIMCorB
Pr
rojects: Outreach
ORD/SIMCorB

-------
Matty Prefac**
PriCT^pProjects: Development	, 1 . ""- rp^J
~ Projects Supporting Development of Common System
Components for Scientific Information Management
~	Implement Architecture for the Scientific Information
Management System (SIMS).
~	Implement Architecture to Support Long-term Scientific Data
Archives.
~	Implement Architecture to Support System Integration with
External Systems.
~	Evaluate New and Emerging Technologies for Scientific IM.
~	SIMS Proof of Concept Project—Information Management
for Endocrine Disruptors.
~	Current Development Project for Integration of Scientific
Data into SIMS.
ORD/SIMCorB

-------

-------
Environmental Laboratory
Advisory Board
Issues on NELAC Implementation
Managing Quality Systems for Environmental Programs
March 1999

-------
¦¦¦¦
m
.xy^t *
NELAC Structure

Environmental Laboratory
Advisory Board
Board of Directors
Accrediting Authority
Review Board

-------
ELAB Overview
¦	Established in accordance with the Federal Advisory
Committee Act (FACA)
¦	Develop recommendations regarding requirements for
accreditation of environmental laboratories
¦	Provide analyses, conduct reviews, produce reports, and
perform all other activities necessary
¦	Fairly reflect the opinions and positions of the affected
public

-------
NELAC Charter
ELAB advises EPA and
NELAC on matters affecting
the interests of the regulated
laboratories and other
interested parties.

-------
Composition
¦	Environmental laboratory industry
¦	Regulated community
¦	Environmental public interest groups
¦	Academia
¦	Local government
¦	Indian tribes
¦ Laboratory assessment bodies

-------
FACA/ELAB Requirements
¦	Open meetings
¦	Opportunity to file comments or make
statements
¦	No compensation

-------
;	Current ELAB Issues
;	hNELAC Implementation
¦	¦ Interim Status
:	¦ PBMS
¦
J	¦ Audit Checklists
¦	¦ Small Lab Issues
¦	_
'	ample Preservation

-------
ELAB Members
¦	J.Wilson Hershey
¦	Janet Hall
¦	Kathy J. Dien Hillig
¦	William G. Kavanagh
¦	Gary Kramer
¦	Jerry Parr
¦Patricia O'Brien Pomerleau
¦Romano Travota
¦Michael J.Smolen
¦Allen W. Verstuyft
¦Frieda White
¦Sandra Wroblewski

-------
amSHlm
a m
ELAB Recommendation on
r::'i
NELAC Implementation
¦	NELAC standards become effective and
enforceable one year after adoption
¦	For the first group of laboratories, the 1999
standards be used for compliance and the
related timelines for acceptance of
applications be adjusted accordingly

-------
Rationale for ELAB
Recommendation
¦	Standards approved in voting sessions
during the Annual meeting
¦	No specification for when the newly
approved standards become effective
¦	1999 Standards are "better"
¦	Implementation schedule can allow

-------

ELAB Proposed Implementation
Schedule
¦	6/99: First accrediting authorities approved
¦	7/99-9/99: Laboratory applications in first round
processed
¦	9/99-6/00: Accrediting authorities review all first
round applications and perform many on-sites
¦	7/00: First round of laboratories approved and
announced
¦	8/00-6/01: Remainder of on-sites completed for
"SEIi 4 first round
'
4m <9aHK-£3

-------
if
I
i§
Issues Presented by ELAB
Proposal
¦	SOPs may need to be modified
¦	Regulations may need to be modified.
¦	State-provided copies of the standards delayed
until after June 1999
¦	New checklist for the laboratory on-site
assessment
¦	Laboratory applications delayed until July 1999
¦	Interim status provisions require modification

-------
mmm
Changes to Standards Since 1997
¦	More flexibility for labs
¦	Well rounded PT program
¦	Qualifications for Technical Directors
¦	Exceptions for small labs
¦	More stringent training for assessors
¦	Details
*£ '
¦ Clarity
-y<. >
• J!-' r

-------
Increased Flexibility
¦	-Calibration
¦	-MDL

-------
Details
¦
¦
¦	¦ Calendar vs working days
I	¦ Use of logo
¦
¦	¦ Glossary
¦	¦ ete.

-------
ELAB Recommendation on
¦
Si
¦	Interim Status
¦
¦
¦ ¦ Interim status not be recorded in database

-------
Rationale for ELAB
Recommendation
¦	Allow 2 years for first round
¦	Allow AA to prioritize labs
¦	Requires full conformance except for on-
site
¦	Prevents unwarranted marketing advantage

-------
Closing Thoughts
¦	On-site details finalized by 1999
¦	States must be able to address one-year timeline
¦	Many resources available to help labs
~	NELAC Website
~	State Websites
~	Catalyst
~	Lab Associations
¦	Current delay in schedule is marginally acceptable

-------
Do you have a NELAC Issue? Call me!
or Wilson, or Gary, or Janet, or ...
Jerry L. Parr
The Information	Resource	for
Catalyst Information Resources, L.L.C.
303-670-7823
catalyst@eazy.net
www.catalystinforesources.com

-------




8

-------
DAUBERT GUIDLINES
Determine...
¦The "falsifiability" of the theory, is it testable
and has it been tested
¦ The "known or potential error rate"
associated with applications of the theory
¦Whether the findings have been subjected to
peer review and publication, and
¦The "general acceptance" of the science
being offered
GENERAL ELECTRIC v JOINER
(1997)
¦ Expanded the triers' latitude to determine
whether Dauberfs specific factors are, or are
not, reasonable measures of reliability in a
particular case.
KUMHO v CARMICHAEL (1999)
• Re-enforced the Joiner decision by upholding
the District Courts' decision to exclude the
scientific evidence at hand because the
District Court did not question Carlson's
qualifications, but excluded his testimony
because it initially doubted his methodology
and then found it unreliable after examining
the transcript in some detail and considering
respondents' defense of it.
SUMMARY
• Quality Assurance and Peer Review will play
an increasingly important role because they
have been identified as specific factors to be
considered by the courts and provide some
flesh on the bones of the Daubert decision.

-------
The Role of The
Quality Expert In Legal Decision Making
By David E. Preston
Partner - Varnum, Riddering, Schmidt & Howlett llp,
SUMMARY
All too often the attorney and technical expert do not understand how the
weaknesses and strengths of technical data may contribute to the success of a legal
decision-making strategy. Communication between the attorney and quality expert at the
earliest stages of strategy development concerning objectives, data reliability, alternative
approaches, and available resources will result in better legal decision-making.
THE QUALITY EXPERT
In the past decade sensational trials and the popularity of legal themes in
television heightened professional and public awareness of the role and potential
problems faced by the quality expert in trial proceedings. As a direct result of this
attention quality professionals may tend to give high priority to that brief period of
activity which precedes testimony at trial and performance during direct and cross-
examination. The significance of these activities should not be minimized. These
activities and preparation for testifying as an expert can shape the objectives and scope of
work performed by the quality professional for months and perhaps years before the
relatively brief period of trial. Quality professionals who offer their services as expert
witnesses may suggest to potential clients that their value lies in their experience and
credentials as presented before the trier of fact. Although these qualifications are critical,
the most important work of a quality expert in preparing for trial occurs outside of the
courtroom in preparations conducted before trial. This is also true in a host of routine
legal and technical decision-making processes which do not involve the types of trial
court proceedings which are more familiar to the general public.
LITIGATION STRATEGY
In the early stages of preparing litigation strategy the quality expert and the
attorney can assist each other by reviewing and discussing available data resources, legal
and technical theories, and information needs. The quality expert should understand that
while counsel may have a firm grasp on the procedure for qualifying and introducing
Page 1

-------
evidence at trial, he may have only the most limited understanding of reliability of
available data, and opportunities for developing better information before trial. Counsel
can assist the expert by providing an accurate description of client objectives, available
information, schedule, and the resources available to develop additional data. Such
communications between the attorney and the quality expert can strengthen the litigation
strategy.
LEGAL DECISION-MAKING
This approach also applies to a range of legal decision-making processes
involving business transactions or similar contexts other than litigation. Common
examples of these processes include environmental due diligence in the sale or purchase
of a business or real estate, and acquiring environmental permits and licenses. A third
example is the evaluation of environmental data in development of new regulations. In
each of these examples, a clear grasp of objectives, available resources, information, and
opportunities for development of additional information contribute to a comprehensive
and stronger strategy. Using as an example environmental due diligence, the
environmental cleanup laws passed in the early 1980s created a keen awareness on the
part of industry of the current environmental conditions and past hazardous substance
handling practices at candidate sites for acquisition. The potential that a purchaser, by
acquiring property, could become liable for the hazardous substance disposal activities of
his predecessor at that location, quickly caused purchasers to examine the history of past
practices and current environmental conditions prior to completing the transaction.
ENVIRONMENTAL INVESTIGATION
The scope of these environmental investigations is as wide-ranging as the
complexity of the transactions which they involve. At the highest level the acquirer must
understand the extent of environmental liabilities and compliance obligations of the
target, perhaps involving many facilities in many countries. At the lower end of the
spectrum the acquirer must, for example, determine if there has ever been an
underground storage tank on the property. Attorneys are often involved at the initial
stages of these efforts because they help shape other aspects of the transaction and can
recommend contractual methods of sorting out responsibility for addressing these
concerns. For example, funds can be escrowed, or sales price adjustments can be made,
to account for environmental remediation costs. The attorney for a buyer or a seller will
normally consult technical experts to assist in the evaluation of environmental conditions
at an acquisition target.
These evaluations pose data acquisition and evaluation challenges which are
Page 2

-------
similar to those confronted in the traditional litigation setting. Technical experts may be
presented with an extensive or very limited body of technical data concerning conditions
on the property. They may be faced with time limits to conduct the evaluations and
gather necessary data. There are far-reaching legal and financial consequences
associated with the evaluations. For example, if investigators fail to discover
environmental problems, a potential purchaser may end up shouldering all or part of the
burden of an environmental cleanup. A prospective purchaser may face difficulties in
establishing legal defenses to liability afforded by federal and state environmental
cleanup laws.
STANDARDIZATION
The risks associated with these undertakings resulted in the standardization of
these environmental investigations. ASTM standards for due diligence investigations
developed which describe in detail the level of investigations which should be
conducted. These standards enable the legal and technical communities to predict the
required level of work and assure themselves of the sufficiency of the investigation
efforts. The standard for due diligence investigations generally does not call for
generating new environmental testing data. This is not to say that performance in
accordance with the standard is the last word on evaluation of environmental conditions.
Discovery of a contamination problem will lead to a choice to conduct further
investigations including the likely possibility of sampling and testing. The
standardization of due diligence investigations illustrates how legal decision-making
needs have influenced development of a set of data quality criteria aimed at supporting
the efficient completion of business transactions.
PERMITTING
Another area where data quality evaluations are closely related to a legal decision-
making process is environmental permitting. Federal and state statutes often require
extensive technical information gathering to support decisions by regulators to issue or
renew environmental permits. The scope and extent of data gathering are dictated by the
affected media, types of contaminants involved, and the significance of potential impacts.
To illustrate the combined data quality and legal issues, consider the air emissions of a
proposed new manufacturing facility which seeks to obtain an emissions permit to
construct a source of air contaminants. A permit application for a new source of air
contaminants must project the types, concentrations, and amounts of emissions. Based
on these projections, regulators will make findings concerning applicable regulatory
requirements anil whether a permit can be granted. Data supplied by the applicant is
used to establish mass and rate emission limits in the form of permit conditions.
Page 3

-------
Accurate projections of emissions must be developed by the applicant to satisfy permit
application requirements, determine which control requirements may apply, determine
associated operating and construction costs, and make sure that compliance can be
achieved in practice. Failure to do so could result in project delays, permit application
rejection, or non-compliance if emission limitations cannot be achieved.
The tools available to environmental technical professionals to project emissions
are wide-ranging. Emission projections can be based on contemporary testing at similar
facilities, mass balance determinations, and emission factors available in scientific
publications. Depending on the available resources, very accurate projections of
emissions can potentially be made. The quality expert, in making a judgment about the
data to use as the basis for projecting facility emissions, might easily be able to help
solve the legal problem of getting a permit with available published data, but ultimately
find that the facility is so dissimilar from those used to determine the emission factors
that compliance cannot be achieved in practice. Here again, planning and available
resources can have a significant effect on the chain of legal and technical decisions
involved in obtaining a permit, and later in complying with permit conditions. The
attorney must make the data quality expert aware of applicable legal requirements and
resources available to meet data collection objectives. The data quality expert must
provide the attorney with information concerning limitations of available data. All of the
participants must understand that business and compliance objectives will shape the
information gathering and evaluation tasks of the quality experts. In the ideal situation
the problems of data quality will be discussed at the planning stages rather than
discovered during operation.
REGULATORY PROGRAMS
A third example of the role of data quality experts in legal decision-making is the
development of new regulatory programs in which information concerning the emission
of contaminants from industrial processes is collected and considered for the purpose of
establishing emission control standards. The development of the Maximum Achievable
Control Technology (MACT) standards, mandated under the Clean Air Act
Amendments of 1990, provide a good example of the data quality problems faced by
industry and regulators in establishing new requirements for affected industries.
In general, the law requires that the U.S. EPA establish standards regulating the
emission of air toxics for specified types of industries based on the level of control
achieved by the top twelve percent (12%) of industry operations in an industry category.
The U.S. EPA is faced with the task of collecting information of varying quality from the
affected industry and making determinations about the reliability of the data. The legal
Page 4

-------
consequences of establishing these requirements are significant, including the
requirement to install new emission controls at affected industries.
Recognizing the potential effects of these new regulatory programs, industry
groups have formed to work with and/or monitor the U.S. EPA information collection
activities so that their constituents understand the information collection efforts and
recognize that the quality of information provided will bear on the outcome of the U.S.
EPA's efforts. Data quality concerns have been given high priority by industry groups
working with the Agency. For example, trade associations have worked closely with
their constituents urging them to respond completely and accurately to the U.S. EPA
questionnaires. Organized industry groups have formed many years in advance of the
U.S. EPA information requests to review the questionnaires and consider issues of data
availability.
The MACT development process also illustrates other legal decision-making
concerns that can affect data quality. One industry group found that a key roadblock to
collection of actual test data is the difficulty of reaching an agreement with the
government under which the tested industry groups would be immune from penalties if
violations were discovered in the course of testing. Industry trade associations that have
attempted to collect information from their constituents have had to consider the
development of safeguards to protect the identities of respondents who would otherwise
be unwilling to share the results with their competitors.
CONCLUSION
Data quality issues are encountered routinely in connection with legal decision-
making. The unique facts and circumstances which drive the decision-making process
will often set the stage for considering data quality objectives. In decisions involving
environmental requirements, careful establishment of quality requirements can improve
the potential for success in meeting business and compliance objectives. Frequently
technical and legal partners in these endeavors are unaware that there may be alternative
and better solutions to solving these problems. In almost all cases legal counsel and the
quality expert will better understand the risks and benefits of data quality issues through
discussion of these matters in the planning stages leading to the legal decision.
Page 5

-------
EXTENDED ABSTRACT
RECONCILIATION OF
NON-AUTHENTIC ANALYTICAL DATA
Jeffrey C. Worthington, CQA. CQM
Quality Assurance Division
National Cernter for Environmental Research and Quality Assurance
USEPA Office of Research and Development
401 M Street S.W. (89724R)
Washington, DC 20460
Analytical laboratories, including environmental laboratories, strive to ensure the delivery of
analytical data which meet their customer's requirements. Assurance that the data are authentic
is often simply assumed by the user and often also assumed by the data producer. In fact,
sometimes the data are not authentic. This problem is not unique to the analytical or
environmental community.
Many organizations and individuals are responsible to develop and deliver data that are
authentic. These include research organizations, individual researchers, government agencies at
all levels, and the regulated community. There are a host of terms that are used in reference to
this non-authentic data. Some people use the term "data integrity" as an all-encompassing term
to consider all issues. When the data are reported intentionally as non-authentic, the concerns on
the part of data users are greater because more issues are involved than the quality of the data.
Indeed, when the reporting of non-authentic data is purely an error, this is solely a quality
problem. It is also a quality problem even if the reporting were done intentionally.
Another term used when the reporting is done intentionally is scientific misconduct. Others link
this terminology to research integrity. Another term used often is "ethical behaviour."
This presentation explores the role of the quality manager in attempting to identify and separate
the quality issues from the other issues. If that is possible, the quality manager can then develop
and implement a system to "reconcile" the quality of the data for possible use of the data. This
presentation provides a systematic approach for such a process.

-------
r

-------
Facility Manuals: Documentation Unique to Research Facilities
Shirley Wasson, M.S.
Quality Assurance Staff
Air Pollution Prevention and Control Division
National Risk Management Research Laboratory
Office of Research and Development
Environmental Protection Agency
Research Triangle Park, NC 27711
In our Division, planning for cutting-edge research projects is generally handled differently from
that performed for the large multi-year, expensive projects often associated with EPA Many of
our projects are conducted in specially-constructed facilities in which the location, equipment,
instruments, documentation, procedures, and other features generally stay the same; however, the
projects conducted within the facilities change with time.
Our projects generally proceed in incremental steps. Each step requires planning, often building
on the results of those preceding. While no one disputes the value of planning, researchers often
find it difficult to justify the time to produce a full-blown "R-5 style" QA plan* for each step of
the project. They express concern that it takes more time and resources to write the plans than it
does to carry out the work, that producing these plans are an inefficient use of scarce resources,
and that the many individual plans contain repetitious material. Examples of such material are
laboratory layouts, descriptions of facilities and instrumentation, descriptions of facility
documentation such as log books, instrument manuals, health and safety manuals, and training
records, and preparation of procedures such as facility operating procedures, calibrations, quality
control procedures, and health and safety protocols.
Research data producers and users understand the value of the information and the need for it to
complete the study record, particularly in long-term, high-profile, or sensitive projects.
Therefore, APPCD decided to gather the material common to all the projects performed in their
facilities into living documents, and to reduce the plan-writing requirements to address only those
elements which change from step to step. This presentation explores how the facility manual
concept is implemented in our Division. It discusses our experiences with the production and use
of the facility manual, successes, failures, and lessons learned.
*EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations,
EPA QA/R-5, USEPA, Quality Assurance Division, Washington, DC 20460, 1994.

-------
/£* v	Documentation
\wt) Unique to Research
Facilities
Shirley Wasson
Air Pollution Prevention and
Control Division (APPCD)
NRMRL, ORD
Acknowledgements
~	Judith Ford, former APPCD Quality Assurance
Officer, for the initial concept of facility manuals
~	Laura Beach, ARCADIS, APPCD's in-house contractor
Quality Assurance Manager, for coordinating and
overseeing contractor personnel during preparation of
facility manuals
~	All the APPCD facility managers and operators for
their time, energy, and resources. Without them,
facility manuals would not exist.
Facility Manuals in the APPCD
~	What they are
~	IP/;/they exist
~	Who prepares them and who uses them
~	When they are prepared
~	Whereihty are kept
~	How they are prepared and used

-------
What is a Facility Manual?
~ A living document
containing the generic
information about a
research facility, written
by the managers and
operators, accessible to
assessors of projects
performed by the
facility, and to users of
generated data.
Facility Manual Generic Index
~	1.0 Introduction
~	2.0 Facility Charter
~	3.0 Management
~	4.0 Description
~	5.0 Equipment
~	6.0 Documentation
~	7.0 Operation
~	8.0 Quality Assurance
~	9.0 Quality Control
~	10.0 Data Handling
~	11.0 Corrective Action
~	12.0 Health and Safety
V
| Appendices
i		
i
i
~	A.	Current Facility Personnel
~	B.	Operating Procedures
~	C.	Standard Methods
~	D.	Technical Systems Audit Checklist
j ~ E.	Performance Evaluation Audit Ranges

-------
Who Prepares Facility Manuals
¦ EPA owns all the facilities, but they may be
~	EPA managed and operated, or
~	Contractor managed and operated under Work
Assignments
~	If EPA managed, EPA Principal Investigators,
researchers, and analysts write the facility manual
~	If contractor managed, facility manuals written as a
task under work assignments for projects performed in
the facility
Large Scale Facilities in the APPCD
(Partial List)
~	Rotary Kiln Incinerator Simulator
~	Rainbow Horizontal Tunnel Combustor
~	Package Boiler Simulator
~	North American Package Boiler
~	Flue Gas Cleaning System
~	Large Indoor Air Quality Environmental Test Chamber
Rotary Kiln Incinerator Simulator

-------
Rainbow Furnace
Horizontal Tunnel Combustor
Package Boiler Simulator
North American Package Boiler

-------
Mobile Facilities in the APPCD
~	On-Road Diesel Emissions Characterization Facility
~	Instrumented Car

-------
Mobile Facilities
Instrumented Car

-------
Laboratory Facilities (Partial List)
~	Organic Analysis
~	Coatings
~	Biomass
~	Mercury
~	Small Environmental Chambers (Indoor Air)
~	Metrology
j ~ Inorganic Analysis
i
i
Organic Analysis
i
Coatings Laboratory

-------
I
Coatings Sampling and Analysis

Coatings Laboratory
Biomass Laboratory
I

-------
Mercury Laboratory
Metrology Latoratory

-------
Inorganic Analysis Laboratory
Why Prepare Facility Manuals
Research planning in APPCD is often different
from that of otfwr parts of the agency
*- Planning is generally for smaller, cutting edge
research projecs
*- Projects generally proceed in incremental steps
~	Next steps build on results of preceding step
~	QAD guidance documents address large projects
~ Planning is different for mega-dollar projects
For Proof-of-Concept Research...
~ Principal investigators say:
¦	Full-blown "R-5 style" QA plan too complex
¦	Priorities not compatible with small projects
¦	Costs more tine and resources to write the pla is than to
carry out the work
¦	Individual plsns contain repetitious material

-------
Why Prepare Facility Manuals?
~	Information generated only once
~	Available in one document
~	Once prepared, available for easy reference
~	Citable in abbreviated Test/QA plans
Who Prepares and Uses Facility Manuals
~	Facility Managers and Operators
¦	Prepare manual* and procedures, keep manuals current
¦	Train new personnel
~	Project Managers, Data Users
¦	Archive for facility layout, operations, procedures, quality
assurance, calculations
¦	Referenced in test/QA plans
~	Auditors
¦	Source of facility information
i
Preparation of Facility Manuals
~	Phase I: Construction of Facility
¦	Manual begins as task in statement of work
¦	Document layout, engineering systems, instruments
¦	Expense borne by construction project
~	Phase II: Setup and Operation
¦	Establish logbooks, documentation
¦	Select software, modify as needed
¦	Establish and document procedures
¦	Expense borne by data collection projects V -jitil,

-------
Maintenance of Facility Manuals
*¦ Phase III: Concordance
¦	EPA managed: EPA manager, QA officer, safety officer
¦	Contractor managed: appropriate EPA approval*, plus
Contractor task manager, OA officer, safety officer.
~	Phase IV: Revisions
¦	Manager* and operator* keep manual up to date
¦	Hand-written entries, formal revision as necessary
¦	In response to audits
~	Phase V: Retirement
¦	Kept for 5 years after decommissioning of facifi
In Conclusion
~	Facility manuals a useful tool in decreasing paperwork
required in planning research projects
~	Organization, layout, operating procedures, QA
requirements collected in one document for project
data users
~	Provides archive of facility history
~	Training tool for new facility personnel
~	Source for auditing information

-------
Auditing New Technologies:
Issues and Approaches to QA in an R&D Setting
In the U.S. EPA's Office of Research and Development, many of the projects that quality
assurance (QA) staff members audit involve novel methods, recently developed and not yet
standardized. This precludes the approach of using standard acceptance criteria and well-
established audit methods and materials. There is, however, a real and important need for QA
support to the projects. This presentation illustrates some useful approaches to auditing research
projects, using examples from new instruments and techniques for the measurement of fine
particles.
The recently enacted National Ambient Air Quality Standard for fine (<2.5 micrometer)
particle and ozone concentrations has accelerated the development of new methods and
instruments for measuring fine particle concentration and size in air. However, several factors
complicate the measurement of fine particles in air (aerosols):
•	Different measurement methods are based on different particle characteristics (e.g., light
scattering versus speed of migration of a charged particle in an electrical field).
Sometimes it is unclear how the measured value corresponds to physical characteristics
such as actual size and density; aerodynamic size, for example, is determined by inertia!
and viscous forces, but smaller denser particles could have the same aerodynamic size as
larger particles that are less dense.
•	Properties of specific particles affect their measurement (e.g., shape, density, charge,
presence of volatile layers, chemical reactivity).
•	Properties of the carrier gas affect particle measurement (e.g., relative humidity, flow,
temperature).
•	Size calibration standards, such as the widely used National Institute of Standards and
Technology (NIST) polystyrene latex (PSL) spheres, may not accurately match the
characteristics (e.g., shape, density, absorbance) of the aerosol to be measured. Many
aerosol measurement systems assume that all particles are spheres.
•	There are no widely recognized and readily available primary standards for mass
concentration or number measurements, particle size distribution, or particle density.
•	Mass measurements may be confounded by adsorption, desorption, or chemical reactivity.
In cascade impactors, under some conditions, particles can bounce from the stage in
which they would properly be collected to a subsequent stage in which smaller particles
would be collected; this particle bounce phenomenon distorts size distribution
measurements.
•	Particle count measurements, as well as size distribution measurements, may be
confounded by the number of particles in an airstream. A stream overloaded with smaller
Nancy Adams ? u>s> EpA> APCD/NRMRL, Research Triangle Park, NC

-------
particles may show "phantom" large particles, due to coincident smaller particles in the
sensing portion of the counter. Likewise, in some systems, a large particle can create a
signal that is also counted as several smaller particles.
• Particles can be lost in the inlet or transfer lines.
Given these complex problems in measuring aerosols and given the lack of primary
standards, this presentation discusses some practical approaches to ensuring aerosol measurement
quality, using traditional QA methods: planning (using the Data Quality Objectives process to
establish how good the measurements must be, preparing a QA/Test Plan), QC (checks of flow
measurements, checks of specific measurement characteristics), quality assessment (appropriate
types of assessments and timing of assessments), analyzing and reporting data (software and
electronic interface checks, data qualifiers), and quality improvement (using information gained
in the conduct of the assessment and the measurement program to develop better procedures, to
design better instrumentation, and to improve quality checks).

-------
Auditing Aerosol Measurements:
QA for R&D
Nancy Adams
ORD/NRMRL/APPCD/TSB
Presented at the Annual EPA QA Meeting, April, 1999
Increasing Interest in
Aerosol Measurements
¦	Revision to NAAQS
~	Measure < 2.5 micrometer particles
~	New instrumentation
¦	Health effects research
~	Particles and mortality
~	Particles and lung function deficits
~	Particles and infectivity
~	Particles and asthma/allergic response

-------
Differing Measurement
Methods
¦	Light-scattering
¦	Light blocking (turbidity)
¦	Relative coloration of filter
¦	Migration in electric field
¦	Mass on filter or stage
~	collected by filtration
~	collected by inertia! mass methods
Particle Properties
Affecting Measurement
¦	Shape
¦	Density
¦	Charge
¦	Volatile coating
¦	Chemical reactivity

-------
Properties of Gas Stream
Affecting Measurement
¦	Viscosity
¦	Relative humidity
¦	Flow rate
¦	Temperature
¦	Turbulence
Other Confounders
¦	Agglomeration
¦	Precipitation
¦	Static attraction to walls, tubing
¦	Adsorption/desorption of volatile
components
¦	Chemical reactivity
~	Particle-particle
~	Gas-particle

-------
IV easurement-Specific
Confounders
¦	Particle bounce in cascade impactors
¦	Phantom particles (coincident small
particles)
¦	Phantom small particles (misclassification
of a large particle)
Available Standards
¦	NIST
~	PSL spheres
~	Glass spheres
¦	Variety of single size preparations
¦	Sold as liquid suspensions

-------
Standards not available for:
¦	Mass concentration (micrograms/cubic
meter)
¦	Number concentration (1,000,000/cubic
meter)
¦	Size distribution
¦	Particle density
QA for Aerosol Measurements
¦	Planning
~	DQO process
~	QA/test plan
¦	Quality control
~	Flow measurements
~	Specific measurments checks
¦	Quality assessment
¦	Analyzing/reporting
¦	Quality improvement

-------
Plannin
¦	DQO process
~	State the problem
~	Identify the decision
~	Delineate inputs
~	Set boundaries
~	Establish a decision rule
~	Set error limits
~	Optimize design
¦	QA/test plan
Quality control
¦	Calibrations
~	Size
~	Flow
~	Temperature
~	Relative humidity
¦	Replicate measurements
¦	Instrument comparisons

-------
Sampling QC
¦	Probe - Mechanical defects
¦	Transfer lines - Loss estimate
¦	Pumps - Operational
¦	Flow meters - Range of use
Sampling QC (continued)
¦	Instrument
~	Diameter of interest
~	Units
~	Concentration limits
~	Diluter calibration
¦	Particle collection substrate
~	Interactions
~	Loading limit

-------
QC for Flow Measurement
¦	Mass flow controller (range varies)
~	Calibrate with NIST-traceable device
~	Calibrate over usage range
¦	Dry gas meter (>lL/min)
¦	Wet test meter (>5L/min)
¦	Roots meter (>5L/min)
~	NIST-traceable bell prover
¦	Orifice(>lL/min)
¦	Bubble flow meter (lcm3-25L/min)
Quality Assessment
¦	TSAs
~	Early in project
~	May be done by project team
¦	PEs
~	Early in project
¦	DQAs

-------
Data Analysis and Reporting
¦	Programs to check software
¦	Measurement of differing sample sizes
¦	Data descriptors
~	Bias, precision, MDL
~	Calibrations
~	Temperature, RH
~	Audits
~	Maintenance, repairs
Quality Improvement
¦	Documented methods
¦	Documented timeframes
¦	Modified SOPs
¦	Publishing of improved methods

-------
References
¦	Factors Affecting Aerosol Measurement
Quality, pp. 130-145, in Aerosol
Measurement, K. Willeke and PA. Baron
¦	http://www.epa.gov/ttn/amtic/qalisthtml,
Section 2.10, QA Handbook
Acknowledgements
¦	Les Sparks
¦	Richard Shores
¦	Paul Groff
Research Triangle Institute
¦	Dave Ensor

-------
Quality Assurance Project Plan
Development for the HSPF Model
Jean Chruscicki
Environmental Scientist
USEPA WT-15J
77 W. Jackson Blvd.
Chicago, IL 60604
Charles S. Melching
Hydrologist
US Geological Survey
.221 N. Broadway
Urban a, IL 61801
Brian Bicknell
Senior Engineer
Aqua Terra Consultants
2685 Marine Way, Suite 1314
Mountain View, CA 94043

-------
SUMMARY
The following is a condensed Quality Assurance Project Plan (QAPP) for the Hydrological Simulation Program -
FORTRAN (HSPF) model. The model simulates surface water where diverse biota and wild rice grows and is
critical to several Tribes, before and after construction of a proposed underground mine near Crandon, Wisconsin.
THE QUALITY ASSURANCE PROJECT PLAN - PROJECT DATA QUALITY OBJECTIVES
The formulation of the Project Data-Quality Objectives (PDQO's) includes the following seven steps: 1) Stating the
problem to be studied; 2) Identifying the decision that will be made; 3) Identifying the information needed; 4)
Specifying the boundaries to which the decisions apply; 5) Specifying how the environmental data will be
summarized; 6) Specifying acceptable error rates; and, 7) Selecting the most resource-efficient study design that
will achieve all of the PDQO's. A total of eleven questions evolved in the original QAPP, but then the focus was
narrowed to five objectives that could realistically be answered (see below la through le). The five questions must
be addressed by 1(a) hydrologists familiar with lake-level fluctuations in northern Wisconsin, 1(b) biologists
familiar with the growth of wild rice, 1 (c) biologists familiar with the flora, fauna, and habitat of interest, 1 (d) is
indirectly addressed by the model, and 1(e) hydrologists familiar with surface water flow in northern Wisconsin.
lfa) Identification of Surface-Water Bodies that Strongly Interact with Ground Water
Problem: According to the Environmental Impact Report (EIR) prepared by the mining company, a number of the
lakes and ponds in the vicinity of the proposed Crandon Mine are poorly connected to the primary aquifer and, thus,
would be minimally affected by drawdown resulting from mine operations. One lake is described as a 'perched
lake' with a silty clay lake bed above the water table through which little water percolates. Thus, the level of this
lake is considered to be independent of the level of the primary aquifer. Others are described as 'ground-water-
head-dependent' lakes. The water levels in these lakes are affected by the ground-water levels. However, the lake
beds limit percolation to the aquifer, maintaining a higher water level in the lakes than the surrounding water table.
The water budget for a lake in HSPF is modeled as AS = P +1+GW - E - O, where P is the precipitation directly
onto the lake, I is the surface and subsurface inflow from tributary areas and streams, GW is the deep ground-water
inflow or outflow (which would have a negative value), E is evaporation from the lake surface, and O is surface
outflow to a draining stream. Also, for the ground-water-head dependent lakes, the head difference between the
lake level and the water table may be too small under natural conditions to result in substantial flows from the lakes
to ground water. However, when the water table drops as a result of mining, the head difference could become
large enough to result in substantial flows from the lakes to ground water. This issue cannot be evaluated because
the available lake-level data do not reflect the effects of a substantially lower water table.
Decision: Are the lakes listed previously sufficiently affected by ground-water level fluctuations that lake levels and
the overall water budget cannot be reliably simulated without considering ground-water interactions in HSPF? If
lake levels cannot be simulated without consideration of ground-water interactions, then future large-scale, ground-
water drawdown would be expected to have substantial effects on lake levels and water budgets. However, if
natural lake levels can be simulated without consideration of ground-water interactions, the effects of future large-
scale, ground-water drawdown on lake levels cannot be reliably evaluated because the HSPF model has not been
developed and evaluated under such conditions.
Information Needed: A long series of measured monthly lake levels are needed to determine if lake levels can be
reliably simulated without considering ground-water interactions.
Boundaries: The areas to which the decision will apply are Oak, Duck, Deep Hole, Skunk, and Little Sand Lakes.
2

-------
Data Summarization: The simulated time series of runoff will be summarized as a series of monthly lake levels
that will be compared with the measured series of monthly lake levels. If the average difference between the
measured and simulated lake levels (error) is less than or equal to ft and the maximum error in simulated lake
levels is less than or equal to X ft. then the hypothesis that the lake levels are not dependent on the surrounding
ground-water levels is accepted. The items underlined above must be specified by a hydrologist.
Acceptable Error Rates: The acceptable error rate in the decision has been factored into the selection of the
simulated lake-level-error tolerances given previously.
I(b1 Effect of Runoff Changes on Wild Rice
Problem: Rice Lake holds the largest and densest stand of wild rice on an inland lake in Wisconsin. Wild rice is
highly dependent on a limited range of flow velocities, flows, water levels, pH, dissolved organic carbon, metals,
nutrients, and sulfate for reproduction and growth. It is of great cultural and economic value to the Sokaogon
Chippewa Community. Therefore, changes in hydrology or hydraulics resulting from the construction, operation,
and closure of the proposed Crandon Mine must not adversely affect the wild rice. This phase of the HSPF
modeling project will focus on runoff simulation and, thus, HSPF simulation will be applied to determine if various
phases of the mine development result in velocities and water levels substantially different from natural conditions.
Decision:
Are water levels in Rice Lake maintained at a range of x inches to x feet in mid- to late-April when the
grain begins to germinate such that the growth of the wild rice will not be impaired?
Are water levels in Rice Lake maintained at a range of x inches to x feet during emergent leaf stage May
to early July, and would the frequency of mid-summer flooding not substantially change such that the
growth of the wild rice would not be impaired?
Once the plants are in midseason, August to early September, will the frequency of water levels negatively
affecting growth substantially increase such that the increase will cause the plants to topple and lodge?
Once the plants have completed growth, mid-September, will the frequency of water levels negatively
affecting growth substantially increase such that the increase will batter the stems, tangle the leaves and
panicles and damage the seed crop? The items underlined must be specified by a biologist familiar
with the growth of wild rice.
Information Needed: A long time series (20 years or more) of lake levels corresponding to hypothetical runoff
from the Swamp Creek watershed under natural conditions are needed to determine the range in lake levels at key
times in the growing season resulting from natural fluctuations in runoff.
Boundaries: Decisions will apply to Swamp Creek and Rice Lake.
Data Summarization: The simulated long-term time series of runoff for natural and mining conditions will be
summarized as frequency distributions of lake levels during the key times of the growing season. The changes in
the frequency of water levels negatively affecting growth and in high and low water levels during the growing
season can be compared to the criteria previously given and a decision can be made regarding the magnitude of the
effect of mining on wild rice.
Acceptable Error Rates: Because the lake levels will be simulated for a long time period, (20 years or more) and
the simulated data will be analyzed utilizing a frequency distribution, the acceptable error rate should be factored
into the biological criteria on the unacceptable frequency of stressful water levels.
3

-------
IcV Will the changes in runoff and water levels resulting from mine construction, operation, and closure impair the
health of flora, fauna, and habitat?
Problem: A land and water survey conducted around the proposed mine site observed a total of five endangered,
five threatened, 40 special concern, and one proposed special concern species as listed by the State of Wisconsin.
Further, in the project area 36 mammal species, 132 bird species, and 45 butterfly species have been identified.
These and numerous other biotic components could be affected by changes in habitat resulting from changes in
streamflow, lake levels, and groundwater levels. Clearly, it would not be possible to evaluate the potential effects of
mining on all species, so certain key indicator species must be identified whose presence and population robustness
indicate the general health of the ecosystem.
Decision: Are water levels or discharges in selected lakes, streams, or wetlands higher or lower, or the frequency of
stressful water levels substantially increased at kev times in the life cycle for selected species such that their health
or habitat requirements could be impaired by the change in the physical parameters that comprise aquatic or
terrestrial habitats? The items underlined in the above decision must be specified by a biologist familiar with
the habitat requirement for selected species in streams, lakes, and wetlands.
Information needed: Same as above in lb
Boundaries: The area to which decisions will apply consists of the surface water bodies for which the most
significant effects on flora, fauna, and habitat from runoff changes resulting from mine construction, operation, and
closure are expected. These include all the lakes, streams and wetlands within the modeling domain.
Data Summarization: The simulated long-term time series of runoff for natural and mining conditions will be
summarized as frequency distributions of lake levels, wetland levels, and/or discharges during the key times of the
life cycles of the indicator species. The changes in frequency of stressful water levels and/or discharges will be
determined for mining conditions compared to natural conditions.
Acceptable Error Rates: Same as lb.
Kdl Effect of Runoff Changes on Tribal Cultural Resources
Problem: The greatest socioeconomic and cultural impacts will be to the Tribes on established reservations in the
area, including in order from greatest impact to the least, the Sokaogon Chippewa Community Mole Lake Band, the
Menominee, the Forest County Potawatomi, and the Stockbridge-Munsee. The greatest impact of any adverse
effects from the mining would be to the Mole Lake Band due to their proximity downstream from and directly
adjacent to the mine, and because of the historic and current growing and harvesting of wild rice, Zizania aquatica,
at Rice Lake on the reservation.v
Decision: The decision to be made is whether the changes in stream discharge, water levels, or flood/drought
frequency and intensity resulting from mine construction, operation, and closure will vary enough from the natural
conditions in the watershed to have a deleterious effect on the water resources and related cultural resources such as
wild rice, and hence the rice growing and gathering traditions of the people. The wild rice issue has been an
historical focus, not just a recent response to the potential mining impacts.
Information Needed: More information is available than will be discussed for this model. The focus of this study
is on the direct impact of the mine to the natural environment of the Tribal peoples, and will not expand to
economics, demographics, infrastructure, or other historical mining data.
Boundaries: The watersheds that the reservations are part of must be recognized.
4

-------
Data Summarization: A complete analysis of the impacts to the Tribal culture is available by Cleland, Nesper, and
Cleland in a report prepared by Aurora Associates, Williamston, Michigan, under contract with the Sokaogon Band
of Chippewa, the Menominee Tribe of Wisconsin, and the Forest County Potawatomi, in cooperation with the Great
Lakes Indian Fish and Wildlife Commission (GLIFWC) on behalf of the Lake Superior Chippewa.
Acceptable Error Rates: as above in Data Summarization, not directly modeled.
ire) Effect of Runoff Changes on Flood Frequency and Magnitude
Problem: The construction of the mine facilities will convert 550 acres of forested (main land cover), open space,
and wetland areas into a mill for ore processing, a tailings-management area, a water management and treatment
system, offices, maintenance shops, storage buildings, and parking. Surface runoff resulting from storms will be
substantially higher from the constructed facilities than from the natural areas. It is also necessary to evaluate the
changes in downstream flooding resulting from mine construction.
Decision:
Are the frequency and duration of discharges greater than X ft3/s at a kev location along the streams
substantially increased for runoff after mine construction relative to the frequency resulting for runoff
under natural conditions? Has the magnitude for the X-year storm at a kev location along the streams
substantially increased for runoff after mine construction relative to the frequency resulting for runoff
under natural conditions? The items underlined in this decision must be repeated as necessary to
include all key locations along the streams, as specified by a hydrologisL
Is the frequency of water levels greater than ft or Y. ft on Little Sand Lake and Rolling Stone Lake,
respectively, substantially increased for runoff after mine construction relative to the frequency resulting
for runoff under natural conditions?
•	If the answer to any of these questions is yes, then the mining company must redesign the stormwater-
detention facilities such that release rates are sufficiently small to keep downstream flows and water levels
within acceptable tolerances relative to flows resulting from natural conditions.
Information Needed: A long time series (20 years or more) of discharges and lake levels corresponding to
hypothetical runoff from the watershed potentially affected by the mine under natural conditions are needed to
establish a comparison baseline.
Boundaries: The area to which this decision will apply includes Hemlock, Swamp, and Hoffman Creeks and Little
Sand and Rolling Stone Lakes.
Data Summarization: The simulated long-term time series for natural and mining conditions will be summarized
as frequency distributions of discharges and lake levels. The changes in exceedance frequency for specified
discharges and lake levels will be determined for mining conditions compared to natural conditions. Changes in the
duration of discharges and lake levels above the specified targets also will be determined for mining conditions
compared to natural conditions. Finally, changes in the magnitude of discharges of specified return periods will be
determined for mining conditions compared to natural conditions.
Acceptable Error Rates: Because the simulated data will be analyzed utilizing a frequency distribution, the
acceptable error rate should be factored into the criteria on unacceptable increases in high discharges, lake levels, or
flood magnitudes.
5

-------
PROJECT ORGANIZATION AND RESPONSIBILITY/QUALITY ASSURANCE OBJECTIVES FOR THE
SIMULATION MODEL
HSPF calibration is performed in a stepwise manner primarily using data available at stream flow gages and
matching the overall water budget, the annual water budgets, the monthly and seasonal water budgets, and finally,
considering storm-runoff volumes and frequencies. In evaluating the monthly and seasonal water budgets and
storm-runoff volumes, the relative proportions of high flows and low flows are considered. It is recommended by
experienced modelers that graphical and statistical means be used to assess the quality of fit because trends and
biases can be easily detected on graphs, and statistics provide an objective measure of whether one simulation is an
improvement over another.
For the overall and annual water budgets only the percent error will be considered. Experienced modelers state
that for HSPF simulation the annual or monthly fit is very good when the error is less than 10 percent, good when
the error is between 10 and 15 percent, and fair when the fit is between-15 and 25 percent (Donigian and others
1984). The target for acceptable calibration and verification for this study is simulation of the overall and
annual water budgets within 10 percent of the measured values.
Plots of observed and simulated runoff will be prepared for the monthly water budget and checked for periods of
consistent oversimulation or undersimulation of runoff. The quality of fit for monthly values also will be examined
using three statistics: (1) the correlation coefficient between simulated and observed flows, (2) the coefficient of
model-fit efficiency between simulated and observed flows, and (3) the number of months for which the percentage
error is less than a specified percentage. In areas where snowmelt is a major factor and meteorological data are
sparse, it may be difficult to obtain the high correlation coefficients and coefficients of efficiency reported in the
previously listed studies. The targets for acceptable calibration and verification of monthly flows are a
correlation coefficient greater than 0.85 and the coefficient of model-fit efficiency greater than 0.8.
The daily flows will be checked graphically by comparing the observed and simulated runoff-duration curves and
time series. General agreement between the observed and simulated runoff-duration curves indicate adequate
simulation over the range of the simulated flow conditions. Substantial or consistent departures between the
observed and simulated runoff-duration curves indicate inadequate calibration. Three statistics are utilized in the
expert system for calibration of HSPF, HSPEXP (HSP Expert System) to numerically evaluate the high-flow/low-
flow distribution indicated in a flow-duration curve. The target criteria for acceptable calibration and
verification for the high-flow/low-flow distribution in the simulated runoff relative to measured runoff are a
mean low-flow-recession rate difference £ 0.02, an error in the mean of the lowest 50 percent of the daily
mean flows £ 10 percent, and an error in the mean of the highest 10 percent of the flows z 15 percent.
The quality of fit for the larger storms will be measured graphically by the agreement between the simulated and
observed partial-duration series of runoff volumes. The annual probability of exceedance of each storm will be
determined . Also, the following criteria are utilized in the HSPEXP for storm volumes: (1) the error in total flow
volumes for the sum of selected storms must be less than 20 percent, and (2) the error in total flow volumes for the
sum of selected summer storms must be less than 50 percent. Runoff volumes are used in this study because
changes in lake water levels are dependent on accurate simulation of runoff volumes. The criteria for acceptable
calibration and verification for storm-runoff simulation are (1) the error in total flow volumes for the sum of
up to 36 selected storms must be less than 20 percent, and (2) the error in total flow volumes for the sum of
selected summer storms must be less than 50 percent
Accurate simulation of lake and wetland levels is a vital component of evaluating the effects of the proposed
Crandon Mine on surface-water resources in the vicinity of the mine. Therefore, calibration also will consider
6

-------
accurate simulation of available monthly lake-level and well-level data in the Swamp Creek watershed. Verification
will be done by spatial transposition of the calibrated model as well as temporal transposition of the calibrated
model. Verification through spatial transposition involves application of the runoff relations calibrated for the
Swamp Creek watershed to the Pickerel Creek watershed. Verification through temporal transposition involves
application of the runoff relations calibrated for a given time period to a second independent time period.
The accuracy of the calibration and verification for monthly lake-level and water level in wells data will be
evaluated using the correlation coefficient and coefficient of model-fit efficiency previously described. Maximum
and average errors in water levels also will be considered relative to the range of water-level fluctuations for a given
lake or wetland. The targets for acceptable calibration and verification of monthly water levels are a
correlation coefficient greater than 0.85 and the coefficient of model-fit efficiency greater than 0.8.
The mine-affected area will be a combination of impervious surface and compacted open space, whereas the tailings
management area is an unique land use/land cover. Ranges of parameter values will be established and sensitivity-
analysis methods will be applied to determine the expected value of and ranges for runoff, lake levels, and water
levels in wells for the watersheds affected by mine operations. The expected values and ranges will be used to
determine which conditions result in substantial changes in runoff and water levels relative to baseline conditions.
CALIBRATION PROCEDURES
The water budget is simulated in HSPF on a continuous basis by subdividing the watershed into areas of specified
land use/land cover and summing the runoff. In this project, verification will be done by spatial transposition of the
model calibrated for runoff estimation on the Swamp Creek watershed to runoff estimation on the Pickerel Creek
watershed. For the area of the proposed Crandon Mine, model-parameter values reflecting the current, natural
conditions can be determined by calibration and verification utilizing runoff data at the Swamp Creek stream gages
above Rice Lake and Swamp Creek below Rice Lake at Mole Lake, assuming adequate daily and hourly rainfall
data and other meteorological data can be obtained. Flow from much of the immediate area, and representative of
the remaining affected area in the Pickerel Creek watershed, is measured at the Swamp Creek stream gage above
Rice Lake. The data from the gage below Rice Lake will be used to ensure flows and water levels in Rice Lake are
correctly represented in the model.
The 9-year period of streamflow data will be subdivided into a 5-year calibration period and a 4-year
verification period. Verification also will be evaluated by applying the HSPF model with parameters determined
for the Swamp Creek Basin to the Pickerel Creek Basin, simulating monthly lake and wetland water levels, and
comparing the simulated values to the measured values. Initial values for model parameters will be selected from
the results of previous studies on similar watershed characteristics and preliminary model simulations. The
calibration process will then be facilitated by the use of the HSPEXP, though not exclusive of other methods. The
basis of the HSPEXP is a compilation of more than two decades of experience with HSPF and similar models over a
wide range of climates and topographies.
ANALYTICAL PROCEDURE TO EVALUATE CHANGES IN RUNOFF
Because the mine is proposed and not yet constructed, a procedure must be developed to estimate changes in runoff
relative to natural conditions resulting from mine construction, operation, and closure. Runoff from natural
conditions can be assessed on the basis of available data for the watersheds near the proposed mine. However,
runoff from the modified conditions of mine construction, operation, and closure cannot be assessed on the basis of
7

-------
available data for watersheds near the proposed mine. Therefore, a computer model that is capable of simulating
runoff resulting from natural and mine conditions through physically defensible selection of model-parameter values
must be used to evaluate changes in runoff. The model-parameter values corresponding to natural conditions can be
determined by calibration and verification utilizing available data, including detailed water-table and stream-bed
elevation data for the areas with high water tables and (or) wetlands.
In application of HSPF, each watershed studied is subdivided on the basis of rain-gage locations and land-cover
categories. The primary purpose of segmenting the watershed is to divide the study area into land segments that are
assumed to produce a homogeneous hydrologic and water-quality response.
INTERNAL QUALITY-CONTROL CHECKS
Quality-control checks are required for two aspects of the HSPF modeling effort: (1) the meteorological,
streamflow, lake-level, and well-level data utilized to calibrate and verify the model, and (2) the assumptions and
procedures utilized to calibrate and verify the model to the available data and to parameterize the model for mine
construction, operation, and closure conditions.
MODEL-OUTPUT REDUCTION AND REPORTING/PERFORMANCE AND SYSTEM AUDITS
A manual procedure is used in HSPF model calibration even when the HSPEXP is applied. In this procedure, the
model user decides which parameter values are changed and by how much for each calibration simulation. Thus, it
is essential to keep an iteration log during the manual calibration process.
CONCLUSION
This is a progress report with no conclusion at this time. However, the objectives continue to be valid and viable
goals for the project and the development and daily use of this QAPP are integral to completion of the project.
ACKNOWLEDGMENTS
As described in the Project Organization and Responsibility section, this project is a cooperative effort among three
agencies: the USEPA, USGS - Illinois and Wisconsin Districts, and Aqua Terra Consultants. The Great Lakes
Indian Fish and Wildlife Commission (GLIFWC), the US Fish and Wildlife Service (FWS), the US Army Corps of
Engineers Waterways Experiment Station (WES), the Mole Lake Band of the Sokaogon Chippewa Community and
the Menominee have made significant contributions to this project.
Appendix in original document entitled "Application of HSPF to the Upper Wolf river Basin for Evaluation of
Hydrologic Impacts of Crandon Mine Simulation Plan" by Brian Bicknell of Aqua Terra Consultants.
REFERENCES/Fumished upon request, citations omitted from this article for brevity.
Cleland, C., Nesper, L., and Cleland, J. 1995. "The Potential Cultural Impact of the Development of the Crandon
Mine on the Indian Communities of Northeastern Wisconsin", Aurora Associates, Williamston, Michigan.
Donigian, A.S., Jr., Imhoff, J.C., Bicknell, B.R., and Kittle, J.L., Jr. 1984. Application guide for hydrological
simulation program-FORTRAN (HSPF): EPA-600/3-84-065, Environmental .research, 177 p.
8

-------
The United States Environmental Protection Agency (USEPA) is applying a hydrology and hydraulic model,
Hydrological Simulation Program - FORTRAN (HSPF), for simulation of changes in a watershed in northern
Wisconsin resulting from a proposed underground zinc and copper mine during construction, operation, and
closure. The watershed is a forested wetland with many lakes and streams, including a shallow lake that supports a
natural stand of wild rice on the reservation of the Mole Lake Band of the Sokaogon Chippewa Community. The
entire hydrologic cycle will be simulated with HSPF with an emphasis on the surface waters (streams, lakes,
wetlands), the water budget, and fluctuations of the water budget. These results will be compared to other models.
Final and critical steps, still in a developmental phase, relate results obtained from HSPF simulation to the risk to
habitat on a biological and cultural level.
The HSPF model will be used to complement the impact analysis for the water budget already in progress - the
MODFLOW ground-water model completed by the mining company and being reviewed by the Wisconsin
Department of Natural Resources (WDNR). The U.S. Army Corps of Engineers (USACE) is currently running a
FEMWATER (Finite Element Mesh for groundwater) model to evaluate the mining impacts on the-groundwater.
The results of the FEMWATER model also will provide some input to the HSPF model. The results and
interpretations from the HSPF model will be available to the WDNR and USACE for use in their respective
Environmental Impact Statements (EISs) as they determine is necessary. The model can also be expanded or
modified in the future for simulation of other parameters, such as chemistry or sediment, as data are available. The
affected Tribes in the area may also modify the model. These changes can be simulated in HSPF, however,
determination of appropriate model parameters for simulating the solute- and sediment-transport processes on the
basis of limited data are complex and are deferred to a possible second phase of the HSPF modeling project.
The points that could effectively be addressed within this Quality Assurance Project Plan (QAPP) evolved from an
all-encompassing total of eleven questions. Some of the commitments and the model itself had already been chosen,
so the following Project Data Quality Objectives (PDQOs) greatly helped focus the overall endeavor based on the
model, time and resource constraints.
a)	Which surface-water bodies (lakes, creeks, wetlands) strongly interact with ground water and, thus, can be
substantially affected by ground-water pumpage from the mine?
b)	Will the changes in runoff and water levels resulting from mine construction, operation, and closure impair the
growth of wild rice?
c)	Will the changes in runoff and water levels resulting from mine construction, operation, and closure impair the
health of other flora, fauna, and habitat?
d)	Will the changes in runoff and water levels resulting from mine construction, operation, and closure substantially
impact the culture or cultural resources of the various Native American Tribes in the vicinity of the proposed mine
(with a particular focus on the Mole Lake Reservation)?
e)	Will mine construction, operation, and closure substantially alter the frequency and intensity of flooding
downstream from the mine site?
The HSPF model simulates the surface water balance with an input of precipitation, potential evapotranspiration,
and meteorologic conditions, and outputs of actual evapotranspiration, streamflow (the sum of surface runoff,
interflow, and base flow from ground water), and percolation to deep aquifers that do not contribute to the local
surface streams. This water-balance simulation also will be valid only for a limited range of values for recharge to
ground water. By comparing the valid ranges of values for recharge in HSPF to those for MODFLOW and (or)
FEMWATER, the models can be used jointly to determine a recharge rate that yields reliable simulation of the
surface-water budget and ground-water levels and flow rates. Since the model had already been utilized rather
extensively, past projects using the model could help in deciding what calibration, verification, and other objectives
could be reached and with what degree of confidence when comparing the simulated to the observed runoff-
duration curves. "Correlation coefficients" and "Coefficient of Model Fit Efficiency" goals were established so that
a "very good fit" could be achieved as several criteria had to be met within a "plus or minus" percentage of
specified goals. These criteria include correlation of storm events, water budgets (overall, annual and monthly),
high- and low-flow distribution, seasonal volume error, recession rates, and lake and well-water levels.

-------


-------
NELAC AND PBMS: THE GOOD, THE BAD, AND THE UGLY
Jerry L. Pan-
Principal
Catalyst Information Resources, L.L.C.
1153 Bergen Parkway, #238
Evergreen, CO 80439
SUMMARY
NELAC and PBMS are both on the verge of implementation. We have the opportunity to create
a new good program that will provide rugged, reliable, economical analyses with oversight by a
sound accreditation process, to maintain our current bad program of a proliferation of trivial
method requirements with little regard to data quality and inspections which focus on adherence
to method details, or to create an ugly program with poor, biased, and unreliable analyses with no
oversight.
NELAC and PBMS can work effectively together to bring a new era of environmental analyses,
one where the data needs drive the laboratory analyses, where laboratories are held accountable
for data quality, not method compliance, and where a sound accreditation process is used to judge
laboratory competence.
INTRODUCTION
On January 14, 1999, the Environmental Laboratory Advisory Board (ELAB), a Federal Advisory
Committee that provides advice and counsel concerning the systems and standards of
accreditation for environmental laboratories, published a report with recommendations for
implementing PBMS. ELAB recommended EPA incorporate 13 "Essential Elements" into their
PBMS implementation plans, and that NELAC continue to establish a framework for
incorporation of PBMS in their program. (ELAB 1999) The ELAB report provides a basis for
establishing an effective PBMS program. NELAC can provide the independent oversight
necessary for this program to be successful.
ELAB RECOMMENDATIONS FOR IMPLEMENTING PBMS
ELAB placed a great deal of importance on the fundamental principles of a successful PBMS
program. These principles are referred to as Essential Elements (see Table 1). While all of the
Elements are essential to PBMS, six were deemed vital to its proper implementation. The
Essential Elements were developed using a "wide perspective" approach to describe key features
while not becoming embroiled in details that could restrict its options. The report provides
examples to clarify the intent and illustrate the main point of the Elements.
ELAB believes these Essential Elements are applicable to compliance monitoring independent of
PBMS.

-------
Table 1. Essential Elements for a Successful PBMS Implementation
A.	Critical Elements
A-l. Legal Standing: Data generated in compliance with the PBMS framework must have the same legal
standing as data generated using a promulgated EPA method.
A-2. Cost Effectiveness: Requirements for PBMS for method validation, demonstration of capability,
and ongoing quality control should be consistent across all EPA programs and should also apply
consistently to EPA published methods, modifications to EPA published methods, and new methods. Such
requirements should be cost effective for small laboratories performing limited analyses, large complex
laboratories working nationwide, and instrument manufacturers.
A-3. Scientifically Sound and Relevant Validation Process: Both the method validation and the PBMS
documentation requirements should be based on principles that are widely accepted in the scientific
community and on the intended use of the data.
A-4. Clearly Articulated and Appropriate Performance Criteria: EPA should develop and publish
PBMS performance criteria appropriate to the anticipated regulatory use. PBMS performance criteria are
the sensitivity, selectivity, precision and accuracy of the data needed to demonstrate compliance with the
regulation.
A-5. Regulatory Development: EPA should employ or develop laboratory methods that have been
demonstrated to be capable of achieving the regulatory compliance monitoring requirements. In order to
assure the quality of the science used in the development of regulations, EPA should submit all the
technical studies used to develop a regulation to peer review as part of the regulatory process, prior to
finalizing any such regulation.
A-6. Documentation: The documentation required under PBMS must be sufficient for independent
verification (i.e., auditing) and reproduction by another laboratory which is skilled in the art.
B.	Important Elements
B-l. Flexibility: Regulated entities should have flexibility to modify methods, or use new methods, as
long as the PBMS requirements are met.
B-2. EPA Optional Approval Process: The scientific community should have an effective system for an
optional EPA approval of new analytical methods. There should be no unnecessary barriers to these
approvals.
B-3. Consistency: Consistency in definitions, objectives and criteria for all aspects of PBMS among
Program Offices, EPA Regions and States is essential.
B-4. Simplicity: The implementation of PBMS should be made as simple as possible without departing
from the Essential Elements and the PBMS goals. Guidance developed for PBMS should be written with
simplicity and clarity to ensure consistent interpretation and implementation.
B-5. Clarity of Intent: Performance criteria must be represented by unambiguous requirements or
objectives, which can be easily understood, applied, demonstrated and readily audi table by the laboratory
community (laboratories, data users, laboratory assessors).
B-6. Careful Implementation: Implementation of PBMS should consider how existing regulations and/or
monitoring requirements will be affected.
B-7. Widely Available Reference Materials: Readily affordable reference materials should be widely
available to assist in the method validation effort.

-------
FRAMEWORK FOR AN EFFECTIVE PBMS
The ELAB report described the key features which must exist under PBMS. These activities can
be organized into four broad categories: Performance Criteria, Method Validation, Laboratory
Requirements, and Accreditation.
Performance Criteria refers to the quality of data needed. This term was used by ELAB in lieu of
comparable terms such as Measurement Quality Objectives (MQOs) or Data Quality Indicators
(DQIs). ELAB believes that the beginning of a successful PBMS program is the establishment of
these data quality needs. ELAB described these needs in terms of the bias, precision, detection
level, and confidence of the data. Ideally, these data quality needs would be described in the
regulations, and, there would be at least one laboratory method whose documented performance
can demonstrate compliance with the regulation. Alternatively, the data user can specify the
measurement quality objectives.
Method Validation is performed to document that the required data quality can be met. ELAB
believed there could be varying levels of validation, and various ways to validate a method.
Conceptually, ELAB envisioned three levels of method validation. These levels relate to the
degree of control over two critical aspects of laboratory quality control-ruggedness and sample
variability. A laboratory who only analyzes one particular sample type (e.g. a municipal
wastewater laboratory) has a lot of control over the quality system it employs, and the sample
matrix has little variability. The validation performed by this type of organization could be less
than that performed by a laboratory who analyzes many different types of samples. In this latter
case, the laboratory still has control over the quality system, but not the sample types. Finally, a
method developed for use by many laboratories for many different sample types, should have the
most validation performed, as the method developer has no control over the laboratory's quality
system.
In terms of the method validation activities themselves, there are several acceptable ways which
could be used, including comparison to a Reference Method, analysis of a Certified Reference
Material, and interlaboratory comparisons (Taylor 1983). Validation represents the activities
required to show that a method has the capability to generate data of the quality needed and
should be differentiated from those activities (QC) performed to document the ongoing quality
achieved with routine sample analysis.
Laboratory Requirements refers to the activities needed to be performed by a laboratory. The
laboratory must have a Quality System in place. The laboratory must document the method in a
Standard Operating Procedure (SOP). It must perform some type of method start up activity
such as an initial demonstration of competence, and, it must analyze quality control samples to
document that the method was performed correctly. Finally, the laboratory must analyze data
quality assessment QC samples (e.g. matrix spikes, field blanks) to document the data quality
obtained on the samples.

-------
If the activities above are performed, Laboratory Accreditation is then simply a review of the
documentation to ensure that these activities were performed. If so, the lab is competent to
perform analysis.
In summary, the regulations (or data users) establish measurement objectives, method validation is
used to demonstrate method capability, laboratory QC is used to document data quality, and
accreditation is used for laboratory competence, not data quality.
EXAMPLE PBMS PROJECT
As an example of how this new system would work, a hypothetical project was established. The
sample was a sludge sample. Expected contaminants at the site are polynuclear aromatic
hydrocarbons (PAH). The cleanup level established for these compounds is 200 ug/Kg. The
project is a RCRA site and other compounds on the 40 CFR 264 Appendix IX list are of limited
concern.
In my vision of a new framework, the first activity is to establish MQOs as follows:
•	Accuracy of 70-130% for PAH at action level
•	Accuracy of 50-200 % for other SVOC at 330 ug/kg
•	<5% False Positive
In this "Good" project, the client request data to meet MQOs, not methods. The laboratory
validates the performance of the method on a solid matrix, and accreditation is based on the
quality system approach.
The Good Method
•	Pressurized Fluid Extraction, methylene chloride/acetone
•	50 g sample
•	GC/MS
•	5 point calibration for PAH, RSD < 15%
•	Low standard equivalent to 100 ug/kg
•	Single point standard at 300 ug/kg for other Appendix IX compounds
•	Continuing calibration for PAH only, 15%D
•	LCS at 400 ug/kg for PAH, 70-130%
•	MS/MSD is PAH mix, spike level at 400 ug/kg
In the existing "Bad" system, the client specifies a method (e.g. 8270), the method cannot meet
the MQO's, auditors audit from detailed method checklists, and laboratory competence is
demonstrated on PE results on a water matrix (WS/WP). The laboratory does not establish that
the method can meet the required project objectives and data validation is performed to assess
method conformance, not data quality.
The Bad Method
•	Soxhlet with methylene chloride
•	30 g sample

-------
•	GC/MS
•	S point calibration for all, CCC RSD<30%
•	Continuing calibration CCC %D<20%
•	Reporting Limit of200 ug/kg supported by MDL
•	Low standard equivalent to 330 ug/kg
•	LCS & MS at mid point, 8270 sub-list
A PBMS program implemented poorly, with no oversight, has the potential to decrease data
quality. In this "Ugly" system, the client specifies a method, the laboratory modifies the method,
but with no method validation and no documentation. There is no independent inspection, and no
QC activities are performed.
The Ugly Method
•	Shake with methylene chloride
•	10 g sample
•	GC/FID
•	Single point calibration at 500 ug/kg
•	No LCS or MS; No Cal Check; No SS
CONCLUSION
If implemented correctly, PBMS can allow flexibility and improve quality. However, PBMS will
only work if a rigorous oversight program such as NELAC exists. Together, NELAC and PBMS
can provide a sound foundation for laboratory analyses.
REFERENCES
Recommendations for the Implementation of Performance-Based Measurement Systems (PBMS),
ELAB-9901, January, 1999.
Taylor, John K. Validation of Analytical Methods. Anal Chem. 55: 600A-608A

-------
AUTOMATION IN LABORATORY INFORMATION MANAGEMENT SYSTEMS: MYTHS
AND VIRTUES.
Cesar A. Muedas, Ph.D.
Quality Assurance Manager
Eckenfelder Laboratory
Brown and Caldwell
227 French Landing Drive
Fifth Floor
Nashville, TN 37228
SUMMARY
Recently, our laboratory undertook the project of substitution of old data management systems.
The total project was managed independently in each Section (Organics, Inorganics, Sample
Management) and a small "task force" served as link among them. After the installation of the
new software, the need to expedite validation and final roll-out led to a realistic inventory of the
contributions the LIMS would make (i. e., the lasting "virtues" that the laboratory would exploit),
but it also revealed distorted expectations that had no support in valid, practical terms (i. e., the
"myths" that were debunked).
One single system did not meet all our needs. We settled for a combination of Hewlett Packard's
ChemServer and The Khemia Company's Omega. One single laboratory-wide approach to
systematic field nomenclature did not work. The structure of tables within the databases was
flexible enough, however, to accommodate correspondence of terms and field designations. The
function of Quality Assurance could not be taken to limits of "full automation" because of the
complex combination of method requirements within each analytical run. We had to settle for a
sensible balance between LIMS-directed processes and effort-driven manual activities, i. e.,
computer power did not replace manpower. Finally, what we thought was the "end of the
transition" simply indicated the beginning of a continuous evolution that still today makes our
LIMS a living project.
Commitment to the integration of one LIMS product and one chromatography data system was a
logical decision consistent with the strategic goals of the laboratory. Training proved to be the
key ingredient to maximize the benefits of the LIMS. The high demand for electronic data
deliverables (EDDs) led to expedient progress in database customization. The integrity of the
LIMS structure turned it into a valuable repository of all lab-related information. Despite the
crises that had to be overcome, having migrated to new, dependable data management systems
has resulted into improved lab operations and renewed competitive advantage in the
marketplace.
INTRODUCTION
In today's marketplace, the appetite for information and knowledge has not excluded the
analytical data generated by commercial environmental laboratories. In order to compete
effectively, the laboratory must commit time and money to the tools, resources and technologies

-------
that support production and management of its data. Laboratory Information Management
Systems (LIMSs), therefore, have become critical elements in any laboratory environment where
processing of data and dissemination of information are indispensable.
The theory and the practice of LIMSs do not necessarily evolve at the same exact pace. The
search for a "next generation", computer-based LIMS has become a major challenge, even
though for many years the principles and guidelines for LIMS functionalities and structure have
been well defined (Hinton 1995). The task can be overwhelming when the constraints of the
implementation process must be made compatible with the demand for flexibility (if not stress)
imposed upon the laboratory.
OUR LIMS SOLUTION
Beyond the dichotomy of vendor-generated vs. laboratory-developed LIMS, our initiative for
migration to a new system began with the end in mind. We came to the conclusion that
continuing support of the current system was untenable and unrealistic, we knew what we
wanted changed; we knew what it was that we could no longer live with or without. We decided
to screen and select an "off-the-shelf' LIMS product.
We had closely observed how, during the last three years, the LIMS industry had demonstrated
incremental versatility in its products and increasing reliability in its services. At the same time
the number of vendors more than doubled, and the product offering had been further specialized
to the point of making available software packages specifically targeted to the commercial
environmental laboratory. The major instrument manufacturers also had secured a strong
presence in the LIMS market.
Our relationship with Hewlett-Packard was the first one to develop. Its chromatographic system
combining a ChemServer network and Target software (CS/T) was a logical choice to revamp
the entire data processing system in the Organic Section. The CS/T system, however, was not a
LIMS. It could be (and, in our case, it was) integrated with a LIMS acting as a client-server
application.
The choice of The Khemia Company's Omega was based on timeliness and fitness for use. It
had just released a new version of Omega developed in MSAccess97, and we had just migrated
company-wide to MSOffice 97; furthermore, we had established MSAccess97 as our "internal
standard" for handling of databases. At the same time, the structure of Omega matched very
closely the overall reorganization of workflow and data processing, started months earlier in
preparation for the transition to a new system.
The implementation proceeded in a stepwise fashion with minor customization required to
incorporate the sample registration (or "log-in") procedures. More extensive efforts were
necessary during the steps of creation of tests and definition of method-specific variables. The
most difficult obstacles were encounter during the customization of the invoicing capabilities.
Confronted with series of predictable and unexpected circumstances, the laboratory took a dual
approach to managing the implementation process. The Organic Section remained independent

-------
with exclusive use of the CS/T system. The Inorganic and Sample Management Sections
focused on expediting the migration to Omega while still processing data in the "old LIMS".
The old system was abandoned as soon as the "log-in" module of Omega was capable of sharing
sample information with the CS/T system and of distributing accurate workload reports to the
analysts.
During the evolution of the project, it became evident that some preconceived potential benefits
of the LIMS would not be realized. The extent of the integration between the CS/T system and
Omega was going to depend on our flexibility to reach consensus on customization of specific
features. The uncharacteristic simplicity of the selection process had also raised expectations
about the speed of delivery of the final LIMS product.
Our interest in expediting the final validation and full rollout of the system led to an inventory of
the main contributions the LIMS would make (i. e., the lasting "virtues" that the laboratory
would exploit). On the other hand, those expectations that would not be met began to take the
shape of "myths" that we (with the evidence collected) would be in the position to debunk.
MYTHS
A.	"One LIMS fits all."
One single system was not the solution to our problems and may not apply to commercial
laboratories of characteristics similar to ours. Our LIMS is (and will continue to be) a true
hybrid; the combination of the CS/T system and the Omega client-server application. The
former is, unquestionably, a powerful chromatographic data system that centralizes the
processing, reduction, validation and reporting of all the data in the Organic Section. It is
not, however, an isolated network; it is directly linked to Omega through import/export
routines that automate the retrieval of sample information and updating of analysis status.
Omega, on the other hand, gave us the flexibility to handle log-in, invoicing and inorganic
data without having to replicate the multiple features already present in the CS/T system.
B.	"LIMS is a universal language."
Not all LIMS are created equal, and that is first revealed through the simple inspection of
table structures in the databases. "Translation" is possible but it adds to the list of items to
customize, to the list of training issues to resolve, and to the potential conflicts that may arise
when the final product needs to be extracted by a third party who uses a different data
system. The "language barrier" applies to field names and types and may also cover
referential links among tables, the ability to export/import specific file types, or the structure
of analytical runs.
C.	"LIMS automates Quality Assurance."
The function of Quality Assurance (QA) is enormously benefited by a versatile LIMS but
never replaced by it. The most sensitive area is the definition of tests, quality control (QC)
samples and method detection limits (MDLs). The Quality Assurance Officer will be most

-------
directly involved in the process of establishing test parameters in adherence to each method's
standard operating procedures. Once in place, the tests incorporated in the LIMS will
introduce automation to the extent that multiple samples will share the same test information
and QA/QC requirements thus simplifying the data validation steps. The Quality Assurance
function, however, must maintain an active role in the following areas:
(1)	Verifying the timely incorporation of new test-related information or method-specific
criteria; e. g., updating MDLs and quality control limits.
(2)	Documentation of any deviation from key criteria; e. g., issuing Corrective Action
Reports, judging the relevance of data points in control charts, generating Case Narratives
for specific projects.
(3)	Evaluating the integrity and completeness of data as they apply to specific projects with
distinct regulatory drivers.
D.	"Computer power will replace manpower."
Particularly in small commercial environmental laboratories, a good number of effort-driven
manual activities find no substitute among alternative LIMS-directed procedures. Even the
most sophisticated database features cannot replace the ability of the analyst to prioritize
workload and assignments in order to maximize throughput and meet turn-around times.
Computer power eventually replaces itself, i. e., every new generation of software and
hardware introduces irreversible changes and guaranteed improvements.
E.	"All good things must come to an end."
The end of the transition is simply the beginning of a long term LIMS project. For the sake
of the laboratory and its clients, the LIMS must behave like a living organism. Assigning a
"life cycle" to a LIMS, however, may be an oversimplification. With continuous evolution
and improvement a LIMS may exceed the typical "life expectancy" of a software product.
Before entering "maturity" and advancing into its "decline", the LIMS can pick up a new
"life cycle" and revitalize itself. This approach may require investment in new hardware and
customized programming but both can be incorporated within the same software architecture.
In the foreseeable future, in fact, Y2K-compliance and throughput capacity may be the key
limitations for extended life of an "aging" LIMS.
VIRTUES
F.	Foundations of solid logic.
The more solid its internal and external logic, the more effective the LIMS solution will be.
Internally, the logic of the LIMS refers to the integrity and validity of the database elements
(tables, queries, forms, macros and modules) and its relationships. This logic is demonstrated
by the flexibility of the application, by its ability to accept customization that maintains
consistency in the programming code. Externally, the logic is reflected in the equivalency

-------
between LIMS operation and laboratory procedures. A new LIMS will demand changes in
some lab procedures and viceversa. This mutual influence should reinforce strengths. Any
value-adding lab activity should find a counterpart in the LIMS structure.
G.	The LIMS will be as good as its users.
Only through extensive and thorough training of all LIMS users will the system unleash its
full potential. The training effort will also be the tool to transform compliance with new
procedures into commitment to a long-term undertaking. Every user must realize the
importance of each individual's contribution, the "big picture" of LIMS implementation, and
the consequences of sub-standard performance.
H.	Electronic deliverables and electronic clients.
The LIMS package includes the capability of generating multiple types of electronic data
deliverables (EDDs). These EDDs have become the paperless equivalent of a full size data
package. The clients' increasing demand for EDDs indicates a trend towards a more
sophisticated level of data management. In fact, if all client-lab interactions can effectively
focus on electronic exchanges, a "new breed" of electronic clients may lead to the mass
customization of reporting formats submitted exclusively as EDDs.
I.	LIMS as a virtual lab.
The LIMS is ultimately the repository of all the data generated by the laboratory. The proper
management of data through the LIMS functionalities turns analytical results and sample
descriptions into meaningful information. Through the dissemination of information, the
LIMS becomes a surrogate for data-driven operators, i. e., a group of "virtual data analysts".
The "physical" lab analysts interact with the LIMS as virtual clients. The virtual lab is then
the collection of procedures that are automated through the LIMS and accessible by the
virtual clients: acquisition, processing, validation, reporting and archival. The "external" lab
clients interact not with the LIMS but with customized EDDs that the virtual clients generate.
J. LIMS as a laboratory's inflection point.
An industry's inflection point has been defined as the point in time when a serious threat
dislocates all activities and demands a redefinition of paradigms, structures and goals (Grove
1996). Such are precisely the circumstances when a laboratory commits to the full
implementation of a new LIMS. The strategies of the past do not work any more; the
imminent threat is the loss of any advantage over the competition. The new LIMS brings
about a new environment; the lab discovers new competencies and sets new goals consistent
with them. The benefits of the new LIMS become visible in the bottom line but also generate
a "return on knowledge" that enriches the expertise of the analysts and the value of the
product delivered to the clients.
CONCLUSION

-------
Every "LIMS project" may tell, in the details, a different story. The moral of ours is that, during
the implementation process, myths and virtues will surface accompanied by clear evidence of
their validity (or lack thereof). The lessons learned can be encapsulated in the need to fulfill
three very important "C's":
•	Communication: Its quality and depth will determine the overall pace of the process. It must
incorporate the "voice of the customer"
•	Collaboration: Everyone in the implementation team is a stakeholder, a knowledge worker
and an owner of the process; the best results will derive from the best-concerted efforts.
•	Customization: With a solid foundation of internal logic, any LIMS software can and must
be intelligently customized to incorporate value-adding features.
ACKNOWLEDGEMENTS
All personnel at the Eckenfelder Laboratory contributed to the LIMS implementation process.
Art Teter and Doyce Blair (The Khemia Company) provided vital assistance. Discussions with
Rick Davis (Brown and Caldwell, Nashville) and Scott Bash (Brown and Caldwell, Atlanta)
clarified perspectives on LIMS issues.
REFERENCE LIST
Grove, A.S. 1996. Only the Paranoid Survive. New York: Currency/Doubleday.
Hinton, M.D. 1995. Laboratory Information Management Systems. New York: Marcel Dekker,
Inc.

-------
11

-------
12

-------
13

-------
The Use and Abuse of QA/QC Samples in Environmental Data
Acquisition
Emma P. Popek, Ph. D.
Field Analytical Services Manager
IT Corporation
4585 Pacheco Blvd.,
Martinez, CA 94553
Garabet H. Kassakhian, Ph. D.
Consultant
1409 Valverde Place,
Glendale, CA 91208
SUMMARY
The types and quantities of field quality control (QC) and quality assurance (QA) samples should
be based on knowledge of the nature of the contaminant and the project data quality objectives
(DQOs), rather than on protocol requirements (EPA 1993). DQO-based selection of project-
specific QA/QC samples reduces the overall analytical cost of the project. During data validation
field QA/QC sample results are usually discussed in terms of precision, sample handling and/or
data comparability. Our experience shows that even when field sample results may be qualified
based on field QC sample results, the overall data usability is hardly ever affected by out-of-
control field QC samples. Other factors, such as the missed holding times, or poor analytical
quality, usually govern the re-sampling and reanalysis decisions. This paper focuses on the
misuse of soil QC samples, such as field duplicates, trip blanks and QA splits, however, many of
the discussed issues are also relevant to QA/QC water sample use.
INTRODUCTION
The purpose of the plethora of QA/QC samples that are required by various protocols can be
truly overwhelming even to seasoned environmental professionals. The use of these samples is
recommended by the Environmental Protection Agency (EPA) and prescribed by the Department
of Defense (DOD) branches. As Table 1 shows, QA/QC samples include trip blanks, field
duplicates, QA splits, equipment rinsate and ambient blanks. These samples are collected and
analyzed at a substantial cost to the project. The costs related to QA/QC samples may be as high
as 30% of the total analytical cost. Collection and analysis of these samples are usually mandated
by the protocol that is applied regardless of the project DQOs and intended use of the data.
DISCUSSION
The collection of field QA/QC samples has become a routine that is rarely questioned or revised
to better meet the project needs. On many projects QA/QC samples are collected for the sake of
satisfying the protocol requirements; the data obtained are rarely used to impact project
decisions. The qualitative and quantitative acceptance criteria for field QA/QC samples are often
arbitrarily selected and do not take into account the site-specific matrix variability or the nature
of contaminant distribution. Meeting these acceptance criteria is of little consequence for the
project execution. To reflect the current industry practice we flow-charted a prevalent process for
soil samples acceptance shown as on Figure 1. One can see that the outcome of QA/QC sampling
does not, in fact, affect project activities because cost and schedule considerations generally
override the quality issues originating from the analyses of field QA/QC samples.
1

-------
Table 1
Types of Soil QA/Qi
C Samples
QA/QC sample
Army Corps of
Engineers (USACE
1994,1998)
Navy
(NFESC 1996)
Air Force
(AFCEE 1996,
1998)
EPA
(EPA 1996)
Trip blank
Water and soil (clean
sand)
Water blank only
Water blank only
Analyte -free
media; no blanks for
solid
Field duplicates
Homogenized or
collocated for
VOCs
Homogenized or
collocated for
VOCs
Homogenized or
collocated for
VOCs
Separate samples
taken from the same
point in space and
time
Field replicates or QA
splits
Homogenized or
collocated for
VOCs
Homogenized or
collocated for
VOCs
Sample divided
into two equal
parts
No reference
Rinsate (equipment)
blanks
For all projects when sampling equipment is cleaned
Ambient (field) blanks
Per DQOs
Per DQOs
Per DQOs Per DQOs
Source water for
rinsate blanks
No requirement or recommendation, however, this practice has been observed
Matrix spikes
On project samples
On project samples
On project
samples
Not project specific
Temperature blanks
Required
No reference
Required
No reference
VOC denotes volatile organic compounds
Field duplicates for Soil
Duplicate samples are usually collected and analyzed at a frequency of 10% of the field samples.
The purpose of field duplicates is to evaluate the sampling precision as relative percent
difference (RPD), with the RPD goal, depending on the protocol, varying from 30 to 50%.
Sample results, as a rule, are not qualified during data validation based on the RPD criterion, and
data sets are not rejected if this criterion is not met. In such cases, a token statement on "matrix
inhomogeneity" usually satisfies all parties involved. Therefore, the main outcome of field
duplicate analysis is some assessment of matrix inhomogeneity with no real consequences on
data usability.
According to the Navy Laboratory Guide (NFESC 1996), field duplicate precision is affected by
sample homogeneity/heterogeneity, precision of sample collection and handling, and analytical
precision that is determined otherwise by use of laboratory control samples. Field duplicate
precision calculated as RPD allows establishing the following parameters:
1.	Total sampling and analysis precision
2.	An indirect measure of matrix inhomogeneity
3.	Representativeness of a sampling program or how well the samples represent the site
Experience shows that the third parameter is the most important and the most overlooked one. To
substantiate this statement we offer some case studies.
2

-------
Figure 1
Soil Sample Acceptance Process Flowchart
STOP
Case Study 1.
Samples for VOC analyses cannot be homogenized due to contaminant volatility; therefore,
collocated duplicate samples are commonly collected for VOC analysis. This type of sampling
by its nature reduces the probability of good sampling precision, and the duplicate RPD for such
samples depends on analytical precision and matrix effects.
Table 2 shows a comparison of the VOC results for collocated stockpile soil samples. The
duplicate RPD ranges from 6 to 168. To complicate the matters further, collocated QA samples
were collected from each location. Although QA sample results are comparable to the sample
and duplicate results, the relative standard deviation (RSD) that ranges from 22 to 158%,
confirms what we already know from the RPD data: there is a great variability of the
contaminant distribution in the analyzed soil. These data had absolutely no impact on the project
decisions.
3

-------
Table2
Unhomogenized Soil Duplicate Precision
1,1,2,2-Tetrachloroethane concentration, mg/kg

No. 1
No. 2
No. 3
No. 4
No. 5
No. 6
No. 7
No. 8
Sample
0.010
28
87
74
1.6
0.2
4.2
0.2
Duplicate
0.024
51
56
66
1.7
2.3
0.64
0.18
RPD1
82
58
43
11
6
168
147
11
QA sample
0.081
20
71
46
0.67
2.4
3.5
6.3
RSD2
98
49
22
23
43
76
68
158
' RPD is the difference in two concentration values divides by the mean concentration
:RSD is the standard deviation divided by the mean concentration
These results raise more questions than they provide answers. Which one of these results
represents the true concentration in a particular section of the stockpile? Don't we already know
that the precision of unhomogenized collocated samples is likely to be poor due to one single
major contributing factor, that is variability in contaminant distribution in soil? Why have we
gathered this information in the first place, since it did not have any influence on project
decisions? And finally, should we have collected more discreet samples instead of collocated
duplicates to better estimate the overall concentrations in the stockpile.
Case Study 2
Homogenized duplicate samples were collected from a former ammunition demolition site. Field
samples collected from surface areas were homogenized four-point composites, and their
duplicates, subsampled from the same composite, were homogenized splits. The target analytes
were metals and trace explosives. Thousands of samples were collected in this manner from
excavations at the site. In this example we have randomly selected nineteen duplicate pairs with
eighty-eight RPD values calculated for various detected target analytes. Table 3 shows that the
RPD for a vast majority of duplicate pairs was below 30%, with the higher RPD values observed
for such metals as lead and barium.
Table 3
Homogenized Composite Duplicate Precision
RPD Range
Number of Duplicate Pairs
0-30
68
30-50
12
Greater than 50
8
This information tells a story different from the one in Case Study 1. For this project, the
collection and analysis of duplicate samples was useful for determining the sampling precision,
i.e. the precision of compositing and homogenization. The RPD values for these samples truly
represent the sampling and analytical precision. Reproducible results represent true
environmental conditions; therefore, good sampling precision is a measure of representativeness.
Nonetheless, the 10% frequency of the duplicate sample collection at this site seems to be
excessive. Once the homogenization procedure has been established for the project and
4

-------
confirmed by analytical results, it would be sufficient to confirm it again only if the procedure
changes.
Trip Blanks
Another example of QC sample misuse is the analysis of water trip blanks with soil samples.
EPA and most of the DOD agencies do not clearly state whether trip blanks should be used for
soil samples. Decisions related to this matter are usually made during the Quality Assurance
Project Plan (QAPP) preparation. We have experienced significant difficulty justifying the DQO-
based use of trip blanks.
Trip blanks are used to determine whether contamination could have been introduced into the
samples while they are handled in the field and are in transport, i.e. in coolers with ice
transported from a site to the analytical laboratory. Ambient contaminants or VOCs emanating
from the samples would then travel through a Teflon-lined septum into a 40-ml vial filled with
analyte-free water. These contaminants are usually chlorinated hydrocarbons, such as methylene
chloride or Freon. Water trip blanks make sense only when they accompany water samples for
VOC analysis, and low or unknown contaminant concentrations are expected.
Soil samples do not provide the same contamination pathway as water samples because they are
not collected in 40-ml vials with Teflon-lined septa. In addition, soil matrix does not have the
same transport mechanism as water does (adsorption versus dissolution). There are other
differences that make the comparison impossible: different sample handling in the laboratory, a
variety of purge and trap sample introduction techniques, and differences in soil and water
detection limits. Comparison of low level trip blank contamination to low level soil
concentrations is not conclusive and hence meaningless.
The same is true for the practice of monitoring cross-contamination between samples with trip
blanks. This type of contamination is best controlled by such QA measures as sample segregation
and proper packaging. Based on years of reviewing data packages one can come to a conclusion
that the trip blank contamination most likely originates in the laboratory or in the source water
used to prepare the blank, not in the field.
The use of clean oven-baked sand for soil trip blanks has been also proposed (USACE 1994,
Lewis 1988). This cumbersome practice raises a different set of issues. For example, how does
VOC transport in organic-free sand compares with VOC transport in soil of mixed lithological
composition with organic carbon content? How can contamination of the sand blank with
ambient laboratory VOCs be prevented?
Convincing an agency or a DOD reviewer to exclude a trip blank shipment with soil samples is
not a trivial matter. Common sense does not always work, and often we are directed to analyze
water trip blanks for soil samples on the projects where the action levels are orders of magnitude
greater than the ambient levels that may be found in the trip blanks. With many sample coolers
going to the laboratory the cost of obtaining this marginal information can be substantial.
5

-------
OA Splits
QA splits are a valuable tool in detecting data quality problems, and as such, their use is
recommended in Region 9 Best Practices for the Detection and Deterrence of Laboratory Fraud
(EPA 1997). Homogenized QA splits have the most value as they serve as confirmation of
results reported by the primary laboratory. Unless two laboratories employ the exact same
procedures for extraction and analysis, results are usually not comparable. If samples are not
homogenized or laboratories use different extraction and analysis techniques, comparison of
results will not be meaningful.
QA splits are useful for making real-time decisions on data quality only when data comparison is
conducted while the project is still in the field phase. Unfortunately, the reality is quite different,
and data comparisons are usually made long after the field work has been completed.
CONCLUSION
The two case studies provide striking examples of the use and abuse of QA/QC samples. In our
view, the data quality objectives of the Case Study 1 did not justify the collection and analysis of
duplicate samples, because collocated samples are not true field duplicates and their precision
reflects only matrix variability with respect to contaminant distribution.
Substantial sample variability is likely to be expected for any sample collection technique for
VOC analysis (Vitale 1999). The same is true for many other contaminants. The analysis of eight
duplicate samples increased the project analytical cost by 10%; however, the obtained data did
not have any impact on the project decisions.
In Case Study 2, however, duplicate samples provided important information because they truly
reflected the sample handling procedures that were susceptible to human error. These procedures
were built into the sampling protocol, and therefore, if conducted improperly, would have
produced unrepresentative site data. It is interesting to note that even when the RPD values were
very high, no corrective action (re-sampling, additional excavation, etc.) was undertaken, and
project activities were not impacted.
A wealth of experience exists today that equips us with the knowledge on contaminant transport
in soil. Coupled with available information on site history and background, this knowledge
allows us to make valid assumptions on contaminant distribution and variability in soil at the
site. If the distribution is expected to be sporadic, to better characterize the site we should collect
more field samples instead of duplicate samples or consider compositing as a sampling option. If
the historic use of the site is completely unknown, a few duplicates collected early during
remedial investigation will provide the information for making educated decisions related to the
future sampling design process.
As we are moving towards implementation of the Performance Based Measurement Systems
(PBMS), we need to re-evaluate the practices related to the collection of QA/QC sample data.
Some of the recommendations that will significantly reduce project analytical costs without
compromising the quality are as follows:
6

-------
1.	Collect trip blanks for water samples only when low level contamination is a matter of
concern
2.	Justify the type and frequency of collection of QC duplicates on the project DQOs; use the
obtained data in sampling design process for further site investigation or remediation
3.	Use QA samples only when justified by the sampling design and make decisions related to
data comparability while the project is still active
4.	Reduce excessive collection of equipment rinsate samples
5.	Perform project-specific matrix spikes only if the obtained data will be used for project
decisions
To reduce unnecessary costs QA/QC samples should not be collected if the obtained data are not
used for project decisions. Justified by the project DQOs and meaningfully interpreted, these
sample results can be used for developing representative sampling designs or for establishing the
true sampling precision, otherwise their collection turns into an expensive yet unnecessary
exercise.
REFERENCES
1.	United States Environmental Protection Agency (EPA). 1993. Data Quality Objectives
Process for Superfund, Interim Final Guidance. EPA 540-R-93-071.
2.	United States Army Corps of Engineers (USACE). 1994. Requirements for Preparation of
Sampling and Analysis Plans. EM 200-1-3.
3.	United States Army Corps of Engineers. 1998. Shell for Analytical Chemistry Requirements.
Version 1.0.
4.	Naval Facilities Engineering Service Center (NFESC). 1996. Navy Installation Restorations
Laboratory Quality Assurance Guide.
5.	Air Force Center for Environmental Excellence (AFCEE). 1996. Model Field Sampling Plan.
Version 1.0.
6.	United States Environmental Protection Agency. 1996. Test Methods for Evaluating Solid
Waste, Physical/Chemical Methods, SW-846, Third Edition, Update III.
7.	Air Force Center for Environmental Excellence. 1998. Quality Assurance Project Plan.
Version 3.0.
8.	David L. Lewis. 1988. Assessing and Controlling Sample Contamination. In Principles of
Environmental Sampling. Edited by Lawrence H. Keith. American Chemical Society.
9.	United States Environmental Protection Agency. 1997. Region 9 - Best Practices for the
Detection and Deterrence of Laboratory Fraud. Version 1.0.
10.	Rock J. Vitale et al. 1999. Comparison of VOC Results between Methods 5030 and 5035 on
a Large Multi-State Hydrocarbon Investigation. Environmental Testing and Analysis.
Volume 8, Number 1. January/February.
7

-------
14

-------
THE FLORIDA PERSPECTIVE - LESSONS LEARNED
Sylvia S. Labie
Florida Department of Environmental Protection
Quality Assurance Section MS 6505
2600 Blair Stone Road
Tallahassee, FL 32399-2400
SUMMARY
Several programs in two different agencies administer the oversight of environmental data
quality in Florida. Differences in objectives and an occasional conflicting requirement leave the
regulated community and the data consumer with a sometimes confusing, slightly skewed
concept of data quality. Recognizing that the existing system of programs resulted in inefficient
quality assurance oversight, the Department of Environmental Protection and the Department of
Health entered into a Performance Partnership Agreement with EPA Region IV to explore
procedures that would make the most efficient use of all resources.
A list of quality management tools was identified through a series of focus groups, meetings with
EPA and benchmarking exercises that considered programs in other states. In addition,
electronic tools were developed to assist in many of the routine, repetitive, time consuming
activities.
The result is a balanced mix of activities and responsibilities that takes full advantage of the
growing NELAP program, electronic tools, information access and the concept that quality is a
shared responsibility. The outcome should mean less intrusive oversight, and better informed
data consumers who will in turn, have the ability to understand and utilize the data to make
sound environmental decisions.
SETTING THE STAGE
Florida, to the rest of the world, has a strong quality assurance program, with requirements
mandated by the Quality Assurance Section of the Department of Environmental Protection and
the Department of Health's Laboratory Certification Program. Why, is it, then, that the DEP
compliance staff encounters increasing incidents of data quality problems, shoddy laboratory
operations and even data fraud. Part of the answer lies in a growing understanding of data
quality by DEP program managers, but of greater concern is the manner in which Florida
currently conducts business.
The oversight of data quality in Florida resides in two agencies. The Florida Department of
Health Laboratory Certification Program (DOH LCP) certifies laboratories for testing drinking
water and environmental samples. Participation in the program is mandatory for laboratories
providing compliance data for drinking water and domestic wastewater; and is voluntary for all
other laboratories. For most other environmental programs, except air, the Department of
Environmental Protection's Quality Assurance Section (DEP QAS) requires that the sample
collection and testing organizations submit a Comprehensive Quality Assurance Plan for

-------
approval by DEP. These plans, while similar to Quality Assurance Project Plans, address all
capabilities of an organization rather than those that might be selected for a particular project.
In addition to these programs, the DEP District staff conduct performance audit inspections
(PAIs) as a part of an ongoing commitment to the NPDES program. These PAIs include audits
of the laboratory that generates data for a given facility.
A laboratory in Florida has the potential of being audited by any of the above programs -
sometimes concurrently, but more often than not, at separate times. Each program has a different
set of objectives and requirements, which can, in some cases, impose conflicting requirements on
the laboratory. Clearly, audits by three state programs are a poor use of resources, and create
confusion in the laboratory community about which requirements to follow. This, in conjunction
with the growing number of problem laboratories, led the Quality Assurance Section to conclude
that the current system was inefficient and did not provide adequate quality assurance oversight.
IDENTIFYING RESOURCES
DEP began to assess the overall effectiveness of the quality management program in Florida. A
series of focus groups, a performance partnership agreement with EPA Region IV and
benchmarking activities sought to identify the basic principles and activities of an effective
Quality Management Program.
The result of these exercises was the identification of four basic principles and a set of tools to be
used in managing an effective quality assurance oversight program.
These principles are:
~	Good decisions require good data;
~	An effective quality management system is the basis for quality data;
~	Quality of Science Reviews are necessary to determine if the Department's decisions
are adequately supported by the collected data; and
~	Data quality is a shared responsibility.
The quality assurance tools that are needed to implement an effective quality assurance oversight
program include quality planning (the DQO and DQA process as well as quality plans), systems
audits, proficiency testing and data validation.
THE CURRENT PROGRAM
In evaluating the current roles and responsibilities of the DOH LCP and the DEP QAS programs,
it was determined that the programs emphasized three of the QA tools: systems audits,
proficiency testing and quality plans. These activities tend to stress the importance of the
process rather than the end product, focusing on a laboratory's potential to produce good data
rather than assessing the generated results. Further, DEP and the data consumers (permittees,
respondents, etc.) depended on the expertise and evaluations of the 15 professionals in the DEP
QAS and DOH LCP programs to provide them with the answer to "How good are my data and
my laboratory?", and the answer, based on the use of these three tools, provided a skewed, overly
optimistic picture of laboratory data quality in Florida.

-------
BACK TO BASICS
Recognizing the overemphasis on certain QA tools, the Department reevaluated the manner in
which all the basic principles and quality assurance tools were being implemented. It was
determined that data validation, which is based on the evaluation of actual data generated, and is
a measure of an organization's routine performance, had been sadly neglected.
Also considered were the emerging national laboratory accreditation program, enhanced
information access, interactive software programs to minimize tedious, repetitive tasks, and the
current resources in the DOH LCP and DEP QAS.
THE NEW LOOK
The result of all the meetings and benchmarking exercises is a new look for data quality
oversight in Florida, beginning with the realignment of responsibilities within the DOH LCP and
the DEP QAS.
DOH LCP will be responsible for accrediting all environmental laboratories in Florida, and DEP
QAS will concentrate its resources on evaluating the resulting data. While the new mix does not
appear to be drastically different from the division of responsibilities, the difference is the
National Environmental Laboratory Accreditation Conference (NELAC) and its associated
National Environmental Laboratory Accreditation Program (NELAP). The NELAC standards
effectively consolidate the currently separate requirements of DOH and DEP into a single
program, making NELAP the logical step in consolidating Florida's laboratory oversight
requirements in a single agency.
, DEP will be able to concentrate on oversight of field activities as well as data validation and data
quality assessments. By using industry's approach to quality control - routinely testing widgets
from the assembly line for acceptability - the Department will be able to monitor routine data
quality. The focus on the quality of the end product (data) as opposed to the initial specifications
(QAP) will provide data managers relevant data quality information necessary for making sound
environmental decisions.
Once the new roles and responsibilities were determined, it became apparent that neither the
DOH LCP nor DEP QAS had sufficient resources to implement all the desired changes. Since
the DEP program managers are responsible for making the environmental decisions, it was felt
that they are in the best position to determine project objectives and data quality needs as well as
deciding whether the submitted data met expectations - tasks that wfire often left to the DEP
QAS who lacked in-depth knowledge of a given program. The success of the revised program
lay in sharing the responsibility for data quality. However if the concept of a collectively shared
responsibility was to become a reality, the program managers and data users needed access to the
same information and tools that the LCP and QAS use in evaluating data or conducting audits
DEP programs now have access to many of the data bases that provide information on the
capabilities, certification status, and credentials of organizations. In addition the DEP Internet

-------
and Intranet sites provide access to SOPs, guidances, checklists, rules, software and other tools
that are used by QA staff in auditing and evaluating data.
The most significant accomplishments have been in the area of software development. These
software packages minimize the amount of time required to perform tedious, often repetitive
tasks.
DEP currently offers QA Planner, an interactive software that allows the user to develop a
Quality Plan in the format required by DEP. It also evaluates the plan for completeness and
approvability during the development process, and electronically reviews the plan when received
by DEP. This process occurs in a matter of 10 minutes instead of 30 days and reduces the
iterative process associated with manual reviews.
QA Planner is being transformed into the Florida Planner/Certifier, which will be used by the
DOH LCP to accept electronic applications, as well as a means of managing the certification
program.
The newest electronic tool is the Validator. In its current form, the program accepts manual or
electronic data sets that are evaluated for permit/standard exceedances and data quality. The
process concludes with a report that identifies problems, and recommends appropriate actions. A
200-page groundwater monitoring report is reduced to the critical problems in a matter of
minutes instead of hours. As an added benefit, the resulting database is designed so that the
information can be readily uploaded to the various department data repositories.
In order to determine if the changes in the QA program have a positive impact on the
environmental decision-making process, the Department instituted the use of Quality of Science
Reviews (QSRs) as a mechanism to evaluate whether the Department's decisions are truly
supported by adequate and valid data. QSRs are large-scale program audits that review those
program activities relating to data quality including identification and use of project/study
objectives; selection of appropriate and sufficient parameters, methods and equipment; processes
to evaluate the data in terms of quality and applicability; and mechanisms to manage the data.
This process will identify the strengths and weaknesses in each program so that the overall
efficiency and effectiveness of the decision-making process can be improved.
LESSONS LEARNED
~	Accreditation doesn't necessarily ensure good data. Accreditation assesses the potential
for generating quality data;
~	Quality AssuranCe Plans don't guarantee quality. They are a statement of intent - a
commitment to follow a document that should result in data that meets the expectations
or data quality objectives of a particular project;
~	There is a need to balance the use of the quality assurance tools, to achieve a realistic
picture of data quality;
~	Auditing - on-site audits of organizations, internal Quality of Science Reviews of
departmental programs, and evaluation of submitted data - is crucial in determining if

-------
organizations are functioning as expected, the resulting data are useable for its intended
use and the Department is utilizing the data to make effective environmental decisions
EXPECTED OUTCOMES
The more effective use of the QA tools and QA principles in Florida will lead to a better
understanding of data quality among the data consumers, and an improved product,
environmental data. In addition, the Department expects that the process will result in:
~	more effective use of the QA tools, and a more balanced, less intrusive approach to QA
oversight in Florida as a result of realigning the responsibilities of DOH and DEP;
~	a heightened awareness of the importance of quality data in environmental decision
making through the promotion of the concept of shared responsibility. In addition,
program managers, who are in the best position to understand the needs and goals of their
programs, will have the resources and expertise available to them to take a more active
role in determining the type of data quality that best fits their needs; and
~	more efficient use of staff time through the use of electronic tools that have proven
effective in minimizing the time spent on repetitive, but critical activities.
The end result will be data users who have a better understanding of the quality of data. They
will also have the tools necessary to evaluate the usefulness of the data in making the types of
decisions that will best protect the public health and the environmental resources in the State of
Florida.

-------
Larry Becker and Heidi Novotny
U.S. Army Corps of Engineers, Washington DC
US Army Corps
of Engineers ®
Engineer Manual 200-1-2
(Download from http://www.usace.army.mil/inet/usace-docs/eng-manuals/em.htm.)
TECHNICAL PROJECT PLANNING
(TPP) PROCESS
Existing Site
Customer's
Goals
•	Detailed Project Objectives
- Detailed Data Quality Objectives
•Technical Basis for Sampling and
Analysis Plan; Quality Assurance
Project Plan; and Work Plan
•	Accurate Cost Forecasting
•	Progress to Site Closeout
/
Focused on site closeout!
Useful for all sites
(small/simple to large/complex)!
S Applicable when planning site
investigation; design; construction;
operation and maintenance; and
long-term monitoring activities!
v Guidance for project managers,
engineers, scientists, attorneys,
customers, regulators, and other
stakeholders!
S Use of TPP Process typically saves 10
to 15 percent of project time and costs!
This brochure provides only an overview of the TPP guidance provided in EM 200-1-2.

-------
Phase I
Identify Current Project
Phase I activities
accelerate protection of
human health and the
environment and expedite
progress to desired future
use conditions at a site.
•	Decision makers and
technical personnel are
brought together;
•	Current project is
identified; and
•	Project objectives are
documented.
Phase I is designed to
"front-load" conflicts and
decision making.
Resultant project
efficiency more than
compensates for the early
commitment to proactive
communications and
detailed, site-specific
planning.
¦
Phase II
Determine Data Needs
Phase II activities involve
an evaluation to determine
if additional data are
needed to satisfy the site-
specific project objectives.
•	Data needs are
determined; and
•	Data needs are
documented.
Phase II is designed to
support the detailed
planning required to
determine and document
data needed for the current
project, and subsequent
Who should use the TPP Process?
Project managers and their technical personnel should use the TPP
Process to help satisfy a customer's expectations. The customer,
regulator, and other stakeholders should also participate during the
TPP Process to maximize the effectiveness of planning,
implementation, and assessment efforts.
What is the TPP Process?
The TPP Process is a comprehensive and systematic process that
involves four phases of planning activities. The TPP Process was
developed for identifying project objectives and designing data
collection programs for hazardous, toxic, and radioactive waste
(HTRW) sites. Use of the TPP Process is consistent with the
philosophy of taking a graded approach to planning that will
produce the type and quality of results needed for site-specific
decision making.
Why should the TPP Process be used?
Use of the TPP Process ensures effective and efficient progress to
site closeout within all project constraints. Use of the TPP Process
saves resources by reducing both the project duration and the
project expenditures. Application of the TPP Process is also
simpler and more complete than EPA's 7-Step Data Quality
Objective (DQO) Process.
When should the TPP Process be used?
The TPP Process should be used as follows:
•To plan a new project;
•	To review existing project plans; and
•	To plan the next executable stage of site activities.
Where should the TPP Process be used?
The TPP Process should be used when planning any site activity
(i.e., investigation; design; construction; operation and
maintenance; or long-term monitoring).
How is the TPP Process used?
•	Use of the TPP Process is lead by the Project Manager, and may-
be facilitated by an outside party;
•	A multi-disciplinary team, identified during Phase I, uses the
TPP Process to guide their planning efforts; and
•	Use of the TPP Process requires that personnel represent
decision maker, data user, and data implementor planning
perspectives.
2

-------
Phase HI
Develop Data Collection
Options
Phase III activities ensure
the customer will have the
information required for
related business decisions.
•	Sampling and analysis
approaches are planned;
•	Data collection options
are developed; and
•	Data collection options
are documented.
Phase III is designed to
support planning sampling
and analysis approaches
that will satisfy the data
needs for a project.
Phase IV
Finalize Data Collection
Program
Phase IV activities
challenge a TPP team to
discuss data collection
options and finalize a data
collection program that
best meets the customer's
short- and long-term goals
for a site.
•	Data collection program
is finalized; and
•	Data collection program
is documented.
. . • .. .. : •• • . .
Phase IV is designed to
provide guidance for
documenting data
collection programs with
project-specific DQO
statements. Many TPP
products can also be
attached to a project's
management plan.
KEY CONCEPTS
•	Site Closeout is achieving the "walk away goal," or final
condition of a site, as envisioned by the customer. The team
develops an effective site closeout statement after considering
future land use; the site's regulatory compliance status and
issues; and the customer's preferences for the final condition of
the site.
•	Project Objectives must be satisfied or resolved in order to
progress from the current site status and condition to site
closeout. Phase I efforts to identify and clearly document project
objectives ensure that site-specific regulatory issues and
environmental conditions are successfully addressed.
•	Basic, Optimum, and Excessive are very powerful terms used
for classifying project objectives, grouping data needs, and
presenting data collection options for a customer's consideration.
•	Data Quality Objective (DQO) statements are prepared during
Phase IV, include nine data quality requirements, and meet
EPA's definition of a DQO.
EFFECTIVE AND TIMELY PLANNING
A premise of the TPP Process is that each individual contributing
to a project has his/her own project execution style. The
systematic TPP Process enables a project manager to achieve an
appropriate balance of project execution styles within a team,
accelerate progress to site closeout, and reduce expensive time and
efforts during the "do," "check," and "finish" stages of any project.
As illustrated below, benefits of effective and timely planning
include:
•	Less time is expended to "check" and "finish" a well planned
project; and
— Too much commitment to project planning.
j

-------
• Less overall time (and
money) is expended when
early efforts are focused and
the team strives to optimally
plan a project.
Applicability
The TPP Process applies
to all HQUSACE elements
and US ACE commands
responsible for HTRW
projects.
Availability
Electronic copies of the
TPP Process guidance and
other US ACE publications
can be downloaded from
http://www.usace.army.mi
1/inet/usace-docs/.
Points of Contact
•	HQ Proponent Larry
Becker, USACE (202)
761-8882
•	Subject Matter Expert
Heidi Novotny, USACE
(402) 697-2626
•	Subject Matter Expert
Craig Willis,
Black &Veatch (913)
458-6656
•	PROSPECT Workshop
Joy Rodriquez, USACE
(256) 895-7448
Workshops
A hands-on case study
workshop is available as a
2.5-day PROSPECT
workshop for individuals
or entire project teams.
On-Board Facilitation
On several projects, a
facilitator has introduced
the TPP Process and then
helped the TPP team to
apply the process and
capture the TPP plans for
a project.
4

-------
TPP Process Guidance
Foreword
Chapter 1	Identify Current Project (Phase I)
Chapter 2	Determine Data Needs (Phase II)
Chapter 3	Develop Data Collection Options (Phase III)
Chapter 4	Finalize Data Collection Program (Phase IV)
Chapter 5	Implement and Assess Data Collection Program
Appendix A	References
Appendix B	Abbreviations and Acronyms
Appendix C	Definitions
Appendix D	Outline of TPP Activities
Appendix E	Crosswalk to EPA's 7-Step DQO Process
Appendix F	Worksheets for Documentation
Appendix G	Verification of DQO Attainment
S The TPP Process is a critical component of
USACE's quality management system that
conforms to the American National
Standard for planning the collection and
evaluation of environmental data.
S The TPP Process supports development of
management plans for projects as required
by the Engineer Regulation governing
program and project management.
if The TPP Process satisfies the systematic
planning requirements of EPA's mandatory
agency-wide quality system.
S Documentation tools provided within the
TPP Process guidance encourage detailed
data collection planning and contribute to
maintaining institutional site knowledge.

-------
Measurable Benefits of TPP Facilitation
US ACE Project
TPP Facilitation
Cost & Schedule Effects
Qualitative Effects
Point of Contact
CELRB
Luckey Site
(FUSRAP)
3-4 February 1998
4 USACE personnel
2 USACE contractors
TPP Contractor ($ 1,700)
Cost Savings > $ 150,000
Short-term schedule was met and
long-term schedule was created
to meet mandated milestones.
Project costs and schedules were
brought under control by
CELRB.
CELRB personnel discovered
where historical site data was
stored and what site-specific
contractor expertise existed.
Resulting technical plans were
better and much more
defensible.
Jim Karsten, Project Manager
716/879-4245
Jim Thurn, Project Lead
716/879-4180
Enabled CELRB to "develop an effective contracting strategy for the
RI work." " The stream-lined contracting resulted in a more efficient
RI."
CELRB
Painesville Site
(FUSRAP)
4 February 1998 (only 2 hours)
2 USACE personnel
TPP Contractor ($300)
Cost savings cannot be estimated
because application of TPP
Process and TPP facilitation was
limited to identifying a realistic
project schedule and the TPP
tools and concepts to be used on
the project.
Technical team was better
prepared to use comprehensive
and systematic process for
reviewing and finalizing data
collection plans.
Technical roles were established.
Jim Karsten, Project Manager
716/879-4245

-------
USACE Project
TPP Facilitation
Cost & Schedule Effects
Qualitative Effects
Point of Contact
CEMVS
St. Louis Airport Site
(FUSRAP)
25-27 February 1998
10-12 USACE personnel
6-7 USACE contractors
2-3 Federal and State Regulators
TPP Contractor ($3,000)
$2,500,000 cost reduction when
"excessive" data needs were
deleted.
$500,000 cost increase to
include omitted sample
collection and analyses needed
for site decisions.
Additional cost savings are not
yet quantifiable.
19 specific action items were
identified and directly linked to
short-term project schedule.
"Train wreck" schedule was met
and Regulatory review cycles
were significantly reduced.
Improved communications and
technical relationships with
Federal and State Regulators.
Clear planning objectives
enabled field personnel to
further control costs and make
good field decisions.
Changed field conditions were
well managed and acceptable
data was acceptable data was
collected on schedule and within
budget.
Regulators agreed that several
existing technical evaluations
were sufficient and that some
data collection efforts could be
deleted as "excessive" to the
program.
Legal Counsel issues related to
future land use and other PRPs
were resolved before field work.
Tom Freeman, Project Manager
314/331-8785
Don Meier, Technical Lead
816/983-3890
CEMVS
St. Louis Downtown Site
(FUSRAP)
25 March 1998
9-11 USACE personnel
5 USACE contractors
2-3 Federal and State Regulators
4 representing other PRPs
TPP Contractor ($ 1,100)
Cost Savings > $ 180,000
8 administrative and technical
constraints were recognized as
critical path action items. These
critical path items would have
otherwise delayed the already
demanding schedule.
"Train wreck" schedule was met.
Resolved complex and
contentious issues regarding
background sampling within
historic fill material.
Gained Regulators' support for
using available data from
adjacent sites in lieu of some
additional site-specific data
collection.
Tom Freeman, Project Manager
314/331-8785
Don Meier, Technical Lead
816/983-3890

-------
USACE Project
TPP Facilitation
Cost & Schedule Effects
Qualitative Effects
Point of Contact
CESWT
AOC #34, Little Rock
AFB
(DERA)
3 November 1998
4-5 USACE personnel
TPP Contractor ($2,400)
"Value-added is an intangible
benefit that remains unseen by
Customer. Lessons learned
using TPP Process have
subsequently improved project
execution on several other
AOCs."
(Measurable cost savings are
limited because Site Inspection
samples had been collected, but
not yet analyzed or interpreted.)
Team recognized that site-
specific "background"
concentrations should be
immediately obtained and
tabulated to prevent subsequent
schedule delay.
Clear reporting of current and
future project objectives was
also realized as an important
means to achieving the site's
schedule.
Timely involvement of the Risk
and Compliance Data User
perspectives were determined to
be critical to the project's
success.
Additional discussions with
Customer and Regulator were
planned to resolve several
anticipated regulatory issues.
Verification of future land use
was identified as a critical action
item to ensure team could
develop a more specific site
closeout plan.
Bob Thurman, Technical Lead
918/669-7170
Lloyd Lewis, Environmental Engr.
918/669-7172

-------
USACE Project
TPP Facilitation
Cost & Schedule Effects
Qualitative Effects
Point of Contact
CESWT
Tulsa Brownsfield Project
(SFO)
4 November 1998
6-8 USACE personnel
TPP Contractor ($2,400)
Cost savings are expected, but
can not be estimated because
project had not begun.
Team determined that very
precise project objectives would
be needed to better define the
current project and ensure
$50,000 funding constraint was
not exceeded.
Strategy for dealing within
limited funding was created to
prepare CESWT for subsequent
discussions with State and
Federal Regulators, as well as
City of Tulsa and other
interested parties.
Team realized that the roles and
responsibilities of all the
decision makers and
stakeholders needed to be better
understood.
Team determined that both
Compliance and Responsibility
Data User perspectives need to
contribute to the project
planning.
Determined that evaluation of
potential points of compliance
and media of concern must be
prioritized for efforts to remain
within project's budget
constraint.
Carol Wies, Project Lead
918/669-7519
CESWT
Landfill #28, Fort Chaffee
(BRAC)
5 November 1998
4-5 USACE personnel
2 USACE contractors
2 USACE Customers
TPP Contractor ($2,400)
Cost savings are now being
estimated.
Potential schedule effects of new
site operator, as well as technical
review cycles were considered
and factored into short-term
schedule.
An interim site closeout
statement and schedule was
drafted by the team to focus
planning efforts.
Team realized that current role,
responsibility, and authority of
the future owner needed to be
clarified prior to data collection.
Need for Compliance and
Responsibility Data User
perspectives was realized.
Andrew McVeigh, Project Manager
918/669-4326
Susan Trussell, Technical Lead
918/669-7046

-------
USACE Project
TPP Facilitation
Cost & Schedule Effects
Qualitative Effects
Point of Contact
CENWK
Former Schilling AFB
(FUDS)
12, 25 January 1999
1 February 1999
4-6 USACE personnel
1-2 State Regulators
TPP Contractor ($4,400)
Cost savings are now being
estimated. (Cost savings,
pending Regulator approval of
plans could exceed $230,000.)
Inter-related tasks were
identified and assigned to ensure
that schedule for data collection
efforts would not be delayed.
Current schedule of activities
will be very focused to ensure
that IRAs are employed if
appropriate and quickly verify if
the site's relative risk score
actually drops priority of site to
a low level.
Assembled and activated an
effective technical project team.
Constrained technical resources
will be able to selectively
support the project and remain
very focused because they
contributed to documenting and
prioritizing the project's
objectives.
Improved the Regulator's
tolerance for the USACE's
limited data collection plans
within select OUs.
Judy Meier, Project Manager
816/983-3569
Amy Darpinian, Project Chemist
816/983-3897
Carol Dona, Environmental Engr.
816/983-3573
CENAB
MOTBY
(BRAC-Army)
22-24 February 1999
3-6 USACE personnel
1 USACE contractor
TPP Contractor ($5,000)
Cost Savings >$1,100,000
(Cost savings, pending
Regulator approval of plans and
contractor negotiations, may
actually exceed $1,500,000.)
"Neck Breaking" schedule will
now be met and all Customer's
milestones can now be satisfied.
Long-term schedule understood.
Technical strategies formulated
for discussions with Regulators,
local redevelopment authority,
and active RAB.
Technical and funding
constraints and dependencies
much better understood and
addressed.
Components produced for PMP
and project plans should now
pass AEC's Interim Technical
Review.
Gladys Hester, Design Team Leader
410/962-2217
Margaret Martin, Environmental
Engr.
410/962-3500

-------
USACE Project
TPP Facilitation
Cost & Schedule Effects
Qualitative Effects
Point of Contact
9 USACE Projects
All Multi Disciplinary Teams!
Only limited Customer and
Regulator participation to date,
yet tremendous savings and
effects of TPP Process!
TPP Contractor Costs < $23,000
Minimum Savings > $4,000,000
All Schedules Met!
"Train wreck" and "Breakneck"
Milestones Achieved!
Improved Project Focus!
More Defensible Technical
Plans!
Increased Customer Satisfaction!
Improved Regulator Relations!
Note: Each Point of Contact edited,
reviewed, and approved this
summary information. Contacts can
also provide additional information
as requested.

-------
BEYOND ENVIRONMENTAL DATA:
Using the Malcolm Baldrige Criteria to Assess Organizations and
Becoming a Certified Quality Manager
Mark. Doehnert
Quality Assurance Manager
U. S. Environmental Protection Agency
Office of Radiation and Indoor Air
401 M St., SW
Washington, DC 20460
SUMMARY « Organizations today face significant challenges, and these challenges translate to
new opportunities for the environmental data quality professional. As quality professionals, one
significant opportunity to share quality information and knowledge is promoting the Malcolm
Baldrige Criteria to assess and improve your organization. Becoming professionally certified is
another opportunity.
Organizations today face significant challenges, these challenges translate to new opportunities
for the environmental data quality professional that go well beyond the traditional tasks and
responsibilities for environmental data. As specialists in quality management and quality
systems, we are uniquely positioned to help the organizations we work for and with to improve
processes and results affecting all key stakeholders. These stakeholders include customers,
employees, suppliers, and the public. Two recent articles in Quality Progress illustrate the point.
In his regular Career Corner, Greg Hutchins' article "Padding like Crazy" (Hutchins 1998)
translates not meeting the challenges and moving to new opportunities like standing still in a
moving stream. The moving stream is organizational change like growth, downsizing, and
reorganization. If we don't see the stream and effectively move with it, we might suddenly end
up in the wrong spot. His challenges to the quality professional include getting a boat, which he
suggests is lifelong learning. He then suggests what we should be doing, and the intensity by
which he suggests is his term "paddling like crazy." His paddling includes actions like getting
out of the box in our thinking, venturing into in-demand multi-disciplinary areas, becoming a
vertical quality specialist and a horizontal business generalist, and becoming our organization's
chief quality knowledge officer and the sharer of its quality information and knowledge. He also
includes becoming professionally certified as one of his suggestions.
In her article "Quality Today, Recognizing the Critical SHIFT," Lori Silverman (Silverman
1999) outlines five trends in the quality field. First, she points out that quality is going softer in
that business success cannot be achieved without attending to needs of the employees. This
includes factors like working in teams and measuring employee satisfaction. Secondly, she
points out that quality went into disfavor or hiding in the early 1990's as prior quality efforts like
total quality management were discontinued. Third, she points out that many U. S. businesses
1

-------
implemented quality tools and methods without appreciating an overall system. She says that
quality must be integrative such that tools are used in concert with each other. Fourthly, she says
that quality is far-flung, with interest in quality reaching to the far corners of the world. Finally,
she points out that quality has gone technical in that highly sophisticated, technical, statistically
based tools are needed.
An example of applying much of what Hutchins and Silverman suggest and moving beyond the
traditional quality system for environmental data is promoting the Malcom Baldrige Criteria to
assess and improve organizations. The term "manage by Baldrige" is being heard more
frequently as thousands of U. S. organizations apply Baldrige. The Baldrige criteria are even
being used by government organizations such as the Naval Sea Systems Command, the Air
Force Air Mobility Command, Brookhaven National Laboratory, and Los Alamos National
Laboratory.
The Malcom Baldrige criteria are accepted widely in U. S. and around the word. More than a
million copies of the criteria have been distributed since 1988. The criteria are goal-focused,
delivering improving value to customers and improving overall performance. The criteria
address approach, deployment, and results, and the focus is on key processes, business results,
and mission effectiveness. It also supports the Government Performance and Results Act.
In the program being used by the Naval Sea System Command through its Inspector General's
office (IG), the organization being assessed competes a Self Assessment based on the current
Malcom Baldrige criteria. The IG team competes a validation visit to validate and assist rather
than just verify compliance. The IG team is made up of assessors from other organizations
within the Naval Sea Systems Command, and I personally served as an assessor for seven
inspections using both traditional and Malcolm Baldrige techniques. The organization being
assessed then develops an improvement plan based upon their self-assessment and the IG's
validation report. Responses to traditional compliance-based inspections were just in response to
each specific finding, while a Baldrige-based improvement plan has a total business focus.
The Baldrige criteria for business are founded in seven categories, leadership, strategic planning,
customer and market focus, information and analysis, human resource focus, process
management, and business results. Now, there are also criteria for education and health care.
The criteria are evaluated on the dimensions of approach, deployment, and results. Approach is
how one addresses the Baldrige requirements — the methods used. Deployment is the extent to
which an approach is applied. Results describe the outcomes in achieving purposes of an item,
such as customer focused results. Results are based on current performance, performance
relative to comparisons and/or benchmarks, and rate, breadth, and importance of improvements.
The criteria also have scoring guidelines. A maximum of one thousand points is distributed over
the seven categories. Nearly half of the total points are in the business results category.
The benefits realized with the Naval Sea Systems Command's new inspection process include
the IG assuming a new role of a business consultant rather than a compliance inspector. The
process reinforces the Naval Sea Systems Command's Strategic Plan. There are a total command
2

-------
commitment, increased awareness and interface with customers, an opportunity to improve
processes and business results. Since the inspections by their new nature identify best practices,
these are then widely disseminated.
Finally, we as environmental data quality professionals have an opportunity to become
professionally certified. I personally support certification because the quality system itself
promotes training and qualification, and because it promotes verification and objective quality
evidence. Being certified presents a verification that we have, in the words of the American
Society for Quality criteria, "demonstrated a proficiency within and a comprehension of a
specified body of knowledge." Furthermore, the body of knowledge for the Certified Quality
Manager helps us become a vertical quality specialist and a horizontal business generalist. It
positions us to become our organization's chief quality knowledge officer and the sharer of its
quality information and knowledge.
As an environmental data quality professional, becoming certified as a quality manager requires a
fair amount of reading, study, and experience beyond that based upon American National
Standard ANSI/ASQC E4-1994. The body of knowledge includes understanding: quality
standards (like the Malcolm Baldrige criteria), organizational assessment, customer satisfaction
surveys, project management, statistical analysis, control charts, quality control tools, team
management, and training needs analysis.
CONCLUSION
Yes, we are in a time of significant change. I believe that we as environmental data quality
professionals can and must take advantage of lifelong learning. Promoting effective programs
that go beyond environmental data quality like the Malcolm Baldrige Criteria can prove a
significant win-win for you and your organization. Finally, when you become certified, you gain
a much greater appreciation for how everything fits together.
ACKNOWLEDGMENTS
I wish to thank the Commander, Naval Sea Systems Command's Office of the Inspector General
for providing me the opportunity to participate in the Malcolm Baldrige Command Performance
Inspection (CPI) Process and for providing me with material for this paper.
REFERENCE LIST
American Society for Quality. 1999. Certified Quality Manager Certification Requirements.
American Society for Quality. 1995. American National Standard: Specifications and
Guidelines for Quality Systems for Environmental Data Collection and Environmental
Technology Programs. ANSI/ASQC E4-1994.
Hutchins, G. 1998. Career Corner: Padding like Crazy. Quality Progress 31:11
3

-------
(November): 144.
National Institute of Standards and Technology (NIST). 1999. Malcolm Baldrige National
Quality Program: Criteria for Performance Excellence.
Silverman, L. L. Quality Today, Recognizing the Critical SHIFT. Quality Progress 32:2
(February): 53-60.
4

-------
If;

-------

-------
OPERATIONAL QUALITY ASSURANCE - COMPLETING THE END STATE
David Bottrell, chemist and
Kelvin Kelkenberg, Director
Office of Transportation and
Emergency Management
Timothy Harms, Program Manager
Office of Waste Management
U.S. Department of Energy (DOE)
19901 Germantown Road
Germantown, MD 20874-1290
SUMMARY
Environmental program decisions developed and implemented through planning processes like
the Data Quality Objectives (DQO) process define unacceptable contaminant levels, site status
after cleanup, and verification/monitoring procedures. However, these decisions and their time
lines often lack consideration for other regulators, such as the Department of Transportation
(DOT). Frequently missing are the realities of packaging, shipping, and waste acceptance. A
decision capable of implementation requires early integration of these and other considerations
into the planning process.
PROJECT DESCRIPTION
The Project Description, as developed through various planning processes, can focus data
collection on a well-defined problem. However, details necessary for implementation of
solutions are not necessarily part of initial considerations. DOE, working with the EPA,
developed a process called the Streamlined Approach for Environmental Restoration (SAFER) to
integrate intermediate remediation decisions earlier in the decision process, thus saving time,
reducing duplication of effort, and using resources efficiently. Other activities, such as
expedited site characterization, have focused data collection to meet immediate needs to direct
actions. These approaches can adequately address project implementation for most
environmental programs. They work because a quantitative decision rule exists. Project
definition establishes acceptable confidence and assesses monitoring options. Flexible
documentation and moving the decision closer to the field are actions that have been completed
safely and efficiently.
However, environmental program activities with higher risk and visibility have more obstacles to
implementation. One potential obstacle is satisfying multiple regulators, as described by the
EPA (40CFR, 1999). For example, DOE is responsible to "make hazardous waste go away," or,
in the case of onsite storage, at least minimize the exposure. When this goal or end point is
defined and accepted, programs will be designed to consider how to achieve the end state.
Critical components to reach the end state often are not adequately emphasized to ensure
availability consistent with end state time commitments. To address this kind of environmental
project, early involvement of expertise outside the scope generally associated with DQO-like
1

-------
planning is essential. The traditional process lists specific expertise necessary to formulate the
problem-and to design data collection appropriate to reach a decision within defined acceptable
limits. However, additional, more specific expertise is necessary to ensure an acceptable
approach can be implemented immediately. An example of specific expertise is knowledge in
transportation and packaging regulations and options.
Some relatively complex decisions necessary for project implementation include onsite vs
offsite storage, existence of acceptable packaging, mode and route of transport, and acceptance
of waste for treatment/storage/disposal. These decisions are largely outside the scope of
EPA/state environmental regulations, falling primarily under jurisdictions listed in Table 1. The
degree of jurisdiction and how it is shared is dependent on waste classification. Waste
constituents dictate packaging requirements, transportation requirements, material disposal
options, and regulatory responsibility. The EPA and state governments control the RCRA
constituents but not radioactive material.
Table 1. Regulatory Authority for Path to End State (radioactive material)
Activity
Department of
Transportation
Nuclear Regulatory
Commission
Department of Energy
Packaging
X
X
X
Transportation
X
X
X
Waste Acceptance

X
X
This summary is incomplete, since it does not include the authority of states and tribes to affect
transportation routes and emergency capabilities along potential routes. The summary indicates
that commitments are very much in hands other than DOE or EPA. Additional stakeholders
should often be considered during early planning phases.
Definitions affecting waste classification (49 CFR part 171.8^
Radioactive material (YES, there is a de minimis) - Has a specific activity greater than 70
becquerels per gram (0.002 microcuries)
Hazardous Waste - Anything subject to EPA manifesting requirements (40 CFR part 262)
Hazardous substance - Listed in tables, quantity > reportable amount (weight per package)
Hazardous material - Mostly hazardous waste and hazardous substance
Transuranic waste - Atomic number >92; half-life > 20 years; activity > 10 nanocuries / gram -
definitions are not exactly standardized
2

-------
Three basic considerations are:
•	EPA regulations are based on concentration; DOT regulations are based on contribution
to package weight.
•	EPA regulations referenced (40CFR-Parts 260-265) mean that the regulation directly
applies to RCRA. Constituents regulated by other programs, such as PCBs and asbestos
under TSCA, aren't directly considered.
•	A high percentage of radioactive waste is mixed.
Based on these and other terms and considerations, significant potential for confusion exists.
Examples include the following:
1)	Solid Waste ~ free flowing liquid
DOT considers these terms mutually exclusive. Packaging, transport, and disposal
assume absence of free liquid in a solid material. This distinction is a basic decision
element relating to physical protection of the driver and as a factor increasing exposure.
2)	Ignitable - D001 (liquid) 60°C (140°F) for EPA, but less than or equal to 60.5°C for
DOT
DOT and EPA have different primary drivers resulting in significantly different concerns. DOT
needs details because the primary reason for regulation is protection of the transporter and, to a
limited extent, to facilitate emergency response. For this reason, information to meet DOT
regulatory requirements is similar to environmental data but often not identical. In addition to
basic RCRA distinctions, DOT considers parameters and cut points listed in Table 2.
Table 2. Class 3 - Assignment of
Packing Groups (from 49 CFR 173
.121)
Packing Group*
Flash Point (Closed Cup)
Initial Boiling Point
I

*35 ° C
II
*23°C
35°
in
*23°C, *60°C
35°
* Packing Groups relate to how dangerous the material is. Packing Group 1 is a robust package
for very hazardous materials. Packing Group III is for least dangerous materials.
3

-------
A timely example of confusion is the concept of corrosiveness as measured by EPA and DOT.
EPA Corrosive - D002 is pH *2; or 112.5; or corrodes steel >6.35mm/yr @ 55°C
Table 3. DOT Corrosive - Class 8

Group I
Group II
Group ID
Exposure Time
s 3 min
>3 min, s 60 min
> 60 min, s 4 hr
Observation Time
£ 60 min
<, 14 days
s 14 days
Corrosion Rate on
steel/aluminum
—
—
>6.25 mm/yr
at 55°C
Packing Group I - Causes full-thickness destruction of intact skin tissue with an observation
period of up to 60 minutes starting after the exposure (3 minutes or less)
Packing Group II - Causes full-thickness destruction within 14 days of exposure (3 minutes but
not more than 60 minutes)
Packing Group III - Only for these materials: general range of environmental samples and most
environmental level waste; only corrosion level that is at all comparable to DOT criteria
These discrepancies are both good and bad. On the good side, EPA concerns about sending
corrosive samples are probably not a problem. The matrix and contaminants would most always
fall under these limits for corrosion and for limited quantities.
On the bad side, recent interlaboratory comparisons (Vitale, etal, 1998) indicate that
classification for RCRA determination of flash point may be a relatively low confidence
measurement. Extrapolating this to DOT requirements would likely make the situation worse.
This is problematic because this hazard is likely the biggest single DOT regulatory concern.
In transportation, a third significant figure in concentration is not as important as "can it blow
up" or "how long will it burn." These basic questions or problems are much different from the
traditional DQO statements for remediation of average concentrations, maximum exposure, or
specific contaminant. DOT regulations can work with limited classification data by assuming
the worst case. In many cases, this assumption provides an approach at a lower total program
cost than would come with complete constituent identification (very expensive characterization).
Although these examples involve relatively simple packaging and transportation issues, even at
this level, potential for confusion exists. Potential difficulties can be eliminated early in the
decision process with no or minimal additional data collection. Parallel situations exist for
transportation and waste acceptance; starting over to collect data or continuing a project on the
assumption that specific activities are not important is a poor approach.
4

-------
For example, at Rocky Flats (see reference 5), Restoration Programs and Safeguards and
Security are evaluating additional options for onsite storage because neither packaging,
transportation, nor an appropriate disposal site is available for various waste materials, including
111,000 cubic meters of rubble.
These oversights are major contributors to missing programmatic milestones; a commitment to
move material must be based on thorough planning and realistic considerations, and not just on
regulatory/legal requirements. Make sure a mechanism is in place to package and ship before
making long-term, legal commitments on waste transfer. Without responsible planning, the
commitment may not be feasible, and, in turn, may not happen!
The concept of expedited response and moving the decision to the field has considerable merit in
making programs more pragmatic. However, the flexible work plans should consider specific
packaging and transportation cut points to ensure that transportation requirements are met. For
example, package limits and external dose requirements still apply. Expediting an
environmental program action while introducing a DOT violation may not be a good trade off.
Opportunities to improve communication and program functionality include:
Incorporate additional regulatory expertise in planning of complex cleanup and waste
management projects, e.g., transuranic / mixed waste;
•	Recognize inconsistent use of the same term, e.g., solid waste (recognize because unique
character not bad and standardization probably impossible); and
•	Work cooperatively to meet human and environmental health needs across project life
cycle, e.g., generation, storage, transportation, disposal, etc.
SUMMARY
Neither characterization, packaging, nor transportation is THE "environmental project end
point." However, all three elements are critical and ultimately relative to successful project
implementation. A guiding premise should be:
DON'T DIG IT UP UNTIL --
• You can get it there (packaging and transport); and
	•	They will accept it (waste acceptance criteria).	
On occasion there may be additional, urgent health and safety concerns; but these can be
responsibly managed with proper consideration and preparation.
Project Perspective:
• Waste is defined by a regulatory process (RCRA, CERCLA).
5

-------
Waste is packaged and shipped to be compliant with DOT/DOE/EPA (waste
designation).
Waste is accepted for treatment/storage/disposal to be compliant with waste acceptance
criteria or commercial contracts (NRC/DOE/EP A/states).
Characterization Needs
•	Qualitative identification recognizes that specific concentrations are important at defined
decision points.
•	Transportation requirements are more often based on mass than concentration.
•	Mass and physical / chemical state dictate packaging requirements, amount of material
per package, and the amount of material per shipment.
Examples of how these factors affect implementation:
•	There is essentially no way to send large shipments of radioactive waste with moisture >5
percent except with small volumes of limited quantities, such as test samples.
•	For plutonium, -40 grams can be put into a single package. A 250-kg shipment of mixed
contaminated material can result in more than 6,000 expensive little packages instead of
a partially filled 55-gallon drum. Don't make this kind of commitment too quickly, since
the packages may not exist.
CONCLUSION
For most CERCLA decisions, acceptable packaging exists. Transportation options are available
and waste acceptance generally is not a problem. However, for mixed waste, the problems can
become more complicated.
For projects involving mixed waste transfer, quality planning requires early consideration of
packaging availability, transport options, and acceptability of waste. Expertise to help identify
alternatives should be directly involved with the initial project scoping activities. These
considerations are basic to determining the lowest total project cost end state and to ensuring the
established goal can be implemented in an acceptable time frame.
6

-------
REFERENCE LIST
1.	Code of Federal Regulations, 40 CFR Chapter 1. Approach to Reinventing Regulations
of Storing Mixed Low-Level Radioactive Waste; Proposed Rule, March 1, 1999.
2.	Code of Federal Regulations, 49 CFR (Transportation) PARTS 100-185, Revised
October, 1997.
3.	Code of Federal Regulations, 40 CFR (Protection of Environment) PARTS 260-265,
Revised July, 1998.
4.	Vitale, R.J., L.J. Dupes, and D.J. Caillout. 1998. "Are Your Waste Streams Being
Correctly Characterized?" Environmental Testing & Analysis, Sept/Oct 18-21 & 30-31.
5.	The Advisor, Publication of Rocky Flats Citizens Advisory Board, Spring 1999.
7

-------
THE SIGNIFICANCE OF ANALYTICAL ERROR IN PROJECT PLANNING,
IMPLEMENTATION AND ASSESSMENT
Cliff J. Kirchmer, Ph.D., Quality Assurance Officer
Stewart Lombard, Ph.D., Quality Assurance Specialist
Department of Ecology
PO Box 47600
Olympia, WA 98504-7600
SUMMARY
Considerations of error and its control are important in project planning, implementation and
assessment. Analytical results are affected by errors in both sampling and analysis, which need
to be taken into account in the data quality objectives (DQO) and data quality assessment (DQA)
processes. Recommendations are made for reducing analytical bias before decisions are made
based on the data.
INTRODUCTION
When projects involve acquiring environmental data, the U.S. Environmental Protection Agency
(EPA) has described a systematic process involving three phases: planning, implementation and
assessment (U.S. EPA 1998). The Water Research Centre (WRC) in England has described a
systematic approach to analytical quality control (Cheeseman and Wilson, 1989). Both are
concerned with controlling errors originating during sampling and analysis. This paper examines
the significance of analytical error in relation to total error, and how the WRC approach to
controlling analytical error can be incorporated into the project level of EPA's Quality System.
NATURE OF ERRORS IN ANALYTICAL RESULTS
The term analytical result denotes a numerical estimate of the concentration of a determinand
(i.e. analyte) in a sample, and is obtained by carrying out once the procedure specified in an
analytical method. Note that a method may specify analysis of more than one portion of a
sample in order to produce one analytical result (Hunt and Wilson 1986).
The error, E, of a result, R, is defined as E = R - T, where T is the true value. Accuracy is
defined as the total error of a result; that is, accuracy represents the combined random and
systematic errors of a result and is said to improve as the total error decreases. Note that, in this
definition of error, there is no implication that anyone has made a mistake, though of course
mistakes may cause error (Hunt and Wilson 1986).
It is common experience that, when a stable homogeneous sample is analyzed repeatedly, the
results of those analyses are not identical. The results obtained generally differ among
themselves and are more or less scattered about some central value. The scatter obtained is
attributed to random errors, which are so named because the sign and magnitude of the error
differs from one result to another. There are many possible sources of random errors in analyses.

-------
Example sources of random errors in analyses include slight variations in the volumes of
reagents-added to samples, variations in the times allowed for chemical reactions, variable
contamination effects, fluctuations in instrument response and temperature variations. Random
error in sampling can occur when there are variations in the concentrations because of the way
the samples were taken or because of the way the samples were preserved, transported, and
stored prior to analysis.
The errors due to sampling and analysis are therefore the two major contributions to the total
error of a result. Of course there is also variability of the analyte in the environment, but the
error in estimating this variability, corresponding to the sampling design error as defined by
EPA, cannot easily be quantified.
Systematic error is present when there is a consistent tendency for results to be either greater or
smaller than the true value. In statistical terminology, the mean of n analytical results from the
same sample approaches a definite value, n, as n increases indefinitely. When fi differs from the
true value, T, results are said to be subject to systematic error of magnitude B, where B - [j. - T.
In this paper, the term "bias", B, is used synonymously with systematic error. The term accuracy
is sometimes used to denote bias, but this ignores the fact that random error also affects the
accuracy of a result (Hunt and Wilson 1986). Many methods refer to "precision and accuracy"
but, since accuracy includes precision, this expression should be discouraged. Note that any test
or experimental design used to estimate bias will include a component of random error in the
estimate.
It has been stated that there are six possible sources of bias or systematic error in water analyses
(Hunt and Wilson, 1986):
•	Unrepresentative sampling
•	Instability of samples between sampling and analysis
•	Interference effects
•	A biased calibration
•	A biased blank correction
•	An inability to determine all forms of the determinand (i.e.analyte)
A seventh, mistakes by the analyst, could be added to this list, and cannot be overlooked, but
should be controlled by proper organization, training and procedures.
The first two sources in the above list correspond to bias originating prior to analysis (i.e. during
sampling or between sampling and analysis), while the last four correspond to bias occurring
during analysis. Representative samples imply that there is a good sampling design as well as
good execution of that design during measurement. While recognizing the importance of taking
representative samples and assuring that the analytes in these samples remain stable until they
are analyzed, the emphasis in this paper is on the analytical error.
2

-------
EPA APPROACH TO ESTIMATING AND CONTROLLING ERRORS
EPA has described a Data Quality Objectives (DQO) Process which consists of seven iterative
steps, starting with a clear statement of the problem and ending in an optimum design to obtain
data. Part of the DQO Process involves the development of a decision rule and the specification
of limits on decision errors.
EPA has defined the total study error as being equal to the combination of sampling design error
and measurement error (U.S. EPA, 1994). Sampling design error is said to occur when the
sampling design is unable to capture the complete extent of natural variability that exists in the
true state of the environment. Measurement error refers to a combination of random and
systematic errors that inevitably arise during the various steps of the measurement process (for
example, sample collection, sample handling, sample preparation, sample analysis, data
reduction, and data handling). Measurement error is therefore synonymous with the total error of
a result as defined by the WRC, since both include sampling and analysis as causes of error.
Since we cannot know the true state of the environment, it is impossible to quantify the sampling
design error, which occurs either because of poor judgement in making the design or because of
limitations in resources. In the latter sense, it is more of a choice than an error. Measurement
error occurs as a result of samples being taken and analyzed and therefore the results of these
analyses, along with the results of analyses of quality control samples, can be evaluated to
estimate its magnitude.
According to EPA, since it is impossible to eliminate error in measurement data, basing
decisions on measurement data will lead to the possibility of making a decision error. And, since
there is uncertainty in measurement data, there must also be uncertainty in the decisions based on
that data. The quantitative statement in the Data Quality Objectives (DQO) process includes
developing a decision rule as an "if.. .then..." statement that incorporates the parameter of
interest, the scale of decision making, the action level, and the action(s) that would result from
resolution of the decision (U.S. EPA 1994). The decision rule is then used as the basis for
specifying statistical limits on decision errors. EPA recommends guarding against making the
decision error that has the greatest undesireable consequence by setting the null hypothesis equal
to the condition that, if true, has the greatest consequence of decision error. Thus, the site
manager may decide that a site is contaminated (null hypothesis) in the absence of strong
evidence (study data) that indicates that the site is not contaminated (alternative hypothesis).
It is noteworthy that only random measurement error is directly considered in deciding whether
the action level has been exceeded. According to EPA, the analytical error determination
measures the precision of the analytical method (U.S.EPA 1993). Thus, the data used in judging
whether or not decision errors are present are assumed to be unbiased and evaluated statistically,
assuming a normal distribution. This means that bias in sampling and analysis must be
eliminated or reduced significantly before the mean concentration of the results can be
considered to be a good estimate of the true value and evaluated using a Design Performance
Goal Diagram (sometimes called a Decision Performance Curve).
3

-------
WRC APPROACH TO ESTIMATING AND CONTROLLING ERRORS
The WRC has described a recommended approach to analytical quality control (Cheeseman and
Wilson 1989). Like EPA's DQO Process, it involves a stepwise iterative process, but the
emphasis is on estimating and controlling analytical error affecting results, while recognizing
that sampling can be an important source of error in results. While the WRC approach is
designed for water samples, many of the principles included in the approach can also be applied
to other matrices.
There are some parallels in the EPA and WRC approaches. EPA's DQO Process starts with a
statement of the problem and includes the development of a quantitative decision rule and
quantitative limits on decision errors, while the WRC approach starts with a definition of
analytical objectives which include quantitative targets for accuracy (i.e. precision and bias).
EPA's DQA Process includes statistically based judgements regarding whether the data quality
objectives have been met, while the WRC approach includes statistically based judgements
regarding whether targets for accuracy have been met. In the WRC approach, most of the work
is done prior to collection of samples for routine analyses, and control charts are used to verify
that targets for precision and bias are met on a continuing basis (Kirchmer 1983). The EPA and
WRC approaches complement one another, since in the EPA approach there is no direct
examination of the existence or causes of analytical error and in the WRC approach analytical
error is examined in detail to assure that the targets for accuracy are met. Meeting targets for
accuracy helps meet the data quality objectives, as expressed through the decision rule and the
limits on decision errors.
Targets for accuracy are set in the following manner. In percentage terms, the total error of
individual analytical results should not exceed 2p% of the result. If the total error is partitioned
equally between systematic and random error, it follows that the systematic and random errors
should each not exceed p% of the result.
The magnitude of random error can be defined only with respect to a chosen probability level,
and the 0.05 probability level (equivalent to 95% confidence limits) is considered reasonable for
most purposes. Since this probability level corresponds to approximately two times the standard
deviation, it follows that the standard deviation should not exceed 0.5p% of the result.
At very low concentrations, the relative error increases and it is not possible to meet the
percentage targets. In this case, the total error of results should not exceed the lowest
concentration of interest, Cl.
In summary, therefore the systematic error should not exceed 0.5Cl or p% of the result,
whichever is the greater; and the total standard deviation should not exceed 0.25CL or 0.5p% of
the result, which ever is greater. Table 1 (Cheeseman and Wilson, 1989) shows the effect of
these recommendations, for a range of values of the concentration of primary interest (e.g. water
quality standards).
4

-------
Table 1 Calculation of Maximum Tolerable Errors in Analytical Results
Maximum Smallest	Maximum tolerable error
allowable
concentration*
Standard
Systematic
Total
concentration*
of interest, CL
deviation, atf
errorj"
errorf
100
10.0
2.5 or 0.05C
5.0 or 0.1C
10.0 or 0.2C
50
5.0
1.25 or 0.05C
2.5 orO.lC
5.0 or 0.2C
10
1.0
0.25 or 0.05C
0.5 orO.lC
1.0 or 0.2C
5.0
0.5
0.125 or 0.05C
0.25 or 0.1 C
0.5 or 0.2C
1.0
0.1
0.025 or 0.05C
0.05 orO.lC
0.1 or 0.2C
* All expressed in the same concentration units: e.g., if the maximum allowable concentration is
expressed in units of jjg/L, the smallest concentration of interest and the maximum tolerable
errors are also expressed in units of ng/L.
fin the quoted numerical values, C denotes the concentration of analyte in the sample. Of the
two values tabulated for each error, that one applies which has the larger value for a given
analyte concentration.
To determine whether the targets for precision and bias have been met, an experimental design is
used to estimate within-laboratory precision and some sources of bias. The experimental design
includes 5-10 batches, each batch including the duplicate analyses of blanks, standards, samples
and spiked samples. The following information is obtained from the data:- estimates of standard
deviations for analyses of blanks, standards and samples, and an estimate of bias due to
interference in samples. Duplicate analyses are done in order to estimate both within-batch and
between-batch components of standard deviation. The within-batch standard deviation of the
blank can also be used to estimate the limit of detection. Calibration data can also be evaluated
to determine if there is a biased calibration.
If resources prevent the implementation of an experimental design prior to sample collection and
analysis, estimates of within-laboratory precision and some sources of bias can be obtained by
including duplicate analyses of blanks, standards, samples, and spiked samples along with
routine sample analyses. However, in this case, if corrective actions are needed, it will not be
possible to make them prior to routine sample analysis.
PERFORMANCE CHARACTERISTICS OF ANALYTICAL METHODS
An important step in the WRC recommended approach to analytical quality control is to choose
analytical methods with satisfactorily small sources of bias and capable of adequate precision.
Some performance characteristics for analytical methods, that are useful when choosing a
method, are: the substance determined, the type of sample, and the basis of the method; the
range of application; the calibration curve; the total standard deviation; the criterion of detection
(based on the within-batch standard deviation of the blank); the sensitivity; bias; interferences;
and the time required for analysis. With respect to the the quantitative targets for accuracy,
5

-------
performance characteristics of standard deviation, criterion of detection, bias, calibration, and
interferences are most important.
Calibration is seldom recognized as a common source of bias in analytical results. A principle of
analytical quality control is that standards and samples should be analyzed the same way, and
that failure to do so can result in calibration bias (Wilson 1974). Calibration is a very common
source of bias for labs analyzing trace organics (Kirchmer and Schupp 1986). For example, in
the analyses of water samples for organics, standards are usually prepared in pure solvent
whereas samples go through an extraction step and sometimes also a cleanup step. The analyses
of aqueous standard solutions containing the target analytes often shows that the "recovery" of
the target analytes is considerably less than 100%. Low "recovery" for analyses of standard
solutions means that the bias can only be due to calibration. In other words, if the calibration
standards had been taken through the same process as the samples, the recovery would be close
to 100% and the bias due to calibration would be low. This assumes that sufficient standards
have been analyzed so that the random error in calibration is also low, since random error in
calibration becomes a systematic error when the calibration curve or factor is used to quantify
sample results.
Table 2 summarizes some results for 8 interlaboratory studies done by EPA (U.S. Environmental
Protection Agency, 1984). The results of these studies show that poor recoveries were obtained
for the analyses of organics in pure (distilled) water. It is worth noting that recoveries for some
specific compounds of interest were significantly less than the mean recoveries given in Table 2.
Thus, for example, the recovery for phenol was only 43% and that for N-nitrosodimethylamine
was only 37%. The studies also included recoveries for compounds spiked in tap water, surface
water, and wastewater, but only the results for pure water are included in Table 2 since they
would not be affected by interference or matrix effects. Any bias would have to be due to the
differences in the procedures for calibration and analysis, namely that samples go through
extraction and cleanup steps while calibration standards do not. Bias due to calibration was
found to be significant for these methods. Interestingly, there was little evidence in these studies
for any significant bias due to interference, which is commonly believed to be a more significant
source of bias.
Table 2. Results for Interlaboratory Studies of some Organics Methods (U.S.EPA 1984)
EPA Method
No. ComDOunds
Mean Recovery*
Calibration Bias
604 - Phenols
11
67.6%
-32.4%
606 - Phthalates
6
65.6%
-34.4%
607 - Nitrosamines
3
65.9%
-34.1%
608 - Organochlorine



* Pesticides
24
86%
-14%
610-PNAs
16
61.5%
-38.5%
611 - Haloethers
5
82.2%
-17.8%
612 - Chlorinated



Hydrocarbons
8
76.3%
-23.7%
625-BNAs
65
74%
-26%
* Percent recoveries are for 100 |xg/L of compounds in pure (distilled) water
6

-------
In some-cases, however, interference can also be an important source of bias, especially for
complex samples such as hazardous wastes. Spiked samples are often analyzed to determine if
interference or matrix effects are present. But the power of the spike recovery test to detect
interference is low. It is much more effective to estimate bias due to interference by
implementation of an experimental design than by routine spiking of samples. Experimental
designs can include an examination of the interference effects of specific substances (Hunt and
Wilson 1986).
Other sources of analytical bias may be significant for some projects. For example, if the
concentration of interest, Cl, is near the limit of detection, biased blank determinations or failure
to blank correct may result in bias. This source of bias may be critical, for example, in trace
metal analyses. Finally, one must be certain that the chosen method is able to determine the
forms of the analyte that are of interest. For example, one may need to determine only dissolved
orthophosphate when other phosphates (e.g., undissolved orthophosphates, condensed inorganic
phosphates) are also present (Wilson 1974).
On the other hand, method performance can be characterized by low bias and high random error.
An example of this would be EPA Method 624, for the analysis of purgeable organic compounds
in water by gas chromatography-mass spectrometry. A method validation study for EPA Method
624 found that overall recoveries for the volatile organic compounds were very good, with an
average recovery of 100%, indicating no calibration bias. This is to be expected, since
calibration standards and samples are analyzed the same way. But the median percent relative
standard deviation at 100 |ig/L was 24%, indicating a large degree of imprecision. In this case,
accuracy for individual analytical results could be improved by basing results on the average of
several replicate analyses, since the standard deviation of the mean is equal to s/Vn, where s is
the standard deviation for the individual analytical results and n is the number of results used to
calculate the mean. This illustrates that random error, unlike systematic error, can always be
reduced by replicate analyses of samples, and this should be taken into account in reducing the
analytical measurement errors.
ERROR DUE TO SAMPLING
It is difficult to quantify error due to sampling, although there are ways to optimize the frequency
and time of sampling and to reduce bias and random error arising from the way samples are
taken, preserved, transported and stored prior to analyis.
Replicate samples can be taken both to reduce the random error in sampling and to obtain
estimates of the magnitude of that random error. For example, duplicate samples can be taken in
the field to estimate the contribution of sampling to the total standard deviation of a result. If
one of those samples is split in the laboratory, estimates of total and analytical standard deviation
can be obtained. This is usually applied to grab samples, but could also be applied to composites
if two composite samples are taken simultaneously. It can be thought of as the variability of a
sample representing a particular combination of space and time. The equation for additivity of
variances is used to determine the contributions of the standard deviations due to sampling, ss,
and analysis, sa, to the total standard deviation, St. That is, st2=Sa2+ss2. So if duplicate samples,
7

-------
sometimes called collocated samples, are taken, the total standard deviation can be estimated
from the equation st=DV2. This is a poor estimate of standard deviation, however, since there is
only one degree of freedom in the estimate. In order to get a better estimate, several duplicate
samples can be taken from a location and a pooled estimate of standard deviation obtained using
the following equation: St=V((£D2)/2m), where D is the difference between two sample results
and m is the number of pairs of sample results.
CORRECTION OF ANALYTICAL BIAS
Bias due to calibration has been identified as a significant source of bias for some methods of
analysis, particularly for organic compounds. This bias needs to be corrected before using
EPA's Decision Performance Curve to decide if an action level has been exceeded. The
preferred correction would be to choose another method or modify an existing method so that
calibration bias is eliminated. Some methods, such as EPA 504.1, do not exhibit calibration bias
because calibration standards and samples are analyzed by the same procedure, and should be
selected whenever possible. EPA 504.1 states that "Aqueous calibration standards are extracted
and analyzed in an identical manner as the samples in order to compensate for possible
extraction losses." Isotope dilution methods, such as EPA 1625 for BNAs, also do not exhibit
calibration bias, since the isotopically labeled internal standards are added to the samples and
carried thru the extraction and cleanup procedure. If it is not possible or permissible to change to
a less biased method, results can be mathematically corrected, based on the recoveries found for
analyses done on analytes spiked into pure matrix (e.g. pure water).
Bias due to interferences are best corrected by using a method less subject to interferences. For
organics analyses, isotope dilution mass spectrometry is less subject to interferences. For metals
analyses, the method of standard additions can often be used to reduce interferences, but it
should be remembered that interference effects that are independent of the concentration of the
target analyte are not corrected by this procedure.
A change in the procedure for determining the blank or in making blank corrections will be
necessary if either of these are a source of bias. Any changes should be documented in the
written procedure.
PERFORMANCE BASED MEASUREMENT SYSTEMS
EPA's proposal for performance based measurement systems defines acceptability of alternative
methods in relation to a reference method. When that standard method exhibits bias, (calibration
bias, interference, or failure to blank correct), it should be permissible to define acceptability in
relation to the true value, and not just in comparison to a reference method.
NATIONAL LABORATORY ACCREDITATION PROGRAM
In the National Laboratory Accreditation Program (NELAP) laboratories are audited to
determine if they are performing analyses correctly. As performance based methods are
increasingly used by laboratories, it will be the auditor's responsibility to verify that the proper
documentation is present to demonstrate that the alternative method is as good as or better than
8

-------
the reference method. The auditor should be permitted to approve alternative methods if they
exhibit less bias than the reference methods.
CONCLUSION
At the project level of EPA's Quality System approach, steps are included to identify and control
errors that may keep one from being able to meet the DQOs and thus avoid decision errors. The
approach effectively deals with random errors of sampling and analysis, but systematic errors in
analysis are assumed to be negligible when in fact they may be significant. The WRC approach
to AQC complements EPA's approach through a series of iterative steps that include the
definition of analytical objectives, selection of methods, estimation of precision and bias, and
routine control of precision. Sources of analytical bias that may need to be controlled are bias
due to calibration, bias due to interference, biased blank corrections, and the use of inappropriate
methods. Random error can often be reduced by specifying the analysis of more than one
portion of a sample to produce one analytical result, and it is important that random error be
considered in the definition of DQOs and targets for accuracy. Finally, it is important to estimate
random errors for both sampling and analysis. Too much effort should not be expended to
reduce analytical error if the random error of sampling is much larger and cannot be reduced.
REFERENCE LIST
Cheeseman, R.V. and Wilson, A.L. A Manual on Analytical Quality Control for the Water
Industry. NS 30. Revised by M.J. Gardner June 1989.
Hunt, D.T.E and Wilson, A.L. 1986. The Chemical Analysis of Water, 2nd ed. The Royal
Society of Chemistry, London.
Kirchmer, C.J. 1983. Quality Control in Water Analyses. Environ. Sci. Technol., 17
(April):174A-181A.
Kirchmer, C.J. and Schupp, G.C. 1986. Quality Control for Organic Trace Analyses. Trends in
Analytical Chemistry. 5 (4) (April): 86-89.
U.S. Environmental Protection Agency 1984. Project Summaries for EPA Method Studies 604,
606,607,608,610,611,612,624, and 625) PB 600/S4-84-044,-056,-051,-061,-063,-052,-039,-
054, and -053. U.S. National Technical Information Service.
U.S. Environmental Protection Agency September 1993. Data Quality Objectives Process for
Superfund, EPA540-R-93-071, PB94-963203.
U.S. Environmental Protection Agency. September 1994. Guidance for the Data Quality
Objectives Process. EPA QA/G-4: 28.
U.S. Environmental Protection Agency. October 1998. EPA Requirements for Quality
Assurance Project Plans for Environmental Data Operations, EPA QA/R-5, External Review
Draft Final.
Wilson, A.L. 1974. Performance Characteristics of Analytical Methods-IV. Talanta 21:1109-
1121.
9

-------
Pitfalls in Performance Auditing
Volatile organic compounds (VOCs) and aldehydes have been implicated in sick building
syndrome in new office buildings. My organization, the Air Pollution Prevention and Control
Division (APPCD) of the National Risk Management Research Laboratory, participated with a
private sector "verification partner" in an Environmental Technology Verification project to test
emissions from low-emitting office furniture. The purpose of this project was to produce
methods to verify performance, based on manufacturers' claims that their furniture did not emit
significant levels of VOCs, including aldehydes and other VOCs, into the indoor office air
environment.
A standard protocol exists for measuring VOCs emitted from materials placed in large
environmental chambers. The procedure involves placing newly manufactured furniture into a
large environmental chamber, passing clean air over the furniture, and collecting any emitted
volatiles on sorbent cartridges, which are subsequently analyzed in a laboratory, following the
standard protocol. The QA Team in APPCD needed to assess the capabilities of three analytical
laboratories to analyze the compounds emitted by office furniture.
The plan to audit was straightforward: We would spike several sorbent tubes with two
levels of some compounds emitted by furniture and send these spiked tubes to the analytical
laboratories. At the same time, we would spike some tubes for analysis by our reference
laboratory, to confirm spiking levels and to determine the inherent variability in the method. We
would assess results from the participating laboratories by comparison with the reference
laboratory.
The illusion of simplicity was short-lived. Each laboratory uses a different sorbent, and
each laboratory uses tubes of different size, none of which were compatible with our reference
laboratory's equipment. None of the laboratories wished to change their system. In addition, the
laboratories wanted to begin with known compounds and unknown quantities, testing only
VOCs. There were scheduling problems and problems with the proprietary nature of data from
prior tests done on furniture emission, used to determine spiking levels. The solvent used in
preparing the spiking solutions was not compatible with one of the sorbent systems.
A sequential description of problems encountered in conducting this audit and solutions
reached is presented in the hope that others conducting performance assessments can glean
helpful suggestions.
Nancy Adams
U.S. EPA, APCD/NRMRL
Research Triangle Park, NC

-------
yit "al s in
Nancy Adams
ORD/NRMRL/APPCD/TSB
Presented at the Annual EPA QA Meeting, 1999
t
r
Background:
¦	Sick building syndrome
¦	ETV project on IAQ
~	Verification partner
~	Stakeholders
¦	"Green" office furniture manufacturers
¦	Verification of performance
¦	SOP for VOCs/aldehydes
¦	Audits to verify capabilities of labs

-------
Performance Audit Procedure
¦=>
¦=>
Issue: Proprietary data
I
I'
i

-------
Issue: Audit materials
-	YOCs and aldehydes?
-	Unknowns v. knowns
Solution: Plan for
multiple rounds of audits 4
r
Issue: Participants
Solution: Two labs
withdraw, one added
ft
r

-------
Status:
¦	Three labs
¦	Four known analytes
¦	Two unknown levels plus blanks
¦	23 tubes
~	7 high level - 3 reference tubes
~	7 low level - 3 reference tubes
~	7 blanks - 3 reference tubes
~	2 spares
More Issues:
¦	Labs can't spare 23 tubes
—	Use 11 tubes (3+3+3+2)
¦	Each lab uses a different sorbent
—	Assume similar and assess data
¦	Each lab uses different size tubes
~	Incompatible with reference lab
~	Participants can't change
~	Ref. lab can't purchase equipment
—	Ref. lab can't validate spikes
—	Ref. lab will validate own sorbent

-------
¦	Different calibration ranges
~	Adjust spiking levels
¦	Scheduling problems
~	Wait
¦	Concerns with proprietary procedures
~	Encourage sharing of knowledge
Status (with more issues):
¦	Spiked samples prepared
~	One lab's tubes inconsistently packed
- Use the best tubes; inform lab.
~	One lab's tubes were metal and heated too
rapidly
—Modified spiking procedure
¦	Shipment to labs, with instruction for
reporting results

-------
Status (with more issues):
(continued)
¦ Validation with reference lab's sorbent
tubes
~	Ref. lab forgot to run TVOC standard
~	Can't assess TVOC levels
More issues:
¦	No response from two labs after
expiration of time limit for analysis
~	Phoned
~	E-mailed
~	&*#%$!
¦	Verification partner scheduled conference
call
~ Labs submitted results at the last minute

-------
Still more issues:
¦ One lab's data unacceptable
~	Accusations!
~	Rebuttals!
~	Technical solution proposed (incompatible
solvents)
—	Remade spiked tubes
—	Reanalyzed, with delays due to instrument
problems
—	Acceptable results
Lessons Learned:
• Nothing is completely standardized
~	Look for possible areas of difference
~	Work with smart and experienced people
who can solve problems!
¦ Expect the unexpected, especially in
scheduling

-------
Lessons Learned: (continued)
¦	Maintain constructive relationships
~	Noone likes to be audited; show respect
~	Everyone wants to do well; show
appreciation.
¦	Remember that performance audits are
the only way to verify analyses
Acknowledgements:
¦	Dr. Roy Fortmann and Libby Beach,
Arcadis Geraghty & Miller
¦	Shirley Wasson, APPCD liaison for IAQ
studies
¦	Dr. Les Sparks, APPCD Project Officer
¦	Dr. Dave Ensor and Debbie Franke,
Research Triangle Institute

-------

-------
SUPERFUND QA OVERSIGHT- WHERE DO WE GO FROM HERE?
AUTHORS:
Joan Fisk, Chemist (5204G)
Duane Geuder, QA Manager (5202G)
US Environmental Protection Agency
Office of Emergency and Remedial Response
401 M Street SW
Washington, DC 20460
Scientist
Marguerite Jones, Sr. QA Analyst
DynCorp Information & Enterprise
Technology, Inc.
300 N. Lee St.
Alexandria, VA 22314
Conrad Kleveno, Sr. Environmental
Extended Abstract
Background:
Introduction: EPA's Superfund program has emerged, over its approximately 19 years of
existence, from a program of mainly site assessment and characterization - as its National
Priorities List of sites was continually growing - to one having a larger focus on design and
construction of remedial actions and actual cleanup. This is true both for "Fund-lead" projects
and, on a growing basis, for Potential Responsible Parties (PRPs) and Federal Facilities. As the
Superfund "pipeline" status was changing, so was the Agency perspective on how to do business
- changing from a prescriptive to a performance-based approach where possible. This
combination of paradigm shifts stimulated the Superfund Headquarters QA personnel to set up a
series of visits to the 10 EPA Regions (and ERTC) to review current practices being carried out
for QA oversight of Superfund activities, regardless of lead.
Justification for these visits was strengthened by the March 1997 report by the Office of the
Inspector General. The ultimate goal of the Regional reviews is to develop relevant guidance on
the minimum elements needed for QA oversight for all Superfund environmental data collection
activities (planning, implementation, and assessment). The guidance will take advantage of the
information gained during the reviews to identify and share the best and most innovative practices
being followed and to identify the areas of weakness that will benefit from guidance. Both
global/program level (e.g., Regional QA staff and, in some Regions Superfund QA staff) and
project level [e.g., Remedial Project Mangers (RPMs), On-Scene Coordinators (OSCs), and Site
Assessment Managers (SAMs)] QA oversight was reviewed and will be covered. Some
perspective on Regional Superfund Program management was also gained. The Agency policy
defined in EPA Order 5360.1, change 1 and its accompanying 5360 Quality Manual will provide
a valuable framework for the upcoming guidance
Approach: Separate questionnaires were used for interviewing Regional QA staff and
Superfund staff (the QA staff questionnaires are being used also for Superfund QA personnel)
because of the importance of assuring that there is an adequate Regional QA program in place as
the underpinning for individual project level QA oversight. The questions for both audiences
covered the gamut including systematic planning through determination of whether project goals
were met, roles and responsibilities throughout the process, and adequacy of experience/training

-------
of those performing program and project functions. In addition, the more fundamental project
level review looked at oversight of contractors (for both Fund and non-Fund lead) because of the
high dependence on the commercial sector for support. Many Quality Assurance Project Plans
(QAPPs) were collected and reviewed according to a very detailed checklist based on EPA's R-
5/G-5 QAPP documents.
General Findings:
It is important to reiterate that our reviews were not intended in any way to provide a "report
card" or focus on problems that would demand formal corrective action - such as an 01G
audit/report or a Management Systems Review (MSR) by QAD. Rather, they were intended to
gather information to learn about the good practices, share innovation, and provide a pointer to
the areas most needing improved guidance. It is also important to understand that the findings
that follow are not all encompassing for all Regions. Each Region has high points and
weaknesses scattered throughout. The Regions were very gracious about giving up their time to
us and candidly sharing both their descriptions of how they carry out their roles and their overall
feelings about currently-needed "help" from Headquarters (e.g., guidance, training,
involvement), even when negative, and how they can better be helped. General findings follow.
•	Project management personnel were sincerely interested in doing a good job and believed
that decisions were supported by adequate and appropriate data.
•	Regional Superfiind senior management was interested in QA and supportive of QA efforts.
The level of this management support varied across the Regions.
•	There is a major gap in QA training - it is not mandatory in most Regions, and when it is
provided by the Regions, not usually well attended. A large number of interviewees had no
QA training, though were interested in acquiring basic QA knowledge.
•	Most personnel interviewed (QA and program staff) indicated that QA training should be
mandatory. The general recommendation from Superfiind project managers was that QA
training should be more relevant to their jobs and several people recommended that it should
be presented by program staff (especially true for DQO training - as noted by many RPMs).
•	The DQO process is rarely used according to the national guidance (G-4). There is general
lack of understanding of step 6 (establishing the limits on decision error) and skepticism on
the value of going through a complex exercise of null and alternate hypotheses, etc., and
statistical planning.
•	Many QAPPs reviewed were quite good, though it was difficult to evaluate them against the
R-5 requirements since many of the R-5 elements are found in other project documents (e.g.,
Work Plans, Sampling and Analysis Plans, Field Sampling Plans, to name a few). Since R-5
is still an interim document, some Regions did not require that it be followed.
•	Generally, the project managers felt that they had good contractor support for their activities.
There was some indication of lack of oversight of contractor activities and, in some cases, a
possibility of too much trust without oversight activities in place to verify performance.
•	There is great variability across the Regions in how much involvement Regional QA staff
have in project planning and implementation and even global oversight. There is a pattern of
"personality-based" involvement, and often little structure to obtaining QA support or formal
policy for when QA staff involvement should occur.

-------
•	There is strong feeling from QA staff that they could provide much valuable support to
project managers in planning, QAPP review, field oversight, and data acceptability
determination - these roles being successfully carried out in some cases.
•	There is still some perception of the role of QA as a burden and obstacle.
•	A Superfund QA function/staff is on the rise.
•	The Removal program staff often feel that it is not well understood and that their data
collection must be less stringent (e.g., emergencies/time critical removals).
•	Many interviewees said the QA documents should be "requirements," not "guidance."
Next Steps:
Regional profiles have been developed and are being reviewed by Regional QA and Superfund
staff as to their representativeness of overall Regional practices. Those profiles are be the basis
for the National Summary which highlights exceptional practices and highlights areas of
weakness for focus on improvement recommendations. Lastly, the "Superfund Guidance on
Minimum QA Oversight Program Elements" (proposed title) will be developed, with anticipated
strong input from the Regional Superfund programs and QA staff. It will be a clear
interpretation of "5360" for Superfund activities, regardless of lead. The guidance will
incorporate any Interagency Data Quality Task Force (IDQTF) products, either by reference or
appendix, in order to assure that no contradiction or redundancy in products occurs and
interagency buy-in is more likely.
QA oversight guidance elements likely for inclusion cover: systematic planning for data
collection, generation of planning documents needed for carrying out data collection (e.g.,
QAPPs, SAPs, FSPs), oversight of the data collection process (e.g., adherence to the sampling
(and other relevant) plans and documentation of deviations, equipment decontamination, field
record-keeping, field measurements, sample packaging and shipment), evaluation of data for
suitability for use, laboratory QA oversight, sample and field QC practices (e.g., types of quality
assessment samples to use), appropriate training for various roles (program and project
management and QA staff, communications (up, down, and sideways) and feedback
mechanisms, and data management. Needed flexibility and a performance-based approach will
be retained - at the same time making QA and management goals clear.
Conclusion:
The Regional reviews were very useful as a starting point for guidance document development.
There are many fine examples of good project management and oversight and many examples of
weaknesses that will benefit from guidance. There appears to be enthusiasm for a single, new,
concise, and user-friendly set of minimum QA elements for Superfund activities - with the hope
that the myriad of existing guidance can be declared obsolete. Management understanding and
support for QA should be enhanced.
Audience participation will be requested on related questions.

-------
AN ALTERNATIVE APPROACH TO QUALITY ASSESSMENT SAMPLE ALLOCATION
Daniel Michael, Vice President	Katherine Campbell, Technical Staff Member, and
Neptune and Company, Inc.	Larry Maassen, Quality Program Project Leader
1505 15th St, Suite B	Los Alamos National Laboratory
Los Alamos, NM, 87544	Los Alamos, NM, 87545
SUMMARY
An alternative approach to the determination of the type and number of field quality assessment samples
such as collocated samples, field splits and a variety of blanks (rinsate, field blanks and trip blanks) has
been developed as part of the Los Alamos National Laboratory Environmental Restoration Project. This
approach is designed to reduce the burden associated with the collection and analyses of these samples,
while maximizing the benefit, and hence value. The alternative approach starts by recognizing the
overall objectives of collecting field QA samples and the problems associated with traditional
approaches. The process involves assigning QA samples to aggregates of sites and focuses on
representing specific environmental media and analyte classes of interest. The process has wide spread
applicability to Federal Facilities or industry where large clean up or monitoring programs require the
collection and analysis of vast quantities of environmental data, and has the potential to save vast sums
of moneys while improving the quality of the product generated.
INTRODUCTION
A great deal of time, money and effort associated with large environmental data collection efforts is
devoted to the collection and chemical analysis of a variety of field quality assessment and quality
control samples. Until recently, the Los Alamos National Laboratory (LANL) Environmental
Restoration (ER) Project followed the same approach as that taken at many facilities across the country -
- to routinely collect field duplicates (either collocated samples or field splits), and a variety of blanks at
a set frequency. For example, the standard procedures followed by LANL from 1992-1996 called for
field QA sampling (field duplicates; rinsates, field and reagent blanks) at the 1 in 20 or 5% rate for soil
sampling. In practice, this approach (in combination with the QA/QC samples used in the laboratory)
resulted in 40-60% of the sampling and analytical budget being spent on this ancillary data, rather than
on field samples. It is no wonder that the value of QA/QC samples has begun to be called into question.
Recognizing these issues, LANL initiated an effort to develop a rigorous QA Oversight program. As a
first step in this process, a decision was made to download and assess all historical QA data to learn
about the overall performance of the measurement systems that have been in place and to determine the
relative value of different sample types and the information in general. The following were stated as the
products that the QA Oversight program were to produce:
1.	A method to determine what measurements are most critical for generating quality measures.
2.	Estimates of precision and bias for the entire measurement by media (matrix), analyte, methods,
(analytical, sample preparation, field sampling), field season, year, or groups of years, and lab
3.	Estimates of the performance of field data including completeness (in data base), and precision and
bias
4.	Data to evaluate the comparability of methods

-------
5.	Summary of how frequently different validation flags are put on data by: lab, analyte, and matrix
6.	Relative contribution of measurement variability to total study variability
Efforts to generate these products based on the historical review were impeded for a number of reasons.
First, attempts to access the data were fraught with difficulties due to historical problems in coding QA
data that interfered with the ability to query the data base, multiple codes used to identify QA samples,
and missing entries for QA results. In addition to data base related problems, duplicates that were
identifiable frequently contained numerous non-detects for constituents of interest, or the number of
duplicates related to any particular site was inadequate to generate a meaningful estimate of variability.
Early conclusions of the QA Oversight effort were twofold: 1) the recognition that numerous changes in
the data recording and management systems would need to be made, and 2) the recognition that the
information obtained from the current allocation of field QA samples was of questionable value to the
decision making process and the process for determining the type and number of samples should be
reconsidered. In the interest of doing more with less, an evaluation was made of the types and numbers
of field QA and other samples that should be introduced by LANL during the sample collection and
handling operations. Emphasis was placed on (1) identifying QA samples that provide information that
is necessary to assess data quality for decision making, and (2) establishing a minimum standard for use
across the ER Project. An alternative process for designing the QA sampling program was developed,
and over time has evolved to a new SOP. This process, and its basis, is presented in this paper.
GOALS AND OBJECTIVES
The goal of this effort was to come up with an alternative process to follow in determining the minimuii.
number of each type of field QA sample. The specific objective established for QA data was to augment
normal field samples in such a way that would support estimates of total measurement system precision
and bias1, by media, analyte and to some extent, sampling method (e.g., surface soil methods versus
subsurface soil methods). It is the total measurement system performance in combination with statistical
sampling error that directly impact the probability of making a decision error. The number and allocation
of field samples should be adequate to support the decisions of interest. The role of the QA samples is
primarily to support the evaluation of the performance of the measurement systems used; not to improve
the quality of data being generated.
THE ALTERNATIVE APPROACH
The approach presented herein reflects recent EPA guidance on the use of field assessment sampling
contained in (EPA 1998a, and EPA 1998b). These documents emphasize the importance of integrating
1 Total measurement system error is the sum of all errors associated with generating an individual result including
error introduced through the processes of sample acquisition, handling, transport and storage, preparation and
analysis. An estimate of total measurement system variability can be obtained through the use of collocated and/or
field split samples. Whenever practical, use of collocated samples are recommended, since in that way the sample
acquisition process is repeated. This is typically impractical when dealing with subsurface cores, where samples
should be split. Total bias is usually estimated by analyzing matrix specific PE samples and blanks. Additional lab
procedures utilizing spiked samples, surrogates and reference materials will also provide some useful estimates of
analytical bias.

-------
field assessment sampling with the sampling and analysis plan (SAP) and the overall project objectives,
rather than achieving a specified ratio of field assessment samples to field samples.
The alternative approach is designed to support an aggregate-level evaluation of the performance of the
measurement system, rather than assessment for a single project or sampling event. Aggregation is
essential, since many studies at LANL (as well as other facilities with multiple sites) involve a small
number of samples from a given matrix at a given site, analyzed for a focused set of analytes. Using the
traditional methods, these sites would be allocated one duplicate and one or two blanks. Analyzed
alone, one duplicate or blank sample provides almost no information of value for interpreting the field
samples, or in designing future studies. Project aggregation is therefore the first step of the procedure for
allocating QA samples.
A project aggregate is defined here as a set of individual projects (e.g., small sites) for which one series
of QA samples will be collected and evaluated. A project aggregate should meet a number of criteria: it
addresses a defined geographic area; it is under the control of a common manager and has a single point
of contact; sampling should be performed within a set time frame, typically a calendar year or field
season; aggregate sampling effort should generate a minimum of at least 50 field samples (more than
150 samples are preferred); and either sampling should be performed by a single contractor, or QA
samples representative of each applicable matrix and analyte suite should be collected by each contractor
(which may increase the number of QA samples required). By defining aggregates in this way, an
adequate number of QA samples can be collected to generate some valuable information, without
overburdening individual sampling efforts. This approach should be easily adaptable to other large
Federal Facilities, as well as large industrial areas, where the majority of the nations environmental data
collection efforts are ongoing.
Identify the Field Samples to be Collected by Projects in the Aggregate
This step in the process requires the project aggregate manager to project how many samples of each
matrix will be collected during the time period to be represented by the QA samples taken (e.g., calendar
year or field season). In addition, if at least 20 samples within a matrix are to be collected using two or
more different sampling techniques (that may affect total measurement error), they should be separated,
so that QA samples specific to that technique can be allocated. Once this has been done, the next
question to answer, is how many of these samples will be analyzed for each of the analyte suites of
interest, and what method(s) will be used to perform the analyses. If two or more analytical methods are
to be used to perform a suite of analyses, these groups should be separated and QA samples allocated to
each group. In addition, any unusual methods that are to be employed should be identified. The results
of this exercise should be recorded in a table like that shown below (Table 1) which includes the analyte
and matrix classes that pertain to the LANL ER project.

-------
Table 1. Example format for recording the number of field samples and types of analyses to be taken
within an aggregate

Aqueous
Liquid
Gas/vapor
Soil/Sediment
Tuff
Other (e.g.,
sludge, debris)
High Explosives
0
0
83 (45 lab analyses, 40
field screening methods)
35 (all lab ,
analyses)
0
Inorg/metals
0
0
, 85
35

PCBs/Pesticides
0
0
10
10

RAD
0
0
85
35

SVOCs
0
0
10
10

VOCs
0
0
0
0

Special





Allocation of Field Duplicates
To begin the process of determining the number and type of field duplicates, the first step is to highlight
the matrix/suite pairs in Table 1 that contain enough samples (usually more than 20) to make it worth
while assigning some number of field duplicates to the cell. Also cells for which information is required
because a novel or difficult sampling or analytical method is to be used, should be identified, even if
they contain less than 20 samples. Other cells are generally viewed as being too small to warrant
allocation of QA samples.
Having completed the table, the project aggregate manager and site experts then contribute their best
judgment regarding what to expect from the sampling campaign in order to decide whether to collect
collocated or split samples. Collocated field duplicate samples are recommended if site heterogeneity is
potentially significant (such as when contaminants are expected to be particulate or chunky in nature).
Spatially or temporally collocated samples are also required where field splits are impractical, e.g., when
collecting samples for volatile analysis. Either field splits or collocated samples are acceptable for other
situations. Slightly different components of the total variance are estimated if field splits are used,
compared to what is estimated using collocated samples (see Table 3).
For each highlighted matrix/suite combination, the next step in the process is to determine the number of
field duplicate pairs required using Table 2. Project aggregate area managers will notice that as the
number of field samples goes up, the proportional burden goes down. The numbers of field duplicate
pairs listed in Table 2 are the minimum recommended numbers, and reflect practicality considerations at
least as much as statistical considerations. Managers will be motivated to form larger aggregates to keep
the cost of QA sampling to a minimum. Additional duplicate pairs are recommended when unfamiliar
sampling or analyses methods are used, or when the estimate of measurement error is an important
outcome of the study (e.g., when conducting a Phase I study that will produce variance estimates used to
determine the number of Phase II samples required).

-------
Table 2. Minimum Numbers of Field Duplicate Pairs
Number of Field Samples
Number of Field Duplicate Pairs
<20
0
20-45
3
45-75
4
76-120
5
121-180
6
181-245
7
>245'
8
Having determined the number of duplicate pairs, the remaining step is to determine which field samples
will be duplicated to obtain the required numbers for each highlighted matrix/suite combination. In
general, field duplicates should be distributed evenly across the project in both space and time.
However, to maximize the value of these samples, more duplicates should be taken from locations that
are likely to yield detected results for the analytes of interest. By incorporating QA sampling design into
the overall QA and SAP planning, sample locations for duplicate pairs can be identified that achieve
these goals.
Allocation of Equipment Rinsate Blanks and other Types of Field Blanks
Other types of field assessment samples are required only for certain types of sampling and/or analysis
and no minimum numbers are required. These other types of field assessment samples should be
carefully targeted to anticipated problems associated with novel or difficult sampling and analysis
methods. For may routine sampling situations, no equipment rinsate blanks are recommended, however
if equipment is not being immediately reused, such as when equipment is decontaminated in bulk at the
conclusion of a project, a single rinsate blank may be collected prior to equipment reuse. Other
situations where rinsates may be deemed appropriate include: use of an analysis procedure with
especially low detection limits, collection of samples of relatively low analyte concentration following
the sampling of areas of high concentration, or collection samples after equipment has been in contact
with relatively adhesive materials such as asphalt. Action levels that would trigger a change in sampling
procedure or resampling if exceeded must be specified. Rinsate blanks are most useful as a QC tool,
but this requires scheduling the collection early in the field season and the use of quick turnaround
analysis to provide results prior to collecting the majority of field samples. Problems found after the fact
(when rinsates are used as a QA tool) may lead to rejection of field data and potentially expensive
resampling, while problems found during the field season can be corrected. If appropriate (i.e., if
methods of adequate specificity and sensitivity are available), rinsate blanks may be analyzed by field
methods or quick turnaround methods; it is not necessary that the methods used for the routine samples
be used.
Other types of blanks such as field blanks are most useful at sites where VOCs are known or strongly
suspected to be present. Contaminated sampling containers and reagents are very rarely a problem for
routine sampling and analysis methods. If required, the appropriate analyte free matrix should be used
for the blank. For example, distilled or deionized water for liquid or solid matrices, and unexposed
filters for gas samples. Blanks can be opened at the site during the sample collection process to
determine if cross contamination is likely due to volatiles (e.g., open field blanks), or can be kept closed
(e.g., trip blanks) to evaluate the potential for cross contamination during shipping and storage. Like

-------
rinsate blanks, if field blanks are used, they may be concentrated near the beginning of a project if
contamination problems are suspected, or inserted at the rate of one per site or round of sampling.
Again, an action level, that if exceeded would trigger changing sampling methods or resampling, should
be set ahead of time.
Table 3. Quality Assessment/Quality Control Samples
Sample/Data Type
Parameter Estimated
How Data Are Used [pool the results for
media/analyte classes]
Field SDlit
Field sample homogenized and
subdivided in the field, then
individual portions submitted for
analysis.
Variance. Provides estimate of
total measurement variance,
excluding the effects of sample
collection and small-scale site
heterogeneity. Useful for
estimating sum of handling, storage,
preparation and analysis variances
and for estimating effectiveness of
sample homogenization protocols.
Compare observed concentration
difference between split and original
sample with differences between lab
duplicates, and reported laboratory
precision. Look for outliers and trends.
Calculate RSDs. Evaluate the effect of
analyte concentrations on RSD.
Collocated Sample
Sample collected as near in space
and time to the original field sample
as the sampling equipment and
procedure allows.
Variance. Provides estimate of
total measurement variance,
including the effect of population
(site) heterogeneity on scale of
sample, (e.g., differences that
actually exist in the field on the
scale of the collocated sample.)
Compare observed concentration
difference between collocated and
original sample with variability among
field samples, differences between lab
duplicates, and reported laboratory
precision. Look for outliers and trends.
Use to estimate overall sampling
variability. Calculate RSDs. Evaluate the
effect of analyte concentrations on RSD.
Rinsate Blank
Blank material poured over
decontaminated equipment, collected
and submitted for analysis.
Bias. Indicates potential for
contamination of field samples
(positive bias) through contact with
contaminated sampling equipment.
Compare observed concentrations to
action levels. Exceeding action levels
indicate potential for sample
contamination, [inadequate fl We want
to know if we have detects in the blanks
fl that alone might be indicative of a
problem. If found, then evaluate against
decision criteria (e.g., action levels) to
see if it matters. Similar to field blanks
in this regard.] Corrective actions may
include revision of decontamination
procedure or resampling.
Field blank
Analyte-free matrix transported to
field in closed sample container.
Optionally, container opened in field.
Appropriate reagents added and
sample submitted for analysis.
Bias. Indicates potential
contamination of field samples
from: exposure to site conditions (if
opened in field); contact with
reagents; field and laboratory
storage/shipping/handling.
Compare observed concentration to
detection limit. Use comparisons to
indicate potential for sample
contamination. Corrective actions may
include revision of sampling procedures
or resampling.
Performance Evaluation Samde
Sample of prepared and certified
performance evaluation material
collected and analyzed in the field.
Bias. Provides estimate of bias of
field analytic method relative to
method by which the performance
evaluation material was certified.
Variance. Provides estimate of
variance of field analytic method.
Compare measurements to certified
values. Estimate the bias and variance
of the measurements. Measurements
with large bias or variance may indicate
the need to modify decision rules that
use the field analyses in order to

-------
compensate for the limitations of the
field method.
Table 3 summarizes the different types of duplicates, blanks and performance evaluation samples that
should be considered when designing a SAP. By applying some of the methods described in this
document we believe these samples will provide greater value to a project.
CONCLUSION
An alternate procedure for allocating field QA samples to environmental data collection efforts during
the design of data collection efforts has been developed. This procedure will increase the value of those
samples collected by ensuring that all important matrices and analyte classes are represented in a manner
that will facilitate pooling data for assessment purposes. The procedure encourages managers to build
QA sampling into their overall SAP, to create meaningful aggregates of sites that will result in a large
enough number of field samples to generate a meaningful QA data set without overburdening a project.
By focusing on the important objectives associated with QA samples, and recognizing upfront how they
will be used, the likelihood that they will in fact be assessed will increase. Therefore, not only will the
burden of collecting the samples go down (especially as aggregate sizes increase), the amount of useful
information obtained will go up.
ACKNOWLEDGMENTS
The alternative procedure discussed herein has been developed as LANL ER-SOP 1.05 by Katherine
Campbell. Paul Black, Wendy Swanson, Dean Neptune and Kristen Lockhart (Neptune and Company),
Tom Johnston (formally of Neptune and Company), and Larry Souza (LANL) played an important role
in developing the concept. Terre Mercier, Wendy Swanson, Paul Black and Kristen Lockhart (Neptune
and Company) and Catherine Smith (LANL) contributed by pursuing QA Oversight, and Brad Martin
and Alison Dorries (LANL) encouraged the search for a more efficient and meaningful approach to QA
sample collection.
REFERENCES
USEPA 1998a. EPA Guidance for Quality Assurance Project Plans, EPA QA/G-5, EPA/600/R-98/018
(Office of Research and Development, Washington D.C., 1998)
USEPA 1998b. EPA Requirements for Quality Assurance Project Plans for Environmental Data
Operations, EPA QA/R-5, External Review Draft Final, Quality Assurance Division, Washington D.C.,
October 1998)

-------

-------
MEASURING THE COST OF QUALITY-
ANALYTICAL PROGRAM CONSIDERATIONS
Paul Mills
DynCorp
2000 Edmund Halley Drive
Reston, VA 20191
Jeffrey C. Worthington
USEPA
401 M Street SW
Washington, DC 20460
SUMMARY
When the USEPA buys data, the Agency pays for "defect-free" data, plus the costs of imperfect
data. The USEPA pays for sampling, and the bid price for laboratory services. Laboratory
services include analysis, reporting, bottles, and built-in lab internal QC checks and reanalysis
fees. The USEPA also pays third party contractors such as Contract Laboratory Analytical
Support Services (CLASS) and Quality Assurance Technical Support (QATS) to review work,
check invoices, provide reports, etc. It is to the government's advantage to use quality costs to
identify cost savings, and work with the Contract Laboratory Program (CLP) labs to reduce
appraisal and failure costs.
Laboratories are evaluated on a number of factors, including completeness/compliance and data
turnaround time, results of PE samples and audits, etc. Laboratories performing better on these
indicators are candidates to receive additional work, with the expectation that the labs will
continue to perform at a high level. If these appraisal costs are quantified, the results can be
factored into an analysis of quality costs. This paper examines the various costs and proposes
evaluation tools to lower total quality costs.
INTRODUCTION
The Quality Cost Improvement Philosophy is simply stated:
PRIVATE SECTOR: Every dollar saved in the total cost of quality is directly
translatable into a dollar of pretax earnings.
PUBLIC SECTOR: Every dollar saved in the total cost of quality may be translated into
increased ability to meet public needs (protect human health and the environment)
Quality improvements and quality cost reductions cannot be dictated by management—
they must be earned through the process of problem solving. The first step in the process is the
identification of problems. Every problem identified by quality costs is an opportunity for profit
improvement. Quality improvement results in cost improvement. Designing and building a
product right the first time always costs less. Solving problems by finding their causes and
eliminating them results in measurable savings.

-------
BASICS OF QUALITY COSTS
J.M. Juran developed the cost of quality (CoQ) technique more than 40 years ago. There are two
kinds of costs: those incurred because of a lack of quality (nonconformance to specifications)
and those incurred in the achievement of quality (conformance to specifications). Costs due to a
lack of quality are further divided into costs of internal failures and costs of external failures.
Costs of achieving quality are divided into appraisal costs and defect-prevention costs (see Table
1, Cost of Quality Categories).
Table 1, Cost of Quality Categories
Category
Definition
Typical Examples
Internal failures
Quality failures detected
before providing a service
or shipping a product to the
customer
Rework, retesting; costs of
scrap, reinspection
External failures
Quality failures detected
after delivery or shipment
of the product or service, to
the customer
Rework, retesting,
resubmission, liquidated
damages; costs of
processing customer
complaints; cost of lost or
reduced business, liability
costs
Appraisal
Costs associated with
measuring, evaluating or
auditing products or
services to assure
conformance to quality
standards and performance
requirements.
Checking and screening
activities, product reviews,
incoming and source
inspection of purchased
material, audits, calibrations
Prevention
Efforts to ensure product
quality
Inspections, process studies,
metrics collection, training,
equipment maintenance,
quality planning, supplier
capability surveys, process
capability evaluations,
quality improvement
projects
The Total Cost of Quality is the sum of all these costs of failure, prevention, and appraisal. They
represent the difference between the actual cost of a product or service and what the reduced cost
would be if there were no possibility of substandard service, failure of products, or defects in
their manufacture. To improve or maintain quality involves cost. Product or service must be
inspected to assurfe bad output does not reach the customer. Control systems must be established
and maintained. Workers must be trained.

-------
Each participant working with or supporting the CLP—the laboratories, the contractors
supporting EPA (sampling team, CLASS, QATS), and the Regions—incurs the three classic
categories of quality costs—prevention, inspection, and failure (see Tables 2 and 3). CLASS, the
Regions, and the labs each have their own QC programs that monitor internal performance.
Table 2, Example Activity-Based Quality Costs for the EPA (with CLASS, QATS)
EPA Activity
Costs
Quality Cost Category
Planning, Scheduling,
Training
Data Quality Objectives, EPA
and field contractors' costs,
Quality Assurance Project
Plans, Health & Safety Plans,
Sampling & Analysis Plans,
program-wide standardized
methods and deliverables
Prevention
Sampling
Packaging and shipping
expenses, QC sample costs,
review costs
Prevention and appraisal
Review/V alidation
Semi-automated and manual
hardcopy validation, pre-and
post-award on-site audits,
blind and quarterly PE
samples, data package and
tape audits
Appraisal
Rework
Resampling, corrective actions
Failure
Table 3, Example Activity-Based Costs and Quality
Y Costs for a Laboratory
Lab Activity
Costs
Quality Cost Category
Planning, Scheduling,
Training
Peer, management, and QA
reviews
Prevention
Receipt/Login
Check paperwork,
preservation, custody
Appraisal
Prep/Analysis
QC checks, calibrations,
Standard Reference Materials,
traceability, maintenance,
review
Prevention/Appraisal
QA Program
SOPs, audits, PE samples,
training, reports
Appraisal
Reporting/Customer
Service
Initial report, answering
questions
Appraisal
Rework
Re-prep, re-analysis, re-
reporting, corrective actions,
Liquidated Damages, potential
lost business, reduced receipts
Failure

-------
ESTIMATED SAVINGS USING QUALITY COSTS
The following questions are worth asking when considering the possible savings that could be
realized in using a quality cost approach:
•	To minimize the total of these costs, how should they be measured and used?
•	How can managers use them to identify areas where small increases of resources can mean
substantial quality increases, and large quality cost decreases?
•	Can they be compared using cost/benefit procedures, or by standardized accounting criteria
such as Net Present Value (NPV), or Return on Investment (ROI)?
•	Can they be compared using a simple "Hassle Factor" (HF), with the combinations causing
the fewest problems equated to the lowest quality cost?
CLP EXAMPLES OF QUALITY COSTS
Assume a situation in which purchased material rejections (i.e., failed data deliverables) are the
Agency's biggest problem. Pareto analysis can be used to determine which suppliers cause the
most problems, based on Contract Compliance Screens (CCS) and the need for manual data
reviews. Then the Agency can focus on these suppliers and take appropriate action. The Agency
might convince them to institute quality cost programs. Improved profitability for the supplier
may eventually result in lower prices for the Agency in a competitive market. The supplier
quality costs can be incorporated into a supplier rating system. A ranking of suppliers by quality
cost performance index can be constructed. What information could be used in such a rating
system? CCS is designed to work on a consistent and fast-turnaround basis. To achieve this,
systems have been implemented to perform the following tasks:
•	Track the progress of deliverables through all parts of the CCS system.
•	Examine and report on the presence of defects, completeness of variables, and compliance
specifications of the data reported on the electronic diskette deliverables.
•	Generate reports summarizing the full compliance status of deliverables in terms of summary
sheets and approved defect statements.
•	Generate CCS records containing coded compliance status and error records summarizing
details of detected defects.
•	Report compliance status and trends across time, laboratories, Regions, and/or CCS criteria.
The following are examples of CLASS routine CCS reports that measure supplier performance:
•	Failed Sample Delivery Group report
•	CCS Defect report and Resubmittal report (semi-automated)
•	CCS Defect report and Resubmittal report (full manual)
•	Compliance Analysis and Late Trend Data Report (each SOW)
•	SDG Completed Monthly Report
•	Weekly CCS Updates
•	Diskette Deliverable Compliance Analysis for Contracts
•	Data Acceptance/Rejection/Reduced Value (DARRV) Forms

-------
As an example of the magnitude of quality failure costs (lab's external failures), the following
figures in Table 4 for FY98 show liquidated damages for all SDGs:
Table 4, FY98 CLP Liquidated Damages Charges
SDGs based on Data Receipt Dates
2540
Total SDGs with Liquidated Damages
1480
% of SDGs with Liquidated Damages
58%
Total Liquidated Damages
$233,000
(Note: Figures are rounded)
HASSLE FACTOR REDUCTION
"Hassle Factor" is a term applied to the amount of effort and aggravation necessary to solve a
problem. For CLP data, a hassle factor can be quantified in terms (such as "What easily-
avoidable failure category items take the most time to deal with?"), and are consequently the
most aggravating types of problems handled? A simple example is when labs don't call to report
discrepancies in sample identification/labeling. A lab may "guess" which sample numbers are
correct, based on information on the bottle labels or Traffic Reports and Chain of Custody forms.
If the lab is incorrect, the sample numbers on the diskette deliverable will be unacceptable, since
the information provided from the sampling team doesn't match what was anticipated from the
laboratory. When the diskette fails in the CCS, the CLASS Environmental Program Coordinator
must talk with the lab and the Region to determine the correct sample numbers and institute
changes in the appropriate documents. The electronic information must then be corrected and
the electronic data screen repeated. If the problem is not resolved within 5 days, and a manual
screen of the data is performed, review costs are much higher for the EPA. A small amount of
prevention (calling about discrepancies) causes a large cost of quality with this failure (failed
screen, corrective action, costs of second screen, and possible manual screen, liquidated
damages, etc.). Diskette failures that trigger manual screening cost the Agency more than $1500
per SDG, even if there are no problems with the analytical data being reported. The lab incurs
additional expenses in resubmitting the diskette, too.
RETURN ON INVESTMENT (ROI)
Consider a lab's investment in software to pre-screen Organic SOW diskettes using CCS
software checks. The software performs the same screening function as the Agency's software,
and can help the lab identify and fix problems before they are reported. A few thousand dollars
invested could save many thousands in late fees, liquidated damages, reduced value, etc.
Assume $2,000 expenses for software in the first year of a three-year analytical services contract.
If the software check prevents 2 SDGs from being manually screened by the CLASS and the
Regional customer, this is a savings of about $3000 that could have been charged to the lab for
liquidated damages and reduced payments. The present value of saving $3000 each year of the
contract, assuming a 15% cost of capital to the lab, is $6849, giving a Net Present Value
(determined from,statistical tables) of $4849 for the ROI in the software. (This calculation
doesn't account for the cost of training lab staff in its use, and possible software licensing fees. It
also doesn't account for the extra costs to the lab of reprocessing the diskettes of the failed SDGs,
and the loss of client goodwill.)

-------
COST/BENEFIT COMPARISON
Another simple example compares the difference in quality costs and performance for individual
labs. Two labs may each receive the same number of samples and SDGs, but a lab with
problems will have a higher cost of quality than one without. It is spending more money to
correct failures than to detect and prevent them.
IMPLEMENTATION OF A QUALITY COST PILOT PROGRAM
Economic benefits of CoQ are usually reported in terms of productivity gains, improved product
quality, reduced cycle time, total cost and effort savings, or return on investment. Managers
want to know that money spent on better practices will yield good returns, and they want to see
numbers that show a positive impact on the bottom line. Convincing management of the value
of tracking CoQ is often the first hurdle. The main purpose of the initial CoQ effort is to show
the opportunity for cost savings. Juran and Gryna (1980) suggest that the following relationships
have the greatest impact on management:
•	Quality costs as a percent of sales
•	Quality costs compared to profit
•	Quality costs compared to the magnitude of the current problem
Campanella (1999) says "The real value of a quality program is determined by its ability to
contribute to customer satisfaction and to profits." The payoff, or monetary gain, to a company
in seeking better customer satisfaction is Gain in Contribution Margin: Cash flow from the
incremental gains in revenue due to additional repeat sales from more satisfied customers, less
the costs of those sales. For most organizations, the principal investment in the entire quality
cost program is the expense of the quality cost collection system.
The strategy for using quality costs is simple: (1) attack failure costs in an attempt to drive them
to zero; (2) invest in the "right" prevention activities to bring about improvement; (3) reduce
appraisal costs according to results achieved; (4) continuously evaluate and redirect prevention
efforts to gain further improvement. If the basic quality measurement system of a company
cannot provide the identification of defects or problems to which quality costs can be attached,
the first corrective action required is to establish a system that does. If failure costs are collected
in a defect-tracking system, the most expensive defects can be identified for root-cause analysis.
This strategy is based on the premise that
•	For each failure there is a root cause.
•	Causes are preventable.
•	Prevention is always cheaper.
All department representatives should be encouraged to make program suggestions from their
expert viewpoints. Ask them to prepare a list of those tasks or functions performed by their
departments that can be considered quality costs—work that would not have to be performed if
quality were and always would be perfect.

-------
Steps in an effective implementation of a Cost of Quality program are (Houston, 1999):
1.	Obtain management support for a CoQ assessment.
2.	Select those parts of the organization to be assessed.
3.	Establish an assessment team from work groups, management, and accounting.
4.	Define the various CoQ components within each of the categories.
5.	Collect data on each CoQ component for a given time period and convert into $.
6.	Compare with a base (e.g., total sales, cost of services sold).
7.	Analyze the CoQ cost components into drivers, symptoms, and root causes.
8.	Determine the investment and impact to eliminate root causes of poor quality.
9.	Do a cost/benefit analysis.
10.	Present to management the anticipated improvements in product or service quality, the
associated savings, the required investment, and the time period for realization.
11.	With approval, select and train the improvement team, develop an action plan, and
implement the improvements.
12.	Continually assess progress of the improvement projects and continually monitor and report
Quality Costs.
CONCLUSION
Investment in a Cost of Quality program will produce lower overall costs for the Agency and its
suppliers. By focusing management attention on the costs of failure and appraisal, root causes of
problems will be identified, corrected, and ultimately prevented. The Agency and the public
benefit from improved performance and cost savings, completing scheduled work on time and
within budget. Laboratories receive more revenue with less costs, and can demonstrate
improved quality leads to higher profits. They perform more efficiently, allowing them to
perform additional analyses and maintain customer satisfaction. There are clear advantages for
the Agency to work with the laboratories to apply the Cost of Quality principles.
REFERENCES:
Campanella, Jack, Editor, 1999. Principles of Quality Costs, Principles, Implementation and
Use, 3rd Edition. Milwaukee, WI. Quality Press.
Houston, Dan, 1999. "Cost of Software Quality: Justifying Software Process Improvement to
Managers" in Software Quality Professional Journal 1, No. 2, 1999. ASQ, Milwaukee, WI.
Juran, J. M., and F. M. Gryna, 1980. Quality Planning and Analysis, Second Edition, McGraw-
Hill, New York, NY.

-------
Efficient QA Through Consolidated Metrology
Paul W. Groff
QA Specialist
ORD/NRMRL/APPCD/TSB
EPA order 5360.1 requires that all environmental data be of known quality. When
obtaining environmental measurement data, "known quality" requires calibrations of the
instruments obtaining such data to be of known quality, usually traceable to an accepted standard.
Using "factory" calibrations usually requires returning the device to the manufacturer, which
necessarily means taking it out of the system. The device is then tested in the "factory" setting,
which may not be the same setting as the researcher will be collecting data with it. To obtain on-
site calibrations can be extremely costly. Performing your own calibrations requires the
researchers to train someone to do the calibrations and to obtain the required calibration
instruments and standards. What this all can boil down to is excessive time and resources spent
on calibrating instruments when compared to the amount of usage the instruments receive.
APPCD researchers recognized this and consolidated their funds to enable a metrology
laboratory to support APPCD's calibration needs.
The following presentation shows the cost savings associated with operation of the
metrology laboratory; additionally, the improved data quality, record keeping and measurement
system improvements have contributed to enhancing many projects. Many of these
improvements are documented in this paper and shown to be effective both from the Q/A
prospective and the researcher's point of view.

-------

-------
mr

-------
QUALITY ASSURANCE REVIEWS AT THE
ORD, NHEERL,
MID-CONTINENT ECOLOGY DIVISION
Allan R. Batterman
Quality Assurance Manager,
Environmental Scientist
USEPA, ORD, NHEERL, MED
6201 Congdon Boulevard
Duluth, MN 55804
EPA policy requires that each Research Project be documented in a Research Plan and
associated Quality Assurance Project Plan (QAPP). This is then implemented by the research
team. The Division Quality Assurance Program is then tasked to assess the implementation of
the QAPP. This assessment is accomplished through the review process, its implementation at
MED will be explained.
Principal Division Research Tasks are targeted for review on a three year rotation.
Under this plan these tasks should be assessed at the beginning and again just before the research
has ended. Thus, the review is to assist the team in understanding the requirements of the
Quality Assurance Program before the project becomes too involved and again to document that
the Quality Assurance/Quality Control procedures as explained in the QAPP were followed.
Materials available at this presentation will provide the participants with a format for
interactive, user friendly, documentation of a Research Work Plan, Quality Assurance Project
Plan, Health and Safety Plan, and Animal Control and Use Plan,available on Wordperfect Disk at
the presentation) as well as a simple format for the basic questions asked in most QA Reviews
(see following). The specific detail of the review is based on the Research Plan and the Quality
Assurance Project Plan. This detail is added to this form by the interview team following their
review of the Work Plan and QAPP prior to the scheduled interview with the research team.

-------
STANDARD NHEERL/MED QA INTERVIEW
TEAM LEADER/PRINCIPAL LEAD:
AUDIT FOCUS:
AUDIT TEAM MEMBERS:
DATE OF AUDIT:
PERSONNEL INTERVIEWED:
NAME | POSITION








OPERATIONAL AREAS THAT WERE OBSERVED:

Laboratory Facilities and Equipment

Internal QA/QC procedures

Methods and QA documentation

Review of QA/QC or Performance Evaluation Data

Sample Receipt and Storage

Data Management Capability

Sample Compositing and Sorting

Data Entry and Review Operations

Sample Identification and
Enumeration

Data Verification and Validation Activities

Sample Tracking within Laboratory,
between Agency and Contractor

Data Base Documentation,
Reporting and Transfer Activities


-------
STANDARD NHEERL/MED QA INTERVIEW QUESTIONS
Based on general Environmental Protection Agency requirements as described in EPA QA/R-2 and
specific requirements described in the Research Projects - Quality Assurance Project Plan (QAPP)
ITEM
QUESTION
RESPONSE
PLANNING DOCUMENTS AND BACKGROUND

1
Is there a written and approved protocol, research plan, or workplan
for this study?
If yes, how is it identified and has the most Current version been
distributed to key study personnel?
If there is not one, briefly describe how/where the study plan is
documented.

2
Describe the internal process (team/branch) for preparation, review,
and approval of Research Plan or other similar document (QAPP).

3
Are there deviations from/to the QAPP?
What are they?
How are they noted?

4
Are written and approved standard operating procedures (SOPs) used
in this study?
If yes, list below and note'whether they have been
distributed to key study personnel and are available to all users
in this study. If not, briefly describe how/where study procedures
are documented.

5
List known SOPs here:
List tentative/planned SOPs here:

6
Are there additional planning documents (Agency
Guidance/Intraagency Agreement)for this study?
If yes, list below and note whether each has been distributed to key
study personnel.

7
How are the QA resources set and allocated for your team''
Who has input and what is included?
What is the current resource level ($, FTE) and is this
adequate to meet your QA needs?


-------
8
How is the fulfillment of existing Agency, NHEERL. and MED QA
requirements confirmed?
Describe QA processes within the team (e.g. planning, experimental
design, data review, documentation)
and external to the team (e.g. research plan approval, peer reviews,
audits).

9
What portion of your project is subject to the Agency QA mandate
(5360.1 CHG 1., 07/16/98) regarding environmental data/programs?
What are Agency's requirements for these studies?
Are there additional QA requirements?

QUALITY OBJECTIVES AND PERFORMANCE CRITERIA

10
Identify the clients for the research with which you are involved.
How are their needs addressed in the research planning process?

11
Briefly describe the intramural research planning process and your
role in it.

12
Is the anticipated use of the data known and documented?

13
Have study quality objectives, consistent with anticipated data use,
been established and documented?

14
Have performance criteria for measurement data (e.g., detection
limits, precision, bias) been established and documented?

15
Are there established procedures for assessing whether quality
objectives and measurement data criteria have been met?
If yes, briefly describe.
Are these assessments documented?

16
Are QC analysis being performed at required frequencies and within
acceptance criteria stated in the QAPP?
Are QC responsibilities and reporting relationships clearly defined?
Does the lab maintain a QC manual?
Are all QC data up-to-date and accessible?
Is there a mechanism in place to analyze QC data for trends?

STUDY ORGANIZATION AND PERSONNEL


-------
17
Describe the organizational structure of your project.
Who on your team is responsible for quality assurance duties?
What portion of their time is devoted to QA?
What are their principal responsibilities with respect to:
-	research planning?
-	project monitoring?
-	project management or oversight?

18
Do all of the principals on your team have in their performance
agreement an "environmental data quality" statement as directed by
the EPA Senior Leadership Council in FY 96? (Probably as part of a
research standard).

19
What are the primary QA related duties and responsibilities of the
individual team members and how are these communicated to them
and to others?
Does each team member regularly attend team meetings?
Is QA a discussed topic at these meetings or are additional
meetings held to address QA issues?
How are these discussions documented and related to the rest of
the team?

20
What are the primary research tasks, both intramural and extramural,
in your team?
How are these research tasks documented?
How are these tasks allocated among team members?

21
Are there standard forms for use in this study9 If yes, list below and
note whether these are available to all anticipated users.

22
Does the team have ready access to a copy of the current approved
Quality Assurance Project Plan as well as MED's Quality
Management Plan?
Where are they kept?
Are you familiar with both?

23
Are there any extramural projects involved in this research?
How are QA/QC issues addressed in planning the extramural studies?
How is extramural QA compliance monitored?
If contracted: How do you ensure that EPA is directing the research
activities (i.e. control against contractor performance of inherently
government function) without violating rules against personal
services?


-------
24
How and by whom are your studies identified for QA reviews and
audits?
What is the frequency of these reviews?

25
ORD employs a hierarchical QA approach which reflects the
intended use of the data and the type of work being done.
To what Category (I-IV) does your project belong?
How does this affect your team QA requirements?

26
How do you measure the overall success of your QAPP and the
success of individual components, and how is this information
communicated to Laboratory or Division management?

TRAINING REQUIREMENTS

27
How does your team identify training needs?

28
Is needed training provided? How?

29
How are training records maintained?

FACILITIES EQUIPMENT AND SUPPLIES

30
List below any key facilities used in the study (e.g., research
laboratories, analytical support laboratories, exposure facilities,
animal care facilities), identify the location of each, and briefly
describe the major activities performed in support of the study.
Indicate whether each facility is adequate.
If not, briefly describe areas where improvements may be desirable or
necessary.

31
List below key instruments or equipment used in the study. As
appropriate for each instrument listed, indicate whether routine
maintenance and calibration or calibrations checks are conducted.
If yes, specify:
—	frequency of routine maintenance
~ frequency and range of routine calibrations or Calibration
checks
-- type of calibration standards/devices used
-- persons or organizations responsible for performing routine
maintenance and calibrations/calibration checks
-	if procedures are documented in SOPs
-- if maintenance and calibration logs are kept.


-------
32
Is acceptance inspection or testing performed on any of the
instruments or equipment listed above?
If yes, list each instrument below and briefly describe inspection or
testing procedures and associated acceptance criteria.

33
Is acceptance inspection or testing performed on any supplies and
consumables used in this study?
If yes, list each supply/consumable below and briefly describe
inspection or testing procedures and associated acceptance criteria.

TEST SYSTEMS AND QUALITY ASSESSMENT

34
What types of planning documents are used to address QA issues and
who prepares these documents?

35
What types of QA assessments are performed to evaluate adherence
of intramural studies to approved plans, by whom are these
assessments conducted and to whom are the observations reported?

36
Does your team have an acceptance program for critical research
supplies and services?

37
What is the role of the Division QA Manager in dealing with this
research?

38
Have you been able to use the DQO process to your perceivable
advantage?

RECORD KEEPING AND DATA MANAGEMENT

39
Are floppy disks, logbooks, and notebooks identified with the
study/protocol number?

40
Are there procedures to ensure the security of hand recorded and
electronic data?

41
What sort of document control process does your team use to help
manage plans, logbooks, forms, records, QA guidelines, SOPs and
software?

42
What quality and security measures are taken to ensure the
defensibility of data resident on network and stand-alone computers?

43
Are provisions made for storage of both raw and reduced data?

44
Are duplicate sets of electronic and hard copy data being stored?

45
How do you identify and communicate the quality of the data?

46
Are data calculations double checked by a second staff member?

47
Is the data review and approval process documented and available?


-------
48
Is there an index list of all data, records, samples and specimens to be
maintained for this study?

49
Are all study records (e.g., floppy disks, log books, notebooks,
instrument outputs, samples/specimens, correspondence) clearly
cross-referenced (e.g., by protocol number, date, experiment
number)?
If yes, briefly describe

50
Is there an individual responsible for compiling all study data and
reporting to the principal investigator?

51
Are study records maintained in a central file?

52
Are standard forms listed above used to record and/or compile data?

53
Are hand-written records recorded in numbered or otherwise uniquely
identified notebooks or binders which are assigned to individual staff
members?

54
Are the initials of each person using a notebook or binder listed in the
front?

55
Is dark permanent ink used and are corrections made with a strikeover
and initialed?

56
Are there procedures for routine verification of the data collection
and management activities listed below?
If yes. briefly describe and note whether verifications are documented
in study records.

57
Are data reduction and analysis procedures clearly documented?

58
Have data reduction and analysis procedures been validated?
If yes, briefly describe.
Is this documented?

59
Are all data files and samples named according to a standard
convention?
If yes, briefly describe.

60
Are all data records identified with a test/sample ID number and a
protocol or study number?

61
How do you ensure that specialized computer software is developed
and documented in accordance with specifications or approved
procedures, and that it performs as required?


-------
Modeling Quality Assurance Plan for the Lake Michigan Mass Balance Project
William L. Richardson Douglas D. Endicott	Kenneth R. Rygwelski
Environmental Engineer Environmental Engineer Environmental Scientist
USEPA, ORD, NHEERL, MED-Duluth, Community-Based Science Support Staff
Large Lakes Research Station
9311 GrohRoad
Grosse lie, MI 48138
SUMMARY
A quality assurance plan has been developed for all modeling aspects of the Lake Michigan
Mass Balance Project (LMMBP). This has been a difficult, but the successful undertaking of a
complex, multi-media project involving multiple organizations, investigators, and models is
better ensured when concepts in these plans are implemented.
INTRODUCTION
With the ever increasing complexity and costs of environmental, ecosystem protection and
remediation, the USEPA is placing more emphasis on ensuring the quality and credibility of
scientific tools, such a models, that are used to help guide decision-makers who are faced with
difficult management choices in these areas. The Agency has issued several documents covering
broad requirements of the development and use of mathematical models and these are used in the
formulation of the Modeling Quality Assurance Plan (MQAP) for the Lake Michigan Mass
Balance Project (LMMBP). This is a stand alone document, separate from, but related to the
LMMBP Quality Assurance Project Plan (QAPP). Because guidance for modeling QAPs is new
and somewhat limited, the LMMBP MQAP could be viewed as a prototype particularly for
projects that involve holistic, multi-media modeling approaches for large systems like the Great
Lakes.
The LMMBP models include hydrodynamics, sediment transport, eutrophication, chemical
transport and fate, and food chain bioaccumulation. In addition, the MQAP report (Richardson
1999) includes the quality assurance (QA) process for the development of atmospheric models
used to describe the emission of the current-use herbicide, atrazine, from the agricultural lands in
the U.S. and its transport and deposition to the lake. It also includes the quality assurance
process for the estimation of tributary and atmospheric depositional loads for measured atrazine,
as well as persistent, bioaccumulative chemicals including polychlorinated biphenyls (PCBs),
trans-nonachlor (TNC), and mercury.
This plan does not cover the quality assurance process for field collection and laboratory
analyses. These topics are covered in the LMMBP QAPP (USEPA, 1997a,b,c,d,e,f).
THE LAKE MICHIGAN MASS BALANCE PROJECT OVERVIEW
The LMMBP was initiated by the USEPA, Great Lakes National Program Office (GLNPO) in
cooperation with the USEPA, Office of Research and Development (ORD) and other federal and
state agencies. The project was initiated in response to regulatory mandates contained in the
1

-------
Great Lakes Water Quality Agreement (GLWQA) between the United States and Canada and
federal legislation that requires the development of "Remedial Action Plans" (RAPs) and
"Lakewide Management Plans" (LaMPs). The purpose of the LaMPs is to restore and maintain
the chemical, physical, and biological integrity of the waters of the Great Lakes Basin
ecosystem. The USEPA also intends that the LaMP process serve as the basis for the
development of State Water Quality Management Plans. This project also has implications and
applications to the Great Lakes Binational Toxics Strategy (Virtual Elimination Strategy) and the
Great Waters Program.
The primary goal of the LMMBP is to develop a sound scientific base of information to guide
future toxic load reduction efforts at the federal, state, tribal, and local levels. Objectives
include: (1) determination of relative loading rates of critical pollutants from major sources to the
Lake Michigan Basin; (2) evaluation of relative loading rates by media (tributaries, atmospheric
deposition, contaminated sediments) to establish a baseline loading estimate to gauge future
progress and load reductions; (3) development of the predictive ability to determine magnitude
of chemical concentration reductions in water, sediment, and biota due to specific load reduction
scenarios and the time to realize those reductions; and (4) improving the understanding of key
environmental processes which govern the cycling, dynamics, and availability of contaminants
within relatively closed ecosystems.
BASIC MODELING CONCEPTS
As this paper is intended for a non-modeling audience, a few basic modeling concepts will be
presented. All models of natural systems whether for air, water, groundwater, or sediment are
based on one or more of the thermodynamic principles of the conservation of mass, momentum,
and energy. Air and water transport models usually include all three. Mass balance models
include conservation of mass with implied inclusion of the other two. Bioaccumulation models
are usually based on mass conservation principles but if population dynamics are modeled, then
energy conservation is included. In general for any media the modeling equations are solved to
produce computed concentrations in space and time, which are functions of the sum of mass
inputs, outputs, and transformations occurring in each model segment:
C(t,x,y,z) = f(^Mas Sm-^ Mass™ ± Mass mxrfunu)	(1)
Figure 1 illustrates hypothetical model output as a function of time. The computed chemical
concentration for a point (or spatial segment) in space is represented by the solid line. Field data
representing that point in space and time are represented by the box with error bars. The error
bars represent the data variance due to collection and analytical laboratory variability and any
spatial and temporal averaging. Model credibility is determined in part by the ability to calibrate
computed model concentrations to a set of measurements in time and space. Ultimately model
validity can only be determined by comparing predicted concentrations to a set of measurements
taken independent of calibration and under different environmental or loading conditions. When
a model is deemed acceptable then it can be used to extrapolate concentrations in time and space.
Also, a model can be used to determine the relationship between loading reductions and
concentrations, at various times following the load reduction (Figure 2). As the last example of
2

-------
mass balance utility, the model can be used to construct the mass budget of a system at any point
in time (Figure 3). The mass budget quantifies the chemical-specific inputs, outputs, and
transformations computed in the model, either for the entire system or for specific spatial
segments of interest.
Figure 1. Hypothetical Model Computation
(concentration vs. time at point x,y,z)
Figure 2. Hypothetical Load-Concentration
Relationship at any time, t.
nt6..--



St?*
\«*
load
time
GENERAL MODEL DATA REQUIREMENTS
In general models require three types of data: (1)
input and (2) calibration, and (3) validation.
Input data include such items as loads, initial
conditions, boundary conditions, and process
rates. Calibration data include chemical
concentrations at strategic points in space and
time. Validation data are obtained for the system
under different loading and environmental
conditions. Data collected to satisfy model
requirements are usually better than "data of
opportunity" collected for some other purpose.
Synoptic (contemporaneous) information is also
desirable and possibly necessary. In other words,
data collected in the lake (or whatever media is of
concern) should be collected during the same
period as the loads and boundary conditions. It is also essential that modelers be directly
involved with the designation of Data Quality Objectives (DQOs) and the design of the data
collection program including the specification of data acceptance criteria, sampling locations and
times, and frequency of sampling. This was the case for LMMBP.
MODELING APPROACH
A general approach to the multi-media modeling processes has evolved from modeling the Great
Lakes that may be useful as guidance for others. The basic premise is to start with the problem
and specific management questions and fit the model specifications to the problem. Otherwise a
Figure 3. Hypothetical Mass Budget for whole system
at time t.
Whole system att
40 kg
50 kg,
AIR
0 kg
mm
M
n
%
CD
U.
o
40 kg
~ o|
m Sediment
Si)
X
3

-------
wrong or inadequate model with no chance of really helping with the decision making process
might be developed. This approach includes the following steps:
1)	Determine specific management questions for the problems being addressed.
2)	Define the appropriate modeling framework needed to address these questions.
3)	Propose alternative modeling/project designs for management review to narrow the range of
expectations and costs.
4)	Using historical data and available computer programs, construct a preliminary screening
model to test the sensitivity of various model components.
5)	Perform statistical analyses of historical data to determine optimal sampling designs.
6)	Make specific sampling design recommendations.
7)	Maintain a continuing dialog with other groups involved with the project.
8)	Work with investigators who collected and analyzed samples to conduct a joint "data quality
assessment."
9)	Evaluate data replicates and other QA notations to determine appropriate uses of data.
10)	Update model process algorithms as necessary according to current theories.
11)	Develop, calibrate, and validate the models. Testing includes comparison of calculated
concentrations to field data and adjusting model parameters within realistic and justifiable
ranges to obtain a model fit that satisfies the statistical criteria established in the MQAP. •
Ideally, models are confirmed by comparing predictions to results. However, this may not be
able to be done until after decisions are made and implemented and is the primary reason that
model uncertainty be defined.
12)	Conduct uncertainty analysis to estimate reliability of predictions.
13)	Provide answers to specific management questions by simulating conditions under alternative
load management scenarios.
14)	Document models and results.
One important aspect of this approach is the use of a screening-level model for project design.
This task may take a year or longer to complete and may require its own MQAP and review
process. Sometimes the screening level model may be used for certain aspect of decision making
before the project model is developed and applied. Model integrity can be improved by
implementing screening models for various space/time scales and by using alternative
approaches and computer programs and comparing the results.
LMMBP MODELING FRAMEWORK
The modeling framework for the LMMBP that resulted from the design process is shown in
Figure 4. This framework is being applied to Lake Michigan at three levels of spatial resolution.
The greater the level of spatial resolution, the more effort in terms of computational power/time
and effort and time spent in calibration. Greater resolution can also mean more state variables
and/or more complex process descriptions. A more resolute model can simulate the ecosystem in
greater detail, but if the resolution is not properly constrained by observations, then the resulting
model can be less reliable than a simpler model. The question that always arises is, "What is the
optimal resolution?" Modelers and managers must reach some agreement about this. Usually,
this is done by estimating the resources needed at different levels of effort along with the
estimate of model uncertainty at each level.
4

-------
Figure 4.
Lake Michigan Modeling Framework
GUIDANCE FOR PREPARING
THE LAKE MICHIGAN MQAP
The mandate to prepare a MQAP
for LMMBP was given to the
LMMBP modeling workgroup
from GLNPO management. The
project was already well underway
and GLNPO was undergoing a QA
audit from which the mandate for
the MQAP originated. At that
time, project planning documents
began to be formulated using some
general guidance that was
available (USEPA 1991, SUNY
1993, USEPA 1994, USEPA 1995,
ASTM 1992). The most specific guidance was provided in the "Quality Assurance Guidelines
for Modeling Development and Application Projects: A Policy Statement" (USEPA 1991). This
policy applied primarily to development and application of relatively simple modeling projects,
unlike the LMMBP involving numerous, linked multi-media models. This was used as a basis
and included additional items provided by the other documents and by our own experience and
history in developing and applying Great Lakes models over the past 28 years. Essential factors
of a good modeling project include:
1)	Qualified personnel including education, training, experience, expertise, integrity, and
publication record.
2)	Infrastructure, including laboratories and offices, computers, software tools, supporting
administrative staff and progressive and supportive management.
3)	Adequate extramural research budgets for acquisition of expertise beyond that of the in-
house research staff.
4)	The administrative means to include extramural researchers and contractors via cooperative
agreements and contracts including the ability to build coordinated teams and partnerships
directed at answering relevant scientific and management questions.
5)	Interaction within the scientific and engineering communities at scientific meetings and
workshops and through publication in journals to ensure the utility of most currently
accepted scientific theory.
6)	Professional engineering judgement.
7)	Computer programming support to implement the theory into computer code.
8)	Verification of computer code and calculations.
9)	Evaluating and reporting uncertainties of calculations and stating assumptions, qualifications,
and caveats which could affect research application to regulatory problem-solving.
10)	Peer review of research including theoretical construct, computational methodology,
appropriateness of application, assumptions, and interpretations.
11)	Common sense and hard work.
Computational
Transport
Mass Balance
Bloaccumulation
MtttUMttN
Holdings) 	
chanto) eipoaure concantoiUon
& bttvufabrity; chemical
concentrator ii phytopfankton
5

-------
LMMBP-MQAP FORMAT
Lacking any other guidance for actually preparing a written report, ORD's Handbook for
Preparing Office of Research and Development Reports was used (USEPA 1995). The report
evolved after several drafts into three main chapters with five appendices. Chapter 1,
Introduction, presents information related to the mandate for preparing the MQAP, general
considerations for modeling quality assurance, and background and history of the primary air and
water modeling programs in EPA and NOAA.
Chapter 2 follows the Duluth policy outline (USEPA 1991), with some additions, and presents
"common quality assurance topics as applied to all project models." Topics such as project
description, quality objectives and acceptance criteria, products, and timetable, project personnel,
and support facilities, general modeling approach, and data quality, peer review reduce
redundancy in presenting information for individual models in Chapter 3.
Chapter 3 focuses on the individual models and each section was prepared by the principal
modeler(s). These follow the Duluth policy guidance (USEPA 1991). Each model framework is
described along with specific procedures for model calibration, uncertainty analysis, etc.
The appendices include: A) The original "Lake Michigan Mass Balance Project Modeling
Work Plan" (USEPA 1995a); B) The LMMBP Modeler's Curriculum Vitae; C) description of
the "Revision Code System" used by modelers in tracking versions of computer code; D) Project
Approvals and E) Model Development and Progress. As the project evolves, modifications to
the models will be included as either further appendices or as addenda rather than modifying the
body of the report.
QUALITY CONTROL
The primary objective of QC for modeling is to make sure that the modeling theory is
implemented correctly by the computer program. To accomplish this in practical ways, a list of
requirements was developed and agreed to:
1)	All modeling activities including data interpretation, load calculations or other related
computational activities are subject to audit and/or peer review, so careful written and
electronic records should be kept for all aspects of model development and application.
2)	Written rationale will be provided for selection of models or versions of models like WASP4
or WASP-IPX, SEDZL, etc.
3)	As modeling computer programs are modified, the code will be checked and a written record
made as to how the code is known to work (i.e., hand calculation checks, checks against
other models, etc.). This should include input and output, if appropriate, or results of
external calculations used to confirm code.
4)	If historical data are used, a written record on where this was obtained and any information
on its quality will be maintained. A written record on where this information is located on a
computer or server will be maintained.
5)	If new theory is incorporated into the model framework, references for the theory and how it
is implemented in any computer code will be documented.
6

-------
6) All new and modified computer codes will be documented. This should include internal
documentation, as revision notes in program headers, and external documentation, in user's
guides and supplements.
Audits of each modeler's work will be conducted periodically by the Agency QA auditing team,
the project QA officer, MED-Duluth QA officer or one or more of their designees
PEER REVIEW
The Agency has provided very clear guidance for peer review of modeling projects that are
involved with regulatory decision making (USEPA 1994). Since the LMMBP will be used in the
LaMP process and perhaps in establishing "total maximum daily loads" (TMDL), it was
determined that the modeling activities should undergo an ongoing peer review process. A
group of scientists and engineers were invited to become members of a "Science Review Panel"
(SRP) and to participate in a series of reviews. It was thought that involving the SRP throughout
the modeling process would result in a better product and would provide time to resolve
problems in contrast to a peer review at the end of the project. The first review was held in June
1998. As a result of the review the SRP submitted a review report including a list of questions
and concerns. Each of these was addressed by the modeling workgroup in a response report. It
is expected that the SRP will meet at least once per year until the end of the project.
CONCLUSIONS
In the final analysis, the quality of the work and the reliability and credibility of the models will
be enhanced by the preparation and implementation of the MQAP. But most importantly, the
quality of the modeling work is determined primarily by the desire and integrity of the project
personnel. History has shown the mathematical models of Great Lakes water quality have been
reliable in predicting future events in making regulatory and remedial decisions, and have
contributed to the success of these efforts. The Lake Michigan modeling efforts build on this
long history of model development by the ORD's Great Lakes Modeling Program at Grosse lie,
Michigan; the Modeling Program at Research Triangle Park, North Carolina; the experience of
the National Oceanic and Atmospheric Administration (NOAA) Great Lakes hydrodynamic
modeling program at the Great Lakes Environmental Research Laboratory (GLERL) in Ann
Arbor, Michigan; the Modeling Program of the U.S. Army Corps of Engineers at Waterways
Experiment Station, Vicksburg, Mississippi; and several other federal, private, and academic
organizations. Hopefully, the MQAP fulfills the requirements of Agency QA managers and will
contribute to the overall success of the project.
REFERENCES
USEPA 1999. The Lake Michigan Mass Balance Project: Quality Assurance Plan for Mathematical
Modeling. Ed. William L. Richardson, Douglas D. Endicott, Russell G. Kreis, Jr. Kenneth R. Rygwelski.
In preparation.. U.S. Environmental Protection Agency, National Health and Environmental Effects
Research Laboratory, Mid-Continent Ecology Division-Duluth, Large Lakes Research Station, Grosse lie,
Michigan. Draft.
7

-------
USEPA. 1997a. Lake Michigan Mass Budget/Mass Balance Work Plan. U.S. Environmental Protection
Agency, Great Lakes National Program Office, Chicago, Illinois. EPA-905/R-97-018.
USEPA. 1997b. Enhanced Monitoring Program Quality Assurance Program Plan. U.S. Environmental
Protection Agency, Great Lakes National Program Office, Chicago, Illinois. EPA-905/R-97-017.
USEPA. 1997c. Lake Michigan Mass Balance Study (LMMB) Methods Compendium, Volume I:
Sample Collection Techniques. U.S. Environmental Protection Agency, Great Lakes National Program
Office, Chicago, Illinois. EPA-905/R-97-012a.
USEPA. 1997d. Lake Michigan Mass Balance Study (LMMB) Methods Compendium, Volume 2:
Organic and Mercury Sample Analysis Techniques. U.S. Environmental Protection Agency, Great Lakes
National Program Office, Chicago, Illinois. EPA-905/R-97-012b.
USEPA. 1997e. Lake Michigan Mass Balance Study (LMMB) Methods Compendium, Volume 3:
Metals, Conventionals, Radiochemistry, and Biomonitoring Sample Analysis Techniques. U.S.
Environmental Protection Agency, Great Lakes National Program Office, Chicago, Illinois. EPA-905/R-
97-012c.
USEPA. 1997f. Lake Michigan Mass Balance Data Reporting Format. U.S. Environmental Protection
Agency, Great Lakes National Program Office, Chicago, Illinois.
USEPA. 1991. Quality Assurance Guidelines for Modeling Development and Application Projects: A
Policy Statement. U.S. Environmental Protection Agency, Office of Research and Development, ERL-
Duluth, Duluth, Minnesota.
State University of New York. 1993. Reducing Uncertainty in Mass Balance Models of Toxics in the
Great Lakes-Lake Ontario Case Study. Great Lakes Program, State University of New York at Buffalo,
Buffalo, New York.
USEPA. 1994. Agency Guidance for Conducting External Peer Review of Environmental Regulatory
Modeling. U.S. Environmental Protection Agency, Agency Task Force on Environmental Regulatory
Modeling, Washington, D.C.
USEPA. 1995. Handbook for Preparing Office of Research and Development Reports, Third Edition.
Office of Research and Development, Washington, D.C. 20460. EPA/600/K-95/002.
ASTM. 1992. Standard Practice for Evaluating Mathematical Models for the Environmental Fate of
Chemicals. Designation: E978-92.
USEPA. 1995a. Lake Michigan Mass Balance Project: Modeling Work Plan (Draft). U.S. Environmental
Protection Agency, National Health and Environmental Effects Research Laboratory, Mid-Continent
Ecology Division-Duluth, Large Lakes Research Station, Grosse lie, Michigan.
8

-------
B. Michael Ray and Brenda T. Culpepper
U.S. EPA, NHEERL, Research Triangle Park
NHEERL APPROACH TO TECHNICAL
SYSTEMS REVIEWS OF HEALTH
EFFECTS STUDIES
Mike Ray and Brenda Culpepper
ORD/NHEERL-RTP/4-99

TShwpquirement
Agency mandate requires NHEERL to establish a
Quality Management System designed to assure the
quality and defensibility of all our environmental data.
Integral to that System are elements that address the
planning, implementation, and assessment of the
research efforts that generate NHEERL's
environmental data.
ORD/NHEERL-RTP/4-99

-------
THE ORD/NHEERL
CH PROCESS
ORD/NHEERL-RTP/4-99

-------
t§|Althe ord/nheerl
RE^PSCH PROCESS
ORD/NHEERL-RTP/4-99

TSnMDjectives
~	Assess whether a group is likely to achieve results of
sufficient quality to meet the stated objectives of a
task or project
~	Improve the understanding of the need to ensure the
defensibility and reconstructibility of data
ORD/NHEERL-RTP/4-99

-------

TSHlWareness Training
~	Mandatory TSR Training
~	277 Scientists Attended 22 Training Sessions
ORD/NHEERL-RTP/4-99
Ty^
^^Studies and Facilities Reviewed
~
Extramural and Intramural Epidemiology Field

Studies
~
Human and Animal Controlled-Exposure Studies
~
Pathology Studies
~
Bioassays
~
GIS Modeling Studies
~
GLP-Mandated Study

ORD/NHEERL-RTP/4-99
6

-------
TSFWTOCESS OVERVIEW
~	Who's Involved?
~	Principal Investigator and Support Staff
~	NHEERL-RTP QA Staff
~	One primary QA reviewer
~	One primary technical reviewer
~	Branch Chief and/or Project Officer
~	Frequency of TSRs
~	All Category I and II Studies — At Least Once
~	10% of All Research Studies Annually
T6 )CESS OVERVIEW
~	Selection of Studies
~	Requests on the QA Review Form
~	Requests by Line Managers
~	Requests by QA Staff
~	Use of Checklist
7
0RD/NHEERL-RTP/4-99
8
0RD/NHEERL-RTP/4-99

-------
ECKLIST
«K Coiel WoidPeilecl - C AQADOCSVT SR qapies OVR wpd
£<• i~" F«™« I-*	**>
"OSH a|.% *&*> \ « 'mztmsm* 1 e W • »©£ I onqsyi
IA ij§n] isB a? i M.-BZ-1 ^ '>•«( B r « H, *-_arj
,z!'b X n *!w* -
mSBSSm
s3 SBSMM

~ i ji xi
•rr" 71r «fc.ao• >f«ss»I«?« « : a x
¦ I * ] 5= C* ft, / T 1= • X « I ~ - Tl

9
ORD/NHEERL-RTP/4-99

-------
Tsd^E
ECKLIST
Q<2>UA an ¦ * '0»o*Ia: <»*B *(=•«« ao f» / - -Ifi-y*. f Q -
—	- ,0 - „ , 0 » • B «—- - s-^ao	53-0 i. r






. „		


a « 1 1 Oi-rt.


C. TMilJ | f >1 d 				


~ - ' ' J" , '


r. 	


r. t»i ni.,fah


a V*hb


„ - - •





J. r. t , I.T ¦ - ¦ - -


K. bkUh 			 _ . -		


u n		 ., ,r , n n			





N. r-.-.	


		 ¦'¦*'![in _«	






nrrc_-w— »»otr»..«4<»w.t^TOwt..! -^^,^.1 1 » at «3 wc :-u
ORD/NHEERL-RTP/4-99
Cu^r^e Checklist

~ Based on the Following

~ IRP

~ QAPP

~ OP

~ Data Management Plan

10
ORD/NHEERL-RTP/4-99

-------
CH
4
¦LIST QUESTIONS FOR
EP
i
PlIOLOGY STUDIES

~
Consent Forms

~
Questionnaires

~
Coded Data Entry

~
Physiological Testing

~
Blood, Urine, and Stool Samples

~
Migration of Data

~
Data Archival


11 ORD/NHEERL-RTP/4-99
cn
d
*
¦LIST QUESTIONS FOR
INh

POTION STUDIES

~
Test Atmosphere Generation


~ Aerosol vs. Gas vs. Particulate Matter

~
Test Atmosphere Analysis


~ Safety and Quality Issue

~
Medical Surveillance


~ Physical Exam

~
Physiological Data Acquisition


~ Pulmonary Function Testing, Heart Rate Variability

~
Environmental Conditions


~ Temperature, Humidity, pH, Sound Level


12 ORD/NHEERL-RTP/4-99

-------
cwAlist questions for
pat^^ogy
~	Tissue Extraction, Preservation, Staining Techniques
~	Labeling of Slides and Blocks
~	Blind-Reading of Slides by Pathologist
~	Use of Independent Pathologist when Warranted
~	Archival of Slides and Blocks
~	Incorporation of Pathology Findings in Study Report
13	ORD/NHEERL-RTP/4-99
C^^BLIST QUESTIONS FOR GIS
MOVING STUDIES
~ Secondary Data
~	Birth Certificates
~	Number of Full-Term Live Births and Premature Live
Births
~	Birth Weights
~	Mothers' Residences
~	City Water Districts Information
~	Water Treatment
~	Extent of Piping Network
~	Water Analysis
14	ORD/NHEERL-RTP/4-99

-------
CM^LIST QUESTIONS FOR GIS
MOVING STUDIES
~ Validation of GIS Software
~	Modeling Results vs. Hydraulic Measurements
~	Modeling Results vs. Water Analysis
15	ORD/NHEERL-RTP/4-99
TA^j
BENEFITS OF TSRs
~
Additional Field Staff Provided for an EPI study
~
Additional Funding Provided for Vital Records

Archival
~
Backup Power to Sample Freezer
~
Improved Storage and Chain-of-Custody for Test

Substance
~
Improved Extraction Methods for Air Sample Filters
~
Support for Contract Negotiations

16 ORD/NHEERL-RTP/4-99

-------
AN
4
mt\G CAN HAPPEN DURING A
FIE

PPTUDY!

~
Robbery at Gun Point

~
Unexpected Exposure to Nudity

~
Gl Distress—Green vs. Red

~
Long Hours

~
Delightful Test Subjects

~
Nifty Thrifty


17 ORD/NHEERL-RTP/4-99

-------

-------
USING FIELD METHODS - EXPERIENCES AND LESSONS:
DEFENSIBILITY OF FIELD DATA
Barton P. Simmons
Chief, Hazardous Materials Laboratory
Department of Toxic Substances Control
2151 Berkeley Way, Room 515
Berkeley, CA 94704
bsimmons@dtsc.ca.gov
SUMMARY
One perceived obstacle to the use of field methods is the legal defensibility of field data. The
standards which are used by the courts are quite different than the standards used in the
environmental testing community. The rules on the acceptability of scientific evidence are
different in federal courts than in some state courts. The federal rules were changed significantly
by the Daubert v. Merrell-Dow decision handed down by the U.S. Supreme Court in 1993. In
that decision, the Supreme Court gave judges considerable latitude to decide what evidence was
relevant and reliable. California, on the other hand, still uses a standard based on "techniques
which are generally accepted by the scientific community."
Neither the federal nor California standards for admissibility distinguish between analysis done
in a fixed laboratory and analysis done in the field. Nor do the standards require adherence to
methods approved by U.S. EPA or other standard-setting organizations. In one California case,
People v. Hale, there were major deviations from the relevant EPA method, but an appeals court
found that the deviations were harmless and allowed the data to be used.
In order for data to be accepted as evidence, whether the data come from a fixed laboratory or the
field, the technique may need to generally recognized in the scientific community (state
standard), and must be shown to be relevant and reliable (federal standard). Once evidence has
been accepted, the weight which is given to the evidence may depend on a variety of factors,
including the training and experience of the personnel, the accuracy of the equipment, and the
reliability of the method. The rules for the defensibility of field methods are no different than
those for fixed laboratory methods.
INTRODUCTION
A real obstacle to the wider use of field methods is the perception that field data are legally less
defensible than fixed laboratory data. To actually examine this perception, it is necessary to
examine the actual legal standards which are used for scientific data. Although environmental
scientists have their own standards for analysis, the actual standards for the legal defensibility of
scientific data involves the interaction of science and law. The courts have made significant
changes in recent years to the rules for scientific evidence, which reached a climax with the U.S.

-------
Supreme Court opinion in the case of Daubert v. Merrell Dow Pharmaceuticals.
FEDERAL RULES FOR SCIENTIFIC DATA
First, we must realize that the rules for scientific data may be different in federal courts than in
state courts. This, however, does not necessarily pose an insurmountable problem. The federal
rules changed in 1993 when the U.S. Supreme Court issued an opinion in the case of Daubert v.
Merrell Dow Pharmaceuticals. Although the case involved allegations that a drug, Bendectin,
caused birth deformities, the ruling had a broad application because it abandoned an earlier
standard, based on Frye v. United States. In its 1993 Daubert ruling, the court established a
more flexible and liberal test of admissibility of scientific evidence. The Supreme Court
received a considerable number of briefs from scientific organizations, and this is reflected in
their opinion, which even dealt with the definition of science.
"...under the Rules the trial judge must ensure that any and all scientific testimony or
evidence admitted is not only relevant, but reliable (Daubert v. Merrell Dow
Pharmaceuticals, 4827)".
Readers who are interested in a thorough examination of the Daubert ruling may want to look at
Foster and Huber's book, Judging Science. The question of what constitutes reliable scientific
evidence is still subject to debate, but the impact of the Court's ruling was to give the judge
considerable flexibility in deciding that question in a particular case.
STATE RULES FOR SCIENTIFIC DATA
Unlike the federal courts, California courts still maintain a standard based on "general
acceptance" in the relevant scientific community (People v. Kelly, 1976). The three "prongs" of
this standard are:
1)	The scientific test's reliability must be established by its general acceptance in the
relevant scientific community;
2)	The testifying witness must be properly qualified; and
3)	The proponent of the evidence must demonstrate that the correct scientific procedures
were used.
Again, none of these standards would distinguish field methods from fixed laboratory methods.
They also should not pose a significant barrier, with the exception of a "black box," which may
operate using principles-thathave not been accepted in the scientific .community.
CASE HISTORIES
People v. Hale, 1994: The first line of this California Appellate Court ruling reads:
"SW-846 is not the name of some new gasoline additive marketed by an oil company. It
is the title of a manual compiled by the United States Protection Agency (EPA) dealing

-------
with the collection and testing of hazardous waste."
The case involved illegal dumping of 1,1,1- Trichloroethane into waste dumpsters. The appeal
focused on major deviations from SW-846: no sampling plan was used, the lab had used Method
8015 (using a flame-ionization detector) instead of the accepted methods 8010 or 8240; the
samples were frozen instead of cooling to 4°C.; and the 14-day holding time was exceeded. The
court held that the deviations were harmless.
"We discern no per se rule which does automatically precludes the introduction of
evidence of disposal of hazardous waste just because the gathering of the sample does not
follow every jot and tittle of the EPA manual."
People v. K&L Plating, 1997: Although this is not a case published by an appellate court, this
case involved the use of field methods. This was a manslaughter case, in which a worker died
after rescuing another worker who was cleaning out sludge in a waste treatment tank. The
prosecution used results from a Draeger tube testing of head space in ajar of sludge and a
hydrogen cyanide monitor as evidence that hazardous levels of hydrogen cyanide were emitted
from the waste. The defense challenged the reliability of all of the data. Review of validation of
the Draeger tube showed that a lower estimate of HCN concentration could be calculated even
though the tube changed color on one stroke instead of the required ten strokes. The HCN
monitor, the prosecution argued, used an accepted principle and provided an expert witness to
support the data. The defendant plead guilty.
People v. Sangani, 1994: This case involved illegal disposal of hazardous waste into a sewer
system. The defendant was convicted, but appealed, in part, because the lab which did the
analysis was not certified by the California Department of Toxic Substances Control. The
Appellate Court found that even if the Hazardous Waste Control Law required the use of an
accredited lab, the data would be admissible.
"Failure to follow precise regulatory or statutory requirements for laboratory tests
generally does not render the test results inadmissible, provided the foundational
requirements for establishing the reliability of the tests are met. The necessary
foundational requirements are:
(1)	the testing apparatus is in proper working order;
(2)	the test was properly administered; and
(3)	the operator was competent and qualified. (People v. Sangani, p. 1276)"
People v. Adams: In what has been described as an explanation of the general rule of evidence in
California, the court found:
"Where a-statute ...does not specifically provide that evidence shall be excluded for
failure to comply with said statute...such evidence is not inadmissible. Statutory
compliance or noncompliance goes to the weight of the evidence {People v. Adams,
567)."

-------
THE APPLICATION OF RULES OF EVIDENCE
The legal cases which established rules of evidence were primarily created to deal with new
scientific techniques, e.g., a crude predecessor to the lie detector, or to distinguish real science
from "junk science." The examples of rules for admissibility of evidence given in the examples
above should pose little problem for a validated technology which is operated correctly by a
trained operator.
CONCLUSION
The rules on the legal defensibility of scientific data do not distinguish between measurements
made in the field and measurements made in the laboratory. The rules used by the courts are
very different than those established in regulation. In particular, courts have found that evidence
may be reliable even if there were major deviations from methods specified in regulation, or if
the analysis was done in a non-accredited laboratory, even if accreditation were required by
regulation. As to the weight which is put to evidence, the validation of the method and the
quality system documentation are certainly relevant.
REFERENCE LIST
Foster, K.R., and P.W. Huber, 1997. Judging Science: Scientific Knowledge and the Federal
Courts: MIT Press.
People v. Adams, 59 Cal.App. 3d at 567 (1976).
People v. Hale, 29 Cal.App. 4th 730 (1994).
People v. Kelley, 17Cal.3d 14 (1976).
People v. Sangani, 94 C.D.O.S. 1273 (1994).
U.S. EPA Office of Solid Waste, Test Methods for Evaluating Solid Waste, Physical/Chemical
Methods.

-------
m
22

-------
23

-------
EXTENDED ABSTRACT
GUIDE TO STANDARDS FOR
EPA QUALITY MANAGERS
Jeffrey C. Worthington, CQA,CQM
Quality Assurance Division
National Cernter for Environmental Research and Quality Assurance
USEPA Office of Research and Development
401 M Street S.W. (89724R)
Washington, DC 20460
Individuals responsible for managing the quality of environmental and related measurements
traditionally must have a working knowledge of specific standards. Until recently, the types of
standards encountered were limited to EPA and other Federal agency guidance/requirements and
standard methodology for sampling and analysis. In 1996, the EPA adopted a standard
developed through the American Society for Quality. This standard, Specifications and
Guidelines for Quality Systems for Environmental Data Collection and Environmental
Technology Programs ANSI/ASQC E4-1994, was adopted under the statutory authority of the
National Technology Transfer and Advancement Act of1995.
In the particular case of E4, a "quality standard" was referenced which also is an "environmental
standard." Other quality and environmental standards may be useful to quality managers
working in enforcement, environmental investigations, and research. Also, many of the
standards that address products or processes may be useful, but it is difficult to access the
potential for their usefulness without a road map to the resources. Yes, "the truth is out there. "
This technical presentation presents a road map to standards which may be useful to the
environmental quality manager.
The presentation considers the history of standards, the current standards under development,
resources for new standards, and an overview of the various parties responsible for standards
development and maintenance as well as Internet addresses for the resources.

-------
ISO GUIDE 25 TO INTERNATIONAL STANDARD ISO 17025
Paul Mills
Mentorprises Corporation
11619 Charter Oak Court, #202
Reston, VA 20190-4513
This paper addresses the move to change the existing International Standard Organization Guide
25, "General requirements for the competence of calibration and testing laboratories" to an ISO
Standard 17025, "General requirements for the competence of testing and calibration
laboratories." The reasons for the proposed changes and their implications for the US
marketplace are discussed. Differences in content and coverage between the Guide and draft
Standard are presented. The relationships of the ISO Guide and proposed Standard to other ISO
Standards, and to NELAC, are explained. Potential benefits and costs of this proposed Standard
are listed.

-------

-------
25

-------
26

-------

-------
AEPA
April 13-16, 1999
Regal Cincinnati Hotel
Cincinnati, Ohio
U,S. Environmental Protection Agency
presents the
f Oth Annual
I Xin National
¦	Conference on
Managing Quality Systems
for Environmental Programs
with	Preconference Training Courses
April 11-12, 1999
Agenda At A Glance
Sunday, April 11, 1999
. ¦¦¦ .. ¦¦ ¦' ¦ N''

7:30 am - 8:30 am
8:30 am - 5:00 pm
1:00 pm - 5:00 pm
Registration, Grand Ballroom Foyer
Preconference Training Course: Integrating Quality Assurance into Project Development, Bronze B
Preconference Training Course: NELAP Accrediting Authority Assessor Training (State and Federal employees only), Colonnade B
Monday, April 12, 1999
f f' .

7:30 am - 8:30 am
8:30 am - 5:00 pm
8:30 am - 5:00 pm
8:30 am - 5:00 pm
8:30 am - 5:00 pm
7:00 pm -10:00 pm
Registration, Grand Ballroom Foyer
Preconference Training Course: Integrating Quality Assurance into Project Development (Continued), Bronze B
Preconference Training Course: NELAP Accrediting Authority Assessor Training (Continued), Colonnade B
Preconference Training Course: Introduction to EPA Quality System Requirements for Environmental Programs, Colonnade A
Preconference Training Course: Introduction to Management Systems Review Process, Pavilion
Regional QA Managers Meeting (EPA only), Pavilion
Tuesday, April 13,1999
7:30 am - 8:00 am
8:00 am -11:30 am
Registration, Grand Ballroom Foyer
Plenary Session: Keynote Address, Special Reports, Grand A
11:30 am-1:00 pm
Lunch Break
' ''jd , ' " ,

1:00 pm - 2:00 pm *
2:00 pm - 3:00 pm
3:30 pm - 5:00 pm
Technical Plenary Session: Report of the Inter-Governmental task Force on Data Quality, Grand A
Technical Plenary Session: Implementing Performance Based Measurement Systems in a Regulatory Setting-Issues and Answers, Grand A
Discussion Breakout Sessions, Grand A
Wednesday, April 14, 1999 GrandA
Grand B
Colonnade
8:00 am - 9:30 am
Concurrent Sessions
Session Wll: QA for Secondary
Data Use-Lessons Learned
Session W12: Implementing NELAP
Status Report
Session W13: The QA Professional as
an Expert Witness in Court
10:00 am -11:30 am
Concurrent Sessions
Session W21: Environmental QA/QC
Practices-Emerging Issues in Research
Session W22: Implementing QA in
Environmental Labs
Session W23: A Sample is a
Sample is a Sample
11:30 am -1:00 pm
Lunch Break


1:00 pm - 2:30 pm
Concurrent Sessions
Session W31: Information Management
for the QA Professional (Mini-Workshop)
Session W32: Environmental QA/QC
Practices-Emerging Issues on
Session W33: Benchmarking-
Processes and Lessons Learned
Questionable Practices
3:00 pm - 4:30 pm
Concurrent Sessions
Session W41: Information Management
for the QA Professional
(Mini-Workshop continued)
Session W42: Putting QA into
Operation: Planning, Implementation,
and Assessment
Session W43: Environmental QA/QC
Practices-Oversight and Implementation
Thursday, April 15,1999 GrandA
Grand B
Colonnade
8:00 am - 9:30 am
Concurrent Sessions
Session Til: Cost of Quality-
Real Dollars or Otherwise
Session T12: Practical Approaches to
Quality Auditing (Mini-Workshop)
Session T13: Implementing QA in
Health and Ecological Research
10:00 am -11:30 am
Concurrent Sessions
Session T21: Field Analytical Methods-
Experiences and Lessons Learned
Session T22: Practical Approaches
to Quality Auditing
(Mini-Workshop continued)
Session T23: What's Happening in
International Standards and Why You
Should Care
11:30 pm • 1:00 pm
Lunch Break
1:00 pm • £:30 pm
Plenary Wrap-up Session, Reports from Breakout Sessions, Grand B
Friday, April 16,1999
^ ' A ^ \ , * ¦

8:00 am -12 noon
EPA QA Managers Meetings (Closed to public), Bronze A, Grand A
A registration and information desk will be open during the entire conference.

-------