»EPA
20th Annual National Conference on
Managing Environmental
Quality Systems
St. Louis, MO
April 2-6, 2001
-------
TABLE OF CONTENTS
20th Annual National Conference on Managing Environmental Quality Systems
Welcome Letter 1
Agenda (Technical Program) 2
Hotel Floor Plan 3
Evaluation Form 4
EPA/QA Community Phone List 5
Tuesday Plenary Session 6
Leveraging Data Quality Needs and Resources 7
Methodologies for Assessing Environmental Data Quality 8
Ensuring Quality of Secondary Data 9
OA Practices in Site Remediation 10
Leveraging Quality Planning and Value Management Tools and Techniques 11
QA Oversight Applications 12
Using the Graded Approach to QA Requirements for Assistance Agreements 13
Advances in Managing Laboratory Data and Information 14
NELAC: Changes in Laboratory Quality Systems 15
Federal and State Partnerships in Environmental Decision Making 16
QA Practices for Research 17
The Role of QA in Environmental Compliance Sampling and Analysis 18
Managing Uncertainty for Environmental Decision Making 19
Data Usability in Site Assessment 20
Data Quality/Information Management I 21
EPA Data Standards Implementation Effort 22
Global Positioning Systems: EPA Policies, Procedures and QA Considerations 23
Data Quality/Information Management II 24
Geostatistical Error Management 25
E.0.13148: Environmental Management Systems at Federal Facilities 26
Monday Training Courses 27
Participant List ' 28
Other information 29
Updates 30
Notes 31
-------
° A \ UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
^72. * WASHINGTON, D.C. 20460
APR - 3 2001
% /
PRO"**-
OFFICE OF
ENVIRONMENTAL INFORMATION
On behalf of the U.S. Environmental Protection Agency and the Quality Staff, it is my
pleasure to welcome you to the 20th Annual National Conference on Managing Environmental
Quality Systems. We are honored to have the opportunity to host this conference again.
As we embrace the challenges of a new millennium, I am reminded of how far we have
come in the field of environmental protection. We have made great strides in improving the
Quality Systems that support decisions on environmental issues. But I am also struck by how far
we still have to go. National and international interest in the environment seems to be consistent
with our ability to do something about the problems we identify. This conference celebrates our
commitment to using Quality Management principles to address the constantly changing issues
in environmental protection.
It is our hope that, during this conference, you will gain knowledge that will enhance
your skills and challenge you. Our panels and technical sessions reflect best practices and
innovative advances in current environmental topics. I encourage you to use this opportunity to
learn about disciplines other than your own, seek opportunities to network with peers, and
proactively share your experiences and knowledge.
The Quality Staff and I look forward to meeting you during the week. We hope that you
will find this week both rewarding and productive.
Nancy W. Wentworth
Director, Quality Staff
Internet Address (URL) • http://www.epa.gov
Recycled/Recyclable • Printed with Vegetable Oil Based Inks on Recycled Paper (Minimum 30% Postconsumer)
-------
TECHNICAL PROGRAM OUTLINE
20™ Annual National Conference
on Managing Environmental Quality Systems
TUESDAY, APRIL 3,2001 - 2nd Floor Promenade Ballrooms Conference Area
8:00 a.m. Welcome and Introduction of Keynote Speaker
Nancy W. Wentworth (Director, U.S. EPA Quality Staff, Washington, DC)
9:30 a.m. Keynote Address: The Honorable Miriam Naveira de Rodon
Justice of the Supreme Court of Puerto Rico
11:00 a.m. Awards and Recognitions
Noon Lunch Break
1:30-4:30 p.m. Keynote Presentation: Larry English
Information Impact International
WEDNESDAY, APRIL 4,2001 - 2nd Floor Promenade Ballrooms Conference Area
8:00 - 9:30 a.m. Leveraging Data Quality Needs and Resources
Session Chair: Monica Jones, U.S. EPA
1. Improving the Quality of Data Used in Environmental Economic
Research (Clay Ogg, U.S. EPA)
2. Ensuring Data Quality Through Effective Life Cycle Management
(Darby Chellis, Jessica Yocum, Marasco Newtown Group)
3. DOE's Quality System Program: Cooperative Development and
Implementation (David Bottrell, U.S. Dept. of Energy)
Methodologies for Assessing Environmental Data Quality
Session Chair: Brenda Young, U.S. EPA
1. Radiological Data in Environmental Decision Making (Raymond Bath,
U.S. Dept. of Energy)
2. Raising the Curtain on the Gray Area (Cliff Kirchmer, Washington State
Department of Ecology)
3. A Proactive Way to Establish Useful DQOs (Evelyn Holly, Quality By
Design)
Ensuring Quality of Secondary Data
Session Chair: Patricia Lafomara, U.S. EPA
1. QA Plans for Surveys (Malcolm Bertoni, RTI)
2. Automated Data Reporting to Ensure Data Quality (Jianwen She,
Department of Toxic Substances Control, CalEPA)
3. EPA's Data Quality Strategic Plan (Cindy Bethell, U.S. EPA)
1
-------
9:30-10:00 a.m.
BREAK
10:00 -11:30 a.m. Quality Assurance Practices in Site Remediation
Session Chair: John Warren, U.S. EPA
1. Remedial Process Optimization (Major Jeff Cornell, United States Air
Force, HQ AFCEE Technology Transfer Division)
2. Using Field Data Analysis for Environmental Decision Making and
Subsequent Remediation at Two Example Sites (Laura Splichal, CDM
Federal Programs Corporation)
3. TBD
WORKSHOP: Leveraging Quality Planning and Value Management
Tools and Techniques to Ensure Required Data Quality for
Environmental Compliance and Decision Making (Craig Willis, Booz
Allen and Hamilton)
Quality Assurance Oversight Applications
Session Chair: Margo Hunt, U.S. EPA
1. FY2000 Highlights of QA Activities Within ORD's Largest Megalab-
NHEERL (Brenda Culpepper, U.S. EPA)
2. ISO 19011:2002 A Combined Auditing Standard for Quality and
Environmental Management Systems (Gary Johnson, U.S. EPA)
3. The Role of Field Auditing in Environmental Quality Assurance
Management (Daniel Claycomb, Environmental Standards, Inc.)
11:30 a.m. -1:00 p.m. LUNCH
1:00 - 2:30 p.m. WORKSHOP: Using the Graded Approach to QA Requirements for
Assistance Agreements (Patricia Lafornara and Louis Blume, U.S. EPA)
Advances in Managing Laboratory Data and Information
Session Chair: Vincia Holloman, U.S. EPA
1. Automated Data Review, Contract Compliance Screening, and
Environmental Database Management System Software Applications
for the Sacramento District Fort Ord Project (Richard Amano, Lab Data
Consultants, Inc.)
2. Important Considerations in Selecting and Implementing a LIMS in a
Water Quality Testing Laboratory (Kim Ryals, Washington Aqueduct
Laboratory)
3. An Innovative Approach in Defining and Producing Laboratory
Electronic Data Deliverables (Michael S. Johnson, U.S. EPA)
NELAC: Changes in Laboratory Quality Systems
Session Chair: Fred Siegelman, U.S. EPA
1. NELAC Quality Systems: The Integration of ISO/IEC 17025 and PBMS
(Scott Siders, Illinois EPA)
2. Observations of Laboratory Changes as a Result of the NELAC
Standards (Mariene Moore, Advanced Systems Inc.)
3. Authority Review Board (Carol Madding, U.S. EPA)
2
-------
2:30 - 3:00 p.m. BREAK
3:00 - 4:30 p.m. Federal and State Partnerships in Environmental Decision Making
Session Chair: Allan Batterman, U.S. EPA
1. Will Anyone Ever Read the Lake Michigan Mass Balance Quality
Assurance Report? (Louis Blume, U.S. EPA)
2. EPA's Coastal 2000 Monitoring Program in the Northeast U.S. -
Consistency in Methods and Quality Assurance (Joe LiVoIsi, U.S. EPA)
3. The Quality Management System as a Tool for Improving Stakeholder
Confidence (Denise K. MacMillan, Environmental and Molecular
Chemistry Branch, Army Engineer Research and Development Center)
Quality Assurance Practices for Research
Session Chair: Brenda Culpepper, U.S. EPA
1. A Lotus Notes Application for Preparing, Reviewing, and Storing
NHEERL Research Protocols (Ron Rogers, U.S. EPA)
2. Implementation of QA on Multilaboratory Studies Within the U.S. EPA
(Thomas Hughes, U.S. EPA)
3. A Solution for the Need to Have Defensible, Documented, Quality Data
(Paul Groff, U.S. EPA)
The Role of Quality Assurance in Environmental Compliance
Sampling and Analysis
Session Chair: Thomas Dixon, U.S. EPA
1. Lessons Learned in Preparing Method 29 Filters (Robert Martz, Eastern
Research Group, Inc.)
2. Passive Diffusion Bag Samplers (Dr. Javier Santillan, United States Air
Force, HQ AFCEE Technology Transfer Division)
3. Measurement Uncertainty for Environmental Programs (Marlene Moore,
Advanced Systems, Inc.)
THURSDAY, APRIL 5, 2001- 2nd Floor Promenade Ballrooms Conference Area
8:00 a.m. - 4:30 p.m. WORKSHOP: Managing Uncertainty for Environmental Decision
Making (David Bottrell, U.S. Dept. of Energy)
Participation in this workshop is limited to 40 people. The workshop is highly
interactive and requires full participation in exercises and discussions. The
workshop stresses Data Quality Objectives process implementation and
documentation as an integrated system and suggests enhancements based on the
evolving systematic planning concept. A primary component is Visual Sample
Plan, a currently available software package that demonstrates cost, impact, and
value of alternative environmental program data collection options.
The sign-up sheet will be available at the main registration desk at 5:00pm
Wednesday, April 4. Other conference participants are welcome to sit in as
observers if desired.
3
-------
8:00 a.m. - 9:30 a.m. WORKSHOP: Data Usability in Site Assessment (Debbie Reece,
Marasco Newtown Group)
Data Quality/Info. Mgt I
Session Chair: Jeffrey Worthington, U.S. EPA
Panel Presentations:
1. Two Databases in Every Garage: Information Quality Systems (Jeffrey
Worthington, U.S. EPA)
2. How Good is My Data?: Information Quality Assessment Methodology
(George Brilis, U.S. EPA)
3. Data Standards are Back Seat Drivers! Methodology for Incorporating
Information Quality into Quality Assurance Project Plans (Lora Johnson,
U.S. EPA)
9:30-10:00 a.m. BREAK
10:00 -11:30 a.m. WORKSHOP: EPA Data Standards Implementation Effort (Sara Hisel
McCoy, U.S. EPA)
Global Positioning Systems: EPA Policies, Procedures and OA
Considerations (George Brilis, U.S. EPA)
11:30-1:00 p.m. LUNCH
1:00 - 5:00 p.m. WORKSHOP: Geostatistical Error Mgt: Quantifying Uncertainty for
Environmental Sampling and Mapping (Jeff Myers, Washington
Group, Intl)
1:00 - 2:30 p.m. Data Quality/Info Mgt II
Session Chair: Jeffrey Worthington, U.S. EPA
1. Department of Energy Environmental Data Exchange Network Project
(Robert Murray, U.S. Dept. of Energy)
2. OEI Best Practices: Guides to Improve Information Quality (Evangeline
Tsibris, U.S. EPA)
3. (TBD)
2:30 -3:00 p.m. BREAK
3:00 - 4:30 p.m. Executive Order 13148: Requirements for Environment Management
Systems at Federal Facilities
Session Chair: Gary Johnson, U.S. EPA
4
-------
Meeting Rooms at the Adam's Mark
All Conference Sessions will be held on the Second Floor of the Adam's Mark Hotel,
Promenade Ballroom. EPA meetings (Friday, April 6) will be held in the Director's Row
Meeting Rooms.
Second Floor
3n
fN|
%
rsj
a
M
u
*¦3
25
Guest
Elevator*
B C D F 1
I i
24
~n
VAxpKtl
h
Pre-Convene
:x n
^¦¦ ¦k rut. ¦¦ L
Affordable Restaurants in the Adam's Mark area
Wendy's (fast food)
St. Louis Bread Company (eat in or carry out)
Max and Erma's (lunch $7-$9 /dinner $12-$ 15)
TGI Friday's (lunch $9 - $12 / dinner $15 - $25)
Caleco's (lunch $9 - $12 / dinner $15 - $25)
Chinese (lunch $5 - $8)
Delectable Delights (lunch $6 - $10)
St. Louis Center Food Court
2 blocks east
2 1/2 blocks east
1 block south, 316 Market St.
3 blocks east, 529 Chestnut
2 blocks east, 101 N. Broadway
1 1/2 blocks north
4 blocks east
4 1/2 blocks northeast
includes McDonalds, Chinese, Chick-Fil-a, Pizza, Arby's, Sub sandwiches and more.
-------
CONFERENCE SURVEY
Thank you for attending EPA's 20th Annual National Conference on Managing Environmental Quality Systems,
lease take a few minutes to complete all of this survey regarding your impressions of the National Conference. This
jurvey will allow us to assess the conference, which will help us plan for the next conference in 2002.
Please leave your completed form at the conference registration desk. We appreciate your contribution and look
forward to seeing you next year.
1. What is your overall assessment of the conference? Please check one.
~ Excellent ~ Good ~ Average ~ Below Average
Comments:
2. How satisfied were you with the following aspects of the conference? If you did not use or experience a particular part
of the conference listed below, check N/A.
Pro-registration Service
Satisfied
Not Satisfied
N/A
Advance registration
~
~
~
Confirmation of registration received in a timely manner
Q
~
Q
Overall pre-registration process
~
a
~
On-site Registration
Staff provided prompt, courteous, knowledgeable service
~
~
~
Registration materials were complete and accurate
~
a
~
On-site registration process was efficient and complete
~
~
~
Conference Facilities and Services
Hotel meeting facilities
~
~
~
Audio/visual
~
~
~
Your hotel accommodations
~
~
~
3. Overall, how would you rate the following aspects of the conference?
Conference format
Applicability of session topics to your work
Quality of speakers and their presentations
Written materials
Excellent
~
~
~
~
Favorable
~
~
~
~
4.
5.
Fair
~
~
~
~
How did you leam about this conference? Please check all that apply.
~ Conference Brochure ~ EPA Quality Staff web site ~
~ Colleague ~ Past attendee ~
What additions/improvements would you suggest for next year's conference?_
Unsatisfactory
~
~
~
~
Email announcement
Other
6. Please rank the plenary sessions and presentations that you attended.
Keynote Address by Justice Naveira de Roddn
~ Excellent Q Good Q Average ~ Below Average Q Did not Attend
Comments:
Keynote Presentation by Larry English
~ Excellent ~ Good ~ Average ~ Below Average ~ Did not Attend
Comments:
-------
7.
Please rank the technical sessions, workshops, and training courses that you attended.
Excellent
Good
Average
Below
Average
Did not
attend
Leveraging Data Quality Needs and Resources
Methodologies for Assessing Environmental Data
Quality
Ensuring Quality of Secondary Data
Quality Assurance Practices in Site Remediation
Leveraging Quality Planning and Value Management
Tools and Techniques to Ensure Required Data Quality
for Environmental Compliance and Decision Making
Workshop
Quality Assurance Oversight Applications
Using the Graded Approach to QA Requirements for
Assistance Agreements Workshop
Advances in Managing Laboratory Data and Information
NELAC; Changes in Laboratory Quality Systems
Federal and State Partnerships in Environmental
Decision Making
Quality Assurance Practices for Research
The Role of Quality Assurance in Environmental
Compliance Sampling
Managing Uncertainty for Environmental Decision
Making Workshop
Data Usability in Site Assessment Workshop
EPA Data Standards Implementation Effort
Data Quality/Information Management 1
Data Quality/Information Management II
Global Positioning Systems: EPA Policies, Procedures,
and QA Considerations
Geostatistical Error Management: Quantifying
Uncertainty for Environmental Sampling and Mapping
Workshop
Executive Order 13148: Requirements for
Environmental Management Systems at Federal
Facilities
Name (optional)
Affiliation
-------
EPA'S QA Community
GROUP
NAME
PHONE
E-MAIL ADDRESS
(6epa.gov)
FAX NUMBER
Quality Staff
Office of Environmental Information (2811R)
Director
WENTWORTH, Nancy
202/564-6830
wen two rth.nancy
202/565-2441
DIXON, Tom
202/564-6877
dixon.thomas
202/565-2441
DOUCET, Lisa
202/564-1416
douceLlisa
202/565-2441
HOLLOMAN, Vini
202/564-5176
holloman.vincia
202/565-2441
HUNT, Margo
202/564-6457
hunt.margo
202/565-2441
JOHNSON, Gary
919/541-7612
johnson.gary
919/541-7670
KIRKLAND, Linda
202/564-6873
kiridand.linda
202/565-2441
LAFORNARA, Patricia
732/906-6988
lafomara. patricia
732/321-6640
MAISONNEUVE, Betty
202/564-6879
maisonneuve.betty
202/565-2441
PLOST, Charles
202/564-6874
plost.charies
202/565-2441
RENARD, Esperanza
732/321-4355
renard. esperanza
732/321-6640
SIEGELMAN, Fred
202/564-5173
siegelman.frederic
202/565-2441
SIMS, Diann
202/564-6872
sims.diann
202/565-2441
STEMMLE, James
202/564-3908
stemmle.james
202/565-2441
WALDRON, Betty
202/564-6830
waldron.betty
202/565-2441
WARREN, John
202/564-6876
warren.john
202/565-2441
YOUNG, Brenda
202/564-6881
young.brenda
202/565-2441
REGIONAL OFFICES
RS&T Lead Region
Coordinator
MAUEL, Linda (Region 2)
732/321-6766
mauel.linda
732/321-4381
Region 1
WOOD, Carol (Acting QAM)
781/860-4316
wood.carol
781/860-4397
LATAILLE, Moira
781/860-4635
lataille.moira
781/860-4397
SZARO, Deborah
781/860-4312
szaro.deb
781/860-4397
Region 2
RUNYON, Bob (QAM)
732/321-6645
runyon.robert
732/906-6824
LAPOSTA, Dore (QAM)
732/321-6686
laposta.dore
732/906-6616
STEVENS, Shari
732/906-6994
Stevens .shari
732/321-6622
KANTZ, Marcus
732/321-6690
kantz.marcus
732/321-6616
Region 3
JONES, Monica (QAM)
410/305-2747
jones.monica
410/305-3095
JONES, Charlie
215/814-2734
jones.charlie
215/814-2782
Region 4
BENNETT,Gary (QAM)
706/355-8551
bennett.gary
706/355-8803
MESSER, Ed
706/355-8560
messer.edward
706/355-8803
TURNBULL, Wayne
706/355-8554
tumbull.wayne
706/355-8803
Region 5
BOLGER, Kevin (QAM)
312/886-6762
bolger.kevin
312/353-4342
CHURILLA, Patrick
312-353-6175
churilla.patrick
312-886-6171
JUPP, Marilyn
312-353-5882
jupp.marilyn
312-353-4342
KARNAUSKAS, Joan
312/886-6090
kamauskas.joan
312/353-4342
LEHRMAN, Loretta
312/886-5482
lehrman.loretta
312/353-4342
LEVIN, Ida
312/886-6254
levin.ida
312/886-9281
SCHUPP, George
312/353-1226
schupp.george
312/886-2591
TSAI, Cheng-Wen
312/886-6234
tsai.cheng-wen
312/353-4342
Region 6
JOHNSON, Don (QAM)
214/665-8343
johnson.donald
214/665-8072
DAWSON, Tim
214/665-2218
dawson.timothy
214/665-8072
HELMICK, Walt
214/665-8373
helmick.walt
214/665-8072
GOROSTIZA, Sylvia
281/983-2134
gorostiza.sylvia
281/983-2248
RITCHEY, Charles
214/665-8350
ritchey.charles
214/665-8072
ROMIG, Randall
214/665-8346
romig.randali
214/665-8072
Region 7
HARRIS, Diane (Acting QAM)
913/551-7258
harris.dianee
913/551-9258
NICHOLS, Robert
913/551-7295
nichois.robert
913/551-9195
Region 8
MEDRANO, Tony (QAM)
303/312-6336
medrano.tony
303/312-7828
WARNER. William
303/312-7289
wamer.william
303/312-7828
Region 9
FONG, Vance (QAM)
415/744-1492
fong.vance
415/744-1476
EIDELBERG, Joseph
415/744-1527
eidelberg.joseph
415/744-1476
Region 10
TOWNS, Barry (QAM)
206/553-1675
towns.barry
206/553-8210
WOODS, Bruce
206/553-1193
woods.bruce
206/553-8210
1 FINAL 02/22/01
-------
GROUP
NAME
PHONE
E-MAIL ADDRESS
(@epa.gov)
FAX NUMBER
Office of Research and Development
AAQARep CORTESI, Roger
202/664-6852
cortesi.roger
202/565-2444
NCER
PARRY, Nan (QAM)
202/564-6859
parry.rtan
202/565-2449
NCEA
NOLAN, Melvin (QAM)
202/564-3354
nolan.meJvin
202/565-0061
HQ
WU. Chieh
202/564-3257
wu.chleh
202/565-0076
Cin
WILLIAMS, Doug
513/569-7361
williams.doug
513/569-7475
RTP
FENNELL, Douglas
S19/541-3789
(enneil.dajglas
919/641-1818
NHEERL
CULPEP PER.Srertda (OQA)
819/541-0153
culpepper.brenda
919/641-2581
AED
LI VOLS I, Joe
401/782-3163
livolsi.joseph
401/782-3030
ECD (RTP)
ROGERS, Ronald
919/541-2370
rogere.ron
919/541-0694
ETD (RTP)
HUGHES, Thomas
919/541-7644
hughes.thomas
919/541-4264
HSD (RTP)
RAY, Mike
919/966-0625
ray .mike
919/966-6212
GED (GB)
MOORE, James C.
850/934-9236
moore.jim
850/934-9201
MED (Duluth)
BATTERMAN, Allan
218/529-5027
batterman.allan
218/529-5015
(Grosse lie)
RYGWELSKI, Ken
734-692-7641
rygwelski.kenneth
734/692-7603
NTD (RTP)
SUTTON. Jim
919/541-7610
sutton .james
919/541-5394
WARD, Tom
919/546-2544
ward.tomr
919/546-5394
RTD (RTP)
BROWN, Janice
919/541-0331
brown.janice
919/541-1499
WED (Corvallis)
MCFARLANE, Craig
541/754-4670
mcfartane.craig
541/754-4799
NRMRL
WAGNER, Tom (DQA)
513/569-7013
wagner.tom
513/569-7585
APPCD (RTP)
ADAMS, Nancy (QAM)
919/541-5510
adams.nancy
919/541-0496
FERGUSON, Holly
919/541-0949
lerguson.holly
919/541-0496
GROFF, Paul
919/541-0979
groff.paul
919/541-0496
WASSON, Shirley
919/541-1439
wasson.shirley
919/541-0496
LRPCD (Cin)
VEGA, Ann
513/569-7635
vega.ann
513/569-7620
AL-ABED, Sou hail
513/569-7849
al-abed.souhaii
513/569-7879
ROULEER, Mike
513/569-7796
roulier.michael
513/569-7879
RICHARDSON, Ten
513/569-7949
richardson.teri
513/569-7015
AUSTERN, Barry
513/569-7638
austern.barry
513/569-7015
RICHARDS, Malta
513/569-7692
richards.marta
513/569-7676
REISMAN, Dave
513/569-7588
resiman.david
513/569-7676
SPRD (Ada)
VANDEGRiFT, Steve
580/436-8684
vandegrift.steve
580/436-8528
STD (Cin)
DREES, Lauren
513/569-7087
drees, lauren
513/569-7787
WSWRD (Cin)
OWENS, Jim
513/569-7235
owens.jim
513/569-7327
NERL
JOHNSON, Lora (DQA)
513/569-7299
johnson.lora
513/569-7424
AMD(RTP)
VIEBROCK, Herbert
919/541-4543
viebrockJierberl
919/541-1379
EERD & MCEARD (Cin) MARTINSON, John
513/569-7564
martinsori.john
513/569-7424
MEES, Bill
513/569-7582
mees.william
513/569-7424
ERD (Athens)
HOLM, Harvey
706/355-8008
holm.harvey
706/355-8007
KITCHENS, James
706/355-8043
kitchens.james
706/355-8068
ESD (LV)
BRILIS, George
702/798-3128
brilis.george
702/798-2233
LCB (RTP)
DUNCAN, John
919/541-2187
duncan.john
919/541-1138
EPIC (VA)
GAROFALO, don
703/648-4285
garofalo.donaid
703/648-4290
HEASD(RTP)
BETZ, Elizabeth
919/541-1535
betz.elizabeth
919/541-0239
ACPB (RTP)
LUMPKIN, Tom
919/541-3611
lumpkin.thomas
919/541-7953
AMMB (RTP)
HUNIKE, Elizabeth
919/541-3737
hunike.elizabeth
919/541-1153
EMMB (RTP)
LUMPKIN, Susan
919/541-4292
lumpkin.susan
919/541-3527
HEAB (RTP)
STEVENS, Catvin
919/541-1515
stevens.carvin
919/541-1486
HERB (LV)
KANTOR, Edward
702/798-2690
kantor.edward
702/798-2261
SACB (RTP)
WEINSTEIN, Jason
919/541-4207
weinstein.jason
919/541-3451
2
FINAL 02/22/01
-------
GROUP
NAME
PHONE
E-MAIL ADDRESS
(6epa.gov)
FAX NUMBER
MISCELLANEOUS QA CONTACTS
Ches. Bay Program
Geospatial
Gulf of Mexico
Nat Env Lab Acer Prog
Office of Regional Ops.
LEY, Mary Ellen
BRIUS, George
KOPFLER. Fred
HANKINS, Jeanne
LUTTNER, Pamela
410/267-5750
702/798-3128
228/688-2712
919/541-1120
202/564-3107
ley.mary
brilis.george
kopfler.fred
hankins.jeanne
luttner.pamela
410/267-5777
702/798-2233
228/688-2709
919/541-4261
202/501-0062
NATIONAL PROGRAM OFFICES
GLNPO
BLUME, Louis
312/353-2317
blume.louis
312/353-2018
OA
OCHP
CHU, Ed
202/564-2196
chu.ed
OA/OPEI
MCGARTLAND, Al
202/260-3354
mcgartiand.al
202/401-1642
OGG, Clay
202/260-6351
ogg.clay
202/401-6637
OAR
AA QA rep
MAZZA, Carl
202/260-4672
mazza.cart
202/260-5155
OAP
KERTCHER, Larry
202/564-9121
kertcher.larry
202/565-2141
OAQPS
ELKINS, Joe
919/541-5653
elkins.joe
919/541-3613
AUTRY, Lara
919/541-5544
autry.lara
919/541-2357
MAXWELL, Doris
919/541-5312
maxwell.doris
919/541-0072
MUSICK, David
919/541-2396
musick.david
919/541-1903
PAPP, Michael
919/541-2408
papp.michael
919/541-1903
ORIA
DOEHNERT, Mark
202/564-9386
doehnert.mark
202/565-2042
EAGLE, Mike
202/564-9376
eagle.mike
202/565-2062
FISHER, Eugene
202/564-9418
fisher.eugene
202/565-2038
NAREL
WISDOM, Mary
334/270-3476
wisdom.mary
334/270-3454
LV
LEVY, Richard
702/798-2466
levy.richard
702/798-2465
FLOTARD, Richard
702/798-2113
flotard.richard
702/798-2109
MOSLEY, Robert
702/798-2259
mosley.robert
702/798-2375
SELLS, Mark
702/798-2336
sells.marie
702/733-8013
OTAQ
HARPER, Thomas
734/214-4308
harper.thomas
734/214-4550
OARM
OA
PASTORE, Tom
202/564-2084
pastore.tom
202/564-6591
DAVIDSON, JeH
202/564-1650
davidson.jeff
202/564-0215
OHROS
WALLACE, Linda
202/564-3182
Wallace.linda
202/564-1369
OCFO
AA QA rep
O'Brien, Kathy Sedlak-
202/260-1162
obrien.kathy
202/260-9650
OECA
AA QA rep
FFEO
OC
OCEFT
IO
CID
NEIC
OEJ
OFA
ORE
OSRE
MARION, Greg
202/564-7139
marion.greg
202/564-0073
JONES, Kelly
202/564-2459
jones.kellyac
202/501-0069
TOPPER, Martin
202/564-2564
topper.martin
202/501-0271
ALGAR, Linda
202/564-2546
algar.linda
202/501-0599
GUARINO, Kevin
303/236-6120
guarino.kevin
303/312-6134
HUGHES, Barbara (QAM)
303/236-6116
hughes.barbara
303/236-5116
ALEYNIKOV, Marina
303/236-6062
aleynikov.marina
303/238-5116
MATHEWS, Kaye
303/236-6281
mathews.kaye
303/236-2395
ROHRER, Mary
303/236-6295
rohrer.mary
303/236-2395
YARBROUGH, Kenna
303/236-6711
yarbrough.kenna
303/236-2395
SETTLE, Mary E
202/564-2594
settle.mary
202/501-0740
BIGGS, B. Katherine
202/564-7144
biggs.katherine
202/564-0072
OLSON, Don
202/564-5558
olson.don
202/564-0010
JOJOKIAN, Jack
202/564-6058
jojokian.jack
202/564-0074
3
FINAL 02/22/01
-------
GROUP NAME PHONE SS FAX NUMBER
OEI
WORTHINGTON, Jeff (DQA)
202/564-5174
worthington.jeffrey
202/565-2441
BETHELL, Cindy
202/260-2580
bethell.cindy
202/260-8550
OIG
SIMPSON, Terry
202/260-3276
simpson.terry
202/260-3030
OPPTS
AA OA Rep
KARIYA, Jim
202/260-2916
kariya.jim
202/401-1282
OPP
GRIM, Betsy
703/305-7634
grim.betsy
703/305-6309
AD
CHERRY, Juanita
703/308-6428
cherry .juanita
703/308-8270
BEAD
GRUBE, Arthur
703/305-8095
grube.arthur
703/308-8090
BEAD/ACL
WRIGHT, Dallas
410/305-2909
wright.dallas
410/305-3091
GOLDEN, Paul
410/305-2960
golden.paul
410/305-3091
BEAD/ECL
BYRNE, Christian
601/688-3213
byme.christian
601/688-3536
BEAD/ML
LASOTA. Leo
410/305-2965
lasota.leo
410/305-3091
SZYMANSKI, Cynthia
703/308-8191
szymanski.cynthia
410/305-3091
BPPD
FRAZER, Carol
703/308-8810
frazer.carol
703/308-7026
EFED
NGUYEN, Thuy
703/605-0562
nguyen.thuy
703/308-6181
FEAD
KENDALL, Ron
703/305-5561
kendall.ron
703/308-3259
ROELOFS, Jim
703/305-2964
roelofs.jim
703/308-3259
HED
JARVIS, Christina
703/305-0312
jarvis.christina
703/308-7157
IRSD
HOWELL, Faye
703/305-5462
howell.faye
703/305-5512
RD
MALAK, Sami
703/308-9365
malak.sami
703/308-9382
SRRD
GOODIS, Michael
703/308-8157
goodis.michael
703/308-8041
OPPT
GLATZ, Jay
202/260-3990
glatz.joseph
202/260-6704
OSWER
AA QA Rep
JOVER, Tony
202/260-2387
jover.tony
202/260-6754
AA
BISHOP, Kathleen
202/260-7912
bishop.kathy
202/260-6754
FFRRO
CARTER, Mike
202/260-5686
carter.mike
202/260-5646
OERR
GEUDER, Duane (QAM)
703/603-8891
geuder.duane
703/603-9132
COAKLEY, Bill
732/906-6921
coakley.william
732/321-6724
WAETJEN, Hans
703/603-8906
waetjen.hans
703/603-9133
OSW
SELLERS. Charles
703/308-0504
sellers.charles
703/308-0511
CIRMD
EMRAD
JOHNSON, Barnes
703/308-8855
johnson.bames
703/308-0511
KIRKLAND, Kim
703/308-0490
kirkland.kim
703/308-0509
HWID
LEBLEU, Wanda
703/308-0438
lebleu .wanda
703/308-0511
HWMMD
RAUENZAHN, Scott
703/308-8477
rauenzahn.scott
703/308-8433
MISWD
CASSIDY, Paul
703/308-7281
cassidy.paul
703/308-8686
PSPD
BROWN, Ernesto
703/308-8608
brown.emie
703/308-8608
OUST
DEPONT, Lynn
703/603-7148
depont.lynn
703/603-9163
TIO
AMf
CRUMBLING, Deana
703/603-9910
crumbling.deana
703/603-9135
UW
AA QA Rep
TELLIARD, Bill
202/260-7134
telliard.william
202/260-7185
AIEO
LIU, Edwin (QAM)
202/260-9872
liu.ed
202/260-7509
OGWDW
CLARK, Steve (QAM)
202/260-7575
dark.Stephen
202/260-3762
COLBERT, Harriet
202/260-2302
colbert.harriet
202/260-3762
DAMRON, Craig
202/260-5556
damron.craig
202/260-0732
HAERTEL, Frances
214/665-8090
haertel.frances
214/665-2191
KYLE, Lee
202/260-1154
kyle.lee
202/401-3041
MADDING, Carol
513/569-7402
madding.Caroline
513/684-7191
PARRISH, Cayce
202/260-0876
pa rrish.cayce
202/401-6135
SMITH, Bob
202/260-5559
smith.robert-eu
202/260-0732
OST
TELLIARD, Bill (QAM)
202/260-7134
telliard.william
202/260-7185
BISWAS, Hiranmay
202/260-7012
biswas.hira
202/260-9830
HEBER, Margarete
202/260-7144
heber.margarete
202/260-7024
OWM
WALKER, John (QAM)
202/260-7283
walker.john
202/260-0116
MSD
BENROTH, Barry
202/260-2205
benroth.barry
202/260-0116
OWOW
BROSSMAN, Martin (QAM)
202/260-7023
brossman.martin
202/260-1977
PAN, Paul
202/260-9111
pan.paul
202/260-9960
SIPPLE, William
202/260-6066
sipple.william
202/260-8000
4
FINAL 02/22/01
-------
The Honorable Miriam Naveira de Rodon
Associate Justice of the Supreme Court
of the Commonwealth of Puerto Rico
Justice Miriam Naveira de Rodon has an extensive and impressive record as a jurist, law
professor, and public servant. She started her professional career as a law clerk in the Supreme
Court of Puerto Rico. From 1963 to 1971, she worked for the Department of Justice of Puerto
Rico where in 1966, she was the first woman to be appointed Assistant Attorney General; and in
1973 she became the first woman to be appointed Solicitor General of the Commonwealth of
Puerto Rico. She was appointed Associate Justice of the Supreme Court of the Commonwealth
of Puerto Rico in 1985, the first and currently the only woman to hold this position. Justice
Naveira de Rodon also held a professorship at the Inter American University School of Law of
Puerto Rico.
Justice Naveira de Rodon holds a Bachelor's Degree in Science with a concentration in
Chemistry from the College of Mt. St. Vincent in Riverdale, New York. She received a Master's
degree from Columbia University School of Law in New York City, New York. Additionally,
she pursued postgraduate studies at the University of Leyden School of Law in Holland.
During her career, Justice Naveira de Rodon has received numerous honors. In 1990, she
received a Doctor of Law, Honoris Causa, from the Georgetown University School of Law in
Washington, D.C. In 1995, The College of Mt. St. Vincent conferred upon her a doctorate of
Law, Honoris Causa, for her contributions to the administration of justice. In 2000, Justice
Naveira de Rodon was elected the treasurer of the International Association of Women Judges,
participated as a member of the Board of Directors in the V Biennial Conference held in Buenos
Aires, Argentina, and was invited by the United Nations to participate as a panelist in the First
Encounter of Women Justices from Supreme Courts and Constitutional Courts of Latin America
and the Caribbean.
-------
NOMINA TIONS FOR THE 2000 BARBARA M. METZGER
QUALITY ASSURANCE MAN A GER A WARD
Scott E. Cieniawski (Region 5)
Outstanding quality management of grants and contracts for the Great Lakes National Program
Office through exemplary performance.
Priscilla S. Farrel (Region 9)
Significant national leadership in development of relevant, accessible information tools for
strategic thinking and public accountability.
Barbara A. Finazzo (Region 21
Demonstrated initiative in leadership in developing and implementing a nationwide assessment
program for EPA's Regional Laboratories
Brian Freeman (Region 5)
Outstanding achievement in quality assurance management through the implementation of
uniquely valuable data quality concepts and tools.
Harvev Holm (ORIT)
Revitalizing and enhancing the Ecosystems Research Division Quality Assurance Program that
is a model for research laboratories.
Gary G. Lear (PARI
Exceptional service in quality assurance through developing a sound, comprehensive, and
credible quality assurance system for the Clean Air Status and Trends Network (CASTNET).
James V. Roelofs (OPPTS^
Exceptional service in assisting EPA's State and Tribal partners to improve Quality Assurance
policies and procedures for pesticide programs.
-------
PREVIOUS WINNERS OF THE BARBARA M. METZGER
QUALITY ASSURANCE MANAGER OF THE YEAR AWARD
1999 Mark P. Doehnert, OAR
1998 Joe Elkins, OAQPS
1997 Vance Fong, Region 9
1996 Barbara M. Metzger, Region 2
1994 Michael Papp, GLNPO
1994 John Scalera, OPPT
1993 Llewellyn Williams, EMSL-Las Vegas
1992 Lora Johnson, EMSL-Cincinnati
1991 Marvin Kendall Young, Region 6
1990 Martin W. Brossman, OWOW
1989 Guy F. Simes, RREL
1988 Gerard F. McKenna, Region 2
1987 Barry Towns, Region 10
-------
Improving The Quality of Data Used in Environmental Economics Research
Clayton Ogg
Abstract - Valuing environmental goods through contingent valuation and
through other survey techniques is difficult because of the lack of traditional
markets for those goods. EPA provides leadership in overcoming these
challenges, using focus groups, workshops, handbooks, and teams of
distinguished economists to obtain more reliable economic data from the surveys.
Agency economists have provided leadership also in employing production
economics tools to identify win-win approaches to solve pervasive and neglected
environmental problems.
In addition, we exploit the speed of sophisticated computers, building field run-off
models into economic models and aggregating results from tens of thousands of
field sites. These very flexible models provide much more credible, site specific
analyses than previous models, and they offer flexible and efficient remedies to
support Total Maximum Daily Loads and other new programs.
Past environmental programs committed hundreds of billions of U.S. dollars to achieve
environmental benefits. Policy makers are keenly interested in quantifying the economic benefits
from these past investments, as well as any future investments in improving the environment.
Although quantifying economic benefits from environmental decisions rarely, if ever, leads to
confrontation or litigation, assuring the quality of economic data and analyses is critically
important. EPA's economic analyses support important policy and programmatic decisions.
Of the many areas of research conducted by the Office of Policy, Economics, and Innovation, this
paper will focus on three areas where generation of quality economic data appears most relevant
to Agency decision making. 1) Agency economists collect primary economic data through their
own contingent valuation (CV) surveys or use data from earlier CV surveys; these surveys face
the immense challenges in developing reliable data, but provide a direct valuation of
environmental products not sold in any market place. 2) Other studies employ agricultural
production economics to quantify producer benefits from adopting nutrient planning and other
technologies which benefit producers' bottom line-and the environment~by reducing the use of
costly chemical inputs. 3) Economists also team up with physical scientists (such as those
attending this conference) to build environmental indicators into economic models; these models
are then used to identify site specific benefits as environmental decisions devolve to the local
level through Total Maximum Daily Loads (TMDL) and other locally run programs. Agency
economists continue to play a leadership role within their professions as they improve the
quality of data and models in each of the above research areas and assure that the models,
assumptions, and findings are employed correctly.
1
-------
Each of these three areas of economic/environmental research offers unique contributions to
support environmental decision making, and each offers it own set of challenges to those
providing reliable data and models. These Quality Assurance (QA) challenges faced by
economists are very different from the challenges faced by others attending this conference.
After describing past efforts to assure the quality of data, the paper will suggest some additional
measures that can be undertaken in the future.
Direct Valuation
Economists traditionally rely on the price of a good to represent its value to society.
However, with environmental goods and services (such as national parks or clean air) there is no
market in which a price, or value, of the good is revealed. Therefore, economists have developed
alternative methods for estimating such values. Three of the most common such techniques are
contingent valuation, hedonics, and travel cost methods. This paper will focus on the first
technique, contingent valuation. Contingent valuation (CV) is a survey method that provides a
hypothetical "market" in which individuals are asked to express a value for a particular
environmental good or service. For example, a survey might ask individuals to express a
willingness to pay to improve water quality in a particular lake from a boatable to a swimmable
standard. These stated values can then be used as an input into benefit-cost analysis or policy
decisions. CV is arguably the most widely used valuation technique at EPA because of its ability
to capture non-use values (e.g., individuals who do not directly "use," or visit, a particular lake
may still have a value for its water quality improvements) and its ability to be applied to many
different scenario types. However, CV is also one of the most controversial methods because of
the hypothetical nature of the survey methods. Designing these CV questions offers perhaps the
greatest challenge in attaining reliable data and estimates (USEPA,2000).
Activities lead by National Center for Environmental Economics (NCEE) economists during the
past year are illustrative of the Agency's efforts to improve the data from CV surveys:
• NCEE co-sponsored a workshop focused primarily on improving survey data obtained for
CV studies, entitled "Stated Preference: What Do We Know? Where Do We Go?" (The
term "stated preference" and "CV" are often used interchangeably). The workshop
included sessions on theory and design, validity, and applications related to health and
ecosystems. Some of the discussion focused on actual surveys supported by EPA,
including a survey designed to value visibility improvements and one designed to
determine the value older people place on health risks. Panel discussions addressed how
agencies use these survey techniques, suggested research priorities, and identified ways to
improve the survey instruments. Presentations and discussions from these and other
workshops are published on-line (www.epa.gov/economics).
• NCEE is writing a Stated Preference Handbook for use by EPA economists and policy
analysts. The Handbook will focus on evaluating the validity and reliability of existing
CV studies, as well as how these issues impact the design of an original study. Topics
will include the design of studies, survey administration, data analysis, and benefits
2
-------
transfer techniques. The Handbook will encourage analysts to incorporate methods of
evaluating their survey instrument and results into their study design and to demonstrate
why they adopted certain procedures. This document was reviewed internally and by
distinguished reviewers from academia.
• Finally, several CV surveys conducted by NCEE during 2000, including surveys relating
to benefits from improving drinking water, freshwater, and estuarine water, employed the
quality assurance techniques favored by the above handbooks and a recently published
"Guidelines for Preparing Economic Analyses." These techniques included extensive use
of focus groups to test people's perceptions regarding the questions in the survey, as well
as a review of the survey instruments by teams of distinguished economists.
Over the past two decades, EPA economists have relied on the above approaches to improve the
data used by contingent valuation studies, hosting conferences which identify problems and
solutions, developing handbooks and other publications outlining the latest techniques for
improving the quality of data generated by contingent valuation and other surveys (Cummings,
et. al.; Murdoch, et. al.; Sylvan Environmental Consultants; Martin Marietta Corporation), and
supporting research by teams of the most talented economists. These studies identify problems
with earlier studies and attempt to design and use new survey and focus group techniques which
will improve the methodology as well as improve the actual estimates of environmental benefits
(Anderson and Kobrin).
One study in 1990 attempted to overcome difficulties afflicting previous CV studies of air quality
by confronting the diverse perceptions by survey respondents. Some survey respondents might
see the Denver Brown Cloud as just that while others also perceive inhalation of harmful
chemicals associated with the cloud (Schulze, et. al ). This study was typical of a number of CV
studies funded by EPA over the last two decades in its use a large team of economists, and in
some cases, psychologists, who work to identify problems in previous CV studies and attempt to
overcome those problems (Anderson and Kobrin).
CV and many related benefit valuation techniques caught the imagination of resource
economists. The work identified above attracts the attention of dozens of talented researchers
inside EPA as well as many other resource economists across the U.S. Thus, the data problems
associated with CV and related analyses are being addressed in a major way by many skillful
economists, with excellent support from their profession. Some QA concerns in other areas of
research, including those described below, do not enjoy the same high level of support, but EPA
has provided much leadership in these other areas, as well.
Production Economics
Over two decades ago, agronomists identified large potential environmental gains in the form of
reduced nitrogen fertilizer use (25-50 percent reductions) by delaying fertilizer applications until
late in the Spring when plants were ready to use it (Bouldin, Reid, and Lathwell; Olson, et. al.)
3
-------
and by taking credit for farm produced nutrients already in the soil when applying fertilizer
(Magdoff, Ross, and Amadon; El-Hout and Blackmere; Fox and Piekielek; Meisinger).
Economists at EPA and elsewhere anticipated the questions policy makers might raise regarding
these remarkable discoveries, such as, "Why are farmers not already using these nutrient
planning technologies if they are so profitable, and what is the actual fertilizer savings achieved
by farmers who have adopted these technologies?" Because policy makers invest as much or
more in programs which support farm income as they invest in programs which improve the
nation's environment, learning about producer benefits and environmental benefits from nutrient
planning appeared to be a policy relevant area for economic research.
In contrast to the CV studies mentioned above, the production economics question appeared
relatively easy. We could issue a call for research proposals which would focus on learning from
farmers' actual experience in adopting and profiting from nutrient crediting technologies. This
would be done in certain states which were ahead of the rest of the country in calibrating the
nitrogen crediting technologies and making them available to farmers. Farmers know what you
are saying when you ask 1) what they have spent on fertilizer and 2) whether they have used the
late Spring soil test. Providing credible estimates in more than one state was considered
desirable because policy makers, as well as economists, are naturally skeptical of win-win types
of claims linking environmental improvements with producer benefits.
Economists in Pennsylvania, Iowa, and Nebraska, three states that were farthest along in
adopting the improved nutrient planning technologies, conducted regional or state-wide
economic analyses regarding use of these technologies. The Pennsylvania study (Hertle, et. al.)
was the only one of the three which was supported by EPA. It found that within five years of its
introduction, over a third of farmers in the state were already using the late spring soil test and
that nitrogen fertilizer savings ranged from 25 to 40 percent. The Nebraska switching regression
study (Fuglie and Bosch) was also based on a large survey of farmers and found a similar
reduction in fertilizer applications and a 50 percent adoption rate for the deep soil test available
in that state. The Iowa study (Babcock and Blackmere) used that state's extensive experimental
data in a production function and again suggested fertilizer savings in the 25-40 percent range
from Iowa' version of the late Spring soil test.
Publication in credible, peer reviewed journals supported the reliability of the these and other
(Fleming and Babcock; Ogg,1999; Tractenberg and Ogg) studies which document producer
benefits from improved management of fertilizer and livestock nutrients. Partly in response to
these economic and environmental opportunities, policy makers have focused in the past four
years on addressing the over abundance of nutrients on the landscape. This policy change
occurred after decades of investing in costly remedies, such as tertiary treatment systems for
urban nitrate sources. Past policies neglected the agricultural pollutants that provide by far the
greatest share of nutrient loadings in U.S. streams (USEPA, 1990; Pucket). Nutrient planning,
which typically includes soil testing or other methods of crediting farm produced nutrients, is the
focus of "management measures" for coastal waterways (Ogg, 1999), of hypoxia initiatives
(Doering, et. al ), and of USDA' new Environmental Quality Incentives Program (Ogg,1999).
4
-------
Nutrient planning also will likely play a large role as States carry out new programs for confined
animal feeding operations and develop Total Maximum Daily Loads (TMDL).
Production economics research, including the studies identified above, is very versatile and
draws from a solid base of economic theory. Providing analyses that take advantage of this
versatility lends credibility to the quality of the findings, as studies use very different data and
approaches, yet arrive at mutually supporting conclusions. Since these production economics
studies are much easier to carry out and since they produce reliable conclusions, much can be
accomplished with relatively modest research investments.
Although not much of this production economics work is supported by the agency at the present
time, economists looking at integrated pest management (IPM) have begun to find win-win sorts
of opportunities that parallel those from the above nutrient studies. However, because of the
large variety of IPM techniques in use, documenting which IPM practices are beneficial to
farmers and to the environment will be more challenging (Norton and Mullen).
Site Specific Research
Devolution of environmental decision making to the local level is another policy option aimed at
reducing costs. TMDL programs and new USDA programs created by the 1996 Farm Bill
attempt to offer flexible solutions that achieve environmental benefits in ways that are much
more efficient than the past, one-size-fits-all approaches. These programs encourage States to
rank local watersheds for treatment based on which watershed program offers environmental
benefits at the lowest cost. For the watersheds targeted for early treatment, environmental goals
are identified, and practices that most efficiently achieve those goals receive funding first.
In order to support the above ranking of potential watershed programs and to take advantage of
the opportunity to achieve economic efficiency, economists work together with scientists from
other disciplines to develop models with the capability to identify site specific benefits and costs.
Working together, they combine economic models with the field run-off models and the stream
models developed by physical scientists. Lack of the technical capability to model site specific
impacts at a reasonable cost has hampered past efforts (Boyd) to advance TMDL programs, but
researchers cooperating with EPA are producing models that provide the credibility and
flexibility that is needed.
Early efforts to combine natural resource indicators into economic models involved building the
Universal Soil Loss Equation (USLE) into linear programming models to analyze costs and
benefits of soil conservation policy options. These models identified site specific land use
changes from implementing various policies to reduce soil erosion. They were most useful in
anticipating the erosion reductions and costs of policies such as the Conservation Reserve
Program and Conservation Compliance (Ogg, Webb, and Huang).
However, as economists attempted to include phosphorus pollution as indicators in these linear
5
-------
programming models (Ogg, Pionke, and Heimlich), they had to address run-off. Run-off was a
much more difficult problem than soil erosion because it had to be addressed for individual
storms. Since computer models could not model large numbers of storm events at a manageable
cost, modelers faced the impossible task of determining which storms could be considered as
most representative or relevant. Also, the early linear programming models tended to average
certain landscape characteristics over a geographic area rather than run the model for individual
sites and then average the model's outputs for those sites. Depending on their disposition,
physical scientists responded with wit, humor, or anger at how economists were using their
models. The larger the landscape being modeled, the greater the opportunity to cause offense.
As computer capabilities have improved, EPA supported research which addresses the above
problems. One recent cooperative study (Wu and Babcock) at Iowa State University ran the
EPIC crop growth model every day for thirty years for ten percent of the 128,591 National
Resource Inventory sample points within itsl2 state Midwestern region being modeled. It then
used "meta-model" regression techniques to expand the results to the rest of the 128,591 sample
points. Models used in this study had been calibrated and tested. The models are extremely
flexible in outputting results to counties (as in the article sited above) or to impaired watersheds
within the 12 state region. This is done by averaging the models' results for the sample points
within the respective county or watershed. Sediment runoff at the field level, nutrient leaching
and run-off at the field level, herbicide leaching and run-off at the field level, and soil carbon are
the environmental indicators available within this model. In the Wu and Babcock analysis,
nitrogen leaching and run-off was reduced by 15 to 30 percent in most counties using the win-
win types of fertilizer management technologies (Fuglie and Bosch; Babcock and Blackmere;
Hertle, et. al.) discussed above.
The Iowa model appears relevant to programs which require ranking watersheds to prioritize
those watersheds which produce the largest benefits per dollar. They also could help in
analyzing remedies to reach environmental goals set within the watersheds. If the model can be
used in ways that avoid the costs of building a separate model for each TMDL watershed,
considerable cost savings can be realized, although this author is not certain whether TMDL
goals can be established without building a separate model for each stream.
Potential Problems Confronting Site-Specific Analyses
While the above Iowa model outputs physical science and economic information in a credible
manner, many regional models used by economists (e.g. Doering, et. al.; Faith) still resemble
more closely the older models described above. They lack the credibility which could be
provided by modem, fast computers.
Serious problems have occurred, also, where models are used to address problems that they were
not designed to address. One recent study (Faith, 1995) estimated the costs of managing nitrogen
fertilizer by assuming that policy makers would reduce nitrogen to levels that starve the plant for
nitrogen and reduce yields. This approach was suggested by the options that were available
6
-------
within the model, not by a review of the options under consideration by policy makers or by
agronomists who assist farmers. In fact, agronomists insist that the technologies such as timing
nitrate applications and taking credit for farm produced nutrients will not reduce yields, and these
scientists work to ensure that yield reductions do not ever occur (Fox and Piekielek; Magdof;
Meisinger). Policy makers focus on nutrient planning technologies which assure adequate
nutrients to reach yield goals, but discourage nutrient applications beyond recommended amounts
(Ogg, 1999). The Faith (1995) study failed to consider the technologies that farmers actually rely
on to address the over abundance of nutrients on their land, as have other studies, before it (Ogg,
1978; Taylor and Frohberg).
Models are used in many ways. It is not the models, themselves, that pose problems, but rather
how models are used. The same model that produced misleading estimates regarding the costs of
fertilizer management provided an important contribution in a later study that considered the
costs of energy- taxes for agricultural producers (Faith, 2000). Because this model correctly
portrayed producers' flexibility to shift to reduced tillage and other technologies in the face of
higher energy costs, it was able to contribute important economic information to the policy
debate regarding energy taxes and agriculture. For example, the analysis indicated that farm
incomes would be affected only modestly and multiple resource conservation benefits would be
realized.
QA for Economics Research
Economic research is very different from physical science research, in part, because we do not
have a physical phenomenum, such as certain chemicals in a stream or in the air, that are there
for us to measure. There are as many different economic phenomenum to measure as there are
ways to measure them, and who is to say which theoretical model captures the more correct
answer, or which offers the best measurement technique? Thus, the challenges to producing high
quality economic data and analyses are very different from the challenges to producing physical
science data and analyses.
Economic models used for quantifying environmental benefits are also not subject to the legal
challenges associated with setting environmental standards. However, our studies need to be
reliable and believable for use in important policy decisions. We are addressing this challenge in
each of the research areas identified above.
EPA has provided leadership in addressing the major problems faced by contingent valuation and
related studies as they attempt to value environmental goods which are not sold in the market
place. The agency produces handbooks, works with focus groups, and assembles teams of the
most knowledgeable economists to assure the quality of data. Many of the past studies attempted
to remedy problems raised by earlier studies as teams of economists designed survey instruments
that address the problems. These efforts will continue to play an important role in the coming
year and beyond.
7
-------
The challenges facing production function analyses are much less formidable. By working with
economists who publish in credible, peer reviewed journals, and by employing a variety of
research techniques, we can produce analyses that scientists and policy makers use.
The Agency also has supported development of site specific modeling tools that avoid criticism
with regard to their physical science components. High speed computers allow researchers to
assemble data at thousands of sample points and run a calibrated model for each sample point in
a credible manner. Results for each point are then aggregated to the watersheds of interest. We
need to employ these new, more reliable modeling techniques where they are most needed.
In the future, economists in OPEI will improve on the above measures by explicitly including
data quality in the reviews conducted for grants and for contracts. Unlike the studies conducted
by program offices, policy analyses have very wide flexibility as to which projects are funded. If
research proposals do not assure appropriate use of data and models, an excellent remedy is to
employ Agency resources elsewhere.
References
Anderson, R.C., and P. Kobrin. 1998. "Introduction to Environmental Economics Research at
EPA." Environmental Law Institute, Washington, DC.
Babcock, B.A., and A. M. Blackmer. 1992 "The Value of Reducing Temporal Input
Nonuniformities." J. Agr. and Resour. Econ. 17:335-47.
Bouldin, C.R., W.S. Reid, and D.J. Lathwell. 1971. "Fertilizer Practices Which Minimize
Nutrient Loss." In: Agricultural Wastes: Principles and Guidelines for Practical Solutions.
Proceedings of Cornell University Conference on Agricultural Waste Management, Syracuse,
1971.
Boyd, J. 2000. "Unleashing the Clean Water Act: The Promise and Challenge of the TMDL
Approach to Water Quality." Resources 139:1-10.
Cummings, R.G. D.S. Brookshire, and W.D. Schullze, 1986. "Valuing Environmental Goods: An
Assessment of the Contingent Valuation Method." Rowman & Allanheld, Totowa.
Doering, O.C., F. Diaz-Hermelo, C. Howard, R. Heimlich, F. Hitzhusen, R. Kazmierczak, L.
Libby, W. Milon, A. Prato, and. M. Ribaudo. 1999. "Evaluation of Economic Costs and Benefits
of Methods for Reducing Nutrient Loads to the Gulf of Mexico." Mimeo.
Faith, Paul. 1995. "Growing Green: Enhancing the Economic and Environmental Performance of
U.S. Agriculture." World Resource Institute, Washington, DC.
. 2000. "A Climate and Environmental Strategy for U.S. Agriculture." World Resources
8
-------
Institute, Washington, DC.
El-Hout, N.M. and A.M. Blackmer. 1990. "Nitrogen Status of Corn after Alfalfa in 29 Iowa
Fields." Journal of Soil and Water Conservation 45: 115-117.
Fleming, R.A. and B. A. Babcock. 1997. "Resource or Waste? The Economics of Swine Manure
Storage and Management." Rev, of Agri. Econ. 20:96-113.
Fox, R.H. and W.P. Piekielek. 1983. "Response of Corn to Nitrogen Fertilizer and the Prediction
of Soil Nitrogen Availability with Chemical Tests in Pennsylvania." Pennsylvania Agricultural
Experiment Station Bulletin 843, Pennsylvania State University, University Park, Pennsylvania.
Fuglie, K.O. and D. J. Bosch. 1995. "Economic and Environmental Implications of Soil Nitrogen
Testing: A Switching-Regression Analysis." Amer. J. Ag. Econ.77: 891-900.
Magdoff, F.R., D. Ross, and J. Amadon. 1984. "A Soil Test for Nitrogen Availability to Corn."
Soil Science Society of America Journal 48:1301-1304.
Meisinger,J.J. 1984. "Evaluating Plant-Available Nitrogen in soil-Crop Systems." In: Nitrogen in
Crop Production, Madison: ASA-CSSA-SSA.
Martin Marietta Corporation. 1994. "Using Contingent Valuation to Measure Non-Market
Value." Washington, DC.
Mueller, D.K., P A. Hamilton, DR. Helsel, K.J. Hitt and G.C. Ruddy. 1995. "Nutrients in
Ground Water of the United States—An Analysis of Data Through 1992." U.S. Geological
Survey, Water-Resources Investigations Rep. No. 95-4031, Denver, Colorado.
Murdoch, J., J. Thayer, G. Gegax, W. Schulze, D. Anderson, S. Gerking, A. Coulson, D.
Tashkin, D. Anderson, M. Dickie, D. Brookshire, R. Cummings. Valuing Environmental Goods:
A State of the Arts Assessment of the Contingent Valuation Method, Volume 1 .A, Experimental
Methods for Assessing Environmental Benefits." Rowman &Allanheld, Totowa.
Norton, G.W. and J. Mullen. 1994. "Economic Evaluation of Integrated Pest Management
Programs: A Literature Review." Virginia Coop. Ext. Pub. 448-120, Virginia Polytechnic
Institute and State University, Blacksburg, Virginia.
Ogg, C. 1978. "The Welfare Effects of Erosion Controls, Banning Pesticides, and Limiting
Fertilizer Application in the Corn Belt: Comment." Amer. J. Agr. Econ.60:559.
Ogg, C. 1999. "Benefits from Managing Farm Produced Nutrients." J. of the Amer. Water Res.
Association3 5:1015-1021.
9
-------
Ogg, C., W.E. Webb, and W.Y. Huang. 1984. "Economic Analysis of Acreage Reduction
Alternatives Including a Soil Conservation Reserve and Competitive bids." J. of Soil and Water
Cons.39:397-383.
Olson, R.A., A.F. Dreier, C. Thompson, K. Frank, and P.H. Grabouski. 1964. "Using Fertilizer
Nitrogen Effectively on Grain Crops." Nebraska Agr. Exp. Sta. SB 479, Lincoln, Nebraska.
Shortle, J.S., W.N. Musser, W.C. Huang, B. Roach, K. Kreahling, D. Beegle, and R.M. Fox.
1997 "Economic and Environmental Potential of the Pre-Sidedressing Soil Nitrate Test." Agri.
Econ. Rev. 17:25-35.
Schulze, W.D., J R. Irwin, D.J. Schenk, G.H. McClelland, T. Streard, L. Deck, and M. Tobin.
1990. "Urban Visibility: Some Experiments on the Contingent Valuation Method." In: C.V.
Mathai, Visibility of Fine Particles. Transactions of the Air and waste Management Association,
Pittsburgh, Pennsylvania.
Sylvan Environmental Consultants. 2000. "Stated Preference: What Do We Know? Where Do
We Go?" Environmental Law Institute, Washington, DC.
Taylor, C. R. and K. Frohberg. 1977. "The Welfare Effects of Erosion Controls, Banning
Pesticides, and Limiting Fertilizer Application in the Corn Belt." Amer. J. Aer. Econ.59:25-36.
Trachtenberg, E., and C. Ogg. 1994. "Potential for Reducing Nitrogen Pollution Through
Improved Agronomic Practices." Water Resour. Bull. 30:1109-18.
U.S. Environmental Protection Agency (USEPA) and U.S. Department of Agriculture. 1990.
"National Water Quality Inventory, 1990 Report to Congress." Washington DC.
U.S. Environmental Protection Agency and Other Federal Agencies. 1998. Clean Water Action
Plan: Restoring and Protecting America's Waters. Washington, DC.
U.S. Environmental Agency. 2000. "Guidelines for Preparing Economic Analyses." EPA 240-R-
00-003, Washington, DC.
Wu, J. J., P.G. Lakshminararyan, and B .A. Babcock. 1999. "Impacts of Agricultural Practices and
Policies on Potential Nitrate Water Pollution in the Midwest and Northern Plains of the United
States." J. ofEnviro. Oualitv28:1916-1928.
10
-------
ENSURING DATA QUALITY THROUGH
EFFECTIVE LIFE CYCLE MANAGEMENT
Daiby Chellis, Marasco Newton Group, Ltd.
Jessica Yocum, Marasco Newton Group, Ltd.
Abstract—By ensuring data is ofthe highest quality, the Environmental Protection Agency
(EPA) is able to make informed decisions regarding budgetary needs, staff allocation, EPA-
wide and program-specific goals, report program accomplishments, andplan appropriately
for future activities. Quality data is imperative to accurately reflect program progress,
justify and negotiate expenditures, andprovide a unified EPA vision to internal and external
parties.
Like every large organization in a data-reliant society, EPA faces many challenges in
ensuring data quality. To protect the data integrity of an information system, a multi-
dimensional team must be implemented and effectively managed, to support all aspects of
a system's life cycle. This team must include a strong project manager, and requirements,
development, independent third-party test, training, change management and user support
staff. Each of these team members contributes to the quality of the system and ultimately
the data tracked in the system.
The life-team must use their programmatic knowledge and technical expertise to create a
system that not only supports data tracking and reporting, but facilitates data entry and thus
data quality. The life-cycle team ensures that the needs of the users are clearly documented
along with the detailed system specifications in order, to promote knowledge sharing.
Without such an integrated approach, an information system cannot adequately support the
user community, nor can it provide the support mechanism where the information system
can grow and change as programmatic policy changes. A dynamic life-cycle team thus
ensures a dynamic and reliable information system.
By ensuring data is of the highest quality, the Environmental Protection Agency (EPA) is able to
make informed decisions regarding budgetary needs, staff allocation, EPA-wide and program-
specific goals, report program accomplishments, and plan appropriately for future activities. Quality
data is imperative to accurately reflect program progress, justify and negotiate expenditures, and
provide a unified EPA vision to internal and external parties. For example, data tracked in EPA's
information systems is used to respond to Congressional and Freedom of Information Act (FOIA)
inquiries. It is integral that this data be of the highest quality so that it accurately reflects ongoing
trends and accomplishments. Quality data is also relied upon during internal EPA negotiations
processes and externally with Congress to demonstrate where resources should be allocated for the
upcoming fiscal year. Both planning and accomplishment data can be used to demonstrate a need
for additional funding and to determine upcoming goals and objectives.
1
-------
Like every large organization in a data-reliant society, EPA faces many challenges in ensuring data
quality. To protect the data integrity of an information system, a multi-dimensional team must be
implemented and effectively managed, to support all aspects of a system's life cycle. This team must
include a strong project manager, and requirements, development, independent third-party test,
training, change management and user support staff. Each of these team members contributes to the
quality of the system and ultimately the data tracked in the system. For example, when documenting
requirements, system analysts work to ensure that program processes are supported and that these
requirements are translated into data entry screens that clearly display which data is required.
Systems analysts also work to ensure that the system is thoroughly documented so that detailed
knowledge of functionality and programmatic processes is retained. Test works with requirements
not only to ensure a defect-free system, but also to ensure requirements are clear, concise and
unambiguous for the developer. Developers work with the life-cycle team to build the system and
provide technical input as to efficient database and application design implementation. Trainers
work with system users, and the life-cycle team to ensure that users understand how to navigate the
screens. Trainers also provide real life data entry scenarios and gather feedback as to the system's
effectiveness in supporting day to day activities. User support staff, draw on the team's knowledge
to ensure that when questions regarding functionality arise, they can quickly respond to ensure that
data is entered and extracted in a timely manner. Change Management provides a vehicle for
organized communication of data needs and works to prioritize these needs for system releases.
Based on feedback from the user community and the life-cycle team, Change Management can assess
trends, user needs and prioritize changes for future releases.
An information system is only as useful as the data put into it. The life-cycle team is integral to
ensure continuity of design and knowledge sharing that protects data integrity. The team must use
their programmatic knowledge and technical expertise to create a system that not only supports data
tracking and reporting, but facilitates data entry and thus data quality. The life-cycle team ensures
that the needs of the users are clearly documented along with the detailed system specifications in
order to promote knowledge sharing. Without such an integrated approach, an information system
cannot adequately support the user community, nor can it provide the support mechanism where the
information system can grow and change as programmatic policy changes. A dynamic life-cycle
team thus ensures a dynamic and reliable information system.
Project Management
Every successful IT project owes its success to a strong, multi-disciplinary team. The first step that
must be taken when building that team is to appoint a strong, experienced project manager. The
project manager must perform many tasks throughout the life-cycle including: working with the
customer to define the scope and goals of the project; communicating the scope, goals, and progress
to the project team; allocating the correct staff and technical resources to the specific life-cycle
phases; ensuring the team meets deadlines and milestones; establishing a plan if tasks must slip,
motivating and mentoring the team throughout the life-cycle process to ensure that the team remains
focused and performing at their optimum level; and ensuring the project stays within its financial and
time constraints.
2
-------
At the beginning of every IT project, the project manager must work with the client to define the
scope of the project. The scope will define overall project timeframes, due dates, the responsibilities
of each team member involved with the project, and expected costs. In addition, the scope will
provide team members with a detailed document that can be referred to throughout the project life-
cycle to provide a clear definition of client expectations, and deliverables. In addition, the project
manager will ensure that the team understands the performance measures that will be used to be used
to determine success. Finally, the scope may also include specific decision points in the project
where requirements may change and decision makers will need to meet to arrive at a consensus.
The proj ect manager will also work with the client throughout the proj ect to establish and implement
a clear project management approach. This approach will be based on the needs of the client and
the strengths and weaknesses of the team. This approach will allow team members to understand
expectations, as well as how to communicate progress, identify potential issues, and prevent risks.
Understanding the expectations of the client and identifying the potential risks of the proj ect up front
will allow the team to plan ahead to eliminate obstacles before they arise.
The project manager will also work in conjunction with the lifecycle team and the client to develop
a release schedule. This release schedule will include milestones where requirements will be
gathered, reviewed and finalized; development will take place; where versions of the application will
be released for testing and defects will be addressed. The schedule will also include a check point
for client review and approval of the system and a proposed system release date.
A strong project manager, a clearly defined scope, and an effective management approach will
ensure project success by ensuring that all team members understand the project goals and are
working for the same expected end point.
Requirements
Requirements define the needs of the user and what functionality will be needed in the system to
accommodate those needs. The role of the requirements analyst is integral to effective system
development and maintenance as the system requirements drive all other phases of application
design. As such, the success of an information system and the quality of the system's data is
contingent upon the effectiveness of the requirements phase. To ensure success, the requirements
analyst must be familiar with the scope and intent of the system that is being developed. The reason
for building an information system is to track and retrieve data for specific purposes. The
requirements analyst must have a firm understanding of the data needs and the ways in which the
data are used. To fully understand these needs, the requirements analyst must know the workflow
processes and programmatic needs and challenges facing the user community.
In order to gain this knowledge, the requirements analyst is responsible for leading application
development/requirements sessions or interviewing and working with the various business units and
the user community to understand the business function. Individuals designated to give requirements
will include a representative sample of the user community, the system owner and other interested
stakeholders. During requirements gathering sessions, the requirements analyst must be an effective
3
-------
facilitator. Participants need to clearly understand the purpose and scope of the system so that they
can focus on the specific functional requirements that are integral to accomplishing the system's
purpose. The process by which requirements will be collected, reviewed, and approved must also
be established and clearly communicated. For example, a designated group of individuals with the
final authority to make decisions when there is disagreement must be identified. Milestones or
decision points must be established where this group will review and approve of the requirements.
As requirements are collected and analyzed, the requirements analyst must provide documentation
of these requirements that is clear, concise and testable. To ensure that these requirements are of the
highest caliber, a thorough quality assurance process must be in place. Both the test and
development teams must review the documentation to ensure that the requirement is unambiguous,
comprehensive, and can be met by the technology. This team review is designed to ensure that
inconsistencies and omissions in the requirements can be addressed before development begins. In
doing this, costly and complicated redevelopment will be avoided. Once the internal team review
is complete, the requirements must be reviewed and approved by the individuals who gave
requirements input and designated decision makers.
Establishing the requirements collection and review processes will protect data quality by ensuring
that the necessary data is collected and can be reported in a meaningful way. Users will understand
the intent and scope of the system and can focus their requirements by pinpointing key data and
functionality needed to meet the information needs for which the system is being built. Reviewing
workflow processes and procedures will help not only the requirements analyst and lifecycle team
to understand the purpose of the system, but it will also help the user community and stakeholders
to define and examine their business processes. Both the user community and the lifecycle team will
have a unified understanding of the scope, purpose and direction of the system. An organized and
well communicated approach leads directly to a purposeful system design.
Development
The development team provides the technical expertise required to build the information system.
As such, the developer must review the requirements both for content and level of effort to
implement to ensure that they are complete, concise and feasible. The developer should prepare a
level of effort estimate so that the system owner will be aware of the complexity of the requirements.
Upon reviewing these requirements, the developer's responsibility is to alert the requirements analyst
of any additional questions or clarification needed from the user that must be addressed before the
requirements can be implemented. They must also identify any software limitations that should be
communicated to the appropriate stakeholders. Once requirements are reviewed and approved, the
developer will document the technical approach that will be taken to see the requirements into
fruition.
An organized approach for development will protect data quality by ensuring that a system is
produced according to clearly defined requirements. Key data will be tracked according to the
requirements and in the formats defined. Establishing check points throughout the development
4
-------
process will allow defects that could lead to poor data, data omissions or defects in various modules
to be identified.
Testing
The role of the test team is to ensure that a defect-free system is developed according to
requirements. To do this, the test team must review the requirements to ensure that they are clear,
concise and testable. Based on the requirements, test must write test case scenarios that document
the state of the system before the test is executed, the process for performing the test (e.g., steps
involved in data entry etc...) and the expected results of the test. In documenting these test cases,
the test team should include testing scenarios on real life data entry processes that would be followed
by the user community. In this way, the test team will ensure that they identify any defects that
would be encountered by the user community during normal operations. Test case scenarios must
be reviewed by the requirements analyst to ensure that they are comprehensive and fully cover all
intricacies of the requirement.
Test cases must be run on each function of the system and the results clearly documented. When the
outcome of a test case does not match the expected outcome, the tester is responsible for entering
defects and communicating their findings to the developer. When a defect is entered, it should
include the steps taken to reproduce the defect and detailed information about the nature of the error.
The test team will track and monitor the progress of the defects through to their resolution. To do
this, testers may need to obtain feedback from the requirements analyst to clarify the requirements.
Test will work with the requirements analyst to certify that all requirements have been correctly
implemented and the system is ready for deployment. The test team's job in certifying that a system
is ready for deployment is to ensure that the system design exactly reflects the requirements and that
these requirements have been interpreted correctly by the developer.
A systematic test approach will promote data quality by ensuring that defects that may lead to
inaccurate data are identified and addressed before the system is placed in production. Testers also
provide an additional check to ensure that requirements are implemented exactly as written and no
interpretations or assumptions are made that are outside the scope of the requirement. Thus, through
testing, the system owner and user community is presented with a functional system that is designed
according to requirements.
Training
The role of the Training team is to provide a wide array of system expertise to all members of the
user community. The training team is expected to provide all users with a complete understanding
of the intent of the system, demonstrate how the system can be used in "real-life" situations,
communicate how the system can be used, and demonstrate how to use the information and data that
is entered. Trainers are expected to be fully knowledgeable about the application, supporting
materials, and all training processes in order to provide appropriate and comprehensive training to
all users. To do this, the training team must work closely with all other teams throughout the life-
cycle process to ensure that they understand the scope of the system, the requirements and business
5
-------
needs driving the development of the system. Trainers must also understand how requirements were
implemented (e.g., how the system was developed technically), and how requirements were verified
through test.
The training team will begin preparing for training long before a system is implemented. The
training team will do this because they will need to complete several steps before they can perform
training. First, the training team must identify the user community, or who will be using the system
once it is implemented. The training team must also identify the training or documentation needs
of that community. In other words, the training team must identify how the users will be trained (in
individual sessions, in group sessions, via video-conferencing, etc.) and what documentation they
need in order to support their day to day activities (i.e., Quick Reference Guides, on-line help, etc.)
The training team will then develop the course curriculum and materials for that user community and
their needs including training scripts, training outlines, activity sheets, and documentation materials.
At this point, the training team will work closely with the project team to perform practice training
sessions to ensure that all system information is being accurately communicated to the user. The
training team will work closely with the requirements team to ensure that the trainers have a
complete understanding of the business needs driving the system and that they understand why the
system was built the way it was. The training team will work closely with the development team to
ensure that the trainers have a complete understanding of how requirements were implemented.
Finally, the training team will work closely with the testing team in order to confirm that all
documentation is accurate and thorough.
A complete approach to training will ensure that users are provided with the resources and the
knowledge they need in order to accurately enter data into the system and/or get data from the
system. In addition, a complete approach to training will ensure that users understand what data is
being asked for and why. Understanding the information needs that lead to the creation of a system
or new functionality can lend legitimacy to the information system. Without a strong training
approach and documentation, the users will not be presented with the knowledge, or guidance they
need to gain a complete understanding of the goals of the system and how their roles affect overall
data quality.
Change Management
As organizations and their systems grow and the business needs driving requirements become more
complex, the need for a clear, concise, well-defined way to manage change is becoming more and
more important. Managing change is absolutely integral to assure high system and data quality and
to make the best use of team resources. It is very easy for the number of enhancement requests,
defect reports, and requirements issues to overwhelm a team that is not properly managed. This is
why it is important to have a change manager in place who can effectively facilitate the change
process. The change manager must ensure that changes are organized, reviewed, prioritized, and
implemented. In addition, the change manager must ensure that all changes are communicated to
the user community. The change manager must also work closely with the full life-cycle team to
ensure that there is a complete understanding of the change being requested and that it is effectively
implemented. In addition, the change manager must compare requested changes with the overall
6
-------
scope of the system to ensure that the system is evolving in accordance with the purpose for which
the system was developed. If a requested change is not part of the overall scope of the system, the
change manager will work with the client to determine if the scope needs to be updated or if the
change should be rejected.
When a change to the system is requested, the change manager will identify the change as an
enhancement, defect, or requirements issue. The change manager will then forward the change to
the appropriate team. For example, if the change is an enhancement, the change manager will
forward the issue to the requirements team who will work with the client to detail their enhancement
requirements. Once the enhancement is fully understood, the original requirements document will
be updated to include the new information and this new version will be forwarded to the developer
and test team for review and implementation. If the change is a defect, the change manager will
forward the issue to the testing team who will work with the development team to identify the defect
and determine what functionality conflicts with the original requirement. The change manager will
also work with the test team to ensure that all changes have been implemented correctly. The test
team will verify all new requirements and will re-test defects to ensure that they have been
addressed. Finally, the change manager will communicate the changes to the user community.
The change manager plays a very important role in ensuring data quality because they work directly
with the user community to ensure that the users have the data they need to support their business
requirements and to make accurate business decisions. As the data needed to make decision evolves,
the change manager works with the user community to identify changes, with the project team to
ensure that the changes are implemented and communicated to the user community.
User Support
The role of the user support or Help Desk team is to clarify user questions and issues. The goal of
the user support team is to accurately respond to all user issues and ultimately to support the building
of a knowledgeable user community that understands how the system functions meets their business
needs.
The user support team will play a role very similar to the training team in that they must work closely
with all other teams throughout the life-cycle process to ensure that they understand the scope of the
system, the requirements and business needs driving the development of the system, how
requirements were implemented during development, and how requirements were verified through
test. In addition, the user support team will work closely with the testing team to understand any
defects that may exist in the system, when they will be fixed, and if there is a "work-around" for the
user. The user support team will also have to work with the change manager to understand any
changes that are planned in an upcoming system release or that recently have been made to the
system in order to answer any questions from confused users about new or planned functionality.
Once a system is implemented, the user support team, with help from the client, will determine the
extent the user support team will be available to the users. For example, some user support teams
provide around the clock support whereas other teams will provide only a few, set hours of support
a day. In addition, the user support team will work with the client to determine what methods the
7
-------
users will adopt to communicate with the user support team and what methods the team will use to
respond. Phone, e-mail, and in-person visits are all potential options. Once the user support team
is operational, they will address questions and issues as they are identified from the user community
and either answer them directly, or forward them to another team member (i.e., tester, requirements,
trainer) to answer. The user support team is responsible for following up on all issues to ensure
closure and to ensure that the user was satisfied with the response they received. In addition, based
on the question or issues the user support team received, they may recommend that documentation,
standard operating procedures, or training sessions be updated to better address that particular issue.
The user support team will ultimately support data quality by ensuring users have an accurate
understanding of how to get data into the system and how to get the data they need out of the system.
In addition, the user support team provides the user with a contact point once the training team has
finished training. Providing this point of contact will make the users feel supported and ensure there
is a resource to answer any questions that may arise.
Effective management of information systems allows system owners and the user community to
make informed decisions about which data is integral to the fulfillment of the organization's mission.
A thoughtful examination of the workflow processes, user needs and available technology allow
organizations to create systems that achieve a defined goal and report accurate information.
Understanding the roles and responsibilities of the players involved in creating an information
system is key to successful system implementation and maintenance. Only through effective
management during requirements gathering, development, testing, user support, training and change
management can the integrity of a system's data be protected.
8
-------
DOE's QUALITY SYSTEM PROGRAM:
COOPERATIVE DEVELOPMENT AND IMPLEMENTATION
Dave Bottrell, U.S. Department of Energy, Germantown, Maryland
Mary Verwolf, U.S. Department of Energy, Idaho Falls, Idaho
Abstract —Implementation of a Quality Systems approach to making defensible
environmental program decisions depends upon multiple, interrelated components.
Often, these components are developed independently and implemented at various
facility and program levels in an attempt to achieve consistency and cost savings. The
U.S. Department of Energy Office of Environmental Management (DOE-EM) focuses on
three primary system components to achieve effective environmental data collection and
use:
• Quality System guidance, which establishes the management framework to plan, implement,
and assess work performed;
• Standardized Statement of Work for analytical services, which defines data generation and
reporting requirements consistent with user needs; and,
• A laboratory assessment program to evaluate adherence of work performed to defined needs,
e.g., documentation and confidence.
• This paper describes how DOE-EM fulfils these requirements and realizes cost-savings
through participation in interagency working groups and integration of system elements as
they evolve.
Introduction
The Office of Safety, Health and Security (EM-5) within the DOE Office of Environmental
Management (DOE-EM) is responsible for establishing policy and guidance related to analytical
service activities. These activities include systematic planning, sample collection and analysis,
performance evaluation programs, data validation, and laboratory assessment. To support these
activities at the DOE Headquarters level, EM-5 established the Data, Decision, and
Documentation (3D) Program. The 3D Program enables DOE Field and Program Offices to
increase value and reduce risk from the $300-$600 Million spent annually on environmental data
collection required to support decisions regarding health and safety, environmental restoration,
packaging, waste management, stewardship, and transportation. Within this general mission, the
3D Program works cooperatively with internal and external organizations to improve regulatory
and internal program acceptance and to, leverage DOE-EM technical and management resources.
To meet these broad objectives, DOE-EM, through the 3D Program is working to develop policy,
guidance and tools to establish the required components of a cost effective and technically sound
Quality System. In cooperation with the DOE National Analytical Management Program
(NAMP), external federal organizations and Field Office contacts, the 3D Program has
developed products in three specific areas: Quality Systems guidance, standardized laboratory
1
-------
contracting, and audit consolidation.
Quality Systems Guidance
Quality Systems are required by DOE Order 414.1 and other various regulatory drivers.
Generally, these systems are based on ANSI/ASQC E4, Specifications and Guidelines for
Quality Systems for Environmental Data Collection and Environmental Technology Programs.
DOE-EM, through EM-5, participated as a consensus member of the Intergovernmental Data
Quality Task Force which developed the Uniform Federal Policy for Implementing
Environmental Quality Systems (UFP). The UFP outlines essential elements of a Quality System
specific to managing environmental data collection and use and environmental technology
programs. The development of the policy is a joint initiative among the U.S. Environmental
Protection Agency (EPA), the Department of Energy (DOE), and the Department of Defense
(DoD). The objectives of developing a policy which applies to all federal environmental
programs are to address real or perceived inconsistencies and /or deficiencies with current
environmental data collection practices and to improve processes to generate environmental data
of known, documented quality that is suitable for the intended use. The benefits of consistent
Quality Systems across federal agencies include:
• Improved effectiveness of federal environmental programs by focusing on results, quality of
data and services, and customer satisfaction;
• Clarification of roles and responsibilities in managing and overseeing environmental data and
environmental technology programs;
• Sufficient confidence in the systems such that duplication of oversight efforts are minimized;
and,
• Enhanced accountability and public confidence in environmental decisions.
The EPA Office of Solid Waste and Emergency Response is adopting the UFP and has provided
a written request to DOE-EM to join the EPA and DoD in implementing a Quality System
consistent with the UFP. This document was issued by the Assistant Secretary of Energy for
Environmental Management to the DOE complex as publication DOE/EM-0556 in January
2001. DOE site operations personnel are reviewing current data collection and use practices to
achieve consistency with UFP requirements.
Standardized Laboratory Statement of Work (SOW) for Analytical Services
Historically, procurement of laboratory services has been conducted at the site level through the
local Sample Management Office or through individual projects. This approach necessitated the
development of site-specific performance criteria and solicitations for commercial analytical
services. In spite of the fact that DOE-EM is one of the largest federal buyers of testing services,
this buying power was not being leveraged at the national level. Additionally, the lack of
complex-wide performance criteria contributed to redundancy in auditing and other quality
2
-------
assurance activities. Although the necessity to address site-specific internal and regulatory
requirements is basic to analytical services, most aspects of a technical statement of work are
relatively consistent (e.g., standard radioanalytical protocols or EPA reference methods).
Recognizing this fact, DOE embarked on efforts to streamline analytical services procurement
through the development of a standardized complex-wide statement of work. To achieve this
objective, an Integrated Contractor Procurement Team (ICPT) formed by DOE federal and
contractor personnel developed a consensus model to standardize laboratory requirements.
Through the use of a Basic Ordering Agreement (BOA) for analytical services, standard
requirements and technical criteria are established. The system also has built-in flexibility, in
that site-specific details may be added as dictated by regulatory requirement or other special
concerns. This feature avoids the pitfalls of the "one size fits all" approach.
As a component of DOE-EM's Quality Systems, implementation of the Standard Statement of
Work enhances quality and efficiency in the following ways:
• Value of consensus technical expertise provided by the ICPT and available to DOE's smaller
facilities;
• Cost avoidance, improved quality, and process improvement from standardized auditing and
shared reports provided by the consolidated audit program;
• An efficient mechanism to introduce specific quality improvements, e.g., participation and
acceptable performance in external performance evaluation programs; and
• Simplified procurements because negotiating the basic agreement and multiple pre-award
laboratory assessments are not necessary.
Approximately 20 commercial laboratories have entered into the BOA with DOE sites. The
approach has proven effective, and DOE is considering the development of requirements for
radiobioassay and industrial hygiene testing services. This broadened scope will allow DOE to
further leverage its buying power while at the same time enhancing data quality through the
standardization of technical requirements and performance criteria.
Environmental Management Consolidated Audit Program (EMCAP)
Commercial analytical laboratories provide the bulk of the testing data used by DOE-EM for
critical environmental decision making. To ensure that these decisions are made with data that is
of known, documented quality, virtually all DOE-EM Operations/Field Offices conduct
laboratory audits. Historically, each Operations/Field Office performed laboratory audits
according to site specific requirements. In some cases, audits of certain laboratories were
conducted by multiple programs and contractors from the same DOE facility. This led to a
redundant, and therefore costly, approach to auditing. These inefficiencies were documented by
the DOE Office of the Inspector General (IG) in the report "Audit of the Department of Energy's
Commercial Laboratory Quality Assurance Evaluation Program (DOE/IG-0374)." This report
3
-------
also highlighted that DOE had not established uniform criteria for the evaluation of commercial
laboratories. To address these issues, a working group was formed with the goal of developing a
consolidated audit program. The Environmental Management Consolidated Audit Program
(EMCAP) was initiated in early 2000 and is based on procedures and audit checklists developed
through the evaluation of each site-specific audit program. We combined the most effective
features of each site's activities to form the basic structure. Representatives from multiple DOE
sites conduct EMCAP audits. The DOE complex, laboratories, and potentially stakeholders share
audit reports and corrective action plans via a web-based data system. The main objectives of the
program are to:
• Determine laboratory ability to generate and document data that is technically defensible and
consistent with defined requirements;
• Facilitate sharing of audit results across the DOE complex and potentially with regulators to
reduce program costs and potential risk from use of unacceptable laboratory data; and,
• Avoid unnecessary costs and improve the value of laboratory audits through establishing a
consistent, controlled process.
EMCAP is managed by the DOE Oak Ridge Operations Office. In FY 2000, the program
completed 19 audits, with over 40 currently scheduled for FY 2001.
Cost Benefits of Quality Systems Implementation
Although continuous improvement is the primary focus, implementation of complex-wide
quality initiatives such as the Standard Statement of Work and the Consolidated Audit Program
avoid unnecessary costs and program delays. In the case of the standardized statement of work,
we avoid or minimize administrative and technical procurement resource requirements. For
example, The DOE Oak Ridge Operations Office estimated their one-time savings from avoiding
costs for contract administration at $150,000. The DOE Oakland Operations Office saved 2-3
months of technical and administrative time for one contract through the BOA but did not
estimate specific costs.
EM-5 recently completed a study to evaluate relative costs for various contract mechanisms. The
data suggest Basic Ordering Agreements achieve lower analytical service costs (30%) than
achieved by fixed unit Indefinite Delivery Indefinite Delivery contracts. Three Office of Closure
facilities (Rocky Flats Field Office, Ohio Field Office, and Oak Ridge Operations) that have
actively worked to standardize and implement an analytical services Statement of Work have
achieved the lowest cost analytical services across DOE.
DOE-EM is in the process of estimating projected and actual cost saving from the consolidated
audit program. An inherent difficulty is identifying the number, type, and cost of the audits of
environmental laboratories conducted by EM. The previously cited Inspector General Audit
(DOE/IG-0374) reported that 50% of DOE's audits of commercial laboratories were redundant
4
-------
(103 of 206). This percentage could escalate because the laboratory community is becoming
more competitive and shrinking, i.e., the DOE complex has fewer laboratories to consider,
increasing potential for redundancy. For example, if certain laboratories recognized for high
technical quality were audited by six Field Offices, excess costs of > $50,000 per commercial
lab (5 redundant audits X $11,500/audit) would be incurred. In FY 2000, 87 audits were
identified and nearly half were redundant ($400,000 unnecessary cost). As described earlier, the
EMCAP will meet current audit needs of participating Offices with less than 50 audits (compared
to >200 audits historically performed). This represents a cost avoidance > $1 million and likely
approaching $2 million. The primary audit costs are time and travel which are both optimized by
utilization of audit teams that are in the same geographical area as the laboratory, a situation
impossible in the case of site-specific audits.
Conclusion
The Department of Energy, including the Office of Environmental Management, is committed
and mandated to maintain a Quality System solution to facilitate, document and defend
environmental decisions. This paper focuses on the Quality System infrastructure and specific
programs to define laboratory requirements and assess adherence to these requirements. The
DOE Standard Statement of Work and Consolidated Audit Program are two elements that have
recently reached an implementation phase. More information on participation, contacts, and
specific audit materials may be found on the Internet at http://www.em doe.gov/safetvhealth/3d/.
References
Audit of the Department ofEnergy's Commercial Laboratory Quality Assurance Evaluation
Program, U.S. Department of Energy Office of the Inspector General, Report Number DOE/IG-
0374, June 1995.
Intergovernmental Data Quality Task Force Uniform Federal Policy for Implementing
Environmental Quality Systems: Evaluating, Assessing and Documenting Environmental Data
Collection/Use and Technology Programs, Interim Final Version 1, DOE Publication
DOE/EM-0556, November 2000.
Specifications and Guidelines for Quality Systems for Environmental Data Collection and
Environmental Technology Programs, American National Standard, ANSI/ASQC E-4/1994.
5
-------
THE RADIOCHEMIST'S ROLE IN THE QUALITY EVALUATION AND
ASSESSMENT OF RADIOLOGICAL DATA IN ENVIRONMENTAL
DECISIONMAKING
Svetlana Bouzdalkina, Research Institute of Radiology, Gomel, Republic of Belarus; Raymond J. Bath Ph.D.,*
US DOE/Environmental Measurements Laboratory, NY, NY; Pamela D. Greenlaw, US DOE/Environmental
Measurements Laboratory, NY, NY; and David BottreU, US DOE/EM5, Germantown, MD.
"This report was prepared as an account of work sponsored by an agency of the United States Government and the Republic
of Belarus. Neither the United States Government nor the Republic of Belarus nor any agency thereof; nor any of their
employees, makes any warranty, express or implied, or assumes any legal liability or responsibility for the accuracy,
completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not
infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name,
trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or
flavoring by the United States Government or the Republic of Belarus or any agency thereof! The views and opinions of
authors expressed herein do not necessarily state or reflect those of the United States Government or the Republic of Belarus
or any agency thereof
Abstract — The quality evaluation and assessment ofradiological data is the final step in the
overall environmental data decision process. This quality evaluation and assessment
process is performed outside of the laboratory and generally the radiochemist is not
involved. However, with the laboratory quality management systems in place today, the
data packages of radiochemical analyses are frequently much more complex than the
project/program manager can effectively handle and additionally with little involvement
from radiochemists in this process the potential for misinterpretation of radiological data is
increasing.
The quality evaluation and assessment of radiochemistry data consists of making three
decisions for each sample and result remembering that the laboratory reports all the data
for each analyses and additionally the uncertainty in each of these analyses. Therefore at
the data evaluation and assessment point the decisions are: is the radionuclide of concern
detected (each data point always has number associated with it); is the uncertainty
associated with the result greater than would normally be expected; and if the laboratory
rejected the analyses is there serious consequences to other samples in the same group. The
need for the radiochemist's expertise for this process is clear. Quality evaluation and
assessment requires the input of the radiochemist particularly in radiochemistry because of
the lack of redundancy in the analytical data. This paper will describe the role of the
radiochemist in the quality assessment of radiochemical data for environmental decision
making.
Introduction:
The goal of the environmental data collection process is to produce quality, credible and cost effective data to
support the decision making process. The data collection process can be divided into the stages of planning,
sampling, analysis, verification, validation, assessment and use (figure 1). Even though for the determination of
radionuclides in the environment, these stages have unique requirements, radiochemists are not usually involved in
any of the stages except the laboratory analysis. The reasons for the lack of radioanalytical support are complex but
usually since the evaluation and assessment of radiochemical data is performed outside of the laboratory, input from
the radiochemist is not available. With the laboratory quality management systems in place today, tile data
packages of radiochemical analyses are frequently much more complex than the project/program manager can
1
-------
effectively handle and without radiochemist support the potential for misinterpretation of radiological data is
increasing.
Radiochemistry data is distinctly different from all other analytical analyses since each measurement always; 1)
results in a number, 2) each measurement has an associated uncertainty, and 3) each measurement is reported with
an MD A (minimum detectable activity). In order then for a quality evaluation and assessment of this type of data
to be accomplished three distinct decisions must be available for each sample result They are: 1) Is the sample
result greater than background? 2) Is the uncertainty in the sample result normal? and 3) Is the sample result so
uncertain that the data point must be rejected?. These distinct analytical decisions can be provided throughout the
entire data collection process if the program manger utilizes input from the radiochemist This paper demonstrates
the role for the radiochemist in all the stages of the environmental data collection process and demonstrates that role
for a quality evaluation and assessment of radionuclide data.
The Environmental Data Collection Process
An efficient environmental data collection activity depends on a series of well thought out logical steps. These steps
address what data need to be collected and how the collected data will be assessed for usability to support the decision
makingprocess. For radiochemical investigations, more emphasis isplaced on theindividual determination. It should
be emphasized that the integration of a chemist at each step is essential for quality use of the data. The data collection
activity presentedhere is in stepwise sessionsforeaseofilhistratingthechemist'sroleinthedatacollectionprocess and
is not meant to be a full treatise on the entire process. The chemist role in each step is described below.
The process begins with a directed planning step which assures that sufficient planning is carried out to adequately
define a problem, to determine its importance and to develop an approach to solutions prior to spending resources.
Essentially each process presents a stepwise approach that includes a planning, implementation and assessment phase
in each stage. These stages are all interconnected so that the assessment phase of one step is in reality the first part of
the planning phase of the next step. These interconnected steps will allow a complete environmental data collection
program to emerge that will meet the program needs and in effect be the "quality evaluation and assessment" of the
program.
1) Directed Planning
The directed planning is the foundation of the data collection process. This planning process follows the Data Quality
Objectives (DQO) program. The radiochemist should participate in the initial planning for theproject to offer input on
the adequacy of existing radiological data to determine need for further sampling, radionuclides of concern and exp ected
concentrations. The chemist insures that the proper radionuclides are selected, Measurement Quality Objectives
(MQOs) are clearly defined, methods of analysis are adequate and meet the objectives, and that the number of sample
results will enable the program manager to meet the stated goals of the process.
The radiochemist role includes developing an appropriate quality system that is capable of implementing the quality
controls and thequality assurance necessary for success. The quality assurance system will oversee the implementation
of Quality Control (QC) samples, documentation of QC sample compliance or non-compliance with MQOs, audits,
surveillances, performance evaluation sample analyses, corrective actions, quality improvement and reports to
management. The documentation generated by these quality assurance activities and their outputs during project
implementation will be a key basis for subsequent assessments and data usability decisions
2) Sampling
Sampling includes all the activities to and including the taking of a sample for shipment to a laboratory. The
radiochemist provides inputfor contract negotiations, sample design and sample handling including sample preservation,
sample container requirements, compositing, and subsampling. The radiochemist prepares the Statement of Work for
2
-------
the laboratory analyses including the Limits of Detection, sampling and laboratory quality control activities, and all
required data deliverables.
3) Analysis
Analysis includes all the activities that will result in the production of the data package.
The radiochemist role is to provide input to all of the activities from the field Chain of Custody to analysis quality
control activities to final data output. The radiochemist provides a quality oversight role that includes review and
consultation on methods used especially for development and approval of a performance base measurement system
(PBMS), instrument calibration, laboratory and matrix interferences, etc.,. The radiochemist should be in
communication with the program manager and
be responsible for any changes required by the program.
The last three steps in the data collection process are collectively known as the assessment phase. These steps all
require a radiochemist expertise. Hie radiochemist role should include technical input to Verification and Validation
steps and support for data Assessment to insure that the data meets the needs of the program.
4) Verification
Verification assures laboratory conditions and operations were compliantwiththestatementof work (SOW) and project
plan documents. The chemist role is to verify the data package delivered by the field or laboratory meet requirements
(compliance) that were outlines in planning, checks for consistency and comparability of the data throughout the data
package, the QC parameters were with in limits, the correctness of basic calculations, data for basic calculations, and
completeness of theresults to ensure all necessary documentation is available. Theprimary function of the radiochemist
shouldbe to apply appropriate feedback to the laboratory resulting in corrective action orrecommending that the proj ect
planning process be revisited.
5) Validation
The validation of the data addresses the issues of reliability and uncertainty of the data. The role of the radiochemist
is to provide input from a review of the verification report and laboratory data package to identify its areas of strength
and weakness and to provide the application of qualifiers to the data. These qualifiers reflect the impact of not meeting
the MQOs and can result in the entire data set be said bade through the planning process.
The radiochemist has a unique role in the validation process by being able to evaluate the data to determine the presence
or absence of a radionuclide, and the uncertainty of the measurement process. During this validation, the technical
reliability and the degree of confidence in reported analytical data are presented.
6) Assessment
Assessment is the last phase of the data collection process, and consists of a scientific and statistical evaluation of
project-wide knowledge to assess the usability of data sets. The Radiochemist compares the data produced with the
planning documents and any other analytical process requirements that were developed in the planning process. To
assess and document overall data quality and usability, the chemist assists the data quality assessor to integrate the
validation report, field information, assessment reports, and historical project data, and compares the findings to the
original project planning documents. Hie DQA process uses the combined findings of these multi-disciplinary
assessments to determine data usability for the intended decisions, and to generate a report documenting usability and
the causes of any deficiencies. It may be useful for a validator to work with the assessor to assure the value of the
validation process (e.g., appropriateness of rejection decision), and to make the process more efficient
3
-------
The assessment of data requires the input of a radiochemist in terms of sampling and analytical MQOs, QC sample data
(e.g., % yield) and compliance with specifications and requirements (e.g., required combined standard uncertainty).
If these records are missing or inadequate, thai compliance with analytical protocol sp educations, including the MQOs
which were identified during the planning phase, will not be ascertainable and will raise questions regarding quality of
the data.
7) Use
Theiadiochemist role in thedevelopment of these stepsduringthedirectedplanningprocess will increase thelikelihood
that the appropriate documentation willbe available forregulatory and otherprogram related activities. Documentation
and record keeping during the environmental data collection process is essential to subsequent data verification, data
validation, and data quality assessment Thorough documentation will allow for a determination of data quality and data
usability objectives.
Itis important to note that the radiochemist is uniquely qualified to perform the technical roles in the environmental data
coQectionprocess. Radiochemists willprovide expertise inradiation/nuclide measurement systems andtheknowledge
of the characteristics of the analyte of concern to evaluate their fate and transport The radiochemist will also provide
knowledge about sample transportation issues, preparation, preservation, sample size, subsampling, available analytical
protocols and achievable analytical data quality. The use of a radiochemist will ensure the effective use of resources
available to the project
4
-------
HNVIRONMHbiTAL DATA COLLEC TION PROCESS
j Directed
\ f liutnia^ . I
Pi*, j
f™*——•''•••
"} raspfcmi&a !
Asssss
¦*:
| Sampling I- »lart !•——
May •««&«>
l/cads
to Use
impifmcrn i
*—(kj Analysis
^ts j.
I^L±
M;iv -cwi iv
\ A ssess
j l>ala
j. V'eri^csitiftn
i < payment)
Further
Planning
! Plan
StJ# Jr>
V;; : :¦
¦u
Validation
(Uncmabriy)
-[pD.
M'jv
*. Assessment
rsau
May fei jj>
-h4
J-npietusjii
j Ass«» j
. MsykjxUc
Pkuis I
: *;se § 4 j ~--&i
'"j tesplcmetii j
I Assess
5
-------
RAISING THE CURTAIN ON THE GRAY REGION
Cliff J. Kirchmer, Agency QA Officer, Environmental Assessment Program
Washington State Department of Ecology
Stewart M. Lombard, Program QA Coordinator, Environmental Assessment Program
Washington State Department of Ecology
Abstract - The gray region in EPA Document QA/G-4 is defined as the range of possible
parameter values near the action level where the cost of determining that the alternative
condition is true outweighs the expected consequences of a decision error. EPA
Document QA/G-4HW clarifies that during the planning stage the action level is based
on an ideal decision rule, while during the assessment stage an operational decision rule
is used.
This paper analyzes the factors that define the gray region and the action level,
including the errors of the first kind (a) and second kind (ft) and the number of samples
taken to determine the mean result. The relationship between the Decision Performance
Curve presented in EPA QA/G-4 and the statistical power curve is also discussed. The
statistically derived critical level is identified as the concentration of importance for
decision-making. The action level is defined in terms of the critical level so that its value
is consistent for decisions made during both planning (a priori decisions) and
assessment (a posteriori decisions).
This paper is a result of our effort to understand the statistical basis for the Decision Performance
Curves and Decision Performance Goal Diagrams described in EPA documents QA/G-4 and
QA/G-4HW. The Decision Perfomance Goal Diagram, which approximates the Decision
Performance Curve, is presented by EPA as a means of stipulating your tolerable risks of
decision errors and communicating them to others.
Figure 1 is an example of a Decision Performance Curve (DPC) which is in both QA/G-4 and
QA/G-4HW. Both an ideal DPC and a realistic DPC are illustrated. The ideal DPC corresponds
to the probabilities of decisions made if there were no random error and the realistic DPC
corresponds to the real world situation where decisions must be made against a smokescreen of
random error. It is assumed that systematic error (bias) has been controlled and only random
error is considered in decision-making. It is important to note that the Action Level specified in
Figure 1 corresponds to the idealistic DPC. The Action Level is defined as being either fixed
standards (e.g. drinking water standards or technology-based standards) or investigation-specific
(e.g. background standards or specific risk-based standards). In the real world, because of
random error, one cannot have realistic action levels set equal to the standards.
Thus, it must be understood that the Action Level identified in Figure 1 is to be used only in the
planning stage when considering an ideal DPC. This can be a source of confusion if one
1
-------
identifies the ideal Action Level used during project planning with the realistic Action Level that
must be used during project assessment.
Figure 2 is an example of a DPC overlaid on a Decision Performance Goal Diagram (DPGD),
taken from EPA QA/G-4HW. A DPGD is a graphical representation of the tolerable risks of
decision errors, and is used in conjunction with a Decision Performance Curve. DPGDs, like
DPCs, are planning tools and specify theoretical Action Levels, which can also be a potential
source of confusion.
The DPGD includes a gray region. One boundary of the gray region is located at the action level
based on the theoretical decision rule and the other boundary is located at the true concentration
value at which the consequences of a false acceptance decision error are considered significant
enough to set a limit on the probability of it occurring.
EPA's approach to specifying tolerable limits on decision errors and the subsequent assessment
of the data provide good mechanical step-by-step procedures that do not require a detailed
understanding of statistics. But for some, an understanding of the statistical basis for decision-
making, including the differences between action levels during planning and assessment phases
and the meaning of the gray region and its boundaries, may help to give more confidence in the
process.
To understand the use of DPCs as planning tools, it helps to understand that each point on the
curve corresponds to a true value for the parameter, and that each true value can be thought of as
having an associated underlying population distribution.
Similarly, the boundary values of the DPGDs have associated underlying population
distributions. Recognition of these underlying population distributions can help to understand
the meaning of the gray region. The population distributions associated with the true values at
the boundaries of the gray region are of particular importance in understanding decision errors.
Figure 2 corresponds to the baseline condition where the parameter exceeds the action level. In
the case of site cleanup, this could be described as a baseline condition of "The Site is Dirty".
From a statistical point-of-view this can be understood as corresponding to a null hypothesis that
the parameter equals the Action Level (Hq: //=A.L.) with the alternative hypothesis being that the
parameter is less than the Action Level
(Ha: pt
-------
Figure 3 are the same (0.05), while in Figure 2, «=0.05, but /H).l. The values represented on
the distribution curves are not the true values, but rather the estimated probability of obtaining
those values for the population under study when the true values are either 80 or 100.
During planning, in addition to needing an estimate of the population standard deviation for
individual results, decisions must be made regarding tolerable false acceptance and false
rejection decision error rates, as well as the minimum detectable difference (equal to the width of
the gray region). These planning decisions will determine how many samples need to be taken,
analyzed, and averaged to obtain a result. This is an iterative process, since cost also needs to be
taken into account. The first estimate of the number of samples needed may exceed the available
budget, and in that case a compromise will need to be made regarding the tolerable decision error
rates and minimum detectable difference in order to lower the cost of sampling.
As Figure 3 illustrates, during assessment only the false rejection error (a) is used for deciding
whether to reject the null hypothesis (e.g. in the case of site cleanup, to reject the null hypothesis
that the site is dirty). The decision point for the chosen value of a is called the critical level. As
an alternative to designating the Action Level at the same concentration as the standard being
used, one could designate the critical level as the Action Level. The critical level would be
estimated during the planning stage based on the estimated standard deviation and chosen levels
for a, fiand the minimum detectable difference (gray region). During the assessment stage the
critical level would be calculated based on the t-test, using the data obtained during project
implementation. The advantage of this approach would be to have a consistent Action Level
during planning and assessment, avoiding the need for theoretical and operational Action Levels.
The gray region depicted in Figure 2 corresponds to the region between the means of the
distributions in Figure 3 (i.e. between 80 and 100). EPA defines the gray region as the range of
possible parameter values near the action level where the cost of determining that the alternative
condition is true outweighs the expected consequences of a decision error. The gray region is
also described as a range of true parameter values within the alternative condition near the Action
Level where it is "too close to call". This may have some meaning during the theoretical
planning stage. But during the realistic assessment phase, as can be seen in Figure 3, the critical
level where the call must be made is located within the gray region. Again, this can lead to
confusion if one believes that assessment decisions cannot or should not be made for results
found in the gray region.
From the perspective of planning, enough samples are taken to reduce the probability of a false
acceptance to a tolerable level, but one can never be certain that a false acceptance (or false
rejection) error has occurred. For the sampled data, the critical level is located within the gray
region, and corresponds to the concentration that must be used during assessment to make the
call as to whether to accept or reject the null hypothesis. We would rather not get results in the
gray region but, if we do, we want the power of the test to be at a level that will give confidence
to our decision.
3
-------
From a planning perspective, the boundaries of the gray region (minimum detectable difference
in statistics) are important, since they are needed to determine how many samples must to be
taken, analyzed and averaged in order to achieve the chosen decision errors for acceptance and
rejection. Another way to express this is that if the true value is equal to the left boundary value
in Figure 2, the probability of correctly rejecting the null hypothesis (i.e. concluding that the site
is clean) is equal to 1-/?, which is the statistical "power" of the test to detect an effect (i.e. for
Figures 1-3, that the concentration is less than the standard level).
Decisions can be thought of as being "a priori" or "a posteriori", depending on whether they are
made during planning or during assessment. Both a and errors must be taken into account for
"a priori" decision-making, while only the a error needs to be addressed for "a posteriori"
decision-making. There are some advantages to identifying the action level as being at
approximately the same concentration level during planning and assessment. ASTM document
D5792-95 achieves this by identifying an operational decision rule rather than an ideal decision
rule. Figure 4 shows a Decision Performance Curve taken from ASTM D5792-95, in which the
Action Level is located at a concentration where the probability of taking action is 0.5 (i.e. p
=0.5) and is distinguished from the Regulatory Threshold (EPA's theoretical Action Level). The
plot of Figure 4 is for Possible True Concentrations rather than the True Values of the Parameter
plotted on Figure 1. An even better description would be to identify the Action Level on Figure
4 as a value obtained from sample data.
The DPC (and DPGD) are prepared during the planning phase. EPA defines the DPC as a power
curve (or a reverse power curve depending on the hypothesis being tested). A power curve in
statistics is a plot of \-0 vs. the true value. Figures 1 and 4 are reverse power curves, since they
are plots of p vs. the true value, corresponding to the probability of deciding that the parameter
exceeds the Action Level using sampled data. A power curve is a mirror image of Figures 1 or 4
and corresponds to the "apriori" probability that one will detect an effect (i.e. reject the null
hypothesis) for different values of the true concentration. The power curve may provide some
additonal insight to decision-makers that is not provided by the DPC or reverse power curve. A
DPGD analogous to a power curve could be constructed by deciding what probability of
rejection of the null hypothesis or baseline condition is desired at a given "a priori"
concentration (corresponding to the left side of the gray region or minimum detectable
difference). These and other considerations will be discussed during the presentation.
4
-------
ffl 0)
si _i
c
<0 o
Ł -s
o> <
-S ©
-o -C
•{J —
M
•D
2
grffi
11
a |
CD
CL
Realistic
Performance
Curve
Ideal
Performance
Curve
- Action Level
True Value of the Parameter (Mean Concentration, ppm)
Figure 1. Ideal Versus Realistic Performance Curve
5
-------
40 60 80 100 120 140 160 180 200
4_
1 Action Level
True Value of the Parameter (Mean Concentration, ppm)
Figure 2. An Example of a Performance Curve Overlaid on a Decision Performance
Goal Diagram (Baseline Condition: Parameter Exceeds Action Level)
6
-------
///The probability of calling for action when it is not needed.
11 iThe probability of not calling for action when it is needed.
CONCENTRATION
Figure 3. Statistical approach to quality control
7
-------
Decision Performance Curve
c1-0
.2 0.9 H
< 0.8
C^O-7 "I
5 0.6-
6
? 0.4 -
False Negative - 10%
False
Positive = 20%
Action
Level
/
Regulatory
Threshold
1.0
T
1.1
0.7 0.8 0.9
Possible True Concentrations (mg/L)
FIG. Decision Rule Development
1.2
Figure 4
8
-------
ESTABLISHING USEFUL DATA QUALITY OBJECTIVES
Evelyn Holly, Senior Chemist
Abstract —Many Quality Assurance Project Plans (QAPP) incorporate the laboratory's QC
requirements, such as detection limits, spike recoveries, and duplicate control windows as
part of the project's data quality objectives (DQOs). Environmental projects require site-
specific decision-making criteria. Using the above approach to establish quality control
(QC) limits for the project is like putting the cart before the horse.
Once the end use of the data and the level of quality effort have been identified then the
laboratory deliverable level and the data validation level needed to support those efforts may
be developed The QAPP should specify the expectations for the project, including
laboratory reporting requirements and DQOs. These expectations establish the QC limits
within which the laboratory should operate.
QAPP Preparation
This paper will discuss how the QC limits are determined based on the analytical method
the laboratory's capabilities, regulatory limits, and the end use of the data. This includes
how project DQOs are used to establish laboratory reporting limits and DQOs for precision,
accuracy, representativeness, completeness, and comparability (PARCC). The laboratory
deliverable levels will be discussed along with the appropriate validation level needed to
achieve the DQOs for the project.
Data Review
All laboratory data and the associated quality control results need to be reviewed by the end
user of the data for acceptability. The review may range from a cursory to a full review, and
the data may need to be qualified or rejected The review of the data determines if the
PARCC parameters are acceptable, which is then used to measure the success of the project.
The data review process, typical decisions that result in qualified or rejected data, and how
qualification of data affects the project, will be discussed
Essentially, this paper will offer insight into how to tie site-specific DQOs, associated
analytical levels, data validation levels and data qualifier flags together to meet the project
goals.
In the Data Quality Objective (DQO) development process, one needs to start at the end to develop a plan
for the beginning. The project background, the intended use of the data, and the decisions that need to
be made, determine the level of quality effort for the project This in turn will help determine the level
of documentation needed to support the quality effort For example, if the end use of the data is site
closure, risk assessment, or legally defensible data, there is a higher level of quality effort needed. The
level of quality effort is then defined by how rigorous the QA/QC is for the project and the supporting
documentation that is needed. The QAPP should then define the documentation for the level of QA/QC
needed. The QA/QC is implemented in a "cradle to grave" fashion and the documentation must start at
field level and continue to the final report to the client
1
-------
QUALITY ASSURANCE PROJECT PLAN (QAPP)
The QAPP is one of the customized blueprints that define the specific requirements and tasks for the
project From the laboratory standpoint, the most important issues to address in the QAPP are DQOs
(PARCC parameters), any special laboratory requirements, corrective action plans, documentation,
Contract Required Detection Limits (CRDLs), analytical methodology, and reporting requirements.
However, most of the time, the QAPP incorporates this information by referencing a laboratory's QA
program and fails to address the specific needs of the project because a laboratory's QA program
often has a one-size fit all approach. If there is a need to change laboratories, the prior inclusion of
another laboratory's program in the QAPP often places die project in violation of the comparability
DQO.
Laboratory Data Quality Objectives
Laboratory DQOs are defined by the PARCC (Precision, Accuracy, Representativeness,
Comparability, and Completeness) parameters. The acceptance criteria for the PARCC parameters
are set by a combination of method guidance, laboratory limits, professional judgment, and project
goals.
Precision
Precision measures the variability of the measurements by evaluating if replicate analyses are
reproducible. Data precision is measured by calculating the relative percent difference (RPD) of the
results for field and laboratory duplicates, and the percent relative standard deviation for field and
laboratory replicates.
The end user of the data must decide in advance how much variability is allowable for the project If
the laboratory analysis and the sampling techniques are unable to achieve the needed data confidence,
then the end user needs to develop an alternate plan to ensure that good decisions are made.
For example, the laboratory's analysis of arsenic in soil may have a 20 percent variability. If the
action limit for soil remediation is 200 ppm and the analytical results are expected to range from 20 to
80 ppm, then the variability is probably acceptable. However, if the action limit is 200 ppm and the
analytical results range from 150 to 190 ppm, then the variability does not permit good decision-
making. Either the laboratory needs to commit to tighter precision, or the project manager will have
to modify the sampling plan and decision-making process, perhaps by collecting more samples so that
a 95 or 99 percent confidence limit may be established.
Accuracy
Accuracy is the evaluation of the bias in a measurement system, and is measured by calculating the
percent recoveries of surrogates, known value check samples and standards, matrix spikes, and blank
spikes. The end user of the data must decide in advance how much error is allowable for the project,
and make plans for decision making if the laboratory is unable to achieve the goal.
For example, the laboratory's analysis of lead from a waste pile may have 85-115 percent matrix
spike recovery. However, if the clean-up action limit is 100 ppm and the analytical result is 95 ppm,
then the bias is probably unacceptable. The project manager may need to develop a decision tree in
2
-------
advance, so that even though the action limit is 100 ppm, results greater than 85 ppm trigger a
continuation of the remediation. If the results are less than 85 ppm, then no more remediation is done
and the groundwater monitoring stage begins.
Most EPA methods contain suggested (but not required) accuracy and precision limits. Table 1 is
taken from the 18th Edition of Standard Methods, and suggests reasonable accuracy and precision
limits for both evaluating laboratory limits, and for when the laboratory does not have limits
established.
Representativeness
Representativeness evaluates if the data is representative of the area that was sampled. This involves
an evaluation of confidence limits, sampling techniques, and the control of contamination. From the
laboratory's perspective, the most important issues are the use of blanks to demonstrate that
contamination did not occur.
The QAPP should indicate that blanks will be collected, and describe decontamination and special
subsampling procedures such as pouring off standing water from sediment samples, removing sticks
and stones from a soil sample, or homogenizing a solid sample.
Comparability
Comparability is judging if the data collected are comparable both to previous and subsequent data.
Because different analytical methods often yield different results for the same sample, standardized
procedures, reporting units (e.g., mg-N/1 vs. mg-N03/l), and approved methods should be followed
during field sampling and laboratory analysis. In addition, CRDLs should be established. This
establishes limits that are based upon the decision making process instead of reporting limits
developed statistically at a specific laboratory which may not be available from other laboratories.
For example, if the water protection standard is 50 ppb for a pesticide, then a CRDL could reasonably
be ten times less than that standard, at 5 ppb. The end user may then make decisions independent of
two different laboratories with contradictory reporting limits of 0.5 and 4 ppb.
Other examples of situations, which have numerical compliance requirements and decision-making
criteria, include aquifer protection standards, residential and non-residential soil remediation levels,
specific NPDES permits, and negotiated agreements with regulators.
Completeness
Data completeness is assessed in terms of the percent of usable analytical results produced. Although
100 percent data completeness is ideal, 90 percent data completeness is generally associated with
Level DI and Level IV analyses (EPA, 1987) and allows for unforeseen incidents. The project
manager should plan contingencies and corrective actions in advance. For example, if a 90 percent
completeness is expected, and the end user needs 100 samples to calculate confidence limits, then at
least 110 samples should be collected.
3
-------
LABORATORY DELIVERABLE LEVEL
The laboratory deliverable level should correspond to the types of decision-making needed for the
project The phase of the project or the end use of the data will affect the type of deliverable needed.
If the project is "high profile" and may end up in court, detailed documentation is needed for a full
review of the data. If the data is from field screening, extensive documentation may not exist and the
data may be useful only for refining the sampling plan and determining sampling locations.
Several laboratory deliverable levels are typically used in the data review process and different
programs, i.e., EPA, Department of Defense, and state programs, may have varying descriptions for
deliverables. Table 2 lists various EPA deliverable levels, however, Level HI and Level IV are the
levels most commonly used in the data review process.
Level m
Level HI data packages or analytical support documentation typically consist of laboratory test results
using EPA approved methodologies. The type of analytical documentation will include QC data in
the form of typed reports and summaries, which consists of:
• Results and method reference;
• Case narrative;
• Date sampled and date analyzed;
• Field and laboratory sample identifications;
• Chain-of custody documentation;
• Documentation of the sample condition upon receipt;
• QC compilation and acceptance limits;
• Detection/reporting limits;
• Approval signature from a responsible officer of the laboratory; and
• Name and identification of the laboratory, including the accreditation number, if applicable.
Level IV
Level IV data packages include the Level HI documentation, plus reports and summaries for
• Initial and continuing calibration;
• Instrument and calibration blanks;
• System monitoring compound results;
• Daily tuning results;
• Internal standard area and retention time summaries;
• Tentatively identified compound (TIC) results; and
• Raw data such as chromatograms, worksheets, logs, and instrument printouts.
A Level IV deliverable is needed when the project requires data that must be fully validated.
4
-------
DATA REVIEW
Three different activities are involved in the data review process: a completeness check, data
verification, and data validation. The type of review should be determined in advance and will
depend upon ihe end use of the data and the confidence needed. Again, if the project is "high
profile," due diligence may suggest a more thorough review. Typically, non-NPL sites and sites that
are not in litigation undergo data verification. NPL and state Superfund sites, or sites with the
possibility of litigation, undergo a data validation.
Laboratory data should be validated according to accepted and published criteria such as the EPA
Contract Laboratory Program (CLP) guidelines described in Laboratory Data Validation Functional
Guidelines for Evaluating Inorganics (EPA, 1994b), and Laboratory Data Validation Functional
Guidelines for Evaluating Or games Analysis (EPA, 1999).
These references were developed for the EPA's Contract Laboratory Program. If appropriate, even
the criteria for data review may be customized in the QAPP and approved by regulators. For
example, many consultants substitute instrument calibration criteria from RCRA methods (SW-846)
for the calibration criteria found in Functional Guidelines.
Completeness Check
A completeness check verifies that all laboratory analyses that were requested were performed and
reported. It is also a cursory quality control review that will identify if any laboratory information is
missing or if samples need to be re-analyzed.
Data Verification
Data verification should be performed by a chemist or other professional with data validation or
analytical laboratory experience who is familiar with the QC requirements specified for the analytical
methods being reviewed. Data verification is done on Level m type deliverables and is a systematic
process for evaluating whether data has been generated with acceptable quality control. Verification
is a cursory data review that includes a review of the laboratory's typewritten report and the tabulated
items provided in the Level III laboratory deliverable level. Data verification is not an in-depth
review of the laboratory's testing, but is a cursory review of the laboratory's quality control and may
suggest that a more thorough validation is needed.
Data Validation
Data validation should be performed on a laboratory deliverable Level IV data package by
experienced chemists. Data validation follows EPA protocols and the decision-making criteria set
forth in CLP's Functional Guidelines. These guidelines are used to evaluate data packages that
include the raw data and back-up documentation for the analysis. Because the raw data is provided,
the validation checks calculations and assesses instrument performance, calibration, and reporting
5
-------
limits in addition to the quality control checks. The data validation levels and the associated
laboratory deliverable levels are listed in Table 2.
Data Qualifier Flags and Decision-Making Criteria
Data may have been qualified for any of the following reasons:
• By the laboratory prior to receipt by the reviewer,
• Because of laboratory deviation from the designated method;
• Because the data may not meet QAPP criteria or review guidelines; or
• By the professional judgment of the reviewer.
In essence, the validator is making two decisions when deciding to flag a data point First, can the
result be used to determine if there is a non-detect or a positive hit If the result cannot be used for
this decision, then the result is flagged "R" to indicate that the result is unusable and should not be
used for any purpose, whether qualitative or quantitative, screening or definitive. Second, if the
validator is sure that the result is a firm "detect" or "non-detect," and then the quantitation is
evaluated. If the value fulfills the requirements listed in the QAPP, the result has been validated or
qualified as acceptable. If the value does not fulfill the requirements of the QAPP or the validator has
noted some other issue that impacts the usability of the data, then flags are assigned to the value.
Commonly used flags are:
J The analyte was positively identified but the quantitation is an estimation, or the analyte was
positively identified but the value is an estimation above the MDL and below the reporting
limit;
U The analyte was analyzed for but not detected at the reporting limit;
B The analyte was found in a blank;
UJ The analyte was analyzed for but not detected, and the reporting limit is estimated;
N The analyte was tentatively identified;
M The sample had evidence of matrix problems; and
D The sample results were taken from a diluted sample.
USING FLAGGED DATA
"J" flagged values are usually counted as usable data points when calculating completeness. The
final decision about the usability of the data and the severity of the flags (e.g., how many reasons for
applying "J" flags are needed before they become an "R"?) depends upon your company philosophy,
client and program requirements, the end use of the data, defensibility, confidence, and gut feelings.
6
-------
AVOIDING THE FLAGGING SYNDROME
In order to avoid excessive and unusual amounts of data qualifier flags, it is prudent to plan so that
you know your site and can set reasonable DQO's and expectations. Also, talk to the laboratory and
the data validator in advance to decide what course of action the laboratory will take when things go
wrong (i.e., take corrective action or flag the result). Finally, put the policy in the QAPP and the
laboratory contract
REFERENCES
Data Quality Objectives for Remedial Response Activities Development Process. Prepared by the
Office of Emergency Response and Office of Waste Programs Enforcement EPA, 1987a.
Laboratory Data Validation Functional Guidelines for Evaluating Inorganics Analyses Prepared by
the Office of Emergency and Remedial Response. EPA 540/R-94/013,1994a.
Laboratory Data Validation Functional Guidelines for Evaluating Orpanics Analyses. R-583-5-5-01.
Prepared by the Hazardous Site Control Division. EPA 540/R-99/008, 1999.
Guidance for the Data Quality Objectives Process EPA OA/G-4. Prepared by the U.S. EPA Quality
Assurance Management Staff. EPA/600/R-96/055. August 2000.
Laboratory Documentation Requirements for Data Validation Draft DC No. 9QA-07-97. Prepared
by the Quality Assurance Program. EPA Region 9, July 1997.
EPA Requirements for Quality Assurance Project Plans for Environmental Data Operations.
Prepared by the U.S. EPA Quality Assurance Division. EPA QA/R-5. October 1997.
EPA Guidance for the Quality Assurance Project Plans Prepared by the U.S. EPA Quality
Assurance Management Staff. EPA QA/G-5. February 1998.
Standard Methods for the Examination of Water and Wastewater. 18th Edition. APHA, AWWA, &
WEF. Edited by Eaton, Clesceri, & Greenberg. 1992.
7
-------
Table 1
General Acceptance Limits for Spikes and Duplicates for
Water and Wastewater Samples
Recovery of Spike
Additions
Precision of Low-
Level Duplicates*
Precision of High-Level
Duplicates**
Analysis
%
+ or - %
+ or-%
Metals
80-120
25
10
Volatile organic
70-130
40
20
Volatile gases
50-150
50
30
Base/neutrals
70-130
40
20
Acids
60-140
40
20
Anions
50-120
25
10
Nutrients
80-120
25
10
Other inorganic
80-120
25
10
Total organic carbon
80-120
25
10
Total organic halogens
80-120
25
15
Herbicides
40-160
40
20
Organochlorine pesticides
50-140
40
20
Captan
20-130
40
20
Enodsutfans
25-140
40
20
Endrin aldehyde
25-140
40
20
Onganophosphorus
pesticides
50-200
40
20
Trichlorophon
20-200
40
20
Triazine pesticides
50-200
40
20
Carbamate pesticides
50-150
40
20
Taken from Standard Methods. 18th ed.
* Low level refers to concentration less than 20 times the MDL. High level refers to concentrations greater than 20 times the MDL.
** Acceptance Limits for independent laboratory control standards and certification of operator competence.
8
-------
Table 2
Summary of EPA QA Levels
EPA Level
Reference
Report Deliverables
Data Validation
1
Various field screening
Methods
Simple analytical data
format
Review of calibration, blanks
completeness, etc.
(No common name)
II
Various sources
Analytical data with
some QC data
Review of calibration, blanks
Completeness, etc.
(No common name)
III
SW-846, Std. Methods EPA
Approved Methods
Lab test results, QC
data in the form of
typed reports and
summaries
Review of holding time, dups,
soikes. etc. on lab reoort onlv
(a.lea. "cursory" validation
IV
CLP SOW
CLP or CLP
equivalent
deliverables, includes
instrumental prints and
raw data
Review of lab report, QC raw
data, including calculations and
transcriptions (a.k.a. "Full,"
"100%," "90/10" validation). EPA
Level 1 is 100%
calculation/transcription checks.
Level 2 is 90/10%.
N/A
SW-846, Std Methods EPA
Approved Methods
Sample data, holding
times, blanks, blank
spikes only in typed
report and summary
Abbreviated review of holding
times, blanks, & blank spikes only
(a.fca "cursory" validation)
V
CLP (Special Analytical
Services Research)
CLP or CLP
equivalent
deliverables, includes
instrumental printouts,
raw data, and all
project notes
Review of results & QC from raw
data for accuracy & precision,
ruggedness, reproducibility.
(No common name)
* Note: "EPA Level 2" validation is performed on "EPA Level IV" deliverables.
9
-------
A Data Quality Strategic Plan
For the U. S. Environmental Protection Agency
Cindy Bethell, U.S. EPA
Since July 2000, a workgroup has been analyzing the Environmental Protection Agency's
quality system processes to identify where data quality vulnerabilities exist. This paper
highlights the factors that brought this workgroup together, the process used for the
analysis, and the recommendations which have been submitted to EPA's Quality
Subcommittee.
I. Introduction
Data and information are vital to informing public policy decisions and the regulations that
help protect the nation's air, land, and water-the mission of the Environmental Protection Agency
(EPA). Reliable information and data of documented quality also constitute a valuable resource
for the public and leaders across society who increasingly demand access to accurate
environmental information that is comparable and complete.
In recent years, the Agency's data and data systems have come under increasing scrutiny
from Congress, federal offices including the General Accounting Office (GAO), EPA's Office of
Inspector General and Science Advisory Board, and the Office of Management and Budget, who
assert the Agency's environmental data lack validity, consistency, and reliability.
Most of EPA's major data collections were initiated decades ago, prior to the current
understanding of data quality principles and in the absence of the standards and metadata
requirements that are vital to the reliability and secondary use of data. The vast majority of data
used by the Agency is collected by state and local agencies, and the regulated community using
inconsistent collection and analytical methodologies, identification standards, and documentation.
The impact of this variability was noted in a February 2000, GAO report stating that, due to
unreliable data, the Agency is unable to give an accounting of the environmental health of the
nation's water bodies.1 The report cited several causal factors, including inconsistencies in water
body assessments and methodologies, lack of standards and common definitions, and questionable
data consistency and reliability.
EPA's new information office, the Office of Environmental Information (OEI) was
established in October 1999. Since that time numerous separate inquiries regarding the state of the
Agency's data quality were sent to OEI from Senators Bond, Smith, Baucus and House Committee
Chairmen Fowler and Mcintosh, and others. EPA's data quality and completeness are squarely on
their radar screen and the new office has raised expectations that the issue of data quality will
receive greater attention from EPA. In March 2000, Margaret Schneider, Principal Deputy
1 Managing for Results: Challenges Agencies Face in Producing Credible Performance Information
(February 2000, GAO/GGD-00-52)
1
-------
Assistant Administrator for OEI, testified before the House Subcommittee on Oversight,
Investigations and Emergency Management, Committee on Transportation and Infrastructure, to
answer questions on the quality of the Agency's data, and in a follow-up request from the House
Subcommittee a data quality assessment was requested for four EPA data systems. The
assessment was completed and a report submitted in August 2000.
The Data Quality Strategic Plan
The Quality and Information Council, the Agency's senior-management body for setting
information and quality policy, is assisted by four subcommittees. One is the Quality
Subcommittee, which in July 2000, formed a cross-Agency workgroup to develop a Data Quality
Strategic Plan (DQSP). The Subcommittee's charge to the Workgroup was first to identify where
and how to improve the quality of the Agency's environmental data and second, to recommend
how to improve the quality culture at EPA; to further embed an appreciation for the role and
importance of quality assurance at all levels.
What is Quality?
Understanding and assessing data quality is matrix-like in complexity, where the x-axis represents
different types of data (regulatory compliance, permitting, violations, ambient concentration
measurements, geo-spatial, laboratory analysis, monitoring, technology performance data, etc.) and
the y-axis represents different types of quality along the entire data life cycle from planning to
sample measurement, analysis, assessment, through transmission, storage and reporting. At any
given point on this x-y grid, the notion of quality will take on different meanings.
The conventional wisdom among EPA's data detractors and among some EPA staff,
appears to be that in general, the Agency's data is unreliable. This view is part perception but also
part substance. Because standard data quality assessments are not routinely performed on the
Agency's data, we lack a quality baseline, and even standard quality criteria by which to determine
the veracity of the conventional view. For those systems which have been recently assessed for
"quality," the Toxic Release Inventory System and Safe Drinking Water Information System, for
instance, the aspect of quality examined is primarily at the data system level. The questions
answered are not, 'do the values accurately represent the actual pollutant release or ambient
condition,' but rather, 'are the data in the data repository consistent and complete in comparison to
the originating system or documentation?' While this back-end assessment of "quality" is of
value, it says nothing about highly significant front-end stages of the data lifecycle.
EPA's major statutes and the programs they have spawned are very different in purpose
and structure. Clean Air Act regulations and guidance are quite specific in their quality assurance
requirements for ambient air monitor siting requirements, instrument precision testing protocols,
and scientific rigor that underpins the entire monitoring program. The Superfund Program also has
an effective quality management system and its data and analysis are generally of known quality.
The same cannot be said for all EPA environmental data programs.
2
-------
Methodology
Views and Reviews
The Workgroup analyzed reviews of EPA's data and information quality management, by
oversight organizations such as the General Accounting Office, Science Advisory Board, National
Academy of Public Administration, the Environmental Council of the States, and business groups
such as the Business Roundtable and Coalition of Effective Environmental Information. Examples
of their observations and advice follow.
The Science Advisory Board:2
• EPA's Quality System implementation is uneven and varies from organization to
organization, increasing the likelihood of problems with data quality and the
associated decisions;
• Over 75 % of states are generating data of unknown quality because they lack
approved Quality Management Plans;
• Incomplete implementation of the Agency's Quality System precludes proper
evaluation and produces the potential for waste, fraud and abuse;
• The reporting status of quality staff denies access to the proper level of authority
and the independence necessary to oversee the Agency's services and products;
• The problem of access also exists at the regional, program and state levels;
• Senior managers needs to be a champion for successful implementation of the
Agency's Quality System and need to implement a more complex web of
persuasion, administrative mandates, and rewards.
In a March 2000, GAP reported that The National Water Quality Inventory, or 305(b)
report on surface waters, is not a reliable representation of nationwide water quality conditions due
to incomplete and inconsistent data, yet EPA uses this report for decision making because it is the
only source on whether waters are meeting water quality standards.3 No factual understanding of
how well the Agency is achieving its mission to protect the nation's waters exists.
EPA's Office of Inspector General has identified various weaknesses in the Agency's
quality system implementation and management, stating, "Without an effective Agency-wide
program, EPA could not fulfill its mission" which depends on having environmental data of
known and adequate quality.4
2 Review of the Implementation of the Agency-Wide Quality System, by the Quality Management
Subcommittee of the Environmental Engineering Committee, Science Advisory Board, February 25. 1999.
3 Water Quality: Key EPA and Stale Decisions Limited by Inconsistent and Incomplete Data
(GAO/RCED-OO-54).
*EPA Had Not Effectively Implemented Its Superfund Quality Assurance Program (E1SKF7-08-0011-
8100240, September 30, 1998)
3
-------
In the National Academy of Public Administration's November 2000 report, Transforming
Environmental Protection for the 2F' Century, four of ten recommendations apply to EPA's
Quality System:
• Invest in Information and Assessment. "Develop objective data of high quality.
• Hold States Accountable For Results. "Redefine EPA's expectations of states in
terms of environmental results rather than only process."
• Invest in Information. "Appropriate sufficient funds for major improvements in
environmental data and in program assessment."
• Challenge EPA, Congress, and One Another to Transform Environmental
Governance. "Build evaluation into the design of... programs."
The Business Roundtable (BRT) has developed their Blueprint 2001: Drafting Environmental
Policy for the Future, which includes the following recommendations:
• EPA needs a more a disciplined focus on data quality and scientific rigor.
• Improve data collection, use electronic data collection and reporting, move toward
integrated reporting, recordkeeping, and monitoring.
• The government should provide better information stewardship, policies that place
environmental information in context, and tools for assessing its accuracy.
Sherwood Boehlert, chairman of the House Science Committee, has endorsed BRT's proposal to
move to performance-based management and has promised the group to take a serious look at the
proposals. Boehlert said:
Sound science is the key to reaching consensus on tough environmental problems,
and technology is the key to affordably solving those problems.5
Analytical Process
Building upon the observations and comments of external reviewers, the Workgroup
developed a 12-step model to identify where along the data/information lifecycle, vulnerabilities to
data quality exist and then identify ways to mitigate those vulnerabilities. The model spanned
from planning for a data collection to ultimate storage in a data system. About 90 resulted from
that exercise. The Workgroup next grouped this long list under five categories and developed
white papers exploring seven key themes. Finally, interviews were conducted with managers, and
data collectors or evaluators from all program offices and six regions-a total of 40 interviews-to
better understand the view of decision makers regarding their use of data, quality priorities, and
their expectations of the Data Quality Strategic Plan.
5lnside EPA. February 23. 2001. Boehlert's comments were quoted from a February 8, 2001, briefing with
the Business Roundtable.
4
-------
Recommendations
The Workgroup developed seven sets of recommendations and prioritized them according to
their importance for improving data quality and quality management. These recommendations
were presented to EPA's Quality Subcommittee on March 8, 2001, and the Subcommittee has
taken them under advisement. The list of recommendations appears below, followed by a
description of each.
1.
National Information Quality Management
2.
Environmental Data and Metadata Standards
3.
Performance Reporting
4.
Data Transmission and Storage
5.
Grants and Permits
6.
Data Stewardship
7.
Better Quality Assurance Project Plans
#1 - National Information Quality Management
EPA's current approach to Agency-wide quality assurance (QA) relies on decentralized
implementation of broadly stated policies contained in EPA Order 5360.1, and the Agency Quality
Manual. Theses documents describe the role of OEI's Quality Staff as providing quality assurance
primarily for environmental data collection and technology development, and reviewing
documents. The Quality Staff develops policies and guidance and assesses their implementation
across the Agency, but has little authority for assuring compliance in organizations that have ndt
developed credible QA programs. The Quality Staff is not responsible for coordination of policy
implementation across EPA on a day-to-day operational basis, nor is there a process for elevating
and resolving common or shared issues to the Agency level.
Many examples illustrate the lack of a cohesive national.quality system. For instance, the
quality requirements in state grants programs are dramatically inconsistent. The Office of Air and
Radiation has clearly defined quality assurance requirements for its air monitoring program, as
does the Superfund program. But other EPA program offices do not and there are few incentives in
place to compel a less inclined office, governed by quality-silent or weak statutes and regulations,
to expend scarce resources to assure the quality its managers believe is already a part of their
standard business process. The cost of inconsistent implementation among EPA's regional offices
can be illustrated by a contractors' dilemma where their Quality Management Plan is approved by
one region but disapproved in another resulting in confusion, frustration and additional costs.
After 20 years of using a decentralized approach to managing quality assurance at EPA,
implementation of quality in the business processes for collecting, analyzing, using, and storing
information is fragmented and inconsistent in program offices, regions, and laboratories; with little
accountability for the results.
5
-------
This creates information quality vulnerabilities and results in inefficient data collection and
analysis expenditures, barriers to information sharing, and the potential for lost credibility in
Agency decision making.
To improve EPA's quality culture and provide a national framework for achieving greater
consistency in quality assurance policy implementation, the Workgroup has recommended that
changes in the current quality management structure be effected.
A. Create a National Information Quality Office. This office would report to the Agency's
chief information officer (CIO), supported by a network of regional and national program
CIO's with explicit authority, responsibility and accountability to:
• Assure consistent coordination and integration of quality across all EPA programs
and regions by developing an Agency-wide EPA Quality Management Plan.
The plan must describe how all sectors of the Agency Quality System
work together to assure information quality. It must describe consistent
roles, responsibilities, and processes for headquarters, national program
offices, regional offices, delegated states, grantees, and contractors to
implement quality procedures; and must establish core criteria to assure
even implementation across the entire Agency.
• Serve as the final arbiter for resolving differences in quality policy interpretation.
• Implement and manage a formal data and information stewardship program that
spans the entire data lifecycle.
• Serve as a central Agency focal point for external contacts on quality policies and
interpretation.
• Lead an Information Quality Office initiative to influence the Agency's quality
culture in positive ways, including:
/ Institute an education and training program among QA staff and managers
to engender a customer-assistance orientation to quality assurance
processes.
/ Offer broad Agency training to emphasize the role and value-added of
quality assurance to all our work.
/ Shift the approach to QA to become more results-oriented, e.g.,
demonstrate the connection between Quality Management Plans and data
quality.
/ Perform a sort of triage to identify national QA priorities that will inform
managers where to target limited resources.
B. Quality Assurance Manager Independence. The Science Advisory Board found that the
reporting status within EPA lowers the quality profile and denies quality assurance
managers access to the proper level of authority, within program offices, regions.
6
-------
According to ISO Guide 25,6 "The quality manager shall have direct access to the highest
level of management." And, "The quality assurance unit shall be entirely separate and
independent from the personnel engaged in the direction and conduct of the study."7 (Italics
added)
Providing sufficient organizational independence for Agency quality assurance manager
(QAM) positions assures that quality vulnerabilities will be raised to the proper level of authority.
The QAM position must have sufficient level of placement, with access to senior management,
grantees, and contractors, to be heard. Currently, there are QAMs in offices with substantial data
collection responsibility that are not full-time positions, nor do they have sufficient access to their
office management. To provide sufficient independence for quality assurance managers:
a. Require institutional placement of QA manager to ensure access to senior
managers, and that they must hold a GS-13 level or above.
b. Guarantee an appropriate measure of independence to QA managers to avoid
conflicts of interest.
#2 - Data and Metadata Standards
Metadata are defined here as the who, when, why, what, where, and how, of data values.
Where metadata are absent, EPA is unable to "defend" its data, making the Agency vulnerable in
terms of external inquiries, and because the public assumes data on our Web site are Agency
"approved." To collect data that can be defended on their merits and gain a higher return on the
enormous investments made in data and data systems by planning for the secondary use of data,
quality assurance considerations need to be explicitly incorporated into OEI's data standards
development process.
In OEI's first eighteen months, the standards process accounts for the successful
completion of the six Reinventing Environmental Information data standards. Fourteen of EPA's
major data systems are in various stages of implementing those standards on the back end-at the
database level. To meet the large demand for approved data standards the current development
process needs to be accelerated, and it also needs to be broadened to include scientific
measurement data.
international Standards Organization. General Requirements for the Competence of Calibration and
Testing Laboratories.
7See 21 CFR 58; 40 CFR Part 160.35)
7
-------
There are many drivers for requiring the collecting, documenting, and storing certain
metadata parameters. Among them, the data quality language was inserted into the FY 2001
appropriations bills,8 raising the possibility of a lawsuit where "inaccurate" public data are
discovered. A lack of standard data definitions and metadata can also result in confusion and
skewed analysis when incompatible data is wrongly compared. Data collection is a huge Agency
investment (direct collections, grants to states and researchers). When data are treated as a
reusable resource-for use after the initial collection purpose-the return on the investment will be
far greater. Where metadata are missing, data are of unknown quality resulting in decisions that
may be unsound and indefensible. Non-standard data formats and missing metadata are both
barriers to large-scale, regional and national analysis and comparisons. EPA's Quality System
requires that where pre-existing data are used to make environmental decisions or new data
collections are planned, the quality of the data must be known and documented.9
The following recommendations address the Agency's need for standardized metadata and
data quality indicators.
A. Analyze EPA's commonly used scientific data types to identify where the most compelling
needs for data standards exist. The selection criterion could be EPA's largest scientific
data collections, or some other. Standardizing Superfund program data has been an
identified need for years.
B. A series of Standards Action Teams should be convened to, 1) identify the needed
elements for core sets of metadata (who, what, how, when, where, why), and 2) data
quality indicators (precision, bias, confidence level). Standard transmission and storage
formats for these elements can be developed using OEI's data standards process. The core
set should consist of a minimum number of priority data elements and quality indicators.
C. Collection and documentation of core metadata elements will be required: 1) of all major
EPA data collections, and these standard data fields will be required of when a data
systems is developed or re-engineered. When data are collected, transmitted or stored, the
inclusion of, or linkage to, any existing metadata connected to that dataset, will be
required.
D. Linking technologies and data transmission languages such as XML should be identified
for use in legacy systems that do not accommodate metadata storage, so that available
metadata can be accessed with the corresponding data.
E. The workgroup should consider two options: 1) develop a series of core sets of metadata,
specific to different data types, or 2) identify one core metadata set comprised of elements
common to most data types, then require program offices to develop their own office-
specific metadat a requirements as a complement the core set.
8Section 515, Consolidated Spending Bill H.R. 4577, Untitled Section of the Fiscal Year 2001
Consolidated Appropriations Act (P.L.I06-554).
9EPA Order 5360.1, Section 2.5.1.
8
-------
F. A plain English educational brochure needs to be developed for non-scientific EPA staff,
decision makers, and the public, to explain the role and importance of metadata to
understanding issues such as random error, confidence and uncertainty. The brochure
should be published and distributed, and a link displayed on EPA's Web sites wherever
data are displayed to promote a more informed understanding and use of EPA's data.
G. Use OEI's data standards implementation process to incorporate standardized data formats
and metadata requirements into all appropriate, data-related, Agency regulations, policies,
permits, data system requirements, and Information Collection Requests.
#3 - Performance Reporting
A. Data Assessments: Perform standard data quality assessments of all major data bases
every two years. A standardized protocol and criteria should be developed for performing
data quality assessments, using statistically significant sampling. The first assessment will
create a baseline from which to judge all future progress. From this, the program can
establish data system-specific improvement goals.
B. Improve the FMFIA Reporting Process: Reporting under the Federal Managers
Financial Integrity Act (FMFIA) can promote greater accountability in improving the
consistency of Agency data. A formal collaboration process should be developed between
OEI and the Office of the Chief Financial Officer (OCFO) to identify national programs
and regions that are not adequately addressing data quality management in their FMFIA
reports, and reminders and assistance can be provided.
C. Integrate Data Quality Management into the Government Performance and Results
Act (GPRA) Processes: To engender a responsive quality culture in EPA, a system of
carrots and sticks should be employed. Tying program and regional development and
implementation of Quality Management Systems to the Agency's Planning, Budgeting,
Analysis and Accountability system and GPRA implementation efforts could provide the
incentives. Since the GPRA process helps determine EPA's resource use, directly tying
quality performance to this process would help improve and broaden the "quality culture"
at the Agency.
#4 - Data Transmission and Storage
A. EPA should formally acknowledge that information technology systems and security
provisions have a direct effect on the overall quality of our information, and should be
explicitly considered part of the enterprise quality system. The Agency needs to develop
explicit quality policies and guidance across the three functional areas of information,
technology, and security.
9
-------
B. Life cycle documentation should be required for all software as should the independent
verification and validation procedures of the software system. Custom hardware should be
subjected to comparable documentation procedures.
C. Explore the development of an Agency-wide metadata repository. This would achieve
significant economies since individual systems would not have to store metadata
information but could link to the appropriate metadata instead.
D. Provide guidance for systems managers, describing methods for checking the quality of
transmitted data and making corrections. These steps must be included in system
operational procedures, and be available to the project manager.
E. Conduct periodic quality review of systems as part of quality program reviews. Upgrade
Agency software engineering guidance and policies (e.g., IRM Directives 2100) to require
quality documentation.
#5 - Data Steward Roles and Responsibilities
Develop a policy to codify the Data Stewardship Network in EPA's Integrated Error Correction
Process. Formally identify data stewards with the responsibility to guard data integrity from cradle
to grave, or from data collection to data repository. The Data Stewardship Network should include
staff from program offices, regions, and states; individuals with a vested interest in overseeing the
development and use of reliable data.
#6 - Quality Assurance Project Plan Improvements
A. Quality Assurance Project Plans (QAPP) are the first line of defense for reliable data
quality. When effectively developed, they encompass the thought and planning that are
vital to an effective data collection process. The Intergovernmental Data Quality Task
Force and Region 1 are developing guidance to help staff prepare streamlined and
meaningful QAPPs. Many QAPPs currently are comprised mainly of "boiler plate" that
has little to do with project requirements.
B. Training in QAPP development is a continuous requirement. Management must be
committed to providing essential training.
#7 - Grants and Permits
Permits - It is important the Agency improves quality assurance requirements for its permit
programs, which currently offer minimal assurance. Two possible approaches are:
A. Have each permit program assess the quality of data reported to EPA by permittees to
determine whether the data are adequate for their intended use. This assessment could
drive the need for regulatory change. If data are adequate, no regulatory change is
necessary. If the data are inadequate for a given program, the program would be required
10
-------
to develop a plan to upgrade the quality of the data (this might include regulatory changes
or other changes);
B. Identify other tactical points where change could be made relatively quickly that would
lead to overall quality improvements. For example, improved data quality screening tools
to prevent inaccurate data from entering the system.
Grants - The EPA Quality System could be modified to specifically require grants that contain
environmental data operations (EDO) to have Quality Management Plans in accordance with EPA
guidance. In addition, 40 CFR §§30 and 31 could be modified so that the burden to determine if
there is an EDO in a grant is made by the QA staff of the organization approving the grant. The
language in 40 CFR §31.45 is outdated and inadequate to ensure that grants having environmental
data operations will establish adequate quality systems. The section needs to be rewritten. The
Agency Grants Management Manual needs to have a section that adequately reflects the modified
language in 40 CFR §31.45, and holds grants management officials accountable for the QA
component of the grant.
Where We Go From Here
On March 22, 2001, the Quality Subcommittee members will meet in a closed-door session to
decide their response to the recommendations presented in this paper. Based on the
Subcommittee's direction, a Data Quality Strategic Plan (DQSP) will be drafted. The current
intention is to distribute the draft Plan for review at the end of April, and later to meet with state
representatives to receive their feedback on a second draft Plan. The Workgroup and others with
an investment in the Agency's Quality System and data, hope that the Subcommittee's
deliberations will result strategic in shifts in the Agency's culture and approach to information
quality management, but we also understand the daunting obstacle of cultural resistence in a large
organization. In short order we will know if the time for this Plan has arrived.
Cindy Bethell
DQSP Workgroup Co-chair
Special Assistant
Office of Information Collection
Office of Environmental Information
U.S. Environmental Protection Agency
Bethell.Cindv@epa.gov
Acknowledgements
This paper represents not only my own research but the cumulative input from the DQSP
Workgroup, whose knowledge and experience have made this endeavor possible. The
Workgroup: Robert Runyon, co-chair, Gary Bennett, Mike Carter, Roger Cortesi, Joe Elkins,
Joseph Greenblott, Lora Johnson, Tony Jover, Russ Kinerson, Don Olson, David Taylor, Nancy
Wentworth, Jeffrey Worthington.
11
-------
LEVERAGING QUALITY PLANNING AND VALUE MANAGEMENT
TOOLS AND TECHNIQUES TO ENSURE REQUIRED DATA QUALITY
FOR ENVIRONMENTAL COMPLIANCE AND DECISION-MAKING
Craig Willis, PE, CVS
Abstract —Exhausting and frequent conference calls, meetings, contract modifications, and
multiple data collection efforts are symptoms of fundamental problems on many projects.
Data collection, data analysis, and more data collection are recurrent situations on many
projects where quality management is a reactive exercise. Keys to successful projects
include early and ongoing efforts to properly leverage quality planning and value
management processes. Project teams, project managers, owners, and regulators can all
either suffer the effects of a project'sfundamental problems or converge early to ensure the
required data quality is obtained. Quality planning and value management tools and
techniques must be used to proactively accomplish environmental compliance and decision
making within cost and schedule constraints. This paper describes how quality planning
and value management tools and techniques have recently been integrated to improve
project execution. The conference workshop will include an interactive exercise to
introduce how quality planning and value management processes should be employed both
early and during ongoing stages of a project. This paper and the workshop offer anecdotes
from teams who have used the integrated processes.
Introduction
Since the early 1980s, scientists, engineers, attorneys, responsible parties, and regulators across the
United States have been working to collect data for environmental compliance and decision making
at many types of environmental sites. Many of these compliance issues and environmental decisions
require multi-disciplinary expertise to plan for and perform sampling and analysis efforts to ensure
the required data quality is achieved. Armed with science and engineering data and analyses, several
people representing awide diversity of project stakeholder responsibilities then work to interpret the
data in hopes of ultimately resolving federal, state, and local compliance issues. Yes, quality
management tends to be a reactive exercise rather than a proactive, deliberate and effective element
of project planning. Yet decision-making related to compliance issues and investigation,
remediation, and or monitoring efforts must be sound and defensible. Quite simply, the right tools
and techniques must be employed on a project to drive project execution to timely, cost effective
protection of human and ecological health at each facility or site.
The complexities and slow progress within the environmental site investigation, remediation, and
monitoring "industry" is not unique. Construction, manufacturing, transportation, and health care
industries have faced, and continue to face, similar problem solving challenges. However, a
significant difference in problem solving within these industries is the extent that quality planning
and value management (VM) tools and techniques are employed.1
1
-------
The Problems!
Resolution of compliance issues and environmental decision-making is both complex and unique
to each site or facility. Some barriers and hurdles include:
~ Applicable, relevant, and appropriate regulations can change and are open to
interpretation and negotiation.
~ Regulatory enforcement and stakeholder influence varies substantially.
~ Project decision-making process is not always evident.
~ Future use(s) of a site is unknown, and can change as a site is characterized.
~ Site closure goals or compliance can be difficult to define and achieve.
~ Compliance issues tend to require facility personnel who represent management,
legal, engineering, safety and health, operations, and maintenance individuals with
varying levels of experience in sound and defensible sample collection and analysis.
~ Environmental site progress spans across science and engineering professionals who
are typically study-, design-, or construction-oriented individuals.
~ Evolution of investigation, remediation, and compliance monitoring technologies
continues to be dramatic.
"The" Solution!
In 1992, the U. S. Army Corps of Engineers (USACE) began to research quality related problems and
then to develop a better, more deliberate and integrated planning approach. These efforts produced
the Technical Project Planning (TPP) Processz, recognized by the Society of American Value
Engineers (SAVE) as an early, life-cycle VM technique for environmental proj ects. The TPP Process
and the VM methodology are Best Practices for project teams to effectively resolve the myriad of
complicated and often times, non-technical/ politically sensitive issues associated with compliance
issues, and decision-making for the investigation, remediation, and monitoring of environmental sites
and facilities.
The TPP Process
The TPP Process, documented in USACE's EM 200-1-2, is a quality planning process to ensure that
the required data quality is obtained to resolve each compliance issue and support environmental
decision-making. The four-phase TPP Process (Figure 1) involves many phase-specific, step-wise
activities that are detailed throughoutEM 200-1-2. Although the four-phase TPP Process is presented
as a linear sequence, EM 200-1-2 challenges planners to use the process not only early on a project,
but also iteratively and not in a reactive planning mode.
2
-------
Existing Site
Infonre
Phasel
Fhasell
Phase DI
Phase IV
Customer's
tfcntffv
. Ginwtl Project
r
Deteraroe
Data Needs
U
DevetupDuta
CttkeSon Options
finalize Data
GsSectkm PtagjraH
5
-» Detailed Project Objectives
-»Detailed Data Quality Objectives
-* Technical Basis for .Sampling and
Analysis Han; Quality Assurance
Project Han; and W>ik Ran
-~Accurate Cost Forecasting
*-* Progress to Site Goseout
The four-phase TPP Process is a
comprehensive and systematic
planning process that will accelerate
progress to site closeout within all
project constraints. Project objectives
are identified and documented early
during Phase I of the TPP Process to
establish the focus required to achieve
site closeout for the customer. Phases
II and IE provide a framework to
develop data collection options for the
customer's consideration during Phase
IV. The project-specific data quality
requirements established throughout
the TPP Process are then documented
as data quality objectives during Phase
IV. Many other documentation tools
within this EM also encourage detailed
data collection planning and contribute
to maintaining institutional site
knowledge.2
Figure 1 The TPP Process2
Phase I of the TPP Process involves defining the project. Decision-makers and technical personnel
are brought together. The personnel represent the owner or customer, regulatory agency(ies), and
other key stakeholder groups. Each is provided with all available site information as well as the
owner's concept of site closure (or walk away goal), schedule and budget. Phase I is designed to
"front-load" conflicts and decision-making. Outputs include specific project objectives and a clear
definition of the scope of the project or compliance issue.
Phase II involves an evaluation to determine if additional data are required to satisfy the site specific
project objectives. Data needs are determined and documented. This phase supports the detailed
planning required to execute the current project and subsequent stages. "Data Users" work to
identify "basic" data needs, "optimum" data needs that are cost-effective and prudent to fill during
the current project for a future executable phase, and any "excessive" data needs specifically
requested by someone besides the Data Users. Data Users are defined by data user perspective (i.e.,
3
-------
Risk, Compliance, Remedy, and Responsibility Data Users).
Phase HI ensures the owner or customer will have the information required for business decisions.
"Data Implementors" define sampling and analysis approaches, develop data collection options, and
document data collection decisions. The Data Implementors consider analytical and field source
error to ensure resultant data is useable for its intended use. Data need trade-offs are discussed, and
sample locations and numbers are adjusted to meet cost and schedule constraints. Other issues
resolved include probabilistic and non-probabilistic sampling, field screening and analysis, expedited
characterization techniques, and dynamic work planning.
Phase IV planning activities challenge the team to discuss data collection options and finalize a data
collection program that best meets the owner's or customer's short and long-term goals. The data
collection program is finalized and documented. Project specific data quality objective statements
are produced and agreed to by all project team members.
Value Management (VM)
Value Analysis and Value Engineering (cominonly
referred to as VE) techniques evolved out of the
necessity to find substitutions for manufacturing
materials during World War II. Today, value
management (VM) methodologies are used across a
wide variety of manufacturing, construction, and service
industries to maximize project value.
VM is an organized, systematic approach to achieve
best-value when planning or reviewing a project, product, service, or program. A multi-disciplinary
team, led by a Certified Value Specialist (CVS), participate in a VM study in a workshop setting that
involves these efforts3.
• Information Gathering
• Function Analysis
• Creative Speculation
• Alternative Evaluation
• Alternative Development
• Presentation of Recommendations
Comparison of TPP Process and VM Methodologies
The TPP Process parallels the standard VM Job Plan. A team's use of the TPP Process involves
specific analytical techniques, creative efforts, and evaluation methods in a systematic sequence of
50 activities. In total, the VM Job Plan involves 28 separate steps, with a specific step called
Function Analysis. Within the TPP Process, the equivalent functional analysis activities are actually
accomplished throughout the four-phase TPP Process. The TPP Process and traditional VM
methodologies are most commonly employed during quality planning and environmental decision-
The relationship between value,
quality and performance, and cost is
the basis for VM and can be
expressed as follows:
Value * Quality + Performance
Total Life-Cycle Costs
4
-------
making phases of work, respectively. Traditional VM methodologies can be used during design,
construction, and operation of environmental remediation systems (Figure 2).
Figure 2
VM Workshop Opportunities
VM for Environmental Compliance
and Decision-Making
VM techniques can be leveraged to resolve
environmental compliance issues; to plan site
investigations; to evaluate feasible site
remedies and compliance alternatives; to
design remedial systems; and to optimize
monitoring activities. For example, a CVS
will use Function Analysis during a VM
workshop to first define the "why" before
systematically establishing the "how" of the
overall project, and several key project
components or issues.
Use of VM techniques maximizes project
value as a result of the multi-disciplinary
team's efforts to identify alternate methods,
means, and design elements that meet the
quality and performance requirements at the
minimum cost.
Application of the TPP Process
The following project summaries show how use of the TPP Process improved project execution on
a variety of recent regulatory compliance and environmental projects.
Remediation Investigation and Monitoring
A highly successful TPP Process workshop was conducted for a complex hazardous waste site
located near Buffalo, New York. The goal of the workshop was to establish the objectives for the
remedial investigation and monitoring at the 191-acre site. The CVS team leader convened 12
owner representatives, 6 remediation contractors, and 2 regulatory agency staff for a two-day
workshop. Results included:
• Determined site characterization could be performed using the existing groundwater data.
• Developed effective human health risk evaluation plan for multi-level exposures.
• Estimated cost avoidance in excess of $1 million (reduction in total life-cycle costs).
5
-------
The savings were more than 25 percent of the project costs and represented a 30 to 1 return on the
TPP investment!
Redevelopment Program Impacted bv Contaminated Site
The TPP Process was used for a contaminated site where site improvement plans called for a
roadway realignment and future airport expansion in St. Louis, Missouri. A CVS team leader
involved representatives from the owner, Federal and State of Missouri regulators, and other
potentially responsible parties (PRPs) to achieve the following results:
• $2.5 million cost reduction when excessive data needs were deleted.
• $500,000 increase to add omitted sample collection and analysis needed for site decisions.
• 19 specific action items were identified and directly linked to short-term project schedule.
• "Train-wreck" schedule was met and regulatory review cycles were significantly reduced.
• Clear planning objectives enabled field personnel to further control costs and make better field
decisions.
• Changed conditions were better managed and acceptable data was collected on schedule and
within budget.
• Legal issues related to future land use and PRPs were resolved before fieldwork began.
Brownfield Utility Site Planning
A one-day TPP workshop was conducted by technical team members to begin environmental
investigation and remediation of a brownfield utility site in Tulsa, Oklahoma. Use of the TPP
Process highlighted several potential areas of concern and issues that would not likely have been
considered. Results of this workshop included:
• Team realized that roles and responsibilities of all decision-makers and stakeholders needed
to be better defined and understood.
• Team requested information regarding the Customer's site-specific schedule requirements
and expectations.
• Determined that both the "Compliance" and "Responsibility" Data User perspectives would
need to contribute to the project planning.
• Determined evaluation of potential points of compliance and media of concern must be
prioritized for the efforts to remain within the project budget
• Team decided that very precise project objectives were required to better define the current
project and ensure the $50,000 funding constraint was not exceeded.
• Created a strategy for use of the limited funding so the owner would be ready for later
discussions with Federal and State Regulators, as well as City of Tulsa representatives.
• Developed effective human health risk evaluation plan for multi-level exposures.
• Estimated cost avoidance in excess of $1 million (reduction in total life-cycle costs).
The rest of the story on this project is that use of the TPP Process helped the team discover that not
only the first, but a second candidate brownfield site was not suitable for the Customer's intended
use. Iterative use of the TPP Process ultimately enabled the team to select a third site and
successfully complete the project within the Customer's cost and schedule constraints.
6
-------
Application of the VM Methodology
The following project summaries show how use of the Value Management (VM) Methodology
improved project execution on several regulatory compliance and environmental projects.
Environmental Permitting. Underground Storage Tank Removal & Replacement
Five environmental design and remediation personnel, led by a CVS, worked with part-time
participants representing the owner, and several design engineers and contractors. This VM
workshop involved an extensive tour of the multiple sites located in San Diego, California, to ensure
the team complied with existing environmental permits and these regulations:
• Clean Water Act, CFR 40, Parts 104-140,401,403, and 433
• California Code of Regulations, Title 22 (tank storage of non-RCRA wastes)
• State of California's Department of Toxic Substances Control
• San Diego County Air Pollution Control District
• San Diego County Hazardous Materials Management Division
Ten proposals were developed and recommended for implementation. Potential VM savings of
$362,000 corresponded with a potential project cost savings in excess of 12 percent. The team also
documented three specific design suggestions to improve constructability.
Alternative Groundwater Treatment
This CVS-led VM workshop was conducted to compare groundwater treatment technologies for two
sites near Baltimore, Maryland. The evaluation focused on total life-cycle costs (e.g., process
equipment, chemicals, waste disposal, electrical usage) directly related to the groundwater treatment
technologies under consideration.
During the 3-day VM workshop, Chemical, Process, and Environmental Engineers completed
detailed analyses of life-cycle chemical costs based on the treatment dosages required to treat
extracted groundwater flows; investigated relative operability and maintainability of alternative
groundwater treatment systems; and developed and evaluated two disposal options for spent
stripping solutions. This VM workshop produced a proposal that will reduce life-cycle costs by 42
percent (a savings of $3.7 million) as compared to the previously planned approach!
Hazardous Waste Storage Facility
An eight-person, CVS-led team, conducted a 4-day VM workshop on the Conceptual Design of a
$5.4 million hazardous waste storage facility near Dayton, Ohio. Structural, Electrical, Mechanical,
and Environmental Engineers were selected for this VM workshop to ensure construction and
operation of the new facility would be in compliance with all Federal, State, and local regulations.
VM workshop results included:
• Generated 278 alternative ideas;
• Developed top-ranked 60 alternatives;
• 32 VM workshop proposals were fully developed and recommended;
• 10 design suggestions were documented to improve constructability;
7
-------
• Over $400,000 of proposals offered to improve the building design scheme; and
• Site modification proposals ranged from $2,300 up to $112,800.
Groundwater Treatment Facility
CVS-led VM workshops were conducted to improve design and operation of a $10 million
groundwater extraction, treatment, and discharge system for an urban site in Phoenix, Arizona.
Five remediation specialists, and the owner's technical staff, completed a 4-day VM workshop on
the Preliminary Design. 14 recommended proposals and 30 specific design suggestions were fully
developed. Potential capital cost savings of the recommendations ranged from $ 10,000 to $687,000.
Potential life-cycle cost savings were estimated in excess of 40 percent for a VM proposal that
recommended an alternate configuration for the GAC treatment train.
A CVS led nine other personnel, specializing in environmental remediation and constructability
issues, during a 5-day VM workshop of the Detailed Design. This workshop involved collaborative
input from a few owner personnel and the designer's Project Manager. 23 recommended proposals
were fully developed during the VM workshop and presented for consideration by the owner.
Potential capital and annual O&M cost savings of recommended proposals ranged up to $270,800
and $30,420, respectively.
Soil Excavation. Decontamination, and Institutional Controls
This 5-day VM workshop, conducted by a six-person, CVS-led VM team, was on a Preliminary
Design for a highly urbanized site in Buffalo, New York. This VM team performed a site visit to
validate workshop assumptions and to ensure technically sound alternatives were being developed.
Estimated total life-cycle cost savings for individual recommendations ranged from $32,000 to
$3,290,000. Implementation of accepted recommendations are forecasted to save more than $4
million, avoiding 17 percent of the project costs.
Summary and Conclusions
TPP and VM workshops guide a team to deliberately analyze a project until the quality and
performance requirements are established. Only then can a multi-disciplinary team effectively
identify alternate methods, means, and design elements that meet the quality and performance
requirements at a minimum cost, and deliver the best value possible. The TPP Process and VM tools
and techniques can obviously help to ensure the required data quality is achieved for environmental
compliance issues and decision-making. In particular, these approaches can be leveraged during
instances of scope creep, imperfect coordination, time constraints, and when the consensus seems
to be that "more is better" or "we have always done it this way."
The discussions and anecdotes in this paper should have introduced you to how quality planning and
VM can be leveraged at sites and facilities. Environmental professionals, regulators, and other
stakeholders are now encouraged to begin or expand their application of these techniques on on-
going and future projects!
8
-------
References
1 Fong, Patrick Sik-wah, 1999. Charting the Future Directions of Value Engineering, SAVE
International Conference Proceedings, 1999.
2 U.S. Army Corps of Engineers (USACE), 1998. Technical Project Planning (TPP) Process,
Engineering Manual (EM) 200-1-2, August 31,1998.
3 Value Methodology Standard, SAVE International (October 1998).
9
-------
ISO 19011:2002 -
A COMBINED AUDITING STANDARD
FOR QUALITY AND ENVIRONMENTAL MANAGEMENT SYSTEMS
Gary L. Johnson, U.S. EPA, Office of Environmental Information
Abstract — In a precedent-setting decision in 1998, the International Organization for
Standardization (ISO) directed ISO Technical Committee (TC) 176 on Quality
Management and ISO TC 207 on Environmental Management to develop jointly a single
guideline standard for auditing quality and environmental management systems. When
approved, this standard would replace ISO 10011-1, ISO 10011-2, and ISO 10011-3 on
quality auditing and ISO 14010, ISO 14011, and ISO 14012 on environmental auditing.
A Joint Working Group (JWG) was established comprising experts from both TC 176
and TC 207 to develop the new standard, ISO 19011, Guidelines on Quality and/or
Environmental Management Systems Auditing, and to incorporate lessons learned from
efforts to improve compatibility between ISO 9001/9004 and ISO 14001/14004, the
standards for quality and environmental management systems, respectively. Work is
proceeding on the development of ISO 19011 with an expected completion in the spring
of 2001.
INTRODUCTION:
This paper discusses the combined auditing standard, ISO 19011:2002, Guidelines on Quality
and/or Environmental Management Systems Auditing. It includes a description of the standard, a
discussion of relevant issues addressed during its development, and a summary of its current
status. Following the approval of the ISO 14001 and ISO 14004 environmental management
systems (EMS) standards and the start of a revision to the ISO 9000 quality management systems
(QMS) standards, there was considerable interest by ISO in increasing the compatibility between
the EMS and QMS standards. Early in the discussions, it became clear that the similarities
among the existing EMS and QMS auditing standards would make them a prime candidate for
integration.
The Joint Working Group (JWG) was created by ISO to develop the new standard. The JWG
would have co-conveners, one from TC 176 and one from TC 207, and experts would be drawn
from both technical committees. Because this venture had never been attempted by ISO before,
the ground rules for operating the standard-setting process also had to be revised. Both TC 176
and TC 207 would participate fully in the process. Ballots would be sent to national member
bodies for both technical committees, but ISO's rule of "one country, one vote" would require
that both TC's agree on the vote for a particular ballot. Otherwise, a country's vote would not be
counted. To ensure that a consensus position is reached in the USA, the U.S. Technical Advisory
Groups (TAGs) to TC 176 andTC 207 formed a Liaison Group with representatives from TAG
176/Subcommittee 3 on Quality Auditing and TAG 207/ Subcommittee 2 on Environmental
Auditing to formulate the USA position on ballots.
1
-------
In November 1998, the first meeting of the JWG to develop a common auditing standard was
held in The Hague, The Netherlands. Experts from TC 176 and TC 207 representing 34
countries attended that meeting with the purpose of charting the development process for the new
standard. From the outset, the stronger experience was with the quality auditing standards. The
environmental auditing standards had been published only for a little over two years and there
wasn't much experience in their use. While very similar, there were some distinct differences
between the quality auditing philosophy and that of environmental auditing. Issues getting early
attention included auditor competency, usability by small-to-medium enterprises (SMEs) and
developing countries, and the structure of the standard.
By the spring of 1999 and the second JWG meeting in Buenos Aires, Argentina, an initial
Working Draft (WD.l) of the standard had emerged. Discussions were held at the TC 207
meeting in Seoul in June 1999 and at the TC 176 meeting in San Francisco in September 1999,
which resulted in the first Committee Draft (CD.l) of ISO 19011. CD.l was balloted in late
1999 and more than 1400 comments from 35 countries were received by the JWG Secretariat by
the end of February 2000.
The JWG met in Berlin, Germany, in March 2000, to address the comments on CD.l. The JWG
was divided into two sub-groups, one to address comments on the structure and process aspects
of the standard, and one to address the comments on auditor competency. Each sub-group had
about half of the comments. After considerable debate, the draft for CD.2 emerged and was
balloted for comments in April 2000. The comments were received in August and were
addressed by the JWG in Cancun, Mexico in September. The Cancun meeting produced CD.3
which subsequently distributed for comments in late fall 2001. The international comments on
CD.3 will be addressed in Sydney, Australia, in March 2001. Depending on the comments and
ballot results, the Sydney meeting should produce a Draft International Standard for a six month
ballot among the ISO member countries. The goal is to publish ISO 19011 as an international
consensus standard in early 2002.
PURPOSE OF THE STANDARD:
ISO 19011 is intended to provide guidelines for auditing ISO 9001-based quality management
systems and ISO 14001-based environmental management systems. The standard will replace
the following current standards:
ISO 10011-1, -2, -3, Guidelines for Auditing Quality Systems
ISO 14010, Guidelines for Environmental Auditing - General Principles
ISO 14011, Guidelines for Environmental Auditing - Audit Procedures - Auditing
of Environmental Management Systems
ISO 14012, Guidelines for Environmental Auditing - Qualification Criteria for
Environmental Auditors
2
-------
ISO 19011 reflects the changes made to ISO 9001:2000, Quality Management Systems -
Requirements, which is scheduled to publication later this year, including the new business
model for the standard. ISO 19011 is intended to apply to both internal and external auditing,
and may be used as part of auditor certification and training.
STRUCTURE OF ISO 19011:
The structure of ISO CD.3 19011 is as follows:
0
Introduction
1
Scope
2
Normative References
3
Terms and Definitions
4
Principles of Auditing
5
Managing an Audit Program
6
Audit Activities
7
Competence of Quality and/or Environmental Management System Auditors
The standard also contains an informative annex on examples of the evaluation process for audit
team selection. The standard also includes several diagrams to aid users in understanding and
using the guidance.
ISO 19011 is a guideline standard which means its use is not mandatory unless it is invoked as
part of a multiple party agreement, such as contract or other legal agreement. As a guideline
standard, its implementation is generally not auditable because the elements of the standard are
not requirements and because there may be others ways of accomplishing the same objectives.
ISO 19011 is generally organized in two parts: Clauses 4 through 6 address the process of
planning, conducting, and evaluating audits and Clause 7 addresses issues pertaining to auditor
competence and selection. Clause 0, Introduction, assists the reader in understanding the reason
for the standard and who might use it. Clause 1, Scope, defines the scope and applicability of the
standard which extends beyond QMS and EMS auditing.
THE AUDIT PROCESS:
Clause 4 - Principles of Auditing:
The standard provides a bfief summary of auditing principles in Clause 4. These principles
should be used to drive the establishment and implementation of the audit process for an
organization. Key among the principles cited for auditor behavior are:
ethical conduct - the foundation of professionalism,
fair presentation - the obligation to report truthfully and accurately, and
3
-------
due professional care - application of reasonable care in auditing.
Two other principles of auditing relate to the audit process primarily. They are:
independence - the basis for impartiality and objectivity of the audit conclusion,
and
evidence - the rational basis for reaching audit conclusions.
Clause 5 - Managing an Audit Program:
Clause 5 provides guidance for those who need to establish and maintain an ongoing set of audits
for an organization. The standard utilizes the Plan-Do-Check-Act cycle to define the audit
program. Some of the key actions addressed are:
establishing the objectives and extent of the audit program,
establishing the responsibilities, resources, and procedures,
ensuring the implementation of the audit program,
monitoring and reviewing the audit program to improve its efficiency and
effectiveness, and
ensuring that appropriate program records are maintained.
Because the standard may be applied to internal and external auditing, the objectives and extent
of the audit program is a critical early step in defining the audit program for a particular
organization or application. Any audit program must be managed by managers having
appropriate authorities and resources to implement the program.
The audit program must also address the possibility of "combined audits" and "joint audits." A
"combined audit" occurs when a QMS and EMS are audited at the same time by the same audit
team. A "joint audit" occurs when two audit teams cooperate to audit an organization during the
same period with one team auditing the QMS and the other team auditing the EMS.
The audit program must be monitored and reviewed to ensure its ongoing effectiveness in
meeting the needs of the organization. Adjustments to the audit program should be made when
needed in order foster improvements.
Clause 6 - Audit Activities:
Clause 6 describes the six general steps in planning and conducting an audit. The steps include:
initiating the audit,
initial document review.
4
-------
preparing for on-site audit activities,
performing on-site audit activities,
reporting the audit results, and
audit completion (including any follow-up activity that may be needed).
Initiating an audit requires consideration of several factors, including:
having defined audit objectives,
confirmation that the audit is feasible, and
establishing a satisfactory audit team.
Once established, the audit team will review any available documents pertaining to the audit and
prepare for the on-site phase of the audit, including the logistics required and arrangements (such
as travel) to be made, specific assignments to audit team members, etc.
Whether a QMS or EMS audit, the on-site activities are similar and include:
opening meeting with the auditee,
roles and responsibilities of guides (as needed),
collection and verification of information,
audit findings,
communication with the audit client and auditee,
preparation of the closing meeting, and
closing meeting.
Reporting on the audit results is a critical step and must accurately reflect what transpired during
the audit. The key is to address the extent of conformance to the audit criteria, the effectiveness
of the management system implementation, and the ability of the management review process to
assure the continuing suitability and effectiveness of the management system. This is a
significant difference from QMS audit criteria in the past when auditors frequently commented
on the suitability and effectiveness of the management system itself. This was inappropriate for
two reasons: (1) management is responsible for assessing the value (i.e., "suitability and
effectiveness") of the management system and (2) the auditors may lack critical knowledge
about the organization's operations in order to assess the value of the management system.
The standard provides for audit follow-up as needed to confirm that all non-conformances have
been addressed.
5
-------
COMPETENCE OF QUALITY AND/OR ENVIRONMENTAL MANAGEMENT
SYSTEM AUDITORS:
It is a given that auditors must be competent to perform their assigned tasks. The extent to which
ISO 19011 should address auditor competence has been the principal source of debate among the
JWG members. There is no question that guidance is needed to define the general areas of
competence based on:
education,
audit experience,
auditor training,
work experience, and
persona] attributes.
The guidance provides for knowledge, skills, and personal attributes needed for an audit team
leader as well and also addresses the unique competence needed for combined audits.
The biggest issue is whether or not the standard should specify minimum levels of training or
work experience. Sentiments have been strong that this is needed, but representatives of some
developing countries have expressed concern that the requirements are too burdensome for them.
In CD.3, Clause 7 contains a table of "recommended education, training, and work and audit
experience." While ISO 19011 is officially a guideline, inclusion of this table in the standard
would imply that these are minimum experience levels. The USA has been concerned that this
table reflects experience levels appropriate to certification or registration audits by third parties
and that some users could be influenced to apply the table to other audit situations, including
internal audits and second-party supplier audits. The USA has proposed that the table be deleted
or, as a best case, moved to an Informative Annex of the standard with additional examples that
cover the full range and scope of auditing to be addressed by the standard. Each national
standards body would be responsible for defining the minimum experience levels appropriate for
auditors, recognizing that there are differences between the major industrialized nations and the
developing countries in terms of capabilities.
The standard does define general areas of competence that should be considered when
determining the suitability of an auditor. These include competence in:
audit procedures, methods, and techniques;
management systems and related documents;
organizational situations; and
relevant laws, regulations and other requirements.
6
-------
The standard also provides guidance for the maintenance of auditor knowledge and skills,
including continuing professional development and auditing ability. This would be assured
through implementation of an auditor evaluation process.
CONCLUSIONS:
ISO 19011 CD.3 has accomplished several important objectives in the development of the
standard:
the contents of ISO 10011-1, -2, and -3 have been fully incorporated into the
standard;
the contents of ISO 14010, ISO 14011, and ISO 14012 have been fully
incorporated into the standard;
the interests of the environmental and quality communities have been successfully
integrated into one document; and
the new standard has been made easier to use with a logical structure and with a
number of diagrams and examples.
While some critical issues remain to be resolved, the USA remains optimistic that they will be
resolved in Sydney, Australia, in March 2001 and that a Draft International Standard will emerge
for approval and publication by early 2002.
REFERENCES:
1. ISO 19011/CD.3, Guidelines on Quality and/or Environmental Management Systems
Auditing. International Organization for Standardization, Geneva, Switzerland
(October 2000).
2. ANSI/ISO/ASQ Q9001:2000, Quality Management Systems - Requirements. American
National Standards Institute, New York, NY (December 2000).
7
-------
THE ROLE OF FIELD AUDITING IN
ENVIRONMENTAL QUALITY ASSURANCE MANAGEMENT
Mr. Daniel Claycomb, Director of Geosciences at Environmental Standards, Inc.,
Valley Forge, Pennsylvania
Abstract — Environmental data quality improvement continues to focus on analytical
laboratory performance -with little, if any, attention given to improving the performance
offield consultants responsible for sample collection. Many environmental
professionals often assume that the primary opportunity for data error lies within the
activities conducted by the laboratory. Experience in the evaluation of environmental
data and project-wide quality assurance programs indicates that an often-ignoredfactor
affecting environmental data quality is the manner in which a sample is acquired and
handled in the field. If a sample is not properly collected, preserved, stored, and
transported in the field, even the best of laboratory practices and analytical methods
cannot deliver accurate and reliable data (i.e., bad data in equals bad data out). Poor
quality environmental data may result in inappropriate decisions regarding site
characterization and remedial action.
Field auditing is becoming an often-employed technique for examining the performance ofthe
environmental samplingfield team and how their performance may affect data quality. Thefield
audits typicallyfocus on: (1) verifying thatfield consultants adhere to project control documents
(e.g., Work Plans and Standard Operating Procedures [SOPs]) during field operations; (2)
providing third-party independent assurance that field procedures, quality assurance/quality
control (QA/QC) protocol, and field documentation are sufficient to produce data of satisfactory
quality; (3) providing a "defense" in the event that field procedures are called into question; and
(4) identifying ways to reduce sampling costs.
Field audits are typically most effective when performed on a "surprise" basis; that is, the
sampling contractor may be aware that a field audit will be conducted during some phase of
sampling activities but is not informed of the specific day(s) that the audit will be conducted The
audit should also be conducted early on in the sampling program such that deficiencies noted
during the audit can'be addressed before the majority of field activities have been completed
A second audit should be performed as a follow-up to confirm that the recommended changes
have been implemented
A field auditor is assigned to the project by matching, as closely as possible, the auditor's
experience with the type of field activities being conducted The auditor uses a project-specific
field audit checklist developed from key information contained in project control documents.
Completion of the extensive audit checklist during the auditfocuses the auditor on evaluating
each aspect offield activities being performed. Bather than examine field team performance
after sampling a field auditor can do so while the samples are being collected and can apply
real-time corrective action as appropriate.
As a result of field audits, responsible parties often observe vast improvements in their
consultant's fieldprocedures and consequently, receive more reliable and representative field
1
-------
data at a lower cost. The cost savings and improved data quality that result from properly
completedfield audits make the field auditing process both cost-effective and junctional.
INTRODUCTION
A commonly overlooked factor in developing and implementing environmental quality assurance
systems is providing a means for verifying that field personnel are conducting each aspect of field
investigation activities in compliance with project control documents. Environmental professionals
commonly assume that the primary opportunities for adverse impact to environmental data quality
lie within the walls of the laboratory. Consequently, quality assurance programs focus on laboratory
operations and consist of developing detailed laboratory analytical procedures, evaluation of
laboratory blind duplicate and performance evaluation sample results, stringent laboratory auditing
programs, and rigorous validation of laboratory data. Although these laboratory quality assurance
activities are necessary and valuable, the relevance of their findings is limited unless the
environmental samples were properly collected, stored, and transported and proper sample custody
was maintained.
The preparation of project control documents for the implementation of field activities (e.g., work
plans, health and safety plans [HASPs], sampling and analysis plans [SAPs], standard operating
procedures [SOPs], etc) is only the beginning of the quality assurance process for field
investigations. It is critical that all field personnel review and follow the provisions of the control
documents. To make sure that field personnel are not deviating from the approved procedures
specified in the control documents (or that field personnel are justified in deviating from the
procedures and are appropriately documenting the deviations), it is important to conduct independent
third-party field audits of field sampling teams.
Field audits are conducted for three primary purposes: (1) to verify that procedures identified in the
project control documents are adhered to during field operations; (2) to provide verification from an
independent organization thatfield procedures, quality assurance/quality control (QA/QC) protocol,
and field documentation are sufficient to produce data of satisfactory (usable) quality; and (3) to
provide project stakeholders with a "defense" in the event that field procedures are called into
question.
FIELD AUDITING PROCEDURES
Typically, field audits are done on a "surprise" basis; that is, the sampling organization is aware that
a field audit will be conducted during some phase of sampling activities but is not informed of the
specific day(s) that the audit will occur. Ideally, the audit is conducted early in the field program
such that deficiencies noted during the audit can be addressed before the majority of field activities
have been completed. Often, a second audit is conducted later in the program to confirm that the
recommended changes have been implemented. Rather than examine field team performance after
2
-------
sampling, a field auditor can do so while the samples are being collected and can apply real-time
corrective action as appropriatei
Once the need for a field audit has been determined, a field auditor is assigned to the project by
matching, as closely as possible, the auditor's experience with the type of field activities being
conducted. The designated field auditor then reviews the applicable field event control documents.
A project-specific field audit checklist is developed based on the information in these documents.
The multi-page checklist follows general headings (such as, pre-task planning, field documentation,
sample containers, sampling activities, QA/QC samples, chain-of-custody, decontamination, sample
packaging, waste management, and health and safety) to group field activities. Completion of an
extensive audit checklist during the audit focuses the auditor on evaluating each aspect of field
activities being performed. When developing the checklist, the auditor also reviews field team
performance with the client for input regarding any field team deficiencies that the client has
observed or about which other involved parties (e.g. the laboratory) may have informed the client
On the day of the audit, the auditor should arrive at the project site at the time the field crew arrives;
this will provide the auditor the opportunity to communicate the purpose of the visit and will allow
observation of each aspect of field activities in their entirety. During introductions, the auditor
describes how the audit will proceed during the course of field activities and explains that the
auditor's role is not an adversarial one and that the goal of auditor, as part of the project team, is to
collect the highest quality of data possible so that the data accurately represent actual site conditions.
The field crew must also be aware that, although the auditor will attempt to minimize interruptions
to field activities, the auditor's presence and questioning throughout the day will likely slow the pace
of field activities. In some cases, there may be delays while corrective actions (in response to the
auditor observing a significant deviation or deficiency) are discussed. The auditor should be given
the authority to act on behalf of the client and to delay and/or cease field activities until significant
deficiencies have been addressed.
Upon completion of the field audit, the field team leader attends a debriefing meeting (typically held
the same day as the audit at the end of the day) with a client representative and the field auditor to
discuss the auditor's observations and recommendations. It is best that each of these parties be
present so that audit findings can be openly discussed and corrective actions can be determined and
implemented immediately. If all parties cannot meet at this time, the auditor should review each of
his findings with the field team leader, should indicate that he will present these same finding to the
client, and should stipulate that any corrective action will be communicated as soon as possible to
the field team leader by the client.
As soon as possible following the audit, the auditor prepares a field audit report that summarizes
audit events, presents deficiencies observed during the audit, and provides recommendations to
address the observed deficiencies. In addition, the report should identify the type and source of
deviations from the control document of concern. Due to the nature of a field audit, some critical
statements will be presented in the report The statements should be based on observations made in the
3
-------
field and should address only those areas in which project field team deficiencies were noted and where
changes may be appropriate. An exhaustive list of the activities performed in accordance with the project
control documents and observed to be in compliance with standard industry protocol are typically not
presented in the audit report
WHICH SAMPLING PROGRAMS AND FIELD ACTIVITIES SHOULD BE AUDITED
The need, reason, and usefulness of conducting field audits are not limited to certain types of clients,
projects, or sampling activities. Fields audits have been proven to be a valuable part of sampling
programs conducted at manufacturing facilities, former landfills, wastewater treatment plants, and
compressor and metering stations along natural gas pipelines. There is a diverse range of reasons
why a particular site is undergoing an environmental assessment. Some sites are involved in
property transactions (where a potential buyer may wish to assess environmental liabilities of a
property before entering a purchase agreement, or apotential seller make wantto know what impacts
his/her operations at the site may have had before the property is put on the market, or a property
owner may wish to establish baseline environmental conditions at his/her site before the site is leased
to another party); some sites are under administrative consent orders or memorandums of agreement
with federal or state regulatory agencies; others may be undergoing assessment and remediation
under a voluntary cleanup program; and yet others are being considered for redevelopment under
a Brownfields program. No matter what the reason for the assessment or remedial activity, project
stakeholders deserve and expect that dollars spent on environmental assessment activities generate
accurate and reliable data. The proper preparation for and implementation of field sampling
activities starts the data generation process.
The types of media of concern and the resultant field activities conducted at sites like those described
above will vary a great deal. Sampling programs may require field personnel to collect multi-media
samples, possibly including soil (both surface and subsurface), groundwater, surface water,
sediment, air, surface wipe, wastewater effluent, or chip samples. The methods used to collect these
sample may include drilling operations (such as rotosonic, air rotary, hollow stem auger, and
Geoprobe*), handsamplingtools, (such as augers, bailers, trowels, chisels, etc.)motorized equipment
(electric sampler for instance), pumps (such as peristaltic, submersible, centrifugal, and electrical),
and trenching and excavation activities under personal protection levels B, C, and D.
If one considers the many variables involved in a sampling program (e.g., the types of sites, the
different media and constituents of concern at each site, the purposes and objectives of the sampling
programs, the array of sampling equipment and techniques, the implications of regulatory and
contractual terms and conditions, strict and complete documentation requirements, and unforgiving
proj ect schedules), one can quickly realize that there are enormous opportunities for poorly executed
field activities to have a negative impact on sample data quality and integrity. One way to minimize
the potential for these negative impacts is to have an experienced, qualified, independent, third-party
field auditor develop and implement a field auditing program.
4
-------
BENEFITS OF FIELD AUDITS
Each of the parties involved in the field auditing process realizes significant benefits from having
field audits conducted. Clients achieve improvements in their consultant's field procedures and,
consequently, receive more reliable and representative field data on which to base important
decisions. There is a cost control aspect that is addressed by the auditing process as well. The audits
help to assess and to streamline an efficient use of project resources, to immediately locate and
correct deficiencies, to verify that the client is getting what they contracted for, and to make sure the
work is done once and done right. In terms of liability control, field audits can be used to support
litigation activities, to help ensure that proper health and safety protocols are followed, and to
confirm that properly trained and appropriately experienced field personnel have been assigned to
the project.
The audited field contractors typically are open-minded, view the field auditor's comments as
constructive, and consider the audit a good learning experience. Field auditors benefit by expanding
their field experience and capabilities by observing other organization's field methods and by being
continually exposed to the environmental industry's most up-to-date and "hi-tech" equipment and
technologies.
CONCLUSIONS
As a result of field audits, responsible parties often observe vast improvements in their consultant's
field procedures and, consequently, receive more reliable and representative field data at a lower
cost. The cost savings and improved data quality that result from properly completed field audits
make the field auditing process both cost-effective and functional.
5
-------
WORKSHOP ON THE USE OF THE GRADED APPROACH
Lou Blume and Pat Lafornara, co-chairs of the Graded Approach Work Group, lead a
workshop to resolve issues regarding the implementation of the graded approach to
documenting quality assurance in assistance agreements. They present the issue with
other members of the work group, who will use their own programs as examples.
Discussion is documented and distributed to attendees.
The graded approach is advocated frequently in EPA QA requirements. The documentation
requirements for assistance agreements are often cited as candidates for applying the graded
approach because of both the circumstances of the program making the awards (many applicants,
no resources for review and oversight, no direct Agency application for the results, etc.) and the
recipients (limited resources, limited proportion of funding by EPA, small effort, no direct
Agency application, QA not a major factor, one-shot project, etc.). However, questions about the
application of the graded approach remain for all concerned. These include whether the graded
approach can limit the ultimate usefulness of the products of the project or program or if it is a
necessary policy to keep assistance programs viable. Interpretations of the graded approach
range from streamlining quality system requirements for quality management plans to
eliminating or modifying elements of the quality assurance project plans. Other considerations
include the program's use of systematic planning, training, data/record management and
oversight. The appropriate use of categories and generic documentation is often mentioned.
The workshop will use real examples as the basis for discussion of the application of the graded
approach. Speakers, primarily drawn from the graded approach work group, will present case
studies from their program experience covering the range of EPA programs, including National
Program Offices, Regions and the Office of Research and Development.
Workshop participants will be encouraged to share with and question the presenters, as
participants attempt to apply the principles discussed to their own programs and projects.
Anyone involved in awarding and overseeing assistance agreements, including representatives of
research programs, state and regional partnership programs with larger grants, and small grant
programs with tribes and other non-profits, is invited to attend. Attendees will be asked to bring
information to the workshop that may be relevant to the application of the graded approach to
specific agreements. This may include the intended use of the data and planning context,
program description, size and resources (personnel and dollars) both of the recipients and the
EPA program, available QA guidance (including review or oversight checklists) or requirements,
time frames and management constraints.
The desired outcome is a rational approach to determining acceptable QA planning,
documentation and oversight of assistance agreements. Criteria for addressing the adequacy of
the documentation will be explored and written down during the workshop so that participants
will have it to take home for consideration by their programs.
1
-------
AUTOMATED DATA REVIEW (ADR), CONTRACT COMPLIANCE
SCREENING (CCS), AND ENVIRONMENTAL DATABASE
MANAGEMENT SYSTEM (EDMS) SOFTWARE APPLICATIONS
FOR THE SACRAMENTO DISTRICT FORT ORD PROJECT
Tony Blake, Nicole Ortega, Pam Wehrmann, John Esparza, Richard M. Amano
Abstract — This presentation is an overview of the Contract Compliance Screening
(CCS), Automated Data Review (ADR), and Environmental Database Management
System (EDMS) software programs developed by Laboratory Data Consultants, Inc.
under contract to the Army Corps of Engineers, Sacramento District for the Fort Ord
RI/FSproject. The software programs use an electronic data deliverable (EDD) format
based upon data elements originally documented in the Implementation Guide for the
Department of Energy Environmental Management Electronic Data Deliverable Mister
Specification (DEEMS). The software was developed on a Microsoft ACCESS 97
platform. Customized modules perform automated data review (EPA Level 3) and
provide the user with discrete data qualification. The qualified data is exported into a
master database for overall project use.
The EDD format includes QA/QC batch links and routine accuracy and precision
parameters such as surrogate, matrix spike, and laboratory control sample recoveries.
In addition, initial and continuing calibration and GC/MS tuning data are delivered in
this format. Development of the EDD integrated these data elements required by end
users with consideration for the current data deliverable capabilities of commercial
laboratories.
The CCS software verifies the completeness and compliance with the EDD format. The
software references a project specific library built upon the QAPP in verifying
compliance and completeness. EDD deficiencies are detailed in an outlier report.
Access to the EDD file in table format allows for quick and easy correction of errors.
The ADR software is initiated by the data user (i.e., Army Corps chemist, prime
contractor, etc.) to review analytical data based upon project specific criteria. Upon
execution of the program, data is qualified using standard Army Corps/EPA data flags
and exported into a master database. Command buttons generate reports such as a
rejected data table, method blank contamination, surrogate outliers, etc. Forms and view
screens also provided on-line review of data qualifiers.
The Environmental Database Management System (EDMS) compiles the validated data
downloaded from the ADR system. The database program has user functions which
allows for comparison ofprimary data versus QA split lab data, comparison ofresults
against project action limits or PRGs/MCLs, and calculates the completeness values for
each test over any period of time. The four types of completeness values include
contract, analytical, technical, and field sampling completeness.
In conclusion, the CCS, ADR, and EDMS software programs were developed as tools to
support technical staff in the evaluation of analytical chemistry data using an expedited
and cost effective automated process. The EDD provides a standardizedformat. This
format allows for streamlining at the laboratories to produce deliverables which can be
verified immediately for completeness and compliance against project criteria using the
1
-------
CCS software. The EDMS allows the data end user to efficiently evaluate large data sets
for key indicators and ultimately determine the usability of the data.
Introduction
This poster presentation is an overview of the Contract Compliance Screening (CCS), Automated
Data Review (ADR), and Environmental Database Management System (EDMS) software
programs developed by Laboratory Data Consultants, Inc. under contract to the Army Corps of
Engineers, Sacramento District for the Fort Ord RI/FS project. The software programs use an
electronic data deliverable (EDD) format based upon data elements originally documented in the
Implementation Guide for the Department of Energy Environmental Management Electronic
Data Deliverable Master Specification (DEEMS). The software was developed on a Microsoft
ACCESS 97 platform. Customized modules perform automated data review (EPA Level 3) and
provide the user with discrete data qualification. The qualified data is exported into a master
database for overall project use.
Summary
The Electronic Data Deliverable (EDD) format is divided into a three table relational structure.
The tables are linked with selected key fields. The tables are divided into a Results Table, a
Sample Table, and a Instrument Calibration Table. These files include QA/QC batch links and
routine accuracy and precision parameters such as surrogate, matrix spike, and laboratory control
sample recoveries and initial and continuing calibration and GC/MS tuning data. Development of
the EDD integrated these data elements required by end users with consideration for the current
data deliverable capabilities of commercial laboratories. The following is the list of field names
in the three tables.
Results Table (Al)
Instrument Table (A2)
Sample Analysis Table (A3)
Client_Sample_ID
InstrumentlD
ProjectNumber
Lab_Analysis_Ref_Method_ID
QC_Type
Project_Name
Analysis_Type
Analyzed
Client_Sample_ID
LabSamplelD
AhemateLabAnalysisID
Collected
LabJD
Lab_Analysis_ID
Matrix_ID
Client_Analyte_ID
Lab_Analysis_Ref_Method_ID
LabSamplelD
AnaJyte_Name
ClientAnalytelD
QCJfype
Result
AnalyteName
Shipping_Batch_ID
ResuKUnits
Run_Batch
Temperature
Lab_Qualifiers
Analysis_Balch
Lab_Analysis_Ref_Method_ID
Detection_Iimit
Lah_Reporting_Balch
Preparation_Type
DetectionLimitType
Relative_Percent_Standard_Deviation
AnalysisType
RetentionTime
PercentDifference
Prepared
AnalyteType
Several fields for BFB/DFTPP ratios and peak ID
LabJD
Percent_Recovery
QC_Level
2
-------
Results Table (Al)
Instrument Table (A2)
Sample Analysis Table (A3)
Relatrve_Percent_DifFerence
Result_Basis
Rewporting_Limh
To1al_Or_Dissotved
RepoitingLimitType
Dilution
RepcrtableResult
Handling_Type
Handling_Batch
Laechate_Date
PercentMoisture
MethodBalch
Preparation_Batch
RunJBatcb
AnalysisBatch
Lab_Reportmg_Batch
Lab_Receipt
LabReported
The CCS software verifies the completeness and compliance with the EDD format. The software
references a project specific library built upon the Quality Assurance Project Plan (QAPP) in
verifying compliance and completeness. EDD deficiencies are detailed in an outlier report.
Access to the EDD file in table format allows for quick and easy correction of errors.
The ADR software is initiated by the data user (i.e., Army Corps chemist, prime contractor, etc.)
to review analytical data based upon project specific criteria. Upon execution of the program,
data is qualified using standard Army Corps/EPA data flags and exported into a master database.
Command buttons generate reports such as a rejected data table, method blank contamination,
surrogate outliers, etc. Forms and view screens also provided on-line review of data qualifiers.
See Attachment 1 for an example of user screens.
The Environmental Database Management System (EDMS) compiles the validated data
downloaded from the ADR system. The database program has user functions which allows for
comparison of primary data versus QA split lab data, comparison of results against project action
limits or PRGs/MCLs, and calculates the completeness values for each test over any period of
time. The four types of completeness values include contract, analytical, technical, and field
sampling completeness. See Attachment 2 and 3 for an example of user screens.
In summary, the CCS, ADR, and EDMS software programs were developed as tools to support
technical staff in the evaluation of analytical chemistry data using an expedited and cost effective
automated process. The EDD provides a standardized format. This format allows for
streamlining at the laboratories to produce deliverables which can be verified immediately for
completeness and compliance against project criteria using the CCS software. The EDMS allows
the data end user to efficiently evaluate large data sets for key indicators and ultimately
3
-------
determine the usability of the data.
4
-------
Attachment 1
QC Outlier Report: Continuing CalibrationVerification
Analysis Batch: msv0209249a
Lab Reporting Batch :P909410
Analysis Method :8260B
Analysis Date: 09^24/1999
Lab ID: SAL-Pet
Calibration Result
Project Limits
Positive Percent Difference
Neoalive Percent Difference
Relative
Relative
Analyte Name
Response
Factor
Percent
Difference
Response
Factor
Lower
Limit
Upper
limit
Lower
Limit
Uppei
Limit
Chloroethane
0.0294
12.7
0.05
25.0
-50.0
-25.0
Dibromochloromethane
0.295
-27.7
25.0
-5D.0
-250
Associated Samples
Client Sample ID
Lab Sample ID
9937E012223F
P909410-24
3937E012224F
P909410-25
3937E012225F
P909410-26
3937MO18102F
P909410-19
3S37MO18103F
P909410-20
3937M018104F
P909410-21
3937MO22105F
P909410-22
Figure 1 Quality Control Outlier Report - The ADR program has built in QC outlier
reports which summarizes calibration deficiencies using the project specific
method and validation criteria.
Client Sample ID :9937E017233F
Sample Date: 09/15/1999
Lab Sample ID: P9Q9410-14
Validated By/Date:
Data Qualification Report
(All Qualifiers)
Lab Report Batch : P909410
Analysis Type: RES
Lab ID: SAL-Pet
Sample Matibt:AQ
Approved By .'Date:
Aralyte Flame
Result Lab Rep Overall
Units Qual Res ValQual Temp HT
Lab Rep Field
MS Dtp Suit Unit QC Twie
CV/
ICV CCV
Analysis Method : 601 OB
Potassium -3000 jugfl
Analysis Method: S2S0B
Bromodi ch[orom ethane
Chlqrotorm
D ibrom ochJoron ethane
Freon113* " *
M^hvlenechlmde
T richloroethene
0.0709
•ugfl
|J....
ks;j*
0.0750
•ug/l
iJ
YES |UJ
0.222
¦ugfl
YES ;J+
0.167
.m.
H...
0.162
Jug/!
R...
YES jjk
Figure 2 Data Qualification Summary Report - The ADR program provides a
summary of qualified data itemizing each quality control area.
Attachment 2
5
-------
Comparison of Field Sampte Results, versus PRG and MCL Data
m
tOm- fort Ota Momtaf
mg Well-
atOOsJ ant
t
IBB
|
1 ahoratom tort* Ctfttubmls, !nc
| <, 5
¦.mWffflAW^^n^hnntwrf.
Figure 3 Comparison of Analytical Results vs PRGs and MCLs - The EDMS
application provides user tools to allow for the comparison of field sample
results against project specific PRGs and MCLs. The data can be filtered and
selected based on many unique criteria.
6
-------
Attachment 3
R
- . ¦.
r ¦¦¦¦..
Lab QA Split
Hag
Lab OA Split I Samplei
Catc RPD : Matrix
Anaiylic al
Method
Anajyte Naree
WtWMi
fv
^Oiufer
147.8a Afj |U261)a faan«-1.2-Dichl»toe>hgne
111
xvxx\':x-xx:x
§stt§5?
Primary Sample j 3S2K01222EF
'"WsinmaM"
SALPei' jaw' 'OSSSS
:0.50G
III
Tmim'imrmmTniiiiiiiili ii1111 iTiivrit-.iiriiirinri>1111>i*n lir Trivkrniiiiniiiiinniiiiu"' niTihi'gBatg
?3S:
¦
::v:sy.v.v
7.03
;AQ [826QB jfiic^oeihewc j;
•x-xxx-x':xx-.
lis
Primary Sample
3323K012226F
¦
:;;,;VI999:00 SAl-Pci I ill.2 0 "iCi 0,50-0 I
XuXOXOXOXwflOMwSwX'Sw&y.'A'AWA'j'M'W^^^
Nrt. ' j*t?
W%1
III!
X:X:>XXXx<
151.49
;*Q
326OB
iVinyl chtorid©
ijj.Primaiy Sampfe |:5S23.<01222ŁF
OA'Split Sajupta
. J ......
CVj.'/ITO'jQO Ivil-Pgl |J )l
Xr'r'-'i'-'v"1-""-'"'
0.06SC
;0.0400
:0.500
wmmmtwtm
«.wa ass- I
J
S&Outiiei
mmmti
1611.13 :AQ 11ISO1
Primary Sample j
f olal Ou*oJv«d Solid*
:t!3/Q3/1»» »0tl
«»
:5.00 ;
:5.G(1
111
>Xv>X;X;XyX;X;X
r',mam»qfo
iiiViWt i i.i-i ij i i;i:i;i;i.i
' Outliei
52.87
aq \:
310.1
Bicaibonale Alkafinity
Primary Sample [ --
03/^999 0.00:SALPv^ j; ]64.0
:20.G
:20.0
;a&- ; Kill. j : pit
j
WM88M
::-:'x:xxxx>x-x:x:x:-:-:-:::-:
x:x xx'x:: xxxxx
— .... i i
I'l'i M i'i i i i M i i iYi rri'i'i Ti ri rri i i rn ri i'ri'i i i l i i i i i} 111 i i1111111111 n i i i i iin nm m.UA'M'.'.'.'.'.U.U.U.'.'.'.U.'.U.'.U.'.1.1'.'.U.M.'MUMMMUMUMU.U.U.'.1.1.'.1.1WMUMUA W.U.U.'.U.U.U.U.MAUWM WMU.U.M.U.UA'.U.U.U.U.UAU.UMU.U.^
ilbmiiei
54.55
AQ
1
; Total Afltafinfty
-XX XX-:'
xx:: X:x:X'' xx:::X
... ..
Primary Sample .
Oft FertGrd I tut OkJ Monrft.rtmi'Welfs at OUs 1 anil 2
j^p
da>03/1399 C:M '' 'i^Pa ' j"' '''' 'jS4 0 Tgl.S
20.1
MX
Figure 4
Laboratory QA Split Sample Comparison - The EDMS program allows the
user to compare QA split sample results generated from two different
laboratories.
7
-------
IMPORTANT CONSIDERATIONS IN SELECTING AND
IMPLEMENTING A LIMS IN
WATER QUALITY TESTING LABORATORY
Kim Ryals, Elizabeth Turner, Christine Paszko, and Don Kolva
Accelerated Technology Laboratories and Washington Aqueduct
Abstract — Today's environmental laboratory faces numerous challenges from enhanced
regulatory oversight to decreasing costs per tests to numerous laboratory accreditations
that are offered and/or required. Selecting the LIMS that will not only "fit" your
laboratory is important, but as important is a system that has the flexibility to conform to
the changes that will be required by the laboratory over the years. Not only in terms of
reporting, but also in terms of the ability to integrate new instrumentation, integrate new
calculations, new screens and the ability to integrate with future software.
In selecting a LIMS, it is important to have a good understanding of the requirements of
both current and future laboratory needs. Equally as important in selecting the
functionality that matches the laboratory requirements is selecting a technology platform
that is easy to manage, a market leader and utilizes open architecture. The laboratory
chose Microsoft SQL Server as the database engine. The selection of Microsoft SQL
Server provides seamless integration with the Microsoft Office Suite (Word, Excel,
Access, PowerPoint) which is used in the laboratory. This allows users to export directly
from the LIMS to any of the programs in the suite and visa versa This synergy enhances
the flexibility of the LIMS.
Many laboratories produce a request for proposal that includes a list of features and
functionality that is required to help automate the laboratory and to provide a system to
integrate the various data systems and reporting within the laboratory. Some
laboratories also include hardware in their request for proposal and ask the vendor to
deliver a complete system, hardware and software. This provides the LIMS vendor with a
clear understanding of what the primary needs of the laboratory are currently and in the
future. There are many features in a LIMS, however the primary functionality includes;
sample tracking, data entry, sample scheduling, QA/QC, electronic data entry, chemical
and reagent inventory and personnel aid equipment management. After the proposals
had been reviewed, the top three vendors were invited in, to provide a scripted demo of
the features andfunctions that were important to the laboratory. Otherfeatures that help
increase productivity and efficiency include the use of bar codes, data loggers,
instrument integration and specialized software modules such as cost accounting/time
tracking. This demonstration is highly beneficial because it gives laboratory personnel
the opportunity to see how the software will function and they can begin to visualize how
the LIMS can assist them in their jobs.
Implementing a LIMS begins with the installation and configuration of the server (Dell
PowerEdge 2400), installation of the LIMS software on the server and client machines,
and creating custom reports for end-users. The installation phase also involved
configuring and integrating label printers (to print barcodes) with the LIMS, hand held
CCD scanners, data loggers (for remote collection of field data which can then be
uploaded to the LIMS). In addition, integration of several high throughput
instrumentation (Agilent ICP-MS, Tekmar-Dohrman TOC, Varian Saturn GC-MS,
1
-------
Agilent GC-MS, Dionix IC) was another phase of the project. More laboratories are
turning to integration of their instrumentation to avoid manual entry of results into the
LIMS and also to avoid transcription errors. Finally there is a verification and training
phase. In verification the LIMS trainers and laboratory personnel review the feature and
functions utilizing a checklist to ensure that all the components of the installation are in
place and operating accordingly. Next the database administrators and end-users
receive training manuals and go over examples in the manuals followed by a questions
and answer period. Once the LIMS is installed, users have the ability to participate in
follow-up training courses offered by the LIMS vendor and to attend user group meetings
to continually learn about new features and keep abreast of the latest technology.
Introduction
Selecting a LIMS for the environmental testing laboratory requires a solid understanding of what
tasks are performed by the laboratory currently and an idea of which tasks the laboratory may
want to perform with the LIMS in the future. This is important because demands made upon the
laboratory will change over time and will require a LIMS to provide flexibility to accommodate
these needs.
A detailed LIMS Request for Proposal (RFP) was presented to Accelerated Technology
Laboratories together with several other LIMS vendors. Each vendor was asked to answer a
series of questions relating to the WAD LIMS specifications, the company, support options,
LIMS experience and references. The specifics of the RFP include a detailed description of the
functionality of the various features or modules of the LIMS. The RFP began with questions on
Sample Tracking, Data Entry, Sample Scheduling, QA/QC, Electronic Data Transfer, Chemical
and Reagent Inventory, and Personnel and Equipment Management. Other key elements of the
RFP included requests for information on statistical capabilities, data loggers to upload data to
the LIMS, bar-coding, instrument integration, time tracking software for cost accounting, custom
report creation and integration with the laboratory's SCAD A system. The RFP also requested
the LIMS vendor to provide the hardware (server for the LIMS) and all necessary software tools
to manage the server, including Microsoft SQL Server Licensees and back-up software. Another
section of the RFP focused on the expertise of the LIMS vendor and the personnel responsible
for the installation and implementation. The WAD laboratory requested the LIMS vendor to
provide a "turn-key" system.
The LIMS having the highest score on the LIMS questionnaire were invited to visit the
laboratory and provide an on-site demo based on a script prepared by the laboratory. The demos
were viewed by laboratory management and staff to gain a thorough understanding of how the
LIMS works and if there is a match in the way in which the LIMS operates. Once the feedback
from the laboratory is gathered, the scores tallied and the cost proposals reviewed, the laboratory
selects the LIMS that best fits its operations and the needs of an environmental testing
laboratory.
It is important for the LIMS vendor to understand how the samples flow through the laboratory.
The figure 2. depicts the typical sample flow through the laboratory at the WAD laboratory.
2
-------
Installation:
Accelerated Technology Laboratories, Inc. was the successful bidder and Sample Master® Pro
LIMS best matched the specifications and the ATL staff had the expertise required by the WAD
laboratory. Before the installation could begin, all hardware and software systems were ordered
from the respective vendors:
WAD Configuration:
Server Hardware:
• Dell Server with IT' Monitor
• Dual Pentium III 933 MHz processors
• Integrated 3Com 10/100 Ethernet controller
• 512 MB RDRAM (2 RIMMS)
• Three (3) 18 GB SCSI harddrives
• RAID 5 Parity
• 20/48x CD-ROM Drive
• 3.5" 1.44 MB Floppy drive
• 40 GB DDS-4 Tape Drive with 10 tapes
Data loggers (2) from Inteimec
Network: Ethernet network, with the NT 4.0 network operating system.
Software:
• SQL Server Licenses from Microsoft
• Arc Serve from Computer Associates
• Sample Master® Pro LIMS
• Delta-one Fieldworker software
• Diskeeper Server edition
ATL's project team consisted to three software engineers and a project manager. ATL engineers
installed the server and required software. The engineers worked closely with the laboratory to
ensure that there would be minimal impact on the day-to-day laboratory operations during the
installation. The project manager was responsible for ensuring that third party software and
hardware products were delivered on time and free of defects. Following the configuration of the
server and installation of the software, ATL software engineers reviewed the custom reports that
were required by the laboratory and also reviewed the requirements for integration with the
WAD SCADA database.
Sample Master® Pro LIMS was implemented in phases, the first phase consisted of acquiring all
the necessary hardware and software required for the project. Once the various components
arrived, they were inspected and installed at the WAD laboratory. The focus of the first phase
was installation of the server, configuration and installing Sample Master® Pro LIMS software.
This phase also involved collecting output files from instruments that were to be integrated with
the LIMS and installing the required software for the data loggers, two handheld units that allow
field workers the ability to collect field data and upload that data to Sample Master® Pro LIMS.
3
-------
The focus of the second phase was instrument integration and installing and testing of the
software to integrate the LIMS with the WAD SCADA database. The following instruments
were interfaced with Sample Master® Pro LIMS: Agilent ICP-MS, Tekmar-Dohrman TOC,
Varian Saturn GC-MS, Agilent GC-MS, Dionix IC.
The benefits of instrument integration include the following; a reduction in transcription errors,
increase in automation, data accuracy and increased throughput. The cost savings alone justify the
integration of instruments to Sample Master® Pro LIMS.
The last phase focused on training and final installation of the last two instrument parsers. In
addition to training end-users, ATL software engineers also trained the LIMS database
administrator (dba). End-users were trained with user manuals that went through each module
with step by step instructions and the administrator guide was used to train the dba.
Conclusions
The selection and installation of Sample Master® Pro LIMS has resulted in many data
management and automation improvements in the laboratory. There is now a full chain of
custody, audit trail and many quality control enhancements that assist the laboratory in their data
management challenges. By limiting users to tests and methods on pull-down lists, instrument
integration, and requesting that users log onto the system with user name and password, there is a
higher degree of data integrity.
Instrument integration has significantly reduced the amount of time analysts devote to data entry.
Prior to the LIMS, entry of VOC data (60+ parameters per sample) would take hours. Since
instrument integration, data entry has been reduced to review of the data and importing it into the
LIMS - a process completed in minutes. Transcription errors are eliminated. Analyst time once
spent entering data can now be devoted to other tasks.
4
-------
Prior to implementation of the LIMS, reports were generated on a weekly to semimonthly basis.
Sample Master Pro LIMS has been setup for automatic report generation so that reports are
automatically printed once sample results have been approved by the laboratory manager. The
turnaround time of reports has been greatly decreased.
Implementation of the LIMS has greatly increased the efficiency of the laboratory. Data is
readily available to view and approve. Instrument integration has reduced data entry time and
transcription errors. Data loggers have reduced the amount of time required to log in samples to
the laboratory. Quality control records are readily available for review and generation of control
charts, once a laborious process, can now be completed in minutes.
Shared printers
Instrument Controllers
can also access the
llMS
Client machines
Server
Built-in Bar-code
support
Instrument Integration
Figure 3. The schematic depicts a client/server configuration of Sample Master Pro LIMS.
5
-------
J. Status Monitoring
REF.
COMPLY
DO NOT
COMPLY
COMPLY
WITH MOD.
Provide methods for monitoring sample status throughout the sample life-
cycle log-in:
J1
Automatic update of sample status based on events or transactions
J1
Provide a method to monitor test and analysis data
32
Provide codes to monitor sample status for the following conditions:
• Sample received by the laboratory
J3
- Samples expected or logged but not received
J3
• Sample has tests assigned that are in progress
J3
- All assigned tests are completed
J3
- Sample results have been reviewed and verified
J3
- A re test has been ordered
J3
- Broken sample container
J3
- Custom status codes defined by the laboratory
J3
Provide codes to monitor test and analysis status for the following
conditions:
• Test is complete
J4
- Test results have failed quality control
J4
- Test results exceed specified limits
34
- Test results have associated text or limits violations
J4
• Test is assigned to a bench sheet and is in progress
J4
- Test results have been reviewed
J4
- A re-test has been ordered for the same sample and test
34
Provide a means for informing when a sample may be disposed of
J5
Allow customers read only access to their data via the internet or
customer call up
J6
Customers can easily view their current and historical results
J6
Figure 1. Illustrates a sampling of some of the questions taken from the Request for Proposal
created by the Washington Aqueduct Laboratory,
6
-------
Figure 2. Schematic sample flow through the Washington Aqueduct Laboratory prior to the
installation of Sample Master Pro Laboratory Information Management System.
7
-------
IMPORTANT CONSIDERATIONS IN SELECTING AND
IMPLEMENTING A LIMS IN
WATER QUALITY TESTING LABORATORY
Kim Ryals, Elizabeth Turner, Christine Paszko, and Don Kolva
Accelerated Technology Laboratories and Washington Aqueduct
Abstract — Today's environmental laboratory faces numerous challenges from enhanced
regulatory oversight to decreasing costs per tests to numerous laboratory accreditations
that are offered and/or required. Selecting the LIMS that will not only "fit" your
laboratory is important, but as important is a system that has the flexibility to conform to
the changes that will be required by the laboratory over the years. Not only in terms of
reporting, but also in terms of the ability to integrate new instrumentation, integrate new
calculations, new screens and the ability to integrate with future software.
In selecting a LIMS, it is important to have a good understanding of the requirements of
both current and future laboratory needs. Equally as important in selecting the
functionality that matches the laboratory requirements is selecting a technology platform
that is easy to manage, a market leader and utilizes open architecture. The laboratory
chose Microsoft SQL Server as the database engine. The selection of Microsoft SQL
Server provides seamless integration with the Microsoft Office Suite (Word, Excel,
Access, PowerPoint) which is used in the laboratory. This allows users to export directly
from the LIMS to any of the programs in the suite and visa versa. This synergy enhances
the flexibility of the LIMS.
Many laboratories produce a request for proposal that includes a list offeatures and .
functionality that is required to help automate the laboratory and to provide a system to
integrate the various data systems and reporting within the laboratory. Some
laboratories also include hardware in their request for proposal and ask the vendor to
deliver a complete system, hardware and software. This provides the LIMS vendor with a
clear understanding of what the primary needs of the laboratory are currently and in the
future. There are many features in a LIMS, however the primary functionality includes;
sample tracking, data entry, sample scheduling, QA/QC, electronic data entry, chemical
and reagent inventory and personnel and equipment management. After the proposals
had been reviewed, the top three vendors were invited in, to provide a scripted demo of
the features andfunctions that were important to the laboratory. Otherfeatures that help
increase productivity and efficiency include the use of bar codes, data loggers,
instrument integration and specialized software modules such as cost accounting/time
tracking. This demonstration is highly beneficial because it gives laboratory personnel
the opportunity to see how the software will function and they can begin to visualize how
the LIMS can assist them in their jobs.
Implementing a LIMS begins with the installation and configuration of the server (Dell
PowerEdge 2400), installation of the LIMS software on the server and client machines,
and creating custom reports for end-users. The installation phase also involved
configuring and integrating label printers (to print barcodes) with the LIMS, hand held
CCD scanners, data loggers (for remote collection of field data which can then be
uploaded to the LIMS). In addition, integration of several high throughput
instrumentation (Agilent ICP-MS, Tekmar-Dohrman TOC, Varian Saturn GC-MS,
1
-------
Agilent GC-MS, Dionix IC) was another phase of the project. More laboratories are
turning to integration of their instrumentation to avoid manual entry of results into the
LIMS and also to avoid transcription errors. Finally there is a verification and training
phase. In verification the LIMS trainers and laboratory personnel review the feature and
functions utilizing a checklist to ensure that all the components of the installation are in
place and operating accordingly. Next the database administrators and end-users
receive training manuals and go over examples in the manuals followed by a questions
and answer period. Once the LIMS is installed, users have the ability to participate in
follow-up training courses offered by the LIMS vendor and to attend user group meetings
to continually learn about new features and keep abreast of the latest technology.
Introduction
Selecting a LIMS for the environmental testing laboratory requires a solid understanding of what
tasks are performed by the laboratory currently and an idea of which tasks the laboratory may
want to perform with the LIMS in the future. This is important because demands made upon the
laboratory will change over time and will require a LIMS to provide flexibility to accommodate
these needs.
A detailed LIMS Request for Proposal (RFP) was presented to Accelerated Technology
Laboratories together with several other LIMS vendors. Each vendor was asked to answer a
series of questions relating to the WAD LIMS specifications, the company, support options,
LIMS experience and references. The specifics of the RFP include a detailed description of the
functionality of the various features or modules of the LIMS. The RFP began with questions on
Sample Tracking, Data Entry, Sample Scheduling, QA/QC, Electronic Data Transfer, Chemical
and Reagent Inventory, and Personnel and Equipment Management. Other key elements of the
RFP included requests for information on statistical capabilities, data loggers to upload data to
the LIMS, bar-coding, instrument integration, time tracking software for cost accounting, custom
report creation and integration with the laboratory's SCADA system. The RFP also requested
the LIMS vendor to provide the hardware (server for the LIMS) and all necessary software tools
to manage the server, including Microsoft SQL Server Licensees and back-up software. Another
section of the RFP focused on the expertise of the LIMS vendor and the personnel responsible
for the installation and implementation. The WAD laboratory requested the LIMS vendor to
provide a "turn-key" system.
The LIMS having the highest score on the LIMS questionnaire were invited to visit the
laboratory and provide an on-site demo based on a script prepared by the laboratory. The demos
were viewed by laboratory management and staff to gain a thorough understanding of how the
LIMS works and if there is a match in the way in which the LIMS operates. Once the feedback
from the laboratory is gathered, the scores tallied and the cost proposals reviewed, the laboratory
selects the LIMS that best fits its operations and the needs of an environmental testing
laboratory.
It is important for the LIMS vendor to understand how the samples flow through the laboratory.
The figure 2. depicts the typical sample flow through the laboratory at the WAD laboratory.
2
-------
Installation:
Accelerated Technology Laboratories, Inc. was the successful bidder and Sample Master® Pro
L1MS best matched the specifications and the ATL staff had the expertise required by the WAD
laboratory. Before the installation could begin, all hardware and software systems were ordered
from the respective vendors:
WAD Configuration:
Server Hardware:
• Dell Server with 17" Monitor
• Dual Pentium III 933 MHz processors
• Integrated 3Com 10/100 Ethernet controller
• 512 MB RDRAM (2 RIMMS)
• Three (3) 18 GB SCSI harddrives
• RAID 5 Parity
• 20/48x CD-ROM Drive
• 3.5" 1.44 MB Floppy drive
• 40 GB DDS-4 Tape Drive with 10 tapes
Data loggers (2) from Intermec
Network: Ethernet network, with the NT 4.0 network operating system.
Software:
• SQL Server Licenses from Microsoft
• Arc Serve from Computer Associates
• Sample Master® Pro LIMS
• Delta-one Fieldworker software
• Diskeeper Server edition
ATL's project team consisted to three software engineers and a project manager. ATL engineers
installed the server and required software. The engineers worked closely with the laboratory to
ensure that there would be minimal impact on the day-to-day laboratory operations during the
installation. The project manager was responsible for ensuring that third party software and
hardware products were delivered on time and free of defects. Following the configuration of the
server and installation of the software, ATL software engineers reviewed the custom reports that
were required by the laboratory and also reviewed the requirements for integration with the
WAD SCADA database.
Sample Master® Pro LIMS was implemented in phases, the first phase consisted of acquiring all
the necessary hardware and software required for the project. Once the various components
arrived, they were inspected and installed at the WAD laboratory. The focus of the first phase
was installation of the server, configuration and installing Sample Master® Pro LIMS software.
This phase also involved collecting output files from instruments that were to be integrated with
the LIMS and installing the required software for the data loggers, two handheld units that allow
field workers the ability to collect field data and upload that data to Sample Master® Pro LIMS.
3
-------
The focus of the second phase was instrument integration and installing and testing of the
software to integrate the LIMS with the WAD SCADA database. The following instruments
were interfaced with Sample Master® Pro LIMS: Agilent ICP-MS, Tekmar-Dohrman TOC,
Varian Saturn GC-MS, Agilent GC-MS, Dionix IC.
The benefits of instrument integration include the following; a reduction in transcription errors,
increase in automation, data accuracy and increased throughput. The cost savings alone justify the
(r)
integration of instruments to Sample Master Pro LIMS.
The last phase focused on training and final installation of the last two instrument parsers. In
addition to training end-users, ATL software engineers also trained the LIMS database
administrator (dba). End-users were trained with user manuals that went through each module
with step by step instructions and the administrator guide was used to train the dba.
Conclusions
The selection and installation of Sample Master® Pro LIMS has resulted in many data
management and automation improvements in the laboratory. There is now a full chain of
custody, audit trail and many quality control enhancements that assist the laboratory in their data
management challenges. By limiting users to tests and methods on pull-down lists, instrument
integration, and requesting that users log onto the system with user name and password, there is a
higher degree of data integrity.
Instrument integration has significantly reduced the amount of time analysts devote to data entry.
Prior to the LIMS, entry of VOC data (60+ parameters per sample) would take hours. Since
instrument integration, data entry has been reduced to review of the data and importing it into the
LIMS - a process completed in minutes. Transcription errors are eliminated. Analyst time once
spent entering data can now be devoted to other tasks.
4
-------
Prior to implementation of the LIMS, reports were generated on a weekly to semimonthly basis.
Sample Master Pro LIMS has been setup for automatic report generation so that reports are
automatically printed once sample results have been approved by the laboratory manager. The
turnaround time of reports has been greatly decreased.
Implementation of the LIMS has greatly increased the efficiency of the laboratory. Data is
readily available to view and approve. Instrument integration has reduced data entry time and
transcription errors. Data loggers have reduced the amount of time required to log in samples to
the laboratory. Quality control records are readily available for review and generation of control
charts, once a laborious process, can now be completed in minutes.
Figure 3. The schematic depicts a client/server configuration of Sample Master Pro LIMS.
5
-------
J. Status Monitoring
REF.
COMPLY
DO NOT
COMPLY
COMPLY
WITH MOO.
Provide methods for monitoring sample status throughout the sample life-
cycle log-in:
J1
Automatic update of sample status based on events or transactions
J1
Provide a method to monitor test and analysis data
32
Provide codes to monitor sample status for the following conditions:
msmmm
- Sample received by the laboratory
33
- Samples expected or logged but not received
33
- Sample has tests assigned that are in progress
33
- All assigned tests are completed
33
- Sample results have been reviewed and verified
33
- A retest has been ordered
33
- Broken sample container
33
- Custom status codes defined by the laboratory
33
Provide codes to monitor test and analysis status for the following
conditions:
Mil
*&.. *.i. .Afi
- Test is complete
J4
- Test results have failed quality control
J4
- Test results exceed specified limits
J4
-Test results have associated text or limits violations
J4
- Test is assigned to a bench sheet and is in progress
J4
- Test results have been reviewed
J4
- A re-test has been ordered for the same sample and lest
J4
Provide a means for informing when a sample (nay be disposed of
J5
Allow customers read only access to their data via the internet or
customer call up
J6
Customers can easily view their current and historical results
J6
Figure 1. Illustrates a sampling of some of the questions taken from the Request for Proposal
created by the Washington Aqueduct Laboratory.
6
-------
Figure 2. Schematic sample flow through the Washington Aqueduct Laboratory prior to the
installation of Sample Master Pro Laboratory Information Management System.
7
-------
AN INNOVATIVE APPROACH IN DEFINING AND PRODUCING
LABORATORY ELECTRONIC DATA DELIVERABLES
Michael S. Johnson, Analytical Operations/Data Quality Center (AOC), U.S. Environmental
Protection Agency, Office of Emergency and Remedial Response
Abstract — The Contract Laboratory Program (CLP), managed by USEPA's Analytical
Operations/Data Quality Center (AOC), has been receiving and utilizing laboratory
Electronic Data Deliverables (EDDs) for over 14years. These deliverables are utilized for
verification of contractual and technical compliance, data assessment as part of the data
validation process, monitoring analytical method performance and populating databases
for statistical analysis, Geographic Information Systems (GIS) or other support and
monitoring activities. The EDDs have been rigid, contract-specific formats, tailored to
specific CLP reporting requirements. The CLP has been evolving over the last several years
to become more flexible and to focus on customers' changing needs for methodflexibility
and utilizing alternative methods. To accommodate this program flexibility requirement,
AOC developed the framework and the tools to introduce and utilize a new type ofEDD, the
Superfund Electronic Data Deliverable (SEDD).
SEDD utilizes Extensible Markup Language (XML) technology and the Internet to provide
the basis for a laboratory EDD that is flexible, yet provides uniformity for EDDs across
various methods and levels ofreporting requirements. AOC provides an Internet-based tool
set that can be utilized by laboratories to generate and deliver a requested EDD from the
laboratory's database. Reporting requirements for specific methods or projects can be
defined in a Document Type Definition (DTD) that is downloadable by the laboratory. AOC
has also been collaborating with the U.S. Army Corps of Engineers (USACE) to utilize a
uniform set of data element definitions and to define multiple stages of data deliverables
that meet the needs of differing data requesters. Both AOC and USACE are piloting the new
data reporting technology and toolset with analytical laboratories.
Introduction:
The Analytical Operations/Data Quality Center (AOC) manages the environmental analytical data
for the Contract Laboratory Program (CLP) in support of USEPA's Superfund program. CLP
provides data through its routine chemical analytical services, and has implemented supporting
services to ensure that known quality data is provided to CLP users. Since the CLP's inception in
1980, more than 1,850,000 samples from over 10,000 sites have been analyzed by over 430
laboratories. The CLP has been utilizing Electronic Data Deliverables (EDDs) in addition to
hardcopy reports for verifying contractual and technical compliance to specified analytical
methodologies and Q/C requirements. These EDDs are then available for performing data
assessment, monitoring analytical method performance and populating databases for statistical
analysis, Geographic Information Systems (GIS) or other decision support and monitoring activities.
Currently, AOC is developing the Superfund Electronic Data Deliverable (SEDD) system to enhance
1
-------
CLP's data exchange needs in order to supportmultiple users' electronic data requirements by taking
advantage of Internet-based data exchange standards.
Background
The use of EDDs within the CLP is vital for maintaining the integrity and availability of high quality
data in large volumes, and has the following advantages:
• Facilitates system-to-system data transmission,
• Minimizes human intervention,
• Speeds up the processing of data,
• Reduces error, and
• Facilitates information transfer, storage and utilization.
In 1988 AOC introduced the "Format A" EDD to receive electronic data. The data organization of
"Format A" was very similar to the hardcopy forms, essentially containing header information
followed by detailed information for results. AOC then proceeded to enhance the capabilities of the
EDDs by introducing "Format B" in 1989. Because of limited gain in capabilities, "Format B" was
eventually replaced in 1991 by Agency Standard Format (ASF). The ASF, still in use today, to a
great extent successfully captured the analytical results along with quality control information
including instrument tuning and calibration data. The information was captured using the relative
position of the data items in the file structure to establish relationships within the data groups and
within analytical runs. The USEPA mainframe systems, such as Contract Compliance Screening
(CCS), were utilized for verification of contractual and technical compliance of the data delivered
in ASF files. Some instrument data processing software vendors and Laboratory Information
Management Systems (LIMS) supported the creation of ASF files and CLP hardcopy forms within
the laboratory.
Need for Flexible EDD
Although these EDDs have proven to be functional and achieved support from participating
laboratories and their software vendors, they were soon facing constraints and limitations due to the
rigid format. The current EDDs require strict adherence to structure and format in order to be
effective. The resource impacts from changes were severe even when new analytical and reporting
requirements were minimal. Adding or removing data elements, introducing Quality Control
Samples, or Data Quality/Validation criteria windows all required large investments of time and
money by USEPA and the laboratories. The current EDDs are dependent on software vendors'
development time lines and software updates. They are not easily adaptable to the changing
analytical needs of USEPA and other customers of analytical data. These limitations severely
impacted CLP's evolution to serve its customers' changing needs.
2
-------
AOC soon realized the following attributes would be necessary for the next generation of EDDs:
• Increased flexibility and compatibility with open industry standards,
• Reduced complexity,
• Reduced dependence on programmers to empower users, and
• A generic information format that can accommodate all environmental analytical data.
The SEDD Initiative
In its quest for a flexible EDD solution, AOC evaluated several EDDs including the Agency
Standard Specifications, other EDDs used by USEPA offices and Federal agencies, and Electronic
Data Interchange (EDI) standards such as ANSI XI2 and UN/EDIFACT. Although none of the
EDDs adequately addressed the changing needs of the CLP, information and requirements were
extracted from these specifications pertinent to a Superfund EDD. AOC decided that SEDD would
be an open transmission standard taking advantage of the benefits of Extensible Markup Language
(XML) instead of a traditional EDD. Recommended by the World Wide Web Consortium (W3C),
XML was selected as a key to SEDD's implementation because it is license-free, platform-
independent, encapsulates structured data in text files and is well supported by freely available third
party tools. The SEDD system has been deployed as a Web-based data-driven application.
Innovative technologies were applied to the development of SEDD which allow the following key
featured benefits:
• The SEDD system interfaces seamlessly with the legacy mainframe data processing systems,
saving USEPA the cost of developing new systems that process the XML data. Additionally
it will streamline data assessment activities by reducing processing time and increasing
reported analytical quality and results information submitted by the laboratories.
• The delivery of industry-standard XML files by the SEDD system enables the chemists/data
evaluators and data users to browse and review original laboratory EDD deliverables
conveniently with widely available XML editing and reporting tools.
• The SEDD system provides a flexible deliverable format and tools to accommodate changes
without the additional cost and time previously required with the rigid ASF. The system's
flexibility allows CLP to expand its capabilities by accommodating changes in existing
Routine Analytical Service (RAS) analytical methods and utilizing new and additional non-
RAS methods.
• The SEDD system reduces the burden on laboratories to create and format EDD deliverables.
It also allows laboratory personnel to focus more on data generation and data quality, and
less on producing EDDs that conform to a multitude of specified data structures and formats.
3
-------
How SEDD Works
The SEDD system supports multiple data deliverable requirements by maintaining Document
Type Definitions (DTD). Each DTD specifies data requirements of an XML deliverable at the
Web server accessible to laboratories with a standard Web browser. The browser hosts a user
friendly java applet interface, allowing the laboratory user to browse the DTDs for the specific
data reporting requirements specified for a client's deliverable. The specifications are available
for download to the local laboratory workstation. The downloaded information, known as a Data
Element Map (DEM) file, is a representation of the DTD requirements. This information can be
viewed within SEDD as a tree-like hierarchical structure of nodes and elements, defining the
XML tag names and the data node relationships. SEDD allows users to map DTD requirements
to their database.
The SEDD interface also accesses tables and fields from the ODBC-compliant laboratory data
source and provides a query builder, which allows the user to select table names and field names
through pick lists. Mapping the lab's data source to the EDD requirements is accomplished by
building SQL statements through a simple interface. The mapped configuration is saved in the
DEM file, which can be used repeatedly to generate specific deliverables as XML files from the
laboratory data source. Once a file is generated, it is then validated against the selected DTD to
ensure that the XML file is complete and valid. The SEDD transmission utility sends XML file
via FTP to the SEDD server for data verification and validation.
The use of the SEDD tool to generate and validate these XML data files is very convenient and
offers a 'turn-key' solution for the reporting of this data. However, most present-day laboratories
may not store all reported data in a single or multiple databases. Some of the data is generated
only by report generators or reported manually to customers. Some laboratories still have very
limited automated data handling capabilities. The use of the SEDD tool to deliver data in the
XML format is not required. The laboratories could independently generate these data files. The
laboratories would be free to choose the approach that best fit their data generation capabilities.
The required XML files could be directly generated from LIMS or generated by software
provided by independent private sector vendors. These files could then be validated against the
same DTDs as used by the SEDD tool and delivered to the client.
AOC and USACE
AOC has been collaborating with the U.S. Army Corps of Engineers (USACE) to utilize a
uniform set of data element definitions and to define multiple stages of data deliverables that
meet the needs of differing data requesters. AOC and USACE are jointly proposing three stages
of data specifications defined as follows:
Stage 1. Contains the minimum number of analytical data elements to transmit results-only
data to the end user.
4
-------
Stage 2. Data content builds on Stage 1 by adding method and instrument quality control data
(e.g., initial calibration, continuing calibrations, method QC limits, sample QC
relationships).
Stage 3. Data content builds on the Stage 2 data set by adding additional measurement data to
allow for independent recalculation of the reported results. Data in this stage will be
similar in detail to the current CLP EDDs.
The three-staged approach was taken in order to provide uniform and scalable EDD requirements
for the data providers as well as the data users. Proposed data elements and schema for reporting
in each of the 3 Stages can be viewed on the AOC Web site at
http://www.epa.gov/superfund/programs/clp/sedd.htm or at the USACE Web site at
http://www.envtronmental.usace.armv.mil/info/technical/chem/chemtopics/chemedd/chemedd.ht
ml. The data elements identified in each Stage represent the maximum reporting requirements
for that Stage, however, specific programmatic EDDs (as defined in the DTD) may only require a
subset of the specified data elements.
Utilizing the SEDD XML format would permit laboratories to support a particular reporting
Stage by maintaining the required data in their LIMS or other database, or by capturing the data
in a database view or utilizing commercial XML tools to extract the data from multiple sources.
Once the three Stages of data specification are adopted, the environmental user community will
benefit from having a single XML format that can support various EDD specifications. While
few laboratories could currently support Stage 3 reporting requirements, most laboratories could
now support Stage 1 reporting requirements and could move up to Stage 2 and 3 reporting as
they update or implement new LIMS or databases. Also, different programs requiring Stage 1 or
2 data could implement their EDDs quickly utilizing the SEDD XML specifications.
Both AOC and USACE are piloting the new data reporting technology and toolset with
analytical laboratories. AOC has also been working with the Office of Environmental
Information (OEI) to utilize the SEDD tool set for laboratory reporting in other programs and for
reporting other types of data.
Conclusion
SEDD's use of XML provides flexibility and format independence for all data reporting needs.
DTDs are used to specify data requirements including data groups, data elements and their
relationships. Each deliverable's data requirements are represented by a DTD on the SEDD
Application Server giving SEDD the ability to support multiple users' electronic data
requirements. The SEDD system easily accommodates future changes in requirements with
minimal modification to existing systems through modifications of DTDs.
5
-------
The XML output file is independent of proprietary data systems. A variety of parsers are
available for viewing, editing, or programmatically processing these files to interface with
different database systems.
SEDD's innovative XML approach will substantially contribute to the enhancement of analytical
data quality by minimizing errors inherent in the reformatting and restructuring of data to comply
with a multitude of reporting formats and will permit the use of common tool sets for data
verification, validation and processing. The SEDD system can be an integral part of any Data
Management System providing data exchange and inherently improves data quality.
6
-------
NELAC QUALITY SYSTEMS:
THE INTEGRATION OF ISO/IEC17025 AND PBMS
Scott D. Siders, Division of Laboratories, Illinois Environmental Protection Agency
Abstract - Within the past year the National Environmental Laboratory Accreditation
Conference (NELAC) Quality Systems Committee has been working on a major
restructuring of the quality systems standards to integrate the ISO/IEC 17025
international standard and performance-based measurement system (PBMS) concepts
(i.e., additional flexibility) into the present standards. This paper will provide the
rational for that effort and will give an update on the status of the Committee's activities
in this and other key areas. Further, the paper will provide an overview of the present
draft language, that relates to ISO/IEC 17025 and PBMS, for the NELAC Quality
Systems chapter.
INTRODUCTION
The adopted June 29, 2000, NELAC Quality Systems standards (i.e., NELAC Chapter Five) are
based on ISO/IEC Guide 25. NELAC has also stated its commitment to the use of Performance-
Based Measurement Systems (PBMS) in environmental testing and toward providing a
foundation for PBMS implementation in the standards. Hence, with the advent of ISO/IEC
17025 as the replacement for ISO/IEC Guide 25 and the Environmental Laboratory Advisory
Board's (ELAB) PBMS Straw Model (presented at NELAC Vli) the NELAC Quality Systems
Committee has begun efforts to develop proposed language for NELAC Chapter 5 that would
integrate both ISO/IEC 17025 and the PBMS Straw Model concepts/elements into the standards.
Obviously, the NELAC Quality Systems Committee initiation of this effort was done with the
knowledge and support of both the NELAC Board of Directors and ELAB.
The primary goal of this effort is to improve overall quality of compliance data via the NELAC
quality system standards. The Committee views the incorporation of the superior ISO/IEC 17025
international standard as its base and further utilization of a PBMS approach for performing
environmental analyses under NELAC as a means to do just that: improve data quality.
Additionally, many NELAC stakeholders view this integration effort as a means to bring about
some positive and needed improvements in the current NELAC Chapter 5 language.
CURRENT ACTIVITIES AND DIRECTION
During NELAC Vli (November 2000) the NELAC Quality Systems Committee discussed its
ISO/IEC 17025 integration efforts and also formed a PBMS Subcommittee to address the PBMS
Straw Model. The NELAC Quality Systems Committee's ISO/IEC 17025 integration efforts
were essentially delayed between NELAC VI and NELAC Vli due to ongoing ISO/EEC 17025
copyright and copyright licencing fee issues that NELAP had with ANSI. Those issues are still
being considered by the NELAC Board of Directors and have had an impact on the direction the
1
-------
Quality Systems Committee has taken. Essentially both the above NELAC Quality Systems
Committee efforts got underway only after NELAC Vli.
The NELAC Quality Systems Committee has as part of its ISO/IEC integration effort initially
developed a spreadsheet that contrasts ISO/IEC 17025, NELAC Chapter 5 and ISO/IEC Guide
25 elements. This tool provided direction on the Committee's next steps. The Committee then
identified the current ISO/IEC Guide 25 language in NELAC Chapter 5 for possible removal.
The Committee has also inserted the present Chapter 5 language under the
appropriate/corresponding ISO/IEC 17025 section since the ISO/IEC 17025 language will
provide the framework for any revised Chapter 5. Lastly, due to the ANSI copyright issue, the
Committee was also directed to be ready to provide a version of NELAC Chapter 5 that would
only cite ISO/IEC 17025 by reference.
At present the Committee is working on drafting a revised NELAC Chapter 5 version that would
have ISO/IEC 17025 sections as the main framework (yet structured as to be able to cite these
sections by reference only if needed) with current and revised NELAC Chapter 5 language (either
minus the old ISO/DEC Guide 25 language or with the ISO/IEC Guide 25 language highlighted)
inserted were appropriate. Further, this revised NELAC Chapter 5 version would have inserted
in it the revised sections of NELAC Chapter 5 that the PBMS Subcommittee is working on. The
goal was initially to have this document ready for proposal at NELAC VII (May 22-25,2001),
but the extent and depth of the undertaking did not allow us to make the imposed March 19,
2001-deadline for Committee's to submit final proposed language to the NELAP Director.
ELAB's PBMS Straw Model which was heavily influenced by ISO/IEC 17025's Section 5.4 as it
relates to how laboratories should implement and use laboratory methods brought two key
concepts to the table. The two most significant concepts that have influenced the PBMS
Subcommittee's efforts are:
- Method Selection; and
- Method Validation.
The PBMS Subcommittee identified NELAC Chapter 5's sections 5.10 (Test Methods and
Standard Operating Procedures), 5.9.4 (Calibration), and Appendix C (Demonstration of
Capability) as important areas to revise to address the PBMS Straw Model concepts/elements.
To date the PBMS Subcommittee has essentially completely rewritten or plans or rewriting
Chapter 5 section 5.10 and Appendix C. Significant revisions are being drafted for sections 5.9.4
(Calibration) and a few changes in section 5.5.4 (Quality Manual) and elsewhere in the main
body of Chapter 5.
Again, the NELAC Quality Systems Committee's ISO/IEC 17025 integration effort and the
PBMS Subcommittee's efforts will be fused into a single discussion document (not to be put for
2
-------
a vote) that will be brought to NELAC VII in Salt Lake City for public consideration and
discussion during the NELAC Quality Systems session.
However, the NELAC Quality Systems Committee will be bringing to NELAC VII as proposed
language and putting up for a vote the Committee's rewrite of Appendices D.l (Chemical
Testing) and D.3 (Microbiology Testing). The D.3 proposed language was discussed at NELAC
Vli and the D. 1 proposed language is based on the ELAB's May 2000 proposed revisions to D. 1.
The ELAB's May 2000 proposed changes to D.l were publically discussed and widely supported
at NELAC Vli as part of the NELAC Quality Systems Session.
LANGUAGE THAT RELATES TO ISO/EEC 17025 & PBMS STRAW MODEL
Obviously, with possibly bringing the entire ISO/IEC 17025 international standard into NELAC
Chapter 5 there would be new language that would represent some changes from the current
ISO/IEC Guide 25 based Chapter 5. While ISO/IEC 17025 has more emphasis/detail in the
technical requirements there appears to be greater flexibility within ISO/IEC 17025.
New ideas in the technical requirements in ISO/IEC 17025 that will likely be brought into
NELAC Chapter 5 are:
~ reference to "needs" of the clients;
~ requirement for sampling plan when sampling done by laboratory;
~ method validation;
~ calculation/estimation of measurement uncertainty for testing laboratories; and
~ provisions for inclusion of interpretations and opinions in test reports.
ISO/IEC 17025's management requirements also introduce some new aspects as compared to
ISO/IEC Guide 25. Some new aspects found are:
~ identification of potential conflicts of interest;
~ more detailed requirements for quality policy statement;
~ specific requirements for control, review, and approval, issue and amendment of
documents;
~ major changes in the Requests, Tenders, and Contracts section (e.g., identify customer
needs, ensure capability to meet needs, dealing with changes and deviations);
~ incorporate ISO 9001 requirements in simplified form for purchasing;
~ specific procedures for dealing with non-conforming work/results and the need for
corrective action;
~ specific procedures for cause analysis, selection and implementation of corrective action,
subsequent monitoring and follow-up audits;
3
-------
~ preventive action requirements deals with potential problems and quality improvement
process;
~ records requirements now consistent with ISO 9001; and
~ specific guidance on matter to be considered during management reviews.
Again, while the above are generally new aspects that will need to be considered, the overall
ISO/IEC 17025 standard is much less prescriptive and introduces greater flexibility on how to
accomplish the requirements. Some of the above items like corrective action, management
reviews, records and reporting are already addressed in detail in NELAC Chapter 5. Actually the
present NELAC Chapter 5 utilized some draft ISO/IEC 17025 language for the management
reviews and corrective actions sections.
It is this inherent flexibility written into ISO/IEC 17025, especially in relation to method
validation, that the PBMS Subcommittee hoped to capture in its below draft 5.10 language for
Chapter 5. Again the PBMS Straw Model elements/concepts are also based upon section 5.4 of
the ISO/IEC 17025 standard. The following in the draft language for Chapter 5 section 5.10 that
has been developed as of February 28, 2001.
Please Note: This language is still draft and undergoing internal review and comment by other
PBMS Subcommittee members. It has not been reviewed by the NELAC Quality Systems
Committee. It is only being shared as part of this paper as only a means to communicate the
general direction the PBMS Subcommittee is heading with its extensive rewrite of section 5.10.
The PBMS Subcommittee has just started work on revisions to the current Appendix C in
NELAC Chapter 5. The revised 5.10 and Appendix C will be the keys to implementing ISO/IEC
17025 section 5.4 with the NELAC Quality Systems standards.
Here is the February 28,2001 draft section 5.10 as drafted by the PBMS Subcommittee:
5.10.1 Methods Documentation
a) The laboratory shall have documented SOPs on the use and operation of all
equipment involved in the measurement, on the handling and preparation of samples,
and on calibration and/or testing, where the absence of such instructions could
jeopardize the reliability of calibrations or tests
b) All instructions, standards, manuals and reference data relevant to the work of the
laboratory shall be maintained up-to-date and be readily available to the staff.
4
-------
5.10 J Laboratory Methods Manual and Standard Operating Procedures (SOPs)
The laboratory shall maintain a methods manual. The methods manual shall contain the laboratory's
standard operating procedures (SOPs). The SOPs shall accurately reflect all phases of current
laboratory activities such as sample receipt, sample storage, sample analysis, assessing data integrity,
corrective actions, handling customer complaints, all test methods, and data and record storage
a) An SOP may be an equipment manual provided by a manufacturer, or an internally
written document so long as the SOP is adequately detailed to permit someone other
than the analyst to reproduce the procedures that had been used to produce a given
result.
b) The test method SOPs may be copies of published methods as long as any changes
or selected options in the methods are documented and included in the SOPs (see
5.10.1.2). Reference test methods that contain sufficient and concise information on
how to perform the tests do not need to be supplemented or rewritten as internal
procedures if these methods are written in a way that they can be used as published
by the laboratory. It may be necessary to provide additional documentation for
optional steps in the method or additional details.
c) Copies of all SOPs shall be accessible to all personnel.
d) SOPs shall be organized in a manner such that they are easily accessible to an
auditor.
e) Each SOP shall clearly indicate its effective date, its revision number and shall bear
the signature(s) of the approving authority.
f) Each test method SOP shall give or reference the following information, where
applicable:
1.0
Scope and Application
2.0
Summary of Method
3.0
Definitions
4.0
Interferences
5.0
Safety
6.0
Equipment and Supplies
7.0
Reagents and Standards
8.0
Sample Collection, Preservation, and Storage
9.0
Quality Control
10.0
Calibration and Standardization
11.0
Procedure
5
-------
12.0 Data Analysis and Calculations
13.0 Method Performance
14.0 Pollution Prevention
15.0 Waste Management
16.0 References
17.0 Tables, Diagrams, Flowcharts, and Validation Data
5.10.3 Use of Test Methods
All measurements made which operating as a NELAC accredited laboratory, must have an adequate
demonstration that the measurement system provided data consistent with its intended use. The
laboratory shall ensure the quality of results provided to clients by implementing a system to
document the quality of the laboratory's analytical results. This demonstration consists of three
activities, 1) an initial determination that the measurement system is capable of providing data of the
quality needed to meet client and/or regulatory requirements (see 5.10.3.2); 2) an acceptable
instrument calibration and verification that the system has remained calibrated during the period that
it was used for analysis, and 3) documentation of the quality of any data that was obtained. The
specific activities performed for this demonstration are defined below and in Appendices C and D.
5.10.3.1 Method Selection
The laboratory shall utilize methods within its scope (including sample collection, sample handling,
transport and storage, sample preparation and sample analysis) which are appropriate and applicable
to client needs (i.e., to meet regulatory or other requirements specified by the client). These
requirements may specify that a particular method, or group of methods be employed for a given
proj ect or program; or that specific data or measurement quality obj ectives be achieved; or both. I.e.,
data or measurement quality obj ectives specified by the client or required of the client to demonstrate
regulatory compliance define the boundary conditions of the method selection process.
a) When the use of a particular test method is mandated by a regulatory agency or is
requested by a client, only that method shall be used. Deviations from a reference test
method shall occur only if the deviation has been documented, technically justified,
authorized, and accepted by the client and/or regulatory agency. The laboratory shall
inform the client when the method proposed by the client is considered to be
inappropriate or out of date.
b) In the event that a specific method is not required by a regulation or a client, the
laboratory may select another, alternative methods, provided that it will yield data of
sufficient quality to meet client requirements. When use of a particular method is not
required by a client, the laboratory should preferentially employ methods published
by consensus standards organizations, government agencies such as USEPA,
reputable technical organizations, or those that are published in peer reviewed
6
-------
journals. When using such a method, the laboratory shall ensure that it uses the latest
valid edition of a method unless it is not appropriate or possible to do so. When
necessary, the method shall be supplemented with additional details to ensure
consistent application.
c) A laboratory-developed methods or a method adopted by the laboratory may also be
used if validated for the intended use. The client shall be informed as to the method
chosen. If the selected method is changed, the validation shall be repeated
d) Client approval must be obtained prior to implementation. Modifications must be
documented in and referenced in reports to the client.
5.10.3.2 Method Validatioa
The laboratory must routinely perform and document the quality of the measurement system relative
to the materials being tested. This activity is termed "method validation." The thoroughness and
robustness of the validation depends on what is already known about the performance of the method
on the analyte-matrix combination of concern over the concentration range of interest. Properties
of the measurement system to be validated include bias, precision, sensitivity, and selectivity. The
measurement system includes the analyst (operator) or work cell and method.
Essential elements of method validation include measures to determine positive or negative bias, to
assess variability and/or repeatability, to determine sensitivity, range, and response, to ensure
selectivity of the test method for its intended purpose, and to ensure constant and consistent test
conditions where required by the system.
The laboratory shall validate each method for its intended use according to Appendix C. The
laboratory shall record the results of the validation, the protocol used for the validation, and the basis
for the stated measurement system performance. When changes are made in a validated method, the
influence of such changes shall be documented and, if appropriate, a new validation shall be carried
out.
The thoroughness of any method validation is always a balance between costs, technical possibilities,
available time, and the consequences of error. There are many cases in which the range and
uncertainty of the values (e. g. accuracy, detection limit, selectivity, linearity, repeatability,
reproducibility, robustness and cross-sensitivity) can only be approximated. However, so long as
the level of approximation is commensurate with the needs of the client, such tradeoffs are
acceptable.
5.103.3 Quality Control Procedures
In addition to the requirement for validation, the following general quality control procedures shall
apply, wherever applicable. The manner in which they are implemented is dependent on the types
7
-------
of tests performed by the laboratory (i.e., chemical, whole effluent toxicity, microbiological,
radiological, air) and are further described in Appendix D. The standards for any given test type shall
assure that the applicable principles are addressed:
a) The laboratory shall have quality control procedures in place to monitor the
performance of the measurement system on an on-going basis, including:
1) procedures to ensure that the measurement system is free of laboratory induced
interferences;
2) procedures to identify if and when analytical instruments are in an out-of-control
condition;
3) procedures to verify continuing analyst proficiency;
4) procedures to ensure the suitability of reagents and standards; and
5) measures such as temperature, humidity, light, or specific instrumental
conditions, to assure constant and consistent test conditions (both instrumental
and environmental) where required by the test method.
b) All quality control measures shall be assessed and evaluated on an on-going basis,
and quality control acceptance criteria shall be used to determine the usability of the
data. (See Appendix D.)
c) The laboratory shall have procedures for the development of accept/reject criteria
where no method or regulatory criteria exist. (See 5.11.2, Sample Acceptance
Policy.)
The essential quality control measures for testing are found in Appendix D of this Chapter.
As you can tell the yet to be revised Appendix C will be integral in regards to the method
validation step. The PBMS Subcommittee draft Appendix C will hopefully be ready to include
into any discussion document taken to NELAC VII. I want to reiterate that the above is only an
internal draft still subject to change and does not represent proposed language up for vote at
NELAC VII. I hope sharing this draft language helps foster discussion and disseminates
information on the NELAC Quality System Committee's present efforts.
CONCLUSION
The advent of ISO/IEC 17025 and the PBMS Straw Model are generating considerable
discussion and efforts within the NELAC Quality Systems Committee and its PBMS
8
-------
Subcommittee. This paper is an attempt to capture the direction the Committee is heading to
address these items as they relate to quality systems. It is the hope of the NELAC Quality
Systems Committee to present a complete discussion document at NELAC VII for discussion-
purposes-only that will highlight possible proposed language to be presented at the next NELAC
Interim Meeting (NELAC Vlli). At NELAC VII the Quality Systems Committee and the PBMS
Subcommittee will welcome your feedback on the direction they are taking. The USEPA, DOD,
other federal agencies, States and the private sector are significant stakeholders in this process
and need to participate fully in NELAC to ensure the quality systems standards developed do
indeed improve overall data quality.
9
-------
OBSERVATIONS OF LABORATORY CHANGES
AS A RESULT OF THE NELAC STANDARDS
Marlene O. Moore, Advanced Systems, Inc.
Abstract —As laboratories implement and adopt the NELAC standards, uniformity of
records, quality control, standards/reagent labeling and SOP contents have been
observed. Although a lot of criticism about consistency of the assessments is presented
at NELAC and other meeting, there are a number of areas where consistency is better
defined and implemented. This consistency provides data users with uniform
documented information that was not previously available from laboratories. This
presentation includes the observations made by one independent assessor relating to
laboratory operational consistency as a result of NELAC. In addition the top laboratory
findings from August 1999 to June 2000 as compiled from the eleven accrediting
authorities will be included in the presentation.
During the last year of assessments, a number of items found in laboratories across the country
have become more uniform and consistent. This consistency is a result of the implementation of
the NELAC standards by laboratories. As laboratories implement and adopt the NELAC
standards, uniformity of records, quality control, standards/reagent labeling and SOP contents
have been observed. Although a lot of criticism about consistency of the assessments is
presented at NELAC and other meetings, there are a number of areas where consistency is better
defined and implemented.
NELAC continues to struggle to ensure assessment consistency as accrediting authorities
conduct the on-site assessment. Laboratories are implementing the standard and uniformity is
evident.
Before the implementation of NELAC training records were often found in various places and
contained a variety of information. In some laboratories no records were found at all. NELAC
changed this to require the laboratory director and QA manager to be responsible for certifying
personnel as trained in the laboratory. NELAC provides a form (Appendix C) that laboratories
have adopted and is resulting in more consistent records. In addition personnel are more familiar
with the methods cited on the form and are familiar with performance measures required to
maintain the status of a trained analyst.
Test records now contain the same minimum number and types of quality control samples.
Records before NELAC did not always include a method blank and laboratory control sample.
Since many reference methods do not indicate these samples in the specific method, laboratories
did not perform these samples. The reference method often included the requirement for a blank
and control sample, but it was located in another section of the reference document.
With NELAC the chemistry and microbiology Appendix clearly indicate the need for these
quality control samples. This allows data review to include these parameters on a more consistent
]
-------
basis. In addition NELAC requires a quality control sample at the reporting limit which ensures
that measurements reported to clients are bracketed with standards.
In many reference methods this is not clearly stated and often results are reported for values
without a standard to verify the measurement is possible.
Besides record improvement, a consistent method of labeling standards and reagents is found.
NELAC requires the labeling of these materials to indicate the expiration date. The preparation
date, open date, receipt dates are not required on the label. In the past, the labeling was
dependent on the program, the auditors and state. It was not uniformly defined and personnel
often were cited for not having a receipt date or open date. NELAC requires the necessary
information on the label that is the expiration date. Standards and reagent logbooks include all
information, but the label is only required to have the information needed by the user. The label
must indicate when the material expires. The records of the laboratory allow traceability of
preparation and lot number of reagents and standards. The containers are found in the laboratory
to be more uniformly labeled not only within the laboratory, but also between laboratories.
Many laboratories have spent significant amount of time and dollars to rewrite the laboratory
procedures to meet the content requirements specified in NELAC. As a result of this,
laboratories SOPs are more complete and are following a uniform standard for content. These
SOPs are more reflective of the laboratory operations and document modifications from the EPA
methods. In the past, assessors had to compare the laboratory method with the reference method.
Now many laboratories are identifying the modifications in order to indicate that these
modifications do not impact the data quality. These modifications are often justified in the SOP
and demonstration of equivalency cited.
Although consistency is underway, many parts of the standard are subject to interpretation.
These areas are presented to NELAC and improvements are underway. In the meantime findings
among the accrediting authorities are found to be consistent. Although outliers may exist in any
one state or one assessor, the assessments are finding the same types of deficiencies.
The NELAC standard from 1999 requires many new and additional requirements that
laboratories have not routinely implemented. The number after the citation is the NELAC 1999
standard reference. The entire wording from the standard is summarized here.
The top ten findings from August 1999 to June 2000 as compiled from the eleven accrediting
authorities are:
1. Laboratory shall have processes to ensure that its personnel are free from any commercial,
financial and other undue pressures which might adversely affect the quality of their work
(5.4.2.b)
2. Nominate deputies in case of the absence of the technical director(s) and/or quality assurance
officer (5.4.2.h)
3. Have documented policy and procedures to ensure the protection of clients' confidential
information and proprietary rights (5.4.2.i)
2
-------
4. Procedures for protecting confidentiality (including national security concerns), and
proprietary rights (5.5.2.r)
5. All audit and review findings and any corrective actions that arise from them shall be
documented.
6. The laboratory management shall ensure that these actions are discharged within the agreed
time frame. (5.5.3.3)
7. The laboratory shall use appropriate test methods and procedures for all tests and related
activities... The method and procedures shall be consistent with the accuracy required and
with any standard specifications... (5.10.2.a)
8. Where computers or automated equipment are used for the capture, processing, manipulation,
recording, reporting, storage or retrieval of test data, the laboratory shall ensure that: all
requirements are met... (5.10.6. b - e)
9. If more stringent standards or requirements are included in a mandated test method or by
regulation, the laboratory shall demonstrate that such requirements are met. (5.1 .b)
10. Method Blanks - performed at a frequency of 1 / batch / matrix type / prep. The source of
contamination must be investigated, corrected.... Any sample associated with the
contaminated blank shall be reprocessed or the results qualified. (D.l.l.a.l)
Additional findings that are frequently reported from the accrediting authorities include:
Organization
• The laboratory must specify and document the responsibility, authority, and interrelation
of all personnel ...
- in job descriptions for all positions (5.4.2.d)in
• The QA officer
- arranges for or conducts internal audits on the entire technical operation annually
- notifies lab management of deficiencies in the quality system
- monitors corrective action (5.4.2.g.6)
System - Findings
• Quality Manual incomplete (5.5.2)
• Objectives not documented (5.5.2.a; 5.5.1 .c)
• Records retention and document control procedures not available (5.5.2.d)
• Signatures QM missing (5.5.2.f)
• Procedures for measurement traceability not available (5.5.2.g)
• Review of new work not defined (5.5.2.i)
• Ethical training not documented (5.5.2.u)
• Management review not complete (5.5.3.2)
• Corrective actions not implemented (5.5.3.5)
Training - Findings
3
-------
• Demonstration of Capability not documented (5.6.2 b, C-l, 5.10.2.1, D.1.3.a)
• Training not kept up to date (5.6.2.c)
• Proactive program for detection of improper actions (5.6.2 h)
Facilities & Equipment - Findings
• Recording & control of environmental conditions (5.7.1 .c)
• Equipment records not complete (5.8.e)
Traceability - Findings
• Calibration and verification of equipment including balances, thermometers, standards
(5.9.1)
• Maintenance records and service calls (5.9.4.1 .a)
Methods - Findings
• Documented instructions (5.10.1.a)
• SOPs not available (5.10.1.1)
• Effective date not defined (5.10.1.1 .e)
• Methods manual not available (5.10.1.2)
• Procedures for obtaining subsamples (5.10.3)
• Procedures for purchase, reception, and storage of consumable materials (5.10.5)
Sample Handling - Findings
• No system for uniquely identifying samples (5.1 l.l.a)
• Documentation of sample conditions not per reference method or program (5.11.3.a)
Records - Findings
• Records do not include all activities (5.12)
• Control of logbooks and records incomplete (5.12.2.d)
Report - Findings
• Reports contents not per standard (5.13.a)
• Subcontractors not identified on report (5.13.c )
• Amendments to reports not identified (5.13.d)
Subcontracting - Findings
• No records for subcontracting (5.14.c)
Chemistry - Findings
• Matrix spikes not performed at required frequency (D. 1.1 .b.2)
• MS duplicates and other duplicates not performed at required frequency (D. 1.2)
• Calibration verification not performed at beginning and end of run (5.9.4.2.2.b)
Microbiology - Findings
• Duplicates and PT testing (D.3.2)
• Temperature devices calibrated annually and appropriate for use (D.3.8.c)
4
-------
Reference:
National Environmental Laboratory Accreditation Conference Standard,, Chapter 1 to 5 with
Appendices, July 1999, USEPA.
5
-------
The Quality Management System as a Tool for Improving
Stakeholder Confidence
Denise K. MacMillan, Environmental Chemistry Branch, Army Engineer Research and
Development Center
The Corps of Engineers works with local restoration advisory boards (RAB) to exchange
information and develop plans for restoration of closed military bases for civilian re-use.
Meetings of the RAB to discuss progress in environmental assessment and restoration of
former defense sites can be contentious due to the complex technical nature of the
information to be shared and the personal stake that the members of the community have
in ensuring that contaminated areas are restored for safe use. A prime concern of
community representatives is often the quality of the data used to make environmental
decisions. Laboratory case narratives and data flags may suggest laboratory errors and
low data quality to those without an understanding of the information's full meaning.
RAB members include representatives from local, state, and tribal governments, the
Department of Defense, the Environmental Protection Agency, and the local community.
The Corps of Engineers representatives usually include project technical and
management personnel, but these individuals may not have sufficient expertise in the
project quality assurance components and laboratory data quality procedures to
completely satisfy community concerns about data quality. Communication of this
information to the RAB by a quality assurance professional could serve to resolve some
of the questions members have about the quality of acquired data and proper use of
analytical results, and increase community trust that appropriate decisions are made
regarding restoration. Details of the effectiveness of including a quality assurance
professional in RAB discussions of laboratory data quality and project quality
management will be provided.
The US Army Corps of Engineers uses a twelve-element quality assurance program to assure
that the data acquired for site investigations, remediation, monitoring, and other environmental
projects meet project-specific data quality objectives. The elements are comprehensive in scope
and integrate quality assurance activities into the planning, sampling, analysis, data assessment,
and data validation stages of a project. The primary elements are laboratory validation, technical
document review, sampling handling quality assurance, split sample collection and analysis, data
comparisons, and data assessment at the user level. Secondary elements are primary data review
by the user, performance evaluation samples, field audits, laboratory audits, and tape audits.
None of the elements are required by the Corps, though the primary elements are highly
recommended. With inclusion of at least some of these activities in District projects, questions
and concerns about data quality can be addressed more readily.
Split sample analysis is frequently one of the data quality activities that are used by projects. By
selecting this element, project personnel are also able obtain information concerning sample
handling directly from the quality assurance laboratory. Data obtained from the primary
laboratory and the quality assurance laboratory are then also available for comparison. Thus
through selection of one of the QA elements, two more could be readily obtained.
1
-------
The Environmental Chemistry Branch of the Engineer Research and Development Center is the
Corps resource for split sample analysis and associated quality assurance elements. When the
Branch (then known as the Missouri River Laboratory) first became involved with quality
assurance, provided services were limited to split sample analysis, sample handling quality
assurance, and primary/QA data comparisons. In recent years, though, the Branch's role in the
Corps environmental program has expanded to include performance of all the twelve quality
assurance activities detailed above. Through this expansion of services, Corps project personnel
are able to obtain all their desired quality assurance elements from one facility.
Since the Branch provides such a wide range of quality assurance services, we are able to use our
comprehensive project knowledge to assist Corps Districts in several ways. In some instances,
District personnel have come to us for help in assessing data usability. In other instances, our
work uncovered problems with primary laboratory performance that would jeopardize data
integrity or completion of the project. These services are the expected benefits of the corporate
quality assurance program. Another benefit is the ability to communicate the quality and
usability of project data to Federal, State, local, and other stakeholders and partners. A specific
example is participation in a restoration advisory board meeting to answer stakeholder questions
about data flags, laboratory comments, and other analysis-related concerns. By improving
stakeholder confidence in and understanding of findings, all these services directly support
projects by resolving problems that are related to usability of data for its intended purpose.
The Corps of Engineers works with local restoration advisory boards (RAB) to exchange
information and develop plans for restoration of closed military bases for civilian re-use.
Recently, a Corps District was concerned about a RAB's interpretation of laboratory data for a
local site investigation. The quality of the data used to make environmental decisions was a
prime concern for RAB members. The RAB membership included people with technical
backgrounds, but, overall, the language of the laboratory's technical report was a barrier to
member's understanding of the analytical results. None of the project personnel who regularly
participated in the RAB meetings had sufficient understanding of typical data packages and
environmental analytical laboratory practices to adequately answer RAB member's questions.
To minimize misunderstanding, contention, and confusion, the District decided that a person
with direct experience with laboratory procedures and the quality assurance systems of both the
Corps and commercial laboratories was needed to answer questions. The District requested that
an Environmental Chemistry Branch QA officer attend the RAB meeting for this purpose.
The QA officer's primary role was to explain case narratives and data flags included in the
project's primary laboratory results reports. Review of the data packages provided by the
commercial laboratory prior to the RAB meeting showed no significant quality control
deficiencies or laboratory error. The RAB members, however, were troubled by the presence of
"J" and "B" flags in previous data packages, and were concerned that some samples required
dilution. In one member's estimation, these flags indicated that the laboratory was making
excuses for poor performance. Another significant RAB concern was that a laboratory might
have transposed numbers when converting instrument raw data to final results reports. The RAB
requested copies of the laboratory's raw data, including calibration data, sample preparation logs,
and analytical run logs for project samples, with the intention of verifying the correctness of the
2
-------
reported results. Each question and concern was on a fundamental, routine procedure that, while
easily understood by analytical environmental chemists, suggested significant problems to the
RAB members.
The idea to have a QA expert present at the RAB meeting was sound, but the implementation
was a limited success. First, the culture of the RAB led to delays during the meeting that
prevented sufficient time for adequate discussion of the data. RAB members expressed mistrust
and irritation at several points during the meeting, starting from the introduction of a facilitator.
Resolution of issues such as these consumed most of the meeting. Secondly, the discussion of
the laboratory results was limited to only 15 minutes at the end of the meeting. There was not
sufficient time for the members to review all the results and formulate questions. Also, the
questions that were framed did not cover enough of the unease to alleviate the concerns held by
the RAB. Thirdly, the follow-on meeting was scheduled to occur two months later. This delay
between receipt of the data and response to concerns could only add to the general frustration of
experienced by the RAB members and project personnel.
Resolution of problems such as these is difficult and will require significant effort. These
difficulties are not limited to the project described here, though. To improve stakeholder
confidence in environmental decisions, a QA representative should be included as part of the
project team at the very start of a project, and should be part of the team that interacts with
stakeholders. Incorporation of a QA professional would emphasize the value of a quality
assurance system to the project personnel and serve to minimize misunderstandings with the
public early in project. Stakeholder concerns tended, in the situation described here, to be basic
and easily answered. But because the questions went so long unanswered, they created a
negative impact on the project that led to delays and inefficiency. The QA representative, if
involved in a project from the planning phase through the stage where data are used to make
decisions, as is suggested by comprehensive use of the US Army Corps of Engineers quality
assurance program elements, would be a resource for overall improvement of the quality and
efficiency of an environmental project.
For more information, contact:
Denise K. MacMillan, Environmental Chemistry Branch, Army Engineer Research and
Development Center, 420 South 18th Street, Omaha, NE 68102-2586
Phone: (402) 444-4304, Fax: (402) 341-5448
e-mail: denise.k.macmillan@nwo02.usace.army.mil
3
-------
LESSONS LEARNED IN PREPARING METHOD 29 FILTERS
FOR COMPLIANCE TESTING AUDITS
Joan T. Bursey, Ph.D., Eastern Research Group, Morrisville, NC
Clyde E. Riley, Emission Measurement Center, U.S. Environmental Protection Agency, Research
Triangle Park, NC
Judith E. McCartney, Eastern Research Group, Morrisville, NC
Mr. Robert Martz, Eastern Research Group, Inc.
Abstract — Companies conducting compliance testing are required to analyze audit
samples at the time they collect and analyze the stack samples if audit samples are
available. Eastern Research Group (ERG) provides technical support to the EPA's
Emission Measurements Center's Stationary Source Compliance Audit Program
(SSCAP) for developing, preparing, and distributing performance evaluation samples
and audit materials. These audit samples are requested via the regulatory Agency and
include spiked audit materials for EPA Method 29 - Metals Emissions from Stationary
Sources, as well as other methods.
To provide appropriate audit materials to Federal, State, tribal, and local governments,
as well as agencies performing environmental activities and conducting emission
compliance tests, ERG has recently performed testing of blank filter materials and
preparation ofspikedfilters for EPA Method 29. For sampling stationary sources using
an EPA Method 29 sampling train, the use of filters without organic binders containing
less than 1.3 fxg/in.2 of each of the metals to be measured is required. Risk Assessment
testing imposes even stricter requirements for clean filter background levels. Three
vendor sources of quartz fiberfilters were evaluatedfor background contamination to
ensure that audit samples would be prepared using filters with the lowest metal
background levels. A procedure was developed to test new filters and a cleaning
procedure was evaluated to see if a greater level of cleanliness could be achieved using
an acid rinse with new filters.
Background levels forfilters supplied by different vendors and within lots of filters from
the same vendor showed a wide variation, confirmed through contact with several
analytical laboratories that frequently perform EPA Method 29 analyses. It has been
necessary to repeat more than one compliance test because of suspect metals
background contamination levels. An acid cleaning step produced improvement in
contamination level, but the difference was not significant for most of the Method 29
target metals.
As a result of our studies, we conclude:
• Filters for Method 29 testing should be purchased in lots as large as possible.
• Testing firms should pre-screen new boxes and/or new lots offilters usedfor
1
-------
Method 29 testing.
• Random analysis of three filters (top, middle, bottom of the box) from a new box
of vendorfilters before allowing them to be used in field tests is a prudent
approach.
• A box of filters from a given vendor should be screened, andfilters from this
screened box should be used both for testing and as field blanks in each test
scenario to provide the level of quality assurance requiredfor stationary source
testing.
Eastern Research Group (ERG) provides technical support to the EPA's Emission Measurements
Center's Stationary Source Compliance Audit Program (SSCAP) for developing, preparing, and
distributing performance evaluation samples and audit materials. For the past four years, ERG
has provided contractor support to the EPA in the area of compliance audits.
Companies conducting compliance testing are required to analyze audit samples (if available) at
the time they collect and analyze stack samples using any of several EPA test methods. To
provide appropriate audit materials to Federal, State, tribal, and local governments, as well as
agencies performing environmental activities and conducting emission compliance tests, these
audit samples are requested via the regulatory Agency and include spiked audit materials for
EPA Method 29 - Metals Emissions from Stationary Sources, as well as other methods.
Consistent with the components of the EPA Method 29 sampling train, audit materials are
supplied as spiked filters and as spiked aqueous media.
In performing compliance testing, field testing personnel and laboratory personnel
collaboratively apply approved EPA sampling and analytical methods to determine whether a
given facility is in compliance with environmental regulations. A "successful" compliance test
(from the point of view of the affected facility) demonstrates that the facility is complying with
applicable regulations - i.e., levels of tested materials are measured below the level that will
trigger remedial activity and repeated testing on the part of the affected facility. It has been
brought to our attention that several field groups and testing firms have been experiencing test
failures because of so-called "dirty filters" with high background levels that yield "out of
compliance" results for certain metals on blank and sample filters. EPA Method 29 states that
the filters used in testing shall contain less than 1.3 yug/in2 of each of the metals to be measured.1
Risk Assessment testing may impose even stricter requirements for cleanliness of filter
background. This target level for cleanliness has not been achieved in several cases of "out of
the box" filters used in sampling for metals at stationary sources.
Examples of high levels of filter contamination in two different field studies are shown in
Table 1. These data show that significant levels of arsenic, barium, and other metals were found
in "blank" filters from field sampling efforts. Two different filter lots from two different vendors
1 Method 29 - Metals Emissions from Stationary Sources, Code of Federal Regulations,
Title 40, Part 60, Appendix A, February, 2000.
2
-------
were used in tests conducted at two separate geographic sources. The metals observed were not
expected to be found at the source, and the analytical laboratory rechecked each measurement to
verify that there was no laboratory contamination or extenuating circumstances related to the
analysis process. The levels of contamination in the two filters were so great that expensive field
tests had to be repeated to verify the results. More than one compliance test has been repeated
due to suspect metals background contamination levels. Several analytical laboratories had
observed and reported high levels of background contamination on filter media used for metals
sampling and analysis. In some cases, the levels of filter blanks have been too high for an
accurate determination of the source contribution, and in other cases, the final results from field
work had suspicious levels of contamination. Background levels between vendors and within
lots of filters from the same vendor showed a wide variation, confirmed through contact with
several analytical laboratories that perform EPA Method 29 analyses. A background study was
initiated by ERG to investigate these claims.
Three selected vendor sources of quartz fiber filters were evaluated by ERG for background
contamination in order to ensure that our audit samples would be prepared using filters with the
lowest possible metal background levels. In each case, filters were purchased and three random
filters chosen from an individual vendor's box were digested separately, the digests combined,
and analyzed according to EPA Method 29 procedures. The results are reported in Table 2 as per
filter values. The apparent background level of the filters varied greatly from vendor to vendor,
with one vendor delivering "new clean" filters with surprisingly high levels of barium and
phosphorus.
Filters were also evaluated to gain additional insight into the per filter variation of blank filters.
The "hits" from Vendors A and B in Table 2 were tested to see if there was significant variation
from filter to filter. Only those metals that were found at levels above the analytical method
detection limit were selected and three filters were evaluated to look at the differences. Results
are shown in Table 3. The filters were reasonably consistent, but the decision was made to use
the average of three analyzed filters for future determinations rather than using a single filter.
While the best filter shown in Table 3 was reasonably adequate for spiking experiments, a
procedure was developed by ERG to clean new filters and that cleaning procedure was evaluated
to see if a greater level of cleanliness could be achieved using an acid rinse with new filters.
Acid cleaning had been suggested by a few specialists in the inorganic analysis field, but no
specifics had been documented. Batches of 25 - SO new filters were acid washed as a group
multiple times with a 10% nitric acid solution in a Biichner filtration apparatus. Filters were
pre-rinsed with deionized water, then immersed in 10% nitric acid, which was applied three
times to the filter group. Filters were never allowed to diy while loaded in the Biichner filter, to
avoid passive sampling of room air through the filters. The filters were immersed and rinsed
thoroughly with deionized water, and gently dried individually in a clean laboratory oven.
The comparison of the blank versus acid washed filters is shown in Table 4. Acid cleaning
3
-------
produced improvement in background contamination level, but the difference was not significant
for most of the listed Method 29 target metals. Again, three random filters were chosen from a
vendor's box, then digested separately, combined, analyzed, and compared to three random
filters chosen from the acid washing treatment. Barium, chromium, and zinc levels were slightly
improved by the acid treatment, while the other elements were essentially unchanged in their per
filter concentrations. While we believe that the filter washing is effective, it does not appear
warranted for filter vendor batches that are already clean enough to meet Method 29 cleanliness
criteria. Perhaps acid washing is a technique that could be applied by the vendors to produce a
more consistent quality filter for use in EPA Method 29 metals sampling.
As a result of our EPA Method 29 studies, we have reached the following conclusions:
• A vendor providing "quartz" fiber filters designated as EPA Method 29 or Metals
Sampling filters does not necessarily provide filter media that is clean enough to meet
method specifications. Consulting other analysts and comparison shopping is
recommended.
• Filters for EPA Method 29 testing and spiking should be purchased in lots as large as
possible (100 - 200 filters or more) to allow a full set to be evaluated for cleanliness and
the entire lots certified as acceptable for spiking studies, field use, or laboratory studies.
A large single lot is the most convenient way to use in spiking or ongoing field testing.
Filters from the same lot and the same vendor should be used for testing, field blanks, and
laboratory blanks.
• Testing firms should pre-screen new boxes and/or new lots of filters used for Method 29
testing. We are aware of several firms that are sending all of their filters off to analytical
laboratories to "pre-screen" the lots of filters and avoid contamination problems at the
end of an expensive field test or compliance study. While this extra testing is expensive,
it is much less expensive than the repeating of an entire field test.
• We agree with fellow laboratories that randomly checking and analyzing three filters (top,
middle, bottom of the box) from a new box or new lot of vendor filters before allowing
them to be used in field tests is a prudent approach. The "cleanliness" of the filters can
be documented as an initial quality assurance step in the test sequence or spiking study.
• The screened boxes of filters in combination with filters chosen and used as field blanks
should be used in each test scenario to provide the level of quality assurance required for
stationary source testing. Screening the boxes, combined with the careful handling of the
filters, will not eliminate all filter contamination issues, but it will avoid the very large
background levels that some testing firms have encountered.
4
-------
Table 1. Examples of High Levels of Filter Contamination
Metal of
Interest
Contaminated Blank
Filter 1
(^g/filter)
Contaminated Blank
Filter 2
O^g/fDter)
. Laboratory
Blank Filter
(^g/filter)
Sb
16
26
<0.4
As
513
853
0.5
Ba
107
259
3.4
Be
0.2
0.2
<0.1
Cd
<0.1
<0.1
<0.1
Cr
4.7
8.4
0.2
Co
0.4
2.2
<0.1
Cu
2.2
3.2
0.4
Pb
10
19
0.6
Mn
27
57
<0.75
Ni
2.7
23
0.25
Zn
14
54
1.7
Filters #1 and #2 were from different vendors. The Laboratory Blank Filter was from a third vendor.
5
-------
Table 2. Quartz Filters Evaluated for Background Contamination
Metal
Symbol
Vendor A
(^g/filter)
Vendor B
O^g/fDter)
Vendor C
Og/fDter)
Antimony
Sb
<0.5
1
<0.5
Arsenic
As
<0.5
<0.5
<0.5
Barium
Ba
14
9
134
Beryllium
Be
<0.02
<0.02
<0.02
Cadmium
Cd
<0.01
<0.01
<0.01
Chromium
Cr
1.4
1.8
2.5
Cobalt
Co
<1.5
<1.5
<1.5
Copper
Cu
<2
<2
<2
Lead
Pb
1.4
0.4
1
Manganese
Mn
<1
1
2
Mercury
Hg
<0.4
<0.4
<0.4
Nickel
Ni
<0.5
0.7
<0.5
Phosphorus
P
<30
<30
114
Selenium
Se
<0.3
<0.3
<0.3
Silver
Ag
<0.02
<0.02
<0.02
Thallium
T1
<0.5
<0.5
<0.5
Zinc
Zn
14
4
15
Table 3. Filter Evaluation
Filter #1
Filter #2
Fitter #3
Average
Metal
(/ig/filter)
(^g/filter)
Gig/filter)
(/Ug/filter)
Barium
6
3
2
3.67
Chromium
2.2
2.1
2.2
2.17
Lead
0.4
0.4
0.4
0.40
Zinc
2
2
4
2.67
Filters were from Vendor B.
6
-------
Table 4. Blank versus Acid Washed Filters
Metal
Symbol
Blank inters
(^g/filter)
Acid washed filters
O^g/filter)
Method blank criteria *
(/^g/fflter)
Antimony
Sb
<0.7
<0.7
18.4
Arsenic
As
<0.5
0.5
18.4
Barium
Ba
3.3
1.8
18.4
Beryllium
Be
<0.05
<0.05
18.4
Cadmium
Cd
<0.05
<0.05
18.4
Chromium
Cr
3.1
2.6
18.4
Cobalt
Co
< 1.0
<1.0
18.4
Copper
Cu
<0.5
<0.5
18.4
Lead
Pb
<0.5
<0.5
18.4
Manganese
Mn
<1.0
<1.0
18.4
Nickel
Ni
<1.0
<1.0
18.4
Selenium
Se
<0.6
<0.6
18.4
Silver
Ag
<0.02
<0.02
18.4
Thallium
T1
<0.5
<0.5
18.4
Zinc
Zn
2.9
2.0
18.4
' 4.25 inch diameter filters (14.19 sq. in.) were used for these tests.
7
-------
U.S. Geological Survey
Passive Diffusion Bag
Samplers
Don A. Vroblesky (USGS),
Javier Santillan (AFCEE), and Maj. Jeff Cornell (AFCEE)
Presentation Date: 4 April 2001
20th Annual National Conference on Managing
Environmental Quality Systems
Objectives
¦ To discuss the use of passive diffusion bag
samplers as an inexpensive and effective alternative
for sampling VOCs in wells.
¦ To discuss the implications of contaminant
stratification in well screens on the data quality
obtained by diffusion bag samplers and other types
of sampling.
U.S. Geological Survey
Description of passive
diffusion bag (PDB) samplers
* PDB samplers are
low-density H
polyethylene tubes
closed at both ends
!
and containing
deionized water.
Vendors Ejfj|
¦ Columbia Ansty^cal S«rvk*s
¦ 206-824-8933
! I
¦ Eon Products vim
¦ 800-474-2490
z w
U.S. Geological Survey 3
-------
Groundwater and Contaminant
Flow
Well Screen
ig|||f
In a situation where
ground water freely
moves horizontally
through a well screen,
the screen typically
functions as a more
permeable part of the
aquifer
U.S Geological Survey *
Lab Tests of Water-Filled
Diffusion (PDB) Samplers
.¦or
Benzene
EDB
1,2-OCA
Ethyl Benzene
1,1,2-TCA
BDCM
DBM
1,1-DCE
MC
TCE
Bromoform
1,2-DCB
c-OCE
Naphthalene
TCFM
Carbon Tet.
1,3-DCB
f-OCE
PCA
1,2,3-TCPA
Chlorobenzene
1,4-DCB
12-DCPA
PCE
Vinyl Chloride
Chloroethane
DCDFM
c-OCPE
Toluene
Total Xylenes
OBCM
1,1-OCA
J-DCPE
1.1,1-TCA
U.S. Geological Survey
2
-------
Advantages of PDB samplers
¦ Inexpensive and easy to deploy
¦ Have the potential to eliminate or substantially
reduce the amount of purge water during sampling.
¦ The samplers are disposable, so there is no down-
hole equipment to be decontaminated between sites.
¦ They have the potential to delineate contaminant
stratification in the screened or open Intervals of
wells.
US. Geological Surety
Disadvantages of PDB Samplers
¦ Not applicable to Inorganic and highly-soluble or
highly-insoluble organic compounds, including the
following tested compounds:
¦ MTBE (too soluble)
¦ Pesticides (too insoluble)
¦ Most PAHs (naphthalene is an exception)
¦ Water should be flowing through the screen or open
interval for maximum effectiveness.
IAS. Geological Survey
PDB Sampler Equilibration
Time in Lab Studies
¦ 48 hours for TCE and several tested compounds
(Vroblesky, 2000, USGS)
¦ 98 to 168 hours for VC and some chloroethenes
(Sivavec and Baghel, 2000, General Electric
Company)
¦ BUT: under field conditions, samplers should
equilibrate long enough for well water, contaminant
distribution, and flow dynamics to restabilize
(typically 2 weeks)
U.S. Geological Survey
-------
Multiple Diffusion Samplers to
Examine VOC Stratification
U.S. Geological Survey
Comparison of PDB sampling
and low-flow sampling
t§|§K|l^§|pl
pais
Lw- Mr" nfit^f
** f IkSBS
Tubing for low-
flow sampling
^IfSt
— Diffusion
samplers
In some cases,
11 iJPi
positive-
displacement
''
pumps were
IrSfcf -* I
deployed with
the PDB
Pis* mi
samplers
U.S. Geological Survey
TCE Stratification in 10-Ft Well Screens
~ LF bladder pump
"~* LF peristaltic pump
-#¦ Diffusion sample
U.S. Geological Survey
NAS North Island, CA
TCE Concentration (mg/L)
4
-------
Stratification Definition May be Necessary to
Optimize Sampling and Remediation
HAS North Is. CA
TCE (ug/L)
U.S. Geological Survey
Contaminant Profile May Provide Clues
Regarding Contaminant Degradation
NAS North Is.. CA
Changes in TCE/DCE Ratio Along Vertical Profile in Screened Interval
Microgram* par Htar Micrograms par titer
fOB
~
i^-oce
HIROPFrt
Hey, MN. Nov. 1999
-A-
-*-TCC
U.S. Geological Survey
TCE stratification in 10-ft well
screens
10-ft well screens, Fridty, MN
0 50 100 150 200 250 300 350 50 100 150 200 250 300 350 400 450
~Acrograms/St er Micrograms/Wer
Explanation
Low-flow PDB
sample sampl^^^ 1,2-Diohloroethen<
a Trichloroethene
U.S. Geological Survey
-------
Different sampling approaches can produce different
results in a VOC-stratified well screen
U.S- Geological Survey
Low-flow Sampling effects near
TCE stratification
>1
Peristaltic low-How sample
Diffusion sample
2
4 Sequence of low-flow sampling
3
5
Well PW-15.
NAS North Island. CA
January 2000
As the sequential
low-flow sampling
approaches a
chemical interface,
in-well mixing
may obscure the
TCE profile
2000 4000 6000
TCE, in j/g/L
U.S. Geological Survey
Sequential sampling from high to
low concentrations
~
Diffusion
sample
•
Low-flow
peristaltic
ample
4
Sampling
sequence
In this well, the
TCE profile from
low-flow sampling
is a subdued
version of
the profile from
diffusion sampling
4000 8000 12000
TCE, in ug/L
U.S. Geological Survey
6
-------
Comparison of PDB and Purge
Sampling Methods at Well 18S
130
650
570
2,300
4-casing-volume
FrkOey.MN (Nov. 1999) Purge
US. Geological Survey
Switched to Multiple PDB samplers and
Low-Flow Sampling at Well 18S
I-
Ł
¦ These data Imply that Ł 32
the low-flow sampling |
res una can be a « M
mixture of waters 1
wtthtn the screened J
Interval 2
f-
#1 T
200 400 *00 *00 1000 1200 1400 1600 IttO
Concentration, fri microgram* per Bter
Fridley, MN (May 2000)
1*' PC8 -
- tampl* Mfnpli * <
' # •' Totri Ij-OlohlwoiOwni
TrtchloroaCMne -v
US. Geological Surety
Varying degrees of mixing
during sampling
¦ Thus, diffusion samplers typically constitute a point sample.
This Is useful for targeting the high concentrations (for
Instance). Average concentrations for a screened Interval are
obtained by multiple samplers.
¦ Low-flow samples sometimes constitute a point sample (North
Island) providing no Information on average concentrations In
a wed screen. In other wells, LF samples constitute a mixed
sample over varying intervals.
¦ 3 or more casing-volume purge sampling averages aqueous
concentrations even more by mixing, sometimes Inducing flow
from horizons not In the vicinity of the well screen.
V S. Geological Survey
-------
Relation between well sampling
and well construction
¦ In some cases, VOC stratification in a well and
disagreement between sampling methods can result
from inadequate wells.
¦ Examples include wells that connect zones of
significantly different hydraulic head or contaminant
concentration.
¦ Consider the following examples
U.S. Geological Survey
-------
High TCE Concentrations in the
Zone of Stagnation
~
— Purge-and-Sample
- Diffusion
-
- 7
Wo* Peter Church.
TCE (pgrt.)
Data from USGS
U.S. Geological Survey 251
Differing Source Waters for
PDB and Purge Methods
m Low-permeability
aquifer
¦ VOCs were higher
in purged sample
than In PDB
sample.
¦ PDB samples
were local.
Pumped sample
was from above
the screen.
ra
Sand
S&G
|
ppl
u
Jl
0."
"1
Water
Level
Davis Gbbal Comm., CA,:
Wefl DMW-5
U.S. Geological Survey
Summary
i Different approaches to sampling in VOC-stratified intervals
can produce different results. The user should be aware of
these differences when matching a sampling methodology to
the data quality objectives.
¦ PDB samplers constitute a point sample. Average
concentrations for a screened interval are obtained by
multiple samplers.
¦ Low-flow samples sometimes constitute a point sample and
sometimes constitute a mixed sample over varying
intervals. It is not always obvious which is being produced.
¦ 3 or more casing-volume purge sampling averages aqueous
concentrations even more by mixing, sometimes inducing
flow from horizons not in the vicinity of the well screen.
U.S. Geological Survey
-------
Summary (continued)
¦ FOB samplers can result In substantial cost savings.
¦ Typically a dose match b obtained betweenPOB samplers and
conventional sampling methods.
¦ When they dont match:
¦ Usually because of differences In the amount of
mixing from each method In a stratified system.
¦ Multiple diffusion samplers can provide Information on
contaminant stratification In wells, particularly when used In
conjunction with borehole flowmeters.
¦ Delineating stratification can be useful for
¦ Sampling optimization
¦ Remediation optimization
¦ Gaining a better understanding of contaminant
degradation
V&GtclcgUalStmj
-------
MEASUREMENT UNCERTAINTY FOR
ENVIRONMENTAL PROGRAMS
Marlene O. Moore, Advanced Systems, Inc.
Abstract — Currently the primary topic of discussion for many testing laboratories in the
United States and the world is measurement uncertainty. Except in a limited number of
testing fields, such as calibration, radiochemistry and some biological studies, the
uncertainty of measurements is not commonly defined or practiced by testing
laboratories. With the recent publication, adoption and implementation of ISO/lEC
17025, testing laboratories must address the understanding, documentation and
evaluation of measurement uncertainty. 1SO/IEC 17025 requires the laboratory to have
and apply procedures for measurement uncertainty. It is noted that the ISO/IEC 17025
definition of uncertainty has been defined as stated in the "Guidelines for Expression of
Measurement Uncertainty" (GUM). This uniform definition requires the reevaluation of
the uncertainty expressions being used by all laboratories when expressing measurement
results. The presentation will include a review of the definition of measurement
uncertainty define internationally and alternatives under consideration for testing
laboratories worldwide with emphasis on environmental applications.
ISO/IEC 17025 requires the laboratory to have and apply procedures for measurement uncertainty.
The ISO/IEC 17025 definition of uncertainty has been defined as stated in the "Guidelines for
Expression of Measurement Uncertainty" (GUM). This uniform definition requires the reevaluation
of the uncertainty expressions being used by all laboratories when expressing measurement results.
Currently the primary topic of discussion for many testing laboratories in the United States and the
world is measurement uncertainty. Except in a limited number of testing fields, such as calibration,
radiochemistry and some biological studies, the uncertainty of measurements is not commorily
defined or practiced by testing laboratories. With the recent publication, adoption and
implementation of ISO/IEC 17025, testing laboratories must address the understanding,
documentation and evaluation of measurement uncertainty.
First we will review the definition for measurement uncertainty, second we will review some of the
methods used for determining uncertainty and finally we will look at the application of measurement
uncertainty for environmental programs.
Measurement Uncertainty Definition
Definition of terms is critical when discussing uncertainty. The definitions used for measurement
uncertainty today is based on the international metrological definition. Over the past several years
a uniform definition for measurement uncertainty has evolved and been adopted worldwide. This
definition is found in "Guidelines for Expression of Measurement Uncertainty" (GUM). Laboratory
accreditation bodies have adopted this guideline to ensure a uniform application and understanding
for measurement uncertainty. However, some in the testing laboratory community believe this
1
-------
definition is not applicable to testing laboratories and that the implementation of this measurement
uncertainty standard by testing laboratories is too costly and not practical. In addition the statistical
application and evaluation of testing data is not understood by most chemists, biologists and other
testing laboratory scientists.
A person's academic, technical and work background will play a significant role in the
understanding of uncertainty. Here are a few definitions obtained from a variety of sources.
Absolute Uncertainty, the instrument or equipment reliability as defined by the
manufacturer, e.g.: the balance permits no better operation than ±0.05 g. (Chemistry
textbook 1969)
Relative Uncertainty: Expression of reliability defined as the fraction obtained by
dividing the absolute uncertainty by the value of the result. (Chemistry textbook
1969)
Uncertainty of measurement: parameter, associated with the result of a measurement,
that characterizes the dispersion of the values that could reasonably be attributed to
the measurand. (VIM 3.9)
Combined standard Uncertainty: standard uncertainty of the result (y) of a
measurement when the result is obtained from the values of a number of other
quantities, equal to the positive square root of a sum of terms, the terms being the
variances or covariances of these other quantities weighted according to how the
measurement result varies with these quantities (GUM 2.3.4)
Expanded Uncertainty: Quantity defining an interval about the result of a
measurement that may be expected to encompass a large fraction of the distribution
of values that could reasonable be attributed to the measurand. Obtained by
multiplying the combined standard uncertainty by a coverage factor. Usually
expressed as 'k". (GUM 2.3.5)
When discussing and using scientific terms be sure that the client, regulatory agency, accrediting
body, and others use the terms in the same way. Many clients, regulators and other data users
express these terms based on various in-house and externally defined programs. Knowing the reason
for the test and the client proposed application of the data should ensure that all parties involved with
the data apply the same terms.
In environmental program^, measurement uncertainty is expressed for instrumental uncertainty such
as radiochemistry, and for probability distributions such as biological testing. However current
environmental programs do not currently define the reporting of measurement uncertainty.
Measurement uncertainty must include sampling, site characteristics, matrix effects and the
laboratory effects. Therefore the definition and determination of measurement uncertainty is
required by the data user and is not a laboratory-generated value. The laboratory provides the data
for calculating the uncertainty, but the data user must generate and define the measurement
2
-------
uncertainty for environmental reporting and decisions. This is better understood when the steps for
determining uncertainty are reviewed.
Measurement Uncertainty Methods
The uncertainty provides the data user with the interval about the result. This interval expresses the
random and systematic effects on the measurement. The values present the variability of the result,
thus providing a level of confidence when making a decision using the result. With the adoption of
a single standard, (GUM) for defining uncertainty for the international presentation of data, the use
of the plus/minus symbol should become more common place in the future.
Uncertainty expresses the range of values that could reasonably be attributed to the measured
quantity. The expanded uncertainty provides the level of confidence that the value actually lies
within the range defined by the uncertainty interval.
The estimating of uncertainty is a quantitative indication of the quality of the result. This estimation
provides the data user with the confidence to allow comparability. This is needed in order to reduce
trade barriers, allow accreditation bodies an objective approach to resolving data comparability
complaints and provide the data user with information related to the risk in making a decision.
For example, one accredited laboratory performs method A and determines the result to be 4.5 or
within specification (< 5.0). A second accredited laboratory performs method A on the same product
and finds a result of 5.3 or outside the specification (> 5.0). If both laboratories have evaluated their
uncertainties for the method, the data user can request this information for method A. Although
other factors may be prevalent, if the uncertainties overlap, it can be determined that the test method
results overlap the specification criteria resulting in this seeming acceptable and unacceptable
product. If they do not overlap, the evaluation by the data user should find other variations resulting
in the product disparity in results. In time, if both labs are performing the same method they should
have determined similar uncertainties. However, this will only become possible when the reference
method A states the uncertainties achieved with the method based on inter or intra laboratory
comparison data and a uniform method for determining uncertainty is defined.
Currently there are several ways to determine the uncertainty of the measurement and more are being
proposed. This paper presents several methods which use the internationally definition of
uncertainty. Before any calculations are completed the measurement must be evaluated then the
calculations are performed. This has been presented in a seven step process.
1. Write down what is being measured, including any relationship between the measurand and the
parameters upon whiclf is depends. For example measuring nitrate. Standard used is a potassium
nitrate salt. The measurand may be expressed as nitrate as N03 or N03 as N. Document how
the final result is defined by the calibration standard or spiking standard.
2. State or refer to the SOP defining all measurement conditions. If your SOPs reflects what is
included in the process and all the conditions of the test method is defined, then summarize the
3
-------
conditions and processes that contribute to the uncertainty. These include, calibration equipment
or standard uncertainty, environment, operator, sample or item under test, and procedure.
3. List possible sources of uncertainty, i.e. drying or primary standard, weighing, personnel ability
to weigh and make volumetric measurements, reading the analog dial on the spectrophotometer,
wavelength drift, glassware variations, etc.. This listing from Item 2 indicates the components
under consideration. Detailed studies are not required on any one or all of the contributors, but
a listing of all assumptions should be documented.
4. Consolidate the components. Look for interdependence and eliminate any components that
overlap. This review must ensure that the components are independent variables.
5. Measure or estimate the size of the components. If the random component is known or expected
to be significant perform measurements to determine the components standard uncertainty. This
can be compared to the listed systematic components of uncertainty to evaluate the random
compared to the systematic effects.
Review all the components and convert all components to the units of the measurand. This
requires a technical understanding of the measurement process. In some cases the values are
converted to percent or a defined unit and the final result's uncertainty expressed in that unit. For
example if the percent recovery is used for the LCS and the uncertainty is expressed as percent
the uncertainty for each value is calculated from the result. The result is 12.6 mg/L with the
uncertainty budget of + 10%, then the result is expressed as 12.6 + 1.3 mg/L.
6. Convert the components to the standard uncertainty. For random error divide the standard
deviation by the square root of the number of measurements. For systematic determine'-if
normal, or other distributions exist.
7. Calculate the combined uncertainty and the expanded uncertainty.
Some groups are attempting to use control charts for determining the uncertainty of the
measurement. However, the use of control charts does not necessarily express the measurement
uncertainty. In fact the control chart plots the stability of the measurement. Caution must be used
when evaluating these charts for uncertainty since many control charts do not include all sources of
bias and precision. The control chart should be used to evaluate the stability of the measurement to
ensure that blunders or outliers are not incorporated into the measurement result. The control chart
is used to ensure the process is in control and is useful for ensuring the quality assurance program
is effective. The control chart provides the information regarding the stability of the measurement.
The measurement must be stable in order to calculate the uncertainty. The GUM assumes the
measurement is stable and the error (true value - observed value) is limited. If this is not the case,
the uncertainty should be qualified.
Another method for determining the uncertainty of the test method is based on a recent paper found
in "Environmental Testing & Analysis" Nov/Dec. This approach allows the determination of the
4
-------
method uncertainty using the LCS with a bias correction. Others are determining the uncertainty of
the method by using over 50 LCS data points and calculating the standard uncertainty. These
methods allow the laboratory to evaluate it test method over time, but do not provide any
information on the measurement uncertainty. That is the environmental sample analysis uncertainty
is not one of the components evaluated.
The presentation of the derivation of the uncertainty may be presented as an uncertainty budget or
using cause effect diagrams. This method for determining uncertainty present all the components
associated with the method and list these in a fish bone diagram or Ishikawa diagram. Another term
for this type of diagram is a cause effect diagram.
For more complicated analysis, it is preferred to use this type of diagram in order to present all
components. The Eurachem/Citac Guide provides a detailed discussion of this type of presentation.
The Eurachem document provides specific examples for presenting the evaluation of uncertainty.
The diagram presents a graphical presentation of the uncertainty budget.
This approach takes the existing QC elements found in all environmental chemical measurements
and estimates the uncertainty for a single measurement. The uncertainty includes both laboratory
and field sampling activities. In most environmental data the uncertainty for the field is often ignored
even though many statistical evaluations indicate that the majority of the error is attributed to the
field activity.
The objective statistical evaluation of the uncertainty for field activities has not been possible until
now. Using this new technique, it is possible to back out each of the components and evaluate them
separately. Thereby allowing the overall estimation of uncertainty for measurements for a specific
sample.
The nested approach determines the uncertainty of the sample being representative of the population.
A probabilistic sampling approach along with the uncertainty of measurement should be combined
for the uncertainty of the measurement for the site to be representative of the population.
Environmental Programs Application
Estimation of measurement uncertainty may be achieved by a nested hierarchical study of
uncertainties inherent in each component of the analytical process. The nested study approach
applies mathematical techniques defined as backing-out, normalization, and integration to
estimate component and sample uncertainties. These techniques are simple mathematical operations
that correct for systematic errors and estimate analytical uncertainty inherent in the random variation
of test measurements.
The approach estimates the uncertainty of:
A. intrinsic instrumental test measurement method uncertainty
B. inherent spike uncertainty and spike preparation uncertainty
C. preparation method uncertainty
5
-------
D. matrix interference uncertainty
E. sample collection uncertainty
F. sampling strategy uncertainty
G. sampling site parameter (target analyte) uncertainty
Method uncertainty is A and C.
Population uncertainty in the environment is G.
Measurement uncertainty is estimated by combining A, B, C, and D
The expression of the uncertainty of the measurement allows data comparability. As environmental
programs move away from the regulatory method comparison to a action level or maximum
contaminant level the need for a uniform basis of comparison is required. The expression of
uncertainty provides the data user with the range possible from the measurement.
Regulatory programs will need to define the error (true value from observed value) and the
uncertainty of the measurement expected before making a decision. Some programs define the
expected recovery of the contaminant using any method. Such as at least a 60% recovery must be
achieved with the test method. This may be difficult since, current methods do not always achieve
this goal.
Decisions based on a single number without the stated uncertainty are not meaningful in a
performance based regulatory program. Knowing the value and the expressed uncertainty is outside
the specification limit allows for a clear decision. The specification criteria may be client,
regulatory or product driven. The specification limit does not always have to be a range. In many
cases the specification is a single value that the result must be less than or greater than a single value.
The specification limit may be regulatory limit or product acceptance criteria. By expressing the
value and the range, it provides the user with the information necessary to understanding any risk
or uncertainty associated with making the decision. When the value and the expressed uncertainty
is inside or outside the specification, the decision is clear. When the value and its uncertainty are not
inside or outside, the decision is not clear. In fact, the chance of making an incorrect decision is
more likely.
Without an estimate of uncertainty of the measurement the opinion and interpretation require a
significant amount of documentation to justify the decision. In all cases the client and laboratory
must agree who is making the decision relative to the data and what are the decision rules that need
to be applied to the data when making that decision. Knowing regulatory requirements, industrial
standards and conformity assessment criteria is a must before expressing an opinion. Documentation
of these decision rules ensures consistent implementation and a documented understanding should
future interpretation be required.
References:
"Environmental Analytical Uncertainty Estimation: Nested Hierarchical Approach," William
Ingersoll, Charleston, S.C., 2000
6
-------
"Estimation of Laboratory Analytical Uncertainty Using Laboratory Control Samples,"
Thomas Georgian, U.S. Army Corps of Engineers, "Environmental Testing & Analysis,"
Nov/Dec, 2000 pp 20-24, 51
"Guide to the Expression of Uncertainty in Measurement," ISO, Geneva, Switzerland 1993
(ISBN 92-67-10188-9) Known as "GUM"
"International Vocabulary of Basic and General Terms in Metrology," VIM ISO(1993),
second ed, International Organization for Standardization, Geneva, Switzerland, ISBN 92-
67-01075-1
"Quantifying Uncertainty in Analytical Measurement," EURACHEM/CITAC Guide, Second
Edition: 2000 (Eurachem)
"Quality Assurance of Chemical Measurements ," John Keenan Taylor, Lewis Publishers,
1987
"The Expression of Uncertainty and Confidence in Measurement," M3003, United Kingdom
Accreditation Service (UKAS), England, edition 1, December 1997
7
-------
TWO DATABASES IN EVERY GARAGE:
INFORMATION QUALITY SYSTEMS
Jeff Worthington, USEPA, Office of Environmental Information
KEY WORDS
information quality, life cycle, value chain
INTRODUCTION
The first automobiles were assembled by hand and no two were alike.
Parts were not interchangeable. Invention of the assembly line
(Ideafinder 2001) by Ransom Olds (Henry Ford added the conveyer
belt) is often cited as an important example of how standardization
, . . ,. ., , 1900 Classic Oldsmobile
helped an enterprise achieve success. Standardizing individual parts
promoted interchangeability and facilitated maintenance and product reliability. Quality system
techniques evolved to support various assembly line processes.
Technical information generators and customers are confronted with a similar problem.
Technical database systems are often crafted by a small group. Database elements, data
structures, and relationships that give value to information may not be interchangeable between
systems. Systems may not be able to directly interact with each other or be comparable.
Standardizing data system parts was not a priority.
Now, information workers and the enterprise recognize that information is an important strategic
resource for the enterprise and needs to be managed as a resource. Management systems for
many enterprises are adapting to support standardization and a centralized framework for all
individual components to facilitate information resource access and usability. Successful
management reduces the need to "retrofit" data to meet the needs of new users in a different
model of the data system. Just as in the automotive industry, an enterprise's quality system must
evolve to support these new approaches to managing information, including technical
information.
This paper describes integrating quality systems with both technical science systems and
information systems and how the resulting integrated system will ensure quality of production
and distribution of technical information as a strategic resource.
Basic techniques that are useful to all quality managers are presented including:
• identifying information quality indicators
• managing information as an enterprise resource
• reconciling information quality with existing quality systems
• assessing information quality
1
-------
BACKGROUND
Information management quality systems Prior to creation of the USEPA in 1970, Federal
agencies were already actively developing and collecting environmental information such as
water quality measurements. New EPA programs, with increased analytical technical capabilities
including automation and computerization of analytical operations, created a literal deluge of
both paper and electronic information. Also, new information sources, such as geospatial
information, increased both scope and application of environmental information. Information
management and database systems evolved to meet customers' needs in each individual
environmental program information application. Quality systems designed for information
management were based on identifying functional and data requirements and subsequent design
and development of software systems, subject to intensive testing programs. Quality
management for hardware systems included assurance of both reliability and maintainability.
The most recent innovation, the Internet, increased access to these growing storehouses of
environmental information and changed customers' expectations regarding quality of
information.
Technical measurement quality systems
USEPA established formal quality policy in 1984 and re-affirmed policy in 2000 in USEPA
Order 5360.1 Al, Policy and Program Requirements for the Mandatory Agency-wide Quality
System, and USEPA Order 5360.1 A2, USEPA Quality Manual. These documents provide for
quality of environmental measurements, environmental technology development, and use of
measurements by secondary users. Guidance development in the specific area of information
quality is left to each office to develop relative to their own work efforts and outputs.
Quality and data caught in the middle
Environmental measurement "centered" quality system may treat application of computer
technology as a support operation, subservient to the real focus of the process: collection and
analysis of environmental samples and information. The required planning document, a Quality
Assurance Project Plan (QAPP), may not contain a detailed section describing minimal quality
expectations for performance of a computer system. Likewise, quality systems for development
of software may not take adequate consideration of the meaning and ultimate use of
environmental measurements and supporting quality indicators as a non-depleting resource. The
result of this situation is that the "quality of information" may not be adequately measured or
known. Quality managers may need to look at unification of disparate quality systems to ensure
that all quality interests of the enterprise are adequately captured. For example, a technical
measurements quality system takes into account accuracy as a measure of the ability of a
technical measurement to provide results as close as possible to the actual value. A software
quality system may take accuracy to mean a demonstration of freedom from defects when data
are entered into an information system.
Standardization in technical measurements and information
Quality systems are based on and promote use of standard methodology. The EPA quality
system is based on a national consensus standard. Environmental analyses are routinely
performed according to standardized analytical methodologies (e.g., "EPA Standard Methods).
-------
Capture of environmental measurement data and other information into EPA systems has
recently been the subject of standardization in the form of data standards. To date, the Agency
has completed work on the following data standards:
biological taxonomy, chemical identification, date, facility identification, longitude/latitude, and
SICS/NAIC (business and industry classification).
KEY CONCEPTS
Following are some key concepts to consider when planning integration of disparate quality
systems and developing systems which address both production and distribution of technical
information.
Concept 1: Recognizing government information as a strategic and national resource
Enterprises, including government Agencies, have focused on providing resources for
information and data and laid out authority and responsibility for IT support and operations.
Identification and management of information as a strategic resource is an important aspect to
establishing and endorsing formal stewardship in the new millennium. This identification allows
organizations to consider how to manage information as a resource. For example, as a resource,
the enterprise may need to look at resource availability, cost, and disposal of the information
resource. One unique aspect of information as a resource is that information is not a
"consumable" resource. The fact that information is a strategic resource and the fact that it is not
consumable greatly impact the type of quality system that should be used when considering
production and distribution of this resources.
Many resources used in a manufacturing facility are not reusable. They are produced or
purchased and are subject to routine inspections to determine or verify that resources are
acceptable for their use in the facility. Non-conforming materials are rejected. In a similar
manner, non-conforming information produced as the result of environmental measurement or
recording of compliance monitoring may be rejected if information is inspected to determine
conformance for acceptability. However, for information, even if the information is not
acceptable for its intended use, the enterprise may value and plan to keep the information for
other purposes. Also, the very fact that the enterprise has possession of a growing warehouse of
data may lead the enterprise to evaluate the use and re-use of this information for new and
previously unplanned purposes. In that sense, information resource is not "consumable." Also,
the quality of this resource may be perceived differently. For example, in using this warehoused
data, new users may apply new quality indicators, such as the degree to which information
describes conditions across a broad geographic area, where information was only originally
intended to be used to apply to a specific area for a simple need relative to that area.
Concept 2: Unification generation in information systems
Zachman Framework authors (Zachman, Inmon, Geiger 1997) describe the following
evolutionary generations for the computer environment:
• formation generation - introduction of the computer to the enterprise
• proliferation generation - the enterprise recognizes value and expends significant
resources to acquire and support computer systems
3
-------
• dispersion generation - computers and computer operations are dispersed widely
throughout the enterprise without a focused effort to manage them
• unification generation - where we all should be now, the enterprise shift management
focus from managing the computer and associated technologies to managing the
environment within which computers operate and to managing data as a enterprises's
resource.
USEPA is actively moving from the dispersion generation to the unification generation of
computer systems. Information managers and quality managers focused quality management
systems on distinct elements in the dispersion generation of the computer environment (and the
overall enterprise). For example, quality planning was performed according to the following
model:
TABLE: Dispersed quality system foci
Activity
Quality system focus
software development
QA planning for software, life cycle development, data/functional requirements,
verification and validation of software
hardware
reliability and maintainability of hardware, purchasing requirements
technical information development
QA planning for science activities, scientific method, measurement objectives,
verification of conformance to quality control criteria
information collection into a data system
data integrity checks, error correction protocols
data warehouses
gap identification, completeness, consistency
In addition to the generally disparate areas identified in the table above, in each activity area, the
quality system activities are often performed according to different methodologies. For example,
for technical information, development activities are often different for each type of information.
Unification of all these activities in a single quality system should be consistent with efforts to
unify information systems.
Concept 3: Understanding needs and expectations for information quality
An enterprise planning to actively managing information quality must know what information
quality means to the enterprise. That knowledge is based on understanding needs and
expectations of customers for information. Information customers may include any or all of the
following:
• enterprise knowledge workers
• enterprise managers
• clients
As in any other quality model, quality managers need to work with these parties to document
their needs and expectations. Some typical types of features and characteristics of information
that are of interest to customers include various aspects of either production or distribution of
information or both. Some features and characteristics that may be of interest are discussed in
following sections.
4
-------
Concept 4: Identifying information quality value chains
"A chain is only as strong as its weakest link." Information quality characteristics are not simply
measured attributes like those attributes that could be measured of an end product in a
manufacturing process, such as an automobile. Information quality is both additive and separate.
It is additive because each link in the process may affect the next link and the sum total
expression of quality is related to the overall linked chain of activities that led to information
product. It is separate because there are processes related to each link in the chain and efficacy
of each process may be assessed for its contribution to a specific quality indicator. Also, in any
enterprise, the enterprise may not maintain overall responsibility for every link in the chain.
Information may be "customer-supplied products" or "raw materials" provided by another party;
information and information components may be supplied externally. Therefore, an enterprise
must understand processes for those links in the chain over which they maintain control and
understand how information quality is impacted by those links over which they do not have
control.
An enterprise may have a single straight forward information quality chain OR the enterprise
may have several different and interacting information quality chains. A classic example for
USEPA includes links involved in collection, analysis, and reporting of technical measurements.
Each potential link in that specific information quality chain and associated processes is
summarized below:
TABLE: Information Chain for An Environmenta
Measurement Project
link
processes that impact quality
associated quality features and
characteristics
planning
location selection
data quality objective development
data standards development
relevant information
acceptable quality objectives
standard and comparable data
sample collection
sample identification
sample bottle preparation
sampling procedures
sampling preservation
quality control checks
authenticity of sample preserved
no contamination, control of quality
known, comparable, repeatable procedures
sample stability, control of quality
control of quality
sample transfer
chain-of-custody maintenance
transfer labeling
authenticity of sample
authenticity of sample
sample receipt
identification verification
chain-of-custody documentation
authenticity of sample
authenticity of sample
analysis
preparation procedures
analysis procedures
QC checks
data validation/verification
known, comparable, repeatable procedures
known, comparable, repeatable procedures
control of quality
known and acceptable results
measurement results
distribution
electronic data transfer
data transfer standards development
validation of transfer
verification of data
usability determination
timely distribution of data
consistent and comparable data distribution
known and acceptable transfer
known and acceptable information content
results are usable for their intent
5
-------
link
processes that impact quality
associated quality features and
characteristics
data handling
software actions on data
software quality assurance
data remains free of errors
data warehousing
labeling and storage of data
unnecessary duplicates are not in warehouse
warehouse data records are complete
data reporting
manipulating data into customer-designated formats
timely distribution of reports
usability and format of reports
data accessing
providing data via Internet to the public
timely distribution of data
completeness of data
usability of data
data archival
storing data when not actively needed
retrieveability of data
Concept 5: Reconciling disparate terminologies
One challenge for USEPA is at the confluence of quality for environmental measurements and
quality for information management. Quality managers and information managers need to
understand terminology used by each party and to agree on a standard terminology because many
words may have conflicting meanings to different parties. Examples include:
TABLE: Disparate terminologies
term
potential meaning in environmental science
potential meaning in information technology
data
environmental measurement
any representation of a fact
data quality
measurement parameters such as precision, accuracy,
representativeness, completeness, comparability
data records are complete
data integrity
potential synonym for "quality"
conformance to technical criteria and business rules
reliable data
data were generated by a reputable source and are of the
quality needed
data are correct and have been securely maintained
Quality managers know that there are often multiple definitions for a single term. Attachment
one is an exam-'-: of how terminology may be organized for a quality system which considers
environmental J scientific measurements. Each may be useful to quality managers in planning
and assessing 11 rmation quality. The works of both Larry English (English 1999) and Thomas
Redman (Redman 1996) were critical resources in developing this list.
Concept 6: Four basic models for information quality systems
Experience is growing in tjie area of information quality. Quality managers may be able to use
one of the information quality models provided in various texts. However, they may, just as
likely need to develop their own information quality model by:
• identifying information that is required
• understanding and structuring individual information management system components
and other processes which act on information in the information quality value chain
• selecting information quality indicators of interest and measuring them
6
-------
Quality managers may be confronted with issues regarding the quality of information in any or all
of the following four models:
• Information as an enterprise product - Some enterprises acquire, develop, manage,
process, and sell information as a product of the enterprise. For these enterprises, the
process to identify and ensure quality characteristics of their information product
resembles a traditional quality system model.
• Information and information systems which support development of product -
Enterprises may need to access, develop, manage, or process information via information
systems to ensure development of the enterprise's products. A great deal of information
is created internally. The enterprise may be reliant on the quality of this information to
ensure all processes are working as required. In the quality system model, information
quality is a distinct portion of the overall enterprise quality system.
• Information needed to support the quality system - successful implementation of the
enterprise's quality system may be dependent on information generated by the enterprise.
For example, measurements made during manufacturing processes may generate
quantities of information used to continue quality management for the enterprise. In this
quality system model, information quality is a part of the quality management and quality
record portion of the quality system.
• Information quality in communication with potential purchasers and the public -
Increasingly, enterprises are reliant on information and information systems to
communicate with purchasers and the public via the Internet, an electronic environment.
Quality of information available via this electronic medium may directly impact
customers' needs and expectations. In this quality system model, both production and
distribution of information are a distinct component in the enterprise's quality system.
Concept 7: Nested quality systems and Russian dolls
Which came first, "the chicken or the egg?" "data or the information system?". This may be
a critical question. There are several way to look at "value-added" components of processes that
affect data production and information management systems. One useful analogy is to consider
quality systems to be "nested," similar to nested Russian dolls. One inside doll could be the
quality system for production of environmental measurements data. Another inside doll could be
the quality system for production of other data types (e.g., GIS measurements). The outside doll
is the quality system for distribution of data via the information management system. Regardless
of the outside system, the inside system must still meet quality requirements for its own system.
The outside system, can, however, impose some higher level requirements based on higher-level
needs. • . .
7
-------
DEVELOPING AN INFORMATION QUALITY SYSTEM
Quality managers can develop an management system to ensure quality of production and
distribution of technical information by the following process:
• Select a standard quality system model (ISO 9001, E4, EPA quality manual chapter 3)
• Identify the information product of the organization
• Assess and determine all individual processes in the information quality value chain
• Identify quality indicators that are valued by customers for information product
• Determine assessment and measurement methodology for those quality indicators
• Apply the quality system model elements including quality policies and procedures in key
areas
- general description
- quality system overview
- personnel qualifications and training
- procurement of items and services
- documents and records
- planning
- implementation of work processes
- measurement
- assessment
- quality improvement
Select a standard quality system model
Some standard quality system models are available for use in establishing an organization-wide
information quality system. EPA's 5350.1 A1 Chapter 3 provides guidance for individual
elements that may be considered in a quality system. Part A: Management System of American
National Standard, ANSI/ASQC E4-1994, Specifications and Guidelines for Quality Systems for
Environmental Data Collection and Environmental Technology Programs. Part is based on
general quality system elements expressed in the ISO 9000 standard. The model for E4 is also
useful because it proposes and general management system and then provided a focus on separate
technical areas. Additional areas may need to be added for each individual user.
Identify information product of the enterprise
Determine to what extent information is an enterprise's product. Are there types of information?
Who are the customers? What is information used for? How is information developed? What is
the relation of information product to other enterprise products?
Assess and determine all individual processes in the information quality value chain
Chart out each type of information using an information quality value chain. Identify individual
processes that create or act on information at each step in the process. Are there standard
operating procedures for these processes? Are there already process controls in place for these
processes? Do you understand the relationship of these processes to the quality of the
information? Document the results. Be sure to clearly express the relationship between
production (developing, observing, and recording data) and distribution (receipt, processing,
warehousing, and reporting) aspects.
8
-------
Identify quality indicators valued by customers
Identify customers for information product, both external customers and those who may be
considered to be "internal" users of information. Have they identified information as a product?
Do they have expectations for the information? Do they have written requirements for
production or distribution of information? Are known quality indicators well defined?
Document the information quality indicators and write formal definitions for each indicator and
relate them directly to processes identified in the previous section.
Determine assessment and measurement methodology for quality indicators
Review the information quality indicators and evaluate the need for measurement methodology.
Has the customer already detailed a specific measurement methodology? Is there more than one
way to measure for each information quality indicator? Has the customer already expressed
minimum acceptance criteria for information? Have baseline measure already been established?
Has anyone already been tracking conformance to any information quality measures against other
variables such as time? Is terminology for measurement consistent among all users? Document
the results.
Develop a written management system for information quality
Plan out the contents of the written management system for information quality. One useful
approach is to develop the system using a three-tiered approach based on the ISO 9001 model.
THREE-TIERED MODEL
The following three tiers are suggested for an information quality management system model:
Top tier - vision, mission, and general description of the information quality system
Record the mission and vision of the enterprise. Develop a written statement of enterprise-wide
quality policy which captures the overall emphasis on the quality system by management.
Develop a general description of the quality system which identifies key elements for planning
implementation and assessment. Include in this level tables which detail quality system
commitments in the following areas:
• roles and responsibilities - identification of management and quality management roles
cross-referenced to activities critical to quality system development (resource
commitment, quality system records, assessment schedule, training, quality records,
procurement, etc.)
• quality system records - record type (quality plan, quality reports, etc.) cross-referenced
to responsibilities for preparation, review, approval, frequency of development, and
distribution
• quality assessment schedule - assessment types (e.g., project, product, system, quality
system, data system, etc.) cross-referenced to assessment tool, assessors, basis for
assessment, minimum frequency, purposes for assessment, and review authority
9
-------
Tier two - organization-level quality policies and procedures
Develop individual statements of quality policy for each key area of the information quality
system and include higher level procedures. This approach will allow for future editing to the
overall quality manual for a single quality area without re-drafting the entire overall document.
For each individual quality policy, include the following elements at a minimum:
• policy title
• approval authority and date
• succinct policy statement
• individual statement of quality requirements (if greater detail is needed)
• purpose
• scope
• responsibility and the role for implementing that responsibility
• listing of any associated documents
• procedures
• quality control (checklist) items that must be addressed
Inclusion of the checklist will encourage quality managers and staff to not develop any specific
policies and procedures that are not planned to be implemented.
The types of specific quality policies and procedures that may be most useful in an information
quality system include:
• general quality system "housekeeping" - quality documents, quality records, quality
system roles and responsibilities implementation, quality system dispute resolution,
quality system improvement
• quality support operations - standard operating procedures format and development
protocols, quality and other technical training processes, customer satisfaction surveys
• identification of quality indicators - information production indicators, information
distribution indicators, technical information indicators, measurement methodology,
measurement methodology development (both production and distribution criteria)
• information production, project and program planning requirements - customized
requirements for planning each kind of information production product including
objective development, acceptance criteria, and ensuring appropriate quality indicators
related to information distribution are also addressed where appropriate
• information distribution, project and program planning requirements - customized
requirements for planning each kind of information distribution product including,
objective development, acceptance criteria, and ensuring appropriate quality indicators
related to information production are also addressed where appropriate
• information and data warehouse maintenance planning requirements - requirements
for ongoing maintenance and operation of a large database system including
responsibilities for data stewardship and routine monitoring and reporting of applicable
quality indicators
• information security - minimum requirements for monitoring and maintenance of
information security
10
-------
• assessment procedures - processes for assessing information products, processes,
systems, database systems, developing assessment schedules, corrective action in
response to assessments
Tier three - standard operating procedures
Tier three consists of the individual SOPs of the enterprise or each unit in the enterprise. SOPs
are the actual work instructions for performing individual activities and are subject to frequent
change. SOPs should be written in a way which facilitates their use and each modification for
improvement.
The EPA Office of Environmental Information developed a Management System for Quality
which details both level one and level two as described above. OEI is working to implement the
system and develop requisite SOPs. Electronic copies of OEI's new quality system (and this
technical paper) can be obtained by sending an Email request to the author
(Worthington.Jeffrey@epa.gov).
CONCLUSION
USEPA and other Federal Agencies are actively unifying and integrating disparate information
systems. As businesses and more government operations increase reliance on these centralized
and standard information systems, the quality manager's job will be easier. Understanding the
nature of the quality of the information and how information processes may act on the quality of
the information will remain key to the ability of the quality manager to develop useful
measurement tools for managers.
ACKNOWLEDGMENTS
Larry English, President and principal of INFORMATION IMPACT International, Inc.
Brentwood, TN and his Total Quality data Management (TqdM®) methodology for information
quality improvement. Author, lecturer, reviewer and a valuable resource in applying quality to
information issues.
www.infoimpact.com
Thomas Redman, President and Founder of Navesink Consulting Group, Little Silver, NJ.
Author, lecturer, reviewer, and a valuable resource in defining and understanding data quality
issues.
www.navesink-dq.com
USEPA information resources available on the web: www.epa.gov/oei
REFERENCES
ANSI/ASQC E4-1994. American National Standard: Specifications and Guidelines for
Environmental Data Collection and Environmental Technology Programs.
Brackett, Michael H. 1996. The Data Warehouse Challenge: Taming Data Chaos. New York,
New York: John Wiley & Sons, Inc.
11
-------
English. Larry P. 1999. Improving Data Warehouse and Business Information Quality': Methods
for Reducing Costs and Increasing Profits. New York, New York: John Wiley & Sons, Inc.
Principia Cybemetica Web site 2000. pespmc 1 .vub.ac.be/ASC/Inform_syste.html
Redman. Thomas C. 1996. Data Quality for the Information Age. Norwood, Massachusetts:
Artech House.
Zachman, John A., Inmon, W. H., Geiger, Jonathan G. 1997. Data Stores Data Warehousing
and the Zachman Framework, Managing Enterprise Knowledge. New York, New York,
McGraw-Hill.
USEPA 2000. EPA Order 5360.1 A2: Policy and Program Requirements for the Mandatory
Agency-Wide Quality System. Washington, D.C.: USEPA.
http://www.ideafinder.com/historv/inventions/storv002.htm
12
-------
ATTACHMENT ONE
INFORMATION QUALITY AND DATA QUALITY DEFINITIONS
The following definitions may be useful for discussion or reference when developing an integrated quality
system to support production and distribution of technical information.
DATA DEFINITIONS
datum - (data item) is a representative triple which consists of e, a, v where
c = entity (and entity's meaning)
a = attribute (and attribute's meaning)
v = value (and value's meaning) Value may include units when the datum represents a measurement.
(Redman 1996)
NOTES
a. The datum represents some element in a model; the element is a real world thing (tangible = physical,
intangible = e.g.. idea) or event. As an event, the datum would need to be captured at the point of the
event.
b. The datum usually represents a fact, a truth, or observation about the real world; but does not always
have to represent a fact.
data representation - a set of rules for recording data (representative triples) on some medium
NOTES
a. Therefore, the same data may be represented in different ways.
b. Therefore, data represented in a prescribed manner may be recorded many times.
c. Data can exist without being represented.
d. These rules are a form of "metadata" (Redman 1996)
data record - a physical instance standing for a set of data items according to the data representation
NOTES
a. Data can exist without being recorded. (Redman 1996)
environmental data - data of measurements or observations that describe environmental processes or
conditions, or the performance of environmental technology (ANSI/ASQC E4-1994)
NOTE: In a broader sense, these data may include ancillary data which are needed so that the data have
meaning (are useful as information), data such as: name of the sample site, sample location, sample no.,
collection methodology, etc.
geospatial data - data of geospatial measures that include a three-dimensional reference system (usually based
on a model of the real world)
NOTE: Often the three-dimensional reference system is cross-referenced to observational data regarding a
physical attribute for locations and if often considered to be environmental data.
quality indicator data - data of the quality indicators.
NOTES:
1. When associated with environmental measurements, this data is usually developed and recorded at the
same time the measurements are developed and recorded.
2. This type of data is sometimes referenced as meta-data.
INFORMATION DEFINITIONS
information - a datum or data presented to meet customer expectations
NOTES:
a. Data presentation must be "knowledge worker-friendly "
b. Data presentation must impart meaning to the data.
13
-------
information production - that aspect of the information which is associated with the creating, updating,
collecting and storing information that gives the information value to the stakeholder (vs. other aspects of the
information such as the data representation)
information distribution - that aspect of information that is associated with the distribution (i.e., extraction,
manipulation, and presentation) of information.
information system - in the broadest sense, a system of functions concerning the acquisition and transfer of
information. (Principia Cybemetica 2000)
NOTES:
a. Carriers in an information system can be biological, social, or personal units, etc.
b. An information system is dedicated to a certain type of information (e.g., environmental information.
c. A storage device is usually part of an information system.
QUALITY AND SYSTEMS DEFINITIONS
quality - the totality of features and characteristics of a product or service that bear on its ability to meet the
stated or implied needs and expectations of the customer.(ANSI/ASQC E4-1994)
quality assurance (QA) - an integrated system of management activities involving planning, implementation,
assessment, reporting, and quality improvement to ensure that a process, item, or service is of the type and
quality needed and expected by the customer.(ANSI/ASQC E4-1994)
quality control - the overall system of technical activities that measures the attributes and performance of a
process, item, or service against defined standards to verify that the meet the stated requirements established by
the customer; operational techniques and activities that are used to fulfill requirements for
quality.(ANSI/ASQC E4-1994)
quality feature - an individual feature of a product or service that is identified as a feature of interest for the
purpose of a quality system.
NOTE: A quality feature may be subject to measurement (see quality indicator).
quality indicators - measurable attributes of the attainment of the necessary quality (quality
features).(ANSI/ASQC E4-1994)
NOTE: In USEPA, quality indicators originally were applied solely to the "quality necessary for a
particular environmental decision and included: precision, bias, completeness, representativeness,
reproducibility, comparability, and statistical confidence. OEI in identifying a greater breadth of quality
indicators to describe and measure the quality of overall Agency information quality,
quality management - that aspect of the overall management system of the enterprise that determines and
implements the quality policy. Quality management includes strategic planning, allocation of resources, and
other systematic activities (e.g., planning, implementation, assessment) pertaining to the quality
system.(ANSI/ASQC E4-1994)
quality system - "the management system for quality" a structured and documented management system
describing the policies, objectives, principles, organizational authority, responsibilities, accountability, and
implementation plan of an enterprise for ensuring quality in its work processes, products (items), and services.
The quality system provides the framework for planning, implementing, assessing work performed by the
enterprise and for carrying out required QA and QC.(ANSI/ASQC E4-1994)
management system - a structured non-technical system describing the policies, objectives, principles,
organizational authority, responsibilities, accountability, and implementation plan of an enterprise for
conducting work and producing items and services.(ANSI/ASQC E4-1994)
DATA QUALITY DEFINITIONS
data quality - the totality of features and characteristics of data that bear on its ability to meet the stated or
implied needs and expectations of the customer.
NOTE: One narrow definition of data quality is
"data quality = data representation quality + data record quality"
14
-------
data representation quality - attributes of data representation quality include:
• the rules for recording data provide data meet the customers' definition
• the format allows for processing by explicit procedures
• the format allows data to retain its characteristics during repeated use
data record quality - attributes of data record quality include:
• the record is a true record of the element that was meant to be recorded (special cause bias)
• the record was accurately recorded (freedom from common cause bias; e.g., systematic data entry error)
data standards quality - the degree to which the data standards enable people to easily define data
completely, consistently, accurately, and understandably. (English 1999)
data architecture quality - the degree to which the data models are reused, stable, and flexible and how well
they depict the data requirements of the enterprise; and how well the databases implement those requirements
and enable capture, maintenance, and dissemination of the data among the information customers. (English
1999)
INFORMATION QVAUTY DEFINITIONS
information quality - the quality of the information production + the quality of the information distribution
(see following sections)
INFORMATION PRODUCTION QUALITY DEFINITIONS
information production quality - the totality of features and characteristics of information production that
bear on its ability to meet the stated or implied needs and expectations of the customer.
(environmental) measurement quality - the quality indicators that describe the (inherent) quality of
environmental measurement results. These include precision, bias, completeness, representativeness,
reproducibility, comparability, and statistical confidence.
information verification and validation - the degree to which the information has been verified and validated
and show to meet requirements related to development of the data (e.g., analytical methods validation)
INFORMATION DISTRIBUTION QUALITY DEFINITIONS
information distribution quality - the totality of features and characteristics of information distribution that
bear on its ability to meet the stated or implied needs and expectations of the customer, (e.g., data entry
quality, data warehouse quality, information architecture quality, etc.)
date entry quality - those quality features that describe quality related to the data entry process (e.g.,
correctness, completeness, data entry verification, data entry validation)
data warehouse quality - those quality features that describe the quality of data resident in Agency data
warehouses (e.g., duplicate data entry, completeness)
information architecture quality - the degree to which information models are reused, stable, and flexible
and how well they depict the information requirements of the enterprise (e.g. non-redundant system processes,
business information model clarity, operational data model clarity) (English 1999)
software quality - those quality features of the software that ensure that the software meets the data and
operational requirements of the stakeholders and ensures the quality of the information managed and delivered
by the software, (e.g., verified software, validated software, conformance of software to enterprise
requirements)
hardware quality - those quality features of the hardware that ensure that the hardware meets the
requirements of the stakeholders and ensures the quality of the information managed and delivered by the
hardware (e.g., reliability, maintainability )
information usability - the degree to which information is usable for its intended purposes.
15
-------
HOW GOOD ARE MY DATA?:
INFORMATION QUALITY ASSESSMENT METHODOLOGY
Jeff Worthington, USEPA, Office of Environmental Information
George M. Brilis, MBA, JD, USEPA Office of Research and Development
(ORD) National Exposure Research Laboratory (NERL) Environmental
Sciences Division, Las Vegas
Abstract— Quality assurance techniques used in software development c
hardware maintenance/reliability help ensure that data in a computerized information
management system are maintained well. However, information workers may not know
the quality of data resident in their information systems.
Knowledge of the quality of information and data in an enterprise provides managers
with important facts for managing and improving the processes which impact
information quality. This paper provides information to assist information workers in
planning and implementing effective assessment of information data and quality. The
areas covered here include:
• identifying appropriate information quality indicators
• developing assessment procedures
• conducting information quality assessments
• reporting information assessment results
• tracking improvements in information quality
KEY WORDS
information quality, measurement, database
BACKGROUND
The source of information and data that may be of value to a customer may not be directly
known, may be unreliable, or may need to be checked for verification purposes. Also, data and
information may be old and there may be reasons to doubt its reliability. Alternatively, processes
in use when data and information were developed may not be understood in relationship to the
quality of the information and data themselves. Other questions about the veracity of the
information and data may be of interest to a customer and therefore, a quality concern. For
example:
• to what degree are two disparate databases comparable?
• is the quality of the data affected by transfer into a new system?
• am I getting my data fast enough?
2nd
1
-------
Many quality systems focus primarily on data production. For example, in the USEPA, the
quality system considers production of environmental measurements and recording resulting data
and quality indicators (AKA, metadata). In software development, software quality systems may
only consider writing consistent code and the valid operation of the system according to
identified data requirements and system requirements. However, for information and data
themselves, quality may not always be known.
For quality managers to assist enterprise management in systematic planning and improvement of
information and data quality, quality managers must have the ability to assess all aspects of data
and information quality. Even though, assessment capability alone cannot replace
implementation of a robust quality system, it is a critical feature in a quality system and is an
important tool in understanding the current status of information and data quality.
The following provides a discussion of the types of information quality assessments that quality
managers may consider. Assessment planning, assessment scope development, and assessment
implementation are also discussed.
TYPES OF INFORMATION QUALITY ASSESSMENTS
There are several ways to consider types of information quality assessments.
For example, one way is to look at the product-process-system trilogy:
• information/data product - the end product of all the processes
• information/data process - an individual process in production or distribution of
data or information
• information/data system - the entire collection of processes which make up the
system
Information/data product assessments
Assessment of information/data products evaluates conformance of product to customers'
expectations. The expectations may be expressed as quality indicators and include a basic
measure of correctness.
Information/data process assessments
Assessment of information/data processes evaluate process effectiveness and process impact on
quality of the information/data product.
Information/data system assessments
Assessment of information/data systems evaluate all aspects of the management system and
technical system to evaluate system effectiveness for achieving intended results. This level
assessment may also focus on conformance to an industry-wide, external, or other standard
specifications.
2
-------
In addition to the above scope approach, assessments can also be objective-based. The following
are some examples of some objective-based information quality assessments:
• external data quality - quality of all information/data provided by an external
provider
• pre-award assessment - preliminary assessment of a supplier's information
quality system
• data element assessment - quality of data element definition for all data elements
of a certain type
• individual quality indicator assessment - assessment of an individual quality
indicator such as timeliness or the delivery of all information
• conformance assessment - conformance to an external standard
INFORMATION/DATA PRODUCT ASSESSMENT EXAMPLE
For the purposes of brevity, this technical paper addresses a single example of the process that
might be employed for assessing information/data product. The paper also explores the potential
relationship of the end result of the assessment to the information/data processes and system that
produced the product. The example product considered here is a large database maintained by an
enterprise for a long period.
PRE-ASSESSMENT PLANNING
The assessor needs to carefully plan the assessment in advance in order to perform an efficient
and effective assessment. In some cases, the assessor may be able to perform the assessment
within an electronic environment and not need to travel to a separate location. This is most likely
when the assessment involves quality of data in one or more databases. In those situations where
the process or system must be looked at, the assessor will most likely need to visit the site and
people involved in the processes and overall system.
Determine purpose and scope of the assessment
The assessor should meet with customer or management representatives and determine the
purpose and scope of the assessment. How assessment results assessment may be used is critical
in planning the assessment. Collecting assessment information that has no use is a waste of
resources. For a database assessment, assessment scope is based on:
• amount of data in the system
• quality indicators that are of interest to the customer
Assessment results often need to be the subject of corrective actions and planning for future
preventative actions. If that is the case, the process by which the assessor identifies
nonconformances and defects and how the corrective action process will be implemented must be
discussed in advance. Assessors may be involved in follow-up review of a written corrective
action plan or even the revised information/data product itself. It is critical to establish this
process prior to conducting the assessment.
3
-------
Identify applicable information quality indicators
Customers for the product, process, or system have needs and expectations for data and
information which are produced and distributed. Meeting with the customers for the data allows
the assessor to identify information quality indicators valued by the customer. Attachment 1
identified some potential information quality indicators. Alternatively, if the person/group who
requested the assessment has a robust information quality management system, that system may
be a good resource for identifying information quality indicators.
Establish measurement methodology
Once quality indicators are selected, the measure of the quality indicator must be determined.
There may be more that one possible measure for a single quality indicator. A good example of
this is in the case of timeliness, which is expressed as two forms of information float:
• information float 1 - the time it takes for an item of information to be collected
into a data system form the time the information was first available
• information float 2 - the time it takes for an item of information to be available
to a system user from the time it is first collected into the data system
For either type of information float, there are at least two possible measures:
• time units - a direct measure of time (e.g., days, hours, minutes, seconds)
• conformance - a measure if the information was received in time for its use (e.g.,
yes or no)
Statistical sampling
Selecting a sample of the overall data population may be necessary to evaluate an individual
quality indicator. Sampling methodologies include (English, 1999):
• random sampling - use of random number generator to provide equal chance to
select every item of data
• systematic sampling - selection of every nth record, based on ration of required
sample size to total population (for use when data records are already random)
• stratified sampling - when there is more than one stratum in the records, to
ensure the selection of adequate records in each strata
• cluster sampling - selection of subsamples from logical clusters in the database
and combining them
Determine the need for acceptability criteria
Depending on the scope of the assessment and maturity of the quality system in place for
information and data, the assessor may need to establish acceptability criteria to report any
measurement as a nonconformance.
4
-------
When sample methodology is employed, acceptability criteria form the basis for the
determination of sample size based on the desired confidence level. Larry English provides a
detailed explanation of the applicability of acceptance sampling methodology is his recent book
(English, 1999).
Identify alternative information source
For some quality indicators (e.g., accuracy to original data), assessment of information quality
may require identification of an alternative or additional information source to use as the basis
for comparison. Identify those sources prior to the assessment, if possible, and verify with the
customer for the assessment the authenticity/acceptability of the alternative information source.
ASSESSMENT WORKING PAPERS
The assessor should develop documents which serve as the basis of the assessment and facilitate
the recording of both observations and conclusions. This approach is consistent with all
assessments.
Assessment plan
The assessment plan need not be long, but it should be documented and should include:
• assessment identifier (number)
• type of assessment
• scope of assessment
• purpose of assessment
• proposed assessment data
• proposed assessors (phone/address)
• location of assessment
• selected assessment target areas
• contact persons
Assessment standard operating procedures (SOPs)
Assessors may need access and training in standard operating procedures for the purpose of
conducting routine and consistent assessments. These SOPs should include details in
measurement methodology for information/data quality.
Assessment requirements
The assessors may benefit from developing a list of assessment requirements based on their own
expertise and the customer's needs for the assessment. This list of assessment requirements
helps focus assessment planning, checklist development of the checklist, and assessment
conduct.
Assessment checklist
Assessors need to develop an assessment checklist to serve as a reminder of all the areas that the
assessors intent to cover in their assessment of the database. This checklist also then becomes a
formal record of the assessment in combination with whatever electronic records are created in
the process.
5
-------
Notification and request for information letter/memorandum
Prior to the conduct of the assessment, the assessors should formally provide notification of the
assessment in a letter or memorandum. The letter should include the assessment plan. One
option is to include the assessment checklist to allow the persons responsible for the data an
opportunity to prepare for the assessment.
Reporting format
Assessors will need a standard reporting format for communicating the results of the assessment.
The structure of this reporting format should reflect the planning for the corrective action
process. The most important feature of the report is to ensure that the assessors can easily
develop this report so that no time is lost in reporting the assessment. The later that assessment
results are provided, the less impact and credibility of the assessment process. One method to
ensure rapid reporting is to severely limit the approval process. A well-organized assessment
system should empower the assessor to produce a final report with no management review.
CONDUCTING THE ASSESSMENT
Communication during the entire process of assessment in crucial in garnering support during the
process and in effective utilization of assessment results.
Pre-assessment briefing
Meet with the parties that are responsible for the database, go over the audit plan carefully
explaining the purpose and scope, assessment methodology, and ask if there are any questions.
This is a good time to work out last minute details, such as concerns about access to data and
how assessment results night be received. Be sure to go over in detail any corrective action
processes that were planned.
Assessment implementation
Make a record of all electronic processes used in the assessment process and, if possible, provide
a printout and electronic file of any nonconformances identified in the information under review.
Assessment debriefing
At the conclusion of the assessment, be sure to provide the persons responsible for the data and
information with a personal debriefing of the findings of the assessment. Discussion of the
corrective actions as well as preventative actions that can be implemented immediately may be
helpful.
CORRECTIVE ACTION IMPLEMENTATION
Planning actions to correct identified problems with information quality can be a meaningless
exercise and a waste of resources unless there is a process to ensure implementation of the
planning. Verification by assessors is useful; however, this approach places the burden of
verification on the assessors and requires additional resources to perform the verification. The
corrective action process should be a standard process of the enterprise that is assessed, and the
process must provide some form of verification for each type of finding reported in an
assessment.
6
-------
PREVENTATIVE ACTIONS
Establishing preventative actions processes will ensure improvement in the quality of the
information and reduce reliance on the assessment process to determine and monitor the quality
of the information.
ASSESSMENT RESULTS IN ONGOING QUALITY SYSTEM MONITORING
An important use of the results of information quality assessments is for ongoing monitoring
operations. For certain information quality indicators, quality managers can routinely monitor
the quality of the information in the form of a control chart. For example, the number of defects
in information received from an outside parts may be a variable subject to measurement.
Ongoing measurement and charting of the number of defects will allow the quality manager to
calculate upper and lower control limits. Using this information, the quality manager can
examine the processes used to develop the information that is being assessed and determine if
improvements to the process actually result in increased quality.
On tbn* d»Uv*ry rate
COMMENTS ON MEASUREMENTS
Users of technical information resident in computer systems need to pay special attention to the
issue of measuring data quality because the technical information in many cases consists of
measurement data. Measurement data includes quality indicators which provide useful
information regarding the measurement in terms of the accuracy, bias (precision),
representativeness, completeness, comparability, and sensitivity of measurement methodology
used.
Both technical measurement results and associated quality indicators are subject to quality
concerns related to distribution of data, because once recorded in the electronic environment,
they are essentially equivalent data elements. Assessment of information quality for distribution
processes, is also a measurement process. Development of measurement methodology,
acceptance criteria, sampling techniques, and confidence intervals result in similar quality
7
-------
indicators for information distribution. For example, accuracy and precision of a measurement
process to determine the number of defects in a database are important indicators of the efficacy
of the quality measurement.
Assessors must be able to clearly explain the unique nature of the categorization of various types
of measurement quality indicators so they can communicate quality system needs, assessment
results, and opportunities for improvement without confusion.
CONCLUSION
Quality managers can apply existing assessment methodologies to all quality aspects of technical
information held as data in information systems. A well operated and consistent assessment
process will provide valuable tools for managers to know and improve the quality of their
information. Identifying usable quality indicators, measures for those quality indicators, and
acceptance criteria is an important process for planning assessment. Establishing and
communicating the relationships of these indicators to specific processes for both production and
distribution of information will facilitate development of quality improvement approaches.
8
-------
RESOURCES
Brackett, Michael H. 1996. The Data Warehouse Challenge: Taming Data Chaos. New York, New York:
John Wiley & Sons, Inc.
English, Larry P. 1999. Improving Data Warehouse and Business Information Quality: Methods for Reducing
Costs and Increasing Profits. New York, New York: John Wiley & Sons, Inc.
Redman, Thomas C. 1996. Data Quality for the Information Age. Norwood, Massachusetts: A rtech House.
Juran, J.M., Gryna, Frank M. 1988. Juran's Quality Control Handbook: Fourth Edition. New York, New
York. McGraw-Hill, Inc.
Deming, W. Edwards 1986. Out of the Crisis. Cambridge, MA. Massachusetts Institute of Technology.
Tamini, Rajan, Sebasianelli, ASQ Quality Progress. Benchmarking the Homepages of "Fortune 500"
Companies, July 2000.
http://www.thedecalogue.com/cchartq2.htm - control chart graphic
9
-------
INFORMATION QUALITY INDICATORS FEATURES MATRIX
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
<
S-
<
data
representation
data
representativeness
a measure of the degree to which the set of rules for recording data meet the needs of the user
% or Y/N
Q
data rep. completeness
a measure of the degree to which the set of rules for recording data ensure data are completely
represented
% or Y/N
data rep. documented
a determination if adequate documentation of the data representation is provided
% or Y/N
1
data rep. granularity
a measure of the degree to which the rules for recording the data provide for recording the correct amount
of granularity
% or Y/N
data rep. validity to
business rule
a measure of the degree to which the rules for recording the data are a valid representation of the
associated business rules
% or Y/N
data name
the degree to which the data name, entity name, attribute name clearly communicate the meaning of the
objects named (English, 1999)
% or Y/N
data name consistency
the degree to which the data and entity names are consistent across all presentation media, such as Held
names, screens, reports
% or Y/N
data record
data record accuracy
to surrogate
a measure of the agreement of the data record with the information record on a surrogate (such as a field
sheet or survey form)
% or Y/N
data record accuracy
to reality
a measure of the agreement of the data record with the data source
% or Y/N
data record business
rule conformance
a measures of the conformance of data values to its domain and business rules
% or Y/N
data record timeliness
(information float la)
a measure of time for the data record to be made and for the data record to be placed in a formal data
base system
time (days, hours,
minutes, etc.)
data record timeliness
(information float lb)
the measure of failures to accomplish the enterprise's goal(s) because the data record was not available to
the data system when needed
failure rate
data standard
data standards
the degree to which the data standards enable people to easily define data completely, consistently,
accurately, and understandably
Y/N
10
-------
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
t-
Z
UJ
H
2
scientific
measures
measurement
precision (meas.
accuracy 1)
a measure of mutual agreement among individual measurements of the same property (usually under
prescribed similar conditions)
standard
deviation
O
u
z
o
p
<
s
measurement bias
(meas. accuracy 2)
a systematic or persistent distortion of a measurement process which causes errors in one direction (i.e.,
the expected measurement is different that the sample's true value)
numerical
difference
between expected
and true value
OC
O
LL.
z
measurement
representativeness
a measure of the degree to which results (data) accurately and precisely represent a characteristic of a
population, parameter variations at a sampling point, a process condition, or an environmental condition
(ANSI/ASQC E4-I994)
measurement
completeness
a measure of the amount of valid results (data) obtained from a measurement system compared to the
amount that was expected to be obtained under correct, normal conditions. (ANSI/ASQC E4-1994)
measurement
comparability
a measure of the confidence with which one set of environmental measurement results (a data set) can be
compared to another (ANSI/ASQC E4-1994)
measurement
reproducibility
a measure of the reproducibility of a measurement methodology
measurement
verification
a measure of the verification that a measurement was assessed to process requirements
measurement
validation
a measure of the validation of a measurement to results requirements
measurement usability
an assessment of a measurement for conformance to use requirements
measurement
documentation
measurement was adequately documented
geospatial
to be determined
measures
to be determined
NOTE: May include quality indicators for scientific measures above.
survey
to be determined
measures
11
-------
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
to be determined
NOTE: May include quality indicators for scientific measures above.
administrative
data
conformance to the
enterprise's business
rule
the degree to which the data conform to all the business rules and administrative requirement of the
organization
% or Y/N
financial data
correct classification
Financial data are recorded in the correct classification
% or Y/N
system meta-
data
system meta-data
completeness
the degree to which meta-data are complete
%
system meta-data
business rule
conformance
a measure of the conformance of meta-data to the business rules
% or Y/N
>¦
' oc
LU
data collection
data entry freedom
from defect
a measure of the correctness in the data entry of information
% ~
"3
UJ
and input
data verification
the degree to which data were verified to meet process requirements
%
z
o
data validation
the degree to which data were validated to meet output requirements
%
H
<
01
operations,
analysis,
verification of
software
a measure of the degree to which software are verified
% or Y/N
2
z
software
validation of software
a measure of the degree to which soft ware are validated
% or Y/N
conformance of
software to enterprise
requirements
a measure of the degree to which software conform to the requirements of the enterprise
% or Y/N
efficiency in software
operations
a measure of the use of resources compared to the scope and complexity of the assignment
% or Y/N
12
-------
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
architecture
architecture
redundant system
processes
a measure of the redundancy of unnecessary system processes
%
conformance to
enterprise
information
business information
model clarity
a measure of the clarity of the business information model (does it provide all the information needed in a
clear manner) (English, 1999)
Y/N
requirements
operational data
model clarity
a measure of the clarity of the operations data model
(stable, flexible, clear, complete)
Y/N
distributed database
architecture and
design
the degree to which the processes control the physical distribution of database data
Y/N
facility,
facility security
hardware
facility conformance
to hardware
requirements
a measure of conformance of the facility to hardware requirements (and enterprise requirements)
Y/N
hardware
conformance to
enterprise needs
a measure of the conformance of hardware to enterprise requirements
Y/N
reliability of hardware
a measure of the reliability of the hardware
failure rate, etc.
hardware
maintainability
a measure of the resources needed to maintain hardware
money or
resources
output/reports
(data
data report
availability
a measure of the availability of reports on data from a data system
% or Y/N
warehouse)
data report contextual
clarity
a measure of the degree to which data presentation enables the information customer to understand the
meaning of the data and avoid misinterpretation (English, 1999)
%
Internet/cyber
web information
availability
a measure of the availability of information that is needed by the information customer (see GOAL 7)
% or Y/N
web information
accessibility
a measure of accessibility of information that is needed by the information customer (see GOAL 7)
% or Y/N
page loading speed
the time it takes for individual pages to fully load at a "normal" work station (Tamini, 2000)
time
13
-------
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
contact information
visibility
the presence/absence of contact points if the information customer needs additional information or has a
question (Tamini, 2000)
% or Y/N
timeliness
the amount of time from when information (e.g., environmental data) is available to an organization until
it is available to information customers who use the information at the web site (Tamini, 2000)
time
functionality of
links
a measure of the degree to which there are inactive links in a web site (Tamini, 2000)
% or Y/N
spelling, clarity,
organization
a measure of the "readability" of the information provided at a web site
Y/N
(potentially
subjective)
web site
modification
timeliness
the amount of time from when changes need to be made to reflect organization changes
(e.g.,re-organization, changes in programs, etc.) and the time the changes are made to the web
pages this is the amount of time incorrect information is being provided to information
customers
time
U1
C/5
3
O
sc
data
architecture
data relationship
correctness
the degree to which relationships among the real-world objects is correctly represented by the
data
- entity type to entity type
-attribute to entity type
- entity type to entity subtype
% or Y/N
u
^ <
storage
duplicate database
records
a measure of the number of incidents of duplicate data entry in a single database
%
unnecessary
multiple data
representation
a measure of the number of incidents where data are unnecessarily entered in more that one
data representation
%
redundant storage
of system data
records
a measure of the agreement of data when data are necessarily entered into redundant storage
14
-------
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
data report
potential
accessibility
the degree to which all potential data needed by the enterprise for information customers are
accessible (English, 1999)
%
data report actual
accessibility
to degree to which the data that are accessible to information customers can be actually
accessed (i.e., ease of use) (English, 1999)
%
archiving
archival timeliness
the degree to which data are placed in archival according to enterprise requirements
time or %
C/5
f-
o
process
failure costs
irrecoverable costs
costs which are not subject to recovery (such as mailing notification letters to the wrong
person) (English, 1999)
u
z
o
liability and
exposure costs
actual costs and potential risks (such as the liability potential if incorrect information is used
to make a decision) (English, 1999)
H
<
2
OS
recovery costs of
unhappy users
the compensation costs and resource costs to fix a problem because of poor information
quality (English, 1999)
o
u.
z
information
scrap and
rework
redundant data
handling and
support costs
the costs of developing and maintaining alternative data systems to handle the same data
because the information customer cannot use the data in the first database system (English,
1999)
costs of hunting or
chasing missing
information
the costs of finding missing information, lost productivity because those resources were
searching for information, and the cost of doing "rework" correcting the problem (English,
1999)
business rework
costs
the costs of re-performing processes that failed, such as reprinting reports because the first
report generation efforts failed (English, 1999)
workaround costs
and decreased
productivity
the costs of performing alternative work, when poor quality information prevents preforming
the normal process, such as completing administrative documents manually when the software
fails to work (English, 1999)
data verification
costs
the costs to the information customers of performing additional manual "quality inspections"
to verify the quality of the information because they do not trust the quality (English, 1999)
15
-------
TYPE
QUALITY
FEATURE
QUALITY
INDICATOR
DEFINITION
MEASURE
software rewrite
costs
the costs to fix application programs when they fail, recover form the problems caused, and
rerun the programs (English, 1999)
data cleansing and
correction costs
the costs of data cleansing (which are usually waste costs because they would often be
unnecessary if the information was correctly created and maintained)
data cleansing costs
the costs of software to cleanse data from a source database (English, 1999)
16
-------
QUALITY MANAGEMENT SOLUTIONS FOR
TODAY S ENVIRONMENTAL CHALLENGES
EPA DATA STANDARDS WORKSHOP
Sara Hisel McCoy
Abstract —EPA's Data Standards effort is part of the Agency's overall information
management strategy to improve the integration, reliability, longevity, and usefulness of
the environmental data the Agency relies on to help direct its regulatory and policy
decisions. The Agency has recently approved the last of an original group of six data
standards and now must implement these standards in its various data systems. EPA's
Data Standards Branch, within the Office of Information Collection, is tasked with
providing assistance to EPA's program offices in understanding and conforming with the
requirements of each standard. This group also designs and manages the Environmental
Data Registry (EDR), which among other functions, can provide this assistance. The
presenter will cover the following topics in detail during the workshop:
• EPA's Motivation for Data Standard Adoption
• EPA's Reinventing Environmental Information (REI) Initiative
• REI Action Plan
• Data Standard Status and Implementation Dates
• Data Standard Implementation Strategy
• New Data Standard Development
• Available Assistance
• EPA's Environmental Data Registry (EDR) Capabilities
1
-------
TECHNICAL INFORMATION MANAGEMENT
WITHIN THE U.S. DEPARTMENT OF ENERGY
OFFICE OF SAFETY, HEALTH AND SECURITY (EM-5)
Robert Murray
US Department of Energy
Germantown, MD
Richard Sassoon
Science Applications International Corp
Gaithersburg, MD
Abstract — Acquisition, use, and dissemination of environmental information are critical
to meeting the mission of the DOE Office of Environmental Management (DOE-EM).
Environmental information is stored in a vast array of databases, each with its own
structure and definitions. This presents challenges for access, which if overcome, present
a tremendous opportunity to streamline current practices.
The DOE-EM Office of Safety, Health and Security (EM-5) is developing an innovative
program based on knowledge management principles to leverage new technologies to
access data via the Internet from dispersed databases. Central to this process is
standardization and integration ofexisting data. EM-5 is currently participating with other
federal and international agencies in a cooperative project called the Environmental Data
Exchange Network (EDEN). This project applies data mining software to simplify the
acquisition and use of the data that resides in geographically dispersed database systems.
The EM-5 Data, Decision, and Documentation (3D) program will present its strategy as to
how it plans to leverage new and innovative web based tools to access critical data
necessary to meet its mission needs.
Introduction
Effective information management is a critical component in the process needed for successful
accomplishment of the mission of the Department of Energy's (DOE's) Office of Environmental
Management (EM) to clean up the Nation's former nuclear weapons production facilities.
Information management includes the technical and management practices necessary to ensure that
the correct type, quantity and quality of data are collected, organized, analyzed and disseminated to
achieve effective decision making Good information management practices within EM will lead to:
more effective management of EM technical projects; better business practices and oversight of
costs; improved planning and decision-making at all levels of the organization; and the generation
of more rigorous and defensible technical data to support these decisions.
Data generated and stored in a vast array of data bases and other data sources within EM is only
useful to the organization if it can be converted into environmental technical information that is
easily available and accessible to key decision-makers within EM. This notion is illustrated in
1
-------
Figure 1. Key to successful implementation of this concept is the standardization and integration
of data which allows the decisiom-maker common access to multiple data sources and the ability to
obtain critical information.
Figure 1: Effective Information Management
DOE EM-5 Role
The Office of Safety, Health and Security (EM-5) within EM is tasked with a variety of technical
and management roles that are central to the execution of the DOE-EM mission. These include: EM
safety and health, establishment cf safety requirements for the packaging used in EM shipments;
supporting EM in addressing its safeguards and security concerns; and developing and maintaining
programmatic quality systems in the areas of quality assurance; analytical services, emergency
management, and risk management. In order to leverage for EM the array of data produced and used
by these diverse programs, a knowledge management approach is being developed within the Data,
Decision and Documentation Program (3D) of the EM-5 Quality Systems team. Knowledge
management is the science of using structured and unstructured data from a variety of sources to
obtain information oriented toward making a specific decision. This approach requires the
development of tools to integrate information within existing EM-5 databases.
2
-------
EM-5 Strategy
EM-5 recently conducted a survey of managers of EM analytical information and database systems
to determine whether a consistent approach for utilizing data in decision making would be of value.
This study revealed that there is a general lack of integration among analytical data systems
throughout DOE-EM, which renders it difficult, if not impossible, to access and utilize data from
the full array of databases. Furthermore, the study suggested that EM technical data sources
associated with other program areas of the EM-5 Quality Systems team may also benefit from a
consistent approach. As a result of this study, EM-5 is undertaking the following activities to
address this need.
• Definition of a Generalized Knowledge Management Process
Essential to developing a consistent approach for utilizing data in decision making is the definition
of a generalized knowledge management process that can be used as a basis for ensuring that all
major elements of an iterative decision making process using data from multiple sources are in place.
Figure 2 represents the generalized knowledge management process defined by EM-5.
Figure 2: Generalized Knowledge Management Process
3
-------
Central to this process is understanding customer and regulatory drivers to define the key decision
to be made and how that decision will be documented. The planning stage determines what type and
quantity of data must be accessed, converted into useful information to become knowledge on which
the decision is based. The portal is the mechanism through which various data sources are accessed.
The process can be repeated and modified though numerous cycles until sufficient knowledge is
acquired to render a defensible decision.
• EM Oversight of the Environmental Data Exchange Network (EDEN) Demonstration
An example of the application of this process is the Environmental Data Exchange Network (EDEN)
demonstration, which is illustrated in Figure 3.
Data Sources
Query
Figure 3: Environmental Data Exchange Network
4
-------
DOE has teamed with the US Environmental Protection Agency (EPA), the US Department of
Defense (DoD) and the European Environmental Agency (EEA) to demonstrate sharing of
environmental information and data from a set of eight diverse and geographically dispersed
database systems from each of the participating agencies. Each of the participating agencies has
a vested interest in the acquisition, use and dissemination of environmental information. EDEN
permits the exchange of information among organizations through the establishment of a
common environmental vocabulary, which is now being defined in EPA's Environmental Data
Register (EDR). EDEN then uses InfoSleuth® data mining software to acquire, analyze, and
summarize the data that resides in the multiple database systems in response to a query made by
the user.
After an initial proof of concept was demonstrated at EPA, an advanced EDEN pilot demonstration
is currently being conducted on the DOE Office of Environmental Safety and Health (EH)
information portal, and EM-5 has a lead role in overseeing this important activity. Once fully
developed and implemented, EDEN will be accessible to any organization or individual through a
standard Internet browser. Successful demonstration of the EDEN concept will yield a tool to access
multiple databases without restructuring existing data.
• Application of Knowledge Management Tools to EM-5
After the knowledge management process is accepted, and the EDEN concept fully demonstrated,
these tools will first be applied within EM on the programs and activities within EM-5. Data will
be integrated from multiple data sources associated with, for example, analytical services, emergency
response, risk avoidance, waste shipment packaging, and safety and health through a consistent
knowledge management process. This is expected to enhance the effectiveness of each of these
programs and provide benefits across all of EM.
Conclusion
Information management is a critical component of environmental program execution. Ready access
to a wide array of data increases confidence in decision making and reduces risks associated with
decision errors. The ability to acquire data from geographically and technically dispersed databases
presents several tangible benefits to the decision-maker. The primary benefit is that data can be
extracted from existing sources, thereby eliminating the need to collect and process additional
redundant data. DOE-EM recognizes the value in this approach and EM-5 is taking the lead in
developing and applying these concepts.
5
-------
TECHNICAL INFORMATION MANAGEMENT
WITHIN THE U.S. DEPARTMENT OF ENERGY
OFFICE OF SAFETY, HEALTH AND SECURITY (EM-5)
Robert Murray
US Department of Energy
Germantown, MD
Richard Sassoon
Science Applications International Corp
Gaithersburg, MD
Abstract —Acquisition, use, and dissemination of environmental information are critical
to meeting the mission of the DOE Office of Environmental Management (DOE-EM).
Environmental information is stored in a vast array of databases, each with its own
structure and definitions. This presents challenges for access, which if overcome, present
a tremendous opportunity to streamline current practices.
The DOE-EM Office of Safety, Health and Security (EM-5) is developing an innovative
program based on knowledge management principles to leverage new technologies to
access data via the Internet from dispersed databases. Central to this process is
standardization and integration ofexisting data. EM-5 is currently participating with other
federal and international agencies in a cooperative project called the Environmental Data
Exchange Network (EDEN). This project applies data mining software to simplify the
acquisition and use of the data that resides in geographically dispersed database systems.
The EM-5 Data, Decision, and Documentation (3D) program will present its strategy as to
how it plans to leverage new and innovative web based tools to access critical data
necessary to meet its mission needs.
Introduction
Effective information management is a critical component in the process needed for successful
accomplishment of the mission of the Department of Energy's (DOE's) Office of Environmental
Management (EM) to clean up the Nation's former nuclear weapons production facilities.
Information management includes the technical and management practices necessary to ensure that
the correct type, quantity and quality of data are collected, organized, analyzed and disseminated to
achieve effective decision making. Good information management practices within EM will lead to:
more effective management of EM technical projects; better business practices and oversight of
costs; improved planning and decision-making at all levels of the organization; and the generation
of more rigorous and defensible technical data to support these decisions.
Data generated and stored in a vast array of data bases and other data sources within EM is only
useful to the organization if it can be converted into environmental technical information that is
easily available and accessible to key decision-makers within EM. This notion is illustrated in
1
-------
Figure 1. Key to successful implementation of this concept is the standardization and integration
of data which allows the decision-maker common access to multiple data sources and the ability to
obtain critical information.
Figure 1: Effective Information Management
DOE EM-5 Role
The Office of Safety, Health and Security (EM-5) within EM is tasked with a variety of technical
and management roles that are central to the execution of the DOE-EM mission. These include: EM
safety and health, establishment of safety requirements for the packaging used in EM shipments;
supporting EM in addressing its safeguards and security concerns; and developing and maintaining
programmatic quality systems in the areas of quality assurance; analytical services, emergency
management, and risk management. In order to leverage for EM the array of data produced and used
by these diverse programs, a knowledge management approach is being developed within the Data,
Decision and Documentation Program (3D) of the EM-5 Quality Systems team. Knowledge
management is the science of using structured and unstructured data from a variety of sources to
obtain information oriented toward making a specific decision. This approach requires the
development of tools to integrate information within existing EM-5 databases.
2
-------
EM-5 Strategy
EM-5 recently conducted a survey of managers of EM analytical information and database systems
to determine whether a consistent approach for utilizing data in decision making would be of value.
This study revealed that there is a general lack of integration among analytical data systems
throughout DOE-EM, which renders it difficult, if not impossible, to access and utilize data from
the full array of databases. Furthermore, the study suggested that EM technical data sources
associated with other program areas of the EM-5 Quality Systems team may also benefit from a
consistent approach. As a result of this study, EM-5 is undertaking the following activities to
address this need.
• Definition of a Generalized Knowledge Management Process
Essential to developing a consistent approach for utilizing data in decision making is the definition
of a generalized knowledge management process that can be used as a basis for ensuring that all
major elements of an iterative decision making process using data from multiple sources are in place.
Figure 2 represents the generalized knowledge management process defined by EM-5.
Figure 2: Generalized Knowledge Management Process
3
-------
Central to this process is understanding customer and regulatory drivers to define the key decision
to be made and how that decision will be documented. The planning stage determines what type and
quantity of data mustbe accessed, converted into useful information to become knowledge on which
the decision is based. The portal is the mechanism through which various data sources are accessed.
The process can be repeated and modified though numerous cycles until sufficient knowledge is
acquired to render a defensible decision.
• EM Oversight of the Environmental Data Exchange Network (EDEN) Demonstration
An example of the application of this process is the Environmental Data Exchange Network (EDEN)
demonstration, which is illustrated in Figure 3.
Data Sources
Query
Figure 3: Environmental Data Exchange Network
4
-------
DOE has teamed with the US Environmental Protection Agency (EPA), the US Department of
Defense (DoD) and the European Environmental Agency (EEA) to demonstrate sharing of
environmental information and data from a set of eight diverse and geographically dispersed
database systems from each of the participating agencies. Each of the participating agencies has
a vested interest in the acquisition, use and dissemination of environmental information. EDEN
permits the exchange of information among organizations through the establishment of a
common environmental vocabulary, which is now being defined in EPA's Environmental Data
Register (EDR). EDEN then uses InfoSleuth® data mining software to acquire, analyze, and
summarize the data that resides in the multiple database systems in response to a query made by
the user.
After an initial proof of concept was demonstrated at EPA, an advanced EDEN pilot demonstration
is currently being conducted on the DOE Office of Environmental Safety and Health (EH)
information portal, and EM-5 has a lead role in overseeing this important activity. Once fully
developed and implemented, EDEN will be accessible to any organization or individual through a
standard Internet browser. Successful demonstration oftheEDEN concept will yield atool to access
multiple databases without restructuring existing data.
• Application of Knowledge Management Tools to EM-5
After the knowledge management process is accepted, and the EDEN concept fully demonstrated,
these tools will first be applied within EM on the programs and activities within EM-5. Data will
be integrated from multiple data sources associated with, for example, analytical services, emergency
response, risk avoidance, waste shipment packaging, and safety and health through a consistent
knowledge management process. This is expected to enhance the effectiveness of each of these
programs and provide benefits across all of EM.
Conclusion
Information management is a critical component of environmental program execution. Ready access
to a wide array of data increases confidence in decision making and reduces risks associated with
decision errors. The ability to acquire data from geographically and technically dispersed databases
presents several tangible benefits to the decision-maker. The primary benefit is that data can be
extracted from existing sources, thereby eliminating the need to collect and process additional
redundant data. DOE-EM recognizes the value in this approach and EM-5 is taking the lead in
developing and applying these concepts.
5
-------
THE OEI BEST PRACTICES SERIES FOR
ANALYTICAL INFORMATION PRODUCTS
Evangeline Tsibris, U.S. EPA, Office of Environmental Information, Environmental
Analysis Division
Abstract —EPA's Office of Information Analysis and Access (OlAA) is developing a Best
Practices Series for analytical information product development. Building on the findings
and recommendations of the OEI report, "Lessons Learned about Designing, Developing
and Disseminating Environmental Information Products, "OlAA is leading the Agency wide
effort to develop and release guidance documents on each important phase of the
information products lifecycle: planning, design, development, review, release, maintenance
and close-out. Each guide will consist of EPA examples of best practices, case study
applications of these practices, and reference to pertinent policy/guidance that already
exist. OlAA welcomes participation on the Best Practices Team for this important effort.
Background
EPA maintains a large and complex holding of public data. Most of the information is generated
from regulatory reporting requirements, such as' emissions monitoring, but the Agency also
maintains public records on ambient monitoring and remote sensing; various regulatory and
voluntary program activities; and compliance records. Once collected and compiled, Program
Offices and Regional Offices use the information for managing their programs, for decision-making,
and making portions of it available for public use. Importantly, the Agency is using this data to
create analytical information products that summarize and interpret the information which facilitate
the use and understanding of EPA's data. Creating accurate and appropriate information products
often reduces the complexity of information being made public, but may also reduce the
transparency of how the information has been processed and is being presented.
EPA's work to create information products and educate users is now moving at a speed that requires
a more consistent and thorough approach to the planning and development of analytical information
products. One of the first steps toward understanding what makes a product widely used, relevant
and technically credible has been background research conducted under an advisory committee
comprised of representatives of all AAships and the Regions. The final report, Lessons Learned
about Designing, Developing and Disseminating Environmental Information Products, documents
the diverse types of products and corresponding steps of product development, and the current issues
that interviewees emphasized for attention in order for environmental information products
throughout the Agency to be of the highest quality.1 Importantly, the findings are distilled from
interviews with information product developers themselves.
1 EPA 260R-00-001
1
-------
Critical areas for improvement identified by product developers included:
~ Articulate a clear sense of the product's purpose early in the design process. As
obvious as this may seem, interviewees stress that this step is frequently overlooked,
resulting in redundancy among products, sub-optimal user functionality, and other
problems.
»• Avoid attempts to develop a generic information product. While information
products do circulate beyond their intended audience, this should not be used as a
rationale for avoiding the need to specify a target audience and its unique needs for
environmental information.
/
*¦ Develop a mechanism to facilitate communication among producers and users of
EPA data. While some users of secondary information products take the initiative
and communicate effectively with originators of data, EPA currently lacks an
organized framework to guide this kind of technical interaction.
*¦ Seek stakeholder and product audience input at all points in the information product
life-cycle. Early stakeholder involvement is especially important to help clarify
goals, reduce skepticism, build consensus, and ensure the dissemination of
information that meets audience needs.
*¦ Address concerns over data accuracy. Some developers and managers interviewed
note the importance of creating a data feedback loop, through which facilities and
other data providers are asked to verify data used in a particular information product.
Beyond data verification, some interviewees also feel that the Agency should
consider initiating an internal comment period via an intranet-accessible form,
allowing data to undergo internal scrutiny prior to public release.
Information product developers probably recognize these issues as those they wrestle with day-to-
day. However, such challenges are perhaps less obvious but not of lesser concern, to internal and
external users of environmental information.
What is the Best Practices Series?
The challenge facing the Agency at this time is the need to grow and enhance the infrastructure that
supports the way analytical information products are developed so that:
~ EPA product developers are given the guidance they need to create credible
information products with sound data; and
2
-------
all information product users receive the information they want and the guidance
they need to understand the information received.2
To these ends, an Agency-wide network of staff involved with information product development is
being convened to research and eventually develop a series of Best Practices Guides for information
product developers and managers. The series of guides will span the product life cycle from initial
conceptualization through development, testing, and maintenance, and will provide practical advice
and examples of excellence. And while OIAA will be coordinating these efforts, the participation
of a wide range of staff throughout the Agency is the most critical input to creating a successful Best
Practices Series.
Purpose
There are existing Best Practices around the Agency that should be highlighted in order to promote
Agency information sharing, thereby strengthening information partnerships and strengthening the
EPA information infrastructure. OIAA has adopted a best practices approach for several reasons:
~ It is among the simplest means of demonstrating tried and true success and helping
other implement that success in their everyday business.
~ It is a voluntary, rather than command and control approach. Tools to help
information product developers will be created, instead of rules. This approach
emphasizes and illustrates attention to excellence throughout the product
development process by fostering partnerships and collaborations.
~ Because it is a process involving many offices and individuals, there are
opportunities for such partnerships, as with the Office of Research and Development
Environmental Information Management System (EIMS), the OEI Information
Products Bulletin (IPB), the Office of Communication, Education and Media
Relations (OCEMR), the data quality efforts of OEI's Data Quality Staff and other
quality efforts around the Agency, thereby connecting the information product
experts around the Agency under one guidance objective.
~ In addition, this series will complement other ongoing government-wide best
practices efforts.
The Best Practices Series, the Best Practices Network, and the very process of developing the series
will also generate direct benefits for product developers, the Offices sponsoring the development of
information products, and users. On the development side, the Best Practices Series will streamline
product development, thereby reducing costs and duplication of effort. On the user side, better
EPA Strategic Plan Goal 7: Expansion ofAmerican 5? Right to Know about Their Environment
Objective 7.1: Increase the availability of quality health and environmental information
Objective 7.2: Improve the public's ability to use and understand environmental and health information
3
-------
coordinated development and products of higher quality will be more credible and increase the
likelihood that those seeking environmental information from EPA will get the answers to their
questions more quickly and in a form that is more understandable. As the best practices approach
is applied to analytical information product development, the Agency will provide all users with the
true power of its environmental information holdings.
Development Plan
Successful development of the Best Practices Series is very much dependent on OEI's ability to
organize and support an effective Agency work process. The planning, research and review
process is a collaborative effort of staff throughout the Agency and will take place throughout the
life of the project. A network of staff in program and Regional offices will contribute their
collective expertise and experience to the initial research and review phases. The work of
researching and developing draft Guides is the responsibility of a smaller Best Practices Team,
coordinated by OEI. The involvement of the Best Practices Network and the Best Practices
Team will ensure that the Best Practices Series creates the most relevant and useful guides
possible. It is important to note that the Best Practices Network consists of Agency volunteers
and these key contacts are critical to the success of the series as a whole. Please note that the
potential for external (non-EPA) stakeholder participation is one that is still being considered by
OIAA in the context of how best to deliver a valuable product to our audience, the EPA product
developer. The ultimate deliverables are an interactive website consisting of a Best Practices
toolkit, feedback forum and library, to assist users with implementing best practices into their
everyday project and communicating with others in this information community. In addition to
the website, the Series will produce 12 hardcopy guides that will be maintained as living
documents into the future. Four topics have been selected for priority development: Metadata
Development, Product Review, Environmental Indicators, and Conducting a Data
Suitability/Quality Assessment.
Structure of the Best Practices Guides
Each of the Best Practices Guides will have a similar structure and will include:
~ Examples of EPA Best Practices - The initial step of guidance development is the
identification of existing practices as candidates for inclusion into a Best Practices
guide. Agreed-upon best practice criteria will then be applied to determine which
findings can be showcased as examples of best practices. These examples will then
potentially be used in at least one case study application, and pending further
revision, will ultimately be recommended for Agency use.
~ Best Practices Case Studies - each guide will contain examples of best practices
along with at least one case study that demonstrates the application of some or all of
these practices. By providing these examples and case studies the guides will enable
information product developers and managers to adapt the guidance to their specific
needs and situations. For example, the guidance on effective stakeholder
4
-------
involvement might include a more in depth look at how one Office engaged external
groups in the development and final release of their product, including a detailed
history of how the project accomplished its objectives, the role of each stakeholder
groups, and the resources involved.
»- Resources - each guidance recommendation will also be contextualized within the
necessary framework of existing guidance and policy, both within the Agency and
outside the Agency.
~ OEIRecommendations - each guide will include recommendations that delineate the
best practice or best practices for that topic area, and how to implement that practice.
These recommendations will be based upon the research conducted by the Team that
resulting in the examples of best practices, the findings from the multiple case study
applications and the relationship between the teams findings and existing policy
and/or guidance. Each recommendation will be expressed in the form of practical
advice product developers and product managers.
Network Building
A network of interested and involved stakeholders is critical to the development, dissemination and
use of the guides. These stakeholders include: current, past, and future product developers; current
information users; primary data system stewards; and the management staff in all Agency Offices
and Regions. These stakeholders will be actively solicited for their expertise, opinions, and
assistance in reviewing proposals and draft material for the Series as it is being developed. As a first
step, OEI is asking all interested individuals to sign up for future bulletins (email
tsibris.evangeline@epa.gov) and is developing a method to keep the network informed of current
and planned activities.
The Best Practices Team
The Best Practices Team will oversee development of the Best Practices Series. Currently, the Team
consists ofrepresentatives from OEI, the Office of Communications, Education and Media Relations
(OCEMR), EPA Region 5, and EPA Region 3. The Team will be instrumental in guiding the
research on best practices within each topic area, as well as the development of series and
interactions with the Network and Agency management. OEI will be soliciting nominations for a
lead staff person from each Office and Region to serve on the Team.
Research
The initial stage in the development of the Best Practices Series has been the needs assessment and
definition of scope for the entire effort. For each Series topic, previous related research will be
collected and summarized to identify best practices, and to develop guidance criteria. Additional
research on each topic may be undertaken following review of the draft Guide.
Best Practice Criteria & Evaluation
A critical part of developing the Best Practices Series is to determine what constitutes best practices
and how to evaluate their appropriateness to Agency information products.
5
-------
In this step, the initial criteria are evaluated for their utility and appropriateness using with existing
information products. Testing the draft criteria helps to determine 1) if there are gaps in issues
addressed by the draft criteria; 2) if additional text explanations are required to help developers
understand or use the criteria; and 3) to identify real-world examples to bring into the Guide.
Finally, once the initial testing of the criteria is complete, the criteria will be incorporated into the
draft Guide and distributed to interested stakeholders in order to obtain feedback from the widest
possible group of reviewers. This review procedure will also help determine if further research is
needed before proceeding to finalizing the particular guidance document.
Publishing of the Series
OEI intends to publish the twelve B est Practices Guides individually as the research, evaluation, and
review are completed for each topic (it is not expected that they will be completed in order). They
will be published simultaneously in both hard copy and via the Best Practices Intranet website. This
website will contain downloadable pdf guides in addition to a more-indepth look at each topic area
and links to other related materials on key subject matter. This website will enable interested parties
to submit comments, questions, or even their own recommendation for a best practice while also
searching for information resources they need.
Maintenance of the Series
The Best Practices Team and OEI Mill be responsible for maintaining the Series. It is expected that
each Guide will be updated and/or completely redone periodically as practices, information, and
technologies evolve. Consequently, the web-based version will be actively maintained and updated
as living tools for infonnation product developers. This approach also allows interested parties, even
if not involved in the original development of a Guide, to provide comments on an ongoing basis,
which can be incorporated into any subsequent revisions of the Guides.
External Stakeholders
Much of EPA's data is collected for regulatory purposes, when this data is used in analytical
information products there are many external stakeholders who have a legitimate interests in these
activities. External stakeholders such as state agencies, industry associations and environmental
groups will have strong opinions on what EPA's best practices should be for analytical information
products. Options for establishing the proper forum to receive stakeholder comments will be
initiated.
Topic Areas for the Best Practices Series
Each stage of product development will be supported by one or more guidance documents. The
following is a list of these product development stages and the Best Practices Guides currently
proposed for each stage:
6
-------
Product Planning Stage:
1. Product Plan - identifies best practices and offers advice on creating a strategic plan for
product development and for the project as a whole, including factors such as budgetary
constraints.
2. Audience Identification - identifies best practices and offers advice on identifying audience(s)
for the product.
3. Stakeholder Involvement - identifies best practices and offers advice on determining when
and how stakeholders should be involved in the product development
Product Design Stage:
4. Product Design - identifies best practices and offers advice on designing the best product
based on its information content, function, audience, proj ect goals and performance measures.
5. Data Suitability Objectives - identifies best practices for assessing how well existing data
meets the objectives and purposes of an analytical information products.
Product Development Stage:
6. Metadata Development - identifies best practices and offers advice on preparing supporting
explanations and caveats that facilitate appropriate data/product use and understanding.
7. Using Environmental and other Indicators - identifies best practices and offers advice on the
development and appropriate use of environmental indicators.
Product Publishing Stage:
8. Product Review and Release - identifies best practices and offers advice on fulfilling all
requirements for product review and release.
Maintenance & Revision Stage:
9. Product Maintenance and Close-out - identifies best practices and offers advice on planning
for and carrying out product maintenance for the anticipated life of product.
10. User Feedback and Revision - identifies best practices and offers advice on creating and
implementing a user feedback and revision process.
11. Error Correction - identifies best practices and offers advice on implementing an error
correction process once the product is released and users are able to provide feedback.
Process/Next Steps
Four Best Practices Guides are currently in the initial stages of development. Once each
guide is completed, it will be published in both hard copy and on the Best Practices
Intranet site, where it can be periodically revised and updated. The best practices team
will be developing an Agency-wide network to assist in identifying best practices
examples and to review draft documents and website material.
7
-------
20th ANNUAL NATIONAL CONFERENCE ON
MANAGING ENVIRONMENTAL QUALITY SYSTEMS
April 2-6,2001, St. Louis, MO
Adams, Nancy H.
QA Manager
National Risk Management Research Laboratory
MD-91
Research Triangle Park, NC 27711
Ph: (919) 541-5510 Fax: (919) 541-0496
E-mail: adams.nancy@epa.gov
Adly, Michael A.
Chemist
U.S. EPA
2890 Woodbridge Ave
Edison, NJ 08837
Ph: (732) 906-6161 Fax: (732) 321-6622
E-mail: michael.adly@epamail.epa.gov
Almodovar, Lisa
Program Manager
U.S. EPA
1200 Pennsylvania Ave, NW (4304)
Washington, DC 20121
Ph: (202) 260-1310 Fax: (202) 260-1036
E-mail: almodovar.lisa@epa.gov
Amano, Richard
Principal Chemist
Lab Data Consultants, Inc.
7750 El Camino Real, Suite 2L
Carlsbad, CA 92009
Ph: (760) 634-0437 Fax: (760) 634-1674
E-mail: ramano@lab-data.com
Ancog, Narciso A.
QA Officer
Naval Facilities Engineering Command - Southwest
1220 Pacific Hwy
San Diego, CA 92132
Ph: (619) 532-2540 Fax: (619) 532-2607
E-mail: ancogna@efdsw.navfac.navy.mil
Arenovski, Andrea L.
Project Coordinator
National Decentralized Water Resources Capacity
Development Project (NCDP)
3901 Grand Ave , Suite 304
Oakland, CA 94610
Ph: (510) 658-2686 Fax: (510) 658-2618
E-mail: a_arenovski@earthlink.net
Bath, Raymond
USDOE
Washington, DC 20585
Ph: (212-620-3637) Fax:(212-620-3600)
E-mail: bath@eml.doe.gov)
Batterman, Allan R.
Env. Scientist, QA Manager
NHEERL
U.S. EPA
6201 Congdon Blvd
Duluth, MN 55804
Ph: (218) 529-5027 Fax: (218) 529-5015
E-mail: Batterman.Allan@epa.gov
Beach, Laura O.
QA Manager
ARCADIS Geraghty & Miller
4915 Prospectus Dr
Durham, NC 27713
Ph: (919) 544-4535 Fax:(919)544-5690
E-mail: lbeach@arcadis-us.com
Beard, Mitch
President
EarthSoft
78 Harvard Rd
Littleton, MA 01460
Ph: (978) 486-9057
E-mail: mbeard@earthsoft.com
Beelman, Joyce
Water QA Officer
Alaska Department of Environmental Conservation
610 University Ave
Fairbanks, AK 99709
Ph: (907) 541-2141 Fax: (907) 541-2187
E-mail: jbeelman@envircon.state.ak.us
Bennett, Gary
Chief, QA and Data Integration
U.S. EPA
980 College Station Rd
Athens, GA 30605
Ph: (706) 355-8551 Fax: (706) 355-8803
E-mail: bennett.gary@epa.gov
Registration as of March 20, 2001
-------
Bertoni, Malcolm J.
Senior Research Env. Scientist
Research Triangle Institute
1615 M Street, NW, Suite 740
Washington, DC 20036
Ph: (202) 728-2067 Fax: (202) 728-2095
E-mail: mjb@rti.org
Bethel], Cindy
Special Assistant
U.S. EPA/OEI/OIC
1200 Pennsylvania Ave, NW (2822)
Washington, DC 20460
Ph: (202) 260-2580 Fax:(204)401-4544
E-mail: bethell.cindy@epa.gov
Betz, Elizabeth
ORD, MD-77
U.S. EPA
Research Triangle Park, NC 27711
Ph: (919) 541-1535 Fax: (919) 541-0239
E-mail: betz.elizabeth@epamail.epa.gov
Biggs, Katherine
Associate Director
Office of Federal Activities
1200 Pennsylvania Ave, NW ( 2252A)
Washington, DC 20460
Ph: (202) 564-7144 Fax: (202) 564-0072
E-mail: biggs.katherine@epa.gov
Bingham, Alan K.
QA/QC Coordinator
Unified Sewerage Agency
2550 SW Hillsboro Hwy
Hillsboro, OR 97123
Ph: (503) 846-8939 Fax: (503) 846-8937
E-mail: Binghama@ usa-cleanwater.org
Bird, Judy
United Water
3600 West 3rd Ave
Gary, IN 46406
Ph: (219) 944-1211 Fax: (219) 949-6885
E-mail: jbird@indy.wrep.com
Bishop, Kathy
Program Analyst
U.S. EPA OSWER
1200 Pennsylvania Ave, NW (5103)
Washington, DC 20460
Ph: (202)260-7912 Fax: 202-401-1496
E-mail: bishop.kathy@epa.gov
Black, Kelly J.
Neptune & Company, Inc.
2031 Kerr Gulch Rd
Evergreen, CO 80439
Ph: (720) 746-1803 Fax: (720) 746-1605
E-mail: kblack@neptuneandco.com
Blaine, Kathleen A.
CO
LabOnline, Inc.
17 Bass Lane
Belleville, IL, 62223
Ph: (618) 476-9103 Fax: (734) 747-9229
E-mail: kblaine@norcom2000.com
Blake, Anthony
Associate Environmental Scientist
Harding ESE
90 Digital Dr
Novato, CA 94949
Ph: (415) 884-3186 Fax: (415) 884-3300
E-mail: afblake@mactec.com
Blume, Lou
QA Manager
U.S. EPA
77 West Jackson Blvd
Chicago, IL 60604
Ph: (312) 353-2317 Fax: (312) 353-2018
E-mail: blume.louis@epa.gov
Bolger, Kevin
Regional QA Manager
U.S. EPA, Region 5
77 West Jackson Blvd, (M-9J)
Chicago, IL 60604
Ph: (312) 886-6762 Fax: (312) 353-4342
E-mail: bolger.kevin@epa.gov
Boone, Patricia
Coordinator, Pesticide QA
U.S. EPA, Region 5
77 W. Jackson Blvd, DT-8J
Chicago, IL 60604
Ph: (312) 886-3172 Fax: (312) 353-4788
E-mail: boone.patricia@epa.gov
Registration as of March 20, 2001
-------
Boone, Denise
Environmental Scientist
U.S. EPA, Region 5
77 W. Jackson Blvd
Chicago, IL 60604
Ph: (312) 886-6217 Fax: (312) 353-9306
E-mail: boone.denise@epa.gov
Boring, Wade C.
Supervisor, Geographic Analysis
Illinois EPA
1021 N. Grand Ave East
Springfield, IL 62704
Ph: (217) 785-4787
E-mail: wade.boring@epa.state.il.us
Bottrell, David
U.S. DOE
19901 GermantownRd
Germantown, MD 20874
Ph: 301-903-7251 Fax: 301-903-7613
E-mail: DAVID.BOTTRELL@em.doe.gov
Boulos, Emile I.
Physical Scientist
U.S. DOE
1000 Independence Ave, SW
Washington, DC 20585
Ph: (202) 586-1306 Fax: (202) 586-0955
E-mail: emile.boulos@eh.doe.gov
Braun, Leslie
Sr. QA Analyst
DynCorp
2000 Edmund Halley Dr
Reston, VA 20191
Ph: (703) 264-8743 Fax: (703) 264-9236
E-mail: braunl@dyncorp.com
Brilis, George
QA Manager
NERL
U.S. EPA
PO Box 93478
Las Vegas, NV89119
Ph: (702) 798-3128 Fax: (702) 798-2233
E-mail: Brilis.George@epa.gov
Broadway, Rayna
Air Quality Monitoring Specialist II
Missouri Department of Natural Resources
2710 W. Main
Jefferson City, MO 65109
Ph: (573) 522-1937 Fax: (573) 526-3350
E-mail: nrbroar@mail.dnr.state.mo.us
Brotherton, Mark
Geologist
Independent
7103 Bristol Ridge Dr
Houston, TX 77095
Ph: (281)550-2271
E-mail: mabenderl4@juno.com
Brown, Sherie
Biological Technician
ORD
26 W. Martin Luther King Dr, MD-314
Cincinnati, OH 45268
Ph: (513) 569-7613 Fax: (513) 569-7170
E-mail: brown.sherie@epa.gov
Bruner, Phillip E.
Air Quality Monitoring Specialist
Environmental Services Program
2710 West Main
Jefferson City, MO 65109
Ph: (573) 526-3359 Fax: (573) 526-3350
E-mail: nrbrunp@mail.dnr.state.mo.us
Burrell, Robert C.
Water Program Liaison
Texas Natural Resource Conservation Commission
(TNRCC)
PO Box 13087
Austin, TX 78711
Ph: (512) 239-0431 Fax: (512) 239-0404
E-mail: rburrell@tnrcc.state.tx.us '
Burson, Marcolm C.
Director of Special Projects
Maine Department of Environmental Protection
SHS #17
Augusta, ME 04333
Ph: (207) 287-7755 Fax: (207) 287-2814
E-mail: malcolm.c.burson@state.me.us
Registration as of March 20, 2001
-------
Buscher, William E.
Supervisor, Hydrogeology/Compliance
Illinois EPA
1021 N. Grand Ave East, PO Box 19276
Sprinfield, IL 62794
Ph: (217) 785-4787 Fax: (217) 557-3182
E-mail: bill.buscher@epa.state.il.us
Byrne, Christian
QA Officer
OPP/BEAD
U.S. EPA
Env. Chemistry Laboratory, Building 1105
John C. Stennis Space Ctr, MS 39529
Ph: (228) 688-3213 Fax: (228) 688-3536
E-mail: byrne.christian@epa.gov
Byvik, Richard L.
Chemist
U.S. EPA, Region 5
77 W. Jackson Blvd
Chicago, IL 60604
Ph: (312) 353-3114 Fax: (312) 353-9281
E-mail: byvik.richard@epamail.epa.gov
Calacsan, Stacie
QA Manager
Hampton Roads Sanitatioin District
1432 Air Rail Ave
Virginia Beach, VA 23455
Ph: (757) 460-4217 Fax: (757) 460-6586
E-mail: scalacsan@hrsd.dst.va.us
Campbell, Susan
Chief, Program Operations Staff
National Exposure Research Laboratory
26 W. Martin Luther King Dr
Cincinnati, OH 45268
Ph: (513) 569-7426 Fax: (513) 569-7584
E-mail: campbell.susan@epa.gov
Carter, Mike
QA Manager
U.S. EPA
1200 Pennsylvania Ave, NW (5106)
Washington, DC 20460
Ph: (202) 260-5686 Fax: (202) 260-5646
E-mail: carter.mike@epa.gov
Carty, Lawrence V.
QA Officer
Illinois EPA
2125 S. First St
Champaign, IL 61820
Ph: (217) 278-5858 Fax: (217) 278-5868
E-mail: verne.carty@epa.state.il.us
Charpentier, Luke
QA Coordinator
Minnesota Pollution Control Agency
520 Lafayette Rd
St. Paul, MN 55155
Ph: (651) 296-8445 Fax: (651) 297-8676
E-mail: luke.charpentier@pca.state.mn.us
Chellis, Darby
Consultant
Marasco Newton Group
2801 Clarendon Blvd
Arlington, VA 22201
Ph: (703) 247-4756 Fax: (703) 516-9108
E-mail: dchellis@marasconewton.com
Childress, Steven
QA Specialist/Chemist
PO Box 13087, MC-143
Austin, TX 78711
Ph: (512) 239-2440 Fax: (512) 239-2450
E-mail: schildre@tnrcc.state.tx.us
Churilla, Patrick J.
Manager, Lab. Certification
U.S. EPA, Region 5
77 W. Jackson Blvd, (WD-15J)
Chicago, EL 60604
Ph: (312) 353-6175 Fax: (312) 886-6171
E-mail: churilla.patrick@epa.gov
Claycomb, Daniel P.
Director of Geosciences
Environmental Standards, Inc.
1140 Valley Forge Rd, PO Box 810
Valley Forge, PA 19482
Ph: (610) 935-5577 Fax: (610) 935-5583
E-mail: Dclaycomb@EnvStd.com
Registration as of March 20,2001
-------
Cobb, Rick
Manager
Illinois EPA
1021 N. Grand Ave
Springfield, IL 62794
Ph: (217) 785-4787 Fax: (217) 782-0075
E-mail: rick.cobb@epa.state.il.us
Collins, Barbara W.
Acting QA Manager
NHEERL, MD-71
Research Triangle Park, NC 27711
Ph: (919) 541-5766 Fax: (919) 541-1499
E-mail: collins.barbara@epa.gov
Corbett-Colbert, Harriet T.
QA Coordinator
OW/OGWDW
U.S. EPA
1200 Pennsylvania Ave, NW (4607)
Washington, DC 20460
Ph: (202) 260-2302 Fax: (202) 260-3762
E-mail: colbert.harriet@epa.gov
Cornell, Jeff
U.S. Air Force
AFCEE, 3207 North Road
Brooks AFB
San Antonio, TX 8235-5363
Ph. (210) 536-4329
Jeff.Cornell@HQAFCEE.brooks.af.mil
Cortesi, Roger
QA Coordinator
U.S. EPA/ORD/NCER
1200 Pennsylvania Ave, NW (8701R)
Washington, DC 20460
Ph: (202) 564-6852 Fax: (202) 565-2444
E-mail: cortesi.roger@epa.gov
Crane, David
Laboratory Director
CA Department of Fish and Game
2005 Nimbus Rd
Rancho Cordova, CA 95670
Ph: (916) 358-2859 Fax: (916) 985-4301
E-mail: dcrane@ospr.dfg.ca.gov
Crespo-Galan, Jorge L.
QA/QC Officer
Environmental Quality Board
POBox 11488
San Juan, Puerto Rico 00910
Ph: (787) 766-2823 Fax: (787) 766-0150
E-mail: jcrespol973@hotmail.com
Culpepper, Brenda T.
Director of Quality Assurance
National Health and Environment Effects Research
Laboratory
86 Alexander Dr
Research Triangle Park, NC 27711
Ph: (919) 541-0153 Fax: (919) 541-2581
E-mail: culpepper.brenda@epa.gov
Daddow, Richard L.
Hydrologist
U.S. Geological Survey
Denver Federal Center, Bldg 53, MS-425
Denver, CO 80225
Ph: (303) 236-5050 Fax: (303) 236-5046
E-mail: rldaddow@usgs.gov
Darley, Skip
U.S. Navy
1661Redbank Rd
Goose Creek, SC 29445
Ph: (843) 764-7337 Fax: (843) 764-7360
E-mail: darleyre@navsea.navy.mil
Dempsey, Carla
Program Manager
Lockheed Martin
1655 N. Ft. MyerDr #360
Arlington, VA 22209
Ph: (703) 812-3902 Fax: (703) 516-9050
E-mail: carla.h.dempsey@lmco.com
Di Rienzo, Robert P.
Vice President QA
DataChem Laboratories, Inc.
960 West LeVoy Dr.
Salt Lake City, UT 84123
Ph: (801) 266-7700 Fax: (801) 268-9992
E-mail: dirienzo@datachem.com
Registration as of March 20, 2001
-------
Dinsmore, Donalea
QA Coordinator
Wisconsin Department of Natural Resources
PO Box 7921
Madison, WI 53707
Ph: (608) 266-8948 Fax: (608) 266-5226
E-mail: dinsmd@dnr.state.wi.us
Dinwiddie, Robert S.
Hazardous Waste Bureau
2044 Galisteo St
Santa Fe, NM 87505
Ph: (505) 827-1561 Fax: (505) 827-1544
E-mail: stu_dinwiddie@nmenv.state.nm.us
Dirgo, John
QA Manager
Tetra Tech EM Inc.
200 E. Randolph Dr., Suite 4700
Chicago, IL 60601
Ph: (312) 856-8765 Fax: (312) 938-0118
E-mail: dirgoj@ttemi.com
Dixon, Thomas
Quality Staff
U.S. EPA
1200 Pennsylvania Ave, NW, (2811R)
Washington, DC 20460
Ph: (202) 564-6877 Fax: (202) 565-2441
E-mail: dixon.thomas@epa.gov
Doehnert, Mark
QA Manager
US Environmental Protection Agency
1200 Pennsylvania Ave, NW (6608J)
Washington, DC 20460
Ph: (202) 564-9386 Fax: 202-565-2042
E-mail: doehnert.mark@epa.gov
Drees, Lauren
QA Manager
NRMRL
26 W. Martin Luther King Dr
Cincinnati, OH 45268
Ph: (513) 569-7087 Fax: (513) 569-7787
E-mail: drees.lauren@epa.gov
Dulka, Anthony
Public Service Administrator
Illinois EPA
1021 N. Grand Ave, East
Springfield, IL 62794
Ph: (217) 785-4787 Fax: (217) 782-0075
E-mail: Anthony.Dulka@epa.state.il.us
Duncan, John W.
Environmental Scientist
National Exposure Research Laboratory
Landscape Characterization Branch
79 TW Alexander Dr., Annex S-427, MD--56
Research Triangle Park, NC 27711
Ph: (919) 541-2187 Fax: (919) 541-1138
E-mail: duncan.john@epa.gov
Duncan, Keith
Env. Program Manager
Oklahoma DEQ
PO Box 1677
Oklahoma City, OK 73101
Ph: (405) 702-4140 Fax: (405) 702-4100
E-mail: keith.duncan@deq.state.ok.us
Elkins, Robert B.
Environmental Scientist
Bechtel Nevada Corporation
PO Box 98521
M/S NTS273
Las Vegas, NV 89193
Ph: (702) 295-5381 Fax: (702) 295-4773
E-mail: elkinsrb@nv.doe.gov
Elkins, Joe
QA Manager
U.S. EPA
1202 Belfry Dr
Knightdale, NC 27545
Ph: (919) 541-5653 Fax: (919) 541-3613
E-mail: Elkins.ioe@epa.gov
England, Jeff
Environmental Specialist
Iowa Waste Reduction Center
1005 Technology Parkway
Cedar Falls, IA 50613
Ph: (319) 273-8905 Fax: (319) 268-3733
E-mail: Jeff.England@UNI.edu
Registration as of March 20, 2001
-------
Fayoumi, Nabil
Environmental Scientist
U.S. EPA, Region 5
77 W. Jackson Blvd
Chicago, IL 60604
Ph: (312) 886-6840 Fax: (312) 353-4788
E-mail: fayoumi.nabil@epa.gov
Fennell, Douglas B.
QA Coordinator
ORD
U.S. EPA
348 Catawba Building, M D- 52
Research Triangle Park, NC 27711
Ph: (919) 541-3789 Fax: (919) 541-1818
E-mail: fennell.douglas@epa.gov
Ferris, Larry E.
Env. Services Specialist
Tinker AFB
7701 Arnold St
Oklahoma City, OK 73145
Ph: (405) 734-3002 Fax: (405) 736-4381
E-mail: larry.ferris@tinker.af.mil
Fong, Vance S.
Regional QA Manager
U.S. EPA, Region 9
75 Hawthorne St
San Francisco, CA 94105
Ph: (415) 744-1492 Fax:(415)744-1476
E-mail: Fong.vance@epa.gov
Francis-Holloman, Vincia C.
Environmental Scientist
U.S. EPA
1300 Pennsylvania Ave, NW (2811R)
Washington, DC 20004
Ph: (202)564-5176 Fax:(202)565-2441
E-mail: holloman.vincia@epa.gov
Friedman, David M.
Chemist
U.S. EPA, Region 3
1650 Arch St
Mail Code 3WC11
Philadelphia, PA 19103
Ph: (215) 814-3395 Fax: (215) 814-3113
E-mail: friedman.davidm@epa.gov
Gerrish, Harlan
Geologist
U.S. EPA
77 W. Jackson Blvd
UIC Branch, WU-16J
Chicago, IL 60604
Ph: (312) 886-2939 Fax: (312) 886-4235
E-mail: gerrish.harlan@epa.gov
Gilbert, John
Quality Systems
1230 Columbia St, Suite 400
San Diego, CA 92101
Ph: (619) 744-3049 Fax:(619)744-3049
E-mail: jmgilber@bechtel.com
Good, Gregg
Manager, Surface Water
Illinois EPA
1021 N. Grand Ave East, PO Box 19276
Springfield, EL 62794
Ph: (217) 782-3362 Fax: (712) 785-1225
E-mail: Gregg.Good@epa.state.il.us
Goodis, Michael L.
Team Leader
U.S. EPA
1200 Pennsylvania Ave, NW (7508C)
Washington, DC 20460
Ph: (703) 308-8157 Fax: (703) 308-8005
E-mail: goodis.michael@epa.gov
Gourley, Don
Supervisor, Air Quality Assurance
Missouri Dept. of Natural Resources
2710 West Main
Jefferson City, MO 65109
Ph: (573) 526-3393 Fax: (573) 526-3350
E-mail: nrgourd@mail.dnr.state.mo.us
Grim, Betsy
QPP QA Manager
OPP/EPA
2029 Hunter Mill Rd
Vienna, VA 22181
Ph: (703) 305-7634 Fax: (703) 305-6309
E-mail: grim.betsy@epa.gov
Registration as of March 20,2001
-------
Groff, Paul
U.S. EPA
Research Triangle Park, NC 27711
Ph: 919-541-0979 Fax: (919) 541-0979
E-mail: groff.paul@epa.gov
Gualo, Alan
Water Quality Coordinator
Illinois Department of Agriculture
PO Box 192871, State Fairgrounds
Springfield, IL 62794
Ph: (217) 782-6297
E-mail: agulso@agr.state.il.us
Guarino, Kevin S.
Special Agent-in-Charge
U.S. EPA
Office of Criminal Enforcement, Forensics and Training
Bldg 53, Box 25227, Denver Federal Center
Denver, CO 80225
Ph: (303) 236-6120 Fax: (303) 236-6174
E-mail: Guarino.Kevin@epa.gov
Gustin, RoseMary J.
QA Director
CDM Federal Programs Corporation
13135 Lee Jackson Memorial Hwy, Suite 200
Fairfax, VA 22033
Ph: (703) 968-0900 Fax: (703) 968-0915
E-mail: gustinrj@cdm.com
Hankins, Jeanne
NELAP Director
U.S. EPA, MD-75A
Research Triangle Park, NC 27711
Ph: (919) 541-1120 Fax: (919) 541-4261
E-mail: hankins.jeanne@epa.gov
Harris, Diane
Acting QA Manager
U.S. EPA, Region 7
901 N. 5th St
Kansas City, KS 66101
Ph: (913) 551-7258 Fax: (913) 551-9258
E-mail: harris.dianee@epa.gov
Haynes, RaeAnn
QA Manager
Oregon Department of Environmental Quality
1712 SW 11th Ave
Portland, OR 97201
Ph: (503) 229-5983 Fax: (503) 229-6924
E-mail: haynes.raeann@deq.state.or.us
Heffern, Richard
Chemist
State of Alaska Dept. Environmental Conservation
410 Willoughby Ave, Suite 303
Juneau, AK 99801
Ph: (907) 456-5 111 Fax: (907) 465-5129
E-mail: rheffern@envircon.state.ak.us
Helmick, Walter
QA Officer
U.S. EPA, Region 6
1445 Ross Ave, (MC: 6SF-D)
Dallas, TX 75202
Ph: (214) 665-8373 Fax: (214) 665-6660
E-mail: helmick.walt@epa.com
Henderson, Michelle
QA Manager
International Technology Corporation
11499 Chester Rd
Cincinnati, OH 45246
Ph: (513) 782-4763 Fax: (513) 782-4749
E-mail: mhenderson@theitgroup.com
Henebry, Michael S.
QA Officer
Illinois Environmental Protection Agency
1021 N. Grand Ave, PO Box 19276
Springfield, IL 62794
Ph: (217) 785-3944 Fax: (217) 785-1225
E-mail: mike.henebry@epa.state.il.us
Hernando, Jennifer
President
Mountain Edge Environmental, Inc.
PO Box 237
Kaawaii, HI 96730
Ph: (808) 237-1503 Fax: (808) 237-8425
E-mail: jjkleveno@aol.com
Hisel McCoy, Sara
Program Analyst
OEI/OIC
U.S. EPA
401 M. St, SW
Washington, DC 20460
Ph: (202) 260-7937 Fax: (202) 401-4544
E-mail: hisel-mccoy.sara@epa.gov
Registration as of March 20,2001
-------
Hite, Robert L.
Manager
Illinois EPA
2309 W. Main St, Suite 116
Marion, IL 62959
Ph: (618) 993-7200 Fax: (618) 997-1281
E-mail: Robert.Hite@epa.state.il.us
Holland, Therese A.
Limnologist
Illinois EPA
1021 N. Grand Ave East
PO Box 19276
Sprindfield, IL 62794
Ph: (217) 782-3362 Fax: (217) 785-1225
E-mail: teri.holland@epa.state.il.us
Holly, Evelyn
Senior Chemist
Quality by Design
1438 W. Broadway Rd, Suite B230
Tempe, AZ 85282
Ph: (480) 967-2380 Fax: (480) 967-2381
E-mail: qbdphx@gte.net
Howell, Faye M.
Assistant to Director
U.S. EPA
1200 Pennsylvania Ave, NW, (7502C)
Washington, DC 20460
Ph: (703) 305-5462 Fax: (703) 305-5112
E-mail: HOWELL.FAYE@EPA.GOV
Hughes, Barbara A.
QA Coordinator
U.S. EPA-OECA-Office of Criminal Enforcement,
Forensics, and Training
PO Box 25227, Bldg 53-Denver Federal Center
Denver, CO 80225
Ph: (303) 236-6116 Fax: (303) 236-5116
E-mail: hughes.barbara@epa.gov
Hughes, Thomas J.
QA Manager
NHEERL, MD-66
U.S. EPA
Research Triangle Park, NC 27711
Ph: (919) 541-7644 Fax: (919) 541-4284
E-mail: hughes.thomas@epa.gov
Hull, Kevin
Manager
Neptune & Company, Inc.
4600A Montgomery Blvd NE, Suite 100
Albuquerque, NM 87109
Ph: (505) 884-8455 Fax: (505) 884-8475
E-mail: khull@neptuneandco.com
Hunike, Elizabeth T.
QA Specialist
NERL, MD-46
79 Alexander Dr.,
Research Triangle Park, NC 27711
Ph: (919) 541-3737 Fax: (919) 541-1153
E-mail: Hunike.Elizabeth@epa.gov
Hunt, Margo E.
Microbiologist
U.S. EPA
1200 Pennsylvania Ave, NW (2811R)
Washington, DC 20460
Ph: (202) 564-6457 Fax: (202) 565-2441
E-mail: hunt.margo@epa.gov
Ingersoll, William S.
Chemist
United States Navy
1661 Redbank Rd
Goose Creek, SC 29445
Ph: (843) 764-7337 Fax: (843) 764-7360
E-mail: ingersollws@navsea.navy.mil
Ishida, Dwayne
Project Chemist
IT Group
3347 Michelson Dr#200
Irvine, CA 92612
Ph: (949) 660-7561 Fax: (949) 475-5433
E-mail: dishida@theitgroup.com
Jackson, Terry
QA Officer
Calif. Dept. Of Food and Ag., Center for Analytical
Chemistry
3292 Meadowview Rd
Sacramento, CA 95832
Ph: (916) 262-1498 Fax: (916) 262-1572
E-mail: tjackson@cdfa.ca.gov
Registration as of March 20, 2001
-------
Jarvis, Christina
Env. Protection Specialist
U.S. EPA
1200 Pennsylvania Ave, NW (7509C)
Washington, DC 20460
Ph: (703) 305-0312
E-mail: jarvis.christina@epa.gov
Johnson, Don
QA Manager
U.S. EPA, Region 6
1445 Ross Ave
Dallas, TX 57202
Ph: (214) 665-8343 Fax: (214) 665-8072
E-mail: johnson.donald@epa.gov
Johnson, Gary L.
Environmental Engineer
U.S. EPA
Mail Drop 34
Research Triangle Park, NC 27711
Ph: (919) 541-7612 Fax: (919) 541-7670
E-mail: johnson.gary@epa.gov
Johnson, Lora
Director of Quality
U.S. EPA/NERL
26 W. Martin Luther King Dr
Cincinnati, OH 45220
Ph: (513) 569-7299 Fax: (513) 569-7424
E-mail: johnson.lora@epa.gov
Johnson, Michael S.
Chemist/IT Coordinator
OSWER
U.S. EPA
1200 Pennsylvania Ave, NW (5204G)
Ariel Rios Building
Washington, DC 20460
Ph: (703) 603-0266 Fax: (703) 603-9112
E-mail: johnson.michaels@epa.gov
Jojokian, Jack
Ecologist
Office of Site Remediation Enforcement
1200 Pennsylvania Ave, NW (2273A)
Washington, DC 20460
Ph: (202) 564-6058 Fax: (202) 564-0074
E-mail: Jojokian.Jack@EPA.Gov
Jones, Monica D.
Manager
U.S. EPA, Region 3
701 Mapes Rd
Fort Meade, MD 20755
Ph: (410) 305-2747 Fax: (410) 305-3095
E-mail: jones.monica@epa.gov
Jones, Marguerite E.
Senior QA Analyst
DynCorp I&ET, Inc.
6101 Stevenson Ave
Alexandria, VA 22304
Ph: (703) 461-2247 Fax: (703) 461-2020
E-mail: maggie.jones@dyncorp.com
Jordan, James E.
QA Coordinator
Department of Environmental Quality
122 West 25th St
Cheyenne, WY 82002
Ph: (307) 777-7352 Fax: (307) 777-5616
E-mail: jjorda@missc. state, wy. us
Joslyn, Jim
QA Coordinator
Minnesota Pollution Control Agency
520 Lafayette Rd, N.
St. Paul, MN 55155
Ph: (651) 296-7387 Fax: (651) 297-0319
E-mail: james.joslyn@pca.state.mn.us
Juehnert, Jane A.
Lead Chemist
TffiSE
7701 Arnold Ave, Suite 204
Tinker AFB, OK 73145
Ph: (405) 736-5871 Fax: (405) 734-7076
E-mail: jane.kuehnert@tinker.af.mil
Jupp, Marilyn
Environmental Scientist
U.S. EPA, Region 5
77 W. Jackson Blvd
Chicago, IL 60604
Ph: (312) 353-5882 Fax: (312) 353-4342
E-mail: jupp.marilyn@epa.gov
Registration as of March 20,2001
-------
Kanaan, Muhannad
DynCorp I&ET, Inc.
6101 Stevenson Ave
Alexandria, VA 22304
Kantor, Edward
Chemist
NERL/HEADS/HERB
U.S. EPA
944 East Harmon Ave
Las Vegas, NV 89119
Ph: (702) 798-2690 Fax: (702) 798-2261
E-mail: kantor.edward@epa.gov
Kantz, Marcus E.
Air and Water QA Team Leader
U.S. EPA, Region 2
2890 Woodbridge Ave
Edison, NJ 08837
Ph: (732) 321-6690 Fax: (732) 321-6616
E-mail: kantz.marcus@epa.gov
Karnuaskas, Joan M.
U.S. EPA, Region 5
77 West Jackson Blvd
Chicago, IL 60604
Ph: (312) 886-6090 Fax: (312) 886-0168
E-mail: Karnauskas,Joan@EPA.gov
Keatley, Aaron
Internal Policy Analyst
Environmental Protection
14 Reilly Rd
Frankfort, KY 40601
Ph: (502) 564-2150 Fax: (502) 564-4245
E-mail: Aaron.Keatley@mail.state.ky.us
Keim, Steve
DPRA. Inc.
BOON. 17th St, Suite 950
Arlington, VA 22209
Ph: (703) 841-8041 Fax: (703) 524-9415
E-mail: skeim@dpra.com
Kelly, Marion
Ofice of Science and Technology, Office of Water
1200 Pennsylvania Ave, NW (4303A)
Ariel Rios Bldg
Washington, DC 20460
Ph: (202) 260-7117 Fax: (202) 260-7185
E-mail: Kelly.Marion@epamail.epa.gov
Kendall, Ronald
QA Officer
U.S. EPA
4400 Brookside Dr
Alexandria, VA 22312
Ph: (703) 305-5561 Fax: (703) 308-3259
E-mail: Kendall.Ron@epa.gov
Kenton, Roger
Senior Chemist
Eastman Chemical Company
PO Box 7444
Longview, TX 75607
Ph: (903) 237-6882 Fax: (903) 237-6395
E-mail: rogerk@eastman.com
King, Judy
Environmental Technologist
Environmental Protection
14 Reilly Road
Frankfort, KY 40601
Ph: (502) 564-3410 Fax:(502)564-5105
E-mail: Judy.King@mail.state.ky.us
Kirchmer, Cliff J.
QA Officer
Department of Ecology
PO Box 47710
Olympia, WA 98504
Ph: (360) 407-6455 Fax: (360) 407-6884
E-mail: ckir461@ecy.wa.gov
Kitchens, James L.
QA Officer
NERL
U.S. EPA
960 College Station Rd
Athens, GA 30605
Ph: (706) 355-8043 Fax: (706) 355-8068
E-mail: kitchens.james@epa.gov
Kleveno, Conrad O.
Senior Environmental Scientist
DynCorp I&ET, Inc.
6101 Stevenson Ave
Alexandria, VA 22304
Ph: (703) 461-2287 Fax: (703) 461-2020
E-mail: Conrad.Kleveno@dyncorp.com
Registration as of March 20,2001
-------
Koran, Dave
Chemist
U.S. Army Corps of Engineers
441 G St, NW
Washington, DC 20314
Ph: (202) 761-4989 Fax: (202) 761-4891
E-mail: david.koran@usace.army.mil
Kuo, Feng-Chao
Engineer
Formosa Plastics Corporation, Texas
201 Formosa Dr.
Point Comfort, TX 77978
Ph: (361) 987-7575 Fax: (361) 987-7487
E-mail: chaokuo@ftpc.fpcusa.com
Lafornara, Pat
Environmental Scientist
U.S. EPA
2890 Woodbridge Ave (MS-104)
Edison, NJ 08837
Ph: (732) 906-6988 Fax: (732) 321-6640
E-mail: lafornara.patricia@epa.gov
Lataille, Moira M.
Quality System Team Leader
U.S. EPA, Region 1
60 Westview St
Lexington, MA 02421
Ph: (781) 860-4312 Fax: (781) 860-4397
E-mail: lataille.moira@epa.gov
Levin, Ida
QA Team Leader/Superfund
U.S. EPA, Region 5
77 W. Jackson
Chicago, IL 60604
Ph: (312) 886-6254 Fax: (312) 353-9281
E-mail: LEVIN.IDA@EPA.GOV
LiVolsi, Joseph
QA Officer
U.S. EPA
27 Tarzwell Dr
Narragansett, RI02882
Ph: (401) 782-3163 Fax: (401) 782-3030
E-mail: livolsi.joseph@epamail.epa.gov
Lombard, Stewart
Program QA Coordinator
Washington State Department of Ecology
PO Box 488
Manchester, WA 98353
Ph: (360) 895-6148 Fax: (360) 895-6180
E-mail: slom461@ecy.wa.gov
Lopez-Luna, Sergio
HWSB
2890 Woodbridge Ave
Edison, NJ 08837
Ph: (732) 321-6778 Fax: (732) 321-6622
E-mail: lopez.sergio@epa.gov
Lumpkin, Mary S.
Technician, Physical Scientist
U.S. EPA
79 Alexander Dr (MD-44)
Research Triangle Park, NC 27711
Ph: (919) 541-4292 Fax: (919) 541-3527
E-mail: lumpkin.susan@epa.gov
MacMillan, Denise
QA Officer
Engineer Research and Development Center
420 S. 18th St
Omaha, NE 68102
Ph: (402) 444-4304 Fax: (402) 341-3848
E-mail: denise.k.macmillan@nwo02.usace.army.mil
Madding, Caroline
QA Manager
OGWDW
U.S. EPA
26 W. Martin Luther King Dr
Cincinnati, OH 45268
Ph: (513) 569-7402 Fax: (513) 569-7191
E-mail: madding.caroline@epa.gov
Maher, Iwona L.
Chemist
U.S. EPA
1200 Pennsylvania Ave, NW (7507C)
Ariel Rios Building
Washington, DC 20460
Ph: (703) 605-0569 Fax: (703) 305-6309
E-mail: Maher.Iwona @ epamai 1 .epa.gov
Registration as of March 20,2001
-------
Malak, Sami
Chemist
OPP/EPA
1200 Pennsylvania Ave, NW (7505C)
Ariel Rios Building
Washington, DC 20460
Ph: (703) 308-9365 Fax: (703) 308-9382
E-mail: malak.sami@epa.gov
Marcellin, Verline
QA/QC Officer
Department of Planning and Natural Resources
1118 Watergut Homes
Christainsted, St Croix, VI00820
Ph: (340) 773-0565 Fax: (340) 773-9310
E-mail: marv@viaccess.net
Martinson, John
QA Manager
U.S. EPA
26 W. Martin Luther King Dr
Cincinnati, OH 45268
Ph: (513) 569-7564 Fax: (513) 569-7424
E-mail: Martinson.John@epa.gov
Martz, Robert F.
Senior Scientist
Eastern Research Group
PO Box 2010
Morrisville, NC 27560
Ph: (919) 468-7927 Fax: (919) 468-7803
E-mail: rmartz@erg.com
Matthews, Wanda
National Exposure Research Laboratory
79 Alexander Dr.
Research Triangle Park, NC 27710
Ph: (919) 541-3491 Fax: (919) 541-3527
E-mail: matthews.wanda@epa.gov
Maxwell, Doris L.
Management Analyst
OAR/OAQPS, MD-13
U.S. EPA
Research Triangle Park, NC 27711
Ph: (919) 541-5312 Fax: (919) 541-0072
E-mail: maxwell.doris@epa.gov
McFarlane, Craig
QA Manager
NHEERL
200 S.W. 35th St
Corvallis, OR 97333
Ph: (541) 754-4670 Fax: (541) 754-4799
E-mail: craigm@mail.cor.epa.gov
McKenna, Dennis
QA Manager/Advisor
Illinois Department of Agriculture
PO Box 19281
Springfield, EL 62794
Ph: (217) 785-4723 Fax: (217) 524-4882
E-mail: dmckenna@agr.state.il.us
McLean, Fred
Chemist
Navy Laboratory Programs Office
1661 Red Bank Road
Goose Creek, SC 29445
Ph: (843) 764-7337 Fax: (843) 764-7360
E-mail: mcleanfs@navsea.navy.mil
McMillan, Dave
Manager, Ground Water
U.S. EPA
1021 N. Grand Ave East, PO Box 19276
Springfield, IL 62794
Ph: (217) 524-8111 Fax: (217) 782-0075
E-mail: epa3189@EPA.STATE.IL.US
Medrano, Tony
Director, QA Programs
U.S. EPA, Region 8
999 18th St
Denver, CO 80202
Ph: (303) 312-6336 Fax: (303) 312-7828
E-mail: medrano.tony@epa.gov
Mees, William M.
U.S. EPA
26 W. Martin Luther King Dr
Cincinnati, OH 45268
E-mail: mees.w@epa.gov
Michaels, Daniel
Vice President
Neptune & Company, Inc.
1505 15th St, Suite B
Los Alamos, NM 87544
Ph: (505) 662-2121 Fax: (505) 662-0500
E-mail: dmichael@neptuneandco.com
Registration as of March 20, 2001
-------
Moore, James C.
QA Manager/Statistician
NHEERL
U.S. EPA
1 Sabine Island Dr
Gulf Breeze, FL 32561
Ph: (850) 934-9236 Fax: (850) 934-9201
E-mail: moore.jim@epa.gov
Moore, Marlene
Advanced Systems, Inc.
Quality Systems Design, Development, Auditing, and
Cost Control
PO Box 8032
Newark, DE 19714
Ph. 1-302-834-9796
mmoore@advancedsys.com
Mosley, Robert E.
Environmental Specialist
U.S. EPA
944 East Harmon Ave
Las Vegas, NV 89119
Ph: (702) 798-2259 Fax: (702) 798-2375
E-mail: Mosley.Robert@epa.gov
Murray, Robert
U.S. DOE
Germantown, MD
Ph: (301-903-5984) Fax:(301-903-7613)
E-mail: robert.murray@em.doe.gov
Jeffrey C. Myers
Geoscientist
Washington Group Intl.
10822 W. Toller Dr
Littleton, CO 80127
Ph: (303) 948-4678
E-mail: ieff.mvers@wgint.com
Neal, Jill
Environmental Engineer
U.S. EPA
26 West Martin Luther King Dr
Cincinnati, OH 45268
Ph: (513) 569-7277 Fax: (513) 569-7185
E-mail: neal.jill@epa.gov
Nguyen, Thuy L.
Chemist
U.S. EPA/OPP
1200 Pennsylvania Ave, NW (7507C)
Washington, DC 20460
Ph: (703) 605-0562
E-mail: nguyen.thuy@epa.gov
Nichols, Robert E.
U.S. EPA
901 N. 5th St
Kansas City, KS 66101
Ph: (913) 551-7195 Fax: (913) 551-9195
E-mail: Nichols.Robert@EPA.GOV
Nixon Cook, Brenda
Coordinator
U.S. EPA, Region 6
1445 Ross Ave
Dallas, TX 75202
Ph: (214) 665-7436 Fax: (214) 665-7447
E-mail: cook.brenda@epa.gov
Nolan, Melvin
QA Officer/En v. Scientist
National Center for Environmental Assessment
1200 Pennsylvania Ave, NW (8601D)
Washington, DC 20460
Ph: (202) 564-3354 Fax: (202) 565-0061
E-mail: nolan.melvin@epa.gov
Ogg, Clayton
Economist
OPEI
401 M St, SW
Washington, DC 20460
Ph: (202) 260-6351 Fax: (202) 260-0290
E-mail: ogg.clayton@epa.gov
Ohta, Ron
QA Chemist
Tetra Tech EM Inc.
10670 White Rock Rd, Suite 100
Rancho Cordova, CA 95670
Ph: (916) 853-4506 Fax: (630) 604-9055
E-mail: ohtar@ttemi.com
Registration as of March 20, 2001
-------
Olivero, Ramon
Program Manager
Lockheed Martin
100 Capitola Dr., Suite 111
Durham, NC 27502
Ph: (919) 572-2764 Fax: (919) 572-2765
E-mail: ramon.olivero@lmco.com
Olson, Donald M.
Env. Scientist, QA Officer
OECA, Office of Regulatory Enforcement
1200 Pennsylvania Ave, NW (2248A)
Washington, DC 20460
Ph: (202) 564-5558 Fax: (202) 564-0010
E-mail: olson.don@epa.gov
Osenbaugh, Ruthann
QA Manager
Indiana Department of Environmental Management
100 North Senate Ave
Indianapolis, IN 46204
Ph: (317) 234-1627 Fax: (317) 233-6647
E-mail: rosenbau@dem.state.in.us
Ostrodka, Steve
Chief, Field Services Section
U.S. EPA, Region 5
77 W. Jackson Blvd
Chicago, IL 60604
Ph: (312) 886-3011 Fax: (312) 353-9281
E-mail: OSTRODKA.STEPHEN@EPA.GOV
Owens, James
NRMRL
26 W. Martin Luther King Dr
Cincinnati, OH 45268
Ph: (513) 569-7235 Fax: (513) 569-7327
E-mail: owens.jim@epa.gov
Owens, Jr., James E.
QA Officer
South Carolina Dept. of Health and Environmental
Control
8231 Parklane Rd
Columbia, SC 29223
Ph: (803) 896-0981 Fax: (803) 896-0980
E-mail: owensje@columb36.dhec.state.sc.us
Parry, Nan
QA Manager
National Center for Environmental Research
U.S. EPA
1200 Pennsylvania Ave, NW (8721R)
Washington, DC 20460
Ph: (202) 564-6859 Fax: (202) 565-2444
E-mail: parry.nan@epa.gov
Perelli, Vincent R.
Senior Planner and QA Manager
New Hampshire Department of Environmental Services
6 Hazen Dr, PO Box 95
Concord, NH 03302
Ph: (603) 271-8989 Fax: (603) 271-2867
E-mail: vperelli@des.state.nh.us
Perkins, Edwin E.
Environmental Chemist in
NYS Department of Environmental Conservation
50 Wolf Rd, Rm 211
Albany, NY 12233
Ph: (518) 457-9262 Fax: (518) 485-7733
E-mail: eeperkin@gw.dec.state.ny.us
Phagan, Carl J.
Quality Manager, Env. Monitoring
Pantex Plant
PO Box 30020
Amarillo, TX 79120
Ph: (806) 477-5481 Fax: (806) 477-5641
E-mail: cphagan@pantex.com
Pham, Howard
Chemist
U.S. EPA, Region 5
77 W. Jackson Blvd, SM-5J
Chicago, IL 60618
Ph: (312) 353-2310 Fax: (312) 886-0186
E-mail: pham.howard@epa.gov
Ponder, Wade
Chief, Technical Services Branch
National Risk Management Research Laboratory
86 TW Alexander Dr
Durham, NC 27711
Ph: (919) 541-2818 Fax: (919) 541-0496
E-mail: ponder.wade@epa.gov
Registration as of March 20,2001
-------
Prince, Cheryl D.
Quality Processes Manager
IT Corporation
PO Box 93838
Las Vegas, NV 89193
Ph: (702) 295-1986 Fax: (702) 295-2025
E-mail: cprince_it@nv.doe.gov
Pulsipher, Brent
Stats Technical Resource Mngr
Pacific Northwest National Laboratory
PO Box 999, MS K5-12
Richland, WA 99352
Ph: (509) 375-3989 Fax: (509) 375-2604
E-mail: brent.pulsipher@pnl.gov
Rarajan, Sunda
Environmental QA/QC
Indian Institute of Technology
Chennai
Tamil Nadu, India 600 036
Ph: (044) 243-4285 Fax: (044) 235-3686
E-mail: srajan@niot.ernet.in
Ratcliff, Betty
Chemist II
Indiana Department of Environmental Management
PO Box 6015
Indianapolis, IN 46206
Ph: (317) 308-3182 Fax: (317) 308-3219
E-mail: bratclif@dem.state.in.us
Ray, William
QA Program Manager
CA State Water Resources Control Board
10011 St
PO Box 944213
Sacramento, CA 94244
Ph: (916) 341-5583 Fax: (916) 341-5584
E-mail: rayb@dwq.swrcb.ca.gov
Ray, Mike
QA Manager
NHEERL
U.S. EPA
86 Alexander Drive
Research Triangle Park, NC 27711
Ph: (919) 966-0625 Fax: (919) 966-6212
E-mail: ray.mike@epa.gov
Reece, Debbie
Senior Chemist
Marasco Newton Group
2801 Clarendon Blvd
Suite 100
Arlington, VA 22201
Ph: (703) 284-9469 Fax: (703) 516-9108
E-mail: dreece@marasconewton.com
Renard, Esperanza P.
Chem. Engineer/Env. Scientist
U.S. EPA
2890 Woodbridge Ave (MS-104)
Edison, NJ 08837
Ph: (732) 321-4355 Fax: (732) 321-6640
E-mail: Renard.Esperanza@epa.gov
Reynolds, Eric S.
CLP QA Coordinator
Office of Emergency and Remedial Response
Ariel Rios Building (5204G)
1200 Pennsylvania Ave, NW
Washington, DC 20460
Ph: (703) 603-9928 Fax: (703) 603-9112
E-mail: reynolds.eric@epa.gov
Rhotenberry, William
Superfund Site Assessment Mngr
U.S. EPA, Region 6
1445 Ross Ave, Suite 1200
Dallas, TX 75202
Ph: (214) 665-8372 Fax: (214) 665-7447
E-mail: rhotenberry.william@epa.gov
Riley, Clyde
Env. Protection Specialist
OAQPS
U.S. EPA
2300 Southern Dr
Durham, NC 27703
Ph: (919) 541-5239 Fax: (919) 541-1039
E-mail: riley.gene@epa.gov
Robeen, Joe
Info. Services Specialist
Illinois EPA
1021 N. Grand Ave East
Springfield, IL 62794
Ph: (217) 524-8116 Fax: (217) 782-0075
E-mail: Joe.Robeen@epa.state.il.us
Registration as of March 20, 2001
-------
Rogers, Ron
QA Manager
NHEERL
U.S. EPA
86 TW Alexander Dr., MD-68
Research Triangle Park, NC 27711
Ph: (919) 541-2370 Fax: (919) 541-0297
E-mail: rogers.ron@epa.gov
Rogers, Kevin
Env. Protection Geologist
Illinois Dept of Agriculture
PO Box 19281, State Fairgrounds
Springfield, IL 62794
Ph: (217) 524-0542 Fax: (217) 524-4882
E-mail: krogers@agr.state.il.us
Romig, Randall
Environmental Scientist
U.S. EPA, Region 6
1445 Ross Ave, (6WQ-D)
Dallas, TX 75202
Ph: (214) 665-8346 Fax: (214) 665-6490
E-mail: romig.randall@epa.gov
Ruggles, Lee
Tech. Services Branch Manager
KY State Government
100 Sower Blvd, Suite 104
Frankfort, KY 40601
Ph: (502) 564-6120 Fax: (502) 564-8930
E-mail: Lee.Ruggles@mail.state.ky.us
Runyon, Robert M.
Chief, Hazardous Waste Support
U.S. EPA, Region 2
2890 Woodbridge Ave
Edison, NJ 08837
Ph: (732) 321-6645 Fax: (732) 906-6824
E-mail: runyon.robert@epa.gov
Ryals, Kim
Software Engineer
Accelerated Technology Laboratories, Inc.
496 Holly Grove School Rd
West End, NC 27376
Ph: 1.800.565-5467
E-mail: KRyals@atlab.com
Rygwelski, Kenneth R.
Environmental Scientist
U.S. EPA
9311 GrohRd
Grosse lie, MI 408138
Ph: (734) 692-7641 Fax: (734) 692-7603
E-mail: rygwelski.kenneth@epa.gov
Sakamoto, Roseanne
Environmental Scientist
U.S. EPA
75 Hawthorne St
San Francisco, CA 94105
Ph: (415) 744-1535 Fax: (415) 744-1476
E-mail: sakamoto.roseanne@epa.gov
Sample, Jackie
Manager, Navy Env. Lab.
U.S. Navy
1661 Redbank Rd
Charleston, SC 29445
Ph: (843) 764-7337 Fax: (843) 764-7360
E-mail: samplejs@navsea.navy.mil
Santillan, Javier
U.S. Air Force
AFCEE;3207 North Road
Brooks AFB
San Antonio,TX 78235-5363
Ph: (210) 536-4329
Javier. Santillan@HQAFCEE.brooks.af.mil
Sassoon, Richard
Science Applications International Corp.
Germantown, MD
Saynuk, Marcella A.
Saynuk Scientific Incorporated
9531 Caboose Court
Columbia, MD 21045
E-mail: msaynuk@erols.com
Schneider, Mary
Program QC Manager
Foster Wheeler Environmental Corp.
779 S. Leyland Dr
Diamond Bar, CA 91765
Ph: (562) 598-6150 Fax: (562) 596-8498
E-mail: mschneider@fwenc.com
Registration as of March 20, 2001
-------
Schofield, Judy
Senior Environmental Scientist
DynCorp I&ET, Inc.
6101 Stevenson Ave
Alexandria, VA 22304
Ph: (703) 461-2027 Fax: (703) 461-8056
E-mail: judy.schofield@dyncorp.com
Schroder, LeRoy
Hydrologist
U.S. Geological Survey
PO Box 25046, MS 401
Lakewood, CO 80225
Ph: (303) 236-1871 Fax: (303) 236-1880
E-mail: schroder@usgs.gov
Schuchardt, Mel
Air Laboratory Manager
Illinois EPA
1021 North Grand Ave East
Springfield, IL 62702
Ph: (217) 782-9281 Fax: (217) 557-4233
E-mail: melvin.schuchardt@epa.state.il.us
Schultz, Mary Ellen
Environmental Scientist
U.S. EPA, Region 3
701 Mapes Rd
Fort Meade, MD 20755
Ph: (410) 305-2746 Fax: (410) 305-3095
E-mail: schultz.maryellen@epa.gov
Schupp, George
QA Coordinator
U.S. EPA, Region 5
77 W. Jackson Blvd, ML-10C
Chicago, IL 60604
Ph: (312) 353-1226 Fax: (312) 886-2591
E-mail: schupp.george@epa.gov
Seibert, Tom
Environmental Technologist
Environmental Protection
14 Reilly Road
Frankfort, KY 40601
Ph: (502) 564-6716 Fax: (502) 564-4049
E-mail: Tom.Seibert@mail.state.ky.us
Seith, William D.
Deputy Director
Illinois EPA
1021 N. Grand Ave East
PO Box 19276
Sprinfield, IL 62794
Ph: (217) 557-7824 Fax: (217) 524-3336
E-mail: epa8851@epa.state.il.us
Sellers, Charles
QA Manager
EMRAD
1200 Pennsylvania Ave, NW (5307W)
Washington, DC 20460
Ph: (703) 308-0504 Fax: (703) 308-0509
E-mail: sellers.charles@epa.gov
She, Jianwen
California EPA
Hazardous Materials Laboratory, Department of Toxic
Substances Control, CalEPA
2151 Berkeley Way, Berkeley, CA 94704
Tel. (510) 540-2680
Email: Jianwen@aol.com
Short, Matthew
Illinois EPA
4500 S. 6th Street Rd
Springfield, IL 62706
Ph: (217) 786-6892
E-mail: matt.short@epa.state.il.us
Siders, Scott
Divisional QA Officer
Illinois EPA
1021 N Grand Ave East
Springfield, IL 62794
Ph: (217) 785-5163 Fax: (217) 524-0944
E-mail: scott.siders@epa.state.il.us
Siegelman, Frederic
Chemist
U.S. EPA
1200 Pennsylvania Ave, (2811R)
Washington, DC 20460
Ph: (202) 564-5173 Fax: (202) 565-2441
E-mail: siegelman.frederic@epa.gov
Registration as of March 20, 2001
-------
Sikes, John A.
QA Specialist
U.S. Army Engineering and Support Center
PO Box 1600
Huntsville, AL 35807
Ph: (256) 895-1334 Fax: (256) 722-8709
E-mail: John.A.Sikes@hnd01.usace.army.mil
Sims, Diann
Environmental Scientist
U.S. EPA
1200 Pennsylvania Ave, NW (2811R)
Washington, DC 20460
Ph: (202) 564-6872 Fax: (202) 565-2441
E-mail: sims.diann@epa.gov
Solsky, Joseph F.
Chemist
U.S. Army Corps of Engineers
12565 W Center Rd
Omaha, NE 68144
Ph: (402) 697-2573 Fax: (402) 697-2595
E-mail: Joseph.F.Solsky@usace.army.mil
Sorbet, Elaine S.
Laboratory Manager
LA Department of Environmental Quality
8618 GSRI
Baton Rouge, LA 70810
Ph: (225) 765-2406 Fax: (225) 765-2408
E-mail: elaines@deq.state.la.us
Soto, Alejandro D.
EMAS Administrator
Guam Environmental Protection Agency
PO Box 22439 GMF
Barrigada, GU 96921
Ph: (671) 475-1650 Fax: (671) 477-9402
E-mail: alexsoto@ite.net
Splichal, Laura
CDM Federal Programs Corp.
9200 Ward Parkway, Suite 500
Kansas City, MO 64114
Ph: (816) 444-3600 Fax: (816) 523-2600
E-mail: splichalll@cdm.com
Stevens, Shari L.
Chief, Hazardous Waste Support
U.S. EPA, Region 2
2890 Woodbridge Ave, Bldg 209
Edison, NJ 08837
Ph: (732) 906-6994 Fax: (732) 321-6622
E-mail: stevens.shari@epa.gov
Stevens, Carvin D.
QA Specialist
NERL
79 TW Alexander Dr
Research Triangle Park, NC 27711
Ph: (919) 541-1515 Fax: (919) 541-4368
E-mail: STEVENS.CARVIN@EPA.GOV
Strobel, Charles
Research Biologist
U.S. EPA
27 Tarzwell Dr
Narragansett, RI 02882
Ph: (401) 782-3180 Fax: (401) 782-3030
E-mail: strobel.charles@epa.gov
Sudduth, Bill
Supervisor, Env. Control
Environmental Protection
14 Reilly Road
Frankfort, KY 40601
Ph: (502) 573-3382 Fax: (502) 573-3787
E-mail: Bill.Sudduth@mail.state.ky.us
Sy, William
U.S. EPA
2890 Woodbridge Ave
Edison, NJ 08837
Ph: (732) 632-4766
E-mail: sy.william@epa.gov
Szaro, Deborah A.
Quality System Team Leader
U.S. EPA, Region 1
60 Westview Street
Lexington, MA 02421
Ph: (781) 860-4312 Fax: (781) 860-4397
E-mail: szaro.deb@epa.gov
Registration as of March 20,2001
-------
Szymanski, Cynthia
Microbiologist/QA Officer
Office of Pesticide Programs
1200 Pennsylvania Ave, NW, Ariel Rios Building
Washington, DC 20004
Ph: (703) 308-8191 Fax: (703) 308-8091
E-mail: Szymanski .Cynthia@epa.gov
Tarquino, John
Quality Manager
PA EPA, Office of Information Technology
400 Market St, PO Box 8761
Harrisburg, PA 17105
Ph: (717) 772-5838 Fax: (717) 772-1676
E-mail: jtarquino@state.pa.us
Taylor, David R.
U.S. EPA, Region 9
75 Hawthorne Street, PMD-3
San Francisco, CA 94105
Ph: (415) 744-1497 Fax: (415) 744-1476
E-mail: Taylor.David@epa.gov
Telliard, William A.
Director, Analytical Methods Staff
Office of Science and Technology, Office of Water
1200 Pennsylvania Ave, NW, (4303)
Ariel Rios Bldg.
Washington, DC 20460
Ph: (202) 260-7134 Fax: (202) 260-7185
E-mail: Telliard.William@epamail.epa.gov
Thorstenberd, Lisa
State Program Manager
U.S. EPA, Region 5
77 W. Jackson, WS-15J
Chicago, IL 60657
Ph: (312) 353-1938 Fax: (312) 886-0168
E-mail: thorstenberg.lisa@epa.gov
Tindall, Sebastian
Bechtel Hanford Inc.
3350 George Washington Way, MS: H0-02
Richland, WA 99352
Ph: (509) 372-9195 Fax: (509) 372-9718
E-mail: sctindal@bhi-erc.com
Topper, Martin Ph.D.
Quality Manager
U.S. EPA/OCEFT
1200 Pennsylvania Ave, NW (2231 A)
Washington, DC 20004
Ph: (202) 564-2564 Fax: (202) 501-0271
E-mail: Topper.Martin@epa.gov
Tsai, Cheng-Wen
Chemist/QA Expert
U.S. EPA, Region 5
77 West Jackson Blvd (M-9J)
Chicago, IL 60604
Ph: (312) 886-6234 Fax: (312) 353-4342
E-mail: tsai.cheng-wen@epa.gov
Tsibris, Evangeline
OEI
U.S. EPA
1200 Pennsylvania Ave, NW (2842A)
Ariel Rios Building
Washington, DC 20460
Ph: (202) 260-1655 Fax: (202) 401-1617
E-mail: tsibris.evangeline@epa.gov
Turkovich, Sydney
QA Officer
Michigan Department of Agriculture
1615 S. Harrison Rd
E. Lansing, MI 48823
Ph: (517) 337-5098 Fax: (517) 337-5094
E-mail: turkovichs@state.mi.us
Vandergrift, Steve
QA Manager
ORD
PO Box 1198
Ada, OK 74820
Ph: (580) 436-8684 Fax: (580) 436-8528
E-mail: vandegrift.steve@epa.gov
Vassmer, Mark
QA Manager
Illinois Department of Public Health
525 W. Jefferson St
Springfield, IL 62761
Ph: (217) 785-2043 Fax: (217) 785-0253
E-mail: mvassmer@idph.state.il.us
Registration as of March 20, 2001
-------
Vazquez, Rafael E.
Environmental Engineer
HQ AFCEE/ERT
3207 North Rd
Brooks AFB, TX 78235
Ph: (210) 536-1431 Fax: (210) 536-4330
E-mail: Rafael.Vazquez@hqafcee.brooks.af.mil
Vega, Ann
QA Manager
U.S. EPA
26 W. Martin Luther Kind Dr
Cincinnati, OH 45268
Ph: (513) 569-7635 Fax: (513) 569-7620
E-mail: vega.ann@epa.gov
Verwolf, Mary
U.S. DOE
Idaho Falls, ID 83415
Ph: 208-526-7001 Fax: 208-526-2548
E-mail: VERWOLMC@ID.DOE.GOV
VonLanken, Vicky
Adm. Assistant
Illinois EPA
1021 N. Grand Ave, East
Springfield, IL 62794
Ph: (217) 782-7001 Fax: (217) 782-2468
E-mail: vicky.vonlanken@epa.state.il.us
Vroblesky, Don, Ph.D.
U.S. Geological Survey
720 Gracern Rd
Columbia, SC 29210
Ph: (803) 750-6115 Fax: (803) 750-6181
E-mail: vroblesk@usgs.gov
Wagner, Tom
Director of Quality Assurance
NRMRL
26 West Martin Luther King Dr.
Cincinnati, OH 45268
Ph: (513) 569-7013 Fax: (513) 569-7585
E-mail: wagner.tom@epa.gov
Warren, John
Senior Statistician
U.S. EPA
1200 Pennsylvania Ave, NW, (2811R)
Washington, DC 20460
Ph: (202) 564-6876 Fax: (202) 565-2441
E-mail: warren.john@epa.gov
Warren, Julieann
Chief, Superfund Site Evaluation
Missouri Department of Natural Resources
PO Box 176
Jefferson, MO 65102
Ph: (573) 751-8629 Fax: (573) 751-7869
E-mail: nrwarrj@mail.dnr.state.mo.us
Wehrmann, Pamela
District Chemist
Army Corps of Engineers, Sacramento District
1325 J. St
Sacramento, CA 95814
Ph: (916) 557-6662 Fax: (916) 557-5307
E-mail: pwehrmann@spk.usace.army.mil
Weinstein, Jason
Physical Scientist
U.S. EPA/ORD/NERL
79 Alexander Dr, MD-47
Research Triangle Park, NC 27711
Ph: (919) 541-4207 Fax: (919 ) 541-4207
E-mail: weinstein.jason@epa.gov
Wentworth, Nancy W.
Director, Quality Staff
U.S. EPA
1200 Pennsylvania Ave, NW (2811R)
Washington, DC 20460
Ph. (202)564-6830 Fax:(202)565-2441
E-mail: wentworth.nancy@epa.gov
Wetherell, Will
Department of Natural Resources
2710 West Main
Jefferson City, MO 65109
E-mail: nrwethw@mail.dnr.state.mo.us
Willis, Craig
PE, CVS
Booz Allen and Hamilton
10999 Metcalf Avenue
Overland Park, KS 66210
Ph: (913) 906-9076 Fax: (913) 906-9018
E-mail: willis_craig@bah.com
Wisdom, Mary
QA Coordinator
NAREL
540 South Morris Ave
Montgomery, AL 36115
Ph: (334) 270-3476 Fax: (334) 270-3454
E-mail: wisdom.mary@epa.gov
Registration as of March 20,2001
-------
Woods, Ph.D., Bruce A.
Chemist
U.S. EPA
1200 Sixth Ave, OEA-095
Seattle, WA 98101
Ph: (206) 553-1193 Fax: (206) 553-8210
E-mail: woods.bruce@epa.gov
Worthington, Jeff
Director of Quality Assurance
OEI
U.S. EPA
1200 Pennsylvania Ave, NW, (2812A)
Washington, DC 20460
Ph: (202) 564-5174 Fax: (202) 501-1718
E-mail: Worthington.Jeffrey@epa.gov
Wright, Jr., Dallas
QA Officer
OPPTS/OPP/BEAD
U.S. EPA
701 Mapes Rd
Ft. Meade, MD 20755
Ph: (410) 305-2909 Fax: (410) 305-3091
E-mail: wright.dallas@epa.gov
Wu, Chieh
Environmental Engineer
U.S. EPA
Ariel Rios Bldg
1200 Pennsylvania Ave, NW (8623D)
Washington, DC 20460
Ph: (202) 564-3257 Fax: (202) 565-0076
E-mail: wu.chieh@epa.gov
Yocum, Jessica
Consultant
Marasco Newton Group
2801 Clarendon Blvd
Arlington, VA 22201
Ph: (703) 247-4068 Fax: (703) 516-9108
E-mail: jyocum@marasconewton.com
Young, Brenda
Quality Staff
U.S. EPA
1200 Pennsylvania Ave, NW (2811R)
Washington, DC 20460
Ph: (202) 564-6881 Fax: (202) 565-2441
E-mail: young.brenda@epa.gov
Zaragoza, Larry
Director
Office of Emergency and Remedial Response/OSWER
1200 Pennsylvania Ave, NW, (5202G)
Washington, DC 20460
Ph: (703) 603-8867 Fax: (703) 603-9133
E-mail: Zaragoza.Larry@EPA.gov
Registration as of March 20, 2001
-------
US EPA QUALITY SYSTEM
REQUIREMENTS AND GUIDANCE DOCUMENTS
EPA Quality Management Conference
St Louis, MO - April 2001
This report provides a current listing of requirements and guidance documents developed by the EPA
Quality Staff. These documents support the implementation of the EPA Quality System across the
Agency and the implementation of quality management and quality assurance requirements in applicable
extramural agreements. Document descriptions and downloads are available from the Quality Staff Web
Site at:
www.eDa.gov/Qualitv
FINAL DOCUMENTS CURRENTLY AVAILABLE:
QA/R-2EPA Requirements for Quality Management Plans (EPA/240/B-01/002, March 2001)
QA/R-5EPA Requirements for Quality Assurance Project Plans (EPA/240/B-01/003, March 2001)
QA/G-4 Guidance for the Data Quality Objectives Process (EPA/600/R-96/055, August 2000)
QA/G-4D* Data Quality Objectives Decision Errors Feasibility Trials (DEFT) Software ,V. 4.0
(EPA/600/R-96/056, September 1994)
QA/G-4HW Data Quality Objectives Process for Hazardous Waste Site Investigations (EPA/600/R-
00/007, January 2000)
QA/G-5 Guidance for Quality Assurance Project Plans (EPA/600/R-98/018, February 1998)
QA/G-6* Guidance for the Preparation of Standard Operating Procedures for Quality-Related
Documents (EPA/600/R-96/027, November 1995)
QA/G-7 Guidance on Technical Audits and Related Assessments (EPA/600/R-99/080, January
2000)
QA/G-9 Guidance on Data Quality Assessment: Practical Methods for Data Analysis
(EPA/600/R-96/084, July 2000)
QA/G-9D Data Quality Evaluation Statistical Toolbox (DataQUEST) (EPA/600/R-96/085,
December 1997)
QA/G-10 Guidance for Developing a Training Program for Quality Systems (EPA/600/B-00/004, •
December 2000)
NOTE: * indicates that the document is undergoing its mandatory 5-year review to determine if it will be
reissued without change, revised, or withdrawn.
April 2001
1
-------
DRAFT DOCUMENTS CURRENTLY AVAILABLE:
Overview of the EPA Quality System (Peer Review Draft, September 2000)
QA/G-5S Guidance on Choosing a Sampling Design for Environmental Data Collection
(Peer Review Draft, August 2000)
DOCUMENTS CURRENTLY IN PREPARATION/ DRAFTS NOT AVAILABLE:
Overview of the EPA Quality System Management Assessment Program
QA/R-1 EPA Quality Systems Requirements for Environmental Programs
QA/G-4A Guidance on Performance and Acceptance Criteria
EPA QA/G-5M Guidance for Quality Assurance Project Plans for Modeling
QA/G-8 Guidance on Environmental Data Validation and Verification
QA/G-11 Guidance on Quality Assurance for Environmental Technology Design, Construction,
and Operation
April 2001
2
-------
INSTRUCTIONS: This questionnaire lists quality systems documents that are under consideration for development by the Quality
Staff. These documents are developed to help organizations in implementing and managing quality systems. Please indicate how
important each document would be to you or your organization by placing an "x" in the appropriate box. The information provided
here will help the Quality Staff determine where needs for Guidance exist and the immediacy of those needs. Thank you for your
participation.
Quality Systems Documents
Unimportant
Little
Importance
Moderately
Important
Important
Very
Important
Quality Management Plans for Laboratories
Guidance for Quality Management Project Plans (Combined
QMP and QA Project Plan)
Guidance on Conducting Process Quality Audits
Quality System Requirements for non-EPA Organizations
Guidance on Conducting Performance Evaluations
Guidance on Conducting Technical Systems Audits
Guidance on Implementing Quality Requirements in
Contracts, Grants, and other Extramural agreements
Other Guidance Documents You Would like to See
Developed
Please drop this survey off at the registration desk along with your conference survey.
------- |