ABSTRACTS
      2007 EPA
 ANNUAL CONFERENCE on
MANAGING ENVIRONMENTAL
    QUALITY SYSTEMS

-------

-------
TABLE OF CONTENTS

Wednesday 8:30 AM

Ecological Monitoring and Assessment of the Great Rivers Ecosystems	7

A Strategic Framework for Implementing the EPA Information Quality
Guidelines 	8

Quality Policy Development  	8

Overview of Ambient Air Quality Regulations, Projects and Issues  	9

Target Areas of Childhood Lead Poisoning	9

Wednesday 9:00 AM

River Watershed Partnership Project 	10

Measuring the Benefits of Quality Management Systems  	10

Status and Changes in EPA Infrastructure for Bias Traceability to NIST .. 11

Test of Hypothesis Using WesVar Regression Models	12

Wednesday 9:30 AM

Pursuit of an International Approach to QA for Environmental
Technology Verification 	13

New Quality System in OPPTs Information Management Division 	13

Stages of Quality	13

Transition to Primary QA Organizations in the Air Quality
System (AQS) 	13

Estimating Populations Around Superfund Sites	14

Wednesday 10:30 AM

QA in  Modeling  Design and Implementation of New Tools	15

Strategy for Policy Management to Improve the Quality of
EPA Information 	16

Information Policies and the Information Policy Process	16

I Policies + Q Policies = EPA Product & Service Quality	16

PDCA Strategy for a Policy Management System	16

-------
Getting it Right - Best Practices for Developing an Agency
Quality Glossary	17

Updates and Enhancements to EPA's QA Handbook for Air Pollution
Measurement Systems (Volume IV)  	18

Statistical Methods for Environmental Applications Using Data Sets with Below
Detection Limit Observations  	18

Wednesday 11:00 AM

Creation of the Single Pass Quality Assurance Plan	19

Data Entry  Program for National Performance Audit Program (NPAP) 	20

Handling Nondetects in Contaminant Trend Analysis	20

Wednesday 11:30 AM

Perspectives on Tribal Quality Assurance Training	21

QA Issues for Energy Efficiency Measures at Ambient Air
Monitoring  Stations	21

A Simple Procedure for Estimating Method Quantitation  Limits	22

Wednesday 1:00 PM

EPA Region 2 Approach to Quality Assurance Training	22

A Successful Strategy for Implementing the EPA IQG	23

What the G-5 Rewrite Intends to Accomplish 	23

Progress Made in the G-5 Rewrite 	23

Discussion on the G-5  Rewrite	24

Integration of Statistics,  Remote Sensing and Existing Data to
Locate Changes in Land Resources	24

Wednesday 1:30 PM

Use of Information Technology to Manage Documents and Training	24

The SAS Enterprise Miner: Sequence Analysis, Decision  Trees, and Neural
Networks	25

Wednesday 2:00 PM

APHL as Home Base for State Environmental Laboratories	25

Statistical Support Pilot in the EPA Office of Research
and Development	25

-------
Wednesday 3:00 PM

Ethical Dilemmas:  An On-going Discussion of Challenges Facing
Environmental Laboratories	26

Data Quality versus Information Quality:  Is there a Difference?	27

eQMP Project, Development and Implementation of Electronic Quality
Management Plans (Session)	27

Wednesday 3:30 PM

External Laboratory Audits, Problems and Lessons Learned	28

Wednesday 4:00 PM

Laboratory Data Quality and the Bottom Line	28

Environmental Indicators	29

Thursday 8:30 AM

Transition of NELAC from EPA to TNI	29

Using Data from Diverse Sources - QC Procedures in
Database Management	29

Discipline-specific Standards Provide the Framework for Science
and Information Quality 	30

Science, Analytical and Quality  Management Standards	30

Data, Information, and Technology Standards - Open Discussion
on Standards Implementation	31

Ohio EPAs Division of Emergency and Remedial Response's
Data Verification Review Guidance Tools 	31

Thursday 9:00 AM

TNI National Environmental Laboratory Accreditation Program	32

High Quality Systems Engineering - Using CMMI as a Process
Improvement Framework	33

Thursday 9:30 AM

Laboratory Accreditation System Program	33

Addressing Address Matchers: Geocoding Performance Metrics	34

QA Guides for Improving Data Usability Assessment	34


Thursday 10:30 AM

-------
TNI Proficiency Testing Program	35

The Nexus of Quality Assurance and Information Assurance 	35

Novel Electronic Tools for Improving and Streamlining Quality System
Operation	36

Visual Sample Plan Expert Mentor: An Aid to Correctly
Using VSP 	37

Thursday 11:00 AM

 TNI Consensus Standards Development Program	38


Thursday 11:30 AM

TNI Advocacy Program	38

-------
Wednesday 8:30 AM
Quality Assurance/Quality Control of a project involving Cooperative
Agreements, Intra-agency Agreements, Agency Staff and Contracts to
conduct research; Ecological Monitoring and Assessment of the Great
Rivers Ecosystems in the Central Basin of the United States
Allan R. Batterman,  U.S. EPA ORD/NHEERL

EPA ORD's Environmental Monitoring and Assessment Program (EMAP) is a
long-term research  effort to enable status and trend assessments  of aquatic
ecosystems  across  the  U.S.   This is a national program that depends  on
partnerships with states, other federal  agencies, tribes, and local  groups to
develop the science to inventory our natural resources, report their condition,
and forecast future  risks. From 2004-2006, EPA coordinated field crews from
cooperating state and federal agencies to sample biological organisms, water,
sediment, and habitat  in the upper Mississippi  River,  Missouri, Ohio Rivers.
Sampling of the lower Mississippi  is  planned  for 2007-2009.  The  ecological
complexities of these  rivers provide scientific challenges for the EPA EMAP-
GRE team. Operationally, assessing these rivers is challenged  by their trans-
jurisdictional  nature, the number of  crews needs,  the diversity of samples
collected. From the  Quality Assurance Perspective, we will show the planning,
documentation,  and  review  processes  necessary  to  maintain   scientific
assurance that the data  being collected,  analyzed,  and reported on  are
comparable across all parties involved.

Key components of the Quality Assurance process are the 1) uniform sampling
protocols, published as the EMAP-GRE Field  Operations  Manual.  2) single
probabilistic  survey design, 3)  web-based information managements  and
sample  tracking system, and  4) standard  taxonomic/analytical laboratories
procedures.  Most of the components  were developed in cooperation  with
various  partners. All of the components were  reviewed, documented, and
audited  throughout the program.  EMAP-GRE  requires collaborations across
multiple  scientific  disciplines  (including  chemistry,  hydrology, toxicology,
biology,  geography, genomics), as  well  as  skills  in  acquisition, contract
management, project management, and Information Technology (IT).

The EPA EMAP-GRE team coordinates the activities of the  15-20  state and
federal  partners through interagency agreements,  cooperative agreements
and contracts. Approximately 15 field crews were used each  year to collect
and process about  8000 samples from 200 sites. Samples  are dispersed to
nine analytical and taxonomic laboratories. Field  and  lab data are compiled by
the EPA contract IT support staff in  a web-based  database. The team conducts
quality assurance audits of each crew  in the  field and laboratory  to ensure
that established protocols are being  followed. All field data  are repeatedly
reviewed. First, the crew leader reviews all data entries at the end of the
sampling. IT staff reviews the completeness and logic of data as the data are
entered into the database. Field crews compare 100% of the entered  data
with original data.  Finally, EPA staff reviews the data making  inter-crew and
inter-annual  comparisons.  Changes  to the  database are documented  to
preserve lineage. All crews participate in annual training sessions. The EMAP-
GRE team communicates with  the  partners through  a newsletter, conference
calls, a website "all-hands"  emails,  and workshops (e.g. Reference  Condition:
Indicators^.  Partners   are  encouraged  to present  preliminary  results  at
scientific meetings.

-------
Through  networking, training,  and  oversight,  the  EMAP-GRE  team  has
demonstrated  leadership  in  operating  across  organizational  boundaries
within and across both state and federal agencies, public and private entities,
multidisciplinary fields, and large geographic regions.  The result has been a
timely  completion of our mission to develop the scientific understanding for
translating environmental monitoring data from multiple spatial and temporal
scales  into policy-relevant  assessments of current ecological  condition  and
forecasts of future risks  to  our natural resources. This has  required the
development of ecological  monitoring and  assessment tools, culminating  in
better  methods  yielding   better information  for  making  more informed
management decisions  for our  nation=s Great  Rivers. Partners  have both
common and unique goals.  Rather than fostering competition among agencies
for the shrinking federal budget, involving our  partners from  the planning
stages through analysis has allowed concurrent and timely completion of our
several missions, as well as allowing the public to realize considerable  savings
in time and dollars. The EMAP-GRE team  strategy forms the basis  for the
research needed to establish  the condition of the nation's resources, as a
necessary first  step  in the  Agency's  overall strategy  for environmental
protection and restoration. While condition reports are useful to  managers,
demonstrating  how to implement a monitoring and assessment program  in
the future is also  an important project goal. In the end, better  monitoring
methods will make  more information more widely available to better manage
the nation's  great rivers.
A Strategic Framework for Implementing the EPA Information Quality
Guidelines
Monica Jones, IQG Team Leader, U.S. EPA OEI Quality Staff
Laurie Ford, U.S. EPA OEI Quality Staff

This  presentation  will  give an overview  of  the outcomes  from the  EPA
Information Quality  Guidelines  Workshop  held in Washington,  DC in April
2007.  The  presenters  will provide the audience with a  summary  of the
discussions about:

     •   Assessing the Agency's information quality
     •   Improving Database Quality
     •   Enhancing the role of the EPA IQG in improving information
         quality
     •   Improving  the  EPA's  administrative   mechanism   for
         responding to the IQG Requests for Correction
     •   Identifying  how the   EPA  IQG  process  can  improve  the
         Agency's  environmental outcomes
     •   Implementing  Pre-Dissemination  Review (PDR)  of EPA's
         disseminated products
Quality Policy Development
Ron Shafer, U.S. EPA OEI Quality Staff

This paper will provide background and activities of the Executive Advisory
Group, and the recommendations they made for revising  the  draft Quality
Policy. The issues of scope,  and roles and responsibilities will be discussed.
The process  used  by the Executive Advisory Group to make decisions and

8

-------
materials  used to support their decisions will  be described.  Any significant
changes to the draft Quality Policy will be identified and discussed. The status
and schedule for finalizing the Quality Policy will also be provided.
Overview of Ambient Air Quality Regulations, Projects and Issues
Dennis K. Mikel, U.S. EPA OAQPS-AQAD

OAQPS  has recently  promulgated new regulations in the Code of Federal
Regulation (CFR) in support of the new Ambient  Air Monitoring Strategy.
Chapter 40 CFR Section 50, 53 and 58 have been re-written and updated to
reflect the new the Monitoring Strategy. The new Monitoring Strategy has the
following recommendations:

      •    Including a greater level of multi-pollutant monitoring sites
          in representative urban  and rural areas across the Nation;
      •    Expanding  use   of  advanced  continuously   operating
          instruments and new information transfer technologies;
      •    Integrating  emerging  hazardous  air  pollutant  (HAPs)
          measurements into mainstream monitoring networks, and;
      •    Supporting advanced research level stations

In addition  to the monitoring regulations being  re-written,  a  number of
changes to the Quality Assurance  requirements were re-written and updated.
This  presentation will give  an overview  to the  changes to  the 40  CFR 58
Appendix A, which deals with Quality Assurance regulations. The items added
and  subtracted  will be  highlighted.   In addition, issues confronting  and
projects that are currently underway  in EPA's Office of Air Quality Planning
and Standards, Air Quality Assessment Division will be also be highlighted.
Targeting Geographic Areas Remaining At-Risk for Childhood
Lead Poisoning
Margaret Conomos, Barry Nussbaum, and Heather Case
Battelle:   Warren Strauss, Tim Pivetz, Jyothi Nagaraja, Elizabeth Slone, Rona
Boehm, Michael Schlatt, Darlene Wells, Michele Morara and Bruce Buxton

This pilot study seeks to develop statistical models to predict risk of childhood
lead poisoning within specified geographic areas based on  a  combination of
demographic,   environmental,   and  programmatic   information   sources.
Exposure  factors  associated with childhood lead poisoning were investigated
within census tracts for a community-focused set of  Models in  Massachusetts,
as  well   as  within  counties across  the  U.S.  in  a  series of  National
Models.  Aggregated summary measures of childhood lead poisoning within
defined geographic areas (census tracts and counties) are being used as the
response  variable in the statistical models, including geometric mean blood-
lead concentration as well as proportion  of children screened at or above 5,
10, 15 and  25 ug/dL  These summary  measures  are  reported  at 3-month
(quarterly) intervals over a several year period of time within each geographic
area, allowing EPA to assess how the risk of childhood lead poisoning changes
over time using a Generalized Linear Mixed Modeling  Approach.

-------
Wednesday 9:00 AM
The Blanchard River Watershed Partnership Project 2007-2008
M. T. Homsher, The University of Findlay
H. Bryan and M. Wilson, Findlay City Schools

The Blanchard Valley  Watershed  has been rated "impaired" by the  Ohio
Environmental Protection  Agency.   Together students will design an action
service  plan that will seek to change the  impaired status to  one of quality
while remaining sensitive to the  needs of all stakeholders.   As a  central
service  project,  these groups  will  conduct research  within the  watershed,
identify  pollution sources, analyze types  of  pollution  while  cleaning  park
streams, and  determine a  list  of current and  potential  park activities for
middle school, high school,  and college students.  The students will present
their data  to the  Middle  School  Student  Body, the Findlay High  School
Environmental Club, The University  of  Findlay  Scholarship  and Creativity
Symposium, the Blanchard Valley Watershed Study Partnership (BRWP), the
Ohio Environmental Protection  Agency, and  other  appropriate  groups  or
organizations. The three groups of students will complement each other as
they participate  in  both the social and scientific  input to a public decision
process.

They will develop the ability to  identify and define scientific issues, seek valid
answers, solve  problems, and impact public decisions  in  a diverse and
multicultural  global community, while reflecting on  their results and  their
service's ties to their topic of study.  This activity is planned for the 2007-
2008 academic  year  and includes  a University  of  Findlay  Sampling and
Analysis Class with a  service-learning component taught by  Dr. Michael T.
Homsher, a sophomore Environmental Studies  Class from Findlay High School
taught by  Heather  Bryan, and  the Environmental  Science Club composed of
5th grade students at Central Middle School, led by Mike Wilson. Each teacher
will  present the investigation and partnership to his/her respective class and
develop  student projects around the scientific problem-solving process and
the  student  interests  related to the  Blanchard Valley Watershed.  These
students will  partner  with  the Blanchard  River  Watershed  Partnership,  a
nonprofit that exists to facilitate stakeholder education about the  watershed,
assist Ohio EPA in evaluation of the watershed, and work with stakeholders to
improve watershed quality.
Measuring the Benefits of Quality Management Systems
Ron Shafer, U.S. EPA OEI Quality Staff

The Office of Environmental Information's Quality Staff is sponsoring a long-
term  effort  to develop a  more  meaningful  performance  measurement
approach  for EPA's quality management system. The objectives of this effort
are to identify:

    •  Identify better ways of measuring  the results and  benefits
        of a quality management system,
    •  Develop EPA quality goals for its products and services,
    •  Develop a  long-term  plan and  governance   process  to
        promote continuous improvement  of the Agency's quality


10

-------
        management system, and
    •   Provide meaningful  metrics that assist senior management
        in planning the future directions of the Quality Program.

To support this performance  metric initiative, the Quality Staff has conducted
two  pilot  efforts in  conjunction  with QAARWP  reporting  and  conducted
interviews with quality community  and Agency program  management staff.
This  technical  paper will describe  the objectives  and the results of these
efforts.  The paper  will also  provide a proposed set of performance metrics
based on the logic model approach.  Finally the paper will introduce Agency-
wide quality progress reporting and the elements associated with this type of
reporting.
Status and Changes in EPA Infrastructure for Bias Traceability to NIST
Mark Shanis, U.S. EPA OAQPS/AQAD/AAMG

Improvement changes continue to be made in a number of the parts of the
EPA QA infrastructure authorized and established by EPA's ORD-QA staff in
RTP and DC in the 1980's to characterize and  promote traceability of EPA
ambient air  monitoring data to NIST standards.  EPA's benchmark Quality
Assurance (QA) programs support the comparability of the calibrations that all
reporting  organizations  use to  assign  values  to the otherwise undefined
instrumental signals that air monitors  provide as  the basis  of the  data
reported to EPA for compliance and other uses.

This  discussion  will address  status  of changes  in  the EPA's  National
Performance Audit  Program  (NPAP) for Ambient  Air Criteria and  other
Pollutants, the EPA's Standard Reference  Photometer (SRP)  Program for
traceably standardizing ambient ozone measurements,  and the  EPA  Protocol
Gas Verification by an independent,  EPA-approved, third party.  In 1996 EPA
OAQPS  agreed to take  the programs  over  from EPA  ORD, to the extent
allowed  every year by resources/ priorities.

The transition of the gaseous pollutant part of NPAP as a mailed, back-of-the-
analyzer (BOA)-only program,  into  the  TTP  Program is about  complete. A
system for auditing Trace Level Precursor Gas monitors is being assembled. It
will be tested and a draft SOP developed in this CY. A more portable gaseous
Criteria  Pollutant TTP PE system was used  in 2 east coast EPA Regions in
2006. The system has advantages  of  lower cost and much easier access.
Planning is under way to add a  2nd  system for use in the west. A new data
base has been developed to automate entry of NPAP audit data into AQS, and
a new ACCESS-based data entry form is being developed to improve the field
data capture,  preparation for reporting, and to automate prescreening for
transmission into AQS. Changes are being implemented to improve ease of
audit point generation. Changes related to the October 2006 regulation are
discussed.

NPEP TTP  PEs have now been conducted by staff from 6 EPA Regions, using a
sharing  approach. Problems have been identified, and the causes determined
in a number of states,  some with  strong internal  programs.  Changes are
already  being  made in some programs. Attention is now being  refocused on
materials  and  high flow  sampling systems.  Draft  TTP  SOPs  and  an
Implementation Plan are now posted  on www.epa.gov/ttn/amtic/quality.
                                                                    11

-------
The  SRP  network  of  10  NIST  manufactured and certified  systems  are
deployed,  based, and operated in 8 of the 10 EPA Regions. They are compared
to the NIST SRP using a stationary  and a traveling  SRP, now both based in
RTP, NC.  In the last 12 months,  the last  Regional SRP  automation upgrade
was completed, and coordination of the central SRP has been brought back to
RTP. See  latest Operator list on www.epa.gov/ttn/amtic   A new accuracy-
improving upgrade of the SRP is being planned by NIST.

After  1996, the ORD's EPA  Protocol Gas Verification Program (PGVP) was not
continued. Although the sample size of the original program was small and
inexpensive, vendors paid attention, for very low cost. Results improved over
the 4-5 years of the program. EPRI (ca.1998), and then EPA (2003, 2006), in
response to complaints from the user communities, have performed additional
blind sampling studies. The studies indicated that problems, across pollutants,
have  recurred. EPA has promulgated  new  Protocol Gas  language,  and is
proposing more, to require  verification, and has worked with stake holders to
develop,  review  and  revised  a  detailed PGVP Implementation Plans for
establishing  a vendor-funded,  EPA-approved,  3rd party-operated,  blind
sampling, publicly-reported  verification program.
Test of Hypothesis to Determine if Hispanics in  NHANES 2001-2002
Have Significant Differences Using WesVar Regression Models
Dr. Hans D. Allender, P.E., U.S. EPA OPP/HED

Introduction
To evaluate  pesticide exposure to  US  population,  the Office of Pesticides
Programs (OPP) at EPA works with a database produced by  the Center for
Disease  Control's (CDC)  known  as NHANES, or the  National Health And
Nutrition  Examination Survey  (http://www.cdc.gov/nchs/nhanes.htm). The
survey  produces extensive and  comprehensive information  on more that
20,000  individuals that includes concentration of different pesticides, or their
metabolites, on the urine of the population. Because the survey is designed to
permit statistical statements to be made about certain  minority groups within
the population, the survey over-samples these subgroups and uses sampling
weights to subsequently adjust for this over-sampling. To extend the survey
results  to  the rest of the population,  special statistics are  necessary; this
includes the utilization of replications for each data point. Testing hypotheses
under these conditions becomes  unconventional  and require specialized
software similar to WesVar.

The specific example in this presentation test the hypothesis that the Hispanic
group in the  NHANES sample is no different from  the rest of the population.
This will be achieved by using replication methods under the conditions of not
normal data,  non-random sample design, and weighed data.

Objectives of the Presentation
The objectives of this  presentation are to explore how WesVar deals with  a
test of hypothesis under the above conditions by using a regression model.
Wednesday 9:30 AM
12

-------
Pursuit of an International Approach to Quality Assurance
For Environmental Technology Verification
Lora Johnson, Director of Quality Assurance, U.S. EPA National  Exposure
Research Laboratory

In  the  mid-1990's,  the  USEPA  began  the  Environmental  Technology
Verification (ETV) Program in order to provide purchasers of environmental
technology with independently acquired,  quality-assured,  test  data, upon
which  to  base their purchasing decisions.   From the beginning,  a strong
program of quality assurance  was  specifically  devised  for  ETV  and was
documented in the ETV Quality and Management Plan.  During the intervening
years,  the ETV program and accompanying quality system,  has continued to
evolve. One feature of the quality system  has remained constant:  EPA QA
staff continue to provide QA oversight for the program.  This has created a
conundrum for the program as the actual technology  testing  moves toward
economic  self-sufficiency  (i.e.  vendors pay for  testing), but  QA  oversight
remains in-house.   It  has presented a particularly interesting problem for
collaborating  with other countries where technology verification programs are
also  being  developed  and reciprocity is desired.  Tapping  into  existing
resources  for accreditation and certification in  the international conformity
assessment   arena  may  be a  viable  approach,  but  undoubtedly  some
infrastructure must be  developed to replace the hands-on oversight that  has
historically been provided  by EPA  QA staff.  Discussion of this  approach and
other  activities  related to  development  of  an  international approach  to
environmental technology verification will be the focus of this presentation.
OPPTS/OPPT New Quality System
Todd Holderman, U.S. EPA OPPTS/OPPT/IMD

OPPTs  Information  Management  Division  (IMD) is striving  to ensure that
quality  is built into all  aspects  of the Division's activities,  services and
operations.  To this  end the Division is revitalizing its quality management
program with the goal of interconnecting all  management and performance
activities as  well as service  delivery under the  auspices  of the Quality
Management Plan. This briefing will highlight the various pieces of this new
program - still a work in progress  - to show how the Division  is attempting to
integrate  performance management, standard  operating procedures, and
records management with program/project planning  and budgeting.  The
establishment   of  a  quality  program  and  system  where  continuous
improvement in all aspects of the Division's mission is the objective.
Stages of Quality
Louis Blume, U.S. EPA Great Lakes National Program Office
(Abstract Unavailable)
Transition  to Primary Quality  Assurance  Organizations in the Air
Quality System (AQS)
Jonathan K.  Miller, U.S. EPA OAQPS/OID/NADG

With the promulgation of the regulations in 40 CFR Part 58, a new entity was
introduced in Appendix A.  As of January 1, 2007, the organization that is
responsible  for the submission and quality of ambient  air quality data  is

                                                                     13

-------
referred to as  the  Primary Quality Assurance  Organization  (PQAO).  This
change was made for two primary reasons:
    1.  Historically this role had been referred to as the "Reporting
        Organization".  However, it has been evident that in  some
        circumstances the meaning of "Reporting Organization" was
        being used to mean "the organization supplying the data to
        AQS" and  not necessarily the agency  responsible for  the
        results and quality of the information. It is felt that the new
        name more clearly describes the intended role of the defined
        organization.
    2.  To provide organizations the opportunity to reduce auditing
        requirements   outlined   in   the   National   Performance
        Evaluation  Program  (NPEP).   By  defining  the new role,
        smaller agencies have the opportunity  to "combine" with
        other organizations to help  reduce the number of audits
        required by the NPEP program.

As part of the creation of the PQAO role, AQS also needed to make changes in
order to be consistent with  the needs of the new role.  The changes can be
broken down  into the following categories:
    1.  Conversion  of existing  Reporting  Organizations to  PQAO
        (Estimated completion in March, 2007)
    2.  Business  Rule Changes  (Estimated  completion  in  March
        2007)
    3.  Changes in Reports (All changes estimated to be complete
        by October 2007)

It is thought that there will not be significant changes in the type or volume of
data  required from  the  submitting agencies once the initial conversion is
complete.  The  benefits  of the changes should  be  reflected not only in  the
increased  accuracy and consistency of the data reported to and from AQS,  but
also reduces the overall costs of collecting and validating the data  as well.
Estimating  Populations  Around  Superfund  Sites:  A  Comparison
Between  Simple  Census Block and National Land Cover  Infused
Geographic Interpolation
Larry Lehrman and Tom Brody
U.S. EPA Region 5 RMD OIS

A November 13, 2006 memo from  EPA Deputy Administrator  Marcus Peacock
instructed  agency  staff to  undertake  a  Workload  Assessment for the
Superfund  Program. The Assessment was in response to several reviews that
recommend EPA manage Superfund resources more effectively. Additionally,
the  House Appropriations  Committee,  in  its 2007 Appropriations Report,
stressed that EPA must do a  better job  of using limited staff resources and
praised EPA  for initiating a  workforce  assessment. A working  group was
formed to tackle the various facets of prioritizing initiatives in the Superfund
program. One of these facets included  establishing the population affected
around the portfolio of sites in the program.

The  working  group established  the definition of the  population to be the
"population within the site area,  in addition to the population within an area
which extends outward from the  site boundary by one mile and encompasses
the site in  its entirety." Geographic Information System (CIS) Analysts from

14

-------
several  Regions spent time  generating  both the site  boundaries of their
portfolios and developing methods to easily capture the population within a
one  mile buffer of these boundaries. Most groups resorted to interpolating
these buffers with block data from the Census to determine the population
around sites. This exercise simply appropriates the percentage of the  block
inside the buffer to  the population number of the block with the assumption
that  population is uniformly distributed within the block. Although block data
can  be  very fine in  urban  and  even suburban  settings,  the interpolation
method  may break down in rural  areas where  wider blocks may capture
several unpopulated areas such as farms and water bodies.

To this  end, Region  5  decided to see  if there  are  significant differences
between simple interpolation of the Census Data  and  a method that infuses
the newly  released 2001  National  Land  Cover Dataset (NLCD 2001)  first.
NLCD 2001 separates developed areas from such categories as water bodies,
barren land, forests,  crops,  and wetlands. The  exercise  proportioned the
population in the blocks to the developed areas and then interpolated the data
in the one mile boundary. This paper will show if we are seeing significant
differences in the two methods, and  provide  some  anecdotes during our
exploration of the NLCD 2001.
Wednesday 10:30 AM
Quality Assurance in Modeling - Design and Implementation
of New Tools
John Smaldone, U.S. EPA

Improving  the application  of quality  assurance  project  plans  (QAPPs) in
modeling may reduce vulnerability to legal challenges.  Guidance  has been
developed and is being implemented for assuring quality in modeling  in the
Total  Maximum  Daily  Load  (TMDL)  program.  A  template and companion
checklist has  been designed for  writing  QAPPs for models  used in  TMDL
decisions.  Flexibility  in the design and implementation strategy allows the
template to be  used for writing  one,  state-wide  generic QAPP or project-
specific  QAPPs.     The checklist  assists  QAPP   writers in  tracking  the
completeness  of their  work and  may assist QAPP reviewers by  readily
identifying deviations from the template.

Examples of some noteworthy features  of the QAPP template include internet
links to:
    •   EPA quality assurance references;
    •   State-of-the-art modeling guidance; and
    •   An EPA modeling glossary

Other noteworthy specifications called for in the template include:
    •   Identification of model selection and model fit to problem;
    •   Notification of changes to model code;
    •   Use of independent data sets for model calibration;
    •   Identification of calibration stop criteria;
    •   Sensitivity and uncertainty analyses; and
    •   Modeling report requirements

                                                                     15

-------
Readily available new tools  from Europe  compliment the  QAPP  template.
These  tools place a  compendium of expert consensus  on good  modeling
practice at your fingertips.  The combination of these quality system tools and
best practices help ensure quality data from external partners.
Strategy for  Policy Management  to Improve The Quality  of EPA
Information (Session)

Policy  implementation is not  always  performed  as discrete  activities  by
discipline-specific groups  in an enterprise.  For example, implementing quality
policy is not only the responsibility of quality managers.  Individuals need to
understand  and  ensure  implementation  of all  applicable policies.   This
presentation  provides an overview of Agency information policies and their
current status.  The second presentation described how those information
policies in concert with quality policies ensure the quality of a wide variety of
EPA products and services.   The third  presentation describes  how OEI is
strengthening its policy management system by incorporating plan-do-check-
act principles and offering this more robust system in a web-based application
for use by all employees.
        Information Policies and the Information Policy Process
        Lyn Burger, Chair, OEI Information Policy Workgroup, U.S. EPA OEI

        Information  policies  and  associated  procedures,  standards,  and
        guidelines  provide the framework for the consistent delivery  of
        information to the public. The policy development process must keep
        pace  with  Government-wide  priorities, EPA mission  support needs,
        and the public's needs.  This presentation reviews the status of EPA's
        information policies and associated documents  and  describes the
        progress made in the last  several years to improve the information
        policy process.
        I Policies + Q Policies = EPA Product & Service Quality
        Laurie Ford, U.S. EPA OEI Quality Staff

        The quality policies are not the only policies that ensure the quality of
        EPA  activities.    Other policies  provide  for uniform  work  and
        encourage oversight to ensure implementation.  Information policies,
        in particular, help  ensure the quality of information delivery and the
        functionality  of the  systems  that  provide information  to EPA's
        customers.  This presentation describes how both information policies
        and quality  policies jointly work together to ensure that there is the
        right  information, at the right time, in the right place.
        PDCA Strategy for a Policy Management System
        Kevin Hull, Neptune & Company

        Information  policy developers  are  faced with  many challenges.
        Information  processes  and  technology frequently  change  and
        required policies to be updated.  Many policies must be approved at
        several  levels before receiving senior management approval. With a

16

-------
        large volume of policies, the 3 year update cycle itself can create a
        substantial workload. Implementation  of  many  policies  in a  single
        area can be difficult.  One solution is to apply the plan-do-check-act
        cycle to the policy management process and provide this is a web-
        based form. This presentation described such a system.
Getting  it Right - Best  Practices for Developing an Agency  Quality
Glossary (Session)
Katherine Breidenstine - U.S. EPA OEI Quality Staff
Support from  Project Performance Corporation (PPC) Glossary Development
Team

Background:
It  is  critical  for the  Quality  Assurance (QA)  community  to  use  correct
terminology to communicate complex quality  issues throughout our  diverse
disciplines. Currently, there many glossaries containing QA terms throughout
the Agency, to include one developed by the Quality Staff.  In our efforts to
ensure  consistency of terms  and  make  improvements  for  our   various
stakeholder communities the Quality Staff has secured resources to assist in
the development a fully functional Agency Quality Glossary.

Session Content:
This session will  share and discuss with participants the  key project efforts
and challenges:

    •   Conducting a  literature  review  and  analysis of  existing
        terminologies, terminology formats, and glossaries available
        on the EPA  internet  and intranet that address quality for
        science,   engineering,   and   information   technology/
        management activities.

    •   Establishing a Glossary Governance Council to participate  in
        discussions for determining  business  rules and  identify
        terms for inclusion in the Agency Quality Glossary.

    •   Establishing business rules for a comprehensive listing of all
        quality and information terms.

    •   Developing a  comprehensive  listing of  all  quality  and
        information  terms,  the  type  of  term (e.g.,  statistics,
        information quality, measurement) definitions, plain English,
        sources, and web links based on  the current terminology
        used  to describe, plan, implement and assess quality for
        Agency science,  engineering, and  information technology/
        management activities, used  by the  Quality Staff in their
        documentation.

    •   Developing the design for an fully-functional agency quality
        glossary Web site to allow for the following functionality:
            - Sorting by term type and term.
            - Linking terms in one  definition  to  other terms in the
                 glossary.
            - Linking terms to the appropriate Web page.

                                                                      17

-------
    •   Establishing  a  management  system to  include proposed
        methodologies for periodic updates to the terms contained
        in the glossary.
Updates and Enhancements to EPA's Quality Assurance Handbook for
Air Pollution  Measurement  Systems (Volume IV):   Meteorological
Measurements
Dennis K. Mikel, U.S. EPA OAQPS-AQAD

Meteorological  data  has proven  to be  an important  part  of  air quality
management activities.   Many State,  Local and Tribal air agencies perform
meteorological  monitoring to support their efforts to  improve air  quality.
Often times, these agencies  lack clear guidance on appropriate methods and
techniques  to  ensure the  quality of  the meteorological  data  they are
collecting.  This often results in  poor meteorological data being collected that
is unusable by the agency-    EPA  has  addressed this lack  of guidance by
updating the Volume IV  QA Handbook for  Meteorological  Measurements so
that  air agencies  can  work  towards   improving  data  quality  for  their
meteorological data.

This  presentation  will provide  the results of  EPA's effort  to  identify the
meteorological  monitoring needs of air quality management agencies.  The
Handbook should be  in final form and  will be available to the public. The
presentation will outline the enhancements,  include a description of the  latest
state-of-the-art meteorological instruments  and updated information on the
calibration and auditing. The updated Volume IV will have in-depth technical
discussions of  wind  parameters (cup/vane and sonic), temperature,  solar
radiation,  rain  measurements, discussions of vector and sigma calculations
and upper air meteorological  measurements.

In-depth audio/videos files were created  in  support of the Handbook. These
AV files are medium length (5 - 18 minutes) that illustrate the calibration of
typical meteorological equipment.   This  presentation will also  illustrate  these
AV files.
Statistical Methods for Environmental Applications Using Data Sets
With Below Detection  Limit Observations as Incorporated in ProUCL
4.0
Anita Singh, Lockheed-Martin
John Nocerino, U.S. EPA, Las Vegas, NV

Nondetect (ND)  or below detection limit (BDL)  results  cannot  be  measured
accurately, and, therefore, are reported as less than certain detection limit
(DL) values. However, since the presence of some contaminants (e.g., dioxin)
in environmental  media  may  pose  a  threat  to  human  health  and the
environment, even at trace levels, the NDs cannot be ignored or deleted from
subsequent statistical analyses. Using data sets with  NDs and  multiple DLs,
practitioners  need to compute  reliable  estimates of the population  mean,
standard deviation, and various upper limits, including the upper confidence
limit (UCL) of the population mean, the upper prediction limit (UPL), and the
upper tolerance  limit (UTL).  Exposure assessment, risk management, and
cleanup decisions at potentially impacted sites are often made based upon the

18

-------
mean concentrations and  the UCLs of the  means of the contaminants  of
potential concern (COPCs), whereas background evaluations and comparisons
require the computations of UPLs and UTLs to estimate background threshold
values (BTVs) and other not-to-exceed values. The 95% UCLs are used  to
estimate the exposure  point  concentration  (EPC) terms  or to  verify the
attainment of cleanup levels; and upper percentiles, UPLs, and UTLs are used
for screening of the COPCs, to identify polluted site areas of concern and hot
spots, and also to compare site concentrations with those of the background.

Even though methods exist in the literature to estimate the population mean
and the standard deviation for data sets with  NDs, no specific guidance with a
theoretical  justification  is  available  on how  to  compute appropriate UCLs,
UPLs, and other limits based upon data sets  with NDs and multiple DLs. The
main objective of this paper is to present defensible statistical methods that
can be used to compute appropriate estimates of environmental parameters,
EPC terms, BTVs, and other not-to-exceed  values based upon data sets with
NDs. This paper describes both  parametric  and  nonparametric methods  to
compute UCLs, UPLs, and UTLs based upon data sets with NDs  having multiple
DLs. Some  of the  methods  considered include:  the maximum  Likelihood
Estimation (MLE) method,  the  regression on  order statistics (ROS) methods,
and  the  Kaplan-Meier  (KM)  method.  Based   upon our findings,  it  is
recommended to avoid the use of ad hoc UCL methods based  upon Student's
t-statistic on ML estimates. It is also suggested to avoid the use of the DL/2
method on data  sets even with  low (<5%-10%) censoring intensities. It is
shown that, just like for uncensored data  sets, for highly skewed data sets
with NDs, one should use the Chebyshev inequality based UCLs (e.g. using KM
estimates) to provide an  adequate coverage for the population mean.

Several of these methods  have been  incorporated  into  the ProUCL 4.0
software package. ProUCL 4.0 makes some recommendations based upon the
results and findings of Singh, Maichle, and Lee (EPA 2006). Some examples  to
elaborate on the issues of  distortion of the  various statistics and upper limits
by outliers and  by  the  use of a lognormal model to accommodate those
outliers will be discussed using ProUCL 4.0.

Reference:
Singh, A.,  Maichle,  R.,  and Lee, S. On the Computation  of a 95% Upper
Confidence Limit of the Unknown  Population Mean Based Upon Data Sets With
Below Detection Limit Observations. EPA/600/R-06/022. March 2006.
Wednesday 11:00 AM
Creation of the Single Pass Quality Assurance Plan
Michelle Henderson, Shaw Environmental, Inc., Cincinnati, OH

Prior  to  the physical  startup  of  most  environmental  projects  a  Quality
Assurance Plan (QAP) must  be  written to  describe the process and  method
details of the construction, remediation or experimental plan.  The writing of
the (QAP)  is often delegated to a junior engineer or technician.   These
technical documents incorporating  the details of  how the project is to be
accomplished can take many hours to complete. If the QAP does  not include
the appropriate detail in language understood by the client or agency,  the QAP
may be  returned  with  multiple comments  and  requirements for revision.

                                                                    19

-------
Sometimes multiple review rounds are required to satisfy the client or agency,
wasting project time, money and resources. In addition, repeated returns of a
QAP may cause low customer satisfaction and lack of confidence even before
the project has begun. The single pass QAP is defined as one which has been
approved on the first pass through the client and/or approving agency prior to
work start. The creation of a single pass QAP requires several specific  tasks,
while success  is never guaranteed.   Upfront  planning, understanding and
meeting of appropriate requirements and the involvement of all stakeholders
is critical in the process of the creation of a single pass QAP.
Data Entry Program for National Performance Audit Program (NPAP)
Jonathan K. Miller, EPA Office of Air Quality, Planning and Standards
Outreach and Information Division

In an effort to produce  accurate  information produced  from the  National
Performance Audit Program (NPAP),  OAQPS has  developed a  data entry
program in Microsoft  Access to assist field  operators with the input and
formatting  of this  data.  The overall application consists of a field operator's
version and a version kept at EPA Headquarters in Research Triangle Park,
North Carolina (RTP).   The field version of the application focuses  on data
entry (providing appropriate fields, look-up values, etc), and  creating files to
export to RTP.  The field version  of the application also provides a summary of
the results of the  audit.  The RTP version  of  the application would focus  on
receiving files generated from the field version of the software as well as long
term maintenance and storage of the data. The RTP version would also have
a variety of summary and detail reports based on the supplied information.

This presentation would be largely a demonstration of the  field version of the
application allowing participants to comment  on the provided user interface
modules, application features,  and  suggestions for improvement.   Reports
from the RTP version of the application will also be demonstrated.
Handling Nondetects in Contaminant Trend Analysis
Douglas Mclaughlin, NCASI

Substituting nondetect values with values  such as  0, one-half the  detection
limit, or the full detection limit prior to the analysis of censored environmental
data  is a  common  practice.  However, this practice is problematic as many
environmental professionals consider it arbitrary with significant potential to
lead to incorrect environmental decisions.  It is also  increasingly unnecessary.
Several data analysis methods that do not  require substitution of nondetects,
while not new, have in  recent years become easier to  carry out  for those
responsible  for the data analyses that are needed to inform environmental
management decisions.  Yet the  number of examples of their use, including
comparisons of results to those obtained using more common "substitution"
methods,  remains small.  One type of data analysis problem that remains of
interest to environmental  managers and commonly  involves censored data is
the investigation of time trends in fish tissue contaminant concentrations and
exposure.  Several methods for censored data are  illustrated and compared
with  more common "substitution" methods using a data  set of fish  tissue
dioxin concentrations that contains nondetects.  Data were analyzed using a
common commercially available statistical  software program.  Methods used
to calculate summary statistics for censored data include robust regression on

20

-------
order statistics (robust ROS), maximum likelihood estimation, and Kaplan-
Meier.  Methods used to characterize trend direction and rate include ordinary
least squares  regression,  maximum  likelihood estimation, and Akritas-Theil-
Sen  regression.  Approaches  for the examination  of  outliers and potential
covariables during the analysis are also presented.  In addition, the effects of
decisions made during the analysis,  for example regarding the treatment of
outliers,  are  discussed  and  the  results  obtained from  all methods  are
compared.    This  work  also reinforces  the  importance  of  establishing
quantitative decision criteria prior to data analysis in order to better define the
relevance of differences  among statistical results  obtained  from multiple
reasonable  data   analysis  decisions,  regardless  of  whether  substitution
methods are used.
Wednesday 11:30 AM
Perspectives on Tribal Quality Assurance Training
David R. Taylor and Eugenia McNaughton, U.S. EPA Region 9 QA Office

Tribes  throughout the  nation  are  now  required  to provide water  quality
monitoring data, the collection of which is supported by USEPA to the Office of
Water using STORET.  These data collections  efforts require the development
of Quality Assurance Project Plans (QAPjPs).  Following the publication of the
Quality  Assurance Project Plans Development Tool for Clean Water Act 106
grants  (in  a  CD-ROM format), the  Region 9  Quality Assurance (QA) Office
provided a number of trainings to tribes  using a variety of different formats
and approaches.   A summary  of these training efforts and the QA Office's
perspective on their effectiveness was presented at the Annual Tribal - EPA
Conference in October 2006.   As a result of the  audience  response  to the
presentation,  the Regional  Tribal  Operations Committee  (RTOC)  Quality
Assurance Subcommittee, on which Region 9 Quality Assurance Office has two
members,  is evaluating future tribal  training needs and the  most effective
means  of delivering that training.   The results of this  evaluation will be
presented.
QA Issues for Energy Efficiency Measures at Ambient Air
Monitoring Stations
Meredith Kurpius, U.S. EPA Region 9
Joel Craig, San Luis Obispo Air Pollution Control District

Implementing energy efficiency measures at air pollution stations reduces the
cost of operating air monitoring stations and promotes improvements in air
quality.  In addition,  there are a number of co-benefits  such as possible labor
savings, reduced wear-and-tear on vehicles, reduced equipment expenditure
(e.g., A/C replacement), possible noise reduction, and improved instrument
operation.  The  challenge is  in implementing energy saving measures while
maintaining complete and high quality ambient air data.

The major energy consumers at an ambient air quality station are likely to be
heating/cooling systems, instrument pumps, and particulate matter samplers.
It is also important to consider travel to/from stations and energy use by idle
electronics  and  equipment.   Numerous options exist  for  improving  energy
efficiency at air  monitoring stations,  including:   setting up remote operation

                                                                      21

-------
and  automating monitoring tasks  (less driving); reconfiguring venting of
waste  heat  from instruments  to minimize use of cooling/heating  systems;
enclosing  instruments to minimize volume to be  cooled/heated; redesigning
inlets to eliminate pumps; replacing energy-intensive instruments/pumps with
more efficient ones; programming calibration systems to power down  when
not in use; and installing solar power.
A major limitation to instituting energy efficiency measures may be state/local
agency resources (staff time, upfront capital expenditures, etc.).  However,
other  limitations are  imposed  by EPA QA requirement such  as station/
instrument  temperature,  residency   time,  and  instrument   equivalency
designations.  In addition, other QA issues to consider include calibrations at
high/low  temperatures,  effectiveness  of   remote  QA/QC  checks,   and
instrument performance with ambient air at  a different temperature than the
instruments.  EPA  Region 9 has been  working  with  San  Luis Obispo Air
Pollution Control District to evaluate energy efficiency options that  comply
with all EPA QA requirements.

The San Luis Obispo County Air Pollution Control  District initiated an energy
reduction  program on their air monitoring network in 2006.  In  this program
various approaches  to energy  reduction were  tested and  evaluated,  while
assessing any impact to  data  quality.  Significant reductions in  station
electrical consumption were achieved through alternative temperature control
strategies as well as slight modification to the station designs.  Reductions in
driving were also achieved through automation and remote  operation of the
monitoring network.
A Simple Procedure for Estimating Method Quantitation Limit (MQL)
Chung-Rei Mao, U.S. Army Corps of Engineers

Method Quantitation Limit (MQL) is the lowest amount of analyte in a sample
that can be quantitatively  determined with  an acceptable  level  of precision
and accuracy. Several procedures are available for estimating or determining
MQL.  Traditionally, an MQL may be estimated  based on  a  multiple of the
Method Detection  Limit (MDL) or determined  based  on multiple analyses  of
fortified samples at low concentrations.  However, information on precision
and accuracy are usually  not available for an estimated  MQL  so that the
estimated  MQL would be of limited or no  practical value.   For a determined
MQL,  the  multiple  analyses  could  be  prohibitively costly  for  routine
applications at a production laboratory.

A simple  procedure for determination of MQL based  on  the MDL and the
Laboratory Control Sample (LCS) will be presented.  Because any quality
assurance  program requires frequent analysis of LCS, the mean and standard
deviation  of LCS recoveries are  well known.  Using  MDL and LCS data, the
proposed  procedure provides a  reliable estimate for MQL  with well  defined
precision and bias.  An MQL determined based  on the proposed procedure will
provide data at or near the  MQL of known quality for reliable decision-making.
Wednesday 1:00 PM
The EPA Region 2 Approach to Quality Assurance Training
Kevin Kubik, U.S. EPA Region 2

22

-------
Region 2 has a centralized Quality Assurance Program and organizationally it
is located within the Division of Environmental Science  and Assessment and
physically located in Edison,  NJ. The Region 2 Program  Offices are located in
New York City, Puerto Rico and the Virgin Islands, and  in  Edison.  After the
March 2003 Quality Systems Assessment of Region 2 it was determined that
Quality Assurance training was needed throughout the region  and that  the
current approach  to QA Training was lacking.  This presentation  will describe
the former QA Training approach and  describe the thought processes that
went into the development of the new approach and how the new  approach
was implemented, given the logistical  challenges.   Results of  the training
program  will be discussed in conjunction with the findings of the  February
2007 Quality Systems Assessment of Region 2.
A Successful Strategy for implementing the EPA Information Quality
Guidelines (IQG) (Session)
Monica Jones, IQG Team Leader, U.S. EPA OEI Quality Staff
Tom Nelson, U.S.  EPA Region 6
Walter Helmick, U.S.  EPA Region 6

At the beginning of the session, the  facilitator will provide a brief overview of
EPA's process for responding to an  IQG Request for Correction (RFC) and
Request for Reconsideration (RFR).  Using an example of an  actual EPA IQG
RFC and RFR, each panelist will describe their role in the  preparation of EPA's
response to these requests. The panelist will give examples of the successful
strategies and lessons  learned in preparing EPA's response to the IQG RFC
and RFR.  At the conclusion of the panel presentation,  we will engage the
audience in a dialogue  of other strategies that can be used to improve EPA's
response to IQG requests.
What the G-5 Rewrite Intends to Accomplish (Session)
John Warren, U.S. EPA OEI

The  current G-5, Guidance for Quality Assurance  Project Plans, is due for
renewal in December 2007, and Neptune & Company is the lead contractor
assisting the Quality Staff in revising and rewriting the guidance.  Input from
all  the  Programs   and  Regions  has  been  solicited  through  regular
communication with the G-5 Revision Workgroup.  The workgroup has advised
the Rewrite committee  to focus on the use of QAPPs applied to existing data
sets without losing the connection to the Uniform Federal Policy (UFP) QAPP
issued  in December 2005. This presentation examines the difference between
the current G-5 guidance and the revised G-5 Guidance.
Progress Made in the G-5 Rewrite
Daniel Michael, Neptune & Co.

This presentation shows the progress made in developing the revised sections
of G-5.  The rationale for the new ordering of the elements of a QAPP will be
discussed especially  with reference to  existing  data.  The  presentation will
conclude with a discussion of what areas are yet to be completed and where
further input from the QA community is needed.
                                                                     23

-------
Discussion on the G-5 Rewrite
Daniel Michael and  John Warren

The final presentation will be an open forum where the audience is invited to
share knowledge and experience with establishing performance or acceptance
criteria for the consideration of existing data for potential use in a project.  It
is anticipated that their advice can be incorporated into the G-5 rewrite.
Integration of Statistics, Remote Sensing and Existing Data to Locate
Changes in Land Resources
Maliha S.  Nash, Deborah J. Chaloud, and William Kepner, U.S. EPA LEB
Samuel Sarri, CCSN Las Vegas, NV

Stability of a nation  is dependent on the availability of natural resources.
When  land is degraded and natural resources become limited,  socioeconomic
status declines and  emigration  increases in  developing countries.   Natural
resource utilization without proper management may result in irreversible land
degradation. Early detection of resource depletion  may enable protective
actions to be taken  prior to significant decline in resources and associated
socioeconomic  conditions. We have developed a  simple method  based on
readily available  data to locate areas of concern.  Our method integrates
results from statistical analyses of  inexpensive remote sensing data  (e.g..
Normalized Difference Vegetative Index) data. Results are  mapped  using
ARCView.  Ancillary information  is  used to verify results and  to  assist in
identification of probable causes of significant positive and negative change.
These results can be  used by authorities in developing management plans to
preserve  or conserve  natural  resources and  maintain  or  improve  the
socioeconomic status of the resident population.

Notice: The U.S.  Environmental Protection Agency (EPA), through its Office of
Research  and  Development (ORD), funded  and performed  the  research
described  here. It has been peer reviewed  by the  EPA and approved for
publication.   Mention of trade names  or commercial products  does  not
constitute endorsement or recommendation for use.
Wednesday 1:30 PM
 Laboratory  Quality  Systems:  Use  of Information  Technology to
 Manage Documents and Training
 Robert P.  Di Rienzo, DataChem Laboratories, Inc.

 The laboratory shall establish and maintain procedures to control all documents
 that form  part of Its management system (internally generated or from external
 sources), such as regulations, standards, other normative documents, test and/or
 calibration methods, as well as drawings, software, specifications, instructions and
 manuals. ISO/IEC 17025:2005

 The laboratory shall  establish  and  maintain  procedures  to control all
 documents that form part of its  quality system (internally generated or from
 external  sources).  Documents include  policy  statements,  procedures,
 specifications,  calibration   tables,   charts,   textbooks,  posters,   notices,
 memoranda,  software, drawings, plans, etc. These may be on various media,

 24

-------
whether  hard  copy  or electronic,  and  they  may  be  digital,   analog,
photographic or written. NELAC 2003

A laboratory should have a document control system that complies  with the
above requirements from ISO and NELAC. The use of Information Technology
to manage a document control system ensures control of documents. This
presentation will identify how documents are controlled through web based
technologies that are available, easy to maintain and cost effective.

The use of an internal website  (Intranet) can manage documents,  training,
policies, procedures and  applicable records.   This  presentation will  identify
how documents and  managed  in this  fashion. This presentation will also
identify required documents in a quality system that must be controlled and
show simple techniques to control documents.

A  well constructed documentation control  system  ensures that the data
generated are of known  and  documented quality. A good document control
system can  improve the laboratory  quality  system, improve consistency in
operations, increase productivity and  most of all  lead the laboratory toward
continuous quality improvement.
The SAS Enterprise Miner:  Sequence Analysis,  Decision Trees, and
Neural Networks
Jodi Blomberg, SAS Institute, Inc.
(Abstract Unavailable)
Wednesday 2:00 PM
APHL as Home Base for State Environmental Laboratories
Jack Bennett, State of Connecticut Department of Health

EPA recognized that there was a need for State environmental laboratories to
come together and  collaboratively  form partnerships among themselves as
well as with EPA. The importance  of those partnerships was heightened by
EPA's responsibilities under Homeland Security Presidential Directive 9 (HSPD
9) for the environmental aspects of an event of national significance. To meet
the need EPA entered into a cooperative agreement with the Association of
Public Health Laboratories (APHL).  The purpose of this paper is to introduce
the audience to APHL and the Environmental Laboratory Subcommittee as well
as its role under the cooperative  agreement  and  to discuss some of the
Quality initiatives being brought forward by the subcommittee.
Statistical  Support  Pilot  in  the  U.S.  EPA Office of  Research and
Development
Lynne Petterson, PhD-
U.S. EPA ORD, National Exposure Research Laboratory

In 2007, EPA's Office of Research and Development is sponsoring a statistical
support pilot.   The statistical support pilot  provides a  hotline  service to
address short-term questions via telephone or email, as  well as a contract
vehicle  for  long-term  projects requiring  specialized statistical  support.

                                                                    25

-------
Statistical support  for  short-term  questions  is  limited to 4-6  hours and
provided at no charge to ORD organizations.  Long-term statistical support is
funded by the requesting ORD organization  and is limited to no longer than
three (3) months.

The  pilot will offer three types of statistical  support.  These include classical
statistical support,  spatial  analysis support,  and  bioinformatics  statistical
support.

The  pilot will address  the  management of data sets in conjunction  with
statistical analyses.  Researchers  who  will generate, or  use,  substantial
amounts of data in  conjunction with their statistical analyses will be required
to complete a data management questionnaire before statistical support will
begin.  The questionnaire will identify issues  such as the size of the data,
where it is stored,  who has access to it, and how it will be archived.  The
results of the data management questionnaire will help ORD enhance scientific
data management.

The  statistical support pilot is designed to run for approximately 6-8 months.
At the conclusion of the pilot, the service will be evaluated in terms of who
requested support,  the  types of support  provided, and the level  of customer
satisfaction. The final report will  include  recommendations for strengthening
statistical training and guidance in ORD.
Wednesday 3:00 PM
Ethical Dilemmas: An On-going, Interactive Discussion of Challenges
Facing Environmental Laboratories
Michael F. Delaney, Ph.D., Massachusetts Water Resources Authority

A continuing series of presentations and discussions on laboratory ethics has
taken place at the quarterly meetings of the Independent Testing Laboratory
Association  (www.itla-ma.ora). ITLA is an organization of representatives of
New England certified environmental laboratories who come together for open
communication in a highly competitive industry.

Over the course of the 15 sessions in the past seven years the ethics sessions
at ITLA meetings have  explored  a  wide range of topics  of importance to
testing  laboratories. We  have looked at ethics statements of professional
organizations, ethics training requirements, and various technical quandaries.

The sessions have two types of regular components:

    •    Ethical  Challenges.  These  are  open-ended items  for
         discussion.  They  are  intended  to  be  realistic  but  also
         thought-provoking. Where does technical judgment leave off
         and impropriety  begin? For example: "In every extraction
         batch, two method blanks are included so that if one is  lost
         in a lab accident there will still be a valid blank. What do you
         do if one blank is clean and the other is dirty?"

    •    Labs Behaving Badly. Ripped from the recent headlines,
         these  are  examples  of people making bad  choices—and

26

-------
        getting  caught.  In addition  to labs,  these  items  have
        included  treatment plant  operators, university scientists,
        Nobel   laureates,   research   institutions,   politicians,
        government regulators,  celebrities, and  many others. The
        key message is that everyone  is vulnerable to making wrong
        decisions  and that laboratories should  encourage  open
        discussion of ethical issues.

Sessions have included guest speakers, review of on-line ethics training, and
examination of regulatory methods and  regulations. The tone  is generally
light-hearted  and  introspective. Our   goal  is to  make  laboratory  ethics
discussions both fun and informative.  This talk will give an  overview  of the
ITLA Ethical Dilemma series.
Data Quality versus Information Quality: Is There a Difference?
Monica Jones, IQG Team Leader, U.S. EPA OEI Quality Staff

There  is  an  old  saying  that  "one   person's date  is  another  person's
information". Some people use  the terms  "information quality" and "data
quality" as synonyms.  Is there a difference between these two terms? This
paper will give EPA's perspective on what is meant by "information quality"
and "data quality".
eQMP Project: Development and Implementation of Electronic Quality
Management Plans (Session)
Gary L. Johnson, U.S. EPA
Kevin J. Hull and John Tauxe, Neptune & Company

Since 1980, EPA policy has required that all Agency organizations collecting
and using environmental data for Agency decision making shall develop and
implement a  quality management  system  (QMS)  in  programs  providing
products and services based on such  data.  The Quality Management  Plan
(QMP) is the key document to describe the processes and procedures in an
organization's QMS, the roles and responsibilities of personnel,  and how the
QMS  is  applied to  the  organization's  business  lines; that  is,  the QMP
documents  how an organization's  QMS  meets the  requirements  of the
Agency's quality policy and implementation standards.

These  QMPs  have  been  paper  documents   that  were  written  by  an
organization, reviewed by the  Quality Staff, and approved by  the Office  of
Environmental Information (OEI) for implementation for up to five years.  The
paper form of the QMP has enabled the QMP to remain a  stable document, but
one that may not be easy to  use and that may not reflect current quality
assurance practices and procedures  in the organization's QMS.  Building on
the innovative implementation of a web-based QMP by OEI, the Quality Staff
has begun a project to develop a workable framework for electronic QMPs that
meets all EPA policy requirements and that provides  expanded flexibility and
accessibility to EPA users.

The eQMP Project was initiated in the fall of 2006. The design will ensure that
the eQMP  structure  addresses  all current QMP requirements  and  the  new
requirements expected to  emerge in the  expanded Agency Quality Policy in
2007.  A key design goal is to build into the structure the capability for many

                                                                    27

-------
access pathways and for linkages to other web-based information sources that
pertain to the QMS and its implementation.  Because of the diversity of EPA
organizations, the eQMP design will be tested  in pilots in  two Regions, one
National  Program Office, and one National Research Laboratory.  The long-
term goal will be to implement the final design structure across EPA and to
migrate existing paper-based QMPs to the eQMP framework.

This presentation will describe the  progress to date in developing the eQMP
framework and testing its suitability in  the pilots, and outline the remaining
work to be completed.  The presentation will offer some possible expansions
of scope to an eQMP as users consider options for links to new tools and other
information sources.
Wednesday 3:30 PM
External Laboratory Audits, Problems and Lessons Learned
Zachary Willenberg, Battelle Memorial Institute

The need to assess external laboratories is a regular and ongoing process that
many in the quality field have to do.  Very often external audits occur because
the facility is contracted  by your company  on a project and you  need to
perform an audit of their contracted work.  In other cases, you  may  be
contracted by another organization to perform assessments on their behalf at
facilities that are part of  a larger program (e.g.  air  quality monitoring
stations). The need for external  audits is essential, especially now that quality
requirements  for contracted   work  are   becoming  more  common  and
increasingly stringent.  This  presentation will  review  typical and  unusual
problems or issues encountered during actual external laboratory audits and
the  resulting  lessons  learned.   Areas  to be  addressed  include audit
documentation,   accreditation  vs.   non-accreditation, scheduling,  auditor/
auditee interactions,  travel planning,  and  pre- and post-audit communication.
Wednesday 4:00 PM
Laboratory Data Quality and the Bottom Line
Dianne Buckheister McNeill and Robert Thielke, ECC

Laboratory and field data are the lifeblood  of many remedial investigations,
feasibility studies, waste characterization  activities, and monitoring projects.
Often the quality and nature of the data is crucial to making project decisions
that have significant impact project budgets and schedules.  Far too often,
project managers view  costs  associated with verifying and assuring data
quality, at best,  as  a  necessary  cost of doing business and, at worst, as a
burden to project  budgets and schedules.    However,  data of  poor or
inappropriate quality can  have a  direct impact upon a project's bottom lime.
At times, the impact can be significant.  In this paper, examples of how data
quality impacts the bottom line are presented, including examples of how the
using low bidder  can lead to profit loss.  Steps to take to avoid this issue by
assuring that only qualified  laboratories are selected for the bidding process,
as well as some good-practice follow-up procedures, are described.
28

-------
Environmental Indicators
Nancy Wentworth, U.S. EPA

The Environmental Protection Agency produces and contributes to numerous
indicator reports that differ in purpose, audience, scope and scale, review and
data standards, and Web presence. Even the term "indicator" is loosely used,
and  can  mean  anything  from  scientifically  robust  information  about
environmental conditions (as in the Report on the Environment)  to information
demonstrating progress on specific program/policy initiatives.  Moreover, the
fundamental  systems  from  which these  reports  are generated differ  in
process, age, maturity, designer credentials, and user feedback mechanisms.
Under these conditions, audiences of information about environmental status
and trends  are faced  with  trying to access  a  highly unorganized  set  of
information  with indicators that  may  be  contradictory and/or may  vary  in
quality.

In this session the speaker will tee-up this  issue, briefly describe the definition
and criteria  for ROE indicator information,  and lead  a dialogue session on
ways the QA community may be able to help to ensure that the  environmental
indicator  information  the  Agency  publishes  meets minimum  "bar"  for
information quality and peer review.
Thursday 8:30 AM
The Transition of the National Environmental Laboratory Accreditation
Program (NELAC) from EPA to The NELAC Institute (TNI)
Lara Autry, U.S. EPA Office of Research and Development
David Speis, Accutest Laboratories

On  November 6, 2006, a giant step towards achieving a long-term goal of the
environmental  laboratory  and  monitoring  communities to have a national
accreditation program  was realized. After years of an evolving program under
the  auspices   of  the  National   Environmental  Laboratory  Accreditation
Conference (NELAC) and the Institute  for National Environmental Laboratory
Accreditation (INELA),  both Boards of Directors took action to form The NELAC
Institute (TNI).   The  purpose  of  TNI  is  to  foster  the  generation of
environmental  data of known  and documented quality through  an  open,
inclusive, and  transparent process that is responsive to the needs of the
community.

As reflected in  the  new name, The NELAC Institute has combined  the heritage
of NELAC with the consensus process of INELA into one organization.   This
presentation will summarize the activities leading up to the formation of TNI,
describe in  detail the core programs being performed by the new  organization
and provide information about the future of national laboratory accreditation.
Using Data from Diverse  Sources  - Quality Control  Procedures  in
Database Management
Rosanna Buhl and Suzanne Deveney, Battelle

Environmental  managers  are often are  interested in using  pre-existing
(secondary) data to support management decisions. This makes sense from

                                                                    29

-------
both schedule and financial perspectives. Large environmental databases are
often placed on  the Internet  by federal  and state agencies, NGOs, and
universities and are  readily available for inspection and downloading.  Their
use  reduces  the  need for new field sampling and the associated cost and
schedule constraints.  However, the compilation of data held in a  variety of
sources is often  not straightforward and  must be  conducted systematically
with appropriate   planning  and management oversight to  avoid  disaster.
Quality control procedures must be designed and implemented to ensure that
the  data are  standardized and  usable.  Queries  made against  a  non-
standardized database  can result in  inaccurate  data  summaries,  incorrect
conclusions, improper protection of the environment, or legal challenges.
This paper discusses quality control procedures that should be incorporated
into the  data acquisition, loading, and  query  phases of a  project.   Prior to
selecting data from the  existing databases, the critical database fields should
be identified  along with  details such as data types and parameters of interest
and temporal and spatial  boundaries.   Data should be  pre-screened for
relevance and quality.  A review of the database  structure, data dictionary,
and primary  keys ensures  that the  data contents and  established  data
linkages  are understood.   QC check scripts, data standardization, database
codes,  and   duplicate   record  elimination  are  key  examples  of  data
management best practices.  Best practices will ensure successful merging of
data sets from multiple sources into a  useable centralized  project database.
Two case studies will  be used to demonstrate  a systematic,  high-quality
approach to data  management and the importance of performing these quality
control  procedures.   These  cases will  be contrasted with  examples  of
erroneous conclusions drawn from a  database developed without rigorous
quality control procedures.
Discipline-specific Standards Provide the Framework
For Science and Information Quality (Session)

Scientists used  analytical  standards  and standard methods to ensure  the
accuracy  and  precision  of their  measurement  methodologies.    Project
managers  use data  standards  to ensure  that  information and data  are
captured and stored in a comprehensive, complete, and comparable manner.
Information managers use standard configurations and  hardware to  ensure
compatibility and reliability  of  technology operations.   This presentation
provides an  overview of all those standards used in the EPA for science,
quality, and information operations.  The terminology and standards used by
these discipline-specific users will be compared and contrasted  and audience
members will be encouraged to share solutions for ensuring that all standards
are accommodated in quality planning, regardless of the discipline-specific
needs.
         Science, Analytical, and Quality Management Standards
         Gary Johnson, U.S. EPA OEI Quality Staff

         A wide variety of standards are in use in EPA to support science and
         analytical applications including quality management standards.  This
         presentation explores the basis of these standards to ensure results
         of the quality needed based on intended use.  A summary and  road
         map to most standards is provided.
30

-------
        Data, Information, and Technology Standards
        Jeff Worthington, Director of Quality, U.S. EPA OEI

        Highly-specific technical standards  are the. tools that drive the
        consistent operation of technology and ensure the interoperability of
        systems as well as the comparability of data in  databases.  This
        presentation  reviews  the  wide  array  of data,  information,  and
        technology standards and considers how they support the quality of
        EPA products and services.

        Interactive Discussion With Audience Members
        The speakers will present  a series of points and  questions to the
        audience  members  to share  lessons  learned  and  to  encourage
        discussion in:
            •   How organizations at  both the program level and
                the project  level can ensure the implementation  of
                all standards and applicable policies and  procedures
                are implemented.
            •   How  to  identify  who  has   the  key  role  for
                implementing each kind of standard.
            •   How to  reconcile the various roles in  ensuring
                implementation of standards.
            •   How  to  ensure  that  all  the  elements  are
                implemented and there is adequate oversight.
Ohio EPA's  Division  of Emergency and Remedial Response's  (Ohio
EPA-DERR) Data Verification Review Guidance Tools
Gunars Zikmanis, Nancy Zikmanis, and Timothy Christman, Ohio EPA - DERR

Introduction:
Ohio  EPA -  DERR  designed  guidance and checklists  to assist regulatory
reviewers in evaluating whether analytical data provided by a laboratory may
contain deficiencies which could result in data quality that is questionable for
use  in a project. The  guidance is designed  to evaluate laboratory quality
control processes which could document issues with the sample and  data
processing within the laboratory. The checklists and guidance work hand in
hand to help reviewers to better  understand  Quality Assurance (QA) issues
and concerns as they review laboratory data. Please note this guidance is not
an in depth data validation process, but an initial review of data to determine
if possible  errors exist.  This  tool  allows reviewers with little laboratory
experience to evaluate their data usability.
It is also part of the larger Data  Quality Objective (DQO) and Data Quality
Assessment   (DQA)  processes. Therefore, additional  evaluation  may be
necessary to fully determine  the  usability of the  data.   The full DQO/DQA
process should be adhered to  in order to make well informed and appropriate
decisions on data usability and accuracy required for a project.

        Overview of DQO/DQA Processes with a focus
        leading to the Data Verification Guidance:
        A  brief overview of the DQO process which  leads  up to  the
        development  of  laboratory data.  The  DQO process outlines  the
        development stages of a  project to ensure the  user is obtaining the

                                                                    31

-------
        appropriate type and quality of data for their project. The overview
        denotes the DQO  process  steps, its iterative nature,  and what a
        reviewer would need to think about in development of quality data
        (Quality Assurance Project  Plans).    The overview proceeds into a
        review of the quality steps to obtain conclusions and the DQA steps;
        of which data verification is the second step.

        Ohio EPA's DERR's Data Verification Tools!
        A review of our guidance  and  checklist to  demonstrate what our
        initial  review would entail  using these tools.   The review would
        include a look at QA requirements for  both  laboratory data review
        and for field data reviews, including mobile laboratory data packages.
        By using the guidance and checklists, reviewers would note where
        more  information is needed in a data package or where errors may
        be identified and require a  more indepth data review.   Ohio EPA's
        guidance is geared  toward initial review only  and more indepth
        reviews would be  conducted by more  qualified  personnel under a
        data validation process.

        Let's  Field Test The Process!
        A case study is used to demonstrate why reviewing project data is an
        important step to ensure a quality project. The case study points to
        potential problems, how they were discovered using both the DQA
        process and data  verification review, and the final in-depth  data
        review to document the data quality.
Thursday 9:00 AM
TNI National Environmental Laboratory Accreditation Program
Kenneth Jackson, New York State Department of Health
Dan Hickman, Oregon Department of Environmental Quality
Judith Duncan, Oklahoma Department of Environmental Quality

The purpose of the National  Environmental Laboratory Accreditation Program
(NELAP)  is to establish and  implement a program for the accreditation of
environmental laboratories. The primary components of this program are:
    The recognition of accreditation bodies,
    The  adoption  of acceptance  limits  for  proficiency  testing
        developed in the Proficiency Testing (PT) Program, and
    The adoption of the laboratory accreditation  system developed in
        the Laboratory Accreditation System Program (LASP).

The NELAP Board has final authority for implementation of the program for the
accreditation  of environmental laboratories.  It develops the  policies and
procedures that govern the operation of this program, and is responsible for
ensuring the  successful implementation of the program.  To ensure that the
program is  implemented effectively  and  to  address  the  needs  of  the
stakeholder community, the  NELAP Board will  work in cooperation with other
core programs and committees within The NELAC Institute.  Specifically the
NELAP Board:
    Will work with the LASP in the development of the  laboratory
        accreditation system,
    Will work with the Consensus Standards  Development Program

32

-------
        to ensure that accreditation  standards  developed for this
        program are suitable for use, and
    Will work with the PT Board to ensure that PT acceptance limits
        developed by the PT Board are acceptable for use.

This presentation will summarize the activities of the NELAP Board since its
formation and provide information about the plans to recognize accreditation
bodies.
High Quality  Systems  Engineering  ~  Using  CMMI as a  Process
Improvement Framework
Stephen Hufford, U.S. EPA OEI

This session explores OEI's experiences  in applying  the  Capability Maturity
Model   Integrated  (CMMI) as a key contract  provision within EPA's  largest
computer programming contract.
Topics will include  OEI's  strategy for  assessing CMMI maturity,  costs  and
benefits of this process  improvement  framework, applicable process areas
(with an emphasis on process and product quality assurance),  and challenges
in implementing process improvement.
Thursday 9:30 AM
TNI Laboratory Accreditation System Program
June Flowers, Flowers Laboratories
Judy Morgan, Environmental Science Corporation

The purpose of Laboratory Accreditation System Program is  to  develop a
system for the accreditation of environmental laboratories that consists of the
policies and  procedures, interpretations, guidance documents, and any related
tools  used   by  the  accrediting   authorities  to  implement  a  national
environmental laboratory accreditation program.

To ensure that the program  is implemented  effectively and to address the
needs of the stakeholder community, the Laboratory Accreditation Committee
will work in  cooperation with other core program within The NELAC Institute.
Specifically,  the LAC:
    Will  work with the NELAP Board and  the  PT  Program in the
        development of the laboratory accreditation system,
    Will work with the Consensus Standard Development Program to
        ensure  that accreditation  standards  developed  for  this
        program are suitable  for use, and
    Will seek the assistance of Expert Committees when developing
        guidance.

In addition to developing the laboratory accreditation system, this  program is
also  responsible  for   establishing  a  national  database  of  accredited
laboratories. This presentation will  summarize the activities of  the LAC since
its formation.
                                                                     33

-------
Addressing the Address Matchers:
Geocoding Performance Metrics for EPA Facilities
Pat Garvey, U.S. EPA OEI

Background
Reliable and  accurate geocoding (i.e., the  process of assigning  latitude and
longitude  values to a location) is a core requirement for the EPA's  facility
management  system.   As  geocoding  requirements for  the  EPA's  user
communities  grow, expectations for positional accuracy  and performance
become all the more significant.  To this end, this paper maps out the various
geocoding  engines  EPA has  at its avail and  compares  their functionalities
within the overarching  intent of  providing the EPA with the most  robust
geocoding solutions for all of its facility management needs. The three vendor
geocoding packages used  in this analysis include Oracle Geocoding  lOg, ESRI
ArcGIS Server (ArcSDE 9.2), and  ArcIMS Route Server 9.1.  Each vendor's
best  configuration  practices will  be  applied to  ensure  consistency  in
deployment.

Methods
In this analysis, 108,000 EPA site locations that have missing  latitude and
longitude  values  will  be  randomly selected from  the EPA Facility Registry
System (FRS) to encompass all EPA regions and program offices.  The records
will be independently geocoded within  the same server  environment using
Dynamap 2000ฎ, version 16.1 street network data.  The records pulled from
the FRS database will have a minimum  requirement of sufficient information
to produce a geocoded latitude and  longitude (records  containing a street
address, a city, and a  state).

To  confirm  these  latitude  and   longitude values,  several  precision  and
performance  metrics will be utilized for each geocoding engine. Percentage of
"hits'  per 'missed' values will be used to surmise the degree of compatibility of
the geocoding algorithms  used.  Performance will be measured by timing the
duration from start to finish  for each geocoder.  Positional accuracy  will  be
assessed  based on a  comparison  of  100  arbitrary GOT  parcel centroid
coordinates matched against expected address points. Each set of results will
be  tested for general variations in positional  accuracy using Oracle Spatial
procedures and the ESRI  Geostatistical Analyst. Correlations of latitude and
longitude  values from each geocoding engine  will be presented as well  as a
covariance analysis  to ascertain significant trends within each engine.

Results
Results from this analysis will address  relative positional accuracy for each
geocoding  engine, percentage of matches  vs.  missed locations, and  coding
times for each  engine.  Specific  resource  measurements will include  data
preparation needs for each  geocoding  engine, CPU  memory utilization per
engine, and engine parse times.
QA Guides for Improving the Data Usability Assessment Section  in
Project Plans
Pat Mundy, U.S. EPA OEI Quality Staff

Project personnel  are often  vague about  how to assess project  results  in
relation to project objectives. QA staff who review QA project plans have little

34

-------
guidance as to what is acceptable, so frequently make few suggestions. As a
result, many project plans have little information in the section for reconciling
data to user requirements (D3).

Drawing from detailed project planning guidance and examples from several
sources,  this  presenter  will  highlight the planning  tasks  relevant to data
usability and discuss how these tasks "play out"  in the translation of project
objectives to results.   This is important to QA managers  because comparing
the  written  plan  to  the  project results  can  be a tool  to  evaluate the
effectiveness of systematic planning.
Thursday 10:30 AM
TNI Proficiency Testing Program
Carl Kircher, Florida Department of Health
Carol Batterton, The NELAC Institute

The purpose of the Proficiency Testing Program is to ensure that an effective
proficiency testing system exists to support the accreditation of environmental
laboratories.  The  PT  Board  has  the following duties in implementing this
program:
     •   Provide assistance to the Board of Directors on the selection
        of a Proficiency Test Provider Accreditor (PTPA_.
     •   Monitor  the  PTPA  to  assure  that  it  is  following  the
        requirements set forth  by the organization.
     •   Facilitate an annual caucus on proficiency testing.
     •   Review and evaluate PT data for the purpose of determining
        the appropriateness  of proficiency test study limits.
     •   Provide  recommendations  to  the  NELAP  Board   as to
        acceptance limits.

This presentation  will  summarize the  activities  of the PT  Board since its
formation.
Quality-in-Depth: The Nexus  of Quality Assurance And Information
Assurance (Security) (Session)
Jeffrey Worthington, Director of Quality, U.S. EPA OEI

Many people feel there are really two essential issues for information:
    •   Information Assurance (i.e., information security) and
    •   Information Quality.

Information Assurance is the process to ensure the confidentiality, integrity,
and availability of information services.  Quality assurance is the means by
which managers plan and ensure the quality of products and services.

A  key concept  in  information assurance  is  to  ensure  security through  a
defense-in-depth approach.  Defense-in-depth  identifies  various perspectives
in  the  overall  information  system  and  encourages  establishing  specific
protocols  at each  level.   Quality professionals typically  focus on the overall


                                                                      35

-------
management system and the project planning components.  Quality activities
include: 1) assuring  the  quality of processes which impact the product and
service quality and 2) establishing the capability to measure and understand
the quality of the products and services.

Quality professionals may benefit from the information assurance defense-in-
depth model. Because information is one of the enterprise's critical resources,
information  presentation and availability can  be  just  as  important  as  the
program  and project  processes which established initial information content
quality.   Once  that  content has entered a system, a new model can  be
applied,  a quality-in-depth  approach.   This suggested  approach requires
recognition that there are various levels, disciplines, and distinct processes
which ensure quality.  These include:
    •   Overall information  system quality  (which may be inherited
        from another organization)
    •   Web content/presentation quality
    •   Application quality
    •   Data base (stewardship) quality
    •   Program/project quality (science quality if  science project,
        general product quality if an  administrative product)
    •   Customer service quality

  This model is  not the same as Michael Porter's "value chain" model where
  value is  imparted to a  product on its path  to the customer. This quality-in-
  depth model recognizes that there  is no single product; what the customer
  gets from  EPA is a  matrix-product and in some cases, a  product within a
  product.   In  either case,  quality  must be assured  at each level.   This
  presentation explores  the  information assurance  process  and provides a
  "how-to" for developing and implementing a "quality-in-depth" approach to
  quality   system  management  that can  be  incorporated into a  quality
  management plan.
Novel Electronic Tools for Improving and Streamlining Quality System
Operation (Session)
Clyde  M.  Hedin, Yogeshkumar C.  Patel, Arthur B.  Clarke, and  Garabet H.
Kassakhian, Ph.D., Shaw Environmental, Inc.
John D. Nebelsick,  U.S. EPA OSRTI

Shaw  Environmental's  USEPA Quality  Assurance  Technical Support (QATS)
Program supports the USEPA's Superfund Contract Laboratory Program (CLP)
by  developing performance evaluation  samples,  scoring laboratory results,
conducting data audits and  on-site laboratory audits,  and evaluating  EPA
methods.  The Shaw QATS Program's Quality Management System (QMS), as
required by its USEPA contract, has been ISO 9001 certified since 2001. As a
result of standard mandated customer focus and continual improvement, and
working with the Client to enhance customer satisfaction, QATS developed a
number of novel and  effective electronic tools to  streamline its QMS  and
increase productivity.  The workshop will include a  hands-on demonstration of
four electronic tools developed to streamline QMS  management, along with a
discussion of their applicability to your QMS.

The Integrated  Quality  System  (IQS)  is  a  database application that
supports core QMS  management  functions at Shaw's  QATS  Program.   It

36

-------
electronically documents customer communications,  customer satisfaction
reports, non-conformances, preventive  actions and continual improvements,
manages  Standard   Operating   Procedure  (SOP)   documentation,   SOP
acknowledgment  signatures,  employee training  needs  assessments,  and
training matrices.   Information from  the IQS is extracted  annually for
monitoring performance trending for metrics including positive  and negative
customer  feedback,  corrective  and  preventive  actions,  and  continual
improvements.  The Software Tools Records System is a data driven web
site and centralized database for recording  changes and modifications to in-
house software applications.  In addition to replacing the hardcopy verification
and validation documentation with  an electronic change control, the system
allows  for electronic  sign-off and/or comment by  organization  management
and the Quality Assurance Manager.  The Data  Archive  System is an
electronic  report repository, accessed by web  links  which  can directly access
and open any and all  source data files and any  raw data that comprise specific
deliverables.  The Electronic Deliverable System  is a Microsoft Share-point
web-portal which allows secure transfer of  deliverables to clients, to be
downloaded  at  their convenience.   In addition  to  reducing deliverable
turnaround times by replacing the traditional express mail  or e-mail, this
system allows transfer to multiple  clients and includes a toggle feature for
customer approval.
Visual Sample Plan Expert Mentor:   An Aid to  Correctly Using VSP
(Session)
Brent A. Pulsipher, John E. Wilson, and Richard O. Gilbert
Pacific Northwest National Laboratory

Visual Sample  Plan  (VSP) software,  which  can be  downloaded free from
www.dqo.pril.aov/vsp. can be used to determine the number and location of
environmental samples for a variety of sampling objectives.  VSP is based on
the  Data  Quality  Objectives  (DQO)  systematic  planning  process, with
particular emphasis  on  Steps  6  and 7  that  deal  with  quantitative  and
statistical  issues  related to determining the number of  sampled needed.
Recently, an effort has begun to develop and embed an "Expert Mentor" into
VSP.   The  purpose of the  Expert  Mentor is  to  provide guidance, help,
recommendations, and  warnings  to  the VSP user  to  (1)  help prevent
inadvertent misuse of VSP, (2) help novice users understand how VSP  works,
and  (3) to help ensure that the  number and  location of  samples  obtained
using VSP  are  appropriate for the particular sampling  situation at the site
being studied. This paper will discuss and demonstrate the progress made to
date in implementing the Expert Mentor.

The  completed Expert  Mentor is expected to include the  following  four
components: (1) Pre-Design Assistance to help implement the first 5 steps of
the DQO systematic planning process  and to help users set  up in VSP the site
maps, floor plans, target populations, etc. that are  needed,  (2) Sample-
Design Guidance for selecting the sampling design (e.g., judgmental,  simple
random, grid, sequential) that is most appropriate for the site, (3) Parameter
Input Guidance on selecting appropriate values of performance criteria  for the
site, such as the probabilities of making decision errors, and (4) Data Analysis
Guidance to provide assistance and recommendations with data analyses such
as testing assumptions and computing statistical tests.
                                                                     37

-------
This paper  focuses  on progress to date on components 1  and 2.  The
presentation will include demonstrating the use of the Expert  Mentor to help
ensure that VSP  is not used to determine the number of samples without
adequate attention to systematic planning, the development of the conceptual
site  model,  the definition of "sample support," and the  practical sampling,
sample handling, laboratory and measurement methods needed to assure that
representative samples from the target population are obtained.  Additional
demonstrations of the  methods developed to support implementation of site
maps and other visual  aspects of VSP,  as well as the approach developed to
provide guidance in choosing the most efficient design will be provided.
Thursday 11:00 AM
TNI Consensus Standards Development Program
Kenneth Jackson, New York State Department of Health
Jane Wilson, NSF International

The purpose of the Consensus Standards Development Program is to develop
standards for the accreditation of environmental laboratories.   The NELAC
Institute (TNI)  is a voluntary consensus standards development organization
that is  committed to the development of standards for  use  by accreditation
bodies. Standards are being developed that will be widely applicable, and will
therefore  promote  a uniform  national  program of environmental laboratory
accreditation.   These  standards are modular, allowing their  assembly into a
series  of  volumes, each  specifically  designed  for  a  stakeholder  group
(Laboratories;  Accreditation Bodies; Proficiency Test Providers; Proficiency
Test Provider Oversight Bodies).  A volume is also  being  prepared for Field
Sampling and Measurement. The consensus process used by TNI in standards
development will  be described.
Thursday 11:30 AM
TNI Advocacy Program
Jerry Parr, The NELAC Institute

The purpose  of  this program  is  to promote  a national  program for the
accreditation of environmental laboratories by:
     •    Establishing  relationships with  other  organizations (e.g.,
         ACIL, AWWA, WEF) that  have an interest in  accreditation
         issues,
     •    Establishing relationships with EPA program offices,
     •    Developing  presentations and papers to promote national
         accreditation,
     •    Developing presentations and papers to promote The NELAC
         Institute,
     •    Providing outreach at national, regional and local  meetings,
         and
     •    Assisting with publication of the member newsletter.
38

-------
39

-------