- ^—py I UN|TED STATES ENVIRONMENTAL PROTECTION AGENCY

 \^SSS^J                WASHINGTON. D.C. 20460
                                                       OFFICE OF
                                                 RESEARCH AND DEVELOPMENT
SUBJECT:  Report on EPA's QA  Program  from the National
          Academy of Sciences
FROM:     Stanley M. Blacker, Director
          Quality Assurance Management Staff (RD-680)

TO:       See Below

     At the request  of the Agency's Senior management, the National
Academy of  Sciences undertook  a  multi-year  study of the Agency's
quality assurance program.   The  final report resulting from this
study has now been issued.  A copy is attached.

     I believe  this report presents  a  strong  endorsement of the
key elements of the Agency's strategy for assuring the quality of
its data.  For  instance, the NAS panel singled out the concept of
Data Quality Objectives  (DQOs)  for particular endorsement.  How-
ever, the NAS panel also found some areas where the implementation
of the QA program ought to be improved by shifts of emphasis  (e.g. ,
increased  involvement and  accountability  of  top  management  in
determining needed quality)  and some initiatives,  e.g. , development
of guidance on  common usage of QA terms such as  "bias" and  "com-
pleteness."

     Also attached is a  copy of a 3-page summary of the NAS Report.
I urge you  to  read the  entire report since  it contains important
statements  about  the Agency's QA program and valuable recommenda-
tions for its improvement.  The summary was  prepared to  enable you
to obtain a quick  and  accurate picture of the  main substantive
points of the report.

     A workgroup comprised of  QA  Officers   representing  National
Program Offices,  Regional Offices, and ORD  Laboratories  prepared
the  summary in cooperation with QAMS.   We  used this approach to
assure the  summary's  accuracy.

-------
Addressees

Office Directors
Deputy Regional Administrators
ORD Laboratory Directors
Duality Assurance Officers
AA Quality Assurance Representatives
Office of Regional Operations
ESD Directors

-------
Summary of the Report of the



National  Academy of  Sciences



           on the



EPA Quality Assurance Program
      September 28,  1988

-------
Introduction

     In 1983, EPA asked for the assistance of the National Academy
of Sciences (NAS) to assess the Agency's QA program and to provide
recommendations on improving its effectiveness.     The NAS panel,
chaired by Dr.  Stephen Berry of the University of Chicago, examined
the QA activities of National  Program Offices,  Regional Offices,
and the  Office of Research  and  Development  and worked  with  the
Quality Assurance Management Staff (QAMS)  over the  next several
years,  issuing an  interim report  in 1985.    The final  report,
submitted  to  EPA  in  August of  1988  confirms  the  fundamental
strategy and principles of  QA which EPA has  adopted  and provides
recommendations for  further  improvement in the Agency's implementa-
tion of  QA.    In  particular,  the NAS  panel  endorses accelerated
insliLuLiunalization of  the DQO  process and recommends  holding
management accountable for key portions of the QA program.

General Findings

     The NAS panel believes that EPA can achieve an effective EPA
Quality Assurance  (QA) program  if it  specifies  the fundamental
elements of  a  QA program  and if  the responsibilities inherent in
these fundamental elements are embraced throughout the Agency.

     The Agency,  through  its Quality Assurance  Management Staff
(QAMS),  has  generally  done an  adequate  job  of  specifying  the
elements essential  to a  successful QA program and  has provided
detailed guidance regarding QA principles to Agency groups involved
with the generation  and/or use of environmental data.  Furthermore,
QA  responsibilities have  been  embraced whole-heartedly  by many
Agency groups  and their managers.   However, all components of the
Agency's QA program are not yet fully implemented.

     The NAS Panel  report had major findings in 4  areas:

  1. Accountability of management
  2. The importance of specifying  the adequacy of  data
  3. The need  for technical  guidance  documents
  4. The role  of the QA Officer.

1.   Accountability of Management.

     The NAS panel  recommended that  EPA make explicit precisely
where accountability rests  for data quality.  (p..ll)1

     The panel determined that the operation of  an effective QA
program has been hampered  by the unclear placement of  accountabili-
ty  for data  quality within the Agency's structure.   As  a  means of
achieving  better understanding  and  more  involvement  of  program
managers,  specific  individuals or offices must be accountable if
the data supplied by  or  through their offices  fail to meet the
     *Page numbers refer to the NAS report.

-------
need.  Only if that accountability  is recognized and responded to,
will that accountable person will have a viable QA program,  (p. 8)

     The panel  recognized more than a  little cynicism . from  the
regulatory and enforcement people concerning QA.  These people said
that it is wasteful to-be concerned at all about data quality.   As
long as  reasonable professional standards  are maintained people
are satisfied with simple QC.  Unless there :s accountability for
collecting the  right  quality of data,  no one is motivated  to go
beyond simple' QC.  (p.21).  Furthermore, when accountability is not
specified  at  a high  enough  level,  EPA  officials  have  little
motivation to use funds for QA.

     The panel  found:

     No clear placement of accountability  for data quality in EPA,
     (p.11)
  -  Top management accountability has not yet  been established.
     (p.14)
  -  The role  of  QA is still  not  apparent  in the  administrative
     chart of EPA.

2.   Data Quality Goals

     The initial  specification of  the quality of  data needed to
support Agency  decisions  and programs  is  an especially important
QA  element  because the  highest attainable quality may  be quite
unrelated to the  quality that makes data  adequate  for a purpose.
EPA  must be  careful to  use  its  resources efficiently  and  not
purchase data quality in excess of  what it needs  (above items  from
p.9)

     The process  which  translates the  needs of Agency decision
makers  into  the specifications for data  quality which scientists
and  technicians can  implement in  their  data collection  is  not
trivial  and  the responsible Agency managers  need  to  be  involved
directly.   The key element  is two-way communication between  data
users and data  generators,  (p.18)

     Communication  Between  the  Agency's  Data Users  and  Data
Suppliers.  The NAS panel found that QAMS, through  its  DQO initia-
tive, has proposed a means to achieve the communication link  needed
between data users and data suppliers.  The DQO process is seen as
the  mechanism for  incorporating data  into decision  making  in a
reasonable,  effective,  and  efficient  manner.   EPA's  Deputy Ad-
ministrator  issued memos  in  May  1984  and Nov  1986   on DQO in-
stitutional izat ion  within  EPA.    The  NAS  panel  found  that DQO
institutionalization  is  happening  more  slowly than the NAS panel
considers reasonable,  (p.20)

     The NAS panel  recommends  that the Agency use  the  DQO process
to control the  scope for data gathering and that  DQOs be  developed

-------
in  a  cooperative  participatory way  with QAMS  taking a  strong
initiative  in  all aspects  of  the  DQO process,  (p. 22).     This
activist role should be explicitly stated by the Administrator as
one of QAMSs  responsibilities,  (p.19)  The  panel  also suggested
that  development  of  DQOs  for  broad  areas  of  data  use  (e.g. ,
regulation of nonradioactive dumps for toxic waste or enforcement
of  point-source emission  regulations)  would  assure  reasonable
coverage of Agency  data collection while avoiding  the danger of
trying to do too many, case-specific DQOs.

3.   Technical Guidance Documents

     The N7\S paiiel also iocuul a need for Agency -wiclo standardiza-
tion  of  common practices  for  data measurement,  treatment,  and
management.  The panel recommended that QAMS initiate and guide the
development of  technical  guidance documents  for measurements and
data practices that have Agency-wide application.

4.   The Role of the QA Officer.

     In general,  the NAS panel calls for an  empowerment  of the
Agency's QAOs.  The QAO "protects" the accountable Agency manager
from the embarrassment of making wrong decisions because of inade-
quate data  and accordingly needs enhanced  stature and authority
within the organization.  (p.12).

Other Recommendations

     The NAS panel also recommended that  EPA:

     Develop and implement mechanisms  for assessing the  reliability
     of data  to be archived and disseminated  in  one of the many
     Agency data banks and for including notations of this assessed
     reliability in the archived data.   Currently, a multitude of
     data generated by EPA is stored in computer data banks without
     any indication of  its quality.   (p.29)

     Allocate resources to what is most needed, not necessarily to
     what people know best how  to do.  (p.24)

     Reestablish QA training programs,  (p.14)

     Establish  standing panels  of  experts  to  advise  about  data
     collection in crisis situations,  (p.16)

     Use the SAB to monitor QC and EPA's progress in meeting  these
     recommendations,  (p.16)

-------
NNAS
R NAE
C IOM
 Final Report on
 Quality Assurance
 to the
 Environmental Protection Agency

-------
                        FINAL REPORT
                    ON QUALITY ASSURANCE
                           TO THE
              ENVIRONMENTAL PROTECTION AGENCY
           Panel on Quality of Environmental  Data
               Numerical Data Advisory Board
Commission on Physical Sciences, Mathematics, and Resources
                 National Research Council
                   NATIONAL ACADEMY PRESS
                   Washington,  D.C.  1988

-------
NOTICE:  The project that is the subject of this report was approved by
the Governing Board of the National Research Council, whose members are
drawn from the councils of the National Academy of Sciences, the
National Academy of Engineering, and the Institute of Medicine.  The
members of the committee responsible for the report were chosen for
their special competences and with regard for appropriate balance.
    This report has been reviewed by a group other than the authors
according to procedures approved by a Report Review Committee
consisting of members of the National Academy of Sciences, the National
Academy of Engineering, and the Institute of Medicine.

    The National Academy of Sciences is a private, nonprofit,
self-perpetuating society of distinguished scholars engaged in
scientific and engineering research, dedicated to the furtherance of
science and technology and to their use for the general welfare.  Upon
the authority of the charter granted to it by the Congress in 1863, the
Academy has a mandate that requires it to advise the federal government
on scientific and technical matters.  Dr. Frank Press is president of
the National Academy of Sciences.

    The National Academy of Engineering was established in 1964, under
the charter of the National Academy of Sciences, as a parallel
organization of outstanding engineers.  It is autonomous in its
administration and in the selection of its members, sharing with the
National Academy of Sciences the responsibility for advising the
federal government.  The National Academy of Engineering also sponsors
engineering programs aimed at meeting national needs, encourages
education and research, and recognizes the superior achievements of
engineers.  Dr. Robert M. White is president of the National Academy of
Engineering.

    The Institute of Medicine was established in 1970 by the National
Academy of Sciences to secure the services of eminent members of
appropriate professions in the examination of policy matters pertaining
to the health of the public.  The Institute acts under the
responsibility given to the National Academy of Sciences by its
congressional charter to be an adviser to the federal government and,
upon its own initiative, to identify issues of medical care, research.
and education.  Dr. Samuel 0. Thier is president of the Institute  of
Medicine.

    The National Research Council was organized by the National Academy
of Sciences in 1916 to associate the broad community of science and
technology with the Academy's purposes of furthering knowledge and
advising the federal government.  Functioning in accordance with
general policies determined by the Academy, the Council has become the
principal operating agency of both the National Academy of Sciences and
the National Academy of Engineering in providing services to the
fovernment, the public, and the scientific and engineering
cosBBunities.  The Council is administered jointly by both Academies and
the Institute of Medicine.  Dr. Frank Press and Dr. Robert M. White are
chairman and vice chairman respectively, of the National Research
Council.
Available from:

Numerical Data Advisory Board
2101 Constitution Avenue, N.W.
Washington, D.C.  20418

Printed in the United States of America

-------
                  PANEL ON QUALITY OF ENVIRONMENTAL DATA
R. STEPHEN BERRY, University of Chicago, Chairman
JACK G. CALVERT, National Center for Atmospheric Research
EDWARD D. GOLDBERG, University of California, San Diego
PAUL MEIER, University of Chicago
R. BRADFORD MURPHY, Bell Laboratories
JOHN J. ROBERTS, Argonne National Laboratory
                      NUMERICAL DATA ADVISORY BOARD
GERD ROSENBLATT, Lawrence Berkeley Laboratory, Chairman
SIDNEY ABRAHAMS, Bell Laboratories
JOHN C. DAVIS, University of Kansas
MORRIS DE GROOT, Carnegie-Mellon University
F. W. MCLAFFERTY, Cornell University
STANLEY L. MILLER, University of California, San Diego
R. BRADFORD MURPHY, Bell Laboratories
ROGERS RITTER, University of Virginia
DAVID W. H. ROTH, JR., Allied Signal, Incorporated
JAMES J. WYNNE, Thomas J. Watson Research Center, IBM
GESINA C. CARTER, Staff Director
                                   iii

-------
       COMMISSION ON PHYSICAL SCIENCES. MATHEMATICS. AND RESOURCES
NORMAN HACKERMAN, Robert A. Welch Foundation, Chairman
GEORGE F. CARRIER, Harvard University
DEAN E. EASTMAN, IBM, T. J. Watson Research Center
MARYE ANNE FOX, University of Texas
GERHART FRIEDLANDER, Brookhaven National Laboratory
LAWRENCE W. FUNKHOUSER, Chevron Corporation (retired)
PHILLIP A. GRIFFITHS, Duke University
J. ROSS MACDONAID, University of North Carolina at Chapel Hill
CHARLES J. MANKIN, Oklahoma Geological Survey
PERRY L. McCARTY, Stanford University
JACK E. OLIVER, Cornell University
JEREMIAH P. OSTRIKER, Princeton University Observatory
WILLIAM D. PHILLIPS, Mallinckrodt, Inc.
DENIS J. PRAGER, MacArthur Foundation
DAVID M. RAUP, University of Chicago
RICHARD J. REED, University of Washington
ROBERT E. SIEVERS, University of Colorado
LARRY L. SMARR, National Center for Supercomputing Applications
EDWARD C. STONE, JR., California Institute of Technology
KARL K. TUREKIAN, Yale University
GEORGE W. WETHERILL, Carnegie Institution of Washington
IRVING WLADAWSKY-BERGER, IBM Corporation
RAPHAEL G. RASPER, Executive Director
LAWRENCE E. McCRAY, Associate Executive Director
                                    iv

-------
                                 PREFACE
    In April 1983, the Environmental Protection Agency (EPA) asked the
National Research Council (NRC) to review the agency's control of the
quality of its scientific data, and particularly its Quality Assurance
(QA) program.  The NRC called upon its Numerical Data Advisory Board
(NDAB) to carry out the task.  To conduct the requested assistance and
review, the NDAB established a panel of six scientists to work with
EPA, under the guidance of the NDAB, for three years, to (1) engage in
discussions and briefings to help the EPA establish a sound QA program,
(2) review methods, methodology, and selected key EPA QA documents, and
(3) present its findings and recommendations.  The final report is to
be distributed as an advisory document that EPA could use to strengthen
its programs, particularly those involving numerical scientific data.
    The panel's study is now concluded, and this is the report of its
findings and recommendations.  An interim report was prepared in early
1985 to give the EPA information and advice on several particularly
urgent problems at the midpoint of the study.  The agency responded to
that report in a very positive way; this report in part discusses the
issues covered in the interim report and how the EPA is currently
dealing with them.  In addition, this report discusses several issues
that were not discussed in the interim report.  There are still some
significant unresolved problems, and there are some very positive steps
that could be taken toward solving some of the biggest, most
refractory, and long-standing problems.  Our overall outlook is
hopeful, and we wish the EPA success in the establishment and
maintenance of an exemplary Quality Assurance program.
    We wish to thank the EPA for opening its operation to us and
particularly to thank Stanley Blacker and Thomas Stanley for their
assistance and candid insight into the problems of the agency.
                                          R. S. Berry
                                          Chairman
                                          Panel on Quality of
                                          Environmental Data

-------
                                CONTENTS
1.      EXECUTIVE SUMMARY AND KEY RECOMMENDATIONS                   1

             Key Concepts Required for an Effective
             QA System                                             1
             Key Responsibilities for an Effective
             QA System                                             2
             Key Recommendations                                   3

  2.    INTRODUCTION                                                5

             Charge to the Panel and Its Realization               5
             Means of Addressing the Charge                        5
             What the Report Aims to Accomplish                    6

  3.    TERMS OF REFERENCE:  CONCEPTS OF QA AND QC                  8

             Definitions of QA and QC                              8
             Specifying "Adequacy" of Data Quality                 9

  4.    INSTITUTIONAL ISSUES                                       11

             Accountability and Organizational Structure in EPA   11
             Characteristics Necessary for a QA Program to
             Function                                             12
             How Does EPA's QA Program Compare with These
             Characteristics?                                     13
             Responsibilities:  Which Should be Centralized and
             Which Distributed                                    14
             "Crisis" Panels                                      16
             Role of the Science Advisory Board                   16

   5.   DEFINITION OF ADEQUATE QUALITY AND THE PURPOSE
       OF DATA:  THE DQO                                          18

   6.   IMPLEMENTATION OF THE DATA QUALITY OBJECTIVE
       PROGRAM:  THE PURPOSES OF DATA AND THE ADEQUACY OF
       THEIR QUALITY                                              20

   7.   PRIORITIES IN ALLOCATING EFFORTS TOWARD DATA GATHERING     24

   8.   ESTABLISHING COMMON PRACTICES                              26

   9.   ARCHIVING AND DISSEMINATING QUALITY-CHARACTERIZED DATA     29
                                   vi

-------
10.    ILLUSTRATIVE EXAMPLE:  THE RCRA "GROUND WATER MONITORING
       TECHINCAL ENFORCEMENT GUIDANCE" DOCUMENT--HOW THE
       SYSTEM COULD BE MADE TO WORK                               31
APPENDIXES
       A.     Meeting Agenda                                       34
       B.     Interim Report                                       35
       C.     List of Abbreviations                                50
                                  vii

-------
               1.  EXECUTIVE SUMMARY AND KEY RECOMMENDATIONS
    The primary  issues associated with establishing and maintaining an
 effective  quality assurance program  in the Environmental Protection
 Agency fall  conveniently  into two categories:  the specification and
 acceptance throughout the Agency of  a few key concepts, and the recog-
 nition and acceptance, again throughout the Agency, of the responsibil-
 ities following  inevitably from the  basic concepts.
    This summary first defines the key concepts, starting with the very
 notion of  quality assurance itself,  and then integrates the implemen-
 tation of  that concept into the existing institutional structure of EPA
 by defining  precisely the components of this structure.
    Then,  with the conceptual structure in hand, this summary lays out
 the responsibilities that fall on each of the individuals and
 offices—and here, the panel emphasizes that this includes data users
 as much as data  suppliers—if EPA is to have effective quality
 assurance  and, more important, the underlying substantive provision of
 data of known and acceptable quality.
    Key recommendations then follow.
            Key concepts required  for an effective QA system

    1.   Quality assurance  (QA)  is the guarantee that the quality of a
product is actually what is claimed by the supplier, as achieved by the
practice of quality control (QC).   The purpose of QA is to assure the
supplier as well as possible that  the needs of the user will be met, or
at least that the user will know the quality of the product.  For EPA's
data, this means that the user—meaning the person responsible for
assessments of legislation  or. regulation or compliance monitoring--can
be assured that the quality of the data on which the criteria are based
is indeed that for which he or she has asked, or is, at the very least,
that specified by the supplier.  Quality assurance, being a guarantee,
is a means for an accountable individual or office to protect the
integrity of that individual or  office.

    2.   A quality assurance officer (QAO) reports directly to the
person who requires the protection of quality assurance.  The QAO
should be independent of those whose data are being quality assured and
should have"no other conflict-of-interest positions.  The QAO should be
technically Highly trained, and  have competence in statistics.  The QA
officer should have some authority over the suppliers of the product.
This authority should be exercised in a mode of friendly guidance as
much as possible, which can in many cases be done because one presumes
that the supplier and the QA officer have a common goal.

    3.   Data quality objectives (DQOs) are an interactive management

                                    -1-

-------
                                   -2-

tool used to interpret data needs of the data-using administrator/J^
regulator to the data supplier, in a manner the supplier can trans^Re
into objectives for quality assurance and appropriate levels of quali
control.  These devices, intended to allow data suppliers to improve
the services they provide to data users, are now seen more as harass-
ment than as aids.  It should be a responsibility of the Quality
Assurance Management Staff to change and improve this situation and
see that the DQO system becomes effective.

    4.   The Quality Assurance Management Staff (QAMS) is an important
office that must be defined on the administrative chart.  QAMS needs to
be at a high enough level and staffed by recognized technical experts
so that it has the influence and respect required to help ensure that
quality assurance is correctly carried out in all parts of the Agency
and to facilitate the work of the QAO and the preparation of the DQOs.
             Key responsibilities for an effective QA system

    1.   At every level, responsibilities of all parties must be
clearly  defined.  If QA is to be useful or effective, an office must
be held accountable for the quality of the product.  Accountability for
data quality should be assigned by EPA if it wishes to use QA to assure
the quality of its data.

    2.   Data users (administrators, regulators), by the nature of
their responsibilities, are accountable for their decisions, including
those depending on environmental data.  They therefore need the assis-
tance of a quality assurance program through which they can find
way to communicate their needs to EPA's suppliers of data.  This
eludes assessments of sites, situations, and effects as well as labora-
tory data.  Data users need to collaborate interactively and draw on
the assistance of the relevant QAO and QAMS people in DQO development.

    3.   The QAO is responsible for four items:  (a) assuring, by
suitable checks, the accountable person or office supplying data that
the quality of data meets the needs of the user so far as possible and
that the user in any case knows the quality of the data; (b) carrying
out the interactive development of the DQO, by data user and provid-
er, which specifies a scheme appropriate to the required data and data
quality (when possible, DQOs should be prepared for classes of similar
data situations, to eliminate unnecessary and costly repetition);
(c) monitoring site observation plans and their execution; and
(d) overseeing and assisting the field scientists  in their
responsibilities (next item).

    4.   The field, or "bench" scientists are responsible for (a) the
technical conduct of the measurement, including quality control  (QC)
and "good laboratory practices"; (b) participation in development of
the DQO, indicating what is technically feasible for a given level or
size of project; and (c) participating in Agency-wide development of
documents defining common practices.  Skills must  be commensurate with
these tasks.

-------
                                   -3-

     5.   The Quality Assurance Management Staff (QAMS) should have the
 responsibility for the QA  infrastructure, oversight, and active inter-
 action with all parties.   The visibility and powers of this office need
 to  be elevated in EPA (currently QAMS does not appear on the
 administrative chart).  The panel believes QAMS's responsibilities
 include  (a) scientific and technical support to the QAOs; (b) coordi-
 nation of Agency-wide development and ultimately, preparing and dis-
 seminating documents for common practices; (c) establishment of crisis
 panels for data, composed  of outside voluntary experts; and (d) con-
 ducting  ongoing quality assurance educational and assistance programs
 aimed at all three parties:  data users, data suppliers and QAOs, and
 management staff.  QAMS should employ technically qualified personnel
 whose competence includes  statistics.

  ,   6.   The Science Advisory Board (SAB) is responsible for scientific
 and technical overview and backup for QAMS.  SAB's own expertise should
 include  quality control, quality assurance, and data handling.

     At the EPA, some elements of these concepts and assignments exist,
 others are being developed, and still others are lacking and must be
 put in place, for reasons  described in the remainder of this report.
                           Key recommendations

    The following are the key recommendations of the panel, beyond
those dealing with the concepts and responsibilities just mentioned.

    o    Specific accountability for data quality should be estab-
lished, in the sense that the holders of particular positions take on
responsibility that the quality of data supplied by or through their
offices is indeed as claimed.  Note that this is not a recommendation
to make individuals responsible for maintaining particular levels of
data quality.

    o    Levels of data quality should be expressed as goals set by the
needs of the users, which will sometimes but not always be possible to
achieve; when suppliers cannot assure that they can provide data of the
quality the users want, they should make the users fully aware of the
situation.

    o    Through efforts by QAMS and higher administrative levels, a
degree of comunication between data-using and data-supplying arms of
the Agency should be established, so that data suppliers understand
what constitutes "adequate" quality of data, as needed by the data
users, and can respond to the needs in a cooperative manner.

    o    Effort and funds for data should be allocated in a manner
assuring (a) that the needs of the user are met through achieving
adequate quality of the data and (b) that efforts yield what is most
needed, not necessarily what people know best how to do.
                                                                               r

-------
                                   -4-

    o    Educational efforts should be reestablished within EPA
regarding quality control and quality assurance, particularly to
train new QA officers.

    o    A document or documents should be developed that set stan-
dards, guidelines, and common practices for data measurement,
treatment, and management, for Agency-wide use.  Setting of standards
on common practices should be done on an ongoing basis involving
scientists both from within and outside EPA and should incorporate, as
much as possible, conventions already established by international
agreement, or, lacking that, those with national acceptance.

    o    Standing panels of experts should be established who can be
called upon to advise about data collection in crisis situations.

    o    The Science Advisory Board should be used to monitor quality
control within the Agency and to follow the progress of the Agency
toward meeting the recommendations of this report.

-------
                             2.   INTRODUCTION
                 CHARGE TO THE PANEL AND ITS REALIZATION

    The charge given by the Environmental Protection Agency to the
 Panel on Quality of Environmental Data was to assist and advise the EPA
 Quality Assurance program so that this program would provide integrity
 and quality assurance of EPA's scientific data adequate to fulfill the
 EPA's intended scientific and regulatory purposes.  After discussion
 with the EPA staff, this was interpreted to mean that the panel should:

    1. Examine the structural and institutional situation of the QA
 program to see whether these fit the needs of the Agency and the
 program.
    2. Examine the specification of objectives of the QA program
 itself, to evaluate how well the objectives fit with the Agency's
 mission.
    3. Examine the budgeting, staffing, and documentation for the EPA
 QA program, to estimate how well those help the program and the Agency
 achieve their goals.
  .  4. Examine how the substantive operations of QA programs function
 in the regional offices, the Environmental Monitoring Systems
 Laboratories (EMSLs), and the support laboratories.
    5. Examine the relation between the maintenance of professional
 quality control in the Agency and the operation of the QA program.
                      MEANS OF ADDRESSING THE CHARGE

    The panel took its primary responsibility to be improving the
assurance of the quality of scientific data and the functioning of the
QA program of the EPA.  With this view, it operated with the
realization that only some of the panel's goals would be accomplished
by written reports.  Others were achieved by the running constructive
dialogue between the panel and the EPA staff and the review of relevant
documents in various stages of development.  Throughout this report,
the panel will refer to matters that were handled through such .dialogue
and point out how the EPA's ways of dealing with these matters have
evolved.
    The p*Ml carried out a series of meetings with various groups
within the EPA during its three-year study.  The calendar of these
meetings is in Appendix A; it shows how the panel tried to see a wide
range of EPA's component parts in action, and to talk with as many as
possible of the key actors involved in the maintenance and assurance of
data quality.  These included staff in the Washington headquarters at
many levels, from deputy administrator to staff scientist; people in
regional offices and laboratories, in Environmental Monitoring Systems

                                   -5-

-------
                                   -6-

Laboratories (EMSLs), and in support laboratories; and staff in
regulation and enforcement.  Most of the panel's regular contacts
EPA were made through the Quality Assurance Management Staff (QAMS) of
the Washington office.  Many meetings were held with the full panel and
the selected EPA staff; several meetings consisted of one, two, or
three panel members with EPA staff .a a particular office, in such
instances usually in EPA offices away from Washington, D.C.  The panel
also read a fairly large number of written reports, primarily internal
EPA documents, ranging from documents on guidelines and common
practices to documents on audits and documents involving DQOs.
    The interaction between the panel and the EPA involved frequent
discussions with QAMS, the preparation and distribution of the interim
report, two briefings with the deputy administrator and several senior
staff members particularly to discuss the interim report, and
collaboration on this final report.  A final briefing for EPA on this
report is anticipated.
                    WHAT THE REPORT AIMS TO ACCOMPLISH

    This report is  intended primarily as advice to the Environmental
Protection Agency,  including

    o  the administrator and deputy administrator;
    o  the assistant administrator for research and development;
    o  the Office of Acid Deposition, Environmental Monitoring and
Quality Assurance;
    o  the Quality  Assurance Management Staff;
    o  all the regional administrators and  their staffs;
    o  all EPA staff members charged with responsibility for data
quality, including  the assistant administrators for water, air, and
hazardous waste;
    o  the assistant administrator for enforcement and compliance
monitoring and the  Enforcement Offices under that administrator; and
    o  the Office of Standards and Regulations and the Science Advisory
Board.

    These are the primary generators and users of scientific data, the
people who are directly responsible for the quality of data and those
who depend on that  quality in order to carry out their missions.
    Secondarily, the report may be useful to others concerned with
knowing the reliability of data generated or used by  the EPA.  This
community may include, among others, committees of the Congress and
their staffs, government agencies concerned with the  effective
functioning of the  EPA, and other scientific, medical, and engineering
communities and institutions, such as the Center for  Disease Control.
    The report attempts to examine how the  quality of scientific data
generated or used by the EPA is maintained  and, more  particularly,
assured to the primary users of those data, especially the people
within the EPA using the data for regulatory, enforcement, or
scientific purposes.  The panel hopes that  this report outlines the
issues and problems of maintaining and assuring data  quality, the ways

-------
                                   -7-

the EPA has already made progress toward solving those problems, and
some ways in which the panel believes that the EPA should go further,
in many cases with little or no extra expenditure, to improve its QA
program and aspects of its operation that rest on scientific data.
    Some of the suggestions involve nothing more than a little
redistribution of responsibilities by asking data users to articulate
their needs more effectively to data generators.  Some involve
redistribution of efforts from areas where skills are high and answers
can be found easily toward areas where answers are badly needed, even
at very approximate levels, but where finding those answers is
relatively difficult.
    This report is intended as a stimulus for EPA to find new
approaches toward ensuring data quality, to enable it to move toward
its goals more effectively.

-------
              3.  TERMS OF REFERENCE;  CONCEPTS OF QA AND QC
                         DEFINITIONS OF QA AND QC

    Quality control  (QC) ^s the maintenance and statement of the
quality of a product, specifically  that  rt meets or exceeds~some
minimum standard based on known, testable criteria.  Quality control in
science is a professional obligation that is usually assumed to be
deeply ingrained in  the training of aspiring scientists.  Science as we
know it cannot function unless its  products are so reliable that each
new piece of explicit work can be built  on the preceding results.  The
products of science  are data, insight, and understanding.  Quality
control regarding insight and understanding usually means maintaining
the validity of the  logic in the steps of an inference.  It is the
control of the quality of scientific data that demands special skills
and attention, in this case to the  degree that an outside advisory
study was requested  by EPA.  Establishing the inherent uncertainties of
the procedures and seeing that the  procedures are properly carried out
are typical central  components of quality control.  In carrying out a
scientific task, one is expected to take the responsibility to
establish quality control over the  data, including maintaining
surveillance over that quality.
    Quality assurance (QA) j.s the guarantee that the quality of a
product ^s actually  what ij5 claimed on the basis of the quality control
applied j.n creating  that product.   Quality assurance is not synonymous
with quality control.  Quality assurance is meant to protect against
failures of quality  control.  The need for quality assurance is
predicated oil the notion that some  person or office is accountable if
the quality of the product fails to meet the standards claimed for it
and therefore that whoever is accountable has a strong interest in
seeing that the claims of quality are valid.
    In the context of scientific data, the logic behind a quality
assurance program is this:

    1.  A body of scientific data is put  to use by somebody who depends
on the data meeting  or exceeding a  specific level of quality in order
to achieve a particular objective.
    2.  If the intended use fails because the data do not meet the
standard of quality  claimed for them by  the supplier, then the supplier
of the" data—some particular person or office--can be held accountable
for that failure.
    3.  The accountable person or office  therefore has a strong vested
interest not only in providing data as economically as possible, but
also in seeing that  the standards of adequate quality are actually met;
an institutionalized quality assurance program is therefore created to
serve that purpose.
    4.  The balance between achieving adequate quality and economy leads

                                    -8-

-------
                                   -9-

to a design of the data-gathering effort commensurate with the user's
needs; that is, quality assurance assures that an assay is neither
underdesigned nor overdesigned.


                  SPECIFYING "ADEQUACY" OF DATA QUALITY

    Scientific data serve many functions in the Environmental
Protection Agency.  They play roles in the proposal and drafting of the
laws and regulations, in the enforcement of laws and regulations, in
scientific research on environmental matters, and in guiding the
thinking in fields such as environmental medicine.  In each of these
uses, scientific data may or may not be of a quality that is adequate
for the use.  It is important for EPA to maintain the quality of its
data at a level that will make the data valid for those uses.  It is
also important for EPA to recognize that the highest attainable quality
may be quite unrelated to the quality that makes data adequate for a
purpose.  Sometimes, the highest attainable quality may fall below the
standard considered adequate for a particular purpose.  In this case,
either the methods of getting the data must be improved or the
expectations for data quality must be lowered.  Sometimes the highest
attainable quality may far exceed in quality and cost the quality
adequate for particular uses.  In this situation, EPA must be careful
to use its resources efficiently and not purchase data quality far in
excess of what it needs.
    The adequacy of the quality of a product for its task can only be
judged by the users of the product.  The suppliers should know what
quality can be achieved, respond to the users' needs in terms of
quality, and avoid unnecessary high costs associated with making the
quality far higher than is needed.  This maxim, translated explicitly
into the EPA situation, means that the users of EPA's scientific data
must specify as best they can what quality of data is adequate for
their purposes.  Then the suppliers can tune their quality control
procedures to meet these needs, and those accountable for data quality
can see that their quality assurance support is maintaining the
correspondence between claims and performance at the selected level.
    In the past, it was apparently never considered within EPA (or at
least, so the panel's findings showed) that the users of data should
convey to their suppliers what the quality of the data should be.*  The
discussions between the panel and the EPA seemed to have helped to
stimulate the Agency to establish procedures for articulating what
constitutes "adequate" quality of data for particular users  (e.g.,
those drafting or enforcing laws and regulations).  The panel discusses
this issue in detail in Chapters 5 and 6.  Here, it simply comments
*In the late 1970s, before the existence of  the Quality Assurance
 Management Staff (QAMS), a call was  issued  for specification  of data
 use and quality, but this recommendation was never  fulfilled.  This
 may have been part of the motivation to establish a QA program.

-------
                                    -10-
that an awareness of this issue is growing within EPA and that active

steps are being taken to implement a system to attain data quality that
meets intended use at the program level.                   quality tn»t

-------
                          4.  INSTITUTIONAL ISSUES
             ACCOUNTABILITY AND ORGANIZATIONAL STRUCTURE IN EPA

    Some of EPA's functions depend on the knowledge and reliability of the
quality of scientific and technical data.  Without dwelling unnecessarily
on specific instances, the panel can say that the Agency has, on occasion,
functioned badly because of failures in the maintenance of data quality;
such occasions were important stimuli for the creation of this panel.  In
discussions with the highest levels of the EPA staff, it was said to panel
members that there were probably no clear placements of accountability for
data quality.  It may be possible in principle for the Agency to function
without such specification, provided the QC procedures are virtually
impeccable.  However, in most situations, an extreme being field sampling
in crises, it is necessary to specifically place accountability to see
that the needed quality of data is achieved.  This in turn becomes the
justification for the existence of a QA program.
    The panel recommends that the EPA make explicit precisely where the
accountability rests for data quality.  No one person or office could
carry the accountability for the quality of all the data supplied by EPA.
However, each kind of data must be associated with some responsible
office--laboratory data with the appropriate EMSL, regional field data
with the appropriate regional office, and so forth.  It is useful to make
this specification not only because it focuses a responsibility that was
heretofore diffused, but also because it has direct implications for the
institutional structure of the QA program, as we shall now see.
    The administrative structure of EPA largely separates the
responsibility for supplying data from the responsibilities of using the
data to carry, out the Agency's mission.  Presumably, when accountability
for data quality is designated, it will fall on those who administer the
programs that supply data.  Anyone in a position designated as accountable
for scientific data has direct interest in having the protection of a QA
program--perhaps only a single individual who reports directly to the
accountable person.  When accountability was not specified at a high
enough level in the decision process, EPA officials had little motivation
to use funds under their control for quality assurance.  As a consequence,
quality assurance was often left to people ill-trained to carry out its
critical demand*, or to people whose other responsibilities conflicted
with their QA duties.  Thus, quality assurance was often unable to
function wall enough to achieve its goals.  Only if there are people both
capable of establishing the programs and motivated by their responsibil-
ities to see that they function, can a QA program perform its job.
                                    -11-

-------
                                    -12-

           CHARACTERISTICS NECESSARY FOR A QA PROGRAM TO FUNCTION

    In any situation, a QA program must have certain characteristics if it
is to function:

    1.  Independence.
    2.  Technical competence and the capacity to assist.
    3.  Clearly understood authority.
    4.  Vigorous management.

It is vital that a person responsible for quality assurance be independent
of the people whose performance he or she is overseeing.  A QA officer
must be a staff representative, free to report to the person that officer
is protecting.  Second, a QA officer must be technically competent to
evaluate the work of those he or she oversees and must have the confidence
of those people, to the extent that the QA officer should be a resource to
assist the bench scientists in the control and maintenance of the quality
of their product.  Third, the authority of the QA officer must be clear
enough that the person to whom the QA officer reports can get accurate
information and that the QA officer can act to maintain or restore
adequate quality of the product.  Fourth, the QA program must be treated
as an active, ongoing monitoring program requiring vigilance and
cooperation among the data user, the data generator, and the QA officer.
    Within the EPA, these general terms translate into specifics in this
way:
    Quality assurance officers must report to and be members of staffs of
the accountable individuals they protect.  They must not be on the
payrolls of the people they oversee. • This is necessary to maintain the
independence that QA demands.  The QA officers must have the scientific
and mathematical competence to assist the people they oversee and evaluate
their work, to help them carry out their normal professional responsibil-
ities for quality control.  Furthermore, the central management of the
EPA's quality assurance program, the Quality Assurance Management Staff
(QAMS), should be technically competent to provide training and assistance
to the QA officers in the field on such general issues as statistical
problems and to see that QA officers get any needed assistance on specific
scientific problems.  The interaction of QA officers and QAMS with data
suppliers needs to be one of friendly support carried out in a
constructive yet authoritative relationship.  The Science Advisory Board
has a responsibility to provide scientific expertise and oversight to
assist QAMS and the QA officers on these matters.
    The necessary authority involves more than mandated power, but
maintaining that authority does require a clear designation of
responsibility and reporting in the organizational chart.  Each
accountable person should have the authority to place responsibility for
maintaining quality assurance on a QA officer.  Beyond that, the QA
officer should have the support of a set of Agency-wide documents
describing rules, norms, standards, and definitions regarding technical
terms such as precision, bias, limits of detection, and other scientific
terms and concepts that demand uniform  treatment throughout the Agency.
Otherwise, data use and interpretation will be riddled with errors.  This
is discussed further in Chapter 8.  Furthermore the QA officers must be

-------
                                    -13-

eiapowered to carry out on-site observations and audits as they see
necessary.
    Vigorous management of the QA program requires organizational ability,
particularly in  its formative stage.  After the program is established, it
must, above all, have technical strength.  Each particular scientific area
needs leadership that stays aware of the current progress in that area.
All the QA officers must possess skills in the statistical treatment of
data or must have access to staff with such skills, statistics being the
single topic that pervades all of data quality control and assurance.
This implies that QAMS needs at least one person who is widely recognized
for his or her expertise in statistics and who interacts with the
professional statistical community.
    An aspect of vigor and authority is very important in EPA.  The QA
staff must be able to elicit from the data users in the Agency clear
statements of what constitutes "adequate quality" of the data supplied to
them.  Providing this information is a new challenge for many of the
users, particularly those involved  in regulation and enforcement rather
than in scientific application of the data.  Therefore the QA staff must
be able to educate and work with the data users in an understanding manner
to bring those statements of the needed quality—now called data quality
objectives (DQOs)--into a form that can be used by the data suppliers.
This topic is addressed in detail in Chapters 5 and 6.
    Education is another task that  the QAMS must carry out.  Initially,
the concept of quality assurance and its role in the functioning of the
Agency must be communicated to the  accountable individuals and to those
who have to explain "adequate quality" to the suppliers.  When the first
generation of these people understands quality assurance, one can expect
that maintenance of their own best  interests will keep this knowledge
alive in the successors of that first generation.  A different kind of
training, a continuing program, will be required for the QA officers.
This is a task that cannot be left  to haphazard learning-by-doing.  New QA
officers should  receive the benefit of some supervised experience before
they are considered trained for their tasks.  A regular training program
should be part of the QAMS responsibilities.
    Short-term exchanges among QA staff from different agencies with QA
programs could be a beneficial adjunct to an internal training program.
       HOW DOES EPA'S QA PROGRAM COMPARE WITH THESE  CHARACTERISTICS?

    When the panel began its  review,  the QA  officers were working
generally only part-time in that capacity  and were frequently working  for
the people they were supposed to be overseeing.   It  was  usual for  QA
officers to, b* on the payrolls of  the people from whom they should be  most
independent'.-  The position of QA officer was considered  a drain on the
budget and w«». sometimes used as a place to  put  people who  could not
contribute to the substantive work of the  Agency.  It  was easy to  see  that
this situation arose naturally because there was no  clear accountability
for data quality and, therefore, little sense of any need for the
protection whose provision is the  main reason for a  QA program. In fact,
when the panel began its tasks, the distinction  between  quality control
and quality assurance was not well understood in the EPA.

-------
                                    -14-

    Much progress has been made to correct this situation insofar as the
administrator's office and QAMS recognize the problem.  However, the panel
feels that the accountability for data quality has not yet been
established, and, until that is done, no QA program will be able to
function as it should.  The role of quality assurance is still not
apparent in the administrative chart of the EPA.
    The responsibilities of data suppliers are not yet understood.  At
present, it seems to the panel that the QA program encourages those
responsible for supplying data to write long documents, often largely
formal restatements of standard model documents, which at best say how
quality assurance and quality control will be maintained.  However the QA
officers do not in many cases have the authority to inspect and audit
contractors, to see whether there is any relation between the statements
and reality.  The panel encourages the Agency to put much more emphasis on
the field activity of the QA officers and much less emphasis on the formal
documentation through lengthy written QA program plans and QA project
plans.
    Some of the QA officers are very competent, responsible, and dedicated
individuals.  Others are in their positions by default or for other
improper reasons.  Assignment of accountability for data quality to the
data supplier would motivate the supplier to insist on the technical
competence of the QA officer.
    The educational component of the QA program no longer exists.  This is
clearly inconsistent with what the panel considers necessary for the
program to function.
    The panel recommends that EPA restore a training program, carried out
by QAMS, for its QA officers and that this program draw in part on
experienced individuals who have participated in successful QA programs
elsewhere.  QAMS should also provide technical training and assistance to
the data users, but in order to do this it will need a more technically
skilled staff than it now has.
    The authority of the QA officers is not clear.  The mandates are not
well specified, and the officers are often in very uncomfortable positions
in which they simply cannot make critical reports.  Some of them do not
have the technical competence to provide scientific assistance to their
colleagues.  Again, the panel returns to the point that the present
situation, although changing, leaves some QA officers in situations of
strong conflict of interest.
   RESPONSIBILITIES:  WHICH SHOULD BE CENTRALIZED AND WHICH DISTRIBUTED?

    Whenever there is a need for Agency-wide uniformity, there must be  a
centrally coordinated program or effort.
                              Quality Control

    The panel was shown, especially  in  its earlier visits, a number  of
documents generated in the laboratories and  regions  in reference  to
quality control and "good laboratory practices."  These did not
necessarily have Agency-wide acceptance.  In contrast, the EHSL/Las  Vegas

-------
                                    -15-

 staff  showed  the panel  a guidance document valid on an Agency-wide level.
 The EPA  is  encouraged to prepare, as much as possible, guidance documents
 concerning  me*aurements and data practices that have Agency-wide
 application.  The preparation of such documents should be given to a group
 with the required expertise and responsibility for the subject, perhaps
 drawn  together just  for this purpose.   However, QAMS should encourage and
 coach  the units in the  preparation of these documents, for it must bear
 the ultimate  responsibility for their preparation, distribution, and
 utility.  Those who  draw up the documents should consult with future users
 of the documents from all  relevant sectors of the Agency and should
 respond  to  their needs.  The Agency should also consult representatives
 from the community of EPA's contractors and from technically competent,
 relevant, independent environmental organizations.  The document itself
 should reflect equivalencies of different measurement approaches and
 should allow  flexibility,  where applicable, among technical options.  This
 kind of  general, Agency-wide guide alleviates ambiguities, reduces errors
 in the handling and  use of data, and eliminates both redundant effort in
 writing  documents and conflicting instructions.  EPA personnel and
 contractors must understand that if the data are requested for EPA's
 purposes, then the measurements must be made in harmony with EPA
 guidelines  and practices.   Application  of the guidelines is then the
 responsibility of the distributed units responsible for the measurements.
                             Quality Assurance

    While the responsibilities of  the QA  officers are strictly to those
individuals who require QA protection and further centralization of this
protective mechanism  is not  required, the function of oversight,
education, and scientific/technical support  for quality assurance as
described in this report should be carried out by QAMS.  Without strong
QAMS support in these areas, quality assurance may have mixed results, and
EPA's QA effort may receive  poor overall  recognition from those it should
be serving.  It also  may not be possible  for the Science Advisory Board to
review adequately the QA activities that  are distributed throughout the
Agency, and QAMS must be the primary focus of interaction for this board.
                              Data Management

    Centralization of data has already  been under way at  the  EPA  in  data
bases such as STORET, SAROAD, and now BIOSTORET.  Uniformity  of data
practices, a* wall as data quality control, quality documentation, and
quality dtjgjji-iptions, is essential to the  specification of  the data  ("meta
data"); civilization of the responsibility for rules and  guidelines is
the key tQ^mlformity here, too.  It must  be added  here that  simply
transferring'data gathered by the laboratories, states, regions,  and
contractors merely produces unusable data  dumps.  Access  to and use  of
data in such repositories are extremely difficult,  if possible at all.
The Agency must provide adequate support for maintenance  and  further
development of these data bases, which  it  will certainly  have to  use in
the future.  Some issues to be addressed by a central office  include:

-------
                                    -16-

What data are worth keeping, what data can or should be compacted, and
what should be discarded?  When is application of expert systems
worthwhile?  To what level of detail do modelers need to know each data
element? These and a host of other broad data issues must be addressed by
the managers of these Agency-wide data bases, in consultation with data
users.
                              "CRISIS" PANELS

    The panel has discussed the creation of "crisis" panels, or standing
panels of experts, to allow EPA to have access to a preselected voluntary
group of experts as advisors in an emergency.  EPA should not maintain
in-house expertise on the entire spectrum of potentially hazardous acute
situations it might face.  But EPA is responsible for coping with such
crises.  The intent of this recommendation is to reduce EPA's response
times in crisis situations that require immediate data gathering and to
improve the quality of the data gathered.  The panel has no specific
structure in mind for such crisis panels.  A single large panel could be
on call to the QAMS office, or about three experts could be on call to
each assistant administrator.  Other ways to implement this intent are
also possible.  These experts could then identify other advisors
especially suitable for addressing the emergency at hand.
    The selection of these crisis panel experts could be assisted by
obtaining recommendations from the scientific community, particularly from
members of the Science Advisory Board, the Council on Environmental
Quality, and the National Research Council, and from other  interested
parties such as the American Statistical Association.  It is proposed that
appointment as a crisis panel member would be an honorary position and
that scientists in that position would provide voluntary service, with
reimbursement for out-of-pocket expenses only.  As the EPA's crisis
response becomes established and takes over labor-intensive work, these
volunteers could step back, while continuing to give advice.
    An existing crisis panel or panels should reduce both the time of
reaction and the risk of inappropriate first actions in these particularly
crucial times for EPA and the public by making better data  available on
shorter notice.
                     ROLE OF THE SCIENCE ADVISORY BOARD

    Prior to the establishment of this NRC panel  for the EPA,  the Science
Advisory Board (SAB) had an Environmental Measurements Committee, which
aaaiated the EPA with measurement data aspects during the crisis of Love
Canal and with other data matters.  At the time of Administrator Gorsuch,
this committee was disbanded while the NRC panel  was created.
Consequently, the expertise of SAB on quality assurance has been reduced
so that SAB involvement in matters related to QAMS is now practically
nonexistent.
    The panel believes, after the period it has spent with QAMS, that  the
progress made is encouraging, so that further continuous panel assistance
may not be required, at least in the near future.  EPA might,  however,

-------
                                    -17-

call upon the NRC two or three years from now to evaluate the QA program
and make further recommendations.  The panel believes that it is a proper
part of SAB's mission to advise QAMS and to monitor the progress of the
Agency in achieving the objectives of this report.  In order to do so, SAB
must include members with expertise in statistics, sampling, measurement
protocol, and guidelines preparation, and especially also in the conduct
of QA programs.  Possibly, a separate committee or subgroup of SAB should
be assigned the responsibility of regular interaction with QAMS.

-------
    5.   DEFINITION OF ADEQUATE QUALITY AND THE PURPOSE OP DATA;   THE DQO
    The amount of effort to be devoted to data gathering must be carefully
tailored to the specific purposes for which the data are intended.
Section III of the interim report (see Appendix B) describes the reasons
for this and reflects the situation at that time, namely the great need
for two-way communication between the administrators and regulators who
use the data and the laboratories in the regions and states and the
contractors who generate the data.
    At the time of the writing of the interim report, the concept of data
quality objectives (DQOs) had just been introduced in EPA by QAMS.  This
was a major step in a direction strongly endorsed by the panel but not yet
implemented at that time.  Since then, the panel has followed some of the
attempts at establishing the DQO mechanism to assure that data-gathering
efforts are neither inadequate for the intended purpose nor exorbitant:

    1. At a meeting held on November 25-26, 1985, at the EMSL in
Cincinnati, an example was shown in which a DQO document was completed at
the laboratory.  It appeared to the panel that, while a paper product
resulted, in reality the necessary two-way dialog was not achieved to
guide the EMSL so that it could translate the intended use of the data
into requirements placed on the data-gathering effort.  A better two-way
dialog must be achieved between data users and data suppliers in order for
the DQO system to effectively function as "translator" between the two.
The panel certainly does not intend that senior administrators spend
excessive time learning scientific formulations and jargon and recognizes
that technical implementation of DQOs is the scientists' task.  However,
the provision of data cannot function in a vacuum.  This issue of
translation from the language of regulation and policy into scientific
language is not just an EPA problem, it is a major issue of our time.  The
EPA is notable here because of its especially intensive need for
quantitative scientific input into its decision making; for this Agency,
making the translation is crucial.  The panel believes that properly drawn
DQOs will provide a useful, realistic vehicle for achieving these goals.
    2. At a meeting held at EPA headquarters on March 18, 1986, the panel
specifically requested the presence of senior management representatives
from policy and regulatory offices to discuss how they present their needs
to those who collect the data they use in order to assure that the data
will meet these users' requirements.  This meeting was disappointing.  No
deputy assistant administrators or their senior representatives were
present.  The panel did become aware that some discussion takes place
among scientists and lower-level administrative personnel regarding
acquisition of data for a variety of purposes including modeling.  There
does seem to be communication among scientists in different laboratories
at a more or less horizontal level, but no dialog involving people at
other levels of responsibility or, most important, between data users  and
data suppliers at a level where programs that supply data are planned.  In

                                    -18-

-------
                                    -19-

other words, the panel has the impression that data suppliers (measurers)
talk to data suppliers and data users (administrators) to data users, but
that a bridge does not yet exist between these groups and that the status
quo ante has not yet been affected by the introduction of the DQO process.

    If it  is to have a. chance to fulfill its intended function, the DQO
mechanism  should be implemented whole-heartedly.  The panel realizes that
this is a  difficult task requiring patience and education.  QAMS should
play an active, substantive role in this process, especially in supplying
sympathetic, technical aid to data users unfamiliar with communicating
their needs to technical workers.  The  panel hopes that in the end EPA
will have  a mechanism to assure that enforcement and regulatory tasks of
the Agency are always stated in terms that allow those who supply the
essential  data for these tasks to translate the data needs into
meaningful, usually quantitative terms  that they can implement.
    The Quality Assurance Management Staff has indicated to the panel that
it sees its future role in the DQO process as one of auditing, not of
engaging itself actively in DQO production.  This is consistent with the
concept that the data users should carry the responsibility for
articulating their needs.  QAMS also will deal with, or should take
responsibility for, the training of others in DQO development.  QAMS
indicated  it does not want to take on the responsibility of actually
translating the end purpose of the data into quantitative statements of
adequacy of data quality.  This panel is not comfortable with that
position.  The panel believes it is unrealistic to expect the DQO process
to be embraced by EPA if QAMS limits its role to audits and training,
excluding  the very heart of the process.  The panel believes that QAMS,
at least for the foreseeable future, should take a strong initiative in
all aspects of the DQO process, particularly in the central, substantive
tasks.  If QAMS does not now have people with the skills needed for this
work, EPA  has the responsibility to add them.  This should be explicitly
stated by  the administrator as one of QAMS's responsibilities.  QA
programs elsewhere have been successful when part of their function is the
provision  of advice and assistance.
    A clear distinction should be kept  in mind between DQOs and a QA
project plan.  The former involves the  data user (regulator,
administrator, modeler, or other); the  latter is strictly a part of
developing the details of a project plan that achieves the objectives.  QA
project plans can be formulated at the  project level with overview by
QAMS.  These plans should address details to the degree of detail called
for by the objectives.
    Data quality objectives are more general statements of the degree of
quality of data required for the transformation of data to Agency
directives;;and actions.  Accordingly, where the occasion presents itself,
DQOs could b* applicable to more than one use, thus reducing the long-term
load on DQO 'development.

-------
         6.  IMPLEMENTATION OP THE DATA QUALITY OBJECTIVE PROGRAM;
             THE PURPOSES OF DATA AND THE ADEQUACY OF THEIR QUALITY
    Within QAMS, there has long been a realization that a data-generating
and data-management system is effective only if it serves the needs of
those who need those services.  This panel urged from its beginnings that
the data users of EPA convey to the data suppliers what kinds of data,
meaning what quality of data, would be adequate for their uses.  QAMS has
proposed a means to achieve this goal in the form of specified data
quality objectives (DQOs), to be providea by the data users to the data
suppliers.  Accomplishing this will break new ground in the way in which
data are incorporated into decision making.  It may be very important, and
it is not a simple or trivial job, to create a DQO program that does what
it should.  The panel has welcomed and encouraged this QAMS initiative.
    However, to date this panel has not been able to find any office of
EPA that has established adequate DQOs or any other means by which data
suppliers learn from the users what the users' needs are.  A memorandum of
November 14, 1986, from the deputy administrator to the assistant
administrators urges this implementation as had been directed in a May
1984 memorandum from the same office.  However, this implementation still
seems to be happening more slowly than this panel considers reasonable.
    The panel has discussed this problem with many data users and data
suppliers at several levels within EPA.  As a result of these discussions
and of its own deliberations and experience, the panel now offers some
recommendations to help achieve this goal of institutionalizing a
mechanism to educate data suppliers about the needs of the data users.
    This goal seems so obviously desirable to everyone concerned that it
is difficult at first to understand why there is any hesitation at all in
trying enthusiastically to achieve it.  The reasons for the hesitation
have to be examined to see why there has been no rush to set up a DQO
system and, more important, to understand what can be done to make the
users and suppliers want to use it.  If it is not established on this
basis, a DQO system will be treated as a bureaucratic harassment by all
the participants on both sides and will not be used to achieve its
purpose; it will simply add to the paperwork burden and reduce the
efficiency of the agency.  The people for whom the program is created must
want it.
    QAMS has described three stages through which it believes the data
UMrs and suppliers must go, in order that the DQO system can come into
effective operation:

    Stage 1.  The data user formulates what decision will be made, what
information is needed, how the information will be used in making the
decision, what the consequences would be of having inadequate or incorrect
data, and what resources can be made available to provide those data.
    Stage 2.  The program staff of the data user determines which elements
of the decision depend on environmental data; specifies the data needed;

                                    -20-

-------
                                    -21-

determines the domain, in time, space, and technical characteristics, from
which the data should be taken; defines the technical form in which the
data should be presented and used; specifies the "desired performance" of
the data in terms of what kinds of errors or uncertainties are tolerable
and what are not; and then determines any needs for new data.
    Stage 3.  The data supplier implements a program to meet the needs
defined by Stage 2, in the'form of a prescribed data-gathering program.

    The memorandum of November 14, 1986, asks each assistant administrator
in the program offices to work with the Assistant Administrator for
Research and Development (AARD) and the Assistant Administrator for
Policy, Planning and Education (AAPPE) to assure that each program office
understands what the DQO process  is and what its utility is, to reach
agreement with each program and regional office on developing DQOs for
their proposed major data collection activities, and to set up reasonable
schedules for preparing DQOs for  each major data collection activity.  The
memorandum also asks that responsibilities for working out the DQOs be
assigned specifically and that the AARD and AAPPE review progress on DQOs
and report quarterly to the deputy administrator on that progress.
    To this panel, it seems that  the most essential and most difficult
(but not the most expensive or lengthy) parts of the activity are in Stage
1.  Furthermore the memoranda that the panel has seen directed at the
individuals responsible for carrying out Stage l--the decision makers,
among the data users—are not convincing.  The documents did not explain
what these people should be doing or why they should be motivated to do
what seems like a very burdensome task for purposes that seem unclear from
the viewpoint of carrying out their primary jobs.
    If a DQO program, or anything else meant to achieve a similar end, is
to succeed, it must have the support of the people it is most meant to
help.  It must have the support of the data users.  The data users must
understand that the data suppliers are trying to serve them and can serve
them most effectively if the suppliers know what they should be
supplying.  Heretofore, the provision of information about what data users
need has been meager and haphazard.  The panel has encountered more than a
little cynicism from people in enforcement and regulatory roles about the
role of data and the importance of adequate quality; some such people have
expressed openly their feelings that regulatory and especially enforcement
decisions are made so commonly on grounds irrelevant to the validity of
data that it is wasteful to be concerned at all about data quality, so
long as reasonable everyday professional standards are maintained in
situations where such standards exist.  In other words, some of the users
would, it seema, be satisfied with a simple quality control program.
    One important reason why data users can hold such views  is the lack of
accountability, which has been discussed elsewhere in this report.  The
accountability meant here is accountability for the claims of adequate
quality of^data.  There are surely situations in which the supplier simply
cannot provide data of quality as high as the user wishes;  in such cases,
it is exceedingly important that  the user know from the supplier what  the
data quality really is.  Unless there is accountability at this level,
nobody needs to be motivated to go beyond simple quality control.  But, of
course, the administrator is eventually accountable for the  performance of
the agency, so it seems reasonable to the panel that the administrator

-------
                                    -22-

 would wish to assign accountability within the agency,  and the rest  of
 logic follows, leading to the creation of the DQO system or something like
 it.
     To make the DQO system work,  it is absolutely necessary for the  data
 suppliers and especially QAM's to work with data users,  on a
 person-to-person basis,  to carry out  the heavy responsibilities that have
 been put under (JAMS's Stage 1.  The tasks may at first  reading seem  well
 defined in the "Draft Information Guide on Data Quality Objectives"* of
 November 4, 1986,  but it seems to the panel that carrying out these  tasks
 is not at all easy and that it would  be rather difficult for someone
 working in enforcement or in drafting regulations to decide even what is
 meant by the statements of the various tasks when they  are to be applied
 to specific kinds of decisions.  The  development of DQOs must be carried
 out in a cooperative, participatory way with individuals from QAMS
 learning the viewpoints and problems  of the data users  at the same time
 that the data users learn from QAMS and the data suppliers what they need
 to know in order to give users what they most want.  Putting a formal plan
 into words, however well ordered, is  not going to make  a plan into a real
 activity, especially when it is not understood or appreciated by the
 people who are being told to do the work.  They have to have the support
 of regular, face-to-face interaction  with people who understand the
 directions while they work out the implementation together.  Bluntly put,
 the formal steps of Stage 1 simply do not necessarily apply to the tasks
 and responsibilities of many of the data users.  Furthermore, QAMS has to
 approach the data users in a manner that lets the QAMS  people first  listen
 to the users rather than tell the users what they must  do.  Then the DQO
 plan can be developed in collaboration with the users so that the users
 consider the DQO program as something really meant to help them rather
 than to harass them.
     The development of DQO procedures in each program office should  be
 done through direct,  person-to-person collaboration among the data user
 and his QAO, someone from QAMS, and perhaps, one or two data suppliers.
     It may be helpful to start with the "Draft Information Guide," but the
 people developing the means to achieve DQOs should be allowed to work with
 the flexibility to learn by doing.
     A second important point is the matter of the level of generality at
 which the DQOs should be written.  The lack of clarity about this point
 appears in the November 14 memorandum from the deputy administrator, which
 contains the expression "develop a reasonable schedule for preparing DQOs
 for each major data collection activity" (underline ours).  What is meant
 by the underlined expressions is left undefined.  This panel believes
 strongly that the principle of DQOs is sound but that there are two
 extremes to be avoided, either of which could undermine the approach.  One
,1m to make DQOs so general that they become hollow.  The other, which the
 'pmoel believes is the greater threat at this time, is the danger of
 trying to do too many, and to make them too case-specific.  The panel
 *Limited availability, unpublished.  One copy was circulated among panel
  members.

-------
                                    -23-

urgres that broad areas of data use be defined by the data users, areas
such as the regulation of nonradioactive dumps for toxic waste or the
enforcement of industrial point-source emission regulations.  The EPA
should not need a DQO for each Jcind of hazardous waste or for each indexed
air pollutant, although air pollutants or hazardous wastes may be so
various that more than one category is required and therefore more than
one DQO is required to describe the corresponding data needs.  The DQOs
should be written so that the data suppliers, with the help of QAMS or of
QA officers for the appropriate program, can infer how to carry out, for
each particular case, data-gathering efforts that keep the data quality at
the appropriate level.  Because the establishment of an effective DQO
system is new, it will require a period of learning by doing.  To go
through this, some trial-and-error will probably be needed.  Some first
approximations to Stage 1 can be worked out at a general level; then QAMS
and data suppliers can develop specific interpretations for test cases,
and the result of these can be brought back to the data users to see
whether the procedure achieved its goals from their viewpoints.
    Whenever possible, each DQO should be written for a whole class of
problems, not for a single case or a single pollutant.  EPA should, by
iterative, close interactions among data users, data suppliers, QAOs, and
QAMS, develop the skills for writing DQOs and interpreting them for
specific situations.

-------
         7.  PRIORITIES IN ALLOCATING EFFORTS TOWARD DATA GATHERING
    It is important for EPA to allocate effort and funding for data in a
way that assures that all the requisite data, insofar as possible, are of
the quality needed for their intended use.  On several occasio-  , EPA
staff members themselves have expressed their frustrations to  i.ie panel
about the difficulty of making such allocations in proportion  to what is
required to achieve their quantitative relevance to the issue  at hand (see
Figure 1).  There is a tendency to spend less, rather than more, on the
more difficult aspects, which offer smaller apparent returns for a given
effort.
    For example, measuring lead content quantitatively in soils  is done
with ease in comparison with developing models for human exposure, which
in turn is frequently easier than assessing health effects, which in turn
is easier than evaluating effects on the quality of life.  Yet,  EPA spends
larger amounts of manpower on assessment of chemical contamination than on
assessments of exposure or of biological or health effects (of course, the
Centers for Disease Control have some responsibility in health effects
assessment as well).  The concern here is that the precision of  chemical
measurements may far exceed the precision that the total assessment can
achieve:  a waste in measurement and thus budget.  Again, a DQO  would
control the scope set for the data-gathering effort.  In the above
example, the higher precision may still be required to measure changes in
contamination, but a need for this measurement must then have  been
expressed as part of the original plan.  Panel members in their  visits to
regions in the earlier phase of this study noted that the data gatherers
often were unaware of the purpose for which the data would be  used.
    Other examples in which there is inverse proportionality of  needs to
spending include carbon dioxide assessment, for which measurement is easy
in relation to model development.
    EPA is pressed on one side to economize on data collection and
evaluation (as it is pressed to economize on all things) and on  the other
side to collect and provide environmental data for a broad clientele,
present and potential.  The panel supports the collection of data with
long-term use but no immediate application, particularly in situations
where the data are likely to be important either as baselines  or as
Indicators of long-term trends.  However, EPA is not a service Agency for
the entire scientific community and cannot be expected to collect and
maintain data sets on every aspect of the environment.  Users  of long-term
data who wish EPA to provide those data should provide DQOs to those
offices of EPA that could supply the data they need.
                                    -24-

-------
                     8.  ESTABLISHING COMMON PRACTICES
    It is important that Agency-wide common practices be established and
implemented as much as possible for numerical data-intensive activities.
These common practices would include guidelines, codes, or standards for
data nomenclature, units, measurements, and handling (including
statistical treatments) and for data validation/computerization/
management.  Equivalent procedures for which alternatives are in common
but competitive use should be issued.  A good start was made in this
direction with a "Chapter 5" document shown to the panel at its meeting at
the EMSL in Las Vegas.  The panel urges the EPA to continue this effort.
Agency-wide setting of guidelines, standards, and codes should be an
ongoing activity with continual feedback from scientists in the field both
within the EPA and outside.  An Agency-wide document stating a standard
code or practice would be very helpful and could replace much of the
repetitive documentation now required to accompany the QA plans and
programs that laboratories and contractors must present.  The common
practices must also be subject to continuous revision to satisfy the needs
of the DQO.  With time, new techniques and concepts become available whose
possible adoption requires timely response, flexibility, and judgment.*
(When the subject at hand does not involve Agency-wide interest, common
practices should be prepared and agreed upon by those parts of EPA to
which they pertain; however, statistical practices and definitions are by
their nature Agency-wide.)
    At its meetings, the panel has discussed practices regarding
statistical handling of data.  The Agency needs a uniform, deliberately
adopted set of conventions or definitions concerning such terms as "bias,"
"precision," and "minimum detection limit (MDL)".  Treatment of outliers
heeds to be put into a uniform code of practices as well, although this is
*A case in point came up regarding measurement in Region III:  the
 determination of total particulate nitrogen, phosphorus, and carbon
 contents of Chesapeake Bay waters.  Such measurements were necessary for
 mass balance and modeling studies of the contractors.  Yet the techniques
 that the EPA administration demanded were outmoded and indirect and
 lacked the necessary sensitivity for the needs of the investigation.  For
 example, EPA methods involved measurement of total Kjeldahl nitrogen and
 Kjeldahl nitrogen in the filtered solution.  The particulate nitrogen was
 determined by measuring the difference.  Direct measurement of
 particulate nitrogen in a gas analyzer gives more accurate results with
 greater sensitivity.  The contractors were prohibited from using this
 latter method.  Informed judgment and rapid response to the needs of the
 contractors would allow a much more rational assessment of the problems
 at hand.

                                    -26-

-------
                                     -25-
     Most difficult,
     most uncertainty
Health effects assessment
Exposure assessment
Other data interpretation, modeling
Sampling strategy
Sample handling
Chain of custody
Reporting
Controlled lab studies
Chemical analysis
                                                        Increased EPA
                                                        emphasis and
                                                        expenditures
Figure  1.   There is a tendency to emphasize  those measurements  for which
increased  precision and accuracy can be scientifically achieved,  while
expending  relatively less  effort on more difficult measurements (e.g.,
biological) and model development that are required for the assessment of
total exposure and health  effects.  The word "relatively" is used here in
the context of proportionality to the size of the error associated with
the task or process given  in the middle column of this figure.

-------
                                    -27-

not meant to imply that all outliers should be handled in the same way.  A
uniform code would spell out how to decide how to handle outliers.  The
approach of adopting a common practice of how to decide can be applied in
other situations appropriate for a guide to uniform practices.  One
example might involve dealing with results obtained by different
monitoring methods and shpwing systematic differences.  A uniform code of
practices would tell how to choose among available approaches.  The EPA
should seek outside expert assistance and evaluation in the preparation of
a guide to uniform practices.
    Agency-wide use of such terms and practices is important to EPA
(1) in reducing the high costs of duplication of effort in preparing the
definitions and the supporting documents, particularly in the contexts of
QA program and project plans, and (2) in reduced ambiguity and misuse of
the data and, hence, increased data utility.  Preparation of common
practices is relevant not only to the data collection process but also to
program plans, project plans, and data quality objectives; thus the
advantages are wide-ranging.
    Project plans and program plans seemed to the panel to have been
carried much too far.  Instead of drawing on existing literature and
definitions, and developing only the specifics necessary for a particular
project or program, these documents are enormous and consist of large
amounts of "boiler plate," repeating standard material that should be part
of a uniform code of practice for the EPA and its contractors.  Because
these documents were, in many instances, so large and unwieldy, the panel
had some doubt about whether anyone could use them effectively.
    EPA should consider the adoption, whenever possible, of definitions
and practices prepared by well-recognized organizations outside the
Agency.  Major efforts toward standardization have been made at the
national and international levels (e.g., by the American Society for
Testing and Materials (ASTM) and the International Union of Pure and
Applied Chemistry (IUPAC)).  While these standardization programs have
attracted a number of notable statisticians, agreement on definitions  is
yet to become a reality after many years of discussion.  However, it now
appears that the IUPAC is nearing agreement on definitions for bias,
precision, and minimum detection limit.*  At least one other international
organization, CODATA,** looks to these  definitions as perhaps not perfect
but nevertheless useful in the absence  of universal agreement.  Data
measurers and reporters cannot proceed  without a working definition;
utilization of an agreed-upon set such  as the IUPAC"s proposed set should
be implemented for those instances for  which these definitions are
technically appropriate.  Application of common practices on an .
international scale not only reduces duplication of work within the EPA
^"Recommendations for Nomenclature  in Evaluation of Analytical
  Methods", IUPAC Commission V.3, Draft Report, 1988  (in  review).

**The Committee on Data for Science and Technology of the International
  Council of Scientific Unions  (ICSU).  The  IUPAC is  a union  of  ICSU  as
  well.

-------
                                    -28-

but also enhances data exchange and utilization on an international
scale.  This point should not be forgotten, particularly for data that
deal with shared resources (e.g., contaminants in air and shared waters).
    It is recommended that EPA develop a. uniform set of definitions of
"bias," "precision," and "minimum detection limit" and other terms and
practices not only with personnel within the EPA but also with
consultation and review by outside expert statisticians.  EPA should study
and adopt, where appropriate, work done by existing organizations that
have convened world experts on the subject.  These organizations include
the IUPAC, the International Standards Organization (ISO), and the ASTM.
Adopting inappropriate or misleading definitions could do a large and
unnecessary disservice to the EPA's data gathering efforts and QAHS's
image within the Agency.

-------
         9.   ARCHIVING AND DISSEMINATING QUALITY-CHARACTERIZED  DATA
    Much of the environmental data now gathered by EPA (and other
governmental agencies) are stored in computer banks.  Most of these
numbers are stored without indicators of quality.  Similar to Gresham's
Law of Economics, there is a "First Law of Environmental Science"
enunciated by S.O. Makarov, the noted Russian oceanographer, about a
century ago:  "One careless observation will spoil 100 good ones" (S.O.
Makarov and the significance of his research in oceanography.  Bull. Inst.
Oceanogr. Monaco, Special Volume No. 2, pp. 615-625 (1968)).  Many of the
data in storage (e.g., STORET) are poor.  The panel was particularly
disappointed during its visit to Cincinnati to learn that in the formation
of BIOSTORET, an  information bank of biological data, no quality control
parameters would  be associated with the entries.
    The storage of environmental data, especially of data that can be used
for decision making, should be done with appropriate screening and with
enough information for the user to assess the reliability of the numbers.
Clearly, measures of precision and of accuracy are essential.  But more
than this, files  of analytical data on the substances present in water
should tell the user what techniques were used, what primary standards
were employed to  calibrate the method, with what intercalibration
exercises the laboratory was involved, and the dates and times of
analysis.
    The flagging  of data with QC parameters will require a variety of
resources.  The storage needs will be greater.  There is a cost to the
procurement of information.  Clearly there should be a complementary cost
to the assessment of its worth.  Further, an active and continuing group
of scientists should be involved in the data validation process.  The
techniques most useful for data validation include screening for
consistency and reasonableness as well as assessing the statistical and
methodological parameters' cited above.  Intercomparisons with related data
sets can often be used to identify or flag apparent discrepancies.
Sampling or measurement probleas or even errors of transcription can be
discovered front such an assessment.  Statistical methods such as
regression analyses or even simple data plots are useful in many cases to
verify good data  or cast suspicion on poor data.  Confidence-levels
criteria for rejection should be adopted in advance.as a matter of policy
and should be strictly followed.  Only if such measures are adopted can
our data bank* provide a useful and valid service to regulatory
activities.
    Data users outside the EPA have been actively working with some EPA
staff to develop  better methods for documentation of data quality in data
bases.   Particularly, two workshops were convened at the initiation of the
Chemical Manufacturers Association (the Workshop on Data Quality
Indicators, February 10-12, 1982, Gaithersburg, Maryland; the Workshop on
Data Documentation Quality Indicators, May 8-9,  1984, Bethesda,
Maryland).  EPA participated in the organization and programs of these

                                    -29-

-------
                                    -30-

workshops.  However, there seems to be no active EPA effort at this
to follow up the recommendations of the workshops.  This matter should
of primary interest to those responsible for data bases such as STORET,
BIOSTORET, and SAROAD.
    As a public Agency, EPA has the responsibility to make all nonsensi-
tive data it has gathered available to the public in a responsible
fashion.  This includes notation of the assessed reliability of the data.
Two workshops have, in fact, identified a practical approach in terms of
procedures for documenting quality indicators.  The EPA should develop the
notations for quality indicators further and implement such indicators so
that it can join the data base technology of the 1980s by providing useful
machine access to its data bases.

-------
             10.   ILLUSTRATIVE EXAMPLE;  THE RCRA "GROUND WATER
                  MONITORING TECHNICAL ENFORCEMENT GUIDANCE"
                DOCUMENT--HOW THE SYSTEM COULD BE MADE TO WORK
    The panel read a number of documents intended to present data quality
objectives.  A few were so far from the mark that they clearly were
written by people who did not understand what was intended.   None that the
panel saw has achieved what the panel believes DQO statements should do.
All were much too long and too vague.  The best of them do begin to
reflect an appreciation of the issues and the intent.  However, the most
consistent impression the documents gave the panel is that the EPA staff
charged with providing these do not yet understand what the documents
should do, so that those people simply provide a pack of paper formidable
enough to satisfy a bureaucratic edict.  Instead, they should be providing
terse, clear statements of what they, as data users, want.  If the
language in such a DQO statement is not clear enough for the data
supplier's guidance, the data supplier should be able to come back to the
user, assisted bj (JAMS, and, through direct discussion, work out what the
supplier should be doing to give the user what he or she needs.  If QAMS
cannot carry out this function, it is failing in its characteristic 2
(listed in Chapter A on  Page 12) of technical competence and the capacity
to assist, in this case, in one of the aspects of quality assurance that
the panel considers central.
    To describe the way a DQO document might be written, the panel
presents here a discussion of one particular document.*  It was  initially
given to the panel as a developing inspection manual, but compared with
other documents identified as examples of DQO implementation, this
document, although itself not a DQO document, serves at least as well,
insofar as it has at least as much relevance to an  interactive DQO process
as those shown to the panel as DQO documents and is a moderately good
report.  The intent of this discussion is to show by example some of the
aspects of DQOs that have not yet been understood and to  indicate how EPA
can make the DQO into a more useful tool to help achieve  its mission.
    At one point, the document in question states that  its  data  quality
objective is "to develop and implement procedures that  will provide
legally defensible results in a court of law."  EPA staff will presumably
be the primary users of the data if a decision  is made  to litigate.  The
litigated group(s) will probably be the ones that have  contaminated  the
water supply.  The necessary regulatory and  legal requirements are
stated explicitly and well :  "the data are  used  to evaluate  compliance
*The RCRA (Resource Conservation  and Recovery Act)  Ground Water Monitoring
 Technical Enforcement Guidance Document (TEGD),  EPA No.  530/SW-86-055,
 September,  1986.

                                     -31-

-------
                                    -32-

with the National Interim Primary Drinking Water Regulations StandarcJHhd
the applicable National Secondary Drinking Water Standards; for this
purpose, they should have method detection limits that are below 20
percent of the maximum allowable levels on a parameter by parameter
basis."
    Some details of the action plan are left open, although in general it
is well conceived.  For example, the number of soil samples in composites
is not given nor are any guidelines to  the determination of the number
provided.  Here, details necessary for  executing the plan are lacking.
These are matters that should be worked out between users and suppliers of
the data, which may require the technical expertise that QAMS should
provide.  On the other hand, there was adequate flexibility in the initial
survey for levels of radiation at the site.  The simple statement is
made:  "If radiation above 10 millirems per hour is recorded, the advice
of a radiation health physicist will be obtained."  Clearly, contamination
by radioactivity could arise from a number of origins.  Some knowledge of
possibilities at a given site is necessary to make measurements with
appropriate instrumentation.  This again is a matter best dealt with by
direct discussion rather than by formal documents.
    This document was examined by the RCRA Groundwater Monitoring
Technical Document Review Committee.  They found a document that in
general was technically sound but needed modest changes to make more
effective the operation of the groundwater monitoring systems, i.e., QA/QC
practices.  In some cases, the review committee suggested making greater
use of professional judgment.  The committee recommends, for example, in
studying well pollution that "EPA should allow substantially greater
flexibility in the recommended length of well screens.  The maximum
limitation of 10 feet on well screen length, may not adequately       ^^
accommodate all hydrogeologic situations which may be encountered in t^P
field and may, in fact, lead to the collection of misleading samples.  If
the screen length is suitable given the hydrogeologic complexity, it
should also be sufficient for water quality sampling."
    On the other hand, the committee identified inadequacies in the
specification of drilling methods:  "Emphasis should be placed upon the
selection of drilling methods which . . . present the least potential for
introducing contamination . . . fluorocarbon resins (such as Teflon . .  .)
or stainless steel 304 or 316 should be specified for use in the saturated
zone when potentially sorbing organics  are to be determined.  In such
cases, and where potential for corrosion exists or is anticipated,
fluorocarbon resins are preferable to stainless steel.  PVC has utility
when shown not to leach or absorb contaminants significantly."  The panel
supports these recommendations and again encourages DQO statements that
are flexible and apply broadly.
    The examples from this report and its review illustrate how a careful
study of the tactics used in monitoring programs can reveal both a lack  of
detail and too much rigidity in the step-by-step methods.  To ensure  that
QC and QA objectives are satisfied, it  is crucial that QA/QC methodologies
in EPA documents are periodically and critically reviewed by disinterested
scientists, either internally or externally situated.  Such reviews can  be
cost-effective in eliminating unnecessary activities with respect to
development of DQOs and QA project plans.  The panel has in mind here both
redundant and excessively project-specific documents.  Reviews can

-------
                                    -33-

pinpoint technically unsound practices that impinge upon quality control.
They can identify areas in which the lack of flexibility in strategies
available to the user can lead to misleading or inappropriate results.
They can uncover practices that may not lead to accurate results and point
out more effective methodologies.  This may have to be an iterative
process of review:  the reviewers make recommendations for change; the
authors of the document respond; the reviewers then review the responses.
Such apparently was the case with the Environmental Engineering Committee
of the Science Advisory Board, which reviewed the RCRA Groundwater
Monitoring Document, and the result seems to have been salutory.

-------
                                 APPENDIX A
May 26, 1983


June 27, 1983


Aug. 8-12, 1983



Aug. 12-13, 1983

Days in Nov., 1983



Dec. 7-8, 1983

March 8-9, 1984


June 6-7, 1984

July 12, 1984



Feb. 27, 1985


Nov. 25-26, 1985

March 18, 1986

Nov. 5, 1986


March 27, 1987


Summer 1988
         MEETING AGENDA

First organizational meeting, East Coast, at
NAS/NRC, Wash., D.C.

First organizational meeting, West Coast, at San
Francisco, Calif.

Three panel representatives attend EPA Conference on
Quality Assurance for Environmental Measurements,
Boulder, Colo.

Full panel meeting at NCAR, Boulder, Colo.

Selected panel members visit regional offices
nearest their residence.  Regions II, V, VIII, and
IX were visited

Full panel meeting at EPA headquarters, Wash., D.C.

Panel chairman and two members attend EPA/QA
workshop organized by QAMS

Full panel meeting at EMSL/Las Vegas, Nev.

Panel chairman and member meet with QAMS and EPA
assistant administrator at EPA headquarters, Wash.,
D.C.

Panel chairman meet with QAMS and EPA assistant
administrator at EPA headquarters, Wash., D.C.

Full panel meeting at EMSL/Cincinnati, Ohio

Full panel meeting at EPA headquarters, Wash., D.C.

Panel members attend QAMS annual officers meeting  at
EPA headquarters, Wash., D.C.

Selected panel members meet with QAMS at EPA
headquarters, Wash., D.C.

Panel chairman to meet with QAMS or EPA assistant
administrator for transmittal of final report and
debriefing.
                                    -34-

-------
                        APPENDIX B
                      INTERIM REPORT
                   ON QUALITY ASSURANCE
                          to the
              ENVIRONMENTAL PROTECTION AGENCY
          Panel on Quality of Environmental Data
               Numerical Data Advisory Board
Commission on Physical Sciences, Mathematics,  and Resources
                 National Research Council
                   NATIONAL ACADEMY PRESS
                   Washington,  D.C.  1985
                             -35-

-------
                                  -36-

NOTICE:  The project that is the subject of this report was approved by
the Governing Board of the National Research Council,  whose members are
drawn from the councils of the National Academy of Sciences, the
National Academy of Engineering, and the Institute of  Medicine.  The
members of the committee responsible for the report were chosen for
their special competences and with regard for appropriate balance.
    This report h-as been reviewed by a group other than the authors
according to procedures approved by a Report Review Committee
consisting of members of the National Academy of Sciences, the National
Academy of Engineering, and the Institute of Medicine.
    The National Research Council was established by the National
Academy of Sciences in 1916 to associate the broad community of science
and technology with the Academy's purposes of furthering knowledge and
of advising the federal government.  The Council operates in accordance
with general policies determined by the Academy under the authority of
its congressional charter of 1863, which establishes the Academy as a
private, nonprofit, self-governing membership corporation.  The Council
has become the principal operating agency of both the National Academy
of Sciences and the National Academy of Engineering in the conduct of
their services to the government, the public, and the scientific and
engineering communities.  It is administered jointly by both Academies
and the Institute of Medicine.  The National Academy of Engineering and
the Institute of Medicine were established in 1964 and 1970,
respectively, under the charter of the National Academy of Sciences.
Available from

NUMERICAL DATA ADVISORY BOARD
National Research Council
2101 Constitution Avenue, N.W.
Washington, D.C.  20418
Printed in the United States of America

-------
                                 -37-
                PANEL ON QUALITY OF ENVIRONMENTAL DATA
R. STEPHEN BERRY, University of Chicago,  Chairman
JACK G. CALVERT, National Center for Atmospheric Research
EDWARD D. GOLDBERG, University of California,  San  Diego
PAUL MEIER, University of Chicago
R. BRADFORD MURPHY, Bell Laboratories
JOHN J. ROBERTS, Argonne National Laboratory
                     NUMERICAL DATA ADVISORY BOARD
R. STEPHEN BERRY, University of Chicago, Chairman
ROBERT A. ALBERTY, Massachusetts Institute of Technology
AARON N. BLOCK, Exxon Research and Engineering Co.
CARL 0. BOWIN, Woods Hole Oceanographic Institution
MORRIS DE GROOT, Carnegie-Mellon University
SAMUEL ETRIS, The Silver Institute
WILLIAM W. HAVENS, JR., American Physical Society
GERD ROSENBLATT, Lawrence Berkeley Laboratory
JAMES J. WYNNE, Thomas T. Watson Research Center, IBM
GESINA C. CARTER, Staff Director

-------
                                  -38-
      COMMISSION ON PHYSICAL SCIENCES, MATHEMATICS, AND RESOURCES
HERBERT FRIEDMAN, National Research Council, Chairman
THOMAS BARROW, Standard Oil Company
ELKAN R. BLOUT, Harvard Medical School
BERNARD F. BURKE, Massachusetts Institute of Technology
GEORGE F. CARRIER, Harvard University
HERMAN CHERNOFF, Massachusetts Institute of Technology
CHARLES L. DRAKE, Dartmouth College
MILDRED S. DRESSELHAUS, Massachusetts Institute of Technology
JOSEPH L. FISHER, Office of the Governor, Commonwealth of Virginia
JAMES C. FLETCHER, University of Pittsburgh
WILLIAM A. FOWLER, California Institute of Technology
GERHART FRIEDLANDER, Brookhaven National Laboratory
EDWARD A. FRIEMAN, Science Applications, inc.
EDWARD D. GOLDBERG, Scripps Institution of Oceanography
MARY L. GOOD, UOP, Inc.
THOMAS F. MALONE, Saint Joseph College
CHARLES J. NANKIN, Oklahoma Geological Survey
WALTER H. MUNK, University of California, San Diego
GEORGE E. PAKE, Xerox Research Center
ROBERT E. SIEVERS, University of Colorado
HOWARD E. SIMMONS, JR., E.I. du Pont de Nemours 6 Co., Inc.
ISADORE M. SINGER, Massachusetts Institute of Technology
JOHN D. SPENGLER, Harvard School of Public Health
BATTEN S. YODER, JR., Carnegie Institution of Washington
RAPHAEL G. RASPER, Executive Director
LAWRENCE E. McCRAY, Associate Executive Director

-------
                                  -39-
                               CONTENTS
  I.  INTRODUCTION                                                    1

 II.  STRUCTURAL PROBLEMS:  THE ROLES OF QUALITY CONTROL AND QUALITY
      ASSURANCE IN EPA                                                3

III.  SPECIFICATIONS OF OBJECTIVES:  THE PURPOSE OF DATA              8

 IV.  BUDGET AND STAFF                                               10

-------
                            I  INTRODUCTION
    in April 1983 the Environmental Protection Agency (EPA)   contracted
with the National Academy of Sciences (NAS) to review the Agency's
control of the quality of its scientific data, and particularly its
Quality Assurance (QA) program.  The contract implemented an initiative
of 1980 by the EPA requesting assistance from the National Academy of
Sciences (NAS).  The NAS delegated the responsibility to its operating
arm, the National Research Council (NRC), which, in turn, called upon
its own Numerical Data Advisory Board (NDAB)  to carry out the task.  To
conduct the requested assistance and review,  the NDAB established a
panel of six scientists—with recognized expertise in one or more
relevant fields—to work with EPA, under the guidance of the NDAB,   for
three years, to (1)  engage in discussions and briefings to help the  EPA
establish a sound QA program,  (2) review methods, methodology, and
selected key EPA QA documents, and (3) present its findings and
recommendations.  The final report is to be distributed as an advisory
document that EPA could use to strengthen its programs, particularly
those involving numerical scientific data.
    The panel's study is now essentially half done.  The document in
hand is an interim report to provide EPA with the findings that the
panel has arrived at thus far and the recommendations it can firmly
make on the basis of those findings.  The interim report sets out
raa-tters that the panel believes can be addressed now by EPA,
independently of other matters intended for the final report.
    Three topics are discussed here:

    1.  structural problems:  what are the roles of quality control
(QC) and quality assurance (QA) in EPA, and what do these roles imply?
Why have a quality assurance program, and who is it for?
    2.  specification of objectives:  how can "adequate* quality of
data be determined and what should a QA program and QC accomplish?
    3.  management of budgeting, staffing, and documentation for EPA's
quality assurance program:  how can the aims of the two previous
questions best be achieved?

These topics are not the only  issues that have concerned the panel,  but
they are the ones regarding which the panel feels sufficiently informed
to make statements at this time. These and other matters will be
covered in the final report.

                                   -40-

-------
                                  -41-

    Before turning to the body of the report,  the  panel, would  like  to
make explicit two fundamental assumptions that have guided  it.
Mentioning the first seems obtuse, but it is no less important  for
being obvioust

    1.  Scientific data gathered by, for, or under the auspices of  EPA
are used for purposes requiring specific levels of known  validity of
those data.
    2.  A clear line of accountability is in place so that  the lines of
responsibility for scientific data quality are known to all
participants in the relevant EPA activities.

    Without clear lines of accountability, a quality assurance program
for scientific data is ineffective.  If scientific questions were  the
only ones EPA faced, it might be possible for  EPA to leave  such
accountability diffuse or undefined and rely on professional standards
of data quality control.  However, EPA's responsibilities are  to the
public to protect its health and the quality of the environment.
Consequently, the panel considers such an informal course inconsistent
with implications of EPA Order 5360.1 labeled  "Policy and Program
Requirements to Implement the Quality Assurance Program.*  The panel
agrees with this Order, that the matter of accountability must be
examined by the EPA administrator, in order for the Agency  to  fulfill
its responsibilities.  So long as there is accountability for  failure
to maintain specified levels of data quality,  the service performed by
a QA program in EPA becomes essential.

-------
        II.  STRUCTURAL PROBLEMS:  THE ROLES OP QUALITY CONTROL
                     AND QUALITY ASSURANCE  IN EPA
    Quality control (QC) and quality assurance (QA)  are instruments
used by institutions to maintain the quality of the goods or services
those institutions provide at the level desired by the institutions'
managers.  QC encompasses all the means by which line workers achieve
and maintain a desired level of quality in the products.  QA consists
of the means used by the institutions' managers to guarantee to their
own satisfaction that the quality of the products is indeed being
maintained at the desired level and at the level claimed by the line
workers.  These two functions are widely recognized and distinguished.
The former is the consequence of the skill and sense of responsibility
of a professional carrying out a task conscientiously.  The latter is a
kind of insurance that management constructs to protect itself and its
institution against failures or weaknesses in QC.  It is predicated on
accountability for such failures and weaknesses.
    To see what these functions mean in the context of EPA, it is
useful to draw parallels with the pre-1984 Bell System, which had a
highly developed QA program.  In some respects the parallels are
limited,  but they provide a useful starting point,  understanding what
QC and QA are requires identifying the suppliers, the goods, and the
customers.  In the Bell System, the suppliers were the manufacturers  of
telephone equipment and those who maintained the operation of the
telephone system.  The intermediate goods were devices and operations;
the ultimate goods, telecommunication services.  The customers were of
two kinds:  intermediate customers, including the Western Electric
plants in which the devices were made and the operating companies that
bought and sold devices; and ultimate customers, the customers
purchasing the telephone services or ultimate goods.
    In the EPA, the suppliers are those who provide environmental data,
and the intermediate goods are the data themselves.  The intermediate
customers are the compliance officials of EPA and others, such as
researchers within EPA and outside and people drafting proposed
regulations and laws for environmental management, whether in EPA, in
Congress, or in the private world.  The data  (intermediate products)
are used by the intermediate customer to produce the ultimate product
of environmental protection and enhancement to the ultimate consumer,
the taxpayer.  Environmental protection is achieved by deciding what
the data mean and taking actions appropriate to that meaning.


                                   -42-

-------
                                  -43-


    QC in EPA must thus be equated with the means  the suppliers of data
use to check and control the quality of those data,  through control
over all the technical aspects of their work, beginning with decisions
about what* where, when, and how to sample, going  through  analyses of
samples, to maintenance of the integrity of lines  of custody and data
storage, and, finally, to documentation of the data. A QA program
constitutes the activities of oversight and monitoring that ensure that
the quality of the data does indeed meet the expectations  of its
intermediate customers.  The purpose of the QA program, crassly put, is
to prevent any embarrassing situation in which the data fail to serve
their intended function because their quality turns  out to be  too far
from what was expected.  Stated more generously, QA  for data in EPA  is
the certification, independent of the data supplier, that  those data
are adequate for their intended purposes.
    Within EPA, the panel found some offices where the distinction
between QC and QA is clearly understood.  In other parts of the Agency,
the distinction between the two activities was not so clearly
understood.  Nowhere in the parts of EPA visited by  the panel  did the
panel feel that the roles of the participants were distinguished enough
for the two functions to be carried out in a healthy, complementary
way.  One specific recommendation of the panel is  as follows:  The
roles of quality control and quality assurance should be clearly
understood throughout SPA, and implemented in a manner that keeps the
two clearly separated.
    QC is primarily a technical matter of sound scientific practices.
QA is also partly technical but at least equally institutional.  That
aspect is discussed in more detail here.
    Because QA is a service to management,  specifically a kind of
protective insurance, any QA program must have certain characteristics:

    1.  Independence from those groups whose performance  it  is
scrutinizing.
    2.  Technical competence and ability to evaluate and assist  those
groups.
    3.  Clearly understood authority.
    4.  vigorous management.

    These characteristics have emerged from many years of  industrial
experience with QA programs.  The panel is persuaded that  the  same
characteristics are required for a successful QA program in  EPA.
    In the B«ll System, there was a single QA program,   integrated
throughout th« system and headquartered in Bell Laboratories.   The  QA
personnel r«poxt«d to the central management of the Bell  System)  the
perception MM that it was the managers of the System who were being
served by th« QA staff.
    EPA is quite a different sort of institution.   Many of its
activities and responsibilities go on in regional administrations or
laboratories.  The structure of the Agency and the distribution of
responsibilities within it differs enough from most private  firms that
the panel felt that it should not make a simple, explicit  recommendation
regarding the structure of the Agency's QA program.  However,  the panel

-------
                                  -44-


does make a very strong recommendation,  one that it  feels  is one of
its most important, regarding the structure of EPA's  QA program:  The
administrator must determine who it is that the quality assurance
program should be serving and protecting, and design  the program so
that it carries out that function.
    The panel has drawn up two options for EPA's QA program that it
considers viable, and has described a third situation,  corresponding to
the status quo, that it believes cannot achieve the purposes of a QA
program.
    Option 1:  In each assistant administrator's office associated with
an EPA program there would be a chief QA Officer (QAO)  responsible for
QA in that program throughout the Agency.  The person served by that
QAO and those QA staff reporting to that QAO would be the assistant
administrator.  QA staff members from each program office would be
located in each region and relevant Environmental Monitoring Systems
Laboratory (EMSL) to assure the assistant administrator of the program
of the quality of data.  A headquarters QA staff would  be maintained in
a form similar to the present Quality Assurance Management Staff
(QAMS), with responsibilities for coordination, Agency-wide guidelines,
procedures, and audits.  In this option, the number of  individuals to
be associated with QA in the program offices and central QAMS staff
should probably be roughly half the present number of full-time
equivalents (FTE) identified with QA activities.  The regional
directors and other senior staff would be free to establish their own
QA programs, in manners that they would see as protecting them.  To
assist the implementation of this option, uniform documentation of
recommended QA and QC practices should be widely available to all
branches of EPA and to its contractors.
    This first option is based on the panel's understanding that in
EPA, as it now functions, the principal data users—enforcement staffs,
drafters of standards and regulations, and analyzers  of trends—work
within the specific program offices and get any data  they need  from
suppliers primarily in that same program office.  The option is drawn
on the supposition that the assistant administrator for each program
would be the person accountable for a failure in performance due to
inadequate or misrepresented data quality.  It is the panel's
impression that the regional administrators might also be accountable
for such failures within their own regions and would consequently  find
it very much to their own advantage to establish their own QA
programs.  The panel would strongly recommend that the regional
administrators use funds under their control for this purpose.
    In the case of the assistant administrators for programs,  the  panel
rtcoMMnds that the establishment of QA activity for  the programs  be
•aadatory and not optional, for the following reason.  While the
•Mifltant administrators can be accountable to the administrator  for
program inadequacies, the person ultimately accountable as a public
official is the administrator.  Short of going to the second option
(below), it seems to the panel that the administrator must mandate the
effective operation of QA activities centered at the program level as
insurance of the integrity of the Agency.

-------
                                  -45-

    The third Section of this report returns  to  the natter of specific
forms of management for this option,  it is the  option that the panel
believes wait effectively meets the aims of a QA program within the
style of governance that places responsibility and authority at the
lowest level where it will be effective.
    Option 2:  This optiop is predicated on a supposition that much of
EPA's scientific data is used in parts of the Agency well separated
from those parts responsible for gathering and evaluating data.  If
this is the case, the panel recommends centralization of the primary
responsibilities for QA so that the chief QAO reports directly to
someone in the administrator's office.  In the model supposed for this
option, which the panel believes did apply to EPA for some time, there
is nobody below the level of the administrator's office who could be
held accountable by the intermediate and ultimate data users for a
failure due to inadequate or inadequately represented data quality,  it
would be the administrator who would need the assurance that one side
of the house was supplying goods as specified to the users on the other
side.
    According to this option, lines of responsibility would run from
the chief QAO to all the program offices, regional offices, and EMSLs.
Unless a large part of the use of scientific  data occurs  in offices
well separated from data collection, such as  the Office of Enforcement
and Compliance Monitoring, the panel would recommend against this
option, in favor of the first.
    The third situation, which we do not consider a serious option, is
to maintain the status quo in the QA program, with responsibilities
delegated in terms of performance at the lowest  level of management.
The panel considers this unacceptable because it creates
self-contradictory, self-defeating situations that are  likely to  thwart
the purposes of QA.  By defining  (implicitly) the performance of QA
activities at the low levels where these responsibilities  frequently
fall now, EPA has exposed a confusion between QC and QA.  The function
of QC is indeed carried out, and properly so, at the  level of the
sampling, laboratory testing, sample handling, transmission and
storage, and data analysis.  The function of  QA is an  activity  of
service directly to a manager at some well-defined level of
accountability.  In Option 1, this is the level of the  assistant
administrators.  In allowing responsibility for  "quality  assurance* to
be distributed all the way to the level of laboratory  directors,  EPA
has at present put the responsibility for surveillance  and auditing
into the hands of those who are being surveyed and audited.   By
allowing managers to make their QAOs part-time,  QA personnel  become
part-time objects of review by the QA staff,  i.e., by  themselves.
There is nothing wrong with having some people work  part  time  in
support of QA, but it is altogether incompatible with  the function of a
responsible QA officer to have loyalties divided between  the  manager
being served by the QA function and the bench supervisor  whom that
manager oversees.  The panel urges very strongly that the present
structure be transformed as rapidly as can be done smoothly into one.of
the other options, preferably Option 1.

-------
                                  -46-


    Whatever option is chosen, it is important that the QA staff are
seen not only as police but also as skilled,  helpful technical counsels
to those whom they oversee.  Policing and helpful counseling may appear
to be incompatible roles but in practice, well-managed and staffed QA
programs succeed in combining these tasks.  Industrial experience has
proven that this dual role can be carried out successfully.  The panel
sees no reason why it should not work in EPA.
    One issue has come to the attention of the panel through its own
interactions with EPA and through previous studies of the Agency.  This
is the difficulty the Agency has had with determining what constitutes
•adequate" data in crisis situations.  To cope with unpredictable
crises, it is inefficient to try to maintain a permanent staff.
Nevertheless, a high level of expertise is usually required to make a
sound judgment in such situations.  A standing board of expert
scientists and engineers functioning as a "volunteer fire department"
might alleviate this longstanding problem.
    On this matter, the panel specifically recommends that within the
structure of the QC and QA programs:  The EPA should establish a means
to call up 'crisis panels* of experts to advise on quality control and
quality assurance in emergency or short-term extreme situations.
    The experience of the Agency with such 'crises" in recent years
indicates what value such panels would have.  It would be relatively
simple to maintain contact with rotating groups of outside experts
willing to serve in emergency situations, groups who would  function
like volunteer fire departments and who could give the Agency guidance
and credibility when needed.

-------
        III.  SPECIFICATION OF OBJECTIVES:  THE PURPOSE OF DATA
    EPA must translate the scientific and regulatory objectives  of
programs that acquire environmental monitoring data into specific
requirements for quality assurance and quality control.
    Virtually every guideline and statement of QA principles and
procedures issued by EPA officials and by QAMS over the  past five years
mentions the need to define the purpose of each environmental
monitoring effort and therefrom derive requirements for  the quality of
the data.  However, until the issuance by the deputy administrator of
the May 24, 1984, memo on Data Quality Objectives, negligible emphasis
had been given to this critical step in defining a QA program.   None of
the many QA program and project plans (including so-called model plans)
that the panel has reviewed address this subject in a meaningful way;
in fact, most totally ignore it.
    For example, the Las Vegas EMSL is nominally responsible for QA and
QC in support of the Superfund program.  In point of fact, their QA and
QC procedures are developed in a consensus forum by representatives
from the contractor laboratories, regions, and headquarters Superfund
staff without any significant input from the principal end-user,
namely, the enforcement division of the Superfund Office.  To quote one
senior EMSL staff member:  "We are somewhat blind to the end use of the
data."
    For example, the General Accounting Office Report, "Underlying
Problems in Air Quality Monitoring Systems Affect Data Reliability*
(GAO/CED-82-101, September 1982), places great emphasis  on the
importance of EPA's need for "accurate, reliable air quality data," and
points to a lack of resources devoted by the federal EPA and the states
to QA.  In the GAO report the EPA counters that the National Air
Monitoring Station network is "complete" and that current procedures
are adequate for directing federal and state resources to QA.  However,
nowhere in tfeia discourse is any attention paid to (1) the basis upon
which the ft*t»oxk itself was established, (2) current and anticipated
future requirements Cor such data, and  (3) the status of QA/QC efforts
in tens of faeb need*.
    For exaiple, the Great Lakes Atmospheric Deposition  (GLAD) network
has been collecting wet and dry samples since the late 1970s, and plans
are under way to expand the network.  An impressive set of QC
procedures are in place.  Yet, there appears to be no documented
scientific or regulatory rationale behind the network layout, sampling
criteria, and specific QA/QC protocols.

                                  -47-

-------
                                  -48-

    The panel'8 reviews of headquarters programs,  regional offices, and
field laboratories consistently confirm this disregard of the purpose
of environmental monitoring data and lack of critical user input  in
establishing related QA programs and QC protocols.  The QAMS
requirement that a QA plan be developed for each major program and
research project is being met by the field offices and major EPA
contractors.  The panel has been told that several of the latter  have
set up special offices in Research Triangle Park,  North Carolina, to
generate such plans.  Yet those plans reviewed by  the panel uniformly
begin with a statement of data requirements, void  of any supporting
rationale based on the ultimate purpose of the data.
    There has been within EPA among line managers  in headquarters and
the field a widespread lack of support for QAMS and its QA directives.
There has been, correspondingly, a reluctance to allocate adequate
resources for QA and to provide adequate authority for those
responsible for QA.  The panel believes that the root of such attitudes
is the lack of clear coupling between the objectives of the line
regulatory programs and the imposed requirements for QA and QC.   As
long as there is no more than a vague understanding by the line
management of the rationale behind the QA program  and its specific
directives, this situation will persist.  Without  well-understood
objectives for data quality, QAMS lacks the necessary foundation  for
its whole program, so that its goal becomes assuring "data of known
quality" instead of "data of appropriate quality."  The former may have
to suffice as an interim, pragmatic objective but  must not be the
long-term goal of EPA's data activities.
    The panel is very much encouraged by the recent QAMS  initiative to
work with the programmatic assistant administrators and their senior
staff in the development of "data quality objectives* for all
significant environmental measurement activities,  and  it  looks  forward
over the coming year to the opportunity to observe the progress of this
critical effort.  The panel emphasizes the importance of each of  the
following five steps:

    1.  The QAMS guidelines should be revised to place much greater
emphasis on the requirement for a systematic analysis of  the purposes
of the project or program and of the data to be acquired, whether the
data used be of legislative and regulatory or scientific  and
interdisciplinary nature.
    2.  A guideline for carrying out such analyses and for deriving
specific QA and QC activities should be prepared by the QAMS
management.
    3*  Prototypical project and program QA plans should  be developed
jointly by the program offices and  QAMS to demonstrate the above.
    4.  A realistic schedule should be established to evaluate all
major ongoing environmental monitoring networks in these  terms.
    5.  The EPA QA program should be structured so that  the assistant
administrators responsible for individual regulatory programs have as
one principal measure of their performance the implementation of
effective QA programs for environmental monitoring data,  where  such QA
activities derive from an understanding of the ultimate purposes of the
data.

-------
                         IV.  BUDGET AND STAFF
    The need for strong QA programs has been recognized  by  the EPA,
although it is evident that many of the operating  sections  of the EPA
have not yet assimilated this realization or determined  the best manner
in which to organize,  and to benefit from,  a strong  QA  program.  There
are probably several reasons for this.  QA at EPA  appears to have
developed in response to a mandate that each program  within the Agency
must have its own QA program;  in some cases these were  added easily,
without extensive change; in other cases, the transition was not so
easy.  The principal client of the program is not  well identified  in
many cases, as is pointed out in Section I.   This  situation can lead to
the development of a less-than-constructive  adversary relationship
between the QA officer and the Agency line personnel, unless the QA
program is organized to protect the QA officer when she  or  he  is in the
vigorous pursuit of some well-recognized QA problem.
    Of serious concern to the panel is the question of the  adequacy of
the Agency's financial commitment to the QA program as a result of the
decentralization of responsibilities for QA.  It appears to the panel
that, in too many cases, the QA program has  been treated as a
meaningless, burdensome, bureaucratic formality.  Because the QA budget
must be derived from the existing program budget,  which  may have been
planned without QA in mind, the responsible  assistant administrator in
some cases sees the QA program as undesirable baggage, competing with
the "real" activity of his or her program.  In some cases the  panel has
encountered very high quality in the designated QA officers;  in other
cases, the appointments appear to have been made to individuals who may
not have fit in well elsewhere; such QA appointments  served as
convenient "dumping grounds."  It is evident that  if  a QA program  is to
be of maximum benefit to the Agency, the personnel appointed  to the QA
officer positions must be highly enough qualified  in  the major areas of
interest to his or her particular office or  laboratory to have the
respect of ttM;people they advise and monitor.  As the panel  has stated
previoualyj&jflfc;!* important that the QA officers be independent of the
groups for^pjjapi they have oversight and report to the individuals who
seek the pgSiietion of the QA program  (e.g., the assistant
administrator*).  They must be on the payroll of the  person to whom
they report, not of the person they monitor.
                                   -49-

-------
                                   -50-

    Further, the panel is concerned about the undermanning of the QA
offices in the regions as well as in the assistant administrator's
office.  Clearly, this diminishes the confidence that can be  placed by
the administrator in the programs.  The very existence of part-time QA
officers at the regional level reflects the confusion of QC and QA
discussed in Section I.  Part-time QA officers have divided loyalties
even outside QA or, QC and cannot provide the protection or advice that
independent, skilled QA personnel should be giving.  Carrying this
point further, the panel recommends that at all levels the
responsibilities for the application of QA/QC procedures be assigned in
ways such that conflicts of interest are minimized.
    At this point in the panel's review of the EPA's QA program, the
panel does not wish to give precise recommendations to the
Administrator regarding expenditures on QA since it does not  yet have
the necessary information regarding staff and  budget to judge properly
the present support for the QA program.  It has been the experience of
experts in private firms that about 0.5 percent of the total  cost of a
technical project should be spent for QA.  This figure could  serve as a
guide for EPA in budgeting for its QA program.
    As an aid to QA management (and, incidentally to the panel in its
formulation of useful final recommendations to the Agency,)  it would be
useful for the Agency to compile some concise information on  QA
personnel and their roles and responsibilities in the QA program.
    1.  A list should be maintained for the officer responsible for QA
of all employees of the Environmental Protection Agency and  its
associated laboratories and agencies who have responsibilities in any
aspect of the QA program (site visits to check the QA procedures  in
contract laboratories, etc.).  This list should indicate the
educational degrees and previous experience of the personnel  shown and
the percentage of their time spent on QA activities, and assigned
responsibilities to other offices.
    2.  An organizational chart of the QA personnel is important  to  the
effective operation of the QA program.  This should be prepared by the
EPA administration and  should designate clearly to whom each QA
officer reports and from whom the money to support his or her position
is received.  The relationships  (lines of reporting), if any, between
the QA officers within the Agency's major laboratories, regional
offices and laboratories, and the enforcement branches such  as the
National Enforcement and Investigations Center (NEIC) must be
delineated.
    3.  At times of environmental crisis, it is essential that funds be
available to the appropriate EPA office or laboratory to formulate  the
special QA/QC plans that will allow the collection of relevant and
meaningful data to answer the needs as defined by the administrator  or
his or her deputy.  As discussed in Section I, the panel recommends
that a special emergency QA panel of experts be maintained on call,  to
be convened at tine of crisis, to establish the necessary sampling and
analytical protocols that will assure the quality of the data to be
collected,  it is at times like this that those responsible  for  QA may
be especially helpful in advising on QC.  Perhaps the present EPA
contractors and consultants could be among those who serve in this

-------
                                  -51-

capacity,  although their role as advisor may be compromised by
self-serving interests for further work,  it is important  to establish
the degree of precision necessary in the particular measurements,  so as
not to choose methods that result in wasteful overkill.  Obviously,
very precise analyses of large cost will be required for some cases,
and less expensive, less precise analyses will do for others.

    QAMS itself is beginning to function well now.   However, a more
effective QA program in EPA might result from placing a national office
for QA in the EPA's Table of Organization, as opposed to the current
situation where visibility and clout are lacking because QAMS  is a
small activity in the Office of Exploratory Research under the Office
of Research and Development (ORO).  Putting QAMS in the Table would
help assure, as well as emphasize, its importance to all of the EPA
regulatory activities.  Centralization and unification of  basic
principles, methods, procedures, and criteria could more easily issue
from above to the QA officers and their program, regional, or
divisional activities.  As the role of QAMS shifts from program
development to the highest level of auditing, it can monitor  the QA
program and its own performance can be examined for effectiveness.
    There are many compelling reasons for the preparation  of nationally
applicable documents relevant to required scientific data  practices and
the conduct of a sound QA program.  For example, a national QA/QC
Super fund volume could give better direction to regional activities in
solid waste and toxics and would eliminate the need for  regional
offices to develop their individual (and differing) guidelines, such as
were shown to panel members in visits to their regions.  Wasteful
incompatibilities among data sets originating from different  regions
would thus be reduced.  Further, such a centralization of  the
preparation of nationally applicable documents may alleviate  the
situation described aptly by one panel member as the writing  "of
documents about documents," characteristic of the overwriting  that one
does when one's ideas are unclear.  Often, such volumes  produced by
small in-house groups do not provide guidelines that reflect  the
cutting edge of present-day wisdom,  in all fairness, some of  EPA's
documents presented to the panel display the Agency's use  of most
current well-grounded statistical methods, for which the Agency clearly
deserves commendation.  It would be advisable for EPA to draw  on the
most knowledgeable and substantial talent in and outside of EPA  to
develop such documents.
    Generally, the panel has observed that there appear  to be no
strategies whatsoever for establishing statistically valid procedures
for field collections.  Samples are sometimes taken without having
properly d*v«lop«d a statistically valid sampling design;  rationales
for locating -stapling or monitoring stations are likewise often  without
logical underpinning.  This is not only ineffective, but costly.
Documentation of the strategy of sampling, not the tactics, would be
extremely helpful, especially if it were reviewed and revised roughly
every five years.

-------
                                  -52-

    In summary, the panel feels that the EPA  administration must
insure that a firm and realistic budgetary commitment is provided to
support the quality assurance program so that well-qualified personnel
can be attracted into the quality assurance officer program and the
necessary standards of an excellent quality assurance program can be
supported.  The amount of commitment should be guided in the interim by
the private sector's policy that the expenditure on QA should be about
0.5 percent of the t6tal operating costs of the office or  laboratory.
    The panel also recommends that the quality assurance program see to
it that nationally applicable documents regarding data measurement,
recording, and handling as well as QA procedures are generated by
qualified persons and made widely available to the branches of the
Agency and to its contractors.
    A last but also important recommendation  of this interim report is
that the EPA take care to assign QA responsibilities to  staff that has
competence in this specialized task and has no other assignments that
pose conflicts of interest.

-------
                                 APPENDIX C
                           LIST OF ABBREVIATIONS

AAPPE       Assistant Administrator for Policy, Planning and Education
AARD        Assistant Administrator for Research and Development
ASTM        American Society for Testing and Materials
BIOSTORET   Data bank on biological data
CODATA      Committee on Data for Science and Technology
DQO         Data quality objective
EMSL        Environmental Monitoring Systems Laboratory
EPA         Environmental Protection Agency
ICSU        International Council of Scientific Unions
ISO         International Standards Organization
IUPAC       International Union of Pure and Applied Chemistry
MDL         Minimum detection limit
NAS         National Academy of Sciences
NCAR        National Center for Atmopsheric Research
NDAB        Numerical Data Advisory Board
NRC         National Research Council
PVC         Polyvinyl chloride
QA          Quality assurance
QAMS        Quality Assurance Management Staff
QAO         Quality assurance officer
QC          Quality control
RCRA        Resource Conservation and Recovery Act
SAB         Science Advisory Board
SAROAD      A data bank of the EPA
STORET      A data bank on environmental data
TEDG        Technical Enforcement Guidance Document
                                    -53-

-------
          UNITED STATES ENVIRONMENTAL PROTECTION AGENCY

                      WASHINGTON. O.C. 20400
                               5 1388               OFFICE OF
                                          POLICY, PLANNING AND EVALUATION
MEMORANDUM
                  >

SUBJECT:  GAO Transition Report
FROM:     Robert Wayland  III
          Deputy Assistant Administrator

TO:       Assistant Administrators
          General Counsel
          Inspector General
          Regional Administrators
          Staff Office Directors
     Attached is a GAO report  focusing on specific
environmental issues prepared  for the administrative
transition at the Agency.

     The report is a synopsis  of selected GAO  reports  on
major issues facing the Agency.  GAO presents  their concerns,
suggestions, and recommendations in five basic areasi

          -  overall program management,
          -  RCRA and Superfund issues,
          -  reduced ozone  levels,
          -  pesticide issues, and
          -  water issues.

     This environmental transition report is one  of  26 reports
that GAO prepared to brief  the incoming Administration.   The
other reports concern other significant issues facing  the
federal government.

     GAO will be briefing the  Administrator-designate  and
Assistant Administrator-designates on the issues  discussed
in the report in the coming months.

     Should your office require additional  reports,  please
call ,«e or Steve Tiber, the EPA/GAO Liaison Officer,
at 382-4010.

Attachments

cc:  Lee M. Thomas
     Dr. John A. Moore

-------