February 25, 1999

EPA-SAB-EEC-LTR-99-002

Honorable Carol M. Browner
Administrator
U.S. Environmental Protection Agency
401 M Street, SW
Washington, DC 20460

             Subject:      Science Advisory Board Review of the Implementation of the
                          Agency-Wide Quality System

Dear Ms. Browner:

      In response to a request from the National Center for Environmental Research and
Quality Assurance (NCERQA) of EPA's Office of Research and Development, the Quality
Management Subcommittee of the Science Advisory Board's (SAB) Environmental Engineering
Committee (EEC) reviewed the implementation of the Agency-wide Quality System (QS). The
Subcommittee met in Washington, DC on September 22-24, 1998.

      In the Charge for this meeting, the EEC Quality Management Subcommittee was asked to
use available information to evaluate the Agency's success in implementing the Agency-wide
Quality System. The Subcommittee restricted its review of implementation to data collection
because the Agency's Quality System presently does not address other activities, such as
modeling or environmental technology.

      At an earlier meeting, the Subcommittee addressed the relevance, completeness and
practicality of the Agency's Quality System in a letter report (EPA-SAB-EEC-LTR-98-003).
The focus of the Agency's Quality System on data collection, rather than other activities, was
documented as an issue of concern in the previously cited SAB report (The complete Charge for
both meetings is provided in Enclosure A, and Enclosure B contains a summary of the findings
detailed in the previous report).

      The Subcommittee commends the Agency for developing its Quality Program and
recognizes the Quality Assurance Division (QAD) for its precedent-setting work.  The
Subcommittee also credits the QAD for its efforts to implement the Quality System, including:

      a)     the generation of widely-accepted  policy and project-level guidance and
             requirements;

      b)     a training program that has introduced the Quality System concepts to many
             Agency  personnel;

-------
       c)     a review and approval process for Quality Management Plans (QMPs) that has
             guided implementation of the Agency's Quality System;

       d)     the use of management system reviews (MSRs) to identify areas of needed
             improvement and success stories for Quality System implementation; and

       e)     the use of Quality Assurance Annual Reports and Work Plans (QAARWP) to
             document past and planned efforts to implement the Quality System.

       The Subcommittee wishes to draw the following overarching findings to your attention:

       a)     Implementation of the Quality System is uneven within the Agency, increasing
             the likelihood of problems with data quality and the associated decisions.

       b)     The majority of states that have primacy for implementing numerous
             environmental programs are generating data of unknown quality because they
             lack approved Quality Management Plans (QMP).

       c)     A well-implemented Quality System is necessary to accurately measure progress
             under the Government Performance Review Act (GPRA).

       d)     Incomplete implementation of the Agency's Quality System precludes proper
             evaluation of internal and external activities; this results in potential for waste,
             fraud and abuse.

       e)     The present reporting status of the Quality System function within the Agency
             organization lowers the profile of the Quality System and denies access to the
             proper level of authority  and the independence necessary to oversee the quality of
             the Agency's services and products.

       f)     Senior Management needs to be a champion for successful implementation of the
             Agency's Quality System or acceptance and implementation of the Quality
             System will remain uneven and incomplete.

       These findings are discussed in more detail below (in addition, findings and
recommendations specific to Quality System Organizational and Program components will  be
found in Enclosure C; Enclosure D addresses Project level components).

       a)     Implementation of the Quality System is uneven across the Agency: After
             reviewing program and regional Quality Management Plans (QMP), management
             system review (MSR) reports, reviews, surveys, and conducting interviews, the
             Subcommittee concluded that implementation of the Agency's Quality System
             varies across the Agency, within regions and programs, and among states and
             grantees.

-------
       Because the QMP is the blueprint for implementing an organization's Quality
       System, the QMP is an indicator of the importance an organization assigns to the
       Quality System.  Timely, coherent, and approved QMPs are reassuring.
       Organizations that lack a QMP, that are tardy in preparing the QMP, that operate
       for extended periods without an approved QMP,  or that suffer from internal
       inconsistencies between QMPs at different levels are worrisome.  Such behaviors
       imply that compliance with the Agency's Quality System is unimportant to the
       organization.  The Subcommittee found actual implementation of the QMP
       requirement varied  significantly, from some Agency organizations that functioned
       under a detailed and approved QMP to some states that lacked QMPs, in their
       entirety.

       The organizational authority for Quality Assurance Managers varies from
       organization to organization as does the source and availability of funding.
       Typically, funding of regional Quality System activities is negotiated annually
       with the programs.  The Superfund program, which is commonly the largest
       source of funding for Quality System activities, not surprisingly has implemented
       more Quality System activities than other programs.

       Implementation of the Quality System at the project level depends upon the
       manager or supervisor. Where champions exist,  Quality System activities such as
       a structured planning process and the use of quality assurance project plans
       (QAPP) are more likely to be employed.  At the project level, the  unevenness of
       implementation may also reflect the different emphasis on training in quality
       management that exists across the Agency.

       The Subcommittee believes that this uneven and  inadequate implementation of
       the Quality System  can impede achievement of Agency goals.

b)     The majority of states that have primacy for implementing numerous
       environmental programs are lacking approved Quality Management Plans
       (QMP), and most likely are generating data of unknown quality: States with
       primacy for the implementation of many  of the Agency's environmental programs
       have legal responsibility to protect the environment and human health.  More than
       75% of states lack approved Quality Management Plans (QMP) for all or
       significant numbers of their environmental programs.  Subcommittee fact-finding
       with a limited number of states discovered that some states not only lacked an
       approved QMP but  also lacked quality assurance managers (or equivalent
       positions) and any semblance of an independent quality assurance function.

       States lacking a Quality System for environmental programs are unlikely to
       document the quality of data.  Such a state is exposing itself, the reliability of its
       decisions, and its credibility, to criticisms due to  its reliance upon data of

-------
       unknown quality.  The same is true for those Agency programs that depend upon
       those data.
c)     A well-implemented Quality System is necessary to accurately measure
       progress towards compliance with the Government Performance and Results
       Act (GPRA): The intent of GPRA is to improve public confidence in federal
       agency performance by holding agencies accountable for achieving program
       results. GPRA's goals include:

       (1)     improving the accountability of federal agencies by setting annual
              performance objectives;

       (2)     developing a means of measuring progress; and then

       (3)     measuring and documenting the progress made toward reaching
              objectives.

       Thorough implementation of the Agency's Quality System will produce data that
       account for progress made towards GPRA-related objectives.  This
       implementation should include a structured planning process (e.g., Data Quality
       Objective, DQO, process), project plans, and appropriate data assessment. While
       the DQO process cannot assure that the objectives chosen are the appropriate ones
       or that the tools are available to measure progress towards the environmental
       objectives, the DQO process can result in the ability to benchmark and measure
       progress in achieving objectives and therefore will increase public confidence and
       facilitate oversight by Congress.

       Data collected under an appropriate Quality System will be of known quality, and
       as such a determination can be made about the usability of the data in terms of
       their intended use. Data of unknown quality are problematic, since the data may
       lead to unjustified confidence and inappropriate use of the data.  It is not possible
       to evaluate the quality of data or the quality  of associated decisions, projects and
       activities, unless the data are generated under the auspices of a program that
       defines their quality.  Questions such as whether a site has been cleaned, a
       contractor has completed a project, a facility is in compliance can not be
       confidently answered.

d)     Incomplete implementation of the Agency's Quality System precludes proper
       evaluation of internal and external activities, with a  resulting potential for
       waste, fraud and abuse: The Agency's Quality System contains the necessary
       components to generate data of known quality.  These, or equivalent components,
       are essential  for identifying the objectives of a data-gathering activity,
       documenting important planning issues for subsequent implementation, and
       assessing the suitability of the resulting data for their intended use.

-------
       When the Agency's Quality System is not properly implemented, one or more of
       the following events is likely to occur:

       (1)    the objectives for collecting data are poorly or incorrectly identified, data
             are collected for the wrong reasons and the resulting data are of
             diminished value;

       (2)    the incorrect type, too much or too little data are collected;

       (3)    the spatial and temporal boundaries of the study are not properly defined;

       (4)    constraints and limitations are not accounted for by the plan or during data
             collection;

       (5)    the necessary degree of confidence in decision-making is not considered
             nor accommodated;

       (6)    the sampling and analytical activities are not implemented according to a
             well-designed plan; and

       (7)    the data are not properly assessed to determine their usability.

       These events result in data of unknown quality.

       Data of unknown quality are  problematic. Typically, such data are unsuitable for
       decision making  or determining whether a project has been successfully
       completed.  Collecting data without adherence to the Agency's Quality System (or
       an equivalent system), is apt to result in data that cannot achieve the objectives for
       which they were  intended.
       The Subcommittee is convinced that a more complete application of the Agency's
       Quality System to data collection projects will provide more usable data, better
       decisions, and improved oversight of internal and external data collection
       activities. The Quality System is designed to reduce  EPA's vulnerabilities by
       producing data of known quality, an evidentiary trail, and reduced sampling and
       analytical costs (by specifying a better match between EPA's information needs
       and data collection). At present the potential for waste, fraud and abuse exists  due
       to the incomplete implementation of the Quality System.

e)     The reporting status of the  Quality System function within the Agency
       organization lowers the profile and diminishes the effectiveness  of the
       Quality System: The Subcommittee's earlier letter report concluded that the
       Agency's current organization doesn't allow national level QA managers adequate
       access to the proper level of authority within the Agency (Refer to Appendix B).
       Additional documents examined by the Subcommittee and the Subcommittees's
       discussions with  Agency and state personnel indicate that the problem of access

-------
       also exists at the regional, program and state levels. The Subcommittee found the
       reporting status of the QA authority correlates with the degree to which the
       Quality System had been implemented. The need to have quality assurance
       managers report at a senior level is expressed clearly in EPA Order 5360.1, and is
       an accepted practice within the Quality Assurance community.

       "The Quality Manager shall have direct access to the highest level of
       management" (International Standards Organization Guide 25, "General
       Requirements for the Competence of Calibration and Testing Laboratories"): The
       need for quality assurance managers to report to a high level is also implied by the
       repeated requirement that the QA function be independent of those doing the
       work.

       (1)    "It is highly desirable that the QA function or person be independent of
             the functional groups that generate the data." (40 CFR 1546.2 Contract
             Quality Requirements);

       (2)    "The quality assurance unit shall be entirely separate from and
             independent of the personnel engaged in the direction and conduct of the
             study." (21 CFR 58 Federal Drug Administration and 40 CFR Part 160.35,
             Federal Insecticide, Fungicide, Rodenticide Act); and

       (3)    "The QA monitoring function shall be entirely separate from, and
             independent of personnel engaged in direct supervision or performance of
             the work being monitored." (ASTM D5283-92)
       The predominant philosophy within the QA community emphasizes the need for
       independence of the QA function and direct access to the highest levels of
       management.  It is inappropriate to argue that these accepted practices are
       applicable to other and lower tier organizations but not applicable to the Agency,
       the leading QA authority on environmental issues.

       The Subcommittee recommends that the Agency place the Quality System at a
       higher level within the Agency structure so that it can function properly and so its
       position reflects the importance of its role within the Agency. The Agency's
       ability to collect data of known quality adequate for its decisions depends upon a
       long-term Agency-wide commitment to quality that will withstand changes in
       administrations. If the Quality System is perceived as a passing management fad
       or a factor that can be ignored in doing the Agency's business, the Agency will
       remain at risk on data quality issues.

f)      Senior Management needs to be a champion for successful implementation of
       the Agency's Quality System: The Subcommittee found that the Quality
       Assurance Division (QAD) has generated great enthusiasm for quality assurance
       amongst a cadre of officials who are distributed throughout the EPA and have
       data collection responsibility. These are the early adopters motivated by their

-------
              shared vision of the potential benefits that the Quality System can provide.  To
              date, quality assurance has been primarily a social movement within the EPA (and
              associated state and tribal organizations) with QAD as the chief cheerleader.  To
              gain wider diffusion will require senior management to implement a more
              complex web of persuasion, administrative mandates, and furtherance of
              rewarded quality practices.  Studies of innovation call this moving up an "S"
              curve from early adopters to wider diffusion.  However, these studies also indicate
              that most innovations fail on the rocky shoals of bureaucratic inertia. The
              committee is concerned that this might be a real possibility for the Quality
              System.  Clearly, the benefits of the Quality System have not been sold to a large
              number of EPA officials, state and tribal officials.  While this may be
              understandable given constraints on time and resources, it also poses dangers for
              the sustainability of quality assurance as a function within the Agency.

       The following ideas emerged in the Subcommittee's review and may assist senior
management in providing the means to counter incomplete buy-in and encourage wider
acceptance of the Quality System within the Agency, states and tribes:

       a)      Consider revisiting the reporting status of the Quality System and
              institutionalizing it within the Agency structure.

       b)      Create senior and lower level champions for the Quality System within EPA,
              states and tribal organizations.  Recognize and reward those exemplars who
              incorporate the Quality System so other EPA, state and contractors can emulate
              them.  The Agency may wish to expand its present QA awards program to
              included exemplars from the states, tribes, grantees and contractors.

       c)      Emphasize the bench marking and oversight advantages of the Quality System as
              management tools and for complying with requirements such as those established
              by the GPRA. The Subcommittee was impressed with the potential of a well-
              implemented Quality System to facilitate compliance with GPRA. A better
              understanding of the synergistic relationship between GPRA and  an operable
              Quality System should advance the movement of the quality system from a
              peripheral activity to a central role within the Agency.

       d)      Articulate the need to have independent oversight of the quality of the Agency's
              products and services and how this oversight will add to the Agency's ability to
              protect human health and the environment as well as to gain increased credibility
              and legitimacy when dealing with political leaders  and the public.

       e)      Articulate the benefits and cost reductions that will eventually accrue following
              incorporation of a Quality System within the Agency structure.

       The Subcommittee finds that the Quality System cannot be successfully implemented
without buy-in and demonstrated commitment from senior management and agrees with the

-------
following quote from the ANSI/ASQC E4-1994 consensus, standard upon which the Agency's
Quality System is based - "While management delegates quality management functions to staff,
they cannot abrogate the ultimate responsibility for the success of the quality system."

       The Subcommittee also finds the Agency to be the national and international leader for
quality assurance activities within the environmental community.  To maintain this leadership
position,  and to continue  to improve the data upon which environmental decisions are made, the
Agency's senior management must be prepared to assign resources to the implementation of the
Quality System. The Subcommittee believes that the Agency's senior management and Congress
must recognize that initially, as the Quality System is implemented, there is the potential that the
quality of products and services will improve at the expense of the  total amount of work
performed.  The benefits  of a Quality System have been argued to be free of costs, but the
validity of this assumption is based on the amortization of costs over the longer term. While the
Subcommittee can not estimate the incorporation costs, the Subcommittee is confident that the
return on investment will be realized.

       In conclusion, the Subcommittee commends the NCERQA  and QAD management for
their request for a review of the Agency-wide Quality Management Program and the openness
and assistance of the QAD staff who faithfully responded to our continued requests for
information and answered our numerous questions.

       As stated in the earlier report, the above findings and recommendations attest to the
significant contributions made towards the quality of data collection activities while recognizing
that the job is not nearly done. Continued and increased attention from senior management is
needed to implement the  Quality System uniformly to all activities on an Agency-wide basis.

       We look forward to your response to the advice contained in this report.

                                 Sincerely,

                                 Dr. Joan Daisey, Chair
                                 Science Advisory Board
                                               I-
                                 Dr. Hilary Inyang, Chair
                                 Environmental Engineering Committee
                                 Science Advisory Board
                                 Dr. John Maney, Chair
                                 Quality Management Subcommittee
                                 Environmental Engineering Committee

-------
                           ENCLOSURE  A

 CHARGE FOR REVIEW OF THE QUALITY MANAGEMENT SYSTEM

a)     To evaluate the relevance, completeness, and practicality of the policy and
       organizational components of the Agency-wide Quality System.

       (1)    Relevance: Evaluate whether the policy and organizational components of
             the Agency's Quality  System are applicable to the Agency's mission
             statement, goals (EPA/190-R-97-002) and those activities identified in
             Section 1.3 of the Quality Manual.

       (2)    Completeness: Evaluate whether the policy and organizational
             components of the Agency's Quality System are structured such that the
             quality of all Agency activities needed to comply with the Agency's
             mission statement and goals will be monitored and assessed versus
             performance measures.

       (3)    Practicality: Is the present structure of the policy and organizational
             components of the Agency's Quality System designed for success (i.e.,
             will the policy and organizational levels of the Quality System properly
             assess and control pertinent activities and facilitate achievement of EPA
             goals).

b)     To evaluate the relevance, completeness, and practicality of the project level of
       the Agency-wide Quality System.

       (1)    Relevance: Evaluate whether the project level of the Agency's Quality
             System is applicable to the Agency's mission statement and goals
             (EPA/190-R-97-002) and the covered activities identified in Section  1.3 of
             the Quality manual.

       (2)    Completeness: Evaluate whether the project level of the Agency's Quality
             System is structured such that of all project activities needed to comply
             with the Agency's mission statement and goals will be monitored and
             assessed versus performance measures. Evaluate whether the project level
             guidance documents consider all essential aspects necessary to monitor
             and measure the quality of environmental measurement data.

       (3)    Practicality: Is the present structure of the project level of the Agency's
             Quality System designed for success (e.g., are they cost-effective, efficient
             to implement, understandable by the intended audience, will the project
             level of the Quality System facilitate achievement of EPA goals).

-------
c)    Use available information to evaluate the Agency's success in implementing the
      Agency-wide Quality System.

-------
                           ENCLOSURE B

   SUMMARY OF ISSUES NOTED TO EPA MANAGEMENT AT THE
           SUBCOMMITTEE'S INITIAL PUBLIC MEETING

a)     The need to include all activities, which have the potential to affect the quality of
       the Agency's products and services, under the auspices of the Quality System;

b)     The need for Agency management to review the appropriateness of the reporting
       status of the Quality System function within the Agency organization.  The
       Subcommittee recognizes that this recommendation is of a policy nature.
       However, the Subcommittee believes this recommendation is justified due to the
       impact of reporting status on the efficacy of the Quality System;

c)     The lack of an Agency-wide focal point for quality issues and needed corrective
       actions;

d)     The need for Quality System training to be expanded to include the training of
       senior management;

e)     The need to identify metrics for "bench marking" existing levels of quality and
       changes over time;

f)     The need for guidance appropriate for the development and use of
       mathematical/computer models and the associated data; and

g)     The need to determine if budgeted resources for QAD are sufficient to meet the
       increased demand for its services.

-------
                                  ENCLOSURE C

     Implementation of the Organization/Program Components of the Quality System

       This enclosure addresses the Subcommittee's findings, organized according to the various
Quality System components that had been implemented at the time of the review.

1. Implementation of Quality Management Plans

       Finding: The subcommittee felt that the sluggishness in updating QMPs was
       symptomatic of a more general lack of commitment to the Quality System by senior
       managers.

       Recommendation: The Subcommittee recommends that the Agency provide incentives
       for states, contractors, and grantees, to complete their QMP.

       Recommendation: The Subcommittee recommends that the Agency explore means of
       making the existence of the QMP and a summary of its contents and its significance more
       available to its staff.

       The Quality Management Plan (QMP) is the first step in creating data of known quality.
The QMP provides an overview of responsibilities and lines of authority with regards to quality
issues within an organization.  EPA managers have described the QMP as similar to the
organization's legal agreement for implementation of the Quality System.

       The percentage of EPA operations currently covered by QMPs is unknown. There are
several Agency organizations that claim exemption from QMPs because they do not collect data
as part of their function. Of those EPA organizations identified as organizations that collect data,
over 90% have an approved QMP.  This extent of coverage does not extend, however to the
states, contractors and grantees. None of these groups is required by law or regulation to
complete a QMP. EPA officials report that a minority of the groups has elected to complete a
QMP.  Given the fact that the states are major data sources for  the EPA, the absence of a QMP
by these groups is a weakness in the quality system. The Subcommittee recommends that the
Agency provide incentives for states to complete their QMPs.

       Due to the relevance of the QMP to operations, the QMP should be a document of great
interest to senior managers and a  document in whose design and development they would likely
want to participate. For example the QMP is the natural document for each organization to
identify success criteria for the Quality System.  The QMP can be the vehicle for generally
linking policy goals to quality practices in data collection. Senior Managers can also use the
QMP for articulating how the Quality System will be used to measure achievement of objectives,
a tool, which will be helpful for management oversight and compliance with GPRA.

-------
       Managers have described the QMP as one of the lesser-known documents within EPA.
When quizzed, few managers had seen their QMP, much less have a familiarity with the content.
More generally, the subcommittee finds that there needs to be a greater awareness of the
existence and content of the QMP amongst EPA managers and staff. The Subcommittee
recommends that the Agency explore means of making the existence of the QMP and a summary
of its contents and its significance more available to its staff. Several EPA officials suggested
the use of a web site to facilitate easy access.

       Finally, a barrier to successful implementation of the Quality System is the relative
slowness observed in updating QMPs.  This  is particularly acute following reorganizations by
EPA offices.  The Subcommittee felt that the sluggishness in updating QMPs was symptomatic
of a more general lack of commitment to the Quality System by senior managers.

2. The Implementation of Management System Reviews (MSRs)

       Finding: The  Subcommittee found that the MSRs are a necessary component of the
       overall Quality System and serve the purpose of identifying problems that need to be
       addressed to improve the  Quality System.

       Finding: The  Subcommittee found that the delayed completion of the final MSR report
       decreased the  overall effectiveness of the MSR.

       Recommendation: The Subcommittee recommends that implementation of MSRs and
       completion of MSR reports should be given high priority.

       Management System Reviews are conducted periodically with a goal of 3-4 years as
defined in EPA Order 5360.1.  The MSR process consists of a site visit; a draft report that details
findings and recommended corrective actions, consideration of the reviewed organization's
formal response to the draft report and the authoring of a final report. There are 40 divisions and
regions which are required to undergo a MSR, either as part of their own  QMP (as is the case for
regions) or in response to another QMP (for  example, the Office of Science and Technology
under the Office of Water). According to the information the subcommittee received, for the
most part MSRs are now conducted on the correct schedule  (approximately every 3-4 years) and
the ones which have not been conducted appear to be a result of the QMP being under review or
just approved.

       The Subcommittee found that the MSRs are a necessary component of the overall Quality
System and serve the  purpose of identifying  problems that need to be addressed to improve the
Quality System. Informal interviews by the  Subcommittee and a survey taken by the Office of
the Inspector General (OIG), supports this claim since it was concluded that QA managers
(QAM) found MSRs as a useful means to bring attention to areas of needed improvement and a
useful mechanism for communication between QAD and the office/region being reviewed.  In
addition,

-------
the QAMs found that the MSR comments could function as a change agent by bringing Quality
System issues to the attention of management.

      A major problem with the MSR process has been the timeliness of the final report.
During discussions with QAMs, some had expressed the belief that MSRs would be more
effective if they were more timely. In some cases it is the slow turnaround of the initial draft
report; however, in others it is the fact that comments from the reviewed organization were not
received in a timely fashion. While QAD has limited resources to complete the MSRs, they
should be given high priority since they are the only tool available for formalizing corrective
actions and giving the office/region a bench mark for improvement (following several MSRs).
The Subcommittee found that the delayed completion of the final MSR report decreased the
overall effectiveness of the MSR.

      As mentioned in an earlier report prepared by this Subcommittee (EPA-SAB-EEC-LTR-
98-003), the Subcommittee recommends that the Agency investigate whether more frequent use
of MSRs would facilitate broader acceptance of the Quality System.

3. Implementation of Training

      Finding: The Subcommittee found the Agency's overall  strategy or formal program for
      training in the Quality System is inadequate.

      Recommendation: The Subcommittee recommends that QAD provide  general guidance
      on how to effectively identify those staff needing training, and on how to design
      appropriate learning activities.

      Recommendation: The Subcommittee recommend that QAD develop record keeping
      procedures and systems, so the Agency can effectively focus its resources and training
      capacity.

      Recommendation: The Subcommittee applauds the above positive efforts and urges the
      Agency to identify means for supporting a comprehensive training strategy and program
      which will ensure effective implementation of the Quality System throughout EPA.
      The Subcommittee found that the Agency's overall strategy or formal program for
training in the Quality System is lacking.  The Subcommittee's first report on the Quality System
(EPA-SAB-EEC-LTR-98-003) recommends that the QAD develop a comprehensive strategic
plan for training. Factors that are hindering EPA's ability to effectively implement the Quality
System at all levels include: the lack of a long-term vision to meet the Agency's Quality training
needs, the lack of clarity about QAD's and others' roles, the inconsistency of training
requirements and opportunities, the infrequent tailoring of course content for the audience, the
lack of wide support for course attendance, and the lack of resources allocated to training.

      The Quality training strategy should clearly identify: Agency-wide goals and objectives
for the program; the target audiences; the type and level of training needed by job
responsibilities; approaches to design and deliver effective learning experiences (e.g., courses,

-------
workshops, conferences, etc.); means to evaluate the short- and long-term effectiveness of
training opportunities; the approach for consistent record keeping; and the means to obtain the
ongoing resources necessary to support training.

       Currently, few programs, offices or regions have identified which staff and managers
need Quality training or what types and levels of training would be appropriate for them.  During
the review, the Subcommittee was not able to locate an inventory of personnel's Quality System
learning needs by job responsibilities.  The Subcommittee recommends that QAD provide
general guidance on how to effectively identify those staff needing training, and how to design
appropriate learning activities. Not only should training be tailored to the audience's technical
needs and work context, but the format of the activities should also be suitable for the audience's
method of learning. For example, most adults learn more effectively when their work
experiences are incorporated into the training process, and when they are actively engaged
during the course.

       Although the Subcommittee found evidence that some trainers use exit evaluation forms
to assess the audience's reactions to the training, no long-term (e.g., 3 or more months after
training) evaluation processes were found.  Resources could be focused on determining the long-
term effectiveness of training for selected courses for different levels of staff and managers.
Over time, evaluating whether attendees use the knowledge and skills they learned would
provide valuable information to improve future training activities.

       There is very little evidence that personnel are aware of what training is available. There
appears to be no systematic record keeping of who has been trained or which courses were
offered, even when specific courses are required of specific staff. Throughout most of the
Agency, records are insufficient to assure that training needs are either inventoried or met. The
Subcommittee recommend that QAD develop record keeping procedures and  systems so the
Agency can effectively focus its resources and training capacity.

       Where Quality System responsibilities have been decentralized, the number of personnel
involved in Quality and requiring training has increased.  This situation presents a particular
challenge for the Agency when resources allocated for training activities are very limited. It is
important to consider the extent of resources  required to support decentralized Quality systems.
Strategies need to be developed to obtain the essential resources for training to implement a
successful Quality System.

       The Subcommittee noted some exemplary training initiatives, which were exceptions to
the above  concerns. In one region, technical  staff recognized specific training needs and, above
and beyond their routine duties, committed their time, expertise and energies to meet the needs.
Some regions reach out to state, tribe and contract staff to invite them to training activities.

Others conduct monthly conference calls or meetings to ensure that lessons learned are shared
broadly.

-------
       The Subcommittee recognizes that QAD staff understand the importance of tailoring
courses to the audience's responsibilities and work context.  They have begun to design training
guidance, modules and materials more targeted to the audience's routine work.  In collaboration
with staff from one office, QAD is developing a new approach to address training concerns.

       The Subcommittee applauds the above positive efforts and urges the Agency to identify
means for supporting a comprehensive training strategy and program which will ensure effective
implementation of the Quality System throughout EPA.

4. Implementation of the Quality Assurance Annual Report and Work Plan

       Recommendation: The Subcommittee recommends that the Agency encourage all
       organizations to complete QAARWPs in a timely manner and that the content of these
       QAARWPs be sufficient for the bench marking of quality, the measurement of changes
       in quality and the planning of the Quality System.

The purpose of the QAARWP is to provide information on the previous year's QA/QC activities
and those planned for the current year. The QAARWP details current and planned resources for
the management and implementation of QA/QC activities, training, accomplishments,
assessments and facilitates communication between QA staff and management.  The QAARWP
functions as an important management tool at the organizational level as well as at the Agency-
wide level when QAARWP supplied information is compiled across organizations.

The Subcommittee had limited access to QAARWPs, however additional information was
gleaned from interviews with QAD personnel and organizational QAMS. The Subcommittee
found the QAARWPs to be:
       (1)    an excellent strategic planning tool for organizational and Agency-wide QA
             activities;

       (2)    a mechanism for raising the visibility of QA issues within an organization and on
             an Agency-wide level;

       (3)    a mechanism for bench marking quality and changes over time; and

       (4)    a platform for recognizing accomplishments related to QA.

       While the EPA Quality Manual emphasizes the importance of the QAARWP to the
Agency's Quality System, interviews indicated that compliance with its annual requirement was
variable across organizations. In fact, information presented to the Subcommittee indicates that
1997 was the first year that all 10 regions submitted a QAARWP and of the regions, research
labs, centers and offices that complied, the content and usefulness of the QAARWPs varied.  In
response to QAD's encouragement and critiques, organizations have, over time, improved the
content and value of the average QAARWP. However, the Subcommittee recommends that the
Agency encourage all organizations to complete QAARWPs in a timely manner and that the

-------
content of these QAARWPs be sufficient for the bench marking of quality, the measurement of
changes in quality and the planning of the Quality System.

-------
                                  ENCLOSURE D

         Implementation of the Project Level Components of the Quality System

1. Implementation of the Data Quality Objective Process

       Finding: the Subcommittee found that adoption of the DQO process by EPA's regions,
       program offices, the states and tribes has been sporadic to nonexistent.

       Finding: The Subcommittee found that there are no demonstrable consequences for
       managers who apply, or do not apply the DQO process (or its equivalent) in the
       environmental data collection planning process.

       Finding: The Subcommittee found that a lack of statistical expertise was a barrier to
       implementation of the DQO process.

       Recommendation: The Subcommittee recommends that the Agency develop training in,
       and guidance for, stakeholder identification and participation.

Systematic planning is required by Order 5360.1 to develop acceptance or performance criteria
for all operations generating environmental data. The elements of a systematic plan are listed in
Section 3.3.8 of the EPA Quality Manual. The EPA Quality Manual also recommends that this
required planning be conducted using EPA's Data Quality Objectives (DQO) Process. Detailed
guidance in the use of the DQO process has been provided in Guidance for the Data Quality
Objectives Process (QA/G-4), (EPA/600/R-96/055) and supplemental guidance in Data Quality
Objectives Process Decision Error Feasibility Trials, DQO/DEFT (QA/G-4D), (EPA/600/R-
96/056).  The Subcommittee found the DQO process to be relevant, nearly complete and
certainly practical as indicated in the Subcommittee's letter report (EPA-SAB-EEC-LTR-98-
003).  However, the Subcommittee found adoption of the DQO process by EPA's regions,
program offices, the states and tribes has been sporadic to nonexistent.

The DQO process apparently has been adopted by the Department of Energy (DOE) for planning
the clean up of DOE facilities, indicating that a large government agency can incorporate a
structured planning process into their environmental program. However, acceptance and use of
the DQO process by the Agency's regions and program offices varies greatly even though
training has been provided on the basics of the process. Reasons frequently given for not using
the DQO process can be classified as one of three principal categories: administrative,
psychological and technical. With regard to the administrative barriers, the  Subcommittee found
that there are no demonstrable rewards or punishments for respectively applying or not applying
the DQO process (or its equivalent) in the environmental data collection planning process. The
lack of a requirement to employ the DQO planning process, and the lack of demonstrable
consequences for those who do  not, may have led to indifference on the part of some project
managers.

-------
With regard to psychological barriers, anecdotal information indicates that a fear of admitting
that decision errors are possible or a desire to avoid the responsibility associated with defining a
tolerable decision error are reasons for avoiding the DQO process. Although the risk of making
a wrong decision based on environmental sampling data has always been present, the DQO
process requires that this risk be explicitly stated and quantified. For many project managers,
this approach represents a paradigm shift in project planning.  Most project managers would
readily agree that there are errors associated with both analytical and sampling procedures, but
admitting that these errors may lead to wrong decisions may be disquieting to them. The
psychological barriers to quantification of the tolerable potential decision error may be overcome
through involvement of all stakeholders in the planning process and consensus-based decision-
making.  The Subcommittee recommends that the Agency develop training in, and guidance for,
stakeholder identification and participation. In addition, the Subcommittee's recommends that in
future revisions to the  applicable guidance documents (G-4 and G-4D) QAD more clearly
emphasize that correct decisions will be made most frequently when systematic planning such as
the DQO process is employed.

Attempts at using the DQO process often stop short of implementing the complete process.
Technical barriers are  often a major impediment to the implementation of the DQO process.  For
example, project managers perceive the DQO process to be inflexible and frequently fail to
comprehend and apply Step 6 "Specify Tolerable Limits on Decision Errors."  Part of the cause
for not readily employing Step 6 is that some  project managers feel uncomfortable with the
statistical nature of this step. To properly specify limits on decision errors requires recognition
on the part  of the project manager that the probability of making an incorrect decision can be
statistically controlled. However, a general lack of appreciation for the consequences of ignoring
potential decision errors and a lack of awareness of the added  value of the DQO process still
persist. For those project managers who are uncomfortable with statistical analysis, the DQO
training should be tailored to include a statistical component together with practical examples
that illustrate the impact of establishing decision error probability limits on sampling design and
costs.

2. Implementation of the Quality Assurance Project Plans (QAPPs)

       Finding: The Subcommittee found that while QAPPs are not uniformly employed, the
       frequency of QAPP usage is significant; yet anecdotal  information indicates that QAPPs
       frequently lack the project specific details necessary for successful implementation.

       Recommendation: The Subcommittee recommends that the Agency consider approaches
       that will encourage the increased incorporation of project-specific details by Agency
       staff.

The Quality Assurance Project Plan (QAPP) is described  as a  key component of the Quality
System.  The QAPP is the principal output of the DQO process and is the project-specific
blueprint for obtaining data appropriate for decision-making.  Agency policy (Order 5360.1)
requires use of an approved QAPP for any environmental data collection operation in which data
are collected for or on behalf of the EPA.

-------
Although, Agency requirements mandate that 100% of environmental data collection operations
in behalf of the EPA require an approved QAPP, in practice this is not always the case.  For
example, the Agency's Inspector General (Report # E1SKB6-09-0041-7100132, March  1987)
uncovered data collection activities, including activities in support of a risk assessment,  that were
carried out without a QAPP and of the 19 QAPPs that were reviewed 14 did not specify DQOs
and 11 lacked data assessment requirements. Management System reviews have uncovered
Program QMPs and practices that did not require approval of QAPPs prior to data collection.

Due to the intricacies and individual nature of projects, QAPPs should be project specific.
However, in many instances (e.g., federal facilities), a generic or off-the-shelf QAPP may be
prepared and applied for the entire facility regardless of the number and type of environmental
data collection activities that exist. Clearly, this approach to the utilization of a QAPP does not
conform to the QAPP preparation guidance provided in EPA QA/G-5, and typically results in a
document that lacks the specific technical and quality assurance details necessary for successful
implementation of the project.  Since the extent of the specific details contained in a QAPP is
dependent on the type of project, the use of a generic QAPP would be practical in only limited
circumstances.

The EPA QA/G-5 guidance document states that QAPPs must identify and characterize
measurement quality objectives pertaining to specific applicable action levels and data quality
criteria. These measurement quality objectives may be developed through the DQO process or a
similar systematic planning process. The QAPP translates the DQOs into performance
specifications and QA/QC procedures  for the data collectors. Thus the usefulness of the QAPP
will be a function of how well the preceding planning process has identified and defined the
DQOs, a previously cited weakness. An improper or incomplete systematic planning will result
in an improperly  developed or incomplete QAPP.

Because of the potential misuse of generic QAPPs  as well as improperly prepared QAPPs, the
EPA document QA/G-5 provides a checklist to assist quality assurance (QA) managers in their
review of submitted QAPPs.  It is unclear to what extent QA managers use this checklist in
screening the accuracy and completeness of QAPPs.  However, if the checklist were used
routinely, improperly prepared or incomplete QAPPs should be readily identified and corrected.

In summary, the Subcommittee found that while QAPPs are not uniformly employed, the
frequency of QAPP usage is significant; yet anecdotal information indicates that QAPPs
frequently lack the project specific details necessary for successful implementation.

3. Implementation of Data Quality Assessment Guidance

       Finding: The  Subcommittee found a general lack of awareness by regional, state, and
       grantee personnel for the G-9 document, its content, and application.

       Finding: The  Subcommittee found that a lack of statistical expertise was often a barrier
       to implementation of G-9.

-------
       Recommendation: The Subcommittee recommends that the Agency determine whether
       there are better approaches for increasing awareness of its requirements and guidance
       documents.

The logical last step in the data quality assurance system is the usability assessment of the
acquired data to determine whether they meet the assumptions and objectives of the systematic
planning process, which resulted in their collection.  In other words, determine whether the data
are usable because they are of the quantity and quality required to support Agency decisions.
The Agency guidance that addresses the overall data usability  assessment process is presently
incomplete. Technical Assessments for Environmental Data Operations (EPA QA/G-7) and
Environmental Data Verification and Validation (EPA QA/G-8) guidance exists only in working
draft form and are not ready for release. Since the Agency's Quality System is presently lacking
these key components, this review focused on the implementation of the Data Quality
Assessment (EPA QA/G-9) component of the data usability assessment process.

The conduct of a DQA is often sufficiently complex that consultation with a professional
statistician is required. QAD has rightly recognized that professional statisticians are not in
sufficient supply to meet the demand. Therefore, QAD has prepared Guidance for Data Quality
Assessment: Practical Methods for Data Analysis (EPA QA/G-9), (EPA/600/R-96/084) and a
supporting statistical software package Data Quality Evaluation Statistical Toolbox (DataQuest)
EPA QA/G-9D, (EPA/600/R-96/085).  These documents provide a compendium of statistical
tools for environmental data evaluation and associated guidance for their use.

While it must be recognized that there is a risk of misuse of these statistical techniques when
employed by those who are not statistical professionals, G-9 provides the project team with the
tools to evaluate data claims by contractors and other stakeholders. The benefits of having these
DQA tools available may well outweigh the risk of their misuse in situations where the DQO
process, and/or other systematic planning process, has not been employed.

Regarding how well the  G-9 guidance has been implemented;  there is limited anecdotal and
factual evidence to evaluate its  application to the Agency's data collection operations. As a result
of interviews however, the Subcommittee found a general lack of awareness by regional, state
and grantee personnel for the G-9 document, its content and application; and that a lack of
statistical expertise,  as for Step 6 of the DQO process, was a barrier to implementation.
                                       NOTICE
       This report has been written as part of the activities of the Science Advisory Board, a
public advisory group providing extramural scientific information and advice to the
Administrator and other officials of the Environmental Protection Agency.  The Board is
structured to provide balanced, expert assessment of scientific matters related to problems facing
the Agency. This report has not been reviewed for approval by the Agency and, hence, the

-------
contents of this report do not necessarily represent the views and policies of the Environmental
Protection Agency, nor of other agencies in the Executive Branch of the Federal government, nor
does mention of trade names or commercial products constitute a recommendation for use.

-------
      U.S. ENVIRONMENTAL PROTECTION AGENCY
                          Science Advisory Board
                    Environmental Engineering Committee (FY99)

CHAIR
Dr. Hilary I. Inyang, Director, Center for Environmental Engineering and Science
      Technologies (CEEST), University of Massachusetts, Lowell, MA

MEMBERS
Dr. Edgar Berkey, Vice President and Chief Science Officer, Concurrent Technologies
      Corporation, Pittsburgh, PA

Dr. Calvin C. Chien, Senior, Environmental Fellow, E. I. Du Pont Company, Wilmington, DE

Mr. Terry Foecke, President, Waste Reduction Institute, St. Paul, MN

Dr. Nina Bergan French, President, SKY+, Oakland, CA

Dr. Domenico Grasso, Head of Department of Civil and Environmental Engineering,
      Environmental Research Institute, University of Connecticut, Storrs, CT

Dr. JoAnn Slama Lighty, Associate Dean for Academic Affairs, Associate Professor of
      Chemical Engineering, University of Utah, Salt Lake City, UT

Dr. John P. Maney, President, Environmental Measurements Assessment, Hamilton, MA

Dr. Michael J. McFarland, Associate Professor, Utah State University, River Heights, UT

Ms. Lynne M. Preslo,  Senior Vice President,  Technical Programs, Earth Tech, Long Beach,
      CA

SCIENCE ADVISORY BOARD STAFF
Mrs. Kathleen W. Conway, Designated Federal Officer, Science Advisory Board, U.S. EPA,
      Washington, DC

Mrs. Dorothy M.  Clark, Management Assistant, Science Advisory Board, U.S. EPA,
      Washington, DC
                                        in

-------
U.S. ENVIRONMENTAL PROTECTION AGENCY
             Science Advisory Board - Environmental Engineering Committee
                            Quality Management Subcommittee

CHAIR
Dr. John P. Maney, President, Environmental Measurement Assessment, Hamilton, MA

MEMBERS
Dr. Edgar Berkey, Vice President and Chief Science Officer, Concurrent Technologies, Corporation,
       Pittsburgh, PA

Dr. Hilary I. Inyang, University Professor and Director, Center for Environmental Engineering, Science
       and Technology (CEEST), University of Massachusetts Lowell, Lowell, MA

Dr. JoAnn Slama Lighty, Associate Dean for Academic Affairs, Associate Professor of Chemical
       Engineering, University of Utah, College of Engineering, Salt Lake City, UT

MEMBER OF OTHER SAB COMMITTEES
Dr. William J. Adams, Director, Environmental Science, Kennecott Utah  Copper Corp., Magna, UT
       (Ecological Processes and Effects Committee)

CONSULTANTS
Dr. Mohammad A. Ansari, President, American Institute for Pollution Prevention, Chester, VA

Dr. Gordon Kingsley, School of Public Policy, Georgia Institute of Technology, Atlanta, GA

Dr. Michael J. McFarland, Associate Professor, Utah State University, River Heights, UT

Dr. Rebecca Parkin, Director, Scientific, Professional, and Section Affairs, American Public Health
       Association, Washington, DC

Mr. Douglas Splitstone, Principal, Splitstone & Associates, Murrysville, PA

SCIENCE ADVISORY BOARD STAFF
Kathleen W. Conway, Designated Federal Officer, U.S. EPA, Science Advisory Board, Washington, DC

Samuel R. Rondberg1, Designated Federal Officer, U.S. EPA, Science Advisory Board, Washington, DC

Dorothy M. Clark, Management Assistant, U.S. EPA, Science Advisory Board, Washington, DC
 Provided editorial support for this report, but did not participate in the review.
                                            IV

-------
                       DISTRIBUTION LIST
Administrator
Deputy Administrator
Assistant Administrators
EPA Regional Administrators
Director, Office of Science Policy, ORD
EPA Laboratory Directors
EPA Headquarters Library
EPA Regional Libraries
EPA Laboratory Libraries
Library of Congress
National Technical Information Service
Congressional Research Service

-------