I
1 Office of Inspector General
o 	i	
   Report of Audit
     Laboratory Data Quality
         at Federal Facility
          Superfund Sites
          E1SKB6-09-0041-7100132
             March 20, 1997
              Tooele Army Depot

-------
Inspector General Division              Western Audit Division
 Conducting the Audit:                 Sacramento Branch Office

Regions Covered:                      Regions 8, 9, and 10

Program Offices Involved:              Office of Solid Waste and Emergency Response
                                     Office of Research and Development
                                     Office of Enforcement and Compliance Assurance

Cover Photograph:                     Tooele Army Depot, Tooele, Utah
                                     Photograph by Dan Cox, EPA OIG

-------
                  UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                                WASHINGTON. D.C. 20460
                                  MAR 20  1997
                                                                            OFFICE OF
                                                                       THE INSPECTOR GENERAL
f/fF.MQRANDUM

SUBJECT:   Laboratory Data Quality at Federal Facility Superfund Sites
             Audit Report No. E1SKB6-09-0041-7100132

FROM:      Michael Simmons
             Deputy Assistant Inspector General
               for Internal Audits

TO:          Timothy Fields, Jr.
             Acting Assistant Administrator
               for Solid Waste and Emergency Response

             Robert J. Huggett
             Assistant Administrator
               for Research and Development

             Steven A. Herman
             Assistant Administrator
               for Enforcement and Compliance Assurance

       Attached is our audit report titled Laboratory Data Quality Oversight at Federal
Facility Superfund Sites. The purpose of this audit was to determine whether EPA had
sufficient procedures in place to ensure .laboratory data was of known and acceptable quality
under Federal facility agreements.  We performed this audit due to the serious problems with
laboratory data quality found in our previous audit of Department of Defense (DOD)
Superfund sites in Region 9.

       The report identifies corrective actions the Office  of Inspector General (OIG)
recommends involving data quality at  federal facility Superfund sites. As such, it represents
the opinion of the OIG.  Final determinations on the matters in the report will be made by
EPA managers in accordance with established EPA audit resolution procedures.  Accordingly,
the findings described  in this report do not necessarily represent the final EPA position and are
not binding upon EPA in any enforcement proceeding brought by EPA or the Department of
Justice.
                                                                                    k* en p*»r ttat
                                                                       oonWn* « (MM 80% iwycM few

-------
       Since the recommendations are addressed to three assistant administrators, we are
designating the Acting Assistant Administrator for Solid Waste and Emergency Response as
the primary action official.  As such, the primary action official should take the lead in
coordinating the Agency's official response to this report so that the 90-day time frame for
response is met. Thus the Assistant Administrator for Research and Development and the
Assistant Administrator for Enforcement and Compliance Assurance are secondary action
officials and should coordinate with the primary action official.

       EPA Order 2750 requires the primary action official to provide our office with a
written response to the audit report within 90 days of the report date.  The response should
address all recommendations.  For corrective actions planned but not completed by the
response date, reference to the specific milestone dates will assist us in deciding whether to
close this report.  We have no objection to the release of this report to the public.

       We appreciate the cooperation from your staff during this review. Should you or your
staff have any questions about this report, please contact Truman Beeler, Western Divisional
Inspector General for Audit, at (415) 744-2445, or Katherine Thompson of our Sacramento office
at (916) 498-6535.

Attachment

Distribution: Appendix I
                                           ii

-------
                            EXECUTIVE SUMMARY
PURPOSE
 The purpose of the audit was to determine if EPA had sufficient
 procedures in place to ensure that laboratory data was of known
 and acceptable quality under Federal facility agreements.
FINDING
 Our audit of nine Federal facility Superfund sites in EPA Regions
 8,9, and 10 showed that EPA and Federal facilities did not have
 sufficient procedures in place to ensure that data was of known and
 acceptable quality. Specifically, we found that:

 •   Quality assurance project plans, the primary means for
    controlling laboratory quality, were not well designed to
    prevent and detect inappropriate data;

 •   Oversight of laboratory data quality needed to be increased;

 •   EPA had not assessed the adequacy of other Federal agencies'
    quality systems for environmental data; and,

 •   There was no Federal system to share laboratory evaluations
    between agencies.

 We believe one primary reason for these weaknesses was that
 EPA's oversight role at Federal facility Superfund sites was
 unclear. In our opinion, effective quality assurance systems could
 have helped avoid $11 million spent on rejected analyses,
 resampling, and associated costs and cleanup delays of up to 214
 years at the nine sites we audited.

 Because of the problems with EPA oversight and Federal quality
 assurance systems, it is our opinion that laboratory analyses
 conducted to date at the Department of Defense (DOD) and the
 Department of Energy (DOE) sites cannot  be presumed to be of
 appropriate quality for cleanup decision making. This should be a
 national concern since DOD and DOE have over 90 percent of the
 160 Federal facility Superfund sites on or pending inclusion on the
National Priorities List.
                                         111

-------
Prior EPA Actions
RECOMMENDA-
TIONS
  We noted that EPA, the Department of Defense, and the
  Department of Energy had developed model Federal facility
  agreements in 1988. These agreements required Federal facilities
  to prepare quality assurance project plans. Also, these plans were
  required to be primary documents subject to EPA review.
 Our recommendations to improve laboratory data quality at
 Federal facilities include:

 •   Revising the guidance for quality assurance project plans to
     require the inclusion of the more effective quality assurance
     activities;

 •   Issuing guidance specifying regional oversight responsibilities;

 •   Assessing other Federal agencies' environmental data quality
     systems; and,

 •  Requesting that Executive Order 12580 be modified to
    expressly identify EPA's oversight role for environmental data
    quality.

 The Agency program offices generally agreed with the findings
 and recommendations, and advised that the Office of Enforcement
 and Compliance Assurance and the Office of Solid Waste and
 Emergency Response, " ...working with the regions and other
federal departments and agencies, -will undertake a program to
 improve the quality of the RI/FS -work the federal departments and
 agencies conduct...At this time, OECA and OSWER  view the best
 approach to improving-the data quality supporting federal facility
 response actions is the cooperative, yet aggressive, approach... "
                                         IV

-------
                             Table of Contents
PURPOSE
FINDING
AUDIT SCOPE
APPENDICES
                                                   Page
	  1

Laboratory Data Quality at Federal Facility Superfund Sites
   Results in Brief	  1
   Background 	  2
   Serious Problems With Laboratory Data Quality  	  4
   Quality Assurance Project Plans Not Well Designed	  6
   EPA Oversight Insufficient	  16
   Federal Quality Systems Not Evaluated  	  19
   Laboratory Evaluations Not Shared	  20
   EPA Oversight Role Not Defined	  22
   Environmental Data Quality a Material Weakness	  24
   Performance Measures Not Established  	  24
Recommendations 	  26
Agency Comments   	28

	^	  30

A Program Office Responses to Draft Report	  33
B Acronyms	  53
C How Federal Facilities on the NPL are Cleaned Up	  55
D Data Quality Problems	  57
E Definitions of Quality Assurance Activities 	  59
F Planning Procedure for Defining Data Quality Objectives .  61
G Example of Quality Assurance Report	  63
H Activities Contacted During the Audit  	  65
I  Distribution 	  67

-------
(This page intentionally left blank.)
               VI

-------
                          Laboratory Data Quality at
                      Federal Facility Superfund Sites
PURPOSE
The purpose of the audit was to determine if EPA had sufficient
procedures in place to ensure that laboratory data was of known
and acceptable quality under Federal facility agreements.  The
national audit was triggered as a result of serious problems found
in our 1995 audit of environmental data quality at DOD Superfund
sites in Region 9.
RESULTS IN
BRIEF
Our audit of nine Federal
facility Superfund sites in three
EPA regions showed that EPA
and Federal facilities did not
have sufficient procedures in
place to ensure that data was of
known and acceptable quality.
Specifically, we found that:
"It shall be the policy of all
EPA organizational units to
ensure that.. .environmentally
related measurements are of
known quality."
             -EPA Order 5360.1
                         •   Quality assurance project plans, the primary means for
                             controlling laboratory quality, were not well designed to
                             prevent and detect inappropriate quality data;

                         •   EPA did not have controls to ensure these plans were in place
                             and operating;

                         •   EPA had not assessed the adequacy of other Federal agencies'
                             quality systems for environmental data; and,

                         •   There was no Federal system to share laboratory evaluations
                             between agencies.

                         We believe one primary reason for these weaknesses was that
                         EPA's oversight role at Federal facility Superfund cleanups was
                         unclear. In our opinion, effective quality assurance systems could
                         have helped avoid $11 million spent on rejected analyses,
                         resampling, and associated costs and cleanup delays of up to 2'/2
                         years at the nine sites included in our audit.

                         Because of the problems with EPA oversight and Federal quality
                         assurance systems, it is our opinion that laboratory analyses

-------
Prior EPA Actions
 conducted to date at the Department of Defense (DOD) and the
 Department of Energy (DOE) sites cannot be presumed to be of
 appropriate quality for cleanup decision making. This should be a
 national concern since DOD and DOE have over 90 percent of the
 160 Federal facility Superfund sites on or pending inclusion on the
 National Priorities List.

 We recommend that EPA strengthen oversight of data quality at
 Federal facility Superfund sites.  Our specific recommendations to
 correct data quality problems start on page 26.  The Agency
 generally agreed with these recommendations, as discussed starting
 on page 28. The Agency's complete response is presented in
 Appendix A.

 We noted that EPA, DOD, and DOE had developed model Federal
 facility agreements in 1988. These agreements required Federal
 facilities to prepare quality assurance project plans; also, these
 plans were required to be primary documents subject to EPA
 review.
BACKGROUND
Federal facilities are a significant part of EPA's Superfund
workload. In 1995, Federal facilities had 160 sites on or pending
inclusion on EPA's Superfund National Priorities List, a register of
the nation's worst contaminated hazardous waste sites. DOD and
DOE had over 90 percent of these sites, including military bases,
manufacturing plants, and laboratory facilities. (Acronyms are
explained hi Appendix B.)

Federal facilities comprise nearly 60 percent of EPA's Superfund
workload under remedial investigation or feasibility study phases.
These are the cleanup phases when most environmental data is
collected. Environmental data is collected by sampling
contaminated water, soil, air, and other materials, and having the
samples analyzed by a laboratory.

EPA Regions 8,9, and 10 oversee about 40 percent of the Federal
facility Superfund sites including:

•   Hanford Nuclear Reservation, one of DOE's (and the nation's)
    two largest Superfund cleanups; and

•   Rocky Mountain Arsenal, one of DOD's two largest cleanups.

-------
Cleanup Rules
Data of Known
Quality
Steps for
Laboratory
Analysis Quality
CERCLA, Executive Order 12580, and Federal facility agreements
set rules for Superfund cleanups at Federal facilities. Under
CERCLA (the Comprehensive Environmental Response,
Compensation, and Liability Act), Federal agencies are required to
carry out their hazardous waste cleanups according to the same
guidelines as other facilities.  Executive Order 12580 further
delegates certain Superfund cleanup authorities to DOD and DOE.

Federal facility agreements are site-specific agreements that govern
cleanups. These agreements set requirements and enforceable
schedules for completing studies, reports, and cleanup decisions.
Once a site is placed on the Superfund National Priorities List,
EPA, the Federal facility, and the state typically negotiate a
Federal facility agreement EPA is responsible for overseeing
these agreements and has final decision-making authority for
selecting the cleanup remedy. (The Federal facility Superfund
cleanup process is described in Appendix C of this report.)
                          *7he primary goal of the QA
                         program is to ensure that all
                         environmental^ related
                         measurements.. .{laboratory
                         analysis] produce data of known
                         quality.  The quality of data is
                         known when all components...are
                         thoroughly documented, such
                         documentation being verifiable and
                         defensible. *
                                            -EPA Ordtr 5360.1
In order to oversee
Federal facility
cleanups, EPA should
ensure that
environmental data
supporting decisions is
of appropriate quality.
EPA Order 5360.1
requires environ-
mental data to be of
known quality and
defensible. The quali-
ty of this data may be __
adversely impacted by
weaknesses in sampling, laboratory analysis, and the validation of
results. Poor quality data can negatively impact or delay the
decision making process. Further, incorrect decisions can lead to
inadequate health protection or expenditures for unneeded cleanup
remedies.

There are two major steps to plan for appropriate quality laboratory
analyses at a site.

First, data quality objectives (DQOs) must be determined. Such
objectives define how data will be used, and establish cor-
responding quality objectives before data is collected, thereby
resulting hi a defensible decision-making process.

-------
 EPA Oversight
 Federal Facilities
 Restoration and
 Reuse Office
Federal Facilities
Enforcement
Office
Quality Assurance
Division
                                                                        Data
                                                                       Quality
                                                                      Objectives
 Second, a quality assurance project plan
 (QAPP) must be developed according to
 40 CFR 300.430. The quality assurance
 activities necessary to achieve the DQOs
 are incorporated into a QAPP. This plan
 is a blueprint for ensuring the laboratory
 analyses produce data of appropriate
 quality and quantity for decision-making.
 EPA regions oversee Federal facility
 Superfund cleanups.  Three EPA
 headquarters offices provide guidance that impact this oversight
 role: the Federal Facilities
 Restoration and Reuse
 Office; the Federal Facilities
     EPA's Federal FacUities
   Restoration anil Rei^f (*ffjcc

The mission of the Federal
FacUities Restoration and Reuse
Office is to assist the Federal
government to promote effective
and timely clean up and reuse of
Federal faculties.
 Enforcement Office; and the
 Quality Assurance Division.

 This office, under the Office
 of Solid Waste and
 Emergency Response
 (OSWER), develops guid-
 ance and policy for
 Superfund cleanups at
 Federal sites; it also supports
 the development of related policies by other agencies.

 This office is part of the Office of Enforcement and Compliance
 Assurance, and is responsible for developing national Federal
 facility enforcement and compliance policy and managing the
 resolution of enforcement disputes.

 This division, part of the Office of Research and Development, is
 responsible for directing and overseeing implementation of
 Agency-wide policy for quality assurance applicable to all
 environmental data collection activities.
SERIOUS
PROBLEMS WITH
LABORATORY
DATA QUALITY
Federal facilities have experienced serious problems with the
quality of laboratory analyses used to make cleanup decisions.
There is evidence these problems are widespread. To illustrate:

•  Extensive laboratory fraud was found at one laboratory, which
   was used by 28 DOD installations in three EPA regions,

-------
resulting in about $5 million dollars of lost data, resampling
costs, and associated expenses.
                           "As we protect public health and
                           the environment, we need to be
                           confident that the laboratory
                           results we rely on are accurate."

                                 -Director, Superfund Programs
                                              EPA Region 9
EPA suspended another
laboratory for improper
analyses. This
laboratory did work at
five DOD sites in two
EPA regions. One of
these sites was Hunters
Point Naval Shipyard,
where $2.5 million of data from this laboratory and another
laboratory was deemed unusable and the cleanup was delayed
2 years.

DOE had problems with laboratory analyses at its Hanford and
Fernald Superfund sites.  Fraudulent laboratory analyses were
alleged at Hanford, one of the nation's largest environmental
                           H Reactor
              Hanford Nuclear Reservation, Washington

 cleanup sites. Further, approximately $240,000 of laboratory
 analyses were rejected at its Fernald site.

 Additional laboratory analyses, costing about $3.2 million
 could not be used for their intended purpose at Rocky
 Mountain Arsenal, Luke Air Force Base, Travis Air Force

-------
QUALITY
ASSURANCE
PROJECT PLANS
NOT WELL
DESIGNED
                              Base, and Sacramento Army Depot because of laboratory
                              quality issues. Moreover, the cleanup of one operable unit at
                              Travis Air Force Base was delayed more than 2l/2 years.

                           (Additional discussion on some of these data quality problems is
                           provided in Appendix D.)

                           A total of nine Federal facility Superfund sites, covering EPA
                           Regions 8, 9, and 10, were reviewed in our audit. As discussed
                           above, $11 million was spent on rejected analyses, and resampling,
                           and associated costs. Further, the cleanups at these sites were
                           delayed up to 2!/2 years.

                           We believe that the full extent of the data quality problems had not
                           been identified because:

                           •   QAPPs, the primary means for controlling laboratory quality,
                              were not adequately designed to prevent and detect
                              inappropriate quality data;

                           •   Oversight of laboratory data quality needed to be increased to
                              ensure QAPPs were followed;

                           •   EPA had not assessed the adequacy of other Federal agencies'
                              quality systems for environmental data; and,

                           •   There was no Federal  system to share laboratory evaluations
                              between Federal agencies.

                           We believe one of the primary reasons these problems existed was
                           because EPA's oversight role at Federal facility Superfund
                           cleanups was unclear.

                           We also observed that EPA needed to  improve its management
                           control system over laboratory data quality by documenting its data
                           quality system and establishing performance measures for
                           environmental data quality.
One of the major reasons for
data quality problems was
that QAPPs were not de-
signed to prevent and detect
inappropriate quality data.
 "The QAPP is an important
part of the EPA Quality System,
and is required for all data
collection activities that
generate data for use by EPA. "
                  -EPA QA/G-4

-------
Data Quality
Objectives Were
Deficient
We reviewed 19 QAPPs at nine Federal facilities in Regions 8,9,
and 10. These nine sites included two or more operable units.
QAPPs are usually prepared for each operable unit of a Superrund
site. Additionally, sometimes there are site-wide QAPPs.
Superrund sites are frequently divided into operable units to make
sites more manageable.

Our evaluation of 19 QAPPs found that:

•   Data quality objectives were either not defined or adequately
    defined in 14 of the 19 QAPPs used at the nine sites;

•   A QAPP was not used for the collection of critical risk
    assessment data at one site; and

•   QAPPs did not make appropriate use of three quality assurance
    activities which have been found to be effective in detecting
    unacceptable quality data.

Consequently, the QAPPs we reviewed were not adequately
designed for collecting data of appropriate quantity and quality to
support the decision-making process. We believe these QAPPs
were representative of the typical Federal facility QAPP in these
regions.

Our review of data quality objectives (DQOs) for nine Federal
facilities showed that objectives were either not established or
adequately defined for 14 of 19 QAPPs.  As a result, it was
difficult to determine whether data of appropriate quality and
quantity was collected to support decision making at the sites.

The DQO process is a series of planning steps based on the
scientific method that is designed to ensure that the type, quantity,
and quality of environmental data used in decision making is
appropriate for the intended application. The process allows
decision makers to define their data requirements and acceptable
levels of decision errors before they collect data. The outcome of
the process, data quality objectives, should be the driving
component of the QAPP. EPA's document, QA/G-4, provides
guidance for the DQO process.

We found that satisfactory DQOs had been established for 5 of the
19 QAPPs reviewed. For the other 14 QAPPs, DQOs were either
not defined or not adequately defined as shown below:

-------
                                        Weaknesses Found With DQOs
PoorDQOs Cause
Problems
Weakness
DQOs not defined
Objectives not defined for each data collection
activity
Objectives not defined for each collection
activity and analytical levels* not defined for
each data use
Analytical levels* defined by objectives not
accurately incorporated into the QAPP for
some data uses
Total
QAPPs With
Weakness
7
3
3
1
14
 *Thc OSWER directive defining analytical levels was rescinded in 1993 after
 OSWER Directive 9355.9-01 was issued.  OSWER Directive 9355.9-01 revised
 the DQO process and replaced analytical levels (along with other elements) with
 acceptable decision errors and data categories.

 When DQOs are not defined, the project runs the risk of collecting
 inappropriate quality data or expending too much for sampling. In
 this regard, we found that the initial sampling costs were much
 higher than the resampling costs, possibly indicating that initial
 DQOs may have been inadequate or incomplete.

 For example, we believe the lack of sound DQOs increased
 sampling and analysis costs at Hunters Point Naval Shipyard.
 DQOs at Hunters Poinlflid not establish acceptable error rates or
 confidence requirements for determining the sample size.
 Nonetheless, the Navy collected over 1,200 samples during the fall
 of 1990; the cost was about $2.5 million. After major data
 problems were encountered, the Navy was forced to resample.
 However, costs were $1 million, less than half the initial costs of
 $2.5 million.  We believe this lower resampling cost indicates the
 initial sampling effort was excessive.

 Problems with DQOs occurred because the Federal agencies had
not effectively used the DQO process to establish QAPP
requirements. Key decision makers and technical experts were
oftentimes not participating  in  the process.  Further, cleanup
managers believed the process  needed more structure and specific
guidance documentation.

-------
QAPP Not Used
For Critical Data
Three Effective
Data Quality
Activities
We noted that a QAPP was not used to collect data to fill critical
gaps in support of Fort Wainwright's postwide risk assessment.
Additional field sampling was conducted at Fort Wainwright in
1995 to fill critical data gaps for the risk assessment.
                                                         "Thepostwide risk assessment
                                                         scheduled for completion in 1996,
                                                         is intended to provide a
                                                         comprehensive evaluation of
                                                         potential human health and
                                                         ecological risks across the post."

                                                               - Final Postwide Field Sampling
                                                             Plan for Fort Wainwright, Alaska
We were told the additional
field sampling was conducted
under Operable Unit (OU) 5's
remedial investigation and
feasibility study QAPP. How-
ever, this QAPP only estab-
lished DQOs and quality as-
surance activities for OU 5. It
did not establish requirements
for the collection of critical
data needed for the postwide risk assessment. Consequently,
DQOs and quality assurance requirements had not been established
for critical risk assessment data.

Our review found that three data quality activities were particularly
effective in detecting inappropriate quality data:

•  Independent data validation, using EPA functional guidelines
   or their equivalent;

•  An independent laboratory audit before work starts and
   periodically throughout the project; and,

•  A requirement to provide magnetic media of raw data, when
   needed.

While 8 of the 19 QAPPs reviewed required at least one of these
activities, the other 11 QAPPs did not require any of the three
quality assurance activities we found effective.

As shown in the following chart, EPA or the Federal facility used
these three activities to find data problems at seven of the nine
Federal facilities we reviewed.

-------
                                      Quality Assurance Activities Used
                                        To Identify Unacceptable Data
Data Validation
Found Effective
J%
/ Jp ^1 '
^! F IK.
rrf !F f Spv
<£L14^ Site
March Air Force Base
Hunters Point Naval
Shipyard
Travis Air Force Base
Sacramento Army
Depot
Luke Air Force Base
Rocky Mountain
Arsenal
Fort Wainwright
Data
Validation

•

•
•

•
Laboratory
Audits
•

•


•

Magnetic
Tape
Audits
•



•


 (Quality assurance activities are defined at Appendix E.)

 Data validation identified data quality problems at four of the sites
 we reviewed. Data validation is used to ensure that laboratory
 data is of known and documented quality.  It involves reviewing
 data against a set of criteria to provide assurance that data is
 adequate for its intended use.  It is absolutely essential at key
 decision points, such as determining the boundaries of
 groundwater contamination.

 EPA has data validation guidelines, known as National
 Functional Guidelines, for its own contract laboratory program.
 Generally, the QAPPs we reviewed called  for data validation that
corresponded with EPA data validation guidelines.  According to
EPA guidelines, data validation includes a  review of
documentation such as raw data, instrument printouts, chain of
custody records, and instrument calibration logs.

For example, data validation was effective in finding data
problems at Sacramento Army Depot. Twenty percent of the data
for the Depot's Burn Pits Operable Unit was required to be
validated; however, prior to our review, the  data had not been val-
idated. Subsequently, we requested that Region  9 validate critical
data  from the March 1991 sampling round, resulting in the
                                         10

-------
Laboratory Audits
Resulted in Data
Being Rejected
                                                         "The sample results [for volatile
                                                         organic compounds] are rejected
                                                         due to serious deficiencies in the
                                                         ability to analyze the sample and
                                                         meet quality control criteria.  The
                                                         presence or absence of the analyte
                                                         cannot be verified."

                                                            -Region 9 Quality Assurance Section
                                                                              May 19,1995
rejection of volatile organic
compound analyses. Further,
our Engineering and Science
Staff determined that all
samples taken in March 1991
should be rejected because of
a defect in the sampling
technique.

These rejected samples were
critical because they were
used in the public health risk
assessment, remedial investigation, feasibility study, and record of
decision. This data was also used to determine the contaminants of
concern, determine the cleanup levels for the contaminants, and
select the cleanup remedy.

Laboratory audits identified inappropriate quality data at three of
the seven sites with data quality problems. On-site laboratory
audits are designed to identify technical areas which may cause
laboratories to improperly identify or quantitate chemicals. Audits
normally evaluate a technical expertise, standard operating
procedures, facility and
equipment sufficiency, and
possible sources of sample
contamination.
                           One example of the effective
                           use of laboratory audits was at
                           Rocky Mountain Arsenal in
                           EPA Region 8. The Arsenal
                           effectively used laboratory
                           audits of their contract
                           laboratories to find poor data
                           and to avoid using
                           laboratories with problem
                           performance.

                           To illustrate, in June 1995, the
                           Rocky Mountain Arsenal
                           conducted an audit of a Texas
                           laboratory. This audit found
                           that the laboratory was performing poorly and had not shown the
                           expected improvements.  Also, changes made at the laboratory did
                           not effectively identify or eliminate the problems.  As a result, the
 "Eureka Laboratory is not in
compliance with the
requirements of the ...[Chemical
Quality Assurance Plan], the
contract under which they are
performing work and good
laboratory practices. Data
produced for certification has
been found to be altered and
may be required to be repeated
prior to acceptance by [the
Program Manager for Rocky
Mountain Arsenal]."

         - Rocky Mountain Arsenal
       August 1993 Audit Report on
              Eureka Laboratories
                                            11

-------
                            Arsenal stopped sending samples to the laboratory, and did not use
                            data previously analyzed by the laboratory. The Arsenal paid the
                            laboratory about $485,000 for the unused data analyses.
Magnetic Tapes
Should be
Available
                        South Plants Area
       On-Post Operable Unit, Rocky Mountain Arsenal, Colorado

We believe that on-site laboratory audits are one of the better
quality assurance activities if they are performed: (i) before
samples are sent to the laboratory; and (ii) periodically throughout
the sampling process. We also believe QAPPs should allow
unannounced audits so that laboratory performance at the time of
the audit is representative of routine operations.

Magnetic tape audits were used to verify the extent of data quality
issues at two of the seven sites with data quality problems.  Tape
audits are routinely conducted by EPA in monitoring its contract
laboratories. At  a minimum, we believe magnetic data should be
available so that  tape audits can be done when warranted.

Magnetic tape audits can identify poor laboratory practices but
have limited usefulness. These audits, in conjunction with data
audits, are used to assess the authenticity of the data generated and
the implementation of good automated laboratory practices.
However, magnetic tape audits generally can only be used for data
generated by gas chromatography and mass spectrometry
laboratory equipment.
                                            12

-------
Lack of Guidance
                                                       "These audits [magnetic tape
                                                       audits] are used to assess the
                                                       authenticity of the data generated
                                                       and assess the implementation of
                                                       good automated laboratory
                                                      practices."
                                                                 -AFCEE Quality Assurance
                                                                Project Plan, February 1996
For example, Region 9
used magnetic tape audits
at March Air Force Base to
detect major data quality
problems. The Region
used this technique after
double-blind performance
evaluation samples
identified data problems.
The tape audits found
deficiencies and pervasive fraudulent work. This led to Eureka
Laboratories pleading guilty to falsifying test results and two of its
chemists being convicted of fraud in May 1995.

We believe it is critical that the requirement for the availability of
magnetic tapes be written into QAPPs and laboratory contracts.  To
illustrate, Travis Air Force Base found potential problems with
data from one laboratory. However, we were advised that the
laboratory refused to provide magnetic tapes of raw data for audit
because contract specifications did not require availability of
magnetic data. Thus, Region 9 could not determine whether the
data was of appropriate quality for its intended purpose. Resultant
laboratory data problems delayed the cleanup more than 2'/z years.

In summary, we believe magnetic tape audits should be performed
if major deficiencies are found by other methods, such as data
validation or performance evaluation samples.  However, in order
to do so, Federal agencies must be able to obtain magnetic data.
This means including the requirement for magnetic data
availability in QAPPs and laboratory contracts.
EPA's guidance
document for the
preparation of quality
assurance project plans,
QAMS 005/80, did not
require data validation,
laboratory audits, and
magnetic tape
availability. To ensure
these quality assurance
items are addressed in
QAPPs, and required when appropriate, EPA guidance should be
modified to require their inclusion, when DQOs require high-
quality data.
                                                   "EPA believes that the appropriate
                                                   content arid level of detail in the
                                                   QAPP may be best achieved by
                                                   having the QAPP requirements
                                                   reviewed and confirmed by the EPA
                                                   project manager with the assistance
                                                   and approval of the EPA QA
                                                   Manager."
                                                                          -EPAQA/frS
                                          13

-------
 Assistance from
 Regional Quality
 Assurance Staff
Best Practices
DQO Development
Computerized
Data Validation
 EPA's regional quality assurance staffs are a good source of
 expertise to improve QAPPs. However, we found 13 of 19 QAPPs
 were not reviewed and approved by the regional quality assurance
 staffs. QAMS 005/80 did not require QA staff to review QAPPs.
 However, interim final QA/R-5, EPA Requirements for Quality
 Assurance Project Plans for Environmental Data Operations.
 August 1994, includes a requirement for EPA QA managers to
 assist EPA project officers in reviewing and approving QAPPs.
 Even though QA/R-5 is not expected to be finalized until late
 1997, it has been embraced EPA-wide, with the exception of
 Region 10.

 During our audit we found "best practices" for developing DQOs,
 validating data, tracking laboratories, contracting with laboratories,
 and reporting quality assurance findings. These best practices
 merit inclusion in Federal quality assurance programs whenever
 possible.

 The environmental restoration contractor at Hanford developed an
 effective planning procedure for defining DQOs.  This procedure,
 shown at Appendix F, involves key decision makers, including
 EPA, in the development of objectives. The outcome of this
 procedure was a set of DQOs with the level of detail and
 information needed by the QAPPs and field sampling plans. EPA
 should refine its process to include many of the aspects of this
 procedure.

 Another best practice is the use of computerized data validation.
 EPA has developed two computerized data validation programs to
 verify laboratory performance.  They are called Computer-Aided
 Data Review and Evaluation (CADRE) and electronic Data
 Transfer and Validation System (e-Data).

 Region 9 has tested both systems for use at Federal facilities.  The
 Region found that  CADRE not only identified the same problems
 that manual data validation did, but was more objective and
 consistent. A drawback to CADRE was that, as a computer
program, it could not visually inspect raw data to identify
anomalies.

Region 9 also did a study of e-Data with the Navy at Pearl Harbor.
Among other things, the study found that e-Data:
                                        14

-------
Army Validation
and Tracking
System
Contracting
Directly with
Laboratories
                          •   Was able to quickly load, process and identify outlying quality
                              control data much more efficiently than manual procedures;

                          •   Did not make any transcription or recording errors; and

                          •   Reduced the effort required to distribute and process data.

                          Neither program was able to accommodate deviations from the
                          prescribed agency standard format for electronic data deliverables.
                          Another drawback is that the laboratories needed computer systems
                          that produce data in the proper format.

                          Although there are some problems with electronic data validation,
                          tests have shown that computerized data validation can be much
                          more efficient than regular data validation as shown in the
                          following table:

                                    Comparison of Data Validation Methods
                                                     CADRE
                                        Manual
Time to Validate
Turn Around Time
Cost to Validate
4 hours
1 week
$150
35 hours
1 month
$1,200
Source: Region 9 study.

The U.S. Army Corps of Engineers (the Corps) implemented a
laboratory validation and tracking system.  This system required
laboratories to be "validated" before contracts were awarded.  The
validation process typically included an on-site audit and
performance evaluation samples.  (Performance evaluation
samples are samples spiked with known quantities of contaminants
used to measure a laboratory's analytical performance.)

The tracking system monitored laboratory validation information
related to Corps' contracts. This system tracked laboratory
information such as a business profile, performance evaluation
sample results, and laboratory validation status.

Another best practice that increased control over environmental
data was directly contracting with laboratories, instead of
subcontracting through environmental engineering firms. Rocky
Mountain Arsenal contracted directly with its laboratories.  This
                                           15

-------
Meaningful Quality
Assurance Report
EPA OVERSIGHT
INSUFFICIENT

Region 8
allowed the Arsenal to have more control over the laboratories.
The Arsenal also included many "best practices" in its laboratory
contracts, including laboratory audits and performance evaluation
samples. Further, it included a clause that said no more work
would be sent to the laboratory if it did not meet the minimum
requirements for operational and documentation requirements.

The Travis Air Force Base QAPP established an effective format
for the quality assurance report.  The report showed the results of

                                            Storm Water Treatment Plant
                                           Travis Air Force Base, California

                          the PE samples, laboratory audits, and data validation.  The
                          report included information on the findings, corrective actions
                          required, and the effects on data quality assurance. The report
                          was included with the remedial investigation reports for EPA's
                          review. An example of one of these reports is included in
                          Appendix G.
EPA regions were not providing sufficient oversight on Federal
facilities' implementation of QAPP requirements.

For example, we found that Region 8 did not have a copy of the
current QAPP for Rocky Mountain Arsenal at the time of our audit
in June 1996. The original QAPP was implemented during 1989
and had substantially changed since that date. Without the current
                                          16

-------
Region 9
Region 10
    "... Regional Administrators
   shall .'...Ensure that all projects
   and tasks involving
   environmentally related
   measurements are covered by
   an acceptable QA project plan
   and that the plan is
   implemented...."
                  -EPA Order 5360.1
QAPP, the Region was unable to
adequately oversee the Arsenal's
compliance with quality
assurance requirements.

Region 9 did not monitor the
data validation requirement
specified in the QAPP for the
Sacramento Army Depot's Burn
Pits Operable Unit.  The QAPP
required 20 percent of the data
to be validated according to EPA national functional guidelines.
However, we found that no data was validated.

Subsequent data validation resulted in the rejection of critical
analyses.  After we determined data was not validated, we
requested that Region 9 validate critical data from the March 1991
sampling round. This validation resulted in the rejection of
volatile organic compound analyses.

These rejected samples were critical because they were used in the
public health risk assessment, remedial investigation, feasibility
study, and record of decision. This data was also used to
determine the contaminants of concern, determine the cleanup
levels for the contaminants, and to select the cleanup remedy.

Region 10 did not monitor compliance with the laboratory audit
requirement specified in the QAPP for Fort Wainwright Operable
Unit (OU) 2. The QAPP required laboratories to be validated by
the Army Corps prior to their use and every 18 months thereafter.
This validation process included laboratory audits.  We found that
the Army had not complied with this requirement.
                           The Army was almost a year
                           late performing an audit of
                           one of the laboratories for
                           OU2. The Army should
                           have audited the laboratory
                           in May  1995, when the lab-
                           oratory's validation from the
                           Army expired. However,
                           the Army extended its vali-
                           dation to May 1996 without
                           conducting an audit. When
                           the Army conducted the au-
                           dit during March 1996, it found
 "Based on the PE sample results
 and the information gathered
 during the on-site inspection,
 National Environmental Testing-
 Santa Rosa Division is not consid-
 ered to be qualified to perform
 chemical analyses for the U.S.
 Army Corps of Engineers at this
 time."
      - Army Corps of Engineers'April 1996
        Audit Report of NET - Santa Rosa
significant performance deficien-
                                            17

-------
Cause
                           cies and concluded that the laboratory was not qualified to perform
                           analyses for the Army. Region 10 was unaware of the untimely
                           audit because copies of the relevant audit reports had not been
                           obtained and reviewed.

                           Unfortunately, the laboratory was used to analyze samples
                           collected in October 1995 for Fort Wainwright's postwide risk
                           assessment; this assessment was used to more completely define
                           contamination in the Chena River at Fort Wainwright.
                         Chena River
                    Fort Wainwright, Alaska

We believe the laboratory's analysis of these samples was
questionable because of the audit's conclusion about the
laboratory.

EPA oversight was insufficient because regional remedial project
managers were generally relying on the Federal facilities to ensure
that QAPP requirements were met.  To ensure that data of
appropriate quality is obtained, regional remedial project managers
must monitor compliance with QAPP requirements.  In addition,
EPA quality assurance staff should assist the project managers with
this oversight in order to make sure that significant data quality
issues are identified and addressed.
                                            IS

-------
FEDERAL
QUALITY
SYSTEMS NOT
EVALUATED
DOD Not Tracking
Laboratory
Performance
EPA had not fiilly assessed DOD's or DOE's environmental data
quality systems on a department-wide basis. We believe the extent
of EPA's oversight should be based on the adequacy of DOD's and
DOE's data quality systems. We found weaknesses in both DOD's
and DOE's data quality systems that substantiate the need for
increased EPA oversight.
                                              Federal Facilities on or Pending the NPL
                                                          u»f»*p«Mib«r1MI
                                              DOD129«NM 81%
                                                                        Other 12 tltot 8%
                                                                      DOE 18 »*•• 12%
DOD and DOE are
responsible for most
of the Federal sites on
the Superfund
National Priorities
List. Under the
National Contingency
Plan, DOD and DOE
have unique
investigative and
cleanup responsibilities for NPL cleanups.  As lead agencies, they
are responsible for ensuring data quality.

EPA is responsible for reviewing DOD and DOE remedial
investigations and feasibility studies and must agree with their
cleanup remedies. In order to make an informed judgement of
remedy, EPA must ensure the data supporting environmental
decisions is of known quality. We believe the degree of EPA
oversight should also depend on the effectiveness of a Federal
department's data quality system.
Except for some efforts made by Region 9, EPA had not evaluated
DOD's or DOE's data.quality systems. Our review indicated such
an evaluation would identify significant deficiencies in their data
quality systems. These deficiencies have allowed poor-performing
laboratories to analyze samples at Federal facility Superfund sites,
as discussed in the following paragraphs.

DOD did not have a system for tracking laboratory performance.
Although it had established a Tri-Service Chemical Quality
Assurance Work Group to enhance communication among the
military services, DOD had not established a system to share
laboratory audit results. Consequently, laboratory audits that
found serious problems were not always shared with other military
services or Federal agencies. For example, an Air Force-contracted
evaluation of Eureka Laboratories found serious deficiencies with
                                         19

-------
 DOE Has
 Problems with
 Quality Assurance
LABORATORY
EVALUATIONS
NOT SHARED
                           laboratory performance and procedures.  However, the evaluation
                           was not shared with the other services or Federal agencies.

                           Moreover, the DOD Inspector General found that DOD facilities
                           were not using effective quality assurance activities for their
                           laboratory support services.
                      Deficiencies in DOE's Commercial
                               Laboratory Quality	

                    >   Some laboratories failed to qualify or were
                        suspended from work for one site but
                        continued to test samples at other sites.

                    >   Some laboratories were not evaluated to
                        determine their ability to provide analytical
                        services.

                    »   Methods used to perform evaluations and
                        report results varied among contractors.

                        -Audit of DOE'S Commercial Laboratory Quality
                        Assurance Evaluation Program, DOE Inspector
                                              General, June 1995
 The DOE Office
 of Inspector Gen-
 eral found prob-
 lems with the
 DOE's commer-
 cial laboratory
 quality assurance
 evaluation pro-
 gram. In its June
 1995 audit report,
 the OIG found
 that:

  "Both
  Department and
  contractor
  officials stated
  that some laboratories failed to qualify or -were suspended from
  work for one site but continued to test samples for other sites.
  These officials told us that even when they learned of these
 failures or suspensions, they did not notify other known
  laboratory customers."

Because of problems wjth Federal quality assurance systems, it is
our opinion that laboratory analyses conducted to date at DOD and
DOE Superfund sites cannot be presumed to be of appropriate
quality for cleanup decision making. This should be a national
concern since DOD and DOE have over 90 percent of the Federal
facility Superfund sites on or pending inclusion on the National
Priorities List.
One reason that the extent of data quality problems were not
identified was because neither EPA or any other component of the
Federal government had an effective forum or system for sharing
laboratory evaluations. Laboratory evaluations, such as audits are
one of the most useful tools for judging the technical capabilities of
a laboratory. On-site audits typically evaluate a laboratory's
                                          20

-------
No Federal
System
Need to Track
Performance Data
technical expertise, standard operating procedures, and facility and
equipment sufficiency.

There was no system within the Federal government to share
laboratory evaluations. Such a system could avoid the use of
incompetent laboratories and could also help cut costs by
preventing duplicate audits.

For example, if the Air Force had shared audit results with the
Army, it is likely that $3.8 million in rejected data and associated
costs could have been avoided.  The Army, Navy, and Air Force
paid for five audits of Eureka Laboratories between January 1991
and October 1992.  The first Air Force audit, done in January 1991,
found major problems with Eureka Laboratories, which was used
to analyze samples at 28 DOD installations.
                                                    Mountain Arsenal told us that Eureka
                                                      "Iflhad^no-wn about the AFCEE
                                                      [Air Force Center for Environ-
                                                      mental Excellence] audit, I would
                                                      not have used Eureka Labor-
                                                      atories."
                                                                      -Amy Representative
                                                                   Rocky Mountain Arsenal
                                                                           February 1996
An Army manager at Rocky
Laboratories would not
have been used if he had
been aware of the Air Force
audit. Rocky Mountain
Arsenal ultimately rejected
samples analyzed by
Eureka Laboratories, at a
cost of about $3.8 million.
This rejection also set back
the Arsenal's water
monitoring program about
1 year.
We were told EPA andJFederal departments have shared laboratory
information in the past to identify poor quality analyses. For
example, EPA told us it provided information to DOE which led to
allegations of laboratory fraud at Hanford.

EPA and other Federal agencies have recognized the need to track
laboratory performance data.  For example, in 1986 OSWER
Directive 9240.0-2 established a system for tracking aJl Superfund
analytical services, including Federal facility laboratories.
OSWER later rescinded this requirement for Federal facilities,
although these facilities accounted for nearly 60 percent of the
Superfund priority sites undergoing investigation or study in 1995.

Some of the items the system might include are laboratory audits,
accreditation status, and performance evaluation samples. In our
                                          21

-------
EPA Notification
Procedures
Lacking
EPA OVERSIGHT
ROLE NOT
DEFINED
 opinion, EPA is the logical proponent for such a system because of
 its experience in tracking laboratory performance and its oversight
 role at Superrund cleanups.  We recognize there could be legal
 limitations on the type of information that could be shared and who
 it can be shared with.

 There were no procedures for exchanging laboratory performance
 information between EPA's contract laboratory program and other
 EPA laboratory programs, such as those for Federal facilities.
 EPA also lacked procedures for ensuring that fraudulent or poor
 quality data was not used at Federal facility Superrund sites. For
 example, after Eureka Laboratories pleaded guilty to fraud  in May
 1995, EPA did not request its regions to evaluate the impacts of
 this laboratory's work at Federal facilities, although the laboratory
 was used at 28 DOD installations.
We believe one of the primary reasons for weaknesses in data
quality was that EPA's oversight role at Federal facility Superfund
sites was not well defined, especially at sites where DOD and DOE
were involved.  This condition was due to ambiguous legal
authorities under CERCLA section 120 and Executive Order
12580.

Under CERCLA section 120, issued in 1986, EPA was to review
remedial investigations and feasibility studies (RI/FS) prepared by
other Federal agencies.  The extent of this review was not defined
by CERCLA.

Executive Order 12580, issued in 1987, gave DOD and DOE
cleanup responsibilities-at their National Priorities List sites.  The
National Contingency Plan (40 CFR 300) further defines these
responsibilities. However, it did not describe EPA's oversight
responsibilities for these
cleanups.
                          In 1991, the Office of Solid
                          Waste and Emergency Re-
                          sponse (OSWER) issued
                          Directive 9835.1(c), Guid-
                          ance on Oversight of Po-
                          tentially Responsible Party
                          Remedial Investigations
                          and Feasibility Studies.
                          According to this directive,

                                         22
                            "While oversight 
-------
EPA Oversight
During the RI/FS
EPA should oversee Federal facility cleanups to the same degree as
private industry cleanups.  However, this directive did not address
how EPA's oversight responsibilities are impacted by DOD's and
DOE's authorities under Executive Order 12580, nor how site-
specific data quality activities should be coordinated.

Because of ambiguous legal authorities, EPA's authority to oversee
data quality during the RI/FS process may be questioned in the
absence of a Federal facility agreement. In fact, on a national
basis, there were 31 Federal facility Superfund sites without a
Federal facility agreement. For example, EPA did not have an
agreement with the Concord Naval Weapons Station. As a result,
EPA's comments on the work plan for the remedial investigation
had to be provided to the State of California for inclusion in the
state's comments.
                          We believe it is impera-
                          tive that EPA become
                          involved overseeing data
                          quality during the RI/FS
                          process because:

                          •  Federal agencies
                             must be viewed as
                             having an inherent
                             conflict-of-interest
                             between their desire
                             to have sites
                             removed from the
                             NPL, but at the
                             lowest cost.
                           Reasons for EPA Oversight
                               of Data During RI/FS

                          • Federal agencies are not
                            independent; they are responsible for
                            pollution and must pay for cleanup.

                          » Most environmental data is collected
                            during the RI/FS.

                          • EPA must rely on data to make an
                            informed judgement of the remedy.

                          • EPA and Federal agency
                            partnerships during the RI/FS allow
                            parties to focus on critical issues.
                              Most of the environ-
                              mental data used in determining the remedial action is collected
                              during the RI/FS. EPA must agree with the remedial action.

                              EPA cannot determine if a remedial action is appropriate
                              without evaluating the quality of the underlying data.

                              Joint partnerships between EPA and the Federal agency during
                              the RI/FS process allow all parties to focus on key issues that
                              are critical.  Such partnerships support the targeting of
                              oversight activities to the priority sites, and provide a means to
                              resolve substantive issues prior to action.
                                          23

-------
 ENVIRONMENTAL
 DATA QUALITY A
 MATERIAL
 WEAKNESS
PERFORMANCE
MEASURES NOT
ESTABLISHED
                           To clarify EPA's oversight responsibilities at Federal facility
                           Superrund cleanups, it is our opinion that Executive Order 12580
                           requires modification. The modification should identify EPA's
                           oversight responsibilities for RI/FS activities, including
                           environmental data quality. Further, EPA's RJ/FS guidance, such
                           as OSWER Directive 9835. l(c), needs to be revised to define
                           EPA's specific oversight responsibilities at Federal facilities and
                           how site-specific data quality activities should be coordinated.
 EPA had not fully documented its quality systems.  In our opinion,
 documenting these systems will help ensure sufficient quality
 assurance systems are in place, including those for data quality at
 Federal facilities.

 Since 1992, environmental data quality has been a material
 weakness in EPA's management control system.  The weakness
 was reported because many EPA activities had not documented
 their quality systems in acceptable quality management plans.

 A quality management plan describes the entire organization-wide
 quality management system. Quality management plans are
 required by American National Standard ANSI/ASQC E4-1994
 and EPA Order 5360.1.

 A quality management plan had not been developed by EPA's
 Federal Facilities Restoration and Reuse Office.  This office has
 responsibility for guidance and policy for Superfund cleanups at
 Federal sites. In addition,  as of September 30, 1996, EPA's Office
 of Research and Development had not approved quality
 management plans for *hree EPA regions, although most of these
 regions had plans approved in the past.  However, as of February
 21,1997,9 of the 10 regions had approved plans.  The other
 regional plan was under development.
EPA had not established performance
measures for environmental data
quality at Superrund sites.  Because
environmental data quality is at the
heart of EPA decision-making, we
believe performance measures should
be developed that measure how well
the data quality process was planned
and carried out.
 "...the active involvement
of agencies' top officials jn
setting goals, measuring
performance, and using
performance information
is critical.,."

 -General Accounting Office
      testimony, June 1995
                                         24

-------
                            "Developing effective performance
                            indicators is the heart of the
                            process."
                                             -KPAfG Peat Marwick
                                               Government Services
The Government Perfor-
mance and Results Act,
enacted in 1993, requires
Federal agencies to mea-
sure performance. The
Act seeks to fundamen-
tally change the focus of
Federal management and accountability from a preoccupation with
inputs to a greater focus on the outcomes that are being achieved.
Under this Act, agencies are to set strategic goals by 1997, and
measure performance and report on the degree to which goals are
met by 2000.  The Office of Management and Budget has recog-
nized the  need to reach consensus on outcome-oriented goals and
has been strongly encouraging agencies to begin implementing the
Act's requirements well before 1997.
EPA should develop
performance
measures for the qual-
ity of environmental
data. Environmental
data form the basis
for most policy, tech-
nical, and regulatory
actions at EPA.
Thus, it is critical that
the collected data are
of the type, quantity,
and quality needed to
make decisions with
the desired degree of
confidence. Manage-
ment should be confi-
dent that data do not
lead to an incorrect
decision and that the
data can withstand
scientific and
litigative scrutiny.
                       Steps to  Measure  Performance
                                        Define Mlealon and
                                        Desired Outcome*

                                        Practices:
                                        1. Involve stakeholder*
                                        1. Assess environment
                                        J.AHgnactivlttM.
                                         cora procMM*,
                                         and resource*
                           Js* Performance
                           nformeUon

                           •ractlcee:
                           I. Identify performance
                            gap*
                            Report Information
                           I. Use Information
Measure Performance

•ractlcas:
I. Produce measure* at
 each organizational
 level met
 • demonstrate result*,
 • ere Untiled to me vital
    few,
 • respond to multiple
    prtorWo*. and
 • Nnk to nraponalMe
    programs
5. Collect data
                                    •eurae: OAO Report OOD-M-111
                 25

-------
RECOMMENDATIONS     We recommend that:
                           The Assistant Administrator for Solid Waste and Emergency
                           Response

                           1.    Work with Regions to ensure that Federal facility
                                 Superfund QAPPs:

                                 a.  Include QAPP requirements that are based on well-
                                    defined data quality objectives.

                                 b.  Are prepared for each data collection activity that is
                                    used for decision making.

                           2.    Ensure that regional quality assurance personnel are
                                 involved in the entire QAPP process, from development of
                                 the QAPP to compliance with the QAPP.

                           3.     Issue guidance that specifies regional oversight
                                 responsibilities for Federal facility Superfund cleanups.
                                 Ensure this guidance addresses the oversight of laboratory
                                 data quality and includes a requirement for site-specific
                                 plans that discuss the nature, frequency, and responsibility
                                 for data quality oversight activities.

                           4.     Assess the adequacy of DOD's and DOE's environmental
                                 data management systems.

                           5.     Establish procedures for ensuring fraudulent or poor quality
                                 data is not used at Federal facility cleanups.

                           6.     Fully identify the impacts of the Eureka Laboratories
                                 fraudulent and poor laboratory practices on Federal facility
                                cleanups.

                           7.    Develop a national  quality management plan.

                           8.    Develop performance measures for the environmental data
                                quality system, and compare actual performance of the
                                system to these measures.
                                         26

-------
9.   Issue program-specific QAPP guidance to ensure that the
     following quality assurance measures are included when
     high-quality data is required by data quality objectives:

     a.  The use of EPA national functional guidelines or their
         equivalent for data validation. The data validation
         should represent all matrices, analysis types, and
         laboratory decision points, and be based on the data
         quality objectives.

     b.  Data validation performed by a party independent of
         both the laboratory and its parent company.

     c.  On-site laboratory audits before work is started and
         periodically throughout the project. Also,  the
         guidance should specify that the audits will be
         conducted by an activity independent of the laboratory
         and will include both announced and unannounced
         audits.

     d.  Magnetic  data maintained and made available to
         regions. In addition, magnetic tape audits should be
         required if major deficiencies are found by other
         quality assurance methods, such as data validation or
         performance evaluation  samples.

10.  Continue the development of electronic data validation,
     expand its capabilities, and encourage its use.

11.  Create a forum for sharing environmental laboratory
     evaluations, such as laboratory audits, among Federal
     agencies.

12.  Publicize best practices used in Federal facility agreements
     QAPPs, and laboratory contracts to make EPA regions and
     other Federal facilities aware of them.

The Assistant Administrator for Research  and Development

13.  Refine the data quality objectives process by:

     a.  Ensuring the early involvement of key decision makers.

     b.  Using checklists to identify all necessary activities.
               27

-------
AGENCY
COMMENTS
OSWER Response
                                c.  Identifying specific documentation requirements.

                                d.  Using the model developed at the Hanford site as a
                                    guide.

                           14.  Work with the Federal Facilities Restoration and Reuse
                                Office and regions to develop acceptable quality
                                management plans.

                           The Assistant Administrator for Enforcement and
                           Compliance Assurance

                           15.  Request that Executive Order 12580 be modified to
                                expressly identify EPA's oversight role for environmental
                                data quality.
The Offices of Solid Waste and Emergency Response, Research
and Development, and Enforcement and Compliance Assurance
generally agreed with the findings and recommendations. Their
complete responses are at Appendix A.

OSWER agreed improvements needed to be made in the current
quality assurance oversight process. However, it cautioned that
EPA must not undermine recent initiatives to streamline the
Superfund process.  It also stated "... We believe that having sound
information to base cleanup decisions is critical, but we also must
recognize the responsibilities delegated under Executive Order
12580 to other Federal agencies.  While we must improve our
efforts, so too must other Federal agencies improve their
accountability..."

OSWER agreed to coordinate with EPA regions and the
Departments of Defense and Energy to assess the adequacy of
DOD's and DOE's environmental data quality systems by
November 30, 1997. OSWER, EPA regions, DOD, and DOE will
also develop a framework for the minimum quality assurance
program that the Federal facilities should have in place. This
framework will incorporate currently available quality assurance
guidance and information.  EPA will complete the design of the
quality assurance framework and the initial implementation by
May 31,1998.

Each EPA region will be responsible for verifying that its
Superfund Federal facilities have established and are maintaining

               28

-------
ORD Response
OECA Response
the quality assurance program. OSWER also agreed to develop a
quality management plan, encourage the use of electronic data
validation, develop a mechanism to share laboratory audit
information, and encourage the dissemination quality assurance
best practices.

The Office of Research and Development agreed to amend its
DQO guidance and work with OSWER's Federal Facility
Restoration and Reuse Office and regions as they develop and
implement quality management plans. ORD will also provide
training in the Agency's quality management system.

ORD pointed out that "...There is no system, no matter how well
conceived and documented, that cannot be circumvented by
unexpected environmental conditions, unintentional mistakes by
staff, or intentional malfeasance."

The Office of Enforcement and Compliance Assurance and
OSWER, working with EPA regions and other Federal
departments, will undertake a program to improve the quality of
RI/FS work the Federal departments conduct  "Consistent with
our long-held 'enforcement first'principles, we applaud the IG for
supporting the need for strong EPA oversight.  "

OECA viewed the best approach to improving data quality at
Federal facilities as the cooperative, yet aggressive, approach
detailed in OSWER's comments. OECA said  it remains ready to
pursue amending the Executive Order if EPA fails to secure the
improvements OSWER actions seek.
                                         29

-------
AUDIT SCOPE
Scope and
Methodology
 This section describes the audit scope and methodology, including
 our review of the 1995 Integrity Act Report to the President an j
 Congress and prior audit coverage.

 We performed our audit in accordance with Government Auditing
 Standards issued by the Comptroller General. Our field work was
 conducted between December 5,1995 and July 31,1996. The
 audit covered management procedures in effect as of
 September 30,1995.

 We interviewed officials in EPA's Offices of

 •   Solid Waste and Emergency Response;

 •   Enforcement and Compliance Assurance; and

 •   Research and Development.

 We obtained and reviewed EPA oversight guidance, reports, and
 analyzed resultant data.  We also contacted the DOD and DOE
 Offices of Inspector General to identify audits of these
 departments' laboratory quality assurance systems.

 We selected Regions 8, 9, and 10 for review because they oversee
 about 40 percent of the Federal facility Superfund sites including:

 •  DOE's Hanford Nuclear Reservation, one of the two largest
   DOE cleanups; and

 •  Rocky Mountain Arsenal, one of the largest cleanups in DOD.

 Seven of the sites included in the audit were selected because of
 known or possible problems with laboratory data quality.
 A complete list of the entities contacted in the audit is shown in
 Appendix H.

 We interviewed responsible regional and Federal facility officials.
 Wfralso reviewed the internal controls associated with regional
 oversight of laboratory data quality, including Federal facility
agreements, QAPPs, and quality assurance reports. The internal
control weaknesses we found are described in this report, along
with recommendations for corrective actions.
                                        30

-------
Federal Managers'
Financial Integrity
Act
Prior Audit
Coverage
In planning our audit, we reviewed EPA's 1995 Integrity Act
Report to the President and Congress, which reports compliance
with the Federal Managers' Financial Integrity Act. EPA reported
that environmental data quality has been a material weakness since
1992. As detailed in this report, we believe internal controls over
laboratory data quality at Federal facility Superfund sites could be
improved by correcting this weakness.

We issued an audit report titled Environmental Data Quality at
POD Superfund Sites in Region 9 in September 1995. The results
of this audit are included in this report. Region 9 had taken actions
to implement the audit report recommendations. In this respect,
Region 9 initiated a memorandum of understanding between its
Federal Facilities Cleanup and Quality Assurance Management
offices to establish responsibilities and time frames for
implementing the report recommendations.

Other departments had also audited laboratory analyses for
environmental data.  For example, the Department of Energy
Office of Inspector General addressed laboratory quality assurance
in two reports: Audit of the Department of Energy's Commercial
Laboratory Quality Assurance Evaluation System, issued in June
1995, and Audit of Testing Laboratory Support to the
Environmental Survey Program, issued in December  1990. Both
of these reports identified weaknesses in the Department's
laboratory quality assurance system.

The DOD Inspector General  issued  a report called Laboratory
Support Services for Environmental Testing on February 21, 1997.
The report identified problems with environmental laboratory
services.

In 1996, Army Audit Agency reported that the Army paid more
than necessary for its laboratory analyses. Air Force Audit Agency
issued two reports on environmental contract quality assurance in
1994 and 1995.  These reports showed that contractor oversight
was inadequate at Air Force cleanups.
                                         31

-------
(This page intentionally left blank.)
                32

-------
                              APPENDIX A

               Program Office Responses to Draft Report
Attached are the responses from the program offices to our draft report. We issued the draft
report on October 28,1996.  We received initial responses from the program offices during
December 1996. After meeting with program officials from OSWER and ORD on January 28
and 29,1997, we revised our draft report. The OSWER and ORD responses were subsequently
resubmitted in February 1997, based on our revised draft.

Program Office Responses to Draft Report
   OSWER	 35
   ORD	45
   OECA	49

It should be noted that the program offices' comments relative to specific sections of our report
may no longer be relevant due to changes we made between the draft and final report. Many of
these changes were made as a result of the program offices' comments.
                                     33

-------
(This page intentionally left blank.)
              34

-------
               UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                          WASHINGTON, D.C. 20460
                          FEB 24  B97                   OFF.CEOF
                                               SOLID WASTE AND EMERGENCY RESPONSE
MEMORANDUM

SUBJECT:  OSWER Response to Audit Report EISKB6-09-0041:
          Draft Audit of Laboratory Data Quality Oversight at
          Federal Facility Superfynd Sites
FROM:     Timothy Fields, Jr.
          Acting Assistant Admihistra

To:       Michael Simmons
          Deputy Assistant Inspector General  for Internal Audit
          Office of Inspector General

     The purpose of this memorandum is to  transmit the Office of
Solid Waste and Emergency Response  (OSWER)  comments on the
findings and recommendations contained in  the revised Office of
the Inspector General  (OIG) Draft Audit Report EISKB6-09-0041
dated January 29, 1997. We provide more specific comments
following our general comments below.  We  appreciate the
opportunity to comment on this revised Draft  Report and we are
looking forward to continuing to work with the OIG on this audit.

GENERAL COMMENTS ON THE DRAFT REPORT

     OSWER generally agrees with the recommendations in the
revised Draft Report.  Many of our comments to the original Draft
Report are no longer appropriate due to the changes in the
revised Draft Report.  We appreciate the opportunity to discuss
our earlier comments with the OIG and feel that the revised
report is a better report than the earlier version.  Many of our
original comments have been omitted, as they  have been addressed,
or changed in this response.

     The Draft Report calls for additional oversight of other
Federal agencies' cleanup program by Environmental Protection
Agency. Although it appears that improvements are needed in
                                 35
                                                        OyQ Printed on Recycled Paper

-------
 quality assurance/quality  control  (QA/QC)  oversight,  we  need to
 be  careful  as  to not  undermine  recent  initiative  to  streamline
 the Superfund  process. We  believe  that having  sound  information
 to  base cleanup decisions  is  critical,  but we  also must  recognize
 the responsibilities  delegated  under Executive Order 12580  to
 other  Federal  agencies.  While  we  must improve our efforts,  so
 too must other Federal agencies improve their  accountability.
 The very nature of  the relationship that EPA has  with other
 Federal agencies, such as  Department of Defense (DOD)  and
 Department  of  Energy  (DOE), as  defined by  law,  and Executive
 Order,  limits  the scope and authority  that EPA has to dictate
 QA/QC  procedures to other  Federal  agencies.

   We agree with the Office of Enforcement and Compliance
 Assurance's response  that  we  should continue to explore  options
 for addressing the  recommendation, including possible future
 amendments  to  Executive Order 12580.   Prior to taking such  an
 approach, however,  we recommend that EPA first try to correct the
 problems through our  own efforts,  in conjunction  with other
 Federal  agencies. If  these prove ineffectual,  then we should
 consider amending the Executive Order.

 COMMENTS ON THE DRAFT REPORT

     l)Page 9:  "In this regard, we found  that  the initial
 sampling costs were much higher than the resampling  costs,
 possibly indicating that DQOs may be inadequate."

     Comment:  It would be helpful if more information were
 provided with  this example to show what  the connection is between
 the sampling costs and the DQOs since there are many  reasons why
 the resampling costs would be lower the  second  time around.

     2)Page 10:  "Our review found that  the following four
 (three)  data quality activities were particularly effective  in
 detecting inappropriate quality data:"

     Comment:   The following are general comments concerning the
 three data quality activities that are recommended.

     a)   Independent data validations,  in accordance with EPA
 functional guidelines or their equivalent.   We assume that
 "independent"  means that the validators are working independently
 from the data generators.   Adequate quality assurance programs
normally require independent data validation.   We agree that


                                36

-------
there should be documented appropriate guidelines for validating
data.

     b)  Independent laboratory audits before the work starts and
periodically throughout the project.  Usually, on-site audits are
viewed as a unique opportunity to evaluate the laboratory in-
person based on their personnel, procedures, documentation, and
facility.  They are useful in providing an insight into the
laboratory's capabilities to perform specific analyses.

     It is often beneficial to perform an on-site audit when the
laboratory is being considered for work (or has been awarded the
work) for which it does not have a performance history with the
Federal facility and/or Department.  The Contract Laboratory
Program (CLP) normally performs a laboratory audit before
contract award and once a year during the contract (CLP contracts
usually last two to three years).   When evaluating a lab for a
new contract, if the laboratory has a good performance history
under a similar current contract the EPA program office uses its
discretion on whether to perform a pre-award on-site audit.

     c)  Provide magnetic media of raw data, when needed.  Audits
of magnetic media (commonly referred to as tape audits) are used
to detect manual changes in the electronic copy of the raw data
and inconsistencies between the electronic copy and hard copy.
The inconsistencies indicate poor laboratory practices or
possibly laboratory fraud.  Tape audits are usually limited to
GC/MS data (i.e., volatile and semi-volatile organics) that are
generated by systems that are capable of taping.  Tape audits are
not currently available for inorganic data  (e.g., metals,
anions), or radionuclides.  They are_not generally performed for
GC data (e.g.,  pesticides/PCBs). The auditor is also required to
have access to systems capable of reprocessing the tape.
Requiring magnetic records of the raw data is a potentially
costly requirement for all program participants.  The CLP
routinely audits two tapes annually from each laboratory with a
current contract for organics analysis.  These audits cost
approximately $5000 each.  This cost does not include laboratory
resources to generate and ship the tape or EPA's resources to
manage the contract and review audit results.

     3)Page 15;  Magnetic tapes audits were effective in
detecting major data quality problems at March AFB.
                                37

-------
      Comment:   It  is  not  clear from the  OIG audit  text  that  tape
 audits are performed  in conjunction with data  audits  to
 reconstruct an  analytical run.   It  is  only after an analytical
 run is compared with  the  data  package  that a true  evaluation of
 the data  deliverable  can  be  made.

      4)Page 16:  The  chart titled  "QA  Requirements in QAPPs".

      Comment;   The  first  column is  labeled "No QA  Requirements".
 This should be  changed  to indicate  that  none of the three  QA
 activities specified  were included,  not  that the quality
 assurance project plan  (QAPP)  contained  no QA  requirements.
 (This  has been  agreed to  but was not changed due to time
 constraints.)

      5)Page 25:  The  OIG  notes  that  there are  no procedures  for
 exchanging laboratory performance information.

      Comment:  While  that  is true,  it  should be noted that the
 DOE  investigation which led  to  allegations of  laboratory fraud  at
 Hanford was started because  EPA informed DOE that  EPA was
 investigating the lab for  fraud.

 RESPONSE  TO RECOMMENDATIONS

     OIG  Recommendations  (page  26) :  We  recommend  that  the
 Assistant Administrator for  the  Office of  Solid Waste and
 Emergency Response:

     OIG Recommendation #5:  Assess  the  adequacy of DOD's  and
 DOE's environmental data management.systems.   (Recommendation
 previously  omitted.)

     OSWER Response:  The results of the assessment of  DOD's and
 DOE's environmental data management  systems will assist EPA  in
 responding to the other recommendations contained  in  this  audit.
 It is important to determine the adequacy of DOD's and  DOE's
 current environmental  data management systems and  to  use this
 information to establish the baseline for the environmental data
management systems.   From this assessment, EPA  should be able to
determine what in the  systems should be changed, the  level of
effort from EPA, DOD,  and DOE to effect the change, and a
reasonable time frame  in which to design and implement  the
changes.  The  baseline assessment can then be used to measure
 improvement in the systems.


                                38

-------
     EPA HQ in coordination with the EPA Regions and DOD and DOE
will assess the adequacy of DOD's and DOE's environmental data
management systems.  The assessment will be completed by November
30, 1997.

     QIG Recommendation ttl;  Make sure that the Regions have QAPP
requirements that are based on well-defined data quality
objectives .

     PIG Recommendation &2 :  Ensure that QAPPs are prepared for
each data collection activity that is used for decision-making.

     PIG Recommendation j*3 :  Make sure that the quality assurance
personnel are involved in the entire QAPP process, from
development of the QAPP to compliance with the QAPP.

     QIG Recommendation #4 :  Issue guidance that specifies
regional oversight responsibilities for Federal facility
Superfund cleanups.  Ensure this guidance addresses the oversight
of laboratory data quality and includes a requirement for site-
specific quality management plans that discuss the nature,
frequency, and responsibility for data quality oversight
activities.

     PIG Recommendation #6 :  Establish procedures for ensuring
fraudulent or poor quality data is not used at Federal facility
cleanups.  In this respect, the impacts of the Eureka
Laboratories fraudulent and poor laboratory practices on Federal
facility cleanups should be fully identified.

     QIG Recommendation &7 ;  Develop_ a national quality
management plan.

     PIG Recommendation tts :  Develop performance measures for the
environmental data quality system and compare actual performance
of the system to these measures.

     PIG Recommendation 9 ;  Issue program-specific QAPP guidance
to ensure that the following quality assurance measures are
included when high-quality data is required by data quality
objectives :
          The use of EPA national functional guidelines or their
equivalent for data validation.  The data validation should
                                39

-------
 represent all matrices,  analysis types,  and laboratory decision
 points,  and be based on  the data quality objectives.

      9b:   The data validation requirements  by a party independent
 of both  the laboratory and its parent  company.

      9c:   On-site  laboratory audits  before  work is  started and
 periodically throughout  the project.   Also,  the guidance should
 specify  that the audits  will be conducted by an activity
 independent of the laboratory and will include both announced and
 unannounced audits.

      9d:   Magnetic data  maintained and made available to regions.
 In addition,  magnetic  tape audits should be required if major
 deficiencies are found by  other quality assurance methods,  such
 as data validation or  performance evaluation samples.

      OSWER Response:   We agree that  these recommendations are
 generally appropriate.   OSWER's response to Recommendation #4
 provides  the core  to the response to recommendations  #1,  2,  3,  6,
 8,  and 9.   In coordination with the EPA Regions and DOE and DOD,
 OSWER will  develop a framework for the minimum  quality assurance
 program that  the Federal facilities should  have in  place.   This
 framework will  incorporate currently available  QA guidances/
 information.   This will  avoid  the generation of new QA guidances/
 information when the existing  documents  are  sufficient and in
 many  cases  are  already being used.

      Each EPA Region will  be responsible  for verifying that  its
 NPL Federal  facilities have established  and  are  maintaining  the
 Federal facilities' QA program.   The_QA program  will specify
 that: DQAPP  requirements  are  based on well-defined data  quality
 objectives;  2)QAPPs are  prepared  for each data  collection
 activity  used for  decision-making; 3) the quality assurance
 personnel are involved in  the  entire QAPP process;  4)procedures
 are used  to minimize the production and use  of  fraudulent data
 and;  5)  performance measures are used to evaluate the
 environmental data quality system; 6)data validation is performed
where appropriate and is based on the DQOs;   7) the  data
validation is performed by a party independent of both the
 laboratory and its parent company; 8) on-site audits are
performed; 9)magnetic data is maintained and available to Federal
 facility and to Regions;  10) magnetic tape audits are required
where appropriate.
                                40

-------
     Although not suitable for all types of analytical data,  we
believe magnetic tape audits should be performed if major
deficiencies are found by other methods, such as data validation
or performance evaluation samples.  However,  in order to do so,
Federal agencies must be able to obtain the magnetic data.  This
means including the requirement in the QAPP and the laboratory
contract.  To this end, EPA will work with DOD and DOE to include
this requirement in their laboratory contracts.

     It is apparent that the most effective way to obtain
performance of the delineated QA tools is for the Quality
Assurance Programs of the DOD and DOE to adopt and require their
implementation.  Therefore, a revision of their QA programs is
necessary and will provide a firm commitment by these Federal
agencies to the specific QA requirements stated.  In addition,
the Regions can use the Federal Facility Agreements as an
opportunity to include the QA/QC requirements in the Federal
facility programs.  EPA will complete the design of the QA
framework and the initial implementation by May 31, 1998.

     Recommendation #6 recommends the establishment of procedures
for ensuring fraudulent or poor quality data is not used at
Federal facility cleanups. A quality assurance program should
ensure that the data is the quality that is needed for decision-
making. One goal of a QA program is to minimize the occurrence
and use of fraudulent or otherwise inappropriate data.  All
incidences of fraud cannot be detected even by the most effective
quality assurance program. There are quality assurance tools
discussed in this report that can be used to minimize the
production and use of fraudulent and poor quality data.  OSWER
intends to pursue the implementation^ of these tools when they are
useful, cost-effective, and available.

     The second part of Recommendation #6, states that the
impacts of the Eureka Laboratories fraudulent and poor laboratory
practices on Federal facility cleanups should be fully
identified.  After Eureka Laboratories was suspended for the
fraudulent and poor laboratory practices, the director of Office
of Emergency and Remedial Response (OERR) sent a letter, dated
July 1995, to all the Regions.  This letter notified the Regions
of Eureka Laboratories fraudulent activities and asked the
Regions to identify the impacts on their data.  The letter did
not specifically mention that the impact on Federal facility data
should be identified.  OSWER will send a new letter to the
Regions asking them to identify the impact of Eureka Laboratories


                                41

-------
 fraudulent activities on Federal facility data.   The responses
 from the Regions will be due by October 31,  1998.

      PIG Recommendation #7:   Develop a national  quality
 management plan.

      OSWER Response:   OSWER's Federal facility Restoration and
 Reuse Office (FFRRO)  will develop a Quality  Management Plan (QMP)
 that will ensure data of appropriate quality is  generated for
 FFRRO.   FFRRO,  working with  OERR and its Regions,  plans to
 complete the QMP by April, 1997,  then submit it  to Office of
 Research and Development (ORD)  for approval.

      QIG Recommendation #10:   Continue electronic  data validation
 developments, expand  its capabilities,  and encourage its use.

      OSWER Response:   Electronic  data validation has a lot of
 potential  for saving  resources, making the validation faster,
 cheaper  and more consistent.  Unfortunately,  the  development and
 implementation  of electronic  data validation  is  very resource
 intensive.  Electronic  data validation continues  to be an
 important  area  of focus  in Superfund.  The  Contract Laboratory
 Program  continues to  support  this project  and funding of the
 project  continues,  albeit at  a reduced rate  due  to budget
 cutbacks.   EPA  will continue  to fund development of electronic
 data validation.  In  addition,  we will encourage its use through
 technical  support and  guidance. This is a  continuous effort that
 has  no completion date.

      PIG Recommendation  11:   Sponsor a forum  for sharing
 environmental laboratory evaluations_among Federal agencies, such
 as laboratory audits.

      QSWER  Response:  Any exchange of  laboratory performance
 information has  to be considered  in  the context  of what
 information EPA may legally release.  Recognizing  the  limitations
 on laboratory information that can be exchanged  between  Federal
 agencies, OSWER will evaluate the advantages, disadvantages, and
 the logistics of exchanging laboratory audits between  the  Federal
 agencies.  OSWER will evaluate our options for developing  a
 standard audit form that can be used by EPA,  DOD,  and DOE  to
 facilitate the sharing of audits.  One option is  to adopt  the
draft standardized audit that is being developed  for the National
Environmental Laboratory Accreditation Conference  (NELAC). OSWER
                                42

-------
will develop a mechanism to share some laboratory audit
information by April 30, 1998.

     OSWER Directive #9240.0-29 dated November 8, 1995
specifically addresses for the CLP what information may be
released concerning laboratories under investigation for alleged
fraud.  For exchanging information among the Federal agencies
when laboratories are under investigation, OSWER plans on
adopting the same basic approach as the CLP.  OSWER will adopt
this approach by May 31, 1997.

     DIG Recommendation 12:   Publicize best practices used in
Federal Facility Agreements, QAPPs, and laboratory contracts to
make EPA regions and other Federal facilities aware of them.

     OSWER Response:  OSWER will encourage the dissemination of
information concerning best practices used in Federal Facility
Agreements, QAPPs, and laboratory contracts.  OSWER will send out
a request semiannually to each EPA Region asking for information
on the best practices used in Federal Facility Agreements, QAPPs,
and laboratory contracts by the Region and/or the Federal
Facilities in their Region.   The information will then be
disseminated through various forums including the Federal
Facility Leadership Council, Federal Facility Forum, and quality
assurance conferences.  OSWER will begin this process by April
30, 1997.

CONCLUSION

     The revised Draft Report makes valid points as to the
potential improvements to the current QA/QC oversight process
including defining EPA's and the Federal facilities role in
ensuring appropriate data are generated and used for making
decisions affecting cleanup. OSWER agrees to initiate some
changes and to address the problems identified and will do so in
a timely manner.   OSWER will actively try to enlist DOD and DOE
in this effort to improve the data quality systems at the Federal
facilities.  We look forward to continuing to work with the OIG
and other EPA Offices as we  move to respond to the
recommendations.
                                43

-------
     If you have any questions about our response, please contact
Marianne Lynch at (202)  260-5686.

cc:  Steven Herman,  OECA
     Robert Huggett,  ORD
     Stephen Luftig,  OSWER/OERR
     Jim Woolford,  OSWER/FFRRO
     Craig Hooks,  OECA/FFEO
     Hans Crump-Wiesner,  OSWER/OERR
     Nancy Wentworth,  ORD
     Johnsie Webster,  OSWER/OPM
     Federal Facilities  Leadership Council
                               44

-------
                   UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                                 WASHINGTON, D.C.  20480
                                         2 7 J997
                                                                          OFFICE OF
                                                                   RESEARCH AND DEVELOPMENT
MEMORANDUM
SUBJECT:


FROM:   -
         *


TO:
             Response to Draft OIG Report - Laboratory Data Quality at Federal Facility
             Superfund Sites (E1SKB6-09-0041)
             Robert J. Huggett, Ph.D
             Assistant Administrator / V
               for Research and De>Wopment (8101)

             Michael D. Simmons
             Deputy Assistant Inspector General
               for Internal Audit (2421)
Purpose:

      This memorandum responds to the Office of Inspector General's Draft Report on
Laboratory Data Quality at Federal Facility Superfund Sites (Report No. E1SKB6-09-0041),
received on January 31, 1997.

Discussion:

      We reviewed the draft report and generally concurred with the findings and
recommendations.  We do, however, have comments which will improve the quality and accuracy
of the report. First, we have provided an overview segment which comments on the focus of the
report.  Second, in our review of the findings, we found certain inaccuracies which we noted in
detail We believe remedying these inaccuracies will greatly improve the report. You will be
receiving separate responses to the draft report from the Offices of Enforcement and Compliance
Assurance, and Solid Waste and Emergency Response. ORD is committed to working with these
other Offices to address the recommendations of the draft report.
                                        45
                                                                          Printed on Recycled Paper

-------
       We appreciate your support for ORD's efforts to respond to this draft report. If you have
any questions about the details of the response, please contact Nancy Wentworth, Director,
Quality Assurance Division, at 202 260-5763, or Arnold Bloom, ORD OIG liaison, at 202 260-
9496.

Attachment

cc:     Elliott P. Laws (5101)
       Steven A. Herman (2201 A)
       William Samuel (2421)
       Katherine Thompson, OIG-Sacramento
                                        46

-------
   ORD Comments on Draft Laboratory Data Quality at Federal Facility Superfund Sites
                               (Report 0E1SKB6-09-0041)

 Introduction:

       We appreciate the opportunity to review the draft report on Laboratory Data Quality at
 Federal Facility Superfund Sites.  We have several comments and concerns about the content of
 the report.  Our comments are organized into three sections: 1) Overview, 2) Clarifications and
 Errors, and 3) Recommendations. We concur with the stated recommendations, except as noted,
 and appreciate the OIG's attention to considering our thoughts to clarify and enhance the report's
 content and message.

 Overview:

       The report leads the reader to believe that quality assurance documentation, particularly
 the quality assurance project plan (QAPP), is intended to provide absolute assurance that data of
 the appropriate type and quality will be collected and used in decision making.  Development and
 approval of a QAPP is one thing; proper implementation of the approved QAPP is quite another.
 Both are needed to confirm the success of the project. The  QAPP, including relevant site-specific
 data quality objectives (DQO), is an effective tool for defining site-specific data collection
 activities and the controls needed to give reasonable and documented assurance that the activities
 occurred as planned. There is no system, no matter how well conceived and documented, that
 cannot be circumvented by unexpected environmental conditions, unintentional mistakes by staff,
 or intentional malfeasance.

 Clarifications, Inconsistencies and Errors:

 Page 5, Quality Assurance Division: QAD "overseeing implementation of the Agency-wide
 mandatory policy for QA	, not "implementing" the program. The Program Office and Regions
 are responsible for implementation.

 Page 7, QAPPs: The statements imply a deficiency in the design of the QAPP, while the text
 describes flaws in the application of the QAPP.  We believe that a well written and correctly
 implemented QAPP will address the deficiencies noted hi the report.

Page 8-9, DQOs were deficient, paragraph below box: "possibly indicating that initial DQOs may
be inadequate or incomplete.

Page 10, Effective Data Quality Activities Identified: Three activities, not four.
                                          47

-------
 Page 11, Data Validation Found Effective, para 1: In the second sentence, data should be "of
 known and documented quality."

 Page D:  Standard operating procedures, not standing operating procedures.

 Page 15: Chart should be revised to delete PEs.

 Page 15: Lack of guidance: QA R-5 is expected to be issued in Summer, 1997.

 Page 24, Data Quality a Material Weakness: As of October 31,1996, 7 Regions have approved
 Quality Management Plans. As of February 28,1997, 9 Regions have approved QMPs; Region
 7's plan is still under development.

 Recommendations:

 The Assistant Administrator for Research and Development

 13.    Refine the data quality objectives process by:

       a.     Ensuring that early involvement of key decision makers.

       b.     Using checklists to identify necessary activities.

       c.     Identifying specific documentation requirements.

       d.     Using the model developed at the Hanford site as a guide.

       Response: The Quality Assurance Division's (QAD) data quality objectives (DQO)
       guidance was not intended to be a static document, but was designed to be updated on a
       periodic basis. QAD will review the guidance and material noted in the report that was
       prepared for the Hanford, Washington, Department of Energy facility, and will issue
       appropriate addenda to its DQO guidance. QAD will complete this activity by August 1
       1997.

14.     Work with the Federal Facilities Restoration and Reuse Office (FFRRO) and regions to
       develop acceptable quality management plans (QMPs).

       Response: QAD will continue its support to FFRRO and the regions as they develop and
       implement QMPs. QAD provides training in the Agency's quality management system
       and invites all organization to participate in the training.
                                         48

-------
              UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
                            WASHINGTON, D.C. 20460
                                      131996
     OFFICE OF
  ENFORCEMENT AND
COMPLIANCE ASSURANCE
MFMORANDUM

SUBJECT:          OECA Response to the EPA Office of Inspector General (IG)
                    Draft Audit Report, Laboratory Data Quality at Federal
                    Facility Superfund Sites (Oct 1996) (the Draft Report)

FROM:           |Jjteven A.
                  ^"Assistant Adinjiftstrator
                    Office of Enforcement and Compliance Assurance

TO:                Michael Simmons
                    Deputy Assistant Inspector General for Internal Audit
                    Office of Inspector  General
       The Draft Report offers thirteen recommendations: twelve addressed to OSWER
and ORD and one to OECA.  While finding inadequate EPA's oversight of environmental
data quality at federal facilities, the IG acknowledged EPA's legal authority to conduct this
oversight is "ambiguous." Draft Report, p. 22. -To cure any inadequacy due to unclear
oversight authority, the Draft Report recommends we endeavor to modify Executive
Order 12580 to expressly identify EPA's oversight responsibilities for RLTS activities,
including environmental data quality.  Draft Report, pp. 23, 28. This memorandum
provides our response to this recommendation.

Legal Background: CERCLA and Executive Order 12580 fail to address whether
EPA has Clear Authority to Oversee RI/FS at Federal Facilities

       Executive Order 12580 (the EO) delegates certain functions CERCLA vests in the
President to the heads of federal departments and agencies for "releases or threatened
releases where either the release is on or the sole source of the release is from any facility
or vessel under the jurisdiction, custody or control" of their departments and agencies.
Although in several instances, the EO tailors its delegations based on the department or
agency receiving the authority and if the site is on the NPL, it makes no distinction when
delegating the President's RI/FS authorities.  The EO delegates the President's 104(b)(l)

                                     49
         Recycled/Recyclable . Printed with Vegetable Oil Based Inks on 100% Recycled Paper (40% Postconsumer)

-------
 investigation and study (RI/FS) authority, regardless of the department or agency or if the
 site is on the NPL.  With few exceptions, the Executive Order's framework allocates the
 President's authorities in an exclusive manner, with EPA not delegated any shared
 authority over the functions delegated to another department or agency.  (A significant
 exception to this exclusive delegation framework is the new amendments to EO 12580
 providing the federal natural resource trustee agencies with section 106 and 122 enforcement
 authorities. These new amendments supplement the authorities of federal agencies, while
 not diminishing EPA's authorities.)

       As noted above, in receiving the President's section 104(b)(l) authorities, the
 heads of the federal departments acquire exclusive authority to conduct the RI/FS,
 including the data gathering and quality assurance activities.  Section 120(e)(2) modifies
 this unfettered authority by requiring EPA to review the "results" of the RI/FS and enter
 into an Interagency Agreement (LAG) providing for the implementation of the remedial
 action (emphasis added).  Section 120(e)(4) provides EPA with the ultimate authority to
 select the remedial action, should the head of the department and the EPA Administrator
 disagree.  However, it is unclear whether EPA has the ultimate authority to alter the
 RI/FS, including the environmental data on which it relies.

       Although section 120 does not require EPA and the other departments or agencies
 to enter into an IAG until the time for remedy selection (time of ROD), typically EPA
 negotiates federal facility agreements earlier in the process. This allows EPA to review
 and potentially dispute certain pre-ROD documents and studies.  These generally include,
 for example, draft RI/FS reports and data from which the RI/FS reports are compiled.
 The Quality Assurance Project Plan (QAPP) is an example of a document over which EPA
 has no clear oversight authorities, but may review, based on the provisions of the federal
 facility agreement. However, if the environmental data are used to support a removal, or
 a remedial action at a non-NPL  site, the federal department or agency need not enter into
 an IAG with EPA. Thus, EPA would have little leverage to gain review and comment
 authority over the environmental data or studies.

       To summarize, the statute and the Executive Order fail to specifically delegate
 oversight to EPA, which weakens our ability to compel the federal agency to  modify any
 pre-ROD document. EPA negotiates for pre-ROD oversight, based on the possibility that
 unless EPA reviews the RI/FS, EPA may not have sufficient information on which to
 approve the federal agency's ultimate remedy selection decision.
OECA and OSWER Are Taking Action to Clarify EPA's Oversight Authority for
Federal Facility RI/FS Activities, including Data Collection and Quality Control
Measures

      In its responses to the Draft Report, OSWER provides details of their efforts to
aggressively respond to the Draft Report's findings and recommendations. In particular,
                                        50

-------
OSWER responds to recommendation 4 (issue guidance specifying regional oversight
responsibilities and other related topics) by committing to develop, in coordination with
the EPA Regions and DOE and DOD, a framework for the minimum quality assurance
program that the federal facilities should have in place. OECA and OSWER, working
with the regions and the other federal departments and agencies, will undertake a program
to improve the quality of the RI/FS work the federal departments and agencies conduct.
Consistent with our long-held "enforcement first" principles, we applaud the IG for
supporting the need for strong EPA oversight. At this time, OECA and OSWER view the
best approach to improving the data quality supporting federal facility response actions is the
cooperative, yet aggressive, approach detailed hi OSWER's comments. However, OECA
remains ready to pursue amending the executive order if we fail to secure the improvements
the OSWER actions seek.  OECA currently is reviewing several options for amendments to
EO 12580, should that become necessary.  Further, OECA is considering whether a
memorandum of understanding or similar agreement may offer the best vehicle for clarifying
EPA'S oversight authorities.

CONCLUSION

       As the above demonstrates, in response to the IG's Draft Report, OECA, in
cooperation with OSWER, EPA regions and federal departments and agencies, is actively
pursuing methods to clarify EPA's authority over federal departments and agencies
conducting RI/FS activities, including environmental data collection and quality assurance
measures.
cc:    Elliott P. Laws
      Robert J. Huggett
                                       51

-------
(This page intentionally left blank.)
               52

-------
APPENDIX B
  Acronyms
Acronym
AFCEE
CADRE
CERCLA
CLP
DQO
DOD
DOE
e-Data
FS
NPL
OECA
OU
ORD
OSWER
PE
QA
QAPP
QC
RI
Name
Air Force Center for Environmental Excellence
Computer Assisted Data Review and Evaluation (Program)
Comprehensive Environmental Response, Compensation,
and Liability Act
EPA's Contract Laboratory Program
Data quality objectives
U.S. Department of Defense
U.S. Department of Energy
Electronic Data Transfer and Validation System
Feasibility study
National Priorities List
Office of Enforcement and Compliance Assurance
Operable unit
Office of Research and Development
Office of Solid Waste and Emergency Response
Performance evaluation (samples)
Quality assurance
Quality assurance project plan
Quality control
Remedial investigation
     53

-------
(This page intentionally left blank.)
              54

-------
                          APPENDIX C
      How Federal Facilities on the NPL are Cleaned Up
Facilities judged by EPA to
present serious risks to
human health and the
environment are placed on
this list.

EPA's policy is to sign a
Federal facility agreement
(FFA) before the Remedial
Investigation/Feasibility Study
(RI/FS) stage.

The responsible agency uses
sampling and other analytical
activities to determine the
nature, extent, and significance
of the contamination.

The responsible agency conducts
feasibility studies to evaluate
cleanup alternatives for the
sites to determine which would
provide the protection required.

EPA is required to review the
RI/FS and enter into an FFA
with the federal agency, if
not already done.
  National
  Priorities
    List
Federal Facility
  Agreement
  Negotiated

 Remedial
Investigation
   Feasibility
    Study
     EPA
   Reviews
    RI/FS
                                                         T
                                  55

-------
 The responsible Federal agency
 selects a cleanup method and.
 in the record of decision,
 documents the analysis that led
 to the selection.
                       Record of
                       Decision
EPA Agrees to
Remedial
Action



No Further
Action
Required
Detailed design plans are
chosen and the cleanup
option is implemented by
the responsible agency.
Remedial Design/
 Remedial Action
                                        56

-------
                             APPENDIX  D

                         Data Quality Problems
 Installation
                              Causes
March Air
Force Base
Region 9's Request for Suspension of Eureka Laboratories dated November
1993 identified: "Fraudulent under-reporting oftetrachloroethene
concentration....Fraudulent reporting that method tuning criteria -were
met....Fraudulent reporting that initial calibration criteria -were
met....Fraudulent reporting that continuing calibration criteria were
met....Fraudulent reporting that surrogate recovery criteria were
met....Removing 'M Flags 'from the paper trail...Pervasive unwarranted
manipulation of calibration and sample quality control data. "
Hunters Point
Naval Shipyard
The Executive Summary for Rejected Laboratory Data for CTO 0057,
Hunters Point Annex RI/FS stated: "However, full validation process
identified gross methodology errors such as improper calibration
procedures, improper procedures in violation of CLP, and numerous other
laboratory QA problems. In addition, the full data packages were
incomplete."
Fernald
Environmental
Management
Project
According to the U.S. Department of Energy's Office of Inspector General
Report on Femald Environmental Management Project Remedial
Investigation and Feasibility Study (DOE/IG-0326) dated April 1993:
"Samples had been assigned duplicate identification numbers.... The
laboratory did not analyze some samples within EPA prescribed time limits,
it lost other samples, and it questioned the integrity of the sample data.'"''
Rocky
Mountain
Arsenal
According to Rocky Mountain Arsenal's report on their audit of Eureka
Laboratories on August 12 and 13,1993: "The laboratory had obviously
manipulated the instrument output, which immediately brings all GC/MS
data output under question....A summary letter says that dilutions are to be
made based on conductivity, but no conductivity measurements were found
in the data package....There was not evidence of an initial calibration in
data package....Several samples in this lot had numbers entered on the
analyst worksheet that were incorrectly reported in the transfer file. "
Luke Air Force
Base
According to Region 9's Memorandum of August 4,1995, Subject:
Confirmation of Manipulated Data at Luke Air Force Base: "Manipulations
of calibrations and surrogate recoveries for the semivolatile analyses were
documented."
                                      57

-------
  Installation
                               Causes
Travis Air
Force Base
According to an Air Force Center for Environmental Excellence
contractor's report dated August 23,1993:  "Weston 's QA system for
identifying out-of-control analytical data \vas in place but failed to prevent
the reporting of such data for Travis Air Force Base....The specified
calibration acceptance procedures were not being used.... The required
number of surrogates was not being used to QC samples for organic
methods....Matrix spikes were not being used to control accuracy for
organic methods....Laboratory-established control limits were not being
used to QC the methods. "
Sacramento
Army Depot
Region 9's data validation for the Burn Pits Operable Unit found that the
sample results for volatile organic compounds were rejected due to serious
deficiencies (defect in sampling technique) in the ability to analyze the
sample and meet quality criteria.

Region 9's data validation for the Groundwater Operable Unit found
missing holding times and unacceptable calibrations impacting numerous
compound analyses.
Fort
Wainwright
The contractor's data validation reports said that pesticide peaks were not
consistent and there were potential false positives and negatives.
                                       58

-------
                              APPENDIX E

              Definitions of Quality Assurance Activities
Computerized
Data Validation
Data Validation
Laboratory Audits
Magnetic Tape
Audits
Computerized data validation is a relatively new quality
assurance measure that is more efficient than traditional manual
data validation. EPA has developed two automated data
validation programs:  Computer-Aided Data Review and
Evaluation (CADRE) and e-Data.

Data validation is a method for ensuring laboratory data is of
known quality.  It involves reviewing data against a set of criteria
to provide assurance that data is adequate for its intended use.

EPA has data validation guidelines, known as national functional
guidelines, for its own contract lab program.  According to EPA
guidelines, data validation includes a review of documentation
such as raw data, instrument printouts, chain of custody records,
and instrument calibration logs.

Laboratory audits are on-site audits designed to identify technical
areas which may cause laboratories to improperly identify or
quantitate chemicals. Audits normally evaluate a laboratory's
technical expertise, standard operating procedures, facility and
equipment sufficiency, and possible sources of sample
contamination.

On-site audits are frequently viewed  as a unique opportunity to
evaluate the laboratory in-person. They are useful in providing
insight into the laboratory's capabilities to perform specific
analysis. It is often beneficial to perform an on-site audit when
the laboratory is being considered for work for which it does not
have performance history with the Federal  facility or department.

Audits of magnetic media are used to detect manual changes in
the electronic copy of the raw data and inconsistencies between
the electronic copy and paper copy.  These audits are done in
conjunction with data audits to reconstruct an analytical run.

Electronic data,  often in the form of magnetic tapes, are an
output of laboratory analyses. By obtaining magnetic tapes (or
                                        59

-------
                           other electronic data) from a laboratory, audits can be conducted
                           to help determine:

                                 •  If the laboratory is complying with its contract;

                                 •  The integrity of the laboratory's computer systems;
                                    and,

                                 •  The appropriateness of any software editing.

                           Electronic tape audits are usually limited to GC/MS data that are
                           generated by systems that are capable of taping.  Tape audits are
                           not currently available for inorganic data or radio nuclides.
Performance
Evaluation
Samples
Performance evaluation (PE) samples are prepared by "spiking"
a known concentration of chemicals into a contaminate-free
media, such as water or soil.  PE samples can be administered by
two methods:  "blind" or "double-blind." When a PE sample is
blind, the laboratory is aware the sample is a PE, but does not
know the chemical concentration levels.

When a sample is double-blind, the PE sample is submitted as
part of a field sample shipment, so that the laboratory  is not only
unaware of the concentration levels, it is also unaware that the
sample is a PE. A laboratory's analysis of PE samples is used to
evaluate its ability to produce accurate results.
                                       60

-------
                          APPENDIX F

     Planning Procedure for Defining Data Quality Objectives
The following chart was prepared by Hanford Nuclear Reservation's environmental
restoration contractor (ERC), Bechtel Hanford, Inc.
                                 61

-------
                     ERC DQO PROCESS
Program
                               Develop List
                                 of Work
Projects
                                                           Decision
                                                          Makers DQO
                                                           Checklist
Interview
Decision
 Makers
  DQO
Checklist
                                 Facilitator
                                 Decision
                                  Makers
                                   Prep
                   Scoping
                    Report
                                 Global
                                 Issues
                                 Meeting
            Internal
             DQO
            Process
           Steps 1-7
External
DQO
Process
Steps 1-7
A



Final DQO
Report
L_/^
              Draft DQO
                Report
  Decision
   Makers
Review & Prep
                                   62
                                                   Planning
                                                  Documents
                                                 FSP;QAPJP;
                                                    DOW
                                                                   E98M080.J

-------
                        APPENDIX G
  Example of Travis Air Force Base's Quality Assurance Report
            Summary of Performance Evaluation Samples

Analysis
SW8260
Volatiles by
GC/MS (water)















SW8260




Labor-
atory
RAS

















RAS




Analysis
Date
May 1994

















August
1994




Problems Noted
All volatile organic
compounds were correctly
identified. Ofl9PE
analytes, only o-xylene
and m,p-xylenes were
outside the QAPP (LCS)
and PE vendor acceptance
criteria
Six analytes were detected
above the detection limit,
but below the MRL which
were false positives:
acetone, 2-butanone,
chloroform,
chloromethane, d-
ibromocthanc, and 1,1-
dichloroethcne.

All 19 PE analytes were
correctly identified and all
except benzene met QAPP
and PE vendor QC criteria.


Comments
No analytical
anomalies
found. A
second PE
sample was
submitted


The majority of
these low level
detections were
qualified as
nondetects
during data
evaluations due
to low-level
blank
contamination.
No analytical
anomalies
found.



Project Impact
Sufficient data
quality for all
analytes. O-xylene
was correctly iden-
tified and quanti-
tated during
second PE sample
analysis.
No impact.









Sufficient data
quality. Benzene
was in control in 2
other PE sample
analyses.
        Source: North Operable Unit RI Report, Travis Air Force Base, February 1995
The quality assurance report is discussed on page 16 of this report.
                                63

-------
(This page intentionally left blank.)
               64

-------
        APPENDIX H
Activities Contacted During the Audit
Activity
Environmental Protection Agency, Headquarters
• Office of Solid Waste and Emergency Response,
Federal Facilities Restoration and Reuse Office
• Office of Research and Development,
National Center for Environmental Research and Quality Assurance
• Office of Enforcement and Compliance Assurance
Federal Facilities Enforcement Office
Environmental Protection Agency
• Region 8
• Region 9
• Region 10
Department of Defense
• Officeofthe Assistant Deputy Under Secretary of Defense
(Environmental Cleanup)
• Office of Inspector General
• Tri-Service Chemical Quality Assurance Work Group
• U.S. Army Corps of Engineers' Hazardous, Toxic, and Radioactive
Waste Center
• U.S. Army Corps of Engineers, Alaska District
• Air Force Center for Environmental Excellence
• Naval Facilities Engineering Service Center
• Concord Naval Weapons Station
• Fort Wainwright
• Hunters Point Naval Shipyard
• Luke Air Force Base
• March Air Force Base
• Rocky Mountain Arsenal
• Sacramento Army Depot
• Tooele Army Depot
Location
Washington, DC
Washington, DC
Washington, DC
Denver, CO
San Francisco, CA
Seattle, WA
Washington, DC
Alexandria, VA
Omaha, NE
Omaha, NE
Anchorage, AK
San Antonio, TX
Port Hueneme, CA
Concord, CA
Fairbanks, AK
San Francisco, CA
Glendale, AZ
Riverside, CA
Commerce City, CO
Sacramento, CA
Tooele, UT
              65

-------
Activity
• Travis Air Force Base
• Air Force Audit Agency
• Army Audit Agency
Department of Energy
• Office of Inspector General
• Hanford Nuclear Reservation
Interagency Steering Committee for Quality Assurance for Environmental
Measurements
Location
Fairfield, CA
Washington, DC
March AFB
Alexandria, VA
Germantown, MD
Richland, WA
Los Alamos, NM
66

-------
  APPENDIX I
Report Distribution
Distribution
Office of
Inspector General
EPA Headquarters









Regional Offices
External


Individual or Activity
• Acting Inspector General (2410)
• Assistant Administrator for Administration and Resources
Management (3101)
• Director, Office of Research Program Management (8102)
• Acting Associate Administrator for Regional Operations and
State/Local Relations (1501)



• Associate Administrator for Congressional and Legislative Affairs
(1301)
• Associate Administrator for Communication, Education, and
Affairs (1701)
• Headquarters Library (3401)
• Director, Federal Facilities Restoration and Reuse and Office
(5101)
• Director, Federal Facilities Enforcement Office (2261A)
Public



• Director, National Center for Environmental Research and Quality
Assurance (8201)
• Agency Followup Official, Attn: Director, Resource Management
Division (3304)
• Regional Administrators, Regions 1 through 10
• General Accounting Office
• DOD Inspector General
• DOE Inspector General



       67

-------
(This page intentionally left blank.)
              68

-------