-------
2
Should you or your staff have any questions about this report,
please contact Truman R. Beeler, Divisional Inspector General for
Audit, Western Division, or Paul Jalbert, Audit Manager at (415)
744-2445.
Attachment
cc: Don R. Clay, Assistant Administrator for Solid Waste and
Emergency Response (w/attachment)
Christian R. Holmes, Assistant Administrator for
Administration and Resources Management (v/attachment)
-------
mcnriva SUMMARY
FPRP08B
The Environmental Monitoring System Laboratory-Las Vegas (EMSL)
provides Quality Assurance/Quality Control (QA/QC) support to the
EPA Superfund Contract Laboratory Program (CLP). Because the
analytical data produced through the CLP is critical to the
effective accomplishment of Superfund goals, the Office of the
Inspector General (OIG) initiated a review of the effectiveness
of ENSL's QA/QC program. The objectives of the review included
determining whether: (i) EMSL's controls were adequate to
evaluate laboratory performance; (ii) EMSL was reporting all CLP
deficiencies timely and accurately; and (iii) appropriate follow-
up actions were initiated against poor performing laboratories.
BACKGROUND
Superfund was created to protect the public health and the
environment from the release, or threat of release, of hazardous
substances. This protection ranges from removal actions for the
immediate control of emergency situations to more permanent
remedial actions which address long term activities at
contaminated sites. The CLP was established to provide
analytical services to support Superfund site investigations and
clean-ups. The primary objective of the CLP is to provide
analytical data of known and documented quality for a high volume
of samples at an acceptable price. This is accomplished through
the award of EPA contracts to private laboratories. The
contracts include strict analytical protocols, QA/QC
requirements, and specific reporting requirements and
deliverables.
The CLP is directed by the Analytical Operations Branch (AOB) in
the Office of Emergency and Remedial Response. While overall
management responsibility for the CLP rests with AOB, various EPA
organizational elements assist AOB in the administration and
monitoring of CLP activities. Administrative project officers,
located in AOB, provide contract administration, laboratory
monitoring, payment approval and other support activities to the
program. The technical project officers, located in the regional
offices, are responsible for a variety of functions including
technical monitoring, participation in on-site audits, primary
liaison with laboratories in their geographic area and evaluating
corrective action measures. The AOB has full control and
responsibility for its internal QA program. However, at the
specific request of the AOB, EMSL provides QA/QC support to the
CLP through the accomplishment of various audits of CLP
laboratories, their analytical data and through the evaluation of
analytical methods.
i
-------
RESULTS-IM-BRIEF
Our review disclosed that EMSL can take a number of actions to
improve the effectiveness of its CLP QA/QC program. In general
we found ^that EMSL had attempted to direct and improve its QA/QC
audit services while responding to changing program priorities.
While EMSL's efforts are recognized, they were not fully
effective ill improving administrative controls fqr the attainment
of QA/QC goals.
We found that EMSL's controls over its QA/QC program were not
complete nor fully effective for evaluations of individual
laboratory performance. In addition, we concluded that EMSL's
reporting systems should be improved to provide more
comprehensive data to program and regional personnel. With
regard to timely and appropriate follow-up actions against poor
performing laboratories, we concluded that EMSL only has an
advisory role in this area. The responsibility to initiate
actions against laboratories rests with EPA's program and
contracts offices.
Because of AOB's influence over the CLP program and its overall
responsibility for managing the CLP program, the OIG initiated a
separate review of AOB's effectiveness in its management of the
CLP program. The results of that review will be reported to EPA
Headquarters at a later date.
PRINCIPAL FINDINGS
The conditions disclosed during our review are summarized in the
following paragraphs and discussed in detail in the report.
EMSL'S QA/QC AUDIT COVERAGE NEEDS INCREASED ATTENTION
While significant improvements have been made in its QA/QC audit
coverage, EMSL did not conduct a sufficient number of
gas/chromatography/mass spectrometry (GC/MS) tape audits, data
audits, on-site or remedial performance evaluation audits to
achieve its CLP audit frequency goals. Except for the routine
quarterly blinds (QBs), EMSL did not achieve its established
QA/QC audit goal frequencies. Frequent and timely QA/QC audits
help identify performance deficiencies so that timely follow-up
action can be initiated. Assurance that laboratories are
producing data of known and acceptable quality can be positively
impacted by an effective QA/QC program. EMSL personnel
identified funding and limited availability of trained personnel
as reasons for not conducting the targeted number of QA/QC
audits. While limited resources may have contributed to the
condition, we believe that EMSL also needs to increase its audit
management planning, controls, and oversight.
ii
-------
TRACKING PROCEDURES AND SYSTEM FOR OA/OC AUDIT RESULTS NEEDED
EMSL has not established effective tracking procedures and
systems £or evaluating QA/QC historical audit performance by
laboratory. This condition diminished EMSL's ability to monitor
individual laboratory QA/QC performance trends for the purpose of
identifying poor performing laboratories. As a result, we
believe that additional QA/QC audits and any needed
administrative or contractual actions that should have been taken
against laboratories were significantly delayed or not taken.
This condition was first reported to EMSL management in 1983,
however, it remained uncorrected through fiscal 1991. At that
time EMSL initiated work on developing a system; however, the
system is not projected for full implementation until early
fiscal 1995. EMSL's QA/QC efforts have emphasized developing and
monitoring of contractual protocols and methodologies, tracking
of overall program performance and some comparisons between
laboratories. The delay in establishing a laboratory based
tracking system for QA/QC audit results may also be attributable
to the fact that such a system was never a priority of the NPO.
It is our opinion that the timely development of an integrated
tracking system would be very beneficial in improving the overall
quality of CLP analytical data.
EMSL'S REPORTING SYSTEMS NEED IMPROVEMENT
A number of weaknesses in EMSL's reporting of laboratory contract
noncompliance and operating deficiencies were found. Because of
the reporting weaknesses, contract compliance deficiencies were
not always reported, nor were laboratories with recurring
deficiencies highlighted. In addition, EMSL does not have
adequate systems for rating the overall performance of
laboratories. Consequently, the NPO was unable to take timely
and appropriate actions against contractors. We attribute this
condition to a need for increased management oversight over
reporting of CLP QA/QC results. An additional contributing cause
may be the NPO encouraging the importance of working with CLP
contractors instead of holding them responsible for contract
performance. Incomplete reporting systems provide opportunities
for poor performing laboratories to continue non-compliant
activities and make it difficult for program managers to fulfill
their responsibilities to monitor and take appropriate actions.
Improvement in EMSL's reporting systems could significantly help
the Agency in its administration of CLP contracts. This in turn
would lead to increased assurance that the Superfund Program is
producing analytical data of known and acceptable quality.
iii
-------
We are recommending that the Assistant Administrator for Research
and Development direct the Director of EMSL to initiate several
actions €b improve EMSL's management of the CLP QA/QC program.
These recommendations include:
1. Establishing clear goals for performing QA/QC audits on
each laboratory in the program, with increased emphasis on poor
performing laboratories.
2. Instituting a laboratory based tracking system to
accumulate historical QA/QC performance data so that QA/QC
efforts can be directed at potentially vulnerable areas.
3. Developing and implementing procedures for identifying
and reporting repeat deficiencies and laboratory non-compliance
with contract deliverable requirements to the appropriate program
offices.
AOEMCY COMMEHT8
A draft audit report was transmitted to Agency officials for
comment on March 4, 1992. The Assistant Administrator (AA) for
the Office of Research and Development (ORD) responded to the
draft audit report on May 6, 1992. An exit conference was held
with ORD representatives on June 3, 1992.
ORD's response indicated a general agreement with the audit
findings relative to those issues that they considered as
specifically involving ORD's areas of responsibility. ORD's
comments to individual findings and recommendations are
incorporated into each chapter in this report. Where applicable,
additional auditor comments follow the ORD comments. ORD's
general comments are summarized as follows:
ORD has been a participant in the CLP for over a decade,
primarily through the activities of EMSL-Las Vegas... I am
satisfied that many of the recommendations in the audit
report are both reasonable and achievable. In fact, the
most complex technical recommendation, the development of an
automated database for evaluation of laboratory performance
trends, is well on its way towards a successful completion.
EMSL-Las Vegas fulfills a key role within the CLP by
monitoring analytical method performance, providing an
unbiased quality assurance data review, and resolving
unsettled technical issues. These are functions that are
consistent with the overall mission of ORD.
iv
-------
The CLP quality assurance program has always been a shared
responsibility between the NPO, EMSL-Las Vegas, the Regions,
and the contract management organizations...with each
component pursuing its individual roles and
responsibilities.
EMSL-Las Vegas is described as not initiating or completing
certain actions, such as laboratory audits, when the
responsibility for the action resides with another
organization.
EMSL-Las Vegas is a participant in on-site audits. The
audits are conducted by and for the Contracting Officer,
or their designee. The actual scheduling and ranking of
audits is the responsibility of the Contracting Officer,
the Regional Technical Project Officers (TPO), or the NPO.
Each year, EMSL-Las Vegas works with the NPO to develop its
plan for quality assurance activities and considers the
needs of the program and anticipated levels of funding.
Levels of funding are designed to cover normal activities.
Special projects arise requiring resolution which consume
resources targeted to satisfy the base program. Some of the
special projects include data audits, tape audits and
document reviews in support of the high priority Office of
the Inspector General activities.
EMSL-Las Vegas is prepared to take whatever steps are
necessary to achieve a successful outcome to the
Laboratory Performance Database (LPD) system.
Finally, there seems to be some confusion concerning the
Quality Assurance Program Plan (QAPP). Data quality
objectives are a user-driven function, dependent on the
intended use of the data. It is incumbent of the user, in
this case the NPO, to develop a QAPP that outlines the
overall QA program framework.
In addition to the above comments on the report findings, ORD
provided the following comments on the summary audit
recommendations included above.
1. EMSL-Las Vegas will work with the National Program
Office to develop a mutually supported process for targeting
QA/QC audits. This process will consider laboratory performance,
routine laboratory frequency goals and critical external needs,
such as requests from the Inspector General.
2. The LPD system is being developed with an initial
version scheduled for completion this year. A scientific peer
review is scheduled for this year to evaluate the system.
v
-------
3. The LPD system will allow users to better identify and
report repeat deficiencies. EMSL-Las Vegas will work with the
NPO to better identify and improve reporting procedures to ensure
that responses are both timely and thorough.
ORD's complete response is included as Appendix II to this
report.
OIQ EVALPATIOM OF AQBHCY GENERAL COMMENTS
We agree with ORD's description of EMSL playing a "key role"
within the CLP, and their long history of assisting the CLP
program. Our review of program documentation showed that it is
EMSL's responsibility to provide adequate QA/QC program coverage
for data audits, quarterly blinds and GC/MS tape audits. Annual
work plans for EMSL, and its support contractor, included
expected performance levels and direction of resources. It is
our opinion that EMSL is accountable for the performance of an
effective QA/QC program and the assurance of appropriate coverage
of CLP laboratory activities consistent with program goals agreed
to with the NPO. EMSL's work plans include extensive discussions
on performance levels and intended direction of QA/QC audit
activities. With regard to ORD's comments on the summary audit
recommendations, they were generally responsive and indicate
potentially positive program improvements if implemented.
In summary, it is our opinion that the audit findings and
recommendations provide the program with opportunities to improve
the effectiveness of the CLP QA/QC program. This in turn will
result in increased assurance that the analytical data being
derived from the program are accurate and supportable.
vi
-------
Audit Report Ho. E1SKFO-09-0137-2100624
TABLE OF CONTENTS
CHAPTERS Page
I INTRODUCTION 1
PURPOSE 1
BACKGROUND 1
SCOPE AND METHODOLOGY 5
PRIOR AUDIT COVERAGE 6
OTHER MATTERS 6
II EMSL'S QA/QC AUDIT COVERAGE NEEDS
INCREASED ATTENTION 8
III TRACKING PROCEDURES AND SYSTEM FOR QA/QC
AUDIT RESULTS NEEDED 22
IV EMSL'S REPORTING SYSTEMS NEED IMPROVEMENT 31
APPENDIXES
APPENDIX I: ABBREVIATIONS 40
APPENDIX II: AGENCY COMMENTS 42
APPENDIX III: DISTRIBUTION LIST 54
-------
Audit Report No. E1SKF0-09-0137-2100624
CHAPTER I
INTRODUCTION
PURPOSE
The goal of EPA's Contract Laboratory Program (CLP) is to provide
laboratory analytical services to support Superfund Program
technical decisions and related site enforcement actions. To
achieve this goal, the CLP was established with stringent quality
assurance/quality control (QA/QC) requirements, both in the
program and its contracts. The Environmental Monitoring Systems
Laboratory-Las Vegas (EMSL) plays a key role in the Agency's
monitoring of laboratory analytical performance for compliance
with contractual requirements.
The Office of the Inspector General (OIG) initiated a review of
EMSL's CLP QA/QC program to determine whether it was effective in
ensuring that data analyses received in support of the Superfund
program are of high quality and whether the Agency was taking
effective corrective action when instances of poor or
unacceptable laboratory performance were disclosed. Our specific
audit objectives were to determine whether:
1. EMSL's QA/QC controls are adequate to evaluate
laboratory performance and whether these controls are being
effectively implemented;
2. EMSL is reporting all CLP laboratory deficiencies timely
and accurately; and
3. Appropriate follow-up actions are initiated timely
against poor performing CLP contractors resulting in corrective
actions.
BACKQROPHD
The CLP was established in 1980 to provide analytical services to
support Superfund site investigations and clean-ups. The primary
purpose of the CLP is to provide data of known and documented
quality for a high volume of samples at an acceptable price. To
accomplish this, EPA contracts with private laboratories for
routine analytical services (RAS) for organic, inorganic,
volatile organic and dioxin compound identification. The
contracts include strict QA/QC protocols, specific reporting
requirements and deliverables. Inspection and acceptance clauses
are included in the contracts that allow EPA to monitor
contractor compliance with the terms and conditions of the
contract.
1
-------
Audit Report No. E1SKF0-09-0137-2100624
During the period of our review, EPA contracted with over 100
laboratories to analyze over 100,000 samples per year. These
laboratories are paid a fixed price per sample. The program
spends between $3 million and $5 million a month for these
analytical services. EMSL's costs for conducting its QA/QC
function is approximately $3 million per year. Agency decisions,
which use CLP data as a basis for its cleanup decisions, result
in expenditures of hundreds of millions of dollars.
OA/OC Definitions
Quality Assurance (QA) and Quality Control (QC) are integral
parts of the CLP. Existing program guidance defines QA and QC as
follows:
Quality Assurance is a process of management review and oversight
at the planning, implementation, and completion stages of
environmental data collection activities. The goal of the QA
process is to assure that the data provided to its users and
decision makers are of the quality needed and claimed.
Management's involvement in the QA process is vital.
Quality Control focuses on the detailed technical activities
(calibrations, split samples, demonstration of instrument
sensitivity and range, interference and contamination checks,
demonstration of method and laboratory accuracy and precision)
needed to achieve a specified data quality. QC activities
associated with sample analysis are intended to demonstrate and
document laboratory method performance.
A complete QA/QC program includes internal laboratory QC criteria
that must be met to achieve acceptable levels of performance.
These performance levels are determined by QA review. According
to EMSL's Fiscal Year 1990 Annual Report of QA in Support of
Superfund, EMSL has a major QA oversight and research mission
associated with the CLP.
QyqqpiafrUffnal R$gp
-------
Audit Report No. E1SKFO-09-0137-2100624
NPO's management and administration of the CLP program is aided
by the contractor operated Sample Management Office (SMO).
CLP Administrative Project Officers (POs) are located in the NPO.
Administrative POs are responsible for contract administration
activities, such as monitoring contract laboratory performance,
approving invoice payments, and recommending contract sanctions
against laboratories that do not perform satisfactorily.
Technical Project Officers (TPOs) reside in the regional offices.
TPOs are responsible for a variety of functions including
resolving technical issues with CLP contract laboratories in
their regions, monitoring laboratory performance, participating
in on-site laboratory evaluations, and evaluating corrective
action measures.
At the direction of OSWER, EMSL is responsible for providing CLP
QA/QC support to the NPO. This is accomplished by assisting in
the evaluation of CLP laboratory performance, evaluating
analytical methods and testing the analytical data being
generated by the laboratories.
Regional offices are the primary clients of the CLP. In additior
to being the end user of the analytical data generated by the
CLP, the regional offices also provide technical support to the
Superfund program. This includes conducting data reviews and
usability assessments, arranging for sample analysis, providing
technical guidance to the regional staff and monitoring of
laboratories by regional TPOs.
EMSL's CLP OA/PC Program -
The objectives of EMSL's QA/QC Program are to:
- define the quality of data produced;
- improve the quality of data;
- monitor technical compliance with contractual QA
requirements;
- assess laboratory capacity; and
- evaluate laboratory performance.
According to EMSL's Quality Assurance Program Plan (QAPP), "The
heart of the QA support effort is the evaluation and monitoring
of laboratory performance." EMSL's laboratory evaluation and
performance monitoring can be categorized into two major
activities:
- Preaward assessment of a laboratory's ability to (i) meet
all contract requirements and (ii) deliver the number of samples
bid upon. These are accomplished by requiring laboratories to
3
-------
Audit Report No. E1SKF0-09-0137-2100624
pass a performance evaluation (PE) sample and by EMSL conducting
an on-site audit; and
- Postaward performance monitoring, data acquisition, and
problem identification. These are accomplished by conducting
quarterly blind (QB) PEs, data audits, Gas Chromatography/Mass
Spectrometry (GC/MS) tape audits and on-site audits. These
activities are summarized below.
Data Package Audits. These are in-depth reviews, selected
statistically, of laboratory routine sample analysis. These
audits are designed to evaluate the technical performance of the
laboratory by thoroughly reviewing its raw data for method
compliance, transcription and calculation errors, and personnel
review to determine if the reporting forms accurately reflect th
contents of the raw data.
GC/MS Tape Audits. These are audits of organic labs' GC/MS
tape files to assess their adherence to contractual requirements
as well as the quality of the data. Tape audits involve
inspection of the raw data recorded on nine-track magnetic
computer tape or data cassettes. The tape audits determine the
consistency of data that is reported on the hard copy and floppy
diskettes.
PEs. These are homogeneous test samples sent to the
laboratories for analysis. They measure contractor and method
performance on the same set of samples and verify the
laboratory's ability to produce analytical data from a known
sample. EMSL sends out preaward PEs to prospective laboratories
bidding on new contracts. After contract award, EMSL sends out
QBs for all of the laboratories to analyze.
On-site audits. These visits to CLP contractor facilities
are used to determine if the contractor has the equipment,
personnel and procedures in place to meet the contractual
requirements. Preaward on-site audits are conducted to assist i
the determination of contract awards. Postaward on-site audits
are designed to incorporate findings from the other audits in
order to resolve the problems.
EMSL's four QA/QC audit programs (i.e. data audits, GC/MS tape
audits, PEs and on-site audits) are directed at providing: (i)
performance data by laboratory; (ii) performance data for the CL
program as a whole; (iii) methods performance; and (iv) trend
analyses. EMSL's QA/QC audits are not intended to make specific
determinations about the usability of site specific data. The
4
-------
Audit Report No. E1SKF0-09-0137-2100624
usability determinations of CLP analytical data are made by the
data reviewers who are located in the regional offices.
SCOPE AND METHODOLOGY
Our review included QA/QC audits performed and reported by EMSL
during fiscal years 1988 through 1990. Our review focused on the
EMSL controls and effectiveness of audits done on laboratories
conducting RAS work for organics and inorganics analyses.
We conducted our audit fieldwork between March 1990 and April
1991. We reviewed policies, procedures, and controls used by
EMSL and its support contractor to carry out QA/QC audit
functions in the CLP Program. We interviewed responsible
officials at EMSL and at the support contractor to obtain
information on their required procedures and objectives as well
as what actually was done. We also reviewed EMSL and support
contractor documents to determine program objectives and to
determine audits conducted and reported. Additional fieldwork
was conducted at three regional offices, the NPO, and Contracts
Management Division (CMD), Research Triangle Park, to confirm the
Agency's utilization of EMSL's work. In addition, we evaluated
EMSL's audits in nine performance areas, focusing on whether
their audits were performed within established timeframes,
particularly for poor performing laboratories. We reviewed
selected CLP contracts to determine QA/QC deliverable
requirements. Finally, we evaluated EMSL's reporting of QA/QC
problems and trends regarding each laboratory. As no QA/QC
audits are done on Special Analytical Services (SAS) analyses, we
did not review that area.
We judgmentally selected 16 organic laboratories (4 of which also
conducted inorganic analyses) to evaluate the effectiveness of
EMSL's program to identify and report performance problems to
appropriate program officials. These laboratories represented
approximately 16 percent of the contract laboratories through
most of the audited period. We selected a cross section of
laboratories with large and small value contracts and whose QA/QC
results indicated poor performance. Laboratories with
indications of poor performance provided us an opportunity to
assess Agency follow-up actions. We also consider such
laboratories as providing the most vulnerability to the CLP
program's integrity. While our sample is not statistically
valid, we consider our work sufficient to address our audit
objectives.
5
-------
Audit Report No. E1SKF0-09-O137-2100624
We perfoafSied our audit in accordance with the Governmental
Auditing Standards (1988 revision), issued by the Comptroller
General. Our audit included tests of management controls
specifically associated with the audit objectives. We did not
evaluate the technical aspects of EMSL's QA/QC program dealing
with improving contract methodologies and protocols or the
techniques and materials needed to support laboratory monitoring.
Also, the review did not include an evaluation of the internal
controls associated with the input and processing of information
into automated records systems, although we did utilize certain
information contained in such systems. Finally, nothing else
came to our attention that warranted the expansion of the scope
of our audit.
This audit report contains findings that describe problems the
OIG identified and the corrective actions the OIG recommends.
This audit report represents the opinion of the OIG, and the
findings contained in this audit report do not necessarily
represent the final EPA position. Final determinations on
matters in this audit report will be made by EPA managers in
accordance with the EPA established audit resolution procedures.
In this particular audit, the OIG did not measure the audited
offices' performance against any of the standards established by
the National Contingency Plan (NCP). The findings contained in
this audit report are not binding in any enforcement proceeding
brought by EPA or the Department of Justice under Section 107 of
the Comprehensive Environmental Response Compensation, and
Liability Act to recover costs incurred not inconsistent with the
NCP.
PRIOR AUDIT COVERAGE
No prior OIG audits have been performed on these specific aspects
of the CLP program.
OTHER MATTERS
On March 20, 1991, the EPA Deputy Administrator established a
task force, to conduct a Federal Managers' Financial Integrity Act
(FMFIA) review of the CLP. The purposes of the review were to
assess the management controls of the CLP and determine whether
significant weaknesses existed and whether user needs were being
met. A Task Force Report on Management Review of the CLP was
published on October 18, 1991. The Task Force concluded that the
"Contract Laboratory Program is highly vulnerable to waste and
fraud." The report also stated that "weaknesses and inaccuracies
in the information and analyses provided through the Contract
Laboratory Program could have a damaging effect on the entire
Superfund Program... The Task Force further concluded that
6
-------
Audit Report No. E1SKF0-09-0137-2100624
material weaknesses exist in the Contract Laboratory Program."
The report went on to identify several actions already underway
to remedy some of the identified problems. In addition, several
recommendations to improve the organization and management of the
program were included in the report. The results of the FMFIA
review were declared a material weakness in the Agency's 1991
assurance letter to the President.
The implementation of the Task Force recommendations provides an
opportunity for the Agency to take proactive action to reduce CLP
program vulnerabilities. These actions should be accomplished
timely and within the requirements set forth in FMFIA.
7
-------
Audit Report No. E1SKF0-09-0137-2100624
CHAPTER 2
EMSL'S OA/OC AUDIT COVERAGE NEEDS INCREASED ATTENTION
While some significant improvements have been made in its QA/QC
audit coverage, EMSL did not conduct a sufficient number of GC/MS
tape audits, data audits, on-sites or remedial performance
evaluation (PE) audits to achieve its CLP audit frequency goals.
This condition was evident for both organic and inorganic
contract laboratories. Except for the routine quarterly blinds
(QBs), EMSL did not achieve its established QA/QC audit goal
frequencies. Frequent and timely QA/QC audits help identify
performance deficiencies so that timely follow-up action can be
initiated. Assurance that laboratories are producing data of
known and acceptable quality can be positively impacted by an
effective QA/QC program. EMSL personnel identified funding and
limited availability of trained personnel as reasons for not
conducting the targeted number of QA/QC audits. While limited
resources may have contributed to the condition, we believe that
EMSL also needs to increase its audit management planning,
controls, and oversight. In order for the program to gain
assurance that its contract laboratories are performing within
the required contract protocols, we recommend that EMSL increase
its QA/QC audit coverage, especially on poor performing
laboratories.
BACKGROUND
The importance of QA/QC monitoring of CLP laboratory performance
from the preaward stage through contract completion cannot be
understated. Effective remedial design and resulting cleanup
actions at Superfund sites depend upon analytical data of known
and acceptable quality.
According to EMSL's QAPP for the CLP, the heart of the QA support
effort to the Superfund program is the evaluation and monitoring
of laboratory performance. Evaluation and monitoring consists of
two major parts:
1. Preaward: To assess a laboratory's qualifications,
capability to meet Program requirements and analysis capacity
2. Postaward: To monitor and evaluate the laboratory's
technical performance and contract compliance, and to identify
and correct problems.
EMSL implements these activities by the interaction of four types
of reviews: (i) GC/MS tape audits; (ii) data audits; (iii) PE
studies; and (iv) on-site audits. By addressing QA/QC
8
-------
Audit Report No. E1SKF0-09-0137-2100624
contractw&l criteria through these reviews, it is possible to
detect the existence of a problem, characterize it, and trace its
source. A problem may or may not be laboratory specific. A
problem may be identified as a weakness in the analytical method,
an ambiguity in the protocol language or a problem with a
particular analyte. These reviews can help the program assure,
not only contract compliance, but also data integrity for the
program as a whole.
GC/M8 TAPE AUDITS
EMSL's annual task directives to its support contractor included
the performance goal that each organic laboratory have a GC/MS
tape audit performed on one of their cases each year. We found
that 26 percent of all organic laboratories did not have any tape
audits performed during the three year period of our review.
Also, for the organic laboratories included in our sample, EMSL
only conducted annual tape audits 29 percent of the time.
Each laboratory is required, by the terms of their contract to:
Retain all raw GC/MS data acquired under this contract on
magnetic tape in appropriate instrument manufacturer's
format...for 365 days after data submission. During that
time, the Contractor shall submit tapes and logbook within 7
days of request, as specified in the Contract
Performance/Delivery Schedule.
EMSL's audits of these GC/MS tapes include a detailed inspection
of the raw data recorded on the magnetic computer tape generated
by the GC/MS equipment. GC/MS tape audits are capable of
detecting deviations from standard operating procedures and
proper analytical protocols. Comparison of tape audit results to
data package submissions by the laboratory can disclose errors or
irregularities which would otherwise not be evident from a review
of the data package by itself.
We found that no tape audits were performed on 11 of the 42 (26
percent) organic laboratories in the program during our review
period. For the 16 organic laboratories included in our audit
sample, we found that EMSL only conducted annual tape audits for
12 of the 41 contract years, or 29 percent of the time.
We reviewed the status of tape audits for the period January 1, v
1990 through September 10, 1990 to determine reasons why tape
audits were not performed. For this period, 54 percent of the
GC/MS tape files received from organic laboratories were hot
audited. Available information lead us to the following
conclusions:
-------
Audit Report No. E1SKF0-09-0137-2100624
- the laboratory's contract was soon due to expire or the
file was considered too old for 22 percent of the files;
- tapes received from the laboratories were incomplete or
unreadable for 21 percent of the files; and
- EMSL had equipment or trained personnel shortages
for 11 percent of the files.
In addition to the above, we believe another contributing cause
for this condition is that EMSL had not prepared a Quality
Assurance Project Plan (QAPjP) detailing the procedures, controls
and oversight necessary to implement an effective GC/MS tape
audit program. The requirement for a QAPjP is included in EPA
Order 5360.1. The development of the QAPjP would also be
consistent with the treatment of other EMSL CLP QAPjPs. The
NPO's CLP QAPP, dated May 20, 1988, specifically refers to EMSL's
other three QAPjPs. The development of the QAPjP is part of the
implementation of the Mandatory Quality Assurance Program. Not
having a QAPjP impacts on management's assurance that their tape
auditing function accomplishes its objectives of timely reporting
QA/QC deficiencies fully.
According to EMSL officials, another reason for not being able to
conduct tape audits on each laboratory was limited resources
coupled with the requirements placed on them by the NPO and OIG
investigators to conduct "special" tape audits, which are very
labor intensive. We recognize that limited funding impacted
EMSL's ability to perform tape audits, especially considering the
other priority requests.
DATA AUDITS
During the period of our review, EMSL made significant
improvements in the frequency of audits of laboratory data
package cases. Also, some improvement in performing data audits
of poor performing laboratories occurred. Additional improvement
is necessary, however, to meet its goals in this audit area.
Data audits provide an in-depth look at the technical performance
of a laboratory on a routine set of sample analyses. It is a
thorough review of the laboratory's data for method compliance,
transcription and calculation errors to determine if the
reporting forms accurately reflect the contents of the raw data.
EMSL selects data packages to audit on a statistical basis. If
conducted appropriately, data audits can highlight errors that
indicate the need for corrective action. Routine data audits
monitor a laboratory's progress in complying with contract
10
-------
Audit Report No. E1SKF0-09-0137-2100624
requirements and provide an important independent measurement of
the quality of the laboratory's performance.
EMSL established the following general goals for conducting data
audits on individual laboratories for the period of our review:
- For fiscal years 1988 and 1989, five percent of each
laboratory's total cases were to be audited. For fiscal 1990,
ten percent of each laboratory's total cases were to be audited.
These goals were established to ensure that a "representative"
number of each laboratory's data package cases should be audited;
For fiscal 1990, EMSL's SOP stipulated that each
laboratory would have a data audit performed on one of its cases
quarterly; and
- EMSL work plans stipulated that data audit levels should
be adjusted in response to changes in data quality. "In general,
laboratories demonstrating poor performance on quarterly
performance evaluation studies, contract compliance screens,
regional data reviews, or LESC data and tape audits will be
audited more frequently."
Frequency of data audits. During the period of our review, EMSL
agreed that, while the frequency of data audits had improved
significantly, the goal of performing a data audit on 5 to 10
percent of each laboratory's cases had not been attained. EMSL
advised that the goal was unattainable because of a lack of funds
and personnel.
In fiscal 1988, EMSL achieved their goal of auditing five percent
of each laboratory's total cases for 65 percent of the
laboratories. In fiscal years 1989 and 1990, this increased to
82 and 81 percent of the laboratories, respectively. While
EMSL's efforts have resulted in most laboratories' cases
receiving data audits more frequently, we believe that EMSL
should direct additional attention to some of the laboratories.
For example, one of the biggest CLP organic laboratories analyzed
368 cases during the three year period of our review. However,
only three of its cases were audited by EMSL. This represented
less than one percent of the laboratory's cases. Considering the
magnitude of CLP organic analyses being performed by this one
contractor, it is our opinion that additional data audits should
have been performed on this laboratory's activities.
EMSL's focus appeared to be more on achieving the stipulated
percentage level for the program as a whole, rather than for
individual laboratories. EMSL reported their results in the
11
-------
Audit Report No. E1SKF0-09-0137-2100624
Annual Stannary based on the organics and inorganics programs as a
whole instead of by individual laboratory.
Data audits on poor performing laboratories needed. For the
cases reviewed, we concluded that EMSL only conducted timely data
audits on about one-half of the poor performing laboratory cases
that they should have. Not conducting timely data audits on poor
performing laboratories contributes to QA/QC problems not being
identified and reported timely, corrective actions not being
taken timely, and actions against poor performing laboratories
being delayed. While the criteria for determining a poor
performing laboratory was not clearly defined by EMSL, we
identified laboratories that had failed more than one QB, or
received poor or serious contractual deficiency scores on two or
more GC/MS tape audits. With regard to the issue of timely
action, we considered any data audit performed within three
months of the poor performance period, as a timely action on the
part of EMSL. Using this criteria, we found that:
- For poor performing inorganic laboratories, EMSL
conducted a timely data audit in only 20 percent of the cases;
and
- For poor performing organic laboratories, EMSL conducted
a timely data audit in only 52 percent of the cases.
We noted that EMSL significantly improved its data auditing of
poor performing organic laboratories between fiscal years 1988
and 1990. During this time period, EMSL improved from not
performing any timely data audits in fiscal 1988 to performing
timely data audits on 83 percent of the laboratories.
Discussion with EMSL personnel disclosed that nonattainment of
their goals was attributed to funding limitations. In addition
to that cause, we also feel that the lack of a QA/QC data base on
individual laboratory performance precluded EMSL from
specifically targeting poor performing laboratories for
additional data audits. Additional discussion of this area is
presented in Chapter 3 of this report.
PERFORMANCE EVALUATION (PE) STUDIES
We found that EMSL generally conducted preaward PEs and routine
QBs in an acceptable manner. Exceptions, however, were noted in
the performance of remedial PEs on laboratories that failed their
routine QBs and for QBs that were to be sent to laboratories
during the first performance quarter under new contracts.
12
-------
Audit Report No. E1SKF0-09-0137-2100624
According to records maintained by EMSL and its support
contractor, remedial PEs were sent to contractors that failed a
QB only 20 percent of the time. In addition, routine QBs often
were not sent to contractors during the first quarter of new
contracts.
The importance of performing PEs was addressed in EMSL's FY 1989
Annual Summary as follows:
Since performance on a PE sample is considered to be an
indication of how well a laboratory is capable of performin
under the best of circumstances, poor performance on a
routine quarterly blind is a strong indication of technical
problems.
OERR's "Guidelines for Effective Management of the CLP", provide
that following a laboratory's failure of a QB, a remedial PE
sample should be sent to the laboratory. In addition, when a
laboratory fails a QB, it is to be placed on "Hold" status.
During this time, the laboratory should not receive samples unti
the PO deems they have successfully corrected their problems and
passed a remedial PE sample. A November 1986 memorandum from th
NPO stated similarly:
The project officer should place a lab on hold if it fails
the quarterly performance evaluation test and require it to
analyze a second, different test sample. If the lab
performs unacceptably on the second test sample, the projec
officer should recommend contractual action.
Types of problems identified could include a laboratory's failur
to identify what analytes EMSL spiked the sample with, reporting
false positives, contamination, and systematic calculation or
quantification errors. In addition, CLP contracts stipulate tha
EMSL will conduct these postaward PEs quarterly on each
laboratory. Discussions of EMSL's performance of PE studies are
provided in the following paragraphs.
Remedial PEs not done on poor performing laboratories. We found
EMSL sent remedial PEs on only 10 of 49 (20 percent) instances o
failed QBs for laboratories in our sample. We expanded our
analysis to include all laboratories in the program and found
that the rate was only slightly better. For all laboratories
with failed QBs, only 31 percent were sent remedial PEs. The
condition existed for all three years of our review, and was
almost evenly distributed between organic and inorganic
laboratories.
13
-------
Audit Report No. E1SKF0-09-0137-2100624
EMSL attributed this condition to their reliance on the NPO to
request remedial PEs. EMSL also indicated that by the time the
remedial score was obtained, the scores for the next QB would be
available. As a result, EMSL considered it more.effective to
wait for the next QB score rather than conduct the remedial test.
This approach is inconsistent with the criteria for PE studies,
and in our opinion, results in an inconsistent policy toward poor
performing laboratories. By not conducting remedial PEs on poor
lperforming laboratories, effective monitoring of contractor
adherence to protocols is diminished. Whether a laboratory
passes or fails a QB, they are required to pass the next QB.
However, when a laboratory fails a QB, the additional requirement
of passing a remedial PE sample has been clearly established in
the program. This additional program control is an effective
QA/QC tool which should have been used by the program.
QBs not always performed on new contracts. EMSL did not always
send the first quarter QBs to laboratories under new contracts.
For the laboratories included in our sample, we found that first
quarter QBs were not sent to the contractors in 8 out of 19 (42
percent) instances. The CLP Statement of Work (SOW) for Organics
Analysis, dated February 1988, Exhibit E Section V, states that a
PE sample set "will be sent to a participating laboratory on a
quarterly basis to verify the laboratory's continuing ability to
produce acceptable analytical results." The absence of such QBs
creates gaps in historical performance data on those laboratories
that did not conduct the QB analysis.
EMSL stated that they depend on the NPO to inform them which
laboratories should be sent a QB sample. According to EMSL, when
the NPO deems new labs in the "introductory stages" of the
Program, the NPO essentially exempts the laboratory from this
monitoring activity. Considering the SOW provisions specifying
that the contractor is required to pass QBs, and the increased
vulnerability associated with performance under new contract
awards, it is our opinion that first quarter QBs should be
performed on all contracts.
OM-8ITB AUDITS
We concluded that EMSL was not conducting preaward and postaward
on-site audits in a manner that focused such audits on poor
performing laboratories. EMSL conducts on-site audits to
determine if a laboratory has the equipment, personnel and
procedures in place to meet the CLP contractual requirements.
EMSL's QAPP and OERR's CLP guidance identify the performance of
on-site audits as an important QA/QC activity in management of
the CLP.
14
-------
Audit Report No. E1SKF0-09-0137-2100624
Preaward on-site audits, together with the preaward PE sample
results, are critical factors in the determination of contract
awards. OERR's "Guidelines for Effective Management of the CLP-
Part II" states:
To become part of the CLP, laboratories must meet stringent
requirements and standards for equipment, personnel,
laboratory practices, analytical operations, and quality
control operations... Before a contract is awarded, low
priced bidders must successfully analyze PE samples and pass
a preaward laboratory audit.
Preaward audits are also used to make determinations on a
laboratory's analytical capacity. Government contract guidelines
also attest to the importance of confirming a contractor's
ability, past as well as future, to perform prior to the award of
the contract. Under the CLP program, preaward on-site audits are
a valuable tool in this evaluation process.
The postaward on-site audit, according to OERR guidance, is used
to monitor a laboratory's ability to meet all terms and
conditions in the contract; identify laboratory problems; and
verify the adequacy and maintenance of instrumentation and the
continuity of personnel, as required by the contract. CLP
guidance provides for the incorporation of findings from other
audits (QBs, data, tape, Regional reviews, CCS) to detect and
identify problems and ensure their resolution. If serious
problems are encountered, recommendations should be made to the
NPO for appropriate actions.
Preaward on-sites audits. OERR CLP guidance requires that before
a CLP contract is awarded, the low bidders must "successfully
pass a preaward laboratory audit." This OERR requirement appears
to apply to all CLP contract awards as it does not differentiate
between new contractors and repeat contractors in the program.
This is also consistent with the Federal Acquisition Regulations
and EPA's contract guidance which require the verification of a
contractor's ability to perform prior to the award of contracts.
In the case of the CLP program, repeat contractors have a
performance history from prior or existing contracts that should
be considered in assessing a bidder's ability to perform. Poor
performing laboratories and new laboratories to the CLP, in our
opinion, are more susceptible to errors which would increase the
CLP program's vulnerability. Accordingly, we determined whether
EMSL conducted preaward on-site audits on poor performing and new
laboratories in our sample.
15
-------
Audit Report No. E1SKF0-09-0137-2100624
We founcK^hat EMSL did not conduct preaward on-site audits on
poor performing laboratories (i.e. those with poor performance
results on QBs, data and/or tape audits) in three out of eight
instances. While evidence indicated prior performance problems
with these laboratories, the Agency awarded the contracts to
those laboratories without performing an on-site audit. Further,
we noted that two of the three laboratories continued to
experience poor performance after contract award. Both of these
laboratories had failed QBs or had other performance problems
identified prior to the award of subsequent CLP contracts.
According to EMSL personnel, the procurement office does not
involve EMSL in the decisions to perform preaward on-site audits.
As such, information available on prior contract performance is
generally not considered in the award of new contracts. It is
our opinion that the above contract examples demonstrate that
EMSL should be actively involved in the decision to perform
preaward on-site audits. Since EMSL actively participates in the
evaluation of preaward PE samples, they are aware of potential
contract awards. With this knowledge, they should proactively
advise the procurement office when potential contractors have a
history of performance problems so that preaward on-site audits
can be conducted.
Postaward on-sites audits. The goal of EMSL's on-site review
program is to conduct the visits on a regular basis "to address
the resolution of problems identified in the data audit process."
OERR's guidance for the CLP states on-sites are "conducted on a
regularly scheduled basis or at a frequency dictated by a
laboratory's performance and sample load." To accomplish this,
EMSL tasked its support contractor to develop a list of
recommended on-site audits based on data audits, tape audits,
regional audits, CCS reports, exception/trend reports and QB
performance results. While the system of recommending on-site
audits is an improvement over past performance, we found that the
current practices could be improved. In this regard, EMSL's
system does not consider the laboratory's complete performance
history. Also, the ranking process does not include
consideration of voids in audit coverage, such as no history of
GC/MS tape audits. Finally, EMSL, during the period October 1987
to March 1989, had a goal of performing an annual on-site audit
for each laboratory.
While program goals emphasize performance of postaward on-site
audits on poor performing laboratories, we concluded that EMSL
had not been fully successful in this aspect of its QA/QC
program. By not performing postaward on-site audits, laboratory
operating deficiencies can go undetected for extended periods of
time without timely corrective action. On-site audits assist
16
-------
Audit Report No. E1SKF0-09-0137-2100624
program Managers in the performance of their contract monitoring
responsibilities.
Since existing program guidance did not include a clear
definition of "poor performance", we used the following as
criteria: (i) two instances of failed QBs; or (ii) poor results
on data or GC/MS tape audits as an indication of a need for an
on-site audit. Using this criteria, we found that EMSL
adequately performed on-site audits on inorganic laboratories
with poor performance. However, on-site audits on poor
performing organic contractors were only accomplished 50 percent
of the time. In addition, with regard to EMSL's goal of annual
on-site audits on each laboratory, EMSL and its support
contractor indicated that they had not been able to meet this
goal. Discussions with EMSL personnel indicated that they were
unable to perform the expected level of on-site audits because of
the limited resources available to conduct sufficient audits.
In addition to the resource limitation issue, it appears that
other factors may have impacted on EMSL's ability to target poor
performing laboratories for on-site audits. One such factor
could be that EMSL had not instituted tracking procedures or
systems to provide management with historical performance
information by laboratory. (See Chapter 3 for a discussion of
this issue) Without a system for identifying poor performing
laboratories, EMSL was unable to assure direction of its
resources to the laboratories in most need of review. In
addition, we also noted that EMSL had not updated its QAPP and
QAPjPs, as discussed in Chapter 4 of this report. The QAPP and
QAPjP provide program personnel with overall direction for the
quality assurance program and aid in consistency of performance.
AGENCY COMMENTS
ORD comments on the above audit finding are summarized as
follows:
Each year, EMSL-Las Vegas works with the NPO to rank QA
activities within the needs of the program and the
anticipated level of funding. Levels of funding are usually
sufficient for satisfactory coverage of base operations.
However, unplanned special projects, including support for
high priority 016 activities, consume resources targeted to
satisfy the base program. EMSL has been forced to remain
flexible to support these projects and has had to use
remaining resources for higher priority activities
negotiated with the program.
Oiq EVALUATION
17
-------
Audit Report No. E1SKF0-09-0137-2100624
ORD's response did not indicate any disagreement with the factual
accuracy of the audit finding. They did, however, provide
commentary on one of the potential causes for non-attainment of
QA/QC performance goals. In this regard, EMSL management was
responsible for providing earlier notification to CLP program
officials of the limitations being experienced in accomplishing
the expected levels of QA/QC audit coverage. Based on ORD's
comments, it appears that these events will be appropriately
considered in the future.
RECOMMENDATIONS
We recommend the Assistant Administrator for Research and
Development direct that the Director, EMSL:
1. Initiate controls to assure that all laboratories receive at
least one GC/MS tape audit per year.
2. Establish clear goals for performance of QA/QC audits on each
laboratory. These goals should include an emphasis on conducting
timely audits on poor performing laboratories.
3. Update the EMSL CLP QAPP and associated QAPjPs to reflect
revised procedures and controls. In this regard, EMSL should
prepare a QAPjP for the GC/MS tape auditing program.
4. Establish procedures to assure that:
a. Poor performing laboratories are subjected to increased
QA/QC audit emphasis following identified periods of poor
performance;
b. Remedial PEs are processed timely on laboratories that
fail QBs;
c. First quarter QBs are sent to all new contract
laboratories for analysis; and
d. Poor performing laboratories are identified to
procurement officials prior to the award of any new CLP
contracts.
AGENCY COMMENTS
ORD's comments to the above recommendations are summarized below
and referenced to the corresponding recommendation number above.
18
-------
Audit Report No. E1SKF0-09-0137-2100624
1. ORD *111 work with the NPO to establish necessary controls
and to assure that appropriate resources are available for
accomplishing this goal.
2. The decision to emphasize audits on poor performing
laboratories rests with the NPO. ORD will work with the NPO to
clearly identify program needs for audits and to develop
alternative mechanisms to handle unscheduled support requiring
program resources.
3. We recognize the deficiencies to existing documentation, some
which are not under our immediate control. We will inform the
NPO of these deficiencies and will work with them to update and
improve all appropriate CLP quality assurance documentation.
4.a. The decision to emphasize audits on poor performing
laboratories rests with the NPO. We will bring this
recommendation to their attention and work with them to modify
procedures and to identify resources to achieve this goal.
4.b. & 4.c. The decision to schedule and initiate remedial PEs
and QBs rests with the NPO. We will bring this recommendation to
their attention and work with them to establish controls to
satisfy this goal.
4.d. The Laboratory Performance Database system will allow users
to better identify poor performing laboratories. We will bring
this recommendation to the attention of the Contracting Officer
and the NPO to satisfy this goal.
OIQ EVALUATION
Our evaluation of ORD's responses to the recommendations are
provided below and referenced to the recommendation number.
1. ORD's response to this audit recommendation appears adequate.
However, ORD should be cautious of changing NPO QA/QC goals in
implementing their own work plan priorities.
2. It was noted during the review that EMSL's work plans
emphasized increased audit coverage on poor performing
laboratories. As such, it remains our opinion that EMSL should
address the proactive application of its QA resources to
identified poor performing laboratories instead of deferring
responsibility to the NPO. Our recommendation remains as stated
in the draft audit report.
19
-------
Audit Report No. E1SKF0-09-0137-2100624
3. While" ORD responded positively to this recommendation, the
response was again directed at correcting the deficiency through
the NPO. The audit recommendation was directed at the published
EMSL QAPP and QAPjPs. It remains our opinion that EMSL's QAPP
and QAPjPs should be updated in accordance with Agency guidance
as soon as possible.
4.a. ORD responded in a generally positive manner to the
recommendation relating to increased QA/QC audit emphasis on poor
performing laboratories. While a coordinated effort is needed to
assure accomplishment of program objectives, EMSL should strive
to direct its resources to those laboratories with identified
performance problems. This type of directed effort could result
20
-------
Audit Report No. E1SKFO-09-0137-2100624
in improvted assurance of high quality data from the contracted
laboratory analyses.
4.b. & 4.c. We agree that certain activities require action by
the NPO such as ordering remedial PEs. However, it remains our
opinion that EMSL should assume responsibility to proactively
notify program officials to assure sufficient QA audit coverage
on poor performing laboratories.
4.d. The intent of the recommendation, regarding notifying
procurement officials on poor performing laboratories prior to
award, was that EMSL, as the primary repository of CLP QA/QC
data, take a proactive role in advising procurement officials.
In this regard, we recommended that laboratories with poor
performance histories be specifically identified to procurement
officials, prior to contract award, so that procurement officials
can properly consider the performance history of the contractor
during the contract award process. Our recommendation remains as
stated.
21
-------
Audit Report No. E1SKF0-09-0137-2100624
CHAPTER 3
TRACKING PROCEDURES AMD 8YSTEM FOR OA/OC APDIT RESULTS NEEDED
EMSL has not established effective tracking procedures and
systems for evaluating QA/QC historical audit performance by
laboratory. This condition diminished EMSL's ability to monitor
individual laboratory QA/QC performance trends for the purpose of
identifying poor performing laboratories. As a result, we
believe that additional QA/QC audits and any needed
administrative or contractual actions that should have been taker
against laboratories were significantly delayed or not taken.
This condition was first identified to EMSL management in 1983,
however, it remained uncorrected through fiscal 1991. At that
time EMSL initiated work on developing a system; however, the
system is not projected for full implementation until early
fiscal 1995. EMSL's QA/QC efforts have emphasized developing and
monitoring of contractual protocols and methodologies, tracking
of overall program performance and some comparisons between
laboratories. The delay in establishing a laboratory based
tracking system for QA/QC audit results may also be attributable
to the fact that such a system was never a priority of the NPO.
It is our opinion that the timely development of an integrated
tracking system would be very beneficial in improving the overall
quality of CLP analytical data.
Background
EMSL's quality assurance mission for the CLP consists of
independently evaluating approximately 60 contract laboratories,
the analytical methods they use, and the resulting environmental
data. Completing this mission entails a variety of processes
including auditing hardcopy data packages, auditing electronic
raw data, inspecting contract laboratory facilities pre- and post
award, evaluating laboratory performance in the analysis of pre-
award and routine performance evaluation samples, developing data
review procedures, and developing data bases that make available
the results of analyses of environmental and quality control
samples. Each of these QA activities provides a picture of a
laboratory's performance at a point in time; each reporting
somewhat different performance data. The audits on each
laboratory are conducted and reported at varying intervals
throughout the year; some quarterly, some annually, others at no
set frequency. Managing and interpreting this mass of
information is complex. The manual generation and maintenance of
reported performance results by laboratory makes it very
difficult to comprehend trends and overall laboratory quality and
performance.
22
-------
Audit Report No. E1SKF0-09-0137-2100624
As a par€' of its program, EMSL is the recipient of a variety of
interrelated quality control information. This complex array of
information was identified by EMSL as being used to:
- determine the quality of data produced in the CLP;
- determine the performance of each participating
laboratory;
identify noncompliant laboratories and appropriate
corrective actions; and
- determine appropriate improvements to analytical methods
and quality control procedures.
OERR's publication titled, "Guidelines for Effective Management
of the CLP, Part Two" states, "The purpose of ...QA audits ...is
to monitor performance of the laboratories so problems can be
identified and corrected in order to maintain the integrity of
sample analysis." Thus, the effective monitoring of each lab's
performance is a prerequisite for maintaining the integrity of
the CLP's sample analyses. Without effective contract
performance monitoring, assurance that those CLP protocols are
followed does not exist.
EMSL'8 1984 QUALITY ASSURANCE PROGRAM PLAN
In 1984, EMSL prepared a QAPP for the CLP program. The QAPP laid
the foundation for EMSL's QA support efforts and included
comments on the need for a data base of actual laboratory
performance to aide program management. The QAPP described the
QA Data Base as:
The focal point for QA/QC data storage... From the Data
Base, control charts can be produced and relative laboratory
performance can be evaluated. Performance trends and
defects can then be monitored within a given laboratory or
between laboratories. Performance criteria (e.g.,
acceptance windows for matrix spike recoveries) can be
evaluated and updated. The data base ties together the
various aspects of laboratory performance monitoring - data
audits, on-site evaluations, and performance evaluations.
The importance of a comprehensive and cohesive data base of QA/QC
performance data was also discussed in a 1983 peer review of
EMSL's organic analysis QA programs. The peer review report
cited several potential benefits from a computerized data
management system as well as several desirable features for the
23
-------
Audit Report No. E1SKF0-09-0137-2100624
system. ^The peer review concluded that EMSL needed to become
smarter at data management. At the time of the peer review, the
CLP was processing about 26,000 samples per vear. one-fourth of
current CLP volume.
DEVELOPMENT OF A DATA BASE POSTPONED
EMSL has attempted, on several occasions over the years, to
accumulate performance information on the CLP program. While we
recognize EMSL's efforts in this area, none of their information
systems were directed at compiling a laboratory based system of
QA/QC results. This however changed on September 18, 1989, when
EMSL proposed the establishment of the "Environmental Data
Integration System" (EDIS). EDIS was proposed as a software
system that would integrate the diverse laboratory information
sources to produce comprehensive automated reports quickly and
efficiently. The proposal stated:
Indications obtained from electronic raw data analysis are
that as much as 15% of data being generated in sample
analysis is suspected of being manipulated or otherwise
unusable; this is resulting in $9 million spent on
questionable data in FY 1990 alone (and rising to 12.5
Million by FY1996)...if EDIS is developed and implemented,
the amount and cost of manipulated data will be dramatically
reduced.
The proposal went on to state that:
The existing system of information integration is
continuously improvised, informal, and almost completely
manual. Such a system is too inconsistent and time
consuming to ensure continued adequate enforcement of
contractual specifications for the rapidly growing CLP.
Substantial, even severe problems can be occasionally
overlooked or intentionally concealed by negligent
laboratories, resulting in environmental data bases
contaminated with data of suspect quality.
EMSL personnel advised us that the 1989 proposal was never funded
by the Agency.
In its Annual Summary Report for FY 1989, prepared for and sent
to the NPO, EMSL stated it "remains committed to supporting the
OERR with a proactive research program to improve quality control
and QA for data generated in response to Agency regulatory
needs." Yet, in their own words, the absence of the data base
was a significant vulnerability precluding a proactive stance.
24
-------
Audit Report No. E1SKF0-09-0137-2100624
However, ^tip to the time of our fieldwork, there is no evidence
that EMSL adequately identified the vulnerability to its ORD
superiors or to the NPO.
In 1990, EMSL again identified the need for a CLP data base using
various QA performance parameters. In an April 17, 1990
memorandum, the proposed system was identified as the "Laboratory
Performance Database'* (LPD). In 1991, EMSL was able to reprogram
some of its existing funds to begin development of the LPD. The
stated goal for the system was "to be able to monitor laboratory
performance of a given laboratory while looking at all relevant
pieces of information." It was expected that the system would
"save resources by early prevention of laboratory problems as
well as streamlining and coordinating our own audit systems."
However, work on this project appears to be progressing slowly;
EMSL's planned full implementation is not expected until October
1, 1995. The fiscal 1991 funding level of $283,500 appears
inadequate when compared to the LPD's total estimated cost of
$2.5 million for the system. Thus, it appears to us that EMSL
management needs to place additional emphasis on this area and
solicit the funds needed to complete the system in a timely
manner.
PROGRAM IMPACTS
Delays in implementation of a comprehensive QA/QC laboratory
based tracking system could seriously impede the implementation
of the Superfund clean-up program as a result of questions being
raised on the overall quality of CLP program analyses. In our
opinion, the timely implementation of a tracking system could
benefit the CLP program as discussed below.
Problem laboratories identified timely and audited. EMSL's
tracking and monitoring of laboratory performance would be
beneficial for the timely identification of poor performing
laboratories and in identifying trends so that additional
emphasis can be directed in those areas in most need. As
discussed in Chapter 2, we noted that EMSL was not conducting
remedial or follow-up audits on poor performing laboratories.
Also, we found no evidence that EMSL was targeting poor
performing laboratories for additional audits. The timely
conduct of audits is necessary to ensure that significant
problems do not continue and, if they do continue, that Agency
decision-makers are apprised of the fact. We also believe that
the lack of a data base has contributed to the Agency being in a
reactive rather than proactive position when addressing
laboratories with performance problems. The fact that over 20
CLP laboratories have been under scrutiny for their improper
25
-------
Audit Report No. E1SKF0-09-0137-2100624
practiced by the OIG Office of Investigations could be considered
a result of EMSL's lack of adequate monitoring systems.
Needed audit coverage identified. We found that EMSL did not
attain its goals for required GC/MS tape audits during fiscal
years 1988 through 1990. During that period, 71 percent of the
required annual tape audits were not conducted. Further, we
found that about 26 percent of the organic laboratories had not
had tape audits conducted on any of their analyses. A
comprehensive QA/QC data base, which includes results by
laboratory, would assist EMSL and program management in
identifying those QA/QC audit areas in need of additional
coverage.
Administrative and contractual actions on poor performing
laboratories. We found that the Agency, during fiscal years 1988
to 1990, often did not take appropriate actions when poor
performance was identified. For the 16 laboratories we reviewed,
82 percent of the instances of poor performance did not have
timely and appropriate corrective actions taken. A data base
system would provide the Agency with information on laboratories
that need to take timely corrective actions for identified
operating deficiencies. A comprehensive data base would also aid
the Agency in its pre-award identification of poor performing
laboratories. In this regard, we found that in 6 out of 6
instances, awards were made to laboratories with documented poor
QA/QC performance. Further, four of the six laboratories
continued to experience problems after contract award.
Regional data validation efforts could benefit. Regional CLP
data reviewers should be considering laboratory performance
history when determining what level of validation is appropriate
on data packages. The importance of "laboratory considerations"
(laboratory performance) on Regional data review levels was
highlighted in an August 18, 1988 OSWER Directive:
The level of data review required may vary across projects
and within projects based on the decisions to which the data
will be applied, site characteristics, laboratory
considerations and the nature of the data itself.
Laboratory considerations should encompass, in our opinion, a
laboratory's propensity for errors or irregularities. It is
reasonable to expect that regional data reviewers would exercise
greater care during data reviews on laboratories with documented
performance deficiencies. The maintenance of a QA/QC data base
by laboratory would, in our opinion, be beneficial to regional
26
-------
Audit Report No. E1SKFO-09-0137-2100624
data reviewers by providing them with comprehensive information
on laboratory performance on QA/QC tests.
AGENCY EMPHASIS NOT ON INDIVIDUAL LABORATORY PERFORMANCE
One of the reasons why a data base of laboratory performance was
not aggressively pursued may be attributable to an Agency
emphasis for EMSL to track overall CLP program performance for
the purpose of updating contractual protocols. This activity is
critical to the Agency's preparation of Invitations for Bids
(IFBs) to award to CLP contracts and maintain overall CLP
available capacity. According to various personnel in the QA/QC
program at EMSL and at the NPO, the CLP program has been in a
constant state of evolution. Until recently, tracking and
updating contractual protocols and methodologies has been a prime
objective.
CONCLO SION
The need for a data base for QA/QC performance results, by
laboratory, has been documented for several years. EMSL
recognized that the lack of such a data base has "forced QA
support in a reactive, rather than the desirable proactive role".
However, EMSL's efforts to develop and implement the data base
have not been effective. EMSL's 1989 proposal came six years
after the deficiency was first reported. With the reprogramming
of fiscal 1991 operating funds, it does not appear that the
funding and work efforts will be sufficient for timely
implementation of the full data base system. Further, the data
base does not appear to be intended for use by POs who need this
information for effective contract monitoring and enforcement.
NPO emphasis to date has been on tracking overall program
performance and not individual laboratory performance. This
appears to have contributed to an attitude that individual
contract enforcement will not be aggressively pursued. We
believe that this may have also contributed to delays in
implementing the data base.
we believe that EMSL should expedite implementation of the data
base to facilitate tracking and monitoring of laboratory QA/QC
performance as well as providing timely and comprehensive
information to program managers. Without the data base, EMSL and
the program's QA efforts will be hampered.
AGENCY COMMENTS
27
-------
Audit Report No. E1SKF0-09-0137-2100624
ORD comm^tits on the above audit finding are summarized as
follows:
EMSL-Las Vegas has considerable experience developing
computerized systems for reviewing large databases, like
that of the CLP. An initial version of the LPD will be
operational this year. To minimize uncertainties in its
development, a peer review is scheduled later this year to
evaluate the science and the likelihood of success of the
system.
Using LPD as the primary mechanism, EMSL-Las Vegas and the
NPO will better identify poor performance and will generate
appropriate historical trend reports that satisfy the needs
of the program.
OIQ EVALUATION
The development and implementation of the LPD system will provide
both EMSL-Las Vegas and other CLP program personnel with
historical laboratory performance information. Since the system
was still in the design and implementation stage, we did not
attempt to evaluate the overall effectiveness of the system. We
commend ORD's plan to perform a peer review of the LPD system.
If properly performed, the peer review should provide EMSL-Las
Vegas, ORD and the NPO with an independent evaluation of the
systems ability to provide needed information to its intended
users.
RECOMMENDATIONS
We recommend that the Assistant Administrator for Research and
Development require that the Director, EMSL:
1. Immediately pursue additional funding for the development of
the LPD System to minimize delays in its completion and
implementation.
2. Develop interim procedures to compile laboratory QA/QC
histories for the purpose of identifying QA/QC audit coverage
deficiencies so that resources can be directed at those
laboratories with significant gaps in coverage.
3. Initiate a design review of the LPD system to determine
whether the current system is sufficient to provide comprehensive
historical performance data, by laboratory, needed by EMSL to
fulfill its QA/QC responsibilities.
28
-------
Audit Report No. E1SKF0-09-0137-2100624
4. Coordinate the development of the LPD system with NPO
management so as to prevent the possible duplication of like data
base systems.
AGENCY COMMENTS
ORD's comments in response to the above recommendations are
summarized below and referenced to the numbered recommendations.
1. An initial version of the LPD is scheduled for completion
this year. A scientific peer review of the project has been
scheduled for this year that will assist us in defining
additional resource needs.
2. Procedures currently exist that can be modified to provide
the information requested. We will bring this recommendation to
the attention of the NPO and work with them to modify existing
procedures to achieve this goal.
3. A scientific peer review of the LPD has been scheduled for
this year and will satisfy the recommendation.
4. The NPO has been integrally involved in the planning of the
LPD system. We will continue to seek out their needs as the
system implementation continues.
OIG EVALUATION
Our evaluation of ORD's responses to the recommendations are
provided below and referenced to the recommendation number.
1. ORD's comments appear responsive to the recommendation.
2. The response did not identify what procedures existed which
could be modified to provide laboratory performance histories.
As such, we are continuing to recommend that interim procedures
be established to compile laboratory performance histories. This
data should be used to direct EMSL's QA/QC audit coverage at
those laboratories with significant gaps in audit coverage.
3. The scientific peer review of the LPD should satisfy the
intent of the recommendation assuming that the peer review
includes an evaluation of program needs for QA/QC results.
4. ORD's response indicated general agreement with the
recommendation and assured continued coordination between NPO and
ORD during the development of the LPD. ORD's response, however,
did not identify how ORD was coordinating this effort. Our
recommendation remains as stated.
29
-------
Audit Report No. ElSKFO-09-0137-2100624
30
-------
Audit Report No. E1SKFO-09-0137-2100624
CHAPTER 4
EM8L'8 REPORTING 8YSTEHS NEED IMPROVEMENT
A number of weaknesses in EMSL's reporting of laboratory contract
noncompliance and operating deficiencies were found. Because of
the reporting weaknesses, contract compliance deficiencies were
not always reported, nor were laboratories with recurring
deficiencies highlighted. In addition, EMSL does not have
adequate systems for rating the overall performance of
laboratories. Consequently, the NPO was unable to take timely
and appropriate actions against contractors. We attribute this
condition to a need for increased management oversight over
reporting of CLP QA/QC results. An additional contributing cause
may be the NPO encouraging the importance of working with CLP
contractors instead of holding them responsible for contract
performance. Incomplete reporting systems provide opportunities
for poor performing laboratories to continue noncompliant
activities and make it difficult for program managers to fulfill
their responsibilities to monitor and take appropriate actions.
Improvement in EMSL's reporting systems could significantly help
the Agency in its administration of CLP contracts. This in turn
would lead to increased assurance that the Superfund Program is
producing analytical data of known and acceptable quality. We
are therefore recommending that EMSL management take several
actions to improve its policies and procedures relating to CLP
contract reporting.
Background
Assuring that contractors adequately perform the specified
analytical protocols and produce data of known and acceptable
quality "requires timely, consistent, and thorough contract
management by CLP experts, and constant contract monitoring for
technical performance by Agency-wide experts." This is
accomplished by the performance of various QA/QC audits of CLP
contractors or their products. EMSL contributes to the QA
oversight of the CLP program contract laboratories.
The results of EMSL's QA/QC audits are intended to assist POs in
their monitoring and enforcing data integrity, laboratory
performance and compliance with contract requirements. As a
result, the QA/QC audits conducted by EMSL must be thorough and
consistent in their evaluation. The timely and comprehensive
reporting of the QA/QC results to the NPO, and other responsible
offices, is critical if effective contract monitoring is to take
place. EMSL's clients, the Regional TPOs, the Administrative POs
in the NPO, and the CO, should be relying on these QA/QC
31
-------
Audit Report No. E1SKF0-09-0137-2100624
evaluations in monitoring and assessing each laboratory's
performance.
In reviewing the effectiveness of EMSL's reporting of QA/QC
results, it is important to evaluate it in light of established
program internal controls. The concept of internal controls has
been defined by the General Accounting Office (GAO) as:
The plan of organization and methods and procedures adopted
by management to ensure that resource use is consistent with
laws, regulations, and policies; that resources are
safeguarded against waste, loss, and misuse; and that
reliable data are obtained, maintained and fairly disclosed
in reports.
The GAO definition of internal controls applies to EMSL's role in
the CLP program. In addition, these controls have been
incorporated, to a degree, in OERR's QAPP and EMSL's QAPjPs.
These guidance documents identify a reporting responsibility
associated with various QA/QC activities. For example, the OERR
QAPP requires that EMSL prepare detailed reports on the results
of data audits and provide these reports to the NPO and Regional
clients. It is important to remember that EMSL's QA/QC audits
are not intended to make specific determinations about the
usability of the data. Usability determinations are made by the
technical data reviewers in the Regional offices. However, the
QAPP and QAPjPs indicate that the intent of the program is to
assure data integrity. As such, EMSL reporting system must be
able to provide accurate, consistent and timely information to
program personnel if the QA/QC program is to be successful.
VIOLATIONS OP CONTRACT DELIVERABLE REQUIREMENTS MOT REPORTED
During our review, we found that some CLP laboratories were not
submitting contract required deliverables to EMSL in a timely or
complete manner. In addition, there was evidence to indicate
that EMSL was not receiving some contract deliverables from
certain laboratories. While the CLP contracts clearly delineated
the submission requirements, EMSL was not reporting these
noncompliant laboratories to the NPO or the CO. EMSL's reporting
deficiencies pertained to two different contract deliverables
during fiscal years 1988-1990. Areas of contract noncompliance
are discussed below.
GC/MS Tapie Requests. CLP organics contracts require that
laboratories submit GC/MS data tape files to EMSL within 7 days
of receipt of request. Our review found that about one half of
all data tape requests, during the period covered by our review,
32
-------
Audit Report No. E1SKF0-09-0137-2100624
were not ^responded to or were not timely. Of 155 data tape
requests, 8 were not responded to at all by the laboratories and
70 were not responded to timely. Delays ranged from 6 to 52
days. EMSL, as a general rule, did not report the condition to
the NPO prior to the fourth quarter of fiscal 1990. Beginning in
the fourth quarter, EMSL usually notified the NPO and TPOs, by
copies of the letters to the laboratories, that the tapes had
been requested. When a laboratory did not respond to EMSL's
request, a second request letter was sent to the laboratory with
a copy to the NPO. In our opinion, the use of courtesy copies as
a form of notification to the NPO and TPOs has some value, but it
does not fulfill EMSL's responsibility to notify appropriate
officials of these noncompliant contractors. Not reporting
instances such as these to the NPO, TPOs and possibly the CO,
increases the program's vulnerability to errors or irregularities
occurring and going undetected.
Standard Operating Procedures (SOPs) and Quality Assurance Plans
(OAPs). CLP contract solicitations require that laboratories
submit their preaward SOPs and QAPs to EMSL for review and
approval. In addition, after contract award the laboratory is
required to submit updated SOPs within 45 days. This is an
important contract deliverable designed to assure that the
laboratory has good, working, acceptable laboratory procedures.
However, we found that EMSL did not have procedures for reporting
noncompliance with this contract requirement. The importance of
such reporting is illustrated by an event that occurred in
January 1991.
In January 1991, the NPO requested that EMSL advise them of the
status of submissions of SOPs and QAPs associated with the
September 1990 contract solicitations that 15 laboratories had
responded to. EMSL reported the status of these contract
deliverables as follows:
- 3 laboratories did not submit any SOPs and QAPs;
- 6 laboratories submitted incomplete SOPs and QAPs; and
- 4 laboratories did not submit updated SOPs and QAPs.
Thus, 13 laboratories (or 87 percent) did not comply with the
solicitation and contract provisions.
By not regularly reporting laboratory non-compliance with QA
deliverable requirements, POs are not informed timely of
performance problems and trends with contract non-compliance. In
addition, appropriate administrative or contractual actions to
bring a laboratory into compliance cannot be taken. The
33
-------
Audit Report No. E1SKF0-09-0137-2100624
conditio?^ are attributable to a need for increased EMSL
management planning and oversight regarding their reporting of QA
deficiencies. This oversight should be incorporated into EMSL's
CLP QAPP and related QAPjPs which are required under Office of
Research and Development guidelines on preparing QAPPs. In
addition, we believe that a lack of emphasis on some contract
compliance issues contributed to EMSL not sensing the importance
of reporting contractor non-compliance with deliverable
requirements.
RECPRRIMG DEFICIENCIES NOT REPORTED
EMSL's reporting practices did not differentiate between new and
recurring laboratory deficiencies. EMSL's report formats for
data, GC/MS tape and on-site audits did not require auditors to
report recurring problems. While EMSL's SOPs for on-site, tape
and organic data audits require that the auditors follow-up on
prior problems, there was no reporting of prior problems to
facilitate compliance with the SOP requirements. We did note
that EMSL reported organic QB results which included prior
scores. However, prior failures or problems were not highlighted
in the report. Inorganic QB reporting of historical performance
was not being done at the time of our review.
During the period of our review, we compiled information on
individual laboratory performance which indicated that poor
performance trends were evident from EMSL's records. EMSL audit
reports, however, did not highlight the repeat problems nor did
they indicate their significance over time. Discussions with
EMSL and program office personnel indicated that several factors
may have influenced the reporting of laboratory performance. Of
particular interest were comments relating to the NPO's
preference that EPA work with CLP laboratories instead of holding
them strictly responsible for contract performance. It is our
opinion that EMSL management had a responsibility to report trend
information which could impact on the overall quality of the
program. Because this information was not brought to the
attention of program management, the Agency was precluded from
taking appropriate and timely action against poor performing
laboratories.
QA AUDIT RATING SYSTEMS
During the period we reviewed, EMSL either lacked systems for
measuring a laboratory's QA/QC overall performance for on-site,
data and GC/MS tape audits, and QBs, or the systems used were not
consistently applied. Consistent and accurate rating of QA
results to the NPO and TPOs provide program officials with the
information needed to identify performance trends and areas of
34
-------
Audit Report No. E1SKF0-09-0137-2100624
weakness Requiring attention. Areas of concern identified during
our review are discussed below.
On-site audits. EMSL did not utilize any system for providing an
overall rating on laboratories during either preaward or
postaward on-site audits. EMSL only reported individual
contractual deficiencies. We believe that the usefulness of the
on-site audits would be enhanced by a system that rated the
overall performance of the laboratories. Such a system could
assist the NPO and TPO in the performance of their contract
monitoring duties and aid in overall program management. Overall
performance ratings could assist both the NPO and the TPOs in
readily assessing the degree to which a laboratory's overall
performance warrants attention. It could increase the ability to
compare performance results from period to period or between
laboratories. Finally, performance trends would be easier to
identify. EMSL personnel stated that one reason for not
providing overall ratings for on-site audits is that each
laboratory's problems are unique and it would be difficult to
develop such a system. While EMSL's concerns are noted, we
believe that the potential benefits from such a system warrant
consideration by EMSL.
Data and GC/MS tape audits. EMSL's system for rating (scoring)
data and GC/MS tape audits has changed frequently over the last
three years. This complicates the use performance trend analyses
and comparisons of historical performance almost impossible.
EMSL's system has evolved from only presenting a narrative
listing of problems, to the present system of numerical scores
along with a narrative description of defects. In the early part
of the period included in our review, EMSL was only tabulating
the number of contractual deficiencies and providing narrative
descriptions. Later, EMSL began scoring each laboratory's
overall data audits on a scale of "1 to 5". This overall
numerical score was developed in a manner that the score
considered a laboratory's performance in relation to other
laboratories' performance. This system of scoring Hon the curve1*
produced a "relative" score rather than reflecting how well each
laboratory performed on its own. In the third quarter of fiscal
1989 EMSL developed another scoring system in which each data
package for a laboratory was measured on the number of major
defects. This scoring system reported a "Data Quality" score on
each laboratory's own data package or GC/MS tape.
We were informed by EMSL's support contractor that the scoring
system for data and GC/MS tape audits was changed again in the
second half of fiscal 1991. Thus, in a period of about three
years, four different scoring systems for data and tape audits
have been in effect. While the current system may provide the
35
-------
Audit Report No. E1SKF0-09-0137-2100624
NPO and TtOs with clear opinions on the acceptability of
laboratory products, the changes in scoring systems make it
difficult to identify trends in performance or to compare current
and prior results. It is our opinion that EMSL should use
consistent numerical scoring systems for data and GC/MS tape
audits to facilitate trend analysis and evaluation of contractor
performance.
Quarterly Blind fOB) scores. EMSL's QAPjP required that
laboratories submitting late QB data packages would sustain a one
percent penalty, per late day, in their performance score.
During the period of our review, we noted that EMSL's QB scores
did not reflect the late submission penalties. By not penalizing
the performance score for late submission, EMSL's QB scores
result in inequitable treatment of those laboratories submitting
their data analysis in accordance with contract criteria. In
essence, noncompliant laboratories are given an unfair advantage
when allowed extra time to submit their analysis. Considering
the high percentage of QB data packages that are received late by
EMSL, it is our opinion that the QAPjP penalty should be
reflected in the performance scores.
CONCLP8IONS
In our opinion, the reporting system weaknesses discussed in this
finding represent a vulnerability to the CLP program.
Ineffective reporting systems provide opportunities for poor
performing laboratories to continue noncompliant activities and
make it difficult for program managers to fulfill their
responsibilities to monitor laboratory performance and take
appropriate action when needed. We attribute the weaknesses to a
number of factors including: (i) a need for increased management
planning and oversight; (ii) a need to prepare and update QAPPs
and QAPjPs; (iii) an emphasis on development and tracking of
analytical protocols rather than QA/QC activities; and (iv) the
NPO encouraging the Agency to work with CLP laboratories instead
of holding them responsible for contract performance.
It is our opinion that improvements in EMSL's reporting systems
could significantly improve the Agency administration of CLP
contracts. This in turn would lead to increased assurance that
the Superfund Program is producing analytical data of known and
acceptable quality. We are therefore recommending that EMSL
management take several actions to improve its operating policies
and reporting procedures.
AGENCY COMMENTS
36
-------
Audit Report No. E1SKF0-09-0137-2100624
ORD provided the following summary comments on the above audit
finding.
It is important to recognize that EMSL-Las Vegas does not
have sole responsibility for the CLP QA program. CLP
quality assurance activities have always been a shared
responsibility between the NPO, EMSL-Las Vegas, and the
Regions, with each component pursuing its individual role
and responsibilities.
PIG EVALUATION
ORD's comments to the above audit finding were not fully
responsive to the issues discussed. Evidence contained in EMSL-
Las Vegas' records clearly showed instances of laboratory non-
compliance with contractual requirements. These matters should
have been reported to appropriate program and contracts officials
so that actions could be taken against the laboratory. There was
no evidence that EMSL pursued the reporting of these instances
with recommendations for appropriate actions. We consider it to
be a fundamental responsibility of management to report instances
of contract non-compliance to appropriate officials for action.
As such, we are continuing to recommend that EMSL-Las Vegas take
necessary action to establish reporting systems for those areas
within their purview.
RECOMMENDATIONS
We recommend that Assistant Administrator for Research and
Development assure that the Director, EMSL:
1. Develop procedures for the identification and reporting of
laboratory non-compliance with contract deliverables to the NPO,
TPOs and CO. Deliverables to be considered in the reporting
requirements should include, as a minimum, the submission of
GC/MS tapes, and SOPs/QAPs.
2. Establish procedures and controls to ensure repeat problems
in audits are reported to the NPO and TPOs timely. In this
regard, the Director should be aware of the need for a tracking
system which can provide the information needed to identify
repeat performance problems.
3. Establish procedures and controls whereby consistent and
accurate rating systems are used for on-site, data and 6C/MS tape
audits.
37
-------
Audit Report No. E1SKF0-09-0137-2100624
4. Upda€fe EMSL's CLP QAPP and related QAPjPs to reflect the
revised procedures and controls instituted by EMSL. (See Chapter
2 for related discussions on QAPP and QAPjPs)
AGENCY COMMENTS
ORD's comments in response to the above recommendations are
summarized below and referenced to the numbered recommendations.
1. Procedures currently exist that can be modified to provide
the information requested. We will bring this recommendation to
the attention of the NPO, TPOs and CO and work with them to
modify existing procedures to achieve this goal.
2. The LPD system will allow users to identify habitual
problems. We will bring this recommendation to the attention of
the NPO and TPOs and work with them to establish procedures and
controls to satisfy this goal.
3. The LPD will archive all QA/QC data and will allow side-by-
side comparisons of rating systems. While scoring systems will
evolve as our understanding of data trends matures, the LPD will
provide more consistency and accuracy to rating systems. We will
work with the NPO to satisfy this goal.
4. We recognize the deficiencies of existing documentation, some
of which are not under our immediate control. We will inform the
NPO of these and work with them to update and improve all
appropriate CLP quality assurance documentation.
PIG EVALUATION
Our evaluation of ORD's responses to the recommendations are
provided below and referenced to the recommendation number.
1. ORD's response indicated that procedures existed which, when
modified, would be able to provide the desired information to
appropriate parties. While the response indicates agreement with
the recommendation, no formalized procedures were provided on
which we could evaluate the adequacy of the response. As such,
we are continuing to recommend that EMSL establish specific
written procedures for the reporting of contract non-compliance
with deliverable requirements.
2. ORD's response is generally responsive to the audit
recommendation assuming that the LPD data base contains
sufficient historical performance information making repeat
problem identification possible.
38
-------
Audit Report No. E1SKF0-09-0137-2100624
3. We agree that the LPD will provide the program with more
accessible information on historical rating information. The
intent of the recommendation, however, was to bring to
management's attention the history of changes in rating methods,
and, in the case of on-site audits, the lack of a rating system,
which make a historical performance comparison difficult if not
impossible. While we recognize the need for some program
changes, we believe that some stability in rating methodologies
would greatly benefit to the program.
4. ORD's comments on advising the NPO of existing documentation
deficiencies is not fully responsive to our recommendation.
Based on EMSL's prior preparation of a QAPP and QAPjPs, the CLP
QA/QC program can benefit greatly by maintaining these documents
on a current basis and preparing new QAPjPs as appropriate. We
continue to recommend that ORD examine this area to determine
whether QAPjPs should be prepared as discussed in the finding and
related recommendation.
39
-------
Audit Report No. E1SKF0-09-0137-2100624
APPENDIX I
Page 1 of 2
ABBREVIATIONS
American Chemical Society
Analytical Operations Branch (synonymous with NPO)
Administrative Project Officer
Contract Compliance Screening
Contract Laboratory Program
Contracts Management Division
Contracting officer
Environmental Data Integration System
Environmental Monitoring Services Laboratory located
at Las Vegas
Gas Chromatography/Mass Spectrometry
Invitation for Bid
Lockheed Engineering and Sciences Company
Laboratory Performance Database
Management Systems Review
National Contingency Plan
National Enforcement and Investigation Center
National Program Office (synonymous with AOB)
Office of Emergency and Remedial Response
Office of Research and Development
Office of Solid Waste and Emergency Response
Performance Evaluation
40
-------
Audit Report No. E1SKFO-09-0137-2100624
APPENDIX I
Page 2 of 2
ABBREVIATIONS
PO
—
Project Officer
QA
-
Quality Assurance
QA/QC
-
Quality Assurance/Quality Control
QAP
-
Quality Assurance Plan
QAPjP
-
Quality Assurance Project Plan
QAPP
-
Quality Assurance Program Plan
QB
-
Quarterly Blind
QC
- -
Quality Control
RAS
-
Routine Analytical Services
RTP
Research Triangle Park
SAS
-
Special Analytical Services
SMO
-
Sample Management Office
SOP
-
Standard Operating Procedure
TPO
-
Technical Project Officer (formerly DPO)
41
-------
Q
Ca
SUf
* ; WASHINGTON, D.C. 20460
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
APPENDIX II
Page 1 of 12
MAY - R IQQ" OFFICE OF
RESEARCH AND DEVELOPMENT
MEMORANDUM
SUBJECT: ORD Comments on Draft Audit Report E1SKFO-09-0137
"Review of EMSL's CLP Quality Assurance/Quality
Control Program"
FROM: ^ Erich W. Bretthauer-
Assistant Administrator
I for Research an^pevelopment (RD-672)
TO: Kenneth A. Konz
Assistant Inspector General for Audits (A-109)
In accordance with EPA Directive 2750, I am providing you
with this written response to the subject audit report which was
dated March 4, 1992. This memorandum includes a summary of the
Office of Research and Development's (ORD) evaluation of the
report and a discussion of the report's findings and
recommendations. It is my understanding that the EMSL-Las Vegas
audit is one portion of a larger audit of the entire Superfund
Contract Laboratory Program (CLP), and that this report is the
first of a number of documents that will result from these
audits. I am requesting that a final exit interview occur
between the audit team, laboratory staff and the Director of
our Office of Modeling, Monitoring Systems and Quality Assurance
(OMMSQA) to resolve any remaining differences relative to the
report. I understand that this response will be included in your
final report.
Summary
ORD has been a participant in the CLP for over a decade,
primarily through the activities of EMSL-Las Vegas. Due to the
evolution of the CLP and its continuously changing participants,
it was both timely and appropriate that an audit be conducted by
an independent, unbiased group of professionals. I am satisfied
that many of the recommendations in the audit report are both
42
m 0 8 1992
Primed on Recycled Paper
-------
APPENDIX II
Page 2 of 12
reasonable and achievable. In fact, the most complex technical
recommendation, the development of an automated database for
evaluation of laboratory performance trends, is well on its way
towards a successful completion.
EMSL-Las Vegas fulfills a key role within the CLP by
monitoring analytical method performance, providing an unbiased
quality assurance data review, and resolving unsettled technical
issues. These are functions that are consistent with the overal
mission of ORD. In fact, this application of unique ORD
expertise and its research products by the CLP gives meaning and
credibility to our technical efforts.
The CLP quality assurance program has always been a shared
responsibility between the National Program Office, EMSL-Las
Vegas, the Regions, and the contract management organizations
within the Office of Administration, with each component pursuin*
its individual role and responsibilities. This reality, that
EMSL-Las Vegas is not the entire CLP quality assurance program,
is recognized within the Executive Summary of the report when it
is stated, "The responsibility to initiate actions against
laboratories rests with EPA's program and contracts offices."
However, that conclusion is not noted within the main report.
Instead, EMSL-Las Vegas is described as not initiating or
completing certain actions, such as laboratory audits, when the
responsibility for the action resides with another organization.
The report concludes that EMSL-Las Vegas was not conducting
preawar^ and postaward on-site audits in a manner that focused
such audits on poor performing laboratories (On-Site Audits, page
14). EMSL-Las Vegas is a participant in these on-site audits,
along with the appropriate Region and the NEIC, and serves as a
technical advisor. The audits are conducted by and for the
Contracting Officer or, their designee, usually the Regional
Technical Project Officer. As a participant, EMSL-Las Vegas
reviews the laboratory's technical capabilities and documents the
findings to the responsible official. Any conclusions that are
made from these audits are made by the Contracting Officer or
designee. The actual scheduling and ranking of audits is the
responsibility of the Contracting Officer, the Regional Technical
Project Officer, or the National Program Office. It is clear
that we have not effectively communicated and documented ORD's
roles and responsibilities in the CLP to the IG. EMSL-Las Vegas
will work with the National Program Office to satisfy this need.
43
-------
APPENDIX II
Page 3 of 12
Eadh' year, EMSL-Las Vegas works with the National Program
Office to develop its plan for quality assurance activities and
considers the needs of the program and the anticipated level of
funding. Levels of funding are designed to cover normal
operations. Historically, however, special projects arise
throughout the year that require resolution, thereby consuming
resources targeted to satisfy the base program. These changes in
priorities are directed by the National Program Office. Some of
these involve data audits, tape audits and document reviews
carried out by EMSL-Las Vegas staff in support of high priority
audits for the Office of Inspector General. Clearly, alternative
strategies must be developed to handle special projects, so that
the greater program does not suffer from the lack of resources.
EMSL-Las Vegas will work with the National Program Office to
clearly identify the resource levels that will assure that
critical activities are completed.
A 1983 peer review recommended the development of an
automated database to facilitate the review and interpretation of
CLP quality assurance data. The draft audit report concludes,
11 EMSL has not established effective tracking procedures and
systems for evaluating QA/QC historical audit performance by
laboratory . . . This condition was first identified to EMSL
management in 1983, however, it remained uncorrected through
fiscal 1991." EMSL-Las Vegas has developed effective automated
tracking systems that responded to the recommendation of the 1983
peer review. In fact, the ADROIT system was developed and
implemented prior to the 1983 review but was never presented at
that early peer review. Another automated tracking system, the
CARD database, was developed and transferred to the National
Program Office in 1987. A third automated system, CADRE, has
recently been completed and is being transferred to the Regions.
Finally, it was with full support of the National Program Office
and the ORD Research Committee that EMSL-Las Vegas delayed or
canceled other ORD Superfund research projects in 1990 in order
to fund the implementation of the latest, state-of-the-art
automated database system, the Laboratory Performance Database
(LPD). I appreciate the support given to the LPD in the report
for resolving current perceived problems in CLP, but we must be
cautious regarding our expectations for LPD's ability to address
all the tracking and trend analysis concerns.
To enhance the likelihood of success of the LPD, I have
scheduled a peer review of the LPD to be held later this year.
This review by a group of preeminent, unbiased scientists will go
far in clarifying the full scope of the system and in addressing
44
-------
appendix II
Page 4 of 12
4
the most appropriate focus for additional resources. EMSL-Las
Vegas is prepared to take whatever steps are necessary to achieve
a successful outcome to the LPD. Once again, I believe that it
would be fruitful if the auditors and EMSL-Las Vegas staff meet
to address and correct any misunderstandings concerning the
history of these types of automated tracking efforts.
Finally, there seems to be some confusion concerning the
Quality Assurance Program Plan (QAPP). Data Quality Objectives
are a user-driven function, with the quality of the data
dependent on the intended use of the data. It is incumbent on
the user, in this case the National Program Office, to develop a
QAPP that outlines the overall QA program framework. We
recognize that the OEKR QA Program Plan has not been fully
integrated into the EMSL-Las Vegas QA program. We will work with
OERR to define the QA program requirements to meet the overall
program management responsibilities.
DISCUSSION OF DRAFT REPORT EXECUTIVE SUMMARY
I
The executive summary of the draft report identifies three
areas of deficiency in EMSL-Las Vegas' quality assurance support:
1) QA/QC audit coverage needs increased attention; 2) tracking
procedures and systems for QA/QC audit results are needed; and 3)
reporting systems need improvement. A synopsis of our response
follows.
A. Finding: EMSL-Las Vegas did not conduct a sufficient number
of audits to meet stated goals. Frequent and
timely audits help identify performance
deficiencies for corrective action which
positively impacts assurance that data can be of
known and acceptable quality.
Response:
Each year, EMSL-Las Vegas works with the National Program
Office to rank the quality assurance activities within the needs
of the program and the anticipated level of funding. Levels of
funding are usually sufficient for satisfactory coverage of base
operations. However, unplanned special projects, including
support for high priority audits for the Office of Inspector
General, surface throughout the year, thereby consuming resources
targeted to satisfy the base program. EMSL-Las Vegas has been
forced to remain flexible to supporting unanticipated projects,
and has had to use the remaining resources for the higher
priority activities negotiated with the program.
45
-------
APPENDIX II
Page 5 of 12
5
Recommendation:
1. Establish clear goals for performing QA/QC audits on
each laboratory in the program, with increased emphasis
on poor performing laboratories.
Response;
EMSL-Las Vegas will work with the National Program Office to
develop a mutually supported process for targeting QA/QC audits.
This process will consider laboratory performance, routine
laboratory frequency goals and critical external needs, such as
requests from the Inspector General.
B. Finding: EMSL-Las Vegas is not positioned to provide
effective historical tracking of laboratory QA
performance which results in a diminished ability
to identify poor performing laboratories.
Response:
EMSL-Las Vegas has considerable experience developing
computerized systems for reviewing large databases, like that of
the CLP. An initial version of the Laboratory Performance
Database will be operational this year. To minimize
uncertainties in its development, a peer review is scheduled
later this year to evaluate the science and the likelihood of
success of the system.
Using LPD as the primary mechanism, EMSL-Las Vegas and the
National Program Office will better identify poor performance and
will generate appropriate historical trend reports that satisfy
the needs of the program. As the database grows with time,
definitions and trend reports are expected to evolve as the
understanding of the process matures.
Recommendation:
2. Institute a laboratory based tracking system to
accumulate historical QA/QC performance data so that
QA/QC efforts can be directed at potentially vulnerable
areas.
Response:
The Laboratory Performance Database system is being
developed, with an initial version scheduled for completion this
year. A scientific peer review is scheduled for this year to
evaluate the system.
46
-------
APPENDIX II
Page 6 of 12
6
Recommendation:
3. Develop and implement procedures for identifying and
reporting repeat deficiencies and laboratory non-
compliance for contract deliverables to the appropriate
CLP officials.
i
Response;
The Laboratory Performance Database system will allow users
to better identify and report repeat deficiencies. EMSL-Las
Vegas will work with the National Program Office to better
identify and improve reporting procedures to ensure that
responses are both timely and thorough.
C. Finding; EMSL-Las Vegas' reporting systems need
improvement.
Response:
It is important to recognize that EMSL-Las Vegas does not
have sole responsibility for the CLP guality assurance program.
The CLP quality assurance activities have always been a shared
responsibility between the National Program Office, EMSL-Las
Vegas, and the Regions, with each component pursuing its
individual role and responsibilities.
Recommendation:
4. • Update EMSL-Las Vegas' CLP Quality Assurance Program
Plan for the CLP to reflect program changes since its
issuance. Related to this area, EMSL-Las Vegas1
Quality Assurance Project Plans should be updated, or
as needed created, to reflect current program
requirements.
Response:
Preparation of a Quality Assurance Program Plan is the joint
responsibility of all the affected organizations. EMSL-Las Vegas
recognizes the limitations to existing documentation, and will
work with the National Program Office to update and improve all
CLP quality assurance documentation.
47
-------
APPENDIX II
Page 7 of 12
I have attached a detailed response to the recommendations
that were contained in the body of the draft report. As you wil]
note, resolution of many of the recommendations will require
discussions with the National Program Office. This draft report
has identified a number of issues that require better
documentation and communication.
If you have any questions on our response, please contact
H. Matthew Bills, Acting Director, Office of Modeling, Monitorinc
Systems and Quality Assurance (OMMSQA) at 202-260-5767.
Attachment
48
-------
APPENDIX II
Page 8 of 12
ATTACHMENT
DISCUSSION OF DRAFT REPORT TEXT
CHAPTER 2: EMSL'S QA/OC AUDIT COVERAGE NEEDS INCREASED ATTENTION
Recommendations:
1. Initiate controls to assure that all laboratories receive at
least one GC/MS tape audit per year.
Response:
We will work with the National Program Office to establish
necessary controls and to assure that appropriate resources are
available for accomplishing this goal.
2. Establish clear goals for performance of QA/QC audits on
each laboratory. These goals should include an emphasis on
conducting timely audits on poor performing laboratories.
Response:
The decision to emphasize audits on poor performing
laboratories rests with the National Program Office. We will
work with the National Program Office to clearly identify program
needs for audits and to develop alternative mechanisms to handle
unscheduled support requiring program resources.
3. Update the EMSL CLP QAPP and associated QAPjPs to reflect
revised procedures and controls. In this regard, EMSL
should prepare a QAPjP for the GC/MS tape auditing program.
Response:
We recognize the deficiencies to existing documentation,
some which are not under our immediate control. We will inform
the National Program Office of these deficiencies and will work
with them to update and improve all appropriate CLP quality
assurance documentation.
4. Establish procedures to assure that poor performing
laboratories are subjected to increased QA/QC audit emphasis
following identified periods of poor performance.
49
-------
APPENDIX II
Page 9 of 12
2
Response:
The decision to emphasize audits on poor performing
laboratories rests with the National Program Office. We will
bring this recommendation to their attention and work with them
to modify procedures and to identify resources to achieve this
goal.
5. Establish controls to assure that remedial PEs are processed
timely on laboratories that fail QBs.
Response:
The decision to schedule and initiate remedial PEs rests
with the National Program Office. We will bring this
recommendation to their attention and work with them to establish
controls to satisfy this goal.
6. Establish procedures to assure that first quarter QBs are
sent to new contract laboratories for analysis.
Response:
The decision to schedule and initiate QBs rests with the
National Program Office. We will bring this recommendation to
their attention and work with them to establish procedures to
satisfy .this goal.
7. Establish procedures to assure that poor performing
laboratories are identified to procurement officials prior
to the award of new CLP contracts.
Response:
The Laboratory Performance Database system will allow users
to better identify poor performing laboratories. We will bring
this recommendation to the attention of the Contract Officer and
the National Program Office to satisfy this goal.
50
-------
APPENDIX II
Page 10 of 12
3
CHAPTER 3 TRACKING PROCEDURES AND SYSTEM FOR QA/QC AUDIT RESULTS
NEEDED
Recommendations:
1. Immediately pursue additional funding for the development of
the Laboratory Performance Database (LPD) System to minimize
delays in completion and implementation.
Response:
An initial version of the LPD is scheduled for completion
this year. A scientific peer review of the project has been
scheduled for this year that will assist us in defining
additional resource needs.
2. Develop interim procedures to compile laboratory QA/QC
histories for the purpose of identifying QA/QC audit
coverage deficiencies so that resources can be directed at
those laboratories with significant gaps in coverage.
Response:
Procedures currently exist that can be modified to provide
the information requested. We will bring this recommendation to
the attention of the National Program Office and work with them
to modify existing procedures to achieve this goal.
3. Initiate a design review of the LPD system to determine
whether the current system is sufficient to provide
comprehensive historical performance data, by laboratory,
needed by EMSL to fulfill its QA/QC responsibilities.
Response:
A scientific peer review of the LPD system has been
scheduled for this year and will satisfy the requirements of this
recommendation.
4. Coordinate the development of the LPD system with NPO
management so as to prevent the possible duplication of like
data base systems.
Response:
The National Program Office has been integrally involved in
the planning of the LPD system. We will continue to seek out
their needs as the system implementation continues.
51
-------
APPENDIX II
Page 11 of 12
4
CHAPTER 4 EMSL'S REPORTING SYSTEMS NEED IMPROVEMENT
Recommendations:
1. Develop procedures for the identification and reporting of
laboratory non-compliance with contract deliverables to the
NPO, TPOs and CO. Deliverables to be considered in the
reporting requirements should include, as a minimum, the
submission of GC/MS tapes, and SOPs/QA project plans.
Response:
Procedures currently exist that can be modified to provide
the information requested. We will bring this recommendation to
the attention of the NPO, TPOs, and CO and work with them to
modify existing procedures to achieve this goal.
2. Establish procedures and controls to ensure repeat problems
in audits are reported to the NPO and TPOs timely. In this
regard, the Director should be aware of the need for a
tracking system which can provide the information needed to
identify repeat performance problems.
Response:
The Laboratory Performance Database system will allow users
to identify habitual problems. We will bring this recommendatior
to the attention of the National Program Office and the TPOs and
work with them to establish procedures and controls to satisfy
this goal.
3. Establish procedures and control whereby consistent and
accurate rating systems are used for on-site data and GC/MS
tape audits.
Response;
The Laboratory Performance Database will archive all QA/QC
data and will allow side-by-side comparisons of rating systems.
While scoring systems will evolve as our understanding of data
trends matures, the LPD will provide more consistency and
accuracy to rating systems. We will work with the National
Program Office to develop appropriate procedures and controls to
satisfy this goal.
52
-------
APPENDIX II
Page 12 of 12
5
4. UpdcTte EMSL's CLP QAPP and related QAPjPs to reflect the
revised procedures and controls instituted by EMSL.
Response:
We recognize the deficiencies of existing documentation,
some of which are not under our immediate control. We will
inform the National Program Office of these deficiencies and will
work with them to update and improve all appropriate CLP quality
assurance documentation.
53
-------
Audit Report No. E1SKF0-09-0137-21Q0624
APPENDIX III
REPORT DISTRIBPTIOH
Recipient
Action Official
Assistant Administrator for
Research and Development (RD-672)
Office of the Inspector General
Inspector General (A-109)
Headquarters Offices
Agency Followup Official (PM-225), Attn: Director,
Resource Management Division
Agency Followup Official (PM-208)
Office of Congressional Liaison (A-103)
Office of Public Affairs (A-107)
54
-------