$
<
33
\
^tDSr%
o
.361
i-.	A
''i PRO'S4-
OFFICE OF INSPECTOR GENERAL
Catalyst for Improving the Environment
Evaluation Report
Promising Techniques Identified to
Improve Drinking Water Laboratory
Integrity and Reduce Public Health Risks
Report No. 2006-P-00036
September 21, 2006

-------
Report Contributors:
Jill Ferguson
Jeff Fencil
Ira Brass
Dan Engelberg
Anthony Chirigotis
Abbreviations
DoD
Department of Defense
EPA
U.S. Environmental Protection Agency
NELAC
National Environmental Laboratory Accreditation Conference
OCEFT
Office of Criminal Enforcement, Forensics, and Training
OEI
Office of Environmental Information
OGWDW
Office of Ground Water and Drinking Water
OIG
Office of Inspector General
PT
Proficiency Testing
QA/QC
Quality Assurance/Quality Control
SOP
Standard Operating Procedures
Cover photo: Microbiological analysis of a water sample (EPA New England Regional
Laboratory photo).

-------
9
<
.sfe.
1 VIV "
\'xP
^ PRQl^
U.S. Environmental Protection Agency
Office of Inspector General
At a Glance
2006-P-00036
September 21, 2006
Catalyst for Improving the Environment
Why We Did This Review
Between Fiscal Years 2000
and 2003, our Office of
Investigations laboratory fraud
unit saw an increase in cases.
Drinking water samples, if not
appropriately analyzed, will
increase the risk of public
exposure to harmful
contaminants. We conducted
this review to identify
vulnerabilities in the drinking
water sample analysis process
and promising techniques to
improve laboratory integrity.
Background
The Safe Drinking Water Act
of 1974 provides that a
laboratory must obtain
approval by the U.S.
Environmental Protection
Agency (EPA) or a State
before analyzing public
drinking water samples for
compliance with health-based
standards. EPA certification
and National Environmental
Laboratory Accreditation
Conference accreditation
programs provide oversight of
drinking water laboratories.
For further information,
contact our Office of
Congressional and Public
Liaison at (202) 566-2391.
To view the full report,
click on the following link:
www.epa.aov/oia/reports/2006/
20060921 -2006-P-00036.pdf
Promising Techniques Identified to
Improve Drinking Water Laboratory Integrity
and Reduce Public Health Risks
What We Found
Within the drinking water sample analysis process we identified hundreds of
vulnerabilities that are not addressed by EPA's process. These vulnerabilities can
compromise the integrity of the analysis process and the quality of data produced.
Many of these vulnerabilities were identified by the Office of Inspector General in
1999 and the Agency's own review in 2002, with no action by the Agency.
Moreover, States that have implemented new techniques to detect laboratory
integrity problems have found additional deficiencies, inappropriate procedures,
and even cases of fraud. Their findings and those of our own investigators show
integrity can be, and has been, compromised. However, without any national
studies of water quality data that include examining the integrity of laboratories,
the full extent of the problem remains unassessed.
Through our work with States, laboratory organizations, and other Federal
agencies, we identified promising techniques to help improve oversight and
protect against inappropriate procedures and fraud in the drinking water analysis
process. This report contains details on those promising techniques.
What We Recommend
Given the potential impact of poor quality data on human health, we recommend
that EPA assess drinking water laboratory integrity and incorporate promising
techniques to better identify inappropriate procedures and fraud into the
laboratory oversight process. Our specific recommendations include reforms to
laboratory oversight processes, policy, guidance, and training. In addition, the
Office of Ground Water and Drinking Water should improve awareness of the
vulnerabilities and realities of fraud and inappropriate procedures affecting
drinking water data quality. The Office of Environmental Information should
develop a mechanism to identify, and a policy to address, data in Agency
databases from laboratories under investigation, indictment, and/or conviction.
EPA suggested modifications to several of our recommendations, preferring to
encourage rather than require the use of promising techniques. We made changes
where appropriate.

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFFICE OF
INSPECTOR GENERAL
September 21, 2006
MEMORANDUM
SUBJECT:	Promising Techniques Identified to Improve Drinking Water
Laboratory Integrity and Reduce Public Health Risks
Report No. 2006-P-00036
TO:	Benjamin Grumbles
Assistant Administrator for Water
Linda A. Travers
Acting Assistant Administrator for Environmental Information
and Chief Information Officer
This is our report on the subject evaluation conducted by the Office of Inspector General (OIG)
of the U.S. Environmental Protection Agency (EPA). This report contains findings that describe
the problems the OIG has identified and corrective actions the OIG recommends. This report
represents the opinion of the OIG and does not necessarily represent the final EPA position.
Final determinations on matters in this report will be made by EPA managers in accordance with
established resolution procedures.
The estimated cost of this report - calculated by multiplying the project's staff days by the
applicable daily full cost billing rates in effect at the time - is $764,803.
Action Required
In accordance with EPA Manual 2750, you are required to provide a written response to this
report within 90 calendar days. You should include corrective action plans for agreed upon
actions, including milestone dates. We have no objections to further release of this report to the
public. The report will be available at http://www.epa.gov/oig.
If you or your staff have any questions regarding this report, please contact me at (202) 566-0847
or roderick.bill@epa.gov. or Dan Engelberg, Product Line Director for Water Issues, at
(202) 566-0830 or engelberg.dan@epa.gov.
Sincerely,
-BiTX: Roderick
Acting Inspector General
^tDSrX
* o \
ifil

-------
Promising Techniques Identified to Improve Drinking Water Laboratory Integrity
and Reduce Public Health Risks
Table of C
Chapters
1	Introduction		1
Purpose		1
Background		1
Scope and Methodology		7
2	Vulnerabilities Compromise Laboratory and Data Integrity,
Increase Public Risk 		8
Sample Analysis Process		8
Multiple Vulnerabilities Identified		9
Expert Panel Members Collectively Identify Hundreds
of Vulnerabilities		11
EPA Procedures to Address Data for Detected Instances of
Inappropriate Procedures and Fraud Limited		12
Various Factors Contribute to Vulnerabilities		13
Vulnerabilities Hinder Ability to Ensure Safe Drinking Water		15
3	Opportunities Exist to Provide Additional Protection Against
Inappropriate and Fraudulent Procedures		16
Promising Techniques Identified to Better Protect Against
Inappropriate and Fraudulent Procedures		16
Techniques Not Required by OGWDW Implemented by Others		17
4	Conclusions and Recommendations		21
Promote Better Training and Use of Promising Techniques		21
Actively Discourage Fraud and Inappropriate Procedures		23
Reduce Risk to Agency Systems and Decision Making		23
Agency Comments and OIG Evaluation		24
Status of Recommendations and Potential Monetary Benefits		27
- continued -

-------
Promising Techniques Identified to Improve Drinking Water Laboratory Integrity
and Reduce Public Health Risks
Table of Contents (continued)
Appendices
A	OIG Laboratory Fraud Cases		29
B	Details on Scope and Methodology		31
C	Laboratory Problems Identified by OCEFT Workgroup		35
D	Vulnerabilities Identified by EPA Team		37
E	Vulnerabilities Identified by Expert Panel		39
F	Promising Techniques		48
G	Office of Water Response		55
H	Office of Environmental Information Response		68
I	Distribution		71

-------
Chapter 1
Introduction
Purpose
The safety of America's drinking water rests on a system of standards,
monitoring, and compliance determinations. To ensure health-based standards are
met, the U.S. Environmental Protection Agency (EPA), under the Safe Drinking
Water Act, requires periodic sampling and analysis of drinking water provided by
public water systems. Testing laboratories play a critical role, alerting water
system managers when health-based standards are not met and providing
information for making public health decisions. Accurate and reliable data from
certified drinking water laboratories are also needed to report EPA performance
information to Congress and the public.
We conducted this evaluation to identify:
•	Vulnerabilities in the drinking water sample analysis process,
•	Techniques to mitigate those vulnerabilities, and
•	Opportunities to further safeguard human health.
Background
Laboratory integrity is crucial to EPA's strategy for providing the public with safe
drinking water. Public water systems test for over 80 contaminants1 on a periodic
basis. EPA has determined that these regulated contaminants can pose serious
health risks ranging from diarrheal episodes to nervous system, kidney, and liver
problems; an increased risk of cancer; and, in some cases, death. Every year,
through the use of Consumer Confidence Reports, water systems notify their
customers of the level of contaminants in drinking water. Between these reports,
if the level of contaminants exceeds health-based standards, a public notice is
issued to customers.
Public Health Considerations
False or inaccurate reporting of drinking water sample results could result in an
increased level of risk - in some extreme examples, where water is severely
contaminated, disability or death. EPA has not yet conducted a national review of
laboratory performance or analyzed State data on laboratory deficiencies to
determine the extent to which public health may or may not be at risk from
1 Primary drinking water regulations include tests for 7 Microorganisms, 4 Disinfection By-products, 3 Disinfectants,
16 Inorganic Chemicals, 53 Organic Chemicals, and 4 Radionuclides. A list of contaminants regulated, maximum
contaminant levels, and potential health effects is at http://www.epa. gov/safewater/mcl.html#mcls.
1

-------
inappropriate or fraudulent laboratory procedures. Investigations by Federal and
State agents have turned up a number of cases of inappropriate procedures and
fraud that have the potential to affect human health. Because of the complexities
of evaluating the effects of data manipulations and falsifications that occurred in
several of the laboratory fraud cases, the EPA Office of Inspector General (OIG)
investigators have been unable to determine the actual health risks and magnitude
of population affected.
An OIG fraud case involving even a single laboratory can have a significant
impact, affecting more than a million people, as demonstrated by the example in
Figure 1.1.
Figure 1.1 - Widespread Potential Impact of an OIG Laboratory Fraud Case
City and County Residents
1.8 million people
School Districts and Individual Schools Served
129
Hospitals Served
12
Bottled Water Companies Served
104
Source: EPA OIG Office of Investigations analysis
Of interest in this example is the number of school districts and hospitals with
particularly vulnerable populations that could be affected. The completed OIG
investigation found machine calibrations associated with volatile and semi-
volatile organic analyses were altered. Several of the standards and samples had
surrogates2 manipulated. While the surrogates are not on the list of national
primary drinking water standards, their manipulation calls into question the
accuracy of the results of all the contaminants for which the surrogates are
monitors. Additional information on OIG laboratory fraud cases is in
Appendix A.
No waterborne disease outbreaks or documented cases of illness related to
drinking water have been directly tied to cases of inappropriate laboratory
procedures or fraud in the United States. However, according to the U.S. Centers
for Disease Control and Prevention, the incidents of illness related to drinking
water contaminants can go unreported. When outbreaks are reported,
epidemiological investigations generally do not include an assessment of
laboratory procedures or a review of the quality of laboratory data reported.
2 Surrogates are compounds added to each sample that monitor method performance with each sample. Typically
one surrogate is the indicator for 10-20 compounds in a given sample analysis. Some of the compounds are
regulated, some are not.
2

-------
EPA Roles in Drinking Water Laboratory Certification
The Safe Drinking Water Act requires all laboratories to obtain an EPA or State
certification before analyzing public drinking water samples. EPA suggests
certified laboratories: (1) analyze proficiency test samples, (2) use EPA-approved
analytical methods, and (3) successfully pass periodic on-site audits. Public water
systems as well as the general public served by these systems rely on these audits
and EPA's certification process to ensure drinking water quality information
provided by laboratories is reliable and accurate. The Safe Drinking Water Act
does not specify the nature of the audit, although EPA has developed audit
training and offers guidance through a laboratory certification manual. As
discussed in greater detail in Chapter 2, EPA has taken the audit requirements to
be a best case assessment of laboratories' capability to perform EPA methods, not
an assessment of actual performance by laboratories under day-to-day conditions.
Public water system regulations for those laboratories testing drinking water
supplies for lead contamination include an additional provision that the EPA
Administrator shall assure that programs for the certification of those testing
laboratories certify only laboratories that provide reliable, accurate testing.3
The certification program for all drinking water laboratories (those testing for lead
as well as other contaminants) is managed and operated from the Office of
Ground Water and Drinking Water (OGWDW) Technical Support Center in
Cincinnati, Ohio. A division of EPA's Office of Water, OGWDW oversees
certification activities in the EPA regions and is responsible for training
certification officers, providing guidance on laboratory certification, and
maintaining a database of laboratory IDs. Regional certification program
managers and regional certification authorities oversee State Principal
Laboratories and State certification programs. In addition:
•	OGWDW has also accepted National Environmental Laboratory
Accreditation Conference (NELAC) accreditation4 as an alternative to
laboratory certification. EPA region, State, and commercial labs that are
NELAC accredited are under the oversight of that program, which was
developed with the support of EPA's Office of Research and
Development. NELAC-accredited laboratories must still meet OGWDW
certification requirements.
•	The Office of Environmental Information (OEI) provides guidance and
training for use by the environmental laboratory community.
•	The Office of Criminal Enforcement, Forensics, and Training (OCEFT),
as part of the Office of Enforcement and Compliance Assurance, performs
enforcement actions through its Criminal Investigations Division.
3	Title 42, U.S. Code, Section 300j-26
4	Additional information on NELAC as well as the NELAC Standards document which describes the guidelines for
laboratory accreditation is available at www.epa. gov/nelac
3

-------
•	The OIG's Office of Investigations investigates fraud, waste, and abuse in
laboratories; and provides training in fraud detection when requested.
OIG laboratory fraud cases are initiated through referrals from laboratory
employees, or State and local inspectors.
State Roles Related to Public and Private Laboratories
It is the States, for the most part, that actually issue public and private laboratory
certification or accreditation status for the analysis of public drinking water
samples and have direct oversight responsibility for those laboratories. State
certification officers visit laboratories and provide information to the State
certification program manager, which is used to determine the certification status
of the laboratory. These individuals are encouraged to attend and pass the EPA
certification officers training course, but it is not required. The content of this
course and exam vary between microbiology and chemistry, but both are based on
EPA testing methods.
Guidance on methods to audit certified laboratories is provided to States via
OGWDW's Laboratory Certification Manual, but there are no legal requirements
on techniques to use. EPA relies on quality control measures built into drinking
water analytical methods, proficiency testing, certification and accreditation
audits, and any other measures implemented by States to control drinking water
laboratory integrity and data quality.
OGWDW has historically not offered a specific radiochemistry course or exam,
although it recently partnered with a provider of such training for a September
2006 course. Laboratories analyzing drinking water samples for radiochemical
contaminants exist in 20 States and are fewer in number than those analyzing
samples for microbiological or chemical contaminants. State radiochemistry
Certification Officers are currently encouraged to participate in the certification
training course for chemistry and pass the exam for inorganic chemistry.
Occurrence of Fraud and Inappropriate Procedures
Four key areas of concern in this report are inappropriate procedures, laboratory
fraud, data quality, and laboratory integrity, which we define as follows:
•	Inappropriate procedure: A scientifically unsound or technically
unjustified omission, manipulation, or alteration of procedures or data that
bypasses the required quality control parameters, making the results
appear acceptable.
•	Laboratory fraud: The deliberate falsification during reporting of
analytical and quality assurance results that failed method and contractual
requirements to make them appear to have passed requirements.
4

-------
•	Data quality: The degree of acceptability or utility of data for a
particular purpose - in this case, reporting public drinking water sample
information.
•	Laboratory integrity: The laboratory's meeting general standards of
objectivity, data quality, and ethical behavior, thus reporting accurate,
complete, and valid information.
Over the past 6 years, the number of laboratory fraud cases reported to the EPA
OIG Office of Investigations has increased steadily (Figure 1.2). Laboratories
responsible for analyzing public drinking water samples represented over
35 percent of the 44 OIG laboratory fraud cases in 2004 and just fewer than
30 percent of the 58 cases in 2005 (see Appendix A for additional details).
EPA Office of Water reports that there are approximately 6,000 laboratories
certified for drinking water. The total percentage of certified laboratories under
investigation or convicted of fraudulent procedures is currently unknown since no
national database tracks certified drinking water laboratories by these parameters.
Figure 1.2: Number of EPA OIG Laboratory Fraud Investigations
60
50
40
30
20
10
0
FY2000 FY2001 FY2002 FY2003 FY2004 FY2005




~44;
]


~32-











Air*







J
-Y
Source: EPA OIG Office of Investigations
A 1999 OIG memo cited problems with laboratory data integrity and provided
suggestions for improvement. In 2001, the OIG issued an open letter to the
environmental analytical laboratory community to draw attention to inappropriate
laboratory procedures and fraud. In 2002, OCEFT issued a laboratory fraud
workgroup report with the Department of Justice acknowledging the severity of
problems in environmental laboratories. Although these reports do not explicitly
refer to laboratories analyzing drinking water samples, it is reasonable to conclude
that the problems cited in these documents would be applicable to all types of
laboratories, including those analyzing drinking water samples.
More Detailed Audits Look for and Find Problems
Arizona is using more advanced and aggressive techniques when compared to the
minimal EPA requirements, and no other State has found the same magnitude of
5

-------
problems as this State. Arizona identified 20 cases of what OIG considers to be
severe inappropriate procedures, including fraud, following certification audits of
over 140 laboratories seeking certification from the State (about 1 in 7
laboratories). Arizona representatives also reported that of the six largest
laboratories operating in the State, five have gone out of business after incidents
involving falsification or inappropriate procedures.
In our evaluation, we did not find any evidence to suggest that the quality of
laboratories analyzing drinking water samples for residents of Arizona were any
different than laboratories in operation throughout the United States. In fact,
while Arizona auditors have found severe problems, including fraudulent
procedures during drinking water laboratory certification audits, other States,
performing EPA (OGWDW) certification audits as well as NELAC accreditation
audits at the same laboratories, have issued reports finding minimal deficiencies.
Figure 1.3 provides examples of what Arizona Certification Officers found at
laboratories both in-State and out-of-State, as well as both NELAC-accredited and
not NELAC-accredited. Of the 106 laboratories currently certified by Arizona to
test drinking water, 49 are located outside the State. Details on Arizona's audit
methods are in Chapter 3.
Figure 1.3 - Inappropriate or Fraudulent Procedures Found in Laboratories Seeking
Arizona Certification
•	Falsified drinking water reports
•	Contaminated samples not reported to water system operators
•	Data falsified to make it appear testing done correctly
•	Testing falsified and, when discovered by lab director, not redone
•	Lab director tore up drinking water result showing coliform at request of system
operator and allowed the operator to submit new sample
•	Lab director substituted purified water for sample when sample lost and testing dates
falsified
•	Inconsistencies between lab records and results reported to Department of
Environmental Quality
•	No peer review of analytical/electronic data
•	30 percent of samples in one laboratory were not analyzed at all
•	Analyst told to fill in past calibration dates while Arizona auditors were on-site
•	Lab director altered time of analyses on data requested by Arizona auditors
•	Laboratory reported several analytical methods that were not actually used
Source: Arizona Department of Health Services, Office of Laboratory, Licensure,
Certification and Training
A survey of EPA regions and discussions with some State certification officers
suggest that those individuals believe fraud and inappropriate procedures occur
infrequently in drinking water laboratories and the impact is low (see OIG
Supplemental Regional Survey Report for further details). However, no national
studies have gauged the actual extent of laboratory fraud, although problems have
been documented and reported to EPA regions for several years. We cannot, with
any accuracy or reliability, quantify the extent to which this is a problem without
6

-------
the use of accepted techniques to identify both inappropriate procedures and
fraud. We do, however, note an association between identified cases of
inappropriate procedures and laboratory fraud, and the use of on-site auditing
methods additional to those required by EPA.
Scope and Methodology
We conducted our evaluation from August 2004 through February 2006 in
accordance with Government Auditing Standards, issued by the Comptroller
General of the United States. We evaluated drinking water laboratory procedures
by identifying vulnerabilities in the sample analysis process and examining
techniques used by EPA, States, and other Federal agencies to identify and
address inappropriate and fraudulent laboratory procedures. Further, we
identified potential promising techniques by organizing a five-member expert
panel to provide input. The panel consisted of representatives from a certification
program, an accreditation program, a Federal agency, an environmental and data
quality consulting organization, and a large commercial laboratory.
We reviewed EPA headquarters, regional, and selected State guidance for the
certification and accreditation of laboratories analyzing drinking water samples.
We compared EPA OGWDW guidance and regulations to those used by other
Federal agencies, interviewing managers and staff. We also interviewed staff and
managers from all relevant EPA program offices regarding training, guidance, and
enforcement activities. Every EPA regional certification authority or their
designees were surveyed and interviewed. State certification or accreditation
program managers and staff were interviewed in Arizona, Pennsylvania,
Kentucky, and Utah. We also interviewed public health experts.
Appendix B provides further details on scope and methodology.
7

-------
Chapter 2
Vulnerabilities Compromise Laboratory and Data
Integrity, Increase Public Risk
We compiled lists numbering over one hundred vulnerabilities in the drinking
water sample analysis process that could lead to fraud and inappropriate
procedures. The Agency itself (OCEFT) identified many vulnerabilities in its
own review in 2002, but these vulnerabilities have not been adequately addressed.
Economic conditions in the laboratory testing industry, combined with limited
oversight controls, increase the likelihood an analyst or manager will exploit an
existing vulnerability. When inappropriate or fraudulent laboratory procedures
occur, the true quality of drinking water is unknown and health risks for
consumers are increased.
Sample Analysis Process
To evaluate the integrity of
drinking water laboratories and the
process used to analyze and report
drinking water sample data, we
first constructed and examined the
drinking water sample analysis
process and determined which
steps would be most prone or
vulnerable to inappropriate or
fraudulent procedures.
The process is divided into
13 steps (Figure 2.1). An
inappropriate or fraudulent
procedure used in one step will
affect subsequent steps and,
ultimately, the final determination
of drinking water sample quality.
For example, an instrument that is
not calibrated properly results in
inaccurate and unreliable
measurements from the time the
calibration is put into use.
Figure 2.1 - Steps in Drinking Water Sample
Analysis Process
a.	Sample Collection*
b.	Sample Tracking and Recording
c.	Adherence to Standard Operating
Procedures (SOPs) for Analytical
Methods
d.	Preparation of Samples and Standard
Solutions
e.	Instrument Performance
f.	Instrument Maintenance
g.	Instrument Calibration
h.	Lab Technician Performance
i.	Adherence to Quality Assurance/Quality
Control (QA/QC) Plan
j. Data Validation and Verification
k. Data Handling and Maintenance
I. Data Reporting
m. Data Security and Backup
* Step occurs mainly outside the control of
testing laboratories, although a laboratory may
choose to reject a sample arriving in poor
condition.
Source: EPA OIG analysis with input from OGWDW
and consultation with OIG expert panel
8

-------
Multiple Vulnerabilities Identified
EPA program offices, States, and
members of the OIG expert panel
provided multiple examples of
vulnerabilities within the drinking water
sample analysis process. Although there
were unique vulnerabilities listed for
each group, several were similar.
OCEFT Noted Problems, Urged Improved Integrity of Laboratory Data
In 2002, the OCEFT Lab Fraud Workgroup noted in a report an increasing trend
in laboratory fraud cases and the potential for laboratory fraud to "undermine the
foundation of EPA's regulatory programs." The report provides several examples
of common types of laboratory fraud, and notes 73 "laboratory problems" or
vulnerabilities. The workgroup cites 29 of the 73 problems (40 percent) as items
that might not require detailed technical knowledge for detection. Appendix C
provides the full list; selected common examples of laboratory fraud follow:
•	Pencil whipping - Changing data or records (now often through computer
manipulations) without a legitimate reason.
•	Juicing - Adding or diluting analyte5 in the sample, calibration standard, or quality
control samples to change results or make reported results appear acceptable.
•	Peak dialing - Adjusting the instrument dials, resistors, attenuators, other controls
or computer outputs to achieve the desired output for the sample or calibration.
•	Time travel or time warping - Changing times and dates to make documentation
requirements appear acceptable.6
EPA Team Acknowledged Sample Analysis Process Vulnerabilities
An EPA team composed of OGWDW Technical Support Center staff and two
staff members from the Office of Research and Development7 with experience in
drinking water laboratory certification, responded to an OIG questionnaire on
laboratory procedures. Vulnerabilities listed in the questionnaire completed by
the EPA team indicate an awareness of shortcomings associated with the analysis
process. Acknowledging that their expertise is with the analytical methods (and
with the certification process established to evaluate laboratory capability to
properly use the methods), that their field expertise with commercial laboratories
was quite limited, and they are not trained in fraud detection, certification team
5	The sample constituent that is sought or intended to be measured.
6	For example, volatile organic samples may degrade rapidly with time or lack of refrigeration; therefore, there is an
incentive to analyze samples within prescribed holding times or make it appear as though they had.
7	The two individuals were the only Office of Research and Development representatives asked for input by
OGWDW.
Vulnerability, for the purposes of this
evaluation, is defined by the OIG as
any weakness, deficiency or feature of
the current system, which, if exploited
(intentionally or unintentionally) by a
laboratory analyst or manager would
compromise: (1) the integrity of the
drinking water sample analysis process
or (2) the quality of data produced.
Source: EPA OIG evaluation team
9

-------
members offered their opinions regarding the potential for problems in the various
sample analysis process areas. Using the list of process steps provided by OIG
(Figure 2.1), the EPA team identified severe vulnerabilities in every step of the
process, for a total of 26 (see Appendix D). Of these 26 vulnerabilities, 4 were
categorized as unintentional only (Figure 2.2).
The most serious vulnerabilities listed
were:
•	falsification of data by a trained
analyst (step h),
•	falsification or failure to perform
quality control data (step i), and
•	failure to flag data outside
acceptance criteria (step j).
Figure 2.2 Vulnerability Error Type (EPA Team)
~	Intentional
¦ Both
~	Unintentional
Source: EPA Team
These three vulnerabilities, and seven others, were categorized by the EPA team
as resulting from intentional errors. Several of these vulnerabilities listed in
response to our 2005 request were similar to problems listed 3 years earlier in the
OCEFT report (see Appendices C and D).
States Note Vulnerabilities, Inappropriate Procedures, and Fraud
Although State certification officers interviewed had mixed views as to what parts
of the process would be more vulnerable than others, almost all steps - excluding
instrument maintenance (step f) and data security and backup (step m) - were
noted by at least one State as vulnerable or prone to fraud and inappropriate
procedures. All States agreed that sample collection - the first step in the
drinking water sample analysis process - is highly prone to inappropriate and
fraudulent procedures. Specific concerns are that the sample may not actually end
up in the laboratory, may be from the wrong location, may be collected by an
individual with limited or improper training, or may be improperly processed or
decanted at the collection site. We did not request State certification officers to
identify vulnerabilities in each step of the process as we did for OGWDW and the
expert panel.
10

-------
Expert Panel Members Collectively Identify Hundreds of
Vulnerabilities
To further evaluate the existence and severity of vulnerabilities in the drinking
water sample analysis process, we convened a five-member panel of experts from
the drinking water laboratory community (see Appendix B for additional
information on selection methodology). In
all, the panel members came up with 272
vulnerabilities in the drinking water sample
analysis process prior to the meeting
(Appendix E). Of the 272 vulnerabilities,
64 were categorized as intentional errors on
the part of a laboratory analyst or manager,
175 as either intentional or unintentional
errors (both), and 31 as unintentional errors
or mistakes; 2 were not categorized
(see Figure 2.3).
After deliberating on 30 vulnerabilities categorized by various members as having
the most severe impact to the integrity of the drinking water sample analysis
process, the panel agreed on a shortened list of 20 (see Figure 2.4).
Figure 2.3 Vulnerability Error Type (Expert Panel)
^\64
~ Intentional

¦ Both

~ Unintentional

~ Not Categorized
Source: EPA OIG expert panel
Figure 2.4 - Most Severe Vulnerabilities Identified by Expert Panel
•	Censoring of information based on reporting limits
•	Data manipulation
•	Failure to follow SOPs/reference methods
•	Falsifying existing data
•	Improper calibration
•	Inappropriate manual integrations
•	Overwriting files: peak shaving, juicing/peak enhancing, deleting
•	Inadequate training
•	Inappropriate collection process
•	Incomplete record keeping
•	Mislabeled sample
•	No demonstration of competency
•	No requirement for collector
•	Reporting data for samples not analyzed ("dry labbing")
•	Retention times not assured
•	Sample integrity unknown
•	Selective use of QC data
•	Sequencing analysis
•	Spiking samples after preparation
•	Time travel (changing times and dates)
Source: EPA OIG expert panel
11

-------
The panel rated the steps in the drinking water sample analysis process from most
to least prone to inappropriate or fraudulent procedures (see Figure 2.5). The
panel agreed that the initial step - Sample Collection (step a) - is the step most
prone to the occurrence of inappropriate and fraudulent procedures.
Figure 2.5 - Areas of the Drinking Water Sample Analysis Process
Most Prone Area
Sample Collection (a)
Highly Prone Areas
Data Validation and Verification (j)
(in order from most to least)
Instrument Calibration (g)

Lab Technician Performance (h)

Preparation of Samples and Standard Solutions (d)

Data Security and Backup (m)
Source: EPA OIG expert panel
EPA Procedures to Address Data for Detected Instances of
Inappropriate Procedures and Fraud Limited
When we reviewed actions taken by EPA (Agency organizations and regions)
when inappropriate and fraudulent procedures are detected, we found that EPA
lacked standardized methods and guidance on how affected data would be
handled. OEI oversees implementation of the Agency's Quality System, which,
in part, assures the quality of data is known and documented. Agency
organizations are responsible for implementing the policy specific to their
activities. While OEI has developed training to deter and detect improper
laboratory practices, fraud detection and reporting are outside the scope of the
existing Quality System policy.
There are no Agency processes to address data produced by public and private
laboratories using inappropriate or fraudulent sampling procedures in drinking
water or other laboratories. Although this evaluation was limited in scope to
drinking water laboratory procedures and data produced by those laboratories, we
found no mechanisms to identify data in Agency databases originating from
laboratories using inappropriate or fraudulent procedures; no Agency policy exists
on how to handle data from laboratories under investigation, indictment, or with
convictions.
We did not find any plans to require certified laboratories to abide by ethics
programs or certification officers to acquire training in additional auditing
techniques, data integrity concepts, fraud detection, or reporting. No standard
procedure or written guidance on the reporting of inappropriate laboratory
procedures or fraud is issued to certification and accreditation officers. Since we
began this evaluation, OGWDW has agreed to encourage certification officers to
participate in fraud/data auditing courses offered by others and has included
presentations on such in their own training course. A button on EPA's main
12

-------
Website is used to promote reporting, but does not include a category applicable
to laboratory fraud.
Various Factors Contribute to Vulnerabilities
In addition to those instances where people deliberately seek to cheat the system,
OIG expert panel members, EPA program office staff, and State certification
officers offered two theories as to why they believe laboratory fraud and
inappropriate procedures may occur. Economic pressures create incentives for
the laboratory industries to cut corners. Also, controls over the integrity of
laboratories are often limited. Additional causes relate to time constraints, as well
as expectations that no contaminants exceed maximum levels.
Economic Pressures Provide Incentive to Cut Corners
Economic pressures in the laboratory industry that contribute to sample analysis
vulnerabilities include being profit driven and a loss of expertise, and these may
work counter to the integrity of the industry. According to OCEFT, additional
tests to show calibration accuracy, reproducibility, and methodology validity can
add an extra 10 to 20 percent to the cost of analysis. Meanwhile, low profit
margins and under bidding dictate that resources be spent on increasing the
volume of samples analyzed rather than ensuring the quality of work. Thus,
people may cut corners. In addition, because prices at labs have dropped, salaries
may not support having qualified people at the analysis level of testing.
Lack of Laboratory Integrity Controls
OGWDW noted that the Laboratory Certification Program was not designed to
prevent or detect fraud, but to establish the technical capability of a laboratory to
conduct analytical measurements required by the Safe Drinking Water Act.
OGWDW interprets fraudulent procedures to fall outside the scope of laboratory
certification audits. Although OGWDW interprets the certification program to
detect and deter inappropriate procedures, we found this to be true only for some
inappropriate procedures.
In our survey to EPA regional certification officers, most regional respondents
stated that the intent of the on-site audit is to determine laboratory compliance
with EPA methods and not the occurrence of fraud. Respondents explained that
auditors do not look for fraud because it is outside the scope of the laboratory
certification audit. In addition, some respondents noted that the on-site audit was
also not designed to focus on inappropriate procedures (see OIG Supplemental
Regional Survey Report for further details).
EPA requires the use of very specific methods when drinking water is tested for
contaminants. Although guidance is available in the form of certification officer
training and the Laboratory Certification Manual, no Federal regulations exist to
13

-------
prescribe or require the use of any specific techniques in the oversight and audit
process used to certify drinking water laboratories. Once an individual becomes a
certification officer, there are suggestions to take refresher courses every 5 years
but no continuing education requirements. There is no Federal requirement for
the submission of quality control data, no policy on manual integration or
calibration, and no requirement that certification officers review laboratory data.
While proficiency tests are encouraged, shortcomings have been noted, and there
are no other competency requirements. Four of the five expert panel members
and State certification officers noted current proficiency test methods could be
more effective. Using a different analyst, process, conditions, or alternate QA/QC
for proficiency test samples, as well as the sharing of proficiency test data
between laboratories before submission, were cited as problems that could occur
with the current proficiency test sample regimen.
New techniques to identify inappropriate and fraudulent procedures are emerging
along with new technologies. The 1999 OIG memo report and the 2002 OCEFT
Laboratory Fraud Workgroup report both suggest the implementation of accepted
processes to detect and deter laboratory fraud and inappropriate procedures.
OGWDW noted that it communicates with regions in monthly conference calls
and during program reviews and certification training. OGWDW also works with
the drinking water laboratory community to develop and update the guidance
manual for laboratory certification periodically. However, there have been no
changes to provide guidance on determining laboratory or process integrity, data
quality, ethical laboratory practices, fraud detection, or fraud reporting.
OGWDW acknowledges that it has not provided regional staff or State
certification officers - the individuals who conduct on-site laboratory audits and
determine whether a lab is qualified to analyze water samples - a plan to address
inappropriate and fraudulent procedures when identified in laboratories.
Of the State certification programs where we interviewed staff, some programs
are, on their own, actively looking for inappropriate procedures and fraud using
many highly effective, promising techniques. These techniques, although
determined effective, go beyond EPA requirements, and States do not receive
additional funding or resources to conduct these types of audits.
Additional Causes Relate to Time and Expectations
Environmental samples, including drinking water samples, can change over time,
which can result in laboratory time constraints. According to OCEFT, if a
problem occurs and the process cannot be performed in the specified time, there is
an incentive to falsify the time of analysis rather than obtain a new sample and
re-run the necessary test. Also according to OCEFT, clients may motivate
laboratories to commit fraud or inappropriate procedures. In addition, a
laboratory may be concerned that test results the client will find unacceptable can
result in losing that client's business, as well as the business of other clients.
14

-------
Vulnerabilities Hinder Ability to Ensure Safe Drinking Water
Vulnerabilities in the drinking water sample analysis process impact EPA's
overall strategy for providing the public with safe drinking water, thus increasing
the health risks for consumers. To the extent that adequate processes are not in
place to detect fraudulent or inappropriate procedures, decision makers will not
have full assurance that drinking water data are accurate. Decision makers range
from water treatment system managers and operators, EPA regulators (State and
Federal), researchers, and even members of the public considering whether to
install a filter or purchase bottled water.
Vulnerabilities in the sample analysis process increase the risk of public exposure
to drinking water contaminants. According to EPA, there are a number of threats
to drinking water: improperly disposed of chemicals, animal and human wastes,
pesticides, wastes injected deep underground, and naturally-occurring substances.
When contaminants are microbial (i.e., viruses, bacteria), the health effect is
usually acute, while chemical contaminants most likely would impact consumer
health over time. EPA and States use a multi-barrier approach to protect public
health, but most efforts are linked in some way to the sample analysis process that
occurs in laboratories and the data produced.
If a sample is inappropriately or fraudulently analyzed, an operator may miss the
opportunity to use a treatment technique to address the amount of contaminant
present, or may use a treatment technique that is not sufficient. In addition,
individuals may make decisions to purchase bottled water, install a water filter, or
simply drink water from the tap based on this data. The integrity of drinking
water laboratories can also affect the future of drinking water regulations and
research efforts since EPA monitors reported drinking water data to determine if
new rules or regulations need to be issued.
Finally, without adequately assuring data from laboratories are accurate and
reliable, as well as verifying the integrity of the analysis process, EPA efforts to
improve drinking water data in the EPA Safe Drinking Water Information
System-Federal Version database may be limited. Certified laboratories are the
starting point for data that enter this system. The information in the system is
used in reports to Congress and the American people on the percent of population
served by community water systems meeting health-based drinking water
standards. In addition, it supports OGWDW's performance measurement and
management processes.
* * *
Overall conclusions and recommendations are in Chapter 4.
15

-------
Chapter 3
Opportunities Exist to Provide Additional Protection
Against Inappropriate and Fraudulent Procedures
EPA can apply numerous promising techniques to better prevent, detect, or
correct inappropriate and fraudulent laboratory procedures. An expert panel, as
well as officials from EPA and States, agreed that many of the techniques we
compiled could prove beneficial. These techniques relate to developing policy,
training, and guidance, as well as oversight and enforcement practices. Further,
some States and other Federal agencies already do more than EPA requires to
assure the integrity of laboratory information, and EPA could consider
encouraging some of those techniques. In addition, we noted some techniques
used by EPA offices other than OGWDW that OGWDW should also consider
using. The potentially promising techniques noted could be used to better protect
against inappropriate and fraudulent procedures at laboratories that analyze
drinking water samples, thus enabling EPA to better protect public health.
Promising Techniques Identified to Better Protect Against
Inappropriate and Fraudulent Procedures
We conducted a literature search and interviews to identify promising techniques
that EPA could use to better protect against inappropriate and fraudulent
laboratory procedures. A full list of the promising techniques is provided in
Appendix F. We provided EPA (OGWDW and OEI), selected States, and our
expert panel with a questionnaire on these promising techniques. We asked the
expert panel and EPA OGWDW to rate each technique for its effectiveness using
a scale of 1 to 5, with 5 being most effective. We considered techniques to be
highly effective if they received an average score of 3.5 to 5.0. We only asked
States and EPA OEI to review the list of techniques and indicate whether they
considered them as highly effective. Those techniques rated highly effective are
shown in Figure 3.1. We divided the highly rated techniques into the three areas:
•	Policy, Training, and Guidance
•	Laboratory Oversight Practices
•	Enforcement Practices
These techniques are intended to be used by EPA, States, and/or public and
private laboratories, and we make appropriate recommendations in Chapter 4
applicable to the highly rated techniques.
16

-------
Figure 3.1 - Techniques Predicted to be Highly Effective
Techniques
Expert
Panel
EPA
OGWDW
EPA
OEI
States*
Policy, Training, and Guidance
• Develop a training and education program on fraud

V
V
V
• Develop and/or use available guidance on fraud awareness
V
V

V
• Develop examples of prohibited practices as guidance


V
V
• Implement an ethics policy/program

V

V
• Implement a fraud detection policy/program
V


V
• Implement a fraud deterrence policy/program
V


V
Laboratory Oversight Practices
• Perform on-site and followup audits
V
V

V
• Include double blind proficiency testing samples
V


V
• Include split sample analysis

V

V
• Conduct data accuracy reviews
V


V
• Use data validation and verification techniques
V



• Review raw electronic data and use electronic
data analysis/tape audits
V

V
V
• Use analyst notation and sign-off on manual
integration changes to data
V
V

V
• Review inventory of laboratory supplies**



V
Enforcement Practices
• Establish a fraud hotline

V


* States included Arizona, Kentucky, Pennsylvania, and Utah only.
** Technique was listed as an additional one by States and not listed on questionnaire provided.
Source: EPA OIG analysis
Techniques Not Required by OGWDW Implemented by Others
Through our research of EPA offices other than OGWDW, as well as our research
of other Federal agencies and States, we found these groups to already be
implementing several of the promising techniques identified in Figure 3.1 and
some additional techniques. Although some of these groups deal with different
types and aspects of laboratory testing (e.g., clinical, environmental), some of
these techniques could be incorporated within the EPA certification program to
better ensure laboratory integrity and further reduce public health risks.
17

-------
Other EPA Offices
EPA's Office of Research and Development provided initial support for the
NELAC accreditation process, which applies to environmental laboratories as
well as all Agency and most EPA regional laboratories. The NELAC
accreditation process follows OGWDW certification requirements for drinking
water laboratories, but has incorporated some promising techniques into the
assessment process. The program requires laboratories to provide data integrity
training to employees that includes providing specific examples of breaches of
ethical behavior, written ethics agreements, and examples of improper practices.
While some NELAC program elements could strengthen oversight of drinking
water laboratories, it is voluntary and OGWDW provides no incentives for
laboratories or States who choose to take on this additional accreditation.
OEI has developed training courses to address the integrity of laboratory
processes and the detection of improper laboratory practices. Specific OEI
activities include:
•	A course on detecting improper laboratory practices.
•	A Website to provide best practices for laboratory quality systems.
•	Guidance and training on environmental data verification and validation.
•	A presentation on EPA quality staff ethics and data integrity activities.
The training courses and other tools are available on the EPA Quality System
Website (http://www.epa.gov/quality/bestlabs.html) and are available for use by
both EPA and non-EPA laboratories. OGWDW and OEI have not communicated
about the use of these tools. Although not specific to drinking water laboratories,
the OEI-developed tools could be adapted by OGWDW to improve laboratory
integrity.
Other Federal Agencies
The Department of Defense (DoD) uses several promising techniques. DoD
requires laboratories to have a program to detect and prevent improper, unethical,
or illegal actions. DoD also requires laboratories to have an ethics policy in place
with annual training requirements, and DoD may use double blind proficiency
testing samples if problems are suspected. In addition, DoD provides guidance on
inappropriate acts and is working on a procurement policy for laboratory services
that specifies a list of prohibited laboratory practices.
While the U.S. Department of Health and Human Services Clinical Laboratory
Improvement Amendments program has no specific processes or guidelines to
protect against inappropriate or fraudulent procedures, the Department believes
there are some Clinical Laboratory Improvement Amendments program activities
that discourage analysts from performing inappropriate or fraudulent acts.
Inspections every 2 years by experienced surveyors, review of documentation
18

-------
dating as far back as 2 years, and steep fines for those who may use inappropriate
or fraudulent procedures may be deterrents for clinical laboratories that consider
the use of inappropriate procedures or fraud. The U.S. Department of Health and
Human Services also noted as a strength regular communication between
surveyors at the regional and national level. The promising technique of double
blind test samples is used in some cases to verify the accuracy of laboratory
results for unregulated analytes.
States
States take a variety of approaches to identify and address inappropriate and
fraudulent procedures, dependent on available resources, technical expertise, and
management preferences. For three of the four States visited, the programs
operated solely on certification fees received from laboratories. In the fourth
State, the majority of the program was funded by State dollars.
The regional survey showed two thirds of States were using data validation and
verification techniques, and a small percentage (six States) were using electronic
data review (see OIG Supplemental Regional Survey Report). These techniques
are currently not required by OGWDW for the oversight of drinking water
laboratories. The States we reviewed that were using promising techniques to
identify inappropriate and fraudulent procedures have found a multitude of
deficiencies, inappropriate procedures, and in some cases fraud.
We found Arizona and Pennsylvania to have been particularly aggressive in
examining laboratory integrity, using on-site audits to look for inappropriate
procedures, and spending additional time on data reviews. Using electronic data
analysis software costing $15,000 and trained technical auditors, Arizona
certification audits have identified various problems, such as manual integrations
and overwriting calibration curves, which would have gone undetected in an audit
using only EPA-required methods. Arizona's laboratory certification
requirements include an additional requirement for "scientifically valid and
defensible" testing. Pennsylvania issued new laboratory accreditation guidelines
in August 2005 that include a paragraph on the potential impact of laboratory
fraud and the undermining of public confidence in data from laboratories.
When microbiology certification officers in Kentucky decided to quiz laboratory
analysts on testing methods during the on-site certification audit - a technique
suggested by EPA, but not required - they found several laboratory analysts were
unable to differentiate between the presence and absence of total coliforms. Total
coliforms include fecal coliform and E.Coli, contaminants used to indicate
whether other potentially harmful bacteria may be present.
To enhance communication and share potential promising techniques, 26 State
certification and accreditation programs have formed a discussion group. The
group, initiated in 2004, holds conference calls every other month. Any
19

-------
environmental or drinking water auditor is welcome to participate in the call.
Until recently, EPA had not participated in calls or communicated regularly with
the group. The group provides an outlet for questions concerning EPA standard
methods and specific applications in drinking water laboratories. The group may
also take action and make suggestions to EPA. For example, the group is working
on a guidance document to provide additional details on calibration protocol.
With recent communication between EPA and the group, the group could
potentially be used to assist OGWDW in developing drinking water laboratory
certification process and oversight guidelines, including the use of promising
practices already implemented by several State members.
* * *
Overall conclusions and recommendations are in Chapter 4.
20

-------
Chapter 4
Conclusions and Recommendations
Significant vulnerabilities in the drinking water sample analysis process
compromise laboratory integrity and data quality, and increase the risk of public
exposure to contaminants. The Agency must take steps to better address the root
causes leading to the existence of vulnerabilities, including limited laboratory
controls and economic pressures. In particular, EPA needs to:
•	Enhance guidance and further encourage EPA and State laboratory
certification officers to use promising techniques, and reduce uncertainty by
monitoring and assessing laboratory and certification program conditions.
•	Review procurement policy and promote ethical practices.
•	Create a policy and mechanism to identify affected data.
These actions will help EPA ensure data reliability and better protect public health
by improving controls over laboratories and diminishing the reasons that can lead
to laboratories engaging in inappropriate procedures and fraud. Further, decisions
vital to public health can be made based on better laboratory data.
Promote Better Training and Use of Promising Techniques
Current requirements for laboratories that analyze drinking water samples, as well
as those individuals who grant the certifications, should be enhanced to better
ensure integrity and protect public health. Techniques used by certification
officers may not assess severe vulnerabilities, impacting the Agency's ability to
ensure safe drinking water. Techniques exist that, if incorporated into the
drinking water laboratory certification program, could better detect and deter
inappropriate and fraudulent procedures, improving the integrity of laboratory
operations. By taking advantage of these techniques, EPA can make significant
gains. Further, EPA can better monitor changes and identify emerging challenges
in the laboratory testing environment.
We recommend that the Assistant Administrator for Water:
1. Prepare laboratory certification officers for the conditions and challenges
they will face in testing laboratories associated with fraud by applying the
following promising techniques (modified from those highlighted in
Chapter 3):
a)	Promote Training and Education Regarding Fraud
b)	Integrate Fraud Awareness into Laboratory Certification Training
21

-------
2.	Ensure that all individuals within OGWDW, regions, and States who have
oversight responsibility for laboratories analyzing drinking water samples
are educated and proficient in the proper procedures to follow should a
laboratory be suspected of inappropriate or fraudulent procedures.
Specifically:
a)	Distribute written guidance and appropriate contacts at the suggested
course for State certification officers; copies of the guidance should also
be distributed to OGWDW regional and Technical Support Center staff.
b)	Establish the use of the EPA fraud hotline for environmental testing
laboratories; certified and accredited laboratories should be provided
with appropriate Office of Enforcement and Compliance Assurance or
OIG contacts to report possible misconduct.
c)	Work with the Office of Enforcement and Compliance Assurance to
determine if the form connected to the on-line violation reporting tool on
EPA's Website could be used for laboratory fraud.
3.	Create and use a training course, exam, and standard methods for the
certification of laboratories analyzing drinking water samples for
radiochemical contaminants.
4.	Encourage certification officers to use the following promising techniques,
as noted in Chapter 3, already developed by other groups in laboratory
oversight. In addition, encourage certified or accredited laboratories to
engage in techniques b and c:
a)	Enhance On-Site and Follow up Audits to Include Techniques to Identify
and Deter Inappropriate Procedures and Fraud
b)	Use Data Validation and Verification Techniques
c)	Use Analyst Notation and Sign-off on Manual Integration Changes to
Data
d)	Review Raw Electronic Data and Use Electronic Data Analysis/Tape
Audits
e)	Review Inventory of Laboratory Supplies
f)	Include Double Blind Proficiency Testing Samples Reform
(or a combination of Double Blind and Split Sample Analysis)
g)	Conduct Data A ccuracy Reviews
5.	Reduce uncertainty associated with the integrity of drinking water
laboratories as well as the occurrence of inappropriate procedures and fraud.
At least every 3 years, perform a periodic assessment to:
a)	Review the drinking water sample analysis process for the existence of
vulnerabilities.
b)	Assess the extent to which inappropriate and fraudulent procedures are
occurring (using techniques described in Recommendation 4).
22

-------
c) Assess the laboratory certification program as well as specific protection
processes and techniques for effectiveness. Explore incentives to
encourage States and laboratories to adopt innovative practices.
As part of this periodic assessment, consider adjusting laboratory and
certification method requirements and resource allocations if needed.
6. Set up a workgroup - including representatives from regions, States, and
laboratories - to review the sample collection requirements and seek
opportunities to minimize vulnerabilities.
Actively Discourage Fraud and Inappropriate Procedures
Economic pressures can provide incentives for analysts and managers to utilize
inappropriate and fraudulent procedures, while limited laboratory oversight
provides an opportunity for exploitation. Additional techniques available to
protect against vulnerabilities and the occurrence of fraudulent and inappropriate
procedures will better address the underlying causes that contribute to their
existence. By addressing economic pressures and a lack of oversight, the Agency
can discourage inappropriate procedures and fraud.
We recommend that the Assistant Administrator for Water:
7.	Meet with Agency contract officers and the Office of Policy, Economics,
and Innovation to determine if appropriate procurement guidance for EPA,
States, and public water systems (including language similar to what is
under development by DoD) specifying a list of prohibited practices and
possible incentives for laboratories or analysts that meet higher integrity
standards can be developed to offset economic pressures to cut corners.
8.	Provide the following training programs and guidance information for
laboratories, as noted in Chapter 3, that analyze drinking water samples:
a)	All Certified Laboratories Should Have an Ethics Policy/Program
b)	Encourage Certified Laboratories to Implement a Fraud Detection and
Deterrence Policy/Program
Reduce Risk to Agency Systems and Decision Making
There is no standard means by which EPA identifies, reports, or handles data
from suspect laboratories in any media, including those that analyze public
drinking water samples. The Agency faces additional costs and unnecessary
delays when it has to identify and assess the impact of questionable data and
undertake additional sampling. With a policy and method to identify altered data,
the Agency can better ensure that environmental and public health decisions,
23

-------
including the development of new environmental rules and regulations, are made
using quality data.
We recommend that the Assistant Administrator for Environmental Information,
as the national program manager for quality:
9. Create a mechanism to identify data in Agency databases originating from
laboratories under investigation, indictment, and/or conviction.
10. Develop an Agency-wide policy on how data originating from laboratories
under investigation, indictment, and/or conviction will be handled.
Agency Comments and OIG Evaluation
The Office of Water and OEI provided written comments on a draft of this report.
In its response the Office of Water pointed out that we had not indicated the
extent of fraud in the nation's drinking water laboratories, and suggested that
without a more precise estimate of the extent of the problem it could not support
expanding the resources of the Laboratory Certification program to address the
vulnerabilities we had identified. The Office of Water also pointed out that many
of the vulnerabilities we had identified are already addressed in some manner in
its current programs. Moreover, other practices, such as those related to sample
collection, are not germane to the issue. The Office of Water also indicated that
while the OIG characterized drinking water laboratory fraud as a
substantial/growing problem, we have not brought these cases to its attention.
In response, we acknowledge there is a great deal of uncertainty about the extent
of inappropriate practices and fraud. We believe that we correctly portrayed both
the uncertainty in the extent of these practices, and the extent of EPA's activities
to detect, prevent, and correct them. The Office of Water's response focuses on
the limited number of fraud cases rather than the existence of vulnerabilities for
which it has limited controls in place to address. It also fails to consider that
several EPA regions stated the laboratory audits do not look for fraud and that
cases of waterborne disease outbreaks generally do not include a review of
laboratory procedures and data produced. Further, many States are not using
techniques that, according to the OIG expert panel and others, could offer more
evidence as to whether there is a problem. In addition, the Office of Water has
not systematically collected data on the extent to which inappropriate procedures
are found in laboratories. While it suggests a resource burden on States to employ
additional techniques, there has been no analysis on the part of the Office of
Water to demonstrate that the manner in which the resources have been used to
date was effective, or for us to say they were not.
We continue to believe that the intense and growing economic pressures on
laboratories cannot be ignored. EPA needs to be assured that it and delegated
States are taking all reasonable precautions to ensure the quality of the
24

-------
information on which regulatory and compliance decisions are based in this
program. Concerning OIG not communicating fraud occurrences, the Office of
Investigations, in addition to States interviewed, has contacted regional offices as
far back as 2000 to alert them to cases located in their geographical jurisdiction.
In addition, specific cases are included in the OIG Semiannual Report to
Congress. As the steward of drinking water data and the responsible party for the
certification of laboratories that analyze public drinking water samples, the Office
of Water should maintain an awareness of issues in all laboratories and consider
the relevance to drinking water laboratories.
With respect to taking future steps, the Office of Water indicated that although we
had not gauged the precise extent of the problem, it is prepared to "play a greater
role" in preventing and detecting these practices. In response to several of our
draft recommendations that had suggested that EPA require States to take
particular actions to address the potential for inappropriate practices and fraud, the
Office of Water replied that it would be more effective for it to encourage States
to take these steps. It argued that it would be difficult to justify a Federal
regulation given the uncertain extent of the problem and the resource burden on
States. It stated that in carrying out its lab certification program, it has
successfully trained and mentored hundreds of certification officers and is
committed to working with States and other offices within EPA to identify best
practices to ensure continuous improvement for the program. The Office of
Water believed, however, that recommendations directed at the development of
ethics and fraud detection/deterrence programs would be more appropriately
directed at offices that manage those types of programs for the Agency.
The Office of Water noted that expanding the portfolio of the drinking water lab
certification program to encompass fraud-related activities could represent a
significant burden because such work requires specific expertise that the Office of
Water currently does not maintain. As the Office of Water works to address
OIG's recommendations, it would be interested in working with the OIG; OEI;
Office of Policy, Economics, and Innovation; and Office of Enforcement and
Compliance Assurance to more clearly define the roles and responsibilities the
respective offices can and should play with respect to fraud prevention, detection,
and response.
We have agreed to many of the suggested recommendation modifications since
they are consistent with our desire to address the root causes of the cited
vulnerabilities. We were not seeking to regulate change, but to establish
reasonable and effective approaches and techniques to address potential
fraudulent and inappropriate procedures. More importantly, we recognize that
there is an absence of Agency data available to determine the extent of the
problem. There is a need to create a system to track, assess, and measure the
reasons and effects of flawed analytical data. We therefore added language to the
revised recommendations to emphasize this.
25

-------
We are encouraged by the Office of Water's interest in working with other EPA
offices and the States to use available training courses and applications and to
establish better coordination and defined roles and responsibilities in the areas of
fraud prevention, detection, and response. We believe this will benefit all parties
and strengthen the drinking water laboratory program. With regard to the Office
of Water's comment on ethics programs, we believe the immediate need is an
ethics policy/program for drinking water labs. Those labs hold an EPA
certification and should be held to attest for the ethical nature of their operations.
Though the scope of our evaluation did not include all environmental labs, we do
agree that the development of ethics guidance would be a good idea.
OEI stated that it is committed to the quality of data in Agency databases. To that
end, it has an active corrective action strategy to the data quality weakness
identified under the Federal Managers' Financial Integrity Act. In addition, OEI
will submit the OIG's recommendations to the Agency's Quality and Information
Council Steering Committee for action. This committee provides a mechanism to
address enterprise-wide issues and to develop Agency policies to guide EPA
decision makers in the area of information technology/information management
and related issues within the framework of OEI. We are pleased that OEI will be
addressing our recommendations.
The full texts of the responses from the Office of Water and OEI are in
Appendices G and H, respectively.
26

-------
Status of Recommendations and
Potential Monetary Benefits
RECOMMENDATIONS
POTENTIAL MONETARY
BENEFITS (In $000s)2
Rec. Page
No. No.
Subject
Status1
Planned
Completion
Action Official	Date3
Claimed
Amount
Agreed To
Amount
21	Prepare laboratory certification officers for the conditions and challenges
they will face in testing laboratories associated with fraud by applying the
following promising techniques (modified from those highlighted in
Chapter 3): (a) Promote Training and Education Regarding Fraud; and
(b) Integrate Fraud Awareness into Laboratory Certification Training.
22	Ensure that all individuals within OGWDW, regions, and States who have
oversight responsibility for laboratories analyzing drinking water samples
are educated and proficient in the proper procedures to follow should a
laboratory be suspected of inappropriate or fraudulent procedures.
Specifically: (a) distribute written guidance and appropriate contacts at the
suggested course for State certification officers; copies of the guidance
should also be distributed to OGWDW regional and Technical Support
Center staff; (b) establish the use of the EPA fraud hotline for environmental
testing laboratories; certified and accredited laboratories should be provided
with appropriate Office of Enforcement and Compliance Assurance or OIG
contacts to report possible misconduct; and (c) work with the Office of
Enforcement and Compliance Assurance to determine if the form connected
to the on-line violation reporting tool on EPA's Website could be used for
laboratory fraud.
22 Create and use a training course, exam, and standard methods for the
certification of laboratories analyzing drinking water samples for
radiochemical contaminants.
22 Encourage certification officers to use the following promising techniques,
as noted in Chapter 3, already developed by other groups in laboratory
oversight. In addition, encourage certified or accredited laboratories to
engage in techniques b and c: (a) Enhance On-Site and Foiiowup Audits to
Include Techniques to Identify and Deter Inappropriate Procedures and
Fraud; (b) Use Data Validation and Verification Techniques; (c) Use Analyst
Notation and Sign-off on Manual Integration Changes to Data; (d) Review
Raw Electronic Data and Use Electronic Data Analysis/Tape Audits;
(e) Review Inventory of Laboratory Supplies; (f) Include Double Blind
Proficiency Testing Samples Reform (or a combination of Double Blind and
Split Sample Analysis); and (g) Conduct Data Accuracy Reviews.
22	Reduce uncertainty associated with the integrity of drinking water
laboratories as well as the occurrence of inappropriate procedures and
fraud. At least every 3 years, perform a periodic assessment to:
(a) review the drinking water sample analysis process for the existence of
vulnerabilities; (b) assess the extent to which inappropriate and fraudulent
procedures are occurring (using techniques described in
Recommendation 4); and (c) assess the laboratory certification program as
well as specific protection processes and techniques for effectiveness.
Explore incentives to encourage States and laboratories to adopt innovative
practices. As part of this periodic assessment, consider adjusting
laboratory and certification method requirements and resource allocations if
needed.
23	Set up a workgroup - including representatives from regions, States, and
laboratories - to review the sample collection requirements and seek
opportunities to minimize vulnerabilities.
Assistant Administrator
for Water
Assistant Administrator
for Water
TBD
TBD
Assistant Administrator
for Water
Assistant Administrator
for Water
TBD
TBD
Assistant Administrator
for Water
TBD
Assistant Administrator
for Water
TBD
27

-------
RECOMMENDATIONS
POTENTIAL MONETARY
BENEFITS (In $000s)2
Rec. Page
No. No.
Subject
Status1
Planned
Completion
Action Official	Date3
23 Meet with Agency contract officers and the Office of Policy, Economics,
and Innovation to determine if appropriate procurement guidance for EPA,
States, and public water systems (including language similar to what is
under development by DoD) specifying a list of prohibited practices and
possible incentives for laboratories or analysts that meet higher integrity
standards can be developed to offset economic pressures to cut corners.
23	Provide the following training programs and guidance information for
laboratories, as noted in Chapter 3, that analyze drinking water samples:
(a)	All Certified Laboratories Should Have an Ethics Policy/Program; and
(b)	Encourage Certified Laboratories to Implement a Fraud Detection and
Deterrence Policy/Program.
24	Create a mechanism to identify data in Agency databases originating from
laboratories under investigation, indictment, and/or conviction.
Claimed
Amount
Agreed To
Amount
Assistant Administrator TBD
for Water
Assistant Administrator TBD
for Water
Assistant Administrator TBD
for Environmental
Information
10 24 Develop an Agency-wide policy on how data originating from laboratories
under investigation, indictment, and/or conviction will be handled.
Assistant Administrator
for Environmental
Information
TBD
1	0 = recommendation is open with agreed-to corrective actions pending
C = recommendation is closed with all agreed-to actions completed
U = recommendation is undecided with resolution efforts in progress
2	Identification of potential monetary benefits was not an objective of this evaluation.
3	In accordance with EPA Manual 2750, the Agency is required to provide a written response to this report within 90 calendar days that will include a
corrective actions plan for agreed upon actions, including milestone dates.
28

-------
Appendix A
OIG Laboratory Fraud Cases
The OIG Office of Investigations has identified a fundamental shift that has occurred with
laboratory fraud investigations over the last 3 years. Historically, the Superfund program had the
greatest number of laboratory fraud investigations. However, wastewater and drinking water-
related testing now comprises over 52 percent of the current OIG laboratory fraud investigations.
Several of these investigations involved fraudulent laboratory analyses or monitoring reports,
used to determine the compliance of public water supplies with Federal drinking water standards.
This is of particular concern because the water programs are delegated to the States and they, in
conjunction with EPA, may not be exercising due diligence in correcting and reporting these
problems. The shift was related to an increase in case referrals for these laboratories and not
initiated by OIG actions.
OIG Office of Investigations drinking water laboratory fraud cases increased by more than
40 percent from Fiscal Years 2000 to 2003. Figure A.l shows the total number of OIG
laboratory fraud cases opened or under investigation. Numbers show the total cases as well as
those involving water and those specific to drinking water only.
Figure A.1 - OIG Laboratory Fraud Cases 1998-2005 (as reported June 2006)
Cases Opened and Under Investigation Involving Allegations Related to Water

1998
1999
2000
2001
2002
2003
2004
2005
Total Number of Cases Opened
4
5
3
5
8
12
23
28
Number Involving Drinking Water
3
2
2
1
1
3
6
8
Number Involving Water
0
3
0
1
1
2
9
9
Total Cases Under Investigation
4
9
12
17
25
32
44
58
Number Involving Drinking Water
3
5
7
8
9
12
13
17
Number Involving Water
0
3
3
4
5
6
13
15
Source: EPA OIG Office of Investigations
The case numbers (amounts involving drinking water) corresponding to the fiscal years are end-
of-year numbers. The total number of drinking water cases opened, minus any cases closed
during the fiscal year, provides the balance of the total number of drinking water cases under
investigation. In the table above, no drinking water cases were closed in Fiscal Years 1998-
2003. Five drinking water cases were closed in Fiscal Year 2004 and 4 cases were closed in
Fiscal Year 2005.
Background on Number of Cases Proven
There are few cases of laboratory fraud when compared to the total number of testing
laboratories. Cases do not have to result in an indictment or conviction in order for fraud to have
29

-------
been proven. Cases can be rejected for many reasons. One is that not enough money is
involved, which is the situation with most laboratory fraud cases. Other cases get settled before
trial and the agreements are structured such that the laboratory does not admit wrongdoing.
30

-------
Appendix B
Details on Scope and Methodology
We conducted our evaluation from August 2004 through February 2006 in accordance with
Government Auditing Standards, issued by the Comptroller General of the United States. This
evaluation focused on identifying vulnerabilities in the drinking water sample analysis process
and we did not examine internal controls. We evaluated EPA's efforts to address identified
vulnerabilities to ensure the integrity of drinking water laboratories and the sample analysis
process. Although laboratory and analyst integrity can be a concern in all environmental
laboratories, for the purpose of this evaluation, we limited our review to laboratories responsible
for the analysis of drinking water samples. We used the following methods to gather information
concerning five specific questions:
•	What is the process used to protect against inappropriate or fraudulent procedures within
laboratories?
•	What parts of the drinking water sample analysis process are most prone to inappropriate
or fraudulent procedures?
•	How do EPA and/or States identify inappropriate or fraudulent procedures in the analysis
of drinking water samples, and what actions are taken when they occur?
•	What is the most effective way for EPA and States to detect and deter the occurrence of
inappropriate or fraudulent procedures in laboratories that analyze drinking water
samples?
•	What are the implications to human health and environmental quality resulting from the
occurrence of inappropriate or fraudulent drinking water laboratory procedures?
Literature and Document Reviews
We conducted a literature review and reviewed documents pertinent to drinking water laboratory
certification and the occurrence of inappropriate and fraudulent procedures within laboratories.
This included a review of EPA documents, other Federal agency documents, State documents,
academic literature, and news articles including but not limited to environmental laboratories.
A list of the documents we reviewed and used as sources is in Appendix F, Figure F.l.
EPA Regional Survey
Through a survey database sent on November 19, 2004, to respondents from EPA regions, we
collected information on all regional and State programs and methods of laboratory oversight.
All 10 EPA regions responded by December 3, 2004. The survey was sent to the regional
certification authority, and that authority, as well as regional certification program managers and
certification offices, responded to the surveys. To verify the contents and accuracy of
information recorded, we interviewed all respondents in March 2005. Representatives of
OGWDW and NELAC commented on the content and terminology of the survey questions, but
the OIG made all decisions on the survey's final content. Additional details on survey
methodology are in the OIG Supplemental Report on the Regional Survey.
31

-------
State Selection and Visits
The evaluation team visited four State drinking water laboratory certification programs during
field work. We sought to have a range of geographic locations. To develop a paired
comparison, we selected two States that were proactive in their approach to inappropriate and
fraudulent procedures (e.g., that use electronic data analysis or data validation and have reported
cases of fraud) and two States that were less active in their approach (e.g., that do not use
electronic data analysis and data verification, and have not reported cases of fraud). We selected
Arizona and Pennsylvania as the proactive States and Kentucky and Utah as the less active ones.
Selection was based on input from EPA, NELAC, and OIG analysis.
Structured interviews were conducted with each State and information collected included:
•	Background and history of the certification/accreditation program.
•	Processes used for laboratory certification/accreditation (e.g., EPA, NELAC).
•	Details on inappropriate or fraudulent laboratory procedures identified, including review
of audit files (if necessary).
•	Techniques used by the State to protect against inappropriate and fraudulent procedures.
•	Viewpoints as to vulnerabilities within the sample analysis process.
•	Technical assistance and training programs offered.
•	Recommendations for improvement to the EPA certification program.
Following the State interviews, we sent representatives a copy of the promising techniques list to
verify which techniques the certification officers viewed as highly effective. We also requested
the implementation status for each of these techniques.
Expert Panel
As part of our field work, we convened an expert panel for review of the current drinking water
sample analysis process, laboratory procedures, oversight, effectiveness, and the occurrence of
fraudulent or inappropriate procedures in drinking water laboratories. To assemble the panel, we
first requested nominees be provided from individuals and organizations identified as major
stakeholders in the area of drinking water laboratory procedures. Criteria used to select panelists
included scientific and technical expertise in the drinking water analysis process, laboratory
certification and accreditation, laboratory management, and laboratory quality assurance and
quality control. It was our intent to select panel members outside of the Agency. Thirty
individuals were nominated, and 5 were selected for the expert panel.
1)	Wisconsin State Certification Officer - Alfredo Sotomayor
2)	National Institute for Science and Technology Federal Research Chemist - Reenie Parris
3)	General Manager of Severn Trent Laboratories - Robert Wyeth
4)	Technical Director of Chemistry for Environmental Standards - Rock Vitale
5)	North Carolina State Certification Officer - Mike King
A 2-day expert panel meeting was held November 2-3, 2005, to collect information from
panelists. We gave panel members pre-meeting questionnaires to rate the techniques listed in
32

-------
Appendix F for their effectiveness in protecting against inappropriate procedures and fraudulent
procedures, as well as their effect on data quality. We also asked panel members to identify
vulnerabilities within the sample analysis process that could impact the integrity of the sample
analysis process as well as data integrity. In addition to rating the level of severity, we asked
them to determine whether they thought errors in these vulnerability areas were intentional or
unintentional. In addition, panelist discussed management challenges and laboratory data
quality. The laboratory management challenges discussion included vulnerabilities in laboratory
management, potential solutions, and recommendations for EPA's role in this area. The
laboratory data quality discussion included a consideration of the assurance level needed for
decision making and methods to generate quality data considering costs.
Comparison of Similar Laboratory Certification Programs
As part of field work, we reviewed other laboratory certification programs to determine if
techniques used by these agencies/programs could have application and utility for EPA's
oversight of drinking water laboratories. The laboratory programs reviewed were:
•	National Environmental Laboratory Accreditation Conference
•	Department of Defense
•	Department of Health and Human Services, Clinical Laboratory Improvement
Amendments program
The review included interviews with each of the agencies and review of program documentation.
EPA Program Staff Interviews
We conducted interviews with EPA program offices considered stakeholders in drinking water
sample analysis and laboratory certification. These offices included:
•	OGWDW Technical Support Center: We conducted several interviews with staff in
Cincinnati, Ohio, to collect background information on the certification program, obtain
expert opinions as to our evaluation questions, and provide progress updates.
•	Office of Research and Development/NELAC: We conducted interviews with the
NELAC Director to collect background information on the accreditation program, obtain
expert opinions as to our evaluation questions, and provide progress updates.
•	Office of Enforcement and Compliance Assurance's OCEFT: We obtained further
information regarding the June 2002 Laboratory Fraud Workgroup Report.
•	OEI: We interviewed Quality Staff members to determine steps underway or planned by
OEI to identify and address inappropriate or fraudulent procedures. We also obtained
OEI's viewpoints as to the most effective ways for States and EPA to detect and deter the
occurrence of inappropriate and fraudulent procedures.
Interviews with Health Experts
We interviewed three experts involved with health-related effects from contaminated drinking
water not meeting public health standards set by EPA. The following three individuals selected
33

-------
provided a good cross-section of professionals with experience dealing with health-related
concerns from contaminated water, analytical techniques, and data quality concerns:
•	Dr. Dennis Juranek, Senior Scientist and Epidemiologist, Centers for Disease Control
and Prevention, Atlanta, Georgia
•	Dr. RolfHalden, Assistant Professor, Johns Hopkins School of Public Health, Center for
Water and Health, Baltimore, Maryland
•	Dr. Rebecca Parkin, Associate Dean for Research and Public Health Practice, School of
Public Health and Health Services, George Washington University, Washington, DC
Technical Assistance Provided by OIG Office of Investigations
The OIG Office of Investigations Laboratory Fraud Directorate staff provided technical
assistance and information during the evaluation. This included data on relevant cases
concerning laboratories found to commit fraudulent acts, and guidance on the technical aspects
of analyzing drinking water samples.
34

-------
Appendix C
Laboratory Problems Identified by OCEFT Workgroup
Examples of Findings Indicating Fraud
•	Substitution of previous acceptable QC (quality control) or calibration computer files for bad to make
run appear acceptable
•	Full sample containers after data reported*
•	Materially misrepresenting actual laboratory methods and practices to clients
Suspicious Condition or Practice
•	Reported results without supporting records of analyses
•	Lack of required equipment to perform required analysis*
•	Lack of required chemicals, reagents, other raw ingredients to perform the analyses
•	Instruments or other equipment in poor or non-operational condition
•	No log books (paperless lab?)*
•	Entries in log books missing*
•	Missing data
•	More samples analyzed than reported (indicating that results are illegitimately altered by selection)
•	Discrepancies in times between various stages of sample handling and analyses*
•	Discrepancies between values reported in raw, intermediate, and final results
•	Illegitimately selecting calibration points or data to meet method criteria
•	Illegitimately adjusting, altering, or improperly selecting peak heights or counts and ratios during
tuning or calibration
•	Illegitimately selectively picking scan data to achieve tune criteria in GC/MS methods
•	Different print styles in reports*
•	Other suspicious anomalies in appearance of report or data outputs*
•	No QC failures over a significant period of time
•	Tough or "ridiculous" QC requirements (e.g., inviting alteration to pass)
•	Deliberately omitting QC steps such as method blanks and control samples to avoid unfavorable
QC results
•	Use of affiliate labs to analyze performance samples for tests normally done in-house*
•	Altering QC performance summaries
•	Removing statistical outliers to improve reported detection limits
•	Illegitimate use of manual integration in chromatographic techniques
•	Inappropriate "averaging" to achieve calibration or performance criteria or to stay in compliance
•	Suspicious computer calculation subroutines or macros
•	Unexplained editing of electronic files
•	Unexplained erasures, white-outs*
•	Altered or forged signatures on report*
•	Report signature not of the author (e.g., supervisor or manager signing analyst's lab report)*
•	Extraordinary lab results (ultra low detection limits, impossible productivity, etc.)
•	Special phone logs on positive or other results*
•	Other suspect "special" files*
•	Failing to tell the "whole" truth
•	Abnormal directives to employees, oral or written; directives to change results*
•	Intimidated employees*
•	Abnormal pressure to produce results*
•	Employee(s) out of town when analyses performed (travel and charge card records)*
35

-------
Questionable Condition or Practice
•	No maintenance records on instruments
•	Log books missing from series*
•	No original or primary records kept*
•	Data on scraps of paper*
•	Missing reports from files*
•	Missing computer files
•	Incomplete data packages
•	Unexpected or abrupt change in lab practices*, procedures, conditions
•	Problem found in raw data
•	High, arbitrary or unjustified detection limits; detection limits that exceed regulatory limits
•	Results differing using two different methods
•	Deviations from required methodology
•	Numerous "stupid" errors indicative of sloppy work
•	Differing SOPs without explanation
•	Unexpected high or low analyte recoveries
•	Not following required or stated procedures
•	Lax QA/QC or no QA/QC* (May directly evidence fraud if a government contract requirement)
•	Not following laboratory QA requirements
•	Large number of QC failures
•	Adding surrogates after sample extraction rather than prior to sample extraction; reporting
pre-digested spikes or duplicates as post-digested spikes or duplicates
•	Closer than expected agreement on PE samples between affiliate labs, sister labs, industry
organization laboratories
•	Data too consistent
•	Owner or supervisor acting suspiciously (e.g., wants to do all the work himself)*
•	Discrepancies between sample identifications in log-in and chain-of-custody sheets versus samples
tallied in final reports*
•	Faulty data parameter correlations (anion/cation balance, mass balance, COD vs. BOD, etc.)
•	No receipts for instruments, chemicals, equipment in their absence*; no billings for maintenance*
•	Other*
Sampling Related
•	Samples not taken*
•	Samples purposely biased through selection, sampling the "good batch" or the "good portion";
avoiding the "hot spots" or avoiding sampling during certain batch dumps; sampling at certain times
versus others, etc.
•	Fraudulent location*
•	Fraudulent time*
•	Samples purposely switched or corrupted (e.g., cyanide samples left in the sun; volatile samples left
open or caused to aerate)
•	Samples purposefully subsampled incorrectly
•	Continuous monitor probe placed in static sample
* = Items that might not require detailed technical knowledge for discernment.
NOTE: Data presented are specifically what was provided by OCEFT Workgroup. Items are listed by
significance, most significant first; sampling problems indicating fraud are listed thereafter.
Source: Laboratory Fraud Workgroup: EPA OCEFT, Department of Justice
36

-------
Appendix D
Vulnerabilities Identified by
EPA Team
Sequenced based on the 13 steps in the drinking water sample analysis process (see Figure 2.1)
and ordered within each step from most to least severe.
Error Type: U= Unintentional	Severity Rating: 5= Most Severe
I = Intentional	1 = Least Severe
B = Both
Description of Vulnerability
Error
Type
Severity
Rating
a) Sample Collection
• Sample is mislabeled
B
4
• Sample not preserved/or no dechlorinating agent/adulteration of sample
B
4
b) Sample Tracking and Recording
• Holding time/temp, exceeded
B
4
c) Adherence to Standard Operating Procedures (SOPs) for Analytical Methods
• Adherence to SOP
B
4
• QA manager/lab mgmt. not knowledgeable about approved methodology
B
4
• Untrained/inexperienced analysts
B
4
d) Preparation of Samples and Standard Solutions
• Incorrect preparations/inappropriate standards (i.e., no traceability;
co nta mi n ated/exp i red)
B
4
e) Instrument Performance
• Instrument response/sensitivity-needs documentation
B
4
f) Instrument Maintenance
• Analyst/QA officer doesn't understand repair needs
U
4
• No repairs-maintenance log maintenance
U
4
• Repaired incorrectly
U
4
g) Instrument Calibration
• Calibration curve incorrect-data biased high or low
B
4
• Calibration verification not performed
I
4
• Out of date reference materials
B
3
h) Lab Technician Performance
• Trained analysts can falsify data
I
5
i) Adherence to Quality Assurance/Quality Control (QA/QC) Plan
• Analysts can falsify/not performing QA/QC data
I
5
j) Data Validation and Verification
• Not flagging data outside of acceptance criteria
I
5
• Selection of inappropriate QC acceptance criteria
I
4
37

-------

Error
Severity
Description of Vulnerability
Type
Rating
k) Data Handling and Maintenance
• Data transcription errors not detected
U
4
• Falsifying raw data/no verification
1
4
• Miscalculations
B
4
1) Data Reporting
• Data adjusted to meet predetermined levels
1
4
• Not all data reported
1
4
m) Data Security and Backup
• Data tampering by unauthorized user
1
4
• No back ups
1
4
• Store data back ups off site
B
4
NOTE: Data presented are specifically what was provided by EPA Team (OGWDW and Office of
Research and Development staff).
Source: EPA Team
38

-------
Appendix E
Vulnerabilities Identified by Expert Panel
Sequenced based on the 13 steps in the drinking water sample analysis process (see Figure 2.1)
and ordered within each step from most to least severe.
CS = Case Specific
NR = No Response
a)	Sample Collection	
•	Appropriate training (and periodic monitoring by lab or other assessor is critical
for this component. It's hard to design appropriate "practical" exam.
Vulnerabilities include sampling of wrong site, under inappropriate conditions, use
of wrong sample container, contamination of container, cross contamination,
inappropriate storage, etc.	B	5
•	Collection at wrong location	B	5
•	Mislabel sample identity	B	5
•	No requirements for collector	B	5
•	Not sure of sample collection site	B	5
•	Not sure of sample integrity	B	5
•	Vague or lack of sample collection instructions (collector not lab employee)	B	5
•	Chain of custody errors	B	4
•	Improper preservation	B	4
•	Improper sampling procedures	B	4
•	Lack of enforcement of requirement to reject improperly preserved samples
arriving at the laboratory	B	4
•	Omitting dechlorinating agent for volatile organic compound (VOC) collection	B	4
•	Error in field data	B	3
•	Improper sampling equipment	B	3
•	Inappropriate sampling equipment	B	3
•	Lack of attention from collector that correct sampling techniques have been used	B	3
•	Lack of training in legal chain of custody procedures	U	3
•	No temperature controls	B	3
•	Uncalibrated field equipment	B	3
•	Presumption that laboratory is always in control of sampling event	U	2
•	Inadequate chain of custody	U	1_
b)	Sample Tracking and Recording
•	Mislabeling sample identification	B	5
•	Misplacing sample	U	5
39

-------

Error
Severity
Description of Vulnerability
Type
Rating
• Unless double-blind PT samples are sent along with other "real" samples,


this step is difficult to assess for its routine practice by either PT or on-site


assessment. Training and on-going monitoring are critical.
U
5
• Dates logging
1
4
• Lack of uniformity for recording condition of samples on receipt
U
4
• Need to ensure that samples received are stored under required conditions.


Require assessors to check at all on-sites. Requirements for bar-coding and


tracking from receipt through storage, preparation and analysis with time stamp


would make this less of a vulnerability. Vulnerabilities: errors in transcription, loss


of sample, "time-shifting" of samples progress in process, etc.
B
4
• Vague requirements for Laboratory Information Management Systems (LIMS)


security
B
4
• Failure to check custody seals
B
3
• Failure to note air bubbles in volatile organic analysis (VOA) vials
B
3
• Improper handling during log-in
B
3
• Improper pH checks
B
3
• Inadequate preservation
B
3
• Inappropriate storage
B
3
• Receipt condition notation
B
3
• Uncertain requirements of sample treatment or storage after arriving at lab but


before logged into system
U
3
• Violating sample integrity - e.g., opening volatile organic analysis (VOA) vials,


etc.
U
3
• Manual obliteration of records or data
B
2
• Not all information recorded at collection
U
2
• Sample condition not properly checked (e.g., temperature)
B
2
• Sometimes more than one person deals with samples and none are considered


sample custodians
B
2
• Segregated storage
B
1
c) Adherence to Standard Operating Procedures (SOPs) for Analytical Methods
• Failure to follow SOP
B
5
• Assessors focus on reviewing SOP, not assessing its proper execution
B
4
• Focus on format and not content of SOP
B
4
• Lack of uniformity of records such as bench stats for documenting adherence to


SOPs
B
4
• Presumption that having an SOP means laboratory is following it
B
4
• SOP inconsistent with required method
B
4
• SOP with gaps and ambiguities (no corrective action report (CAR) for failing QC)
B
4
• Failure to update SOPs consistent with required method
B
3
• Inadequate analyst training
B
3
• Irregular review of SOPs for accuracy and conformance with methods
B
3
• Lab not using SOP
B
3
• Lack of review of SOPs for correspondence with approved methods
B
3
• Poor mechanisms for demonstrating proficiency using SOPs (initial


demonstration of competencies (IDCs) are weak indicators)
U
3
40

-------


Error
Severity

Description of Vulnerability
Type
Rating
•
Use of wrong method or version
B
3
•
Lab not using most recent version of method or withdrawn method
B
1
•
Following wrong method (Using 624 for 524.2)
B
0
•
SOP not following method exactly (3pt calibration curve instead of 5pt)
B
0
•
Vulnerability: Lack of adherence to SOP not being documented, reported by lab



staff. Training, appropriate oversight of lab staff is critical - hard to detect after
B
CS

analyses completed.


•
Vulnerability: Use of inappropriate SOP for specific tvpe of sample
B
CS

(analyte/matrix/program) should be noted during internal review
d) Preparation of Samples and Standard Solutions
•
Spiking samples after preparation
I
5
•
Adjustment of spikes to ensure compliance
I
4
•
Backdating for holding time compliance
I
4
•
Double spiking solution concentration (e.g., semi volatiles)
I
4
•
Failure to keep accurate log of all standards and/or solutions
B
4
•
Failure to use or calibrate equipment used to make standards
B
4
•
Measurement errors (samples, solvents, etc.)
B
4
•
Record keeping
B
4
•
Segregation of QC sample glassware
I
4
•
Spiking at inappropriate step of process
B
4
•
Use of inappropriate/unverified source of QC/calib. solution components
U
4
•
Use of wrong or outdated materials
B
4
•
Volume errors of concentration or other process steps
B
4
•
Absence of clear requirements to verify spiking solution before they are used
B
3
•
Failure to maintain secure and/or temperature controlled storage
B
3
•
Inconsistencies in the sources used for preparing integrity check values (ICVs)



and continuing calibration verifications (CCVs)
B
3
•
Lab not performing required verifications of pipette volume, balance mass, etc.



critical in value-assigning QC and calibration solutions
B
3
•
Lab preparing "multiple" sets of calibrants, method blanks, etc. and discarding the
results of those not consistent with requirements but then having appropriate
number of results to report
I
3
•
Lack of emphasis on reviewing sample prep methods
B
3
•
Lack of uniformity for verification of accuracy of prepared standards



(e.g., integrity check values (ICVs); standard reference materials (SRMs))
B
3
•
Not enough emphasis on tracking preparation of reagents
B
3
•
Overheating to speed up process
B
3
•
Reagent water quality - contamination
U
3
•
Separate blank process
I
3
•
Solutions made incorrectly
U
3
•
Use of single-source references for everything (e.g., calibration=continuing



calibration verification (CCV)=spike)
B
3
•
Using "old" calibration solutions, QC checks without assessing current validity
(often in conjunction - see prep of "new" solutions just prior to analysis of PT or



reference material
B
3
41

-------
Description of Vulnerability
Error Severity
Type Rating
•
Improper traceability of reference preparation
B
0
•
Solutions expired
B
0
e) Instrument Performance
•
Improper calibration (e.g., single-point, peak shaving, historical library)
1
5
•
Retention time/retention window monitoring critical for appropriate results as to ID



of sample constituent or reporting as "non-detected"
U
5
•
Consistent Laboratory. Use of laboratory control samples (LCSs), other

4

monitoring QC samples at "high" end of analyte concentration range - need to

(more
severe for
reported
results

monitor performance over entire range - if not with each sample set - then ensure


that range is covered adequately over a series of sets of analyses. And - use of
1

LCSs in appropriate matrix, spike samples, etc.

near
quantitation
limits)
•
Inadequate training/experience to recognize instrument problems



(e.g., sensitivity loss)
U
4
•
Indiscriminant use of software to compensate for poor instrument response
B
4
•
Lack of clear guidance on keeping and maintaining chronological run logs
B
4
•
Mislabeling sample (dry labbing)
1
4
•
Vulnerability: unless a specific criteria, etc. are available and checked - or staff
knowledgeable and experienced in the specific analyses to monitor output of



instruments - need for unscheduled maintenance repair mav be missed
U
4
•
Emphasis on analytical instrument performance and not much on support



equipment
B
3
•
Excessive deferral to instrument manufacturer instructions which at times conflict



with approved methods
B
3
•
Failure to run performance checks
B
3
•
Inappropriate adjustment or performance check maintenance
1
3
•
Instrumental adjustment to match performance check response
1
3
•
Insufficient QC (e.g., not running blanks)
B
3
•
Not routine
B
3
•
Record keeping
B
3
•
Repeating QC and calibration without re-running samples (overwriting files)
B
3
f) Instrument Maintenance
•
Failure to repair as needed
1
3
•
Poor upkeep of maintenance logs - reliance on analysts to complete them



and of assessors to review them
B
3
•
Record keeping
1
3
•
Vagueness about when maintenance triggers performing another MDL



(Method Detection Limit) or IDC (Initial Demonstration of Competence) study
B
3
•
Failure to keep determined maintenance schedule
B
2
•
Lack of prep guidance or maintenance of support equipment
B
2
•
Source cleaning, column changing, etc.
B
1
42

-------

Description of Vulnerability
Error
Type
Severity
Rating
g) Instrument Calibration
•
Failure to assure retention times and mass spectra
B
5
•
•
Overwriting/deleting files/peak shaving/juicing
Allowance in some procedures for checking the entire calibration analytes with
selected few (this is not much of a concern with Safe Drinking Water Act)
1
B
5
4
•
Allowance in some procedures for reporting analytes out of calibration
B
4
•
Analysis with non compliant curve
1
4
•
Calibrations performed correctly and verified
B
4
•
Day to day file switching
1
4
•
•
For some tests, especially excessively broad calibration acceptance criteria
Frequency of calibration and review of results as appropriate for specific method
not adhering to requirement
B
1
4
4
•
Improperly made standards
B
4
•
Inappropriate manual integration
1
4
•
Initial calibration (e.g., cherry picking, dropping points, repeating points)
1
4
•
Lack of a defensible calibration primer
B
4
•
Manipulation of injection volumes to get appropriate response
1
4
•
Prepared mixes not verified
1
4
•
Record keeping
B
4
•
Repeating continuous calibration and tweaking until passed
1
4
•
•
Running many batches of QC samples right after ICAL (instrument calibration)
Use of "manual" mode for many of calibration solution integration, peak detection
and/or peak area/height/baselines versus few for "real" samples
1
B
4
4
•
Use of intentionally mis-made standards and solutions
1
4
•
Reference material not accurate
B
3.5
•
Emphasis on empirical demonstration of a calibration's effectiveness -
i.e., dispersal or detection expected or learned behavior?
B
3
•
Failure to complete chain of custody as required
1
3
•
Improperly stored standards
B
3
•
Inappropriate instrument adjustments
1
3
•
Lack of clarity of use and sources of integrity check value (ICV) and continuing
calibration verification (CCV)
B
3
•
Over counting of files
1
3
•
•
Time clock adjustments
Use of calibration solutions inappropriately. Value assigned with non-verified
identification of analytes
1
NR
3
NR
h) Lab Technician Performance
•
•
"Dry lab"
Always vulnerable to intentional fabrication of data - as in "dry lab" - not analyzing
sample at all
1
1
5
5
•
•
Inadequate training and demonstration of competency
Lab tech with insufficient training/expertise to recognize problems with analyses
and with little supervision/monitoring
B
U
5
4
•
Lab technician's ethics, results falsified
1
4
43

-------

Error
Severity
Description of Vulnerability
Type
Rating
• Manual integration of criteria materials
1
4
• No "testing" on read and understand SOP and reference method
B
4
• Volume or instrument adjustment to insure compliance
1
4
• Date and time accuracy
B
3
• Failure to follow method/SOP
B
3
• Few guidelines on conducting internal audits or master sample runs (MSRs)
B
3
• Inadequate education
1
3
• Inadequate oversight and review of data
U
3
• Misuse of dilutions
B
3
• Noncompliance /short cuts in method
1
3
• Supervisor not knowledgeable
U
3
• Supervisor only "bean-counting" and not periodically reviewing row instrument


outputs, results, etc., for signs of problems with analyses
U
3
• Training records and method proficiency
B
3
• Transcription errors
B
3
• Use of wrong or manipulated samples
B
3
• Weak requirements for experience and education of analytical personnel
B
3
• Few requirements for ongoing training of analysts (e.g., refresher courses)
B
2
• Record keeping
B
2
• Stress in the laboratory and management philosophy
B
2
• Verify credentials
B
2
• Not analyzing sample within required time frame and modifying date of analysis
1
varies
i) Adherence to Quality Assurance/Quality Control (QA/QC) Plan
• QA/QC data fabricated
1
5
• Complete and accurate reporting
1
4
• Limited diagnostics for addressing quality of a result at the individual sample level
B
4
• Manual integration for method compliance
1
4
• QA plan not reflective of actual practice
B
4
• Analysts not "tested" on QA plan contents
U
3
• Approval of inappropriate SOP
B
3
• Failure to follow QAPP when required
B
3
• Failure to ID proper constituents
B
3
• Failure to insure proper review
B
3
• Inappropriate choice of QC samples for "real" samples could provide "valid" data


not defensible for a given sample (e.g., poor matrix match, higher concentrations,


etc.)
B
3
• Inappropriately established control limits
B
3
• Incomplete review process
B
3
• Inconsistency of focus between what is assistance problem or a localized


problem
B
3
• Lack of support for root cause analysis
B
3
• More focus on the QA/QC plan and not on its correct execution
B
3
44

-------

Description of Vulnerability
Error
Type
Severity
Rating
•
QA plan out of date or not followed
B
3
•
QA plan with gaps and ambiguities
B
3
•
Record keeping
B
3
•
Variable corrective actions to failing QC samples
1
3
•
Failure to follow up on client complaints
B
2
•
Failure to follow up on PT failures
B
2
•
Failure to complete required internal audits
B
2
j) Data Validation and Verification
•
Data manipulated to pass QC
1
5
•
Failing QC data ignored/manipulated
1
5
•
Inappropriate manual integration
1
5
•
Selective use/discarding of various QC results to enhance compliance criteria
1
5
•
Time travel
1
5
•
Calibration data manipulation
1
4
•
Comparing of results as "average or mean" across numerous analytes with
criteria can lead to not recognizing/reporting lack of compliance by specific
analytes - aggregate masks individual results
B
4
•
File transfers
1
4
•
Sequence adjustments
1
4
•
Wide acceptance criteria allowed by some procedures
B
4
•
Application of inappropriate criteria
B
3
•
Failure to appropriately comment or qualify data
B
3
•
File overwriting
1
3
•
Improper use of data review checklists
B
3
•
Inadequate senior review of data
B
3
•
Not sufficient review in data
1
3
•
Omission of non-compliant data
B
3
•
Rate evaluation of selected diagnostics against set criteria
B
3
•
Record keeping
B
3
•
Reliance on hierarchical decision making
B
3
•
Dependence on automated software review
1
2
•
QC acceptance criteria not properly generated/updated
U
2
k) Data Handling and Maintenance
•
Cursory spot checking by second party when procedures do not require
secondary review
B
4
•
Failure to provide adequate data review and approval
B
4
•
Manual entry of data
B
4
•
Transcription errors
B
4
•
File corruption
B
3
•
Lack of clear expectations on what constitutes raw data
B
3
•
Lack of standards on specific documents to maintain and their storage under



(what goes with what)
B
3
45

-------

Description of Vulnerability
Error
Type
Severity
Rating
•
Losing ability to regenerate records stored electronically
B
3
•
Overwriting or not recording appropriate information
B
3
•
Verification of calculations, algorithms, and software
B
3
•
General record keeping including disposal
B
2
•
Inappropriate record or sample (extract, digestive) retention
B
2
•
Process change without training/notification
U
2
•
Secure/backup files
B
2
•
Timely down loading and storage to prevent data loss
B
2
•
Uncontrolled logbooks and loose paper used and discarded
1
2
•
Validity of software assumed
U
2
•
Could always be intentionally falsified or safeguards not followed
B
varies
1) Data Reporting
•
Falsification of data - intentional change or "generation" of results for sample
not analyzed
1
5
•
Reporting data for samples not analyzed (dry labbing)
1
5
•
Not flagging data where results from analyses with documented (or not)
nonconformities with SOPs, reporting criteria, appropriate QA/QC results, etc.
B
4
•
Transcription errors in government-required report forms
B
4
•
Ethics agreement failures
1
3
•
Failure to initiate data recalls when necessary
1
3
•
Lack of clear requirements for data deliverables
B
3
•
Lack of communication between regulator and regulated entity
B
3
•
Lack of uniform flagging or qualifying corrections
B
3
•
Reporting data without appropriate qualifiers (e.g., sample received at room
temperature)
B
3
•
Reporting of "draft" data prior to compliance reporting
1
3
•
Failure to note and report method issues and concerns
B
2
•
Late or incomplete reports submitted
B
2
•
Reporting data without proper authorizing signature
B
1
•
Transcription errors
U
0
m) Data Security and Backup
•
Data cannot be found nor retrieved from computer
B
4
•
Lack of clear insistence on audit trails for electronic transactions
B
4
•
Little review of correspondence of electronic and hard copy records
B
4
•
Overwriting tapes
1
4
•
Perceived and real case of altering electronic records
B
4
•
Security system compromised in electronic system with other parts of software
updated - need to verify with each update
U
4
•
Appropriate electronic data retention
B
3
•
Backups in same location as original
U
3
•
Backups not timely, tapes unsecure and not off-site
B
3
•
Failure to provide security of database
B
3
46

-------

Description of Vulnerability
Error
Type
Severity
Rating
•
No audit trails and/or audit trails not reviewed
B
3
•
No single user/passwords for systems
B
3
•
Obsolete GALP (Good Automated Lab Practices)
B
3
•
Appropriate intellectual safeguard and access control
B
2
•
Insure adequate capacity for all backup
B
2
Other
•
"Different" conditions, process, QA/QC, analyst, etc., used for PT samples versus
"real samples"
1
5
•
"Sharing" of PT data prior to submission
1
5
•
For wet chemistry methods - need for some monitoring - of checks of reagents,
apparatus, etc., as focused on in the questionnaire with regards to
instrumentation
B
5
•
Using library of historical consumer confidence reports (CCRs) to overwrite failing
files
1
4
•
Dosing extracts with additional surrogate
1
3
•
Equating technical expertise with ability to assess technique
U
3
•
Lack of training for assessors on how to properly assess
U
3
•
Poor training of laboratory assessor in data review practices
U
3
•
Inexperienced/unqualified management training staff in inappropriate procedures
B
2
•
Lack of meaningful ethics training and testing
U
2
NOTE: Data presented are specifically what was provided by expert panel.
Source: EPA OIG expert Panel members
47

-------
Appendix F
Promising Techniques
Comprehensive List of Techniques Based on Literature Search
To address our question, "What is the process used to protect against inappropriate or
fraudulent procedures within laboratories?" we conducted a literature search and
interviews to identify available techniques. The sources used are in Figure F. 1.
Figure F.1 - Sources for Promising Techniques
1.	Best Practices for the Detection and Deterrence of Laboratory Fraud: California Military
Environmental Coordination Committee and Chemical Data Quality/Cost Reduction Process
Action Team; Version 1.0; March 1997.
2.	Memorandum - Laboratory Fraud: Deterrence and Detection; U.S. EPA, Office of Inspector
General; June 25, 1999.
3.	Report of the Laboratory Fraud Work Group; U.S. EPA, Office of Criminal Enforcement,
Forensics, and Training; June 2002.
4.	Best Practices for Data Quality Oversight of Environmental Sampling and Testing Activities:
Department of Defense, Environmental Data Quality Workgroup, Department of the Navy, Lead
Service; May 1999.
5.	Quality Systems Manual for Environmental Laboratories; Department of Defense, Environmental
Data Quality Workgroup, Department of the Navy, Lead Service; Final Version 2, June 2002.
6.	Fraud Control in the Health Care Industry: Assessing the State of the Art; U.S. Department of
Justice, Office of Justice Programs; National Institute of Justice, Research in Brief; December
1998.
7.	Publication of OIG Compliance Program Guidance for Clinical Laboratories; Department of
Health and Human Services, Office of Inspector General; Federal Register/Vol. 63, No. 163/
Monday, August 24, 1998/Notices.
8.	Responding to Allegations of Scientific Misconduct: The Procedure at the French National Medical
and Health Research Institute; Science and Engineering Ethics, Volume 6, Issue 1, Pg. 41-48;
2000.
9.	Prohibited Practices (involving Environmental Sampling and Testing Activities). Procurement
Policy for DoD Departments, Attachment 4a.
10.	Information obtained from an interview conducted with the Department of Health and Human
Services-Centers for Medicare and Medicaid-Clinical Laboratory Improvement Amendments
(CLIA) Regional Program Staff, August 2, 2005.
11.	Information obtained from an interview conducted with the Department of Defense-Laboratory
Quality Accreditation Office Program Staff, August 18, 2005.
Source: EPA OIG analysis of literature reviewed
We divided these techniques recommended and/or used by Federal agencies and
laboratory organizations into the three areas: (1) Policy, Training, and Guidance;
(2) Laboratory Oversight Practices; and (3) Enforcement Practices. The complete listing
is provided in Figure F.2.
48

-------

Figure F.2 - Comprehensive List of Techniques Based on Literature Search


Source (see

Technique
Figure F.1)
Policy, Training, and Guidance
•
Develop a Training and Education Program on Fraud: Develop training
for Agency or State on-site auditors/inspectors specifically focused on
fraud and best practices for detection and deterrence.
[2]
•
Develop and/or Use Available Guidance Documents on Fraud
Awareness: Update and enforce guidance for oversight officials to
incorporate fraud awareness techniques.
[2]
•
Develop Guidance and/or Training for QA/QC: Implement written
policies, procedures, and standards of conduct. Develop or improve
guidance and training specific to the planning process to assist data users
in determining laboratory QA/QC necessary and appropriate for the
intended use of the data.
[2]
•
Develop Guidance on Prohibited Practices: Create an example list of
prohibited practices that could potentially occur within drinking water
sampling and testing activities.
[9, 11]
•
Implement an Ethics Policy/Program: Promote ethics in laboratories
through outreach and training. Promote an ethics policy for laboratories
that is read and signed by all personnel.
[1]
•
Implement a Fraud Detection and Deterrence Policy/Program: Develop
SOPs for detecting, deterring, and reporting; present fraud awareness
workshops; develop no-fault policy that encourages lab personnel to
come forward and report fraudulent activities, etc.
[5]
Laboratory Oversight Practices
•
Perform On-site and Followup Audits: Conduct internal and monitoring
audits on a routine basis. Conduct followup audits if initial on-site audits
reveal significant lab deficiencies. Ensure that corrective measures are
taking place to sufficiently address the deficiencies. Ensure data quality
requirements are being met.
[1]
•
Include Double Blind Proficiency Testing Samples: Use proficiency
testing samples where concentration and identity are not known by
laboratory (i.e., known only to parties submitting the proficiency testing to
the laboratory). Double blind proficiency testing labeling, packaging, and
chemical composition samples should mimic those of routine samples.
[1]
•
Include Split Sample Analysis: Send duplicate field samples to multiple
laboratories. Send one of the duplicate samples to a second laboratory
while the corresponding sample is submitted to the primary laboratory.
Different labs that provide similar results confirm reliable data and
minimize loss if a fraud problem should surface.
[1]
•
Use a Systematic Planning Process for Data Collection Activities:
Ensure that the requisite type, quality, and quantity of data are obtained.
[4]
•
Conduct Data Accuracy Reviews: Ensure information systems used to
track laboratory data are current and complete. Review the feasibility of
implementing such systems in programs that do not currently use them.
[2]
•
Use Data Validation and Verification Techniques: Review a body of data
against a pre-established set of QC acceptance criteria to determine
whether it is within the criteria windows to determine the quality of data.
[1]
49

-------
•	Review Raw Electronic Data and Use Electronic Data Analysis/Tape
Audits: Review raw electronic data at the bench level when conducting
audits. Use automated data screening tools that look for patterns in a
data set that may not be predictable or observable by conventional data
review techniques. Use statistical algorithms to discover patterns in data
(i.e., mining tools).
•	Use Analyst Notation and Sign-off on Manual Integration Changes to
Data
•	Use a Laboratory/Research Notebook: Rigorous maintenance of a bound
laboratory notebook by analysts.
•	Conduct Impact Assessments: Respond promptly to detected offenses
and developing corrective action.
•	Conduct Laboratory Staff Reviews: Review qualifications
(e.g., education, training) of analysts to ensure that laboratories are
adequately staffed.
•	Institute an Accreditation Program: Maintain a secondary recognized
laboratory accreditation to support primary certification and/or
accreditation (e.g., EPA, NELAC).
Enforcement Practices
•	Share Laboratory Performance Data and Histories (Interagency):	[1 ]
Agencies that use environmental laboratories should share performance
data.
•	Involve Regulators: Involve the regulators at such junctures as	[4]
developing data quality objectives and incorporating the use of innovative
monitoring and analytical capabilities.
•	Appoint a QA Officer: This officer provides independent review and	[4]
oversight of data collection. Same as appointing a compliance officer or a
scientific integrity officer.
•	Create a Special Investigations Unit: Create this unit to deal with the	[6]
occurrence of inappropriate or fraudulent procedures.
•	Appoint a Fraud Control Officer: Designate responsibility for fraud control	[6]
(as separate from investigations) so that various contributory functions
can be integrated into a strategy designed to reduce the level of fraud.
•	Establish a Fraud Hotline: Pursue and publicize all means of providing	[2]
individuals performing environmental testing with appropriate contacts to
report possible misconduct.
•	Use a Fraud Profile Checklist: Use this checklist to prompt on-site	[2]
auditors to look for indicators of potential fraud (e.g., high personnel
turnover rates or stretching acceptance limits).
•	Include Anti-Fraud Language in Subcontracts	[5]
•	Establish Self-Policing within Laboratories: (e.g., American Council of	[11]
Independent Laboratories' Environmental Laboratory Data Integrity
Initiative, which calls for a systems approach to ensuring that data is of
known and documented quality).	
Source: EPA OIG analysis of literature reviewed
[2]
[5]
[8]
[7]
[10]
[2]
50

-------
Techniques Implemented and Recommended by OEI
EPA OEI quality staff have worked with Agency and non-Agency organizations to
develop guidance and best practices for deterring improper laboratory data quality
practices. Figure F.3 notes specific activities OEI is already implementing as well as
activities it recommends. Recommendations are not intended as potential actions by OEI
as several are outside its (and in some cases, EPA's) mission and expertise.
Figure F.3 - Techniques Implemented and Recommended by OEI
Techniques Already Being Implemented by OEI
•	Detecting Improper Laboratory Practices Course: The detection of improper laboratory
practices describes "red flags" and provides instruction on how an assessor or auditor
should proceed if fraud or inappropriate procedures are suspected.
•	Website to Provide Best Practices for Laboratory Quality Systems: Website developed to
provide Best Practices for Laboratory Quality Systems. Website offers References,
Training, as well as Examples and Other On-line Resources
http://www.epa.gov/qualitv/bestlabs.html.
•	Guidance on Environmental Data Verification and Data Validation EPA QA/G-8:
Includes information on Data Verification, Data Validation, Data Integrity, Tools and
Techniques for Data Verification and Validation, and Data Suitability. The chapter on
Data Integrity will help the verifiers, validators, and data users detect improper practices.
•	EPA Quality Staff Ethics and Data Integrity Activities: This presentation given at the
National Environmental Monitoring Conference, July 2003.
Techniques Recommended
•	Develop a training program on fraud for both Agency and State on-site auditors/inspectors
that discusses best practices for detection and deterrence of fraud. This should be done
using examples of prohibited practices as guidance.
•	Require the review of raw electronic data and the use of electronic data analysis/tape when
conducting bench level audits. Where possible, use automated data screening tools that
look for patterns in a data set that may not be predictable or observable by conventional
data review techniques and use statistical algorithms to discover patterns in data.
•	Require the inclusion of anti-fraud language in contracts and sub-contracts.
•	Require laboratory auditors to be fully capable of performing parameters for which they are
responsible to inspect. Also, be fully capable of performing auditing procedures and aware
that fraudulent procedures may occur.
•	Require subject matter experts to be trained in conducting audits and aware of fraud.
•	Develop a program for laboratory personnel to obtain credentials for their mastery of
specific analytical procedures.
•	Require certification or credentials for laboratory analysts, to assure they are qualified to
analyze drinking water samples.
Source: EPA OEI quality staff
51

-------
Techniques Considered as Highly Effective by States Reviewed
Figure F.4 highlights techniques we presented to State certification program officials that
they considered highly effective in protecting against inappropriate or fraudulent
procedures. States included Arizona, Kentucky, Pennsylvania, and Utah only.

Figure F.4 - Techniques Considered as Highly Effective by States


State
Policy, Training, and Guidance
•
Develop a Training and Education Program on Fraud
AZ*, KY1 (Micro)*,
KY1 (Chem), PA, UT
•
Develop and/or Use Available Guidance on Fraud
Awareness
AZ*, KY1 (Micro)*, UT
•
Develop Guidance and/or Training for QA/QC
AZ2, UT
•
Develop Examples of Prohibited Practices as
Guidance
PA, UT*
•
Implement an Ethics Policy/Program
KY1 (Chem), KY1 (Micro)*, PA*, UT*
•
Implement a Fraud Detection Policy/Program
KY1 (Chem), KY1 (Micro)*, PA, UT
•
Implement a Fraud Deterrence Policy/Program
KY1 (Micro)*, UT
Laboratory Oversight Practices
•
Perform On-Site and Followup Audits
AZ*, KY1 (Micro)*, PA*, UT*
•
Include Double Blind Proficiency Testing Samples
KY1 (Micro)*, UT
•
Include Split Sample Analysis
UT
•
Use Systematic Planning Process for Data Collection
AZ*, KY1 (Micro)*, UT
•
Conduct Data Accuracy Reviews
"O
>.
c
•
Review Raw Electronic Data and Use Electronic Data
Analysis/Tape Audits
AZ*, KY1 (Chem), PA, UT
•
Use Analyst Notation and Sign-off on Manual
Integration Changes to Data
UT
•
Conduct Impact Assessments
UT
•
Conduct Laboratory Staff Reviews
KY1 (Micro)*, PA*, UT
•
Institute an Accreditation Program
UT
Enforcement Practices
•
Share Laboratory Performance Data and Histories
PA
•
Involve Regulators
UT
•
Appoint a QA Officer
UT*
•
Create a Special Investigations Unit
AZ*, UT
•
Establish a Fraud Hotline
KY1 (Chem), KY1 (Micro)*, UT
•
Use a Fraud Profile Checklist/Data Checklists
AZ*, KY1 (Micro)*, PA*, UT
•
Include Anti-Fraud Language in Subcontracts
UT
•
Establish Self-Policing within Laboratories
"0
>
CO
c
H
* State is currently implementing technique as part of its certification program.
1 Kentucky does not have a specific office or division for laboratory certification. Chemistry certification
("Chem") is completed by State chemists; microbiology certification ("Micro") is contracted out.
2	Arizona indicated that developing guidance and/or training for QA/QC potentially could have some effect if
set up right, but is not sure it would be "Highly Effective."
3	Encouraged in laboratories accredited by Pennsylvania, mandated for those seeking NELAC accreditation.
Source: EPA OIG analysis of State interviews
52

-------
Additional Techniques Implemented and/or Recommended by States
Figure F.5 highlights additional identified techniques used or recommended by the State
certification programs in addition to the techniques we provided.

Figure F.5 - Additional Techniques Recommended by States with the Intent of
Protecting Against Inappropriate and Fraudulent Procedures

Technique
State
•
Standardize process/run a certification program similar to
Clinical Laboratory Improvement Amendments or Federal Drug
Administration: Clinical Laboratory Improvement Amendments or
the Federal Drug Administration. Across-the-board regulations for
all laboratories (State, public, private) that analyze drinking water
samples.
AZ
•
Review inventory of laboratory supplies: Review the supply
inventory of the labs to ensure that they are running the required
tests. Check the quantity of supplies used versus number of tests
run to determine whether test reported to the State are actually
run.
AZ, KY1 (Micro)*
•
Identify "Red Flags" when reviewing data: Look for specific red
flags when reviewing data (e.g., lack of QC, repeat manual
integrations, not reporting positive results).
PA
•
Expand the EPA Assessors Training Course to include guidance
on fraud/ethics.
KY1 (Micro), PA, UT,
•
Conduct historical review of Public Water System labs: Consider
a historical review of Public Water System labs and determine
whether a Public Water System has switched to labs that
report non-detects
PA
•
Focus on repeat laboratory deficiencies: Focus on repeat
deficiencies to make sure that labs have a process in place to
address and correct them.
UT
•
Promote good quality systems in labs: Ensure that labs have
good quality systems in place.
UT
•
Audit labs for each step in the sample analysis process:
Review the entire sample analysis process during the audits.
KY1 (Chem)
* State is currently implementing technique as part of its certification program.
1 Kentucky does not have a specific office or division for laboratory certification. Chemistry certification
("Chem") is completed by State chemists; microbiology certification ("Micro") is contracted out.
Source: EPA OIG analysis of State interviews
53

-------
Techniques Recommended by Expert Panel
The expert panel members recommended five techniques to protect against inappropriate
and fraudulent procedures in addition to what was in the list we provided them. These
five techniques are in Figure F.6; the first two were rated as highly effective:
Figure F.6 - Additional Techniques Recommended by Expert Panel
•	Have regulators solicit sample data at sporadic but frequent intervals from lab. Review data
packages for compliance with requirements.
•	Procurement Reform Based on Qualification Based Selection. Awareness of realistic costs
of analysis properly/appropriately conducted. Awarding of contracts to "bidder" with costs
much lower than those of other bidders - may be a sign of a lab run very efficiently, or may
indicate lab will "cut corners."
•	Post articles of discovered fraud and protection as deterrent.
•	As part of training, require an ethics exam to ascertain analysts' knowledge of what is
acceptable and what is not.
•	Rotate duties of analytical personnel temporarily to gauge ruggedness of quality system and
detect inappropriate practices.
Source: EPA OIG expert panel members
54

-------
Appendix G
Office of Water Response
MEMORANDUM
SUBJECT: Promising Techniques Identified to Improve Drinking Water Laboratory Integrity
and Reduce Public Health Risks, Assignment No. 2004-1400, Draft Report
FROM: Benjamin H. Grumbles
Assistant Administrator
TO:	Dan Engelberg
Director of Program Evaluation
Office of the Inspector General
Thank you for the opportunity to comment on your Office's draft report, Promising
Techniques Identified to Improve Drinking Water Laboratory Integrity and Reduce Public
Health Risks. I will respond to the overall findings with more detailed responses to your
recommendations and technical comments attached. My staff has provided additional technical
comments on the text of the report under separate cover. Our comments also reflect feedback
from staff in Regional offices to whom, with your Office's concurrence, we provided the draft
report.
The Office of Water (OW) appreciates the attention that the Office of Inspector General
(OIG) has brought to the potential ramifications of fraudulent laboratory activity on ensuring the
safety of the nation's drinking water. The report includes some suggested activities that we
believe can help to improve the quality of data, and OW is prepared to play a greater role in
preventing and detecting fraudulent activity. However, we have significant concerns with some
of the findings on the part of the OIG related to the role that OW has played to date in dealing
with such activity.
Extent of the Problem
While the OIG report provides extensive information on potential vulnerabilities that may
exist in laboratory procedures, the report acknowledges that the OIG "cannot, with any accuracy
or reliability, quantify the extent to which this [fraud] is a problem." Further, as noted in the
report "No waterborne disease outbreaks or documented cases of illness related to drinking water
have been directly tied to cases of inappropriate lab procedures or fraud." OW does not deny the
serious implications that could result from fraudulent activity. However, we are concerned that
the report does not adequately distinguish between possibilities and likelihoods and, in not doing
so, may be presenting an unnecessarily alarming picture to the American public. Given that the
report includes recommendations that would require significant investments on the part of EPA
and states, it is also critical to demonstrate more specific evidence of the problem. While OIG
55

-------
characterizes drinking water laboratory fraud as a substantial/growing problem, we note that the
OIG has not brought drinking water laboratory fraud cases to the attention of OW.
Inappropriate Procedures us. Fraudulent Activity
The Drinking Water Laboratory Certification (Lab Cert) program is focused on ensuring
that laboratories have the technical capability to conduct analytical measurements required by
drinking water regulations issued under the Safe Drinking Water Act. The program is the first
and still the only Agency program to offer laboratory certification criteria, guidance, and
training. The Laboratory Certification Manual recently underwent an extensive update and
continues to represent the most significant laboratory guidance provided by any EPA program.
Laboratories may produce flawed analytical data as the result of inappropriate procedures
or fraudulent activity. The report discusses outcomes of each of these potential causes, but does
not adequately distinguish between them. We believe it is critical to do this because our Lab
Cert program does, in fact, address many inappropriate procedures. The report notes that an
expert panel identified 272 vulnerabilities8 in the drinking water sample analysis process and
narrowed the list down to 20 that they deemed most severe. Many of these vulnerabilities are
currently addressed by the Lab Cert program and others could be addressed through program
modifications. Additionally, some of the identified vulnerabilities were in the area Sample
Collection area - which is generally not within the purview of the laboratories because most
samples are collected by public water systems.
Drinking water systems and their associated laboratories analyze hundreds of thousands
of samples a year. In considering drinking water analysis, we concur with Regions and States
who have suggested that the greater vulnerability is not due to fraud but the ability to perform the
analysis and get an accurate result. The focus of the Lab Cert program has been to make sure
accepted methods are used, analysts have proper training and develop appropriate skills, and
laboratory procedures are correct and yield accurate results. As others have stated, we believe
this has been an appropriate emphasis for the program over the years.
OW acknowledges that it plays a role in preventing inappropriate procedures through our
existing Lab Cert program and we are committed to making improvements in that area. While
OW has never had the understanding that fraud detection should be included as a primary focus
of the drinking water Lab Cert program, we are committed to working with OECA, OIG, OEI
and others to better address fraud.
The report notes a 1999 OIG memo to the Deputy Administrator and a 2002 report from
OCEFT and faults OW with not following up with recommendations made in both documents.
The 1999 OIG memo, while citing other programs, made no mention of drinking water. While
OARM, OECA, OPPTS, ORD, OSWER, and 4 Regions were contacted as part of that OIG
review, there was apparently no contact with, nor any distribution to, OW. As with the 1999
memo, the 2002 OCEFT report is written at a very general level, with no specific mention of
8 Although the report describes 272 vulnerabilities, a review of Appendix D indicates that there is considerable
overlap which, if addressed, would decrease the total number. For example, under Sample Collection, the table lists
"improper sample equipment" and "inappropriate sample equipment" as discrete vulnerabilities.
56

-------
drinking water. Moreover, the only transmittal memo that OIG was able to locate for the report
was from the Director of Office of Criminal Enforcement, Forensics and Training to the AA for
OECA. To the extent that it was distributed beyond OECA (which remains unclear), the Office
of Ground Water and Drinking Water has no recollection of receiving it.
It is important to note, however, that expanding the portfolio of the drinking water Lab
Cert program to encompass fraud-related activities could represent a significant burden because
such work requires specific expertise that the office currently does not maintain. As we work to
address the OIG's recommendations, we would be interested in working with the OIG, OEI,
OPEI, and OECA to more clearly define the roles and responsibilities the respective offices can
and should play with respect to fraud prevention, detection, and response.
With respect to the specific recommendations proposed in the draft report, we have
suggested modifications to several of them in an effort to improve their feasibility. We believe
some of the recommendations directed at development of ethics and fraud detection/deterrence
programs would be more appropriately directed to offices that manage those types of programs
for the Agency. Additionally, many of the recommendations asked that EPA "require" states
and/or laboratories to undertake specific activities. While OW can encourage the activities, we
note that EPA would only be able to require activities if they were included in a formal
regulation, which would require a high burden of proof. Because the OIG does not document the
extent of fraud related to drinking water samples, we believe a better approach would be to
encourage the use of best practices to minimize the potential for fraud.
In carrying out our Lab Cert program, we have successfully trained and mentored
hundreds of certification officers and we are committed to working with states and other offices
within EPA to identify best practices to ensure continuous improvement of the program for the
future. Thank you again for the opportunity to comment on the draft report. If you have
questions regarding our comments, please contact Cynthia C. Dougherty, Director, Office of
Ground Water and Drinking Water, at (202) 564-3750.
Attachments
57

-------
Attachment 1
OGWDW Response to Recommendations in IG Report on Laboratory Practices - August 28, 2006
Recommendation
EPA Response
Revised Recommendation
Recommendation 1: Prepare laboratory
certification officers for the conditions and
challenges they will face in testing
laboratories associated with fraud by
applying the following promising
techniques: a) Develop a Mandatory
Training and Education Program on Fraud;
b) Develop and/or use Available Guidance
on Fraud Awareness.
OW is prepared to strongly recommend
attendance at such training; however,
making this mandatory would require
regulation. We intend to continue to
integrate fraud discussion into the
Certification Officer training course,
expanding on the approach we have
recently employed. OW could also
incorporate more detailed training by OIG
into our on-site regional/state Certification
Officer meetings and could encourage
participation in fraud training offered by
others (e.g., Arizona). OW would
coordinate with OEI, OIG and OECA (i.e.,
those with the expertise) regarding the
development and presentation of any new
training. We would also work with the
Regions that have done presentations on
fraud to make their materials available to
others.
OW suggests that the recommendation
should be changed as follows:
Prepare laboratory certification officers
for the conditions and challenges they will
face in testing laboratories associated with
fraud by applying the following promising
techniques: Promote Training and
Education regarding Fraud and Integrate
Fraud Awareness into Laboratory
Certification Training.
Recommendation 2: Ensure that all
individuals within OGWDW, regions, and
States who have oversight responsibility
for laboratories analyzing drinking water
samples are educated and proficient in the
proper procedures to follow should a
laboratory be suspected of inappropriate or
fraudulent procedures. Specifically: a)
OW agrees that more can be done to raise
awareness of proper procedures for
reporting suspected fraud and concurs with
the recommendation.
No change
58

-------
Recommendation
EPA Response
Revised Recommendation
Distribute written guidance and
appropriate contacts at the required course
for State certification officers; copies of
the guidance should also be distributed to
OGWDW regional and Technical Support
Center staff, b) Establish the use of the
EPA fraud hotline for environmental
testing laboratories; certified and
accredited laboratories should be provided
with appropriate Office of Enforcement
and Compliance Assurance or OIG
contacts to report possible misconduct, c)
Work with the Office of Enforcement and
Compliance Assurance to determine if the
form connected to the on-line violation
reporting tool on EPA's Web site could be
used for laboratory fraud.


Recommendation 3: Create and require
a training course, exam, and standard
methods for the certification of
laboratories analyzing drinking water
samples for radiochemical contaminants.
OW is currently working with Minnesota
to assist in the presentation of
radiochemistry training in September
2006. Key aspects of the Laboratory
Certification Manual are being integrated
into the training. This training would
supplement the inorganic chemistry
training already presented to Certification
Officers. OW is also prepared to
encourage radiochemistry COs to
participate in training offered by the
Radiochemistry Society and similar
organizations.
OW suggests that the recommendation
should be changed as follows:
Create and use a training course, exam,
and standard methods for the certification
of laboratories analyzing drinking water
samples for radiochemical contaminants.
59

-------
Recommendation
EPA Response
Revised Recommendation
Recommendation 4: Require all
certification officers use the following
promising techniques, already developed
by other groups in laboratory oversight. In
addition, require certified or accredited
laboratories to engage in techniques b and
c: a) Enhance On-Site and Follow-up
Audits to Include Techniques to Identify
and Deter Inappropriate Procedures and
Fraud b) Use Data Validation and
Verification Techniques c) Use Analyst
Notation and Sign-off on Manual
Integration Changes to Data.
OW is prepared to encourage these
techniques (highlighting them as part of
the Certification Officer training) and to
encourage participation in the Arizona
training course, which promotes the use of
these techniques. Given the State and
Regional concerns that have been
expressed regarding the resource burden
associated with the techniques vis-a-vis the
uncertain effectiveness of the techniques,
we would use our annual survey of the
Regions to assess effectiveness and adjust
our recommendations accordingly. Please
note that the substantive aspects of
Recommendation 5 have been integrated
into the revised recommendation.
OW suggests that the recommendation
should be changed as follows:
Encourage certification officers to use the
following promising techniques, already
developed by other groups in laboratory
oversight, to the extent that they are not
already employed. In addition, encourage
certified or accredited laboratories to
engage in techniques b and c: a) Enhance
On-Site and Follow-up Audits to Include
Techniques to Identify and Deter
Inappropriate Procedures and Fraud; b)
Use Data Validation and Verification
Techniques; c) Use Analyst Notation and
Sign-off on Manual Integration Changes to
Data; d) Review Raw Electronic Data and
use Electronic Data Analysis Tape Audits;
e) Review Inventory of Laboratory
Supplies; j) Include Double Blind
Proficiency Testing Samples Reform (or a
combination of Double Blind and Split
Sample Analysis); andg) Conduct Data
Accuracy Reviews. "
Recommendation 5: Require the
following techniques on a trial basis in a
subset of laboratories, using certification
officers or third party auditors trained and
fully proficient in their use: use each
technique in at least 25 percent of drinking
water laboratories, and provide incentives
for States, a) Review Raw Electronic Data
OW suggests that Recommendation 5 be
deleted, since the substantive aspects have
been integrated into our suggested
revisions for Recommendation 4.
Delete
60

-------
Recommendation
EPA Response
Revised Recommendation
and use Electronic Data Analysis/Tape
Audits b) Review Inventory of Laboratory
Supplies c) Include Double Blind
Proficiency Testing Samples Reform (or a
combination of Double Blind and Split
Sample Analysis) d) Conduct Data
Accuracy Reviews.


Recommendation 6: At least every 3
years, perform a periodic assessment to: a)
Review the drinking water sample analysis
process for the existence of vulnerabilities,
b) Assess the extent to which inappropriate
and fraudulent procedures are occurring
(using techniques described in
Recommendation 5). c) Assess protection
processes and techniques for effectiveness.
As part of this periodic assessment, review
State certification and accreditation
programs, provide incentives for States and
laboratories that identify new protection
techniques, and adjust laboratory and
certification method requirements as
needed.
OW is prepared to incorporate the
identification of new vulnerabilities into
our regional questionnaire. OW is also
prepared to work with OIG to review and
assess available information regarding the
number/nature of fraud allegations,
investigations, and findings (to the extent
that such information can be made
available to OW). Regarding the
assessment of techniques for effectiveness,
OW will encourage Regions to assess State
certification and accreditation programs for
the effectiveness of techniques; see also
OW's comments regarding
Recommendation 4. OW will continue to
incorporate appropriate QC into methods,
adjusting as need be, and to update the
guidance provided via the Laboratory
Certification Manual. OW welcomes OIG
suggestions regarding potential State and
laboratory incentives.
OW suggests that the recommendation
should be changed as follows:
At least every 3 years, perform a periodic
assessment to: a) Review the drinking
water sample analysis process for the
existence of vidnerabilities. b) Assess the
extent to which inappropriate and
fraudulent procedures are occurring, c)
Assess protection processes and techniques
for effectiveness. Explore incentives to
encourage states and laboratories to adopt
innovative practices (including, but not
limited to, those included under
Recommendation #4).
Recommendation 7: Set up a workgroup
- including representatives from regions,
OW provides substantial training and
guidance associated with sample collection
OW suggests that the recommendation
should be changed as follows:
61

-------
Recommendation
EPA Response
Revised Recommendation
States and laboratories - to review the
sample collection requirements and
determine if vulnerabilities can be
minimized through sample collector
requirements, accreditation or licensing.
and will continue to look for opportunities
to improve such. OW has monthly calls
with regional certification officers to
discuss drinking water laboratory issues
and has also begun to participate in the
state workgroup described in the report
that meets periodically to discuss
environmental laboratory issues.
Continue to work with regions, states,
sample collectors and laboratories to
review requirements and guidance
associated with sample collection and seek
opportunities to minimize vidnerabilities.
Recommendation 8: Work with Agency
contract officers and the Office of Policy,
Economics and Innovation to determine if
a procurement policy for States and public
water systems, including language similar
to what is under development by DoD
specifying a list of prohibited practices and
possible incentives for laboratories or
analysts that meet higher integrity
standards, can be developed to offset
economic pressures to cut corners in the
environmental testing laboratories that
analyze drinking water.
OW suggests that this recommendation be
deleted. If this recommendation is
maintained in the final report, OW
suggests that it be changed to apply to
users of all environmental laboratories and
that the recommendation be directed to
OPEI. Perhaps guidance, based on DoD
language and any lessons-learned from
OSWER, ORD and/or others, could be
developed and distributed Agency-wide
and externally.
OW suggests that the recommendation
should be changed as follows:
OPEI should work with Agency contract
officers and program offices to provide
appropriate procurement guidance for
EPA, States and environmental labs,
(including language similar to what is
under development by DoD) specifying a
list ofprohibited practices and possible
incentives for laboratories or analysts that
meet higher integrity standards.
Recommendation 9: Provide the
following training programs and guidance
information for laboratories that analyze
drinking water samples: a) Require an
Ethics Policy /Program for All Certified
Laboratories b) Implement a Fraud
Detection and Deterrence Policy /Program.
OW recommends that the recommendation
be revised to apply to all environmental
laboratories and that the recommendation
be directed to OEI who could develop
guidance and distribute it Agency-wide.
OW would then add an appendix to the
Laboratory Certification Manual
encouraging the development of ethics
policy/programs and fraud
OW suggests that the recommendation
should be changed as follows:
OEI should provide guidance regarding
the use of Ethics Policy Programs and
Fraud Detection and Deterrence
Policy Programs in laboratories. OW
should include relevant guidance as an
appendix to the Laboratory Certification
Manual to encourage laboratories to adopt
62

-------
Recommendation
EPA Response
Revised Recommendation

detection/deterrence programs by
laboratories.
ethics policy programs andfraud
detection deterrence programs.
Recommendations 10 and 11: Create a
mechanism to identify data in Agency
databases originating from laboratories
under investigation, indictment and/or
conviction and develop an Agency-wide
policy on how data originating from
laboratories under investigation,
indictment, and/or conviction will be
handled.
OW understands that these
recommendations have been directed to
OEI and will defer to OEI to address. We
note, however, that we would anticipate
issues associated with adverse actions
against a laboratory that is not yet
convicted of a crime.

63

-------
Attachment 2
Technical Comments on Draft Report
Chapter 1
•	Page 1, first paragraph under Public Health Considerations. Based on information provided
in the supplemental report, we believe the second sentence should be edited to read "EPA has
not yet conducted a national review of laboratory performance or analyzed State data on
laboratory deficiencies to determine the extent to which public health may or may not be at
risk from inappropriate or fraudulent laboratory procedures, although a survey of EPA
Regions and discussions with a number of States suggest that fraud and inappropriate
procedures occur infrequently in drinking water laboratories and the impact is low."
•	Page 2, paragraph below Figure 1.1. The report indicates that "The completed OIG
investigation found machine calibrations associated with volatile and semi-volatile organic
analyses were altered; tests for contaminants including methyl tertiary butyl ether (MTBE),
nitrobenzene, tert-butyl alcohol, and dichlorodifluromethane were affected." We believe the
report should also note that the contaminants cited are not regulated as national primary
drinking water standards.
•	Page 2, third paragraph below Figure 1.1. The paragraph implies that the case of increased
lead in the drinking water for the District of Columbia was due to fraud. We are not aware of
any such findings, and as such, do not believe it is appropriate to include this as an example.
We recommend that the paragraph be deleted.
•	Page 3, first paragraph under EPA Roles in Laboratory Certification. We believe the fourth
sentence should be edited to reflect EPA efforts as follows: "SDWA does not specify the
nature of the audit, although EPA has developed audit training and offers substantial
guidance through a laboratory certification manual."
•	Page 4, first paragraph under State Roles Related to Public and Private Laboratories. The
report should clearly identify the role of states by changing the first sentence as follows: "It
is the States, for the most part, that actually issue public and private laboratory certification
or accreditation status for the analysis of public drinking water samples and have direct
oversight responsibility for those laboratories."
•	Page 4, second paragraph under State Roles Related to Public and Private Laboratories. We
believe the first sentence should be edited to reflect EPA efforts as follows: "OGWDW
currently offers no has historically not offered a specific radiochemistry course or exam,
although it recently looked into options to partnered with a provider of such a training for a
September 2006 course."
•	Page 5, second paragraph under section on Occurrence of Fraud and Inappropriate
Procedures. We recommend that the report clearly identify the number of cases and address
the comments provided below with the excerpted text.
"Over the past 6 years, the number of laboratory fraud cases reported to the EPA OIG
Office of Investigations has increased steadily (Figure 1.2). Drinking water cases
(i.e.. allegations of fraud) increased by more than 40 percent from Fiscal Years 2000
to 2003, from (#) to (#) (Comment: What are the figures for "convictions" vs.
"cases"?). Laboratories responsible for analyzing public drinking water samples
represented over 35 percent of the 44 OIG laboratory fraud cases in 2004 and just
fewer than 30 percent of the 58 cases in 2005. (Comment: Many of those
laboratories likely do other non-drinking water work. Were the investigations to
64

-------
which OIG refers specifically related to drinking water analyses by those labs?) The
total percentage of certified laboratories under investigation or convicted of
fraudulent procedures is currently unknown, since no national database tracks
certified drinking water laboratories by name. (Comment: There are approximately
6,000 laboratories certified for drinking water, so this likely represents a very small
percentage
•	Page 5, third paragraph under section on Occurrence of Fraud and Inappropriate
Procedures. Given the scope of this investigation, the figures presented in the third sentence
should be specific to drinking water; they should not be coupled with wastewater. The fourth
sentence indicates that "Several of these investigations involved fraudulent laboratory
analyses or monitoring reports, used to determine the compliance of public water supplies
with Federal drinking water standards." Does the term "monitoring report" refer to the
laboratory report or reports provided to the state from the public water system? Given the
scope of this investigation, we believe it should only apply to laboratory analysis reports.
Please clarify if appropriate.
•	Page 5, last paragraph. The report should clarify that the focus of the 1999 report was not on
drinking water. We believe the first sentence should be edited as follows: "A 1999 OIG
memo, based on a review of programs with OECA. OPPTS. ORD. and OSWER. cited
problems with environmental laboratory data integrity and provided suggestions for
improvement."
•	Page 6, paragraph which begins "In our evaluation..." We believe it is relevant to indicate
the source of the respective audits in the second sentence. "In fact, while Arizona auditors
found severe problems, including fraudulent procedures during a drinking water laboratory
certification audit, another State, performing a NELAC accreditation audit for auditing the
same laboratory around the same time period, issued a report of no findings."
•	Page 6, first paragraph under Figure 1.3. The example provided for Kentucky is an example
of an "inappropriate" practice that the laboratory certification program is designed to detect,
and did, in fact, detect using techniques taught in our training course. We recommend that
this paragraph be deleted, or presented in a different context (i.e. in the context of the
certification program being effective at addressing inappropriate procedures).
Chapter 2
•	Page 9, in section titled OGWDWAcknowledged Sample Analysis Process Vulnerabilities.
Rather than addressing in footnote #6, we believe that this title and other relevant text be
changed to clearly indicate that "OGWDW and ORD laboratory certification team members
staff' responded to the questionnaire and offered opinions on potential vulnerabilities.
•	Page 12, last paragraph beginning "We did not find any plans...". We believe the first
sentence should be edited to reflect EPA efforts as follows: "We did not find any plans to
require certified laboratories to abide by ethics programs or certification officers to acquire
training in additional auditing techniques, data integrity concepts, fraud detection or
reporting, although OGWDW has recently encouraged certification officers to participate in
fraud/data auditing courses offered by others and has recently included presentations on such
in their own training courses."
•	Page 13, first paragraph under Lack of Laboratory Integrity Controls. We believe this
paragraph should be edited to reflect OGWDW's view that its program does address
65

-------
inappropriate procedures. We recommend the following edits to the second and third
sentences: "OGWDW interprets inappropriate and fraudulent laboratory procedures to fall
outside the scope of laboratory certification audits. When asked about inappropriate
procedures and fraud, a majority of EPA regions surveyed stated that the on-site certification
audit does not include, and is not currently designed to include, a review of the laboratory to
determine whether fraudulent or inappropriate procedures may be taking place (see OIG
Supplemental Regional Survey Report for further details).
•	Page 13, second paragraph under Lack of Laboratory Integrity Controls. If the OIG intends
to include the numbers cited in the paragraph, they should provide additional details,
including analysis/explanation regarding the reasons for the delayed Regional reviews.
•	Page 14, paragraph beginning "EPA requires the use..We believe the second sentence
should be edited to reflect EPA efforts as follows" "Although guidance is provided in the
form of certification officer training and the Laboratory Certification Manual no Federal
regulations exist to prescribe or require the use of any specific techniques in the oversight
and audit process used to certify drinking water laboratories."
•	Page 14, paragraph beginning "New techniques to identify..." We recommend the last
sentence be deleted because it suggests that OW has been remiss in communicating with the
laboratory certification community. Historically, the Lab Cert Bulletin has been developed
on as as-needed basis. Since it was last issued, we have communicated with Regions in our
monthly conference calls and with Regions and States during program reviews and
certification training. We also worked extensively with the community to develop and
publish an updated Laboratory Certification Manual that consolidates guidance that would
otherwise be covered in bulletins and other forms of communication.
•	Page 14, paragraph beginning "OGWDW does not provide...". As noted earlier, we believe
the certification program does address inappropriate procedures. We also believe the last
sentence should be deleted because our Lab Cert manual does recommend continuing
education ["Periodic training for both laboratory auditors and analysts should be provided
by the Regions. Certification officers should attend refi'esher training programs every five
years to keep their knowledge current"]. We believe the paragraph should be edited as
follows:
OGWDW acknowledges that it has not does not provided regional staff or State
certification officers - the individuals who conduct on-site laboratory audits and
determine whether a lab is qualified to analyze water samples - a plan to address
inappropriate and fraudulent procedures when identified in laboratories. The
certification officer requirements also do not prepare the individuals who will be
entering drinking water laboratories to identify inappropriate or fraudulent
procedures. Identifying and addressing inappropriate or fraudulent procedures are not
within the scope of the certification officers' training or exam. Also, once an
individual becomes a certification officer, that person is qualified for life; there are no
continuing education requirements despite constant technological advancements."
Chapter 3
•	Page 19, first paragraph under Other EPA Offices. We recommend that the OIG delete the
last sentence. OW notes that the certification program was rated by the Regions as more
effective than the NELAP program in every measure included in the Regional survey
66

-------
described in the Supplemental Report. To the extent that OIG is suggesting that OW consider
adding some NELAP elements to the certification program, OW concurs and is examining
this.
•	Page 19, third paragraph under Other EPA Offices. We believe the second sentence should
be edited as follows: "OGWDW and OEI have not communicated about the use of these
tools and OGWDW continues to operate the drinking water laboratory certification program
without the requirement or inclusion of methods to identify or address inappropriate
procedures and fraud." We have provided comments explaining how our program prepares
individuals to identify and/or address inappropriate procedures. Even so, the issue does not
seem relevant to this paragraph.
•	Page 20, fourth paragraph under States. The example provided for the State of Kentucky
does not really represent an innovative approach. OW's laboratory certification manual
states that positive/negative controls should be run for all methods performed. The manual
also outlines training criteria for analysts. We recommend that this fact be noted or the
paragraph be deleted.
•	Page 20, fifth paragraph under States. To accurately reflect current conditions and to
recognize that calibration procedures are addressed in Section 10 of the promulgated methods
that are used for compliance monitoring, we suggest the following edits to the second
through fourth sentences. "Any environmental or drinking water auditor is welcome to
participate in the call, although EPA has only recently begun to does not participate in calls
and of communicate regularly with the group. The group provides an outlet for questions
concerning EPA standard methods and specific applications in environmental drinking water
laboratories. For example, there arc no guidelines for specific calibration procedures, so the
group is currently in the process of building a calibration/manual integration policy for
submission."
67

-------
Appendix H
Office of Environmental Information Response
MEMORANDUM
SUBJECT: OEI Response to OIG's Draft Report: "Promising Techniques Identified to
Improve Drinking Water Laboratory Integrity and Reduce Public Health Risks"
FROM: Linda A. Travers
Acting Assistant Administrator and Chief Information Officer
TO:	Dan Engelberg
Director of Program Evaluation, Water Issues
Office of Inspector General
Thank you for the opportunity to review and comment on the draft report
"Promising Techniques Identified to Improve Drinking Water Laboratory Integrity and Reduce
Public Health Risks." We appreciate your efforts to ensure the clarity of your findings.
This memorandum responds to the specific findings and recommendations for the Office
of Environmental Information. Please contact Reggie Cheatham, Director, Quality Staff, at 202-
564-7713 if you have any questions or need additional information.
cc: Benjamin Grumbles, Assistant Administrator for Water
Reggie Cheatham, Director, OEI Quality Staff
Charles Cavanaugh, Special Assistant, OEI
68

-------
OEI Response to Draft Report
"Promising Techniques Identified to Improve Drinking Water Laboratory Integrity and
Reduce Public Health Risks"
August 19, 2006
Summary of Quality Staffs review of May 2006 discussion draft report:
OEI submitted comments on a review draft of "Promising Techniques Identified to Improve
Drinking Water Laboratory Integrity and Reduce Public Health Risks" on May 19, 2006.
Throughout the document, our specific corrections to statements about OEI responsibilities,
resources provided to the laboratory community, and activities were accepted verbatim. In
addition, with the following exception, our requested changes and comments, intended to
improve clarity and understanding, were addressed.
We commented that three of the "most severe vulnerabilities" identified in Chapter 2, Figure 2.4
(on page 11 of this version) are related to sample collection, and thus are beyond the control of
the laboratory, which typically receives drinking water samples by courier or delivery service
(and has no role in their collection). OIG declined to mention this point in Chapter 2 of its
revised version and in Appendices B, C, and D, which display expanded lists of sample
collection activities that are vulnerable to fraud. We consider that this shortcoming detracts from
the utility and objectivity of the report, but does not impact the recommended actions for OEI.
Recommendation:
In its draft report, OIG recommends that EPA assess drinking water laboratory integrity and
incorporate promising techniques to identify inappropriate procedures and fraud into the required
elements of the laboratory oversight process, specifically, OIG recommends that OEI, as the
national program manager for quality:
Create a mechanism to identify data in Agency databases originating from laboratories under
investigation, indictment, and/or conviction, and
-	Develop an Agency-wide policy on how data originating from laboratories under
investigation, indictment, and/or conviction will be handled.
OEI Response:
OEI is committed to the quality of data in Agency databases. To this end, we have an active
corrective action strategy to the Data Quality weakness identified under the Federal Managers'
Financial Integrity Act (FMFIA). In response to FMFIA, EPA has developed an effective Data
Standards Program which now requires continuous monitoring as discussed in Attachment 2.
Organizations, such as States and Tribes, are working together with OEI to develop data
standards for the exchange of environmental data. OEI has developed a number of tools,
processes, and guidance documents to facilitate the implementation of data standards by, e.g.:
-	Providing technical assistance to National Program Managers, Regions, and States; and
69

-------
- Establishing the technical and business guidelines for the use of standard data elements. This
process is used to assist in data integration, improve reliability, and enable the rapid
aggregation of data for emergency response. For more information, see the Environmental
Data Standards Council Web site at http://vvvvvv.envdatastandards.net/.
In 1999, EPA established the Quality and Information Council (QIC) with the intention that the
QIC provide a mechanism to address enterprise-wide issues and to develop Agency policies to
guide EPA decision makers in the area of information technology/information management and
related issues within the framework of OEI. In order to address OIG's recommendations, OEI
will submit these to the Agency's QIC Steering Committee (SC) for action at the first QIC SC
quarterly meeting following the issuance of the final report. The primary role of the QIC SC is
to assist the QIC in the development of the IT/IM and related policy agenda. Additionally, the
QIC SC is charged with resolving issues that are not appropriate for QIC action or that are
specifically delegated to the QIC SC (such as establishment of procedural or guidance
documents in support of a given policy). The QIC SC members will determine how to address
the OIG recommendations on an Agency-wide basis.
More information about the QIC is available at
http://intranet.epa.gov/oeiintra/imitpolicv/qic/index.htm
Current activities related to Recommendation:
It should be noted that the Agency and OEI have several activities in progress that are intended
to capture "metadata," e.g., identifying the laboratories generating data, as well as the quality
control and other parameters associated with those data. Please review the metadata information
lists recommended by the National Water Quality Monitoring Council and the Methods and Data
Comparability Board located at: http://acwi.gov/methods/data products/index.html
In addition, the final Environmental Sampling, Analysis and Results data standard information is
available on: http://www.envdatastandards.net/section/standards and
http://vvvvvv.envdatastandards.net/content/article/detail/649
70

-------
Distribution
Office of the Administrator
Assistant Administrator, Office of Water
Assistant Administrator, Office of Environmental Information
Assistant Administrator, Office of Research and Development
Assistant Administrator, Office of Enforcement and Compliance Assurance
Director, Office of Ground Water and Drinking Water
Director, Quality Systems, Office of Environmental Information
Agency Followup Official (the CFO)
Agency Followup Coordinator
Audit Liaison, Office of Water
Audit Liaison, Office of Environmental Information
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
General Counsel
Acting Inspector General
Appendix I
71

-------