March 1992
TRAINING COURSE FOR ON-SITE
LABORATORY EVALUATIONS
Presented by
Quality Assurance Department
Lockheed Engineering & Sciences Company
1050 E. Flamingo Road
Las Vegas, Nevada 89119
and
Environmental Monitoring Systems Laboratory
U.S. Environmental Protection Agency
944 E. Harmon
Las Vegas, Nevada 89119
-------
SECTION A
COLLECTING DATA, IDENTIFYING LABORATORIES,
AND PREPARING FOR EVALUATIONS
OVERVIEW OF THE ON-SITE LABORATORY EVALUATION PROCESS
In the U.S. Environmental Protection Agency's (EPA's) Contract Laboratory Program (CLP),
the first step in preparing for an on-site visit is to collect the available analytical and quality
control information from the environmental laboratory that is to be visited. This information
comes in the form of reports or data that are generated by or supplied to the Environmental
Monitoring Systems Laboratory in Las Vegas, NV (EMSL-LV). This information includes
data reviews (audits), quarterly blind reports, regional performance evaluation reports, Sample
Management Office (SMO) Contract Compliance Screening (CCS) and timeliness reports,
regional reviews, and reviews of standard operating procedures (SOPs) and resumes. All of
this material helps an auditor evaluate the quality of work performed by a laboratory. The
auditor uses the information to prepare for a routine on-site visit or for a visit to a laboratory
that had a large number of defects in data submitted. All the information is kept on file so
that a laboratory's performance can be tracked over time.
After visiting the laboratory, the auditor prepares an on-site evaluation report that details
problems found in the laboratory. The report also details any defects noted in the submitted
data. The completed report is forwarded to the laboratory, which has 14 days to reply to any
criticisms and to rectify defects, if possible.
COURSE STRUCTURE
The course is divided into four modules. Section A consists of classroom instruction. All of
the topics mentioned above are covered in as much detail as participants require (time will be
scheduled for questions and discussion). Special emphasis is placed on defining each type of
data that the laboratory supplies and on the uses of each data type for evaluation purposes.
Section B includes instruction on conducting a laboratory on-site visit. A simulated
laboratory visit to a laboratory called W.B. Goode Laboratories is covered in Section C. This
simulation may be presented as an audio tape and slide show or as an actual walk-through of
a laboratory. Section D includes instruction in preparing the evaluation report. Ample time
will be allowed for discussion and questions.
A-1
-------
COLLECTING LABORATORY MONITORING DATA
MONITORING LABORATORY PERFORMANCE
Numerous data types can serve as valuable tools when evaluating laboratory performance.
Among them are data reviews (or audits); quarterly blind scores; regional performance
evaluation (PE) standards results; Sample Management Office CCS and timeliness reports;
regional reviews; on-site reports; laboratory responses to audits, on-site reports; and SOPs and
resumes.
DATA REVIEWS (AUDITS)
Data reviews are performed at EMSL-LV to assess the technical quality of the analytical data
and to evaluate overall laboratory performance. Data packages are randomly selected from
cases received or by criteria specified in this section. The packages are audited for
completeness, data quality, and contractual compliance, by using standardized procedures
based on a total number of defects. Data packages for the inorganic and organic CLP
contracts are routinely examined on a quarterly basis. A standardized data review form,
which contains a checklist, is used to prepare uniform reports. The overall technical data
quality of each package is scored on a scale of 0 to 100, with 100 being the best score.
These scores provide a mechanism to track data quality of individual laboratories over time.
In addition to random selection, laboratory data packages are selected for auditing by using
one or more of the following criteria:
Follow-up reviews are done for laboratories exhibiting poor performance on
previous data reviews, gas chromatography/mass spectrometry (GC/MS) tape
reviews, on-site evaluations, or quarterly blind (QB) performance evaluations.
Special requests are received from the Analytical Operations Branch (AOB),
the regional offices, the Office of the Inspector General (OIG), and from other
EPA offices.
The first data package submitted by a laboratory, either at the start of a new
contract or after a relocation of the laboratory, is audited to ensure contract
compliance.
A data audit is performed on a certain percentage of the data packages received
at a minimum of once per quarter for each laboratory.
Each data review comprises a set of forms that the auditor completes during the data package
review. (See Apendix B for the list of raw data that is reviewed.) The overall technical
quality is scored according to the number of errors found in the data package. In a summary
section of the audit, these errors and scores are explained in detail for perusal by the
A-2
-------
laboratory that was audited. The summary covers such items as raw data that were not
contained in the case, technical areas where requirements were not met, unexplained
discrepancies between sample results and raw data, and analytical problems.
Inorganic Audit Process
The inorganic data review report is divided into several sections, covering the various aspects
of the data package that the auditor examines in detail.
Technical Review In this section the raw data are examined for technical quality and to
ensure that quality control (QC) data were acceptable. This process involves a detailed check
of raw data from analysis by Inductively Coupled Plasma spectroscopy (ICP), graphite
furnace atomic absorption spectroscopy (GFAA), mercury cold vapor atomic absorption
spectroscopy, and colorimetric spectroscopy (cyanide). The review of this information is
performed to ensure that the laboratory is meeting all of the technical requirements of the
contract. This process, in turn, helps maintain consistency within the program.
The following results and required quality control should be present in the raw data:
Analytical sample results.
Initial and continuing calibration verification (ICV and CCV) standards.
Low-concentration contract-required detection limit (CRDL) standards.
Blanks.
ICP interference check sample.
Pre-digest spike.
Post-digest spike (if required).
Duplicates.
Laboratory control sample.
Method of standard addition (MSA, if required).
ICP serial dilution.
ICP raw data:
Raw data must be a direct, real-time readout containing time analyzed,
instrument identification, and calibration standard preparation date.
All samples and QC should be identified properly.
Samples should be diluted when over 5% of the established linear range.
Multiple readings and the averages should be present for each of the
standardization and sample results in the raw data.
No deviations should be observed in the analytical methods.
A-3
-------
GFAA raw data:
The raw data must be a direct, real-time readout containing the time each
measurement was performed, instrument identification, and calibration standard
preparation date.
A four-point calibration curve must be used.
Calibration standards that are not contained within the four-point calibration
curve (due to instrument limitations) should be analyzed after calibration, and
they must be within ±5% of the true value. This does not apply to the CRDL
standard.
GFAA analysis should be reported according to the analysis scheme in the
Statement of Work (SOW).
All GFAA analyses should be clearly and sequentially identified in the raw
data.
No deviations should be observed in the contract-required analytical methods.
Mercury cold vapor raw data:
The raw data must be a direct, real-time readout containing the time each
measurement was performed, instrument identification, and calibration
preparation date.
All samples and QC should be identified properly in the raw data.
A five-point calibration curve must be used.
No deviations should be observed in the contractually-required analytical
method.
Cyanide raw data:
The raw data must be a direct, real-time readout, containing the time each
measurement was performed, instrument identification, and calibration
preparation date.
All samples and QC should be identified properly in the raw data.
A four-point calibration curve must be used.
No deviations should be observed in the contractually-required analytical
method.
The followine sample receipt and preparation items should be submitted with the data
package:
SMO sample traffic reports.
Percent solids log.
Digestion and distillation logs showing that the weights and volumes given in
the SOW have been followed.
Documentation showing that the cyanide initial calibration verification standard
A-4
-------
was distilled with the samples.
Documentation showing that both mercury and cyanide holding times were
met.
Review of Quarterly Verification Submissions, Personnel Qualifications, and SOPs This
section of the audit covers all deliverables (data packages, SOPs, etc.) that are contractually
required to be submitted to the EPA, in conjunction with the preaward data packages, and as
they are updated. The following items are examined for those deliverables:
Verify that the raw data have been submitted with the quarterly report of
instrument performance and that the instrument detection limits (DDLs) have
been calculated correctly.
Confirm that the laboratory personnel meet the educational and experience
requirements in the SOW.
Verify that the written SOPs from the laboratory cover all areas required in the
SOW.
Quality Assurance and Quality Control (QA/QC) This section of the audit is an overview
of the QC data that have been submitted with a case. Approximately 15% of the raw data is
checked for transcription or calculation errors. All QC values that are out of control or that
are flagged incorrectly are listed in this section of the audit. The following QC items are
checked in this section:
Calculation errors
Sample results are checked for accuracy to ensure that they were transcribed
correctly.
Initial and Continuing Calibration, and CRDL Standards
Verify that all calibration and CRDL standard percent recoveries are calculated
correctly.
Confirm that all initial and continuing calibration percent recoveries meet the
EPA-required control limits.
Verify that all CRI values (CRDL standard for ICP) are within 80 to 120% of
the true value. This range is not contract-required, but considered by EMSL-
LV to be a good laboratory practice.
Blanks
Verify that all initial and continuing calibration blanks, as well as all
preparation blanks, are within ± CRDL.
A-5
-------
ICP Interference Check Sample
Confirm that all percentage values for ICP interference check sample solution
AB (ICSAB) are calculated correctly and are within ± 20% of the true value.
Make sure that all analytical values for ICP interference check sample solution
A (ICSA) are within ± CRDL.
Pre-Digested and Post-Digested Spike Results
Verify that all spike recoveries are calculated correctly and are between 75 to
125%.
Duplicates
Confirm that all duplicate relative percent difference (RPD) values are
calculated correctly and are within the 20% acceptance window for sample
values greater than five times the CRDL, or ± CRDL for sample values less
than five times the CRDL.
Laboratory Control Sample (LCS)
Verify that all LCS percent recovery values are calculated correctly and are
within the acceptance window of ± 20% of the true value.
Matrix Effect Correction
List all MSA analyses that have not achieved correlation coefficients of >
0.995.
List all serial dilution results that do not fall within ± 10% of the original
determination.
When the laboratory data package has been reviewed, a final report is generated and reviewed
internally for completeness. The report is then sent to EPA Headquarters and the Region.
The laboratory receives the report from the Region and should take corrective action, if
required.
Organic Audit Process
The organic data review is also divided into several sections. The data package from the
laboratory is carefully checked for the contractual quality control requirements listed in each
of those sections. As with the inorganic data review, this review is performed to ensure that
the laboratory is meeting all of the technical requirements of the contract. This process, in
turn, helps maintain consistency within the program.
A-6
-------
QC Forms and Results - QC forms and results are checked for the following:
System Monitorin£ Compound Percent Recovery
The system monitoring compound (SMC), or surrogate, must be spiked into each
sample, matrix spike, matrix spike duplicate, and blank prior to purging or extraction.
SMCs are evaluated by determining whether the percent recovery falls inside the
contract-required limits.
Matrix Spike and Matrix Spike Duplicate Percent Recovery
The percent recovery of each matrix spike (MS) and matrix spike duplicate (MSD) is
used to evaluate the degree of matrix effect of the sample upon the analytical
methodology. Matrix spike and matrix spike duplicate recoveries are evaluated to
ensure that they meet contract-required limits.
Instrument Performance Check Solution
BFB (4-bromofluorobenzene) and DFTPP (decafluorotriphenylphosphine) must be used
to determine whether or not the GC/MS system meets the mass spectral ion abundance
criteria. The criteria are designed to control and monitor instrument performance. No
samples, blanks, or calibration standards should be analyzed prior to tuning with BFB
and DFTPP. The instrument performance check solutions (BFB and DFTPP) are
contractually valid for a maximum of 12 hours.
Calibration
Initial and continuing calibrations must be evaluated to ensure that an adequate system
response was achieved, as measured by relative response factors (RRF) for all
compounds analyzed by the method. Initial calibrations must be linear over a range of
concentrations. The continuing calibration is checked for precision against its
corresponding Initial calibration through the evaluation of percent differences (% D).
Internal Standard Areas
Internal standard areas are used to check instrument stability.
Pesticide Linearity Requirements
The GC system is initially checked for linearity of response.
A-7
-------
Raw Data - Raw data are checked for the following requirements:
Calculation of Sample Results
The reported results should accurately reflect the results found in the raw data. If not,
check for calculation and transcription errors. Discrepancies can be an indication of
poor training or review procedures in the laboratory.
Chromatographic Interpretation
Chromatograms are checked to ensure that technical specifications were met for the
following:
Is the chromatogram normalized to the largest non-solvent peak?
Does the chromatogram indicate saturation or non-resolution of peaks?
Are all peaks labeled and identified on the chromatogram?
Do the chrornatogram and quantitation report agree?
Spectral Interpretation
Mass spectra are reviewed to ensure that contract requirements were met for the
following:
Are the required mass spectra submitted?
Do the mass spectral interpretations indicate an analyst review?
Do the mass spectra support the laboratory identification of target compound
list (TCL) and tentatively identified compounds (TIC)?
Ouanritarion Reports
Quantitation reports are checked to ensure that requirements were met for the
following:
Was manual integration versus automated integration used to determine peak
areas?
Is the correct quantitation ion used?
Is the relative retention time of the compound in close agreement to relative
retention time found in the continuing calibration?
Do the laboratory-reported TCL/TIC concentrations agree with those generated
by the auditor?
Are all compounds given in the correct elution order?
Are correct concentrations of internal standards and system monitoring
compound used by the laboratory?
A- 8
-------
Reanalysis Requirement - The following items are reviewed to assess if reanalyses are
required:
Do the system monitoring compounds percent recoveries meet contract criteria?
Do the retention times for all internal standards change by less than 30
seconds?
Does each internal standard area change by less than a factor of two?
Does each method blank contain less than the maximum allowable level of
contamination?
Do all calibrations meet linearity requirements?
Are endrin and 4,4'-DDT (dichlorodiphenyltrichloroethane) percent breakdowns
less than 20% in the pesticide fraction?
Are all sample analyses performed inside the required time interval?
Are all sample analyses performed under valid instrument performance checks
and calibrations?
Are all analyses performed in the required sequence?
Review and Distribution
To ensure completeness and technical accuracy of the data audit, a thorough review is
performed when the audit is completed. The final audit report is sent to the EPA for review
and distribution.
GC/MS Raw Data Audit Process (Magnetic Tape Audits)
Raw data audits are performed on cases selected for hard copy data audits. In the past, two
separate reports were generated for hard-copy and tape audits. The audits are now combined
as a single organic data review. The following is a description of the parameters reviewed
during the tape audit portion of the organic data review:
Entire files are provided and complete.
BFB and DFTPP instrument performance checks are present and complete.
Initial and continuing calibration files for volatile and semivolatiles are
complete.
Chromatograms for calibrations and samples are submitted.
All chromatograms indicate adequate compound resolution.
All compounds are correctly identified.
Quantitation files are complete.
A-9
-------
GC/MS Raw Data Evaluation Procedure - Reviews of GC/MS raw data are performed to
assess the quality of the data provided by laboratories and to evaluate laboratory performance.
The raw data are evaluated for adequacy of method analyses and QC criteria as specified in
the contract. GC/MS tuning and mass resolution files and system parameters are examined
for the following:
Correct ion abundances and ratios were reported for each BFB and DFTPP
tune.
The percent relative abundances of the tuning compounds are determined. A
standard background subtraction technique has not been established in the
contract. If an averaging technique is used, one representative background ion
spectrum is used for subtraction. For summing techniques, several background
scans equaling the number of summed tuning compound scans is used for
subtraction.
The spectra of the subtracted background are checked to ensure that the ions
contained therein are not those of the tuning compound.
The system parameters are checked to ensure that the laboratory is using the
contract-required mass scan range and the required scan time. The electron
multiplier voltage is checked to ensure that it has not been altered in the course
of a 12-hour calibration period.
Calibration of the GC/MS System - The initial and continuing calibrations are regenerated by
the auditor and are compared to the laboratory-submitted hard copies and checked for the
following:
Ensure that internal standards, surrogates, and calibrations are at the contract-
required concentrations.
Ensure that the primary quantitation ion was used.
Verify that the reported percent relative standard deviation (% RSD) and
percent difference (%D) were correctly calculated and reported.
Compare the quantitation ion areas generated by the laboratory with those
generated by the auditor.
Quantitation reports and laboratory-generated library searches are checked to
ensure that compounds are identified at the correct retention time.
Reconstructed ion chromatograms (RICs) are examined to ensure no problems
were found in the data for the following:
- Evidence of detector saturation.
- Occurrence of peak-splitting severe enough to affect automated peak area
integration.
- Severe instances of peak tailing indicating chromatographic system problems.
- The presence of unknown contaminants or artifacts.
- The signal-to-noise ratio is checked to assess baseline zeroing.
A-10
-------
Sample Analyses - The samples are checked to ensure that the values reported are accurate.
This is performed to verify that all systems (computer and manual) are correctly translating
the results from the raw data to the report forms. The sample results are verified for the
following:
Quantitation reports are auditor-generated and compared with laboratory-
submitted hard copies.
Internal standard quantitation ion peak areas are checked to determine if they
fall within a factor of 2 from the internal standard areas in the daily calibration
standard.
A check is made to see if the retention time of the internal standard shifts by
less than 30 seconds when compared to the daily calibration standard.
Data files are quantitated and the auditor-generated quantitation reports are
compared with the laboratory-supplied hard copies.
A check is made to ensure correct identification of compounds.
Method Blank Analysis - The method blanks are examined to determine the type and extent
of any potential sample contamination.
Quality Control on Data Review Report Generation To ensure completeness, consistency
and technical accuracy, the data review is subjected to senior review, editorial review, and a
supervisory review at Lockheed Engineering & Sciences Company (LESC), contractor to
EMSL-LV.
EPA Review and Distribution - The review report is then sent to the EPA for final review
and distribution.
Use Of Data Reviews
Data reviews are used by the CLP as an independent check of the technical quality of the
data packages and the technical performance of the laboratories in the program. The data
reviews are provided by EMSL-LV to the CLP and subsequently to the regional technical
project officers (TPOs) for resolution of any problems noted in the reports. The TPO for the
region where a laboratory is located is that laboratory's primary contact for resolution of
problems with the contract. The reports are also used by EMSL-LV laboratory evaluators
when preparing for and conducting on-site laboratory visits. For examples of data reviews
from EMSL-LV, refer to the attachments to the final on-site reports in Section D of this
manual.
A-11
-------
QUARTERLY BLIND SCORES
Quarterly blind performance evaluation studies are conducted by using sample sets of known
matrices and compositions to evaluate laboratory analytical performance for identification,
quantitation, and contamination of target and nontarget organic compounds and target
inorganic elements. QB sample sets are shipped to CLP and some non-CLP laboratories once
each quarter. The sample sets can vary in sample design, composition, and concentrations.
Organic QB sample sets usually consist of a water matrix spiked with various TCL
compounds at varying concentrations. In the past, these sample sets have varied, consisting
of water and soil matrices, and full-volume or ampulated sample sets.
Inorganic QB sample sets usually consist of water and soil matrices spiked with target
elements of varying concentrations. Recently, inorganic sample sets have consisted of two
water samples and one soil sample.
Sample composition and sample concentrations vary from quarter to quarter. Organic TCL
compounds and inorganic target elements are to be spiked at least once per fiscal year (FY).
The spike concentration may vary from near the Contract Required Quantitation Limit
(CRQL) for organic compounds or the Contract Required Detection Limit (CRDL) for
inorganic elements to several times above the CRQL or the CRDL.
In addition to their role as a measurement of laboratory performance evaluation tool, organic
QB studies have been used to gather analytical information (retention time, reactions to TCL
compounds, recovery using CLP analytical methods, etc.) on specific compounds. For
example, two semivolatile compounds, benzophenone and carbazole, have been spiked for this
purpose during the past two fiscal years. Carbazole has been added to the TCL.
Sample Preparation and Shipment
Organic QB sample sets are prepared and shipped by ICF-Kaiser Engineering (ICF),
contractor to the U.S. EPA in Washington, D.C., but located in Las Vegas, Nevada.
Inorganic QB sample sets are prepared and shipped by Lockheed Engineering & Sciences
Company (LESC), contractor to the U.S. EPA, EMSL-LV.
The sample recipe is determined by the PE program manager with advice from ICF and
LESC chemists. The spiked TCL compounds and target elements and their respective
concentrations are determined by asking these and other questions:
Has the TCL compound or target element been spiked at least once during this
fiscal year?
Should method performance at or near the CRQL be checked?
Should analysis of isomeric TCL compounds (i.e., dichloropropene isomers,
A- 12
-------
dichlorobenzene isomers, benzofluoranthene isoraers) be checked?
Should analysis of late eluting TCL compounds be checked?
Should laboratories be checked for following the instructions enclosed with
sample sets (for amputated sample sets)?
For non-TCL compounds, should information on a particular compound be
checked?
Once the recipe is determined and the samples are prepared, they are scheduled for shipment
to the laboratories.
Two shipment methods can be used for the sample set: double blind or single blind
shipment. Double-blind shipment requires that a second party (usually a region) actually
ships the sample sets to the laboratory. The laboratory does not know which samples are
QBs or when the samples are to arrive. The sample sets are prepared (either full volume or
amputated) and are shipped to the regions by ICF from Las Vegas, NV. The Region then
"disguises" the sample sets by creating traffic reports, chain-of-custody reports, and site
information for those samples before shipping the samples to the laboratory. The laboratories
then analyze the samples as "real-world samples" and send copies of the data packages to
SMO, EMSL-LV, and the Region for their review. The advantage of double-blind samples is
that they produce a better measure of the laboratory's day-to-day performance because the
identity of the samples is unknown to the laboratory.
A single-blind shipment means that the laboratory knows the samples are QBs and knows
when the samples will arrive. A separate set of instructions is shipped with the samples. The
sample sets are prepared (either full-volume or amputated) and are shipped to the laboratories
by ICF. The laboratories then analyze the samples and send copies of the data packages to
SMO and EMSL-LV. The advantage of the single blind study is that it minimizes cost It
can be assumed that single-blind results are a measure of a laboratory's best performance
since they know the identity of the samples.
Calculations
The final score for the both the inorganic and organic QBs is reported in a range from 1 to
100, with 100 showing the best performance. The score is determined from an algorithm that
emphasizes identification and quantitation. The confidence intervals used to determine
identification and quantitation are calculated by using a bi-weight outlier test This scoring
method creates windows based on the population of the reported results rather than basing the
windows around a true value.
Software (Programs) and Reports
The available software used to generate the QB reports were developed by LESC for EMSL-
A-13
-------
LV. The software is called Contract Laboratory Automated Scoring System (CLASS) and it
provides semi-automated trends analysis, method performance statistics, and semi-automated
reports. Examples of the reports are listed below.
QB Individual Laboratory Summary report
QB Program Summary report
QB Summary of Scores
Trend Individual Laboratory Summary report
Trend Program Summary report
These reports are reviewed by EMSL-LV and are submitted to the program office and
subsequently to the regional TPOs. If poor performance is noted based on the QB score, a
remedial sample will be sent to the laboratory where corrective action is required. EPA may
decide to put the laboratory on sample hold until corrective action is completed.
An auditor preparing for an on-site should review the QB Individual Laboratory Summary
report and the QB Summary of Scores. Examples of these reports are on pages A-15 through
A-18.
Uses of QB Scores
The organic QB studies are used to evaluate laboratory performance for (1) TCL and non-
TCL compound identification, (2) TCL compound quantification, and (3) TCL and non-TCL
contamination. Organic QBs are also used to determine if any of these are problem areas for
the laboratory.
The inorganic QB studies are used to evaluate laboratory performance in the analysis of the
target metals in soil and water samples.
A-14
-------
ALL LABORATORIES
DECODED SUMMARY OF LABORATORY SCORES
QB 3 FY 89
06/30/89
#TCL #TCL #TCL # TCL # Non-TCL # Non-TCL # Non-TCL
Laboratory
EFGH
ABCL
EFGW
JEKL
Code
K3
O5
P2
Kl
% Score
96.8
80.8
53.1
16.8
Not ID
0
0
1
2
Misquant
0
3
11
19
Contain
1
3
1
1
Cpds
47
47
47
47
Not ID
0
0
0
1
Contain
0
0
1
2
Cpds
4
4
4
4
-------
ORGANIC PERFORMANCE EVALUATION SAMPLE
INDIVIDUAL LABORATORY SUMMARY REPORT
FOR QB 3 FY 89
LABORATORY: ABC Laboratory (ABCL)
% SCORE: 60.5
PERFORMANCE: UNACCEPTABLE - Response Explaining Deficiency(ies) Required
RANK: Above
REPORT DATE: 06/30/89
60 Same = 0 Below = 9
MATRIX: WATER
CONFIDENCE INTERVALS
PROGRAM DATA
COMPOUND
TCL VOLATILE
Chloroethane
Carbon disulfide
Chloroform
J" Vinyl acetate
to^
01 1,2-dichloropropane
Trans-l,3-dichloropropene
Bromoform
1,1,2,2-tetrachIoroethane
Toluene
Styrene
Xylenes (Total)
WARNING
LOWER UPPER
12
26
26
NU
26
10
13
8
8
12
24
22
46
37
NU
34
20
21
12
12
29
37
ACTION
LOWER UPPER
10
23
25
NU
25
8
12
7
8
10
22
24
49
39
NU
34
22
22
13
12
32
39
LAB DATA NUMBER OF LABORATORIES
CONG Q MIS-QNT NOT ID ID CPD
9
14
21
10 U
21
14
17
10
6
7
18
$
X
X
X
X
X
X
9
15
12
0
16
8
15
9
11
12
10
1
0
0
60
0
0
0
1
0
1
4
69
70
70
10
70
70
70
69
70
69
66
TOTAL
#LABS
70
70
70
70
70
70
70
70
70
70
70
# OF TCL COMPOUNDS NOT-IDENTIFIED: 1
# OF TCL COMPOUNDS MIS-QUANTIFIED: 9
# OF TCL CONTAMINANTS: 0
# OF NON-TCL COMPOUNDS NOT-IDENTIFIED: 0
# OF NON-TCL CONTAMINANTS: 2
NOTE: The full QB summary report contains data on TCL volatiles, semivolatiles, pesticides, and contaminants, and Non-TCL compounds.
-------
INORGANIC PERFORMANCE EVALUATION SAMPLE
INDIVIDUAL LABORATORY SUMMARY REPORT
FOR QB 2 FY 89 - PAGE 1
LABORATORY NAME: ABC LABORATORY (ABCL)
PERFORMANCE LEVEL: UNACCEPTABLE, Corrective Actions Mandatory
LABORATORY RANK: Above = 38 Same = 0 Below = 3
% SCORE: 48.7
REPORT DATE: 6/23/89
MATRIX: SOIL
Element
Aluminum
Antimony
Arsenic
Barium
Beryllium
Cadmium
Calcium
Chromium
Cobalt
Copper
Iron
Lead
Tolerance Intervals
Warning Action
Lower Upper Lower Upper
3800
12.0
297
400
d
35.1
12400
25.0
10.0
922
116001
5530
8960
29.6
538
1200
d
462
15600
39.5
16.3
1030
144001
7320
3250
12.0
271
40.0
d
33.9
12000
23.4
10.0
910
113001
5340
9510
32.8
564
1330
d
47.4
15900
41.1
18.0
1040
147001
7510
Laboratory
Program Data
Reported #Labs #Labs #Labs #Labs #Labs Total
Value Q Not ID Misquant False Pos Mspk Out Dup Out #Labs
7650
9.7
278
321
3.9
40.9
13900
93.4
5.3
997
108000
6350
1
U 13
S 0
1
# 0
1
1
X 1
B 4
1
EX 1
0
2
5
0
1
0
11
1
5
3
8
4
1
0
0
0
0
4
0
0
0
0
0
0
0
0
34
0
24
0
1
0
1
0
0
0
1
1
1
1
8
0
0
0
0
0
0
0
0
42
42
42
42
42
42
42
42
42
42
42
42
-------
INORGANIC PERFORMANCE EVALUATION SAMPLE
INDIVIDUAL LABORATORY SUMMARY REPORT
FOR QB 2 FY 89 - PAGE 2
LABORATORY NAME: ABC LABORATORY (ABCL)
PERFORMANCE LEVEL: UNACCEPTABLE, Corrective Actions Mandatory
LABORATORY RANK: Above = 38 Same = 0 Below = 3
% SCORE: 48 7
REPORT DATE: 6/23/89
MATRIX: SOIL
00
Element
Magnesium
Manganese
Mercury
Nickel
Potassium
Sodium
Thallium
Vanadium
Zinc
Tolerance Intervals
Warning Action
Lower Upper Lower Upper
6240
8080
0.059
8.0
1500
d
d
35.6
5180
7310
10600
1.1
99
2960
d
d
69.9
7370
6120
7820
0.53
8.0
1350
d
d
31.9
4940
7430
10800
1.2
10.4
3120
d
d
73.3
7610
Laboratory
Reported
Value Q
7400 $
10600
0.81
6 B
2940
650
1.4
59.4
5960 e
Proerani Data
#Labs #Labs #Labs *Labs #Labs Total
Not ID Misquant False Pos Mspk Out Dup Out #Labs
1
1
1
5
1
0
0
1
1
2
1
4
6
3
0
0
2
3
0
0
0
0
0
1
2
0
0
0
0
6
0
0
0
8
0
0
0
0
3
0
1
0
0
0
0
42
42
42
42
42
42
42
42
42
# OF ELEMENTS NOT IDENTIFIED: 1
# OF ELEMENTS MISQUANTIFIED: 2
# OF FALSE POSITIVES: 1
# OF MATRIX SPIKES OUT: 5
SOIL: Sb, Ba, Cr, Se, Ag
# OF DUPLICATES OUT: 0
SOIL:
-------
REGIONAL PERFORMANCE EVALUATION STANDARDS RESULTS
Regional performance evaluation samples are sample sets sent to laboratories by the regions
to better evaluate laboratory accuracy and precision. PE samples are used as supplementary
information to quarterly blinds and are sent out as often as necessary. The PE results are
examined critically for sources of bias, particularly calibration problems.
An Office of Solid Waste and Emergency Response (OSWER) training course on PE
materials is available. This course gives all PE procedures in detail. The course,
"Instructional Course on the Use of Performance Evaluation Materials," is presented by
EMSL-LV.
Results from regional performance evaluation standards are tracked for each laboratory.
Individual scores may or may not be awarded; however, the information is used to evaluate
an individual data package. The on-site auditor studies these evaluations when preparing for
a laboratory visit and compares them against other available information on the laboratory's
performance.
SMO CCS AND TIMELINESS REPORTS
The Sample Management Office compiles Contract Compliance Screening reports from data
submitted by the laboratories (both data package and diskette). (Note: The CLP requires that
all report forms be submitted on floppy disk.) The CCS is a rigorously objective assessment
of laboratory data. The CCS is used to perform rapid assessment of deliverables in terms of
completeness and technical compliance with contract requirements. The CCS reports help
improve data quality by quickly and clearly identifying problems with laboratory non-
compliance. The primary CCS functions are (1) to detect data defects and facilitate their
correction or resolution, and (2) to facilitate determination of payment to laboratories. CCS
procedures are used to screen 100% of submitted case data.
Three CCS procedures are commonly used, one for each type of routine analytical services
(RAS) contract: organic, inorganic, and dioxin. The automated CCS reports received by the
laboratory from SMO detail the problems with the data package that require resolution before
the data package is deemed compliant. Copies of screening results are sent to the laboratory,
to the region that sent the samples, and to EMSL-LV.
Laboratories are given 10 days to reply to notification of defects detected during the CCS.
The laboratories respond with the additional information or modified report forms requested
by the CCS report Any response to the CCS is then reprocessed through the full CCS
assessment system.
A-19
-------
Data Assessed
The CCS is an examination of the information in the hard copy data package and diskette
deliverable as required by the SOW. The data is examined for appropriate completion of
paperwork (traffic reports), and for submission of appropriate spectra or chromatograms to
support the "hits" reported by the laboratory. Other uses are to check the header information
(i.e., date, time, instrument ID) on spectra, chromatograms, quantitation reports, etc. The
CCS also notes information in the case narrative.
The Assessment Process
The majority of the CCS assessment process is an automated, rigorously objective
(mainframe) analysis of the data. The CCS program assesses the presence or absence of
required data and the adherence of reported data to SOW requirements. Where possible, the
program checks the internal consistency of the reported data.
The CCS produces a report that summarizes the compliance of data deliverables, at the
analytical fraction level, to SOW specifications. The CCS report identifies defects in two
ways. The fraction is flagged at a summary criterion level and the exact nature of the defect
is reported along with a reference to a section of the SOW. Results from the automated CCS
assessment are generally available on the EPA mainframe within 24 hours of receipt from
SMO; therefore, assessments are available to an auditor for on-site visit preparation.
Laboratory data is assessed to determine the analytical sequence used by the laboratory.
During this timelining process, all quality control forms and sample results are checked.
Uses of CCS Reports in the On-Site Evaluation Process
Laboratory performance can be monitored through several reports related to CCS: overall
summaries, case summaries, and laboratory responses. At EMSL-LV, summaries of overall
laboratory contract compliance and timeliness performance based on CCS are received as
electronic reports from SMO. These reports are used with other information to determine a
profile of laboratory performance and as additional information during an on-site visit. An
example of such a report is shown on page A-21. The reports show the laboratory's
performance in data package assembly in terms of completeness, technical compliance, and
overall compliance. The units are in percent completeness or compliance. The higher the
percentage, the better the laboratory's performance.
Reports similar to the overall reports can be obtained electronically from SMO and are used
by EMSL-LV when evaluating a laboratory's performance on an individual case.
EMSL-LV often receives the laboratory responses to the CCS reports as well. By reviewing
A-20
-------
K>
CRITERIA
Cover page
Data sheets
Calibration
CRA and CRI Stds
Blanks
ICS
Matrix Spike
Post digestion spike
Duplicate
LCS
MSA
Serial Dilution
Cyanide Holding Time
Merc. Holding Time
IDL
IEC
Linear Range
Raw Data
Traffic Reports
Samples
FEB
100
98
60
100
73
100
98
100
100
100
96
100
100
100
100
100
100
60
100
55
MAR
100
100
62
100
87
100
99
99
100
100
99
100
100
100
100
100
100
73
100
157
APR
100
95
63
100
91
100
98
99
%
100
100
96
100
100
100
100
100
95
100
109
SAN
CONTF
T_ah -
MAY
100
92
70
93
60
90
99
99
99
78
99
90
100
100
90
100
100
58
100
233
1PLE MANAGEMENT OFFN
ACT COMPLIANCE SCREE
FROM: ORIGINAL DATA
FROM 02/01/91 - 07/31/91
= WBGD Type = Percent Co
JUN
100
100
100
100
100
100
100
100
99
100
100
100
100
100
100
100
100
79
100
117
JUL
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
100
35
WBGD AVG
100
97
73
98
80
97
99
99
99
93
99
96
100
100
97
100
100
73
100
118
~E
NING
mpleteness
CLP AVG
100
95
88
98
85
96
99
100
99
94
98
96
100
100
94
100
97
79
100
93
ABOVE
0
12
12
13
10
12
7
14
7
12
4
9
0
0
11
0
0
8
0
4
SAME
17
1
1
2
2
1
7
3
4
1
6
3
20
20
1
17
12
1
20
1
BELOW
3
7
7
5
8
7
6
3
9
7
10
8
0
0
8
3
8
11
0
15
-------
the responses, the auditor can get some indication of the problems the laboratory is having
and of their ability to correct the problem. By knowing what corrective action the laboratory
has already attempted, the auditor is better prepared for the on-site visit
A-22
-------
REGIONAL REVIEWS AND DATA VALIDATION REPORTS
The EPA regions or their contractors are responsible for evaluating each data package
compiled at the laboratory to determine the overall data quality for hazardous waste site
assessment This evaluation of organic and inorganic laboratory data determines the actual
usefulness of the data for its intended purpose (e.g., remediation or assessment).
Much of the analytical data collected is used to determine liability for anticipated site
cleanups. Additional concerns, such as economic feasibility and health risks, provide EPA
with the incentive to furnish quality analytical data to site personnel. This process of
ensuring data quality is called data validation.
The laboratory provides a data package containing analytical results with all the underlying
and supporting QA/QC documentation to EMSL-LV, SMO, and the region. Contractual
deviations are checked by EMSL-LV to ascertain contract compliance. Data validation
involves a level of QA different than that contained in the SOW. The validation report is
used by On-Site Coordinators (OSCs) in planning and decision making at hazardous waste
sites.
Regional Data Validation vs. EMSL-LV Data Review
Data validation is generally designed to avoid potential decision making errors at the
hazardous waste site. For example, if surrogate recoveries are low, this could suggest to the
data reviewer that the reported analytical concentrations are biased low. A concentration of
85 ug/L with an associated surrogate recovery of 75% may lead to some speculation in
decision making if the action level is 120 pg/L. Data quality is of utmost importance to
OSCs in deciding, for example, whether to send backhoe operators and site personnel onto the
site with or without respirators.
Data reviews are designed to directly address analytical problems in the laboratory. If
surrogate recoveries are low, the issue of concern is in the sample preparation area of the
laboratory where the analysis occurred. Data reviews report whether the analytical procedures
are being followed correctly, whether the laboratory personnel are properly trained to follow
the SOW, and whether the data is sufficiently reviewed prior to leaving the laboratory.
Consistently poor performance on EMSL-LV data reviews is a direct reflection of laboratory
performance. Data quality (high or low) is viewed as a laboratory responsibility. Exceptions
to contractual requirements are only accepted when problems are deemed matrix specific and
out of the control of the laboratory.
A-23
-------
Supplementing On-Site Evaluations with Data Validation Reports
In regard to on-site evaluations, the usefulness of the data validation reports varies. An
EMSL-LV data review may yield contract violations in the method blank area that go
unrecognized in the regional data validation. By using both the EMSL-LV data audit and the
regional data validation to supplement on-site evaluations, a more complete review of
laboratory performance is possible. In the laboratory monitoring process, regional reviews,
data reviews, and tape audits complement each other as useful tools.
The Regional Review Summary
Because of the tremendous volume of regional reviews (data validations) received at EMSL-
LV, a mechanism for summarizing the information for each laboratory was needed. A
spreadsheet was developed to summarize the performance information over time. The figure
on page A-25 is an example of an inorganic form that is used by EMSL-LV to summarize
that information. The summary presents the information in a format that allows observation
of those analytes that show evidence of consistently poor performance. Entries under each of
the QC headers consist of the analyte(s) and the corresponding QC result for analytes that
failed the data quality objectives in the validation. Analytes that are out of control in case
after case are singled out as problem areas to be concentrated upon in a future on-site
evaluation.
The inorganic QC items that are evaluated in the regional reviews are listed on page A-25,
along with some of the possible laboratory problem areas that may require concentration.
The organic QA areas examined are similar to those of the inorganic areas:
Holding Time - If samples containing certain analytes are not analyzed until several days
after the contract-specified holding times (e.g., mercury and cyanide), an evaluator might
assume that a laboratory is having difficulty because of personnel shortages, poorly
documented procedures, or instrumentation problems. For example, one laboratory was
consistently late with cyanide analyses because the laboratory had space for only six, rather
than the required 12 distillation apparatuses.
Initial and Continuing Calibration Verification - Laboratories rarely demonstrate problems
with calibration verifications. This is because out-of-control results are crossed out and
reanalyzed, or a recalibration is performed. If problems are seen in this area, poor calibration
standards or calibration standard preparation techniques may be involved. The worst possible
problem might be severe instrument drift over a short period. For cyanide, the distillation
process could be an inherent cause of difficulty.
Preparation Blanks - Contamination is the chief source of problems observed in blanks.
Reagents, glassware cleaning processes, and instrument uptake systems are suggested as
potential problem areas deserving some attention during an on-site evaluation. At one
A-24
-------
REGIONAL REVIEW SUMMARY
Laboratory WBGD
Case
Number
Date of
Review
Holding
Time
ICV7
ccv
PB
ICS
LCS
DUP
SPK
Serial Dil
Grant
12345
45678
08/21/91
09/04/91
Hg exceeded
by 6-8 days
All met
Be 80%
Pb82%
Cr88%
Fel69ug/l
Mn 3.7ug/l
Pb 123%
Pb 50RPD
As 200 RPD
Cu 56 RPD
Cd 74 %R
Se 28 %R
Asl38%R
Cu 43 %R
Fe 13 %D
Mg 17 %D
Mn 12 %D
Ca 12 %D
>
to
-------
laboratory, high copper results in the preparation blanks suggested that perhaps part of the
water system was copper-alloy pipe. It was discovered that one small section of the
plumbing had been replaced and that it was not the glass pipe material used in the remainder
of the laboratory. The section of plumbing in question was copper.
Interference Check Sample - Results for the ICS that are abnormally high or low may
indicate that interelement correction factors (lECs) are not adequate. The laboratory needs to
ensure that out-of-control results for low concentration analytes are not caused by elements
found at high concentrations in the samples.
Laboratory Control Sample - LCS results are not usually found to be out of control.
Laboratories reanalyze an LCS if it is outside the contractual control limits and, if it fails, the
group of accepted samples is normally redigested and reanalyzed. If an LCS result is found
to be outside the validation data quality objectives, the cause may be analyte loss (physical or
chemical), or contamination.
Duplicates Duplicate precision is a parameter most often affected by the sample matrix.
Consistently poor precision from case to case may suggest inadequate sample preparation
techniques or inadequate instrument operation.
Matrix Spikes - Consistently poor recovery for antimony and silver is to be expected. If poor
recoveries are seen for other analytes over a period of time, degraded spiking standards or
incorrectly prepared spiking standards may be the cause.
Serial Dilutions Serial dilution results that are out-of-control can effectively tie specific
analytes, on a per-case basis, to the other matrix-dependent QC measures. If duplicate results
are poor for an analyte, the serial dilution results may be poor as well. If serial dilution
results are out of control for a specific analyte on a consistent basis, poor dilution technique
by the analyst may be suspected. Automatic pipet calibration problems may also be a cause
for systematic error.
A-26
-------
o
I
10
On-Site Laboratory
Evaluations
presented by
Environmental Monitoring Systems Laboratory
Las Vegas, Nevada
QAB 1304 SLM
-------
Course of Events for Sample
Analysis in the CLP
X
O
EPA Headquarters
SMO
Regions
1
EMSL-LV
Laboratories
QAB 1304 SLMS
-------
O.
5
Course of Events for Sample Analysis in the CLP
Sequence of Events
1. Region requests Laboratory Services through Sample
Management Office (SMO).
2. SMO identifies CLP Laboratory for Region.
3. Regions sends Samples to Laboratory.
4. Laboratory completes Analysis and sends a Full Data Package
to each of three Reviewers: the Region, SMO, and EMSL-LV.
5. Each Reviewer Performs an Evaluation of Data Packages.
QAB 1304.SLM4
-------
a.
O
I
t-n
Reviewers Evaluate Data to Produce Report
A CLP laboratory completes a full data package for each
case for each of three data review organizations.
Organization
EPA Region
Sample
Management Office
EMSL-LV
Data Use
Site Investigations
and Studies
Contract Compliance
and Payment
to the Laboratory
Independent Technical
QA Review
Report Produced
by Organization
Regional Review
Contract Compliance
Screen (CCS)
Data Review
OAB13MSLI-SO
-------
Q.
X
O
Overview of On-Site
Laboratory Evaluation Process
Identification of Laboratories
Requiring an On-Site Visit
On-Site Laboratory Visit
Measurement of
Laboratory Performance
Preparation for Visit
Debrief Laboratory and Report
Results of Evaluation
OAB 1304 SLI-48
-------
Course Structure
Section A
o
«J
Section B
Section C
Section D
Review of tools available to measure laboratory
performance. Identification of laboratories requiring
on-site visits. Instruction on preparation for an
on-site visit.
Discussion of organic and inorganic laboratory
walk-throughs
Laboratory walk-through demonstrations
On-site report writing, Comments
QAB 1304.SU-3
-------
o
oo
Section A Outline
Auditor's tools
Audits
Quarterly Blind Scores
Regional Performance
Evaluations
SMO, CCS Timeliness
Reports
Regional Reviews
On-Site Reports
Laboratory
Responses
On-Site Evaluations
SOPs and Resumes
Identify laboratories with performance problems
Prepare for on-site visit
QAB 1304 811-24
-------
Environmental Laboratories
EMSL-LV
Regions
\
Organic & Inorganic Data Reviews (Audits)
What are they? Tools for assaying the technical quality of
the data.
Use:
Information used to assess a laboratory's
performance.
GAB 1304 SLI-5
-------
Use of Audits
Perform Technical Review of Data Package for
Compliance with Contract Methods.
Verify that Problems Identified in the Previous Audit
have been Corrected.
Verify that Corrections cited in an On-site Report
have been made by the Laboratory.
QAB 13O4.SLM1
-------
Course Structure
Section A
o.
5
O
Section B
Section C
Section D
Review of tools available to measure laboratory
performance. Identification of laboratories requiring
on-site visits. Instruction on preparation for an
on-site visit.
Discussion of organic and inorganic laboratory
walk-throughs
Laboratory walk-through demonstrations
On-site report writing, Comments
QAB 1304 SLI-3
-------
Section A Outline
5'
O
Auditor's tools
Audits
Quarterly Blind Scores
Regional Performance
Evaluations
SMO, CCS Timeliness
Reports
Regional Reviews
On-Site Reports
Laboratory
Responses
On-Site Evaluations
SOPs and Resumes
Identify laboratories with performance problems
Prepare for on-site visit
QAB 1304JII-24
-------
Environmental Laboratories
EMSL-LV
Regions
Organic & Inorganic Data Reviews (Audits)
a.
a
What are they? Tools for assaying the technical quality of
the data.
Use:
Information used to assess a laboratory's
performance.
OAB 1304 SI I S
-------
Use of Audits
Perform Technical Review of Data Package for
Compliance with Contract Methods.
Verify that Problems Identified in the Previous Audit
have been Corrected.
Verify that Corrections cited in an On-site Report
have been made by the Laboratory.
QAB1304SLMI
-------
Inorganic Audit Process
Technical Review
QC Review
| Review of Quarterly Deliverables,
? SOPs, and Resumes
5!
QAB1304 SLM3
-------
o.
D
Inorganic Audit Process
Technical Review
Verify that Results Reported on Forms
are Present in the Raw Data.
Examine the Raw Data for Technical
Contract Compliance.
Verify that Support Documentation is
included in Data Package.
QAB 1304.SLI-42
-------
Q.
5
D
Inorganic Audit Process
QC Review
Review for Calculation and Transcription Errors from the
Raw Data to the Report Forms.
Review Raw Data for Compliance with Required QC
Windows.
Review of Spikes, Duplicates, and Interference Checks
for Compliance with Recommended Windows.
QAB 1304.SLMO
-------
Inorganic Audit Process
Review of Quarterly Deliverables, SOPs and Resumes
Verify that Laboratories Have Met the Following Requirements
for Quarterly Deliverables:
- For reporting instrument performance checks
- For detection limit, linear range
- Interference correction
Update the Review of SOPs
Update the Review of Laboratory Personnel Qualifications
QAB 1304 SLI-39
-------
Q.
5
O
VO
GC/MS Raw Data Audit Procedure
A raw data tape is requested and received
The paper data package is thoroughly reviewed
The completeness of the raw data files is verified
All compounds are independently quantitated
Instrument settings and performance are examined
Compound quantitations and identifications are examined
Chromatography and contamination are examined
QAB 13043LU
-------
X
O
Organic Data Audit Process
QC Forms Review
Raw Data Review
Reanalysis Requirements
QAB 1304 SLI-52
-------
Organic Data Audit Process
QC Forms Review
System Monitoring Compounds
Matrix Spike and Duplicate
Instrument Performance Check Compounds
X
^ Calibration Standards
Internal Standards
QAB 13M SLI-53
-------
Organic Data Audit Process
Raw Data Review
Calculation of Sample Results
Transcription Errors
Chromatograms
Mass Spectra
Quantitation Reports
QAB 1304 SLI-54
-------
Q.
5'
Organic Data Audit Process
Reanalysis Requirements
System Monitoring Compounds
Method Blank Contamination
Internal Standards
QAB1304SLI-S5
-------
Organic Tape Audit Process
GC/MS Raw Data Evaluation
GC/MS Calibration
> Sample Analysis
| Method Blank Analysis
OAB 1304 SLI-S6
-------
Organic Tape Audit Process
GC/MS Raw Data Evaluation
BFB and DFTPP Tunes, Ion Abundances and
Ratios, and Background Subtraction Techniques
Mass Scan Range
Scan Times
Electron Multiplier Voltage
QAB 1304.SLI-57
-------
Organic Tape Audit Process
GC/MS Calibration
> Concentration of Surrogates, Calibration
I Standards, and Internal Standards
X
Quantitation Ions
Mass Spectral Library Search
Reconstructed Ion Chromatograms
QAB 1304 SLI-58
-------
X
D
Organic Tape Audit Process
Sample Analysis
Quantitation Reports
Internal Standards
Compound Identification
GAB 1304 SLI-59
-------
Environmental Laboratories
I
EMSL-LV
T
Quarterly Blind Scores
5-
o
What are they?
Uses:
Sample sets used to assess laboratory
analytical performance.
Evaluate laboratory performance for
identification, quantitation and
contamination.
QAB 1304 3LI-6
-------
W. B. Goode Laboratories
>
o
a.
*'
O
Quarters (3/89 to 4/90)
QAB 1304.SLI-S1
-------
X
o
Environmental Laboratories
I
Regions
1
Regional P.E. Standards Results
What are they? Help gauge a laboratory's routine
performance.
Uses:
Detect false negatives, contamination or
spectral interferences.
QAB 13M.SLI-7
-------
Q.
5'
o
Environmental Laboratories
I
Headquarters SMO
i
Sample Management Office/Contract Compliance
Screening and Timeliness Reports
What are they? An assessment of administrative
performance.
Uses:
Identifies reporting defects in a
laboratory's data.
QAB 13O4 SLI-fl
-------
Environmental Laboratories
I
Regions
I
Regional Reviews
H
D
1*1
10
What are they?
Uses:
Reports evaluating overall data quality for the user of
the sample results.
Site Managers: Report is used to determine if data
meets site DQOs.
Auditor: Report is used as background
information in evaluating
laboratory performance.
QAB 130* SLW
-------
Environmental Laboratories
EMSL-LV
I
u>
UJ
What are they?
Uses:
Regions
I
On-Site Reports
A report of observations and recommendations
resulting from an on-site laboratory visit by an
EMSL-LV regional representative.
Reports items requiring correction for the benefit of the
laboratory and the EPA. Verifies that past
recommendations have been met by the laboratory.
QAB 1304 SLUG
-------
Environmental Laboratories
EMSL-LV
Regions
Laboratory Responses
s-
X
What are they?
Uses:
Response of environmental laboratories to
EMSL-LV evaluations.
Indicates what actions the laboratory will or has
taken to eliminate problem areas.
OAB1304SLM1
-------
Environmental Laboratories
I
EMSL-LV
T
Review of SOPs and Resumes
Q.
x
What are they?
Uses:
Evaluations of laboratory standard operating
procedures and resumes against contractual
requirements.
Provide information concerning a laboratory's
ability to meet requirements for personnel
qualifications and procedures documentation.
QAB 1304.SLM2
-------
Documentation:
Quality Assurance Manuals
Standard Operating Procedures
Q.
X
D
QA Manual
Standard
Operating
Procedures
QAB 1304 SLI-29
-------
X
O
Organic
Standard Operating Procedures
A Set of Project-Specific SOPs Must be Available
and must Adequately Address the Following 12 Areas:
Evidentiary SOPs Glassware Cleaning
Sample Receipt and Storage ' Calibration of Balances
Sample Preparation
QAB 130* SLI-32
-------
Organic
Standard Operating Procedures
Analytical Procedures (for each Analytical System)
Maintenance Activities (for each Analytical System)
Analytical Standards
| Data Reduction Procedures
I Documentation Policy and Procedures
£ Data Validation/Self-Inspection Procedures
Data Management and Handling
GAB 1304 SLI-33
-------
Inorganic
Standard Operating Procedures
SOPs must be Available and
must Adequately Address the Following:
> Receipt and Storage
I Security
o
8 Standards Preparation
Sample Preparation
QAB 1304 SLI-34
-------
Inorganic
Standard Operating Procedures
Glassware Cleaning
Analytical Methods
Data Package Preparation
Technical Review of Data
Instrument Maintenance
QAB 1304 SLI-35
-------
Organic
Quality Assurance Manual
Seven Required Sections
must be Addressed in Sufficient Detail:
Organization and Policy Data Generation
Facilities and Equipment Quality Control
Document Control Quality Assurance
Analytical Methodology
QAB 1304.SLI-30
-------
o.
5
D
Inorganic
Quality Assurance Manual
The Manual Must Be Satisfactory for the Following:
Personnel
Preventive Maintenance
Reliability of Data
Facilities and Equipment
Documentation of Procedures
Operation of Instruments
Feedback and Corrective Action
QAB 1304 SLI-31
-------
Organization and
Personnel: Resumes
Q.
5"
0
Education
Experience
Course
Work
QAB 1304 SU-28
-------
Identify Laboratories that have
Performance Problems
Performance scoring triggers an on-site visit
when a laboratory shows:
Poor QB scores Poor timeliness scores
(chronic lateness)
Poor audit scores _
Poor responses
Poor CCS scores . pOor overall scores
GAB 1304.SU-13
-------
a.
D
Laboratory Performance Reports
Regional Review Summary
Performance Evaluation Sample Summary
Sample Management Office Contract
Compliance Screen
QAB 1304 SLI-67
-------
Effect of On-Site on Audit and QB Scores
CL
X
100
90-
60-
50
Declining Data
Quality
Improving Data
Quality
XQBs
Lag between
On-Site and Data
Quality
improvement
Poor Lab Performance
Prompts On-Site
Fiscal Year Quarters
QAB 1304.SLM9
-------
X
o
On-Site Visit Preparation
Thorough review of SOPs and resumes
Review data audits, QBs, CCS and regional
reviews to determine what problems the
laboratory is having and where to concentrate
QAB 1304111-14
-------
Q.
X
D
Checklists to Use in Preparation
for an On-Site Evaluation
SOP/QAP Checklist
Resume Checklist
Data Review Checklist for GC/MS
Non-Conformance Memo for Problem
Data on GC/MS
QAB 1304.St.l-68
-------
Q.
S'
o
Laboratory Evaluation Checklist
Organization and personnel
- Identification of key
personnel
Documentation
- QA manual available and
satisfactory?
- Written SOPs available
and satisfactory?
Laboratory tour
- Sample receipt and storage
area
- Preparation area
- Analytical methods
- Data handling and review
QAB 130481-21
-------
Q.
5"
O
Quality Control Troubleshooting Guide
Common QC Errors Potential Problem Areas
Calculation Errors
Transcription Errors
Form-Generation Errors
Unresponsive to Tape Audit Request
Method Blank Contamination
Data Entry; Data Review
Data Entry; Data Review
Form-Generating
Software
Storage; Retrieval of
Electronic Data (GC/MS)
Source of Organic-Free
Water; Reagent Purity;
Glassware Cleaning;
Holding Blanks;
Ventilation
QA8 1304 SLM6
-------
5-
o
Quality Control
Troubleshooting Guide
Common QC Errors
Out-of-Control ICVs/CCVs
Out-of-Control Surrogate Recoveries
Out-of-Control CRI Values
Out-of-Control ICBs/CCBs
Out-of-Control PBs
Potential Problem Areas
Instrument Drift
Sample Preparation;
Data Review
Instrument Instability at
or near the IDL
Instrument Instability at
or near the IDL
Blank Contamination
QAB 1304 SLI-47
-------
Section B Outline
Conducting an On-Site Laboratory Visit
I. Orientation Meeting
II. Laboratory Walk-through
111. Laboratory Debriefing
QAB 1304.811-28
-------
I. Orientation Meeting
A. Organization and Personnel
B. Documentation
I 1.QA Manual
! 2. SOPs
L/l
UJ
OAB 1304.SLI-27
-------
5
D
II. Laboratory Walk-through
A. Sample Receipt and Storage Area
B. Preparation Area*
C. Analytical Methods*
D. Data Handling and Review
'Organic and Inorganic Requirements Differ Significantly
OAB 1304 SLI-36
-------
X
O
Sample Receipt and Storage Area
Are SOPs available and acceptable?
Are adequate storage facilities available and is the
cold storage properly monitored?
Are records kept in a manner consistent with GLP?
QAB 13048D-17
-------
Preparation Area
Are sufficient and adequate facilities and equipment
provided? Sample preparation?
Are written SOPs available and consistent with the
contract? Are the SOPs followed by laboratory
personnel?
Are calibration and reference standards properly
tracked in a log?
Are the records consistent with GLP?
QAB 1304-BD-18
-------
Standards Preparation Label
g.
><'
D
STTANOWW LABEL
STANDAKDS
Name
Date of
Preparation.
Analyst
Prepared
OAB 1304.SLI-37
-------
Sample Preparation
Individual graduate cylinders should be used to
transfer each sample to the digestion beakers.
0.
x'
o
QAB 1304.SLI-60
-------
Analytical Methods
Is sufficient and adequate instrumentation provided to
adequately support the laboratory's commitments?
Are written SOPs available and consistent with the
contract? Are the SOPs followed by the laboratory
personnel?
Is the instrumentation maintained properly? Are service
records kept? Are the instrument run logs adequate and
up to date?
QAB 130481-19
-------
Data Handling and Review
Are there at least two layers of review?
Is SOP available and adequate?
Are sufficient computer resources
available for diskette deliverables?
OAB 1304811-20
-------
X
o
I
a\
Items to Look for in the
Document Control Area
Is a Full Set of Current SOPS Located in
Document Control Area?
Are Personnel Dedicated to 100%
Document Control Duties?
Is an Organized System Used for Tracking
and Filing Data and Reports?
QAB 1304 SLI-70
-------
Q.
3
O
Items to Look for in the
Document Control Area
Are Methodical Procedures Used to
Assemble and Verify Complete Reports?
Are the Procedures Clear for Processing
Requests for Resubmittals?
Is Verification of Accuracy of Software
Packages Used for Assembly of Reports?
OAB 1304 Sim
-------
QA Officer Responsibilities
Performs Systems Checks
Submits In-house Unknown PE Materials
to Laboratory
Updates QAP
Maintains Independence of Laboratory
Operations
QAB 1304.SLI-72
-------
III. Laboratory Debriefing
A.Team caucus
- Discussion of on-site team findings
- Laboratory personnel are not present
> B.Laboratory debriefing
| - Problems that need resolution are listed
* - Items from debriefing will be incorporated
i into on-site report
QAB 1304(11-16
-------
Section C
o.
O
8
Laboratory Walk-through
Demonstration
OAB 1304 SLI-73
-------
Section D
Writing the On-site Report
3
GAB 13M.SLI-74
-------
Use of On-Site Evaluation Report
Monitor Laboratory Performance
Assure Corrective Action
Prepare for Future Visits
QAB 1304 SLM2
-------
X
D
Basic Data for Cover Sheet
Name, Address, Telephone Number of Laboratory
Type of Evaluation (On-Site, Organic, Inorganic)
Date of Visit
Title and Numbers of Contract and Solicitation
Names and Titles of Evaluation Team Members
QAB1304SLI-63
-------
On-Site Report Outline
Summary of Recommendations
List of Personnel Present During the On-Site Visit
1» Documentation Review
D Evaluation and Discussion of Laboratory Personnel
s Qualifications
Evaluation and Discussion of Analytical Procedures
and Equipment
Evaluation and Discussion of Data Review and Report
Generation Procedures
GAB 1304 SLI-38
-------
Summary of Findings
States Auditor's Findings (from Checklist)
Recommends Corrective Actions or Good
Laboratory Practice
Gives Number of Points Deducted for the
Problem
Refers to Applicable Statement of Work or
Program Requirements
OAB 1304 Sl>65
-------
Tips on Preparation
Take Good Notes
Exercise Selectivity
I Collect All Background Information
I Before Writing
D
s Write Direct, Clear, Concise Sentences
Be Tactful but Frank
Review and Edit Report
OAB 1304811-68
-------
Body of a Typical Report
Name and Affiliation of Author of Report
Date Report Prepared
Summary of Recommendations and
Observations (usually as numbered list)
Laboratory Evaluation Checklist
Attachments
X
- List of Personnel - Data Reviews
OJ
- Equipment - Other
QAB 1304.SLI-64
------- |