&EPA
• -* *—*" "-T —*— "^—*" V ^^^/ V^^ ^ur-^
Guidance on the Documentation
and Evaluation of Trace Metals Data
Collected for Clean Water Act
Compliance Monitoring
> Printed on Recycled Paper
-------
Acknowledgements
IMS guidance was prepared under the direction of William A Telliard
Division (BAD) within the U.S. Environmental Protection Agency's
Engineering and Analysis Division (BAD). This guidance was prepared under EPA
by the DynCorp Environmental Programs Division with assistance from Interface, Inc
and Ar dyo,
Disclaimer
Tliis document has been reviewed and approved for publication by the Analytical Methods; S taffwUhin
L Engineering and Analysis Division of the EPA Office of Water. Mention of trade names or
commercial products does not constitute endorsement or recommendation for use.
Further Information
For further information, contact
William A. Telliard, Chief
Analytical Methods Staff
Engineering and Analysis Division
U.S. Environmental Protection Agency
401 M Street
Washington, DC 20460
Phone: 202-260-7134
Fax: 202-260-7185
Requests for additional copies should be directed to:
US EPA NCEPI
11029 Kenwood Road
Cincinnati, OH 45242
513-489-8190
-------
Chapter 1
Introduction
Numerous organizations, such as state pollution control agencies, health departments local
government agencies, industrial dischargers, research facilities, and federal agencies (e.g EPA
ooO** data on effluent and ambient metal concentrations for use in a variety o
determining attainment status for water quality standards, discerning trends m w
effluent concentrations and variability, estimating background kJ* for total mab
(TMDLs, assessing permit compliance, and conducting research1. The quality of data used is an
SET rU5 ' m T00?' ^ qUality °f *aCe l6Vel metals data mav be compromised due to
SSZS£ rg rP^g' mtT?°?' St°rage' "ld ^i8- * fact' one of *«» ««*»* obstacles faced
by laboratories attempting ttace metals determinations is the potential for contamination of samples during
rtcSTSL "? ^ ^°rSeS- TiaCe metalS *• Ubiquit°US fa *» ^ironment, and sLple^
reaMy become contaminated by numerous sources, including: metallic or metal-containing labware
rne^-containmg reagents, or metallic sampling equipment; improperly cleaned and stored equipment
atmospheric inputs such as dirt, dust, or other participates from exhaust or corroded structoes.
? mf asurement of ttace metals at EPA **« q«aUty criteria (WQC) levels has been spurred by
WOP ±^f T °D " Wf r quaktybased aPProach to *» Control of toxic pollutants. Current ambient
WQC eves for trace metals requu-e measurement capabilities at levels as much as 280 times lower than
those levels reqmred to support technology-based controls or achievable by routine analyses
environmental laboratories. Also, recent USGS and EPA studies strongly indicate mat rigorous steps
be taken m order to preclude contamination during the collection and aldysis of «S
crit™ C°UeCted for ttace metals terminations at ambient water quality
cntena levels are vahd and not a result of contamination, rigorous quality control (QC) must be SS
oaU sample collection, preparation, and analysis activities. EPA has pubUshed analytical methods $983
1991) for monitoring metals in waters and wastewaters, but these methods are inadequate for the
2STSS-? ,Tf CT,ent^°nS of metals fa ^bient waters due to the lack of soml or all of the
jsenti^ quality control and handling criteria. Tins prompted the Engineering and Analysis Division
C5AD) to develop new sampling and analytical methods that include the rigorous sample handling and
quality control procedures necessary to deliver verifiable data at WQC levels The new samolins method
is entitled, M**ad 1669: Sampling Ancient Water for Detection of Tr^e M^a^l^er
icalmethods include Methods 1631 ,1632*636,
^ Methods")- M^y of these analysis methods were
rean Wlth additi°nal quality contto1 "d sample handlSJ
requirements; others are new methods that are based on newly developed analytical procedures
to
Division Directo, and
January 1996
-------
Data Evaluation Guidance
Appropriate quality assurance (QA) and quality control (QC) procedures are the key to producing,
precise and accurate data unbiased by contamination. Examination of trace metals data without data from
blanks and other QC analyses yields little or no information on whether sample data are reliable. Data
gadfly must be documented through the use of blanks (both field and laboratory blanks), standards matoc
spikeLtrix spike duplicates, and field duplicates, as well as other QC analyses The resulte of aU QC
procedures must be included in the data reporting package along with die sample results if data quality
is to be known.
The remainder of this document contains guidance that is intended to aid in the review of trace
metals data submitted for compliance monitoring purposes under the National Pollutant Discharge
SLL System (NPDES) when these data are collected in accordance with Method 669 and analyzed
byT 1600 Series Analysis Methods. Chapter 2 of this document outlines the data elements that must
be reported by laboratories and permittees so that EPA reviewers can validate the data Chapter 3
provides guidance concerning the review of data collected and reported in accordance with Chapter 2
Chapter 4 provides a Data Inspection Checklist that can be used to standardize procedures for documenting
the findings of each data inspection.
The guidance provided in these chapters is similar in principle to the data reporting and review
guidance provided in EPA's Guidance on Evaluation, Resolution, and Documentation of Analyucal
Problems Associated™* Compliance Monitoring (EPA 821-B-93-001), but has been specifically adapted
to reflect particular concerns related to the evaluation of data for trace metals.
This guidance is applicable to the examination of recently gathered trace metals date and to
historical data in existing EPA databases. It should be noted, however that some qualification of historical
dSa may be required before these data can be included in current databases. A draft User sGuule to the
SSS£^<^
EMSL-LV), provides guidance that may be used to qualify data for inclusion into current databases Thi
EMstLV guidance stipulates that at least some form of QA/QC must be associated wrth the histoncal
data for evaluation. This QA/QC may be in the form of various types of b anks (method ^d,eteO.
replicates field, analytical, etc.), spikes (matrix, surrogate internal standard, etcO, andPE samples Certified
reference materials, QC check samples etc.). A scoring mechanism is applied to these QA/QC data, and
the usability of the sample data is based on the resulting score.
January 1996
-------
Chapter 2
Checklist of Laboratory Data Required
to Support Compliance Monitoring for Trace Metals
Determined in Accordance with Method 1669
and the 1600 Series Analysis Methods
d^singtife^MMfor S^n * m^Z^ dementS neC6SSaiy tO ^^ "** ^^ data
Levels (Method 1669) and the 1600 Series Analysis Methods^ should °L
—* data output, th< '~ "
Method Number
In recognition of advances that are occurring in analytical technology, the 1600 Series Analysis
Memods are performance-based. That is, an alternate procedure or tecSuque may be
modrfications is to improve method performance on the sample being analyzed At no tin*
E- •— *- - - -
2.
Detailed Narrative
January 1996
-------
Data Evaluation Guidance
so that the data user can understand the reason(s) for acceptance/rejection of the data or any changes to
the reference method.
3. Data Reporting Forms
The complete data reporting package must include data reporting forms
analyzed, the Jab and metal species determined, and the concentrations found. Analytes
les at concentrations below the minimum level (ML) must be reported as non-detect&
concentrations detected in blank samples must be reported, regardless of tiie level Results
fo?Lh sample analyzed, including any dilutions and reanalyse, Metals should be luted
by name and CAS Registry number.
Tto ML is the quantitation level as defined by the EPA 1600 series method used for sample
^laboratory is required to determine the MDL for each analyte m accordance with the
delcribS in^O CRR Part 136, Appendix B- Definition and Procedure for DetermtnaUonof
^L Limit - Revision 1.11. Ttet MDL multiplied by 3.18 must be less than or equal to the
ML given in the EPA 1600 Series Analysis Method.
•Die use of data qualifiers or flags by the laboratory is discouraged. Rather, laboratories should
used.
4. Summary of Quality Control Results
Results for all quality control analyses required by the reference EPA method must be presented
in the complete datTreportmg package, ff more than one method was used Or if more ton one se of
Tan^Vs wS andy^,l muftbe ckarly evident which QC corresponds to a given method and set of
samples.
Results for QC procedures that must be provided include, but are not limited to, the following
(where applicable):
Instrument tuning
Calibration , . . , ,
Calibration verification (initial and following every 10 analytical samples)
Initial precision and recovery
Ongoing precision and recovery
Blanks
Laboratory (method) blanks
January 1996
-------
Data Evaluation Guidance
Field blanks
Calibration blanks
Equipment blanks
Matrix spike/matrix spike duplicates
Field duplicates
Method of standard additions (MSA) results
Spectral interference checks
Serial dilutions
Internal standard recoveries
Method detection limits
Quality control charts and limits
Table 2 lists the required frequency and purpose of the QC procedures.
5.
Raw Data
•mhmitt 57 an?yS6S mUSt te kept °n me at ** ^oratory (Chapter 2, Section 7) and
neaH ^ "* ? tothe ^ ""*"« UpOn "*"*• ^ ^trument output (en^ssion Lens^
peak height area, or other signal intensity) must be traceable from the raw data to L final result SpoS
The raw data must be provided for not only the analysis of each field sample but also for all caSS"
instrument specific md mav include' ^t are not limited to, the
following^ ***
Sample numbers and other identifiers
Digestion/preparation or extraction dates
Analysis dates and times
Analysis sequence/run chronology
Sample weight or volume
Volume prior to each extraction/concentration step
Volume after each extraction/concentration step
Final volume prior to analysis
Injection volume
Matrix modifiers
Dilution data, differentiating between dilution of a sample or an extract
Instrument (make, model, revision, modifications)
c!Z^fr0dUfT SyS!em (^asonic nebulizer' hydride generator, flow injection system, etc.)
Column (manufacturer, length, diameter, chelating or ion exchange resin etc )
P-^-dentrfpower,flow
Detector (type, wavelength, slit, analytical mass monitored, etc.)
Background correction scheme
January 1996
-------
Data Evaluation Guidance
Quantitation reports, data system outputs, and other data to link the raw data to the results
DkeSistrument readouts (e.g., strip charts, mass spectra, printer tapes, and other recordings of
raw data) and other data to support the final results J
Lab bench sheets and copies of all pertinent logbook pages for all field and QC sample
preparation and cleanup steps, and for all other parts of the determinations
6. Example Calculations
Example calculations that will allow an independent reviewer to determine how toe laboratory used
the raw data to arrive at a final result must be provided in the data reporting package if any adjustments
Se Ltd 7* the equations included in the method, Useful examples ^^fe bo* detfed ^ ^g
compounds. Htto laboratory or the method employs a standardized reporting level for undetected
compounds, this should be made clear in the example calculation. Adjustments made for sample volume,
dilution, internal standardization, etc. should be evident.
7. Archiving Data on Magnetic Media
It is not necessary for the laboratory or responsible organization to submit digitized binary
hexadecimal, or other raw signal recordings with the data package. However, the laboratory tot performs
dte Ldysis hould archive these data so that the raw reduced data can to reconstructed, and the. Utonto*
orOrganization responsible for reporting the data should be prepared to submit raw data on magnetic
medif To* request by EPA. Magnetic media may be required for automated data review, for diagnosis
of data reduction problems, or for establishment of an analytical database.
8. Names, Titles, Addresses, and Telephone Numbers of Analysts and QC Officer
The names titles, addresses, and telephone numbers of the analysts who performed the
determinations and'the quality control officer who verified the results must be included in die data
matte package. If the data package is being submitted by a person or organization other ton to
SX kSory it is that person or organization's responsibility to ensure that the laboratory provides
all the data listed above and that all method requirements are met. For example, with regards to effluent
or ambient monitoring data submitted by an NPDES permittee on a Discharge Momtonng Report (DMR),
die task of collecting and reporting quality control data falls to die permittee.
In addition, the personnel, tides, addresses, telephone numbers, and name (if different from the
laboratory that analyzed the field samples) of the facility that cleaned and shipped the sampling equipment
id Derated the equipment blanks, the laboratory (if different) that analyzed the equipment b anks and
ti?e facility responsible for the collection, filtration, and transport of the field samples to the laboratory
must be obtained and included in die data reporting package.
January 1996
-------
Table 1
Method Numbers, Analytical Techniques, Method Detection Limits, and Minimum Levels
Method
1631
Technique
Oxidation/Purge &
Trap/CVAFS
1632
1636
1637
1638
1639
Metal
MDL (pgfl,)1
Mercury
1640
ML
0.000054
0.0002
Lowest EPA
Water Quality
Criterion (ug/L)3
0.012
lydride AA
Ion
romatography
C/STGFAA
ICP/MS
iTGFAA
yiCPMS
•MMMMBL
1 Arsenic
1 Hexavalent
Chromium
Cadmium
Lead
Antimony
Cadmium
Copper
Lead
Nickel
Selenium
Silver
Thallium
Zinc
Antimony
Cadmium
Trivalent
Chromium
Nickel
Selenium
Zinc
Cadmium
Copper
Lead
Nickel f
•MMMMM!
0.002
. 0.23
0.0075
0.036
1 0.0097
0.013
0.087
0.015
0.33
0.45
0.029
0.0079
0.14
1.9
0.023
0.10
0.65
0.83
0.14
0.0024
0.024
0.0081
0.029
•OMMBMnL
0.005
0.5
0.02
0.1
0.02
0.1
0.2
0.05
1
1
0.1
0.02
0.5
5
0.05
0.2
2
2
0.5
0.01
0.1
0.02
0.1
•^^••MMI
"~ — i
0.018 1
10 1
0.37 |
0.54 1
14 1
0.37 1
2.4 1
0.54 1
8.2 1
5 1
0.32 1
1.7 1
1 32 1
14 1
0.37 1
57 1
8.2 1
5 I
32 §
0.37 1
2.4 §
0.54 1
8.2 1
M^^_H^^^_J
Method Detection Limit as determined by 40 CFR Part 136, Appendix B
ss;:^
i3'
-------
-------
Chapter 3
Guidance for Reviewing Data
from the Analysis of Trace Metals Using
Method 1669 and the 1600 Series Analysis Methods
Use of the guidelines provided below, or of similarly developed standardized protocols is
recommended as a tool with which Regional and State permitting authorities cT^^
inspection and acceptance procedures and minimize differences that might otherwise result between date
" Pe™?f ^P°nSible f°r SUbmitting data' A Data InsP*cti™ C^ckliSt has^ bin
***? *? ,f°UOWing Chapter- ™S Checklist Provides a standardized format for
^^
1.
Purity and Traceabiliiy of Reference Standards
mM« nt
measurement. In
°f ™y n0n;?bSOlUte 6mpirical mea^ement is dependent on the reference for that
ermining pollutants in water or other sample matrices, the analytical instrument and
-St * ^^ Wlth a known refereace ™teliai of documented^ S
This information need not be provided with every sample analysis. Rather, it sho^d be
maintamed on file at the laboratory and provided upon request. When ialyses are conducted mlconLct
±S TV T^? U°n Sh°Uld * pr°Vided t0 *" permittee ** ** *« that the laborSoS is
employed for specific analyses and updated as needed. ««««ry 15,
2. Number of Calibration Points
The 1600 Series Analysis Methods specify that a minimum of three concentrations are to be used
when calibrating the instrument. One of these points must be the Minimum Level (ML, seTJem 5) Sd
note must be near the upper end of the calibration range. Calibration must be peJL^SS^
ie 7 S3mPleS ^f^ m analyZed- ^ USe of *e ML as a point on the calibration curie
means by which to assure that measurements made at this quantitation level are reliable.
™v, d^f^iewef.ushould revie^ me points used by the laboratory to calibrate the instrument and
make certain that the calibration range encompasses the Minimum Level and that all sample and QC
measurements are within the calibration range. Samples that produced results J£H£eSS £
n±T6OoT ^ SA°1 T, ^ f Uted and reanalyZed " aCCOrdance with «» specificationTde^
Lt o th ^ ^ °d ^ ^ US6d by ^ Iab°rat0iy- ""» ^^ sa™Ple ««to need only
apply to those analytes that exceeded the calibration range of the instrument fc other words it is
acceptable to use data for different analytes from different levels within the same sample. SomI flexlbmty
1 « onl
dat, flj ^ t°m "I311315;? °f ** dantod S3mple *" D0t provided' ^^ "* can be made of the
data that are above the calibration range (>10%). The response of the analytical instrument to
concentrations of analytes will eventually level off at concentrations above the calibration
January 1996
-------
Data Evaluation Guidance
it is not possible to specify the concentration at which this will occur, it is generally safe to assume that
to%££ concenLS above the calibrated range is a lower limit of the actual «J£^£
Therefore, if the concentration above the calibration range is also above a regulatory limit, it is a virtual
certainty that the actual concentration would also be above that limit.
3. Linearity of Calibration
The relationship between the response of an analytical instrument to the concentration or amount
of an anSIntroduced into the instrument is referred to as the "calibration curve". An analytical
instrument can be said to be calibrated in any instance in which an instrumental response can be:
to the concentration of an analyte. The response factor (RF, calculated for external standard cahb
or Sati^ response factor (RRF, calculated for internal standard calibration) is the ratio of the response
of the instrument to the concentration of the analyte introduced into the instrument. Equations tor
calculating RFs and RRFs are provided in the 1600 Series Analysis Methods.
While the shape of calibration curves can be modeled by quadratic equations or higher order
al functions, most analytical methods focus on a calibration range in which.to meat
is essentially a function of the concentration of the analyte. The advantage of the linear
ration is that the RF or RRF represents the slope of calibration curve, simplifying calculations and
interpretation. The 1600 Series Analysis Methods contain specific criteria for determmmg to
™ of calibration curves determined by either an internal or external standard technique. When the
criterion is met, the calibration curve is sufficiently linear to permit the laboratory to use an
IP or RRF, and it is assumed that the calibration curve is a straight line that passes ^ough toe
zero/zero calibration point. Linearity is determined by calculating the relative standard deviation (RSD)
Se RF oTS for each analyte and comparing this RSD to the specified limit. The^specific acceptance
criteriaarelisted m the Data Inspection Checklist (Chapter 4, Item 12) and in the 1600 Series Analysis
Methods. TJiese methods also include alternative procedures to be used in the event the linearity criteria
fail specifications.
The laboratory must provide the RSD results by which an independent reviewer can judge
linearity even in instances hi which the laboratory is using a calibration curve. In these instances, the ctata
reviewer should review each calibration point to assure that the response increases as me concentration
SSSses If it does not, the instrument is not operating properly, and the data should not be considered
valid.
4. Calibration Verification
Calibration verification involves the analysis of a single standard, typically in the middle of the
calibration range, at the beginning (and, in some cases, at the end) of each analytical shift The
ctnceZtion of each analyte in a reference standard is determined using the initial calibration data and
compared to specifications in the method. If results are within the specifications the ^oratory may
pZed with analysis without recalibrating. The initial calibration data are then used to quantify^sample
results Specific criteria for acceptance of calibration verifications are provided m the Data Inspection
Checklist (Chapter 4, Item 17) and the 1600 Series Analysis Methods.
10
January 1996
-------
Data Evaluation Guidance
nr* f ^f °n Verification' which is used m *e 1600 Series Analysis Methods, differs in concept and
uutial calibration, all subsequent sample analyses are conducted using the new respons fXorc The
° S "" ""*** " <" *«** ^—e Lween thfold
** * am°UDtS t0 a ^ sin^-P^ calibration.
r I-K 16«° SeiieS ^^y818 Methods require calibration verification after every ten samnles
Calibration verification is performed by analyzing an aliquot of the mid-point ctftaS s^cS? and
obtaining results that meet the specifications contained in L methods. These speSficate a^Sen for
5. Method Detection Limit and Minimum Level
limit (Mutadv, Iab°rat0ry t0 Perform a method Section
urnit (MDL) study for each analyte in accordance with the procedures given in 40
Appendix B. The MDL studies are conducted to demonstrate tL the
^
methods. The MDL studies were conducted by at least one laboratory for each method and
January 1996
11
-------
Data Evaluation Guidance
low the ML, the data reviewer should require the responsible party to correct and resubmit
TcxTO of action is not possible, the reviewer should determine the sample-specific ML
and consider results below that level to be non-detects for regulatory purposes.
If sample results are reported above the ML, but are below the faculty's regulatory compliance
S^muSu^ 51 ttofc «4*> resdK in OKier «o detennin. if the level of poHu«, daecfcdmay
be attributable to contamination.
Although sample results are to be reported only if they exceed the ML, all blank results are to be
reported r^ardless of the level. This reporting requirement allows data reviewers the opportunity to
Spact of any blank contamination on sample results that are reported above the ML.
complian
the
' It is important to remember that if a change that will affect the MDL is made to a method foe
and were capable of producing the desired MDLs.
The procedures given in this document are for evaluation of results for determination °f regulatory
ianS'and not for assessment of trends, for triggering or for J^^i^L^^
rtine of all results, whether negative, zero, below the MDL, above the MDL but below
S ML. may be of value and may be required by the permitting authority as necessary
to enforce ^ a particular ckcuWance. Dealing with the multiplicity of consequences presented by such
results, either singly or in combination, is beyond the present scope of this document.
6. Initial Precision and Recovery
The laboratory is required to demonstrate its ability to generate acceptable precision and accuracy
SSStatoen that laboratories that have difficulty passing the start-up test have such marginal
performance that they will have difficulty in the routine practice of the method.
The test consists of spiking four aliquots of reagent water with the metals of interest at 2 - 3 times
the MLS«rrUstedPin the method and analyzing these four aliquots. The mean concenttaUon
S) and the standard deviation (s) are then calculated for each analyte and compared to the specifications
in tS memoS! If the mean and the standard deviation are within the limits, the laboratory can use the
method to analyze field samples.
January 1996
-------
Data Evaluation Guidance
hv th i ^ t P u / t0 ^ ^ sPecifications ^ the method, none of the data produced
by the laboratory can be considered to be valid. If the laboratory did not perform the start-up tests the
data cannot be vahd, unless all other QC criteria have been met and the laboratory has submitted ffR^d
associated mstrument QC data that were generated after-the-fact by the same analyst on foe * ame
instrument. If these conditions are met, then the data reviewer may consider thb data to be acceptable for
most purposes NOTE: The^ inclusion of this alternative should not in any way Sl±S to^c
die practice of performing IPR analyses after the analysis of field samples. Rather, EPA believls
demonstration of aboratory capability prior to sample analysis is an" essential QC
Si1 * 1S Pr°V1^ }£* " f t001 t0 Peimitting aUth°ritieS When data have *«** ben collected
±Sedt i™Z f STleS' °nCe *" Pr°blem haS *« identified> * re
expected to implement corrective action necessary to ensure that it is not repeated.
It is important to remember that if a change is made to a method, the IPR procedure must be
repeated using the modified procedure. If the start-up test is not repeated when these steps are rTdffiS
or added, any data produced by the modified methods cannot be considered to be valid. moamea
7. Analysis of Blanks
Because trace metals are ubiquitous in the environment, the precautions necessary to preclude
h111016 eX?S1Ve *IHI?Me reqUked t0 Preclude cont^ation when sy^etic^rganic
fcon™ t Q0n~f iqmr T?^68 m determined- EPA h<* f°™
-------
Data Evaluation Guidance
from its cleaning facility, maintain these results on file, and provide them to the permitting authority upon
r^uest The data reviewer should evaluate equipment blank results only if it is necessary to identify
potential sources of contamination present in field blanks.
Controlling laboratory contamination is an important aspect of the quality assurance plan for the
equipment-cleanm! facility, laboratory, and field team. Each party should maintain records regardmg
Urik contamination. Typically, these records take the form of a paper trad for each piece of equipment
anTcontrol charts, and Sey should be used to prompt corrective action by the party •"°^"*£>.
contamination. For example, if records at a single site suggest that equipment blanks laboratory blanks,
and calibration blanks are consistently clean but that field blanks show consistent levels ofoootanmation,
then the field sampling team should re-evaluate their sample handling procedures, identify the problem,
^TuStirute corrective actions before collecting additional samples. Siimlarly equipment cleamng
facilities and laboratories should utilize the results of blank analyses to identify and correct problems in
their processes.
Unfortunately, it is often too late for corrective action if data are received that suggest the presence
of uncontrolled contamination that adversely affects the associated data. The exception to this rule is the
casTui which the field and equipment blanks show no discernable levels of contamination, 'ontammation
is deleted in the laboratory or calibration blanks, sample holding times have not expired and sufficient
sample volume remains to allow the laboratory to identify and eliminate the source of contamination and
rSyze the associated sample(s). In all other cases, the reviewer must exercise one of several options
listed below when making use of the data.
If a contaminant is present in a blank but is not present in a sample, then there is little need for
concern about the sample result. (It may be useful, however, to occasionally review *e raw data
for samples without the contaminant to ensure that the laboratory did not edit the results for this
compound.)
If the sample contains the contaminant at levels of at least 10 times that in the blank, then the
likely contribution to the sample from the contaminant in the sample is at most 10%. Since most
of the methods in question are no more accurate than that level, the possible contamination is
negligible, and the data can be considered to be of acceptable quality.
If the sample contains the contaminant at levels of at least 5 times but less than 10 times the blank
result the numerical result in the sample should be considered an upper limit of the true
concentration, and data users should be cautioned when using such data for enforcement purposes.
If the sample contains the contaminant at levels below 5 times the level in the blank, the sample
data are suspect unless there are sufficient data from analyses of multiple blanks to perform a
statistical analysis proving the significance of the analytical result. Such statistical analyses are
beyond the scope of this guidance.
If Mask contamination is found in some types of QC samples but not others (e.g., only in the
laboratory blank but not in the field blank), the data user should apply the guidelines listed above,
14
January 1996
-------
Data Evaluation Guidance
but may also use this information to identify the source of contamination and take corrective
actions to prevent future recurrences.
There are two difficulties in evaluating sample results relative to blank contamination. First the
reviewer must be able to associate the samples with the correct blanks. Field blanks are associated with
each group of field samples collected from the same site. Calibration blanks are associated with samples
by the date and time of analysis on a specific instrument. Laboratory (method) blanks are associated with
each batch of 10 samples prepared and digested in accordance with a particular method during a single
shift. If Hie reviewer cannot associate a batch of samples with a given blank, the reviewer should request
this association from the laboratory so that the results for the samples can be validated.
The second difficulty involves samples that have been diluted. The dilution of the sample with
reagent water represents an additional potential source of contamination that will not be reflected in the
results for the blank unless the blank was similarly diluted. Therefore, in applying the 10-times rule stated
above, the concentration of the sample is compared to the blank results multiplied by the dilution factor
of the sample. For instance, if 1.2 ppb of a contaminant is found in the blank, and the associated sample
was diluted by a factor of six relative to the extract from the blank prior to analysis, then the diluted
sample result would have to be greater than 1.2 x 6 x 10 or 72 ppb to be acceptable. Diluted sample
results between 36 and 72 ppb would be considered an upper limit of the actual concentration, and diluted
sample results that were less than 36 ppb would be considered unacceptable in the absence of sufficient
blank data to statistically prove the significance of the result.
In most cases, the practice of subtracting the concentration reported in the blank from the
concentration in the sample is not recommended as a tool to evaluate sample results associated with blank
data. One of the most common problems with this approach is that blank concentrations are sometimes
higher than one or more associated sample results, yielding negative results.
Nearly all of the 1600 Series Analysis Methods are capable of producing MDLs that are at least
10 times lower than the lowest water quality criteria (WQC) published in the National Toxics Rule Since
most discharge permits require monitoring at levels that are comparable to or higher than the WQC
published in the National Toxics Rule, EPA believes that, in nearly all cases, laboratories should be
capable of producing blank data that are at least 10 times less than the regulatory compliance level It
should also be noted that laboratories cannot be held accountable for contamination that is present in field
blanks but not present in laboratory blanks; in such cases the sampling crew should take corrective
measures to eliminate the source of contamination during then: sample collection and handling steps.
8. Ongoing Precision and Recovery
The 1600 Series Analysis Methods require laboratories to prepare and analyze an "ongoing
precision and recovery" (OPR) sample with each batch of up to 10 samples started through the extraction
process on the same twelve hour shift. This OPR sample is identical to the aliquots used in the IPR
analyses (see Item 6), and the results of the OPR are used to ensure that laboratory performance is in
control during the analysis of the associated batch of field samples.
January 1996
15
-------
Data Evaluation Guidance
The data reviewer must verify that the OPR sample has been run with each sample batch and that
the applicable recovery criteria in the analytical method have been met. If the recovery criteria have not
been met, the reviewer may use the following guidelines when making use of the data:
If the concentration of the OPR is above method specifications but that analyte is not detected in
an associated sample, then it unlikely that the sample result is affected by the failure in the OPR.
If the concentration of the OPR is above method specifications and that analyte is detected in the
sample, then the numerical sample result may represent an upper limit of the true concentration,
and data users should be cautioned when using the data for enforcement purposes.
If the concentration of the OPR is below method specification but that analyte is detected in an
associated sample, then the sample result may represent the lower limit of the true concentration
for that analyte.
If the concentration of the OPR is below method specification and that analyte is not detected in
' an associated sample, then the sample data are suspect and cannot be considered valid for
regulatory compliance purposes.
If the OPR standard has not been run, there is no way to verify that the laboratory processes were
in control. In such cases, a data reviewer may be able to utilize the field sample data by examining the
matrix spike recovery results (see item 9), the BPR results, OPR results from previous and subsequent
batches and any available historical data from both the laboratory and the sample site. If the matrix spike
results associated with the sample batch do not meet the performance criteria in the methods, then the
results for that set of samples cannot be considered valid. If the laboratory's IPR results and the matrix
spike results associated with the sample batch in question meet the all applicable performance criteria in
the methods, then the data reviewer may be reasonably confident that laboratory performance was in
control during field sample analysis. This level of confidence may be further increased if there is a strong
history of both laboratory performance with the method and method performance with the sample matrix
in question, as indicated by additional OPR and matrix spike data collected from the laboratory and
samples from the same site.
9. Precision and Recovery of Matrix Spike and Matrix Spike Duplicate Compounds
The 1600 Series Analysis Methods require that laboratories spike the analytes of interest into
duplicate aliquots of at least one sample from each group of ten samples collected from a single site. The
first of these spiked sample aliquots is known as the matrix spike sample; the second is known as the
matrix spike duplicate. These spiked sample aliquots are used to determine if the method is applicable
to the sample matrix in question. The 1600 Series Analysis Methods are applicable to the determination
of metals at concentrations typically found in ambient water samples and certain treated effluents (e.g.,
the part-per-trillion to low part-per-billion range). These methods may not be applicable to marine
samples and many effluent and in-process samples collected from industrial dischargers. Therefore, it is
important to evaluate method performance hi the sample matrix of interest.
16
January 1996
-------
Data Evaluation Guidance
In evaluating matrix spike sample results, it is important to examine both the precision and
accuracy of the duplicate analyses. Precision is assessed by examining the relative percent difference
(RPD) of the concentrations found in the matrix spike and matrix spike duplicate samples, and comparing
the RPD to the acceptance criteria specified hi the analytical method. If the RPD of a matrix spike/matrix
spike duplicate pair exceeds the applicable criterion, then the method cannot be bonsidered to be applicable
to the sample matrix, and none of the associated sample data can be accepted for regulatory compliance
purposes.
If RPD criteria are met, the method is considered to be capable of producing precise data hi these
samples, and the data reviewer must then verify that the method is capable of producing accurate data
Accuracy is assessed by examining the recovery of compounds in the matrix spike and matrix spike
duplicate samples. If the recovery of the matrix spike and duplicate are within the method-specified limits
then the method is judged to be applicable to that sample matrix. If, however, the recovery of the spike
is not within the recovery range specified, either the method does not work on the sample, or the sample
preparation process is out of control.
If the method is not appropriate for the sample matrix, then changes to the method are required
Matrix spike results are necessary in evaluating the modified method. If the analytical process is out of
control, the laboratory must take immediate corrective action before any more samples are analyzed.
To separate indications of method performance from those of laboratory performance, the
laboratory should prepare and analyze calibration verification standards and OPR samples. If the results
for either of these analyses are not within the specified range, then the analytical system or process must
be corrected. After the performance of the analytical system and processes have been verified (through
the successful analysis of CCV and OPR samples), the spike sample analysis should be repeated. If the
recovery of the matrix spike and duplicate are within the method-specified range, then the method and
laboratory performance can be considered acceptable. If, however, the recovery of the matrix spike does
not meet the specified range, the laboratory should attempt to further isolate the metal and repeat the test.
If recovery of the metal remains outside the acceptance criteria, the data reviewer may apply the following
guidelines when attempting to make use of the data:
If the recovery of the matrix spike and duplicate are above method specifications but that metal
is not detected hi an associated sample or is detected below the regulatory compliance limit, then
it unlikely that the sample result is affected by the failure in the matrix spike.
If the recovery of the matrix spike and duplicate are above method specifications and that metal
is detected hi an associated sample above the regulatory compliance level, then the sample result
may represent the upper limit of the true concentration, and the data should not be considered
valid for regulatory compliance purposes.
If the concentration of the matrix spike and duplicate are below method specifications but that
metal is detected hi an associated sample, then the sample result may represent the lower limit of
the true concentration for that metal. If the metal was detected in the sample at a concentration
higher than the regulatory compliance limit, then it is unlikely that the sample result is adversely
January 1996
17
-------
Data Evaluation Guidance
affected by the matrix. If, however, the metal was detected below the regulatory compliance limit,
the data should not be considered valid for regulatory compliance purposes.
10. Statements of Data Quality for Spiked Sample Results
The 1600 Series Analysis Methods specify that after the analysis of five spiked samples of a given
matrix type, a statement of data quality is constructed for each analyte. The statement of data quality for
each analyte is computed as the mean percent recovery plus and minus two times the standard deviation
of the percent recovery for the analyte. The statements of data quality should then be updated by the
laboratory after each five to ten subsequent spiked sample analysis.
The statement of data quality can be used to estimate the true value of a reported result and to
construct confidence bounds around the result. For example, if the result reported for analysis of selenium
is 10 ppb, and the statement of data quality for selenium is 84% ± 25% (i.e., the mean recovery is 84%
and the standard deviation of the recovery is 25%), then the true value for selenium will be in the range
of 9.4 - 14.4 ppb, with 95% confidence. This range is derived as follows:
Lower Limit = [(10 -r .84) - (10 x .25)] = [11.9 - 2.5] = 9.4 ppb
Upper Limit = [(10 -=- .84) + (10 x .25)] = [11.9 + 2.5] = 14.4 ppb
Many laboratories do not provide the data quality statements with the sample results, in which case
the data reviewer must determine if the data quality statements are being maintained for each analyte and
may need to obtain the data. If necessary, the reviewer can construct the data quality statement from the
individual data points. The lack of a data quality statement does not invalidate results but makes some
compliance decisions more difficult. If statements of data quality are not being maintained by the
laboratory, there may be increased concern about both specific sample results and the laboratory's overall
quality assurance program.
11. Statements of Data Quality for Spiked Reagent Water Results
In addition to statements of data quality for results of analyses of the compounds spiked into field
samples, the 1600 Series Analysis Methods require that statements of data quality be constructed from the
initial and ongoing precision and recovery data. The purpose of these statements is to assess laboratory
performance in the practice of the method, as compared to the assessment of method performance made
from the results of spiked field samples. Ideally, the two statements of data quality would be the same.
Any difference could be attributable to either random error or sample matrix effects.
12. Field Duplicates
Method 1669 requires the collection of at least one field duplicate for each batch of field samples
collected from the same site. The field duplicate provides an indication of the overall precision associated
with entire data gathering effort, including sample collection, preservation, transportation, storage, and
18
January 1996
-------
Data Evaluation Guidance
analysis procedures. The data reviewer should examine field duplicate results and use the following
equation to calculate the relative percent difference between the duplicate and its associated samples.
(D1+D2)
where:
Dl = concentration of the analyte in the field sample
D2 = concentration of the analyte in the duplicate field sample
. If the analyte of interest was not detected in either replicate of the field sample, then the RPD will
be zero. If the analyte was detected in each field sample replicate, but the results are highly disparate
(indicated by a large RSD), the reviewer should apply the following guidelines when making use of the
data:
If the analyte was detected in each replicate and at similarly variable concentrations in the blank
samples, then the field sample variability may be attributable to variable contamination, and the
data may not be valid for regulatory compliance purposes.
If the analyte was detected in each replicate at a concentration well above the regulatory
compliance level, but was not detected in the associated blank samples, then it is likely that the
sample results are not adversely affected.
Ideally, the RPD between field duplicates and MS/MSD samples will be identical. Any difference
between the two is attributable to variability associated with the field sampling process.
January 1996
19
-------
-------
Chapter 4
Data Inspection Checklist
The following pages contain a data inspection checklist that may be used by data reviewers
laboratory personnel, and other parties to document the results of each data inspection in a standardized
format.
January 1996
21
-------
Data Inspection Checklist
1. Name of Reviewer
Summary Information
^•M
Tide:
Required Samples
Sample Results Provided
Sample Location or Sample ID
Analyte(s)
Sample Location or Sample
ID
Analyte(s)
2. Method Used:
3*. Total No. of analytical shifts per instrument (determined from analysis run log):
Instrument
No. of Shifts
4. Total No. of CCVs Required:
(one for each 10 samples after the
first 10 samples on each instrument)
5. Total No. of CCBs Required:
(one for each CCV)
6. Total No. of Field Blanks Required:
(one per site or per 10 samples, whichever is more
frequent)
7. Total no. of Lab Blanks Required:
(one per batch" per method/instrument)
Total No. of CCVs Reported:
Total No. of CCBs Reported:
8. Total no. of OPR analyses Required:
(one per batch per method/instrument)
9. Total no. of MS/MSD samples Required:
(one per 10% per matrix per site)
10. Total no. Field Duplicates Required:
(one per 10 samples per site)
11. Total no. of MDL results required:
(one per method and per analyte)
Total No. of Field Blanks Reported:
Total No. of Lab Blanks Reported:
Total No. of OPR Analyses Reported!.
Total No. of MS/MSD samples Reported:
Total No. of Field Duplicates Reported:.
Total No. of MDL Results Reported:.
22
January 1996
-------
Data Inspection Checklist
12.
a.
b.
Initial Calibration
Was a multiple point initial calibration performed*? Eyes dno
Were all sample concentrations reported within the calibration range? dyes dno
If no, list method and analytes for which initial calibration was not performed or which exceeded
the calibration range.
c.
Analv
No ICAL CY/N) Exceeded ICAL Range
d.- Did the initial calibration meet linearity criteria? dyes dno
e. K no, was a calculation curve used to calculate sample concentrations? dyes dno
lA^f *?? (m™?? ^ «*»«*»> should be performed for each analyte; if the RSD of the mean RRF is less than 15% or if
the RSD of the mean RF is less than 25%, then the averaged RRF or RF, respectively, may be used for tot analyte.
13.
a.
b.
c.
d.
Method Detection Limit (MDL)/Minimum Level (ML)
Did the laboratory demonstrate their ability to achieve the required MDL? dyes Dno
Did the initial calibration range encompass the ML? dyes dno
Were all field samples detected below the ML reported as non-detects? dyes dno
If the answer to item a, b, or c above was "no", describe problem:
January 1996
23
-------
Data Inspection Checklist
14.
a.
b.
c.
d.
e.
Initial Calibration Verification (ICV)/Initial Calibration Blanks (ICB):
Was an ICV run prior to field samples? nves nno
Were ICV results within the specified windows? Eyes Dno
Was the ICV followed by an ICB? . nves Dno
Was the ICB free from contamination? Dyes Dno
If any item in a - d above was answered "no", list problems below:
Failed ICV Recovery Concentration Detected in ICB Affected Samples
15.
a.
b.
c.
d.
Initial Precision and Recovery (DPR)
Were IPR data reported for each analyte? nves nno
Did all IPR aliquots meet required recovery criteria (x)? Eyes Dno
Did the standard deviation (s) of each IPR series meet the required criterion? Dyes Dno
If any item in a - c above was answered "no", document problem below.
Analvte Ave. Result Reported (X) RSD Reported Affected Samples
16 Ongoing Precision and Recovery (OPR)
a. Were OPR data reported for each analyte, instrument, and batch? Dyes Dno
b. Did all OPR samples meet required recovery criteria (x)? dyes Dno
c. If item a or b above was answered "no", document problem below.
Analvte OPR Recovery OO Reported Shifts Missing OPR Affected Samples
24
January 1996
-------
Data Inspection Checklist
17.
a.
b.
c.
d.
e.
Continuing Calibration Verification (CCV)/Continuing Calibration Blank (CCB)
Were CCVs run prior to each batch of 10 samples on each instrument? Dyes Dno
Were all CCV results within the specified windows?
Was each CCV followed by a CCB?
Was each CCB free from contamination?
If any item in a - d above was answered "no", list problems below:
Analyte Affected Samples Shift Missing CCV/CCB Failed CCV/CCB ID
Dyes Dno
Dyes Dno
Dyes Dno
18.
a.
b.
c.
Laboratory (Method) Blanks
Was a method blank analyzed for each instrument & sample batch?
Was each method blank demonstrated to be free from contamination?
If the answer to item a or b was "no", document problems below.
Analyte Affected Samples Blank Concentration Reported
Dyes Dno
Dyes Dno
Shift Missing MB
19.
a.
b.
c.
Field Blanks
Was a field blank analyzed for each 10 samples per site?
Was each field blank demonstrated to be free from contamination?
If the answer to item a or b was "no", document problems below.
Analyte Affected Samples Blank Concentration Reported
Dyes Dno
Dyes dno
Shift Missing FB
January 1996
25
-------
Data Inspection Checklist
20.
a.
b.
c.
d.
e.
MS/MSD Results
Were appropriate number of MS/MSD pairs analyzed? Eyes Dno
Were all MS/MSD recoveries within specified windows? Eyes Dno
Were all RPDs within the specified window? dyes Dno
Was appropriate corrective action (e.g., MSA for GFAA, serial dilution
for ICP) employed on affected samples? Dyes Dno
If the answer was "no" to items a - d above, document affected samples:
Analyte MS % R
MSD%R
MS/MSD RPD Affected Samples
21.
a.
b.
c.
d.
Additional Information
Were Instrument Tune Data Provided? Eyes dno
Were equipment blanks demonstrated to be free from contamination? Dyes Dno
Were statements of data quality provided? Eyes Dno
Did field duplicate demonstrate acceptable precision? Dyes Dno
26
January 1996
-------
Glossary
Accuracy: The degree of agreement between a measured value and the true or expected value of the
quantity of concern.
Calibration Blank: A sample of reagent water analyzed after the calibration verification standard to
check for contamination attributable to the analytical system.
Calibration Range (Calibration Curve): A graphical relationship between the known values for a series
of calibration standards and instrument responses, specifically the linear portion of this relationship
between calibration standards.
Dissolved Metals: The concentration of metal(s) that will pass through a 0.45 micron filter assembly,
prior to acidification of the sample.
Equipment Blank: An aliquot of reagent water that is subjected in the laboratory to all aspects of sample
collection and analysis, including contact with all sampling devices and apparatus. The purpose of the
equipment blank is to determine if the sampling devices and apparatus for sample collection have been
adequately cleaned prior to shipment to the field site. An acceptable equipment blank must be achieved
before the sampling devices and apparatus are used for sample collection.
Field Blank: An aliquot of reagent water that is placed in a sample container hi the laboratory, shipped
to the sampling site, and treated as a sample in all respects, including contact with the sampling devices
and exposure to sampling site conditions, storage, preservation, and all analytical procedures, which may
include filtration. The field blank is used to determine if field sample handling processes, sample
transport, and sampling site environment have caused sample contamination.
Field Duplicates: Two identical aliquots of a sample collected hi separate sample containers at the same
time and place under identical circumstances and sample collection techniques, and handled in exactly the
same manner as other samples. Field duplicates are used as a measure of the precision associated with
sample handling, preservation, and storage as well as laboratory handling, preparation, and analytical
procedures.
Initial Precision and Recovery (IPR): A series of four consecutively analyzed aliquots of reagent water
containing the analyte(s) of interest at 2 - 3 times the ML. IPRs are performed prior to the first time a
method is used and any time the method or instrumentation is modified. The IPR is used to demonstrate
the analyst/laboratory ability to generate acceptable precision and accuracy through the calculated mean
(x) and standard deviation (s) for each analyte.
Laboratory Blank: An aliquot of reagent water that is treated exactly as a sample including exposure
to all glassware, equipment, solvents, reagents, internal standards, and surrogates that are used with
samples, the laboratory blank is used to determine if analytes or interferences are present in the laboratory
environment, reagents, or the apparatus.
January 1996
27
-------
Magnetic Media: A storage medium on which all instrumentally acquired raw data may be retained.
Matrix Spike (MS) and Matrix Spike Duplicate (MSD): Aliquots of an environmental sample to which
known quantities of analytes are added in the laboratory. The MS and MSD are analyzed under the same
conditions as other samples and are used to quantify the bias and precision associated with the sample
matrix. The background concentration of the analytes in the sample are determined and subtracted from
the MS and MSD results.
Method Blank: See 'laboratory blank".
Method Detection Limit (MDL): The minimum concentration of an analyte that, in a given matrix and
with a specified method, has a 99% probability of being identified, qualitatively or quantitatively
measured, and reported to be greater than zero.
Minimum Level (ML): The lowest level at which the entire analytical system gives a recognizable signal
and acceptable calibration point.
Ongoing Precision and Recovery (OPR): An aliquot of reagent water containing the analyte(s) of
interest The OPR is used to demonstrate continuing ability of the analyst/laboratory to generate
acceptable results based on target and standard recoveries.
Qualify Assurance (QA): An integrated system of activities involving planning, quality control, quality
assessment, reporting, and quality improvement to ensure that a product or service meets defined standards
of quality with a stated level of confidence.
Quality Control (QC): The overall system of technical activities designed measure and control the
quality of a product or service so that it meets the needs of users. The aim is to provide quality that is
satisfactory, adequate, dependable, and economical.
Precision: The degree of mutual agreement characteristic of independent measurements as the result of
repeated applications of the process under specified conditions.
Reagent Water: Water demonstrated to be free from the metal(s) of interest at the method detection limit
(MDL) of the analytical method to be used for determination of the metal(s) of interest.
Reference Standards: A material or substance, one or more properties of which are sufficiently well
established to be used for the calibration of analytical apparatus, the assessment of a measurement method,
or assigning of values to materials.
Trace Metals: Concentrations of metals found at or near their established water quality criteria levels.
28
January 1996
-------
Appendix A
EPA Water Quality Criteria for
Priority Pollutant Metals and Metals Species
The table provided on the following page provides the freshwater, marine, and human health water quality
criteria published by EPA for priority pollutant metals and metals species. Human health criteria reflect
values published by EPA in the National Toxics Rule at 57 FR 60848. Aquatic criteria reflect values
published by EPA in the National Toxics Rule and in the Stay of Federal Water Quality Criteria for
Metals (60 FR 22228). This table includes criteria for both total recoverable metals and dissolved metals
In addition, the table includes freshwater criteria that are based on a hardness of 100 mg/L In order to
provide a worst-case scenario, the table also includes criteria that are based on a hardness of 25 mg/L
<"»™ Calculations for deriving these values were published by EPA at 60 FR 22228.
January 1996
29
-------
CO
lo"
1!
la
3 s
= 1
is J
a-g
o
S
I
!
-<
f?
S
------- |