United States
Environmental Protection
Agency

Office of Research and
Development
Washington, D.C. 20460

January 2002

<&EPA Environmental Technology
Verification Program

Verification Test Plan

Evaluation of Field Portable
Measurement Technologies for
Lead in Dust Wipes

Oak Ridge National Laboratory

etVetVeiV


-------

-------
January 2002

Environmental Technology
Verification Program

Verification Test Plan

Evaluation of Field Portable
Measurement Technologies for Lead in
Dust Wipes

By

Oak Ridge National Laboratory
Oak Ridge, Tennessee 37831-6120

and

U.S. Environmental Protection Agency

Environmental Sciences Division
National Exposure Research Laboratory
Las Vegas, Nevada 89193-3478


-------
APPROVAL SIGNATURES

This document is intended to ensure that all aspects of the verification are documented, scientifically sound, and
that operational procedures are conducted within quality assurance/quality control specifications and health and
safety regulations.

The signatures of the individuals below indicate concurrence with, and agreement to operate compliance with,
procedures specified in this document.

U. S. ENVIRONMENTAL PROTECTION AGENCY

Project Manager: 			

Eric Koglin	Date

ESD Quality Manager: 			

George Brilis	Date

OAK RIDGE NATIONAL LABORATORY

Project Manager: 			

Roger Jenkins	Date

Technical Lead: 			

Amy Dindal	Date

QA Specialist:				

Janet Wagner	Date

Statistician:				

Charles Bayne	Date

U. S. DEPARTMENT OF ENERGY, Oak Ridge Operations Office

Program Manager: 			

Regina Chung	Date

i


-------
Environmental Safety & Health Review

CSD ES&H Coordinator: 	

Fred Smith

TECHNOLOGY VENDORS

KeyMaster Technologies: 	

Therese Howe

ii


-------
TABLE OF CONTENTS

EXECUTIVE SUMMARY	v

ABBREVIATIONS AND ACRONYMS 	vi

1	INTRODUCTION 	1

1.1	Verification Objectives 	1

1.2	What is the Environmental Technology Verification Program?	1

1.3	Technology Verification Process	2

1.3.1	Needs Identification and Technology Selection	2

1.3.2	Verification Planning and Implementation	2

1.3.3	Report Preparation 	3

1.3.4	Information Distribution	3

1.4	Purpose of this Verification Test Plan	3

2	VERIFICATION RESPONSIBILITIES AND COMMUNICATION 	3

2.1	Verification Organization and Participants	3

2.2	Organization	4

2.3	Responsibilities 	5

3	TECHNOLOGY DESCRIPTIONS 	6

3.1 KeyMaster Technologies	6

3.1.1	General Description 	6

3.1.2	Sample Preparation	6

3.1.3	Sample Analysis	7

4	VERIFICATION TEST DESIGN	7

4.1	Drivers and Objectives of the Verification Test 	7

4.2	Summary of the Experimental Design 	7

4.2.1	ELPAT and Blank Sample Description 	7

4.2.2	University of Cincinnati Sample Description	8

4.2.3	Distribution and Number of Samples	9

4.3	Comparison of Field Technology Results to an NLLAP-Recognized Laboratory's Results

	10

4.3.1	Laboratory Selection	10

4.3.2	Description of Method 	10

5	EXECUTION OF THE VERIFICATION TEST	10

5.1	Summary of Verification Activities	10

5.2	Sample Distribution	11

5.3	Submission of Results 	11

5.4	Verification Performance Factors 	11

6	QUALITY ASSURANCE PROJECT PLAN (QAPP)	11

6.1	Purpose and Scope	11

6.2	Quality Assurance Responsibilities	11

6.3	Field Operations	12

6.3.1	Site Training	12

6.3.2	Communication and Documentation 	12

6.4	Performance and System Audits 	12

6.4.1 Technical Systems Audit 	12

Hi


-------
6.4.2	Data quality audit of the laboratory 	12

6.4.3	Surveillance of Technology Performance 	12

6.5	Quality Assurance Reports 	12

6.5.1	QC Reports of Sample Preparation	12

6.5.2	QAS Surveillance Report	13

6.5.3	Status Reports	13

6.5.4	Audit Reports 	13

6.6	Corrective Actions	13

6.7	Laboratory Quality Control Checks	13

6.8	Data Management 	13

6.9	Data Reporting, Validation, and Analysis	13

6.9.1	Data Reporting 	14

6.9.2	Data Validation	14

6.9.2.1	Completeness of Laboratory Records 	14

6.9.2.2	Holding Times	14

6.9.2.3	Correctness of Data	14

6.9.2.4	Correlation Between Samples within a Concentration Set	14

6.9.2.5	Evaluation of QC Results 	14

6.9.2.6	Evaluation of Spiked Sample Data 	14

6.9.3	Data Analysis for Verification Factors	15

6.9.3.1	Precision 	15

6.9.3.2	Accuracy 	16

6.9.3.3	Detectable Blanks	16

6.9.3.4	False Positive/False Negative Results	16

6.9.3.5	Comparability 	16

6.9.3.6	Completeness	17

7 HEALTH AND SAFETY PLAN	17

7.1	Contact Information	17

7.2	Health and Safety Plan Enforcement	17

7.3	Site Access	17

7.4	Waste Generation	18

7.5	Hazard Evaluation	18

7.6	Personal Protection 	18

7.7	Physical Hazards 	18

7.8	Fire	18

7.9	Mechanical, Electrical, Noise Hazards 	18

7.10	Medical Support	19

7.11	Environmental Surveillance	19

7.12	Safe Work Practices	19

7.13	Complaints 	19

REFERENCES	20

iv


-------
EXECUTIVE SUMMARY

EPA created the Environmental Technology Verification (ETV) Program to facilitate the deployment
of innovative technologies through performance verification and information dissemination. The goal of the
ETV Program is to further environmental protection by substantially accelerating the acceptance and use of
improved and cost-effective technologies. The ETV Program is intended to assist and inform those involved
in the design, distribution, permitting, and purchase of environmental technologies. The verification study
described in this test plan will be conducted by the Advanced Monitoring Technology Center (AMT), one of
six Centers of the ETV program. The AMT Center is administered by the EPA's National Exposure Research
Laboratory. The Oak Ridge National Laboratory (ORNL) will serve as the verification organization for the
test.

The verification test will consist of a vendor of commercially available portable technology, capable
of measuring lead in dust wipe samples, operating their x-ray fluorescence instrument (XRF) in a field
setting. This test will be the second round of testing for lead in dust wipe measurement technologies. In
November 2001, four technologies were tested in Hartford, CT. The vendor will blindly analyze 160 dust
wipe samples containing known amounts of lead, ranging in concentration from <2 to 1,500 |ig/wipe. The
experimental design is particularly mindful of germane clearance levels, such as those identified in the Code
of Federal Regulations of 40, 250, and 400 |ig/ft2. The samples will include wipes newly-prepared and
archived from the Environmental Lead Proficiency Analytical Testing Program (ELPAT). These samples
have been/will be prepared from dust collected in households in North Carolina and Wisconsin. Also, newly-
prepared samples will be acquired from the University of Cincinnati. These dust wipe samples will be
prepared from National Institute of Standards and Technology (NIST) Standard Reference Materials (SRMs).

v


-------
ABBREVIATIONS AND ACRONYMS

AIHA

American Industrial Hygiene Association

AMT

Advanced Monitoring Technology Center, ETV

ASTM

American Society for Testing and Materials

CDC

Centers for Disease Control and Prevention

ELPAT

Environmental Lead Proficiency Analytical Testing program

EPA

U. S. Environmental Protection Agency

ESD-LV

Environmental Science Division-Las Vegas

ESH&Q

Environmental Safety, Health, and Quality

ETV

Environmental Technology Verification Program

ETVR

Environmental Technology Verification Report

fh

false negative result

ip

false positive result

HASP

Health and Safety Plan

ICP-AES

Inductively coupled plasma-atomic emission spectrometry

MTI

Monitoring Technologies International

NIST

National Institute of Standards & Technology

NLLAP

National Lead Laboratory Accreditation Program, U.S. EPA

OPPT

Office of Pollution Prevention and Toxics, U.S. EPA

ORNL

Oak Ridge National Laboratory

PPE

personal protective equipment

QA

quality assurance

QAPP

Quality Assurance Project Plan

QAS

ORNL Quality Assurance Specialist

QC

quality control

RSD

relative standard deviation

RTI

Research Triangle Institute

SRM

Standard Reference Material

UC

University of Cincinnati

XRF

x-ray fluorescence instrument

vi


-------
1 INTRODUCTION

This chapter discusses the purpose of the verification and the verification test plan, describes the
elements of the verification test plan, and provides an overview of the Environmental Technology
Verification (ETV) Program and the technology verification process.

1.1	Verification Objectives

The purpose of this verification test is to evaluate the performance of commercially available field
analytical technologies for analyzing dust wipe samples for lead. Specifically, this plan defines the following
elements of the verification test:

Roles and responsibilities of verification test participants;

Procedures governing verification test activities such as sample collection,
preparation, analysis, data collection, and interpretation;

Experimental design of the verification test;

Quality assurance (QA) and quality control (QC) procedures for conducting the
verification and for assessing the quality of the data generated from the verification;
and,

Health and safety requirements for performing the verification test.

1.2	What is the Environmental Technology Verification Program?

The U.S. Environmental Protection Agency (EPA) created the Environmental Technology
Verification Program (ETV) to facilitate the deployment of innovative or improved environmental
technologies through performance verification and dissemination of information. The goal of the ETV
Program is to further environmental protection by substantially accelerating the acceptance and use of
improved and cost-effective technologies. ETV seeks to achieve this goal by providing high-quality, peer-
reviewed data on technology performance to those involved in the design, distribution, financing, permitting,
purchase, and use of environmental technologies.

ETV works in partnership with recognized standards and testing organizations and stakeholder
groups consisting of regulators, buyers, and vendor organizations, with the full participation of individual
technology vendors. The program evaluates the performance of innovative technologies by developing
verification test plans that are responsive to the needs of stakeholders, conducting field or laboratory tests (as
appropriate), collecting and analyzing data, and preparing peer-reviewed reports. All evaluations are
conducted in accordance with rigorous quality assurance (QA) protocols to ensure that data of known and
adequate quality are generated and that the results are defensible.

ETV is a voluntary program that seeks to provide objective performance information to all of the
participants in the environmental marketplace and to assist them in making informed technology decisions.
ETV does not rank technologies or compare their performance, label or list technologies as acceptable or
unacceptable, seek to determine "best available technology," or approve or disapprove technologies. The
program does not evaluate technologies at the bench or pilot scale and does not conduct or support research.
Rather, it conducts and reports on testing designed to describe the performance of technologies under a range
of environmental conditions and matrices.

The program now operates six Centers covering a broad range of environmental areas. ETV began
with a 5-year pilot phase (1995-2000) to test a wide range of partner and procedural alternatives in various
pilot areas, as well as the true market demand for and response to such a program. In the Centers, EPA
utilizes the expertise of partner "verification organizations" to design efficient processes for conducting
performance tests of innovative technologies. These expert partners are both public and private organizations,
including federal laboratories, states, industry consortia, and private sector entities. Verification
organizations oversee and report verification activities based on testing and QA protocols developed with
input from all major stakeholder/customer groups associated with the technology area. The verification test
described in this plan will be administered by the Advanced Monitoring Technology (AMT) Center, with
Oak Ridge National Laboratory (ORNL) serving as the verification organization. (To learn more about ETV,
visit ETV's Web site at www.epa.gov/etv and ORNL's web site at www.ornl.gov/ctv). The AMT Center is
administered by EPA's National Exposure Research Laboratory (NERL).

1


-------
1.3 Technology Verification Process

The technology verification process is intended to serve as a template for conducting technology
verifications that will generate high quality data which can be used to verify technology performance. Four
key steps are inherent in the process:

Needs identification and technology selection;

Verification test planning and implementation;

Report preparation;

•	Information distribution.

1.3.1	Needs Identification and Technology Selection

The first step in the technology verification process is to determine technology needs of the user-
community (typically state and Federal regulators and the regulated community). Each Center utilizes
stakeholder groups. Members of the stakeholder groups come from EPA, the Departments of Energy and
Defense, industry, and state regulatory agencies. The stakeholders are invited to identify technology needs
and to assist in finding technology vendors with commercially available technologies that meet the needs.
Once a technology need is established, a search is conducted to identify suitable technologies. The
technology search and identification process consists of reviewing responses to Commerce Business Daily
announcements, searches of industry and trade publications, attendance at related conferences, and leads
from technology vendors. The following criteria are used to determine whether a technology is a good
candidate for the verification:

Meets user needs

May be used in the field or in a mobile laboratory
Applicable to a variety of environmentally impacted sites

High potential for resolving problems for which current methods are unsatisfactory
Costs are competitive with current methods

Performance is better than current methods in areas such as data quality, sample preparation, or
analytical turnaround

Uses techniques that are easier and safer than current methods
Is commercially available and field-ready.

For this verification test of lead measurement technologies, ORNL has assembled a technical panel
of the nation's experts in this field. The technical panel includes representation from the U.S. Department of
Housing and Urban Development, the National Institute for Occupational Safety and Health, the National
Institute of Standards and Technology, Research Triangle Institute, the American Industrial Hygiene
Association, the Massachusetts Childhood Lead Poisoning and Prevention Program, and several EPA offices,
including the Office of Pollution Prevention and Toxics (OPPT).

1.3.2	Verification Planning and Implementation

After a vendor agrees to participate, EPA, the Verification Organization, and the vendor meet to
discuss each participants responsibilities in the verification process. In addition, the following issues are
addressed:

Site selection. Identifying sites that will provide the appropriate physical or chemical environment,
including contaminated media

Determining logistical and support requirements (for example, field equipment, power and water
sources, mobile laboratory, communications network)

Arranging analytical and sampling support

Preparing and implementing a verification test plan that addresses the experimental design, sampling
design, QA/QC, health and safety considerations, scheduling of field and laboratory operations, data
analysis procedures, and reporting requirements

2


-------
1.3.3	Report Preparation

Innovative technologies are evaluated independently and, when possible, against conventional
technologies. The technologies being verified are operated by the vendors in the presence of independent
observers. The observers are EPA staff, technical panel staff and from a independent third-party organization.
The data generated during the verification test are used to evaluate the capabilities, limitations, and field
applications of each technology. A data summary and detailed evaluation of each technology are published in
an Environmental Technology Verification Report (ETVR). The original complete data set is available upon
request.

An important component of the ETVR is the Verification Statement, which consists of three to five
pages, using the performance data contained in the report, are issued by EPA and appear on the ETV Internet
Web page. The Verification Statement is signed by representatives of EPA and ORNL.

1.3.4	Information Distribution

Producing the ETVR and the Verification Statement represents a first step in the ETV outreach
efforts. ETV gets involved in many activities to showcase the technologies that have gone through the
verification process. The Program is represented at many environmentally-related technical conferences and
exhibitions. ETV representatives also participate in panel sessions at major technical conferences. ETV
maintains a traveling exhibit that describes the program, displays the names of the companies that have had
technologies verified, and provides literature and reports.

We have been taking advantage of the Web by making the ETVRs available for downloading to
anyone interested. The ETVRs and the Verification Statements are available in Portable Document Format
(.pdf) on the ETV Web site (http://www.epa.gov/ctv).

1.4 Purpose of this Verification Test Plan

The purpose of the verification test plan is to describe the procedures that will be used to verify the
performance goals of the technologies participating in this verification. This document incorporates the
QA/QC elements needed to provide data of appropriate quality sufficient to reach a credible position
regarding performance. This is not a method validation study, nor does it represent every environmental
situation which may be appropriate for these technologies. But it will provide data of sufficient quality to
make a judgement about the application of the technology under conditions similar to those encountered in
the field under normal conditions.

This test plan was developed based on the first round of testing which occurred in November 2001 in
Hartford, CT. Four technologies were evaluated during that test.

2 VERIFICATION RESPONSIBILITIES AND COMMUNICATION

This section identifies the organizations involved in this verification test and describes the primary
responsibilities of each organization. It also describes the methods and frequency of communication that will
be used in coordinating the verification activities.

2.1 Verification Organization and Participants

Participants in this verification are listed in Table 2-1. The specific responsibilities of each
verification participant are discussed in Section 2.3 This verification test is being coordinated by the Oak
Ridge National Laboratory (ORNL) under the direction of the U.S. Environmental Protection Agency's
(EPA) Office of Research and Development, National Exposure Research Laboratory, Environmental
Sciences Division - Las Vegas, Nevada (ESD-LV). ESD-LV's role is to administer the verification program.
ORNL's role is to provide technical and administrative leadership and support in conducting the verification.

3


-------
Table 2-1. Verification Participants in the Leac

in Dust Field Analytical Technology Verification Test

Organization

Point(s) of Contact

Role

Oak Ridge National Laboratory

P.O. Box 2008
Bethel Valley Road
Bldg. 4500S, MS-6120
Oak Ridge, TN 37831-6120

Project Manager: Roger Jenkins
phone: (865) 576-8594
fax: (865) 576-7956
ienkinsra@,ornl. sov

Technical Lead: Amy Dindal
phone: (865) 574-4863
fax: (865) 576-7956
dindalab@,ornl. sov

verification
organization

U. S. EPA

National Exposure Research Laboratory
Environmental Science Division
P.O. Box 93478
Las Vegas, NV 89193-3478

Project Officer: Eric Koglin
phone: (702) 798-2432
fax: (702) 798-2291
koslin.cricVvcDa. sov

EPA project
management

U. S. DOE

ORNL Site Office
P.O. Box 2008
Bldg. 4500N, MS-6269
Oak Ridge, TN 37831-6269

Program Coordinator: Regina Chung
phone: (865) 576-9902
fax: (865) 574-9275
chunsr@ornl.sov

DOE/ORO

project
management

KeyMaster Technologies

415 N. Quay
Kennewick, WA 99336

Contact: Therese Howe
phone: 509-783-9850
fax: 509-735-9696
thowecrke Ymastcr-tcch.com

technology
vendor

DataChem

4388 Glendale-Milford Road
Cincinnati, Ohio 45242

Contact: Chris Gibson
phone: (513) 733-5336, x304
fax: (513) 733-5347

NLLAP-
recognized
laboratory

2.2 Organization

In Figure 2-1 is presented an organizational chart depicting the lines of communication for the
verification.

4


-------
Figure 2-1. Organizational Chart for the verification test.

2.3 Responsibilities

The following is a delineation of each participant's responsibilities for the verification test. In this
section, the term "vendor" applies to KeyMaster Technologies.

The Vendor, in consultation with ORNL and EPA, is responsible for the following elements of this
verification test:

Contribute to the design and preparation of the verification test plan;

Provide detailed procedures for using the technology;

Prepare field-ready technology for verification;

Operating the technology during the verification test;

Documenting the methodology and operation of the technology during the

verification;

Furnish data in a format that can be compared to laboratory values;

Logistical, and other support, as required.

ORNL has responsibilities for:

Preparing the verification test plan;

Developing a quality assurance project plan (QAPP) (Section 6 of the verification
test plan);

Preparing a health and safety plan (HASP) (Section 7 of the verification test plan)
for the verification activities;

Developing a test plan for the verification;

Acquiring the necessary laboratory analysis data;

Performing sample preparation activities (including purchasing, labeling, and
distributing).

ORNL and EPA have coordination and oversight responsibilities for:

5


-------
Providing needed logistical support, establishing a communication network, and
scheduling and coordinating the activities of all verification participants, including
the technical panel;

Auditing the on-site sampling activities;

Managing, evaluating, interpreting, and reporting on data generated by the
verification;

Evaluating and reporting on the performance of the technologies;

Other logistical information and support needed to coordinate access to the site for

the field portion of the verification, such as waste disposal.

3 TECHNOLOGY DESCRIPTIONS

This section provides descriptions of the technology participating in the verification test. The
description was provided by KeyMaster, with minimal editing by ORNL.

3.1 KeyMaster Technologies

3.1.1	General Description

X-ray fluorescence spectroscopy utilizes gamma or x-ray energy to produce
photons, which can be measured in an elemental analysis. When a sample is
exposed to a low energy gamma source, X-rays are produced with energies
that are characteristic of the elements in the target sample. The basis for X-
ray Fluorescence (XRF) technology is that each atomic element has its own
unique energy signature. XRF instruments measure each energy signature to
determine presence and concentration of various elements in a matrix based
upon this unique signature. Portable XRF instruments were first developed
for the mining industry to verify ore quality and are now used in a variety of
applications ranging from US EPA soil analysis to alloy sorting. In fundamental terms, the XRF instrument
has a gamma source, which emits a known energy and excites the atoms of a target element. The resulting
energy is called X-ray fluorescence, which can be detected or "read" by the detector in the XRF instrument.
Figure 3-1 is a picture of KeyMaster Technologies" MAP-5™ field portable XRF instrument.

3.1.2	Sample Preparation

Fold the wipe in fourths, according to Figure 3-2 below, so that the dust is folded to the inside of the
wipe. Then put the folded wipe in the red plastic holder. The dust wipe is now ready to test. Do not re-use the
XRF film in order to reduce contamination of subsequent wipes.

STEP 1

Figure 3-1. MAP-5 instrument

Fold Here

STEP 2

Fold Here

Figure 3-2. Steps for folding wipe into fourths.

6


-------
3.1.3 Sample Analysis

Turn on the MAP 5 at least 10 minutes prior to taking the first
measurement. From the Setup select "Test" precision. Place a Dust Wipe
standard in the dust wipe fixture on the MAP5 and take five "Test" precision
assays and average the results. The average should be within the parameters
provided by the manufacturer. After checking the MAP 5 calibration, begin
testing. Take a measurement, positioning the red dust wipe holder carefully
on the MAP 5's sample holder (Figure 3-3) . This procedure assures that the
entire area of the folded dust wipe is measured by the analyzer. The results
are displayed after each test. When the measurement is complete, the MAP 5
automatically displays the results in |J,g/cm2.	Figure 3-3. Sample holder

4 VERIFICATION TEST DESIGN

This section discusses the objectives and design of the verification test, factors that must be
considered to meet the performance objectives, and the information that ORNL and EPA will use to evaluate
the results of the verification.

4.1	Drivers and Objectives of the Verification Test

The purpose of this test is to evaluate the performance of field analytical technologies that are
capable of analyzing dust wipe samples for lead contamination. This test will provide information on the
potential applicability of field technologies for clearance testing, as the experimental design is built around
the three clearance levels of 40 |ig/ft2 for floors, 250 |ig/ft2 for window sills, and 400 |ig/ft2 for window
troughs which are outlined in a recent rule amendment to the Code of Federal Regulations [1].

The primary objectives of this verification are to evaluate the field analytical technologies in the
following areas: (1) how well each performs relative to a conventional, fixed-site, analytical method for the
analysis of dust wipe samples for lead; (2) how well each performs relative to results generated in previously
rounds of ELPAT testing, and (3) the logistical and economic resources necessary to operate the technology.
Secondary objectives for this verification are to evaluate the field analytical technology in terms of its
reliability, ruggedness, cost, range of usefulness, sample throughput, data quality, and ease of use. The
planning for this verification test follows the guidelines established in the data quality objectives process.

4.2	Summary of the Experimental Design

All of the samples analyzed in this verification test were prepared gravimetrically and are of known
quantity. All of the wipes utilized in this test (PaceWipe and Aramsco Lead Wipe) meet the specifications of
the American Society for Testing and Materials requirements [2], Initial consideration was given to
conducting the test in a real-world situation, where the technologies would have been deployed in a housing
unit that had been evacuated due to high levels lead contamination. In addition to the safety concern of
subjecting participants to lead exposure, the spatial variability of adjacent samples would have been so great
that it would be much larger than the expected variability of this type of technology, therefore making it
difficult to separate instrument/method variability and sampling variability. The availability of well-
characterized samples derived from "real-world" situations made the use of proficiency testing samples (so-
called "ELPAT" samples) and other prepared samples an attractive alternative.

4.2.1 ELPA T and Blank Sample Description

In 1992, the American Industrial Hygiene Association (AIHA) established the Environmental Lead
Proficiency Analytical Testing (ELPAT) program. The ELPAT Program is a cooperative effort of the
American Industrial Hygiene Association (AIHA), and researchers at the Centers for Disease Control and
Prevention (CDC), National Institute for Occupational Safety and Health (NIOSH), and the EPA Office of
Pollution Prevention and Toxics (OPPT). Participation and proficiency in ELPAT are AIHA requirements
for laboratories who wish to seek accreditation and recognition by EPA's National Lead Laboratory
Accreditation Program (NLLAP). The ELPAT program is designed to assist laboratories in improving their

7


-------
analytical performance, and therefore does not specify use of a particular analytical method. Participating
laboratories are blindly sent samples to analyze on a quarterly basis. The reported values must fall within a
range of acceptable values in order for the laboratory to be deemed proficient for that quarter.

Research Triangle Institute (RTI) in Research Triangle Park, NC, is contracted to prepare and
distribute the lead-containing paint, soil, and dust wipe ELPAT samples. For the rounds of testing which
have occurred since 1992, archived samples are available for purchase. These are the samples that will be
used in this verification test. Because the samples have already been tested by hundreds of laboratories, a
certified concentration value is supplied with the sample. This certified value represents a pooled
measurement of all of the results submitted, with outliers excluded from the calculation.

The following description, taken from an internal RTI report, briefly outlines how the samples are
prepared RTI developed a repository of real-world housedust, collected from multiple homes in the
Raleigh/Durham/Chapel Hill area, as well as from an intervention project in Wisconsin. After collection, the
dust was sterilized by gamma irradiation and sieved to 150 |im. A PaceWipe™ was prepared for receiving
the dust by opening the foil pouch, removing the wet folded wipe and squeezing the excess moisture out by
hand over a trash can. The wipe was then unfolded and briefly set on a Kimwipe to soak up excess moisture.
The PaceWipe™ was then transferred to a flat plastic board to await the dust. The weighing paper
containing the pre-weighed dust was then removed from the balance, and 0.1000 ± 0.0005 g portions of dust
were gently tapped out onto the PaceWipe™. The wipe was then folded and placed in a plastic vial, which
was then capped. All vials containing the spiked wipes were stored in a cold room as a secondary means of
retarding mold growth until shipment.

Before use in the ELPAT program, RTI performs a series of analyses to confirm that the samples
were prepared within the quality guidelines established for the program. Ten samples are analyzed by RTI
and nine samples are sent off-site to an independent laboratory for confirmatory analysis. The relative
standard deviation of the 10 samples analyzed by RTI must be 10% of less, indicating that the samples were
prepared in a homogeneous fashion. The measured concentrations must be within 20% of the target value
that RTI was intending to prepare. Additionally, the off-site analysis must be within ± 20% of the RTI results
in order for the samples to be acceptable.

RTI prepared the blank samples using the same preparation method as the ELPAT samples, but the
concentration of lead in the dust on the wipe will be below reporting limits of the participants (< 2 |ig/wipe).

4.2.2 University of Cincinnati Sample Description

As described above, the ELPAT samples consist of dust mounded in the center of a Pacewipe. The
University of Cincinnati (UC) prepares "field QC samples" where the dust is spread over the wipe, similar to
how a wipe would look when a dust wipe sample is collected in the field. The sample is prepared
gravimetrically, so the concentrations can be estimated. In a typical scenario, these control samples would be
sent to a laboratory along with actual field-collected samples as a quality check of the laboratory operations.
Because the samples are visually indistinguishable from the actual field sample and are prepared on the same
wipe and are shipped in the same packaging, the laboratory blindly analyzes the control samples, providing
the user with an independent assessment of the quality of the laboratory's data.

A cluster of twenty UC-prepared samples at the key clearance levels were added to the experimental
design to augment the robustness of the test. The UC wipe samples will be prepared from National Institute
of Standards & Technology (NIST) Standard Reference Materials (SRMs). The UC samples were prepared
on Aramsco Lead Wipes™ (Lakeland, FL).1 To document the variability of the preparation process, UC
analyzed approximately 5% of the total number ordered. The results indicated that the samples were prepared
accurately, relative to the target concentration. Additionally, randomly-selected samples were analyzed by an
independent organization (EPA Region 1) as a quality control check of the accuracy and precision of UC's
sample preparation procedure.

1 PaceWipes would have been used to prepare the UC samples, but the PaceWipes were unavailable at the
time of the study due to a problem with the manufacturer.

8


-------
4.2.3 Distribution and Number of Samples

Figure 4-1 is a plot containing the distribution of the sample concentrations to be analyzed in this
study. A total of 160 samples will be analyzed in the verification test. For the ELPAT samples, four samples
will be analyzed at each of 20 test levels (20 test levels x 4 samples each = 80 samples total). While the set
of four samples have/will be prepared using homogeneous source materials and an identical preparation
procedure, they cannot be considered true "replicates" because each sample will be prepared individually.
However, these samples will represent four samples prepared similarly at a specified target concentration.
Twenty samples were prepared near each of the three clearance levels (3 test levels x 20 samples = 60
samples total) by the University of Cincinnati. Twenty blanks, prepared by Research Triangle Institute at lead
concentrations < 2 |ig, will also be analyzed. In Figure 4-1, the clearance levels are denoted as a horizontal
lines.

.5-

4

o>
>
o>

cS
¦-


-------
4.3 Comparison of Field Technology Results to an NLLAP-Recognized Laboratory's Results

Current EPA regulations for clearance testing of facilities that have been abated for lead
contamination stipulate that the laboratory performing the testing must been recognized under NLLAP [3],
Currently, only fixed-site analytical laboratories are recognized under NLLAP. Mobile laboratories and
testing firms using field portable equipment may be recognized under NLLAP, even if no such laboratories or
firms are on the current NLLAP list. In order to assess whether the field portable technologies participating
in this verification test produce results that are comparable to NLLAP-recognized data, an NLLAP -
recognized laboratory was selected to analyze samples concurrently with the field testing.

4.3.1	Laboratory Selection

NLLAP was established by the EPA OPPT under the legislative directive of Title X, the Lead-Based
Paint Hazard Reduction Act of 1992. In order for laboratories to be recognized under the NLLAP they must
successfully participate in the ELPAT Program and undergo a systems audit.

The acceptable range for the ELPAT test samples is based upon consensus values from participating
laboratories. A laboratory's performance for each matrix is rated as proficient if their ELPAT results are
within three standard deviations of the determined acceptable range for 75 percent of the ELPAT test
samples.

The NLLAP required systems audit must include an on-site evaluation by a private or public
laboratory accreditation organization recognized by NLLAP. Some of the areas evaluated in the systems
audit include laboratory personnel qualifications and training, analytical instrumentation, analytical methods,
quality assurance procedures and record keeping procedures.

The list of recognized laboratories is updated monthly. ORNL obtained the list of accredited
laboratories in July 2001. The list consisted of approximately 130 laboratories. Those laboratories which did
not accept commercial samples and those located on the west coast were automatically eliminated as
potential candidates. ORNL interviewed at random approximately ten laboratories and solicited information
regarding cost, typical turnaround time, and data packaging. Based on these interviews and discussions with
technical panel members who had personal experience with the potential laboratories, ORNL selected
DataChem (Cincinnati, OH) as the fixed-site laboratory. As a final qualifying step, DataChem analyzed 16
samples (8 ELPAT and 8 prepared by UC) in a pre-test study, which demonstrated that the laboratory was
proficient in analyzing these types of samples.

4.3.2	Description of Method

The laboratory method used in this study was hot plate/nitric acid digestion, followed by Inductively
coupled plasma-atomic emission spectrometry (ICP-AES) analysis. The preparation and analytical
procedures, as supplied by DataChem, can be found in Appendix A. DataChem's procedures are modification
of Methods 3050B and 6010B of EPA SW-846 Method Compendium for the preparation and analysis of
metals in environmental matrices [4,5], Other specific references for the preparation and analysis of dust
wipes are available from the American Society for Testing and Materials (ASTM) [6],

5 EXECUTION OF THE VERIFICATION TEST

5.1 Summary of Verification Activities

This verification test will be conducted at ORNL in an office environment from January 7 through
11, 2002. The vendor, who will operate their own equipment, must analyze all 160 samples on-site and
submit results prior to departure in order to complete the verification test. The samples evaluated during the
verification will consist of (1) archived ELPAT samples prepared from housedust collected from multiple
homes in North Carolina and Wisconsin, ranging in concentration from 15 to 1,500 |ig/wipe, (2) UC-
prepared samples from NIST SRMs on Armasco Lead Wipes, at the three clearance levels of 40, 250, and
400 |ig/wipe, and (3) low level samples called "detectable blanks", with concentrations (< 2 |ig lead/wipe)
below typical detection levels for field technologies, prepared by RTI using the same procedure as the
ELPAT samples.

10


-------
5.2	Sample Distribution

ORNL will be responsible for sample distribution. The samples will be packaged in 20-mL plastic
scintillation vials and labeled with a sample identifier. The vendor will receive the suite of samples in a
randomized order. All samples will be prepared for distribution at the start of the verification. The vendor
will go to a sample distribution table to pick-up the samples. The samples will be distributed in batches of 16.
Completion of chains-of-custody forms will document sample transfer.

5.3	Submission of Results

The vendor will provide the results to ORNL. The vendor will be responsible for reducing the raw
data into a presentation format consistent with the evaluation requirements. At the end of the verification
test, the vendor will submit all final results and raw data to ORNL. After the conclusion of the field activities,
the vendor will have one week to review their data and make revisions to their results. These revisions will
not involve re-analysis of any sample. The revisions will be limited to correcting for calculation and
transcription errors.

5.4	Verification Performance Factors

The following are the logistical and technical performance verification factors that will be verified
for each technology.

Accuracy: closeness of technology result to an estimated known value (i.e., ELPAT
certificate value);

Precision: reproducibility of technology's results for set of four samples prepared at a
specific concentration level;

Comparability: performance relative to the NLLAP-recognized laboratory;

Detectable blanks: number of samples where lead is reported above reporting limits for
samples which are prepared at low levels (< 2 |ig/wipe);

Probability of false positive results: relative to all three clearance levels of 40, 250, and 400
|lg/ft2. For example, number of samples where the field technology reports a result as > 40
|ig and the concentration is actually less than 40 |ig.

Probability of false negative results: relative to all three clearance levels of 40, 250, and 400

|lg/ft2. For example, number of samples where the field technology reports a result as < 40

|ig and the concentration is actually > 40 jig.

Sample throughput: number of samples/hour/number of analysts

Ease of use: user friendliness of the technology; amount of training required to operate

independently.

These factors and the anticipated statistical analyses are further discussed in Section 6.

6 QUALITY ASSURANCE PROJECT PLAN (QAPP)

The QAPP for this verification test specifies procedures that will be used to ensure data quality and
integrity. Careful adherence to these procedures will ensure that data generated from the verification will
meet the desired performance objectives and will provide sound analytical results.

6.1	Purpose and Scope

The primary purpose of this section is to outline steps that will be taken to ensure that data resulting
from this verification is of known quality and that a sufficient number of critical measurements are taken.

This section is written in compliance with ORNL's ETV Quality Management Plan [7],

6.2	Quality Assurance Responsibilities

The implementation of the verification test plan must be consistent with the requirements of the
study and routine operation of the technology. The ORNL technical lead is responsible for coordinating the
preparation of the QAPP for this verification and for its approval by EPA and ORNL. The ORNL project
manager will ensure that the QAPP is implemented during all verification activities. ORNL's QA specialist
(QAS) will review and approve the QAPP and will provide QA oversight of the verification activities. The

11


-------
ORNL technical lead will be responsible for the laboratory data validation. The ORNL statistician will
primarily be responsible for the reduction of the vendor and laboratory data. The EPA project manager and
EPA QA manager will review and approve this plan.

6.3	Field Operations

6.3.1	Site Training

Preliminary site training will be provided to the vendor on the first day of testing. This will be
required before initiation of the field study. This training will be conducted by the ORNL project manager or
his designee. It will entail an overview of the test site, safety information, emergency procedures, and
logistical information regarding the verification test.

6.3.2	Communication and Documentation

Successful field operations require detailed planning and extensive communication. ORNL will
communicate regularly with the verification participants to coordinate all field activities associated with this
verification and to resolve any logistical, technical, or QA issues that may arise as the verification progresses.
Pertinent vendor and ORNL field activities will be thoroughly documented. Field documentation will
include field logbooks, photographs, field data sheets, and chain-of-custody forms.

The ORNL technical lead will be responsible for maintaining all field documentation. Field notes
will be kept in a bound logbook. Each page will be sequentially numbered and labeled with the project name
and number. Completed pages will be signed and dated by the individual responsible for the entries. Errors
will have one line drawn through them and this line will be initialed and dated. Any deviations from the
approved final verification test plan will be thoroughly documented in the field logbook and provided to the
ORNL. Photographs will be taken with a digital camera.

6.4	Performance and System Audits

The following audits will be performed during this verification.

6.4.1	Technical Systems Audit

The ORNL QAS will perform an on-site surveillance during the test and prepare a report on her
findings.

6.4.2	Data quality audit of the laboratory

One of the requirements to become an NLLAP-recognized laboratory is routine quality audits. ORNL
audited the laboratory during the analyses of the samples and found that the lab was proficient in following
its procedures.

6.4.3	Surveillance of Technology Performance

During verification testing, ORNL staff will observe the operation of the field technology, such as
observing the vendor operations, photo-documenting the test site activities, surveying calibration procedures,
and reviewing sample data. The observations will be documented in a laboratory notebook. The verification
report will contain the exact protocols used by the vendors during testing.

6.5	Quality Assurance Reports

QA reports provide the necessary information to monitor data quality effectively. It is anticipated that
the following types of QA reports will be prepared as part of this verification.

6.5.1 QC Reports of Sample Preparation

As described in Sections 4.2.1 and 4.2.2, both RTI and UC analyze a portion of the prepared samples
to confirm the accuracy and precision of the sample preparation. These data have been made available to
ORNL. Additionally, ORNL distributed 5% of the UC samples to an independent laboratory (EPA Region 1)
for confirmation analyses. The concentrations of the samples prepared by RTI have already been through
independent confirmation through the ELPAT proficiency testing process.

12


-------
6.5.2	QAS Surveillance Report

The QAS will prepare a comprehensive report of the verification activities, including those
performed in Hartford, CT, and in the second round of testing at ORNL.

6.5.3	Status Reports

ORNL will regularly inform the EPA project manager of the status of the verification. Project
progress, problems and associated corrective actions, and future scheduled activities associated with the
verification test will be discussed. When problems occur, the vendor and ORNL will discuss them, estimate
the type and degree of impact, describe the corrective actions taken to mitigate the impact and to prevent a
recurrence of the problems, and discuss with EPA, as necessary. Major problems will be documented in the
field logbook.

6.5.4	Audit Reports

Any additional QA audits or inspections, such as those conducted by interested visitors, that take
place while the verification test is being conducted will be formally reported by the auditors to the ORNL
technical lead, who will forward them to the EPA project manager. Informal reporting of audit results will be
reported immediately to EPA through a phone call, personal communication, or email.

6.6	Corrective Actions

Routine corrective action may result from common monitoring activities, such as:

Performance evaluation audits
Technical systems audits
Calibration procedures

If the problem identified is technical in nature, the individual vendors will be responsible for seeing that the
problem is resolved. If the issue is one that is identified by ORNL or EPA, the identifying party will be
responsible for seeing that the issue is properly resolved. All corrective actions will be documented. Any
occurrence that causes discrepancies from the verification test plan will be noted in the technology
verification report.

6.7	Laboratory Quality Control Checks

Internal quality control (QC) samples were analyzed by DataChem to indicate whether or not the
samples were analyzed properly. A summary of QC samples include: initial calibration, continuing
calibration verification, and analysis of known samples. This data was reviewed by ORNL as part of the data
validation process. No discrepancies were noted in the data validation records.

6.8	Data Management

The vendor, ORNL, and EPA each have distinct responsibilities for managing and analyzing
verification data._The vendor is responsible for obtaining, reducing, interpreting, validating, and reporting the
data associated with their technology's performance. These data should be reported on the chain-of-custody.
Vendor results will be due to ORNL at the conclusion of a day's field activities. The vendor's final report
will be due to ORNL one week after the verification. Any discrepancies between the originally reported
result and the final result must be described. ORNL is responsible for managing all the data and information
generated during the verification test. EPA and ORNL are responsible for analysis and verification of the
data

6.9	Data Reporting, Validation, and Analysis

To maintain good data quality, specific procedures will be followed during data reduction, review,
and reporting. These procedures are detailed below.

13


-------
6.9.1	Data Reporting

Data reduction refers to the process of converting the raw results into a concentration which will be
used for evaluation of performance. The procedures to be used will be technology dependent, but the
following is required for data reporting:

The concentration unit will be jug of lead/wipe.

If no lead is detected, the concentration will be reported as less than the reporting limits of
the technology, with the reporting limits stated (e.g., < 20 |ig/wipe). A result reported as "0"
will not be accepted.

6.9.2	Data Validation

Validation determines the quality of the results relative to the end use of the data. ORNL was
responsible for validating the laboratory data. (Note that the vendor is responsible for validating its own data
prior to final submission.) Several aspects of the data (listed below) that were reviewed. The findings of the
review are documented in the validation records.

6.9.2.1	Completeness of Laboratory Records

This qualitative review ensures that all of the samples that were sent to the laboratory were analyzed,
and that all of the applicable records and relevant results are included in the data package.

6.9.2.2	Holding Times

The dust wipe samples will not require refrigeration or other preservation techniques. The method
requirement is that the samples be prepared within 6 months of collection, which was met.

6.9.2.3	Correctness of Data

So as not to bias the assessment of the technology's performance, errors in the laboratory data will be
corrected as necessary. Corrections may be made to data that has transcription errors, calculation errors, and
interpretation errors. These changes will be made conservatively, and will be based on the guidelines
provided in the method used. The changes will be justified and documented in the validation records. No
changes were made to the laboratory data.

6.9.2.4	Correlation Between Samples within a Concentration Set

Normally, one would not know if a single sample result was "suspect" unless (a) the sample was a
spiked sample, where the concentration is known or (b) a result was reported and flagged by the laboratory
as suspect for some obvious reason (e.g., no quantitative result was determined). The experimental design
implemented in this verification study will provide an additional indication of the abnormality of data
through the inspection of the set of four results for samples prepared at a specific concentration. Criteria has
been established to determine if data is suspect. Data sets will be considered suspect if the percent relative
standard deviation for a set of four similarly-prepared samples was greater than 50%, because this criteria
would indicate imprecision. These data would be flagged so as not to bias the assessment of the technology's
performance. Precision and accuracy evaluations may be made with and without these suspect values to
represent the best and worst case scenarios. If both the laboratory and the vendor(s) report erratic results, the
data may be discarded if it is suspected that the erratic results are due to a sample preparation error.

6.9.2.5	Evaluation of QC Results

QC samples were analyzed by the NLLAP-laboratory with every batch of samples to indicate
whether or not the samples were analyzed properly. Performance on these samples was reviewed and no
major findings were noted in the validation records.

6.9.2.6	Evaluation of Spiked Sample Data

Spiked samples are samples containing known concentrations of analyte(s). For this verification test,
all of the samples are considered spiked samples.

14


-------
6.9.3 Data Analysis for Verification Factors

This section contains a list of the six primary performance verification factors to be evaluated for
both the field technology and the NLLAP-recognized laboratory.

6.9.3.1 Precision

Precision, in general, refers to the degree of mutual agreement among measurements of the same
materials and contaminants. Environmental applications often involve situations where "measurements of the
same materials" can take on a number of interpretations. In environmental applications, precision is often
best specified as a percentage of contaminant concentration. The following lists several possible
interpretations of precision for environmental applications.

1)	The precision involved in repeated measurements of the same sample without adjusting the
test equipment.

2)	The precision involved in repeated measurements of the same sample after reset,
repositioning, or re-calibration of the test equipment or when using different equipment of
the same technology.

3)	The precision of measurements due to spatial variability of dust samples from adjacent
locations.

4)	The precision characteristics of a specific technology in determining contamination at a
specific site or at an arbitrary site.

In general, users of the technology will want to be assured that measurement variability in 1) and 2) is small.
Measurement variability due to spatial variability described in 3) is likely to be site specific and is minimized
in this verification by using samples prepared under homogeneous conditions. The measurement variability
discussed in 4) is perhaps of most interest as it includes measurement variability resulting from possible
differences in the design activities and effects of environmental conditions such as temperature that would
vary from one site characterization to another as well as site and technology specific sources.

The strength of this verification's experimental design is that since an equal number of similar
samples will be selected from a homogeneous population at every concentration level, an equal number of
precision comparisons can be made.

Precision for this verification will be estimated by the variance, or standard deviation from the
measured data. If "n" lead concentration measurements are represented by Y,. Y2, ..., Yn, the estimated
variance about their average value "y" is calculated by:

s2 = -L. £(r - ?)2 .

n - 1

k=i

The standard deviation is the square root of S2 and will be analyzed to see if the precision values are a
function of lead concentration levels. The estimated S2 values will also be compared by F-tests to those
values reported on the ELPAT certificate and by UC. To express the reproducibility relative to the average
lead concentration, percent relative standard deviation (RSD) is used to quantify precision, according to the
following equation:

RSD = (standard deviation / average concentration) x 100%

Standard deviations estimated at each concentration level can be used to establish the relationship between
the uncertainty and the average lead concentration.

15


-------
6.9.3.2	Accuracy

Accuracy is a measure of how close the measured lead concentrations are to estimated values of the
true concentration. The estimated values for the ELPAT samples will be the certificate values that are
reported on the certificate of analysis sheet (see Appendix B for an example sheet). The ELPAT certificate
values represent an average concentration determinated by hundreds of laboratories that participated in
previous rounds of ELPAT testing. The UC concentration values will be reported by UC for individual
samples, calculated by the amount of NIST-traceable material loaded on the dust wipes. The accuracy and
precision of the UC value will be assessed through three independent laboratories analyzing randomly
selected QC samples. Each of the three labs will analyze 5% of the total number of samples prepared by UC
at each of the three concentration levels and confirm that the process used to prepare the samples was in
control.

Accuracy of the vendor measurements will be statistically tested using t-tests or non-parametric tests
and will also be quantified by computing the percent recovery for four similar samples or a single sample
using the equation:

percent recovery = [measured amount(s)/estimated value] x 100%

The optimum percent recovery value is 100%. Percent recovery values greater than 100% indicate results
that are biased high, and values less than 100% indicate results that are biased low.

Inaccuracies or biases are the result of systematic differences between measured and known
values. These biases may be due to limited calibration range, systematic errors, or standards preparation.
Consequently every effort will be made by ORNL, the technology vendors and the laboratory to identify
specific sources of inaccuracies. The verification includes blanks, replicates, and spiked samples that should
provide substantiating evidence to support this partitioning of sources of bias when results become available.

6.9.3.3	Detectable Blanks

Twenty samples in the study were prepared at <2 |ig/wipe, below the anticipated reporting limits of
both the field technologies and the laboratory. Any reported lead for these samples will be considered a
"detectable blank".

6.9.3.4	False Positive/False Negative Results

A false positive (fp) result is one in which the technology detects lead in the sample above a
clearance level when the sample actually contains lead below the clearance level [8]. A false negative (fn)
result is one in which the technology indicates that lead concentrations are less than the clearance level when
the sample actually contains lead above the clearance level [8], For example, if the technology reports the
sample concentration to be 35 |ig/wipe, and the true concentration of the sample is 45 |ig/wipe, the
technology's result would be considered a fn. Accordingly, if the technology reports the result as 45 |ig/wipe
and the true concentration is 35 |ig/wipe, the technology's result would be a fp.

A primary objective for this verification test is to assess the performance of the technology at each of
the three clearance levels of 40, 250, and 400 |ig/wipe, and estimate the probability of the field technology
reporting a fp or fn result. Measurement uncertainty (that is, method bias and variability) causes the
technology to report fp and fn results. Recall from the experimental design that 20 UC samples (at
concentrations +/- 10% of each clearance level) and 16 ELPAT samples (at concentrations +/- 25% of each
clearance level) will be analyzed. The data generated for these samples will be used to model the
technology's uncertainty. These uncertainties will be used in a normal probability distribution model to
calculate probabilities of fp and fn results. Additionally, the required number of samples for specified false
acceptance and false rejection rates on decisions about remediation of lead contamination will be examined.

6.9.3.5	Comparability

Comparability refers to how well the field technology and the NLLAP-recognized laboratory data
agree. The difference between accuracy and comparability is that accuracy is judged relative to a known

16


-------
value, comparability is judged relative to the results of a laboratory procedure, which may or may not report
the results accurately. Comparing averages from similar samples measured by the technology with
corresponding averages measured by the laboratory will be performed for all target concentration levels.

A correlation coefficient quantifies the linear relationship between two measurements [9], The
correlation coefficient is denoted by the letter r; its value ranges from -1 to +1, where 0 indicates the absence
of any linear relationship. The value r = -1 indicates a perfect negative linear relation (one measurement
decreases as the second measurement increases); the value r = +1 indicates a perfect positive linear relation
(one measurement increases as the second measurement increases). The slope of the linear regression line,
denoted by the letter m, is related to r. Whereas r represents the linear association between the vendor and
laboratory concentrations, m quantifies the amount of change in the vendor's measurements relative to the
laboratory's measurements. A value of+1 for the slope indicates perfect agreement. Values greater than 1
indicate that the vendor results are generally higher than the laboratory, while values less than 1 indicate that
the vendor results are usually lower than the laboratory.

6.9.3.6 Completeness

Completeness refers to the amount of data collected from a measurement process expressed as a
percentage of the data that would be obtained using an ideal process under ideal conditions. The
completeness objective for data generated during this verification is 95% or better.

There are many instances which might cause the sample analysis to be incomplete. Some of these

are:

•	Instrument failure;

•	Calibration requirements not being met;

•	Elevated analyte levels in the method blank.

7 HEALTH AND SAFETY PLAN

This section describes the specific health and safety procedures that will be used during the field
work at Oak Ridge National Laboratory, in Oak Ridge, TN.

7.1	Contact Information

The ORNL project manager will be Roger Jenkins, (865) 574-4871.

The ORNL technical lead will be Amy Dindal, (865) 574-4863.

The ORNL project statistician will be Chuck Bayne, (865) 574-3134.

The ES&H Coordinator will be Fred Smith, (865) 574-4945.

The ORNL Quality Assurance Specialist (OAS) will be Janet Wagner, (865) 576-8335.
The Environmental Protection Officer (EPO) will be Kim Jeskie, (865) 574-4947.

In case of emergency, dial 9-1-1.

7.2	Health and Safety Plan Enforcement

ORNL project manager, ORNL technical lead, the ES&H Coordinator, and the EPO were responsible
for developing the health and safety plan. ORNL project manager will ultimately be responsible for ensuring
that all verification participants understand and abide by the requirements of this HASP. ORNL technical
lead will oversee and direct field activities and is also responsible for ensuring compliance with this HASP.

7.3	Site Access

Site training will be provided to the vendor prior to testing. The training will include a review of this
health and safety plan. Because the test will be conducted on a community college campus, there will be
public access to the facility. Visitors will follow standard safety and health practices (e.g., wearing safety
glasses, as necessary).

17


-------
7.4	Waste Generation

The EPO will be responsible for ensuring that the chemical waste generated during the test is handled
properly. Because this is an x-ray fluorescence technology which does not require the use of chemicals, the
only expected waste to be generated is the used dust wipe samples.

7.5	Hazard Evaluation

The technology vendors must provide their own personal protective equipment (PPE), based on the
hazards associated with the operation of their technology. Although unlikely to be necessary, visitors will be
provided with PPE if warranted. The hazard information provided below was gathered from the ORNL
Material Safety Data Sheet (MSDS) web page and serves as a general guideline for the hazards likely to be
encountered during this field test.

Lead will be the most prevalent chemical hazard at the verification test. Exposure to lead can cause
eye, skin, and gastrointestinal irritation. If inhaled, it may cause a respiratory tract irritation. The highest
concentration of lead in the dust samples will be 1,500 |ig, and most of the sample concentrations will be
well below that level.

7.6	Personal Protection

PPE is appropriate to protect against known and potential health hazards encountered during routine
operation of the technology systems. For this verification, Level D PPE is required. Level D provides
minimal protection against chemical hazards. Level D PPE will be supplied by the individual technology
vendor. It consists only as a work uniform, with gloves worn, where necessary. The only requirement for this
verification test is appropriate work clothes, with no shorts or open-toed shoes. ORNL will provide visitors
with PPE if necessary. If site conditions indicate that additional hazards are present, ORNL may recommend
different or additional PPE to the vendors.

7.7	Physical Hazards

Physical hazards associated with field activities present a potential threat to on-site personnel.
Dangers are posed by unseen obstacles, noise, and poor illumination. Injuries may result from the following:

•	Accidents due to slipping, tripping, or falling

•	Improper lifting techniques

•	Moving or rotating equipment

•	Improperly maintained equipment

Injuries resulting from physical hazards can be avoided by adopting safe work practices and by using caution
when working with machinery.

7.8	Fire

The following specific actions will be taken to reduce the potential for fire during site activities:

•	No smoking in the building.

•	Fire extinguishers will be maintained on-site.

•	All personnel will be trained on the location and operation of the portable fire
extinguishers.

•	All personnel will be trained on the location of the phones and the number to call the fire
department.

7.9	Mechanical, Electrical, Noise Hazards

Some technology-specific hazards may be identified once the vendors set-up their equipment. Proper
hazards controls (i.e., guarding or markings) or PPE (i.e., ear plugs for noise hazards) will be implemented as
necessary.

Electrical cables represent a potential tripping hazards. When practical, cables will be placed in
areas of low pedestrian travel. If necessary, in high pedestrian travel areas, covers will be installed over
cables.

18


-------
7.10	Medical Support

A complete medical facility is located in Building 4500N. In case of a medical or fire emergency, cal 9-1-1
or pull a red fired pull box. A phone will be available for use at all times. (Note that cellular phones will not
work on most of the Oak Ridge Reservation.)

7.11	Environmental Surveillance

The ORNL project manager and ORNL technical lead will be responsible for surveying the site
before, during, and after the verification test. Appropriate personnel (e.g., ES&H Coordinator, EPO, etc.) will
be contacted to assist with any health or safety concerns.

7.12	Safe Work Practices

The vendor will provide the required training and equipment for their personnel to meet safe
operating practice and procedures. The individual technology vendor and their company are ultimately
responsible for the safety of their workers.

The following safe work practices will be implemented at the site for worker safety:

Eating, drinking, chewing tobacco, and smoking will be permitted only in designated
areas;

Wash facilities will be utilized by all personnel before eating, drinking, or toilet
facility use;

PPE requirements (See Section 7.6) will be followed.

7.13	Complaints

All complaints should be filed with the ORNL technical lead. All complaints will be treated on an
individual basis and investigated accordingly.

19


-------
REFERENCES

[1]	Code of Federal Regulations. 2001. "Identification of Dangerous Levels of Lead", Final Rule, 40
CFR 745.65. January.

[2]	American Society for Testing and Materials. 1996. "Specification El792-96a: Standard
Specification for Wipe Sampling Materials for Lead in Surface Dust" in ASTM Standards on Lead
Hazards Associated with Buildings. West Conshohocken, PA.

[3]	Title X, the Lead-Based Paint Hazard Reduction Act of 1992.

[4]	EPA (U.S. Environmental Protection Agency). 1996. "Method 3050B-1: Acid Digestion of Sediment,
Sludge, and Soils." In Test Methods for Evaluating Solid Waste: Physical/ Chemical Methods,

Update II. SW-846. U.S. Environmental Protection Agency, Washington, D.C., December.

[5]	EPA (U.S. Environmental Protection Agency). 1996. "Method 6010B-1: Inductively Coupled
Plasma-Atomic Emission Spectrometry." In Test Methods for Evaluating Solid Waste: Physical/
Chemical Methods, Update II. SW-846. U.S. Environmental Protection Agency, Washington, D.C.,
December.

[6]	American Society for Testing and Materials. 1998. "Practice El644: Standard Practice for Hot Plate
Digestion of Dust Wipe Samples for the Determination of Lead" in ASTM Standards on Lead
Hazards Associated with Buildings. West Conshohocken, PA.

[7]	ORNL (Oak Ridge National Laboratory). 1998. Quality Management Plan for the Environmental
Technology Verification Program's Site Characterization and Monitoring Technologies Pilot.
QMP-X-98-CASD-001, Rev. 0. Oak Ridge National Laboratory, Oak Ridge, Tenn., November.

[8]	Keith, L.H., G. L. Patton, D.L. Lewis and P.G. Edwards. 1996. Chapter 1: Determining What Kinds
of Samples and How Many Samples to Analyze, pp. 19. In Principles of Environmental Sampling.
Second Edition. Edited by L. H. Keith, ACS Professional Reference Book, American Chemical
Society, Washington, DC.

[9]	Draper, N. R., and H. Smith. 1981. Applied Regression Analysis. 2nd ed. John Wiley & Sons, New
York.

20


-------
APPENDIX A

LABORATORY STANDARD OPERATING PROCEDURES

Supplied by: DataChem (Cincinnati, Ohio)


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001

Page: 	1 of	10.

STANDARD OPERATING PROCEDURE

FOR THE ACID DIGESTION OF SEDIMENT, SLUDGE, AND SOIL FOR
ANALYSIS BY AA OR ICP SPECTROSCOPY
BY EPA METHOD 3050B

1.0 SCOPE AND APPLICATION

1.1 The EPA method as written provides two separate digestion procedures, one for the
preparation of sediments, sludge, and soil samples for analysis by flame atomic
absorption spectrometry (FLAA) or inductively coupled plasma atomic emission
spectrometry (ICP-AES) and one for the preparation of sediments, sludge, and soil
samples for analysis by Graphite Furnace AA (GFAA) or inductively coupled plasma
mass spectrometry (ICP-MS). The extracts from these two procedures are not
interchangeable and should only be used with the analytical determinations outlined in
this section. Samples prepared by using this procedure may be analyzed by ICP-AES or
GFAA for all the listed metals as long as the detection limits are adequate for the required
end-use of the data. Alternative determinative techniques may be used if they are
scientifically valid and the QC criteria of the method, including those dealing with
interferences, can be achieved. Other elements and matrices may be analyzed by this
method if performance is demonstrated for the analytes of interest, in the matrices of
interest, at the concentration levels of interest. The recommended determinative
techniques for each element are listed below:

FLAA or ICP-AES	GFAA or ICP-MS

Aluminum

Magnesium

Arsenic

Antimony

Manganese

Beryllium

Barium

Molybdenum

Cadmium

Beryllium

Nickel

Chromium

Cadmium

Potassium

Cobalt

Calcium

Silver

Iron

Chromium

Sodium

Lead

Cobalt

Thallium

Molybdenum

Copper

Vanadium

Selenium

Iron

Zinc

Thallium

Lead





1.2 This method is not a total digestion technique for most samples. It is a very strong acid
digestion that will dissolve almost all elements that could become "environmentally
available." By design, elements bound in silicate structures are not normally dissolved by
this procedure, as they are not usually mobile in the environment.

2.0 SUMMARY OF METHOD

2.1 For the digestion of samples, a representative sample is digested with repeated additions
of nitric acid (HN03) and hydrogen peroxide (H202).


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001	

Page: 	2 of	10

2.2	For ICP-AES or FLAA analyses, hydrochloric acid (HC1) is added to the initial digestate
and the sample is refluxed. In an optional step to increase the solubility of some metals,
this digestate is filtered and the filter paper and residues are rinsed, first with hot HC1 and
then hot reagent water. Filter paper and residue are returned to the digestion flask,
refluxed with additional HC1 and then filtered again. The digestate is then diluted to a
final volume of 100 mL.

2.3	If required, a separate sample aliquot shall be dried for a total percent solids
determination.

3.0 SAFETY PRECAUTIONS

3.1	The toxicity or carcinogenicity of each reagent used in this method has not been precisely
defined. However, each chemical compound should be treated as a potential health
hazard. From this viewpoint, exposure to these chemicals must be reduced to the lowest
possible level by whatever means available. The laboratory is responsible for maintaining
a current awareness file of OSHA regulations regarding the safe handling of the chemicals
specified in this method. A reference file of material data-handling sheets should also be
made available to all personnel involved in the chemical analysis.

3.2	Proper precautions such as the use of safety glasses and lab coats are mandatory when dealing with
these samples.

3.2.2 Additional protection given by gloves may also be indicated.

NOTE: Any gloves used must undergo prior testing to insure that no method target compounds
can be leached from the gloves when contacted by acid in liquid or vapor form.

4.0 SAMPLE HANDLING AND PRESERVATION

4.1	All samples must have been collected using a sampling plan that addresses the
considerations discussed in Chapter Nine of "Test Methods for Evaluating Solid Waste
Physical/Chemical Methods," SW-846 current revision. DataChem Laboratories does not
participate in sample collection activities.

4.2	All glassware is washed with a non-phosphate detergent in hot water and rinsed with tap water.
The glassware is then soaked in a 1:1 nitric acid bath and rinsed with tap water. Finally, the
glassware is soaked in a 1:1 hydrochloric acid bath, rinsed with tap water and distilled water then
hung upside down to dry on a peg board. After air-drying, all glassware is stored in cabinets to
minimize contamination due to airborne particulate. Immediately prior to use, the glassware is
rinsed with deionized water.

4.3	Non-aqueous samples shall be maintained at 4°C + 2°C from immediately after sampling until
just prior to digestion. Samples for this procedure have a holding time of 6 months after sampling.

4.4	Plastic or glass containers may be used to store the samples. In the determination of trace
metals, sample containers have the potential of introducing positive or negative errors in
the measurement by (a) contributing contaminants through leaching or surface desorption,
and (b) depleting analyte concentrations through adsorption. Consequently, the collection
and treatment of the samples prior to analysis requires particular attention. The following
cleaning treatment sequence has been determined to be adequate in minimizing
contamination in sample bottles, whether borosilicate glass, linear polyethylene,


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001	

Page: 	3 of	10

polypropylene, or Teflon: detergent, tap water, 2% nitric acid, tap water, and Type II
water.

Note: Chromic acid should not be used to clean glassware, especially if chromium
is one of the analytes. Commercial, no-chromate products (e.g., Nochromix)
may be used in place of chromic acid if a more rigorous cleaning procedure is
required. (Chromic acid should also not be used with plastic bottles.)

5.0 DETECTION LIMITS

5.1 Detection limits are discussed in the appropriate analytical method.

6.0 INTERFERENCES

6.1 Sludge samples can contain diverse matrix types, each of which may present it's own
analytical challenge. Spiked samples and any relevant standard reference material should
be processed to aid in determining whether this method is applicable to a given waste.

APPARATUS

7.1

Digestion Vessels - 250 mL.

7.2

Watch glasses.

7.3

Drying oven - able to maintain 105°C+4°C.

7.4

Thermometer - capable of measuring the range of 0-200°C.

7.5

Filter paper - Whatman No. 41 or equivalent.

7.6

Heating source - Adjustable and able to maintain a temperature of 90-95°C.

7.7

Variable pipetters (1-10 mL capacity)

7.8

50-mL screw top plastic sample containers.

7.9

Balance capable of weighing to 0.0 lg.

7.10

Funnel, or equivalent.

8.0 REAGENTS

8.1 Reagent grade chemicals shall be used in all tests. Unless otherwise indicated, it is
intended that all reagents shall conform to the specifications of the Committee on
Analytical Reagents of the American Chemical Society, where such specifications are
available. Other grades may be used, provided it is first ascertained that the reagent is of
sufficiently high purity to permit its use without lessening the accuracy of the
determination.


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001

Page: 	4 of	10.

8.2	ASTM Type II Water [ASTM D1193-77 (1983)]. All references to water in the method
refer to ASTM Type II unless otherwise specified.

8.3	Nitric acid (concentrated), HNO3. Acid should be analyzed to determine levels of
impurities. If method blank is < MDL, the acid can be used.

8.4	Hydrochloric acid (concentrated), HC1. Acid should be analyzed to determine level of
impurities. If method blank is < MDL, the acid can be used.

8.5	Hydrogen peroxide (30%), H2O2. Oxidant should be analyzed to determine level of
impurities.

9.0 CALIBRATIONS

9.1 Calibrations are discussed in the appropriate analytical method.

10.0 SAMPLE PREPARATION

10.1

See Section 12.0 Procedure


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001

Page: 	5 of	10.

11.0 DIAGRAMS OR TABLES

11.1 LCS AND MS SPIKING INFORMATION

Method	Analyte	* Concentration **Soil Amount

Spiked (mL)

6010A	As	100	0.5 mL

nalyte

* Concentration



(jlg/mL)

Ag

100

A1

100

As

100

B

100

Ba

100

Be

100

Ca

100

Cd

100

Co

100

Cr

100

Cu

100

Fe

100

K

1000

Mg

100

Mn

100

Mo

100

Na

100

Ni

100

Pb

100

Sb

100

Se

100

Si

50

T1

100

Ti

100

V

100

Zn

100

* Spiking solution is purchased at above listed concentration from vendor.

**Target concentrations and analytes may be altered to better satisfy client project requirements.

12.0 PROCEDURE

12.1 Mix the sample thoroughly to achieve homogeneity. Weigh 0.5 to 2.0 grams + 0.05
grams and transfer to a digestion vessel.

12.1.1 Quality control samples for each batch of up to 20 samples of the same matrix is
prepared as follows:

12.1.1	Prepare a preparation blank by following all steps and reagent additions
as used for the samples.

12.1.2	Weigh 0.50 grams of the solid blank material for the preparation of the
LCS sample. Spike the solid blank with 0.5 mL of the appropriate
spiking solution. (Some analytes may require blank correction for LCS
concentration).


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001

Page: 	6 of	10.

12.1.3	Prepare 2 replicates of a client submitted sample for a matrix
spike/matrix spike duplicate pair. Spike the matrix spike pair with 0.5
mL of the appropriate spiking solution prior to the addition of any acid.

12.1.4	These samples are digested and analyzed using the same procedure as
client submitted samples.

Note: All steps requiring the use of acids should be conducted under a fume hood by
properly trained personnel using appropriate laboratory safety equipment.

12.2	Add 10 mL of 1:1 HNO3, mix the slurry, and cover with a watch glass. Heat the sample
to 95°C + 5°C and reflux for 10 to 15 minutes without boiling. Allow the sample to cool,
add 5 mL of concentrated HNO3, replace the watch glass, and reflux for 30 minutes.
Repeat this last step as many times as necessary until no brown fumes are given off by the
sample upon the addition of acid indicating complete oxidation. Using a ribbed watch
glass, allow the solution to evaporate to 5 mL (or heat for two hours) without boiling,
while maintaining a covering of solution over the bottom of the vessel.

12.3	After Step 12.2 has been completed and the sample has cooled, add 2 mL of water and 3
mL of 30% H2O2. Cover the digestion vessel with a watch glass and return the covered
beaker to the heating source for warming and to start the peroxide reaction. Care must be
taken to ensure that losses do not occur due to excessively vigorous effervescence. Heat
until effervescence subsides and cool the vessel.

12.4	Continue to add 30% H2O2 in 1-mL aliquots with warming until the effervescence is
minimal or until the general sample appearance is unchanged.

NOTE: Do not add more than a total of 10 mL 30% H2O2.

12.5	Cover the sample with a ribbed watch glass and continue heating the acid-peroxide
digestate until the volume has been reduced to approximately 5 mL or heat at 95°C + 5°C
without boiling for two hours. Maintain a covering of solution over the bottom of the
vessel at all times.

12.6	After cooling, dilute to 100 mL with water. Particulates in the digestate that may clog the
nebulizer should be removed by filtration, by centrifugation, or by allowing the sample to
settle.

12.6.1	Filtration - Filter through Whatman No. 41 filter paper (or equivalent) and dilute
to 100 mL with water.

12.6.2	The diluted sample has an approximate acid concentration of 5.0% (v/v) HC1
and 5.0% (v/v) HNO3.

12.7	For the analysis of samples for FLAA or ICP-AES, add 10 mL concentrated HC1 to the
sample and cover with a watch glass. Place the sample on the heating source and reflux at
95°C + 5°C for 15 minutes.

12.8	Filter the digestate through Whatman No. 41 filter paper (or equivalent) and collect
filtrate in a 100-mL volumetric flask. Adjust to final volume if needed with ASTM Type
II water. The sample is now ready for analysis by FLAA or ICP-AES.


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001	

Page: 	7 of	10

12.9	Make a record of the sample preparation in the analyst laboratory notebook. Include the
client identification, preparation procedure, determinative procedure, set and sample IDs,
quality control preparations, analyst's name, and the date of preparation. Any special
circumstances or notations regarding the preparation or the samples should be included if
the analyst deems them necessary for the analysis.

12.10	To improve the solubility and recoveries of antimony, barium, lead, and silver the
following procedure may be necessary. These steps are optional and not required on a
routine basis.

12.10.1	Add 2.5 mL conc. HN03 and 2.5 mL conc. HC1 to a 0.5 gram sample and cover
with a watch glass. Place the sample on the heating source and reflux for 15
minutes.

12.10.2	Filter the digestate through Whatman No. 41 filter paper, or equivalent, and
collect the filtrate in a 50-mL volumetric flask. Wash the filter paper, while still
in the funnel, with no more than 5 mL of hot («95°C) HC1, then with 20 mL of
hot 095°C) reagent water. Collect the washings in the same 50-mL volumetric
flask.

12.10.3	Remove the filter and residue from the funnel, and place them back in the vessel.
Add 5 mL of conc. HC1, place the vessel back on the heating source and heat at

95°C + 5°C until the filter paper dissolves. Remove the vessel from the heating source
and wash the cover and sides with reagent water. Filter the residue and collect the filtrate
in the same 50-mL volumetric flask. Allow the filtrate to cool then dilute to volume with
reagent water.

Note: High concentrations of metal salts with temperature-sensitive solubility can
result in the formation of precipitates upon cooling of primary and/or
secondary filtrates. If precipitation occurs in the flask upon cooling, do not
dilute to volume. Add up to 10 mL of conc. HC1 to dissolve the precipitate.
After the precipitate dissolves, dilute to volume with reagent water and the
extract is ready for analysis.

12.11	Hotblock digestion procedure.

12.11.1	Mix the sample thoroughly to achieve homogeneity. For the digestion
procedure, weigh to the nearest 0.01 g and transfer to a disposable Hotblock
digestion vessel.

12.11.2	Quality control samples for each batch of up to 20 samples of the same matrix is
prepared as in Sections 12.1.1 through 12.1.4.

12.11.3	To each digestion vessel prepared, add 1 mL Type II water followed by 1 mL
concentrated HN03. Cap with screw-top caps and place in Hotblock for 15
minutes at 95°C + 5°C.

12.11.4	Carefully remove the vessels from the Hotblock and allow to cool. Add an
additional 1.5 mL HN03, reseal with screw caps, and return to the Hotblock for
30 minutes at 95°C + 5°C.

12.11.5 Remove the samples from the Hotblock and allow to cool. Add 2.5 mL
concentrated HC1 and reseal with screw caps. Return to the Hotblock and heat at
95°C + 5°C for 30 minutes.


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001

Page: 	8 of	10.

12.11.6	Remove the digestion vessels and allow digestates to cool. Adjust final volume
to 50 mL. Tighten screw caps and gently shake samples. Filtration may be
completed prior to analysis, unless Thallium is a requested analyte. (Note:
Filtering samples for Thallium analysis reduces analyte concentration present in
the samples and QC.) Samples may be filtered using an Acrodisk filter attached
to a disposable syringe just prior to analysis.

12.11.7	Acid concentrations and sample size may be adjusted to better suit project
requirements.

12.11.8	For accurate analysis and quantitation, it is important that sample acid
concentrations match the acid concentration of the analytical standards. The
final acid concentration of the digestates using this procedure is 5% HN03 and
5% HC1 by volume.

13.0 CALCULATIONS

13.1	The concentrations determined are to be reported on the basis of the actual weight of the
sample. If a dry weight analysis is desired, then the percent solids of the sample must also
be provided.

13.2	If a percent solid is desired, a separate determination of percent solids must be performed
on a homogeneous aliquot of the sample.

13.3	Additional calculations are discussed in the appropriate analytical method.

14.0 QUALITY ASSURANCE PROVISIONS

14.1	All specific quality control samples described in the analytical procedure should be
followed. Refer to the appropriate SOP of the analytical procedure for detailed
instructions.

14.2	For each analytical batch of samples processed, reagent blanks should be carried
throughout the entire sample-preparation and analytical process at a frequency of one per
analytical batch or every 20 samples, whichever is greater. These blanks will be useful in
determining if samples are contaminated during the preparation process.

14.3	A matrix spike/matrix spike duplicate (MS/MSD) pair should be processed on a routine
basis and whenever a new sample matrix is being processed. An MS/MSD pair is
duplicate aliquots of one of the samples, spiked with known amounts of analytes (see
Section 11.1), and brought through the entire sample preparation and analytical process.
MS/MSD pairs should be processed with each analytical batch or every 20 samples,
whichever is greater. MS/MSD samples will be used to determine precision.

14.4	A laboratory control sample (LCS) is a spiked blank sample or standard reference
material of a known concentration processed on a routine basis and whenever a new
sample matrix is being prepared. An LCS should be processed with each analytical batch
or every 20 samples, whichever is greater. The results of the LCS should be employed to


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001	

Page: 	9 of	10

determine the effects of the sample matrix and to determine preparation and analytical
accuracy.

14.5 Limitations for the FLAA and ICP-AES optional digestion procedure. Analysts should be
aware that the upper linear range for silver, barium, lead and antimony may be exceeded
with some samples. If there is a reasonable possibility that this range may be exceeded,
or if a sample's analytical result exceeds this upper limit, a smaller sample size should be
taken through the entire procedure and re-analyzed to determine if the linear range has
been exceeded. The approximate linear ranges for a 0.5-gram sample size are:

Optional Digestion Procedure Linear Range Limitation
	,	mg/Kg	

Ag

200,000



Mo

1,000,000

As

1,000,000



Ni

1,000,000

Ba

2,500



Pb

200,000

Be

1,000,000



Sb

200,000

Cd

1,000,000



Se

1,000,000

Co

1,000,000



T1

1,000,000

Cr

1,000,000



V

1,000,000

Cu

1,000,000



Zn

1,000,000

These ranges will vary with sample matrix, molecular form, and size.

14.6 Responsibility for Inspection

14.6.1	The Section Manager, or designee, is responsible for inspecting the work
performed by the analysts to verify completeness and data quality.

14.6.2	The analysts performing this procedure shall have the responsibility to inspect
notebooks and worksheets for accuracy and completeness, samples for proper
volume/size, labels, forms, and tags for accuracy, and equipment for proper
maintenance and operation.

15.0 REPORTING RESULTS

15.1 The process of reporting results is discussed in the appropriate analytical method.


-------
ENV - 3050B

Revision No.:	5	

Date: 	March 19. 2001

Page:	10 of	10

16.0 PREVENTIVE MAINTENANCE

16.1 Preventative maintenance should be performed according to equipment manufacturer's
recommendations. All service and maintenance performed is to be recorded in the
appropriate equipment service logbook.

17.0 REFERENCES

17.1	Rohrbough, W.G.; et al. Reagent Chemicals, American Society Specifications, 7th Ed.;
American Chemical Society: Washington, D.C., 1986.

17.2	1985 Annual Book of ASTM Standard, Vol. 11.01; "Standard Specification for Reagent
Water," ASTM: Philadelphia, PA, 1985; D1193-77.

17.3	"Test Methods for Evaluating Solid Waste Physical/Chemical Methods," Version 2,
USEPA SW-846, December 1997.


-------
Addendum: Preparation procedure for wipes for 6010B lead analysis

Place the wipe in a hotblock digestion vessel.

Acid addition

2.1	Ghost wipe or equivalent.

2.1.1	Add 2 mL of concentrated HN03 to the digestion vessel containing the wipe
sample.

2.1.2	Allow the reaction to subside.

2.1.3	Loosely attach the screw cap onto the vessel.

2.2	Other wipes, including gauze, baby wipes, etc.

2.2.1 Add the appropriate volume of concentrated HN03 to the sample. (Note: Wipes
larger than Ghost wipes typically require 5 mL of concentrated HN03 for digestion. The
acid concentration may be adjusted to adequately digest the wipe material used.)

Heat on the hotblock for 1 hour at 95 °C.

Remove from the hotblock apparatus and allow to cool.

Adjust to the required final volume with Type II DI water.

5.1 The final volume must allow a nitric acid concentration of 10%. (i.e., 2 mL nitric acid
used for ghost wipe digestion requires a final volume adjustment to 20 mL with Type II
DI water.)


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	1 of 33

STANDARD OPERATING PROCEDURE

FOR THE DETERMINATION OF TRACE METALS IN
SOLUTION BY INDUCTIVELY COUPLED PLASMA-ATOMIC
EMISSION SPECTROSCOPY BY EPA METHOD 6010B

1.0 SCOPE AND APPLICATION

1.1	Inductively coupled plasma-atomic emission spectrometry (ICP-AES) determines trace
elements, including metals, in solution. This method is applicable to all of the elements
listed in Table 2. All matrices, excluding filtered groundwater samples, but including
ground water, aqueous samples, TCLP and EP extracts, industrial and organic wastes,
soil, sludge, sediment, and other solid wastes, require digestion prior to analysis.
Groundwater samples that have been pre-filtered and acidified will not require acid
digestion. Samples, which are not digested, must either use an internal standard or be
matrix-matched with the standards.

1.2	Table 2 lists the elements for which this method is applicable. Detection limits,
sensitivity, and the optimum and linear concentration ranges of the elements can vary with
the wavelength, spectrometer, matrix and operating conditions. Table 2 also lists the
recommended analytical wavelengths and estimated instrumental detection limits for the
elements in clean aqueous matrices. The instrument detection limit data may be used to
estimate instrument and method performance for other samples matrices. Elements and
matrices other than those listed in Table 2 may be analyzed by this method if performance
at the concentration levels of interest is demonstrated.

1.3	Users of the method should state the data quality objectives prior to analysis and must
document and have on file the required initial demonstration performance data described
in the following sections prior to using the method for analysis.

1.4	Use of this method is restricted to spectroscopists who are knowledgeable in the
correction of spectral, chemical and physical interferences described in this method.

2.0 SUMMARY OF METHOD

2.1	Prior to analysis, samples must be solubilized or digested using appropriate sample
preparation methods. When analyzing groundwater samples for dissolved constituents,
acid digestion is not necessary if the samples are filtered and acidified prior to analysis.

2.2	This method describes multielemental determinations by ICP-AES using sequential or
simultaneous optical systems and axial or radial viewing of the plasma. The instrument
measures characteristic emission spectra by optical spectrometry. Samples are nebulized
and the resulting aerosol is transported to the plasma torch. Element-specific emission
spectra are produced by a radio-frequency inductively coupled plasma. The spectra are
dispersed by a grating spectrometer and the intensities of the emission lines are monitored
by photosensitive devices. Backgound correction is required for trace element
determination. Background must be measured adjacent to analyte lines on samples during
analysis. The position selected for the background-intensity measurement, on either or
both sides of the analytical line, will be determined by the complexity of the spectrum
adjacent to the analyte line. In one mode of analysis, the position used should be as free
as possible from spectral interference and should reflect the same change in background


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	2	of 33

intensity as occurs at the analyte wavelength measured. Background correction is not
required in cases of line broadening where a background correction measurement would
actually degrade the analytical result. The possibility of additional interferences named in
Section 7.0 should also be recognized and appropriate correction made; tests for their
presence are described in Section 13.9. Alternatively, users may choose multivariate
calibration methods. In this case, point selections for background correction are
superfluous since whole spectral regions are processed.

3.0 SAFETY PRECAUTIONS

3.1	The toxicity or carcinogenicity of each reagent used in this method has not been precisely
defined. However, each chemical compound should be treated as a potential health
hazard. From this viewpoint, exposure to these chemicals must be reduced to the lowest
possible level by whatever means available. The laboratory maintains a current
awareness file of OSHA regulations regarding the safe handling of the chemicals
specified in this method. A reference file of material data-handling sheets is also
available to all laboratory personnel. The MSDS file is kept in the top drawer of the
Health and Safety Officer's filing cabinet. Additional references to laboratory safety are
available. They are:

1.	"OSHA Safety and Health Standards, General Industry," (29 CFR 1910),
Occupational Safety and Health Administration, OSHA 2206, revised January
1976.

2.	"Prudent Practices for Handling Hazardous Chemicals in Laboratories."
Committee on Hazardous Substances in the Laboratory. Assembly of
Mathematical and Physical Sciences. National Research Counsel, 1987.

3.2	Proper precautions such as the use of safety glasses and lab coats are mandatory when
dealing with these samples.

3.2.2 Additional protection given by gloves may also be indicated.

NOTE: Any gloves used must undergo prior testing to insure that no method target
compounds can be leached from the gloves when contacted by acid in liquid or
vapor form.

4.0 SAMPLE HANDLING AND PRESERVATION

4.1	All samples must have been collected using a sampling plan that addresses the
considerations discussed in Chapter Nine of "Test Methods for Evaluating Solid Waste
Physical/Chemical Methods," SW-846 current revision. DataChem Laboratories does not
participate in sample collection activities.

4.2	All glassware is washed with a non-phosphate detergent in hot water and rinsed with tap
water. The glassware is then soaked in a 1:1 nitric acid bath and rinsed with tap water.
Finally, the glassware is soaked in a 1:1 hydrochloric acid bath, rinsed with tap water and
then distilled water. After air-drying, all glassware is stored in cabinets to minimize
contamination due to airborne particulate. Immediately prior to use, the glassware is
rinsed with deionized water.

4.3	Aqueous samples should be acidified to a pH of < 2 with HNO4


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	3 of 33

4.4	Nonaqueous samples shall be maintained at 4°C + 2°C from immediately after sampling
until just prior to digestion.

4.5	Sample holding times, preservation requirements, and suggested collection volumes are
listed in Table 1. The volumes listed in Table 1 are adequate for ICP analysis. If the
performance of any additional methods is required, a larger sample volume may be
necessary. Also, if the other test methods are to be performed requiring different sample
preservation, separate aliquots of the same sample must be collected and preserved
appropriately.

TABLE 1

RECOMMENDED COLLECTION VOLUMES FOR METAL DETERMINATIONS

Measurement

Digestion
Vol. Req.a

(mL)

Collection
Volume (mL)b

Preservative

Holding
Time

Metals (except hexavalent chromium and mercury):

Total Recoverable	45	250

Dissolved	45	250

Suspended
Total

45
45

250
250

HNO3 to pH <2

Filter on site;
HNO3 to pH <2

Filter on site

HNO3 to pH <2

6 months
6 months

6 months
6 months

aSolid samples must be at least 50 g and usually require no preservation other than storing at 4°C + 2°C
until digested.

^Either plastic or glass containers may be used.

4.6 In the determination of trace metals, sample containers have the potential of introducing
positive or negative errors in the measurement by (a) contributing contaminants through
leaching or surface desorption, and (b) depleting analyte concentrations through
adsorption. Consequently, the collection and treatment of the samples prior to analysis
requires particular attention. The following cleaning treatment sequence has been
determined to be adequate in minimizing contamination in sample bottles, whether
borosilicate glass, linear polyethylene, polypropylene, or Teflon: detergent, tap water,
2% nitric acid, tap water, and Type II water.

Note: Chromic acid should not be used to clean glassware, especially if chromium
is one of the analytes. Commercial, no-chromate products (e.g., Nochromix)
may be used in place of chromic acid if a more rigorous cleaning procedure is
required. (Chromic acid should also not be used with plastic bottles.)


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	4	of 33

5.0 SAMPLE PREPARATION

5.1 Prior to analysis, samples must be digested using appropriate sample preparation

methods. A summary of ICP sample preparation methods is listed below. Refer to the

applicable SOP for complete sample preparation information.

5.1.1	3015: (or current revision) Describes the microwave induced digestion of
aqueous samples for total recoverable or dissolved metals. This method is
applicable to ground water, surface water, drinking water, and EP and TCLP
extracts.

5.1.2	3051: (or current revision) Describes the microwave induced digestion of solid
samples. This method is applicable to soils, sludges, and solid waste samples.

5.1.3	3050B: Describes the hotplate assisted acid digestion of solid samples. This
method is applicable to soils, sludges, and solid waste samples.

5.1.4	3010A: Describes the hotplate assisted acid digestion of aqueous samples for
total recoverable or dissolved metals. This method is applicable to ground
water, surface water, drinking water, and EP and TCLP extracts.

5.1.5	ENV-3005A: Describes the hotplate assisted acid digestion of aqueous samples
for total recoverable or dissolved metals. This method is applicable to ground
water, surface water, drinking water, and EP and TCLP extracts.

6.0 DETECTION LIMITS

6.1	Method detection limits must be determined annually for the actual instrument to be used
as detailed in the laboratory Standard Operating Procedure GEN-012, "Method
Detection Limits". Table 2 lists the estimated method detection limits.

6.2	Instrument detection limits will be determined by the laboratory semi-annually using the
procedure found in Section 13.12.

6.3 Linear dynamic range verification will be conducted semi-annually using the procedure
found in Section 13.13.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	5 of	33

TABLE 2

RECOMMENDED WAVELENGTHS AND ESTIMATED METHOD DETECTION LIMITS

Method Detection
Element	Wavelengtha (nm)	Limit^ (p-g/L)

Aluminum

308.215

7.0

Antimony

206.833

5.5

Arsenic

188.979

6.1

Barium

233.527

0.0

Beryllium

313.042

0.

Boron

182.527

20.

Cadmium

214.438

0.5

Calcium

315.887

11.2

Chromium

205.552

1.2

Cobalt

228.616

0.3

Copper

324.754

2.7

Iron

273.955

12.8

Lead

220.353

4.0

Lithium

610.364

3.4

Magnesium

279.079

21.9

Manganese

257.610

1.0

Molybdenum

202.030

1.0

Nickel

232.003

0.9

Phosphorus

177.428

30.8

Potassium

766.491

7.3

Selenium

203.985

5.4

Silicon

221.667

2.2

Silver

328.068

0.0

Sodium

589.592

17.5

Strontium

460.733

1.5

Thallium

190.800

5.3

Tin

189.933

54.0

Titanium

368.520

2.8

Vanadium

292.402

0.5

Zinc

213.856

3.4

aThe wavelengths listed are recommended because of their sensitivity and overall acceptance. Other
wavelengths may be substituted if they can provide the needed sensitivity and can be properly corrected for
any spectral interferences (see Step 6.1). In time, other elements may be added to this list.

^Highly dependent on operating conditions and plasma position.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	6	of 33

INTERFERENCES

7.1 Spectral interferences are caused by: (1) overlap of a spectral line from another element;
(2) unresolved overlap of molecular band spectra; (3) background contribution from
continuous or recombination phenomena; and (4) stray light from the line emission of
high-concentration elements.

7.1.1	Background emission and stray light can usually be compensated for by
subtracting the background emission determined by measurements adjacent to
the analyte wavelength peak. Spectral scans of samples or single element
solutions in the analyte regions may indicate when alternate wavelengths are
desirable due to severe spectral interference. These scans will also show
whether the most appropriate estimate of background emission is provided by an
interpolation from measurements on both sides of the wavelength peak or by
measured emission on only one side. The locations selected for the
measurement of background intensity will be determined by the complexity of
the spectrum adjacent to the wavelength peak. The locations used for routine
measurement must be free of off-line spectral interference (interelement or
molecular) or adequately corrected to reflect the same change in background
intensity as occurs at the wavelength peak. For multivariate methods using
whole spectral regions, background scans should be included in the correction
algorithm. Off-line spectral interferences are handled by including spectra on
interfering species in the algorithm.

7.1.2	To determine the appropriate location for off-line background correction, the
user must scan the area on either side adjacent to the wavelength and record the
apparent emission intensity from all other method analytes. This spectral
information must be documented and kept on file. The location selected for
background correction must be either free of off-line interelement spectral
interference or a computer routine must be used for automatic correction on all
determinations. If a wavelength other than the recommended wavelength is
used, the analyst must determine and document both the overlapping and nearby
spectral interference effects from all method analytes and common elements and
provide for their automatic correction on all analyses. Tests to determine
spectral interference must be done using analyte concentrations that will
adequately describe the interference. Normally, 100 mg/L single element
solutions are sufficient; however, for analytes such as iron that may be found at
high concentration, a more appropriate test would be to use a concentration near
the upper analytical range limit.

7.1.3	Spectral overlaps may be avoided by using an alternate wavelength or can be
compensated by equations that correct for interelement contributions.
Instruments that use equations for interelement correction require the interfering
elements be analyzed at the same time as the element of interest. When
operative and uncorrected, interferences will produce false positive
determinations and be reported as analyte concentrations. More extensive
information on interferant effects at various wavelengths and resolutions is
available in reference wavelength tables and books. Users may apply
interelement correction equations determined on their instruments with tested
concentration ranges to compensate (off line or on line) for the effects of
interfering elements. Some potential spectral interferences observed for the
recommended wavelengths are given in Table 3. For multivariate methods using


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	7	of 33

whole spectral regions, spectral interferences are handled by including spectra of
the interfering elements in the algorithm. The interferences listed are only those
that occur between method analytes. Only interferences of a direct overlap
nature are listed. The overlaps were observed with a single instrument having a
working resolution of 0.035nm.

7.1.4	When using interelement correction equations, the interference may be expressed
as analyte concentration equivalents (i.e. false analyte concentrations) arising
from 100 mg/L of the interference element. For example, assume that As is to be
determined (at 193.696 nm) in a sample containing approximately 10 mg/L of
Al. According to Table 3, 100 mg/L of A1 would yield a false signal of As
equivalent to approximately 1.3 mg/L. Therefore, the presence of 10 mg/L of Al
would result in a false signal for As equivalent to approximately 0.13mg/L. The
user is cautioned that other instruments may exhibit somewhat different levels of
interference than those shown in Table 3. The interference effects must be
evaluated for each individual instrument since the intensities will vary.

7.1.5	Interelement corrections will vary for the same emission line among instruments
because of differences in resolution, as determined by the grating, the entrance
and exit slit widths, and by the order of dispersion. Interelement corrections will
also vary depending upon the choice of background correction points. Selecting
a background correction point where an interfering emission line may appear
should be avoided when practical. Interelement correction, that constitutes a
major portion of an emission signal, may not yield accurate data. Users should
not forget that some samples may contain uncommon elements that could
contribute spectral interferences.

7.1.6	The interference effects must be evaluated for each individual instrument
whether configured as a sequential or simultaneous instrument. For each
instrument, intensities will vary not only with optical resolution but also with
operating conditions (such as power, viewing height and argon flow rate). When
using the recommended wavelengths, the analyst is required to determine and
document for each wavelength the effect from referenced interferences (Table 3)
as well as any other suspected interferences that may be specific to the
instrument or matrix. The analyst is encouraged to utilize a computer routine for
automatic correction on all analyses.

7.1.7	Users of sequential instruments must verily the absence of spectral interferences
by scanning over a range of 0.5nm centered on the wavelength of interest for
several samples. The range for lead, for example, would be 220.6 to 220. lnm.
This procedure must be repeated whenever a new matrix is analyzed and when a
new calibration curve using different instrumental conditions is to be prepared.
Samples that show an elevated background emission across the range may be
background corrected by applying a correction factor equal to the emission
adjacent to the line or at two points on either side of the line and interpolating
between them. An alternate wavelength that does not exhibit a background shift
or spectral overlap may also be used.

7.1.8	If the correction routine is operating properly, the determined apparent analyte(s)
concentration from analysis of each interference solution should fall within a
specific concentration range around the calibration blank. The concentration
range is calculated by multiplying the concentration of the interfering element by
the value of the correction factor being tested and divided by 10. If after the


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	8	of 33	

subtraction of the calibration blank the apparent analyte concentration falls
outside of this range in either a positive or negative direction, a change in the
correction factor of more than 10% should be suspected. The cause of the
change should be determined and corrected and the correction factor updated.
The interference check solutions should be analyzed more than once to confirm a
change has occurred. Adequate rinse time between solutions and before analysis
of the calibration blank will assist in the confirmation.

7.1.9	When interelement corrections are applied, their accuracy should be verified
daily, by analyzing spectral interference check solutions. If the correction
factors or multivariate correction matrices tested on a daily basis are found to be
within the 20% criteria for 5 consecutive days, the required verification
frequency of those factors in compliance may be extended to a weekly basis.
Also, if the nature of the samples analyzed is such that they do not contain
concentrations of the interfering elements at + one reporting limit from zero,
daily verification is not required. All interelement spectral correction factors or
multivariate correction matrices must be verified and updated every six months
or when a change in instrumentation, such as in the torch, nebulizer, injector, or
plasma conditions occurs. Standards solution should be inspected to ensure that
there is no contamination that may be perceived as a spectral interference.

7.1.10	When interelement corrections are not used, verification of absence of
interferences is required.

7.1.10.1	One method is to use a computer software routine for comparing the
determinative data to limits files for notifying the analyst when an
interfering element is detected in the sample at a concentration that will
produce either an apparent false positive concentration, (i.e. greater
than) the analyte instrument detection limit, or false negative analyte
concentration, (i.e. less than the lower control limit of the calibration
blank defined for a 99% confidence interval).

7.1.10.2	Another method is to analyze an Interference Check Solution(s) which
contains similar concentrations of the major components of the samples
(>10mg/L) on a continuing basis to verify the absence of effects at the
wavelengths selected. These data must be kept on file with the sample
analysis data. If the check solution confirms an operative interference
that is >20% of the analyte concentration, the analyte must be
determined using (1) analytical and background correction wavelengths
(or spectral regions) free of the interference, (2) by an alternative
wavelength, or (3) by another documented test procedure.

Physical interferences are effects associated with the sample nebulization and transport
processes. Changes in viscosity and surface tension can cause significant inaccuracies,
especially in samples containing high dissolved solids or high acid concentrations. If
physical interferences are present, they must be reduced by diluting the sample, by using a
peristaltic pump, by using an internal standard or by using a high solids nebulizer.
Another problem that can occur with high dissolved solids is salt buildup at the tip of the
nebulizer, affecting aerosol flow rate and causing instrumental drift. The problem can be
controlled by wetting the argon prior to nebulization, using a tip washer, using a high
solids nebulizer or diluting the sample. Also, it has been reported that better control of
the argon flow rate, especially to the nebulizer, improves instrument performance. This


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	9	of 33	

may be accomplished with the use of mass flow controllers. The test in Section 13.9 will
help determine if a physical interference is present.

7.3	Chemical interferences include molecular compound formation, ionization effects, and
solute vaporization effects. Normally, these effects are not significant with the ICP
technique, but if observed, can be minimized by careful selection of operating conditions
(incident power, observation position, and so forth), by buffering of the sample, by matrix
matching, and by standard addition procedures. Chemical interferences are highly
dependent on matrix type and the specific analyte.

7.4	Memory interferences result when analytes in a previous sample contribute to the signals
measured in a new sample. Memory effects can result from sample deposition on the
uptake tubing to the nebulizer and from the build up of sample material in the plasma
torch and spray chamber. The site where these effects occur is dependent on the element
and can be minimized by flushing the system with a rinse blank between samples. The
possibility of memory interferences should be recognized within an analytical run and
suitable rinse times should be used to reduce them. The rinse times necessary for a
particular element must be estimated prior to analysis. This may be achieved by
aspirating a standard containing elements at a concentration ten times the usual amount or
at the top of the linear dynamic range. The aspiration time for this sample should be the
same as a normal sample analysis period, followed by analysis of the rinse blank at
designated intervals. The length of time required to reduce analyte signals to within a
factor of two of the method detection limit should be noted. Until the required rinse time
is established, this method suggests a rinse period of at least 60 seconds between samples
and standards. If memory interference is suspected, the sample must be re-analyzed after
a rinse period of sufficient length. Alternate rinse times may be established by the
analyst, based upon their data quality objectives.

7.5	Users are advised that high salt concentrations can cause analyte signal suppressions and
confuse interference tests. If the instrument does not display negative values, fortify the
interference check solution with the elements of interest at 0.5 to 1 ml/L and measure the
added standard concentration accordingly. Concentrations should be within 20% of the
true spiked concentration or dilution of the samples will be necessary. In the absence of
measurable analyte, over-correction could go undetected if a negative value is reported as
zero.

7.6	The dashes in Table 3 indicate that no measurable interferences were observed even at
higher interferent concentrations. Generally, interferences were discernible if they
produced peaks, or background shifts, corresponding to 2 to 5% of the peaks generated by
the analyte concentrations.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	10 of 33

TABLE 3

EXAMPLE OF POTENTIAL INTERFERENCES
ANALYTE CONCENTRATION EQUIVALENTS ARISING FROM INTERFERENTS AT THE

100 mg/L LEVEL

Analyte

Interfere nt

Wavelength 	

(nm) A1 Ca Cr Cu Fe Mg

Mn Ni T1 V

Zn

Aluminum

Antimony

Arsenic

396.153
206.833
188.979

1.6 	

0.44 	 .002 	 .003

Barium	233.527		 	

Beryllium	313.042		.24

Cadmium	214.438		 	

Calcium	315.887		.001

Chromium	205.552		 	

Cobalt	228.616	- - .009 	.008 	

Copper	324.754		 	

Iron	273.955		.009 .02 .008 - 1.7

Lead	220.353		.006 	

Magnesium 279.079		 — ~ 	

Manganese 257.610		 	

Molybdenum 202.030	— ~ — ~ ~ 	

Nickel	232.003	- - 4.1 .01 	.02 .01

Selenium	196.026		 ~ — ~ .01 — ~ .05

Sodium	589.592		 	

Thallium	190.800	- - .03 	

Vanadium	292.402		 	

Zinc	213.856		 .26 .006 - - .71 	

Dashes indicate that no interference was observed at the interferent concentrations used to generate this
table. The concentrations used are listed below:

A1	-	100 mg/L

Ca	-	100 mg/L

Cr	-	100 mg/L

Cu	-	100 mg/L

Fe	-	100 mg/L

Mg	-	100 mg/L

Mn -	100 mg/L

T1	-	100 mg/L

V -	100 mg/L

Zn -	100 mg/L


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	11 of 33

8.0 APPARATUS

8.1	Inductively coupled argon plasma emission spectrometer, ThermoJarrell Ash Model 6IE
or Perkin-Elmer Model 3000XL purged spectrometer :

8.1.1	Computer-controlled emission spectrometer with background correction.

8.1.2	Radio frequency generator compliant with FCC regulations.

8.1.3	ICP torch and load coil assembly.

8.1.4	Nebulizer and spray chamber.

8.1.5	Peristaltic pump.

8.1.6	Mass flow controller.

8.1.7	Autosampler

8.1.8	Water chiller (if necessary)

8.1.9	Drain assembly

8.1.10	Ventilation system

8.2	Argon gas supply - Welding grade or better.

8.3	Nitrogen gas supply - Welding grade or better.

8.4	Sample uptake tubing.

8.5	Variable and fixed volumetric pipetters. (100-1000|_lL. 1-10 mL)

8.6	Analytical balance capable of weighing 0.01 g.

8.7	Volumetric flasks (1 L).

8.8	Plastic screw top sample containers.

8.9	16mm x 125mm Plastic disposable culture tubes for Autosampler.

9.0 REAGENTS AND STANDARDS

9.1 Reagent or trace metal grade chemicals shall be used in all tests. Unless otherwise
indicated, it is intended that all reagents shall conform to the specifications of the
Committee on Analytical Reagents of the American Chemical Society, where such
specifications are available. Other grades may be used, provided it is first ascertained that
the reagent is of sufficiently high purity to permit its use without lessening the accuracy of
the determination. If the purity of a reagent is in question, analyze the reagent for
contamination. If the concentration of the contamination is less than the MDL then the
reagent is acceptable for use.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	12 of 33

9.1.1	Concentrated Nitric acid (HNO3). DataChem Laboratories, Cincinnati
currently uses Mallinkrodt reagent grade nitric acid.

9.1.2	Concentrated Hydrochloric acid (HCL). DataChem Laboratories, Cincinnati
currently uses Mallinkrodt reagent grade hydrochloric acid.

9.2	ASTM Type II Water [ASTM D1193-77 (1983)]. All references to water in the method
refer to ASTM Type II unless otherwise specified.

9.3	Standard stock solutions may be purchased. DataChem Laboratories, Cincinnati is
currently purchasing High-Purity Standards certified 1000 ppm stock solutions. These
standards are NIST traceable. The shelf-life of all stock solutions is one year from the
day received. Alternatively, stock solutions may be prepared from ultra-high purity grade
chemicals or metals (99.99 to 99.999% pure). All salts must be dried for 1 hour at 105°C,
unless otherwise specified.

CAUTION: Many metal salts are extremely toxic if inhaled or swallowed. Wash
hands thoroughly after handling.

Typical stock solution preparation procedures follow. Concentrations are calculated
based upon the weight of pure metal added, or upon the mole fraction and the weight of
the metal salt added.

. . ,	. weight (mg)

Metal Concentration (ppm) = vokmie

Metal salts

weight (mg) x mole fraction
Concentration (ppm) =	voiume (L)

9.3.1 Aluminum solution, stock, 1 mL = 100 ug Al: Dissolve 0.10 g of aluminum
metal, weighed accurately to at least four significant figures, in an acid mixture
of 4 mL of (1:1) HC1 and 1 mL of concentrated HNO3 in a beaker. Warm gently
to effect solution. When solution is complete, transfer quantitatively to a liter
flask, add an additional 10 mL of (1:1) HC1 and dilute to 1,000 mL with water.

9.3.2 Antimony solution, stock, 1 mL = 100 ug Sb: Dissolve 0.27 g K(Sb0)C4H406
(mole fraction Sb = 0.3749), weighed accurately to at least four significant
figures, in water, add 10 mL (1:1) HC1, and dilute to 1,000 mL with water.

9.3.3 Arsenic solution, stock, 1 mL = 100 ug As: Dissolve 0.13 g of AS2O3 (mole
fraction As = 0.7574), weighed accurately to at least four significant figures, in
100 mL of water containing 0.4 g NaOH. Acidify the solution with 2 mL
concentrated HNO3 and dilute to 1,000 mL with water.

9.3.4	Barium solution, stock, 1 mL = 100 ug Ba: Dissolve 0.15 g BaCl2 (mole
fraction Ba = 0.6595), dried at 250°C for 2 hours, weighed accurately to at least
four significant figures, in 10 mL water with 1 mL (1:1) HC1. Add 10.0 mL
(1:1) HC1 and dilute to 1,000 mL with water.

9.3.5	Beryllium solution, stock, 1 mL = 100 ug Be: Do not dry. Dissolve 1.97 g
BeS04»4H20 (mole fraction Be = 0.0509), weighed accurately to at least four


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	13 of 33

significant figures, in water, add 10.0 mL concentrated HNO3, and dilute to
1,000 mL with water.

9.3.6	Cadmium solution, stock, 1 mL = 100 ug Cd: Dissolve 0.11 g CdO (mole
fraction Cd = 0.8754), weighed accurately to at least four significant figures, in a
minimum amount of (1:1) HNO3. Heat to increase rate of dissolution. Add 10.0
mL concentrated HNO3 and dilute to 1,000 mL with water.

9.3.7	Calcium solution, stock, 1 mL = 100 ug Ca: Suspend 0.25 g CaCC>3 (mole
fraction Ca = 0.4005), dried at 180°C for 1 hour before weighing, weighed
accurately to at least four significant figures, in water and dissolve cautiously
with a minimum amount of (1:1) HNO3. Add 10.0 mL concentrated HNO3 and
dilute to 1,000 mL with water.

9.3.8	Chromium solution, stock, 1 mL = 100 ug Cr: Dissolve 0.19 g C1O3 (mole
fraction Cr = 0.5200), weighed accurately to at least four significant figures in
water. When dissolution is complete, acidify with 10 mL concentrated HNO3
and dilute to 1,000 mL with water.

9.3.9	Cobalt solution, stock 1 mL = 100 ug Co: Dissolve 0.100 g of cobalt metal,
weighed accurately to at least four significant figures, in a minimum of (1:1)
HNO3. Add 10.0 mL (1:1) HC1 and dilute to 1,000 mL with water.

9.3.10	Copper solution, stock, 1 mL = 100 ug Cu: Dissolve 0.13 g CuO (mole fraction
Cu = 0.7989), weighed accurately to at least four significant figures, in a
minimum amount of (1:1) HNO3. Add 10.0 mL concentrated HNO3 and dilute
to 1,000 mL with water.

9.3.11	Iron solution, stock, 1 mL = 100 ug Fe: Dissolve 0.14 g Fe2C>3 (mole fraction
Fe = 0.6994), weighed accurately to at least four significant figures, in a warm
mixture of 20 mL (1:1) HC1 and 2 mL of concentrated HNO3. Cool, add an
additional 5.0 mL of concentrated HNO3, and dilute to 1,000 mL with water.

9.3.12	Lead solution, stock, 1 mL = 100 ug Pb: Dissolve 0.16 g Pb(NC>3)2 (mole
fraction Pb = 0.6256), weighed accurately to at least four significant figures, in a
minimum amount of (1:1) HNO3. Add 10 mL (1:1) HNO3 and dilute to 1,000
mL with water.

9.3.13	Lithium solution, stock, 1 mL = 100 ug Li: Dissolve 0.5324 g lithium carbonate
(mole fraction Li = 0.1878), weighed accurately to at least four significant
figures, in a minimum amount of (1:1) HC1 and dilute to 1,000 mL with water.

9.3.14	Magnesium solution, stock, 1 mL = 100 ug Mg: Dissolve 0.17 g MgO (mole
fraction Mg = 0.6030), weighed accurately to at least four significant figures, in
a minimum amount of (1:1) HNO3. Add 10.0 mL (1:1) concentrated HNO3 and
dilute to 1,000 mL with water.

9.3.15	Manganese solution, stock, 1 mL = 100 ug Mn: Dissolve 0.100 g of manganese
metal, weighed accurately to at least four significant figures, in acid mixture (10
mL concentrated HC1 and 1 mL concentrated HNO3) and dilute to 1,000 mL
with water.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	14 of 33

9.3.16	Molybdenum solution, stock, 1 mL = 100 ug Mo: Dissolve 0.200 g
(NH4)6Mo7024*4H20 (mole fraction Mo = 0.5772), weighed accurately to at
least four significant figures, in water and dilute to 1,000 mL with water.

9.3.17	Nickel solution, stock, 1 mL = 100 ug Ni: Dissolve 0.100 g of nickel metal,
weighed accurately to at least four significant figures, in 10.0 mL hot
concentrated HNO3, cool and dilute to 1,000 mL with water.

9.3.18	Phosphate solution, stock, 1 mL = 100 ug P: Dissolve 0.4393 g anhydrous
KH2PO4 (mole fraction P = 0.2276), weighed accurately to at least four
significant figures, in water. Dilute to 1,000 mL.

9.3.19	Potassium solution, stock, 1 mL = 100 ugK: Dissolve 0.19 gKCl (mole fraction
K = 0.5244) dried at 110°C, weighed accurately to at least four significant
figures, in water and dilute to 1,000 mL.

9.3.20	Selenium solution, stock, 1 mL = 100 ug Se: Do not dry. Dissolve 0.17 g
H2Se03 (mole fraction Se = 0.6123), weighed accurately to at least four
significant figures, in water and dilute to 1,000 mL.

9.3.21	Silver solution, stock, 1 mL = 100 ug Ag: Dissolve 0.16 g AgNC>3 (mole
fraction Ag = 0.6350), weighed accurately to at least four significant figures, in
water and 10 mL concentrated HNO3. Dilute to 1,000 mL with water.

9.3.22	Sodium solution, stock, 1 mL = 100 ug Na: Dissolve 0.25 g NaCl (mole fraction
Na = 0.3934), weighed accurately to at least four significant figures, in water.
Add 10.0 mL concentrated HNO3 and dilute to 1,000 mL with water.

9.3.23	Strontium solution, stock, 1 mL = 100 ug Sr: Dissolve 0.2415 g of strontium
nitrate [Sr(NC>3)2] (mole fraction 0.4140), weighed accurately to at least four
significant figures, in a 1-liter flask containing 10 mL of concentrated HC1 and
700 mL of water. Dilute to 1000 mL with water.

9.3.24	Thallium solution, stock, 1 mL = 100 ug Tl: Dissolve 0.13 g TINO3 (mole
fraction Tl = 0.7672), weighed accurately to at least four significant figures, in
water. Add 10.0 mL concentrated HNO3 and dilute to 1,000 mL with water.

9.3.25	Vanadium solution, stock, 1 mL = 100 ug V: Dissolve 0.23 g NH4O3 (mole
fraction V = 0.4356), weighed accurately to at least four significant figures, in a
minimum amount of concentrated HNO3. Heat to increase rate of dissolution.
Add 10.0 mL concentrated HNO3 and dilute to 1,000 mL with water.

9.3.26	Zinc solution, stock, 1 mL = 100 ug Zn: Dissolve 0.12 g ZnO (mole fraction Zn
= 0.8034), weighed accurately to at least four significant figures, in a minimum
amount of dilute HNO3. Add 10 mL concentrated HNO3 and dilute to 1,000
mL with water.

Mixed calibration standard solutions - Prepare mixed calibration standard solutions by
combining appropriate volumes of the stock solutions with 50 mL of concentrated HNO3
and 50 mL of concentrated HCL in 1000 mL volumetric flasks (see Table 4). Dilute to
1000 mL with water. Add the appropriate types and volumes of acids so that the
standards are matrix matched with the sample digestates. Prior to preparing the mixed
standards, each stock solution should be analyzed separately to check for possible spectral


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	15 of 33

interferences and/or the presence of impurities. Care should be taken when preparing the
mixed standards to ensure that the elements are compatible and stable together. Transfer
the mixed standard solutions to FEP fluorocarbon or previously unused polyethylene or
polypropylene bottles for storage. Fresh mixed standards should be prepared, as needed,
with the realization that concentration can change on aging. Record all standard
preparation information in the working standard (WS) logbook.

Note: If the addition of silver results in an initial precipitation, add 15 mL of water and
warm the flask until the solution clears. Cool and dilute to 1000 mL with water.
For this acid combination, the silver concentration should be limited to 2 mg/L.
Silver under these conditions is stable in a tap-water matrix for 30 days. Higher
concentrations of silver require additional HC1.

TABLE 4

CALIBRATION AND ICV* STANDARD CONCENTRATIONS (fig/mL)

Calibration
Standard

#4



Calibration

Calibration

Calibration

EMENT

Standard

Standard

Standard



#1

#2

#3

A1

0.10

0.50

1.00

Sb

0.10

0.50

1.00

As

0.10

0.50

1.00

Ba

0.10

0.50

1.00

Be

0.10

0.50

1.00

Cd

0.10

0.50

1.00

Ca

0.10

0.50

1.00

Cr

0.10

0.50

1.00

Co

0.10

0.50

1.00

Cu

0.10

0.50

1.00

Fe

0.10

0.50

1.00

Pb

0.10

0.50

1.00

Li

0.10

0.50

1.00

Mg

0.10

0.50

1.00

Mn

0.10

0.50

1.00

Mo

0.10

0.50

1.00

Ni

0.10

0.50

1.00

K

1.00

5.00

10.0

Se

0.10

0.50

1.00

Ag

0.10

0.50

1.00

Na

0.10

0.50

1.00

Sr

0.10

0.50

1.00

T1

0.10

0.50

1.00

V

0.10

0.50

1.00

Zn

0.10

0.50

1.00

200.

*ICV concentrations are spiked at levels similar to the calibration standards.

9.5 Two types of blanks are required. The calibration blank is used to establish and verily the
calibrations. The reagent, or preparation, blank is analyzed to check for possible sample
contamination. Potential sources of contamination include: the reagents used in the
sample preparation, and/or contaminated equipment used in the sample preparation
process.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	16	of 33

9.5.1	The calibration blank is prepared by diluting 50 mL of concentrated HNO3 and
50 mL of concentrated HCL to 1000 mL with water. This solution should also
be used to flush the system between each standard and sample analysis. The
calibration blank will also be used for all initial and continuing calibration blank
determinations.

9.5.2	The reagent blank must contain all the reagents, and at the same volumes, as
used in the processing of the samples. The reagent blank must be carried
through the entire sample digestion procedure at the same time that the samples
are prepared and contain the same acid concentration in the final solution as the
sample solution used for analysis.

9.6	The initial calibration verification (ICV) check standard must be purchased, or prepared
from stock solutions which are independent of those used for the preparation of the
calibration standards. The acid content in the ICV should be the same as in the
calibration standards and samples. The concentrations of analytes in the ICV must be
different than those used for calibration, but within the linear working range of the
instrument. The ICV check standard concentration is described in Table 4.

9.7	The interference check samples (ICSA and ICSAB) contain known concentrations of
interferents that will provide an adequate test of the correction factors. They are analyzed
to verily the validity of the inter-element correction (IEC) factors. These solutions may
be purchased, or prepared by spiking a blank with the elements of interest, particularly
those with known interferences at 0.5 to 1 mg/L. The acid content in these two solutions
should be the same as in the calibration standards and samples.

9.8	The continuing calibration verification (CCV) check sample should be prepared in the
same acid matrix as the calibration standards and samples with concentrations near the
mid-range of calibration. The CCV may be prepared from the stock solutions used for the
preparation of the calibration standards.

10.0 CALIBRATIONS

10.1	Preliminary treatment of most matrices is necessary because of the complexity and
variability of sample matrices. Groundwater sample, which have been prefiltered and
acidified, will not need acid digestion. Samples, which are not digested, must either use
an internal standard or be matrix-matched with the standards. Solubilization and
digestion procedures are described in Section 5.0.

10.2	Operating conditions.

10.2.1 The analyst must follow the instructions in section 11.0, Procedure, or the
manufacturer's recommended conditions, which ever is applicable. When
analyzing samples in organic solvents, solvent-resistant tubing, increased plasma
(coolant) argon flow, decreased nebulizer flow, and increased RF power is
recommended to obtain stable operation and precise measurements. Sensitivity,
instrumental detection limits, precision, linear dynamic ranges, and interference
effects must be established for each individual analyte line on each particular
instrument. All measurements must be within the instrument linear range where
correction factors are valid. The analyst must (1) verily that the instrument
configuration and operating conditions satisfy the analytical requirements and


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	17 of 33

(2) maintain control data confirming instrument performance and analytical
results.

10.2 Set up the instrument using proper operating parameters as discussed in section 11.0. The
instrument must be allowed to become thermally stable before beginning (usually requires
at least 30 minutes of operation prior to calibration).

10.2.1	Before using this procedure to analyze samples, there must be data available
documenting initial demonstration of performance. The required data document
the selection criteria of background correction points; analytical dynamic ranges,
the applicable equations, and the upper limits of those ranges; the method and
instrument detection limits; and the determination and verification of
interelement correction equations or other routines for correcting spectral
interferences. This data must be generated using the same instrument, operating
conditions and calibration routine to be used for sample analysis. This
documented data must be kept on file and available for review by the data user
or auditor.

10.2.2	Specific wavelengths are listed in Table 2. Other wavelengths may be
substituted if they can provide the needed sensitivity and are corrected for
spectral interference. The instrument and operating conditions utilized for
determination must be capable of providing data of acceptable quality to the
program and data user. The analyst should follow the instruction provided by
the instrument manufacturer unless other conditions provide similar or better
performance for the task. Operating conditions for aqueous solutions usually
vary from 1100 to 1200 watts forward power, 14 to 18 mm viewing height, 15 to
19 liters/min argon coolant flow, 0-6 to 1.5 L/min. argon nebulizer flow, 1 to 1.8
mL/min. sample pumping rate with a 1 minute preflush time and measurement
time near 1 second per wavelength peak for sequential instruments and 10
seconds per sample for simultaneous instruments. For an axial plasma, the
conditions will usually vary from 1100 to 1500 watts forward power, 15 to 19
liters/min. argon coolant flow, 0.6 to 1.5 L/min. argon nebulizer flow, 1 to 1.8
mL/min. sample pumping rate with a 1 minute preflush time and measurement
time near 1 second per wavelength peak for sequential instruments and 10
seconds per sample for simultaneous instruments. Reproduction of the Cu/Mn
intensity ratio at 324.754 nm and 257.610 nm respectively, by adjusting the
argon aerosol flow has been recommended as a way to achieve repeatable
interference correction factors.

10.2.3	The plasma operating conditions need to be optimized prior to use of the
instrument. This routine is not required on a daily basis, but only when first
setting up a new instrument or following a change in operating conditions.
Follow the manufacturer's recommendations or the following procedure. The
purpose of plasma optimization is to provide a maximum signal to background
ratio for some of the least sensitive elements in the analytical array. The use of a
mass flow controller to regulate the nebulizer gas flow or source optimization
software greatly facilitates the procedure.

10.2.3.1 Ignite the radial plasma and select an appropriate incident RF power.
Allow the instrument to become thermally stable before beginning,
about 30 to 60 minutes of operation. While aspirating a 1000 |.ig/L
solution of yttrium, follow the instrument manufacturer's instructions
and adjust the aerosol carrier gas flow rate through the nebulizer so a


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	18 of 33

definitive blue emission region of the plasma extends approximately
from 5 to 20 mm above the top of the coil. Record the nebulizer gas
flow rate or pressure setting for future reference. The yttrium solution
can also be used for coarse optical alignment of the torch by observing
the overlay of the blue light over the entrance slit to the optical system.

10.2.3.2	After establishing the nebulizer gas flow rate, determine the solution
uptake rate of the nebulizer in mL/min. by aspirating a known volume
of calibration blank for a period of at least three minutes. Divide the
volume aspirated by the time in minutes and record the uptake rate; set
the peristaltic pump to deliver the rate in a steady even flow.

10.2.3.3	Profile the instrument to align it optically as it will be used during
analysis. The following procedure can be used for both horizontal and
vertical optimization in the radial mode, but is written for vertical.
Aspirate a solution containing 10 |.ig/L of several selected elements.
These elements can be As, Se, Tl, or Pb as the least sensitive of the
elements and most needing to be optimized or others representing
analytical judgement (V, Cr, Cu, Li and Mn are also used with success).
Collect intensity data at the wavelength peak for each analyte at 1 mm
intervals from 14 to 18 mm above the load coil. (This region of the
plasma is referred to as the analytical zone.) Repeat the process using
the calibration blank. Determine the net signal to blank intensity ratio
for each analyte for each viewing height setting. Choose the height for
viewing the plasma that provides the best net intensity ratios for the
elements analyzed or the highest intensity ratio for the least sensitive
element. For optimization in the axial mode, follow the instrument
manufacturer's instructions.

10.2.3.4	The instrument operating condition finally selected as being optimum
should provide the lowest reliable instrument detection limits and
method detection limits.

10.2.3.5	If either the instrument operating conditions, such as incident power or
nebulizer gas flow rate are changed, or a new torch injector type with a
different orifice internal diameter is installed, the plasma and viewing
height should be re-optimized.

10.2.3.6	After completing the initial optimization of operating conditions, but
before analyzing samples, the laboratory must establish and initially
verify an interelement spectral interference correction routine to be used
during sample analysis. A general description concerning spectral
interference and the analytical requirements for background correction
in particular are discussed in Section 5.0. Criteria for determining an
interelement spectral interference is an apparent positive or negative
concentration for the analyte that falls within + one reporting limit from
zero. The upper control limit is the analyte instrument detection limit.
Once established the entire routine must be periodically verified every
six months. Only a portion of the correction routine must be verified
more frequently or on a daily basis. Initial and periodic verification of
the routine should be kept on file.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	19 of 33

10.2.3.7 Before daily calibration and after the instrument warm-up period, the
nebulizer gas flow rate must be reset to the determined optimized flow.
If a mass flow controller is being used, it should be set to the recorded
optimized flow rate, in order to maintain valid spectral interelement
correction routines. The nebulizer gas flow rate should be the same
(<2% change) from day to day.

10.2.4	For operation with organic solvents, use of the auxiliary argon inlet is
recommended, as are solvent-resistant tubing, increased plasma (coolant) argon
flow, decreased nebulizer flow, and increased RF power to obtain stable
operation and precise measurements.

10.2.5	Sensitivity, instrumental detection limit, precision, linear dynamic range and
interference effects must be established for each individual analyte line on each
particular instrument. All measurements must be within the instrument linear
range where the correction equations are valid.

10.2.5.1	Method detection limits must be established at least annually for all
wavelengths utilized for each type of matrix commonly analyzed. The
matrix used for the MDL calculation must contain analytes of known
concentrations within 3 to 5 times the anticipated detection limit. The
soil MDL concentration will be calculated from the water MDL data
due to lack of a suitable matrix.

10.2.5.2	Determination of limits using reagent water represent a best case
situation and do not represent possible matrix effects of real world
samples.

10.2.5.3	If additional confirmation is desired, re-analyze the seven replicate
aliquots on two more non-consecutive days and again calculate the
method detection limit values for each day. An average of the three
values for each analyte may provide for a more appropriate estimate.
Successful analysis of samples with added analytes or using the method
of standard additions can give confidence in the method detection limit
values determined in reagent water.

10.2.5.4	The upper limit of the linear dynamic range must be established for
each wavelength utilized by determining the signal responses from a
minimum of three, and preferably five, different concentration
standards across the range. One of these should be near the upper limit
of the range. The ranges, which may be used for the analysis of
samples, should be judged by the analyst from the resulting data. The
data, calculations and rationale for the choice of range made should be
documented and kept on file. The upper range limit should be an
observed signal no more than 10 % below the level extrapolated from
lower standards. Determined analyte concentrations that are above the
upper range limit must be diluted and re-analyzed. The analyst should
also be aware that if an interelement correction from an analyte above
the linear range exists, a second analyte where the interelement
correction has been applied may be inaccurately reported. New
dynamic ranges should be determined whenever there is a significant
change instrument response. For those analytes that periodically
approach the upper limit, the range should be checked every six


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	20 of 33

months. For those analytes that are known interferences, and are
present at or above the linear range, the analyst should ensure that the
interelement correction has not been inaccurately applied.

Note: Many of the alkali and alkaline earth metals have non-linear
response curves due to ionization and self-absorption effects.
These curves may be used id the instrument allows; however the
effective range must be checked and the second order curve fit
should have a correlation coefficient of 0.995 or better. Third
order fits are not acceptable. These non-linear response curves
should be re-validated and re-calculated every six months.
These curves are much more sensitive to changes in operating
conditions than the linear lines and should be checked whenever
there have been moderate equipment changes.

10.2.5.5 The analyst must (1) verify that the instrument configuration and
operating conditions satisfy the analytical requirements and (2)
maintain quality control data confirming instrument performance and
analytical results.

10.3	Profile and calibrate the instrument daily, according to the instrument manufacturer's
recommended procedures using the typical mixed calibration standard solutions described
in Section 9.4. Flush the system with the calibration blank (Step 9.5.1) between each
standard and sample. (Report the average intensity of multiple exposures for both
standardization and sample analysis to reduce random error.) The calibration curve must
consist of a minimum of a blank and a standard.

10.4	For all analytes and determinations, the laboratory must analyze an ICV (Section 9.6), a
calibration blank (Section 9.5.1), and a continuing calibration verification (CCV) (Section
9.8) immediately following daily calibration. A calibration blank and either a CCV or an
ICV must be analyzed after every tenth sample and at the end of the sample run. Analysis
of the check standard and calibration verification must verify that the instrument is within
+10% of the calibration with relative standard deviation <5% from replicate (minimum of
two) integrations. If the calibration cannot be verified within the specified limits, the
sample analysis must be discontinued, the cause determined and the instrument re-
calibrated. All samples following the last acceptable ICV, CCV, or check standard must
be re-analyzed. The analysis data of the calibration blank, check standard, and ICV or
CCV must be kept on file with the sample analysis data.

10.5	Rinse the system with the calibration blank solution (Section 9..5.1) before the analysis of
each sample. The rinse time will be one minute. Each laboratory may establish a
reduction in this rinse time through a suitable demonstration.

10.6	The MSA should be used if an interference is suspected or a new matrix is encountered.
When the method of standard additions is used, standards are added at one or more levels
to portions of a prepared sample. This technique compensates for enhancement or
depression of an analyte signal by a matrix. It will not correct for additive interferences,
such as contamination, interelement interferences or baseline shifts. This technique is
valid in the linear range when the interference effect is constant over the range, the added
analyte responds the same as the endogenous analyte, and the signal is corrected for
additive interferences. The simplest version of this technique is the single addition
method. This procedure calls for two identical aliquots of the sample solution to be
taken. To the first aliquot, a small volume of standard is added; while to the second


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	21 of 33

aliquot, a volume of acid blank is added equal to the standard addition. The sample
concentration is calculated by: multiplying the intensity value for the unfortified aliquot
by the volume (liters) and concentration (mg/L or mg/kg) of the standard addition to make
the numerator; the difference in intensities for the fortified sample and unfortified sample
is multiplied by the volume (liters) of the sample aliquot for the denominator. The
quotient is the sample concentration. For more than one fortified portion of the prepared
sample, linear regression analysis can be applied using the computer software program to
obtain the concentration of the sample solution.

11.0 EXAMPLE OF THE ANALYTICAL PROCEDURE

11.1	NOTE: Inexperienced analysts should not attempt to operate the ICP without the
supervision of a trained analyst. Many components of the instrument, especially the
sample introduction system and torch assembly, are easily damaged. Improper use of the
instrument may result in very costly repairs and extended down-time.

SUMMARY: The optical spectrometer measures element-emitted light. Samples are
nebulized and the resulting aerosol is transported to the plasma torch. Element-specific
atomic-line emission spectra are induced by a radio-frequency inductively coupled
plasma. The spectra are dispersed by a grating spectrometer, and the intensities of the
lines are monitored by photomultiplier tubes. Background correction is required for trace
element determination. Background must be measured adjacent to analyte lines during
sample analysis. The position selected for the background-intensity measurement, on
either or both sides of the analytical line, will be determined by the complexity of the
spectrum adjacent to the analyte line. The position used must be free of spectral
interference and reflect the same change in background intensity as occurs at the analyte
wavelength measured. Background correction is not required in cases of line broadening
where a background correction measurement would actually degrade the analytical
results. The possibility of additional interferences should also be investigated, and
appropriate corrections made.

11.2	Instrument Startup. To turn the instrument on, locate the surge protector underneath the
computer table, and turn it on. This surge protector turns on the computer, monitor and
printer. The spectrometer and radio frequency (RF) generator should already be in
standby mode. Thermospectm , which is a version of ICP operations software, should
automatically load and display the main menu bar. If an error message is displayed,
notify the Section Manager before proceeding. Continuing without first correcting the
error could result in the loss of data.

11.3	Inspect the spray chamber and baffle for any residue left from previous analyses. If
residue is observed, disassemble the spray chamber and clean the components with soap
and water. Perform a final rinse with deionized water. Assemble the spray chamber and
install it on the instrument.

11.4	The nebulizer should be cleaned every other day. A second nebulizer may be found in
the spectroscopy prep lab. This nebulizer is sitting in a 400 mL beaker, which contains a
2 % nitric acid solution. Placing the nebulizer in a dilute nitric acid solution for a few
days safely and effectively leaches any built up residue, which would otherwise degrade
instrument performance. The beaker is in the fume hood. Take the nebulizer presently
sitting in the beaker, rinse it with deionized water and install it on the ICP. Place the
other nebulizer in the acid solution and leave it there until it is needed.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	22	of 33

11.5	The peristaltic pump tubing should also be inspected. Any tubing that has "yellowed" or
become crimped should be replaced to ensure a consistent sample uptake rate.

11.6	If the autosampler is to be used, make sure that the rinse solution bottle is full and the
waste bottle is empty.

11.7	Record all instrument maintenance in the maintenance log.

11.8	After insuring that the sample introduction system is in good working order, the plasma
may be ignited. A macro program has been set up on the computer to ignite the plasma,
set the proper analysis parameters, and to turn on the peristaltic pump.

11.9	Place the sampling probe into the calibration blank or cadmium profile solution.

11.10	Press and hold the Ctrl key and then press the F3 key. This will display the macro
command line.

11.11	Type On and then press the Enter key. The plasma will ignite in approximately two
minutes. If an error message is displayed, notify the Section Manager before proceeding.
Do not attempt manual ignition as serious damage may result if done improperly.

11.12	Check the spray chamber for an even sample aerosol flow. If a sporadic mist or no mist is
observed, check for improperly connected tubing. Also, inspect the pump tubing for
excessive wear, and check the pump clamp for proper tension. If the problem is not
obvious, ask the Section Manager for assistance.

11.13	It is important to maintain an aerosol flow through the torch at all times. Make sure that
the rinse solution bottle does not run dry. Do not allow the sample introduction system to
aspirate air, except during the time it takes to move the sampling probe from one solution
to the next.

11.14	Once the plasma has ignited and the sample introduction system is performing properly,
allow the instrument to warm up and stabilize for thirty minutes. If the plasma does not
ignite, ask the Section Manager for assistance.

11.15	Press the SELECT button on the printer. Press the TYPE STYLE button a few times
until the Draft Gothic LED is illuminated. Press the SELECT button again to bring the
printer back on-line.

11.16	After the instrument has stabilized it must be profiled. Profiling aligns the spectrometer
optics with respect to the detector array. Alignment ensures that the correct analyte
spectral line is received by each detector and that instrument sensitivity is optimized.

11.17	Place the sampling probe into the profile solution.

11.18	Highlight the Analysis prompt on the main menu and press the Enter key. This will
display the method command line.

11.19	Type the appropriate method and press Enter. The method screen will be displayed,
which contains an options menu in the lower right corner.

11.20	Press F5 which will select the profile option. Then press F3 to perform an automatic
profile and F1 to start the scan.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	23	of	33

11.21	Once scanning is completed, a peak profile and its peak position will be displayed. The
peak profile should be a symmetric bell shape. If a non-symmetric peak is observed, a
problem with the sample introduction system should be suspected. Repeat step 11.12,
and then step 11.20.

11.22	Once a proper bell-shaped peak is observed, peak profiling may be performed. The
spectrometer is considered properly profiled if the peak position is within +/- 0.05
nanometers of the profile line position.

11.23	If the peak position is within +/- 0.05 nanometers of the profile line, proceed to step
11.24. Otherwise, adjust the vernier position and repeat step 11.20. The vernier position
should be adjusted to a higher value if the peak position is negative, and should be
adjusted to a lower value if the peak position is positive. The vernier position adjustment
and automatic profile may have to be done several times before achieving a peak position
within +/- 0.05 nanometers of the profile line. For best results, change the vernier
position a little at a time.

11.24	Once an acceptable peak position is attained, press the F9 key.

11.25	The instrument is now ready to be calibrated. Calibration can be done manually or using
the autosampler. For manual calibration press the F3 key.

11.26	Place the sample probe into the blank solution. Using the arrow keys on the lower-right
corner of the key pad, highlight STDl-Blank and press F1 to begin the scan.

11.27	When the analysis is completed, the results will be displayed on the screen. Press F9 to
accept and print the results. This will automatically return you to the screen where the
standards are displayed.

11.28	Using the arrow keys, select the next standard to be analyzed, if applicable. Place the
sample probe in the standard solution and press F1 again to begin the scan, and then F9 to
accept and print the results.

11.29	After analyzing all of the applicable standards, press F9 to accept the calibration and
return to the method screen. Check the relative standard deviation of the standard
intensity measurements. Generally, 1% RSD is observed. If significantly larger RSD's
are observed problems in the sample introduction system should be suspected. Repeat
step 11.12, and then recalibrate the instrument by repeating steps 11.24 through 11.27.

11.30	Once an acceptable calibration is obtained, record the standard intensities for copper and
silver in the calibration log. Place the sample probe in the blank solution. If the
intensities are significantly different (> +/- 15 %) than the values obtained over the last
several days, report this observation to the Section Manager before proceeding to step
11.31.

11.31	The instrument is now ready for sample analysis. Sample analysis can be done manually
or using the autosampler. The autosampler is recommended for large groups of samples
when the matrices are not expected to be complicated. For small groups of samples or
samples with suspected difficult matrices, manual analysis is recommended. Place the
sample probe in the solution to be analyzed.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	24	of 33

11.32	If the solution to be analyzed is a sample, select Fl. If the solution is a QC standard,
press F2.

11.33	A sample information page will be displayed on which the sample name, analyst's initials,
and correction factor, if applicable, should be entered.

11.34	For all quality control standards a check table has been created which will automatically
validate the QC standard data against the established control limits. Press F2 to recall the
QC check table screen. Enter the appropriate check table name opposite the QC CHK
TABLE prompt. If necessary, consult with the Team Leader or Section Manager to
obtain a listing of the check table names. Press F9.

11.35	Press Fl to begin the scan. When the analysis is complete, the results will automatically
be displayed on screen and printed.

11.36	The samples and QC standards must be run in the proper sequence and at the proper
frequency as defined in sections 8.0 (Calibration) and 13.0 (Quality Assurance). If any
QC results fall outside of the control limits for the ICV/ICB, CCV/CCB, Prep Blank, or
ICSAB analyses, the run must be terminated, the problem must be corrected, and any
samples not bracketed by valid QC results must be re-analyzed.

11.37	After completing all analyses for the day the instrument must be shut down. A macro has
been set up to turn off the plasma, printer and the peristaltic pump. Press and hold the
Ctrl key and then press the F3 key. This will display the macro command line. Type
OFF and then press the Enter key. The instrument will shut down in approximately 30
seconds.

11.38	Once the plasma is off press the ESC key to return to the Main Menu.

11.39	Use the right arrow key until the Exit prompt is highlighted. Use the down arrow key to
highlight the Quit to DOS prompt. Press the Enter key.

11.40	When the screen goes black turn off the surge protector under the table.

11.41	Open the torch box door and release the peristaltic pump tubing clamps. This will extend
the life of the tubes.

11.42	All QC results are entered in the QC database.

12.0 CALCULATIONS

12.1 Results from the instrument are reported in |ag/mL (equals mg/1) of the prepared solution.
To obtain the analyte concentrations in the original sample, preparation weights and
dilution volumes must be taken into consideration, where applicable. Examples of proper
dilution correction and units conversion are shown below.

For solid samples:

Sample Concentration (|u,g/g) = [(ug/mL) From Instrument! * IFinal Volume (mL)l

[Sample Weight (g)]

For aqueous samples:

Sample Concentration (|.ig/L) = [(ug/mL) From Instrument! * [Final Volume (mL)l*[1000(mL/L)l

[Sample Aliquot (mL)]


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	25	of	33

For filter and wipe samples:

Sample Concentration	= |(|.ig/mL) From Instrument] * [Final Volume (mL)]

(ug/sample)

12.2	Determination of dry weight fraction.

12.2.1 Weigh 5 to 10 grams of sample onto a preweighed watch glass. Dry in oven
overnight at 105 degrees Celsius. Cool in desiccater before final weighing. To
determine dry weight concentration of analyte in sample, divide sample results
by the dry weight fraction.

Weight dry sample

% Dry Weight = 100 * 	—			;—

Weight or sample

12.3	Percent recovery calculation:

Measured Value

Percent Recovery =100

*

Target Value
Matrix Spike Percent Recovery =

100

^ Spiked Sample Value - Unspiked Sample Value

Spike Amount

12.4 Precision calculation:

|Vi —V2I

Relative Percent difference =100* —			—

(Vi + V2)/2

where:

Vi, V2 = found concentrations

13.0 QUALITY ASSURANCE PROVISIONS

13.1	All quality control data must be maintained and readily available for reference and for
auditing purposes.

13.2	Check the instrument standardization by analyzing appropriate check standards as
follows:

13.2.1 Verify the instrument calibration by analyzing a high standard (1 ppm) as a
sample. Results obtained from this analysis must agree within +/- 5% of the true
value for each reported analyte. If the criteria is not satisfied, terminate the
analysis, correct the problem, and re-calibrate the instrument.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	26	of 33

13.2.2	Verify the instrament calibrations using the ICV and ICB standards described in
Section 9.5 and 9.6. Results obtained from the analysis of the ICV must be
within +/-10 % of the true values for all analytes. If not, terminate the analysis,
correct the problem, and re-calibrate the instrument. The results of the ICB are
to agree within three standard deviations of the mean blank value. If not, repeat
the analysis two more times and average the results. If the average is not within
three standard deviations of the background mean, terminate the analysis, correct
the problem, and recalibrate.

13.2.3	Verily stability of the calibration every 10 samples and at the end of the
analytical run, using the CCB (section 9.5.1) and the CCV (section 9.6)
standards. The results of the CCV analyses must agree to within +10% of the
true values. If not, terminate the analysis, correct the problem, re-calibrate the
instrument and re-analyze the previous 10 samples. The results of the CCB are
to agree within three standard deviations of the mean blank value. If not, repeat
the analysis two more times and average the results. If the average is not within
three standard deviations of the background mean, terminate the analysis, correct
the problem, recalibrate, and re-analyze the previous 10 samples.

13.3	Verily the interelement and background correction factors at the beginning of the
analytical run or twice during every 8-hour work shift, whichever is more frequent, using
the ICSA and ICSAB standards. The results obtained for all analytes in ICSAB must
agree to within +/- 20 % of the true value. If not, terminate the analysis, correct the
problem, re-calibrate the instrument, and re-analyze all samples since the last valid
ICSAB analysis.

13.4	Employ a minimum of one reagent blank per sample digestion batch to determine if
contamination or memory effects are occurring. A reagent blank is a volume of reagent
water acidified with the same amounts of acid as were added to the standards and samples
(Section 9.5.2).

The reagent blank control limits and corrective actions are as follows:

Control limits: Less than the highest of either:

(1)	The method detection limit,

(2)	Five percent of the regulatory limit for that analyte, or

(3)	Five percent of the measured concentration in the
sample.

Corrective Actions:

1)	Check for calculation errors, instrument performance

2)	Re-analyze blank and samples

3)	Re-prepare and re-analyze samples

4)	Flag data

13.5	Prepare and analyze one laboratory control sample (LCS) per sample batch or per new
matrix type, whichever is more frequent. A solid LCS is a reference standard of similar
matrix as the samples. An aqueous LCS should contain the same reagents used to prepare
aqueous samples and known amounts of certified stock standards (see Table 5). The ICV
solution may be used for the aqueous LCS. The results obtained for solid and aqueous
LCS's must be within the laboratory specified control limits If there is insufficient data to
generate control limits (minimum 20 analyses), the results must be within +/- 20 % of the
true value for aqueous LCS's, and within the vendor specified control limits for solid
LCS's. If any reported analytes fall outside of the control limits for the LCS, the problem


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	27	of 33

must be corrected, and the associated samples must be re-prepped and re-analyzed for
those analytes.

13.6 Analyze one pair of matrix spike samples (MS and MSD) per analytical batch or per new
matrix type, whichever is more frequent. A MS/MSD pair are duplicate aliquots of one of
the samples, spiked with known amounts of analytes (see Table 5), and brought through
the entire sample preparation and analysis process. Recoveries should be within the
established control limits, when the sample result does not exceed 4x the spike added. If
there is insufficient data to generate control limits (minimum 20 analyses), advisory limits
of +/- 25% will be used. If any reported analytes fall outside of the control limits, a matrix
effect should be suspected and a post-digestion spike should be performed as described in
section 13.8.2. At the client's request, flag all samples associated with the out of control
matrix spike results. The relative percent difference (RPD) of the duplicate analyses,
when both the matrix spike and duplicate results are greater than or equal to 10 times the
detection limit, should be less than the laboratory established control limits. If there is
insufficient data to generate control limits (minimum 20 analyses), advisory limits of
<20% will be used. If the duplicate analysis exceeds the control limit for any analytes,
then a heterogeneous sample should be suspected. At the client's request, flag all samples
associated with the out of control duplicate results.

RPD	=	I [(D1-D2)/((D1 + D2)/2)] * 100 I

where:

RPD	=	relative percent difference

D1	=	first sample value

D2	=	second sample value

13.8	Dilute and re-analyze samples that exceed the linear calibration range or use an alternate,
less sensitive line for which quality control data is already established.

13.9	It is recommended that whenever a new or unusual sample matrix is encountered, a series
of tests be performed prior to reporting concentration data for analyte elements. These
tests, as outlined in steps 13.8.1 and 13.8.2, will ensure the analyst that neither positive
nor negative interferences are operating on any of the analyte elements to distort the
accuracy of the reported value.

13.9.1 Serial dilution: If the analyte concentration is sufficiently high (minimally, a
factor of 10 above the instrumental detection limit after dilution), an analysis of a
1:4 dilution should agree within +/- 10% of the original determination. If not, a
chemical or physical interference effect should be suspected. At the clients
request, flag all samples associated with the out of control serial dilution results.

13.9.2 A post-digestion matrix spike should be performed for any analytes (exception:
Ag) for which the pre-digestion matrix spike recovery did not fall within control
limits, and the sample result did not exceed 4x the spike added. An analyte spike
added to a portion of a prepared sample, or its dilution, when applicable, should
be recovered to within 75% to 125% of the true value. The concentration of the
spike addition should be within 10 times and a 100 times the detection limit. If
the spike is not recovered within the specified limits, a matrix effect should be
suspected and flagged as such.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	28 of 33

CAUTION: If spectral overlap is suspected, use of computerized compensation,
an alternate wavelength, or comparison with an alternate method is
recommended.

13.10 For dust wipe and filter samples only, analyze a laboratory control spike sample (LCS)
and duplicate (LCSDup.). Prepare the LCS/LCSDup. by aliquoting equal amounts of
standard solution onto blank sample collection media. Analyze the pair with the
frequency of one pair per batch of samples using the control limits established by the
laboratory.

13.12	An IDL study must be performed semi-annually, or every time the instrument is adjusted
in a way which may affect the IDL's, whichever is more frequent. The IDL's are
determined by first creating a standard which contains all of the analytes at concentrations
between 3x and 5x the instrument manufacturer's suggested IDL's. This standard is then
analyzed, under normal operating conditions, seven consecutive times per day, on three
non-consecutive days. Each analysis must be performed in the same manner as typical
analytical samples are measured, including rinsing between analyses with the reagent
blank. The standard deviations obtained from the three sets of seven analyses, for each
analyte, are averaged. The IDL's are obtained by multiplying by three the average of the
three standard deviations for each analyte.

13.13	On a semi-annual basis, the linear range of each analyte must be confirmed or every time
the instrument is adjusted in a way which may affect the linear range, whichever is more
frequent. This is accomplished by analyzing a linear range verification check standard
during a routine analytical run. The results obtained for all analytes must be within +/- 5
% of the true value. Otherwise, the problem must be corrected and the standard re-
analyzed. The concentrations of each analyte in this check standard are the highest
concentrations which can be reported in samples or QC standards. When results are
obtained which exceed these values, the sample or QC standard must be diluted and re-
analyzed.

13.14	Inter-element correction (IEC) factors, which compensate for spectral interferences on
analyte wavelengths, must be determined semi-annually, or every time the instrument is
adjusted in a way that would affect the correction factors, whichever is more frequent.
The validity of the IEC's is verified by analyzing the ICSAB solution described in section
13.3. As described in this section, the results obtained for all analytes in this sample must
be within +/- 20 % of the true value. When out-of-control results are obtained, and the
bracketing ICV/ICB and/or CCV/CCB results are within control limits, erroneous IEC's
are the probable cause. The analysis must be terminated, the problem corrected, and any
samples not bracketed by valid ICSAB results re-analyzed.

13.15	Responsibility for inspection.

13.15.1	The Section Manager, or designee, is responsible for inspecting the work
performed by the analysts to verily completeness, accuracy, and compliance to
the referenced methods. The analysts are responsible for maintaining complete
and detailed log books. The Section Manager is responsible for reviewing,
signing, and dating all completed logbook pages.

13.15.2	The analysts performing these procedures have the responsibility of inspecting
the sample containers for damage and for proper sample labeling. Any non-
conformance's must be documented on an Non-conformance/Corrective
Action form as described in the standard operating procedure for


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	29	of 33

nonconformances. The Section Manager must be notified for further instructions
and for client notification.

14.0 REPORTING RESULTS

14.1	Results should be reported in the units and format specified by the client or contract.

14.2	It is the responsibility of the Section Manager, or designee, to check the final report for
transcription errors, proper rounding of numbers and correct number of significant
figures, compliance with the method, and compliance with the requirements listed in
section 14.1.

14.3	All validated reports must be signed by the reviewer.

15.0 PREVENTIVE MAINTENANCE

15.1 Preventative maintenance should be performed in accordance with the instrument
manufacturer's recommendations. All service and maintenance performed is to be
recorded in the appropriate equipment maintenance logbook. Refer to preventive
maintenance standard operating procedure for specifics.

16.0 DIAGRAMS OR TABLES

16.1	Table 1:	Recommended Sampling Volumes for Metals Analysis

16.2	Table 2:	Recommended Wavelengths and Estimated Instrument Detection Limits

16.3	Table 3:	Example of Potential Interferences

16.4	Table 4:	Calibration and ICV Standard Concentrations

16.5	Table 5:	LCS and MS Spiking Information

16.6	Table 6:	Method 6010B LCS and MS control limits

16.7	Table 7:	Reporting Limits

17.0 REFERENCES

17.1	Winge, R.K.; Peterson, V.J.; Fassel, V.A. Inductively Coupled Plasma-Atomic Emission
Spectroscopy: Prominent Lines (Final Report, March 1977-February 1978); EPA-600/4-
79-017, Environmental Research Laboratory, Athens, GA, March 1979; Ames
Laboratory: Ames IA.

17.2	Test Methods: Methods for Organic Chemical Analysis of Municipal and Industrial
Wastewater, U.S. Environmental Protection Agency. Office of Research and
Development. Environmental Monitoring and Support Laboratory. ORD Publication
Offices of Center for Environmental Research Information: Cincinnati, OH, 1982; EPA-
600/4-82-057.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	30 of	33

17.3	Patel, B.K.; Raab, G.A.; et al. Report on a Single Laboratory Evaluation of Inductively
Coupled Optical Emission Method 6010; EPA Contract No. 68-03-3050, December
1984.

17.4	Sampling and Analysis Methods for Hazardous Waste Combustion', U.S. Environmental
Protection Agency; Air and Energy Engineering Research Laboratory, Office of Research
and Development: Research triangle Park, NC, 1986; Prepared by Arthur D. Little, Inc.

17.5	Bowmand, P.W.J.M. Line Coincidence Tables for Inductively Coupled Plasma Atomic
Emission Spectrometry, 2nd ed.; Pergamon: 1984.

17.6	Rohrbough, W.G.; et al. Reagent Chemicals, American Chemical Society Specifications,
7th ed.; American Chemical Society: Washington, DC, 1986.

17.7	1985 Annual Book of ASTM Standards, Vol. 11.01; "Standard Specification for Reagent
Water;" ASTM: Philadelphia, PA, 1985; D1193-77.

17.8	"Test Methods for Evaluating Solid Waste Physical/Chemical Methods," Version 2,
USEPA SW-846, December 1997.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	31 of 33

TABLE 5

LCS AND MS SPIKING INFORMATION

Method	Analyte	Concentration **Soil Amount Water Amount

Spiked (mL)	Spiked (mL)

601 OB	A1	200	0.5 mL	0.1 mL

nalyte

Concentration



(Hg/mL)

A1

200

Cd

5

Pb

50

V

50

Sb

50

Cr

20

Mn

50

Zn

50

As

200

Co

50

Ni

50

Ba

200

Cu

25

Se

200

Be

5

Fe

100

T1

200

Ag

5

''Soil LCS is purchased pre-spiked by vendor.

Spiking solution is purchased at above listed concentration from vendor.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	32	of 33

TABLE 6

METHOD 6010B LCS AND MS/MSD RECOVERY LIMITS

Analyte

WATER

SOIL



% LCS

%MS/MSD

RPD

% LCS

%MS/MSD

RPD



Recovery

Recovery

Limits

Recovery

Recovery

Limits

A1

82-136

80-136

63

57-124

—

63

Sb

85-115

56-148

25

17-138

32-120

25

As

76-138

89-125

65

79-115

40-147

34

Ba

71-147

69-134

68

56-135

24-166

49

Be

86-111

85-128

75

78-112

11-180

25

Cd

62-152

81-125

41

68-103

47-129

69

Ca

--

—

--

73-102

—

50

Cr

84-125

70-135

82

74-109

23-162

46

Co

86-110

83-124

40

63-88

19-169

39

Cu

84-116

83-136

52

83-113

34-166

69

Fe

51-154

74-127

38

74-150

—

66

Pb

82-136

73-136

30

68-108

10-186

52

Mg

--

--

--

83-114

--

42

Mn

75-136

57-145

14

81-107

—

55

Mo

--

—

--

76-102

—

57

Ni

79-114

81-125

65

78-119

18-180

31

K

--

—

--

58-162

—

83

Se

59-163

86-126

67

73-106

23-153

25

Ag

62-145

73-126

85

84-123

24-166

96

Na

--

—

--

71-128

—

65

Sr

--

—

--

95-197

—

30

T1

40-140

54-138

77

37-131

23-157

25

V

65-124

78-135

55

78-114

36-193

51

Zn

73-142

55-151

65

74-103

30-194

54

-- indicates that the compound is not spiked.

Note: Control limits are subject to revision annually as per laboratory requirements.


-------
ENV-6010B

Revision No.:	5	

Date: 	September 13. 2000

Page:	33 of	33

TABLE 7

METHOD 6010B REPORTING LIMITS

Analyte

WATER

SOIL

Units

Hg/L

mg/Kg

A1

200.

20.

Sb

30.

3.0

As

50.

5.0

Ba

200.

10.

Be

4.0

.40

Cd

5.0

.50

Ca

200.

20.

Cr

10.

1.0

Co

50.

5.0

Cu

25.

2.5

Fe

100.

10.

Pb

15.

1.5

Mg

200.

20.

Mn

15.

1.5

Mo

50.

5.0

Ni

40.

4.0

K

200.

20.

Se

30.

3.0

Ag

10.

1.0

Na

200.

20.

Sr

50.

5.0

T1

30.

3.0

V

50.

5.0

Zn

50.

5.0

The reporting limits listed are routinely used by the laboratory. Other reporting limits may be used
to fulfill individual project requirements but must be supported by the laboratory verified method
detection limit study.


-------
APPENDIX B

ELPAT CERTIFICATE OF ANALYSIS SHEET

Supplied by: American Industrial Hygiene Association


-------
ELPAT ROUND 36

ENVIRONMENTAL LEAD PROFICIENCY ANALYTICAL TESTING PROGRAM

CERTIFICATE OF ANALYSIS



Sample
Number

Reference
Value

STD

RSD%

Lower
Limit

Upper
Limit





PAINT CHIPS (%)

1

1.5576

.094

6.0

1.2763

1.8389



2

3.2953

.219

6.6

2.6385

3.9521



3

0.0598

.006

9.4

0.0429

0.0/6/



4

0.2851

.016

5.6

0.23/3

0.3329



SOIL (mg/kg)

1

113.1

12.3

10.8

76.3

150



2

141.9

12.6

8.9

104.1

1/9.8



3

/91V

4/. 9

6.1

64/.9

935.5



4

289.5

24.6

8.5

215V

363.3



DUST WIPES (ug)

1

162.3

14.3

8.8

119.2

205.3



2

1/.6

3.39

19.3

(A

2/.9



3

418.1

30 V

A3

326

510.3



4

49

5.88

12.0

31.3

66V




-------