United States
Bm* ฆฆPaUk Environmental Protection
l^nl ^^mAgency

Verification of Portable Ion Mobility
Spectrometers for Detection of Chemicals
and Chemical Agents in Buildings

Office of Research and Development
National Homeland Security
Research Center


-------

-------
TEST/QA PLAN

for

VERIFICATION OF PORTABLE ION MOBILITY SPECTROMETERS
FOR DETECTION OF CHEMICALS AND CHEMICAL AGENTS

IN BUILDINGS

Prepared by

Battelle
Columbus, Ohio

GSA Contract Number GS-23F-0011L-BPA-2
Task Order Number 1102

EPA Task Order Project Officer
Eric Koglin

July 2003


-------
TABLE OF CONTENTS

Page

List of Figures 		iv

List of Tables 		iv

List of Acronyms		v

1.0 INTRODUCTION		1

1.1	Test Description		1

1.2	Test Objective		2

1.3	Organization and Responsibilities		2

1.3.1	Battelle		4

1.3.2	Vendors		6

1.3.3	EPA		6

1.3.4	Test Facility		7

2.0 APPLICABILITY		8

2.1	Subject		8

2.2	Scope		8

3.0 SITE DECRIPTION		12

3.1	General Site Description		12

3.2	Site Operations		13

4.0 EXPERIMENTAL DESIGN		15

4.1	General Test Design		15

4.2	Performance Parameters		16

4.2.1	Response Time		17

4.2.2	Response Threshold		18

4.2.3	Repeatability		18

4.2.4	Accuracy		19

4.2.5	Recovery Time		19

4.2.6	Temperature and Humidity Effects		20

4.2.7	Interference Effects		21

4.2.8	Cold/Hot Start Behavior		22

4.2.9	Battery Life		22

4.3	Operational Characteristics		23

4.4	Chemical Test Compounds		24

4.5	Test Matrix		24

4.6	Test Schedule		25

4.7	Reference Methods		29

5.0 MATERIALS AND EQUIPMENT		31

5.1	Agents, Simulants, and TICs		31

5.2	Vapor Delivery Equipment		31

5.3	Temperature/Humidity Control		32

5.4	Reference Methods		32

li


-------
TABLE OF CONTENTS (Continued)

Page

5.5 Performance Evaluation Audit		33

6.0 TEST PROCEDURES		34

6.1	Response Time		37

6.2	Response Threshold		39

6.3	Repeatability		40

6.4	Accuracy		40

6.5	Recovery Time		41

6.6	Temperature and Humidity Effects		41

6.7	Interference Effects		41

6.8	Cold/Hot Start Behavior		42

6.9	Battery Life		43

7.0 QUALITY ASSURANCE/QUALITY CONTROL		45

7.1	Equipment Calibrations		45

7.1.1	Reference Methods		45

7.1.2	IMS Instruments Checks		45

7.1.2.1	GB Agent Detection		45

7.1.2.2	HD Agent Detection		46

7.2	Assessment and Audits		46

7.2.1	Technical Systems Audits		46

7.2.2	Performance Evaluation Audit		46

7.2.3	Data Quality Audit		47

7.2.4	Assessment Reports		48

7.2.5	Corrective Action		48

8.0 DATA ANALYSIS AND REPORTING		49

8.1	Data Acquisition		49

8.1.1	IMS Data Acquisition		49

8.1.2	Laboratory Data Acquisition		51

8.1.3	Confidentiality		51

8.2	Data Review		52

8.3	Data Evaluation		52

8.3.1	Multivariate Analyses		52

8.3.1.1	Evaluation of Multiple Performance Parameters		53

8.3.1.2	False Positives and False Negatives 		54

8.3.1.3	Support Tools		54

8.3.2	Single-Variable Analyses		55

8.3.2.1	Response Time		55

8.3.2.2	Response Threshold 		56

8.3.2.3	Repeatability 		56

8.3.2.4	Accuracy 		57

8.3.2.5	Recovery Time		58

8.3.2.6	Temperature and Humidity Effects		58

in


-------
TABLE OF CONTENTS (Continued)

Page

8.3.2.7	Interference Effects		59

8.3.2.8	Cold/Hot Start Behavior 		60

8.3.2.9	Battery Life 		60

8.4 Reporting		61

9.0 HEALTH AND SAFETY		62

9.1	Access		62

9.2	Potential Hazards		62

9.3	Training		62

9.4	Safe Work Practices		63

9.5	Equipment Disposition		63

10.0 REFERENCES		64

LIST OF FIGURES

Figure 1. Organization Chart for the IMS Detection Technology Verification Test		3

Figure 2. Test Sequence and Logic for TIC IMS Instrument Verification		27

Figure 3. Test Sequence and Logic for CW Agent IMS Instrument Verification		28

Figure 4. Test System Schematic		36

LIST OF TABLES

Table 1. Battelle Facilities for Testing of Portable IMS Instruments		14

Table 2. Temperature and Relative Humidity Conditions for

Portable IMS Instrument Testing		21

Table 3. Summary of Evaluations to Be Conducted in

Portable IMS Verification Test		25

Table 4. Planned Reference Methods for Target TICs and CW Agents		30

Table 5. Target Challenge Concentrations		39

Table 6. Summary of PE Audits		47

Table 7. Summary of Data Recording Process for the Verification Test		50

iv


-------
LIST OF ACRONYMS

AC

hydrogen cyanide

APT

Aerosol and Process Technologies

AS

Atmospheric Sciences

CET

Chemical and Environmental Technologies

CG

phosgene

CK

cyanogen chloride

Cl2

chlorine

CRDEC

Chemical Research, Development and Engineering Center

CSM

Chemical Surety Material

CW

Chemical Warfare

CWA

Chemical Warfare Agent

DEAE

N,N-di ethyl aminoethanol

DOD

Department of Defense

DOE

Department of Energy

EPA

Environmental Protection Agency

EPA

U.S. Environmental Protection Agency

ETV

Environmental Technology Verification

FID

flame ionization detector

FPD

flame photometric detector

ft

foot

FTIR

Fourier Transform Infrared

GB

Sarin

GC

gas chromatography

GD

Soman

HD

sulfur mustard

HML

Hazardous Materials Laboratory

HMRC

Hazardous Materials Research Center

IDLH

Immediately Dangerous to Life and Health

IMS

Ion Mobility Spectrometers

L

Lewisite

LITF

Large Item Test Facility

min

minute

MREF

Medical Research and Evaluation Facility

MSD

mass selective detector

NHSRC

National Homeland Security Research Center

PE

Performance Evaluation

PPE

Personal Protective Equipment

ppm

parts per million

QA

Quality Assurance

QC

quality control

QMP

Quality Management Plan

v


-------
LIST OF ACRONYMS (Continued)

RDS

Research Dilute Solution

RH

Relative Humidity

RSD

Relative Standard Deviation

SA

arsine (AsH3)

SBCCOM

U.S. Army Soldier Biological and Chemical Command

SOP

Standard Operating Procedure

TIC

Toxic Industrial Chemical

TOPO

Task Order Project Officer

TSA

Technical Systems Audit

Y/N

Yes/No

vi


-------
DISTRIBUTION LIST

Dr. Thomas J. Kelly
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Ms. Karen Riggs
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Mr. Zachary Willenberg
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Mr. Kent Hofacre
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Mr. Dale Folsom
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Ms. Tricia Derringer
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Mr. Peter Larkowski
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Mr. Gary Stickel
Battelle

505 King Avenue
Columbus, Ohio 43201-2693

Mr. Eric Koglin

USEPA National Exposure Research

Laboratory
Environmental Sciences Division/ORD
P. O. Box 93478
Las Vegas, NV 89193-3478

Mr. Frank Thibodeau
Bruker Daltonics, Inc.

40 Manning Road
Billerica, MA 01821

vii


-------
Vendor Approval of EPA/ETV Test/QA Plan for

Verification of Portable Ion Mobility Spectrometers for
Detection of Chemicals and Chemical Agents in Buildings

July 2003

Name 	 Signature 	

Company	

Date 	

viii


-------
Ion Mobility Spectrometer Test/QA Plan

Page 1 of 64
Date: 7/31/03

1.0 INTRODUCTION
1.1 Test Description

The U.S. Environmental Protection Agency (EPA) has the responsibility to help protect
the public in workplaces and other buildings that may be subject to attack using chemical or
biological agents. That responsibility includes identifying methods and equipment for detecting
or monitoring for chemical and biological contaminants in indoor environments. In January
2003, EPA established the National Homeland Security Research Center (NHSRC) to manage,
coordinate, and support a wide variety of homeland security research and technical assistance
efforts. Through the Safe Buildings Program, a key research component of the NHSRC, EPA is
verifying the performance of products, methods, and equipment that can detect chemical or
biological agents on indoor surfaces or in indoor air. EPA's goal is to generate objective
performance data so building and facility managers, first responders, and other technology
buyers and users can make informed purchase and application decisions.

To meet this goal, EPA is using the process established in its Environmental Technology
Verification (ETV) Program. The ETV process, which has been used since 1997 to verify the
performance of over 200 environmental technologies, includes developing a test/quality
assurance plan (with input from stakeholders and vendors), applying high-quality test procedures
according to that plan, and publicizing separate performance reports for each technology
verified. The purpose of ETV is to provide objective and quality-assured performance data on
environmental technologies, so that users, developers, regulators, and consultants have an
independent and credible assessment of what they are buying and recommending. The ETV
process does not rank, select, or approve technologies, but instead provides credible performance
data to potential users and buyers. Other information about the program is available at the ETV
web site (http://www.epa.gov/etv) and through the NHSRC web site (www.epa.gov/nhsrc).

This test/quality assurance (QA) plan provides procedures for verification of
commercially available portable ion mobility spectrometers (IMS) that can rapidly detect


-------
Ion Mobility Spectrometer Test/QA Plan

Page 2 of 64
Date: 7/31/03

individual chemicals and chemical agents in indoor air. The verification test will be conducted
in accordance with the ETV process and will be conducted by Battelle, of Columbus, OH, under
the direction of the EPA. In performing this verification test, Battelle will follow the procedures
specified in this test/QA plan and will comply with quality requirements in the ETV Quality and
Management Plan (QMP)(1).

1.2 Test Objective

The objective of the verification test is to assess the performance of commercial portable
IMS technologies by challenging them with a variety of toxic industrial chemicals (TICs),
chemical warfare (CW) agents, and simulants, under a range of conditions and practices that
mimic the real-world use of these IMS instruments. This verification is focused on the use of
portable IMS instruments by first responders to identify contaminants and guide emergency
response activities after chemical contamination of a building. The performance characteristics
to be evaluated include the ability to detect and identify target agents and chemicals under both
ideal and realistic operating conditions. The response time, response threshold, accuracy,
recovery time, temperature and humidity effects, interference effects, and battery life of the IMS
instruments will be assessed. Operational factors such as cold/hot start behavior, cost, ease of
use, and data output capability also will be evaluated.

1.3 Organization and Responsibilities

The verification test will be performed by Battelle under the direction of EPA, with input
from the vendors whose IMS instruments will be verified. The organization chart in Figure 1
shows the individuals from Battelle, the vendor companies, and EPA who will have
responsibilities in the verification test. The specific responsibilities of these individuals are
detailed in the following paragraphs.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 3 of 64
Date: 7/31/03

Figure 1. Organization Chart for the IMS Detection Technology Verification Test


-------
Ion Mobility Spectrometer Test/QA Plan

Page 4 of 64
Date: 7/31/03

1.3.1 Battelle

Mr. Kent Hofacre is Battelle's Verification Test Coordinator for this verification test.
In that role, Mr. Hofacre will direct the verification testing of portable IMS detection
technologies. His responsibilities are to:

•	Select the appropriate laboratory or location for the test.

•	Coordinate with vendor representatives to facilitate the performance of testing.

•	Prepare the draft test/QA plan, verification reports, and verification statements.

•	Arrange for use of the test facility and establish a test schedule.

•	Arrange for the availability of qualified staff to conduct the test.

•	Assure that testing is conducted according to this test/QA plan.

•	Revise the test/QA plan, verification reports, and verification statements in response
to reviewers' comments.

•	Keep the Battelle Program Manager and Verification Testing Leader informed of
progress and difficulties in planning and conducting the test.

•	Coordinate with the Battelle Quality Manager for the performance of technical and
performance audits as required by Battelle or EPA Quality Management staff.

•	Guide the Battelle/EPA/vendor team in performing the verification test in accordance
with this test/QA plan.

•	Have overall responsibility for ensuring that this test/QA plan is followed.

•	Respond to any issues raised in assessment reports and audits, including instituting
corrective action as necessary.

•	Establish a budget and schedule for the verification test and direct the effort to ensure
that budget and schedule are met.

•	Coordinate distribution of final test/QA plan, verification reports, and statements.

Dr. Thomas J. Kelly is the Verification Testing Leader in this program. In this role, Dr.
Kelly will support Mr. Hofacre by:

•	Ensuring that ETV program procedures are being followed.

•	Providing a technical review of the draft test/QA plan, verification reports, and
verification statements.

•	Serving as backup Verification Test Coordinator in Mr. Hofacre's absence.

Ms. Karen Riggs is Battelle's Program Manager for this program. As such, Ms. Riggs

will:


-------
Ion Mobility Spectrometer Test/QA Plan

Page 5 of 64
Date: 7/31/03

•	Maintain communication with EPA's Task Order Project Officer (TOPO) on all
aspects of the program.

•	Monitor adherence to budgets and schedules in this work.

•	Provide the TOPO with monthly technical and financial progress reports.

•	Review the draft test/QA plan.

•	Review the draft verification reports and statements.

•	Ensure that necessary Battelle resources, including staff and facilities, are committed
to the verification test.

•	Ensure that vendor confidentiality is maintained.

•	Support Mr. Hofacre in responding to any issues raised in assessment reports and
audits.

Mr. Zachary Willenberg is Battelle's Quality Manager for this program. As such, Mr.
Willenberg will:

•	Review the draft test/QA plan.

•	Maintain communication with EPA Quality Management staff for this program.

•	Conduct a technical systems audit (TSA) once during the verification test.

•	Review results of performance evaluation (PE) audit(s) specified in this test/QA plan.

•	Audit at least 10% of the verification data.

•	Prepare and distribute an assessment report for each audit.

•	Verify implementation of any necessary corrective action.

•	Issue a stop work order if internal audits indicate that data quality is being
compromised; notify Battelle's Program Manager and Verification Test Coordinator
if such an order is issued.

•	Provide a summary of the QA and quality control (QC) activities and results for the
verification reports.

•	Review the draft verification reports and statements.

•	Ensure that all quality procedures specified in this test/QA plan and in the QMP(1) are
followed.

Battelle technical staff will support Mr. Hofacre in planning and conducting the
verification test. These staff will:

•	Assist in planning and scheduling the verification test.

•	Become familiar with the use of the IMS detection technologies to be tested.

•	Carry out the test procedures specified in this test/QA plan.

•	Assure that test procedures and data acquisition are conducted according to this
test/QA plan.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 6 of 64
Date: 7/31/03

1.3.2	Vendors

Vendors of portable IMS detection technologies will:

•	Provide input for preparation of the draft test/QA plan

•	Review the draft test/QA plan, and approve the final version

•	Sign a Vendor Agreement specifying the respective responsibilities of the vendor and
of Battelle in the verification test

•	Provide information on the quantitative response of their portable IMS instruments
(e.g., programmed alarm levels; concentrations triggering transition between
low/medium/high readings) to aid in planning of the verification test

•	Provide two units of their portable IMS detection technology for use in the
verification test

•	Train Battelle and/or test facility staff in the operation of their portable IMS
instruments

•	Provide support, if needed, in use of the IMS instruments during testing

•	Review their respective draft verification report and verification statement.

1.3.3	EPA

Mr. Eric Koglin is EPA's TOPO for this program. As such, Mr. Koglin will:

•	Have overall responsibility for directing the verification process

•	Review the draft test/QA plan

•	Approve the final test/QA plan

•	Review the draft verification reports and statements

•	Oversee the EPA review process on the draft test/QA plan, reports, and verification
statements

•	Coordinate the submission of verification reports and statements for final EPA
approval.

The EPA Quality Manager for this program will:

•	Review the draft test/QA plan

•	Perform, at her option, one external TSA during the verification test

•	Notify Battelle's Quality Manager to facilitate a stop work order if an external audit
indicates that data quality is being compromised

•	Prepare and distribute an assessment report summarizing the results of the external
audit, if one is performed

•	Review the draft verification reports and statements.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 7 of 64
Date: 7/31/03

1.3.4 Test Facility

The location for the verification test described here will be Battelle's laboratories in
Columbus and West Jefferson, Ohio. The Columbus facilities to be used are chemical
laboratories equipped for safe handling of volatile TICs. The West Jefferson facilities are
chemical surety laboratories certified for use of CW agents. Other test facilities could be used
depending on the availability and capability of the facilities. In general, the responsibilities of
the technical staff in these test facilities will be:

•	Ensure that the facility is fully functional prior to the times/dates needed in the
verification test

•	Provide requisite technical staff during the verification test

•	Provide any safety training needed by Battelle, vendor, or EPA staff

•	Review and approve all data and records related to facility operation

•	Review the draft test/QA plan

•	Adhere to the requirements of the test/QA plan and the QMP(1) in carrying out the
verification test

•	Provide input on facility procedures for the verification test report

•	Support Mr. Hofacre in responding to any issues raised in assessment reports and
audits related to facility operation.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 8 of 64
Date: 7/31/03

2.0 APPLICABILITY
2.1 Subject

This test/QA plan focuses on the verification testing of commercially available portable
IMS instruments for detection of toxic chemicals or chemical agents in indoor air. This plan is
specifically focused on detection in the building environment, in the context of use of the IMS
instruments by first responders arriving at a potential contamination event. In this target
scenario, there is need for immediate and accurate identification of chemicals, by first responders
who are wearing extensive personal protective equipment (PPE), regardless of the weather or
environmental conditions at the time. These needs are the basis for the test procedures stated in
this plan.

The chemicals and chemical agents that may pose a threat in the building environment
may include TICs and CW agents. Chemical agents having relatively low vapor pressures are of
interest in this test, because of their persistence in the building environment. However, highly
volatile TICs and CW agents are also included in testing under this plan; although they can be
readily removed from the building by ventilation, they may be present at the time that first
responders arrive at the scene.

Verification testing requires a basis for establishing the quantitative performance of the
tested technologies. For this verification, quantitative performance is assessed primarily in terms
of the detection of the chemicals, agents, or simulants. For this verification, standard test
methods are used to confirm the contaminant concentrations sampled by the IMS instruments.

2.2 Scope

The overall objective of the test described in this plan is to verify the performance of the
portable IMS technologies with selected chemicals and chemical agents under a realistically
broad range of indoor conditions and procedures of use. Testing will be conducted over ranges
of temperature and relative humidity (RH) representing those that might be encountered in an


-------
Ion Mobility Spectrometer Test/QA Plan

Page 9 of 64
Date: 7/31/03

emergency response situation in a building environment. The rigorous nature of actual use by
first responders will be simulated by testing for cold and hot start operation, battery life, and
interferences. In all testing, two units of each IMS instrument will be tested simultaneously, to
assure complete coverage of all test procedures in the event of a failure of one unit. The test data
sets from the two units will be compiled and reported as independent measures of the IMS
performance.

The performance parameters on which the portable IMS instruments will be evaluated
under this plan include:

•	Response time

•	Response threshold

•	Repeatability

•	Accuracy

•	Recovery time

•	Temperature and humidity effects

•	Interference effects

•	Cold/hot start behavior

•	Battery life

•	Ease of use

•	Data output

•	Cost.

The response time, recovery time, accuracy, and repeatability will be evaluated by
challenging the IMS instruments with known vapor concentrations of target chemicals and CW
agents. Performance of such tests with low target analyte concentrations will evaluate the
response threshold of the IMS instruments. Similar tests conducted over a range of temperature
and RH will be used to establish the effects of these factors on detection capabilities. The effects
of potential interferences in an emergency situation will be assessed, by sampling those
interferences both with and without the target TICs and CW agents present. Testing the IMS
instruments after a cold start (i.e., without the usual warm-up period) and after hot storage will
evaluate the delay time before IMS readings can be obtained, and the response speed and
accuracy of the IMS instruments once readings are obtained. Readings of a target TIC will be
obtained with each IMS instrument operated on AC power, and subsequently on battery power,


-------
Ion Mobility Spectrometer Test/QA Plan

Page 10 of 64
Date: 7/31/03

to assess any differences. Battery life will be determined as the time until IMS performance
degrades as battery power is exhausted, in continuous operation. Operational factors such as
ease of use, data output, and cost will be assessed by observations of the test personnel and
through inquiries to the technology vendors.

The testing to be conducted under this plan is limited to detection of chemicals in the
vapor phase, because that mode of application is most relevant to the stated target scenario,
i.e., use by first responders. Some of the IMS instruments may be capable of analyzing surface
wipe samples, or heating a sample surface to promote vaporization of chemical agents. Such
capabilities could be addressed by a modification of this test/QA plan. However, those
capabilities are unlikely to be used by first responders at a scene of building contamination, and
so are not addressed in this verification. Testing will be conducted in two phases, the first will
address detection of TICs, and will be conducted in a non-surety laboratory, the second will
address detection of CW agents, and will be conducted in a certified surety facility.

Because of the nature of the test activities under this test/QA plan, the IMS instruments
will be operated by Battelle staff in all testing. However, each IMS vendor will be required to
provide the appropriate instructions or operator's manuals for their IMS instrument, and to train
Battelle staff in the correct use of the IMS instrument. Battelle testing staff will review all
written instructions and manuals before receiving training from the vendor. The Battelle testing
staff will note the clarity, completeness, and adequacy of the written documentation provided.
When each IMS vendor is satisfied that he has fully trained Battelle staff in operating the IMS
instrument, the vendor will be required to attest in writing that the Battelle staff are authorized to
operate the IMS instrument for the purpose of this verification test.

The portable IMS instruments to be tested provide different types of data outputs that
must be addressed under this test/QA plan. Although some IMS instruments may provide
quantitative indication of the concentration of the target CW agent or TIC, many provide only
qualitative (e.g., an audible or visual alarm indicating the presence of the compound) or semi-
quantitative (low/medium/high reading, numbered bar graph, etc.) indications. To achieve the
most effective verification test, the IMS vendors will be asked to provide the nominal
concentrations of target compounds that correspond to the qualitative detection ranges,


-------
Ion Mobility Spectrometer Test/QA Plan

Page 11 of 64
Date: 7/31/03

thresholds, or transition points of their IMS instruments. For example, the vendor of an IMS
instrument that provides low/medium/high indications will be asked to provide the nominal
concentrations of selected agents and TICs that are programmed to cause a transition in reading
from low to medium, and medium to high. These nominal levels will be factored into the test
procedures, to assure that relevant information on IMS performance is obtained.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 12 of 64
Date: 7/31/03

3.0 SITE DESCRIPTION

These tests are expected to be conducted at Battelle facilities in Columbus and West
Jefferson, Ohio. Those facilities are described below. Alternative facilities could also be used,
provided those facilities meet all the requirements for safety, security, and testing capability
established by this plan.

3.1 General Site Description

Battelle has two primary campuses in or near Columbus, Ohio that will be used to
conduct the verification tests. The main chemistry laboratories for non-chemical surety materiel
testing are located in a new King Avenue laboratory. Testing with the non-surety materiel -
TICs and interferents - will be conducted in the King Avenue laboratory. These facilities have
the dedicated vapor generation, collection, and analysis equipment needed to conduct the tests
described in this plan. The King Avenue laboratory has been used previously to conduct IMS
instrument and filter tests using phosgene (CG), hydrogen cyanide (AC), cyanogen chloride
(CK), and chlorine (Cb) under controlled environmental conditions.

Battelle's West Jefferson facility is an 1,800-acre research campus located within a tract
of Battelle-owned land in a rural area approximately 17 miles west of downtown Columbus,
Ohio. Testing with CW agents under this test/QA plan will use the Hazardous Materials
Research Center (HMRC) at West Jefferson. If necessary to meet schedule constraints, the
Medical Research and Evaluation Facility, which is a second laboratory-scale facility conducting
research with both chemical surety material (CSM) and biological materials, may also be used.

Battelle's HMRC is an ISO 9001 certified facility that will be used for testing of IMS
instruments with CW agents. The HMRC provides a broad range of materials testing, system
and component evaluation, research and development, and analytical chemistry services that
require the safe use and storage of highly toxic substances. Since its initial certification by the
Chemical Research, Development and Engineering Center in 1981, the facility has functioned as
both a research and a technology development laboratory in support of DoD chemical programs.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 13 of 64
Date: 7/31/03

The HMRC can safely store and handle BZ, tabun (GA), sarin (GB), soman (GD), thickened GD
(TGD), sulfur mustard (HD), thickened HD (THD), Lewisite (L), mustard-Lewisite mixtures
(HL), V-agent (VX), and other hazardous materials and toxins, such as arsine (AsH3) (SA),
cyanogen chloride (CK), hydrogen cyanide (AC), phosgene (CG), perfluoroisobutylene (PFIB),
as well as agent simulants, Class A poisons, and toxins (e.g., T-2 toxin).

The HMRC complex consists of approximately 10,000 ft2 which includes the Hazardous
Materials Laboratory and the Large Item Test Facility (LITF), which provide approximately
2,000 ft2 of laboratory space and 100 linear ft of CSM-approved filtered hoods for working with
neat (pure) CSM; about 630 ft2 of research dilute solution (RDS, i.e., diluted chemical agent)
laboratory space, including four fume hoods; and approximately 2,100 ft2 of laboratory support
areas, including environmental monitoring, emergency power supplies, and air filter systems.
The LITF, which occupies approximately 540 ft2 of the HMRC, was designed and is operated for
test and evaluation of items and systems too large to fit into standard laboratory fume hoods.

3.2 Site Operations

Battelle operates its certified chemical surety facilities in compliance with all applicable
Federal, state, and local laws and regulations, including Army Regulations. Battelle's facilities
are certified through inspection by personnel from the appropriate government agency. Battelle
is certified to work with CSM through a Bailment Agreement by the U.S. Army Soldier
Biological and Chemical Command (SBCCOM). SBCCOM will terminate its Bailment
Agreements on September 1, 2003, so Battelle has already begun the process to transition to an
AR50-6 surety facility. In this transition, Battelle will demonstrate, through inspections by the
appropriate government personnel, that its facilities meet all Federal, state, and local laws and
regulations, including Army Regulations. Battelle operates its certified biological facilities in
compliance with requirements contained in 32 CFR 626 and 627, Biological Defense Programs.
Our chemical and biological facilities and attendant certifications are listed in Table 1.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 14 of 64
Date: 7/31/03

Table 1. Battelle Facilities for Testing of Portable IMS Instruments

Facility

Materials

Level

Certification

Hazardous
Materials
Research Center

CW Agents

Chemical Surety Materiel
(CSM) (Neat)

RDT & E (Dilute)

Bailment Agreement
No. DAAD13-H-00-0002

Analytical
Chemistry
Laboratory

CW Agents

RDT & E (Dilute)

Bailment Agreement
No. DAAD13-H-00-0002

CW Agents

Chemical Surety Materiel
(CSM) (Neat)

RDT & E (Dilute)

United States of America
Medical Research Materiel
Command (USAMRMC)
No. G472501


-------
Ion Mobility Spectrometer Test/QA Plan

Page 15 of 64
Date: 7/31/03

4.0 EXPERIMENTAL DESIGN
4.1 General Test Design

The performance parameters to be verified and the rationale for their inclusion in this test
program are defined and summarized in Sections 4.2 and 4.3 below. Greater detail on the test
procedures is given in Section 6 of this test/QA plan.

The Safe Building Program of EPA's NHSRC addresses a relatively broad scope of
chemical vapor detection applications. Three main use-concepts can be envisioned: (1) detect-
to-warn, (2) detect-to-respond, and (3) detect-to-restore. These different use concepts have
different requirements, and thus, permit potentially different technologies (or configurations of a
single detection technology) to be considered for each application. For example, detect-to-warn
would require permanently installed, continuously operating systems that are integrated into the
building's infrastructure and utilities. IMS instruments used by a first responder, however, need
to be fast-responding and portable (i.e., light in weight, battery-powered) and are used on
demand. IMS instruments used in restoration (i.e., post-decontamination) need be neither fast
nor portable, but would need to have low detection limits to determine whether an area is clean.
Similarly, the range of environmental operating conditions can be different in these different use
scenarios.

The use scenario of detect-to-respond was chosen as the focus of this test/QA plan for
portable IMS technologies. The performance parameters to be verified and the test conditions
are therefore intended to be relevant to use by a first responder, or other personnel needing rapid,
real-time indication of an immediate hazard.

The general test design is to first benchmark IMS instrument IMS performance when
operated according to the manufacturers' instructions. This will include following
manufacturers' recommendations for calibration, warm-up time, and operating conditions (e.g.,
ambient temperature range). The challenge vapor concentration most relevant to a first
responder is the immediately dangerous to life and health (IDLH) level, and consequently
concentrations approaching this level will be used in benchmark experiments with a variety of


-------
Ion Mobility Spectrometer Test/QA Plan

Page 16 of 64
Date: 7/31/03

chemicals and chemical agents. Normal indoor air temperature and RH will be established for
these benchmark experiments. In addition to the benchmark experiments to establish response
time and characterize IMS instrument performance, test conditions will be varied to explore the
IMS response threshold, and to assess the impact on IMS instrument response of realistic stresses
or ranges of conditions likely to be encountered during actual field use. For example, cold-start
operation (not allowing proper warm-up time), startup after hot storage, differing temperature
and humidity conditions, and the introduction of potentially interfering compounds, are all
included in the test matrix.

A description of the performance parameters to be characterized and the rationale for
their inclusion is provided in Sections 4.2 and 4.3. The chemicals of interest that will be used for
the vapor challenges are discussed in Section 4.4. The test matrix and schedule are discussed in
Sections 4.5 and 4.6, respectively, and the reference methods to be used are introduced in
Section 4.7.

4.2 Performance Parameters

The key performance parameters to be evaluated in this verification test are:

•	Response time

•	Response threshold

•	Repeatability

•	Accuracy

•	Recovery time

•	Temperature and humidity effects

•	Interference effects

•	Cold/hot start behavior

•	Battery life

All of these performance parameters will be evaluated with TICs as the target analytes.
All of these performance parameters except cold/hot start behavior and battery life also will be
evaluated with CW agents. These performance parameters are defined, and general test
procedures are outlined, in Sections 4.2.1 to 4.2.9. Specific test procedures to evaluate these
parameters are in Sections 6.1 to 6.9. In addition to these key performance parameters,


-------
Ion Mobility Spectrometer Test/QA Plan

Page 17 of 64
Date: 7/31/03

operational characteristics of the units will be recorded. These operational characteristics
include:

•	Ease of use

•	Signal/data output

•	Cost.

These characteristics will be evaluated based on operator observations and available
information on the IMS instruments.

4.2.1 Response Time

The determination of IMS response time will accommodate the wide variety of responses
and displays provided by commercial IMS instruments. For IMS instruments that provide a
quantitative continuous reading of concentrations, response time will be defined as the time
required for the IMS instrument to reach 90% of its final response, or indicated concentration,
after the introduction of a step change in the concentration of target chemical. In the case of IMS
instruments that provide a relative scale reading, e.g., "low/medium/high," or a status bar
display, the time to reach a stable (i.e., unchanging) reading will be recorded as the response
time. For IMS instruments that do not provide a quantitative measure, but rather an audio or
visual alarm, then the time to alarm will be recorded as a response time. If multiple forms of
response (e.g., an alarm and a scale reading) are outputs of the device, then both will be recorded
to determine response time. The response time will be measured from the start of a fixed
challenge vapor concentration, after the IMS instrument has been stabilized by sampling a clean
air stream.

The response time is to be verified because a rapid indication of chemical concentration
will be needed by first responders to assess the potential hazard.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 18 of 64
Date: 7/31/03

4.2.2	Response Threshold

The IMS instrument's response threshold is defined as the approximate concentration that
causes the instrument to indicate a response above the baseline reading obtained when sampling
clean air at the target test conditions. For instruments that provide a continuous quantitative
reading, the response threshold will be the minimum concentration that produces readings
uniformly above the zero level. For IMS instruments that provide a relative measure of response
such as a status bar or "low/medium/high," the response threshold will be defined as that
concentration required to indicate the next highest reading above the baseline. The response
threshold for IMS instruments that provide an audible or visual alarm will be that minimum
concentration required to cause the audible or visual alarm.

The response threshold is being assessed to determine whether the IMS instruments have
adequate sensitivity to chemicals of interest. A precise determination of the response threshold
is not needed because the first responder will be using the IMS instrument to determine an
immediate hazard, rather than an exact concentration. Therefore testing that brackets the
response threshold within an approximate range is considered sufficient.

4.2.3	Repeatability

Repeatability is defined as the consistency of the IMS instrument's indicated response to
a constant vapor challenge concentration. Repeatability defined in this way applies to IMS
instruments that output a concentration reading in the form of an analog or digital signal, status
bars, or a qualitative audible or visual alarm.

Repeatability is being assessed to provide the prospective IMS user with information on
the consistency of response at constant vapor concentrations.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 19 of 64
Date: 7/31/03

4.2.4	Accuracy

Accuracy is defined as the degree of agreement between the chemical concentration
indicated by an IMS instrument and that measured by a reference method. Accuracy will be
verified by direct comparison of reference and IMS data only for those IMS instruments that
output a quantitative response as an analog or digital signal. For IMS instruments that output
only audible or visual alarms, accuracy will be determined relative to the response threshold in
terms of correct (or false) positive and negative indications of the presence of the target
chemical. IMS instruments that identify the chemical being sensed will also be evaluated
relative to accurate identification of the chemical.

The accuracy of IMS instruments that indicate a relative concentration by status bar or
low/medium/high indicators will be determined based on the correlation of indicator reading to
concentration provided by the vendor. For example, if the transition to a "high" reading is
programmed to occur at concentration X, then the IMS will be credited with an accurate reading
whenever it reports a "high" response at an analyte concentration equal to or greater than X.

Accuracy is being assessed to demonstrate that the indicated response is a true indication
of the actual vapor challenge concentration.

4.2.5	Recovery Time

Recovery time (or clear-down time) is defined as the time for the IMS instrument to
return to its baseline reading (established prior to exposure to the challenge vapor), after it has
reached stable readings while sampling the challenge vapor. This performance parameter will be
verified for devices that provide a quantitative output, as well as for those that only produce a
qualitative or semi-quantitative output. For quantitative IMS instruments, the same 90%
response criterion for quantitative readings that is applied in establishing response time, will be
applied to quantitative IMS instruments in establishing recovery time. For IMS instruments that
provide only an audible or visual alarm, recovery time will be determined as the time between
removing the challenge vapor concentration and stopping the alarm.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 20 of 64
Date: 7/31/03

Recovery time is being verified to illustrate how much time the IMS instrument requires
to clear before it is ready to provide an accurate reading in another sampling event. This factor
would be relevant when a first responder enters an area that causes an alarm. The IMS
instrument would have to clear before it could be used reliably in another area in the building.

4.2.6 Temperature and Humidity Effects

The effect that the temperature and RH of the sampled atmosphere have on IMS
instrument response will be evaluated. In all cases, the IMS instrument undergoing testing will
be maintained at the same temperature as the challenge air stream. The challenge air stream also
will be maintained at the specified RH.

The temperature and RH conditions to be used in testing were selected based on those
likely to be experienced in an indoor environment in actual use by first responders. In the event
of a chemical release it is possible that the windows of a building could be opened to flush out
the contaminant. Conversely, safe building protocols also may require closing a building to
prevent infiltration of outside vapor hazards, to minimize exposure of the surrounding populace,
or to minimize convective transport of contaminants throughout a building. Overall, it is
unlikely that the indoor building conditions encountered by a first responder would range over
the full extremes of potential outdoor conditions. Consequently, a narrower range of temperature
and RH is considered appropriate for this verification test, as indicated in Table 1. Each "X" in a
cell in Table 2 indicates a condition of temperature and RH that will be used as a test condition
in this verification test.

Temperature and RH effects are being assessed to establish whether IMS readings are
influenced by environmental conditions during use.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 21 of 64
Date: 7/31/03

Table 2. Temperature and Relative Humidity Conditions
for Portable IMS Instrument Testing

RH (%)

Temperature (ฐC)

5 ฑ 3

22 ฑ3

35 ฑ3

<20



X



50 ฑ 5

X

X

X

80 ฑ 5



X

X

4.2.7 Interference Effects

The effect of potentially interfering compounds present in the indoor building atmosphere
will be assessed. Potentially interfering compounds have been selected to achieve a diverse set
of chemicals that could be ubiquitous in buildings under a first-response scenario, and whose
presence is not seasonally dependent. A representative set of potentially interfering compounds
was identified for use in testing, as follows: (1) ammonia-based cleaner, (2) latex paint fumes,
(3) gasoline vehicle exhaust, (4) air freshener vapors, and (5) N, N-diethylaminoethanol (DEAE),
a common additive in building boiler systems that can be a ubiquitous indoor contaminant.

These potential interferents will be tested both with and without the target TICs and CW agents
presented.

The effect of potentially interfering compounds is being assessed because such
compounds can potentially produce two types of errors with IMS instruments: (1) erroneous
reporting of the presence of a chemical or chemical agent when none is present (false positives)
or (2) reduction in sensitivity or masking of target analytes of interest (false negatives). False
positives will be assessed by alternately sampling clean air and air containing the interferent, in
the absence of any target chemical or agent. False negatives will be assessed by alternate
sampling of clean air and air containing both the interferent and a target chemical or agent. Both
types of tests will be conducted with all the interferent species and all the target chemicals and
agents.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 22 of 64
Date: 7/31/03

4.2.8	Cold/Hot Start Behavior

The test of cold start behavior will assess how the IMS response to a target challenge
concentration at baseline environmental conditions is affected when the IMS instrument is not
permitted adequate warm-up time per the manufacturer's instructions. The performance of the
IMS devices will be evaluated without any warm up period, to simulate the effect of immediate
use that could be required in an emergency. The time delay between turning the IMS instrument
on and when the IMS instrument is ready to begin giving any reading at all will be a primary
factor determined in this test. In addition, as appropriate for the IMS instrument being tested, the
response time to a vapor challenge, and the accuracy of readings relative to the challenge
concentration will be evaluated. The cold start behavior will be evaluated with both a cold start
from room temperature and from reduced temperature (i.e., after storage of the IMS instruments
overnight in a refrigerated environment at 5 to 8ฐC). Conversely, a hot soak followed by startup
is also of interest, because IMS instruments may be stored/transported in vehicles parked in the
sun. Such heat exposure may affect performance, especially electronics. Note that the "hot
start" evaluation means that the IMS instrument is taken from storage in a hot environment and
then started; it is not "hot" in the sense of having been running previously. The hot soak will
consist of storing the IMS instruments overnight at a temperature of 40 ฑ 3ฐC before testing. As
in the cold soak tests, the response time and accuracy of readings will be assessed.

Vendors have indicated that actual use conditions and operating parameters are not and
cannot always be followed by the emergency responders. Therefore, IMS instruments may be
used in a fashion that is not ideal. The need for immediate readings upon arrival at an
emergency is the motivation for testing cold/hot start behavior.

4.2.9	Battery Life

Portable IMS instruments will be battery operated and thus performance will be
dependent on proper performance of the batteries. Battery life is defined as the amount of time
the IMS instrument can operate on fully charged or new batteries. A one-time test will be


-------
Ion Mobility Spectrometer Test/QA Plan

Page 23 of 64
Date: 7/31/03

conducted to determine how long the IMS instrument will run on a single, full charge or one set
of new, disposable batteries.

4.3 Operational Characteristics

Key operational characteristics of the IMS instruments will be evaluated by means of the
observations of test operators, and by inquiry to the IMS vendors.

Ease of use will be assessed by operator observations, with particular attention to the
conditions of use by first responders. For example, the use of PPE (e.g., gloves, respirator) may
make it difficult to turn the IMS instrument on or off, operate it, or read the display. These
factors will be assessed by outfitting an operator with such PPE, and noting any difficulties in
operating the IMS instrument. This assessment will be done separately from any test of the other
performance parameters with TICs or CW agents.

Signal or data output capabilities of the IMS instruments will be assessed by observations
of the testing personnel who operate the instruments during testing. The type of data output will
be noted (e.g., audio or visual alarm, bar graph, low/med/high indication, quantitative measure of
concentration, etc.). In addition, the clarity and readability of the output will be noted, especially
in low light conditions or when holding the IMS instrument while walking, as in use by a first
responder. The availability of multiple forms of data output or display also will be assessed, e.g.,
the availability of both a visual display and an analog voltage output for recording purposes.

Costs for each IMS instrument will be assessed by asking the vendor for the purchase and
operational costs of the instrument as tested in this program. This verification test will not be of
sufficient duration to test long-term maintenance or operational costs of the IMS instruments.
Estimates for key maintenance items will be requested from the vendors to address those costs.
Costs will be those at the time the IMS instruments are made available for testing.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 24 of 64
Date: 7/31/03

4.4 Chemical Test Compounds

This test/QA plan cannot consider all the chemicals that a first responder could
potentially encounter when responding to a possible vapor hazard in a building. An emergency
response may be necessary due to an accidental spill of relatively innocuous chemical, or to a
purposeful release of a hazardous chemical. One focus of chemical selection in this plan is on a
set of TICs commonly considered by the DoD community as potential hazards. Initial
experiments will challenge the IMS instruments with selected TICs. After completing TIC
experiments, the IMS instruments will be challenged with CW agents. The TICs selected for use
in IMS verification are (agent designation or chemical formula in parenthesis): cyanogen
chloride (CK), hydrogen cyanide (AC), phosgene (CG), chlorine (CI2), and arsine (AsH3) (SA).
The CW agents selected for use in testing are GB and HD.

4.5 Test Matrix

Table 3 summarizes the evaluations to be conducted in this verification test. As Table 3
indicates, except for cold start and hot start behavior and battery life, and assessment of false
positive interference effects (i.e., the interferent alone), all performance parameters will be
evaluated both with five TICs and with two CW agents.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 25 of 64
Date: 7/31/03

Table 3. Summary of Evaluations to Be Conducted in Portable IMS Verification Test

Performance
Parameter

Objective

Comparison Based On

Response
Time

Determine rise time of
IMS response

IMS readings with step rise in
analyte concentration

Response
Threshold

Estimate minimum concentration
that produces IMS response

Reference method results

Repeatability

Characterize consistency of IMS
readings with constant analyte
concentration

IMS readings with constant input

Accuracy

Characterize agreement of IMS
with reference results

Reference method results

Recovery
Time

Determine fall time of
IMS response

IMS readings with step decrease in
analyte concentration

T and RH
Effects

Evaluate effect of T and RH on
IMS performance

Repeat above evaluations with
different T and RH

Interferent
Effects

Evaluate effect of building
contaminates that may
interfere on with
IMS performance

Sample interferents and TICs/CW

agents together

(and interferents alonea)

Cold Start

Characterize startup performance
of IMS

Repeat tests with no warmupa

Hot Start

Characterize performance after hot
storage

Repeat tests with no warmupa

Battery
Operation

Characterize battery life and
performance

Compare IMS results on battery vs
AC powera

a: Indicates this part of the test not performed with CW agents.

4.6 Test Schedule

Testing under this test/QA plan is expected to begin in July, 2003. It is anticipated that
three to four weeks will be required to complete the TIC experiments for a single IMS
technology. This schedule is predicated on the vendors providing two of their respective IMS
instruments for testing by May 15, 2003. Because effort and resources are required to construct
test fixtures for controlled challenge atmosphere generation, a test apparatus will be constructed
for testing one chemical at a time. Testing will then consist of sequencing through the TICs at
Battelle's King Avenue laboratories, followed by the CW agents testing at the HMRC. Testing


-------
Ion Mobility Spectrometer Test/QA Plan

Page 26 of 64
Date: 7/31/03

the TICs first allows for the most rapid and cost effective means to conduct tests. If any
equipment (reference instrument or test fixture) maintenance or modification is required, it will
be easiest to do it prior to CW agent exposure. Testing with TICs will initially emphasize the
baseline environmental conditions of 22 ฑ 3ฐC and 50 ฑ 5% RH. The procedures for
temperature and RH effects and the interferent tests will be conducted following the initial
benchmark experiments.

Figures 2 and 3 illustrate the planned stepwise progression of procedures in TIC and CW
agent experiments, respectively. These figures show that most procedures are conducted both
with TICs and with CW agents. However, some procedures are not repeated with CW agents,
including the cold and hot start tests, battery operation and battery life, and the sampling of
interferents in the absence of target analytes. Repetition of these tests with CW agents is
unnecessary.

Sections 6.1 through 6.9 of this plan describe how each of the procedures in
Figures 2 and 3 will be performed.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 27 of 64
Date: 7/31/03

Figure 2. Test Sequence and Logic for TIC IMS Instrument Verification


-------
Ion Mobility Spectrometer Test/QA Plan

Page 28 of 64
Date: 7/31/03

Figure 3. Test Sequence and Logic for CW Agent IMS Instrument Verification


-------
Ion Mobility Spectrometer Test/QA Plan

Page 29 of 64
Date: 7/31/03

4.7 Reference Methods

Table 4 summarizes the reference methods to be used for determining the challenge
concentrations of the target TICs and CW agents in the test. Listed in the table are the target
TICs and CW agents, the sampling and analysis methods to be used for each compound, and the
applicable concentration range of each method. For the TICs cyanogen chloride, hydrogen
cyanide, and phosgene, low concentration samples will be injected directly for determination by
gas chromatography (GC) with flame ionization detection (FID). High concentration samples
will first be diluted with zero air in a Tedlar gas sampling bag before injection into the GC/FID. .
Chlorine will be determined by a continuous electrochemical analyzer with a chlorine-specific
sensor, to allow rapid determination of chlorine levels delivered to the IMS instruments during
testing. Arsine will be determined by a gas chromatographic method with a capillary column
and mass selective detection (MSD), using samples collected by syringe from the test apparatus.
A retention time of about seven minutes is expected for arsine, allowing repeated analysis within
each test procedure.

The CW agents GB and HD will be collected in impingers containing an organic solvent,
or on solid sorbent cartridge, and determined by GC with flame photometric detection (FPD).
Determination of the CW agents will be conducted according to the HMRS Standard Operating
Procedure (SOP) HMRC-IV-056-06, "Operation and Maintenance of Gas Chromatography
Analysis of GA, GB, GD, GF, HD, and VX. " The procedures of this method for gas
chromatographic (GC) analysis will also be adapted for the analysis of TICs by GC


-------
Ion Mobility Spectrometer Test/QA Plan

Page 30 of 64
Date: 7/31/03

Table 4. Planned Reference Methods for Target TICs and CW Agents

Analyte

Concentration
Range (ppm)

Sampling Method

Analysis
Method

Cyanogen

chloride

(CK)

2 ~ 100

Low conc.: Air sample injected directly
High conc.: Air sample diluted in gas bag
and then injected

GC/FID

Hydrogen

cyanide

(AC)

0.05 ~ 100

Low conc.: Air sample injected directly
High conc.: Air sample diluted in gas bag
and then injected

GC/FID

Phosgene
(CG)

1 ~ 100

Low conc.: Air sample injected directly
High conc.: Air sample diluted in gas bag
and then injected

GC/FID

Chlorine
(Cl2)

0.1 ~ 100

Continuous electrochemical detector with
chlorine-specific sensor

Continuous
detection

Arsine (AS)

0.05 ~ 100

Capillary gas chromatography with direct
injection

Mass selective
detector (MSD)

GB

0.01 ~ 100

Air sample collected with solvent impinger
or solid sorbent

GC/FPDa

HD

0.01 ~ 100

Air sample collected with solvent impinger
or solid sorbent

GC/FPDa

a: These measurements governed by HMRC SOP HMRC-IV-056-06.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 31 of 64
Date: 7/31/03

5.0 MATERIALS AND EQUIPMENT

Note: The assembly and preparation of equipment for this verification test is taking
place simultaneously with review of this draft test/QA plan. Consequently, the materials and
equipment to be used in this verification can be described in general, but individual items of
equipment cannot be specified.

5.1	Agents, Simulants, and TICs

As noted in Section 4.4, the chemical TICs to be used in this verification test will include:
cyanogen chloride (CK), hydrogen cyanide (AC), phosgene (CG), chlorine, and arsine (SA).
These gases are relatively common and readily available materials that could be used by
terrorists to attack a building. Chlorine is also a common, high-volume industrial chemical that
might be found at the scene of an industrial accident or transportation spill. All TICs except
cyanogen chloride will be purchased as dilute compressed gas mixtures from commercial
vendors, with a balance of nitrogen. The concentrations of those mixtures will be determined
based on the required challenge target concentrations. For cyanogen chloride, a compressed gas
standard will be prepared in Battelle's laboratories, using neat cyanogen chloride as the starting
material.

The CW agents planned for use in the verification test include GB and HD. These agents
are reasonable potential threats, and have been used in previous tests of CW agent IMS
instruments for military applications, thereby providing a possible link between this verification
test and previous testing. The CW agents will be obtained from the U.S. Army, under the
bailment agreement noted in Section 3.2.

5.2	Vapor Delivery Equipment

Different vapor delivery equipment will be used depending on the TIC or CW agents to
be tested. Compressed gas cylinders will be used as the vapor delivery source for all the TICs:


-------
Ion Mobility Spectrometer Test/QA Plan

Page 32 of 64
Date: 7/31/03

cyanogen chloride, hydrogen cyanide, phosgene, chlorine, and arsine. For the less volatile CW
agents GB and HD, a sparging system or diffusion cell will be used depending on the challenge
concentration, with the sparging system providing a high vapor generation rate and a diffusion
cell providing a low vapor generation rate. A temperature controlled water bath will be installed
to control the temperature of the sparging system and the diffusion cell, to maintain a stable and
controllable vapor generation rate. A two-way valve will be included in the flow path
downstream of the vapor generation source, so that the dilution and test equipment can be totally
isolated from the source if necessary. A schematic of the entire vapor generation, dilution and
delivery system is shown in Section 6.0.

5.3 Temperature/Humidity Control

The IMS instruments will be evaluated at temperatures specified in Table 2,

Section 4.2.6. Both the delivered air temperature and the IMS instruments will be maintained
within the specified temperature range. For testing at 35ฐC, the vapor delivery system will be
warmed with heat-traced line, using an electronic temperature controller. For testing at 5ฐC, the
dilution and delivery system will be enclosed in a cooled chamber, to provide approximate
temperature control. For all tests, thermocouples will be installed in both the clean air plenum
(see Section 6.0) and the challenge plenum to provide real-time temperature monitoring.

A commercial Nafionฎ humidifier (Perma Pure, Inc.) will be used to generate controlled
high humidity air (50 to 100% RH), which will then be mixed with dry dilution air and the target
vapor stream to obtain the target RH (< 20% to 80%) in the challenge air.

5.4 Reference Methods

The planned reference methods were summarized in Section 4.7. The media used will
depend on the analyte and concentration range of interest. In summary, gas samples for CK, AC,
and CG will be collected directly in a syringe or diluted in Tedlarฎ bags, and direct injection via
sample loop or syringe will be used for subsequent analysis by GC/FID. For arsine, direct


-------
Ion Mobility Spectrometer Test/QA Plan

Page 33 of 64
Date: 7/31/03

injection via syringe will be used, for analysis by GC/MSD. Chlorine will be determined
continuously by a chlorine-specific electrochemical sensor. For the CW agents, samples will be
collected into organic solvents in sampling impinger, or onto commercially available solid
sorbent cartridges, and subsequently injected for GC/FPD analysis.

5.5 Performance Evaluation Audit

The equipment needed for conducting the performance evaluation audit will consist of
independent standards used to check the reference methods against which IMS responses are
compared. These independent standards will be liquid or gaseous standards of the target TICs or
CW agents, prepared or obtained from different suppliers than those providing the standards used
for reference method calibrations. Description of the schedule and procedures for the PE audit is
provided in Section 7.2.2.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 34 of 64
Date: 7/31/03

6.0 TEST PROCEDURES

The schematic of the test system is illustrated in Figure 4. The test system consists of a
vapor generation system, a Nafionฎ humidifier, two challenge plenums, a clean air plenum, RH
sensors, thermocouples, and mass flow meters. The challenge vapor or gas is generated by the
vapor generation system. The appropriate vapor generator, such as a sparging system, diffusion
cell, or compressed gas cylinder, will be selected depending on the compound of interest and the
concentration range to be tested. The challenge vapor from the vapor generation system will
then mix with the humid dilution air and flow into the challenge plenum.

The RH and target concentration of the challenge vapor will be obtained by adjusting the
mixing ratio of the humid air (from the Nafionฎ humidifier) to the dry dilution air, and the
mixing ratio of the vapor generation stream to the humid dilution air, respectively. To avoid
potential corrosion or malfunction of the relative humidity sensor from exposure to the challenge
vapor, the RH meter will be installed upstream of the inlet of the vapor stream. The RH of the
challenge vapor stream will be calculated based on the measured RH of the humid dilution air,
and the mixing ratio of the vapor generation stream to the humid dilution air.

To establish the background readings of the two IMS units being tested, a clean air
plenum will be installed. Part of the humid dilution air will be introduced directly into the clean
air plenum. When establishing the IMS instrument background, the four-way valves connected
to the two IMS units will be switched to the clean air plenum to collect baseline data.

After the background measurement, the four-way valves connected to the two IMS units
will be switched to the challenge plenum to allow the IMS instruments to sample the challenge
mixture. Switching between the challenge and clean air plenums will be rapid, and the residence
time of gas in the test system will be short, to allow determination of the response and recovery
times of the IMS instruments. The use of two challenge plenums allows an assessment of the
recovery of IMS response, as when the user moves from one contaminated area to another area
of different contaminant concentration. Note that multiple IMS instruments can be tested
simultaneously using the test setup, although only two units of one IMS instrument are illustrated
in the schematic. The reference methods described in Section 4.7 will be used to quantify the


-------
Ion Mobility Spectrometer Test/QA Plan

Page 35 of 64
Date: 7/31/03

gas concentrations in the clean air plenum and the challenge plenum to provide a cross-check of
the concentrations measured.

The test system depicted in Figure 4 is the basic system that will be used to assess the
response and performance of IMS instruments to challenge vapors of the selected chemicals.
The specific components and methods will depend, in part, on the type of evaluation and
chemical challenge. For example, the vapor system method will draw a known flow of the
chemical from a compressed gas cylinder, when testing with a volatile chemical such as the
TICs, or use a sparging system or diffusion cell, for less volatile compounds such as the CW
agents. Similarly, the test system will also incorporate an interferent generator (not shown in
Figure 4) as needed in the test for evaluation of interference effects. The interferent generator
will be a simple but realistic vapor source, for delivery of paint fumes, ammonia cleaners, and air
fresheners. For these interferents, a flow of approximately 100 cm3/min of clean air will be
passed through a sealed glass vessel containing a stirred aliquot of the interferent material. The
vapor picked up by the air stream will be diluted in the air flow to the test plenum, to achieve the
target interferent concentration. For delivery of vehicle exhaust, the interferent source will be a
small flow of exhaust drawn from the total exhaust of a gasoline engine. Testing with DEAE
will use a compressed gas mixture of DEAE in nitrogen, prepared in Battelle's laboratories. The
same interferent sources will be used in all tests.

The test system will be constructed so that a dedicated clean air and challenge air stream
can be sampled. The dedicated streams are needed to properly establish the system response to
clean air prior to an experiment. This is critical when testing a parameter such as response time,
so that the time constant of the test system can be uncoupled from that of the IMS instrument. A
single stream system would require too much time to change from clean air to challenge air,
preventing the actual response time of the IMS instrument from being properly measured.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 36 of 64
Date: 7/31/03

E

U

ฃ

CS

s

-fi

o

s

VI

Xft

H

u
3

OX)

T"



CM

o



o

Ll_



LL

5







U)



cn

E

i-

S.E
-Sฎ

3 flj

ฆo
c

c
o



Q 
-------
Ion Mobility Spectrometer Test/QA Plan

Page 37 of 64
Date: 7/31/03

6.1 Response Time

To evaluate IMS response time, the environmental conditions will be established at the
target conditions of 22 ฑ 3 C and 50 ฑ 5% RH. Initially 10 L/min of the clean humidified air will
pass through the clean air plenum. The IMS instruments will sample the clean air for a minimum
of 5 minutes, or until a stable reading has been indicated, but not to exceed 10 min, to obtain a
baseline for the IMS instrument. A stable reading is defined as one that does not change when
all system conditions are unchanged. For IMS instruments that do not provide an analog or
digital signal, but rather a status indicator such as a meter bar or relative measure
(e.g., low/medium/high), IMS readings will be considered stable when there is no change in the
reading over a 1-minute period. If the IMS instrument has a digital or analog signal, readings
that fluctuate by less than ฑ 20% and show no apparent trend over a 1-minute period will be
considered stable. Simultaneously with the sampling of the clean air by the IMS instruments, the
clean air plenum will also be sampled with the appropriate reference method. This sampling will
take place over at least a 5-minute period, after IMS readings have been stabilized.

Concurrent with the background measurements will be the establishment and
demonstration of the target challenge concentration in the high challenge plenum. The high
challenge concentration will be generated at the target environmental conditions. Adjustments
will be made to the generator operating conditions and the dilution flow as needed to establish a
challenge concentration within ฑ 20% of the IDLH target, with a stability characterized by a
percent relative standard deviation of 10% or less in successive reference measurements.
Reference samples will be collected and analyzed immediately to establish the challenge
concentration and demonstrate stability prior to testing. A challenge concentration will be
considered stable if it can be maintained within the target challenge bounds based on three
consecutive reference sample measurements over a minimum of 5 min of continuous operation
prior to the test.

After a stable reading is obtained from the IMS instruments on background air, and the
challenge mixture is stable and at the target concentration, the four-way valve at the IMS
instrument's inlet will be switched to sample from the high challenge plenum. The response of


-------
Ion Mobility Spectrometer Test/QA Plan

Page 38 of 64
Date: 7/31/03

the IMS instruments will be recorded and the time to reach a stable response will be determined.
The high challenge vapor concentration will also be determined by reference method sample
every 10 to 15 min during the procedure. A stable IMS response in sampling the challenge
mixture is defined in the same manner as above for baselining with clean air. The IMS will
sample from the challenge plenum for a minimum of 5 min, up to a maximum of 10 min.

After the challenge sampling has concluded, the sample inlet four-way valve will be
switched to again sample from the clean air plenum. The time required for the IMS instruments
to clear, i.e., the time to return to its starting baseline or non-alarm reading will be recorded as
the recovery time. A minimum of 5 min will be permitted to allow the IMS instrument response
to return to baseline. After maximum of 10 min, regardless of whether the IMS instrument has
returned to baseline, subsequent cycles of alternating challenge/clean air sampling will be carried
out, controlled by the 4-way valve. A total of five such challenge/clean air cycles will be
completed.

In the case of an IMS instrument that enters a "backflush" mode or otherwise interrupts
sampling upon detection of the target chemical, a different approach will be used from that
outlined above. Upon interruption of sampling due to detection of the chemical, the IMS will
immediately be switched back to sampling from the clean air plenum. That is, the requirement
for a minimum 5 minute sampling period will be removed. Once the interruption or "backflush"
has ended, the baseline measurement will be taken and the process repeated.

Following the five challenge/clean air cycles, a corresponding set of five cycles will be
conducted, in which the IMS instruments alternate sampling from the high and low challenge
plenums. The high challenge plenum will provide the IDLH or comparable concentration, and
the low challenge plenum a concentration of approximately 0.1 times that level. Clean air will
be sampled before the first of the five high/low challenge cycles, and after the last of those
cycles. This procedure will simulate use of the IMS instruments in locations having different
degrees of contamination. If necessary, the alternate procedure described above for instruments
that interrupt sampling or go into a "backflush" mode will be used in this test as well.

The same sampling procedure will be carried out at different temperature and RH
conditions or challenge concentration to evaluate temperature and RH effects and response


-------
Ion Mobility Spectrometer Test/QA Plan

Page 39 of 64
Date: 7/31/03

thresholds. The initial test will be conducted at a concentration equal to the target chemical's
IDLH level. If the chemical does not have an IDLH, then another concentration of significant
health impact will be targeted. The temperature and humidity effects will similarly be assessed
using the IDLH or other significant concentration.

If the instrument does not respond to the IDLH or other significant concentration
selected, then all subsequent tests planned for that chemical will be eliminated. Otherwise,
testing will proceed as described.

Table 5. Target Challenge Concentrations

Chemical

Concentration

Type of Level

Cyanogen chloride (CK)

20 ppm (50 mg/m3)

Estimated based on IDLH for
HCN

Hydrogen cyanide (AC)

50 ppm (50 mg/m3)

IDLHa

Phosgene (CG)

2 ppm (8 mg/m3)

IDLH

Chlorine (Cb)

30 ppm (90 mg/m3)

IDLH

Arsine (SA)

3 ppm (10 mg/m3)

IDLH

GB

0.015 ppm (0.087 mg/m3)

AEGL-2b

HD

0.09 ppm (0.6 mg/m3)

AEGL-2

a: IDLH = Immediately dangerous to life and health

b: AEGL = Acute Exposure Guideline Level; AEGL-2 levels are those expected to produce a serious hindrance
efforts to escape in the general population.(2) The values shown assume a 10-minute exposure.

6.2 Response Threshold

The response threshold of each IMS instrument will be evaluated by repeating the
procedure of Section 6.1 at successively lower (or, if necessary, higher) concentrations, to define
the instrument's response threshold. The response threshold will be determined at the baseline
environmental condition of 22 ฑ 3 ฐC and 50% RH, in the absence of any interfering chemicals.
The manufacturer's reported detection limit (ฑ 50%) will be used as the starting concentration.
If no detection limit is reported by the manufacturer, then a concentration at least 10 times lower
than the IDLH or other target concentration will be used as a starting concentration. If there is
no response at the starting test concentration, then the concentration of the challenge will be
increased by a factor of two. Similarly, if the IMS instrument responds to the starting
concentration, then the challenge concentration will be decreased by a factor of two. The


-------
Ion Mobility Spectrometer Test/QA Plan

Page 40 of 64
Date: 7/31/03

increase or decrease in concentration will be continued accordingly, until the response threshold
has been bracketed. The minimum concentration producing an IMS response will be denoted as
the response threshold.

The duplicate IMS instruments tested simultaneously may produce different instrument
responses. In that case, the concentrations will be varied as needed to assess the response
thresholds of the individual IMS instruments.

6.3 Repeatability

Repeatability will be assessed using data obtained from the five repeated clean
air/challenge or high/low challenge cycles, in the various tests conducted, such as the response
time tests. The five repeated test results at the same environmental and concentration conditions
will be reported, to demonstrate the repeatability of the measurements. No additional tests
specific to this parameter will be performed.

6.4 Accuracy

The accuracy of the IMS instruments will likewise not require any additional tests. In all
the response threshold and response time tests, the challenge concentration will be measured
using a reference method or monitor. Reference samples will be collected prior to, during, and
after IMS testing to ensure that a stable concentration is maintained. The reference samples will
be the ground truth samples used to assess accuracy for those IMS instruments that give a
quantitative concentration reading. For IMS instruments that give only a relative indication of
concentration, such as indicator bars, accuracy will be assessed based on manufacturer-supplied
data on the relationship of instrument readings to analyte concentration. It is assumed that
manufacturers have correlated such readings to absolute concentrations during development. If
those data are not proprietary and are provided, they will be used to assess accuracy. Alarm
readings, initiation of backflush mode, and other IMS responses will be used to assess accuracy
as described in Section 8.3.4.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 41 of 64
Date: 7/31/03

6.5 Recovery Time

The time for the IMS instrument to return to its baseline reading or non-alarm state after
removing a challenge concentration will be measured as described under Response Time,

Section 6.1. No additional tests are planned beyond those conducted in Section 6.1.

6.6 Temperature and Humidity Effects

The tests described under Response Time in Section 6.1 will be repeated at the IDLH or
other selected target concentration of significant health concern, over the range of environmental
conditions shown in Table 2 (Section 4.2.6). Five repeat runs will be performed at each set of
test conditions, for each target TIC or CW agent. The same procedure used in Section 6.1 will
be used. The data at different temperature and RH conditions will be used to infer whether these
conditions affect the detection (i.e., accuracy, repeatability, response threshold) of the IMS
instrument for the target chemical. The effect on response time and recovery time will also be
assessed.

6.7 Interference Effects

To evaluate interference effects, the test system shown in Figure 4 will be modified with
the addition of an interferent vapor generator. The output from this source will be directed as
needed to mix with the humidified air flowing to the challenge plenum. The test chemical
generation can be independently controlled such that the interferent will be generated in the
absence or presence of the test chemical. This will allow interference effects to be evaluated
with the interferent alone, and with the interferent and TIC or CW agent together. Testing with
the interferent alone will allow evaluation of false positive responses, and testing with the
interferent and chemical together will allow evaluation of false negatives. The test procedures


-------
Ion Mobility Spectrometer Test/QA Plan

Page 42 of 64
Date: 7/31/03

will also allow observation of interferent effects on the response time and reecovery time of the
IMS instruments.

Interferent testing will involve only one interferent vapor at a time. Testing will be done
by alternately sampling clean air and the interferent mixture, for a total of up to five times each,
in a procedure analogous to that described in Section 6.1. However, if no interferent effect is
observed after three such test cycles, the test will be truncated at that point. Testing with
interferents alone will involve alternately sampling from the air plenum, and then from the
challenge plenum, to which the interferent in clean air is delivered. The same process will be
used for testing with interferents and TICs or agents together, with the two compounds diluted
together in humidified air delivered to the challenge plenum. The same TIC and CW agent
concentrations used in the initial testing under Section 6.1 will be used in this test, i.e., the IDLH
level or comparable. A response from the IMS instrument with the interferent alone will be
recorded as a false positive, and the absence of a response, or a reduced response, to the TIC or
agent in the presence of the interferent will be recorded as a false negative.

The replicate test runs conducted with the interferent plus TIC or agent will also allow
the response time and recovery time of the IMS instruments to be assessed with interferents
present. Differences in response and recovery times, relative to those in previous tests with only
the TIC or agent present, will be attributed to the effect of the interferent vapor.

6.8 Cold/Hot Start Behavior

The cold/hot start tests will be conducted in a manner similar to the Response Time test
in Section 6.1. Prior to these tests, however, the IMS instruments will not be allowed to warm up
per the manufacturer's recommendation.

The cold start test will be conducted both with the IMS at room temperature, and
subsequently at reduced temperature, prior to startup. In the former test, the IMS instruments
will be stored at 22 ฑ 3ฐC for at least 12 hours prior to testing. The cold start effect will be
assessed using an IDLH challenge concentration of one TIC, at the baseline conditions of
22 ฑ 3ฐC and 50% RH. The time from powering up the IMS instruments to their first readiness


-------
Ion Mobility Spectrometer Test/QA Plan

Page 43 of 64
Date: 7/31/03

to provide readings will be determined as the startup delay time. The response time - as defined
in Section 6.1 - will be measured, followed by the recovery time. Repeatability and accuracy in
five replicate clean air/challenge cycles also will be noted. For the reduced temperature cold
start, at the end of the test day the IMS instruments will be placed in a refrigerated enclosure (5 -
8ฐC) for at least 12 hours overnight. At the start of the next test day, the cold start test will be
repeated, using the same baseline conditions (22 ฑ 3ฐC and 50% RH) and again recording the
startup delay time, and other performance parameters.

For the hot start test, the instruments will be placed in a heated enclosure at 40 ฑ 3ฐC for
at least 12 hours overnight. At the start of the next test day, the hot start test will be conducted in
the same fashion as the cold start test, at the baseline test conditions (22 ฑ 3ฐC and 50% RH).
Only one cold/hot start test will be performed per day, so that the IMS can equilibrate to storage
conditions prior to the test.

The cold/hot start test procedures will be to connect the IMS instruments to the clean air
manifold, and switch the instruments on. The time between switching the IMS instruments on
and when the instruments indicate they are ready to begin providing readings will be recorded as
the delay or standby time for each IMS unit being tested. Then the IMS instruments will be
connected (by the four-way valve in Figure 4) to the challenge plenum, which is supplied with
the IDLH level of the target TIC. The response time, stable reading, and recovery time of each
IMS unit will be recorded, for each of five successive periods of alternating clean air and
challenge mixture. The recorded data will be used to evaluate whether response and recovery
time, repeatability, and accuracy are affected by a cold or hot start relative to normal (i.e., fully
warmed up) operation.

6.9 Battery Life

An evaluation of battery life will be made by assessing the degradation of performance
with extended continuous operation. New batteries will be installed, or the IMS batteries will be
fully charged. The IMS then will be turned on and allowed to warm up, and an initial response


-------
Ion Mobility Spectrometer Test/QA Plan

Page 44 of 64
Date: 7/31/03

time test will be performed per the procedure and baseline environmental conditions of
Section 6.1. A single TIC will be used in this evaluation. The indicated concentration signal
from the IMS will be recorded. The IMS will then sample clean air for 30 min, then the TIC
mixture will be sampled again. This procedure will be repeated until the response time doubles,
or until the IMS no longer responds to the presence of the target TIC. The total time of operation
will be recorded as the measure of battery life.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 45 of 64
Date: 7/31/03

7.0 QUALITY ASSURANCE/QUALITY CONTROL
7.1 Equipment Calibrations

7.1.1	Reference Methods

The reference methods to be used for the determination of TICs and chemical agents are
described in Section 4.7. The analytical equipment needed for these methods will be calibrated,
maintained and operated according to the quality requirements of the methods indicated in
Section 4.7, and the normal documentation of the test facility.

7.1.2	IMS Instruments Checks

The IMS instruments will be operated and maintained according to the vendor's
instructions throughout the verification test. Vendors will be required to provide such
instructions before testing. Maintenance will be performed only according to a preset schedule
or in response to predefined IMS instrument diagnostics. Daily operational check procedures for
the IMS instruments are described below.

7.1.2.1 GB Agent Detection

On any test day on which the IMS instruments will be challenged with GB, each
instrument's functionality will be checked with a vendor-supplied simulant tube. In the event
that the vendor of an IMS instrument did not supply such a tube, Battelle staff will conduct the
operational check tests with dipropylene glycol monomethyl ether (DPGME) or dimethyl
methylphosphonate (DMMP). The results of all such checks will be recorded in laboratory
notebooks as part of the verification test records.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 46 of 64
Date: 7/31/03

7.1.2.2 HD Agent Detection

On any test day on which the IMS instruments will be challenged with HD, each
instrument's functionality will be checked with a vendor-supplied simulant tube. In the event
that the vendor of an IMS instrument did not supply such a tube, Battelle staff will conduct the
check with the simulant methyl salicylate. The results of all such checks will be recorded in
laboratory notebooks as part of the verification test records.

7.2 Assessment and Audits

7.2.1	Technical Systems Audits

Battelle's Quality Manager will perform a TSA once during the performance of this
verification test. The purpose of this TSA is to ensure that the verification test is being
performed in accordance with this test/QA plan and that all QA/QC procedures are being
implemented. In this audit, the Quality Manager may review the reference sampling and analysis
methods used, compare actual test procedures to those specified in this plan, and review data
acquisition and handling procedures. The Quality Manager will prepare a TSA report, the
findings of which must be addressed either by modifications of test procedures or by
documentation in the test records and report.

At EPA's discretion, EPA QA staff may also conduct an independent on-site TSA during
the verification test. The TSA findings will be communicated to testing staff at the time of the
audit, and documented in a TSA report.

7.2.2	Performance Evaluation Audit

A PE audit will be conducted to assess the quality of the measurements made in this
verification test. This audit addresses only those reference measurements that factor into the data
used for verification, i.e., the IMS detection technologies are not the subject of the PE audit.

This audit will be performed once during the verification test, and will be performed by


-------
Ion Mobility Spectrometer Test/QA Plan

Page 47 of 64
Date: 7/31/03

analyzing a standard that is independent of standards used during the testing. Table 5
summarizes the PE audits that will be done. These audits will be the responsibility of Battelle
and test facility staff, and Table 5 indicates the acceptance criteria for the PE audit. These
criteria apply to each target TIC or chemical agent in the PE audit. In the event that results of
analysis of the PE audit standard do not meet the acceptance criteria, then the reference analysis
method will be recalibrated with the laboratory standards, as described in Section 7.1.1, and then
the PE audit standard will be reanalyzed. Continued failure to meet the PE audit criteria will
result in the pertinent data being flagged, and the purchase of new standards for repetition of the
PE audit. Battelle's Quality Manager will assess PE audit results.

Table 6. Summary of PE Audits

Parameter

Audit Procedure

Expected Tolerance

TIC

Concentrations

Analyze independent standards

ฑ 20%

CW Agent
Concentrations

Analyze independent standards

ฑ 30%

7.2.3 Data Quality Audit

Battelle's Quality Manager will audit at least 10 % of the verification data acquired in the
verification test. The Quality Manager will trace the data from initial acquisition, through
reduction and statistical comparisons, and to final reporting. All calculations performed on the
data undergoing audit will be checked.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 48 of 64
Date: 7/31/03

7.2.4	Assessment Reports

Each assessment and audit will be documented in accordance with the ETV
QMP.(1) Assessment reports will include the following:

•	Identification of any adverse findings or potential problems

•	Space for response to adverse findings or potential problems

•	Possible recommendations for resolving problems

•	Citation of any noteworthy practices that may be of use to others

•	Confirmation that solutions have been implemented and are effective.

7.2.5	Corrective Action

The Quality Manager during the course of any assessment or audit will identify to the
technical staff performing experimental activities any immediate corrective action that should be
taken. If serious quality problems exist, the Quality Manager is authorized to stop work. Once
the assessment report has been prepared, the Verification Test Coordinator will ensure that a
response is provided for each adverse finding or potential problem, and will implement any
necessary follow-up corrective action. The Quality Manager will ensure that follow-up
corrective action has been taken.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 49 of 64
Date: 7/31/03

8.0 DATA ANALYSIS AND REPORTING
8.1 Data Acquisition

Data acquisition in this verification test includes proper recording of the procedures used
in testing, to assure consistency in testing and adherence to this plan; documentation of sampling
conditions and analytical results for the reference methods; recording of the readings of the IMS
instruments in each portion of the test; and recording of observations about ease of use, cost, etc.
These forms of data acquisition will be carried out by the testing staff, in the form of test
notebooks, analytical data records, and data recording forms.

Table 6 summarizes the types of data to be recorded, how the data will be recorded, and
how often the data will be recorded. All data will be recorded by Battelle staff. The general
approach is to record all test information immediately and in a consistent format throughout all
tests. Identical file formats will be used to make quantitative evaluations of the data from all
technologies tested, to assure uniformity of data treatment. This process of data recording and
compiling will be overseen by the Battelle Verification Test Coordinator and Quality Manager.

8.1.1 IMS Data Acquisition

The acquisition of data from the IMS instruments will be tailored to the data output
capabilities of those instruments. It is expected that a visual display of readings, coupled with an
audible or visual alarm, will be the data output of most portable IMS instruments. For those IMS
instruments, data will be recorded manually by the testing staff, on data forms prepared before
the verification test. Separate forms will be prepared for distinct parts of the test, and each form
will require entries that assure complete recording of all test data. Note: These data forms will
be used in trial runs of the test procedure, and will be revised as necessary before IMS testing
begins. The final test data forms will be appended to the final version of this test/QA plan,
when the plan is distributed prior to the start of testing.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 50 of 64
Date: 7/31/03

Some IMS instruments may have on-board data logging capabilities, or may provide an
electronic output signal. In such cases, data acquisition will be conducted electronically, using
the IMS instrument's own software or a personal computer-based data acquisition system in the
test facility.

Table 7. Summary of Data Recording Process for the Verification Test

Data to be Recorded

Where Recorded

How Often
Recorded

Disposition of

Data(a)

Dates, times of test
events

Laboratory record
books, data forms

Start/end of test, and
at each change of a
test parameter.

Used to

organize/check test
results; manually
incorporated in data
spreadsheets as
necessary.

Test parameters
(agent/surrogate
identities and
concentrations,
temperature and relative
humidity, gas flows, etc.)

Laboratory record
books, data forms

When set or
changed, or as
needed to document
the sequence of
tests.

Used to

organize/check test
results, manually
incorporated in data
spreadsheets as
necessary.

Reference method
sampling data
(identification of
sampling media,
sampling flows, etc.)

Laboratory record
books, data forms

At least at start/end
of reference sample,
and at each change
of a test parameter.

Used to

organize/check test
results; manually
incorporated in data
spreadsheets as
necessary.

Reference method
sample analysis, chain of
custody, and results

Laboratory record
books, data sheets,
or data acquisition
system, as
appropriate.

Throughout sample
handling and
analysis process

Transferred to
spreadsheets

IMS instrument readings
and diagnostic displays

Electronically if
possible; prepared
data forms otherwise

When stable at each
new clean air,
interferent, or
challenge
concentration;
whenever updated in
recovery and
response time tests

Transferred to
spreadsheets

(a) All activities subsequent to data recording are carried out by Battelle.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 51 of 64
Date: 7/31/03

Whether collected manually or electronically, all IMS data will be entered into electronic
spreadsheets, set up to organize the IMS, reference method, and test condition (e.g., temperature,
RH, interferent concentration) data for each part of the test procedure. Organization of the data
in this way will allow evaluation of the various performance parameters clearly and consistently.
The accuracy of entering manually-recorded data into the spreadsheets will be checked at the
time the data are entered, and a portion of the data will also be checked by the Battelle Quality
Manager as part of the Data Quality Audit (Section 7.2.3). A separate spreadsheet will be set up
for each IMS instrument tested, and no intermingling or intercomparison of data from different
instruments will take place.

8.1.2	Laboratory Data Acquisition

Laboratory analytical data (e.g., reference method results quantifying the TICs or CW
agents used) may be produced electronically, from (e.g.) gas chromatographic or electrochemical
instruments. For IMS instruments that do not provide an electronic output, data will be recorded
manually in laboratory notebooks or on data forms prepared prior to the test. These records will
be reviewed on a daily basis to identify and resolve any inconsistencies. All written records
must be in ink. Any corrections to notebook entries, or changes in recorded data, must be made
with a single line through the original entry. The correction is then to be entered, initialed and
dated by the person making the correction.

8.1.3	Confidentiality

In all cases, strict confidentiality of test data for each vendor's technology, and strict
separation of data from different technologies, will be maintained. Separate files (including
manual records, printouts, and electronic data files) will be kept for each technology. At no time
during verification testing will Battelle staff engage in any comparison of different technologies
undergoing testing.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 52 of 64
Date: 7/31/03

8.2 Data Review

Records generated in the verification test will receive a one-over-one review within two
weeks after generation, before these records are used to calculate, evaluate, or report verification
results. These records will include laboratory record books, completed data forms, electronic
spreadsheets or data files, and reference method analytical results. This review will be
performed by the Battelle Verification Test Coordinator or his designate, but in any case
someone other than the person who originally generated the record. Testing staff will be
consulted as needed to clarify any issues about the data records. The review will be documented
by the person performing the review by adding his/her initials and date to a hard copy of the
record being reviewed.

8.3 Data Evaluation

In order to extract the most information about IMS performance from the test procedures,
a multivariate statistical analysis of the test results will be performed whenever feasible. Such an
analysis will use all available data to explore the impact of test parameters on IMS performance.
However, a limitation in this approach is that the IMS instruments to be tested under this test/QA
plan provide primarily qualitative responses. That is, they indicate the presence or absence, and
in some cases the relative concentration, of a target TIC or CW agent, rather than a quantitative
concentration. As a result, for some IMS instruments the data produced in this test may not lend
themselves to multivariate analysis. To address this limitation, a multivariate analysis is
planned, but is backed up by single-variable analyses that will be employed as needed. Section
8.3.1 below describes the multivariate approach, and Section 8.3.2 describes the single variable
analyses.

8.3.1 Multivariate Analyses

The multivariate analyses focus on the following IMS performance parameters:
• Response Time


-------
Ion Mobility Spectrometer Test/QA Plan

Page 53 of 64
Date: 7/31/03

•	Recovery Time

•	Repeatability

•	Accuracy

•	False positives/False negatives,

by considering the following explanatory effects:

•	Identity of the target TIC or CW agent

•	Temperature

•	Humidity

•	IMS Start State (i.e., warmed up, cold start, etc.)

•	Identity and presence of absence of interferent

The performance parameters of response threshold and battery life do not lend themselves to
a multivariate analysis based on the planned test procedures, and will be addressed using a
single-variable approach (Sections 8.3.2.2 and 8.3.2.9).

8.3.1.1 Evaluation of Multiple Performance Parameters

For each IMS instrument, response and recovery time, repeatability, and accuracy will be
measured with each target TIC and CW agent, at varying conditions of four environmental
variables: temperature, humidity, start state, and interferent. At least five measures of the
performance parameters will be taken for each combination of TIC/agent and environmental
variables. Furthermore, since two units of each IMS instrument will be tested simultaneously, up
to ten measures of each performance parameter will be available for each combination. Thus, for
example, since three temperature levels will be assessed (5, 20 and 35 ฐC) at a fixed humidity
(50% RH) and start state (warmed-up) - at least five measures of the performance parameters
will be available for each TIC/CW agent and temperature combination.

A multivariate analysis of variance (MANOVA) will be performed to quantify IMS
performance and to understand how IMS performance relates to TIC/CW agent identity and the
values of the environmental variables. Given the experimental design, it is not anticipated that it


-------
Ion Mobility Spectrometer Test/QA Plan

Page 54 of 64
Date: 7/31/03

will be possible to uncover interactions between temperature, humidity, and the other variables.
For example, the design is limited to recording IMS response as temperature varies at one level
of humidity, and recording IMS response as humidity varies at one level of temperature. For
reasons of experimental practicality, the design does not include simultaneously high values of
temperature and humidity. However, the data analysis will consider environmental interactions
and the degree to which available data do in fact allow for their exploration.

8.3.1.2	False Positives and False Negatives

A representative set of potentially interfering compounds will be added to air samples,
both with and without a target TIC or CW agent present in the samples. Some IMS instruments
may provide only a binary (yes/no) response indicating the detection or non-detection of the
target TIC/CW agent. At least five such binary responses will be collected for each
interferent/zero air and interferent/TIC or agent combination. The false positive and negative
rates of the IMS will be modeled in such cases using logistic regression, a technique that relates
the chance of an event (for example, the chance of a positive reading when no TIC/CW agent is
present) to explanatory variables (for example, interferent). The focus of the analyses will be to
understand the relationship between false positive rate and interferent; and false negative rate
and interferent/TIC or agent combination. For IMS instruments that provide a quantitative
measure of the TIC or CW agent concentration, an analysis will be conducted to assess whether
significant differences in response result from the presence of the interferent. Both types of
analyses will use data from tests conducted with the interferent species, and corresponding data
from other parts of the test procedures in which no interferent was present.

8.3.1.3	Support Tools

All data analyses will be conducted using the statistical analysis software, SAS. The SAS
software provides extensive analytical capabilities, handling a wide range of statistical analyses,
including analysis of variance, regression, categorical data analysis, multivariate analysis,
survival analysis, cluster analysis, and nonparametric analysis. As indicated, the analyses
described above will rely primarily on SAS' support of multivariate analysis of variance and


-------
Ion Mobility Spectrometer Test/QA Plan

Page 55 of 64
Date: 7/31/03

logistic regression. SAS tools will also be used for data summarization, including visualization
of data with high-resolution graphics.

8.3.2 Single-Variable Analyses

8.3.2.1 Response Time

The data collected to evaluate response time will be the measured time periods (in
seconds) between the start of IMS sampling from the challenge plenum and the achievement of
stable IMS readings, an alarm state, or a switch to the backflush mode, on the challenge gas.
These data will be recorded in sets of three, as a result of alternately sampling clean air and the
challenge gas five successive times. Five replicate response time measurements will be recorded
in all tests in which the IMS instruments are challenged with a test mixture, whether that mixture
is of a TIC, a CW agent, or an interferent. The only exception is that if no effect is observed
from an interferent after three replicates, the final two replicates will not be conducted. Different
types of response times may be recorded for a single IMS instrument. For example, an
instrument may provide an audible alarm and a visual display of qualitative readings. In that
case, both the time to alarm, and the time to achieve stable qualitative response will be recorded
in each test.

The recorded response time data will be tabulated in the verification report, and will be
summarized in terms of the mean and range of response times observed. Data analysis will
include comparison of the observed means and ranges of response times under different test
conditions. For example, response time may vary as a function of the target analyte
concentration, so the response times will be compared graphically (linear regression) and/or
statistically (comparison of means) to determine whether there is a significant dependence of
response time on concentration. Linear regression analysis will focus on whether a statistically
significant slope and correlation result from the regression of IMS results against reference
method concentration data. Comparison of means will assess whether the mean response time at
one concentration differs from that at another concentration. Corresponding comparisons will be
made to assess the effect of temperature, RH, and the presence of interferents on response times.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 56 of 64
Date: 7/31/03

These comparisons will be carried out using data for each TIC and CW agent tested, and
consequently the response time will be assessed separately for each such target chemical.

8.3.2.2	Response Threshold

The data used to evaluate the response threshold will be the five replicate IMS readings
obtained at each succeeding target analyte concentration, in the procedure described in
Section 6.2. These data will be tabulated, along with the corresponding reference method data
that establish the challenge concentration. The response threshold will be determined by
inspection as the lowest reference method concentration that produces a positive IMS response in
all triplicate runs. In this evaluation, the consistency of the IMS readings is not an issue, e.g., an
IMS response of "low" is equivalent to a response of "medium" or "high" in terms of the
response threshold evaluation.

8.3.2.3	Repeatability

Repeatability will be assessed by means of the stable IMS readings recorded in the
successive periods of sampling from the challenge plenum, at each concentration of TIC or CW
agent. Each set of five replicate readings will be tabulated, and the consistency of readings will
be noted as a function of the identity and concentration of the target analyte, the temperature and
RH, and the presence of an interferent. In the case of IMS instruments that provide only alarms
or qualitative responses, the evaluation of repeatability will be necessarily qualitative. That
evaluation will be conducted by noting, for example, whether all three readings in a test run were
the same, or two out of three were the same, etc. The exact nature of this qualitative evaluation
will depend on the nature of the data output provided by the IMS instrument. In the simplest
form, the evaluation of repeatability may involve only the consistency of providing an alarm or
switching into a backflush when the TIC or agent is present.

For IMS instruments that provide a quantitative data output, repeatability will be assessed
in terms of the percent relative standard deviation (%RSD) of the five readings from each test,
i.e.,


-------
Ion Mobility Spectrometer Test/QA Plan

Page 57 of 64
Date: 7/31/03

%RSD = (SD/Mean) x 100

where SD is the standard deviation of the five readings in a test, and Mean is the arithmetic
average of the five readings.

The %RSD results will be evaluated by inspection, and apparent differences in
repeatability will be tested for significance by a comparison of means test (Student's t or
similar).

8.3.2.4 Accuracy

Accuracy will be assessed by comparing the IMS readings with the reference method
results, for each TIC and agent tested. The comparison will be conducted differently for
quantitative IMS results relative to qualitative results.

For IMS instruments that provide quantitative data, accuracy will be assessed by a linear
regression of IMS data against reference method data. This comparison will be conducted
separately for each TIC and agent tested, and will use all test results. Results from tests at the
baseline conditions (22 ฐC and 50% RH) with no interferent present will be segregated from
those at other test conditions, or with interferents present, but the same comparisons will be
conducted on all data sets. The comparison will assess whether the slope of the regression line is
significantly different from 1.0 and whether the intercept of the regression line is significantly
different from zero.

For IMS instruments that provide qualitative data output, the assessment of accuracy will
depend on information provided by each IMS vendor on the correspondence of qualitative
readings to quantitative values. Accuracy will then be assessed by comparing the reference
method data with the ranges of concentration indicated by qualitative IMS readings. This
comparison will result in a Yes/No (Y/N) assessment of accuracy for each reference/IMS data
set. For example, an IMS vendor whose instrument provides a low/medium/high indication
reports that the "medium" response range for a particular chemical agent corresponds to
concentrations of 5 to 10 (arbitrary units for example only). Then any IMS reading of "medium"
that corresponds with a reference method result of 5 to 10 units will be designated as accurate


-------
Ion Mobility Spectrometer Test/QA Plan

Page 58 of 64
Date: 7/31/03

(Yes); "medium" readings that correspond to reference values outside the 5 to 10 range will be
designated as inaccurate (No). The results will be tabulated and the Y/N results will be
reviewed. As with the quantitative data, qualitative accuracy will be assessed for each TIC and
agent, using all test data.

For IMS instruments that provide only an alarm, or that switch into a backflush mode and
stop sampling upon detection of the target species, accuracy will be assessed only in terms of
false positives and false negatives. For this evaluation, a positive IMS response in the absence of
the TIC or CW agent concentration will be deemed a false positive, and the absence of IMS
response at any concentration above the response threshold for the target species will be deemed
a false negative.

8.3.2.5	Recovery Time

Recovery time will be evaluated in the same manner as described above for response time
in Section 8.3.2.1, except that the data points will be the time from switching the IMS sampling
point to the clean air plenum until baseline IMS response, the absence of an alarm, or a return
from backflush mode is achieved. As is the case for response time, recovery time will be
evaluated for all test runs, for all TICs and agents tested, by means of the mean and range of the
values found in each test.

8.3.2.6	Temperature and Humidity Effects

Temperature and humidity effects will be assessed by direct comparison of test results
under baseline conditions (22ฐC and 50% RH) to those under other conditions. Temperature or
RH effects will be examined relative to each of the performance parameters being tested, i.e.,
response time, recovery time, accuracy, etc. Thus assessment of temperature or RH effects
involves comparison of results for those performance parameters under different temperature and
RH conditions.

These effects will be evaluated by tabulation of the results obtained for the various
performance parameters, under each set of temperature and RH conditions. Identification of
temperature or RH effects will begin by inspecting the data for apparent differences that may be


-------
Ion Mobility Spectrometer Test/QA Plan

Page 59 of 64
Date: 7/31/03

a function of temperature or RH. Any suspected differences will then be investigated by
appropriate means, such as linear regression or comparison of means. The effect of temperature
will be assessed by comparing data from the tests conducted at 10 to 30ฐC with constant
50 (ฑ5) % RH; the effect of RH will be assessed by comparing data from the tests at < 20 to 80%
RH at constant 22ฐC temperature. These evaluations will be done separately for each TIC and
CW agent tested.

8.3.2.7 Interference Effects

The impact of interferences on IMS response will be assessed by comparison of response
with a potential interferent present to that in the absence of interferent, under the same test
conditions. Response will consist of the readings of the IMS instrument in tests both with and
without the interferent. Comparison of these responses may conveniently be done graphically, to
illustrate the difference or similarity of the responses. All response readings with the interferent
present must be the same as those without the interferent present, or an interferent effect will be
inferred. For example, three positive and two negative responses in the presence of the
interferent will be judged as different from two positive and three negative responses in the
absence of the interferent indications.

The interference data will be evaluated in two ways. Data from the tests with interferent
present alone will be used to assess false positive readings, i.e., comparison of IMS readings with
interferent and clean air will assess whether the IMS instruments give a positive indication of a
TIC or agent due to the presence of interferent. Data from the tests with both interferent and a
TIC or agent will be used to assess false negatives, i.e., the absence of a response to the TIC or
agent when the interferent is present. A reduced or enhanced response to the TIC or agent when
the interferent is present, relative to that without the interferent, will be taken as indication of a
partial masking or interference in the IMS response.

This evaluation will be conducted by matching (in the data spreadsheets) the results from
tests with interferents present with those at the same conditions without interferents. This
organization of the data will be done separately for each TIC or agent tested, so that interferent
effects are assessed separately for each TIC or agent.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 60 of 64
Date: 7/31/03

8.3.2.8	Cold/Hot Start Behavior

A primary evaluation of cold/hot start behavior will use the measured time between the
startup of the IMS instrument and when it is ready to provide data. Three values of this result
will be tabulated: one resulting from a cold start from room temperature, another from a cold
start at reduced temperature (5 to 8ฐC), and the third from startup after an extended period of
storage in a hot environment. These two measured delay times will be reported without any
additional data analysis.

Additional evaluation of cold/hot start behavior will result from the determination of
response time, repeatability, and recovery time in the tests that immediately follow the cold and
hot starts. These data, which will result from the determination of these performance parameters
as described elsewhere in Section 8.3, will be compared to those from tests under the same
baseline conditions with full warmup prior to testing. Differences in performance between
cold/hot start and warmed up operation will be investigated by comparing the mean values and
ranges of the results.

8.3.2.9	Battery Life

Both battery life and the effectiveness of battery operation will be assessed. Battery life
will be reported as the time (in minutes) from startup to battery exhaustion when an IMS
instrument is warmed up and operated solely on battery power at room temperature and 50% RH.
This time will be measured from initial startup of the instrument to the point in time when the
IMS instrument no longer responds to a challenge mixture of a selected TIC in air.

The effectiveness of battery operation will be assessed by comparing the triplicate test
results for a single TIC with the IMS instrument operated on AC power, to the corresponding
results when the same test is immediately repeated using IMS battery power. The results for
response time, recovery time, accuracy, and repeatability will be compared to assess whether any
substantial differences result from use of battery power.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 61 of 64
Date: 7/31/03

8.4 Reporting

The data comparisons described in Section 8.3 will be conducted separately for each IMS
instrument undergoing verification. Separate verification reports will then be prepared, each
addressing one IMS technology. Each verification report will present the test data, as well as the
results of the evaluation of those data. The verification report will briefly describe the ETV
program, and will present the procedures used in verification testing. These sections will be
common to each verification report resulting from this verification test. The results of the
verification test will then be stated quantitatively, without comparison to any other technology
tested, or comment on the acceptability of the technology's performance. The preparation of
draft verification reports, the review of reports by vendors and others, the revision of the reports,
final approval, and the distribution of the reports, will be conducted as stated in the ETV QMP(1).
Preparation, approval, and use of Verification Statements summarizing the results of this test
also will be subject to the requirements of that same document.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 62 of 64
Date: 7/31/03

9.0 HEALTH AND SAFETY

All participants in this verification test (i.e., Battelle, EPA, and vendor staff) will adhere
to the security, health, and safety requirements of the Battelle facility in which testing will be
performed. Vendor staff will train Battelle testing staff in the use of their portable IMS
instruments, but will not be the technology users during the testing. To the extent allowed by the
test facility, vendor staff may observe, but may not conduct, any of the verification testing
activities identified in this test/QA plan.

9.1 Access

Access to restricted areas of the test facility will be limited to staff who have met all the
necessary training and security requirements. The existing access restrictions of the test facility
will be followed, i.e., no departure from standard procedures will be needed for this test.

9.2 Potential Hazards

This verification in part involves the use of extremely hazardous chemical materials.
Verification testing involving those materials must be implemented only in properly certified
surety facilities, capable of handling such materials safely.

In addition, simulant and TIC materials used in this verification may be toxic, and must
be used with appropriate attention to good laboratory safety practices.

9.3 Training

Because of the hazardous materials involved in this verification test, documentation of
proper training and certification of the test personnel is mandatory before testing takes place.
The Battelle Quality Manager, or a designate, must assure that documentation of such training is
in place for all test personnel before allowing testing to proceed.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 63 of 64
Date: 7/31/03

9.4 Safe Work Practices

All visiting staff at the test facility will be given a site-specific safety briefing prior to the
start of any test activities. This briefing will include a description of emergency operating
procedures, and the identification and location and operation of safety equipment (e.g., fire
alarms, fire extinguishers, eye washes, exits). Testing procedures must follow all safety
practices of the test facility at all times. Any report of unsafe practices in this test, by those
involved in the test or by other observers, shall be grounds for stopping the test until the Quality
Manager and testing personnel are satisfied that unsafe practices have been corrected.

9.5 Equipment Disposition

Tests conducted according to this plan will require that all equipment that has been
exposed to chemical surety materiel be decontaminated and/or disposed of. Although efforts will
be made to remove any contaminated parts of the IMS instruments after testing, there is no
guarantee that this will be feasible. Consequently, it is not certain that IMS instruments
undergoing testing will be returned to the vendor at the completion of the tests.


-------
Ion Mobility Spectrometer Test/QA Plan

Page 64 of 64
Date: 7/31/03

10.0 REFERENCES

1.	Environmental Technology Verification Program Quality and Management Plan (QMP) for
the EPA Pilot Period, 600/R-98/064, U.S. Environmental Protection Agency, Washington,
D C., May 1998.

2.	Proposed Acute Exposure Guideline Levels (AEGLs), Nerve Agents GA, GB, GD, GF,
U.S. EPA, Office of Pollution Prevention and Toxics, Public Draft, October 2000. Federal
Register (www.access.gpo.gov/su_docs/aces/acesl40.html).


-------