June 2003

Environmental Technology
Verification Report

Clean Air Technologies
International, Inc.

REMOTE

On-Board Emissions Monitor

Prepared by
Battel le

Batteiie

. . . Putting Technology To Work

Under a cooperative agreement with

&EPA U.S. Environmental Protection Agency

ETV ElV ElV


-------
June 2003

Environmental Technology Verification

Report

ETV Advanced Monitoring Systems Center

Clean Air Technologies
International, Inc.

REMOTE
On-Board Emissions Monitor

by

Jeffrey Myers
Thomas Kelly
Amy Dindal
Zachary Willenberg
Karen Riggs

Battelle
Columbus, Ohio 43201


-------
Notice

The U.S. Environmental Protection Agency (EPA), through its Office of Research and
Development, has financially supported and collaborated in the extramural program described here.
This document has been peer reviewed by the Agency and recommended for public release. Mention
of trade names or commercial products does not constitute endorsement or recommendation by the
EPA for use.

11


-------
Foreword

The U.S. Environmental Protection Agency (EPA) is charged by Congress with protecting the
nation's air, water, and land resources. Under a mandate of national environmental laws, the
Agency strives to formulate and implement actions leading to a compatible balance between
human activities and the ability of natural systems to support and nurture life. To meet this
mandate, the EPA's Office of Research and Development provides data and science support that
can be used to solve environmental problems and to build the scientific knowledge base needed
to manage our ecological resources wisely, to understand how pollutants affect our health, and to
prevent or reduce environmental risks.

The Environmental Technology Verification (ETV) Program has been established by the EPA to
verify the performance characteristics of innovative environmental technology across all media
and to report this objective information to permitters, buyers, and users of the technology, thus
substantially accelerating the entrance of new environmental technologies into the marketplace.
Verification organizations oversee and report verification activities based on testing and quality
assurance protocols developed with input from major stakeholders and customer groups
associated with the technology area. ETV consists of seven environmental technology centers.
Information about each of these centers can be found on the Internet at http://www.epa.gov/etv/.

Effective verifications of monitoring technologies are needed to assess environmental quality
and to supply cost and performance data to select the most appropriate technology for that
assessment. In 1997, through a competitive cooperative agreement, Battelle was awarded EPA
funding and support to plan, coordinate, and conduct such verification tests for "Advanced
Monitoring Systems for Air, Water, and Soil" and report the results to the community at large.
Information concerning this specific environmental technology area can be found on the Internet
at http://www.epa.gov/etv/centers/centerl .html.

111


-------
Acknowledgments

The authors wish to acknowledge the support of all those who helped plan and conduct the
verification test, analyze the data, and prepare this report. In particular we would like to thank
ETV stakeholders Jeff Cook, Clifford Glowacki, and Donald Stedman.

iv


-------
Contents

Page

Notice	ii

Foreword	 iii

Acknowledgments	iv

List of Abbreviations	 viii

1	Background	 1

2	Technology Description	2

3	Test Design and Procedures	 3

3.1	Introduction 		3

3.2	Test Design	3

3.3	Reference Methods		5

3.4	OEM Installation 		5

3.5	Test Schedule		5

3.6	On-Road Testing	7

4	Quality Assurance/Quality Control 	 8

4.1	Reference Method Calibrations and Checks	 8

4.2	Audits	 8

4.2.1	Pre-Test Facility Audit 	 8

4.2.2	Performance Evaluation Audit 	 9

4.2.3	Technical Systems Audit	 9

4.2.4	Audit of Data Quality	 9

4.3	QA/QC Reporting 	 10

4.4	Data Review	 10

5	Statistical Methods 	 11

5.1	Bias 	 11

5.2	Precision	 12

5.3	Other Factors 	 13

6	Test Results	 14

6.1	Bias 	 16

6.2	Unit-to-Unit Precision 	 19

6.3	Other Factors 	20

6.3.1	Reliability and Ease of Use	20

6.3.2	Other Unit-to-Reference Method Comparisons	21

6.3.3	Temperature Effect	21

v


-------
7	Performance Summary 	29

8	References 		30

Figures

Figure 2-1. Clean Air Technologies REMOTE On-Board Emissions Monitor	2

Figure 3-1. Chevy Tahoe in Dynamometer Cell 	6

Figure 3-2. External View of Chevy Tahoe Before On-Road Testing	 7

Figure 3-3. View of the Duplicate REMOTE OEMs in the Chevy Tahoe

Before On-Road Testing	 7

Figure 6-la. Integrated Sample Comparison for Hydrocarbons		14

Figure 6-lb. Integrated Sample Comparison for Carbon Monoxide 		15

Figure 6-lc. Integrated Sample Comparison for Nitrogen Oxides		15

Figure 6-Id. Integrated Sample Comparison for Carbon Dioxide 		16

Figure 6-2a. FTP Individual Bag Comparison for Hydrocarbons 		17

Figure 6-2b. FTP Individual Bag Comparison for Carbon Monoxide 		17

Figure 6-2c. FTP Individual Bag Comparison for Nitrogen Oxides		18

Figure 6-2d. FTP Individual Bag Comparison for Carbon Dioxide		18

Figure 6-3a. Second-by-Second Data from Duplicate REMOTE OEMs and the

Reference Method for Hydrocarbons During an FTP Cycle 	22

Figure 6-3b. Second-by-Second Data from Duplicate REMOTE OEMs and the

Reference Method for Carbon Monoxide During an FTP Cycle	23

Figure 6-3c. Second-by-Second Data from Duplicate REMOTE OEMs and the

Reference Method for Nitrogen Oxides During an FTP Cycle 	24

Figure 6-3d. Second-by-Second Data from Duplicate REMOTE OEMs and the

Reference Method for Carbon Dioxide During an FTP Cycle	25

Figure 6-4a. Linear Regression Comparison Between Reference Method and

REMOTE OEM for Hydrocarbons	26

vi


-------
Figure 6-4b. Linear Regression Comparison Between Reference Method and

REMOTE OEM for Carbon Monoxide	26

Figure 6-4c. Linear Regression Comparison Between Reference Method and

REMOTE OEM for Nitrogen Oxides	27

Figure 6-4d. Linear Regression Comparison Between Reference Method and

REMOTE OEM for Carbon Dioxide 	27

Tables

Table 3-1. Summary of Chassis Dynamometer Test Cycles	4

Table 3-2. Schedule for Chassis Dynamometer Test Cycles	 6

Table 4-1. Results of Performance Audit	 9

Table 6-1. Percent Bias Values and Confidence Intervals for REMOTE OEM	 19

Table 6-2. Unit-to-Unit Precision Results and Confidence Intervals for

REMOTE OEM	 20

Table 6-3. Temperature Effect Results	28

vii


-------
List of Abbreviations

AMS	Advanced Monitoring Systems

ATL	Automotive Testing Lab

CL	chemiluminescence

CO	carbon monoxide

C02	carbon dioxide

CV	coefficient of variation

EPA	U.S. Environmental Protection Agency

ETV	Environmental Technology Verification

FID	flame ionization detector

FTP	federal test procedure

g	gram

GC	gas chromatography

HC	hydrocarbon

L	liter

LL	lower limit

mi	mile

NDIR	non-dispersive infrared

NIST	National Institute of Standards and Technology

NOx	nitrogen oxides

02	oxygen

OBD	on-board diagnostic

OEM	on-board emissions monitor

PE	performance evaluation

QA/QC	quality assurance/quality control

QA	quality assurance

QMP	Quality Management Plan

REMOTE	real-world emissions monitoring on-board testing equipment

TSA	technical systems audit

UL	upper limit

viii


-------
Chapter 1
Background

The U.S. Environmental Protection Agency (EPA) supports the Environmental Technology
Verification (ETV) Program to facilitate the deployment of innovative environmental tech-
nologies through performance verification and dissemination of information. The goal of the
ETV Program is to further environmental protection by substantially accelerating the acceptance
and use of improved and cost-effective technologies. ETV seeks to achieve this goal by provid-
ing high-quality, peer-reviewed data on technology performance to those involved in the design,
distribution, financing, permitting, purchase, and use of environmental technologies.

ETV works in partnership with recognized testing organizations; with stakeholder groups
consisting of buyers, vendor organizations, and permitters; and with the full participation of
individual technology developers. The program evaluates the performance of innovative tech-
nologies by developing test plans that are responsive to the needs of stakeholders, conducting
field or laboratory tests (as appropriate), collecting and analyzing data, and preparing peer-
reviewed reports. All evaluations are conducted in accordance with rigorous quality assurance
(Q A) protocols to ensure that data of known and adequate quality are generated and that the
results are defensible.

The EPA's National Exposure Research Laboratory and its verification organization partner,
Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV The AMS Center
evaluated the performance of the Clean Air Technologies International Inc., REMOTE (Real-
world Emissions Monitoring On-board Testing Equipment) on-board emissions monitor (OEM)
in May of 2001. A delay by the vendor postponed the preparation of this verification report until
early 2003.

1


-------
Chapter 2
Technology Description

The objective of the ETV AMS Center is to verify the performance characteristics of
environmental monitoring technologies for air, water, and soil. This verification report provides
results for the verification testing of the Clean Air Technologies International, Inc., REMOTE
OEM. Following is a description of the REMOTE OEM, based on information provided by the
vendor. The information provided below was not verified in this test.

The REMOTE OEM is capable of measuring exhaust emissions from electronically controlled
light-duty passenger vehicles and light trucks of model year 1996 and newer with on-board

	diagnostics (OBD) ports. The REMOTE OEM,

using infrared techniques to measure carbon
monoxide (CO), carbon dioxide (C02), and total
hydrocarbons (HC) and electrochemical techniques
to measure nitrogen oxides (NOx), is designed to
provide real-time on-road emissions measurements
and to deri ve test- and bag-averaged emissions
during standard vehicle test cycles, as used in
vehicle dynamometer testing. The REMOTE
OEM provides second-by-second total HC, CO,
NOx, C02, and oxygen (02) readings and total
mass emissions summaries for individual test

		cycles. It includes a touch-screen computer and

Figure 2-1. Clean Air Technologies	comes standard in a powder-coated aluminum

REMOTE On-Board Emissions Monitor housing.

The REMOTE OEM is installed in the passenger seat of the vehicle and connects to the vehicle
in three locations. The cigarette lighter provides the power in the majority of installations
(auxiliary battery optional), the OBD port under the dashboard provides the engine data stream,
and the sample exhaust probe is inserted into the tailpipe.

2


-------
Chapter 3
Test Design and Procedures

3.1 Introduction

This verification test was conducted according to procedures specified in the AMS Center Test/QA
Plan for Verification of On-Board Vehicle Emission Monitors.{l) The purpose of the test was to
evaluate the performance of the Clean Air Technologies REMOTE OEM under realistic operating
conditions.

The REMOTE OEM was tested in multiple vehicles to assess its overall accuracy (bias and
precision) relative to emission measurements made by standard emission test equipment with a
chassis dynamometer. The REMOTE OEM was also operated in on-road driving, to observe its
performance under real-world conditions. Reliability and ease of use also were assessed.

3.2 Test Design

The verification test used the facilities of Automotive Testing Laboratory (ATL) in East Liberty,
Ohio. For this test, three automatic-transmission, multi-port fuel injection, gasoline-powered test
vehicles were rental cars obtained by the testing facility for both chassis dynamometer testing and
road testing:

¦	Chevrolet Cavalier (1998, 2.2 L, 4 cylinder, 22,697 miles)

¦	Chevrolet Tahoe (1997, 5.7 L, 8 cylinder, 63,857 miles)

¦	Ford Taurus (1998, 3.0 L, 6 cylinder, 33,981 miles).

To establish intra-method precision (i.e., unit-to-unit relative error), duplicate REMOTE OEMs
were operated side-by-side throughout all portions of the verification test.

In the first phase of the verification test, the duplicate REMOTE OEMs were operated on a
vehicle running on a chassis dynamometer, and vehicle emissions were monitored by the
REMOTE OEMs being verified and by standard emission testing equipment. Three dynamometer
test runs were performed with each of the three vehicles on each of two test cycles: the Federal
Test Procedure (FTP)(2) and the US06(2) cycle (Table 3-1). The FTP cycle consists of three
segments, in each of which an integrated sample is collected in a gas sampling bag. The overall
emissions from the cycle are based on all three bags. For the US06 cycle, one bag sample is
collected over the duration of the cycle. These bag samples were analyzed using the emission
testing equipment in place at ATL. The reference method emission results were compared with the

3


-------
REMOTE OEM results on a second-by-second basis, on a test cycle basis for both the FTP and
US06 cycles, and on a bag-by-bag basis for the FTP cycles.

Table 3-1. Summary of Chassis Dynamometer Test Cycles

Test cycle

Cavalier

Tahoe

Taurus

Total

FTP (a)

3

3

3

9

US06

3

3

3

9

Total

6

6

6

18

(^ Each FTP test cycle produced three bags, so 27 observations were obtained for the bag-level comparisons.

Three test cycles were conducted to provide information on test cycle-to-test cycle repeatability
and to test whether interactions between vehicle type and test cycle have an impact on observed
bias and precision. For example, levels of bias and/or precision may differ from vehicle to vehicle
only during the FTP cycle, or one vehicle type may show consistent bias and precision during both
test cycles, while the other two do not.

Following the chassis dynamometer test cycles summarized in Table 3-1, four additional US06
cycles were performed on the Cavalier. First, a single US06 cycle was performed with the
Cavalier accessories (air conditioner, radio, lights, etc.) off, at each of three different temperatures
(i.e., 30°F, 75°F, and 100°F). The fourth US06 cycle was performed at 100°F with the Cavalier's
air conditioner operating at maximum capacity, to assess whether using vehicle accessories
influences the performance of the REMOTE OEM.

For all of the test cycles, vehicle emissions were measured by the reference methods described in
Section 3.3. These measurements were used to establish bias of the REMOTE OEM relative to the
reference data. During each test cycle, vehicle emissions were monitored in real time by the
reference methods and by the duplicate REMOTE OEMs for HC, CO, NOx, and C02. Bias and
REMOTE OEM precision were determined independently for HC, CO, NOx, and C02.

In the second phase of the verification test, the duplicate REMOTE OEMs were installed in one of
the test vehicles, and the vehicle was then driven over two routes for approximately 15 minutes
each. The two routes consisted of one route that was predominantly stop-and-go traffic, and one
that was predominantly sustained high-speed traffic. While the vehicle was driven over these two
routes, second-by-second data were collected by the duplicate REMOTE OEMs. Results from the
duplicate REMOTE OEMs were compared graphically to assess the unit-to-unit reproducibility of
the OEM in on-road driving. The same on-road procedure was conducted using each of the three
test vehicles.

4


-------
3.3 Reference Methods

During the verification test, the following methods were implemented through the emission test
equipment in the ATL facilities to measure the concentrations of HC, CO, NOx, and C02 in
vehicle emissions:

¦	HC—flame ionization detector (FID)

¦	CO—non-dispersive infrared spectroscopy (NDIR)

¦	NOx—chemiluminescence (CL) analyzer

¦	co2—ndir.

These methods are described in 40 CFR Part 86.(2) These methods served as the basis for
evaluating the bias of the REMOTE OEM. These analyses were performed both in real time and
on collected bag samples.

3.4 OEM Installation

The REMOTE OEMs were installed by a vendor representative, who ensured that each
REMOTE OEM was calibrated and operating properly before testing began each day. The
duplicate REMOTE OEMs were installed with appropriate plumbing to split the exhaust stream
for analysis by both REMOTE OEMs. A leak check was performed before road testing and
before each series of dynamometer runs to ensure the integrity of the exhaust sampling
assembly. During the chassis dynamometer cycles, the vehicle battery was used to power one of
the two REMOTE OEMs, and a secondary supply (independent of the vehicle battery) was used
to power the other REMOTE OEM. During the on-road cycles, the duplicate OEMs were
powered by the vehicle battery.

The installation activities (including on-site calibration, repairs, etc.) were documented by
Battelle staff. Observations regarding installation time and simplicity, ease of use, practicality,
passenger safety, etc., were based on the installation of a single unit.

3.5 Test Schedule

Testing was conducted between May 7 and May 10, 2001. Preparation of this verification report
did not begin until early 2003 due to delays by the vendor. Chassis dynamometer test cycles
were performed according to the schedule shown in Table 3-2 and were conducted with the
vehicle accessories off, except where noted. Test cycle conditions were documented by the test
facility in accordance with 40 CFR Part 86.(2) On-road tests were performed at the end of each
testing day.

Because the test was not designed to determine emission rates for the test vehicles, strict
adherence to the soak and preconditioning procedures described in 40 CFR Part 86(2) was not
necessary. However, conditions were consistent for replicates of each test cycle. After the vehicle
soak (12 to 36 hours), the test vehicle was placed on the dynamometer and prepared for testing.

5


-------
Table 3-2. Schedule for Chassis Dynamometer Test Cycles

May 7, 2001

May 8, 2001

May 9, 2001

May 10, 2001

Cavalier - FTP
Cavalier - US06
Tahoe - FTP
Tahoe - US06
Taurus - FTP
Taurus - US06

Tahoe - FTP
Tahoe - US06
Taurus - FTP
Taurus - US06
Cavalier - FTP
Cavalier - US06

Tahoe - FTP
Tahoe - US06
Taurus - FTP
Taurus - US06
Cavalier - FTP
Cavalier - US06

Cavalier - US06 @ 30°F
Cavalier -US06@ 75 °F
Cavalier-US06@ 100 °F
Cavalier - US06 ®" 100°F w/AC

Testing began with performance of an FTP cycle, followed within 10 minutes by a US06 cycle.
Three FTP and three US06 cycles were thus performed at room temperature (75 °F) alternately
on the three test vehicles on each of three test days. On the fourth day of testing, a series of three
US06 cycles were performed, including one at each of the following temperatures: 30°F, 75 °F,
and 100°F. These test cycles were conducted using the Cavalier, which had mid-range emissions
among the three test vehicles as established by previous testing by the test facility. After this
sequence of temperature tests, an additional US06 cycle was performed at 100°F with the
Cavalier's air conditioner operating at maximum capacity. Figure 3-1 shows the Chevy Tahoe in
the dynamometer cell during one of the test cycles. As shown in this figure, the two REMOTE
OEMs were placed outside the vehicles in the chassis dynamometer cell for all test cycles.


-------
For each test cycle, the exhaust emissions and engine activity data were monitored by both the
reference emission test equipment and the duplicate REMOTE OEMs. The test facility recorded
data on HC, CO, NOx, and C02 emissions at the test, bag, and second-by-second level. Back-
ground concentrations of the target emissions were not measured. The second-by-second
reference values were integrated over the periods of bag collection and compared with the
corresponding bag values to assess agreement for the reference measurements of HC, CO, NOx,
and C02. Results of these comparisons are summarized in Section 4.1.

3.6 On-Road Testing

The three test vehicles used in the chassis dynamometer test cycles were driven on two separate
routes over public roads while the duplicate REMOTE OEMs recorded second-by-second data
for HC, CO, NOx, and C02. Engine data were recorded by either of the REMOTE OEMs being
tested. The vehi cles began the on-road testing with a full tank of suitable, locally available,
regular unleaded (87 octane) gasoline and completed the two driving routes in succession (i.e.,
on the same trip). The routes involved

¦	Approximately 15 minutes of stop-and-go traffic through a central business district

¦	Approximately 15 minutes of sustained high-speed driving on a freeway.

Test routes were consistent from vehicle to vehicle. An effort was made to conduct on-road
testing under similar driving conditions (i.e., time of day, weather conditions). Figures 3-2 and
3-3 show external and internal views of the Chevy Tahoe before on-road testing.

Figure 3-3. View of the Duplicate
REMOTE OEMs in the Chevy Tahoe
Before On-Road Testing

Figure 3-2. External View of Chevy
Tahoe Before On-Road Testing

7


-------
Chapter 4
Quality Assurance/Quality Control

Quality assurance/quality control (QA/QC) procedures were performed in accordance with the
quality management plan (QMP) for the AMS Center^ and the test/QA plan for this verification
test.(1)

4.1	Reference Method Calibrations and Checks

The dynamometer and laboratory instrumentation were calibrated by the ATL according to the
standard operating procedures and schedules in place at the facility. These calibration specifica-
tions met or exceeded those described in 40 CFR Part 86.(2) Documentation of the calibrations
was provided to Battelle by the ATL prior to test initiation.

Calibration verifications of specific instrumentation were performed at the request of Battelle
during the verification test, and the results of the calibration verifications were provided to
Battelle. In all cases the calibration verifications were within the specified tolerances, i.e., 5%
for CO and C02, 10% for NOx, and 15% for HC.

The second-by-second reference data were averaged over the collection periods of the bag
samples in each test cycle, and the averaged data and bag analysis data were compared for con-
sistency. All such comparisons showed agreement within the requisite criteria of 5% for CO and
C02, 10% for NOx, and 15% for HC.

4.2	Audits

4.2.1 Pre-Test Facility Audit

Two weeks prior to verification testing, the Battelle Quality Manager conducted an audit of the
ATL to ensure that it had the equipment necessary to perform the verification test and that a
satisfactory QA/QC program was implemented. The audit included a tour of the dynamometer
facilities and a review of appropriate standard operating procedures and calibration records. The
audit also included observations of ongoing dynamometer testing. There were no adverse
findings as a result of this pre-test audit.

8


-------
4.2.2 Performance Evaluation Audit

A performance evaluation (PE) audit was conducted to assess the quality of the reference
measurements made in this verification test. This audit addressed only the emissions measure-
ments provided by the reference methods. The audit was performed by analyzing National
Institute of Standards and Technology (NIST)-traceable calibration gas standards that were
independent of those used by the ATL during the testing. The acceptance criteria for the results
of this audit were identical to those already in place at the ATL for calibration verification. The
results of the performance audit are shown in Table 4-1, which indicates that all reference
method readings were within 3% of those expected based on the PE standard concentrations.

Table 4-1. Results of Performance Audit

Audited Parameter

Acceptable Error

Actual Error

Passed Audit

HC

15%

2.8%

yes

CO

5%

-1.6%

yes

NOx

10%

2.9%

yes

CO,

5%

1.6%

yes

4.2.3	Technical Systems Audit

The Battelle Quality Manager conducted a technical systems audit (TSA) on May 8 and 9, 2001,
to ensure that the verification test was performed in accordance with the test/QA plan,(1)
reference methods, standard operating procedures used by the ATL, and the AMS Center
QMP.(3) As part of the audit, the Battelle Quality Manager reviewed the reference methods used
and compared actual test procedures to those specified in the test/QA plan, and reviewed data
acquisition and handling procedures. Observations and findings from this audit were docu-
mented and submitted to the Battelle Verification Test Coordinator for response. No findings
were documented that required any corrective action. The records concerning the TSA are
permanently stored with the Battelle Quality Manager.

4.2.4	Audit of Data Quality

At least 10% of the data acquired during the verification test were audited. Battelle's Quality
Manager traced the data from the initial acquisition, through reduction and statistical analysis, to
final reporting, to ensure the integrity of the reported results. All calculations performed on the
data undergoing the audit were checked.

9


-------
4.3 QA/QC Reporting

Each assessment and audit was documented in accordance with the QMP in effect for the ETV
AMS Center at the time of testing.(3) No adverse findings or potential problems were found. The
results of the TSA were sent to the EPA.

4.4 Data Review

Records generated in the verification test received a one-over-one review within two weeks of
generation before these records were used to calculate, evaluate, or report verification results.
The review was performed by a Battelle technical staff member involved in the verification test,
but not the staff member that originally generated the record. The person performing the review
added his/her initials and the date to a hard copy of the record being reviewed.

10


-------
Chapter 5
Statistical Methods

The statistical methods presented in this chapter were used to verify the performance factors
noted in Section 3.1.

5.1 Bias

The bias of the REMOTE OEM was assessed for each emitted species at the test level based on
the percent difference between the average concentration measurements or the grams/mile (g/mi)
emission rates from the REMOTE OEM relative to the reference method. For each individual
dynamometer run, the percent difference, dn between the REMOTE OEM and the reference
measurement was calculated as

± =

Yt-Xt

X

x 100

(1)

where Yt represents the test level results from the REMOTE OEM and Xt represents the test level
results of the reference method for a given emitted species. The average, D, and standard
deviation, 5, of these individual bias results were calculated from

and

Z4

(2)

D =

i=i

n

s =

1



(3)

n- 1

where n is the total number of chassis dynamometer test cycles. The standard deviation and
average difference were used to calculate the upper (UL) and lower (LL) 95% confidence limits
for the bias of each REMOTE OEM according to

and

UL = D+ 1097j 0)

LL = D - t

0.975

(*)

(4)

(5)

11


-------
where t0 975 is the 0.975 quantile of the Student's t distribution with n-1 degrees of freedom. Bias
was calculated independently for each of the duplicate REMOTE OEMs and each emitted
species. Additionally, bias was calculated independently for each vehicle and for each test cycle
(i.e., FTP, US06). Note that, as the absolute measurement becomes small (Xi), the percent bias
can become large since Xi is in the denominator.

5.2 Precision

Unit-to-unit precision was calculated based on the percent difference in the readings of the
duplicate REMOTE OEMs relative to the mean of the readings, as shown below:

d, =

z-y

(Yi+YJ/2

x 100

(6)

where Yt and Y \ are the test level results for a given emitted species from the two duplicate
REMOTE OEMs for each test cycle i. The coefficient of variation, CVn for each dynamometer
test cycle and vehicle was calculated according to Equation (7).

CK =

A.

a/2

(7)

The individual coefficients of variation for all test cycles and vehicles were pooled according to
Equation (8) to determine the overall precision of the REMOTE OEM.

cv=

1

I icv,)

(8)

i= 1

n

The UL and LL 90% confidence limits for the REMOTE OEM's C V are given by

UL=CV

Xo.95 ,n

(9)

and

LL= CV

n

%0.05,n

(10)

12


-------
where n is the number of degrees of freedom, and Xo95n m^Xoosn are the 0.95 and 0.05
quantiles, respectively, of the x2 distribution with n degrees of freedom. Precision was assessed
independently for each emitted species, as well as for each vehicle and each test cycle.

Supplemental comparisons were made at the second-by-second level to determine the
instantaneous unit-to-unit reproducibility of the duplicate REMOTE OEMs. As with the test
level results, these comparisons were made based on a percent difference calculation.

5.3 Other Factors

Second-by-second data from the OBD port and the REMOTE OEM were compared graphically
for the Ford Taurus to illustrate temporal correlations between the vehicle operational
parameters and the measured concentrations of the emitted species in the vehicle exhaust.
Likewise, second-by-second data from the reference monitors were compared visually against
those from the REMOTE OEM to illustrate temporal correlations. No statistical evaluations were
made of these second-by-second comparisons because of differences in the lag times and
response times between the reference monitors and the REMOTE OEM. For the on-road testing,
second-by-second comparisons were made between the results of the duplicate REMOTE
OEMs. Finally, a linear regression comparison was made between the REMOTE OEM and the
reference measurements.

13


-------
Chapter 6
Test Results

Two types of data were recorded during this verification test. The first type consisted of second-
by-second data that recorded HC, CO, NOx, and C02 emission levels. The second type of data
were the integrated sample results from collected bag concentrations during the FTP and US06
cycles. In this case, the emissions from the vehicle were integrated by collection over a time
period of several minutes, and the concentrations of HC, CO, NOx, and C02 in the collected bag
samples were measured at the end of the collection period with the reference monitors. The
REMOTE OEM did not measure this concentration directly, but rather performed a numeric
integration over the same period to calculate a corresponding bag or integrated sample concen-
tration value. Figures 6-1 a-d show the comparisons of the test cycle sample data for the
reference monitors and the corresponding integrated data from the two REMOTE OEMs (A and
B) for each of the test cycles in the order they were conducted. Figures 6-1 a-d show results for
HC, CO, NOx, and C02, respectively. All data in these figures are in terms of grams-per-mile
(g/mi) emissions of the indicated species.

m

US 06 FTP US06
Tahoe Taurus Taurus

iri

US 06 FTP US06
Tahoe Taurus Taurus

I

~	Reference
¦ OEM A

~	OEM B

Mm

FTP
Tahoe

FTP

Cavalier

US 06
Cavalier

FTP US06
Tahoe Tahoe

FTP
Taurus

US06
Taurus

FTP US06
Cavalier Cavalier

Figure 6-la. Integrated Sample Comparison for Hydrocarbons

14


-------
16

[

~	Reference

~	OEM A

~	OEM B

h

m

FTP

Cavalier

US 06
Cavalier

FTP US 06
Tahoe Tahoe

FTP
Taurus

US 06
Taurus

FTP US06
Tahoe Tahoe

FTP
Taurus

US06
Taurus

FTP

Cavalier

US 06
Cavalier

FTP
Tahoe

US 06
Tahoe

FTP
Taurus

US06
Taurus

FTP

Cavalier

US06
Cavalier

Figure 6-lb. Integrated Sample Comparison for Carbon Monoxide

d£

~	Reference

~	OEM A

~	OEM B

FTP

Cavalier

US06 FTP
Cavalier Tahoe

US06
Tahoe

FTP
Taurus

US06
Taurus

FTP
Tahoe

US 06
Tahoe

FTP
Taurus

US06 FTP
Taurus Cavalier

US 06
Cavalier

FTP
Tahoe

US06
Tahoe

FTP
Taurus

US06
Taurus

FTP

Cavalier

US06
Cavalier

Figure 6-lc. Integrated Sample Comparison for Nitrogen Oxides

15


-------


~	Reference

~	OEM A

~	OEM B



















n

n



n



















j











m

r





n

r

-



J



"



r



"







¦





"





























































If

FTP

Cavalier

US 06
Cavalier

FTP
Tahoe

US06
Tahoe

FTP
Taurus

US06
aurus

FTP
Tahoe

US06
Tahoe

FTP
Taurus

US 06
Taurus

FTP

Cavalier

US06
Cavalier T

TP

ahoe

US06
Tahoe

FTP
Taurus

usoe

auru

FTP

Cavalier C

JS06
avalier

Figure 6-ld. Integrated Sample Comparison for Carbon Dioxide

It can be seen from these figures that the results from the duplicate REMOTE OEMs generally
agreed with one another and also showed agreement with the reference results in most cases.

During each FTP cycle, three individual bags were collected in succession. The bag-by-bag FTP
sample data from the reference measurements and the corresponding integrated data from the
two REMOTE OEMs (A and B) are shown in Figures 6-2a-d, in the order they were collected.
Figures 6-2a-d show results for HC, CO, NOx, and C02, respectively. These figures again
illustrate the general agreement between the duplicate REMOTE OEMs, and the comparison
with reference results from the individual bags collected during different FTP cycles.

6.1 Bias

Bias results were calculated according to Equations (1-5) in Section 5.1. These bias results
express the average percent difference between the REMOTE OEM results and the reference
results. This calculation was performed for the integrated sample data for the entire test, and also
separately for each vehicle and each test cycle (FTP and US06). Table 6-1 shows the results of
these calculations. For the overall verification test, the smallest relative bias is found for OEM B
measuring NOx, at 1.96 ± 3.90%, and the largest is found for OEM A while measuring HC, at
34.8 ± 9.56%. Considering the bias results organized by test vehicle and test cycle, in general,
both REMOTE OEMs exhibited smaller percent biases while measuring NOx and CO, ranging

16


-------
nk



I

H£L

~ Reference

,0v	e. e." _0J an a" a3 0e> .a" _eJ an a" a3 ,0v	e, ®" ®J a, a,- a>- .&'.&-

/ / /	>cP -fp ^>^^>NN0NN0MN° ^ ^ ^ # J # K>° K>° K>°

{<> J? J? X& A& A& ¦<*>¦>*> 'I& *.*> .n5> J<> J<> Jf> j/Jf- . ^> .^> ,^> Jo- J<> Jo

&„- &„- \* \* \* *? *? *? \* \* \* *? *?	tfT *r *? *r	
-------
Figure 6-2c. FTP Individual Bag Comparison for Nitrogen Oxides

E

3 400

o
o

~	Reference
HOEM A

~	OEM B

A /U lb A ib A A A > 'V £>	£> A /V A A 3 A iV A A /V /b

^ ^	^	^ ^ ^	^ «>	J?* $ $

^ ^	^ 
-------
Table 6-1. Percent Bias Values and Confidence Intervals for REMOTE OEM

Bias
Pooling

OEM A

OEM B

HC

CO

NOx

co2

HC

CO

NOx

co2

Total Test

















% Bias

34.8

-7.95

-11.2

16.5

21.5

-2.07

1.96

22.2

±

9.56

1.80

3.35

2.50

4.82

2.84

3.90

3.60

Cavalier

















% Bias

63.9

-0.14

-1.93

29.5

24.9

5.40

16.5

44.7

±

15.8

1.19

5.33

3.39

3.56

2.98

6.01

4.11

Tahoe

















% Bias

11.2

-12.9

-10.3

8.09

20.1

-7.79

-3.04

9.53

±

2.22

2.09

1.46

0.73

4.97

3.12

1.64

0.70

Taurus

















% Bias

29.4

-10.8

-21.3

11.8

19.4

-3.80

-7.58

12.3

±

2.42

1.05

1.42

0.39

6.12

1.94

1.32

0.34

FTP cycles

















% Bias

53.9

-7.47

-18.1

19.2

20.6

-1.56

-4.57

25.2

±

12.8

1.40

2.05

3.29

3.48

2.63

1.98

3.78

US06 cycles

















% Bias

15.8

-8.42

-4.23

13.7

22.4

-2.57

8.48

19.2

±

2.62

2.22

4.07

1.39

6.11

3.19

5.06

3.57

Note: Bold rows show results by vehicle.

between -0.14 and 21.3% for OEM A and between -1.56 and 16.5% for OEM B. Larger percent
biases were found while measuring HC and C02, ranging between 8.09 and 63.9% for OEM A
and between 9.53 and 44.7% for OEM B. There was no consistent trend in OEM bias relative to
the identity of the test vehicle or the test cycle.

6.2 Unit-to-Unit Precision

To calculate the unit-to-unit precision of the OEM, a CV was determined for each dynamometer
test cycle by vehicle and by test cycle. Table 6-2 shows the results of these calculations.

Unit-to-unit precision was measured by the pooled CVs of results from the duplicate OEMs. In
nearly all cases, the CVs for all the emitted species from all the vehicles were less than 5%. The
largest CV was reported for HC during the Cavalier test, at 8.97 ± 11.6% over a tested range of
0.05 to 0.47 (g/mi) (Figure 6-1 a). The smallest CV was seen for CO during the Cavalier test, at
1.11 ± 1.43%) over a tested range of 0.70 to 12.0 (g/mi) (Figure 6-lb).

19


-------
Table 6-2. Unit-to-Unit Precision Results and Confidence Intervals for REMOTE OEM

Precision
Pooling

HC

CO

NOx

co2

Total Test









%CV

6.04

2.54

4.03

3.17

±

2.66

1.12

1.78

1.40

Cavalier









%CV

8.97

1.11

4.73

4.89

±

11.6

1.43

6.10

6.30

Taurus









%CV

4.77

2.32

4.59

1.99

±

6.15

2.99

5.92

2.57

Tahoe









%CV

2.50

3.57

2.29

1.50

±

3.23

4.60

2.96

1.93

FTP Cycles









%CV

7.90

2.05

4.4

3.71

±

6.44

1.67

3.58

3.02

US06 Cycles









%cv

4.77

2.32

4.59

1.99

±

2.65

2.40

2.95

2.05

6.3 Other Factors

6.3.1 Reliability and Ease of Use

All data were collected as expected, and the REMOTE OEMs had no downtime during the tests.
The REMOTE OEMs were installed in the vehicles for on-road testing with no difficulty.
Installation time for a single unit was between 5 and 15 minutes for the on-road portion of the
verification test. No repairs of either of the two OEMs were required during the verification test.
Operation at 30°F and at 100°F had no adverse impact on OEM reliability, and operation over
this range showed no consistent effect of temperature on OEM bias for any of the measured
species.

20


-------
6.3.2	Other Unit-to-Reference Method Comparisons

Figures 6-3 a-d show the second-by-second data from the reference method and REMOTE OEMs
in the May 9, 2001, FTP cycle with the Ford Taurus. Due to its mid-range engine size, the data
from the Ford Taurus are presented. These data show a graphical representation of speed of
response and agreement between the two OEMs and the reference methods. No statistical
calculations were performed using these data. However, the data illustrate the temporal correla-
tions between the REMOTE OEMs and the reference methods. These figures show general
agreement between the reference monitors and the two REMOTE OEMs on the timing and level
of vehicle emissions. There was some time delay between the reference monitors and the
REMOTE OEMs, and some difference in the height of transient peaks, due to the different lag
times in sampling by the reference monitors. Data from on-road testing with the two OEM units
showed very similar agreement to the FTP second-by-second data shown in Figures 6-3a-d.

Linear regression comparisons of the REMOTE OEM results with FTP bag results are presented
graphically in Figures 6-4a-d for each of the chassis dynamometer test cycles. These figures are
based on the bag sample data presented above in Figures 6-2a-d, i.e., n=27 for each linear
regression shown.

The linear regression results show that, except for the OEM A HC results (r2 of 0.54)

(Figure 6-4a), both OEM A and OEM B had coefficients of determination greater than 0.86 for
all four emitted species measured. The slopes of the linear regressions for OEM A and OEM B
relative to FTP bag results were between 0.97 and 1.03 for C02 over a tested range of 300 to
620 (g/mi). The slopes were between 0.95 and 1.05 for CO over a tested range of 0 to 13 (g/mi)
and between 0.92 and 1.03 for NOx over a tested range of 0 to 1.4 (g/mi). However, the slopes of
the linear regressions for OEM A and OEM B were between 0.62 and 0.79 for HC over a tested
range of 0 to 1 (g/mi). The HC results may be because of the different analytical techniques used
(i.e., infrared absorption in the OEM measurements, FID in the reference measurements).

6.3.3	Temperature Effect

The results from the tests conducted at different cabin temperatures are shown in Table 6-3 as
percent bias relative to the reference measurements. The results from these tests indicate that,
while sometimes large differences occurred between the reference and OEM measurements,
these differences were not consistently greater at elevated (100°F) or reduced (30°F)
temperatures, relative to the 75 °F condition. The largest bias values (88.6 to 96.9%) occurred for
NOx with both OEM units at the 100°F condition. It is noteworthy that these biases had an
opposite sign in the two tests at 100°F, i.e., -96.9% in the third test and +88.6%) in the fourth test
(Table 6-3). The OEM experienced no observable malfunctions due to the changing testing
temperatures.

21


-------
Figure 6-3a. Second-by-Second from Duplicate REMOTE OEMs and the Reference Method for
Hydrocarbons During an FTP Cycle


-------
4500 •







3500 •
3000 -
2500 •

	Reference

	OEM A



OEM B













1500 •
1000
500 •











-



J

k .







500 1000 1500 2000 25

Figure 6-3b. Second-by-Second Data from Duplicate REMOTE OEMs and the Reference Method for
Carbon Monoxide During an FTP Cycle


-------
Figure 6-3c. Second-by-Second Data from Duplicate REMOTE OEMs and the Reference Method for
Nitrogen Oxides During an FTP Cycle


-------
Figure 6-3d. Second-by-Second Data from Duplicate REMOTE OEMs and the Reference Method for
Carbon Dioxide During an FTP Cycle


-------
Figure 6-4a. Linear Regression Comparison Between Reference Method and
REMOTE OEM for Hydrocarbons

Figure 6-4b. Linear Regression Comparison Between Reference Method and
REMOTE OEM for Carbon Monoxide

26


-------
Figure 6-4c. Linear Regression Comparison Between Reference Method and
REMOTE OEM for Nitrogen Oxides

87

Reference NOx(g/mi)

Reference CO2 (g/mi)

Figure 6-4d. Linear Regression Comparison Between Reference Method and
REMOTE OEM for Carbon Dioxide

27


-------
Table 6-3. Temperature Effect Results (US06 Cycles)

% Bias





OEM A



OEM B

Condition

HC

CO

NOx

co2

HC

CO

NOx

co2

30°F, Accessories Off

47.1

19.9

-7.46

-18.6

-16.4

1.49

-7.46

-19.2

75°F, Accessories Off

22.4

7.97

-0.23

-22.5

-32.0

-4.27

-50.3

-23.3

100 °F, Accessories Off

-21.9

4.64

-96.9

-19.4

-60.4

-7.64

-96.9

-19.4

100 °F; AC Max, Hot

-29.3

-15.7

88.6

-16.0

-70.1

-30.6

88.6

-16.0

28


-------
Chapter 7
Performance Summary

Duplicate REMOTE OEMs were tested for bias and unit-to-unit precision in FTP and US06
dynamometer test cycles with three vehicles. Considering all tests with all vehicles, OEM B had
the smallest relative bias, for NOx, at 1.96 ± 3.90%, and OEM A had the largest, for HC, at 34.8
± 9.56%. Considering the test results organized by test vehicle and test cycle, both OEMs A and
B exhibited smaller percent biases for NOx and CO, ranging between -0.14 and 21.3% for OEM
A and between 1.56 and 16.5% for OEM B. Both OEMs A and B exhibited larger percent biases
for HC and C02, ranging between 8.09 and 63.9% for OEM A and between 9.53 and 44.7% for
OEM B.

Unit-to-unit precision was measured by the pooled CVs of results from the duplicate OEMs. In
nearly all cases, the CVs of the duplicate OEMs for all the emitted species with all the vehicles
were less than 5%. The largest CV was reported for HC during the Cavalier test, at 8.97 ± 11.6%
over a tested range of 0.05 to 0.47 (g/mi). The smallest CV was seen for CO during the Cavalier
test, at 1.11 ± 1.43% over a tested range of 0.70 to 12.0 (g/mi).

In assessing reliability and ease of use, all data were collected as expected, and the monitors had
no downtime during the tests. The REMOTE OEMs were installed in the vehicles for on-road
testing with no difficulty. Operation at 30°F and at 100°F had no adverse impact on OEM
reliability, and operation over this range did not show a consistent effect of temperature on OEM
bias for any of the measured species.

The second-by-second data for the reference method and the REMOTE OEMs illustrate close
agreement. A time delay between the reference monitors and the REMOTE OEMs was due to
the different lag times in sampling by the reference monitors.

The linear regression of averaged OEM results against FTP bag results shows that, except for the
OEM A HC results (r2 of 0.54), both OEM A and OEM B had coefficients of determination
greater than 0.86 for all four emitted species. The slopes of the linear regressions for OEM A and
OEM B were between 0.97 and 1.03 for C02 over a tested range of 300 to 620 (g/mi). The slopes
were between 0.95 and 1.05 for CO over a tested range of 0 to 13 (g/mi) and between 0.92 and
1.03 for NOx over a tested range of 0 to 1.4 (g/mi). However, the slopes of the linear regressions
for OEM A and OEM B were between 0.62 and 0.79 for HC over a tested range of 0 to 1 (g/mi).
The HC results may be because of the different analytical techniques used (i.e., infrared
absorption in the OEM measurements, FID in the reference measurements).

29


-------
Chapter 8
References

1.	Test/QA Plan for Verification of On-Board Vehicle Emissions Monitors, Battelle,
Columbus, Ohio, April 26, 2001.

2.	U.S. Environmental Protection Agency, "Control of Air Pollution from New and In-Use
Motor Vehicles and New and In-Use Motor Vehicle Engines: Certification and Test
Procedures," 40 CFR Part 86.

3.	Quality Management Plan (QMP) for the ETV Advanced Monitoring Systems Pilot,
Version 2.0, U.S. EPA Environmental Technology Verification Program, Battelle,
Columbus, Ohio, October 2000.

30


-------