Report No. SR02-02-02
Hands-On Audit Training for
Mass- and Concentration-
Based IM Programs
prepared for:
U.S. Environmental Protection Agency
Certification and Compliance Division
February 2002
prepared by:
Sierra Research, Inc.
1801 J Street
Sacramento, California 95814
(916) 444-6666
-------
Report No. SR02-02-02
Hands-On Audit Training for Mass- and
Concentration-Based IM Programs
prepared for:
U.S. Environmental Protection Agency
Certification and Compliance Division
Under Contract No. 68-C7-0051
Work Assignment No. 3-05
February 2002
prepared by:
Michael J. St. Denis
Richard W. Joy
Garrett D. Torgerson
Sierra Research, Inc.
1801 J Street
Sacramento, CA 95814
(916) 444-6666
-------
DISCLAIMER NO. 1
The mention of commercial products or equipment vendors is for
informational purposes only. Mention of these products or vendors should
not be considered an endorsement by Sierra Research or the United States
Environmental Protection Agency.
DISCLAIMER NO. 2
Although the information described in this report has been funded wholly
or in part by the United States Environmental Protection Agency under
Contract No. 68-C7-0051, it has not been subjected to the Agency's peer
and administrative review and is being released for information purposes
only. It therefore may not necessarily reflect the views of the Agency and
no official endorsement should be inferred.
-------
Hands-On Audit Training for Mass-
and Concentration-Based IM Programs
Table of Contents
Section page
1. INTRODUCTION 1-1
Background 1-1
Organization of the Report 1-3
2. INSIGHTS ON EQUIPMENT AUDIT FAILURES 2-1
Recommendations 2-2
QA/QC Elements 2-3
Auditing Particular Programs 2-7
Issues of Concern 2-9
Sources of Audit Equipment 2-16
Approximate Cost for Audit Equipment 2-21
3. TEST SYSTEM ACCEPTANCE TESTING 3-1
Initial Acceptance and Beta Testing 3-1
ATP Development and Performance 3-3
Beta Testing 3-4
Required Resources and Time 3-5
Testing of Non-BAR97 Equipment 3-6
4 AUDITOR TRAINING COURSE 4-1
Classroom Training Description 4-1
APPENDIX A. EXAMPLE ACCEPTANCE TESTING PROCEDURE A-l
APPENDIX B TRAINING COURSE PRESENTATION B-l
-------
1. INTRODUCTION
Under Work Assignment 3-05, Sierra has developed a training course for EPA that can be
used to train auditors of decentralized concentration- and mass-based vehicle inspection
test systems. The training materials (which are included in Appendix B) focus on quality
control issues of all types (acceptance testing, calibration, and audit failures) and include
Sierra's experience with the most common types of failures. This report is a supplement
to the training materials and includes documentation of common equipment issues,
proposed EPA and state actions to improve equipment quality control, an example
acceptance testing procedure (provided in Appendix A), and issues related to auditing test
systems. Also included are Sierra's insights into quality control issues based on
observations from working with several states.
Background
Under the Clean Air Act Amendments of 1990, metropolitan areas with the most serious
air quality problems are required to implement so-called "enhanced" vehicle emissions
inspection and maintenance (I/M) programs. One element of an enhanced program is a
test procedure that is more effective than the simple idle tests used in "basic" I/M
programs. Two different test procedures for exhaust emissions testing in enhanced
programs have been approved by EPA: the "IM240" test and the "Acceleration
Simulation Mod8" (ASM) test. The IM240 test and several shortened versions of it (e.g.,
IM147) involve "transient" (i.e., stop-and-go) testing. In contrast, the ASM procedure
involves only steady-state operation, but with a load on the vehicle that represents an
acceleration mode Both of these procedures have been shown to be capable of
separating vehicles with excessive exhaust emissions from other vehicles; however, the
accuracy of the emissions results depends on ensuring that the test equipment is
accurately calibrated and operating properly.
A number of enhanced I/M programs are implementing one or both of the two common
ASM modes (i.e , the 2525 and the 5015). Most of these programs have implemented
ASM testing on a "decentralized" basis in which existing vehicle repair shops are
licensed to perform such testing. Under this type of system, licensed shops must
purchase and use test systems produced by one or more equipment manufacturers, based
on equipment specifications developed by the program All specifications for ASM
testing programs are in turn based on equipment specifications developed by the
1-1
-------
California Bureau of Automotive Repair (BAR)* and also published by US EPA." In
addition to spelling out the test procedures to be used in conducting the ASM tests, the
US EPA and BAR97 specifications include required certification, calibration, and audit
procedures. Specifications developed by other states are modified versions of these
procedures.
Both ASM and transient testing programs include new components that were not
previously required for idle testing, including the measurement of NOx, the measurement
of emissions control system status and RPM via the OBDII port on a vehicle, and the use
of dynamometers. Transient programs require a more complicated system for exhaust
measurement than ASM systems, due to the need to measure exhaust flow rate to
calculate emission rates. Under EPA's EM240 procedures, exhaust is diluted to a constant
volume using a constant volume sampler (CVS). Some decentralized transient testing
programs specify "VMAS" flow measurement equipment, which uses a proprietary
technique to measure vehicle exhaust flow at a lower cost than a CVS system. The
availability of this equipment, coupled with the use of BAR97 analyzers and
dynamometers (used in a transient test mode), has enabled several states (i.e., New York,
Massachusetts, and Rhode Island) to implement transient mass emissions testing in a
decentralized network. However, although VMAS systems are already operating in the
field, development of well-designed calibration and other quality assurance/quality
control (QA/QC) requirements has lagged behind the implementation of these systems,
leading to concerns regarding their accuracy during in-use service (in part because they
are not specified by BAR and were introduced after the last revision of the federal test
system QC requirements). OBDII interrogation firmware is also being incorporated into
the design of most decentralized test systems due to Federal requirements related to
OBDII I/M checks, but no quality control procedures have been proposed to date.
The proliferation of these new, increasingly sophisticated decentralized emissions test
components means that this equipment is being used more and more by vehicle inspectors
who are less able to understand how calibration and quality control practices may affect
test accuracy. The combination of this factor with the absence of industry-accepted
QA/QC procedures raises significant concerns regarding the overall accuracy of enhanced
decentralized testing (Some of these systems are being used in the centralized test
environment as well.)
CFR40, Part 51 Section 363(c) requires that overt audits of the decentralized test systems
be performed twice a year. To help states develop these programs by increasing the
amount of information available on QA/QC procedures for these new sophisticated test
systems, EPA contracted with Sierra in 2000 under Work Assignment 2-05*" to provide
EPA with emissions test equipment audit procedures. These procedures cover all test
* "BAR-97 Emissions Inspection System Specifications," California Bureau of Automotive Repair, May
1996
" "Acceleration Simulation Mode Test Procedures, Emissions Standards, Quality Control Requirements,
and Equipment Specifications," US EPA, EPA-AA-RSPD-EM-96-2, July 1996
"* "ASM Audit Guidance," US EPA Contract 68-C7-0051 Work Assignment No 2-05, September 15,
2000
1-2
-------
system components including BAR97 gas analyzers, dynamometers, weather stations,
engine RPM measurement devices, OBDII interrogation firmware, gas cap and vehicle
pressure testers, and VMAS flow measurement devices.
As IM programs begin to mature, more and more states are now beginning to evaluate
their program performance. As part of this evaluation, they are beginning to examine the
performance of their test equipment as required by CFR40 by implementing auditing
procedures such as those developed under Work Assignment 2-05. Therefore, EPA is
being asked more frequently by states to provide technical support to auditing programs
in the field. For this reason, EPA has developed Work Assignment 3-05 to provide EPA
IM staff with hands-on training of the audit procedures. This report is part of that Work
Assignment in that it provides information on the initial acceptance testing of
decentralized emissions test systems related to QA/QC {an example is provided in
Appendix A) and the most common type of failures for each given audit procedure shall
be discussed. In addition, the instructional materials from the classroom portion of the
training are include in Appendix B and should allow EPA staff to training additional EPA
and state program staff.
Organization of the Report
This report is divided into four sections. Following this Section 1 introduction, Section 2
provides insights on equipment audit failures, and costs and sources of auditing
equipment. Section 3 then discusses test system acceptance testing, and Section 4
describes the training materials presented in Appendix B.
it a ii
###
1-3
-------
2. INSIGHTS ON EQUIPMENT AUDIT FAILURES
This section of the report includes detailed descriptions of issues that are not obvious by
simple examination of the audit procedures and forms. It is meant to provide completely
candid expert opinion on the state of I/M testing today, in particular on ASM testing,
including the use of the VMAS emission measurement equipment. This section also
provides the names of vendors and approximate costs for audit equipment.
The comments and opinions provided by Sierra are based on our significant firsthand
experience in working with states, contractors, and equipment vendors, and our
knowledge gained by frank discussions with a broad spectrum of I/M stakeholders.
Because many of the statements made in this section of the report are qualitative
assessments of particular areas of the I/M test process (quantitative documented findings
from firsthand involvement in I/M test activities cannot be made due to confidentiality
requirements with Sierra's clients), it is possible that groups within EPA, states,
contractors, and equipment vendors could take issue with specific statements that are
deemed to be overly subjective from an individual's perspective. Despite this, Sierra is of
the opinion that these comments represent an accurate assessment of current issues
related to design, checkout, operation, and audit of ASM and VMAS test systems.
This section is a supplement to the audit training course (covered in Section 4 and
Appendix B) and covers:
1. Recommendations to improve the QA/QC by EPA and states;
2. The relationship between routine program audits and other equipment-related
quality assurance and quality control (QA/QC) activities, and how these
activities fit together,
3. Issues that technical staff should be aware of in performing the equipment
audits; and
4. Recommendations on the equipment needed to perform the audits (e.g ,
portable weather station, RPM and OBDII simulators, etc.), possible sources
for this equipment, and approximate costs for each item.
Each of these elements is covered in the following sections.
2-1
-------
Recommendations
The following recommendations are divided into two groups. The first involves
recommended actions for EPA to pursue in addressing existing emissions test system-
related concerns. The second group of recommendations is aimed at the state I/M
programs, and actions they should take to improve the quality of their emissions test
program.
EPA Issues -
1 EPA should make readily available the audit guidance created under Work
Assignment 2-05 so that states can use this as a reference for performing
auditing.
2. EPA should develop a reasonable policy and guidance on how it intends to
work with states to give them the time they need to ensure that the test
equipment is operating properly prior to program rollout (proper time for
acceptance testing and beta testing).
3. EPA should become more proactive in judging the technical adequacy of new
test systems and procedures before they are produced and installed in a
program. This would include holding states to a much higher performance
threshold than currently exists for testing and verifying the operation of new
prototype systems prior to allowing their use.
4. EPA needs to establish some type of approval process for new test systems,
similar to but less cumbersome than the High-Tech Test Committee process
used to develop the initial details of EPA's IM240 guidance.
5 EPA should use the above approval process to ensure that new test systems are
well designed and incorporate adequate QA/QC elements.
6. EPA needs to establish a mechanism to ensure that technical, certification, and
audit information about test systems is shared among the states. EPA should
post information about state audit problems on the EPA website or distribute it
to interested states using some other suitable mechanism.
7. EPA should add calibration and quality control procedures for the use of
VMAS to CFR 40 Part 51 Subpart S Appendix A
State Issues -
1. States should allow sufficient time in implementing new program designs or
equipment for a comprehensive acceptance and beta testing process designed to
minimize the occurrence of test system bugs after program rollout.
2-2
-------
2. States should ensure that standard audit display screens covering all applicable
test system components are included in the specifications for the equipment,
and verify the functionality of all such screens as part of the acceptance testing
process prior to program rollout.
3. States should use the audit procedures and related equipment recommended in
Work Assignment 2-05 to perform sufficient acceptance testing to verify the
basic functionality of emissions test system hardware prior to program rollout.
4. States should incorporate the software verification elements described below
into the acceptance testing process, in order to verify all required functionality
and identify/resolve any software bugs prior to rollout.
5. If a manufacturer is basing its compliance on a statement of compliance with
other specifications, such as BAR97 or NYTEST, the state(s) should contact
the applicable agency (e.g., the California Bureau of Automotive Repair
[BAR]) during the acceptance testing process to verify that the proposed test
system has in fact been certified for use. The state(s) should keep current with
proposed and required changes for manufacturers to maintain their certification.
6. States should subject all test systems to a well-designed and thorough beta
testing process following the completion of acceptance testing.
7. States should include analysis of the test record database and all data files being
produced by the test systems sent to that database in the acceptance and beta
testing verification process.
OAJOC Elements
The auditing of ASM and other enhanced emissions test equipment is only one
component of what should be an integrated quality assurance/quality control program
aimed at ensuring that the equipment is properly designed and maintained. This QA/QC
program can be divided into three primary elements:
1. Initial acceptance and beta testing of the equipment;
2. Ongoing, automatic calibration checks programmed into the equipment; and
3. Equipment audits that are performed on a routine basis.
All three of the above elements are critical to ensuring that the equipment continues to
work properly and produce accurate test results. Details on acceptance and beta testing
are given in Section 3 of this report, and issues related to calibration and audit elements
are provided below.
2-3
-------
Calibration Checks - Calibration checks are the second of the primary QA/QC elements
that are essential to ensuring that emissions test systems are producing accurate test
results. A range of automated calibration checks is incorporated into BAR97 and other
enhanced test systems. Most of the test systems are (or should be) designed to require the
performance of a passing calibration on a specified time interval (e.g., every 3 days,
weekly, etc.). If the check is not performed or the test system fails its calibration, the
software is designed to lock out the unit from further testing until a passing calibration is
performed.
Verifying the functionality of the calibration checks during the acceptance testing process
provides assurance that the equipment will continue to operate within allowable limits
and thus produce acceptable test results after the test systems have been rolled out into
the field. The calibration checks also serve to ensure that the equipment is properly
maintained, since otherwise it will not continue to pass calibration. It is therefore
important that the equipment specifications and resulting control software contain a full
range of calibration checks aimed at all hardware components.
For BAR97 components, the calibration checks are relatively straightforward. They
typically involve a required 3-day gas bench calibration, a dynamometer coast-down
check, and calibration of the gas cap tester (if the system is so equipped), as well as a
3-day leak check of the sample system. This latter element is usually performed in
combination with the gas bench calibration. These checks basically ensure that the
equipment is operating within specified parameters (as coded into the control software).
For non-BAR97 components, the situation is complicated by the lack of documented
EPA guidance or other specifications. Neither EPA guidance nor the NYTEST
specifications contain any information related to VMAS calibrations. This has
necessitated a significant effort on the part of Massachusetts and its network contractor
(Keating Technologies), the manufacturer of the VMAS systems (Sensors), and Sierra to
develop and implement the necessary calibration checks for this component. While
significant progress has been made in this area in the Massachusetts program, we are
unaware to what degree similar functionality has been or is being developed in the other
VMAS-based programs (i.e., New York and Rhode Island).
The fact that the VMAS QA/QC functionality is being developed on the fly after program
rollout is distressing and is a potential indicator of more problems to come. As noted
above, additional test systems using new emissions and/or flow measurement technology
are currently under development. Unless EPA handles the approval of such equipment
differently than the NYTEST units, these test systems will have similar, if not worse,
QA/QC problems. EPA needs to begin now to establish some type of approval process
for such systems. Restarting the High-Tech Test Committee process that was used to
develop the initial details of EPA's EM240 guidance is not recommended However,
some type of similar but less cumbersome method is needed to ensure that new test
equipment is well designed and incorporates adequate QA/QC elements. Currently there
is no mechanism to ensure this
2-4
-------
Equipment Audits - Audits are the last of the three primary QA/QC elements. These
need to be conducted on a regular basis to verify that the test systems are accurately
measuring emissions. They also serve as a check on the ability of the automatic
calibrations to maintain equipment performance within specified parameters.
For the most part, audits should function on a pro forma basis; i.e., they should show that
the equipment is meeting all applicable performance standards. If the test systems are
verified during acceptance testing to be meeting all applicable specifications (including
relevant calibration checks) and continue to pass all their calibration checks, they should
easily pass audits as well. However, this is not the case in almost all enhanced programs
across the country, particularly in the decentralized programs. There are a number of
reasons why audit failures are occurring at high rates in many programs, including those
outlined below.
1. In hindsight, some elements of the BAR97 specifications and related EPA
ASM guidance come relatively close to the performance limits of the available
technology (i e., gas benches).
2. The prototype systems submitted by the manufacturers and tested by BAR
during the BAR97 certification process were largely hand-picked units whose
performance was superior to the average test systems.
3. A typical BAR97 test system, particularly after it has been in the field for some
time and suffered a commensurate degradation in performance, does not
perform nearly as well as the gold standard units tested and certified by BAR.
4. Certain design features that are required by federal regulation, and which would
address some of the audit problems, are not incorporated into the test systems
nor is EPA requiring them to be. The most obvious example of this is the lack
of any multipoint gas bench calibrations.
5. In some cases, widespread audit failures are seen by the affected states,
manufacturers, and contractors as a problem with the audit procedures rather
than the test systems.
Sierra does not have a readily available solution to the audit problems that are being
encountered in many programs. We also recognize that EPA cannot simply address this
issue by requiring that states adopt more stringent QC procedures. For example, while
implementation of multipoint calibrations might eliminate most BAR97 bench audit
failures, it is clear that EPA cannot just begin telling states to go to such calibrations. The
problem is that the existing test systems have been built within the constraints of existing
EPA guidance and BAR97 specifications; any tightening of the specifications is likely to
lead to the need for a significant redesign of the test system components, which would be
impractical.
Conversely, some if not all the manufacturers appear to also be at fault because they built
systems that could not perform to the guidance/specifications during in-field operation.
2-5
-------
This has led BAR to pressure the manufacturers to incorporate various improvements or
enhancements into their systems in an attempt to upgrade performance to allowable
limits. The end result of this effort is unclear at this time; i.e., it is not known whether the
improvements will in fact enable the systems to begin meeting the audit limits in
California. In addition, it is unclear to what extent these improvements will migrate to
other programs. BAR is being very quiet about its discussions with the manufacturers,
and as a result it is unknown to what extent other states are aware of the upgrades that are
currently underway in California The manufacturers will certainly not volunteer this
information to other states.
This situation further supports the need for some type of mechanism aimed at ensuring
that this type of information gets shared among the states. At present, this happens only
when specific questions are addressed to California (or another state). We also believe
that many states tend to downplay any problems they are encountering to avoid creating
the perception among the public, EPA, or other parties that their programs are in trouble
or are suffering from a loss in effectiveness. For this reason, Sierra recommends that
EPA consider requiring more frequent reporting of audit problems by states. Under this
approach, states that have a component audit failure rate above a certain threshold level
would be required to report that rate on a quarterly basis, along with a detailed
explanation of the reason(s) for the failures and what the state is doing to address the
problem in an expeditious manner. This information would in turn be posted on the EPA
website or distributed to interested states using some other suitable mechanism.
Audit Feedback - Under 40 CFR section 51.363(c), equipment audits are to be conducted
at least twice per year. Test system components specifically described as requiring an
audit in this section include the gas bench, sample system, CVS flow measurement
device, FED, dynamometer, and pressure and purge test devices. The gas cap tester,
VMAS unit, OBDII interrogation link, weather station, and RPM measurement devices
are not explicitly listed in the section but should also be included in the audit. Obviously,
the components to be audited in any particular program will depend on the design of the
applicable test system(s).
Isolated audit failures will occur since even equipment that is well calibrated and
maintained will fail on occasion when subjected to a more detailed audit procedure.
These failures are not considered of significant importance except for ensuring that they
result in the applicable test systems being quickly returned to proper performance.
Of much bigger concern is the problem of repeated audit failures noted above. A critical
link in the QA/QC effort is to have an effective feedback mechanism that can be used to
address repeated failures through changes in equipment design or performance
parameters. For example, such failures may be addressed by redesign of one or more
components, which would then need to be subjected again to some type of certification or
acceptance testing process prior to being rolled out into the field. Another approach
might be to change the automated calibration procedures (e.g., increase their frequency)
to address such audit failures. These examples show the interrelated nature of the three
QA/QC elements. Each depends on the other and none by itself is sufficient to produce
properly designed and operating systems.
2-6
-------
Audit Frequency - Another concern is that many states may be performing substantially
fewer equipment audits than the number specified in 40 CFR 51.363; i.e., two per year
per test system. During a presentation at the Clean Air Conference in Colorado, the audit
supervisor for BAR indicated that the agency currently has a total of eight auditors who
are responsible for performing equipment audits. He also indicated this number was a
recent increase from the previous total of four auditors.
There are currently about 7,800 licensed Smog Check stations in California, most of
which are located in the enhanced program areas. The current total of eight auditors is
clearly insufficient to perform two equipment audits per year at each of these stations. If
it is assumed under a best-case scenario that an auditor can perform an average of four
audits per day and there are 250 working days per year, the eight auditors would be
capable of performing a maximum of 8,000 audits per year. This is only about half the
required number. In addition, given the relatively high frequency of failing audits in the
program, the assumption that each auditor can average four audits per day is overly
optimistic (In the case of a gas bench audit failure, the bench is recalibrated and the gas
audit is then repeated, thus significantly extending the time required to complete the
audit.) For this and other reasons, Sierra believes the actual number of equipment audits
currently being performed in California is much less than 8,000 per year.
The above example is not intended to single out California per se, but rather to illustrate
the potential magnitude of this problem. The California Smog Check program is widely
regarded as being one of the best managed decentralized I/M programs in the country,
particularly with regard to technical and equipment issues. One demonstration of this
view is the nearly total acceptance by other states of BAR certification as adequate proof
of test system hardware compliance. However, it is clear that insufficient resources are
being allocated to equipment audits in California. BAR's annual budget for
administering the Smog Check program and overseeing other vehicle repair activities in
the state is in excess of $80 million. If BAR, with this budget, is not coming close to
performing enough audits, what is the situation in other states with much more limited
resources?
Auditing Particular Programs
The audit procedures and forms developed under WA 2-05 are in a modular fashion, with
each audit module aimed at individual equipment components (i.e., the gas benches,
dynamometer, VMAS flow measurement, etc.). The objective of this approach was to
improve the utility of the materials for use in audits performed by EPA and state I/M
program staff. Those modules to be used in auditing a particular program can be selected
based on which test system components are applicable to the program.
Once the relevant modules have been identified, they can be combined in a logical order
to perform audits of the test systems being used in the program. Typically, the sample
system leak check and gas analyzer audit are performed first. (The leak check is always
performed before the gas bench audit.) The remaining audit elements can be combined in
whatever manner makes the most sense for the test system being audited. For example,
the software menus that must be accessed to perform some functions may be configured
in such a way as to make it easier to perform these audit elements in a certain order.
2-7
-------
Table 1 has been developed as a guide to what modules should be used to conduct audits
in each current enhanced I/M program. The information contained in the table represents
Sierra's understanding of the present status of the program, based on a detailed survey we
recently completed of all the programs. Significant changes in program status can and
are likely to occur in the future. In particular, many of the programs included in the table
are planning on implementing OBDII checks in the next 6 to 12 months.
Table 1
Modules to Use in Auditing Current Enhanced
I/M Programs1
Program
Leak
Check
Gas
Bench
Transient 1 ASM
Dvno2 1 Dvno2
CVS
Row3
VMA
S Flow
Gas
Cap
Pressure
Tester
OBD
II
RPM
Weather
Station
AZ/Phoenix
CA/enhanced areas
CO/enhanced areas
CT
DC
DE
GA
IL
LN
MA
MD
ME
MO
NJ
NY/New York City
#
OH
OR/Portland
PA/
Philadelphia
RJ
TX
UT/Salt Lake
VA
WA/Spokane
9
WAV
Vancouver
WI
1 In addition to the individual modules shown in the table, a visual inspection of the test systems would
be conducted as part of all equipment audits
* Audit procedures for both types of dynamometers are included in a single module.
3 CVS audit procedures have not been developed since this work assignment is focused on auditing
ASM- and VMAS-based test systems CVS flow measurement is shown in the table for completeness,
see the EPA IM240 guidance for how these units should be audited
2-8
-------
An experienced auditor should be able to perform a full audit of an ASM/evaporative
emissions test system in roughly 45-60 minutes. It is estimated that an additional 30
minutes will be required to audit a transient dynamometer, and another 30 minutes to
audit a VMAS flow measurement device. Less experienced auditors will take longer
(i.e., as much as double the time) to complete each of the audit procedures.
Issues of Concern
As noted in the work plan for this work assignment, Sierra is to highlight issues that EPA
technical staff should be aware of in performing audits of emissions test systems. In
performing acceptance testing and audits of a range of emissions test systems in various
state I/M programs, Sierra has identified a number of problems and concerns with the
equipment. These are sufficiently widespread to lead us to conclude that many existing
test systems in other programs probably also suffer from similar problems. These
concerns are therefore highlighted below to ensure that EPA is aware of and checks for
the type of problems observed in the field by Sierra. Note that some of the highlighted
examples involve software issues that should be checked during acceptance testing, rather
than hardware issues that can be checked during an equipment audit.
Audit Menus - Audit menus are generally an afterthought to most EM programs in the
development of their specification. Most auditing programs have difficulty accessing the
information needed to perform all of the portions of an audit. This is in part because
some of the raw readings are available only through manufacturer field service
representative access, and generally state auditors do not have this type of access because
manufacturers do not want the state personnel to accidentally change setup parameters.
The requirements for audit menus to display all of the required information for
performing an audit need to be clearly defined in equipment specifications and checked
during ATP.
Dynamometers -
1. Proper error checking of dynamometer load is not incorporated into the control
software on some test systems; in some cases, the software may not be
checking to determine if a load is even being applied. For example, Sierra
disconnected the power to the Power Absorbing Unit (PAU) of the
dynamometer and ran a test without it being detected and the test being
stopped.
2. Problems with application of the correct load have been found in both
steady-state and transient uses of BAR97 dynamometers. For steady state
(ASM) testing, the problem is usually related to incorrect lookup of the vehicle
2-9
-------
testing parameters (see the next item). For transient testing, although BAR did
conduct some certification of the BAR97 dynamometers to be able to perform
"diagnostic level inertia simulation," inertia simulation is not used in CA.
3. It appears that during certification, at least one manufacturer used software
running on a PC to perform transient feedback control of the dynamometer.
When the dynamometer was certified, the only operation the PC was
performing was controlling the dynamometer. Testing by Sierra using the
Sierra Research Dynamometer Tester discovered, however, that the
manufacturer was using "Virtual Inertia" (e.g., applied load based on assuming
the drive trace was driven perfectly) as opposed to using feedback control from
a computer. When asked to use feedback control, the manufacturer used
feedback control from the PC in the test system. Unfortunately, this PC is also
responsible for updating the drive trace on screen, sampling gas values, and
performing a plethora of other simultaneous tasks. Using this configuration the
manufacturer has not been able to meet the response time criteria, and these
systems are currently in operation in testing programs.
4. There are three problems that have been observed with the use of the
EPA/Sierra Lookup Table (ESLT). First, some vendors are simply not looking
up the vehicle information in the lookup table correctly. This needs to be
throughly tested during ATP to prevent this from occurring in the field.
Second, some locations are not updatmg the ESLT as frequently as needed,
even though important changes are made (even to older model years data)
between updates. The problem with performing updates is due to limitations of
the data systems some states use that do not allow for revisions to the lookup
table to be downloaded. The only option programs have in this case is to have
the manufacturer's field service representatives manually load a revised lookup
table when they visit the test stations. This is a slow, tedious process and it is
difficult to ensure all stations have received the update in a short period of time.
In addition, the recent decision by EPA to no longer fund the ESLT and the
switch to OBDII testing are creating further uncertainty for states as to the need
to update the lookup table data in the field.
Third, many testing programs are not using the "by model year" defaults
contained in the ESLT, but are using the older defaults that were contained in
previous EPA guidance. These are based only on vehicle type and number of
cylinders. Average vehicle weights based on these characteristics have
changed over the years, which led to the development of model year specific
defaults that are more accurate. This problem is exceptionally important if
either of the two previously mentioned problems (incorrect lookup or old ESLT
data) is present because in both of these cases the system may switch to the
defaults if the correct value cannot be found.
One last issue is that the test programs in many locations are not maintaining
the ESLT vehicle characteristic identification (VCID) number in their
2-10
-------
CfC CAmp ivrrimf nt'O proofing tv, air ryu m m v^.xU mol/ac it
UAiaUaoCo. OOniW ^lOdiaiiu uiw tiwumi^ uiwa wu iiuiiiwt} ? luwii ihui^vo 11
difficult to trace data back to the ESLT. In addition, if someone is attempting
to compare data for specific makes/models of vehicles between states, it is
much more difficult if the VCIDs used by the states are different. If the ESLT
VCED is used and stored in the test record, one can be more certain that the
correct testing parameters were used for testing the vehicle and the data would
be easier to compare between states.
5. No axle weight scales are being used with BAR97 dynamometers in ASM
programs besides California to compute generic tire/roll loss (GTRL) for the
test vehicle, despite the fact that it is a BAR97 requirement. The equipment
manufacturers are instead using a variety of approaches to set GTRL, including
using the values in the ESLT or values based on a hard-coded axle weight or
half of the GVWR for the test vehicle. These latter two approaches are clearly
problematic.
The absence of axle weight scales is also an example of how equipment
manufacturers are claiming test systems are BAR97-certified although the
configurations being used in other programs are not those thai were certified by
BAR.
6. Currently the ESLT includes only the generic tire/roll interface losses for 2WD
testing. Some states are now performing testing of AWD vehicles and vehicles
with traction control on AWD dynamometers. Therefore, a method is needed
to determine the proper GTRL for vehicles tested in AWD mode. Sierra has
developed a method that EPA has approved to calculate AWD GTRL. This
method should be included in revised ASM and IM240 guidance to ensure that
these vehicles are loaded properly when tested.
7. Roll bearing temperatures are not being checked to compensate for changes in
parasitic losses with temperature. Thermocouples are installed in the hardware
but the data from the sensors are never used by the control system to adjust the
parasitic losses for temperature. Also, one manufacturer's software had the
ambient temperature and the bearing temperature signals reversed.
8. In some cases parasitic losses were being subtracted twice (at both the VID and
the test system), causing all tests to be conducted with too low of a load. Two
manufacturers were found to have other problems correctly accounting for
parasitic losses. In one case it appeared to be an error in the calculation of
parasitic losses in the software; in the other, it is clear that the manufacturer
deliberately hard-coded the parasitic losses (i.e, so they were fixed for all tests
and ambient conditions).
9. BAR97-certified dynamometers come equipped with a motor for automating
the spin up of the dynamometer. This motor is used for spinning up the
dynamometer to perform coast down checks and parasitic loss determir Hons.
Since the dynamometers were originally designed to perform the BAR tests
2-11
-------
(the ASM 5015 and ASM 2525), the motor is required to get the dynamometer
only up to just over 30 mph. In New York, these BAR97-certified
dynamometers are used for transient testing up to 56 mph. The dynamometers
are incapable of determining the parasitic losses or of performing a coast down
check at such a high speed. This could be leading to errors in loading at higher
speeds.
10. One dynamometer vendor has made changes in the gearing between the power
absorption unit and the roll set in programs where its dynamometer is used for
transient testing. The change in configuration was never tested by BAR and the
impact on testing accuracy is unknown.
Analyzers -
1. The NOx outlet was being vented in a manner that an unscrupulous inspector
could easily change the reading by partially covering the outlet. This was
changed from the configuration certified at BAR, but was still represented as
"BAR Certified." The vendor said the NOx cell had been moved to the outside
of the analyzer to allow the customer to be able to replace the NOx cell,
without having to have a service technician come out and do it. Obviously if
customers can access the NOx cell, they can tamper with the data from the NOx
cell.
2. Testing in several programs found that several of the test system manufacturers
are not properly time aligning the gas emissions with the drive trace. In one
instance, although the exhaust gas being analyzed passes through the
HC/C0/C02 bench and then through the NO analyzer, the time alignment was
the same for all gases. This caused the NO emissions to be improperly time
aligned with the trace. This can cause problems in steady state testing
programs and cause more significant errors in transient testing programs. This
is because the second by second exhaust gas concentrations and the second by
second flow rates are multiplied together to achieve the mass emission rate. If
they are not both time aligned correctly, this will lead to incorrect calculation of
mass emission rates.
3. The original test data from New York stated that proper sample conditioning
(e.g., a chiller) appears necessary to improve emissions measurement accuracy
However, only one decentralized test equipment manufacturer is using a chiller
in its system and this system is not used in any of the transient testing
programs. No decentralized test equipment manufacturer is using a chiller in
any of the three decentralized transient testing programs (New York,
Massachusetts, and Rhode Island).
4. NOx electrochemical cells are being found to perform very poorly and their
performance is degrading well before the one-year lifetime commonly believed
for these units. Unfortunately, the only inexpensive NO measurement method
to date has been the City Technologies electrochemical cell. This cell has a
relatively slow response time, which is problematic for states using the cells for
2-12
-------
transient testing. SAR recently has proposed & change to the allowed T9C
response time of the cells from 6.5 seconds max to 9.5 seconds max. For
steady state testing, as long as the data are time aligned and if one assumes that
the pollutant concentrations will be fairly constant, then this change is not
significant. For transient testing programs, however, this change, which BAR
has allowed, will add more inaccuracy to the calculated emission rates from the
test since the emission levels during a transient test can change rapidly.
5. Gas benches on new test systems have been found to fail audits when being
started up for beta testing. Some manufacturers have resorted to doing a "Cal
High, Cal Low" procedure every time a calibration is required, instead of "Cal
High, Check Low." "Cal High, Check Low" provides a check that the
calibration curve fit between the high calibration point and the zero point is
correct. The "Cal High, Cal Low" procedure has no check of the calibration
curve fit and therefore the performance of the bench is never checked during a
calibration and failures will only be detected during audits.
6. Some test system manufacturers are using the old humidity correction factor
and have not yet begun to use the revised humidity correction factor. The
revised humidity correction factor has the advantage of working over a wider
temperature range than the previous version. In other cases, the incorrect
revised formula contained in the 1998 guidance has been used and not
corrected.
7 CFR 40, Part 51 Appendix A to Subpart S (I)(d) states that "high volume"
stations (those performing more than 5,000 inspections per year) should
perform a calibration every four hours as opposed to every three days. In
addition, monthly multi-point calibrations are required as per section
(I)(f)(l)(B). The only state that currently reports differentiating between low
and high test volume decentralized stations is North Carolina. North Carolina
also reports a low audit failure rate (less than 5%).
VMAS Units - At the time CFR 40, Part 51 Appendix A to Subpart S was adopted,
VMAS was not being used in the field so no specific quality control checks for VMAS
were specified. Appendix A (II)(b) does include checks for CVS-based flow
measurement, but there are major differences between CVS- and VMAS-based flow
measurement that make it impossible to apply the CVS-based quality control checks to
VMAS For this reason, states have worked with Sensors (the manufacturer of the
VMAS) to develop quality control procedures for the VMAS. Still, several open issues
remain that can contribute to errors in measurements.
1. The VMAS unit determines the dilution ratio (exhaust gas to supplemental air)
by comparing the oxygen content in the raw vehicle exhaust to the oxygen
content in the diluted exhaust flowing through the VMAS. Unfortunately, the
oxygen sensor in BAR97 analyzers (which measure the raw exhaust) have slow
response times and BAR specifications did not require the oxygen sensor to
pass a quality control check (there are no pass/fail criteria in the BAR
specifications for the oxygen sensor) Therefore, even if the oxygen sensor in
2-13
-------
the BAR97 analyzer is nol functioning, the software will still allow testing to
occur. When this happens, the dilution ratio will be incorrectly calculated due
to the poor quality of the raw exhaust oxygen content measurement.
2. There is currently no EPA-specified method for auditing the oxygen sensor in
the VMAS unit. Sensors has suggested a technique for auditing the sensor, and
it has been included in the audit procedures developed under WA 2-05.
New York and Massachusetts are using this technique on a research basis, but
the technique is not being used in a consistent audit program.
3. There is no routine check the operator can perform to determine if the flow
measurement system is operating properly. Working with Sensors, a weekly
"hose off flow check" has been proposed to ensure the VMAS is still making
accurate measurements of total exhaust/dilution air flow. The principle is that
the VMAS should be pulling a fixed amount of air through it with the hoses off,
and if this value changes, there may be something wrong with the flow
measurement. This has not been implemented yet, and needs to be.
4. The VMAS unit compares the oxygen content in the diluted exhaust flowing
through the VMAS to the ambient reading to make sure the collection cone has
not accidentally or intentionally been moved away from the exhaust pipe. If
the oxygen content in diluted exhaust flowing through the VMAS approaches
ambient, the analyzer aborts the test because it believes the cone has moved
away from the exhaust pipe. Unfortunately, on large decelerations (such as at
the end of the IM240) many vehicles go into fuel cutoff to improve fuel
economy. When the vehicle goes into fuel cutoff, the engine of the vehicle is
essentially pumping air throjgh the exhaust system. This air combines with the
dilution air entering the VMAS and the oxygen content is nearly the same as
ambient. The VMAS system interprets this as the cone having moved and
aborts the test. The onLy way to prevent this type of abort is to remove this
check
Without this check, it is possible that the cone could move away from the
exhaust and it will not be caught by the system. A back-up check using fuel
economy could be used; however, as was previously discovered by EPA when
a fuel economy check was proposed for the EM240 guidance, it is difficult to
make this check stringent enough to be useful.
5. Many stations have mounted the VMAS units in various locations in their
shops to get the unit "off of the floor" or out of the way Unfortunately, many
of these locations require additional hose to get the exhaust to the VMAS unit.
This distance and the extra resistance in the additional hose could change the
transport time from the tailpipe to the VMAS. In addition, Sensors
recommends that the exhaust exiting the VMAS pas through straight pipe for
some small distance, and many shops are also not following this suggestion
6. The flow measurement in the VMAS is accomplished by a piezoelectric
speaker on one side of the VMAS tube sending out a signal that is modulated
by eddies coming off of a strut in the air flow and the rate of the eddies is read
2-14
-------
by another piezoelectric sensor (microphone) on the other side of the VMAS
tube. If the strut or piezoelectric device gets dirty, the flow reading will not be
accurate. Currently some states require their software to instruct the inspector
to clean the strut once a week, but there is no certain way to confirm that this
has been done. Sensors has suggested that some sort of analysis of the
variation in the flow could be used to determine if the strut was dirty, but to
date, no one has implemented this type of check.
Weather Stations -
1. Ambient temperature sensors were improperly located inside the equipment
cabinet, thus resulting in biased readings. The equipment manufacturers
indicated that Sierra was the first to even ask that the temperature, humidity,
and barometric pressure be displayed so auditors could check the reading
against a standard measurement. Therefore, these cannot be evaluated in some
instances unless the software has since been modified.
2. Sierra wsu> also told that a manufacturer in another program hard coded the
ambient readings (e.g, temperature and humidity) in all its test systems. The
manufacturer provided two units for acceptance testing that properly monitored
and accounted for changes in ambient conditions. After completion of
acceptance testing, however, all units subsequently sold in the program area by
the manufacturer had ambient temperature and humidity values hard coded into
the software.
3. The weather stations are supplied with coefficients that are the calibration of
the weather station (such as 1 volt equals 30 degrees). Sierra has witnessed a
manufacturer's field service representative change a weather station in a unit
and not enter the new calibration coefficients. When asked why he did not
enter the new coefficients, he responded that he never changes them when
changing a weather station because "they are all about the same."
Gas Cap Testers and Calibration Caps -
1. There are currently three types of gas cap testers in use. Stant, Waekon, and an
internal version used in ESP analyzers. One of the gas cap testers has been
found to give unreliable results and to have difficulty passing calibration
checks. Stations generally repeat the calibration over and over until the unit
passes the calibration and then use the tester for IM testing, most likely
providing inaccurate results.
2-15
-------
2. Some of the gas cap tester calibration procedures used by some vendors do not
include checks of the accuracy of the system, but simply reset the calibration
point. If one of the standards were incorrect (or, for instance, if the fail
standard were left off during the fail portion of the calibration as opposed to
using the fail standard), then the calibration would be incorrect. The
calibrations set the pass and fail leak rates for the tester, so if an incorrect
calibration is performed, then all subsequent tests could have incorrect results.
3. Fail standards provided by both Waekon and Stant have on occasion been
found to be outside the allowed tolerance. EPA guidance requires these
standards are checked before use and at six-month intervals. This is generally
not currently done in IM test programs.
4. Providing the stations with an external pass standard leads to stations using the
standard to fake tests. Several states have reported that when they visit stations,
they have observed many are leaving the calibration pass standard attached to
the gas cap tester. This implies that the stations are using this for performing
tests.
Sources of Audit Equipment
Provided below are recommendations on possible sources of the equipment needed to
perform the recommended audit procedures. Cost estimates for the recommended
equipment are also provided in Table 2 at the end of this section. Note that these
estimates are based on information obtained by Sierra and would be expected to change
in the future. They also reflect the prices quoted to Sierra if we were to purchase the
items; it is unknown if EPA or a state would be quoted a different cost.
Audit Van - Depending on which test system components are to be audited, a large
amount of test equipment is required to perform the required audit elements. Due to these
equipment requirements (including the gas cylinders that will need to be secured for
transport), it is not possible to efficiently and safely transport the equipment in a
passenger car. In addition, auditors can spend a great deal of time connecting and
disconnecting the gas cylinder regulators. If gas cylinders are carried in a passenger
vehicle, each time they are used the safety cap must be removed and the regulator must be
connected for every cylinder In addition, some states have set up the interior of the audit
vans to act as a workspace for the auditors This includes adding a desk, stool, storage
cabinets, interior lights, and 120 VAC power for powering a laptop computer. Because
compressed CO and NO gases are stored in the vehicle, it may be desirable to have both
CO and NO alarms installed as well.
If audits will be performed on a regular basis, it is recommended that a full-sized van
such as a Ford E-250 be used for transporting the required cylinders of gas and associated
test equipment. This type of vehicle has the advantage that the rear interior space can be
used for a work area and can include a desk, etc. In some cases, access to a gas analyzer
is not possible by backing the audit vehicle into the shop allowing access to the gases
2-16
-------
through the back door. A van allows for the gas distribution to be cither from the rear or
the side door of the vehicle, allowing different access to shops.
An alternative arrangement is to use a pick-up truck with a camper shell on the back and
store the gases and equipment in the back. This prevents the auditor from sharing space
with the gases which is safer in the case of a gas leak. However, it does not allow for
work space and the gases can only be accessed from the rear of the vehicle.
Gas Racks - Because gas cylinders are to be moved from location to location in a vehicle,
they need to be housed in safety storage racks. Installation of a gas rack in a van also
allows the cylinders to be permanently attached to panel mounted regulators, saving
significant time for the auditors. Although the use of vans with gas racks installed is
more expensive than using passenger vehicles, it will save costs through reduced time for
the inspectors to perform an audit, and make for much safer transport of the gas cylinders.
Sierra has previously worked with Praxair to design a cylinder storage rack and manifold
gas delivery system that will reduce the amount of time it takes to perform an audit while
storing the cylinders safely for transport. The system includes a storage rack, regulators,
and a manifold that will provide a single point where all the gases can be accessed in
order to flow them during the audit. More information on this system can be obtained
from the following contact:
John Ottosen
Praxair
(800) 860-3531 x6237
Audit Gases - For audit gases, high-pressure cylinders are recommended due to the cost
savings compared to low-pressure cylinders. The part numbers listed below were
provided by Praxair; however, the same gases can be obtained from any gas blender. All
gases should be ordered as 1% analytical accuracy gases, 2% blend tolerance. The specs
on the gases are listed below. Note that the oxygen in nitrogen blends are only needed for
VMAS auditing.
Zero air (Praxair Part No. EV MSE14)*: < 1 ppm THC, < 1 ppm CO, < 200
ppm C02, < 1 ppm NO, 20.9% oxygen, balance nitrogen
Low-range BAR-97 with NO (Praxair Part No. EV MSE9): 200 ppm propane,
0.50% CO, 6.0% C02, 300 ppm NO, balance nitrogen
Mid-1 BAR-97 with NO (Praxair Part No. EV MSE38): 960 ppm propane,
2.40% CO, 3.6% C02, 900 ppm NO, balance nitrogen
* The recommended zero air blend is "3AR97" zero air, which has higher allowed impurity levels than
recommended in the current EPA ASM Guidance Document
2-17
-------
Mid-2 BAR-97 with NO (Praxair Part No. EV MSE39): 1920 ppm propane,
4.80% CO, 7.2% C02, 1800 ppm NO, balance nitrogen
High-range BAR-97 with NO (Praxair Part No. EVMSE12): 3200 ppm
propane, 8.00% CO, 12.0% C02, 3000 ppm NO, balance nitrogen
1%, oxygen in nitrogen audit gas, AQ- or AS-size cylinder
8%, oxygen in nitrogen audit gas, AQ- or AS-size cylinder
15%, oxygen in nitrogen audit gas, AQ- or AS-size cylinder
Medium-sized aluminum cylinders (i.e., "AQ" or "AS" size) are recommended. Size
"AQ" cylinders are pressurized to 2000 psig, contain 78 cubic feet of gas, are 7.25 inches
in diameter by 33 inches high, and have a tare weight of 30 pounds. "AS" cylinders are
also pressurized to 2000 psig, contain 145 cubic feet of gas, are 8.00 inches in diameter
by 48 inches high, and have a tare weight of 48 pounds. They should be able to conduct
approximately 200 or 400 audits, respectively. The AQ and AS cylinders are
approximately the same price, but the AS cylinder contains twice the volume. Although
the AS cylinders are 15 inches taller, they should be able to be mounted standing in a
full-size van. For these reasons, we suggest using the AS cylinders. If ease of handling is
a significant issue, the AQ size cylinders would be easier to move.
Regulators - The gas rack system mentioned above includes regulators as part of the
manifold gas delivery system. Therefore, if that system is used, fewer regulators will be
required. If the cylinders are not going to be permanently mounted with the gas rack as
recommended, more regulators will be required. Since the regulators will not be used to
deliver a constant flow rate for a long period of time, single-stage regulators (which are
much cheaper than dual-stage regulators) can be used The four audit gases containing
NO must have stainless steel regulators; these require a CGA660 fitting. The zero air and
the oxygen in nitrogen mixes can use brass regulators; these require a CGA 590 fitting
Both types of regulators can be obtained from any gas blender.
Three-Way Switching Valve and Air Flow Meter - For VMAS auditing, an additional
three-way switching valve (i.e., containing 3 inlets and 1 outlet) to switch the low-
pressure zero gas and the 8% and 15% oxygen blends is required. This can be
constructed of brass, since no corrosive gases will be used with it. The valve should be
capable of handling low-pressure gas (less than 30 psi). The output of the three-way
valve should flow through a simple floating ball-type flow meter. Praxair manufactures a
three-way valve with a flow meter output. The Praxair unit consists of a stainless steel
panel with three brass inlet quick connectors (containing internal valves), which are piped
to a manual five-way brass valve (the extra inlet ports are plugged). There is one
common outlet, which is piped to a brass flow meter (with a 0-2 liters per minute
capacity) with a metering valve. The outlet of the panel also consists of a quick
connector with an internal valve. The unit can be obtained from:
2-18
-------
John Ottosen
Praxair
Model SGE 1215V5-FM-QC
(800) 860-3531 x6237
Tubing and Connectors - The straight tubing is made of Tygon and the connectors are
made of polypropylene or polyethylene. All materials are fairly resistant to the gases
used. These are commonly found items; one source is listed below.
Ryan Herco, (818) 841-1141
Tubing - Item # 0010.071, 50-foot roll - Tygon tubing V* ID x 3/8" OD R3603
Connectors - Item #0717.015, Hose Tee.PP.'/V'H x W'H x W'H
Barbed regulator connectors - Item #0675.056, MALE CONN PE 2A" x W BARBXMPT
For connection to the sample probe, "bubble tubing" that fits over the end of a sample
probe is required. This is not very common, Sierra has found only one source for this
item:
Fisher Scientific, (800) 766-7000.
Item # 14 170 13C, 1 case (50-foot roll) bubble tubing 1/4 in, VCAT 8889 224054
We believe the VCAT number is the actual catalog number, and the "Item H" is the part
number. Although only a few feet of bubble tubing is needed for a single initial setup,
the "bubble" portion will wear out over time and need to be replaced. Therefore,
although 50 feet may seem excessive, it will be used over time.
Rubber Bands and Balloons - The rubber bands are used to attach the balloons to the
"Tee" connectors. Any inexpensive size of rubber band will work The balloons should
be of the normal oval shape and can be made of any type of normal rubber material
Portable Temperature and Humidity Weather Station fPsvchrometerl - The temperature
and humidity can be obtained accurately using a simple and low cost psychrometer.
Sierra recommends the following battery-operated psychrometer:
Davis Instruments, (800) 368-2516
Catalog number ET87101
Model 22010
Barometer - A high-accuracy barometer is needed for use with the Smooth Approach
Onfice (SAO) that is used for the flow measurement audits. We recommend an
electronic digital barometer, since a mercury barometer would be more difficult and
hazardous to transport. The following is an example of a digital barometer that we
believe is accurate enough for use with the SAO:
Davis Instruments, (800) 368-2516
Catalog number DR639406
Model 740
2-19
-------
Reference Gas Caps - Reference pass and fail gas caps can be purchased from either Stant
or Waekon, at the numbers provided below. Past experience has shown that some of the
reference caps are not as accurate as they are purported to be, and it is prudent to
individually flow test each cap on a quarterly basis to ensure it is properly calibrated.
Sierra recommends an audit agency either purchase equipment that will allow the caps to
be flow tested and conduct its own certification of the reference standards, or send the
reference caps out to be flow tested.
Stant-(765) 825-3121
Waekon - (800) 367-9235
RPM Simulator - Two models are available, one with a dial and one with push-buttons.
The push-button model has settings for both 2500 and 700 RPM; the dial model can only
accurately test 2500 RPM. The push button model is recommended for use in auditmg.
A source and contact are listed below.
Leistung Test Products, (414) 783-5537
Contact: Ray Niemeischek
OBDII Simulator - To determine if the OBDII test systems are working properly, a
device that can simulate an OBDII system in a vehicle is needed. The device allows for
setting of diagnostic trouble codes and readiness codes, and simulating RPM. Sierra has
worked with EASE Diagnostics to develop an OBDII audit tool. For further information,
contact the following:
EASE Diagnostics
Catalog Number OVT-2020-SW
Steve Golenski
570-465-9060
stepheng@obd2 .com
Computer Laptop - We recommend that auditors use laptops to record the audit data. If
audits are going to be performed on a regular basis, audit software should be developed
and used in the field to reduce the incidence of calculation errors. If the dynamometer
tester is going to be used, the laptop also needs to have a PCMCIA slot for connection to
the tester.
Smooth Approach Orifice (200 to 500 SCFM) - An SAO will be used to check the
accuracy of the CVS or VMAS flow measurement. The following vendor can provide an
SAO to meet the program requirements. Contact the company for further information
Dick Muns Company, (562) 596-1559
SAO, 200 to 500 SCFM, for connection to a hose with a 4" ID, low-flow restriction
Stainless Steel 4" Air Flow Control Valve - A "blast gate" is commonly used to control
the flow of air in ventilation-type systems. This same type of gate can be used to adjust
the flow of air exiting the VMAS system. This type of device can be purchased from the
following source:
2-20
-------
A Port* ฃ00 <01 1
iVlVXVlOOlWI*VUil> } U-/4.*J> l A
Item# 1788K12
Model - 4" duct diameter, standard blast gate, full gate style
Dynamometer Tester - A dynamometer tester will be required to ensure that transient
dynamometers are applying the correct load to test vehicles. Sierra has developed a
portable, relatively low-cost dynamometer tester for this purpose. The tester, which can
be integrated with the auditor laptop (i.e., the software for the tester will reside on the
laptop), includes a data acquisition system, optical speed sensor, interface box, data
acquisition card, and software. The cost of Sierra's dynamometer tester is $20,000.
Further information on Sierra's dynamometer tester can be obtained from the following
contact:
Michael St. Denis
Sierra Research
(916) 444-6666
mstdeni s@s ierraresearch.com
The oniy other dynamometer tester of which Sierra is aware was developed for the
California BAR by Real Time Dynamometers. It is relatively cumbersome to use and
costs approximately $450,000. Further information can be obtained by contacting:
David Amlin
California Bureau of Automotive Repair
(916)255-1376
Approximate Cost for Auditing Equipment
Costs for all recommended audit equipment are provided in Table 2. These reflect the
costs quoted to Sierra; i.e., if we were to buy the items. Note that a simple summation of
the costs included in the table will not result in the total cost needed to equip an audit
vehicle. Depending on the test systems to be audited, multiple units of certain items (e.g,
gas regulators) may be required. In addition, installation costs for the initial setup of the
audit vehicle, such as installation of the gas racks and other equipment, are not included.
2-21
-------
Table 2
Audit Equipment Costs
#
Hem
Cost/Item
1
Audit van
2
Gas cylinder racks w ith regulators, plumbing and output panel
$10,000
3
High audit gas, AQ or AS size cylinder
$250
4
Md-1audit gas. AQ or AS size cylinder
$250
5
Md-2 audit gas, AQ or AS size cylinder
$250
6
Low audit gas, AQ or AS size cylinder
$250
7
1 %, oxygen in nitrogen audit gas, AQ or AS size cylinder
$200
8
10%, oxygen in nitrogen audit gas, AQ or AS size cylinder
$200
g
20%, oxygen in nitrogen audit gas, AQ or AS size cylinder
$200
10
Zero air, AQ or AS size cylinder
$75
11
Single stage stainless steel gas regulators, CGA 660
$425
12
Single stage brass air regulators, CGA 590
$175
13
Three way sw itching valve and flow meter
$1,300
14
50 foot roll 1/4" D x 3/8" ODTygon tubing
$55
15
Hose tees. 1/4" x 1/4" x 1/4"
$2
16
Barbed'reg ulator connectors
$2
17
50 feet roll of bubble tubing
$35
18
Balloons and rubber bands
$2
19
Fbrtable temperature and humdity w eather station (psychrometer)
$140
20
Barometer
$1,800
21
Reference "pass" fuel cap
$100
22
Reference "far fuel cap
$100
23
Vehicle pressure test audit tool
$650
24
RPM simulator
$250
25
OBDD simulator
$3,500
26
Laptop computer
$2,000
27
SAO, 200 to 500 SCFM
$5,000
28
4" blast gate vah/e
$15
29
Dynamometer tester
$20,000
H 11 U
###
2-22
-------
3. TEST SYSTEM ACCEPTANCE TESTING
Before a QA/QC program begins, it is necessary to first ensure that the test system
software is functioning properly through the performance of a comprehensive acceptance
test procedure. An example of a required acceptance test is the confirmation that the
parasitic drag in the dynamometer is being calculated properly. Sierra has had the
experience where one particular equipment manufacturer was not calculating the parasitic
drag in the dynamometer properly. If this had not been found and corrected before the
equipment went into the field, the test systems could have been failing audits and having
a technician attempt to recalibrate the dynamometer would not have fixed the problem.
As can be seen by this example, if the proper system operation is not confirmed during
acceptance testing, the systems will have audit failures that will not easily be repaired.
Due to the importance of acceptance testing, this overview is provided to show how
acceptance testing relates to QA/QC and auditing of test systems. Sierra has performed
acceptance testing for several states on all brands of decentralized test equipment.
Examples from actual acceptance testing are provided from this experience. In addition,
Appendix A includes an example of an acceptance test procedure for a decentralized
ASM test system.
Initial Acceptance and Beta Testing
Under this element, prototype units of the emissions tesi system (or systems) should be
subjected to detailed acceptance testing prior to program rollout. This testing is needed to
verify that the equipment as designed will result m proper vehicle tests and accurate
emissions results. If comprehensive acceptance testing is not conducted prior to rollout, a
program is virtually guaranteed to encounter substantial problems in test system operation
that could plague it for months or even years.
Acceptance testing also sets the stage for the latter two QA/QC elements described in this
report (calibration checks and equipment audits). For example, control software for all
required calibration and audit procedures must be verified to be present and fully
functional during acceptance testing. This would seem to be an obvious requirement;
however, as mentioned earlier, audit functions in particular often appear to be added as an
afterthought to many test systems
Acceptance testing must include both test system hardware and software. A detailed
discussion of how such acceptance testing should be performed is beyond the scope of
this work assignment. In general, however, the testing should address the issues
summarized below.
3-1
-------
Hardware Functionality - The functionality of the individual hardware components as
well as the performance of the integrated system should be tested. To date, most program
agencies have not conducted very thorough hardware testing, largely because they do not
have the experience or resources needed for this effort. This has led many programs to
simply specify that the hardware being provided by the test system manufacturer(s) must
meet BAR97 specifications (or NYTEST in the case of VMAS-based testing). However,
this approach has also proven problematic, primarily for the following reasons:
1. BAR97 and, to a much greater extent, NYTEST specifications are poorly
documented and/or incomplete. Agreements between the manufacturers and
BAR made subsequent to the release of the BAR97 specifications are not fully
documented, making it nearly impossible to determine exactly what is meant by
the term "BAR97certified." The NYTEST specifications are very incomplete,
with many key elements (e.g., VMAS operation) totally ignored in the written
specifications. This leaves many important technical issues open to
interpretation, making it very difficult to determine whether a particular system
is compliant with the specifications.
2. While the equipment manufacturers claim their systems are BAR97 certified,
Sierra has found that in a number of cases changes have been made in the
systems being produced for other programs. This means that these systems are
in fact not BAR97 certified; however, the program staff are typically unaware
of this. The changes that have been made cover a range of complexity and
significance. However, we have identified several instances in which a
modification from the BAR97-certified configuration may be having a
significant impact on resulting emissions readings. This issue is discussed in
more detail in a subsequent portion of this section and in Section 2.
Sierra recognizes that most agencies will continue to have limited ability to perform their
own hardware verification testing. However, we believe much of the audit procedures
and related audit equipment recommended m Work Assignment 2-05 can also be used
during the acceptance testing phase to verify the basic functionality of the hardware
We also strongly recommend that any state that relies on a manufacturer's statement of
compliance with other specifications, such as BAR97 or NYTEST, as proof of
compliance for its own program initiate a dialogue with the applicable agency (e.g., BAR,
New York State DEC, etc.) well in advance of acceptance testing. This dialogue should
really begin during the development of the subject program's bid documents and/or
equipment specifications, and should be continued throughout the acceptance testing
process. Manufacturer claims regarding BAR97 or NYTEST certification should be
discussed in detail with the applicable agency, and guidance should be sought on how to
check the prototype test system to confirm it represents the certified configuration.
Photographs and other documentation can also be provided to help the certifying agency
determine if the prototype system matches the certified configuration.
Software Functionality - The full functionality of the software needs to be checked
during acceptance testing. This includes all test procedures incorporated into the
program, calibration and audit procedures, and all other software functions. While this
3-2
-------
testing is relatively straightforward, it can also be time consuming. This is particularly
true for decentralized programs involving multiple test systems built by different
manufacturers and/or a centralized vehicle inspection database (VID) that is integrated
with the test equipment. Care needs to be taken in such cases that both the individual test
systems and the fully integrated system are subjected to adequate testing. When bugs are
found in the system, additional time is also required to troubleshoot and identify the
source of the bugs, since it is often unclear whether they are occurring in the test systems,
the VID, or the interaction between the two parts of the overall network.
As mentioned previously, an important element that needs to be checked during
acceptance testing is the functionality of the menus and display screens needed to audit
each of the individual components of the test system (e.g., analyzers, dynamometers,
VMAS flow unit, weather station, etc.). Sierra's experience is that these menus and
screens are often forgotten or added as an afterthought without much focus on making
them user friendly. This can lead to substantial difficulties in conducting subsequent
audits. It is therefore critical that full audit functionality be incorporated into the
equipment specifications and verified dunng acceptance testing. Particular emphasis
should be given to ensuring the following-
All audit menus, functions, and display screens are user friendly and do not
require unusual access (e.g., through an FSR Menu). All audit functions should
be easily accessible through a State/QA Menu.
All data needed to determine audit results are clearly displayed and readily
available to auditors.
ATP Development and Performance
To ensure that acceptance testing of the software includes all applicable functionality, a
detailed acceptance test procedure (ATP) should be developed prior to the beginning of
the actual testing process. An example of a detailed ATP for an ASM test system is
provided in Appendix A. ATP development requires a thorough review of the equipment
specifications to verify that all required functions will be checked. This review has the
added benefit of identifying any inconsistencies or ambiguities in the specifications.
These items can then be addressed in advance, thus avoiding a situation in which these
types of problems are not identified until the test systems are delivered for acceptance
testing. This leaves the program and manufacturers scrambling at the last minute to
address these items, which often carry over into the rollout of the program.
The ATP is typically divided into the following four sections as shown in the example in
Appendix A:
1. Main menu checks,
2. Inspection menu checks,
3. Test procedure scenarios, and
4. Additional checks.
3-3
-------
In additjon, sheets with the emissions test conditions, timers are also included as checks
that should take place with the scenarios. These checks make sure that items such as
checking for sample dilution are working properly and allow for the test to behave as per
the specificaitons.
Acceptance testing of the software would begin with an evaluation of the Main menu
elements, and then continue with checks of the Inspection menu elements. After the
menu checks, multiple simulations of the individual test procedures (e.g., ASM, idle, and
2500 RPM emission tests) would be conducted following the applicable procedures
contained in the ATP. These checks should include the use of simulated vehicle
emissions and an RPM simulator, as appropriate. (This latter device was described in
more detail in Section 2.)
The logic of how the test system should react in specific circumstances will be
determined in advance and documented (see example in Appendix A). Defects should be
identified and noted whenever the system response is inconsistent with the proper results
for a particular set of inputs as specified in the ATP. An example bug log is provided in
Appendix A. This allows a quantitative comparison of expected versus actual
performance in each area of functionality. Various test simulations should be
incorporated into the test scenarios included in the ATP in order to hilly check the
operation of the software.
Other performance assessments, or additional checks, should be conducted on the test
system on a "random" basis at various points in the ATP. These checks include a variety
of evaluation elements designed to ensure that the test system properly performs all non-
inspection-related functions.
Beta Testing
While a comprehensive acceptance testing process typically results in the elimination of
most software bugs, it is impossible to test all possible vehicle and test procedure
combinations during this process. Therefore, another critical QA/QC pre-program
element is subjecting all test systems to a well-designed and thorough beta testing
process. Typically, this involves rolling the equipment out to a limited number of test
stations following the successful completion of acceptance testing in a laboratory setting
for additional testing using actual, privately owned vehicles. Beta testing is usually
conducted by private repair shops whose owners have reached an agreement with the
individual manufacturers to be a beta test site.
The repair shops use the beta version of the manufacturer's equipment, following the
inspection procedures programmed into the equipment, to test vehicles that come to their
facilities for I/M testing. Ideally, such testing would be performed for a sufficient
minimum period of time (e.g., 30 days) to identify any additional bugs. In addition, the
software should meet an established acceptance protocol before being allowed out of beta
testing. This protocol could include such things as a minimum number of tests that are to
be performed, the required correction and retesting of all bugs that are identified during
3-4
-------
beta Jesting, a certain minimum number of "clean" tests (tests with no bugs in the
software and no data record defects) after correction of all bugs, etc.
Prior to beta testing, procedures should be developed for mechanics at the beta sites to
follow in conducting tests and noting and reporting potential equipment defects. These
procedures should include a standard reporting form to use in documenting the details of
each incident (an example beta testing bug log is provided in Appendix A). The beta-site
mechanics would then be trained in properly following reporting procedures and filling
out the reporting forms. Program staff or their contractor would observe initial testing at
the beta sites, and assist mechanics in understanding the beta systems and reporting
potential bugs.
The equipment manufacturers normally provide primary training to beta test mechanics
on how to operate their test systems. The program staff/contractor's role in this area
would therefore be limited to ensuring full documentation of the manner in which the
equipment was operated that resulted in a potential bug. This should include observing
testing, performing frequent audits of the equipment to ensure that the systems meet all
required specifications during the beta period, collecting and reviewing test data, and
collecting and reviewing any reporting forms completed by the beta sites. Efforts would
also be undertaken if needed to duplicate and better document bugs initially identified by
beta sites using test systems installed at the program's or contractor's laboratory.
Experience has shown this to be needed in many cases because the beta sites cannot
document or remember how a particular bug was generated.
While there are many areas of performance that should be evaluated during beta testing,
one that is often overlooked is analyzing the data files being produced by the test systems
to verify that they comply with all format and content specifications. This may seem like
a fairly minor issue when a program is attempting to roll out a new enhanced test
program. However, if this issue is not addressed at this time, it may well be discovered
months into the program that there are significant problems with the data that are being
generated, making all the data that have been generated to that point in time virtually
useless. This is another instance in which the need to spend significant effort correcting
problems in the future could have been prevented with a little extra effort in the
beginning of the program.
Required Resources and Time
The above descriptions clearly show that a significant acceptance and beta testing effort is
needed to properly roll out a program. It is anticipated that many programs would
indicate they have neither the resources nor the time to spend on this level of testing prior
to rollout. Unfortunately, recent experience across the country in rolling out enhanced
programs (particularly those with decentralized networks) has shown that if this level of
effort is not spent initially, even more time will need to be spent after the program is
rolled out to correct problems that could have been avoided. Any defects in the test
systems will come out eventually, and it is much easier to deal with these before testing
hundreds of thousands or millions of vehicles with the equipment.
3-5
-------
Sierra sees two basic factors that are driving states to roll out programs before they are
ready. The first is that most states view their SIP obligations and milestone dates (for
starting an enhanced program) as deadlines that must be met no matter what the cost.
One of the best things that EPA could do in this regard would be to develop a reasonable
policy and guidance on how it intends to work with states to give them the time they need
to ensure that the test equipment is operating properly prior to rollout. (Obviously, this
would require states to be making a good faith effort to implement an effective program.)
This would result in the rollout of better operating programs, which would therefore be
subject to less adverse public pressure. It would also help ensure that vehicles are being
properly tested in these programs.
The second factor is that the states have settled for (and EPA has allowed) fairly limited
acceptance testing of equipment in the past, which in hindsight was inadequate to identify
and address all the problems in the test systems. The result has been several programs
that have gone through the motions of acceptance testing while doing little to actually
identify and correct both hardware and software problems. This is not meant to be overly
critical of most states; for the most part, their intentions have been good. However, their
staff have limited experience in understanding and conducting testing on enhanced
emissions inspection systems. In many instances, the states have relied on contractors to
perform such testing, who unfortunately have done a less than thorough job.
Testing of Non-BAR97 Equipment
While the above description paints a fairly bleak picture of past acceptance testing efforts,
we do not see the situation improving in the fiiture. If anything, it is likely to worsen due
to further penetration of VMAS-based and other similar non-certified test systems into
existing I/M programs. EPA is allowing states to develop and implement such
customized systems with little oversight. As a result, programs are being rolled out that
incorporate test systems that have not been subjected to thorough development and
testing. As a result, there are I/M programs operating with what we believe are serious
flaws in the enhanced emissions test systems.
We recognize that EPA (like the states themselves) has limited resources to address this
issue. It is believed, however, that the Agency should hold the states to a much higher
performance threshold than currently exists for testing and verifying the operation of new
prototype systems prior to allowing their use. This recommendation fits well with the
previous one regarding the development of guidance on how EPA will work with states
to give them the time they need to ensure that the test equipment is operating properly
prior to rollout. Combining a more stringent acceptance threshold with the possible
granting of additional time to ensure test systems meet that threshold would clearly send
the signal that these systems need to be operating properly prior to the start of a program.
This in turn would give equipment manufacturers and contractors the message that they
need to do a better job m ensuring fully functional systems.
mn
3-6
-------
4. AUDITOR TRAINING COURSE
Audit procedures for all of the components in ASM and VMAS-based decentralized test
systems were described in the audit guidance submitted to EPA under Work Assignment
2-05. For the current Work Assignment, Sierra has prepared a classroom training course
to be used to train auditors. The goal of the training is to familiarize the auditors with the
test systems, the audit procedures, and the types of failures that may be experienced.
Participants in the training course should be provided a copy of the training presentation
materials attached in Appendix B and a copy of the Work Assignment 2-05 audit
procedures and forms.
The ideal training program would be divided into three parts designed to be conducted
over two days. Part 1 would include a four-hour classroom session discussing the tests
systems, acceptance testing, audit procedures, audit equipment, and common audit
failures. Training materials for this portion of the training are included in Appendix B.
Part 2 would be a four-hour hands-on training session with an ASM and/or VMAS-based
decentralized test system (as appropriate) in a laboratory environment. This portion of
the training should be conducted using the audit guidance developed under WA 2-05. To
reinforce what was learned in the classroom and practiced in the lab, the last portion of
the training should include actual field experience working with auditors conducting
audits on actual decentralized test systems in a shop environment. This would provide
experience in the practical and logistical issues related to performing equipment audits
Classroom Training Description
As mentioned above, the presentation materials for Part 1 (the classroom portion) of the
training course are provided in Appendix B. The training materials begin with a general
overview of the issues in QA/QC, ATP, calibration, performing audits in the field, audit
menus on the various test systems, required audit equipment including audit vehicles, a
discussion of record-keeping issues, and issues related to retesting failed test systems
After the general overview, each test system area to be audited (as per the audit guidance)
is discussed. These include the following-
1 Test system visual inspection,
2. Weather station,
3. Gas benches (NDIR for HC, CO, C02, and electrochemical NO cells),
4 Tachometer,
5 Dynamometer (steady state and transient testing),
6. Gas cap tester,
7. Pressure tester,
4-1
-------
8. OBDII scan tool software, and
9. VMAS unit.
For each audit procedure, each of the following five areas is discussed:
1. Test system component operation,
2. Audit procedure for the component,
3. Audit tolerances,
4. Equipment required to perform the audit, and
5. Audit failures.
Test System Component Operation - The first item is a brief review of the theory of
operation for each test system component to be audited. The training materials are not
designed to make students experts in how the equipment works, but rather to provide the
background needed to understand problems that are encountered during the audits and to
understand the pass/fail criteria. The benefits and limitations of each test technology are
also be discussed.
Audit Procedure for Each Component - The next area covered in the training is the
development of the individual audit procedures for each audit component. The step-by-
step procedures are covered along with the reasoning behind the procedures. There are
subtleties for many of the procedures that are stressed in the procedures because if not
followed, they could lead to inaccurate audit results.
Audit Tolerances - This area covers how the audit tolerances were developed for each
procedure. The pass/fail criteria contained in the audit materials are based on the contents
of EPA's latest guidance documents on enhanced emissions testingi.e., the July 1996
version of the ASM test guidance and the April 2000 version of the IM240 and Evap
Technical Guidance. In cases where the EPA documents did not provide guidance
regarding relevant pass/fail criteria, the presentation covers how they were developed
based on other available materials (e.g., the BAR97 specifications) or through discussions
with knowledgeable stakeholders. For example, as noted in the previous section, Sensors
was contacted for its input regarding appropriate audit procedures for the VMAS unit. In
many IM programs, however, there is debate amongst the parties involved (state
government, equipment vendors, and/or contractors) as to the appropriate level of
stringency for audit tolerances. The presentation covers the development of each audit
tolerance m the audit procedures, the basis of each, and the vendors' concerns.
Equipment Required to Perform the Audits - The training will cover the recommended
test equipment for performing the audits. Potential suppliers and approximate costs for
all of the required equipment are included in Section 3 of this report. The training will
review the required equipment and calibration of the audit equipment, and present the
approximate costs.
Audit Failures - The last portion of the training includes a discussion of the common
failures of each test system component This portion of the training is also covered in
detail in Section 2 of this report. Information from audits performed on operating EM
4-2
-------
program test systems are presented to demonstrate problems found in the field. Insights
into the types of equipment failures that are the most common, the impact of those types
of failures on test accuracy, and what is being done by states and equipment vendors to
resolve the problems will also be presented.
UM
tttrtr
4-3
-------
APPENDIX A.
EXAMPLE ACCEPTANCE TESTING PROCEDURE
A-l
-------
ANALYZER TOP LEVEL MENUS
MENUS SELECTION
PASS CHECKS
Power on Self Teat
1 dsconnect keyboard, fails POST
1 al connected passes
Calibrations Required?
requres cakbration
gas bench HC CO NO zero span md span
gas bench caibraoon lockout set/dear
leek check
leak check lockout set/dear
gas cap tester
gas cap tester caibratlon lockout set/dear
dynamometer warm 14} time
dynamometer warrrnp lockout set/dear
dynamometer coast down
dynamometer coast down lockout set/deaf
dynamometer load eel catenation
dynamometer load eel caibraoon
Bench caltme
72hoirs
Cynocamme
72 tars
TOP LEVEL MENU
Menus Selection
1) Vehicle Inspection Menu (V)
al items presented
2) Addtonal Site Speafic Software (A)
single letter chooses menu
3) Diagnostic Functions (0)
secure access to state menu, not even to tectnaan
4) Station Menu (a)
seaxe access to manufacturer's menu
5)NJ State Menu (N)
secue access state onfy
6) Manufacturer's Technician Menu (T)
seoie access, service ortfy
VEHICLE INSPECTION MENU
1) Vehicle Inspection Menu
1 Vertde E/rassions Inspection
I see separate worksheet
2 Analyzer Maintenance
(1) Three Oay Gas Cakbration
scan bottle vaLes
(dyno coast down gascal leak check)
manual entry of botte veUes
easy prompts, dsptay resiits
gas caibration fail sets lockout
leak check fail sets lockout
coast flovm fail sets lockout (B to 18 hp. 2 settings)
dear lockouts once passed
(2) Leak Check Only
dear Eockouts once passed
(3) Gas Calbranon Onty
dear lockouts once passed
(4) Status Screen
al required data Is dsptayed and Is correct
(5) Dynamometer Catbration Onty
dear lockouts once passed, +t- 7% of CCDT
(6) Preside Test and Gas Cap Test Cattrtbon
dear lockouts once passed
(7) Dynamometer Rol Speed Check
dear lockouts once passed +/- 0 2 mph, weekly
(99) Retun to Main Menu
returns to Inspection menu
3 Search and Reprint report/VlR
search by V1N, date anc/or TIN
review and pnnt selected inspection record
prtnt "replacement V1R' with note dupScate and date of panting
retuns to inspection menu
4 Training Mode
"Training Mode" on VIR resutllsT" TIN says Void"and VOID" on VIR
notestresUtsonVIR
no VID communications no EIS DAT generated
retixns to Inspection menu
99 Return to Ma^i Menu
returns to Main menu
2) Additional She Specific Software
optional, csal 141 access to diagnostic data base, etc (may not be instated)
verify no access 10 State emissions testing program
venfy no access to operating system/ fUe structure
3) Diagnostic Functionฎ
manual testing and repair functions, not accessible from wiWn official emissions tests
venfy no access to State emissions testing program
venfy no eccess to operating system/file structure
4) Station Menu
1 Network Canmumcacon Diagnostics
modem serial port dfagrwstics
dal tone check
network diagnostics
2 Data File Refresh
refresh date and time AUDITOR DAT TECH DAT FUEL OAT LOCKOUT DAT INSPCONDOAT
SPD2 OAT ASMSTDS DAT NEWSTDS DAT D6FPRAM DAT, SYSTEM OAT
note ipdated, or cannot access network
3 lockout Status Refresh
contact V1D request LOCKOUT DAT show status or cannot access network
4 Print a Comminications log
print Isting of past 14 days or last 100 sessions
prompt lor number of sessions to be pnnled
S) NJ State Menu I Iscan inspector badge input access code after tree atterrpts Dack :o Main menu
1 State Anatyzer Maintenance
(1) Three Day Gas Caibraoon
scan bottle values
(dyno coast down gas caj, leak check)
manual entry of battle values
easy prompts dsptay resiits
gas catbration
leak check
coast down 8 to 13 hp random 2 settings
(2) Leak Check Only
leak check
(3) Gas Calbratjon Only
gas caibration
(4) Status Screen
al requred data is dsptayed and is correct
(5) Oynamometer Caibraflon Onfy
+A 7% of CCDT
(6) Pressure Test and Gas Cap Test Caibraoon
test gas cap only
(?) Oynamometer Rol Speed Check
~A02mpn
(99) Return to State Menu
returns to State menu
2 Gas Audit
no preset values display in targe characters and printresuits
3 Update Station Information
(tsptay al information in STATION OAT and alow to be changed log to audi file
4 Instal New Data Disk
reptace secired floppy diskette lormaL copy previous records
when complete retun to State menu
S PracticaVDemo Test Record
same as Inspection menu
6 Lockout Fincttons
display lockouts
alow lockouts to be changed
transfer tockoul changes to VlD
7 Perform Emergency Software Update
changes software version r&mber re tuns to NJ State menu or reboot
8 Test Record Search and Retneva)
seme as Inspection menu saves to floppy
9 Total Number of Inspections Since Last Data Transfer
display nunoer promptto reti*n to NJ state menu
99 Ream to Main Menu
retun to Mam menu
6) Manufacturer's Technician Menu ( (manufacturer to propose secuity method wnfes to audit fife
2/27/2002
MAIN MENUS
Page 1
-------
INSPECTION ITEMS |
Refercnco
Function
NOTES
Pass CERTIFICATION TESTING REQUIRED
Fiold Namo (EIS DAT!
No |
VEHICLE INSPECTION MENU
pg 33 3 3 A 1 i Vehicle Inspection
2 Analyzer Maintenance
3 Search and Repnnt ReporWlR
4 Training Mode
99 Return to Mam Menu
Seteci OpUon Desired Enter Number
pg 14 ~ 35 3 3 B 1 Vehicle Selected Lockouts Set7
Slate lockout set at machine
Cabinet tampering
Station license has been revoked or suspended
Station license has expired
Failure to pay lor communications services
Max offkne tests exceeded
Corrupt data file or other problem
pg 35 3 3 B 3 Aborts
pg 35 3 3 B 3 Inspector Access ( Manual or Bar-code)
access coda requtfed
pg 35 3 3 B 3 Aborts
pg 36 3 3 B 3 C Inspector Exist in INCOMPLT OAT Y/N7,
pg 36 3 3 B 4 VtN Entry
pg 37 3 3 B 4 Jurisdiction Entry
pg 38 3 3B4 Sticker Expiration Entry
pg 37 3 3 B 4 License Plate Entry
pg 37 3 3B5 Select Test Type
number of days from VID
do not wnte INCOMPLT OAT before ID entered |
tnree tnes or back to main menu double Mind if manual-
bcense in TECH OAT _
license access code matches _
bcense has not expired _
wnte INCOMPLT OAT after ID entered
If a file for the inspector not present create a file
bar code vehicle or manual entry? B/V/M?
2nd manual entry match? Y/N?
no prompts if scanned
show 1st of all 50 states
no prompts if scanned
MM/YYYY formal
do not proceed until entered
no prompts if scanned tf manual then double btond
only manual entry do not proceed until entered
Initial / Reinspection
reinspection ask about repairs
ask for repair info
display cost categones
repair categories and codes mfg interface <9
parts and labor costs no decimals < 9999
aO menu items presented
VID cannot set/ dear
set at machine, cleared by VID or in person
set by VID cleared by VID or in person
set by VID deared by VID or in person
set by VID deared by VlO or tn person
set by VID cleared by VlO or m person
sot at machine deared by VlO or m person
abort does not wnte INCOMPLT OAT
manual entry or invslid license prompt not a valid license back to mam menu
manual entry valid license valid access code
bar code enuy, access code valid
license expired prompt license expired back to main menu
abort wntes INCOMPLT OAT (lor each inspector) from this point forward
add inspector to INCOMPLT DAT as required
increment test started field m mcomplt del
increment test aborted field if test ends poor
to record write to EIS OAT on hard drive
date change creates new record sends oW records
pnnt VlR with "Invalid " next to pollutant readings
cannot cut and pasle to do double blind
manual entry double blind pass
DMV renewal form bar code entry
read bar code on vehicle
J pick from bst
pg 39 3 3 B 5 Pink Card Entry
manual entry
blank will not proceed
wrong format, wiS not accept
manual entry
blank, will not proceed
initial test go to network communications
reinspection no enlry back to first prompt
reinspection no repair data go to network communications
reinspection repair data
reinspection, repair data enter repair facility 10
reinspection repair data enter repa? technician ID
reinspection repairdata do not enter date show error message
reinspection repair data enter date
reinspection repairdata enter parts costs >9999. show error message
reinspection, repair data enter pans costs with cents
reinspection, repair data enter valid parts cost if cents not accepted
reinspection repairdata enter labor costs >9999 show error message
reinspection, repair data enter labor costs with cents
reinspection repair data enter valid labor cost if cents not accepted
reinspection repairdata enter category 27 show error message
re inspection repairdata enter > 9 vabd repair categones and codes
reoupection repair data enter < 9 vabd repair categones and codes
reinspection, repair data enter code 69
reinspection repairdata enter work order number
renspection repair data Y-edditional forms return to prompt lit write file
reinspection, repair data no more forms go to network communications
request pcik card date be entered enter future date get error message
enter a vabd date {today or otder) cormme ^
Check tn TECH OAT
Software Version
License Number
Input Source
Test Record Number
INCOMPLT DAT
Facibty 10
Analyzer Number
Test Date
Test Stan Time (24 hour time)
VIN
VIN source
License plate funsdiction
Old sticker date
Plate no
Plate Source
Test type (iruiial/retest)
Emission repair faabty number
Emission repair technician ID
Repair date
Parts total
Labor total
Repair codes
Work order number
Inspection ABowance Date (IAD)
2
3
37
38
10
12
REPAIR DAT 15
REPAIR OAT 14
REPAIR DAT 11
REPAIR DAT 16
REPAIR OAT 17
REPAIR OAT 12
REPAIR DAT 18
2/27/2002
INSPECTION MENU
Page 2
-------
INSPECTION ITEMS
Pas* CERTIFICATION TESTING REQUIRED
Fiold Name fEIS DATi
pg 40 3 3 6 6 Network Connection
pg29 2 25d)2
pg 29 2 25 d) 2
pg 29 2 25 d) 2
pg 41 3 3 B 6 B Network Access (transmitted data )
display prompt that system ts connect ng to network
if no connection prompt proceeding with inspection
Write INCOMPLT DAT
Wnte INCOMPLT DAT
Write INCOMPLT DAT
pg4i 33B6C
Network Access (no match)
pg 43 3 3 0 6 D Network Access ( received data)
if vehicle match is found no changes allowed to
VIN and license abort if requaed
look in EIS OAT or EIS DAT
not in EIS Ikies verily VIN
refer to communications specs
pg 43 3 3 B 6 E Display Data Fields
pg453 3B8F Initial versus Relest Check
pg 45 3 3 B 6 G Recall Notice and TSB Information
pg 40 3 3 B 7 Vehicle Review Screen
pg 22 2 3
display and allow to print
model year not * 1900 or > current ~ 2
pg47 3 3 B 9 GVWR Entry
pg 47 3 3B8 Odometer Reading
pg 43 3 3 B 8 Dual Exhaust Y/H?
pg 48 3 3 B 9 1 VRT Lookup Row ID
pg 49 3 3 B 10 WRITE PRELIMINARY CIS DAT
manual entry
> 8 500 pounds receives an idle test
calculate d low mileage exemption applies
only Y or N for answer accepted
WHmatch first try ask for VIN and plate re-entry no changes manuaj entry
NJ licensed no-match tcrst try VIN re-entry change, second no match proceed
NJ bcensed no-match first try VIN re entry, change lo valid VIN
nocomm search EIS DAT and EIS HST for VIN not found prompt N abort
rtocomm search EIS DAT and EIS HST for VIN not found prompt, Y proceed
nocomm search EIS DAT and ElS HST for VIN found proceed
Number of days/tests allowed w/o com
MCOMPLT DAT
INCOMPLT DAT
MCOMPLTDAT
VID vehicle ID
Previous cyde initial odometer
Previous cyde initial test date
allow for changes I I
more than 45 days since last faded test | 1
if initial found previous test ask J change to retest | j
H
receive Vehtde Data
receive TSB and Recall Information
receive System Date/Tune Update
receive Lockout Status
receive Inspector and Auditor Data (TECH OAT and AUDITOR DAT)
receive OMV Messages
received Program Data (SYSTEM DAT)
receive Updates to the Vehide Reference Table (VRT OAT)
receive Two Speed Idle Emissions Table (SPD2 DAT}
receive updates to ASM Emissions Table (ASMSTDS DAT)
check applicability of NEWSTDS DAT by copy date in SYSTEM DAT if NEWSTDS DAT enst
and date pasted, delete ASMSTDS DAT rename NEWSTDS DAT to ASMSTDS OAT
received Horsepower Default Table (DEFPRAM DAT)
display 13 velude data fiekjs and allow for changes
prompt that a fun reinspection must be performed
type entered is initial VID sends back relest, prompl for retest (Y/N?) change to retesTesi type (initial or retest)
type entered is initial VID sends back relest prompl for retest (Y/N?) keep initial
display recall or TSB cnfo
print recall or TSB info
display 11
prompt lo enter missing data
change model year to 1699 show prompt
change model year to 2005 show prompt
prompt lo venfy luel code chow list GASO highlighted
27
33
34
Model year
Make
Model
Trans type
Vetude type
prompt lo enter fuel code enters wrong code shows error message returns to prompBody style
prompt lo enter fuel code enters bi-fual code prompt to lest on gasoline Cert class
prompt lo enter fuel code enters exempt from testing code prompl to do safety Fuel code
prompt lo enter cert dass give message and allowed values Number of cylinders
prompt lo check transmission lype if E (only A and M allowed) Engine size
prompt lo enter certification dass give message on allowed vetues and how to determine
E3
no value entered should not accept
enter 1999 pounds should not accept, give error message
enter 60001 pounds should not accept give error message
decimal point entered
valid reading entered if deomal point not accepted
no value entered should not accept
decimal point entered
vabd reading entered if decimal point not entered
no value entered, should not accept
enter "M" should not accept
valid reading entered if decimal point not entered
find VRT match
no VRT match use DEFPARM DAT
GVWR
ETW
ETW Source
Odometer Reading
VRT Record No
ASMSTDS DAT record number
15
24
13
17
18
28
22
23
19
20
21
2B
29
2/27/2002
INSPECTION MENU
Page 3
-------
INSPECTION ITEMS
NOTES
Pass CERTIFICATION TESTING REQUIRED
_Qeld_N^o (E'S DAT)
pg 51 338116 Credentials Check
pg52 3 36 116 Steenng/Suspension Chock
pg 52 3 38 11 B Safety Equipment Check
pg 52 33B11B Lights Check
pg 52 33B11B Brakes Check
pg 53 33B118 Exhaust Check
pg 53 33B11B Miscellaneous Check
pg 54 3 3B11C Overall Safety Test Rosult
pg 56 33B12 Visual Anti*Tampering Inspection
pg 56 3 36 13 B Gas Cap Pressure Test
pg SB 33B13C Tank Pressure Test
pg 59 3 3 B 14 A Retest?
pg 59 3 3 B 14 B Retest Waiver
pg 60 3 3 B 14 O Emissions Test Selection
pg 01 3 3B 14 03 For ASM. Check Dyno Testabta
pg 62 3 3 B 14 0 4 Low Mileage Vehicle Exemption
"Finished?" default is *n"
"P* replaces previous conditions
rouai test conditions an oefauu to P else old result
finished filed defaults to ~tr
"Finished7" default is "N"
T" replaces previous conditions
tmbal tesl conditions aD default lo P else otd resuti
finished filed delaults to "N"
"FwshedT" defauli is "N"
"P" replaces previous conditions
trubal test conditions all defauli to P else old result
finished fded defaults to *N"
"Finished?" default is "N"
"P" replaces previous conditions
initial lest, conditions all default to P else old result
finished filed defaults to "N"
"Finished?" default is "N"
"P" replaces previous conditions
tniuaJ lest conditions all default to P else otd result
finished filed defaults to "N"
"Finished?* defauli is "N"
"P" replaces previous conditions
initial test conditions aD default to P else old result
finished filed defaults to "N"
"Finished?* defauli is "N"
~P~ replaces previous conditions
initial test conditions aD default to P else old result
finished filed defaults to "N"
can only be P or F
can only be P or F
optional idle lest
for ASM Indicated check if dyno testable
no pmk card or previous odometer
Steenng and suspensen result
Safety Equipment resuti
Lights result
enter n or X for each item, see if coned inspection conditions are shown Credentials result
select no condition display prompt to return to item enter "N~ stay on prompt
select no condition display prompt to return to item enter "Y" return to item screen
check inspection conditions logic for pass, fail with P N and X with A F or 4
at bottom under "Finished?" enter N should stay on screen
at bottom under "Finished?" enter Y go to next check
even with previous test Credentials results always default to blank
enter N or X for each item see if correct inspection conditions are shown
select no condition display prompt to return to item enter "N" stay on prompt
select no condition display prompt to return to item enter "V return to item screen
check inspection conditions logic for pass fail with P N and X with A F or 5
at bottom under "Finished?" enter N should stay on screen
at bottom under "Finished?" enter Y, go to next check
enter N or X for each item see if correct inspection conditions are shown
select no condition, display prompt to return to item enter "N" stay on prompt
select no condition display prompt to return to item enter "Y" return to item screen
check inspection conditions logic for pass, fail wuh P N and X with A F or 6
at bottom under "Finished?" enter N, should stay on screen
at bottom under "Finished?" enter Y go to next check
enter N or X for each item see if correct inspection conditions are shown
select no condition, display prompt to return to item enter "N" stay on prompt
select no condition, display prompt to return to item enter "Y" return to item screen
check inspection conditions logic for pass fad with P N and X with A F or 7
at bottom under "Ftnished?" enter N, should slay on screen
at bottom under "Finished?" enter Y, go to next check
enter N or X for each item see if correct inspection conditions are shown
select no condition display prompt to return to item enter "N~ stay on prompt
select no condition display prompt to return lo item enter "Y" return to item screen
check inspection conditions logic for pass fail with P N and X with A For 8
at bottom under "Finished?" enter N should stay on screen
at bottom under "Finished?" enter Y go to next check
enter N or X for each item, see if coned inspection conditions are shown
seled no condition, display prompt to return to item enter "N" stay on prompt
select no condition display prompt to return to item enter "Y" return to item screen
check inspection conditions logic for pass fail with P N and X with A F or 9
at bottom under "Finished?" enter N should stay on screen
at bottom under 'Finished?" enter Y go to next check
enter N or X for each item see d corred inspection conditions are shown Miscellaneous items result
seled no condition display prompt to return to item, enter "N* stay on prompt Miscellaneous comment
seled no condition display prompt to return to item enter "Y" return to item screen Miscellaneous comment result
check inspection conditions logic for pass fail with P N end X with A For 10
at bottom under 'Finished?" enter N. should stay on screen
at bottom under "Finished?" enter Y go to next check
Exhaust system result
B
CZD
displays overall pass fail result, only P or F
onginally equipped no exit
originally equipped yes prompt to check
cap Accessible? No prompt to fad answer no to fail yes to question
can cap be removed? No prompt to fad answer no to (ad yes to question
does the Gas cap fit en adapter? No passes
does the Gas cap fit an adapter? Yes proceed with tesl
test cap fail prompt to replace say no prompt will fad answer replace
prompt lo replace and repeal lest pass
prompt "More than one fuel tank?"
to be tested at a later date
If ratest and passed emissions lest previously display question if to retast (Y/N)
d previously received an exhaust waiver from a CIF ask if emissions test ratest (Y/N)
pnor to start up date (OEC 10) for ASM qualified vefudes ask if want Kfie test
ask if 4WD or AWD can not be disengaged if yes prompt for reason
Overall safety result
Catalytic Converter Inspection Result
Adapter avadable
Replaced during test
Overall gas cap test results
116
117
118
119
106
57
58
Pressure test result
ask if motonst want to claim low mdeage yes
ask mdeage m last two years
mdeage Qreater than 10 000 prompt that they are ineligible
Reason for norv-dyn testable
Dyno Testable
Low mdeage exemption
30
31
2/27/2002
INSPECTION MENU
Page 4
-------
INSPECTION ITEMS
Pass CERTIFICATION TESTING REQUIRED
Fiold Name (ElS OAT)
pg 63 3 3 B 14 E 1 Zero Air
pg 63 3 38 14 El Ambient
pg 6 3 3 0 14 F 2 c
pg 3 3 3 B 14 E 2 HC Hang-up Check
pg 76 33B14G Request RPM Pickup Type
pg 8 3 3 B 14 H Emissions Test (ASM, 2500 or Idle)
pg 76 3 3 B 14 I
pg 76 3 3 B 14 1
B
should request OBDI1 if 1996-ซ
pg 35 3 3 B 3
pgfc3 3 3 B 15
Aborts
Visible Smoke Check
aborts not allowed after emissions test
anytime during test
rev engine foe ihree seconds and look (or smoke
pg 64 3 3B16A Overall Emissions Test Result
pg 64 3 3B16G Overall Safety Test Result
pg 64 3 3B16C Overatl Test Result
pg 85 3 3 B 17 Print VIR
pg 126 S 1 (72) F Sticker Number (Passes Only)
Pg 152 Append E Calculate TIN
pg 79 3 38 16 Print VIR
pg11S 5 1
no zero air pressure (ad
zero air not 'zero"
How dean air pass
ambient au fails ( > 15 ppm HC 0 02% CO or 25 ppm NO)
ambient air passes
heat temperature sensor warn ambient temperature oui of range check sensor
Relative Humidity
Humidity correction factor
Ambient Temperature
Barometric Pressure
46
49
47
48
HC hang-up check fails ( probe ambient > 7 ppm HC)
HC hang up check passes
contact/non contact/OBDll
RPM unstable request change method of measurement
RPM unstable three limes allow bypass
RPM stable okay
RPM bypass
122
ASM test senpu
if ASM 5015 was tire drying used?
2500 RPM test scripts
idle test senpts
Emissions tesl type (performed)
Tire drying
Exhaust emission result
40
43
102
attempt to abort abort not allowed
yes or no if smoked (or auxe than 3 seconds Vistbte smoke test result
if no prompt to increase RPM to above idle for three seconds prompt J smoked yes or no
103
OveraD Emission Result
Overall Safety Result (also given abo
Overatl test resutt
108
119
120
request enter sticker number if result is pass otherwise do not display
request enter test lee
Sticker number
Test fee
124
123
set flag in SYSTEM OAT to yes prompt to confirm change to 2 months invalid
change lo 27 months invalid
chenge lo 14 months accept
change flag in SYSTEM OAT no prompt to confirm date
Certificate number
Sucker date Calculated
Sucker date Actual
Sucker date overriden?
125
154
155
126
use formula and example code given spec (1226 1952)
TIN
44
printer failure (remove cable) correct Test end lime
pnnter failure unabJe to correct (prompt to retry to pnnt) prompt to consumer about test
check VIRs for correct information
39
2/27/2002
INSPECTION MENU
Page 5
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Retest
l/R
X
X
X
X
Waiver return
Y/N
Model year
pre 71 71-74,75-80,81 +
pre 68
pre 68
68-70
68-70
71-74
71-74
71-74
71-74
75-80
75-80
75-80
75-80
81 +
81 +
Low mileage exemption
Y/N
Over 8500 GVWR
Y/N
X
X
AWD/4WD non-disengagable
Y/N
Fuel type
GASO, GCNG, PROP
X
X
84 BMW. auto
Y/N
92 Peugeot 505 auto
Y/N
83 Volvo 740, auto
Y/N
ASM
X
?
2500
idle
X
X
X
X
X
?
X
X
?
X
Gas cap
X
?
X
X
X
?
X
X
X
1
Pressure
X
?
Cat inspection
X
?
X
X
X
7
Safety
X
X
X
?
X
?
X
X
X
?
X
X
X
?
Vehicle Identification
VIN entry
099
100
101
101
102
102
103
104
105
105
106A
107
108
108
Jurisdiction
NJ. other
NJ
NJ
NJ
NJ
Guam
Guam
NJ
NJ
NJ
NJ
NJ
CA
NJ
NJ
Plate number
TST1
TST2
TST3
TST3
TST4
TST4
TST5
TST6
TST7
TST7
TST8
TST9
TST10
TST10
Test type (Initial or reinspection)
1, R
I
I
I
R
1
R
I
I
I
R
I
1
I
R
Pink card? Enter date (use as IAD)
MM/YYYY
24-May
4-Jun
3-Jun
Vehicle type
P, T. H
P
T
P
P
P
T
H
P
P
P
H
T
T
T
GVWR
1 (<6K), 2, 3 (>8 5K)
1
2
1
1
2
2
3
1
1
1
3
1
1
1
Cert class
Tier I. LEV,
Model year
55
65
70
70
72
72
73
74
80
80
75
79
81
81
Make
CHEV
FORD
DODG
DODG
FORD
TOY
INTER
FORD
TOY
TOY
HUNK
CHEV
JEEP
JEEP
Model
BEL AIR
F150
CHALL
CHALL
MUST
CORN
BUBBA
TORIN
CRESS
CRESS
FT BOY
PRT VAf
WAGON
WAGON
Body style
1 -6
1
4
1
1
1
4
N/A
2
2
2
N/A
6
4
4
Cylinders
1 -16, R
8
8
8
8
6
4
8
8
6
6
8
8
6
6
Engine size
99 9 L
4 7
5 3
59
59
3
1 6
6 5
7 5
26
26
8
5
4 2
4 2
Transmission type
A, M
M
A
A
A
M
M
M
A
A
A
A
M
M
M
Test weight
4000
4000
Fuel code
GASO, GCNG, PROP
GASO
GASO
GASO
GASO
GASO
GASO
GASO
PROP
GASO
GASO
GASO
METH
GASO
GASO
Dual exhaust
Y, N
N
N
N
N
Y
Y
N
N
Y
Y
N
N
N
N
Oyno testable
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
Y
Reason for non-dyn testable
1, 2, 3
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Claimed low mileage exemption, estimated mileage
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Current odometer
25000
33333
44444
55555
66666
77777
88888
99999
86666
84444
56666
54444
92222
93333
GVWR
4500
6600
4250
4250
3500
6500
9500
5500
5000
5000
12000
5500
4500
4500
2/27/2002
TEST SCENARIOS
Page 6
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Retest
l/R
Waiver return
Y/N
X
Model yeai
pre 71 71-74 75-80.81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
81 +
Low mileage exemption
Y/N
X
Over 8500 GVWR
Y/N
X
AWD/4WD non-disengagable
Y/N
X
Fuel type
GASO, GCNG. PROP
X
84 BMW, auto
Y/N
X
92 Peugeot 505, auto
Y/N
X
83 Volvo 740. auto
Y/N
X
ASM
X
X
X
X
X
X
2500
X
X
idle
X
X
X
X
Gas cap
?
X
X
X
X
X
X
X
X
X
X
X
X
X
Pressure
ฆ>
X
X
X
X
X
X
X
X
X
X
X
X
Cat inspection
?
X
X
X
X
X
X
X
X
X
X
X
X
X
Safety
?
X
X
X
X
X
X
X
X
X
X
X
X
X
Vehicle Identification
These three, choose idle
VIN entry
109
110
111
112
113
114
115
116
117
118
119
120
121
122
Junsdiction
NJ, other
NJ
NJ
NJ
NJ
NJ
Oregon
NJ
NJ
other
NJ
NJ
NJ
NJ
NJ
Sticker expiration
#NAME?
#NAME?
tlllllllttlltl
#NAME?
#NAME?
IIIIIIIIIIIIII
#NAME?
#NAME?
#NAME?
it it uiiii ii ii
#NAMI:7
(KNAME?
Plate number
TST11
TST12
TST13
TST14
TST15
TST16
TST17
TST18
TST19
TST20
TST21
TST22
TST23
TST24
Test type i Initial or reinspection)
1. R
I
1
I
I
I
1
I
I
I
I
1
1
I
I
Pink card? Enter date (use as IAD)
MM/YYYY
N
25-May
5/1/1999
16-Apr
2-Jun
16-May
10-May
Vehicle type
P. T, H
P
P
H
T
T
P
P
P
P
P
P
P
T
T
GVWR
1 (<6K), 2, 3 (>8 5K)
1
1
3
2
1
1
1
1
1
1
1
1
2
2
Cert class
Tier I, LEV.
Model year
86
95
89
91
99
87
92
84
91
93
85
1
97
87
Make
AMC
GEO
HEAVY
FORD
DODG
BMW
Peugeot
Volvo
MAZ
BMW
CHEV
FORD
FORI)
CHEV
Model
ALLIAN
METRO
TRUCK
RANG
PICKUP
3 SER
505
740
RX-7
7 SER
BEAST
A/NDSTR
E25C
R20 P/U
Body style
1 -6
1
1
N/A
3
4
1
2
1
1
1
2
5
6
3
Cylinders
1 -16, R
4
3
8
6
8
6
4
4
3R
8
8
6
8
8
Engine size
99 9 L
1 4
1
66
3
6
2 7
1 9
2 3
1 3
4
10
3 8
8
5 7
Transmission type
A, M
A
A
M
A
A
A
A
A
A
A
M
A
A
M
Test weight
2000
3625
4125
2625
3125
3125
3250
4250
4000
4125
7000
4500
Fuel code
GASO. GCNG, PROP
GASO
GASO
GASO
GASO
DIES
GASO
GASO
GASO
GASO
GASO
GASO
GASO
GASO
GASO
Dual exhaust
Y, N
Y
Y
Y
N
Y
Y
N
N
Y
N
N
N
N
N
Oyno testable
Y, N
N/A
N/A
N/A
N
N/A
Y
Y
Y
Y
Y
Y
Y
Y
Y
Reason for non-dyn testable
1. 2, 3
N/A
N/A
N/A
1
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Claimed low mileage exemption, estimated mileage
N/A
Y
N/A
N/A
N/A
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
Current odometer
94444
95555
96666
97777
98888
99999
88888
77777
66666
55555
44444
33333
22222
100000
GVWR
300
2850
20000
5200
5800
3500
5300
5000
3750
3750
6000
5500
7000
6500
2/27/2002
TEST SCENARIOS
Page 7
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Vehicle Inspection Scenario Information
6/6/1999
VID contact7
Y, N
Y
Y
Y
Y
N
Y
Y
N
Y
N
Y
Y
N
N
RID (outgoing)
ItttlllllHltl
7/1/1997
IIIIIIII It till
6/6/1999
6/6/1999
6/3/1999
6/6/1999
6/6/1999
6/6/1999
6/6/1999
8/1/1997
6/3/1999
6/6/1999
6/6/1999
IED (incoming)
MM/DD/YYYY
#NAME?
#NAME?
#NAME?
#NAME?
blank
6/3/1999
#NAME?
blank
#NAME?
btank
#NAME?
6/3/1999
blank
blank
ICD (incoming)
MM/DD/YYYY
#NAME?
#NAME?
#NAME?
#NAME?
blank
#NAME?
blank
0NAME?
blank
#NAME?
6/3/1999
blank
blank
ICO (outgoing)
#NAME?
#NAME?
#NAME?
#NAME?
7/6/1999
IIIIIIIIIIIIII
IIIIIIIIIIIIII
#NAME?
#NAME?
#NAME?
#NAME?
7/6/1999
#NAME?
#NAME?
ICD flag (coming in from V1D)
Y
T
Y
N
Y
N
Y
Y
Y
N
Y
F
Y
N
ICD flag (going out)
N, Y. T, R, F
N
N
N
Y
N
N
Y
N
N
Y
N
N
Y
Y
IAD (incoming)
6/7/1999
blank
7/3/1999
blank
blank
7/3/1999
blank
blank
IAD (outgoing)
6/7/1999
blank
7/3/1999
blank
blank
7/3/1999
blank
blank
Credentials Check
Dnver license
P. F, A
P
P
P
Z
P
Z
P
P
P
Z
P
P
P
Z
Registration
P, F, A, 4
P
P
P
Z
F
F
P
P
P
Z
P
P
P
Z
Insurance documents
P, F. A, 4
P
P
P
P
F
P
P
P
P
z
P
P
P
z
Plates
P. F, A, 4
P
P
P
z
P
Z
P
P
P
z
P
P
P
z
Other credentials
P, F, A, 4
P
P
F
z
P
Z
P
P
P
z
P
P
P
z
Odometer
P. F, A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Overall
P, F
P
P
F
p
F
F
P
P
P
z
P
P
P
z
Steering/Suspension Check
Steenng/suspension
P, F, A, 4
P
P
P
z
P
P
P
F
P
z
P
P
P
z
Overall
P, F
P
P
P
z
P
P
P
F
P
z
P
P
P
z
Safety Equipment Check
Glass
P, F, A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Horn
P, F. A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Wipers
P. F. A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Mirrors
P, F, A. 4
P
P
F
p
F
p
P
P
P
z
P
P
P
z
Tires/wheels
P. F, A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Visual obstruction
P. F, A. 4
P
P
P
z
P
z
P
P
P
z
F
P
P
z
Overall
P. F
P
P
F
p
F
p
P
P
P
z
F
P
P
z
Lights Check
Headlights
P. F, A. 4
P
P
P
z
P
z
P
F
P
z
P
P
P
z
Rear lights
P, F, A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Turn/warning lights
P, F. A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Plate lights
P. F. A, 4
P
F
P
z
P
z
P
P
P
z
P
P
P
z
Other lights
P. F. A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Marker/clear/reflector
P. F. A, 4
P
P
P
z
P
z
P
P
P
z
P
P
P
z
Wiring/switches
P, F. A, 4
P
P
P
z
F
p
P
P
P
z
P
P
P
z
Overall
P. F
P
F
P
z
P
z
P
F
P
z
P
P
P
z
2/27/2002
TEST SCENARIOS
Page 8
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Vehicle Inspection Scenario Information
6/6/1999
VID contact?
Y, N
Y
Y
Y
Y
Y
Y
Y
N
Y
Y
Y
N
Y
Y
RIO (outgoing)
5/1/1997
6/6/1999
t! till It If ft tt
6/671999
6/6/1999
6/6/1999
6/6/1999
11/5/1995
6/9/1997
lmlm
6/6/1999
6/6/1999
lilt tilt 11 Hftlt
IED (incoming)
MM/DD/YYYY
flNAME?
flNAME?
#NAME?
flNAME?
flNAME?
IIIIIIIIIIIIH
IIIIIIIIIIIIII
blank
6/2/1999
flNAME?
flNAME?
blank
flNAME ?
flNAME?
IED (outgoing)
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
6/2/1999
//NAME?
flNAME?
IIIIHIIIIIIII
//NAME ?
flNAME?
ICD (incoming)
MM/DD/YYYY
flNAME?
flNAME?
flNAME?
#NAME?
mi it ii ii ii ii
IIIIIIIIIIIIH
blank
6/2/1999
flNAME?
flNAME?
blank
flNAME.?
flNAME?
ICD (outgcing)
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
flNAME?
#NAME?
7/6/1999
flNAME?
IIIIIIIIIIIIH
flNAME?
flNAME?
6/20/1999
ICD flag (coming in from VID)
Y
Y
Y
T
T
F
Y
Y
F
Y
T
Y
Y
T
ICD flag (going out)
N, Y, T, R, F
N
Y
N
N
Y
Y
Y
N
N
N
N
N
Y
N
IAD (incoming)
6/8/1999
tltlllUtllltl
IIIIIIIIIIIIII
blank
7/2/1999
IIIIIIIIIIIIH
blank
5/24/1999
IAD (outgoing)
6/8/1999
ItlllWIIIW
lUtlltltltW
blank
7/2/1999
II It II It IIII It
blank
5/24/1999
Credentials Check
Driver license
P, F, A
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Registration
P. F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Insurance documents
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Plates
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
F
P
P
P
Other credentials
P, F, A, 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Odometer
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
F
Overall
P. F
P
P
P
P
p
P
P
P
P
P
F
P
P
F
Steering/Suspension Check
Steering/suspension
P. F, A, 4
P
P
P
P
p
P
P
A
P
P
P
P
P
P
Overall
P. F
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Safety Equipment Check
Glass
P, F, A, 4
P
P
F
P
p
P
P
P
P
P
P
P
P
P
Horn
P. F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Wipers
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
F
P
P
P
Mirrors
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Tires/wheels
P, F, A, 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Visual obstruction
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Overall
P. F
P
P
F
P
p
P
P
P
P
P
F
P
P
P
Lights Check
Headlights
P. F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Rear lights
P. F, A, 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Turn/warning lights
P. F, A, 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Plate lights
P. F, A, 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Other lights
P. F, A, 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Marker/clear/reflector
P. F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Winng/switches
P, F, A. 4
P
P
P
P
p
P
P
P
P
P
P
P
P
P
Overall
P, F
P
P
P
P
p
P
P
P
P
P
P
P
P
P
2/27/2002
TEST SCENARIOS
Page 9
-------
ASM/2500 RPM/IOLE TEST SCENARIOS
Test Type
Options
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Brakes Check
Service brake
P. F, A, 4
P
P
4
P
P
z
P
P
P
Z
P
P
P
Z
Parking brake
P. F, A. 4
P
P
F
P
P
z
P
P
P
Z
P
P
P
z
Brake equalization
P. F, A, 4
P
P
F
P
P
z
P
P
P
z
P
P
P
z
Overall
P. F
P
P
F
P
P
z
P
P
P
z
P
P
P
z
Exhaust Check
Exhaust system
P, F. A. 4
P
P
P
Z
P
z
P
P
P
z
P
P
4
p
Overall
P. F
P
P
P
Z
P
z
P
P
P
z
P
P
4
p
Miscellaneous Check
Miscellaneous items result
P. F, A, 4
P
P
P
z
F
P
P
P
P
z
P
P
P
z
Miscellaneous comment result
P, F, A, 4
P
P
P
z
4
p
P
P
P
z
P
P
P
z
Overall safety inspection result
P, F, Z. N
P
F
4
p
4
F
P
F
P
z
F
P
4
p
Tamperinq Check
Catalytic converter
P, F, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
p
P
P
P
p
Gas Cap Test
Accessible
Y.N
N/A
N/A
N/A
N/A
Y
Y
Y
N/A
Y
Y
Y
N
Y
Y
Can it be removed
Y. N
N/A
N/A
N/A
N/A
Y
Y
Y
N/A
Y
Y
Y
N/A
Y
Y
Does the cap Til the adapter?
Y, N - mark on VIR
N/A
N/A
N/A
N/A
Y
Y
N
N/A
Y
Y
Y
N/A
Y
Y
First test result
P, F - Ask if replace?
N/A
N/A
N/A
N/A
F
P
N/A
N/A
P
P
P
F
P
P
Replace cap
Y, N - Fail
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Retest
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Overall result
P. F
N/A
N/A
N/A
N/A
F
P
N/A
N/A
P
P
P
F
P
P
Pressure Test
Pressure test result
P,F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
P
Test Preparation
4WD/AWD non-disengagable
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
Reason for non-dyn testable
1. 2, 3
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Analyzer zero
P. F
P
P
P
P
P
P
P
P
P
P
P
P
P
P
HC hanq-up check
P. F
P
P
P
P
P
P
P
P
P
P
P
P
P
P
RPM pickup device
Cont, non-cont. OBDII
CONT
CONT
NON
CONT
CONT
CONT
CONT
CONT
CONT
CONT
CONT
CONT
CONT
CONT
Unstable RPM, bypass
Y. N
Y
N
N
N
N
N
P
N
N
N
N
N
Y
N
2/27/2002
TEST SCENARIOS
Page 10
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Brakes Check
Service brake
P, F, A, 4
P
P
P
P
P
P
P
P
P
P
P
P
P
P
Parking brake
P, F. A, 4
P
P
P
P
P
P
P
P
P
F
P
P
P
P
Brake equalization
P. F, A, 4
F
P
P
P
P
P
P
P
P
P
P
P
P
P
Overall
P, F
F
P
P
P
P
P
P
P
P
F
P
P
P
P
Exhaust Check
Exhaust system
P, F. A. 4
P
P
P
P
P
P
P
P
P
P
P
P
P
P
Overall
P, F
P
P
P
P
P
P
P
P
P
P
P
P
P
P
Miscellaneous Check
Miscellaneous items result
P, F. A. 4
P
P
P
P
P
P
P
P
P
P
P
P
P
P
Miscellaneous comment result
P, F. A. 4
P
P
P
P
P
P
P
P
P
P
P
P
P
P
Overall safety inspection result
P, F. Z, N
F
P
F
P
P
P
P
P
P
F
F
P
P
F
Tampering Check
Catalytic converter
P, F. N
P
P
P
P
N/A
P
P
F
P
P
P
P
P
P
Gas Cap lest
Accessible
Y, N
Y
Y
Y
Y
N/A
Y
Y
Y
Y
Y
Y
Y
Y
Y
Can it be removed
Y, N
N
Y
Y
Y
N/A
Y
Y
Y
Y
Y
Y
Y
Y
Y
Does the cap fit the adapter?
Y, N - mark on VIR
N/A
Y
Y
Y
N/A
Y
Y
Y
Y
Y
Y
Y
Y
Y
First test result
P, F - Ask if replace?
F
F
F
P
N/A
F
P
P
F
F
P
F
F
P
Replace cap
Y, N - Fail
N/A
Y
Y
N/A
N/A
Y
N/A
N/A
Y
Y
N/A
Y
Y
N/A
Retest
P. F
N/A
P
P
N/A
N/A
P
N/A
N/A
P
P
N/A
P
P
N/A
Overall result
P. F
F
P
P
P
N/A
P
P
P
P
P
P
P
P
P
Pressure Test
Pressure test result
P,F
P
P
N/A
P
P
P
P
P
P
P
P
F
P
P
Test Preparation
4WD/AWD non-disengagable
Y. N
N/A
N/A
N/A
Y
N/A
N/A
N/A
N/A
N
N
N
N
N
N
Reason (or non-dyn testable
1, 2, 3
N/A
N/A
N/A
2
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Analyzer zero
P. F
P
P
P
P
P
P
P
P
P
P
P
P
P
P
HC hang-up check
P. F
P
P
P
P
P
P
P
P
P
P
P
P
P
P
RPM pickup device
Cont, non-cont, OBDII
CONT
CONT
CONT
CONT
N/A
CONT
CONT
CONT
NON
CONT
CONT
CONT
OBDI:
CONT
Unstable RPM, bypass
Y. N
N
N
N
N
N
N
N
N
N
N
N
N
N
N
2/27/2002
TEST SCENARIOS
Page 11
-------
ASM/2500 RPM/IOLE TEST SCENARIOS
Test Type
Options
1
2
3
4
5
6
7
8
9
10
11
12
13
14
ASM Test
Test weight source
VID (V), VRT (M), Def (D)
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
M
M
Front wheel drive
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
Tire drying
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
Y
More tire drying
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
Dunng lest
Speed violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
Acceleration violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
N
Dilution violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
RPM violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
Oyno loading violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
Restarted test
Speed violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Acceleration violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
N/A
Dilution violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Dyno loading violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted lest
Speed violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Acceleration violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Dilution violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
RPM violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Dyno loading violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
F
N/A
2/27/2002
TEST SCENARIOS
Page 12
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
15
16
17
18
19
20
21
22
23
24
25
26
27
28
ASM Test
Test weight source
VID (V), VRT (M), Def (D)
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
V
V
V
M
V
V
Fiort wheel drive
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
N
N
Y
Tire drying
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
Y
N
N
More tire drying
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
Y
N
N
During test
Speed violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
N
N
N
N
N
Acceleration violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
N
N
N
Dilution violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
N
N
N
RPM viola'ions
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
N
N
Y
Dyno loading violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N
N
N
N
N
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
F
P
P
N/A
Restarted test
Speed violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N
Acceleration violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
N/A
N/A
N/A
N/A
N
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N
Dyno loadinq violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N
Emissions result
P, F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
Restarted test
Speed violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
Acceleration violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Y
N/A
N/A
N/A
N/A
N/A
Dilution violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
RPM violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
Dyno loading violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
F
N/A
N/A
N/A
N/A
N/A
2/27/2002
TEST SCENARIOS
Page 13
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
1
2
3
4
5
6
7
8
9
10
11
12
13
14
2500 RPM Test
During test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y,N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Second Chance Test
Dilution violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y.N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
2/27/2002
TEST SCENARIOS
Page 14
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
15
16
17
18
19
20
21
22
23
24
25
26
27
28
2500 RPM Test
During lest
Dilution violations
Y.N
N/A
Y
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
N/A
N/A
N/A
F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y. N
N/A
Y
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
Y
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y. N
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
P
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Second Chance Test
Dilution violations
Y, N
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
N/A
N/A
N/A
F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
2/27/2002
TEST SCENARIOS
Page 15
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Idle Test
During test
Dilution violations
Y. N
N
N
N
Y
N
N
N
N/A
N
N
N
N/A
N/A
N/A
RPM violations
Y, N
N
N
N
N
N
Y
N
N/A
N
N
N
N/A
N/A
N/A
Low flow violation
Y. N
N
N
N
N
Y
N
N
N/A
N
N
N
N/A
N/A
N/A
Emissions result
P. F
F
P
F
N/A
N/A
N/A
P
N/A
F
P
P
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N
Y
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
P
N/A
P
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low (low violation
Y. N
N/A
N/A
N/A
N/A
Y
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Second Chance Test
Dilution violations
Y, N
N
N/A
N
N/A
Y
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N
N/A
N
N/A
N
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y. N
N
N/A
N
N/A
N
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
Emissions result
P, F
F
N/A
P
N/A
N/A
N/A
N/A
N/A
P
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N/A
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
P
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Visible smoke test
P, F, Z
F
P
P
P
P
P
P
P
P
P
F
P
P
P
OVERALL RESULTS
Overall safety
P. F, 2, 4
P
F
4
P
4
F
P
F
P
Z
F
P
4
P
Tampenng
P, F, 2, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
P
P
P
P
P
Pressure
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
P
P
Gas cap
P, F, Z, N
N/A
N/A
N/A
N/A
F
P
N/A
N/A
P
P
P
F
P
P
Emissions
P, F, W, Z, N
F
P
P
P
P
P
P
N/A
P
P
P
N/A
F
P
Smoke
P. F, Z
F
P
P
P
P
P
P
P
P
P
F
P
P
P
Overall Emissions
F
P
P
P
P
P
P
P
P
P
F
F
F
P
Overall
P. F
F
F
F
P
F
F
P
F
F
P
F
F
F
P
Test fee
Sticker date
REJ
REJ
REJ
IIIIIIIIIIIIII
REJ
REJ
IIIIIIIIIIIIII
REJ
REJ
SNAME?
REJ
REJ
REJ
#NAME?
TIN
2/27/2002
TEST SCENARIOS
Page 16
-------
ASM/2500 RPM/IDLE TEST SCENARIOS
Test Type
Options
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Idle Test
During test
Dilution violations
Y. N
N/A
N/A
N
N/A
N/A
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N
N/A
N/A
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y. N
N/A
N/A
N
N/A
N/A
N
N
N
N/A
N/A
N/A
N/A
N/A
N/A
Emissions lesult
P, F
N/A
N/A
P
N/A
N/A
F
F
P
N/A
N/A
N/A
N/A
N/A
N/A
Restarted lest
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y. N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Second Chance Test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y. N
N/A
N/A
N/A
N/A
N/A
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N/A
N/A
N
N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
F
P
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Restarted test
Dilution violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
RPM violations
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Low flow violation
Y, N
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Emissions result
P. F
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
N/A
Visible smoke test
P, F,Z
P
P
P
P
P
P
P
P
P
P
P
P
P
P
OVERALL RESULTS
Overall safety
P, F, Z. 4
F
P
F
- P
P
P
P
P
P
A
F
P
P
F
Tampennq
P, F, Z, N
P
P
P
P
N/A
P
P
F
P
P
P
P
P
P
Pressure
P
P
N/A
P
P
P
P
P
P
P
P
F
P
P
Gas cap
P, F, Z, N
F
P
P
P
N/A
P
P
P
P
P
P
P
P
P
Emissions
P, F, W, Z, N
P
P
P
F
N/A
P
P
P
F
P
F
P
P
P
Smoke
P, F, Z
P
P
P
P
P
P
P
P
P
P
P
P
P
P
Overall Emissions
F
P
P
F
P
P
P
F
F
P
F
F
P
P
Overall
P, F
F
P
F
F
P
P
P
F
F
F
F
F
P
F
Test fee
Sticker date
REJ
#NAME?
REJ
REJ
#NAME?
#NAME?
#NAME?
REJ
REJ
REJ
REJ
REJ
#NAME?
REJ
TIN
2/27/2002
TEST SCENARIOS
Page 17
-------
Additional Checks to be Performed on Test Systems
Item
Condition
Pass
1
Cabinet open lockout
2
Check for UL label on cabinet
3
Does training VIR have VOID on it
4
Record search, copy to floppy
5
Can not hold down a key and fill in fields
6
Calibration procedures for analyzer
7
Calibration procedures for gas cap tester
8
Calibration procedures for dynamometer
9
Leak check fail
10
Test technician security
11
Print screen disabled during inspection
12
No key buffering
13
State lockout not cleared by VID download
14
Single letter chooses a menu item
15
If lockout, display message and return to main menu
16
Fan prompt or turns on automatically if above 72 degrees
17
Find and print duplicate VIR with "duplicate" and date printed on it
18
Displays "testing" while sampling
19
Check transaction ID number (TIN) algorithm
20
Perform gas audit
21
Install new data disk
22
Set State lockout
23
Clear State lockout
24
Update station information
25
Bar code reader failure limits
26
Inspector time limit for manual entry of ID
27
Invalid station license
28
Communications ANI rejection
29
Data corruption check
30
Prints messages when downloaded
31
Prints recall when downloaded
32
Power down before, during, and after an emissions test
33
Restore files from diskette
34
Disconnect phone line between first and second VID call
35
Attempt to back out of test sequence with edit keys
36
"esc" key causes abort after inspector enters ID
37
Attempt entry to DOS through optional software
2/27/2002
Additional Checks
Page 18
-------
ASM Test Conditions
Test Item
Conditions
1
RPM Setup
If 1996* prompt to use OBDII port request type used be entered
2
Front Wheel Drive7
Front wheel drive prompt "Front-wheel drive vehicle Laterally stabilize, restrain and chock"
Rear wheel drive prompt "Rear-wheel drive vehicle Restrain and chock"
3
Vehicle Alignment and Restraints
Will not allow test if restraints not in place
4
Cooling Fan Prompl if > 72 deq F
Can be optionally turned on automatically
5
Sample Probe Hookup
6
Gear Selection
1 automatic transmissions engine <= 3 OL 3000 rpm, suggest use of "drive"
2 automatic transmissions engine > 3 0L, 2500 rpm, suggest use of "drive"
1 manual transmission, engine <= 3 0L, 1500 - 3000 rpm, suggest use of "2nd gear"
2 manual transmission, engine > 3 0L. 1250 - 2500 rpm. suggest use of "2nd gear"
7
Tire Drying (y/n)
more drying' (y/n)
1) drying at < 30mph increment clock 1X/sec
2) drying at > 30mph or 3000 rpm, increment clock 2X/sec
3) maximum delay = 30 seconds
8
Pretest Conditions (delay start of lest)
1) zero not performed ambient air check not complete. HC hang-up failed/incomplete
2) low flow
3) idle not between 400-1250 RPM
4) dyno rolls turning ( speed > 1mph)
5) delay start twice time of high idle or rolls turning, maximum delay = 30 seconds
6) count down delay unless rollers < 1 mph or RPM < 1250 increase at 1 second j)er second turning
7) again count down delay
9
ASM Test Mode
1) speed, RPM, dyno loading and dilution not within spec within 25 seconds, restart
2) roll speed. RPM dyno loading and dilution within spec for 5 seconds, start lest timer (if > 10 and < 25 subtract difference from mode timer)
3) display test speed with limits test time, engine RPM
4) at 25 seconds start pass/fail determination
5) during emissions averaging portion of the test, monitor vehicle acceleration if > 0 6 mph/sec (or SYSTEM DAT spec)
a) display acceleration error prompt
b) increment acceleration violation counter, go to restart if count =5 or accumulated acceleration time > 5 seconds
c) do not use time aligned transient throttle data for emission averaging
d) delay sampling for 5 seconds after time aligned acceleration violation ends
e) if less than 10 seconds remain in test after sampling is restarted use last 10 sec average without acceleration for pass/fail determination
6) during emissions averaging portion of the test, restart the test if
a) dilution for more than 2 seconds
b) dyno loading out of tolerance 2 consecutive seconds or 5 seconds accumulated
c) vehicle speed out of tolerance (+/-lmph) for 5 seconds at any one time
d) engine RPM out of tolerance for more than 2 seconds
e) acceleration violation 5 times or 5 seconds
7) if average simultaneous readings are all less than or equal to spec and speed has not decreased by 0 5 mph. vehicle passes
8) if test time expires and pass is not obtained, vehicle fails, log average of last 10 seconds
9) at end of mode timer, apply augmented braking, warn driver on screen any time augmented braking is applied
10
ASM Restart Procedures
1) bring rollers to full stop (augmented braking is to be applied).
2) reset all violations counters to zero
3) increment number of restarts counter, if >2, end the test
4) delay at RPM < 1250 roll speed < 1mph,
5) delay time = 2X previous running time maximum delay = 60 seconds
6) return to ASM test mode
2/27/2002
5015 Test Conditions
Page 19
-------
Idle and 2500 RPM Test Conditions
Test Item
Conditions
1
Special Test Sequence?
If so, record in EIS DAT field 41, provide special prompts
2
RPM Setup IqnitionType
If 1996+ prompt to use OBDII port, request type used be entered
3
Sample Probe Hookup
4
Pretest Conditions to Start Testing Period
1) zero not performed, ambient air check not complete, HC hang-up failed/incomplete
2) low flow
3) CO + C02 must be greater than 6%
4) RPM meeting limits for test (400-1250 for idle test, 2250-2750 for 2500 RPM test)
5
Emissions Test
1) vehicle in neutral or park
2) engine RPM within limits
3) display test RPM with limits, test time
4) at 10 seconds start pass/fail determination, 5 second average
5) maximum elapsed time is 30 seconds for mode
6
Restart Conditions
1) RPM outside limits for test (400-1250 for idle test, 2250-2750 for 2500 RPM test) for 2 or 10 seconds
2) low flow
3) CO + C02 less than 6%
7
Second Chance Test
1) 2500 RPM test - idle for 30 seconds, retest
1) idle test - rev up to 2200-2650 rpm for 30 seconds, if less tan 2200 rpm for more than 2 seconds, warn, after revving, retest at idle
DO NOT rev special test types 3 or 5 (with ZF transmissions)
2/27/2002
ldle-2500 RPM Conditions
Page 20
-------
Acceptance Testing Problem Tracking Log
Item
Date
Level
Problem
Notes
Fixed
Date
#
Found
1
2
3
Description
OK
Fixed
1
2
3
4
5
6
7
8
9
10
11
12
13
Page Number
-------
Acceptance Testing Problem Tracking Log
Item
#
Date
Found
Level
Problem
Description
Notes
Fixed
OK
Date
Fixed
1
2
3
Page Number
-------
Beta Test Station Problem Tracking Sheet
Facility Name'
Facility Phone Number-
Facility Address
Software Version #
Facility Number
Page #
Please make descriptions as detailed as possible. Use more than one set of blocks if necessary.
Date /
Time
Problem / Issue / Suggestion
Exact Description of Where Problem
Occurred in Software
VIN (attach VIR if
possible)
-------
APPENDIX B. TRAINING COURSE PRESENTATION
B-l
-------
US EPA I/M TEST SYSTEM
AUDIT TRAINING
Michael St. Denis
Sierra Research, Inc.
Outline
Overview of acceptance testing
Overview of calibrations
Field auditing issues
Auditing equipment and costs
Audit vehicles
Record keeping
Retesting of failed test systems
Component training
Hands-on training
l
-------
Outline - continued
Components to be audited
Visual inspection
Weather station
Gas benches (NDIR and fuel cell)
Tachometer
Dynamometer
Gas cap tester
Pressure tester
OBDII scan tool
VMAS unit
Outline - Continued
Component Audit Training
For each test system audit, the following
will be covered:
- Component theory of operation
- Required audit equipment
- Audit procedure
- Calculations and record keeping
- Audit tolerances
- Common audit failures and equipment issues
-------
Total QA/QC Program
The auditing is only one component of an
integrated quality assurance/quahty control
program aimed at ensuring that the equipment is
properly designed and maintained. This QA/QC
program can be divided into three pnmary
elements*
- Initial acceptance and beta testing of the equipment
- Ongoing, automatic calibrations and checks
programmed into the equipment
- Equipment audits that are performed on a routine basis
combined with analysis of calibration and audit data to
find trends in performance
Auditing Issues
New hardware or hardwaie uses have been added
- VMAS
- OBDll scanners
- Transient tests with ASM dynamometers
Previous EPA Work Assignment (2-05) developed audit procedures
for all components
Report provides information about previous failures, there needs to be
more sharing of info between states
Report provides details on required equipment and cosis
There is disagreement over some of the an- it toleiances
Most audit programs are not meeting EPA requirements, some not
auditing at all. cost is a significant issue
Audi! menus are not available on many systems
3
-------
Acceptance Testing
Quality Control CFR40 ง51 359
Acceptance testing is important to ensure internal decisions and
calculations that cannot be checked 111 auditing are functioning properly,
prevents "incurable" audit failures
Hardware generally not tested, BAR.97 certification accepted
Some manufacturers have changed hardware on BAR97 certified systems
Beta testing must be conducted, tracked, and the bugs fixed
ฆ Appropriate time must be allotted for ATP and beta testing
Special equipment for acceptance testing
- BAR97 high and low gases
- D vnamomete r si mu I ato rs
- RPM and OBD1I simulators
- VID connection and VID data management tools
- Possible connection to state moior vehicles department
Acceptance Testing Components
ฆ Main menus
Inspection menus
Inspection scenarios (test logic checks)
Test conditions (timers, dilution, restart, RPM/OBDIJ connection, etc )
Additional checks (VIR's, lockouts, security, tampering, comm. etc )
Calculations (emissions, correction factors, sticker dates, etc )
Calibrations (frequency, lockouts, calculations, etc )
Data files (proper storage, format, logic)
Audit menus
Lookup table data matching
ฆ Gas sample lime alignment
Dynamometer parasitics, temperature correction and loading
Humidity correction factor
4
-------
Calibrations
Differences between high and low volume (4,000 vehicles per year)
requirements, ignored by most states
Gas bench
- Daily leak check
- 3-day two point gas calibration
Dynamometer
- 3-day coast down, if fail
Parasitic determination
Load cell calibration
- Weekly speed checks not conducted
Gas cap tester - Daily
Pressure tester - Daily
VMAS - Weekly hose-off flow check
Issues with calibration discussed in audit issues section
Auditing Objectives
Quality Assurance CFR40 ง51.363:
"An ongoing quality assurance program shall be
implemented to discover, correct and prevent
fraud, waste, and abuse and to determine whether
procedures are being followed, are adequate,
whether equipment is measuring accurately, and
whether other problems might exist which would
impede program performance"
5
-------
Auditing Requirements
Requirements in ง51.363(c) for equipment audits
- Twice a year for each lane or test bay
~ (1) + (7) gas audit and check of station gas bottles
- (2) tampering, wear and filters
- (3) + (4) CVS and CFV = VMAS
- (5) FID - not used in ASM equipment
- (6) leak check
- (8) dynamometer tests
- (9) measure background concentrations
- (10)+(11) pressure and purge = gas cap and pressure
tester
Field Auditing Issues
Audit menus
Audit equipment
Interrupting inspections
Lockouts
Retests
Record keeping
6
-------
Audit Menus
* Audit menus not clearly defined or not required in most specifications
* Access to all menus should be through state access menu
ฆ State access should have ability to lockout the analyzer
* Should display
- Weather station data (temperature, humidity, barometric pressure)
- HC (propane), CO, CO,, NO, and 0, concentrations (should have the
ability to zero the analyzer
- RPM reading (inductive and OBDII)
- Dynamometer speed, load cell reading, coast down, and parasitic
measurement data
- Gas cap tester test function with pass/fail result
- Pressure tester test function with pass/fail result
- OBDII test data (MIL commanded on, readiness status, DTC's)
- VMAS temperature, pressure, flow rate, factory VMAS flow rate and
oxygen concentration (should have ability to calibrate oxygen sensor)
Auditing Equipment and Costs
Depends on which components are audited
Need several of some items
May be alternative methods that are less expensive
Need to be careful to check accuracy of
instruments
Need to ensure instruments are NIST traceable
Need to be able to recalibrate all equipment on a
regular basis to ensure it is accurate
7
-------
Audit Equipment Costs
a
lem
CcrJi/iani
I
Auaivan
-
2
Gas cy*naซ ractt w AS tea eviroer
S250
7
1% civgtunrwroo^nautagti AQer AStea cytroar
SMO
a
to* oiygtn r\ ndreq*n and* AO<* AS cyfenoer
MX
9
20% oi/gtnfi rwegan uufn AO a AS sifacytroar
S200
10
Zero W AO or AS tSซ Cytndar
ITS
n
SffQto iiaqt ttartais ttM qu ragjaiors CGA 060
W2S
12
siage brass ** ragUatcrt OGA $90
S17S
13
nrMwiyniicnrgvavinllw nnr
s> xc
U
SO loot rci 1M' 0ซ WT CO Tygoniuert;
S5S
IS
1/4 ( 1/4 ฆ 1W"
12
16
Baroed regulator eonnacurt
S2
17
50 rซซi roa ol euoa cuong
SJ5
ia
QaBoom ard ruofear Daras
S2
ie
ftyiatt*iaiTEซrซraandrumaiywaairw Uflor (psycrrorrvlar)
J14C
20
Baron* *f
$1 aoo
21
Wartfc* M9ป fjeJcap
$100
72
Ralarenca *>aT luatcao
(100
23
Vend* crtiivt last ajdl tool
VWI
24
RFMป*ruiปoป
$250
25
06O irfuator
$3 500
29
Lagtoo cornwier
U 000
27
SAO 200 fo 500 SCFM
uooo
28
& Wasl gala vahrt
SIS
29
O/riafTWTWeT latter
120 000
Audit Vehicles
Cargo vans are ideal
Some suggest use of pickups with camper shell to separate
gases from driver
Should be capable of carrying up to 8 gas cylinders in gas
racks
Want to avoid having to reconnect audit gases at each
location, can use split caps
Should have rear and side gas distribution panels
Should have work area
Should have power inverter
Should have vented gas storage area with gas detectors
Partition between driver and equipment
-------
Interior of Audit Van
Audit Van Gas Distribution
-------
Lockouts and Retests
If there is a failure, auditor should be able to lock
out the analyzer
BAR has two sets of audit failure standards:
- Service soon
- Lockout till serviced
Some states have field service reps repeat audit
after service and remove lockout if passed
Some states require state to retest before removing
lockout
Record Keeping / Triggers
Data can be entered onto paper and calculation done by
hand, must be later entered into database
Data can be entered directly into software that performs
calculations, makes pass/fail decisions, must print report
signed by auditor
Copy of report left with station so that field service
representatives will know what to fix if there is a failure
Data transmitted to database for later analysis
Can use calibration data in triggers to target audits, and use
audit data to confirm trigger accuracy and improve triggers
10
-------
Component Training
1. Visual inspection
2. Weather station
3. Gas benches (NDIR and electrochemical cell)
4. Tachometer
5. Dynamometer
6. Gas cap tester
7. Pressure tester
8. OBDII scan tool
9. VMAS unit
Component Audit Training
For each test system audit, the following
will be covered:
- Component theory of operation
- Required audit equipment
- Audit procedure
- Calculations and record keeping
- Audit tolerances
- Common audit failures and equipment issues
11
-------
1. Visual Inspection
To determine if the system has been
tampered
Determine if proper preventative
maintenance is being conducted
Ensure proper calibration equipment is
being used and proper values for calibration
gases are entered
Look for signs of overt cheating
Visual Inspection Items to be Checked
All pieces of equipment are properly housed
All pieces of equipment are clean and orderly
There are no signs of tampering
HC filter is clean
Correct calibration gas values are entered
The dynamometer (if used) is connected to the test system and power
Vehicle restraints are present (if dynamometer is used)
Gas cap tester (if used) is connected to the test system
Gas cap standard is not left on tester by shop
The VMAS unit is connected to the test system and power (if used)
Shop does not have an RPM simulator on tachometer pickup
ZAG filters changed at appropriate times
12
-------
Common Visual Inspection Failures
Not changing the HC filter
Not emptying the water filter bowl
Gas cap tester calibration device connected to tester
Gas cylinder values not updated
Calibration gas cylinders not tampered (such as for refilling with shop
air)
Calibration gas cylinder labels not removed
Shop air used to fill zero air cylinder or plumbed into analyzer
(sometimes around zero air generator)
RPM simulator on tachometer pickup
Gas cap tester standard not left on tester
Tape covering hold in raw sample hose or signs of kinking the hose to
pass the leak check
2. Weather Station Audit
The weather station reports
- Humidity
- Temperature
- Barometric pressure
Humidity and temperature are used in humidity correction factor for
NO, measurements
Temperature used for lockout if outside allowed operation temperature
window (35 to 110ฐF)
Barometric pressure ieported by analyzer used to adjust emissions
readings
Barometric pressure and temperature in VMAS are used for VMAS
flow corrections to STP
13
-------
Weather Station Audit Equipment
For temperature and humidity use a
psychrometer
Thermometer is to be accurate to less than 1ฐF
Humidity is to be accurate to less than 1% RH
Works by blowing air over a wet bulb and comparing
the difference between a dry and the wet
thermometers
For pressure, use a barometer
Accuracy is better than 0.25% of point
Weather Station Audit Tolerances
For temperature
Audit measurement is to be accurate to less than lcF
Analyzer temperature accuracy is to be 3ฐF
Audit tolerance is 4ฐF
For humidity
Audit measurement is to be accurate to less than 1% RH
Analyzer humidity measurement accuracy is to be 3% RH
Audit tolerance is 4% RH
For pressure
Audit measurement is to be accurate to less than 0 25% of point
- Analyzer pressure measurement accuracy is to be 3% of point
- Audit tolerance is 3% of point
14
-------
Weather Station Audit Issues
Ambient temperature sensors improperly located inside
the equipment cabinet, biasing readings
Weather station data not displayed by some analyzers
Weather stations sensitive to analyzer location
A manufacturer in one program hard coded the
temperature and humidity readings in all its test systems
The weather stations are supplied with coefficients that
are the calibration of the weather station (such as 1 volt
equals 30 degrees), manufacturer field service
representatives change weather stations without
changing calibration coefficients
3. Analyzer Gas Audit
All gas audits should start with a leak check
- Leak check extra important for VMAS systems
Gas audits:
- Four gas
- Five gas
- Five gas with oxygen check for VMAS use
15
-------
How NDIR Analyzers Function
Transmission Spectroscopy
with continuous-flow sampling
| Air
T In
Light
source
Light
detector
Analyzer Audit Equipment
Tygon tubing
Bubble tubmg (used to adapt the tubing to the analyzer
probe tip)
Tygon hose tee
Balloons
Five or six audit gases, 1% accuracy with 2% blend
tolerance
Single stage gas regulators (brass and stainless steel) with
shut off, barbed output
16
-------
Analyzer Audit Gases
o3
%
11c
PPM
CO
%
COj
%
NO
PPM
Ni
BAR zero
20 9
1
1
200
I
Bal
EPA zero
20 7
0 1
05
1
0 1
Bal
Low
0
200
Of
60
300
Bal
Low Mid
0
960
24
36
900
Bal
High Mid
0
1920
48
7 2
1800
Bal
High
0
J 200
80
120
3000
Bal
V'MAS
Oxygen Check
1 0
1
20C
1
3al
Analyzer Audit Procedure - Setup
Shui olT
Cylinder /
diaioff / Bjkigii
ปjKc /
Iฎ.ฆ/ l ฆ -r. Insert EI5 probe here
~~ \
/ ^ C>H fcna Rcf \
w/,uui!& * CJetfiabBij*'
belled end
^ Audit G-jj Cylinder
Record gas concentrations on audit form
Stainless steel regulators on gases containing NO
Balloon for approximating pressure balanced flow
Balloon near probe end
Audit from low to high concentration
Display bench readings, ensure HC is displayed in ppm propane
17
-------
Analyzer Audit Procedure
Perform a leak check
- First without a cap 10 ensure a fail
- Re-check with the cap in place, analyzer should pass
- If the analyzer does not pass, allow shop to repair the analyzer (replace
probe or hose) before continuing the gas audit
Zero analyzer - this may or may not be done automatically when
entenng a gas concentration display mode
Connect the belled fining over the sample probe
Flow zero gas, balloon should barely inflate
Record readings after one minute
Turn off gas and remove sample line, allow analyzer to sample
ambient air until NO reading is below 20 ppm
Repeat procedure with L% oxygen if using a VMAS, then other gases
s
preadsheet for Calculating ASM Audit Tolerances
Range
Gaa
Gas Concentration
Audit Tolerance
Applied
Tolerance
(greater cjf
% or point)
Audit limils
Recom
mended
Actual
Cylinder
%
%
(as point)
Poirtt
Lo*
High
High
HC (ppm)
3200
3191
a 0%
128
<10
128
3063
3319
CO <%)
0 00
8 11
4 Q%
0 32
0 10
0 32
7 79
8 43
C02<*1
12 00
11 90
4 0%
0 48
0 42
0 48
11 42
12 33
NO
-------
Analyzer Audit Issues
High accuracy failure rate, especially for HC
Old or incorrect humidity correction factor
Time alignment, even transient programs in incorrect
For VMAS systems, original NY data stated sample conditioning
(e.g., a chiller) appears necessary
Gas benches on new test systems have been found to fail
High volume stations calibration every four hours and monthly
multi-point calibrations
NOx outlet venting changed or mounted exterior to cabinet
Some manufacturers have resorted to doing a "Cal High, Cal Low"
procedure every time a calibration is required
NOx cells response time and life time, "Rev 4" cells now sold, new
analyzers coming currently being certified
Accessible NOx Cell on ASM Analyzer
19
-------
Two Point Calibration
"Cal High, Check Low"
HC Calibration
3000
M
s
t*
c 2500
*
| 2000
y'
ง 1600
e
*
*
0
U 1000
500
^ ^ '
_ _ - "V ~
"Response"
NO Cell / Analyzer Issues
Many NO cells were failing audits with a response time limit of 6 5
seconds BAR increased the allowable response time limit to a
maximum of 9 5 seconds This is a significant problems for
programs using the cells to perform transient tests
NO cell life time - City Technologies said the cells should last from
3 months to 9 months Many programs extending their use for up to
two years Cell cost is approximately $250
Rev "A" cells not longer available, only Rev "4" cells available
BAR wanted City Technologies to certify, but in the absence of that,
asked test system manufacturers to recertify with Rev 4 cell
New analyzers coming currently in development of being certified
Sensors NDUV
HoribaNDIR
Andros NDIR
Costs range from $750 to $ 1,500
-------
4. RPM Measurement Audit
Check inductive tachometer pickup
Check OBDII RPM reporting ability
"Wand"-type pickups cannot be audited
RPM Inductive Audit Equipment
-------
RPM OBDII Audit Equipment
1 1
SCAN TOOL
EASE DIAGNOSTICS
OBDII VERIFICATION TESTER
OFF ON
auto ^ bun
IGN KEY ' "
\ ฆ
7o
-------
RPM Measurement Audit Issues
Inductive tool is expensive (S250)
OBDII RPM simulator is expensive (S3,500), may only
be used if OBDII testing is also being conducted
If tachometers fail they do not impact test results
because they usually are not inaccurate, they cease to
function, which would prevent tests or allow RPM to be
bypassed
If an OBDII RPM simulator is not available, use a
vehicle with a tachometer and operate the vehicle at idle
and 2500, recording readings, allowed tolerance is 10%
"Wand'Mype RPM pickups cannot easily be audited
5. Dynamometer Audit
Type of audit depends on dynamometer use:
- Steady-state use audit (ASM test)
Only check loaded coast down
Loading accuracy checked at ATP
- Transient use audit (MA31, NYTEST)
Check loaded coast down
Check inertia simulation accuracy during a test
23
-------
Transient Vehicle Loading
The vehicle is supposed to be tested on the dynamometer
at a load simulating driving on-road
This load is Track Road Load Horsepower (TRLHP) plus
inertia force
TRLHP is the sum of fhctional forces (aerodynamic drag,
bearing faction, etc ) and is a function of speed
Inertia force from accelerating a mass and therefore is a
function of acceleration
TRLHP and inertia weights are provided in the EPA/Sierra
Lookup Table (ESLT)
Force Applied by Dynamometer
The force applied by the dynamometer
should be equal to TRLHP + Inertia force
The dynamometer applies force to the
vehicle in three ways:
- The power absorption unit (PAU)
- The resistance in the dynamometer called
"Parasitic Losses" (PL)
- The friction at the tire/roll interface (GTRL)
24
-------
How Dynamometer Forces
are Determined
Power absorber - The dynamometer has a strain
gauge ("load cell") that measures the force applied
Parasitic losses - this is determined by coasting
the dynamometer down and measuring the rate at
which the dynamometer slows
Friction at the tire/roll interface - EPA provides
equations to calculate "Generic" tire/roll losses
(GTRL) for vehicles based on the drive axle
weight
Determination of Parasitic Losses
. ' ฆ^ฆ'.y.s=
Wher dnwnmet'sr stxod bepns lo deccJetale. pfew hofK |l DfrcatetVmq
25
-------
Dynamometer Coast Down Check
A random horsepower setting is chosen
A coast down with that power applied is conducted
The time to coast down through a speed window is
measured and compared to a theoretical time
Dynamometer should compensate for parasitic drag
For ASM dynamometers, 7% error is allowed
For IM240 dynamometers, 1 second from 55 to 45 mph
(15 to 20 seconds), 6 seconds from 22 to 18 mph (22 to 33
seconds)
Dynamometer Inertia Simulation Tester
Principle of Operation
A dynamometer should simulate the load a
vehicle would see under identical operating
conditions (speed and acceleration) on-road
The theoretical load that a vehicle should
receive is compared to the load the vehicle
actually received on the dynamometer.
Both overall (over the whole test) and
instantaneous power applied are evaluated
26
-------
How is Error Determined?
Inertia + TRLHP = PAU + PL + GTRL
Or
PAU = Inertia + TRLHP - PL - GTRL
PAU power measured at the load cell, Inertia,
TRLHP, PL, and GTRL determined from
calibration or calculation based on vehicle
characteristics and measured speed and
acceleration
Two Types of Error Calculated
Overall HP Simulation Error (OHPSE)
- Second-by-second errors are calculated for every
second and averaged over the entire trace (except
during decelerations on eddy current dynamometers)
Maximum Instantaneous HP Simulation Error (IHPSE)
- Error calculated as an average over five second
intervals, highest single average is recorded as
maximum instantaneous error
27
-------
Dynamometer Testers
Dynamometer tester must be capable of measuring speed and force at
0 5 second intervals
Dynamometer testers
- BAR / Real Time - $450,000, mounted on a trailer
ฆ Uses wheel on dynamometer to measure speed
Uses roll set on dynamometer to simulate a vehicle and measure force
applied
Measures force through roll set tires, some error in tire/roll interface
- Sierra Research - $20,000 and brief case size
Measures force at dynamometers own load cell
Measures speed with photo tachometer
SR Dynamometer Tester
C0lซJTER
usai rW-r*!!HCusKw
cowJtcncw | , software
FIBER-OPTXCUflLE
ftฃFl
iXCTtvE
o
lftOUS
POWER
AK0O8ER
Wit
1
1
_J
rinn
"I
_
*ear
h:
L
ULiU
1 loaocoi
' OYKAUOUETER
2S
-------
SR Dynamometer Tester
Program Main Menu
ฆ -inl x|
Fie Settings
Siena Research Dynamometer Tester
Potent Pending
| Dynamometer Description
1
Caftnate Load Cel
-1
Determine Parasites
-1
Drive T.ac*
1
Print fiesufts
1
Ready
Configuration Choices
MPH bebw which nstantaneous error disregarded: 5
Instantaneous error averaging period (seconds): 5
Minimum horsepower denominator lor error 5
Minimum MPH tor parasite determination: 10
Black bar width fmj: 0.5
Track road load power fraction A' 0 35
Track road load power fraction B 0.1
Track road load power fraction C: 0.55
Communications COM pott number 1
Augmented brakng force limit (Ik); -110
Max acceptable X nstantaneous error: 5
Max acceptable % overall error: 5
Size of parasitic MPH speed segment 10
8 65" tire/toll interface bss PF A. 0.76
8 65" tire/roll interface toss PF B 033
8.65" tire/ioll interface loss PF C: 0.09
20" tire/roH interface toss PF A. 0.65
jjjgjjigijjK^
Restore de/auts j
Zi
'I
Qose
29
-------
Vehicles Pre-Loaded
"Test" cars
1
2
3
4
VCID
961119
810984
880617
920454
Model Year
1996
1981
1988
1992
Make
Ford
Chewolet
Yugo
Mercedes
Model
Taurus
K20 Suburban
GV
300TE
Body Style
Sedan
SUV -4
Sedan
Sedan
Cylinders
6
8
4
6
Engine Size
3.4
5.7
1.1
3.0
Transmission
A
E
M
A
Test Weight
3875
5000
2125
4250
TRLHP
13.4
21.0
11.1
14.8
Drive Axle Weight
2345
2711
1164
2000
30
Vehicle List
lEdil Vehicle List
~1
r - 1
Vehide: [95ESE
vut
Cfeale| DeMef ] OK
Vertciฎ devuiptiort j 1996 T auus Sedan 6cyl 3.4L Automatic Caned j
Onva axle weight:] 2 345
Vehide fieitia: J 3875
bป
bs
.
ฆ - ฆ ,,
Rood load al 50 mokp"
HP
-------
Test Data Entry Screen
Data Entiy
El
FaciWy name: Joe's Test Shop
100 Mam Street. Anywhere. CA, 90000
Equipment-R-Us
Facility address
System provider:
System model: Smogger 2002
Dynamometer manufacturer Speed Roil Inc.
Dynamometer model: Big Dyne
E quipment serial number: 1234567890abcdefg
Roll diameter (in): 8.G5
Dynamometer base inertia (lbs): 2000
Restore defeufo [ ฃbse |
Load Cell Calibration
1 Calibrate Load CeO
~
1 1
Arm length from center io weight position: [iE in
Close
T otal weight to be placed on aim; |50 lbs
r
Numbei of teeth on power absorber pulley: |i
Number of teeth on roll pulley. ["
lbs per volt 145
Begin Calibration |
31
-------
Dynamometer PAU and Load Cell
Determine Parasitic Losses
When
-------
Determine Parasitic Losses
Paiatibc Deleinnaiion
Whon dyrxmomd-w speed begrw to decetefale, s*eซซ haiK: |[~ Pyfe'ffoq. j
PmAcurvepo^no*nafc ft SOW. ft: aOKtC aOOCB
Inertia Simulation Test
r^iD- ianosBcaez
Egiiprvsrit mmI nunber. 12345B7830jtode^
AJowhJMrt* V
VซnidK J ChevroW Suburban
Whซn ieป4>, |lTw~Sfofog \
U
1*1 K2DSUMM*
Ovwซi ซfcr UncafcUsted
a.M 1
33
-------
Inertia Simulation Test
Results Printout
34
-------
Proper Inertia Simulation
Sierra Research
Transient Dynamometer Simulation Accuracy Test
!
Q.
4000
20.00
0.00
-20.00
; iH fi ' *
i A> , i ' S A
a aivw /i ft y"v i11
- v*-/ ^ ,r.jv.y T-i- va-** * ฆ ;; .
tr
11 21; 31 jll 51 61 71 Sl| 91 101 111j|2113J*141 151 151 171 151 191 201^11 22T231
-"ii
!/
Total Expected hp ฆ
Total Achieved hp
Error !
Roil Speed (mph)
Time (sec)
Inertia Setting Incorrect
Actual 3000, Set at 5000
Sierra Research
Transient Dynamometer Simulation Accuracy Test
60 00
40 00
20.00
000
g .20 00
| -40 OO
a.
-60 00
00 00
100 00
-120 00
* jj
, i : >'\ A
P. .VW-. A.U ฆ - \
-v-*rv, *^Vl:'r .
r;
11 21 3141 51 51 71 81 91 101 IITffJ11^141151 181171 18119120^1122*231.
! .ซ
i .-i t
ฆM
! Total Expected hp
Total Ach*vซd hp
Error
Roll Spซซd (mph)
Time (sec)
35
-------
TRLHP Incorrect
Should be 15 HP, Set at 12 HP
Sierra Research
Transient Dynamometer Simulation Accuracy Test
60.00
40.00
20.00
0.00
ฆ20.00
: > ..J ..
\
t: '\ \ \ M
ฆ i. i?'*: j
11 21. 31 41 51 61 71 81 M 101 111 121 131 141 151 161 171 181 191 201*211 22T 231
! ft
\i
-Total Expected hp
Total Achieved hp
-Error
Roll Speed (mph)
Tim* (sec)
Dynamometer With a Slow
Control System
Sierra Research
Transient Dynamometer Simulation Accuracy Test
J -20.00
m A
k-i V
i ,i \f\r\a A
A':
if .'--"...v.,", . . . ,
11 2i: 3t;41 SI 6t 71 ail 91 101 111,^21 133 1*41 151 161 171 181 181 201 211 221|231
' . Ij
i Total Expected hp
Total Achieved hp
j Error
RoM Speed (mph) ,
Time (sec)
36
-------
TRLHP Incorrect
Should be 15 HP, Set at 12 HP
Sierra Research
Transient Dynamometer Simulation Accuracy Test
SO 00
40.00
20.00
0.00
^ -20 00
o
Q.
-40.00
6000
80.00
100.00
I-
tl/ ซ
< ฆป > . j\
I 1 f.\ \ ' V
ฆฆ ! J :>' y \r !
' */\ " I''. I: >
! ..! ,v. >:
ฆ
11 21 ปI41 51 ป1 n 11 ป1 101 lli^fil 13ซ"1ซ1151 1ซ1 171 1ป1 IS1 201211 Z2TZ31
1 . ฆ !;' ซi 1 '! , u
\p}
ฆ ฆ ' - if' ii
1' ' r
I !
- Total Expected hp |
Total Achieved hp
- Error
Roll Speed (mph)
Time (sec)
Dynamometer Audit Issues
Proper error checking of dynamometer load is not
incorporated into the control software on some test
systems even with disconnected power
Improper transient loading - "Virtual Inertia"
No axle weight scales are being used with BAR97
dynamometers in ASM programs
Roll bearing temperatures are not being checked to
compensate for changes in parasitic losses
BAR97-certified dynamometers are incapable of
determining parasitic losses at speeds over 30 mph
One dynamometer vendor has made changes in the
gearing between the power absorption unit and the roll
set
37
-------
Dynamometer ATP Issues
Problems observed with the use of the EPA/Sierra
Lookup Table (ESLT)
- Some vendors are simply not looking up the vehicle information
in the lookup table correctly
The second problem is that some locations are not updating the
ESLT as frequently as needed, some locations not capable
Many testing programs are not using the "by model year-'
defaults contained in the ESLT
- Test programs are not maintaining the ESLT vehicle
characteristic identification (VC1D) number tn their databases
Currently the ESLT only includes the generic tire/roll
interface losses for 2WD testing
In some cases parasitic losses were being subtracted
twice (at both the VID and the test system)
6. Gas Cap Tester Audit
Check gas cap tester accuracy
Use standard caps (Stant) or wand (Waekon)
Flow check standards at startup and every six
months
Ensure pass standard is not on analyzer when
auditor arrives for audit
3S
-------
Gas Cap Tester Audit Procedure
Do not calibrate the gas cap tester, test as found
Put analyzer m gas cap testing mode
Perform a test with no cap on the tester
Perform a test with the fail cap on the tester
Perform a test with the pass cap on the tester
Record results
Gas Cap Tester Audit Issues
Currently four types of gas cap testers in use
- Stant, Waekon, ESP internal, Worldwide internal
Some units give unreliable results
If a gas cap tester fails calibration, stations generally repeat the
calibration until passed
Some calibration procedures do not include checks of the accuracy
of the system, but simply reset the calibration point
Fail standards provided by both Waekon and Stant have on occasion
been found to be outside the allowed tolerance
EPA guidance requires standards are checked before use and at six-
month intervals, this is generally not done
Pass standard used for falsely passing tests
-------
7. Pressure Tester Audit
Currently not being used, probably will
never be used
Check flow measurement
Should have volume compensation
Only prototype testers in NJ
Only audit tool is in NJ
Pressure Tester Audit Procedure
Two tank volumes, one for low head space (6 5L), one
for large head space (19 5L)
EPA audit pass/fail limit is a drop of 0.4 inches of water
column when pressurized to 28 +/- 1 inch WC
Tester itself will only pressurize to 14" WC, therefore, a
test is needed that can be done at 14"
Used actual pressure test procedure and look for correct
pass/fail result with audit tool set to pass then fail
Conduct each test at low and high headspace
NJ pressure tester audit tool orifices are 0 0094" and
0 0142" corresponding to flow rates of 74 6% and
119 1% of the 225cc/min standard
40
-------
Pressure Tester Audit Issues
Only decentralized pressure tester designed by
Waekon
Currently there are no decentralized programs
conducting pressure testing
Pressure testing auditing has not been attempted;
however, the tool was used during certification
testing of the Waekon pressure tester and
worked satisfactorily
41
-------
8. OBDII Tester Audit
Check the ability of the system to obtain the
following over any communication protocol:
- Readiness monitor status
- If the MIL is commanded on
- DTCs set
Use EASE OBDII Simulator
PID is set to "FF" so if unit is used for clean
scanning, it can be identified from the data stream
OBDII Audit Equipment
ฆ -InM
OeOLป*ปซ. j *0 l 380
ML
Sปo^ QTCป
JfTOI
7' !ซ"
1 Jpo:j4
4. |ซ-*a
fc |ooซ]
fc l^sr*
ad DซW <0 Vanhcaho* tMH |
42
-------
OBDII Tester Audit Procedure
Set the "power on time" and the "time to run" to 15 and
30 seconds
Randomly choose monitor status (complete, not
complete, unsupported) Entering "unsupported" for any
of the continuous monitors may cause an error
Choose communication protocol
Choose if the MIL is commanded on
Enter some diagnostic trouble codes (DTCs), they must
start with the letter P and be followed by four digits
Send data to verification unit
Connect tester to OBDII connector on analyzer and have
analyzer poll the unit and report the results
OBDII Tester Audit Calculations
Check that readiness monitors are correct
(complete, not complete, unsupported)
Is MIL commanded on
Are all DTC's correctly recorded
Some states also require the DTC definitions to
be displayed or printed on the VIR
43
-------
OBDII Tester Audit Issues
Original simulators did not include Key Word
Protocol (KWP)
Many OBDII test systems did not include the
ability to communicate with KWP
Controller Area Network (CAN) communication
protocol to be out in 2003 model year vehicles,
current tester does not simulate CAN
Presently only one manufacturer of the simulator
9. VMAS Audit
Two measurements to be checked on the
Vehicle Mass Analysis System (VMAS)
unit
- Oxygen measurement (used for calculation of
dilution)
- Exhaust flow
Oxygen sensor on BAR97 analyzer needs to
be audited as well
44
-------
VMAS Unit Operation
1. Flow
2. Dilute
Oxygen
3. Ambient
Oxygen
t
XL
3
>
Senai Communications
BAR97
Cabinet w/
raw gas
analyzer
4. Raw
Oxygen
i *ฆ Ambient air \
Raw Exhaust Sample
Exliaug Flo* ป Anibiciii air
ฆ Dilute Exhaust Flow
UxhauNt Row
Antbicm Air
VMAS Calculations
Mass Emission Rate = Dilute flow rate x dilution ratio x concentration
(ambient 02 - dilute 02)
Dilution Ratio =
(ambient 02 - raw 02)
Overflow is when 75% exhaust to 25% make up air is exceeded:
75% exhaust = (calculated exhaust flow / total flow) x 100
45
-------
VMAS Unit Components
VMAS Unit Flow Measurement
VurlmtduJ !p ปt?i I lie >inii interim- mill
uliniMNiic waves ซml by ซhซ TrunMinl pieni lo the
Receive pICAl
O-
Strut
Wl" 1700
Vorlex shedding rate, Hz
46
-------
Technician QC check
"Hose-Off' Flow Check
Compares factory flow rate with hoses removed to
current flow rate with hoses removed
Can be done by technicians weekly
Requires a screwdriver to remove hoses
Turn on, allow to flow for one minute, sample
flow rate
Allowable difference is 10%
Factory hose-off flow rate - Current hose-off flow rate
% Difference = 100 x
Factory hose-off flow rate
VMAS Flow Audit Equipment
Smooth Approach Orifice (SAO) with a
calibration range of 200 to 500 SCFM
Barometer
Manometer
Thermometer
Butterfly or gate valve
Screwdriver
47
-------
VMAS Flow Audit Equipment - Setup
% Difference - 100 x-
SAO
48
-------
VMAS Flow Check Audit Procedure
Remove the hoses from the VMAS
Place the analyzer in audit mode and display the VMAS flow rate
and factory default hose off flow rate
Perform the hose off flow check
If the difference between the factory and measure hose off flow
checks is more than 10%, have the technician clean the strut and
piezos and recheck
Connect the SAO setup to the VMAS unit, ensure the gate valve is
fully open
Measure the barometric pressure and ambient temperature
Turn on the VMAS blower
Record the flow rate displayed at analyzer and pressure at SAO after
one minute
VMAS Flow Check Audit Procedure
Continued
Restrict the flow to 225 CFM indicated on the analyzer
Record the flow rate displayed at analyzer and pressure
at SAO after one minute
Increase flow to half way between 225 CFM and
maximum recorded flow rate
Record the flow rate displayed at analyzer and pressure
at SAO after one minute
Convert pressures at SAO to flow
Calculate differences between SAO- and analyzer-
indicated flow rates
Accuracy tolerance is ฑ10%
49
-------
VMAS Dilution Audit Equipment
Zero air
8% and 15% oxygen in nitrogen - 1%
accuracy gas, 2% blend tolerance
Three gas regulators for the gas cylinders
Bubble style flow meter with a 1.5LPM
maximum range
Three position selecting valve with flow
meter
Tygon Uibing
VMAS Dilution Audit Equipment - Setup
3 Ou Cytanewi
* ZnO An
* 0* O, m N,
* IS% O, m N,
50
-------
VMAS Dilution Audit Procedure
Check accuracy of oxygen sensor in VMAS, oxygen sensor in
analyzer was checked in analyzer audit
Connect audit hardware and connect audit gas to innermost oxygen
sensor vent
Confirm all three gases are set to flow at 1 LPM
Flow zero air (20 9% oxygen) at 1 LPM
Enter VMAS audit page
Turn VMAS and zero oxygen sensor
Switch to 15% oxygen, flow for 1 minute and record three readings
Switch to 15% oxygen, flow for 1 minute and record three readings
Audit tolerances areฑ 1 3% at boih audi: points
VMAS Dilution Audit Tolerances
Spreadsheet lor CaJcutatfno VMAS AudJt Tolerances
Range
Go
Gas Concemrauon
Accuracy
Audi Tolerance
Kppiiea
Toleranc0
(greater of
Autki bn*tป
Rec on-
mended
Actual
Cylinder
*<* I ซ 1
_ / Pant Gas
ReaOng
* of %
Reackng (as pani)
Pout
Low High
Anafvw02
02 (%)
100
1 01
5 0% Oiol IDS
3D\I C 00
0 11
0 11
0 90 1 12
VMAS 02
150
u a
0 3%| 0 001 10%
1 3%| 0 19
045
0 AS
14 451 15 35
Uw{%)
80
7ft
Q2%\ 0COI 1 OH
1 3%) C 10
034
0 3G
7 531 are
In the 'Actual Cyhnder" edum enter the gas concentration from tr* audit cylindera
Tho aufel low and tagh tolerances tn catenated m Itie columns on the rgnt
51
-------
VMAS Audit Issues
No quality control checks for VMAS specified by EPA
Oxygen sensors in BAR97 analyzers are not required to
fail the oxygen sensor quality control check
Leaking raw probes very important because raw sample
oxygen is not dilution corrected and extra oxygen in the
raw sample falsely lowers the exhaust flow rate
VMAS unit oxygen sensor audits failing
Hose-off flow check for routine inspector QC not
implemented, factory values do not exist
Diluted flow oxygen content on decelerations a problem
VMAS Audit Issues
There are no QC checks for a dirty strut or piezos
VMAS calibration considers ambient oxygen 20 8%, not
20 9%
Oxygen sensor calibrates at a high concentration,
measures at lower concentrations Sensors may not be
linear in response
Units may not be calibrated properly before installation
VMAS mounting could cause low flow. Some stations
making holes in hose to avoid low flow lockouts This
prevents the overflow lockout from working properly
52
-------
VMAS Alternative - TransMass
Test method developed and patented by Gordon-
Darby that allows calculation of mass emissions
scores without flow measurement
IM240 results from Gordon-Darby test lanes
used to establish relationships between exhaust
volume and vehicle characteristics (e.g., weight
and engine size)
Resulting algorithms applied to test vehicles to
compute mass emissions directly from raw
tailpipe measurements
TransMass
Benefits include:
- Elimination of flow measurement errors or fraud
- Potential SIP benefits equal to or greater than ASM
(m-progress study)
- Potential for eliminating ASM-related test problems
Sierra working with Gordon-Darby on
TransMass enhancements:
- Incorporating vehicle emissions simulation model to
eliminate total reliance on pre-existing test data
- Expanding method to include other transient drive
traces (e.g., MA31)
Investigating SIP credit issue
53
------- |