TECHNICAL REPORT
     SEPTEMBER, 1980
ANALYSIS OF THE EMISSION
                             EPA-AA-IMS-80-5-A
   INSPECTION ANALYZER

-------
                                                            EPA-AA-IMS-80-5-A
                              Technical Report
                               September, 1980
                          Analysis of the Emission
                            Inspection Analyzer

                                     by

                             William B. Clemmens
                                   NOTICE
Technical Reports do  not necessarily represent final EPA decisions or posi-
tions.  They are intended to present technical analyses of issues using data
which are currently  available.   The purpose in  the  release of such reports
is to  facilitate the  exchange  of  technical  information and  to  inform the
public of  technical  developments  which  may form the basis  for a final EPA
decision, position,  or regulatory action.
                      Inspection and Maintenance Staff
                     Emission Control Technology Division
                Office of Mobile Source Air Pollution Control
                    U.S. Environmental Protection Agency

   Note:  This  report  is presented  in two  sections,  -80-5-A,  and -80-5-B.
          The first section (-A) contains Background, Technical Discussions,
          and  Policy  Information.   The  second section  (-B)  contains the
          Recommended Technical  Specifications.   The sections are available
          separately.

-------
                              Table of Contents

                              EPA-AA-IMS-80-5-A
List of Tables                                                          5
Acknowledgments                                                         6
                         	 Executive Summary 	                    7
                         	 Technical Report 	                    12

I.   Introduction                                                      13

II.  Overview and Conclusions                                          15

III. Minimum Quality Analyzers 	 Are They Needed?

     A. Introduction                                                   28
     B. Uses of Test Data                                              29
     C. Consequences of Bad Data                                       29
     D. Current Analyzers                                              32
     E. Alternatives                                                   34
     F. Precedents                                                     35
     G. Conclusions and Recommendations                                36

IV.  The Inspection Analyzer

     A. Introduction                                                   37
     B. Measurement Error Sources                                      37
     C. Acceptable Measurement Error                                   42
     D. Discussion of Current Specification and Alternatives           43
     E. Cost of Alternatives                                           47
     F. Production Variances and Field Audit Testing                   51
     G. Conclusions and Recommendations                                55

V.   Program Implications

     A. Introduction                                                   61
     B. Implementation Considerations                                  61
     C. Implementation Issues                                          61
                              EPA-AA-IMS-80-5-B

                              Table of Contents
                               Acknowledgments

                          Analyzer Specifications 	                 10
VI.  How to Use The Specifications

     A. Overview                                                        11
     B. Analyzer Technology                                             12
     C. Change Notices                                                  12
     D. Definitions and Abbreviations                                   13

-------
                        Table of Contents (continued)
VII. Recommended State Minimum Specifications                           16

     A. Gases                                                           19
     B. Gas Cylinders                                                   21
     C. Durability Criteria                                             22
     D. Design Requirements                                             23
     E. Analyzer Performance                                            31
     F. Sample System Performance                                       34
     G. Operating Environment                                           36
     H. Fail-Safe Features                                              37
     I. System Correlation to Laboratory Analyzers                      38
     J. Manuals                                                         39

VIII. Additional System Specifications                                  41
      Recommended For Decentralized Inspection Programs

     A. Automatic Zero/Span Check                                       41
     B. Automatic Leak-Check                                            43
     C. Automatic Hang-Up Check                                         44
     D. Automatic Read System                                           45
     E. Dual Tailpipes                                                  45
     F. Automatic Test Sequence                                         46
     G. Printer                                                         48
     H. Vehicle Diagnosis                                               48
     I. Anti-Tampering                                                  48
     J. System Diagnosis Testing                                        49

IX.  Optional Features For The Inspection Analyzer                      50

     A. Automatic Data Collection                                       50
     B. [Deleted]                                                       52
     C. Anti-Dilution                                                   53
     D. Loaded Mode Kit                                                 54
     E. Engine Tachometer                                               55

X.   Future Improvements For The Inspection Analyzer                    56

     A. Introduction                                                    56
     B. Improvements in Water Removal                                   56
     C. Improvements in HC Measurement                                  57

XI.  Evaluation Procedures                                              59

     A. Traceability of Analytical Gases                                60
     B. Gas Cylinder Specifications                                     61
     C. Durability Test Procedures                                      62
          1. Vibration and Shock                                        62
          2. Sample Line Crush                                          64
          3. Sample Handling Temperature Effect                         65
          4. Filter Check and Hang-up                                   67

-------
                   Table of Contents  (continued)
D. Design Requirement Inspection and Test Procedures               70
     1. Useful Life                                                70
     2. Name Plate                                                 70
     3. Sample System                                              70
     4. Sample Pump                                                70
     5. Sample Probe                                               70
     6. Sample Line                                                70
     7. Analyzer Spanning System                                  -71
     8. Analyzer Ranges                                            71
     9. System Grounding                                           71
    10. System Vents                                               71
E. Analyzer Performance Test Procedures                            72
     1. Calibration Curve                                          72
     2. Resolution                                                 74
     3. Compensation                                               75
          a) Altitude                                              75
          b) Pressure and Temperature                              78
          c) Non-Compensated Systems                               81
     4. Zero and Span Drift                                        82
     5. Span Drift (see E.4.)                                      85
     6. Noise (see E.8.)                                           85
     7. Sample Cell Temperature                                    86
     8. Gaseous Interference and Noise                             88
     9. Electrical Interference                                    91
    10. Propane to Hexane Conversion Factor                        95
F. Sample System Test Procedure                                    98
     1. Sample Cell Pressure Variation, Low  Flow  and               98
        Response Time
     2. (See F.I.)                                                 102
     3. (See F.I.)                                                 102
     4. (See F.I.)                                                 102
     5. System Leakage                                             103
     6. (See C.4)                                                  104
G. Operating Environment Test Procedure                            105
H. Fail-Safe Systems                                               107
     1. Warm-up Lock-out                                           107
     2. Low Flow                                                   109
I. System Correlation Test Procedures                              110
     1. NDIR Correlation                                           110
     2. FID Correlation                                            114
J. Micro Processor Systems                                         117
     1. Automatic Zero/Span                                        117
     2. Automatic Leak Check                                       117
     3. Automatic Hang-up                                          117
     4. Automatic Read                                             117
     5. Dual Tailpipes   ,                                          118
     6. Automatic Test Sequence                                    118
     7. Printer                                                    118
     8. Vehicle Diagnosis                                          118
     9. Anti-Tampering                                             118

-------
                               List of Tables
Number                        Title

II-l                                                                    17
II-2           Significant Features of the Technical Recommendations    20
11-3           Comparison of Operator Features                          23
II-4           Comparison of Performance Specifications                 25
II-5           Comparison of Specification Features                     26
II-6           Comparison of EPA Recommendations versus                 27
               207(b) Minimum Specifications

IV-1           Factors Affecting the Factory Calibration Curve          39
IV-2           Factors Affecting Field Spanning                         39
IV-3           Sample Handling System Design Considerations             40
IV-4           Operational Characteristics                              40
IV-5           Factors Affecting Durability                             41
IV-6           Factors Controlled by the Operator                       42
IV-7           Incremental Error Due to Operator Actions                44
IV-8           Inflation Effect on Analyzer Cost                        47
IV-9           Estimated Retail Cost Increase                           50
IV-10          Estimated Retail Cost Increase                           51
IV-11          Comparison of Analyzer Costs                             52
IV-12          Recommendationed Qualification Program                   53
IV-13          Analyzer Costs                                           57
IV-14          Average Inspection Labor and Costs per Vehicle           57
IV-15          Yearly Costs to Purchase A Computer Analyzer             59
IV-16          Recommended Analyzer Applications                        60

V-l            Recommendation Criteria for Phase-Out/Phase-In           63
               of Analyzers

-------
                              Acknowledgments
The contributions  and  stimulating discussions with Merrill Korth and Gordon
Kennedy of  EPA  during  the formative stages  of  the technical specifications
are greatly appreciated.  The  inputs  and numerous  reviews  provided by the
Equipment and Tool Institute Performance Test Group  have  greatly aided the
preparation of this specification.

-------
EXECUTIVE
 SUMMARY

-------
                              EXECUTIVE SUMMARY
                 EPA Recommended Instrument Specifications

The  instrument  used  to  measure  motor vehicle  exhaust  concentrations  of
hydrocarbon   (HC)  and carbon monoxide  (CO)  is  typically  called  a garage
analyzer.   This  commonly used title has considerable significance  to  the  I/M
program administrator  choosing  a  minimum specification  for  exhaust  ana-
lyzers.  The  word  "garage"  indicates the  location  where  the instrument  is
used.  "Garage" also refers  to  its intended use: to assist in the  diagnosis
and repair of engines and emission control systems.  As those familiar with
diagnoses  and repair using  an exhaust analyzer know,  the  relative  level  of
pollutants,  and the  change  in emission levels  in response to an  adjustment
or repair,  are most important in servicing a vehicle.

The I/M program places  a new burden on  these  Instruments: inspection.   An
inspection  of the  vehicle's  exhaust requires an  accurate measurement of  the
pollutant   concentration,  not  just a  relative  level.   Whether  a  vehicle
requires  repair depends  on  this measurement, and accuracy becomes  an impor-
tant consideration.  The Instrument must provide  a repeatable measurement  in
order  to  assure  equitable   inspection  for  each  motorist.   The  failure  to
achieve repeatability  inevitably   results  in  challenges  to  the  program's
credibility.

The  current  garage  type repair  analyzer  has  been  used  as an  inspection
analyzer  in currently operating I/M programs.  In centralized I/M programs,
its design capability has been  greatly complemented by computer  control  and
very  frequent  calibration  and  maintenance.   It  is continually under  the
watchful  eye of an experienced  instrument  technician, and  its working envi-
ronment is  often carefully controlled.

In decentralized I/M programs,  the repair  analyzer has also  been  used as  an
inspection  tool,  but with  little  consideration  for its  original  intended
use.  The  one concession to this situation  has been periodic  State checks  of
the instrument's calibration.

In the garage environment,  the  inability  of  the  repair analyzer  to  provide
accurate  and  repeatable  measurements  is  well  established.   A recent NHTSA
study  indicated 32 percent  of field use exhaust  analyzers were reading more
than  15 percent too high or too  low.   The study had  attempted  to use  the
Industry  standard  of accuracy of ±3%, but  found  virtually no analyzer could
meet this  requirement.   At  ±5%, 93 percent of  analyzers were inaccurate  or
not repeatable.

In  addition,  the   performance specifications of  repair analyzers  often  are
inappropriate for  inspection  purposes.   As an example, most  instruments  are
designed  to  operate at  0 to 85% relative  humidity.  As an  inspection  ana-
lyzer, this could  preclude  use  on high humidity  days.  In many areas of  the
country,  this specification effectively limits  the use of  the instrument  for
a  large portion of the  year.  This  is  of course an impractical  constraint,
yet the Instrument manufacturers have done nothing  to rectify  this problem.

-------
Several attempts  to  improve the inspection analyzer by establishing minimum
specifications have been made.  In 1974, the California Bureau of Automotive
Repair (BAR) published its  first minimum instrument specification applicable
to garages participating in its Blue Shield inspection program.  In 1980 BAR
upgraded  this  specification.   This  year,  the  Equipment  and Tool Institute
(ETI), an industry organization,  published its  own  specification,  and has
widely disseminated  it  to states preparing I/M programs.   EPA has reviewed
each of these  specifications and finds them lacking in two areas:  1)  fail-
ure to consider how the operator affects the measured results, and 2) speci-
ficity of accreditation test procedures.

Based on  an error propagation model, EPA determined that the most important
factor involving  accurate  exhaust measurements was  proper operating proce-
dures.   EPA  determined   that   incorrect  gas   spanning  (calibration)   could
affect emission measurements  by up to  40%, improper  purging by up to  100%,
leaks by  100%,  and improper meter reading by 20%.  Under the best of condi-
tions, current equipment has a measurement accuracy of 25% to 35%.

The  assessment indicated  that  minor,  but  important design  changes  could
improve optimum  accuracy to  the 10 to 20  percent range.   The improvements
involve  the detector,  sample  cell,  and  signal  conditioning,  and  can  be
incorporated into  existing  designs with relative  ease.  EPA recommends that
all  states  implementing  centralized I/M programs  adopt  the EPA recommended
specification  for  a  manually operated  analyzer.  The specification, includ-
ing  detailed  acceptance procedures,   can  be  found  in  EPA-AA-IMS-80-5-B,
"Recommended Specifications for Emissions Inspection Analyzers".

These improvements do not  address the proper  operation  of the instrument.
In centralized programs,  this  is dealt with  through  use  of inspection per-
sonnel thoroughly familiar with  the instrument,  through  recordkeeping and
frequent  calibration  and maintenance,  and often  through  real time computer
control of the instrument.

In  decentralized  programs,   the  station operator  or  mechanic  cannot be ex-
pected to become  an  instrument technician;  the  sophistication of  the in-
strument precludes this.  The cost pressures of completing the inspection  as
rapidly as pos'sible encourage failure to provide proper calibration and leak
checks.    In fact,  the  calibration  and  maintenance requirements  of  these
analyzers exceed   those  of  any  other  garage instrument.   Incompetence and
fraudulent  practices are  also  considerations  in a decentralized  program
because of minimal inherent checks on the quality  of the operation.

The advent  of  the $15 pocket calculator and the $800 home computer provides
a  practical  solution to  most of  the  problems  of proper operation  of the
inspection  analyzer  in a  decentralized I/M program.  With  the addition  to
each instrument of a small microcomputer, the inspection instrument can take
on most of the calibration and recordkeeping burden.  This computer operated
analyzer will  restrict operation until the unit is fully warm, will provide
for  automated  gas span  and leak  checks,  can  accept  vehicle ID  and  other
information, will  automatically  make  the pass/fail decision, will provide a
hard copy output  (which  can include diagnostic information),  and  can  store
pertinent data on magnetic tape for future state analysis.

-------
                                     10
Such  analyzers  are not  a wishful  dream.   One  repair  analyzer produced by
Hamilton  Test  Systems incorporates  some of  these features  and  is commer-
cially available.  The New Jersey inspection program  is currently evaluating
a computerized  unit  produced by Sun  Electric.   New York State has recently
contracted  for  over  4000 computerized  analyzers  for its decentralized  I/M
program.   Given these  recent  developments,  EPA's  assessment of  9  to 18
months  before  quantities of  units  meeting  its specification are available
may be overly pessimistic.

EPA  strongly recommends  that  each  state  implementing  a decentralized  I/M
program adopt the  EPA specification  for a  computer operated analyzer.   The
complete  specification  and  detailed  acceptance procedures  can be- found in
EPA-AA-IMS-80-5-B   ("Recommended   Specifications  for  Emission  Inspection
Analyzers").   A drawing  of  an existing  prototype  follows this executive
summary.  EPA Report  EPA-AA-IMS-80-5-A  ("Analysis  of  the  Emission Inspection
Analyzer")  provides  considerable  information  on  the need  and  benefits of
adopting the computer operated  analyzer.

Computerized  features will  increase the  cost   of  the  inspection analyzer.
The attached report  (EPA-AA-IMS-80-5-A) estimates  that the  full cost  of  the
analyzer  will  be recovered  for 2  to 3  dollars  per test.   In addition,  the
cost  to  the  garage  owner  can be  reduced  through  investment  credits  and
depreciation.   These  considerations are fully  discussed  in  the report.   The
estimated  cost  (1980 dollars)  of   the  various   types of  analyzers is shown
below.

     EPA computerized                         $6195 to 7395
     EPA manual operation                    $4490 to 5690
     BAR 80 certified                         $3750 to 4950
     Current Repair Analyzer  (ETI)            $3000 to 3750

The  computer operator  analyzer will  allow  a  reduced   frequency  of state
audits  of licensed decentralized  inspection stations.   Because  of the  in-
strument's  self-calibration  feature,  quarterly audits  will  provide  quality
assurance  equivalent  to  the  otherwise  required monthly audits.   This  pro-
vision  will  reduce program administration costs,  and should be an  incentive
for State adoption of the EPA recommended analyzer.

The impact  on  implementation schedules  of  the  lead  time to procure  instru-
ments  meeting  this specification  is discussed  in a separate memorandum to
EPA's  Regional  Administrators  from  the  Assistant  Administrator  for  Air,
Noise and Radiation.

Finally,  the 'I/M staff at  EPA's Ann Arbor facility  is  available to  provide
additional  assistance and  information  as  necessary.  You  may contact  Tom
Cackette, Donald White, or  Bill Clemmens at  (313)668-4367.

-------
                              COMPUTER OPERATED EXHAUST ANALYZER
   Visual Output —

     Cutpoint
 Selector Switches
     or Decal

   Operator Input
     (Optional)
  Sealed Access
   for Audit or
Service Adjustments
   Printed Output
Computer Controlled
 Function Switches
(gas span, test, etc.)
 Manual Function
    Switches
  (on/off, pump,
  indicators, etc.)

-------
      12
TECHNICAL
 REPORT

-------
                                     13
I.   Introduction

It  is  doubtful  that  anyone would  disagree  with the statement  that  automo-
biles  (and trucks) are part  of  the  air  pollution  problem in  the  country.   In
the  past,  Congress has  specified,  through  the  Clean  Air Act  (and  various
amendments to  the  Act) that new  cars must meet  certain emission performance
standards prior  to the introducion  of  those vehicles  into  commerce.   These
performance  standards  have always  been  checked under  closely  controlled
laboratory conditions with  sophisticated equipment.   The required  accuracy
and  performance of   this  equipment has  always  been  specified within  the
context of the laboratory  situation.

Recently, there  has been more emphasis  on checking  the  performance of in-use
vehicles.  This  is occurring through the implementation  of  state inspection
and  maintenance  (I/M) programs as  well  as  the  forthcoming  emission  repair
warranty regulations  (207(b)) authorized  by  the  Clean Air Act.   Further,  the
new  1984 Heavy-Duty   (HD)  Truck Federal Regulations  specify  an  idle standard
as  well  as  a  driving cycle standard.    Both  the 207(b)  warranty  emissions
test for hydrocarbons (HC)  and  carbon monoxide (CO),  and  the Heavy-Duty idle
emission test  for CO are  expected  to  be conducted  on in-use vehicles with
data generated mostly by state-run  I/M  programs.

Practically  all  of the I/M  data  will be  generated by field  emission  inspec-
tion  analyzers  (as   opposed  to  laboratory  equipment)  in both centralized
programs  (i.e.  central inspection  lanes) and decentralized programs  (i.e.
inspection conducted  by independent  service  centers).  This  data will affect
the  consumer  through  required  maintenance,  the  automobile   manufacturer
through  warranty claims,  the  State  through emission  credits,  and  the  EPA
through  its  ability  to  judge  the  effectiveness  of  the  individual I/M pro-
grams.  Obviously, a  fundamental  issue  that  an I/M program must  deal  with is
the  accuracy and validity of the test  data taken under  these programs.   An
inseparable  part  of   that  issue  is the quality of  the equipment used  to
obtain the data.  Various state  and  trade associations have developed  stan-
dards  to control the quality of  the equipment used,  but  as  yet  there are no
nationally accepted minimum standards for inspection  analyzers.   An examina-
tion  of  the data  validity issue should  then  encompass both — Is  the data
generated  under  present conditions sufficiently valid?, and — Is  there a
need for minimum quality standards  for  inspection analyzers?

If  it  is  determined  that  minimum quality data is an objective, and  minimum
quality  standards  are needed,  then  it would be reasonable to investigate
what those minimum standards should be.   The  determination  of minimum stan-
dards  presents  several practical issues  which include — Is such  equipment
available?,  — What is the  economic  impact of  such  technology on society,  on
the  equipment  owner,  and  on the individual  consumer?,  and finally  —  How
could  such  minimum   inspection  analyzer   standards  be  best   implemented?

The  subject  of this  report  deals with  these questions  and  issues.  Chapter
II provides  a brief summary  of  our  conclusions and  recommendations.  Chapter
III  provides a  discussion of  the need for  the  States to establish  minimum
analyzer  specifications.    (EPA  policy requires  the  State  to  establish a
minimum  instrument specification  4/.)    Chapter  IV  details  the analysis  of

-------
                                    14
the  technical  issues  pertaining  to  the  inspection  analyzer,  provides a
discussion of  several economic issues, and provides overall recommendations
based  on  the  technical  issues.   The  practical  issues  involved  with  State
implementation  of minimum analyzer  specifications are  also  discussed  in
Chapter V.   Chapters  VI through X provide  the specific recommended analyzer
specifications, and Chapter  XI provides specific evaluation test procedures
that may  be used  to  verify  the  performance of  the equipment.  References
used in Chapters I through V may be found at the end of Chapter V.

-------
                                     15
II. Overview and Conclusions

The purpose of this study was to determine  if  an  improved  exhaust  inspection
analyzer was needed  for use in  Inspection  and Maintenance programs,  and  to
recommend  a set of  instrument specifications  that  could  be adopted  by the
State as a minimum instrument requirement.  The purpose  of  this  report is  to
provide the  States  implementing I/M with EPA's recommendation  for a minimum
analyzer specification, and to set forth the benefits and  issues surrounding
State adoption of this  specification.

Before any issues on analyzers could be discussed, it was  found  to be neces-
sary  to  categorize the various types  of   field  analyzers.  This "categori-
zation process resulted in three distinct types of field analyzers — 1) the
centralized  inspection analyzer,  2)  the  decentralized  inspection analyzer,
and  3)  the  repair  analyzer.   It  is  the  two  types  of  inspection analyzers
(central and  decentral) that  are of concern.  Therefore,  the  focus  of the
subsequent  analyses  in this document address  only the  inspection  aspects  or
capabilities of  field  analyzers.  The quality of, or minimum specifications
for  repair  analyzers  is left to market  forces or specifications  determined
by  trade  associations,  such  as  the  Equipment  and Tool  Institute  (ETI).

Data  validity  and  the need for minimum quality inspection analyzer specifi-
cations were the first  issues analyzed.'  The analysis investigated the basic
functions  for which  the  data will  be used  (i.e.  pass  or fail  vehicles,
etc.), and some consequences that  may result if that data  is incorrect.  The
status  and  condition  of  current  in-use  analyzers  were evaluated,  and the
alternative  of not  specifying minimum requirements  was  discussed.  Finally,
policy  precedents   in  other  fields  of  emission  measurement  programs  were
reviewed for similarities.

Chapter III  provides  more detail  on these  analyses. Briefly  the  conclusion
is:   the  accuracy  of  the test  results  is   an  important consideration  in
operating an  effective I/M program.  The  consequences  of  bad data not only
affect consumer  protection,  but affect the ability  to   judge the  effective-
ness  of  the programs.   Both EPA  and  NHTSA  studies indicate that  there  is
substantial  variation  between  in-use  analyzers  (up to  35%)  \J,  and  that  a
substantial  number  of  analyzers in the  field  are inaccurate or not repeat-
able.   One NHTSA  study 2f indicates  that the number   of  analyzers  in the
field  producing  erroneous  data  is over  90 percent.   From these data, the
problems  associated with  no uniform  specification, and  the  precedents  of
other  programs,  the staff concluded that  some minimum  specification on the
quality of emission inspection analyzers was needed.

With  a decision that a  minimum specification was  needed, the staff was faced
with  developing  a  technique that  would allow  a determination  of the factors
that  should be  included  in a  minimum  specification.    An  error propagation
 / References  to be  found  a,t  the  end  of  Chapter V,  page  64.

-------
                                     16
model  (described  in Chapter IV) was developed  that  allowed  such  a determina-
tion.   Several current  analyzer  specifications, most  notably the State  of
California  BAR 80 (Bureau of Automotive Repair), and  the Equipment and  Tool
Institute  (ETI)  Recommendations were  evaluated  against this model.  All  of
the  specifications evaluated  (including  the  two  mentioned)  were found  to
lack  critical  guidelines in two key areas:  1) control  of the  analyzer  oper-
ator's  procedures,  and 2) detailed evaluation testing  procedures.  The  lack
of  these guidelines  in  addition to concern  on  many specific technical  re-
quirements  lead  to the development of  an EPA analyzer  specification.

A draft of  the EPA specifications was circulated  to the equipment manufac-
turers  for  comment.   Their comments were  reviewed  within the  context of the
considerations presented in this report.  The final EPA analyzer specifica-
tion  recommends  two  distinct inspection analyzer types — a  manually  oper-
ated  model,  and  a computer  (microprocessor) operated model.   The quality of
the  data generated by either type would be  the  same provided  that the  oper-
ator  of  the manual system followed correct procedures,  checking  frequencies,
and maintained and reviewed  log books  to identify long  term trends in equip-
ment  condition.   These  specific technical  recommendations  are  found  in
Chapters  VI through  XI.  Some of the more significant differences  between
these   recommendations  and  other specifications  are  found  in  Table  II-2
located  at  the  end  of  this chapter.   A generalized comparison  between the
EPA  recommended  specifications,   the   BAR 80  specifications, and  the  ETI
recommendations  can be found in Tables II-3,  II-4, and II-5 also at  the end
of  this chapter.   A final  comparison  (Table II-6)  between  the EPA recommen-
dations  and 207(b) requirements follows Table  II-5.

The  stdff recognized  that  the cost  of the  recommended equipment,  and  the
possible  economic burden on  those  purchasing  such equipment  would be  an
important influence on the final equipment  recommendations.   Sections  E and
F of  Chapter IV  review  this issue in  more detail.   What the staff found was
that  inflation has increased  the cost of current  analyzers  substantially.
Analyzers  that cost  $2000  to  $2500  in 1975, now  cost $3000 to $3800  (see
Table  IV-8).   Using  those analyzers which meet  the State  of  California BAR
74  specifications as  a baseline,  the staff determined  that  an  average incre-
mental  retail  price increase of a BAR  80 analyzer would be  around $975.   The
estimated incremental retail price increase over the  cost of a  BAR  80 ana-
lyzer  would be   approximately  $720  for the EPA manual  system,  and  around
$1700  for the  computer  system  (See Table  IV-11).   Production quality  audit
testing  for the  EPA  system is  expected to  increase the retail  price by an
additional  $20.   These  estimated incremental  price increases result  in the
estimated average retail prices presented  in Table  II-l.

Although some  may claim that  these  equipment  prices  are  prohibitive,  the
staff  identified  that the yearly amortized  cost over  a 5 year period repre-
sented  a very  small  fraction of  a  centralized  contractor  fee or the yearly
gross  income  for most service  centers.  Data  taken from a  recent DOT report
to  Congress 3f   indicate  that the  estimated  retail  price of  $6850 for a
computer inspection analyzer represented less  than  2.5  percent of the yearly
gross  .income  for over 77 percent  of  the  vehicle repair industry (paint and
body  shops  excluded).  The staff  considered this burden  on decentralized
inspectors  to  be minor, but even  this minor  burden would  be  reduced rather
substantially  by the  following real  factors:  depreciation credit  on income

-------
                                     17
tax, first  year  investment credit on income tax, allocation  of a portion  of
the  inspection  fee to  defray the  cost  of the  analyzer,  and potential in-
creases  in  business due  to mandatory  I/M vehicle  repairs.   Based on  this
limited economic  study,  the staff found no reason  to alter  the basic tech-
nical recommendations due  to economics.
                                 Table II-1
Specification                           Estimated Average Retail  Price
                                                 (1980 dollars)

BAR 74                                            $3400,
BAR 80                                            $4350
EPA Manual Operation                              $5120
EPA Computer Operation                            $6820
A factor that may concern some about the EPA recommendations  is  that  one  can
not immediately  purchase  an inspection analyzer that meets these specifica-
tions.   The  technology to  meet these  specifications  is readily available,
and the  manufacturers have  estimated  a 0 to  9  month  delivery  schedule  for
the manual system,  and 9 to  18  months for the computer  systems.   One major
manufacturer  already  markets a  microprocessor based  analyzer,  and  another
major manufacturer has indicated it will do so shortly.

Practical  issues  such as   the  usefulness  of  analyzers  currently in-use
(grandfathering),  and the availability of  newer technology  equipment domi-
nated the final issue that was evaluated — State adoption  of the EPA speci-
fications.  The staff developed  guidelines which could be used by a State to
assure that  some quality  of grandfathered analyzers is maintained.   Details
of the guidelines  can be found  in  Chapter V.   The staff favors a  phase-in/
phase-out strategy  to bring about an  orderly  transition to  the newer tech-
nology.   The phase-in  portion  of  the  plan  could  be .implemented  through
purchasing policies.   In  this strategy any orders  for new,  replacement,  or
additional analyzers would only  be issued  for  new technology  analyzers.   The
phase-out portion  of the  plan  would  simply  set a  date beyond which older
technology analyzers  could  not  be used  for  inspection purposes.   The date
could be  established  based on the estimated  5 year useful life of the ana-
lyzer (see Chapter V).

The EPA  staff  strongly  recommends the  computer analyzer  for decentralized
programs  (an example  of a computer analyzer which EPA has purchased is shown
in Figure II-l).   This recommendation  is  in harmony with a recent  report by
Booz  Allen to the California Legislature  on  I/M program options  (March  21,
1980).   For  the  decentralized  option,   the  Booz  Allen  report   concluded
"...Foremost  is the  requirement  that  all decentralized inspection stations
be  equipped  with  a  sophisticated analyzer  that  would be  'examiner  proof.'

-------
                              COMPUTER OPERATED EXHAUST ANALYZER
   Visual Output

     Cutpoint
 Selector Switches
     or Decal

  Operator Input
    (Optional)
  Sealed Access
   for Audit or
Service Adjustments
   Printed Output
Computer Controlled
 Function Switches
(gas span, test, etc.)
 Manual Function
    Switches
  (on/off, pump,
  indicators, etc.)
                                                                                                                    00
                                           Figure II-l

-------
                                     19
                                 Table  11-2

            Significant Features of  the Technical  Recommendations
A. Manual Operation Analyzers

1. Analyzer Calibration  Curve:   The analyzer calibration  curve requirements
are more statistically sound than other  published  specifications.   Since  the
calibration curve  sets  the basic accuracy  of the analyzer,  it is important
to measure  the  true performance when attempting  to  verify compliance  to  the
specifications.  Additionally  these recommendations  require  bette'r accuracy
than other  specifications at the 207 (b)  emission  warranty levels  of 220  ppm
HC and 1.2% CO, and the Heavy-Duty  idle  standard  of  0.47%  CO.

2.  Analytical Gases;   All  analytical gases  must be  traceable to  National
Bureau of  Standards,  Standard Reference  Materials (SRMs), or  an  EPA  Office
of Mobile Source's approved standard.

3. Analyzer Spanning Concepts:   Analyzer spanning is  the  process  of adjust-
ing the calibration curve into the  proper frame of reference.   True spanning
involves  the  use  of a  known concentration  analytical gas  as a reference
point when adjusting the analyzer.

An  approximation  of  true spanning which is  rather  common on  current field
analyzers  involves the  application  of   a  reference  voltage  level  to  the
electronic  circuitry.   Known as  electrical spanning,  the  reference voltage
is supposed to  be equivalent to  the  electrical signal  that  would be  gener-
ated by  a known  analytical gas  at that reference  level, if  that  gas  were
introduced  to  the  optical  bench  in  the  analyzer.   The problem  with  the
electrical  span is that the voltage  level  used implicitly assumes that  the
relationship  remains  constant  between the  true gas  signal and  the reference
level voltage.  That relationship,  in  fact, does  not remain constant,  and is
influenced by many factors.  Some  of  these factors  include barometric pres-
sure, ambient temperature,  sample  temperature, system leaks,  and  the  physi-
cal age or cleanliness of the optical  bench.

The change in the  relationship between the  electrical span and  true spanning
(or gas spanning) can take  place very  rapidly from vehicle to vehicle, or it
can take place  over a long  period of  time as  the  analyzer  slowly gets  dirty.
To account  for changes in  the analyzer's response to span gas, in a labora-
tory  situation  the analyzer  is spanned  before  and  after each  test point
(approximately  every 5 to 20 minutes).

Gas spanning  every  4 hours  was recommended  for field inspection analyzers as
an  optimum compromise between  the necessary frequency of spanning and  gas
use.  Based  on the projected gas use, and  the size  of analytical  gas  cylin-
der recommended, span gas cylinders should  last between 8  to  10 months.   The
average  cost  per  vehicle of the  span gas works  out  to be a  fraction of a
penny per  vehicle.   This recommended  gas span  check frequency  compares to a

-------
                                    20
                           Table II-2  (continued)

once-per-week  minimum  requirement  for 207(b)  warranty  coverage and a  once-
per-tnonth  proposed  EPA policy minimum.   Because  the analyzer drift charac-
teristics  are only  specified  for a  1 hour  period,  an  electrical span  at
least once each hour is recommended to cover  the  time duration between  the 4
hour gas span checks.

Several  practical  options that  would allow  less frequent gas spanning  and
hence  less use of  the analytical gas are offered.   For those units  which
incorporate  features which will allow use of  the less  frequent gas spanning
option,  it  is  recommended  that  the  minimum gas spanning  frequency  be  at
least  once a week.   Once  a  week  gas  spanning is  a req uirement for 207 (b)
warranty coverage.

4.  Evaluation Tests;   The recommended  specifications provide detailed  test
procedures  that can  be  used  to  evaluate  the capabilities  of  a  candidate
analyzer.   Including such test procedures avoids many  of  the problems  asso-
ciated  with interpretation  of  a  set  of  specifications.   The tests are  de-
signed  primarily  for  laboratory  check-out of the candidate equipment.   No
other  published  instrument  specification  provides  sufficiently   detailed
procedures  that can assure unbiased equipment  qualification.

5.  Procedures;  The  recommended  specifications  require  that  the  analysis
system  include all  necessary  equipment  that will  allow  the  operator  to
perform  the  necessary  maintenance and  testing  procedures  correctly.    The
manufacturers  are  allowed the  flexibility to fit  the  procedures   to  their
equipment,  but  specific  functions   that  the  procedures must  address  are
required.

6.  System  Leak  Checks;   Leaks  in the  sample  system are probably the source
of  the  largest and  most  frequent errors  that occur in  emission measurement
systems.   This  is because  a  leak is transformed directly  into an error  (i.e.
a  15%  leak  is  a 15%  error).   Most  laboratories have rigid procedures  for
leak checking of  analysis systems, and  the  process of searching for a leak
can be very  time consuming.

I/M  analysis systems  have special problems  that the laboratory systems do
not.   First,  laboratory systems are  generally not  moved  around; I/M systems
in  decentralized  systems  are.  Second,  laboratories have special  equipment
to  identify  leaks,  the expertise to  use  the  equipment,  and  the knowledge to
repair  leaks Inside the  systems;  the I/M operator generally does  not have
this knowledge available.  Third, the  laboratories generally have  a strong
commitment  to prevent  and repair  leaks,  and   subsequently provide  resources
of time and money for this commitment; the independent  decentralized inspec-
tor may not  have this commitment, and  may  not be  able  to  apply  the  resources
to  it.   Fourth,  the flow  rates used  in  I/M systems  are  so small that even a
10% leak is  difficult  to  measure without  laboratory  equipment,  let  alone a 2
or 3%  leak.

Most  other specifications do  not  provide for a  routine leak check, and if
they  do provide  for  the necessary  equipment,  the equipment  is   usually  a
tapered  tube  flow meter,  or  the equipment  to  perform a  vacuum decay test.  A

-------
                                     21
                            Table  II-2 (continued)
tapered  tube  flow meter is not  practical for field use.   It is a high main-
tenance  item  if built  into  the system,  and tends to stick from a combination
of hydrocarbons and  water.  This is  true in a laboratory  situation even with
an extremely  high filter changing  frequency.  If  the  flow  meter is used to
monitor  system response time  as well,  the capacity of the flow meter is too
large  to read leaks without  a 15-20% error in the reading.   With the vacuum
decay  method,  it is  difficult  to  verify,  in  the field,  the  relationship
between  the  vacuum decay time and  the amount of emission measurement error.
Further,  the  vacuum decay  test  is  somewhat  dependent  on  the gauge location
in  the  system  relative to  the  location  of the leak,  the gauge  damping and
protection,  and  the  condition  of  the particulate  filter(s) (i.e.  a dirty
filter has  more pressure drop than  a clean  one).   Primarily for these rea-
sons, a  flowing span gas  leak  checkv(through the sample line) is recommended
on  a  weekly  basis.    (Such a check  is  also a  requirement for  the 207(b)
warranty).

B. Computer Operated Analyzers

1. Quality;   Same features  as  the manually operated analyzer.

2. Fail  Safe  Systems;   The recommended  specifications provide features that
prompt  and  assist  the operator in  performing  the  necessary procedures re-
quired to obtain  valid data.   In many cases, these features  are "one button"
operations with the  analyzer  automatically performing the check.

C.  Future Improvements

1. A discussion  which provides manufacturers with long  term  goals for im-
provement  of  field  analyzer   technology  has been  included   in  this report.
These  recommendations  have  no  impact  on State adoption  of the recommended
specification in  this report,  but  are  provided  to  help   stimulate  major
improvements  in  exhaust  analyzers.   These  technology  improvements  would be
desireable to be  included  in  the basic recommended specifications, and could
be implemented in the  attached  recommendations if  the technology were less
costly or widely  available.   For more information, see Chapter X.

-------
                    Purpose

                    Stabilize the analyzer
                    performance parameters
                    before use.
                                                          Table I1-3
                                                Comparison of Operator Features
                                                          (overview)
                                        EPA  Recommendations
                         Computer Operation            Manual  Operation
                         Time not  specified, based
                         on analyzer parameter,  but
                         analyzer  cannot be operated
                         until warm-up  is  achieved.
                              Same as computer analyzer.
                                                                                                          BAR 80
                               15 min.
                                                                          ETI
             30 min.
II.  Gas Span
  A. B}uipment
  supplied
  B. Procedures
  C. Electrical
  Span Correction
  available to the
  operator.
Allows operator  to
gas span.
Tells operator how  to
gas span.
Adjusts electronic span
to track gas span.
  D. Range Correc-  Allows correlation
  tion available    between ranges.
  to Auditor.
Yes, integral system

Automatically controlled
by computer.
Yes, Automatic
                         Yes
Yes, integral system.

Supplied by MFC
must meet minimum
functional
requirements.

Yes, Manual
                              Yes
Yes, add on. Optional
Universal
manual
procedure
No
                              No
add on.
Supplied
by MFG.
no require
ments.

No
             No
                                                                                                                            NJ
  E. Responsibil-
  ity for frequen-
  cy of Check. ,

III. Leak Check
  A. Equipment
  supplied for
  gas comparison.
  B. Procedures
  C. Responsibili-
  ty for frequen-
  cy of field
  check.
Insure proper
operation.
Allows Auditor  to
quantify a  leak.

Tells operator  how
leak check.
Insure proper
operation.
Automatically controlled
by computer.
Yes
Check automatically
controlled by computer.
Automatically controlled
by computer.
Controlled by operator
(log book).
Yes
Supplied by MFG. and must
meet minimum functional
requirements.

Controlled by
operator (log book).
Controlled   Not
by operator  specified
 (log book).
No
No
                                                                                      No
No
No
             No

-------
                                                      Table II-3  (continued)
                                                Comparison of Operator Features
                                                           (overview)
                                             ooooooooooooo EpA Recommendations  °°°°°°°°°°°«              BAR 80          ETI
Item                Purpose                  Computer Operation            Manual Operation              	          	

IV. Field HC
    Hang—up Check   Prevents contamination   Yes, automatically            Yes, test performed           No              No
                    from altering readings,  controlled by computer.       by the operator.

V. Test Value
  A. Time Averaged  Provides more accurate,  Yes                           Yes                           No              No
  Tailpipe Sample   consistent results; pre-
                    vents operation errors.

  B. Determine      Prevents operator        Yes                           No                            No              No
  Pass/Fail.        errors.
  C. Results        Provides consumer        Yes                           Optional                      No              No f
  printed.          a receipt.

VI. Anti-Tampering  Prevents intentional     Yes                           No                            No              No
                    altering of analyzer.

-------
                                            24
Item

0 Accuracy of
  Calibration Curve
     - High Scale
     - Low Scale
     Temp. Range

0 Accuracy of
Audit Gases
0 Accuracy of
Span Gases
  Drift
     - Zero
     - Span
  HC Hang-Up
0 Interferences
     - Gaseous

     - Electrical

0 Leaks
                                        Table  I1-4
                         Comparison of Performance  Specifications
                                        (overview)
EPA Recommendation
5% Read. @ 90% C.L.
5% Read. @ 90% C.L.
35° - 110° F

1% Traceable to NBS
Standard Reference
Material (SRM).
(MVEL procedure)

2% Traceable to NBS
SRM. (MVEL procedure)
2% fs L.S. 1 hour
2% fs L.S. 1 hour
Less than 20 ppmh
(5% fs L.S.)
before each test.
3 Items @ 1% each

6 Items @ 1% each

3% of comparative
gas readings, weekly.
0 Operating Environment
     - Temperature        35°  to  110°F
BAR 80
3% fs
3% fs (L)
35° - 110°F

Traceable
to California
Standards. (L)
Traceable
to California
Standards. (L)
3% fs 2 hours
3% fs 2 hours
                    ETI
                    10% Read./5% Read.
                    10% Read./5% Read. (I
                    35°-550F/550-85°F

                    2% Traceable to NBS.
                    (Traceability proce-
                    cure not specified)
                    2% Traceable to NBS.
                    (Traceability proce-
                    dure not specified)
                    2.4-3% fs L.S.  1 hour
                    3% Read. 1st hour
                    2% Read. 2nd hour (L)
Less than 200 ppmh  Less than 200 ppmh
(10% fs H.S.) in 15 (10% fs H.S.) in
     after 2 min.
                                                   sec.
                                                   sample.   For
                                                   evaluation  test
                                                   only.  (L)
                    30 sec. after  1 min.
                    sample.  For evalua-
                    tion test only.   (L)
                    5 Items @  2.5% each
6 Items @ 2.5%
each (L)
5 Items @ 2.5% ea.  no criteria specified
     - Relative
       Humidity
0% to 100%
(raining)
2.5% of reading
for evaluation
test, frequency
of field check not
specified.  (L)
35° to 110°F
0 to 85%  (L)
                    optional, no
                    value specified.  (L)
                    35° to  105°F
                    0 to 85%  (L)
fs = full scale
H.S. = High Scale
L.S. = Low Scale
Read. = Analyzer  reading             (L)  =
MVEL = EPA Motor  Vehicle  Emission
       Laboratory, Ann Arbor,  Mich.  (M)  =
                 Less  stringent  than
                 EPA recommendation.
                 more  stringent  than
                 EPA recommendation.

-------
                                           25
                                  Table II-4  (continued)
                         Comparison of Performance Specifications
                                        (overview)
Item
  Probe
  Propane/Hexane
  Conversion Factor

  Response Time
  Vehicle Check
EPA Recommendation

16 inch with
tailpipe extender.

.48-.56
@ 90% C.L.

14 seconds to 95%
of Reading

Equivalency Test; pre-
cision, slope, and mean
value comparisons.
BAR 80

12 inch with tail-
pipe extender. (L)

.500-.540 (M)
8 seconds to 90%
of Reading (M)

Comparison test,
no regression of
of data.  (L)
ETI

Length not
specified. (L)

.46-.58 (L)
10 seconds to 90%
of Reading (L)

Test procedure or
acceptance criteria
not specified.  (L)
fs = full scale
H.S. = High Scale
L.S. = Low Scale
Read. = Analyzer reading             (L)
MVEL = EPA Motor Vehicle Emission
       Laboratory, Ann Arbor, Mich.  (M)
               = Less stringent than
                 EPA recommendation.
               = more stringent than
                 EPA recommendation.

-------
                                    26
                              Table I1-5
                 Comparison of Specification Features
Item

0 Automatic Data
Collection Option

0 Anti-tampering

0 Computer Control
Specifications

0 Evaluation Test
Procedures

0 Loaded Mode Option

0 Production Line
Audit Plan
EPA Recommendation       BAR 80

Yes                      No


Yes, computerized model  No

Yes                      No
Yes


Yes

Yes
No
No

Random Audit
no limit*
ETI

No


No

No


No


No

No
* BAR Personnel Restricted on out-of-state  travel.
 . Travel policy negates audit plan.

-------
                                    27
                                 Table I1-6
                      Comparison of EPA Recommendations
                    versus 207(b) Minimum Specifications
Item

0 Accuracy of the
  calibration curve
                - HC

                - CO
  Drift
                - HC

                - CO
0 HC Hang-Up
  Check Freq uency

0 Leak Check
  - Type of test
  - Tolerance
  - Frequency

0 Response Time
  Span Check Frequencies
  - Gas Span
     0 Recommended -

     0 Minimum
  - Electrical Span
     0 Minimum
EPA Recommendations



   5% of Reading

   5% of Reading


   2% fs L.S./l hr.

   2% fs L.S./l hr.



   Before each test
   gas comparison
   3% of Reading
   weekly

   14 seconds to
   95% of Reading
   twice daily (4 hr
   intervals)
   weekly

   1 hr interval
207(b)
±15 ppmh @ 200 ppmh
(7.5% of 200 ppmh)
±0.1% @ 1% (10% of 1%)
±15 ppmh/1 hr.
(3.75% fs L.S./l hr.)
±0.1%/1 hr.
(5% fs L.S./l hr.)
not required
gas comparison
3% of Reading
weekly .

15 seconds to
95% of Reading
weekly

1 hr interval

-------
                                     28


III.  Minimum Quality Analyzers 	 Are They Needed?

A.  Introduction

There appears to be many conceptions about what an emission analyzer is, and
what  it  can and  cannot do.   Much  of what  an emission  analyzer can do is
related  to  the function  for which  it  was designed.   For instance,  a lab-
oratory  analyzer   is  generally  designed  to  be  used within  the controlled
environment  of the  laboratory  building,  to  be operated by knowledgeable
scientists,  engineers,  or  technicians,  and to  be  maintained on a regular
basis.   Under  these  conditions the  laboratory analyzer can  perform the
function for which it was intended 	 precise emission measurements.

Currently,  field   emission  analyzers were  designed  more  as  vehicle repair
equipment  than  emission   inspection  equipment.   The  advent  of  the  I/M
programs along  with  the concept of centralized and decentralized inspection
systems  provides  a  challenge  to the previous design  goals for this equip-
ment.   In  centralized  programs,  the function of  the  emission  analyzer is
mainly that of vehicle  inspection.  For that purpose, most programs expect a
minimum  level  of  quality  of data  from  th equipment" in terms of accuracy,
repeatability,  etc.,  and  the design goals  for this  equipment are less sub-
ject  to  dispute.   When implementing a  decentralized  program, however, the
distinction  between  analyzer functions becomes blurred  by the past history
of  the  design  philosophy used in the design of analyzers  currently found in
service  centers (i.e.  past history being  that  of  repair, not inspection).

With  the implementation of I/M programs,  there  now emerges three distinct
types of field emission analyzers — 1) the centralized  inspection analyzer,
2)  the  decentralized inspection analyzer, and 3)  the  repair analyzer.  The
quality  of  the inspection  equipment and  subsequent  data should be the same
for  both centralized and decentralized programs.   The repair analyzer will
probably  find  the  greatest use  in centralized  programs.   Although some
independent  vehicle  repair centers may opt for the more accurate decentral-
ized  inspection analyzer for repair work, it is suspected many will use the
lower cost  repair  analyzer.  Estimates suggest there may  be  as many as 5 to
10  repair  analyzers  for  each inspection  analyzer  or  inspection lane.  The
quality  of  these  repair   analyzers  would,  of  course,  be  judged  by the
centralized  inspection analyzer during the inspection or  reinspection  test.
For  the  repair  analyzer,   the  quality  of  the  equipment,   therefore,  is
probably  best  left  to market forces (i.e. number of  retests failed by the
inspection  analyzer),  and   the equipment  specifications  suggested by the ETI
would  be  recommended   for  those analyzers.   However,  because  the driving
function  for maintenance  in an I/M program  is  the  inspection,  this  report
will  focus on  only  the inspection function  and  capabilities of field ana-
lyzers.

The  subject  of  this  chapter deals with the issue of analyzer  quality.   Since
analyzer  quality  is rarely  an issue  in  centralized  programs  (i.e.  most
central  programs   choose  the best quality  analyzer available),  the quality
and  capabilities   of  the  decentralized equipment will  receive the most em-
phasis.   Because  there has  not yet  been  a widespread  acceptance  of the
distinction  between  the function of a decentralized inspection analyzer, and  a

-------
                                    29
decentralized  repair  analyzer*,  an  analysis of  the quality  of field  ana-
lyzers must look at the inspection capabilities of equipment  that was  possi-
bly designed for repair functions.

B. Uses of Test Data

In  order  to  assess  the  importance  of  the validity  of  the  data  from  the
inspection analyzer,  knowledge of the end use of that data would be benefi-
cial.   The immediate use  of  the data generated  by the  inspection analyzer
would,  of  course,  be  to determine whether  a vehicle passed  or  failed state
inspection standards.   The  degree of failure would  help determine the  min-
imum level and type of vehicle maintenance required  in order  to  pass,  hence,
the  minimum  cost  to  the  consumer would to an  extent  be determined  by  the
analyzer.  The  absolute value of the failure  could also  initiate  the emis-
sion repair  warranty  provisions of  Section  207(b) of the Clean  Air Act.   If
a Heavy-Duty gasoline-fueled vehicle  I/M program existed,  the  absolute value
of a failure on 1984  and later model  year vehicles  could  form  the basis of a
recall  action.   And  finally,  the number of vehicles failing would provide
input into assessing  the effectiveness of the I/M program.

On a qualitative basis, the uses  of  this data are important.   To some  extent
the  degree of  importance  of  the  data  is  based  on  the  seriousness  of  the
consequences due to incorrect  or bad  data.

C.  Conseq uences of Bad Data

The  consequences  of bad data  involve risks  to the  consumer,  the automobile
manufacturers,  and the environment.  When  considering consumer protection,
only errors  of commission (reading  high) are normally assumed to be a prob-
lem.  Certainly,  incorrectly high readings  are  a  problem to  the individual
consumer.  Data from  a recent  NHTSA  study 2J suggests at  least 16 percent of
the  -analyzers currently  in  the  field  read high  by more  than 15 percent.
Near  the lower cutpoint  levels, some  of  the units tested  read high  by as
much as 40 percent.  Translated  to  the  consumer's   risk,  this  data suggests
at  least  16  percent  of the  public  is in danger of  receiving an incorrectly
high assessment  of their  vehicle's  emissions.   The consumer  is hurt  in two
ways  by these  high readings.   First,  his   vehicle  could fail  the I/M test
when it potentially should not have.  The consumer  is out both  the time and
effort  involved in  repairing  a vehicle that  potentially  did not  need  repair,
as  well as the time  and  effort required for  a  retest.    Secondly, the  con-
sumer will be  out the money  it cost  for  the potentially unneeded mainte-
nance.

For  the  consumer,  though,  errors   of  omission  (reading low)  are just as
important  as  errors of commission (reading  high).   Data  from the same NHTSA
study shows  a parallel situation for instruments  reading low.  At  least 16
* The  Equipment and Tool Institute  (ETI)  publically  proposed the same three
classes of field emission analyzers  (i.e.  centralized inspection, decentral-
ized  inspection,  and  vehicle  repair)  in  a presentation  given  at an  APCA
meeting in Detroit, Michigan, on  April  23,  1980.

-------
                                     30
percent  of  the analyzers  read low  by  more than  15  percent.   But, in this
case some of the analyzers tested read as much as 50 percent low.   This data
excluded the effects  of leaks in the  systems.   Leaks could easily increase
the magnitude  and  number of low  reading  analyzers.   Low readings  translate
into a  potential  fuel  economy penalty for the  consumer.   Even if the con-
sumer's  vehicle were  to pass  the I/M test, when it should have failed, such
vehicles need  repair or  maintenance.   Not getting  proper  repairs or main-
tenance  when  the  vehicle  needs it  costs  the  consumer  and  the  nation
potential fuel economy  savings.

Considering that  at  least 16  percent of  the  public is potentially affected
by  erroneously high  readings,  and  at  least 16  percent  of  the public is
potentially  affected  by  erroneously low  readings,  suggests-'that a signi-
ficant  portion of the  public (32 percent)  could be  adversely affected by
incorrect analyzer readings.   These  estimations are based  on  the data from
the NHTSA study 2j which looked at only one of several variables  in emission
measurement  (see   Chapter  IV,  Section  B,  Measurement Error  Sources).   If
anything,  these  estimates on  the omission and  commission  errors should be
conservative.  Even  so, some  may wish  to comment , that:   "So what if some
analyzers  read high and some  read  low!   They all balance  out in the end!"
Such statements  assume  that  the consumer only  gets  hurt  by high  readings.
So  they conclude  "Some  people pay  a little more  and  others get by without
paying  what  they  should."  First, that is not  a very equitable  position to
take;  it  must be remembered that  the  public  consists   of , individuals.
Generally,  individuals  are willing  to pay  their fair share if they believe
that the costs are  allocated in a  fair  manner.   Individuals can be quite
hostile  if  they believe  that the costs  are unfairly  allocated.  But, the
more important point  is that  it  is not only the high reading  consumers that
are unfairly penalized, but that the low reading consumers are  also penal-
ized.   The  real  case  is  that erroneous  analyzer  readings  cost  everybody.

The  automobile manufacturers  need  not  be concerned  about  incorrectly low
readings.  Because of  the 207(b) warranty repair provisions,  the  automobile
manufacturers  are  concerned  about  errors of  commission  (high  readings).
Since the NHTSA data presented pertains to the analyzer characteristics, the
characteristics would  be  expected  to apply  equally to analyzers  measuring
207(b)  vehicles.   Under  these conditions the  manufacturer might  challenge
the  State  inspection program, or refuse  to  provide warranty  repairs.  The
latter  is  assumed  to  be  illegal, but  the dissention  created by  the manu-
facturer may   raise public concerns  about  the  quality and  fairness of the
State program.

Such  concerns  may be  exhibited  by  consumer "shopping"  between  inspection
lanes   or  centers.   For  instance  in   the  Oregon  program,   there  is  no
inspection  fee until   the  vehicle  passes.   There  have  been  reports that
consumers  try  many test lanes until they  get the  vehicle to pass instead of
repairing  the  vehicle.   Or, such concerns could esculate  into local TV and
newpaper exposes  on the discrepencies in  emission  results between different
testing  lanes  (or  inspection  centers).

-------
                                    31
In  fact,  adverse  exposed need  not  be  centered  only  on discrepencles  in
results, they could allege improper operation or use of  the emission  testing
equipment,  thus  claiming any  test results  from  the claimed improper  oper-
ation would  be  void.   Such a situation  would  more likely occur  in a decen-
tralized  program if  that program did  not have  a strong  set  of  technical
minimum  specifications.   Take humidity operating  limit specifications  for
example".  Much  of  the equipment currently in the  field  was not designed  for
accurate  measurements under high  ambient  humidity conditions.    The  instru-
ment  specifications  were designed   to  prevent warranty problems  at high
humidities,  not  for  accuracy.   Most  specifications have  a  stated  upper
humidity  limit  of  85 percent relative  humidity.   The BAR 80 specifications
also  use  this  limit.  The  implication of this  limit  is that performing  an
inspection  test at  ambient  humidities  higher  than  85  percent  constitutes
improper  operation  of  the  testing  equipment.   The  consequences  of this
situation could  mean  that anytime it was  raining  during an  inspection  test,
the  validity of  these  tests  could  be  challenged  by  the  consumer  or  the
automobile manufacturer.

Clearlyj  adverse publicity would  provide the  State I/M program some  head-
aches.  But  without  public belief that  the program  is credible,  it is  ques-
tionable  that  the  program will  survive  as an effective  program.   This  could
be one of the most severe consequences  of bad data.

-------
                                     32
D.  Current Analyzers

Before  making  a determination  on the  necessity (or  lack  of) for  minimum
quality  analyzer specifications,  it  is  pertinent  to discuss  the  condition
and  the  capabilities  of  present  analyzers.   In  the  previous  section,  a
recent NHTSA  study was referenced.  2[  This  study evaluated  the accuracy and
repeatability of various pieces  of vehicle repair equipment  (emission  ana-
lyzers,  timing  lights,  wheel balancers,  etc.).   The location  of  the equip-
ment  that was  evaluated  included mass merchandizers, dealerships,  indepen-
dent  repair  shops,  service stations,  specialty  repair shops,  and  diagnostic
centers.   The specific  locations  across the  nation  were  selected  on a random
basis  to be  statistically representative  of the auto repair  industry.   The
sites  included  States with and without  in-use  I/M programs.  .

The  study indicated  that only 16  percent of  the emission  analyzers tested
were  judged  to  be accurate.  The word  "judged"  is  used  to indicate that the
NHTSA  study  had to  expand   the  accuracy  criteria to  ±5 percent from  the
industry accepted standard of ±3  percent.   The  3 percent  figure  would  have
failed  significantly more  analyzers,  possibly  preventing  any of the  ana-
lyzers from being considered  accurate.

In  this  study,  accuracy was  defined  as the ability  of  the  analyzer  to  read
correctly  both  a high concentration  (1576 ppmh,  8.05%  CO)  calibration  gas,
and a  low concentration  (307 ppmh, 1.62% CO)  calibration gas.   The  gas was
introduced through  either a  span  port  (if available) or  through  the probe.
When  introducing gas  through the  probe,  precautions were  taken  to  prevent
leaks  from affecting  the accuracy determination.   No  determination of the
number  of systems with leaks was made, even  though  leaks are a significant
source of  measurement error.

The  other  test  performed  by the  study measured  the  repeatability  of  the
analyzers.  In  this  test,  the analyzer's capability to replicate readings of
the  same  high  and  low calibration gases was observed.  If  it  is required
that  those replicate readings be  accurate (±5%)  as well as  repeatable,  only
T_ percent  of  the analyzers tested would be acceptable.

The statistical representativeness  of the survey  suggests that 93 percent of
the  analyzers  currently  in   the  field  would  fail  at least one of  the two
rather  simple tests on accuracy  or repeatability.   The  staff  considers  this
condition  serious  because as  Chapter  IV will  point  out, there are  many
factors  other  than  calibration  accuracy  and  repeatability that  affect the
ability  of the  analyzer to measure exhaust samples  correctly.

The  effects   of  some of these other  factors were highlighted  in  a previous
contract study  on field  analyzers  (work performed  by Olson Laboratories in
1976-1977).  5f   The following selected  comments  on these other factors  from
that  study  suggest that  real world effects may  substantially  affect  the
ability  of current field analyzers to  accurately measure the  true level of
exhaust  emissions.

      1)  "The  operation of some HC/CO analyzers  can  be  severely  impaired by
      exposure to high temperature  environments.   In many cases,  span and/or
      zero adjustment at high temperatures is  insufficient for correct cali-
      bration.  Typical  failures  appear  to be a result of electronic malfunc-
      tion due  to overheating.   Most instruments exhibiting malfunctions at

-------
                               33
high  temperatures  will  recover  to normal when  operated again at  room
temperature.  Low temperature operation tends to cause the same type of
problem but the incidence is much  lower."

2)  "Humidity  variation  affects  the output  of  most HC/CO instruments.
Variations  of  ±10  percent RF can  cause  a significant change  in output
level.  Operation  at  high altitude causes malfunctions  at high temper-
atures.  The  principle detrimental operational effect of high altitude
is  insufficient zero/span  adjustment  to achieve  correct calibration.
Altitude compensation  has apparently not been successfully  implemented
by all manufacturers."

3)  "The  interference  effects of various noninterest  gases can be quite
significant at specific temperatures,  and carbon  monoxide   [ed.  -  pre-
sumed  to  be  dioxide]  are  the most dominant interference gases.   Some
analyzers exhibit  cross-sensitivity between  their HC and CO channels."

4)  "In general,  most  instruments showed response  times of  up  to 30
seconds under extreme  temperature  and humidity conditions."

5)  "In some  cases,  the  response  time  was  a  function  of the specific
design or sampling operation of the instrument.   The general trend in
response time  indicated a  range from slightly  less than 10 seconds to
approximately  30 seconds  for 100 percent  final reading."

6) "Instrument  zero drift at normal operating temperatures for a period
of  4  hours,  in many cases, can exceed ±3 percent  of  full scale.  Temp-
erature extremes  generally  serve  further to degrade  zero drift perfor-
mance ."

7)  "Instrument warm up  times at normal room temperature were  generally
less  than 30 minutes.  Most state  specifications require 30-minute  warm
up  times  for  all  temperature and  humidity  conditions.   Both high and
low temperature tests  indicated longer warm up times  than those at  70°F
for most  instruments.   Temperature-related  drift at  the extremes may
account for the apparent  increased warm up times by masking  the warm up
performance with meter drift."

8)  "Prolonged  loaded  steady-state testing caused  early  filter degrada-
tion  on  some  instruments.   Water accumulation  in some lines occurred
during these tests.  The water accumulation  indicated, in general,  that
if an  instrument is to be used for a significant time in loaded steady-
state  or  high  rpm testing, an auxiliary sample conditioning or water
removal system  should be utilized  to achieve maximum  instrument perfor-
mance ."

9)  "Instrument durability  in  sampling  service,  demonstrated  in  the
durability  tests,  ranged from early failure  to  completion   of approxi-
mately 1,000  hours of exposure to  exhaust gases.  .Long-term use under
exhaust conditions generally resulted in  repetitive filter replacement,
sample system degradation and blockage, water accumulation and analyzer

-------
                                    34
     degradation  resulting  In  loss  of response,  Increased response  times,
     increasing  drift  and instability.  Analyzer  and sample system designs
     were  most accurately  characterized  by  performance  observation  during
     durability tests."

In an  effort  to control  the validity  of  Inspection  test data,  the State  of
California maintains a monthly audit of service center inspection analyzers.
This audit check only determines the calibration accuracy at the high  end  of
the scale  (lower numbers are usually more difficult  to measure accurately),
and the  audit  does not  check  repeatability.   Consistently  25 to 30 percent
of the analyzers checked  fail this simple audit test.

EPA's  own  limited  evaluations have also shown  problems.   One check on  auto
exhaust  _!/ under  nearly  ideal  conditions  showed a  35  percent difference
between  two different  makes of analyzers.  This difference  resulted in  over
10% of the vehicles  passing on one  brand of  analyzer and  failing   on the
other.  Another very limited study _6/ suggested that  the HC analyzer reading
could shift in a little  as two hours of operation.  Disassembly and cleaning
of the  optical bench would rectify the  problem,  but this  would hardly  be  a
practical solution in the field.

The data  presented would suggest that  the  state  of  current analyzers could
generate a large amount  of invalid test data  in an  I/M program.

E.  Alternatives

Basically the  issue of minimum quality analyzer specifications comes down  to
two alternatives;  a)  are they needed or b)  can we get by  without them?   If
it is  concluded that some form of minimum quality  is needed, then the mech-
anisms  used to implement that minimum quality specification become a  policy
issue.  Such issues are  discussed in Chapter  V.

The previous  sections  suggest  that there  is a  reasonable case for  deter-
mining  that  some  sort  of minimum quality  is  needed.   But,  what about the
opposite  alternative?    Are  there  advantages  that could result  if minimum
specifications were not  required.

One possible  advantage  of not specifying minimum  requirements could be  that
the States  would  have  more flexibility to determine their own needs.  Al-
though  this option may  sound appealing, the  practical aspects may not be  as
desireable as  anticipated.   First,  it is possible that  the number of indi-
vidual  analyzer  specifications  would e^ual  the  number  of states partici-
pating  in  I/M programs.  Practically though, there would probably be  suffi-
cient  commonality  to group the multitude of  specifications into maybe  5  or
10 distinct sets of requirements.  This assumption  is based on the fact  that
ancilliary requirements of  the current New York,  New Jersey, and California
specifications  are sufficiently different,  such  that  they would not allow
the interchangeability  of analyzers between these  States.

This  lack of  commonality  of analysis  systems  poses  two  major problems.
First,  for each  set  of  specifications,  the interested manufacturers would
have  to design  a  special instrument  package or  modification  package  to  a

-------
                                     35
basic system.  Because  of  the assumed number of  specifications,  the market
share would  tend  to  be minimal.  Under these conditions, most manufacturers
tend to make  a single production run of those instruments, and then move on
to  the  next  set  of  specifications.   Such is currently  the case  for the
recently purchases New Jersey analyzers.

An  immediate  problem also  occurs  when  these  assumed 5—10  designs  must be
designed and  manufactured  all  within  the  same time  frame  in order  to meet
current State  implementation dates.  Possibly, the  industry can accomodate
such an intensive  effort,  but the multiple  efforts  place  a  large burden on
the  industry.   Consider, for  instance,  that multiple  specifications would
tend to expand the manufacturers' analyzer product line from  one to possibly
10 analyzer  configurations.   That  means 10 different sets of documentation,
10 different  lines to stock in  inventory,  possibly  10  different qualifica-
tion requirements  and testing,  distribution problems,  etc.   The magnifica-
tion of manufacturing  and  distribution problems potentially  affects each
manufacturer  involved in supplying  this equipment.  In  the end,  the user
(State, contractor, or service center) pays for this multiplicity of effort.

Once in the field,  a second  set  of long  term problems  begin  to show up.
Because  of  the   assumed single  production run,  duplicates  or  additional
analyzers  needed  at  a  later date  may force the  customer  (State  or service
center) to wait  for  the next production run.  The  next  production  run may
not  occur  if  there  is  not  sufficient demand.   If a production  run does
occur,   it  would  be reasonable to assume that  the volume would be less than
the original run, possibly causing an increase in the retail  price.

Another  long  term problem  is  the  availability  of spare  parts  for  these
different  systems.   It  is  reasonable to expect that it would be profitable
for  the  manufacturers  to  maintain  spare  parts  for  the larger  volume
versions.   Without other  influences,  it  may  be less  profitable  for the
manufacturers  to maintain  as large an  inventory (or  even one at  all)  of
spare parts for the smaller  volume versions.  The user of the smaller volume
versions  ultimately  will   pay  for  these  factors   through either  forced
inventory  requirements,  longer  repair  times,  or  possibly the inability to
repair  the unit  at  all, thus,  necessitating the purchase  of a  new  unit.

F.  Precedents

Finally, before considering  recommendations on the issue of minimum specifi-
cations  it is always useful to see how others have approached  this issue.
For  instance,  some states  have  established minimum  quality  requirements for
inspection  analyzers.   Both  California and  New Jersey  have had  vehicle
inspection  programs  for at  least  5  years.  California's  initial  program
covered only  change-of-ownership inspection, but  even so, California issued
minimum  specifications  known  as  the  "BAR 74  Specifications"  (Bureau  of
Automotive Repair).   The BAR 74  specifications included minimum verification
testing.  California  has exhibited a further need for minimum specifications
by  implementing  a more  stringent  set of specifications known  as "BAR 80".
New  Jersey,  on  the  other  hand, has  not published  a set  of specifications
like either the BAR 74 or the BAR 80, however New Jersey does publish a list
of acceptable  analysis systems that may be used.  An analyzer is only placed

-------
                                    36
on  the New  Jersey  list after  it  is  tested  and judged  acceptable by  the
State.  Without  extensive  research, it would be  difficult  to know  the  exact
reasons why  these  states found  it  necessary  to  implement  minimum acceptance
criteria.  One  can  suspect  that the reasons have  some relationship to  the
consequences  of  bad  data,  especially  on  the equality  and fairness to  the
vehicle owners, or in a word —  consumer protection.

Precedent  is  also  established in the area of ambient air  monitoring.  Min-
imum  instrument  specifications  have  been  formally  established by  EPA,  and
manufacturers  must qualify  their  equipment through  extensive equivalency
testing.

G.  Conclusions and Recommendations

Probably  the  most  important aspects of the issue of minimum analyzer speci-
fications are consumer protection and the  ability to judge  the  effectiveness
of  the individual I/M  programs  in a  fair  and  equitable manner.    From  the
information available,  the  current analyzers do  not provide sufficient con-
fidence that  the consumer will be protected.  Without  outside  influences,  it
is doubtful  that this situation will improve.   Therefore, one is  left with
the  conclusion  that  some  sort  of minimum  requirement  is justified.    The
staff  also recommends that to be equitable such  minimum requirements should
be uniform.

Adoption  of  minimum quality  analyzer  specifications  presents  some  very
practical  problems.   The  specification  must provide data  that  is suffi-
ciently valid; yet at the same  time,  the specifications must  be achievable
within  the constraints  of technology and  leadtime.  Another practical  prob-
lem  associated  with  new specifications  is that some states  have  in  place
less  stringent existing  requirements.  How  should equipment purchased  under
these  existing state guidelines in good  faith be considered?   Finally, what
are  the  best ways   to  implement  the  specifications recommended  by  EPA?

None  of these practical questions can be answered,   however,  without  first
determining what the  exact EPA  recommended specifications  should be. In  the
succeeding  chapters, the  approach used  to  determine the  exact specifica-
tions,  and the  implications that those specifications may have on  I/M pro-
grams,  will   be  discussed.   Recommendations on  these issues  will  be pre-
sented, followed by  the  exact specifications  and  testing  procedures  that  can
be used to verify  compliance with these specifications.

-------
                                     37
IV.  The Inspection Analyzer

A.  Introduction

The process  of obtaining valid  emission measurements involves  two  critical
elements	the  equipment and  the  operation of the equipment.   Many  tech-
nical factors affect  the ability  of  the  equipment  to  measure  accurately even
when the equipment is new.  Other  factors affect the ability  of  the equip-
ment to  maintain  accurate measurements  with  age.   Certain practices in  the
operation of the equipment affect the validity  of  the data as well.

Because  the  I/M emission check  is  less  sophisticated than the  federal  driv-
ing  cycle  compliance checks,  it might be  expected  that the  requirements
necessary  to achieve acceptable  accuracy  and variability with  I/M  analysis
systems would  be  less sophisticated than  those used in  the  laboratory.   To
some extent  this  statement  is true, but  it  is not  totally  true.   Consider
for instance,  the  laboratory analyzers  are  operated  by  trained and  skilled
technicians.   These  personnel  can  often  spot  problems  with  the  equipment
even before  they  happen.   Complete engineering departments are  constantly
checking  the  laboratory  emission  values against  past  values and design
goals.    Under  these  conditions,  errors in  measurement  can  be  detected  and
eliminated.   In a  sense,  the human mind is adding  considerable  sophisti-
cation  to  the  equipment.   The  typical  I/M analyzer operator  generally  has
not had  the  training or experience  of  laboratory  technicians and  engineers.
This training  problem is compounded even  further in decentralized  I/M pro-
grams.   Mechanics  in  a repair  shop just  cannot  be  required to understand the
gamut  of measurement problems.   Even  though  mechanics  must   by necessity
become  familiar with  the  I/M equipment,  there is an economic  incentive to
direct their technical skill toward  repairing vehicles.

Another  significant difference between  the  laboratory  analyzer and  the  I/M
inspection analyzer  is  the  environmental operating  conditions.   Laboratory
analyzers  are  never  operated  in hostile  environments.    They  are  generally
operated in heated and air conditioned buildings with humidity  control.  I/M
programs, on  the  other hand,  expect analyzers  to  operate from  California to
Massachusetts,  from  Texas  to  Wisconsin,  and   from  summer  to  winter  in  a
variety  of  enclosures  ranging  from  rain and  wind  shelters   to  permanent
structures.  This  variety  of hostile environmental conditions  places a much
greater  burden on  an I/M analyzer  than a laboratory analyzer  ever encoun-
ters.

The lack  of  measurement savvy by the  I/M analyzer operator, and  the signi-
ficant  variation  in  environmental  operating conditions  are just  two  among
many reasons that  would  suggest  an  I/M  analyzer should  be more  sophisticated
than a laboratory  instrument.

B.  Measurement Error Sources

Identifying  the specific  factors for  both   the equipment and   the  operator
that can influence data quality  involves looking at  the  broad aspects of the
measurement  process,  and  assessing the  interactions between  them.   Figure
IV-1 provides a simplified overview  of  this  process.

-------
   FACTORY
  CALIBRATION
     FIELD
   SPANNING
SAMPLE HANDLING
 SYSTEM DESIGN
  OPERATIONAL
CHARACTERISTICS
   DURABILITY
   OF SYSTEM
      FIELD
   OPERATION
IMPROPER
OPERATION
INCORRECT
   DATA
 PROPER
OPERATION
  VALID
  DATA
                                                                                                                    do
                                                                                                   PASS/FAIL
                                                                                                    VEHICLE
   STATE OF
NEW ANALYZER
    EFFECTOR
ANALYZER AGE ON
  TEST RESULTS
            EFFECT OF
          OPERATOR ON
          TEST RESULTS
                        EFFECT ON
                        CONSUMER
                                                        e IV-1

-------
                                     39
The  following  tables list some  of  the specific factors  involved  In  each of
the  generalized  parameters shown in Figure  IV-1.   Tables IV-1  through  IV-4
deal with a new  analyzer,  Table  IV-5 with  durability  factors,  and  Table  IV-6
with those  factors  that are under  control of  the  operator  (or owner)  of the
equipment.  Each of  these  items  listed affects the validity  of the test  data
in some manner.

Mathematically  combining  the  listed   items  with standard statistical  tech-
niques allows  the  estimation of an overall  measurement error.  In order to
evaluate  the  effect of  individual items  listed  in  the following  tables,
various  error percentage  values  can  be  selected for each  item.  In  this
manner,  the  capability to accurately  measure  vehicle  emission levels  can be
determined  for  several levels of  technology or specifications,  such  as  best
available,  current  in-use, BAR  74, ETI, and BAR 80.   The effect on measure-
ment error of typical  failure modes can  also be evaluated.
                                 Table  IV-1
               Factors Affecting the  Factory  Calibration Curve
                     1. Accuracy  of

                         a) NBS  Gases
                         b) Calibration Gases.

                     2. Precision (repeatability)  of  the Analyzer

                     3. Hysteresis of  the  Analyzer

                     4. Analyzer  Resolution
                                  Table  IV-2
                       Factors Affecting Field  Spanning
                     1. Accuracy  of  Calibration Curve

                     2. Pressure  in  the  Sample Cell

                          a) Altitude
                          b) Weather Fronts  (barometer)
                          c) Ambient Temperature Changes

                     3. Accuracy  of  the  Span Gas

                     4. Background Levels  (Zero Air)

-------
                    40
                 Table IV-3
Sample Handling System Design Considerations

    1. Sample Probe Design

    2. System Materials

    3. System Flow Path (Sample Cell Pressure Variations)

    4. Design and Function of Water Removal Device

    5. Particulate Filter Design
                 Table IV-4
         Operational Characteristics

    1.  Non-Interest Gas Interferences

         a) CO
         b) HO
         c) N02

    2.  Electrical Interferences

         a) RFI
         b) VHF
         c) Induction
         d) Line Frequency and Voltage Variations
         e) Static Electricity
         f) Ground Loops
         g) Basic Analyzer Noise

    3.   Contamination

         a) HC Hang-up
         b) Particulate build up-

    4.  Effects of exhaust gas temperature

    5.  Effects of system leaks

         a) Sample line
         b) Other sample transport components
         c) From filter changing
         d) From water trap

    6.  Effects of Slow Response Time

    7.  Effects of HC Analyzer response to different
       HC compounds in the sample

    8.  Effects of Probe Dilution

    9.  Effects of Vehicle Exhaust System Leaks

-------
                41
             Table IV-5
    Factors Affecting Durability

1. Thermal Cycling Effects on

     a) Sample transport components
     b) Optical Bench components
     c) Electronic components

2. Vibrations and Mechanical Effects on

     a) Optical Bench
     b) Sample Pump
     c) Sample Line
     d) Other Sample System Components
     e) Electronic Components

3. Contamination

     a) Chemical attack on sample transport
        components (short and long term)
     b) Particulate build-up on Optical Bench
        windows.
     c) Long term HC Hang-up

4. Spare Parts availability

5. Quality of Spare Parts

6. Quality of Repair Service

-------
                                    42
                                 Table IV-6
                     Factors Controlled by the Operator

                    1. Procedures

                         a) Warm-up
                         b) Spanning  (gas and electrical)
                         c) Leak Checking
                         d) HC Hang-up check
                         e) Reading Test Value

                    2. Frequency of Checks

                         a) Spanning  (gas and electrical)
                         c) Leak Checking
                         d) HC Hang-up check

                    3. Maintenance

                         a) Water trap
                         b) Filter changes

                    4. Purchasing substandard replacement
                       parts or filter elements

                    5. Tampering

                         a) Improper  repairs
                         b) Intentional

C.  Acceptable Measurement Error

One of  the  first issues that working through the model  impresses  on  one  is
that no matter  what specifications are choosen,  some error in  the measure-
ment will  exist.   Therefore,  some judgment  must be  made  on the amount  of
measurement error that is acceptable, and the amount  that is not acceptable.
Further, this- decision  must be tentatively made  prior to selecting individ-
ual  specifications.  Determining  a  tentative  tolerance  allows trade-offs
between the individual specifications at that given level of error which can
be evaluated  against  practicality  and cost.  Finally, after the best  combi-
nation of individual specifications is evaluated, a judgment must be made  to
ascertain if that level of accuracy is worth the  cost.

Data presented  in  the  previous chapters suggest that the  current error  in
measuring emissions could be  around  ±35%.  Additional  data from the NHSTA
study  2j  shows  that  a one  standard  deviation  of the  ability  of multiple
analyzers to  read  calibration gas is around  15 percent.  Loosely translated
this means  that approximately  68% of all  analyzers  could  read calibration
gas  within  ±15% of  its  actual  concentration   (32%  would  be  greater than
±15%).

The model  (Figure  IV-1)  shows  that the ability to read  calibration gas does
not guarantee an accurate emission result.  Using a 5% calibration curve and
rather  typical  individual specifications (i.e. ETI,  BAR 74, in-use,  etc.),

-------
                                    43
the model  computes measurement  accuracies  in  the 25  to  35 percent  range.
Using best  technology,  it might be possible to obtain better  than ±10 per-
cent measurement accuracy.  Evaluating  failure  modes,  the measurement  errors
could exceed 50 percent.

Considering the data, it was our judgment  that  a  ±10  to ±20 error would be a
desirable goal.  The  model predicted that in order to achieve  this goal the
analyzer must  be  able  to read  calibration  gas  to  within  ±5 percent.  The
remainder of the  error  band would be taken  up  by  trade-offs  in other  compo-
nents.

D.  Discussion of  Current  Specifications and Alternatives

In the  determination  of a set of  specifications, the ability  of other pub-
lished  specifications  to  meet  the  design  measurement goals  must  be  ad-
dressed.  If some  other published specification would be able to achieve the
necessary measurement accuracy, it would probably  be  far easier, or at least
expedient to adopt those other specifications than to develop completely new
specifications.  Based  on these assumptions  the  more  realistic alternatives
for State minimum  analyzer specifications  are:

     Alternative 1:      Adopt the BAR  80  Specifications
     Alternative 2:      Adopt the ETI  Recommendations
     Alternative 3:      Adopt a modified  version  of  alternatives 1 or 2
     Alternative 4:      Develop a New  Specification

During  evaluation  of  these alternatives,  a  problem  arose  in the ability to
interpret and  compare  existing  specifications such  as  the  BAR 80 and ETI.
The difficulty centered  around  the fact  that  these specifications did not
include  detailed  evaluation and  acceptance  test  procedures.  In all fair-
ness, the BAR  80 specifications do include acceptance  guidelines, but  in our
judgment many  of  the BAR  80 guidelines do not  contain sufficient details to
insure  consistant  interpretations.   This  difficulty  has  also  been high-
lighted  by  manufacturers  attempting  to  certify  analyzers  to the  BAR  80
specifications.

This  confusion in  interpretation affects  the  basic  ability to evaluate the
analysis systems,  and their potential  compliance  with the design goals.  If
one cannot  accurately determine the capability of a  system,  then what value
are  the individual  specifications?   In  EPA's judgment,   the  inclusion of
specific  verification  procedures  is  nearly as  important  as  the specific
parameter values chosen, and it is important that  such procedures be part of
any specification.

Even  though the more  specific evaluation procedures were lacking, an attempt
to  evaluate the   first  two  alternatives  for  sources of error (see  figure
IV-1)  was  made.   This comparison  indicated that  these  specifications ad-
dressed  the condition of new analyzers, and  to  some extent  the  durability of
those analyzers,  but  neither of the specifications dealt to  any degree with
the effects of the operator's actions.  Table  IV-6 lists the  various actions
controlled  by   the operator that  affect measurement  accuracy.   Table IV-7
details  the  potential effects  of some  of  these actions.  Clearly, operator
actions  significantly affect the ability  of the  analyzer  system to achieve
the design goal of 10 to  20 percent measurement accuracy.

-------
                                    44
                                 Table IV-7
                  Incremental Error Due to Operator Actions
                        Normal Operation
                                       Failure Modes
Field Spanning

HC Hang-Up

Leak Checking

Reading Test Value
Optimum

  ±1%

   0%

   0%

   0%
Acceptable
 Maximum

   ±2%
   -3%

   ±2%
Typical

±6-10%

±10-30%
±10%
±30-40%

+100/(-70)%

-100%

±20%
Considering  the  original question of  "can  other specifications achieve  the
design goals  for  measurement accuracy?", our judgment is that without modi-
fications  to  those  documents to control  or  account for operator errors,  it
would be  extremely difficult  for those  specifications  to  assure  that  the
design goals for measurement accuracy  could be maintained.  Modifying either
the BAR 80 specification,  or the ETI  Recommentaions  to include both evalu-
ation test procedures and  operator errors  would be  a  substantial modifi-
cation to  those documents.

Of the alternatives listed, it was determined that  a new specification would
best  integrate  the evaluation test  procedures,  equipment requirements,  and
operator control requirements.  The  first draft  of  these new specifications,
which was  sent  to the analyzer manufacturers for comment, included slightly
more  stringent  performance  specifications  than BAR  80,  and  on-board  gas
spanning,   traceability  of gases  to NBS, and control  of operator practices
(both procedures and frequency).

Comments on the stringency of the performance specifications were not major.
The modifications that have been made  to specific performance specifications
should alleviate  most concerns expressed by  the manufacturers.  The source
of  the  most  vociferous comments  were those on techniques to  control  the
operator's actions.

In  response  to  those comments  on  control  of   the operator's  actions,  the
causes .of  analysis  error due to  the operator  were again reviewed.  The  two
main causes for these errors are failure  to perform the checks on a frequent
basis, and  failure  to perform the checks properly.  The basic alternatives
to solve  the  operator problem are:  1) make the operator responsible (i.e.,
manual control),  or  2)  make the machine responsible  (i.e.,  computer con-
trol).

Placing responsibility  on  the operator  (or  owner)  generally means that  log
books  for each and  every operation would  be  maintained  to not  only help
insure  compliance  in  frequency  and  procedure,  but  to  allow  the  user  to

-------
                                    45
identify  trends  in  analyzer  operation.  The  maintenance and review of  the
data implies that the owner will  take  the time  to  perform such functions.  A
slightly  less  time  consuming  compromise  (for the owner)  would place  the
burden of searching  for trends  in the  log book  on  a State Auditor.

Placing  the  responsibility  for preventing  failure  modes  on  the  machine
generally requires  an  on-board microprocessor  for internal  self-management.
Such systems are usually called  computer  controlled analyzers.  The  computer
controlled  analyzer would not  totally alleviate  the operator from  respon-
sibility,  but  through  computer  prompting  the operator  to perform  certain
checks, and  computer analysis of the  results,  the necessity for  maintaining
a log  book  would diminish.   Further,  the computer system  could  be  designed
to be  self-policing, could spot  trends,  and prevent  the  operator from  using
the analyzer if  the  results of  any of  the various  checks  exceeded the speci-
fied limits.

The initial  trade-off  between  these  two  basic  alternatives is operator time
in  performing  the  necessary  checks  including  the  log  book  maintenance,
versus  the  cost  of  the computer based analyzer.   The  two basic alternatives
represent  the  extremes between  manual control  and computer  control.   There
are  at  least  two  alternatives that  bridge  the  gap  between the  extremes.

A step between manual and  automatic control would  deal with the frequency  of
performing  the proper  checks.   Very  simply,  a clock  function and  indicator
lights  could be added  to the  manual system.   Such a passive system  would
remind the operator  to perform  the necessary  function  on ,a proper frequency,
but nothing  else.

A step  up  from the  simple passive indicator  system would be an active  indi-
cator  system that  could  prevent the  analysis  system from  testing  vehicles
(i.e., disable printer,  drive meter  to  full  scale, etc.) when an  indicator
light was activated.

Both of  the indicator systems  rely on the  operator to perform the  indicated
function  before  resetting the  clock  functions.  The  responsibility  to  per-
form  those   functions  correctly is still placed  on the  owner/operator.   To
that extent neither of  the two  indicator  options offer  that much over  the
completely manual system.

To recap  the candidates for  adoption  as  the EPA  recommended analyzer speci-
fications:
     Alternative  1.       1980 BAR  Specifications
     Alternative  2.       ETI Recommendations
     Alternative  3.       Modified  BAR  or  ETI
     Alternative  4.       EPA Recommendations
                              a) Manual Operation
                              b) Manual Operation  w/passive  indicator system
                              c) Manual Operation  w/active indicator  system
                              d) Computer Operation

-------
                                    46
Although the previous discussion concluded  that  the  first  three  alternatives
will not achieve  the desired measurement accuracy,  and  that  alternative  4b)
and  4c)  may not  offer that  much,  a  final recommendation must  include  the
considerations presented in the following sections.

-------
                                    47
E.  Cost of Alternatives

In the  previous  section,  four basic alternatives for minimum specifications
were  suggested.   Although one  must view the  total  inspection  costs from a
broad perspective, it is also important to evaluate  the possible  impact  that
the various equipment specifications may have on retail prices of the equip-
ment.

In order  to  equitably judge possible  cost  differences  between the  alterna-
tives, a reference framework was selected.  For this reference, the  1974 BAR
(California Bureau  of Automotive Repair) analyzer specification  was chosen.

Another aspect to consider is the effect of inflation on analyzer costs.   In
1974 most  analyzers  meeting the BAR 74  specifications  cost around  $2000  to
$2500.   Table IV-8  indicates  how  these  costs for a basic  analyzer  have
escalated  with  the  Consumer Price  Index (CPI).  Although  the CPI represents
a  composite  annual   increase,  and  the  analyzer  manufacturing   industries'
retail  prices may deviate from  the CPI  for a given year,  the overall  trend
is reasonably  accurate.  For  instance,  the current price (May  1980)  for a
Sun EPA 75 analyzer  (which meets the BAR 74 specifications)  is around $3400.
This price falls between the two mid-1980 costs shown in Table IV-8.
                                 Table IV-8

                      Inflation Effect on Analyzer Cost
                                   (BAR 74)
Year                               CPI*                          CPI  Cost

74                                 New Cost                      2000 2500
75                                   7.0                          2140 2675
76  -                                 4.8                          2243 2803
77                                   6.8                          2395 2994
78                                   9.0                          2611 3263
79                                 13.3                          2958 3698
80 (June)                          (3.8+)                       (3070) (3838)
80 (Dec)                           14.6++                        3390 4237
81                                 10**                          3729 4661
82                                   9**                          4064 5081
 *  Composite Consumer Price  Index  from Bureau  of Labor  Statistics
    (Jan. to Dec. values)
**  Estimated annual  inflation rate
 +  Estimated 1.25% per month for 6 months
++  Estimated 1.25% per month for 12 months

-------
                                    48
The  evaluation of  the technical  alternatives starts  with the  alternative
expected to have the least technological impact and progresses  to the alter-
natives expected to have the greatest technological impact.

The  ETI recommendations  were _ judged  to  have the  least  impact..  In many
respects,  the  ETI recommendations  are similar in stringency  to the BAR  74
specifications even  though  the assumed intent of the ETI specifications was
to provide an alternative to the BAR 80 specifications.  Based  on the strin-
gency of the ETI recommendations, and the current technological  state of the
industry, if the ETI recommendations were adopted as minimum specifications,
only minor changes to current analyzers (BAR 74) resulting  in effectively  no
retail  price  change  (due  to   technology  improvements) would  be  expected.
Even  though  no retail  price increases are foreseen  due to technology, the
cost of such analyzers would increase due to inflation.

The  next specification  in  order  of  increasing stringency  is  the BAR  80
specifications.  These  specifications  include more stringent specifications
than  the ETI recommendations and provide acceptance  criteria.   BAR 80 does
not,  however,  include  evaluation test procedures  or  specifications to con-
trol the operator's actions.

The  retail cost  of the first  analyzer  to  pass the BAR 80  specifications  is
in the  range of  $4100 to $4300.  If we use 1979 or mid-1980 data from Table
IV-8, a  BAR  74 analyzer costs  between  roughly $3000 and $3750.   Using this
range and  an average price  of $4200 for the  BAR  80  analyzer,   the  improved
technology of the BAR 80 specifications (over BAR 74) represents an  increase
in retail  cost between  $450 and $1200.  Using  this  price differential,  we
would esimate  BAR 80  analyzers will  be  in the $3750  to  $4950  price range
(1980 dollars).

Although the third alternative, modification to either  BAR  80 specifications
or the ETI recommendations could represent an  improvement in analyzer quali-
ty,  such modifications would effectively be  an  entirely new specification,
and thus this option was not evaluated separately.

The  options  investigated under the fourth  alternative (the new specifica-
tion) included manual  operation, manual operation with a passive  reminder,
manual operation with an active  reminder (lock out), and computer operation.
In order to  evaluate these options and  allocate  costs, a  rough idea of the
potential  market  is needed.   Based on  the  number of   States beginning I/M
programs in  the next few years,  it is estimated that somewhere  around 25,000
inspection  analyzers will  be  needed  in  the next  5  years.   If 5  analyzer
manufacturers*  actively  market  inspection  analyzers,  by  assuming  equal
market  shares,  each manufacturer  would be able  to  spread development cost
across  5000  analyzers.    If fewer  manufacturers entered  the  market,  the
development  cost per analyzer would be cheaper.  Further, some  manufacturers
* Written  comments  from  an analyzer  manufacturer on  the draft of  the  EPA
specifications  specifically  suggested  only  5  analyzer  manufacturers  may
actively  pursue  the inspection  analyzer market.   It  is  assumed  that  the
remaining manufacturers would continue to market repair  analyzers.

-------
                                    49
might allocate  some of the development costs  to  improvements  in  their  vehi-
cle repair  analyzers.   For the purposes of comparison, however,  we  will  use
the 5000 unit figure.

For the completely manually operated analyzer, we would expect some  develop-
mental and  modification  costs over and above  that  completed  for  the BAR 80
systems.   These modification costs  would be  as  high  as  $100,000**,  but
spread across 5000  units, this cost would only  be $20 per  unit.  Using  a 3
to 1 retail price mark-up, modification costs would increase the  retail cost
of an analyzer by $60 per unit.

Final  accreditation and  verification  cost Increase  (due  to the  improved
verification  testing)  on  three units  represents roughly  a  $5000  increase
over  the BAR  80 procedures   (BAR  80 accreditation  currently costs around
$10,000).   Using the  same  costing procedure,  a more thorough verification
testing process  increases the  cost of  the  analyzer  around  $3.

Estimated  retail hardware costs  are  listed in- Table IV-9.   These  improve-
ments not only  improve the basic analyzer  over the  BAR 80  analyzer,  but they
are  also directed  at improving  the  operator's  actions  which  are  not  ad-
dressed  in  the  BAR 80 specifications.  Even  so, the  manual EPA  system will
place the responsibility  for  those actions on  the operator.

The  total  cost  for these improvements is around $720 (Figure IV-9) over a
BAR 80  unit.   Based on this  price increase and  the estimated  price  range of
the  BAR 80 analyzers,  we would estimate  the  cost  of the. manually  operated
EPA specification analyzer would be in the $4470 to  $5670 price  range  (1980
dollars).   Since these improvements would not  require a redesign of BAR 80
optical  bench,  we  would estimate  the  lead time  for the improved  analyzer to
be around 0 to  9 months.
** Based on  informal  conversations  with  industry  representatives and  comments
   on the draft EPA specifications.

-------
                                    50
                                 Table IV-9

                       Estimated Retail Cost Increase*
                     EPA Recommended Inspection Analyzer
                          Manually Operated Version


Item                                                         Cost Increase

a) Development
b) Improved Verification Testing
c) Gas Spanning System Improvements
d) Detector Improvements
e) Signal Conditioning Improvements
f) Leak Check and HC Hang-up Systems
g) Sample Cell Heater or Electronic Compensation

                                                                $723
*  Cost increase over BAR 80 due to EPA improvements.
The next  option  on the basic manual system would be  the  passive  reminder  or
indicator  system.   This  system would  include a clock  function,  indicator
lights, and reset buttons.  No interfacing  (other than the  sample read  to  HC
hang-up light) would  be required.  Therefore,  the passive  system could  be a
stand  alone  system attached to  the basic  analyzer.   We estimate the  retail
cost of this option to be between $300  to $500.  Lead time  is  expected  to  be
3 to 9 months.

More  sophisticated  than  the  passive indicator system  would  be   the  active
reminder  system.   Such  a system would  be similar to  the  passive  system, but
would  require substantial interfacing with  the  analyzer in  order  to lock out
or prevent vehicle  testing when one of the reminder lights were activated.
Resetting the reminder system would allow testing to  proceed.   It is assumed
that the operator would perform  the necessary  functions before resetting the
system, but  the  system would not be able to verify  that  those actions  actu-
ally had taken place.  The cost  of this  system  would  probably  be  around  $300
to  $500  over  the  passive  system, or  about  $600  to $1000 over  the  basic
manual system.

The final option Investigated was the computer  operated system.   This  option
would  place  more responsibility on the  machine for  the operator's  actions,
and in many  cases would  perform the operations automatically.  The develop-
ment  of  the  microprocessor  based  computer  system  including software and
hardware programming could cost  an additional  $300,000 over the basic  manual
EPA system.   Using the  previous costing  procedures,, the  development  costs
would  increase the  retail price  by around $180  (Table IV-10).   Final accredi-
tation  testing would  be about the same as  the manual system,  and therefore
would  not result in any additional incremental  price  increases.

-------
                                    51
Estimated  retail costs  of  the additional  computer hardware  is  listed  in
Table  IV-10.   As indicated by  the  incremental  price increases, interfacing
electrical signals  between the processor and  the  analyzer is a significant
task.   The total  cost  of adding  the computer  to the  manual  analyzer  is
estimated  to  be  around $1700.  This  increase  in  price would place  the  com-
puter  analyzer (without automatic  data collection)  in  the  $6175  to $7375
price  range.   The  lead  time  to  place computer  analyzers in  the  field  is
expected to be around 9 to  18 months.   (New York  State  has  contracted  with
one  manufacturer to  provide  several thousand  similar units by  the end  of
1980).
                                Table IV-10

                       Estimated Retail Cost  Increase*
            Computer Operated EPA Recommended Inspection Analyzer
                          Computer Operated Version


Item                                                         Cost Increase

a) Development                                                   $180
b) Microprocessor Board                                          $375
c) Signal Interfacing                                            $900
d) Basic Printer                                                 $250

                                                                 $1705

Options (See Chapter IX)
a) Automatic Data Collection                                     $500
b) Additional Printer Capability                                 $200
c) Anti-Dilution (C0%)                                           $800
d) Loaded Mode Kit                                               $900
e) Engine Tachometer (RPM)                                       $150
* Increase over EPA manual  system
Table  IV-10  also lists some  optional  features that one may  wish  to  obtain.
The automatic  data collection and  the  expanded printing systems  could  only
be added  to  the  computer  analyzer.   The other  options  (anti-dilution,  loaded
mode kit,  tachometer)  could be  fitted  to  any  of  the manual  systems as well.

Table  IV-11  summarizes the estimated price  ranges for the various alterna-
tives.   Bear in  mind that  these direct costs must be  evaluated in relation
to  the  overall  implementation  costs  before  making  a final  specification
choice.   Further,  recognize  that these estimates are in mid-1980 dollars.
If current inflation trends are maintained,  the  retail price at the  time  of
purchase  will  most likely be higher  than that  indicated.

-------
                                    52
                                Table IV-11

                        Comparison of Analyzer Costs
                             (Mid-1980 dollars)
Specification            Estimated Retail Price Range       Step  Increase

BAR 74 and ET1                   3000 - 3750
BAR 80                           3750 - 4950                      975
EPA (Manual)                     4470 - 5670                      720
EPA (Passive)                    4770 - 6170                      400
EPA (Active)                     5070 - 6670                      400
EPA (Computer)                   6175 - 7375                     1705*
* Increase over EPA  (Manual)
F.  Production Variances and Field Audit Testing

The previous  sections  in  this  chapter have dealt  with  the capability of  a
single  analyzer.   If  all  analyzers in a  production line and all  analyzers
between  all  manufacturers  were identical,  testing  one analyzer would  indi-
cate  the quality  of  all  analyzers.  Obviously  all  analyzers  are not  the
same.   The intent  of  a  set  of specifications  is to  accomodate  the real
variences  between  analyzers,  and yet meet the  design goals for measurement
accuracy.  The  intent of  evaluation testing is  to  verify  that  the analyzer
design  truly  meets those  specifications.   The intent  of  field audit  testing
is  to  spot check  certain parameters to  assure  basic  operation  and analyzer
condition.

The problem   is two-fold.   One, the audit check  assumes that all  analyzers
met  the complete  specification  criteria  when  new.  Two, practical- audit
checks  are not  complete enough to. verify  that the  analyzers are meeting  all
criteria  (otherwise  the audit test  could  be  substituted for the evaluation
procedures).  The missing element  is production variation.
                                \
There  are  several avenues  to deal  with  production  variation,  and  overall
quality assurance and quality control  (QA/AC).  All of these approaches deal
with  testing  production analyzers.   The  analyzer  manufacturers claim that
any full  evaluation  testing of production analyzers is  unwarranted,  that it
constitutes  re-certification,  and that  their  current production checks  are
adequate.

EPA disagrees.  Many of the problems with analyzers attempting  to  be certi-
fied  to  the  BAR 80  specifications have been production  type problems*, even
* Informal  contacts with  California  BAR  personnel.

-------
                                     53
though   the  certification   analyzers  are  generally  specially  built  pre-
production  analyzers.   If   anything,  the  specially  built analyzers  should
exhibit  better quality  than  production analyzers.

In  response  to the industry  claims  that current production  checks  are ade-
quate  — How do they know?   Have  they procured  analyzers from the field and
performed  proper evaluation  test  procedures  on them?   The  NHSTA  study  2/
suggested  that continually  repeatable readings  tended to lead the operators
to  believe  the units were accurate,  when in  fact, around 80 percent of the
units  were  inaccurate.   The fact  is  that no one is really  evaluating the
performance  of analyzers in  the field.   Therefore,  if these  .analyzers were
producing bad  test  data (which  the NHSTA study suggests), no  one would know.

Fully evaluating a  few  analyzers in  the  field  might  be one means of  assuring
valid  test  data.   But  suppose  it  was determined that  the design or produc-
tion  run truly was bad.  The damage  would  have  already been  done.  Hundreds
or  thousands of analyzers would already  be  in  the  field.

From  a  practical  point, most people would  prefer  to find out  if analyzers
leaving  the production  line were  of  proper  quality.   To determine  if the
analyzer is  meeting the specific  criteria,  it must  be tested.  Testing more
analyzers  means more production costs.   The number of additional analyzers
tested  will depend  on  the sample plan,  which will,  in  turn, determine how
much  the QA/QC program  will  cost the  customer.

The QA/QC  approach  the staff  recommends  is  a compromize  which  provides
reasonable  assurance that  both the  design and production practices  of the
manufacturer will  have  been  tested.   Several  other  sample plans such as SEA
(Selective  Enforcement  Audit)  or  full  recertification each year were inves-
tigated.  However,  to save cost, and  to  some  extent  overcome  industry resis-
tance  to continual,  periodic production checks, production  line testing  is
limited  to  3 units  selected  randomly  from the  first  20 produced.  The recom-
mended  QA/QC procedures found  in Table IV-12 are expected  to  increase the
retail price around $20 per  unit.

To  supplement  the QA/QC plans,  (which  does not completely assure that pro-
duction  problems will  not  occur  after  the first  20 units)  thorough field
evaluations  by State auditors  during their periodic  audits  is  recommended.
ETI and  EPA  are working on guidelines  for these  audit procedures, which will
be  able  to  detect  most  production errors,  but also  should be able to detect
random equipment problems as well  as  other  problems  due  to wear, abuse,->etc.
These  guidelines will be published as soon  as  they are finalized.
                                  Table IV-12

                       Recommended Qualification Program

 I.   Pre-Production

 1.   The  manufacturer  may  receive a  preliminary  accreditation,  valid  for
 six(6)  months,  by providing a  publically released report which demonstrates
 that at  least  one  pre-production  unit has  passed  all evaluation  tests.

-------
                                    54
II.  Initial Production OA/QC

1.  The manufacturer  shall  also  select,  in a  random manner,  three of  the
first 20 production units, and all three shall receive all  evaluation tests.

2. If two of the three units pass all evaluation  tests,  the instrument  shall
receive full accreditation  valid for a  period  of three years  from the date
the first unit was produced.

3. If two or more units fail the  evaluation  tests, corrections  to the design
and/or production must  be made,  and  three  additional units selected from a
new or current  production run.   Two  of  these three  must pass  all evaluation
tests.

4. All  units covered  by a preliminary  accreditation and  produced  prior  to
the production run in which full  accreditation is  received  shall  be required
to incorporate the necessary design and/or production fixes.

III. Subsequent Production QA/QC

The accreditation  may be  renewed for  a three year period at any time  by
passing all  evaluation  tests  on  two  of  three units  selected randomly from a
production run of 20.

IV.  QA/QC Testing Criteria

1.  Two(2)  of  the  three(3)  production  units must  pass with  no  design  or
random failures.

2. A design failure is defined as a failure  to meet  the  evaluation procedure
criteria.

3. A  random failure  is defined  as   the  failure  of  a  standard  part in the
system  (i.e.  pump,  electrical  resistor,  etc.)  where   improved  procurement
specification,  assembly  technique,  or  pre—assembly QC  on that  part  would
reduce failures in the field.

4. An  infant  mortality is defined as the total failure  of  a part (usually a
computer chip or related components)  within  a short  period  of  time after the
unit  first  receives  any  electrical  power.   Infant mortality  failures must
have sufficient documentation  (i.e. published report available  to regulatory
bodies) to justify why the failure can be attributed as  infant  mortality and
not minor design  failure.  Infant mortality failure is  not classified  as  an
analyzer  failure  if  the  failure would be obvious  in the  field.   After re-
pairs,  those tests  that might  be  affected by  the  repairs must  be rerun.

5. Random failures must have sufficient  documentation (i.e. published  report
available to regulatory bodies)  to justify why the  failure  can  be attributed
as  a  random failure  and not minor design  failure.   Random failures may  be
repaired on  pre-production  units only.  A  condition to allow  the repair  of
the pre-production  analyzer is  the development  of a plan  (where necessary)
to prevent the specific  type of  failure  in production units.   After repairs,
those tests that might be affected by the repairs  should be rerun.

-------
                                    55
G. Conclusions and Recommendations

This chapter  began  by focusing on  two  aspects of measurement  —  the  equip-
ment and  the operation  of that  equipment,  how  those  variables affect  the
validity of the measured data, and  the  fact  that  some measurement  error must
be tolerated.  The  amount of total measurement  error that can be  tolerated
determines  the  real  bottom  line  on  the technical  aspects of  any set  of
analyzer specifications.   This bottom  line  also  provides-a yardstick with
which to compare different specifications.

Technical  consideration  is not  the only  consideration  in the selection  of
analyzer specifications.   Both the initial  cost  of  the analyzers, and  the
availability of the analyzers must  also be considered as well as  the overall
program cost and the  inspection fee structure.

On the  issue  of  total measurement  error,  simple calibration curve  accuracy
has been noted to be  considerably different  than  total measurement  accuracy.
Our  technical judgement  is  that a total measurement  error of  ±15 to  ±20
percent  of the true  value is probably  sufficient,  and would be  considered
valid  data for  an inspection  program.  Measurement  errors of  ±10 to  ±15
percent  of the  true value  would  be  much more  desirable.   Achievement  of
either of  these measurement goals would require a  calibration curve  accuracy
of ±5 percent of  true value  or better.  The error difference  between  the 5
percent  and  the 10  to 15 or the  15  to 20 percent  figures  is taken up  by
errors  in  other parts  of the  system,  or by  errors due  to the  operator's
actions.

Evaluating  current analyzer  specifications  (BAR 80, ETI,  and other  State
programs),   indicates  that these  specifications  will not  be able to  assure
even ±20 percent  true measurement  accuracy.   The  BAR 80 specification  comes
the  closest  to the  20 percent  figure, and BAR  80  analyzers might even  be
able  to meet that  figure  if  operated  correctly.  But,  BAR  80 does not  in-
clude sufficient specifications to  insure  proper  operation in the  field,  and
more fundamentally,  the  BAR 80 lacks sufficient  definition  of  the  verifica-
tion  procedures  necessary  to  identify if  the  20 percent  figure would  be
truly met.   Based  on an overall  evaluation  of other  specifications, EPA  has
developed  a  set of  recommended  analyzer  specifications which will provide
measurement  accuracies  better than  20  percent.   The  recommendations  are
found in Chapter VI through XI.

Four  options  on the  EPA specifications  —  manual . control,  manual control
with  a  passive  reminder  system,  manual  control  with an  active  reminder
system, and a computer controlled system were  evaluated.   Based on  technical
considerations,  the  completely  manual control  is   recommended  as minimum
specifications  for all  inspections in  centralized  inspection programs  and
for  those  decentralized  programs that  may have  constraints  that would pre-
vent  those  programs  from exercising  other  options.   A  passive  reminder
system  added  to the  basic manually operated  analyzer  is  recommended as  an
option  which  the  States  could select  for those  decentralized applications
using the  manual system.

-------
                                    56
The active  reminder system is not  recommended  on. technical grounds  because
it does not deal with improvements in the operator's actions,  but  only  with
the improvement  in  the frequency of those actions.   Secondly,  the complexity
and the  cost of the active  reminder system .approaches  that of a  computer
operated system.  Thus, the cost effectiveness  of  the active  reminder system
is much less desirable.

On a  technical  basis,  the computer  operated  inspection analyzer  is  recom-
mended  for  all  decentralized programs,  and for  inspection  in centralized
programs  where  operator  training and   technical  management  oversight  is
limited by  budget  or other constraints.  The  computer  operated analyzer  is
also  recommended for those centralized programs  that may have  rather remote
testing  lanes with limited  support  facilities at  those  remote locations.

These  technical  recommendations (stated  above) must be reconciled  with the
other  factors of  cost  and  leadtime.   Previous  discussions   on  cost  have
indicated  that   the  average mid-1980 price  for BAR  74 analyzers  is  around
$3400.  Estimated average price (Table IV-11) for BAR 80 analyzers  is  around
$4350,  for  the  EPA manual  system around $5100,  and  for  the EPA  computer
system around  $6800.   Production  audit testing is expected to  add  about $20
to the cost  of the  systems.

For those  states implementing  centralized  inspection  lanes,  these  analyzer
costs  are   very  minor  compared with  the overall  costs  of  setting-up  and
operating  the  inspection   lanes.   However,   for  those states  implementing
decentralized programs,  the cost  of  the  analyzer  in relation  to the overall
program costs must  be considered.

A  critical consideration for States  implementing decentralized programs  is
the determination  and allocation  of the inspection fee schedule.    For in-
stance, is  a portion of the  inspection fee  to  be  allocated to  subsidize 100
percent of  the cost of the  analyzer,  50 percent of the  cost of  the  analyzer,
or is  the owner  expected to completely subsidize  the cost  of  the analyzer as
a  "cost  of  doing business"?   The  same questions must  be  asked on  the  sub-
jects  of labor costs, and recurring equipment and maintenance  costs.

Table  IV-15  illustrates  direct analyzer costs  to  the  owner.  The  yearly
amortization  would  represent the  owner's average  out of pocket expenses for
various initial  costs.   The yearly amortized cost  would also  represent the
yearly costs if  the I/M fee  were not meant  to  subsidize  the  purchase of the
analyzer.   The   other columns represent  the average payback  or rebate  per
vehicle for  two  different inspection  fee  subsidy  rates.

A  governing  factor  on whether  the  inspection fee  subsidizes  the purchase of
analyzers  is whether  the service center recoupes  the  inspection labor costs.
Table  IV-16 indicates average inspection costs  (one  free reinspection)  at
various  labor rates,  and  with various  inspection  times.   The inspection
labor  efforts may  involve  moving  the vehicle  into  the bay,  setting  up the
equipment,  testing  the  vehicle,  and  paper  work.  It  can  be  seen  from  this
table  and   Table IV-15  that  the  inspection  time factor  is  probably  more
important  than the  analyzer cost.

-------
                                    57
                                Table IV-13

                               Analyzer Costs*
                                   (5 years)
   Initial
Analyzer Cost

   $3500
   $4500
   $5500
   $6500
   $7500
   $8500
   $9500
   Yearly
Amortization

  923.29
 1187.09
 1450.89
 1714.68
 1978.48
 2242.28
 2506.08
000000"Average Cost per Vehicle00000000
50% Fee Allocation  100% Fee Allocation
     $
     $
     $
.62
.79
.97
     $1.14
     $1.32
     $1.49
     $1.67
1.23
  58
  93
  28
  63
  98
                    3.33
* Assumptions
     a) Average Annual Auto Population = 800,000
     b) Number of Inspection Stations = 1064
     c) Annual Interest Rate (INT) = 10%
     d) Program Length (PRL) = 5 years

                                    PRT
     e) Amortization = INT  (1 + INT)    =  .2638

                       (1 + INT)PRL -1
                                Table IV-14

                 Average Inspection Labor Cost per Vehicle*
                                (5 year period)
Inspection Time                $20/hr

5 minutes (.083 hours)         $2.63
10 minutes (.167 hours)        $5.30
15 minutes (.25 hours)         $7.94
                         $25/hr

                         $3.29
                         $6.63
                         $9.92
                    $30/hr

                    $3.95
                    $7.95
                   $11.90
                            $35/hr

                            $4.61
                            $9.28
                           $13.89
* Assumptions
     a) Annual Inflation Rate  (INF)  =  10%
     b) Program Length  (PRL) =  5 years
     c) Stringency  Factor  (STR) =  30%
     d) Inspection  and  one  free Reinspection Labor  Costs  =  (time)(hr  rate)(1 +  STR)
                                            PRL-1
                                            I
     e) Average Annual Costs w/Inflation  =  i  =  o
                               (1 + INF)'
                                                       PRL
                                                                         -  1.221

-------
                                    58
A  recent  NHSTA  study  3/  on small repair  centers  (those less  than  $185,000
gross  per year)  indicates  that the  small  centers  account  for around  46
percent of  all  vehicle  repairs, and  gross an  average of $79,000  per  year
(1977 dollars).  Such repair facilities would be expected  to be affected the
greatest  by   the purchase  of  a  new analyzer.   If  it  is  assumed  that  the
inspection  fee  were  sufficient  to  cover  the  inspection  labor,  incidental
equipment  maintenance,  and  expendible  supplies,  the  largest  individual
expense by the small service center would  be the cost  of  the  analyzer.   It
is  also  assumed  that  these small service  centers  would only need  one  ana-
lyzer.  Using  the $79,000 average gross per year, a yearly  amortized cost of
$2500  for the  highest range of costs ($9500)  given  in  Table IV-15 would
represent  only  around  3.3%  of  the  yearly  gross  income.   For an  average
analyzer  retail  price of around  $6820  (EPA  recommended  decentralized comput-
er operated inspection analyzer), the  amortized  cost would  represent 2.3% of
the average gross.  Even if other factors such  as  inflation, etc.  caused the
analyzer  retail price to rise  to $9500,  the net  effect on the small owner
would be  an  increase in cost of  only  1 percent  of  the  average  gross income.
An expense of 3.3% of one's annual  gross is one  third  of  typical inflation
rates, and is normally not considered  a major expense.

Therefore, even if a  very expensive  analyzer  were  used with  absolutely no
subsidy from  the inspection fee, the  analyzer  would represent  only a minor
burden on  over 77 percent of the vehicle repair industry  (50 percent of 46%
plus  54%).   Cheaper analyzers and analyzer subsidies  through  the  inspection
fees  would reduce  this minor burden even  more.  Additionally,  the  expected
increase  in   mandatory  I/M  repairs  would  further  offset the cost of  the
analyzer.

Another area that  would  reduce  the  expense of  the  analyzer  is  income tax
credits.   For  instance  depreciation  is  determined  on the  cost minus  the
scrap value  divided by depreciation schedule (useful life).   Informal  con-
tacts with  the  IRS indicate that  a  5 year  depreciation schedule  would be
appropriate  if the  EPA specifications indicate  a  5 year useful  life.   How-
ever, even if a longer depreciation schedule were used, the  total  benefits
would be  approximately the same.  The  scrap value  is  a  little more difficult
to estimate  because of the large number  of variables  involved.   A range of
scrap  value   between  10  to 50  percent  of the new  cost seems  reasonable.
Using an  average scrap value of  30  percent and  a  5 year depreciation sched-
ule,  the  yearly depreciation  would be approximately  $955 for a  $6820  ana-
lyzer, and $1330 for a $9500 analyzer.

The  tax  saving would depend on  the  tax rate.   The minimum tax  rate could be
zero, but such  tax  rates  are  not common.  The  maximum tax rate  for incor-
porated  service centers would  be 50  percent.   Yearly  tax savings  for the
analyzers  mentioned would  range between  $477  and  $665 at  the  50 percent
rate.  More  typical  tax rates for the  smaller service  centers would  probably
be between 15  and 25 percent.   These tax  rates  would  result  in  a tax savings
in the range  of  $143 to $239 for the $6820  analyzers,  and  $200  to  $332 for a
$9500 analyzer.

During the first year  of depreciation  additional  tax  savings  can be  obtained
by  applying   an investment  credit.    Investment  credits  for  this   type  of
equipment  are generally around 7 percent  of  the  purchase price.   In  this

-------
                                    59
case another $477 to $665 could be subtracted  from  the  tax bill  in  the  first
year.  The  sum  of the depreciation  tax  savings (typical tax rates) and  the
investment  credit  reduces  the first year amortized analyzer  cost between 34
and  40 percent  (Total  tax savings  of $620  to $997).  Table  IV-17  shows
typical yearly  costs  (including  tax savings)  to purchase  a computer ana-
lyzer.

Based  on  this  rather  limited discussion  of  economic  impacts,  it is  still
concluded that the cost of an inspection analyzer in a  decentralized program
is not a  significant factor  in  the  selection of that  analyzer.  These eco-
nomic  conclusions would therefore reinforce the previous technical conclu-
sions  and  recommendations.   These  final  recommendations  are  restated   in
Table  IV-18.  The exact specifications are  listed  in  chapters VI  through X
of this  report.   Evaluation procedures are  found in  Chapter XI.   The  issue
of availability of the equipment is not totally  addressed in  the recommenda-
tions  given in Table  IV-18.  That  issue  is  more  affected  by  the program
recommendations, and will be discussed in the  next  chapter.
                                 Table IV-15
                               Yearly Costs to
                        Purchase a  Computer Analyzer
                                (1980 Dollars)
                                              1st  year        2nd-5th year

Analyzer Cost                                   $6820
Ammortized Cost                                 $1800            $1800
Tax Savings
     - 1st year  investment  credit               $  477              -
     - Average Depreciation (25% tax  rate)      $  239            $  239

Yearly Cost                                     $1084            $1561
(Maintenance and expendible
supplies not included)

-------
                                    60
                                Table IV-16

                      Recommended Analyzer Applications


CENTRALIZED Inspection Programs

     Recommendation:     a) EPA specifications, manual operation.

                         b) For programs with limited operator training and
                         technical management oversite, or with remote
                         facility location and limited support facilities,
                         EPA specifications, computer operation.

DECENTRALIZED Inspection Programs

     Recommendation:     a) EPA specifications, computer operation.

                         b) For programs where lead time constraints can not
                         be accomodated, or other constraints prevent selec-
                         tion pf the computer operated system, EPA specifi-
                         cations, manual operation.

-------
                                    61
V.  Program Implications

A.  Introduction

In the  previous  chapters, the need  for  minimum quality analyzer  guidelines
was discussed.   The recommendation  that some  form of  minimum quality  was
needed is consistent with the Hawkins policy memo  of July  17,  1978.  4_/   That
policy  memo  stated that  States  enacting I/M  programs must include minimum
analyzer  specifications  as  part of  those  programs.   This report  provides
specific technical details that  the staff recommends the States  should  adopt
as their analyzer standards.

If a  State  adopts the EPA specifications,  the  State may face  certain  imple-
mentation problems.  This chapter deals  with those  issues,  and  provides  some
recommendations.

B.  Implementation Considerations

This  report  recommends state  adoption  of  the EPA instrument  specification
because it  is  the only specification which can assure  instrumentation  that
will provide accurate and repeatable emission measurements.  For centralized
programs, the  0  to 9 month  lead time to procure a manual  controlled instru-
ment  meeting  these specifications  is compatible  with  the required program
start date of no  later than  December 31, 1982.
                                          /
For decentralized programs, EPA strongly  recommends  that states adopt the
EPA computerized  analyzer as  their  minimum specification.  This  instrument
will  greatly  enhance  the quality of  inspections, and  provide  the  consumer
and state with  convenient data  on the inspection  and  effectiveness  of  their
program.  The computer operated  analyzer will also allow a reduced frequency
of  state  audits   of  licensed  decentralized  inspections.   Because  of  the
instrument's self-calibration  feature, quarterly audits  will provide quality
assurance equivalent to the  otherwise required  monthly audits.

Additional  lead   time  to  procure instruments  meeting the  computer  operated
instrument  specification  may  be required.   The impact  of lead time on SIP
I/M implementation  schedules is discussed  in a memorandum from EPA's  Assis-
tant  Administrator of  Air,  Noise,  and  Radiation   to  the  Regional  Adminis-
trators.

C.  Implementation Issues

Adoption  of  a  minimum instrument specification, as is required by EPA poli-
cy, raises the issue of how  to deal with garages that  have already purchased
an  instrument  which  may  not  meet  the minimum  specification.   This  issue
largely applies  to decentralized I/M programs, since  the  impact of  substan-
dard  instruments  in repair facilities in centralized programs  is less  criti-
cal (inaccurate garage readings  will conflict with the more accurate inspec-
tion  lane instrument, putting  pressure on the  repair  facility  to correct the
deficiency).   The staff  has developed  a  suggested timetable  to deal  with
instruments  which do  not meet  specifications.   To an  extent, the current
status  of  the  I/M program  (operating versus beginning to  implement) affects
our suggested approach.

-------
                                    62
For  operating decentralized  programs,  EPA  recommends  adoption  of  the EPA
recommended  instrument  specification  discussed in  this  report  as  soon as
practicable.  Upon adoption, all new instrument purchases should  be required
to meet this specification.  The state should also adopt a plan to phase-out
old  instruments.   The recommended  approach would  require  that  instruments
not meeting the new minimum specification be taken out of inspection service
on  a date  5  years after  the  majority of  instruments  were originally  pur-
chased.  Five years is the estimated useful life of an I/M instrument.   This
approach assures  that most licensed station operators  obtain a  fair  return
on  their  original investment.   If  original instrument  purchases have  been
spread out  over  a period of time, an alternate approach would be to require
that  no  original  instrument  failing to meet the  new specification be  used
after it has reached  5 years old.  This could be verified at  the  time  of the
periodic state audit. .

These same  approaches could be used by  states  which are not operating  yet,
but  have  recently adopted other  (non-EPA)  instrument specifications.   This
will  allow gradual upgrading  of their  program.   If the  upgrade is  to the
computerized  analyzer,  reduced  resources  spent  on  audits would   occur.

Table V-l  presents criteria we recommend for the phase-out of old equipment
in favor of the EPA specification.

-------
                                    63
                                 Table V-l

                               Recommendation
                                    for
                       Phase-Out/Phase-In of Analyzers
Recommended Criteria

     1. The emission analyzer was in-use prior to  the recommended order
     cut-off date,  or  the emission analyzer was  not  ordered after the
     recommended order cut-off date.

     2.  The  emission  analyzer  manufacturer does  not  make a retro-fit
     kit  that  would bring  the  emission  analyzer into compliance with
     the recommended EPA specifications.

     3.  The  emission  analyzer meets  minimum equipment specifications.
     These are:

          a) The make  (i.e.  manufacturer), and exact model of emission
          analyzer must have undergone performance testing and received
          accredidation under the State of  California BAR 74 specifica-
          tions.

          b)  Inclusion of  BAR  74  emission  analyzer  components  in a
          larger vehicle  diagnostic  tester would be  permissible pro-
          vided those  components were certified under the BAR 74 speci-
          fications.   Documentation should  be  required to  verify the
          similarity of components  to the accredited model.

          c)  Accredidation  under BAR  80  specification would automati-
          cally  be  considered  as  meeting the BAR  74 specifications.

          d)  If  an  emission  analyzer by  make  and model  has not been
          accredited under  BAR  74,  then  that  analyzer make and model
          should undergo  evaluation testing  prior to  use  as a grand-
          fathered  inspection analyzer.   A minimum  of three analyzers
          shall be  evaluated.   All  three shall meet  or exceed the BAR
          74 specifications or the  State's  published minimum specifica-
          tions.

-------
                                    64
                                 References

j_/ "Analyzer Comparison; I/M Demonstration on September  6,  1979",  EPA Memo,
   M. Rosenfeld to T. Cackette, December  17, 1979.

2J "Results of Field Equipment Evaluation", contractor progress report and
   draft report submitted to NHTSA, February 1980.

31 "Evaluation of Diagnostic Analysis and Test Equipment  for  Small Automotive
   Repair Establishments", A Report to the Congress, July 1978, DOT  HS-803  536.

_4/ "Inspection/Maintenance Policy", Memo  from David G. Hawkins, Assistant
   Administrator for Air and Waste Management to Regional Administrators,
   Region I-X, July 17, 1978.

5/ "Vehicle Exhaust Emission Instruments  Evaluation", EPA-460/3-77-014,
   July 1977.

6/ "Instrument Drift of XXX Analyzer" EPA Memo, E.A. Earth to R.C. Stahman,
   July 20, 1979.

7/ 40 CFR 86 Subpart K and Appendix X, Federal Register,  Volume 45,  No.  14,
   Monday, January 21, 1980.

8/ "Decentralized Private Garage I/M Program Cost  Calculation Worksheet",
   EPA Technical Report IMS-006/CS-2, August 1979.

-------