EPA-AA-IMG-84-2
                  Technical  Report
                Quality  Assurance  in

          Inspection/Maintenance Programs
                         By

               John M. Cabaniss, Jr.

                     April  1984
                       NOTICE

Technical Reports do not  necessarily  represent final EPA
decisions or  positions.   They are  intended  to  present
technical  analysis  of   issues   using   data  which  are
currently available.   The purpose  in  the  release  of such
reports  is  to  facilitate  the  exchange  of  technical
information  and  to   inform   the  public  of  technical
developments which  may form  the  basis  for  a  final  EPA
decision, position or  regulatory action.
              Technical Support Staff
        Emission  Control  Technology Division
              Office  of Mobile  Sources
            Office of Air and Radiation
        U.S.  Environmental  Protection Agency

-------
    QiiJl'lity Assurance  in  Inspection/Maintenance Programs*
 Introduction

 The term  "quality  assurance"  is often defined and interpreted
 in  many ways.   Therefore,  it  is appropriate  to begin  this
 discussion  of  quality  assurance   in  inspection/maintenance
 (I/M)   programs  with a  description of  how  the term  is being
 used here.   In  this report, quality  assurance  (QA)  refers to
 the formal  system  of  activities undertaken  by the  State  or
 locality  to determine  whether  the   program  is operating  as
 intended  and whether  overall  program  objectives are  being
 achieved.   In  this context, QA  involves  a  continuous pattern
 of  information  feedback  to  the I/M  program  managers  in order
 to  allow those  midcourse  corrections  or modifications  that
 may be  needed to keep  the program on track  toward  meeting the
 desired objectives.  QA  efforts  as  discussed here  provide I/M
 managers one way to effectively manage their programs.

 A good  synonym  for  QA  might be "program analysis", if part of
 the "analysis"  includes  a determination and  implementation of
 any  needed   corrective  actions.   Also,  it   is important  to
 recognize   that  QA  is  a  continuous,  cyclic   process  of
 management   control  and   that   the  focus   of   QA   is   on
 problem-solving.

 The Quality Assurance Process

 The QA  process  is  basically a  simple management  system.   The
 first step  is to  set  expectations  (goals and  objectives)  for
 each major  element of  design  of  the I/M program  (i.e.,  for
 test   procedures,   emissions   standards,   record   keeping,
 analyzer  quality   control,   enforcement,  waivers,   mechanic
 training, repair costs, etc.).

 For example, the goals for emissions standards might  include:

     1.    To achieve  significant emissions  reductions  while
           not    overburdening     repair     facilities    or
           jeopardizing public acceptance.

     2.    To provide  for  similar or equitable failure  rates
           across vehicle design or age categories.
*This report was presented  to the Tenth North  American  Motor
Vehicle  Emission  Control Conference  sponsored  by the  State
and Territorial Air  Pollution Program Administrators and  EPA
in New York, New York on April 2, 1984.

-------
      »                       -2-

     3.    To  provide  compatibility  with  the  federal  207(b)
           emissions warranty  requirements  for  1981 and newer
           model year vehicles.

These goals might be translated into the following objectives:

     1.    For  pre-1981  vehicles,   to  achieve  an  overall
           failure  rate  of  20  percent  with  no  individual
           model year's  failure rate  being below  10  percent
           or above 30 percent.

     2.    For  1981+ vehicles,  to  use  the  207(b)  warranty
           cutpoints   in   order   to   maximize   both   the
           identification of  gross  emitters  and  the  utility
           of  the  warranty  (the  expected  failure rate  for
           these vehicles would  be 3-7 percent  using  an  idle
           test) .

These  objectives would  then  be   used  to   guide  the  State's
choice  of emissions  standards  for  each   class   of  vehicles
being inspected.

The second step in  the  QA  process  is to  establish a  way  to
determine  whether   the   objectives  are  being   met.    This
involves  collecting  and  analyzing   data.    In   the   simple
example above,  inspection data would  be  analyzed  to determine
the actual failure  rate being  experienced  by model year  and
overall to see  if the objectives are being  achieved.

The  third step in  the  QA  process  is  to  investigate  any
apparent problem areas to determine  the cause  of  each  problem
and the  corrective  action  needed  to  resolve  it.   This is  by
far the hardest step in  the process  because many  problems  are
often  related  to  several  different  aspects  of  a  program.
Therefore,  it  is  often  difficult  to  pinpoint  the  exact
problem and  solution.   In some cases,  the  real  solution  may
be a  combination of  corrective measures.   Because of  these
complications,  when  a problem is  identified,   it  is important
to   carefully   investigate   it   before    trying   to   reach
conclusions about  its cause and solution.   The  investigation
of problem  areas  usually   involves  an  in-depth   analysis  of
existing  data  and   often   involves  collecting  new  data   or
conducting special studies.

Continuing the  previous  example,   if  it  were determined  that
the  failure   rate  among  pre-1981  vehicles   was  5  percent
instead  of  the 20  percent  objective, the  problem would  not
necessarily be  that the cutpoints were  too  lenient,  as  one
might  think   initially.   Other  factors,  such  as  inadequate
analyzer leak checks, could cause  the  low failure  rate, or  it
could  be  that  the  data  itself  is not  accurate.   These

-------
                             -3-

situations have  to  be investigated in order  to  determine the
real  cause of  the  problem.   If  instead  the outpoints  were
tightened,  there  may  not  necessarily  be  a  corresponding
increase in the  failure rate because  the  real problem was not
resolved.

The fourth and  final step  in  the QA process  is  to implement
the corrective  actions  and then  begin  the  QA cycle  again  to
make sure that the problem  is resolved.

Current QA Issues in I/M Programs

Recent EPA evaluations  have identified three  primary problem
areas in the operating I/M programs:

     1.    High  levels  of  non-compliance  (in  excess  of  20
           percent)  among vehicle owners  in  some regionalized
           programs with sticker enforcement systems.

     2.    Low  reported  failure  rates   (less  than  half  of
           design) in some decentralized programs.

     3.    Seemingly  excessive  waiver rates  in  some  programs
           (waiver  rates  greater  than  10  percent of  failed
           vehicles) .

High Non-Compliance:

EPA  has  always  maintained  that  the   most  effective   I/M
enforcement   system   is   provided   by   denying   vehicle
registration to  noncomplying  vehicles.   However,  States  have
been allowed to use alternate enforcement methods,  as  long  as
they  were as   effective  as  a   registration  denial   system.
Because of  their popularity  in  safety   inspection programs,
many States have  opted  to  use sticker enforcement  systems  in
their   I/M  programs.  In  regionalized I/M programs,  however,
sticker  systems  have not  worked  well  because  of   several
factors:

     1.    In   regionalized  I/M   programs,   there  are   many
           unstickered,  excluded  vehicles on local  streets.
           Therefore, it is difficult for police  to determine
           whether  an  unstickered   vehicle   is   actually   a
           violator.

     2.    Stickers  themselves  are  sometimes  not  designed
           such that violators  can be  easily  determined.

     3.    Stickers   are  usually  on  the  windshield,   whereas
           the  license  plate  (which  may identify  county  or
           month  of inspection)  is usually  on   the  rear  of
           vehicle,   thus  confounding  the comparison  of  the
           two in order  to judge compliance.

-------
                             -4-
     4.    Police  are  often  prohibited   from  citing  parked
           vehicles;  therefore,  the  only  vehicles  closely
           examined  for  compliance  are  those  stopped  for
           other reasons.

     5.    Police  often  do  not give  as  much  priority  to
           enforcement of inspection stickers as they could.

One  State which  has experienced  this problem is  Colorado.
Upon  investigation  Colorado  decided that  their  problem  was
primarily due to the lack of an incentive for  local police to
enforce  the  program (since  all I/M fines  went to  the  State
treasury) and to the fact that only  moving violators could be
cited.  Therefore,  last year  the State  I/M law  was  amended to
allow  localities  to adopt   their own  ordinances which  would
allow  I/M  fines  to  go to the local  treasury rather  than  the
State  treasury.  Also, another  amendment  was adopted to  allow
parked vehicles to  be  cited.  The response to  these changes,
which  took effect in  January, has been dramatic.   In January
alone,  the City  of  Denver  collected  over $100,000 in  I/M
fines.  State  officials  report  that  the  compliance  rate  has
definitely  improved,  despite the  fact  that  all  localities
have not yet adopted local ordinances.

Of course, EPA  feels  that  the ultimate solution to  a sticker
enforcement  problem  would   be  to  enforce  the  I/M  program
through  vehicle  registration.   But,  as  pointed out in  the
Colorado case, other solutions are possible.

Low Reported Failure Rates:

Some decentralized  I/M programs  are  experiencing  low reported
failure  rates.   In  evaluating  I/M  programs,   EPA  has  found
that the reported failure rates  in decentralized  I/M programs
are often less than in comparable centralized  programs.   Part
of this phenomenon  may be explained  by  pre-inspection repairs
or  tune-ups.    Another   explanation  may  be  that  inspection
personnel take  shortcuts  in recording  inspection data and do
not  always  report   initial  emissions  failures   which   they
repair  and  retest  immediately.  EPA  believes that  each  of
these arguments is  valid to  an  extent,  but  that  these factors
should not dramatically lower the reported failure rates.

Other more serious  situations which  could also result in  low
reported failure rates would be:

     1.    Cheating    or    incompetence   among    inspectors
           resulting  in  broadscale  improper  record  keeping,
           improper  testing, etc.

-------
                             -5-
     2.    Inadequate  analyzer quality  control  resulting  in
           excessive leaks  and,  thus, low  readings  and fewer
           failures.

Some possible solutions to this kind of problem would include:

     1.    Retraining  of  inspectors  on  the  importance  of
           proper data collection and other procedures.

     2.    Tighter surveillance on analyzers and data.

     3.    Improved  inspection form  or  reporting  mechanisms
           (e.g.,  this  could  include  changing  from  a  manual
           data system to automatic data collection).

     4.    Tighter I/M cutpoints.

While  the  latter  (tighter cutpoints) is a  possible  solution,
especially  in  programs  that have  been  operating  for a  few
years,  EPA  believes  that   most  current  I/M  programs  have
adopted reasonable I/M cutpoints, except that a  number  of the
programs are  using less  stringent  standards than the  207(b)
cutpoints  for  1981  and  newer  vehicles.   EPA  believes  that
there  is ample data  available  from  both  EPA and  State testing
to   show   that   the  207 (b)  cutpoints   are   effective   in
identifying gross emitters and do not yield excessive failure
rates.  Current  testing has shown  that  less than 10 percent
of  1981  and  newer  vehicles  usually  fail  an idle mode short
test  using these  cutpoints.  In  addition,  of  course,  most
1981 and newer vehicles  failing  an  I/M test are  eligible  for
warranty protection.  Therefore, EPA  feels  that  all  State  and
local  I/M  programs should use  the  207 (b)  cutpoints  for  1981
and newer vehicles.

Excessive Waiver  Rates:

Some States are  reporting seemingly high waiver  rates.  High
waiver  rates  are  a  concern  because,   in general,   lower
emissions reductions are  obtained from waived vehicles.  High
waiver  rates  are  often  symptomatic  of  other problems.   For
instance, high waiver  rates can sometimes  indicate  a problem
with the competence  of mechanics and, thus,  a  need  for  more
mechanic training.   Poor analyzer  quality  control  practices
in  repair  garages  can also  cause   waiver  rates  to  be  high
because a  mechanic, relying on an  inaccurate  analyzer,  may
inadequately  repair  the  vehicle  before  it  is  submitted  for
retesting.   There  may  also  be  problems,  in some  cases,  with
the procedures used  in  issuing waivers or  in the  criteria  on
which waivers are based.  Sometimes,  for instance, the waiver
repair  cost  limit  may be  too  low,  or  there  may  not be  a
provision to prevent tampered vehicles from receiving waivers.

-------
                             -6-
In at  least one  State,  there  has been  an apparent  problem
with high average  repair  costs which  in  turn  has caused  the
waiver  rate  to be  high.   Such  information provides  further
evidence of  poor  mechanic  skills which  lead  to  unnecessary
repairs and higher costs.

Some possible  solutions to correct a  high waiver rate  would
include:

     1.    Stricter waiver criteria and procedures.

           a.    Require  tampering  to  be  repaired  before  a
                 vehicle is eligible for a waiver.

           b.    Review  receipts   or  work  orders  to  verify
                 that repairs are appropriate for  the  type  of
                 vehicle and  the type  of I/M failure.

           c.    Disallow waivers  for  vehicles covered  under
                 warranty or  prepaid maintenance  agreements.

           d.    Increase the  repair  cost ceiling or  require
                 specific  minimum repairs in lieu  of a  repair
                 cost ceiling.

     2.    Better  mechanics training.

           a.    Start or  expand formal training.
           b.    Create incentives for  mechanic participation.
           c.    Promote the  benefits  of the training.

     3.    Better  quality control procedures for analyzers  in
           repair    facilities   (especially   in    centralized
           programs where the repair facilities are generally
           under  less scrutiny  by  the  State).

     4.    Better  surveillance  of repair  facilities including
           monitoring of waiver  rates  by facility.

     5.    Better   consumer  awareness  about  the  kinds   of
           repairs and  their approximate  costs for different
           types  of I/M failures.

     6.    Better   monitoring  of   repair  costs  by  garage
           (perhaps publish a list of  average costs by garage
           for  public information).

-------
                             -7-

Other Problem Areas:

In addition  to the  three  primary problem areas  noted  above,
EPA has also  noted  several lesser problems  in  evaluating  the
operating  I/M  programs.   In some cases,  minor problems  have
been  noted  with  analyzer  maintenance  and  quality  control
procedures, especially  in  the area  of finding and  repairing
system  leaks.   There have  also  been  some  reported  problems
with  data  collection,   both   in  manual  and  automatic  data
collection systems.

Importance of Data Collection

Data  collection  and  analysis  is  the  key element  to  the  QA
process.   Its  importance  cannot   be  overstated.    Without
accurate   data,   it   is   impossible   to   ensure   a   properly
operating program.  Without accurate data, problems  cannot  be
identified for resolution,  and,   just  as  important,  successes
cannot be verified.

The following data are important  in the QA process:

     1.    Test data on  vehicles.

     2.    Summaries  by  inspection  station  (or by lane in  a
           centralized system).

           a.     Failure rate.
           b.     Waiver  rate.

     3.    Repair  data.

           a.     Types of repair.
           b.     Cost of repairs.

     4.    Surveillance  data.

           a.     Audit results.
           b.     Results of undercover  operations.
           c.     Results of roadside or independent checks.
           d.     Results  of   challenge   tests  or   complaint
                 investigations.

The  methods  of  collecting   these  data  vary  considerably
according   to   the   type   of   program   (centralized    or
decentralized),   the    type    of   analyzers    (manual    or
computerized), and  the  type of data.  Almost all centralized
I/M  programs   use computers   to  automatically  collect test
data.   However,   there  is   more   variety  among decentralized
programs  in  collecting  test  data.   Some  use  computerized
analyzers  which  have automatic   data  collection, a  few use

-------
                             -8-

machine-readable forms,  but most collect  test  data manually.
In the two former cases,  the  test data  can easily be analyzed
electronically.  However, most  States which  collect test data
manually rely on analyzing  random samples  of the  test  data by
transferring  the  sample  data   to  computer  media.   Random
sampling is  usually  adequate for overall  statistics,  as long
as   the   size   of   the   sample  is   large  enough   to   be
representative.

Station by station statistics  can be easily  derived in those
cases  where  the  test data  is  collected  by  computer  or  on
machine-readable  forms.    In   the   case   of   manual   data
collection,  station  summaries  can be easily  obtained  through
monthly (or  other periodic) activity  reports.   In such cases,
simple  tally sheets  can  be used  to  report  the   number  of
vehicles  inspected,  passed,  failed, and  waived during  the
month  (or  other  period).   This  type  of  report  requires
minimal time for station personnel  to complete the  report,
but allows the program manager  an easy  and quick  way to track
failure  rates  and  waiver  rates  by inspection  station  and
overall.

Repair  data  should  be  included on  the  test  form whenever
possible.    This   is   often  done by  having  a  checklist  of
repairs which  can be  checked  off as  performed.   When  such
lists are  used,  it   is  important to periodically revise that
part  of the  form to  add new items  or  delete  those that  are
never  used.    In  many  cases,  the  repair  data  sections  on
current I/M  forms are incomplete or  out-of-date  because they
do  not  include  categories  applicable  to  1981  and  newer
vehicles.

Surveillance data  should  be collected  through  formal  audit
reports and  other reports on surveillance activities.   These
reports must be  reviewed manually,  but the  information they
provide is essential  for tracking station performance.

Summary

Regardless of  the type  of  I/M program,  QA is  an  essential
function.    QA  provides   the  means  for  documenting  program
successes  and for addressing the  following  questions:

     1.    Is the program operating as intended?

     2.    Is it meeting its objectives?

     3.    Are   these  special   problems  that  need   to   be
           addressed?

-------
                             -9-

     4.    What is the proper way to solve these problems?

     5.    Once implemented, did the solution really work?

Data collection  and  analysis  is  the  key element  of the  QA
process.  Various  methods  are  available to  the I/M  program
manager to obtain and analyze the needed data.

As all  program managers know,  regardless  of how good  things
are, they  could  always be  better.  The  purpose  of  the  QA
process  is   to help  managers  identify ways  to  keep  their
programs functioning optimally.

-------