&EFW
            United States
            Environmental Protection
            Agency
Office of Air Quality
Planning and Standards
Research Triangle Park, NC 27711
EPA-450/2-88-002
February 1988
            Air
            National Air Audit System
            Guidance Manual for
            FY 1988 - FY 1989

-------

-------

-------
• ^ ,"JX™.
                               ^ Table, of  Cd'ntBnts,*.::4.«   '•-?-.,'*
                               *    ' '"  w   ''.  ,>><     '.."'."  «- •'•••"• • • . • .;• s _•.•-
Chapter  Section    ' •'***-      ,      '"•' • ." "',;.,. ' '• .''. ;'^-/--;.;v  .•  s  -'"•'•-» '.^vP-age
                     ~^           '   ''  •' ".. '„•'  ~~"','/".';.. i'. •;•;..-  ..._ ;..,":,   •<•-  A
  1.     Introduction..;	;... .-.4»J:. :V...... ,'v.;........."..V^.'./.^ f,.l>^l;j ;;/l

            Goal s  and  Objecti ves of the Nati onal Audi t, Syistem	>.s>7   1 -2   V
            Audit  Protocol	,.			   1-4
            Audit  Reports and Use of Audit  Data.	    l-€

  2.     Air Quality Planning and SIP Activities.	    2-1

         Introducti on		    2-1

         Emission  Inventories			    2-3

            Use of Criteria Pollutant Emission  Inventory..	    2^3
            VOC Inventories in 03 Nonattai nment Areas		    2-6
            Demonstration of Reasonable Further Progress	    2-9
            CO Inventories in CO Nonattai nment Areas	...:.	    2-12
            Mobile Source Inventories in  03 and CO
               Nonattai nment Areas	    2-13
            PMio  Inventories in Group I and II Areas	    2-16

         Modeling	 1	    2-18
            Experience and Training	I.		    2-18
            Model  Avai 1 abi 1 i ty	 1	    2-19
            Alternative Modeling Techniques	    2-21
            Modeling Analyses...		   2-23

  3.     New Source Review	    3-1

         Introduction	;.........	    3-1

         FY 1988-1989 NSR Audit Procedures	    3-1

         ! Compl eti ng the Audi t Report	    3-4

         Suggested Worksheets		.	    3-5

         New Developments in New Source Review..	    3-6

         Description of the Permit File Questionnaire	;    3-10

            Source Information...		;	    3-10
            Public Participation, and Notification	    3-VI
            Applicability Determinations	    3-12
            Control  Technology	    3-16
            Air Quality Monitoring Data  (PSD)			  ..3.-t7
            PSD  Increment Analysis....		?.    3-18
            NAAQS Protection	    3-19
            Emi ssi on Offset Requi rements	^	    3-20

-------
Chapter  Secti on	 "•'"     	.       Page
  4.     Compliance Assurance	;	   4-1
         Introduction...<......';: 'IC'^lvS. .i r*«s.., .f..,,.	   4-1
         Periodic Review and ^Assessment  of* Source* Data...>	   4-1
         Asbestos Demoli ti on  and flenbvati on (.D&FP,.... —	   4-4
         File Review...		 J. .:P:".	   4-6
             , •.. -     _          ••'...,.  ;.:.. -,!   -,;    f   .
         Overview Inspections:.;i..?s......«,s..«.^..".	   4-8
         Compliance Assurance Report  Format.';,^'i.l	 —	   4-9
  5.     Air Monitoring	   5-1
         Introduction	   5-1
         Regulatory Authority to  Perform a  System Audit.... —	   5-3
         Prel imi nary Assessment and Systems Audi t PI anni ng	   5-9
         Guidelines for Conducting Systems  Audits of
            State and  Local Agencies	   5-15
         Criteria for  the Evaluation  of  State  and Local
            Agency Performance		   5-26
         Systems Audit Questionnaire  - Short Form	   SF-1
         Systems Audit Questionnaire  - Long Form	   LF-1
  6.     Motor Vehicle Emissions  Inspection Programs
         Introduction  and Purpose	   6-1
         Overview of the I/M Audit	   6-2
            Monitoring Program Effectiveness	   6-2
            Initial I/M Audits	,	   6-3
            Follow-up  Audits	   6-4
            Correcti ve Acti on	._.   6-5
         Initial Program Audit Process		   6-5
            Advance Preparation	.•	,.   6-5
            Site Visit	   6-7
            Audi t Report	   5-13
         Fol low-Up Audit Process	   6-14
            Follow-Up  Audit Activities	   6-14
            Follow-Up  Audit Reports	   6-15
            Correcti ve Acti on	   6-15

-------
Chapter  Section	.     	      -.    Page

  6.     (Continued)

         APPENDICIES                       .  ,                "    	""•-•    -
         A.   Method for Determining Required Emission  Reduction.
                 Compliance in Operating I/M Programs.,;....	.......\...   A-^jl-j
         B.   Descriptions of I/M  Program Elements... ..IT............•.:;;>'.>.«.'-.'  B-l
         C.   I/M Program Audit Questionnaire^.	:,•.-,...%<..	.......   C-l
         D.   Descriptions of On-Si te Audi t Acti vi ti es	,..'.'.". .'*}-..  :Dn-f ;
         E.   Procedures Used in QMS Tamperi ng Surveys	".-..-^ *..'..... ^. »>,.,  E-} r.
         F.   Instructions and Forms for Audit Activities......,^..,...ii..  "F-l
         6.   Checklist for Completing the. I/M Audit	'.-.'...	''  .:lv; /  G-l  .
                                      W        V   ''"••  • "   ITV i >, ,"•

-------
a
o
o
c
o

-------
                             1.  INTRODUCTION


     The National  Air Audit System (NAAS)  was  first  implemented  in  1984
as a result of a joint STAPPA/ALAPCO and EPA effort.   This  FY  1988-1989
audit manual  has been based on the FY 1986-1987  manual,  but has  been  revised
to include new emphases and to upgrade the quality of  the  audit  effort over
the last cycle.

     In the past , EPA set forth overall policy  for  the  NAAS,  based on
agreement with STAPPA/ALAPCO and the Regional  Offices.  The portions  of
that policy that will continue to apply this year, together with new
emphases, follow:

     0  Program Coverage -  Coverage will  consist of the same  five  topics  as
        before (air quality planning and SIP activities,*  new  source  review,
        air monitoring, compliance assurance,  and vehicle  inspection/maintenance)

     °  Audit Teams -  EPA Regional  Offices will  select  the composition  of
        the audit teams.  A crossover team approach  (State/local and  outside
        Regional Office representatives) is possible for FY 1988-1989 if
        Regions and affected States so desire.  All  audit  team members should
        have sufficient knowledge of the programs they audit;  the audit  is
        not intended as an on-the-job training program for  inexperienced
        personnel.  Also, as a minimum, middle or senior Regional management.
        should personally participate in the exit interview with the  agency
        director.

     0  Audit Coverage -  As was the case last year, each  of the four following
        audit topics should be covered at each State agency: (a) air  quality
        planning and SIP activities; (b) new source  review; (c)  compliance
        assurance; and (d) air monitoring.  Only those agencies  with  I/M
        programs that have been on-going for one year  or more  should  be
        considered for I/M audits.
On-site Visits and Pre-preparatic
conduct on-site audit visits, but
interviews and "on-hand" informal
*:o complete significant portions
n - Regions should continue to
should maximize use of telephone
ion to prepare for the visit and
of the questionnaire. Question-
        naires should be exchanged and results studied and evaluated
        prior to each on-site visit.
       1-The air quality planning and SIP activities chapter has been shortened
        and will only include emission inventory and diffusion modeling activities,
                                     1-1

-------
     o
   Schedule -  To even out audit workloads on the State and local
   agencies and the EPA auditors, STAPPA/ALAPCO and EPA agreed that
   the NAAS effort will be spread over two years.  Each EPA Region
   will be responsible for ensuring that all  State agencies and
   selected local agencies are audited within the two year period.
   It is recommended that approximately half  the agencies are audited
   the first year and the remaining agencies  the second.  Regions
   may begin their FY 1988-1989 audit as early as they wish, so long
   as the audited agency is agreeable and the previous item ("On-
   site Visits and Pre-preparation") is accomplished prior to the  on-
   site visit.  The Regional  Office must forward a final audit
   report to OAQPS within 180 days after the  audit is completed.

   Corrective Actions - Actions to implement  the needed improvements
   identified in the audits will be initiated through existing
   mechanisms, i.e., 105 grants, State/EPA agreements, etc. ,

0  National Report - At the end of the two year audit cycle, the
   OAQPS will prepare a national report based upon the results of
   all Regional  audits.  The  national  report  will  not rank agencies
   or focus on deficiencies of specific agencies.  Before they are
   finally issued, drafts of.the national  report will  be reviewed  by
   EPA Regions and representatives of STAPPA  and ALAPCO.

0  Replacement for Other Audits - The NAAS has replaced Regional
   Office audit activities carried out previous to initiation- of the
   NAAS.  It does not, however, replace all portions of the 105
   grant evaluations that are specified in the grant regulations.

0  Other Oversight - The NAAS is intended to  eliminate the need for
   much of the "item by item" Regional Office oversight on certain
   State and local agency programs.  The NAAS, however, will  not be
   a substitute for the necessary flow of communications between
   State/local agencies and .EPA Regional  Offices.
GOALS AND OBJECTIVES OF THE NATIONAL AIR  AUDIT SYSTEM
     The purpose of developing
                          a national  audit  manual  is  to  establish
standardized criteria for the !!?A Regions'  audit  of  State  and  local  air
program activities.  The primary goals of this  program are r.o  determine
the obstacles (if any) which are preventing the State  or 'local  air
pollution control agencies from being as effective as  possible in their
air quality management efforts and to provide EPA with quantitative
information on how to define more effective.and meaningful  national
programs.'1 States are playing a larger role than-ever  in the planning  and
implementation of complex, and often controversial,  air pollution control
strategies.  EPA oversight of these and related activities  is  necessary
for ensuring national'consistency and .for assisting  the States in resolving
identified problems.
                                   1-2

-------
     The EPA and States can also use these audit results to ensure that
available resources are being focused toward identified needs (e.g.,
attainment and maintenance of standards, adoption of regulations,
implementation of regulations and technical  analyses to support  control
strategy development).

     The EPA also hopes to share the results of these audits in  a manner
that permits the "cross-fertilization" of innovative approaches  and
systems across States and Regions.  Only through this national  exchange
can we hope to benefit from the invaluable experiences gained to date by
control agencies in carrying out the raquirements of the Clean  Air Act.

     This audit guideline outlines a program which EPA, State,  and local
air pollution control agencies can jointly use to--

     0  Meet statutory requirements;

     0  Assist in developing at least a minimally acceptable level of
        program, quality;

     0  Allow an accounting to be made to the Congress and the  public of
        the achievements  and needs of air pollution control  programs;
     o
        Enable EPA, States, and local  agencies to agree on needed technical
        support and areas where program improvements (including regulatory
        reform) should be made; this includes improvements to both EPA
        and State/local programs;

     0  Maximize and effectively manage available resources within the
        State and local agencies and EPA, resulting in expeditious
        attainment and maintenance of ambient air quality standards as
        soon as possible; and

     0  Promote a better understanding of the problems facing air pollution
        control agencies, thereby fostering mutual  respect among EPA,
        State, and local, agency staff.

     State, local, and EPA Regional  Offices, working together, may .identify
items in addition to those of the national  program that are worthy of
Cijrther audit attention.  In identifying *:ness, "he EPA .-^eyional Of:r1cs
and "he State/local agency should understand in advance what the reasons
are and what the objectives and expected fesult(s)  of this expanded review,
will be.  Also, the NAAS is not intended to preclude EPA Regions from  "
dealing, on a case-by-case basis,  with significant  deficiencies wh,i:ch.
identified during the course of the audit.                              •••

     The EPA, State, and local  agencies should keep in mind that the    -
audit is intended to improve the overall quality of air pollution control
programs.  This intent of improving overall performance needs to be
clearly understood.  The standards of performance outlined by these
guidelines are not so rigid that they eliminate the flexibility afforded
by the Clean Air Act.  Also,- these guidelines should not be construed to
                                   1-3

-------
establish performance standards which must absolutely be achieved in
practice.  Moreover, while participating agencies will  use the audit to
point out where opportunities exist for State or local  improvements, it
is not expected that the audit will address every problem.  The EPA,
however, will continuously search for and disseminate information about
better ways'of consistently, effectively, and efficiently implementing a
comprehensive air pollution control program.  This includes possible
reforms of EPA's requirements where feedback from the audits suggests
that certain requirements detract from program effectiveness.

AUDIT PROTOCOL

     Each Regional  Office must tailor the structure of the audit according
to the particular characteristics of the State and local  agencies in the
Region and its own operating procedures.  Certain elements and procedural
steps, however, appear necessary or useful in FY 1988-1989 based on
previous experience.  These are discussed below.

Advance Preparation

     The EPA should send a letter to the control agency well in advance
of the audit.  The letter should confirm the date and time of the audit
and describe what resources the State is expected .to provide, such as
office space and staff time.  This letter should also identify the name
and title of each EPA individual who will participate in the audit.

     With the exception of file audits and similar questionnaires, the EPA
Region will provide the control agency with the nationally prepared
questionnaire.  These should be sent to the State or local agency 6 weeks
in advance of the audit and, thus, will allow the agency to better'prepare
for the audit.  The State or local agency should fill out ,the specified
parts of the questionnaires and return a copy of the completed questionnaire
to the Regional Office 2 weeks before the on-site visit.   The Regional
Office subject experts should review the completed questionnaire and use
it to prepare the audit team for the on-site.visit.  Returning the
completed questionnaire to the Regional Office.before the visit should
serve to minimize wasted effort and time reviawi'ng the questionnaire
during the on-site visit and to prepare the audit team for discussions
that focus on any problems uncovered in the questionnaire.  Because of
time limitations, however, it .nay not '~>e possible in all  ;ases to ^eturn
the completed questionnaire to the Regiona I. .Office before the on-site
visit; in these cases, the State or local agency will have to make the
completed questionnaires available to the audit team during the on-site
visit.

     The chapter for each'of the audit topics presents the specific
protocol for that audit topic.  This includes procedures  such as advance
tailoring of the questionnaires for the air quality planning and SIP
activit-ie's audit, selection of files for new source review audits, and
instructions for use of the individual questionnaires.
                                   1-4

-------
On-site Visit

     The primary purposes of the on-site visit are to--

     0  engage in a broad discussion with agency staff to gain insight
        into any recent changes in the structure of the organization,
        discuss specific problem areas of the agency,  and become acquainted
        with the staff in order to better open up channels of communication;

     0  discuss and clarify answers to the questionnaire and complete
        any questions not answered by the audited agency;

     0  review on-site documents that are too cumbersome to transmit such
        as permits, modeling runs, and supporting files; and

     0  audit by observing the agency's daily operations of programs for
        air monitoring, compliance assurance, new source review, planning
        and SIP activities, and (where appropriate) vehicle inspection
        and maintenance.

     Typically, the on-site audit is conducted in four phases:

     0  The EPA auditors for all programs meet with the State agency •
        director and top staff to discuss the goals of the audit and to
        "break the ice."  This meeting usually sets a  cooperative tone
        for the visit.

     0  The EPA auditors conduct a discussion of the questionnaire with
        the person(s) in charge of each of the activities to be audited.

     0  The EPA auditors will review appropriate files; this will usually
        be necessary for each of the five audit areas  to varying extents.

     0  The exit interview is held as a wrap-up session to inform agency
        management of the preliminary results of the audit.  This promotes
        harmony between EPA and the State by giving immediate feedback of
        the results in a face-to-face meeting between  the people actually
        performing the audit'and those responsible for the programs being
        audited.  The. EPA middle or upper management from the Regional
        Office should participate in at least this portion of the on-site
        visit.

     The time which the audit team spends to complete  the various phases
of the on-site visit should usually not exceed 3 days.  This general     -.,-'.
rule, however, will be difficult to adhere to in certain instances, suc'h
as when satellite facilities of the agency must also be visited.  In any
event, the duration of the on-site visit should be mutually agreed upon-.-  ..."
in order not to create an undue burden to the agency being audited.   \-  :.._-,

     A recent survey of the Regions indicated that many Regions preferred to
conduct the four or five parts of the audit separately rather than all^t
the same time as generally recommended in the above guidance.  This decision
is left to each Region, with the understanding that the States should  be
fully informed as to the intentions of the Reyion in this regard.


                                   1-5

-------
AUDIT REPORTS AND USE OF AUDIT DATA

    . Each State or local agency audit report should contain the findings
for each of the four or five audited areas.  The audit report must include
a copy of the completed questionnaire(s) (except file audit or similar
questionnaires—see specific instructions in appropriate chapters of this
manual) for each of the audit topics to enable national  compilation of
audit results.  Since the questionnaires will  be included, the audit
report should not merely reiterate the answers on the questionnaires, but
present the Regional Office's overall findings.  The Region should give
the State or local- agency an opportunity to comment on a draft of the
report before it is released outside of EPA.  This allows misunderstandings
and errors to be discovered before the report is made final.

     The audit report should contain an- executive summary that provides
the Regional Office's overall assessment of the audited  agency's program
after reviewing all the questions as a whole.

     Major deficiencies identified during the audit should also be high-
lighted in the executive summary.  This enables the Region to detail all
the findings of the audit without causing the reader to  confuse minor
points with major problems.  It also identifies to tha audited agency
those deficiencies considered most serious.

     Where an agency disagrees with the conclusions of the audit, it
should provide to EPA written comments outlining its perspective.  These
will be incorporated as an appendix into the final report.  The report
should also highlight outstanding and/or innovative program procedures
that are identified.

     The audit would be of limited use without some mechanism for rectifying
identified deficiencies.  Therefore, it is important that the report recom-
mend measures or steps to treat the causes determined to be responsible for
these inadequacies.  Lead agencies responsible for implementing these recom-
mendations and anticipated resource requirements should  also be considered.

     Each question1 shoujld be answered on the questionnaire itself;
attachments should be avoided unless a question specifically requests
them.  If attachments are requested and included, the attachments should
'->e  forwarded to OAQPS along with the questionnaire.

     A recommended format for preparation of the audit report is given in
Table 1-1.  Following a standardized format will not only enable reviewers
to easily find material in the text, but will  also facilitate compilation
of  information into the national report.

     A national report compiling tha findings  of the audits conducted by
the EPA Regions will be prepared by EPA Headquarters.  This analysis will
be based on the reports prepared by the EPA Regions discussed above.  It
will not rank agencies or focus on specific deficiencies in individual
programs.  While it will address areas of conflict between EPA guidance
and action of implementation experience, it will not be  a forum for
addressing unresolved issues between audited agencies and States.

                                   1-6

-------
   •  The NAAS initiative is designed as  a partnership  effort  to  help EPA
and State and local  agencies each  do their respective  jobs  better.  It  is
our hope that it can become the foundation which  all  involved  can use to
make solid progress  in protecting  and enhancing the quality of our  Nation's
air.                   '
                                   1-7

-------
                             •   Table 1-1


                   RECOMMENDED  FORMAT FOR  AUDIT  REPORTS
Introduction—Purpose of 'audit,  for benefit  of  potential  layman  readers;
identify persons on EPA -audit team and  persons  interviewed  in  audit  visit.

Executive Summary—Outline major findings, major  deficiencies, and major
recommendations.  As the name implies,  it  is  a  summary  designed  for  the
chief executive—the director—of the audited agency.   The  summary should
cover all audit topics:

     Air Quality Planning and SIP Activities
     New Source Review
     Compliance Assurance
     Air Monitoring
     Vehicle Inspection and Maintenance (where  audit  is conducted)

Air Quality Planning and SIP Activities—Follow recommended  format
in Chapter II of Audit Manual

New Source Review—Follow recommended format  in Chapter III  of Audit Manual

Compliance Assurance—Follow recommended format in  Chapter  IV  of Audit
Manual

Air Monitoring—Follow recommended format  in  Chapter  V  of Audit  Manual

Vehicle Inspection and Maintenance—Follow recommended  format  in Chapter VI
of Audit Manual

Appendices

     Completed Air Quality Planning and SIP  Activities  Questionnaire
     Completed New Source Review Questionnaires (except for  permit
     file questionnaire)
     Completed Compliance Assurance Questionnaire                  '   :
     Completed Air Monitoring Questionnaire
     Completed Venicle Inspection and Maintenance Questionnaire  'wnero
     audit is conducted)
     Audited Agency's comments on draft report  (for final report)
                                   1-8

-------
    >
    55
w  >
=5  E



H
H  Z
m  £
OT  Z
    o

-------
                                   Chapter 2

               AIR QUALITY PLANNING AND SIP ACTIVITIES GUIDELINES

                                  FY 1988-1989



Section    Title	   Page

           Introducti on	   2-1

 B         Emission Inventories	   2-3
 B.I          Use of Criteria Pollutant Emission Inventory	   2-3
 B.2          VOC Inventories in 03 Nonattainment Areas	   2-6
 B.3          Demonstration of Reasonable Further Progress (RFP)	   2-9
 B.4          CO Emission Inventories in CO Nonattainment Areas	   2-12
 B.5          Mobile Source Inventories in 03 and CO
                 Nonattai nment Areas	   2-13
 B.6          PMio Inventories in Group I and II Areas	   2-16

 C         Model i ng	   2-18
 C.I          Experience and Training	   2-18
 C.2          Model Availability	   2-19
 C.3          Alternative Modeling Techniques	   2-21
 C.4          Modeling Analyses	   2-23

-------
                2.  AIR QUALITY PLANNING AND SIP ACTIVITY
INTRODUCTION

     The FY 1988-1989 audit guidance for air quality planning and SIP
activity has been revised from the guidance used in FY 1986-1987.  Two
sections were removed from the chapter for this round of audits.   The
air quality evaluation section was removed because this baseline  data has
been' collected for three audit cycles.  SIP implementation questions were
not included because the agencies'wil1 be involved in developing  PM^g and
ozone SIPs during the current audit cycle.

     Many of the comments received on past guidance expressed concern
that the' Air Quality Planning and SIP Activity chapter was more of a
survey questionnaire than an audit questionnaire.  We are trying  to make
the transition to an audit type questionnaire.  However, due to the
impending need to get the FY 88-89 audit cycle underway, complete transition
will not be possible until the upcoming FY 90-91 audit.

     Audit criteria are provided below for each of the program areas to
be audited.  Each topic is prefaced by a brief discussion of what activities
it encompasses and what we generally hope to accomplish through the audit.

PROTOCOL

     Procedurally, the Regional Offices will, upon receiving this
questionnaire from OAQPS fill in Section C.4.a on modeling analysis before
sending it to the State and local agencies.  These agencies will  then
complete the questionnaire and return it to the Regional Office at least
2 weeks before the on-site visit.  Regional EPA staff would then  review
the State/local  agency responses prior to the on-site audit.  Regional
Office auditors should discuss all questionnaire items with the appropriate
agency personnel during the audit.      i

     In compiling,the data for the FY 1986-1987 national report,  it was
noted that a number of the questionnaires for the Air Quality Planning
and SIP Activities chapter were  incomplete.  Whether or not the agencies
return the questionnaire with all of the questions answered, the  auditors
must verify' the answers provided and complete all parts of the questionnaire
that are left blank.  It should be remembered that this is an EPA audit
of the a-gencies' programs; therefore, the answers in the questionnaire
must reflect the findings of the auditors, even if the agencies are not
in complete agreement with these answers.  The differences are to be
discussed with the agencies during the audit and between the issuance
of the draft report and the final report.  If the differences cannot be
resolved, the agencies' comments are to be attached to the final  report.
The Regional Offices are expected to ensure the quality of the reports
and see that all the questions are answered in the questionnaire.
                                      2-1

-------
     The draft  report  from the Regional Office should consist of an overall
executive summary and  highlights of the emissions inventories and model-ing
'sections.  The  questionnaire .would then follow this narrative audit
summary.


EMISSION INVENTORIES

     The emission inventory provides  information concerning source
emissions and defines  the location, 'magnitude, frequency, duration, and
relative contribution  of these emissions.  An inventory  is useful in
designing air sampling networks, predicting ambient air quality, designing
control strategies, and interpreting  changes in monitored air quality
data.  Plans for attaining and maintaining NAAQS1 are dependent on a
complete and accurate  emission inventory.  The FY 1988-1989 guidance has
focused on the  major nonattainment problem of ozone by directing .questions
toward the completeness and quality of the agencies VOC'emission inventories.
In  addition, questions on PMjg have been added to the questionnaire.

     In the implementation of a nationwide program of ai'r quality
management, consistent methods of inventory compilation are essential.
An  adequate emissions  inventory must  be accurate, complete and up-to-
date, and provide for  consistency in  planning between metropolitan areas,
States, and Regions.

MODELING

     Air quality models are being used more extensively  in the conduct
of  day-to-day activities in the planning and SIP program area.  These
activities include such things as attainment demonstrations, major source
compliance determinations, new source review, evaluations of "bubbles,"
and assessing attainment status.  Most State agencies should have the EPA
reference models on-line that are available for use in these and other
types of applications.  The modeling  audit is intended to gather information
regarding the agency's demonstrated expertise and [capability to perform
necessary air quality  modeling analyses consistent with accepted EPA procedures,
                             r
     This  guidance reviews the various kinds of modeling applications
 performed  or  evaluated at the State/local program.  Because the Region
 •'/ill have  already reviewed certain site-specific modeling analyses which
 the  State/local agency has submitted  (such as bubbles, new source permits,
 etc.), the questionnaire asks the Regional Office to list the results of
 these evaluations.
                                      2-2

-------
B.  Emission Inventories

B.I.  Uses of Criteria Pollutant Emission Inventories


Emission inventories are used in a number of applications  by  air  pollution  control
agencies, including the development of SIPs an
-------
     b.  Quality Assurance

Quality (or validity) assurance for emission inventories  involves  checks  of  the
procedures, emission factors,  calculations,  etc.,  that  were  used during compil-
ation as well  as checks for missing sources  and edit  checks  for reasonableness.
Specify below the statement that best  describes the quality  assurance
measures that are conducted on your State's  emission  inventories.

       1.	  Formal, rigorous, regular checks are implemented

       2. 	  Less formal, spot checks are  made,  on  an irregular  basis

       3. 	  No quality assurance measures are implemented

Are other efforts made to insure all  ex-i sting sources are included in  the
inventory? (describe)
Are other efforts made to varify that source and  emissions  information  is
accurate and up-to-date? (describe)	
     c.  What should EPA do to help you  make your criteria  pollutant  inventories
more comprehensive, accurate and current?   (Check "x" where appropriate.)

                                        Very             Adequate    No  Strong
                                     Important    Useful     As  Is       Opinion

       1. Provide better guidance on....

           -Point sources               	       	       	         	

           -Area sources     ,           		       	         	

           -Highway vehicle             	       	       	         	

           -Locating sources            	       	       	         	

           -Questionnaire design        	       	       	         	

           -Quality assurance           	       	       	         	

           -Data handling               	       	       	         	

           -Reflecting SIP regs         	       	       	         	
             in projection inventory

           -Other(specify)	



                                     2-4

-------
                                        Very              Adequate   No  Strong
                                     Important    Useful    As  *s       Opinion •
       2. Improve emission factors
           in AP-42
       3. Provide computerized
           systems having better
           data handling capabilities   	
       4. Other(specify)
     d.  Do agency personnel  engaged in emission  inventory  development  have
or have access to al1  current guidance on emission factors  and  emission
inventory preparation"?  (RO personnel  performing  the  audit  should  obtain
a current listing of existing guidance from the Criteria  Emissions Section
at OAQPS prior to the audit.   It should be used as a  checklist  on  this
question in conducting the audit).


EPA Regional  Office Response  (Confirmation or Comment)  	       	
                                     2-5

-------
B.2.  VOC Inventories in 03 Nonattainment Areas

Current guidance requires that VOC inventories in 03  nonattainment areas  be
adjusted in various ways to reflect reactive emissions  occurring during the 63
season.  The following questions address the adjustments  made  in your  agency's
VOC inventory and apply to all components of the inventory,  i.e., point,  area
and highway vehicle sources.  (NOTE:   if your state contains no  nonattainment
areas for 03, check here [	] and go to B.3.)

State/local agency response

     a.  EPA guidance specifies that  methane, ethane, methylene  chloride,
methyl  chloroform, trifluoromethane and 6 chlorofluorocarbons  should be
excluded from 03 SIP inventories as nonreactive.  Indicate below your  agency's
exclusion of nonreactive VOC compounds from its 03 SIP  inventory.
(Check "x" where appropriate.)

1. 	No VOC compounds have been.excluded as nonreactive (if  checked,
       go to question b.)

Check the compounds that are excluded from the 03 SIP as  nonreactive.

2.	Methane                    3.	Ethane

4. 	Methylene chloride         5.  	Methyl chloroform

6. 	 Trifluoromethane           7.  	 Chlorofluorocarbons CFC-11, CFC-12,
                                         CFC-22, CFC-113, CFC-114, CFC-115
8. 	 Others - specify compounds 	


9. 	 The agency excludes the following compounds based  on  vapor pressure
       cutpoi nts:	
     b.  What technical  basis does your agency use to identify  and  quantify
nonreactive VOC?  (Check "x" where appropriate.)

1. ,	 Nonreactive VOC not excluded,  so question  not  applicable

2.	Use EPA's VQC Species Data Manual,  as  revised  1988.

3. 	Use MOBILE4 option to generate nonmethane  VOC  emission  factors  for
       highway vehicles.

4. 	Use general species profiles from the  literature.

5. 	 Sources are asked to list nonreactive  compounds  in their VOC emissions,

6. 	V_apor pressure cut point of 	.

7.	Other (specify) 	

                                     2-6

-------
     c.  The highest levels of ozone formation generally  occur  on  weekdays
during the summer months.  Current guidance requires  that VOC inventories
represent typical weekday emissions during the summer ozone  season.   Have your
VOC totals been adjusted for conditions  representative of the 03 season  such as
higher temperatures, lower Reid Vapor Pressure (RVP)  of gasoline,  etc.?

1. 	Yes   	No

If yes, check the appropriate statement(s) below:

2. 	 Higher 63 season temperatures have been considered in
       generating highway vehicle emission factors

3. 	 Higher 03 season temperatures have,been considered in estimating
       evaporative losses from petroleum product (including  gasoline)  storage
       and handling.

4. 	 Lower summertime RVP's have been  considered  in estimating evaporative
       losses from gasoline storage and  handling


     d.  A number of source categories have recently  been identified  as  being
potentially significant VOC emitters that have not  traditionally been  included
in VOC inventories, especially those relating to fugitive and/or waste treat-
ment processes.  Have the following sources been included in your  agency's  VOC
inventory?  (Specify "yes" or "no," or "N/A" if no  such sources are  located in
your area.)

1..	POTW's (Publicly Owned Treatment  Works, i.e.,  sewage  treatment  plants)

2. 	TSOF's (Treatment, Storage and Disposal Facilities for hazardous  wastes,
       including landfills, surface impoundments, waste piles,  storage and
       treatment tanks, hazardous waste  incinerators, and injection wells)

3.	Municipal landfills (domestic garbage, rubish, etc.)

4. 	Fugitive leaks from valves, pump  seals, flanges, compressors,  sampling
       lines, etc., in organic chemical  manufacturing facilities (esp. SOCMI)

5. 	 Leaks from underground gasoline .storage tanks
                                     2-7

-------
     e.  In question B.l.d,  we asked  how EPA  could  help you  on your  criteria
pollutant inventory.  Indicate below  where  your  agency specifically  feels
better information or guidance is  needed to improve its VOC  inventory.   (Check
"x" where appropriate.)

                                                      Current Data
                                   Very                Or  Guidance    No  Strong
                                 Important     Useful     Adequate     Opinion

1. Excluding nonreactive VOC       	          	       	           	
2. 03 season adjustment of
   VOC totals

3. Emission factors for
   sewage treatment plants
   and hazardous waste
   treatment, storage, and
   disposal facilities

4. Other(specify)	
EPA Regional  Office Response (Confirmation  or  Comment)
                                     2-8

-------
B.3.  Demonstration of Reasonable Further Progress  (RFP)

If your State contains no 03 nonattainment areas, check here  [	]  and  go  to' B.4.

State/Local  Agency Response .

     a.  For 03 nonattainment  areas,  the Clean Air  Act  requires  SIPs  to provide
for tracking of VOC emission reductions to ensure RFP.  Through  CY  1987 did  your.
agency actually track changes  in VOC  emissions or emission  reductions with
projected changes in emissions or emission reductions gi.ven in the  SIP  RFP
curves?!  Check yes or no for  each nonattainment area listed  below.   If no,
insert the letter code best representing the reason RFP was not  tracked.

Areas where 03 RFP should be tracked:           Yes    No      Reason
(Regional Office to provide list)                             (use  codes)

1.	;	 .          	    	    	

2. 	•                 	    	        '

3. 	           	    	    	

4.
  CODE
  A.   Alternative tracking mechanisms used (e.g.,  air quality  data),  not
       directly involving emission inventories.
  B.   RFP tracking not considered a priority  task.
  C.   Insufficient resources available to track  RFP  by maintaining  up-to-date
       emission inventory data.
  D. :  Insufficient guidance available on how  to  do RFP tracking.
  E.   Other (specify) __


     b.  RFP Report:   If "yes," did your agency,  -in the past year:     ;

       •  1. Prepare a report on RFP?  _ yes  _ no

         2. Submit the report to EPA?  _ yes _  no

         3. Make the  report available for public  comment?  _  yes _ no
     s question applies only to those areas  specified  by  the  Regional Office
  as 03 extension areas and areas where EPA  called  for SIP  revisions.   (EPA
  Regional Offices can provide a list of these areas.) Note that RFP tracking
  means compiling a realistic estimate of an individual year's  emissions
  or emission reductions and comparing this  to the  appropriate  year  in  the SIP
  RFP curve.  Merely compiling air quality data trends as an  alternative to
  compiling emissions data is not accepted as RFP tracking.

                                     2-9

-------
For EPA Regional  Office response
         4. Has the RO received the above reports?
yes
no
         5. If "yes," has the RO/commented  or  otherwise  responded  to  the
            State or local agency on the report?   	yes 	no

         6. Are you assured that  any progress  indicated  is  the result'of
            real  emission reductions rather than  the  result of changes  in
            methodologies, emission  factors, etc.? 	yes	no

     c.  RFP Emissions Inventory  Update:  For  each of the areas where 63 RFP
         should have been tracked in the past  year, what approximate  percent
         of the VOC emissions in  tne .inventory was updated  for the major,
         minor, and mobile source categories that year?  The areas are  the  •
         same as those listed by  the Regional  Office  in  question a.

         Use one of the following percent ranges  in the  table.
         0-19%,  • 20-39%,   40-59%,    60-79%,   80-10U%
Areas where 03
RFP should be
tracked. See
question a.
1.
2.
3.
4.
5.
Stationary Sources
Major
% ' RO*





Minor
Regulated
% RO*





Unregul ated
% RO*





Mobile
Sources
% RO*





*The Regional  Office auditor should  ask  the  agency  for  documentation on the
 extent of updating the RFP  inventory  and  ini:tia1  in  this  column  if
 documentation appears  adequate.
                                    2-lU

-------
     ci.  New RFP tracking guidance entitled Revised  Guidance  For  Tracking
Reasonable Further Progress (RFP)  In Ozone Control  Programs,  September  1986,
was published by EPA to apply to post-1987 03 SIPs.

         Is your agency aware of this guidance?

              Yes	   No	'

         (If no, contact the Regional Office for more information.)

         If yes, is your agency aware of its requirements?

              .Yes	  No	

         Does your agency plan to implement these requirements?
                  i
              YesJ	  No	

         If no, why?	
                                     2-11

-------
B.4.  CO Emission Inventories  In  CO  Nonattalnment Areas


If your State contains  no CO  nonattainment  areas, check here  [	] and go to B.5.


State/Local  Agency Response


     a. Indicate which  response(s) below  describe the  geographic coverage and
focus of your agency's  CO inventory.   (Check  "x" where appropriate.)

1. 	  Major emphasis  is on  maintaining  a  CO  inventory for highway vehicle
        sources for certain traffic  areas such  as Central Business Districts,
        intersections,  or specific nonattainment areas

2. 	 lAreawide or countywide CO inventory is  maintained, covering major CO
        point sources,  area sources  and highway vehicles

3. 	  CO inventory is not currently  maintained



     b.   Are woodstoves included in your CO  emission  inventory?  	Yes	No

     c. •  Is the highway vehicle  inventory  or  transportation/traffic data used
          to locate potential  CO  hot spots?
          	Yes  	No



EPA Regional Office Response  (Confirmation  or  Comment)  	
                                    2-12.

-------
B.5.  Mobile Source Inventories in 03 and CO Nonattainment  Areas

Mobile source emissions inventories for highway vehicles are often compiled by
the air pollution control  agency acting in concert with the local  planning
agency or transportation department.  In some instances, the local MPO or DOT
will compile the inventory independently as the lead responsible  agency.   In
general, mobile source emissions are calculated by applying mobile source
emission factors to transportation data such as vehicle miles traveled (VMT),
trip ends, etc.  Mobile source emission factors are available for various
vehicle types and conditions from an EPA emission factor model  entitled MOBILE4
(or from earlier versions).  Important conditions affecting emissions  are
vehicle age and mix, speed, temperature, and cold start operation.

If your State contains1 no 03 or CO nonattainment areas, check here [	]  and
go to B.6.                              '

State/local agency response.

The State or local agency should answer the following questions even  if a
transportation or planning agency is responsible for the mobile source
inventory.

     a.  Which agency maintains the highway vehicle emission inventory
     for the 03 and/or CO nonattainment areas? (Check "x" where appropriate.)

         1.  	'Air pollution agency (State or local)
         2.  	Local planning organization (MPO, COG, RPC, etc.)
         3.  	 State or local transportation department  (DOT)
         4.  	; Other (specify
         5.  	None is maintained
         6.  	Unsure

     b.  If an agency other than the air agency maintains the mobile source
     inventory, indicate what difficulties (if any)  result.   (Check  "x"
     where appropriate.)
                                        i     i
         1. 	No significant difficulties are evident
            	               I   '       !
         2. 	 Scheduling and coordination of activities are negatively  affected

         3. 	 The air agency loses control  of the  riesi'gn and format  of  "he
                inventory

         4. 	 The responsible agency has not been  adequately funded  to  be
                responsive

         5. 	Additional technical guidance is needed for  effective  communi-
                cation of program needs to another agency

         6. 	Other (specify)	
                                    2-13

-------
     c. Conversely, indicate what benefits (if any) accrue from having another
agency responsible for the mobile source inventory.  (Check "x" where
appropriate.)

         1. 	No significant benefits result

         2. 	 Less resource drain on the air agency

         3. 	The air agency doesn't have to develop transportation planning
                expertise

         4. 	A better product results

         5. ,	Other (specify)	
     d.  Which emission factor model  (MOBILE 2,  2.5,  3,  or 4)  was used to
generate the mobile source emission factors for  the most recently
developed or maintained inventory? 	 (Indicate number or "U"  if unsure..)

     e.  Were the mobile source emission factors in the  model  tailored to
your area to account for the following parameters?  (Indicate  "Yes," "No" or if
unsure, specify "U.")


         1.  	Vehicle mix
         2.  	 Vehicle age
         3.  	Speed
         4.  	Ozone season temperature
         5.  	Cold/hot start 'Operating modes


     f.  Were data from the local transportation planning process used to
compile the most recently-developed or maintained mobile source inventory?
(e.g., VMT, street locations, traffic volumes,  growth patterns, etc.)
Yes 	   No 	  Unsure 	j_


     g.  If not, were gross areawide  estimates, of VMT or gasoline sales used
to compute emissions?   Yes 	'_  Ho  	  Unsure 	


     h.  An important component of travel  sometimes overlooked in mobile
source inventories is VMT associated  with  minor  roads and connne'ctors, often
called "local" or "off network" travel.  Was local travel included in
your most.recently-developed or maintained mobile source inventory?
Yes         No       Unsure
                                    2-1-

-------
     i.  The results of an earlier audit  indicated  that  significantly fewer
mobile source inventories  contained NOX emissions than VOC emissions.
Indicate if this is.still  so and why it is  so  for your agency.   (Check "x" where
appropriate.)

         1. 	Not so.  Our mobile source  inventory  contains both NOx and VOC.

         2. 	 NOx inventory not perceived  as  needed because NOx reductions are
                not required for 03 control

         3. 	NOx inventory perceived as  needed for 03 but it  was not included
                because of resource limitations

         4. 	Other (specify)  	
     j.  An earlier year's audit asked  each  agency  to  specify the base year of
the mobile source inventory in the SIP,  which  gave  a limited idea of how well
these inventories have been maintained  to  the  present.  What is the latest year
of record for which your agency's  highway  vehicle inventory has been updated?
When was this done?

         1.  19	 (latest year of record)

         2.  19	 (year when latest update was performed)


EPA Regional  Office Response (Should confirm the answers  in B.4)	
                                    2-15

-------
B.6.  PMm Inventories in Group I  and II  Areas

Current guidance requires that PM^g emission inventories  be developed  for Group
I and Group II areas, which have moderate to high  probabilities  for nonattainment
of the PMig ambient standard levels.  The following questions  address  the
availability of information/guidance needed to  compile the PMig  inventories  and
areas of difficulty encountered by State/local  agencies in compiling these
inventories.  (NOTE:  If your State contains no  Group-I or II areas  or  if  your
local agency responding to this audit survey is not in a  Group I  or II  area,
check here [	] and do not complete this section.)

State/1ocal_agency_response

     a.  EPA has issued guidance and requirements  for  compiling  PM^g emission
inventories.  This information was presented at PMjn, Workshops held in  August
1987 at four U.S. locations and published in PMm  SIP  Development Guidance.,
supplemented.  To what extent has  your agency understood  this  guidance  and
requirements in initiating your PM^g inventory  efforts?
(Check "x" where appropriate.)

       1.  	The guidance and  requirements  were readily understood.   Inventory
efforts are proceeding.

       2.  	The guidance and  requirements  were moderately difficult  to
interpret and further guidance was/is/will  be sought from EPA.  Inventory
efforts are proceeding even while  awaiting  some clarification.

       3.  	The guidance was  very difficult to understand.   Clearer  guidance
and requirements are needed before proceeding with PM^g inventory efforts.

       4.  List all problem areas:
     b.  EPA has published i>Mig emission  factors  in  Compilation  of  Air  Pollutant
Emission Factors, AP-42 (as supplemented).   Have  these  been  adequate  to develop
the °Miy inventory? 'If gaps in your inventory  have  been  encountered  or are
anticipated, due to lack of PM^g emission factors in particular  source  categories,
what percent of your inventory does (wo.uld  be)  affected by  such  gaps  (check  "x"
where appropriate)?
          0-10%

         11-20%

         21-40%
         over 40%
                                    2-15

-------
    c.  List specific areas in your PM^Q inventory where gaps exist:
     d.  EPA recently established an Emission Factor Clearinghouse focusing on
PM]_Q.  A Clearinghouse Contact has been designated in each EPA Regional
Office.  The purpose of the Clearinghouse is to facilitate filling of PM^Q
emission factor gaps.  The Clearinghouse responds to State/local  requests and
tries to fill gaps by technology transfer and other quick techniques, where
emission factors are needed for particulate source categories and also
evaluates agencies' own proposals for filling these gaps.  Where  possible, data
used to develop new emission factors will be made available.  To  what extent
will your .agency use the Clearinghouse?  (Check "x" where appropriate.)
        _ _ little or not at all (have nearly all  the PM^Q emission
              factors that are needed)

        __ to a moderate extent (will be helpful  in completing the
              inventory)

        _ extensively (will  rely on it heavily since many gaps
              exist or are anticipated)

     4.  PM^g inventories cover point, area and mobile sources.  Does your
agency have sufficient guidance for all three types  of sources to complete the
inventory?  (Mark "x" where appropriate.)

         Yes _

         No
If no, for which type(s) of sources do you lack sufficient guidance?

         Point 	

         Area	

         Mobi1e
                                     2-17

-------
C.   Modeling

C.I.  Experience and Training

The ability of an agency to effectively deal  with modeling problems  depends
upon the personnel  resources available to the agency.   Competent  and experienced
personnel are essential  to the successful application  of dispersion  models.
The need for specialists is critical  when more sophisticated  models  are  used  or
the area being studied has complicated meteorological  or topographic features.
Please summarize your agency's staff  levels  and modeling experience.
State/Local Agency Response:

a.   Please complete the following table indicating the number of persons  in
     each of the training and experience categories:


                                  	Experience	
    Training	                0-2 yrs    2-5  yrs     5-10  yrs    >10  yrs .

1.  Meteorologist                 		

2.  Engineer/Scientist            	  		
    with modeling training

3.  Engineer/Scientist	
    without modeling training

4.  Other educational  background  	  	   	  	
    with modeling training

5-.  Other educational  background  	  	   	  	
    without modeling training



EPA/Regional Office Response (confirmation or comment):
                                    2-18

-------
C.2.  Model  Availability .

     The ability of an agency to effectively deal  with modeling problems  that
     may arise also depends on the facilities (hardware and software)  available
     to the staff to perform or confirm modeling analyses.   Please summarize
     below your agency's accessibility, expertise and usage of computer-based
     air quality models.

State/Local  Agency Response:

a.   To which air .quality models does your staff have access?  Also indicate
     whether your staff is capable of runntng the model  and the approximate
     number of applications during the last fiscal year.
     1.  Circle UNAMAP version number

     2.  Specific Models
                                        Other
   Don't know
EPA Guide-
1 ine
Model sU)
Other
Models(2)
                         Access
                       (yes or no)
BLP
CALINE3
COM 2.0
CRSTER   '
ISCLT
ISCST
MPTER
RAM
UAM
OCD      ;

APRAC-3
AQMD
PAL-2
PLUVUE II"
                              In-house
                           Expertise to Use
                             (yes or no)
  Yearly Usage
(Estimated Number
 of Applications)
Screening    PT MAX
Techniques(3)PT DIS
             PTMTP
             PTPLU-2
             VALLEY
             COMPLEX
        LONG Z/SHORT
             EKMA
                                    2-19

-------
b.   Is access.generally by:

     1.  	Telephone line  to State/local  agency  mainframe  computer?
     2.  	Telephone line  to private  or  subscription  computer?
     3.  	 In-house dedicated computer?
     4.  	 Telephone line  to EPA computer?
     5.  	Personal Computer?

c.   Does your staff have the capability to modify  software for  the
     above models? 	 Yes 	 No
     If yes, which models have been modified?	
     Where modified guideline models  have  been  used,  State/local  agency
     should answer appropriate part of  question C.3.

Footnotes:

     (1)  Models recommended in the "Guideline  on  Air Quality  Models
          (Revised)"  (1986)  and Supplement  A  (1987).

     (2)  In addition to the examples given,  list  the nonguideline models
          available to you  and indicate whether any have  been  used on  a
          case-by-case basis.  Include  long range  transport models,
          photochemical  models, complex terrain models  and any other models
          for situations for where EPA  has  not  provided guidance.

     (3)  In addition to the EPA screening  techniques listed,  indicate
          the accessibility  and usage of any  other screening techniques
          available to you.

EPA/Regional Office Response (confirmation  or comment):
                                     2-20

-------
C.3.  Alternative Modeling Techniques

     EPA modeling guidance recommends specific models  and  data  bases  to  be
     used in regulatory modeling.   However,  the guidance also indicates  that
     an alternative model  or data  base may be used  in  an individual case if
     it can be demonstrated that the alternative technique is more  appropriate
     than the recommended  technique.  Describe the  number  of, and circumstances
     related to, modeling  analyses where it  was necessary  to use alternative
     techniques from those specifically recommended in EPA guidance.

State/Local Agency Response:

a.  In approximately what  number of the modeling analyses  performed by your
    agency in the last fiscal  year was it necessary to use techniques not
    specifically recommended in EPA guidance?

         1.  	 times out of     .    modeling analyses performed.

b.  If an alternative model was used, indicate the  reason(s) for its  usage,
    using the list below:   (alternative data bases  are covered  in part C of
    this question)
                                                Number  of    Reason(s)
                                                  Cases     for  Use*
         1. Use of nonguideline model

         2. Modification of guideline  model

         3. Use of nonrecommended option  in
            a guideline model  ,

         4. Use of guideline model  outside
            its stated limitation      i

         5. Other (describe ,              )
*Reasons for use:  (List one or more of the following  codes  as  applicable
 in the space above)


     CODES
     A.    The alternative technique was  judged,  for  technical
           to be more appropriate for the situation.
     B.    Lack of access to the guideline model  recommended for  the  situation.
     C.    The alternative model  was judged,  through  a  performance  evaluation,
           to be more appropriate for the situation.
     D.    No EPA guidance applies to the situation.
     E.    Other (specify below).
                                    2-21

-------
c.  If there are cases where the selection/usage  of  data  bases  for
   •models are different from those  recommended in EPA  guidance,  please
    indicate 'the number and circumstances  surrounding each  case.
                                 Number         Reason  (brief  Statement)
                                 of Cases
    1. Use of less than 5 years
       of off-site or less than
       1 year of on-site
       meteorological  data.

    2. Use of techniques other
       than those contained  in
       the "Guideline on Ai r
       Quality Models" for
       determining background

    3. Use of techniques other
       than those contained  in
       EPA policy on treatment
       of calms

    4. Use of techniques other
       than those contained  in
       the EPA policy on design
       of receptor network
EPA Regional  Office Response (confirmation  or  comment):
                                    2-22

-------
C.4.  Modeling Analyses

a.   State and local  agencies will  normally conduct/review and  submit  to
     EPA modeling analyses to support certain  actions.   EPA will  review
     these analyses case-by-case and approve or disapprove them in  the
     Federal  Register.

TO BE COMPLETED BY THE  EPA REGIONAL OFFICE:

     Please indicate  in column A approximately how many  modeling  analyses
described above were  submitted to EPA during the last  fiscal  year in each  of
the program areas listed below.  In column B,  indicate the number of analyses
where EPA has required  the State/local  agency  to revise  the analysis.   In  column
C, use the code letters provided below to indicate the technical  areas contributing
to the answer given in  column B.

                               A                     B                   C
                         # of analysis          # of analysis    .    Contributing
                         •   reviewed         requiring revision       factors*

   1. Bubble  (emission    	      	     	
      trades)                                                 ~

   2. Section 107	
      redesignations

   3. New source review	
      (including PSD)

   4. Nonattainment area 		
      SIP analyses

   :5. Lead SIP's	

   6. Other SIP modeling ^^_	      	    	
      *Indicate by code(s) which of the following technical  areas  were  either
       inadequate or deviated from EPA Guidance.

          7JDES
          A.    Use' of inappropriate guideline model
          B.    Use of nonguideline model  without a  performance  evaluation  to
                demonstrate acceptability  of the  model
          C.    Urban/rural dispersion coefficients
          D.    Emission inventory and operating  design  parameters
          E.    Meteorological  data base
          F.    Receptor network design
          G.    Complex terrain considerations
          H.    Downwash consideration
          I.    Comparison with acceptable air quality  levels
          J.    Technical  documentation of modeling  analysis
          K.    Other 	

State/Local  Agency Response (confirmation  or comment)

                                     2-23

-------
b.    In those instances where the modeling analysis  supporting a  particular
      action is performed by industry and/or other governmental  entity,  the
      agency will  review and approve or  disapprove the  modeling  analyses.
                              /
TO BE COMPLETED BY THE STATE/LOCAL AGENCY:

     Indicate in column A approximately  how many  modeling  analyses described
above were reviewed during the last fiscal  year in each of the  indicated
program areas.  In column B, indicate the  number  of analyses  where the  agency-
required the responsible party to revise the analysis.   In column  C,  use the  code
letters provided below to describe the technical  area contributing to the  answer
given in column B.

                               A                     B                    C
                         # of analysis         #  of analysis       Contributing
                            reviewed	       requiring  revision       factors*

   1. Bubble (emission   	       	   	
      trades)

   2. Section 107        [	
      redesignations

   3. New source review	
      (including PSD)

   4. Nonattainment area	
      SIP analyses

   5. Lead SIP's	

   6. Other SIP modeling 	       	      	
      *Indicate by code(s)  which of the following  technical  areas  were  either
       inadequate or deviated from EPA Guidance.

           CODES              •                      '   .
           A.    Use of inappropriate guideline model
           3.    Use of nonguideline model' without  a performance evaluation  .;•.:
                 demonstrate acceptability of  the  model
           C.    Urban/rural dispersion coefficients
           D..    Emission inventory and operating  design  parameters
           E.    Meteorological  data base
           F.    Receptor network design
           G.    Complex terrain considerations
           H.    Downwash consideration
           I.    Comparison with acceptable air quality  levels
           J.    Technical  documentation of modeling analysis
           K.    Other
EPA Regional Office Response (confirmation  or  comment)


                                    2-24

-------
z
m


(/>
O
c
2)
O
m
3J
m

Fn

-------
                                 Chapter 3

                     New Source Review Audit Guidelines.

                                FY 1988-1989

Section  Title	  pane

l.-O      Introduction.	•.  .  3-1

2.0      FY 1983-1989 NSrt Audit Procedures	3-1

3.0      Completing the Audit Report	3-4

4.0      Suggested Worksheets  	  3-5

5.0      f>lew Developments in flew Source Review	3-b

6-.0      Description of the Permit File Questionnaire	3-10

G.I         Source Information	3-1-J
6.2         Public Participation and notification   	  3-11
6.3         flopl icibil i ty neter.ii nations	3-1?
6.3.1          Definition of Source  . . .	3-12
6.3.2          Fugitive Emissions  . . .	3-13
6.3.3          Potential to Emit	1	3-13
6.3.4          Emission Netting	•	3-13
6.3.5          Emission Limits	3-15
6.4         Control Technology 	  3-15
6.4.1      •    iNSR/DSO Sources	3-15
6.4.2          Non-NSR/PSD Sources 	  3-17
6.5         Air Duality Monitoring Data (PSD)	3-17
6.6         PSD Increment Analysis.	  3-13
6.7         MAAQS Protection 	  3-19
6.8         Emission Offset Requirements 	  ....  3-20

Fin. 1.   Major MSR Audit Topics and Associated Questions   	  3-7

Table 1.  Reference .Tables for Use Hit.1 FY  19S8-19P9 ?!SR Audit
          Questionnaires 	  3-22
              ,:2r"i;: "ummarv Ouest ionnai r?.
Form 2.   MSR Audit Summary Questionnaire	3-25

Form 3.   Permit File Questionnaire for Major Sources Subject
          to PSD or Part D	3-29

Form 4.   Permit File Questionnaire for Major Sources Not Subject
          to PSD or Part D . .  .,	  .  3-43

Form 5.   Worksheet for Audit	3-6C

-------
                           3.  NEW SOURCE REVIEW
1.0  INTRODUCTION

     The procedures for carrying out the FY 1988-1989 NSR audit will  remain
largely the same as they were for FY 1985-1987.   That is, the same four
questionnaires will be used, and the onsite audit will  continue to focus on
the examination of current permit files.  Perhaps the most significant
change is the overall switch to a two-year audit cycle.  As a result,
individual NSR programs will be audited every two years.   Some changes  have
been made with respect to the audit forms, primarily for  the purpose  of
clarification, but also in some cases to modify  or expand the type of
information that will need to be collected.  Specific changes are included
in a later section which describes each of the questionnaires.  The four
questionnaires are:                   >

     1.  NSR Permit Summary Questionnaire (Form  1)

     2.  NSR Audit Summary Questionnaire (Form 2)

     3.  Permit File Questionnaire for Major Sources Subject to PSD or
Part D (Offsets) (Form 3)

     4.  Permit File Questionnaire for Sources not Subject to PSD or  Part D
(Offsets) (Form 4)

     Some of the audit subjects covered in this  section continue to involve,
in whole or in part, issues that could be affected by proposed EPA rulemaking
or ongoing litigation [e.g., CMA agreement rulemaking proposed on August 25,
1983 (48 FR 38742)].  These particular items are potentially impacted  by
regulatory amendment.  Should changes to the affected requirements be
promulgated, EPA will issue revised guidance as  to how the audit .should
handle them.  Until such time that the existing  Federal requirements  and
the State rules developed pursuant to these 40 CFR Part 51 provisions  can
be changed, this guideline will assume that all  rules will continue to  be
implemented under the EPA requirements presently in effect.
                                      i     i
2JO  FY 1988-1989 NSR AUDIT (PROCEDURES,

     At least 30 days before the scheduled onsite audit,  the ?5g1onal
Office should send a copy of Form 1 to the appropriate agency.  Audited
agencies should be asked to complete the questionnaire before the onsite
audit so that it can be returned to the audit team before or during their
visit.  A set of instructions for completing Form 1 should accompany  the
questionnaire when it is sent to an agency.  The instructions provide  important
information which will help to ensure that the agency responses will  be -made
in a reasonably consistent format.

     The file review is a very important aspect  of the NSR audit process.
For the new source review audit, permit files should be selected on the
basis of permit action type, source type and size, source location, public
concern, and other factors geared to ensuring review of a variety of  permitting
actions and decisions by the agency.  Criteria to consider include:


                                    3-1

-------
          0 Review both large (major)  and small  (minor)  sources;

          0 Review both new plants  and plant  modifications;

          0 Review a PSD source for which preconstruction monitoring data
            requirements apply;

          0 Review a PSD source near a Class  I  area;

          0 Review sources that avoided PSD or  Part 0 review because of
            restrictions on their operation or  capacity;

          0 Review some sources in  nonattainment  and  sanctioned areas,
            if applicable;

          0 Review some of the most common source types  in that State
            (for example, boilers and  asphalt plants), but also review
            a variety of other source  types;  and

          0 Review a PSD source with toxic "unregulated"  pollutants, to
            ascertain whether the BACT determination  appropriately addresses
            them.

          0 Review a controversial  permit, a  permit of high  public
            interest, or one that would be of particular  interest  for
            reasons other than those described  above.

          0 Tabulate total number of permits  and  categories  of these
            permits;

     By combining several of these factors in one permitting action, it
may be possible to satisfy the criteria above with only  a relatively few
permits.  Generally, however, in order to obtain  a reasonable sampling  of
permits, the auditor should randomly select at  least  five PSO/Part D and
ten other permits issued since the last audit.   If this  random selection
does not seem to represent the variety of criteria indicated above, the
auditor should note this and specifically select  additional  permits'for
review which do reflect the missing criteria.
                 I   '       I
     For the 1988-89 NSR audit, the auditor is  NOT required  to fill out the
file audit questionnaires for every permit file selected  and examined.
Instead, after selecting the permit files that  will be examined,  in accordance
with the criteria described above,  audit questionnaires  should be  filled
out for a minimum of ONE PSD, ONE Part D (offsets) and THREE other (non-PSD/
Part D) permits.  The following procedures should be  used:

1.  For completion of the PSD/Part D permit questionnaire (Form 3)--

     Select the most recent PSD permit and the  most recent Part D  (offset)
permit and fill out the major source questionnaire, to the extent  applicable,
for each one.  In case the agency did  not issue at least  one of each of
these types of permits, select and review at  least two of the other type,
preferably the most recent new plant and the  most recent  modified  plant.


                                    3-2

-------
2.  For completion of the non-PSD/Part D questionnaire (Form 4)--

     Beginning with the most recently issued permits, select at least three
permit files that represent a variety of permit review situations as described
in the criteria above.  A file should be rejected only if it too closely
resembles a file already selected for evaluation using the questionnaire,
except that the auditor should try to include at least one permit where
major review was avoided through restrictions to the source's operation or
capacity.

3.  All remaining major and minor source permits should be examined as time
allows using the appropriate questionnaire as a guide to ensure that the
applicable audit topics are adequately addressed.  The auditor may complete
the questionnaire for any or all of the remaining permit files, but he or
she is not required to do so.  It is understood that permit file review
time may vary greatly.  Auditors are encouraged to review as many of the
selected files as possible, but, should time run short, they are advised to
conduct only a limited review of these remaining files, concentrating on
problems identified by the completed questionnaires to determine whether
the problem is common to several permits or only to an isolated case.  Uf
course, if time is available, auditors should conduct a more detailed
review of the remaining permits (as well as others not originally selected)
to see if any additional observations can be made.

     It cannot be over overemphasized—auditors must not use all the time
examining permit files at the cost of meaningful communications with audited
agencies.  In conducting the onsite audit, auditors are advised to strike a
reasonable balance between examining the selected permit files and maintaining
meaningful dialogue with the appropriate agency personnel.  Feedback from
the FY 1986-87 audit indicated that, although auditors have increased their
dialogue from FY 1985, more emphasis is still needed.  The lack of dialogue
was generally cited as a problem by the auditors and the audited agencies
alike.  An, example of how such dialogue could be achieved is to have an
agency representative—preferably the permitting engineer most familiar
with the selected permit file—present during the examination of one or
more permit files.*

     The presence of the agency representative would anable him or her to
observe1 how the file audit is actually conducted.  More importantly, such
individual would be able to help expedite the file search by indicating at
the appropriate times during the file audit where (and whether) specific
Information is to be found.  At the same time, he or she is likely to
become more appreciative of the need for carefully organized and well
documented files as specific information is being sought from the file to
complete the appropriate questionnaire.  It is also important that an
*An alternative approach might be to meet with some of the permitting engineers
after having completed several file audits to explain how the file audit
was conducted and to explain how certain conclusions were reached in responding
to the file audit questionnaires.  During this time, the auditor may also
want to ask certain questions of the engineers with regard to any unanswered
or unclear issues.
                                    3-3

-------
opportunity be afforded for the auditor tc explain applicable new source
review requirements and procedures where a lack of understanding nay be
apparent.  In all, both the auditor(s) and the agency representative(s)
stand to benefit from dialogue occurring in conjunction with the file
audit.

     Auditors will find it usef'jl to take with them a calculator and a
few basic documents:

     (1) Copi'ss of 4C CF^ 51.18 and 51.24 [nov; recodified under 40 CFR 51,
Subpart H

     (2) AP-42, for checking whether-all emission un-its were included, and

     (3) A copy of the reference sheet included in this-guidance.  (Table 1.)

2.Q  COMPLETING THE AUDIT REPORT

     At the close of each audit, the auditor(s) should have (a) approximately
5 completed NSR questionnaires—2 for major sources and 3 for minor (or not
subject to PSD or offsets) sources; (b) notes .(or additional questionnaires)
on approximately 10 additional permit files; and (c) a subjective impression
of the audited agency.  This information must now be organized and reported
so that it can be incorporated into the State or local agency audit report.

     The HSR audit report should be written in a narrative form, organized
as closely as possible according to the NSR audit topics and specific arnas
of concern (questions) under each topic.  Figure 1 identifies the r,eve;i NSR
audit topics and associated audit questions.  These topics and questions
v.-ere selected for auditing by the new source review audit committee comprised
of State, local, and EPA representatives.  The audit topics and questions,
originally defined for the FY 1934 audit, serve as the basis for development
of the permit file questionnaires.  For FY 1983-89, the questions are to be
answered primarily on the basis of information found in the audited files,
but discussions with agency personnel will also be useful.

     The information that will be available to each auditor consists of
b;oth background information (such as the number of sources proposing to
locate with in 100 km of a Class I area) and evaluative information (such
as whether an agency is incorrectly using actual emissions, rather than
•^ot'riti; 1 ".:) emit, to decsriina applicability to ?SO or offsets;.  "."!
.•)re-o*rinn the i'ISr? audit narrative, the auditor shoulj try to use bot.n tyo.es
of information to formulate therfindings and recommendations.            '  -

     For each problem identified from the permit files, .the following
considerations should be taken into account and discussed in the narrative
where possible:

     a.  The number of cases where the problem was identified versus the
total number of permit files examined for that particular problem.

     b.  Whether the oroblem is a new one or is a carryover of a situation
that had been identified by previous audits.

-------
     c.  The likely reasons for the occurrence of the problem,  e.g., inadequate
procedures, failure to adhere to existing procedures, insufficient training
or resources, lack of EPA guidance, etc.

     The auditor's subjective judgment can affect conclusions in two ways:
by deciding how serious the problem itself is, and by deciding  whether
there are special  circumstances which affect the seriousness of the problem.
It is important to make certain that perceived problems are discussed with
the audited agency in order to more clearly understand the true nature of the
problem or to determine whether a problem actually exists.  Before a problem
is actually listed in an audit report, the auditor should discuss the issue
with the appropriate personnel in the affected agency.  Often,  a lack of
information is just a difference in filing 'systems.  Give the auditing agency
an opportunity to provide solutions before writing up an issue  as a problem.
Where a problem is suspected, the auditor's assessment and the  basis for that
assessment should be clearly documented in the narrative report.  When problems
are identified to agencies, the reviewer should distinguish between very
serious problems and problems that would just be "nice" to resolve.

     If there are fewer than five permits in a data base, it may be difficult
to draw conclusions on how widespread a problem is.  However, subjective
impressions often offer valuable insight and are encouraged in  these situations.

     Recommendations should' reflect the potential seriousness of the problem.
The factors outlined above should also be considered in developing recommen-
dations for resolving each identified problem.  Potential solutions to problems
found during the audit should be discussed with the audited agency during the
exit interview.  Tentative solutions to some problems may be negotiated at
the time and described in the audit report.

     The NSR audit reports (narrative plus appropriate number of copies of
Forms 1 through 4) should be incorporated into the State or local agency
audit report and forwarded to EPA (CPDO) in accordance with existing guidance.
It is important that all of the completed forms that are submitted as part  of
the audit report be neat and legible.  If originals are not sent to Headquarters,
then the proper care should be taken to ensure that all copied  material  can
be easily-read.

     Auditors should take special note and report on any innovative or
alternative permit conditions that agencies have used that meet Federal
requirements but which lead to more effective1compliance determinations or
;nay ensure better operation and fewer violations.  The EPA will describe sucn
permit conditions in its national report, allowing other agencies to learn
better methods of writing NSR permits.

4.0  SUGGESTED WORKSHEETS

     Feedback from the 1986/87 audits indicates that the information requested
on Forms 3 and 4 is not always presented in the same order in a permit as in
the form.  Therefore, a new worksheet has been developed (see Form 5 on page
3-57), and may be used when reviewing selected permit files.  Use of the work-
sheet may make it easier to find the information requested on the audit form.
                                    3-5

-------
5.0  NEW DEVELOPMENTS -IN NEW SOURCE REVIEW

     As with all other things, new source review changes over the years.
There have been several modifications in the NSR policy that affect permit-
ting as well as potential  and actual  changes in regulations.  The auditors
should probably discuss these with the agency being reviewed.  These include:

     1.  There is renewed emphasis on good quality 8ACT determinations.
The EPA expects agencies to do a top-down analysis for BACT determinations;
this means that the presumptive BACT is LAER.  The EPA also wants agencies
to consider toxic implications while doing BACT determinations.

     2.  An important aspect of BACT, brought out by the Administrator's
North County Remand (PSD Appeal No. 85-2), is that all pollutants, including
those not directly regulated by the Act, are to be considered in making the
BACT determination..  The BACT review should reflect this consideration of
"unregulated" toxic emissions and the BACT limit tightened if appropriate.
(For the purposes of the audit, this class of pollutants is referred to as
"air toxics" or "toxic air pollutants.")

     3.  The PMio NAAQS has been promulgated; this means that PSD SIP's
must eventually do analyses for PM^g as we^ as TSP.  The audit forms have
been modified to.include PMiQ.  The EPA Regional Offices should be aware of
each State's SIP with regard to PMio and make sure that PMig is reviewed  .
where applicable.

     4.  Rulemaking pursuant to CMA may be promulgated this year, but no
one may use regulations revised pursuant to CMA until  the regulations are
incorporated in their SIP.  This will probably not occur during the
FY 1988-89 audit timeframe.

     5.  Rulemaking pursuant to a suit from the Sierra Club is being
conducted by EPA.  This rulemaking may require an increment analysis for
NOX emissions.  This rulemaking will  not become effective until October
1989, which is after the FY 1988-89 audit timeframe.

     Finally, this audit cycle may be the last audit analysis NSR permits
after they have bee.n issued.  It is obvious that catching mistakes after
the permit has been issued is not extremely, useful in improving that quality
of audited permits.  Although it is true that when agencies are notified of
errors in tiieir permits they do attempt to avoid making the same errors
again, it is also better to avoid ever making the errors.  To this 2nd, ^PA
is developing a program to ensure that EPA will review most significant NSR/PSD
permits before they are issued.  This will allow EPA to comment on permits
early enough in the permit process to allow full incorporation in the
resulting SIP.  All members of the NSR community will  be kept informed as
this policy is developed.
                                    3-6

-------
     Figure 1.  MAJOR NSR AUDIT TOPICS AND ASSOCIATED AUDIT QUESTIONS

I.  PUBLIC PARTICIPATION REQUIREMENTS

     1.  For which new or modified sources was the public afforded an
opportunity to comment on proposed permits?  Is the State meeting their SIP
requirements for public comment?

     2.  Do the public notices routinely provide adequate information?

     3.  Were other State and local air pollution control agencies and
other officials whose jurisdictions might be affected by the proposed new
or modified source notified of the proposed action?

II.  APPLICABILITY DETERMINATIONS

     1.  Does the agency apply the proper source definition(s) and exemption
provisions?  What definition of "source" is the agency using (plantwide, dual
source, or something else)?

     2.  Does the agency typically use the best available emission projections
and federally enforceable restrictions!!*] in defining a new source's (or
unit's) "potential to emit"?

     3.  Does the agency routinely use an existing source's "potential to
emit" to determine major source status for proposed modifications?

     4.  Does the agency use as its netting baseline actual emissions
expressed in TPY?[*]

     5.  Verify that the agency does not allow for "double counting" of
emission decreases used for netting purposes.

     6.  Does the agency adequately address fugitive emissions!!*] in
calculating the "potential  to emit" and the "net emission increase"?

     7.  Does the agency properly apply the §107' area designations when
determining what type of preconstruction review will  be required of major
construction?

     3. Verify that ".he agency does not approve rnajor construction
projects in designated nonatt.ainment areas under an E?A-imoosea construction
moratorium.

III.  CONTROL TECHNOLOGY

     1.  Does the review agency check the applicants' selection of the
appropriate control technology?  Does the review agency do a thorough,
substantive review of these determinations?

     2.  Does the BACT analysis consider each regulated pollutant emitted
in significant amounts?
                                    3-7

-------
      3.   Does the BACT review reflect consideration.of  toxic  air pollutants
 in  choosing the level/type of controls for the  regulated  pollutants?

      4.   Does the review agency require the consideration  of  more than
 one control alternative?  To what extent are economic,  energy,  and non-air
 environnental impacts considered in the BACT analysis?

      5.   What tendency is there for the agency's BACT/LAER  determinations
 to  conform exactly to mini nun EPA requirements?

      6.   Does the agency adequately review non-NSR/PSG  sources  for
 apolicability to the NSPS and NESMAP requirements?

      7.   Does the agency ^o a top-down 3ACT analysis?   '-/hen using top-
 down  anlaysis, what is the basis for selecting  the  most stringent (too)
 control  alternati-ve (i.e., LAER, BACT/LAER clearinghouse,  or  EPA
 BACT/LAER policy determination)?

      8.   Will the final  selected control technology  result  in emissions
 less  than BACT?  Why v/as the stricter technology chosen?   Mere  the ilAAQS or
 air quality increments in jeopardy?

 IV.  AIR QUALITY MONITORIfiG UATA—PSD

      1.   Does the agency follow the correct procedures  to  exempt applicants
 from  the preconstruction monitoring requirements?

      2.   Does the agency adequately ensure that existing data meets  Federal
 criteria for representative air quality data when applicants  are not  required
 to  conduct new monitoring?  Uhat is the RPA Regional Office interpretation
 of  the monitoring requirements for each source?

      2.   Do the source monitoring data adhere to PSD quality  assurance
 requi rements?

 V.   AMBIENT AIR QUALITY  IMPACT

   a.   PSD  Increment Consumption

      1.   ^oes the anency adequately consider the baseline  concentration
 •..yi .sen-; - •- ; -jj-i .- s ?. .-I o T j V,'|TJ r'"! ^f'c^r'~_ n'-i(~ r0!T.-ip^.--<
"•iav?  o r'/sT",e'i t:i ",r3c!c incr.'i:"enT; "^"su^pt ion"1   -Ia3  ".his jvst.?1"1  jj.?::?

      2.   Are long- and short-term PSD increments being  given  adequate
 consideration as part of tne ambient impact analysis?

      3.   Does the agency make an adequate assessment of new sources  and
 modifications on the Class I area increments?

   b.   MAAQS Protection                      '                      _

      1.   !Jhat emission baseline does the agency require to  be used to
 evaluate the impact on the NA.AQS of new and modified sources7

-------
    .2.  Does the agency routinely evaluate the ambient impact of minor
source construction?

     3.  Does the agency's ambient impact analysis provide adequate protection-
against the development of "hot spots"?

  c.  Dispersion Models

     1.  Does the agency use adequate models and model options to carry out
the ambient impact analyses for screening analyses?  for more refined analyses?

     2.  Does the agency perform an independent, internal review of the
modeling analyses contained in the permit application?

VI.  EMISSION OFFSET REQUIREMENTS

     1.  Does the agency require that all offsets be Federally enforceable?

     2.  Does the agency routinely ensure that the emission offsets are
not otherwise needed to show RFP or attainment?

     3.  Does the agency require that the emission baseline for offsets
be expressed in the same manner as for RFP?

     4.  Does the agency's offset requirement cover other emission
increases since the last offset review?  If not, does the agency track
minor source growth and account for minor source emissions increases in the
attainment strategy?

     5.  Does the agency require that offsets occur on or before the time
of new source operation?

     6.  Does the agency allow offsets resulting form early source shutdowns
or production curtailments?*

VII.  PERMIT SPECIFICITY AND CLARITY
     1.  Does the agency
emissions in the final permit(s)?
         Are the allowab
permit conditions?
 identify all  emission  units  and their allowable
le emission rates  stated or referenced in che
*This audit ques.tion could be affected by proposed EPA rulemaking or by
ongoing litigation [e.g., CMA agreement rulemaking proposed on August 25,
1933  (48 FR 38742)].  Should changes to the affected requirements be
promulgated, EPA will issue revised guidance as to how the audit should
handle them.  Until such time that the existing Federal requirements and
the State rules developed pursuant to these 40 CFR Part 51 provisions can
be changed, this guideline will assume that all rules will continue to be
impemented under the requirements presently in effect.
                                     3-9

-------
     3.  Are the compliance test methods stated or referenced in the
permit terms and conditions?  Does the permit specify when initial  compliance
is to be demonstrated?

     4.- If a source's calculated potential to emit is based on less than
full design capacity and continuous, year-round operation, are all  limiting
restrictions clearly identified in the permit?

     5.  Does the permit specify the averaging time of each emission
limitation?  Is the averaging time consistent with the averaging time of
the applicable NAAQS and PSD increments?

     6.  Does the permit specify the method and frequency of reporting
continuous compliance?  Are the methods consistent with the averaging time
of the standard?

     7.  Are excess emissions defined in terms consistent with the applicable
emission standards and averaging times?

     8.  If these above things are not found in a permit, has the State
been having problems enforcing permits?

6.0  DESCRIPTION'OF THE PERMIT FILE QUESTIONNAIRES

     The auditor will gather data primarily from selected permit files.
Depending on the permit file selected, the auditor will use either Form 3
or Form 4 in accordance with the procedures described in the previous
section.  The choice of questionnaires should be based on the type of
preconstruction review that the reviewing agency actually carried out in
each case.  Form 3 is designed to be used to evaluate permit files for
which the reviewing agency considered the proposed source subject to PSD
or Part D (nonattainment area/offset) requirements.

     Form 4 was designed to evaluate permit files where the reviewing
agency determined that the proposed source was not^ subject to PSD or Part D
(offset) requirements.  This would include cases where a major source
underwent a modification involving insignificant emission increases, as
well as sources ithat were allowed to avoid PSD or Part D review by' restricting
their potential to emit.  This questionnaire includes questions that will
help to determine whether the agency followed the correct procedures in
subjecting a
source to a non-?SD/Part  D  source  review  rather than  a  P5D/Dart  0
source review.

6.1  Source Information—(Section I)

     The basic data needed to identify the permit file reviewed are requested
in Section I of both .questionnaires (Forms 3 and 4). The questions pertain
to the overall source—not the particular configuration of emission units
which may be the subject of the current permit review.  Thus, "Source
Category" refers to one of the 28 listed' PSD sources or any other category
which best describes the overall  source.
                                    3-10

-------
     "Location" refers to a geographical .identifier that will  help the
auditor to identify the specific source under review.  The identifier is
primarily for the auditor'-s benefit and may be expressed as a complete
address, or simply in terms of the city or county of location.

     "Region" refers to the two-digit Arabic number, such as 01 or 10;
and "State" is the appropriate two-letter code, such as AL or AZ.

     "Type of Review" refers to the status of the proposed source action
relative to'the overall source.  Thus, the addition of a new boiler to an
existing source would be a modification rather than a new source.  (Note
that the status of individual  emission units should be designated in
Section III.)                                                      .       •

6.2  Public Participation and Notification—(Section II)

     Public participation requirements for review of new and modified
sources are set forth under 40 CFR 51.161 and 51.166(q) [formerly numbered
5.18(h) and 51.24(q)].  These requirements call for the issuance of a
public notice which informs the public of a pending permit action and of
the opportunity for public comment or hearing prior to final agency action
on a source application.

     Previous audit results indicate that some agencies require public
notification for all permits issued, but many agencies do not.  This year's
audit seeks-further information as to what specific sources the public was
notified of, and how adequate the notification was.

     Both questionnaires ask for the same information that was requested
last year.  Because of concerns pertaining to the usefulness of public
notices versus the costs of providing such notices, the FY 1988-1989 audit
continues to ask what it costs to issue a public notice.  Th'e answer should
indicate the amount charged by the newspaper or othe,r media to publish the
notice.  If this information is not available in the file (e.g., a copy of
the receipt), the auditor may wish to determine an approximate cost from
the audited agency, but it is  not recommended that too much time be spent
trying to obtain this cost.
       i
       i
     The public notice should inform the public of the availability for
their inspection of the application submitted by the source, the estimated
impact of the source on ambient air quality, and the agency's  proposed
action- to approve or disapprove the permit.  The notica .should also Indicate
the nature of the analysis of air toxics, consistent with EPA guidance.
Instructions for submitting comments, as well  as the opportunity for a
public hearing, should also be addressed.  The auditor should  verify that
notices issued by the agency adequately inform the public of the permit
being considered and of their opportunities to provide input to the final
determination.

     In addition to providing adequate notice to the public in general,
certain parties are to receive specific notification of proposed permit
actions where those parties would be directly affected by the proposed
source.  The auditor should verify that the agency has, and uses, a mechanism
                                    3-11

-------
for notifying the appropriate government officials when the proposed
source may affect their jurisdiction.  The auditor should particularly
note, in the case of PSD sources, whether and at what point in the process
the Federal Land Manager (FLM) is notified of any pending agency action
on a source locating within 100 km of a Federal  Class I area.   In addition,
the auditor should identify, for information gathering purposes, any
other criteria used to trigger notification of the FLM.

6.3  Applicability Determinations--(Section III)

     State and local governments are expected to regulate not  only PSD and
Part D sources but also construction of other air pollution sources.  The
agencies are, however, particularly expected to strive for the level of
consistency needed to satisfy the minimum Federal requirements for subjecting
new and modified PSD and Part D sources to preconstruction.review.

     6.3.1  Definition of Source

     The auditor should verify, through the review of selected permit
files, that the appropriate levels and detail of review are being made.
The listing of emission units provides a basis for determining the answers
to several questions, so it should be as complete as possible.

     Agencies must use, as a minimum, the appropriate Federal  definitions
of "source" to make applicability determinations.  The number  of definitions
used by any particular agency will depend upon the specific Federal

preconstruction review requirements being implemented by the audited agency
under an approved SIP or delegated authority.  The auditor should be familiar
with the following situations:

     For PSD, the agency should use a reasonable grouping of emission
units as one stationary source, classified according to its primary  activity,
i.e., same two-digit SIC code.  The industrial grouping will determine
the applicable emission threshold (100/250 TPY) governing major source
status, and therefore whether PSD applies.

     For nonattainment areas, including areas where the construction ban
(40 CFR 52.24) is in effect, one of several definitions of source may apply.
The possibilities include the plantwide^efinition, as described for PSD
•above, the dual definition v/hich considers a "source" to be both the plant
and each of ics individual pieces of process (equipment, or another definition
based on previous EPA requirements preceding the Alabama Power court decision.
The auditor must know which definition is actually being used  by the agency
in order to determine that it is being correctly applied.

     For NSPS and NESHAPS, the applicable "source" is defined  by various
subparts'of 40 CFR Parts 60 and 61, respectively.  The auditor should verify
that the NSPS/NESHAP applicability determinations are made independently of
the PSD or Part D (offset) determinations.  This is particularly important
where the PSD or Part D requirements do not apply, e.g, "minor" sources,
major sources which have de minimis net emission increases for the pollutant
of concern, or sources where exemptions from the PSD or Part D requirements
are otherwise granted by the agency.

                                    3-12

-------
     5.3.2  Fugitive Emissions

     Fugitive emissions, to the extent they are quantifiable and emitted by
any of the listed source categories, should be included" in the emission
calculations for determining whether a source is major and subject to PSO
or Part D review.  For the auditors' convenience, the listed source categories
have been included in the Reference Table (see Table 1) which is to be used
with the FY 85 audit questionnaire.  For other source categories, i.e.,
those not listed, the source must first be evaluated as to whether it is
major without using fugitive emissions.  However, fugitive emissions should
be included in the ambient impact analysis and other review requirements
whether the "source is major or minor.  The auditor should verify that the
emission factors used to calculate fugitive emissions are documented and
reviewed by the agency independently from any use of such factors by the
applicant.

     6.3.3  Potential to Emit
                                                                              I

     The status (PSD/Part D or non-PSD/Part D) of new or modified sources
must be determined on the basis of the source's potential to emit.  "Potential
to emit" is a source's maximum capacity to emit a pollutant under its
physical and operational design.  In order for any physical or operational
limitations to be considered as part of the source's design (to restrict
the maximum capacity of the source), the limitations must be made an enforceable
part of the permit.  Moreover, the limitations must be Federally enforceable,
which requires that the permit condition(s) be 'identified in the construction
permit or an operating permit that has been specifically incorporated in
the approved SIP.

     The auditor must determine whether the audited agency correctly applies
the concept of "potential to emit" when making applicability determinations.
Both questionnaires ask questions concerning the use of acceptable, well-
documented emission factors as well as the use of special limitations to
define a new source's potential to emit.  The auditor should determine
whether restrictions to a source's potential to emit are properly applied,
particularly when they are used to allow the source to avoid PSD or Part D
review.

     For modified sources, it is important to note that major source status
in1 terms of potential emissions of the existing source must be taken into
account.  This involves the existing source's maximum capacity, which may
r.ake into account all control- equipment and operating restrictions that are
Federally enforceable.  Previous audits have shown that there may be a
tendency on the part of some air pollution control agencies to overlook the
potential to emit of the existing source, particularly when actual emissions
are significantly less than the applicable major source cutoff size.  The
auditor, by completing Form 4, should be able to determine whether any
problems exist with this aspect of the audited agency's applicability
procedures.

     6.3.4  Emission Netting

     For modifications to existing sources, once the major or minor status
of the existing source has been affirmed, the applicability review of

                                    3-13

-------
proposed modifications should be based on the net change in actual emissions
on a tons-per-year basis.  For example, emission changes occurring from
retiring equipment or other methods of emission reduction generally will
be credited on the basis of the difference in the emission unit's actual
emissions before and after the reduction.  Actual emission estimates
generally should be based on either:  (1) reasonable engineering assumptions
regarding actual emission levels and representative facility operation .
over a two-year period, or (2) permitted allowable emissions determined on
a site-specific, case-by-case basis so as to be representative of actual
source emissions.  Where an- emission unit has not begun normal operations,
the potential to emit of the unit should be used.
     Any net change in actual  emissions that would result in a significant
emission increase at an existing major stationary source must generally be
reviewed as a major modification.  However, for this to be true in a nonattain-
ment area, the existing source must also have the potential  to emit in
major amounts the nonattainment pollutant(s) for which a significant net
increase would occur.  For proposed new major sources 'subject to PSD,, PSD
review applies to all criteria and noncriteria pollutants that would be
emitted in significant amounts.

     For the auditors' convenience, the EPA-defined significant emission
rates for criteria and noncriteria pollutants regulated under the Clean Air
Act have been included in the Reference Table (Table 1) attached for use
with the file audit questionnaires.  The auditor should check all applic-
ability determinations carefully.with respect to significant emissions.
Some agencies do not appear to use the EPA significance values to trigger
review of major modifications.  Instead, they may be using some uniform
cutoff point that tends to be more restrictive than the required significance
values for some pollutants but less restrictive for other pollutants.

     The worksheet provided in Section III.D. of both questionnaires should
be used to determine the net change in emissions.  It should be noted
that EPA policy requires the emission changes resulting from the proposed
modification itself to-be significant before considering other contemporaneous
emission increases and decreases that may have occurred before the proposed
modification.  If the proposed modification does not result in a significant
emission increase, then a major modification is said not to occur regardless
of how previous contemporaneous emission changes would altar the net
emission change.  State and local agencies may implement a more stringent
policy if they wish to do so.   Where this is the case, the auditor should
note such policy and evaluate the permit in accordance with the more
stringent policy. •

     Adequate safeguards should be taken by the agency to prevent the  use
of contemporaneous decreases in actual  emissions if the decreases are  not
creditable.  The auditor must know how "contemporaneous" is defined by •
each audited agency.  Contemporaneous emission decreases should be surplus
and should not credited more than once.  No decrease previously relied on
by a PSD source can be considered again in determining the net change  of
a current or future modification.  For nonattainment areas, any required
emission reduction that has occurred or is scheduled to occur pursuant to
the attainment date contained in and required by the SIP control strategy
cannot be counted for netting purposes.
                                    3-14

-------
     Finally, In nonattainment situations, no reduction relied on previously
to meet the reasonable further progress requirement of Part 0 of the Clean
Air Act can be used for calculating emissions.  If necessary, the auditor
should inquire about the agency's policy and procedure for preventing
double counting, but documentation in the file which specifically states
that the decrease was not relied on or counted elsewhere is preferable
and should be encouraged.  Should documentation not be readily available,
the auditor should so indicate.

     6.3.5  Emission Limits

     Agencies may vary in the number of permits that they issue to a source
having more than one emission unit...  No Federal requirements exist to
govern the number of permits which may apply to any source.  Uhat is important,
however, is that each emission unit is identified clearly, along with its
allowable emission rates, or design, equipment, work practice or operational
standards, as may be appropriate to address each pollutant emitted.  "Appro-
priate" often means having more than one limit for each pollutant.  For
example, there may be limits for the same pollutant to ensure compliance with
(a), an NSPS (e.g., Ib/million btu, rolling 30-day average), (b) 3-hour, 3-hour,
or 24-hour NAAQS or PSD increments, (e.g., Ib/hour), (c) a restriction on
capacity or operating hours (e.g., Ib/day), and (d) an applicability determin-
ation or an annual NAAQS (e.g., ton/year).  As a minimum, the permit should
contain sufficient emission limits to ensure adequate control of all regulated
pollutants which the source has the potential to emit in significant amounts.

     It is particularly important, when an agency issues one permit to a
large complex, that each emission unit is identified separately, along
with its allowable emission rate, as opposed to a single composite emission
rate for each pollutant.  The auditor should verify that, for each permit
issued, there is separate and clear identification of the affected emission
units and their corresponding .allowable emissions.  Also, the auditor should
insure that the averaging time for each standard be consistent with the
averaging time for each NAAQS or PSD increment which is to be protected.

     In addition to identifying the allowable emissions, equipment or other
standard for each separate emission units, it is important that such
limitations be addressed adequately in conditions on the permit(s) for a
new or modified source.  The auditor should examine the adequacy! of the
conditions in terms .of their clarity and enforceability.  The auditor
should pay close attention to the use of clear and'precise averaging periods
over y/hich the various pollutant emissions are to be measured,  j'lote:  In
many cases, averaging periods may be a part of the required test method and
may not be specifically stated on the permit.  In such cases, auditors should
discuss this with the audited agency and verify that the agency regulations
do require proper averaging periods by reference.  Also, some agencies may
incorporate by reference the test method as well as the averaging period.]
Finally, the emission rates  must be consistent with acceptable measurement
procedures; otherwise, compliance will be difficult if not impossible to
ascertain and the conditions would be unenforceable.

     Test methods used to determine compliance of the source with its
allowable emission rates should be clearly defined or referenced as conditions
                                     3-15

-------
to the final permit.  These compliance tests- should be specific to
the individual  emission units to which they apply.  The auditor should
verify the documentation of the compliance test methods and their adequacy
for covering each applicable emission unit for which allowable emission
rates are defined.  Where test methods are not specified in the permit, the
auditor should determine whether the SIP specifications are otherwise
applicable and sufficient.  (This is likely to involve a discussion with
Agency personnel.)

6.4  Control Techno!ogy--(Section IV.)

     6.4.1  NSR/PSD Sources (Form 3).

     The primary objective for the auditor is to determine whether good,
well-supported, 8ACT/LAER determinations are being made.  Secondary objectives
are to measure the frequency of BACT/LAER determinations set equal to
existing new source performance standards, and to determine the amount of
legitimate attention being given by review agencies to the requirement for
the application of LAER on new and modified major sources constructing in
nonattainment areas.

     Pollutants regulated under the Clean Air Act are subject to a BACT
analysis if they would be emitted in significant amounts by a source whose
construction is subject to PSD.  A pollutant subject to regulation under the
Clean Air Act generally has had a standard of performance under §111 or 112
and/or NAAQS promulgated for it.  The analysis for the subject source
should address both fugitive and nonfugitive emissions.  The auditor should
verify that the BACT analysis considers all significant emission increases
rather than being restricted to criteria pollutants or major emission
changes.  Consistent with the North County Remand, the auditor should
confirm that toxic air pollutants are addressed in the BACT determination.

     In selecting BACT, the applicant generally should be required to
consider more than one control strategy, unless it can be demonstrated that
the single proposed strategy clearly represents the highest degree of contin-
uous emission reduction available.  In all cases, the control strategies
considered should be technically feasible and should address the economic,
energy and environmental impacts of the particular^alternative.  Quantifiable
impacts should be identified.  The auditor should verify that adequate
alternative control strategies are'included where appropriate.

     In each case, the BACT analysis submitted by the applleant ;aisz be
reviewed independently by the permit agency.  In particular, candidate
control equipment should be assessed to ensure that reasonable performance
claims, including consideration of continuing compliance, are being made.
Atypically high control efficiencies should be examined for their reason-
ableness, particularly where they would result in emission rates that
would enable the applicant to avoid a certain requirement or to meet ambient
constraints.  Where the alternative representing the most stringent emission
reductions is not selected, the permit agency should review carefully the
alternatives to ascertain that the most appropriate one was selected.  The
agency should routinely check to see whether any technically feasible
alternatives were not considered, and why.  The auditor should verify that
the agency performs an adequate independent review of the BACT analysis
submitted by the-applicant.

                                    3-16

-------
     For each permit reviewed which was subject to BACT or LAER, the auditor
should note the regulatory baseline assumed by the review agency. ' In how
many instances do the agency's 8ACT/LAER determinations conform exactly to
existing SIP, NSPS, or NESHAP requirements?  The auditor should verify that
adequate documentation is provided for those determinations which simply
meet the minimum requirements.  For cases where LAER determinations conform
exactly to iNSPS, the auditor should examine the reasons why LAER was not
determined to be a more stringent limitation.

     There has been a- large change in EPA policy concerning BACT determinations.'
A memorandum from Craig Potter, dated December 1, 1987, states that EPA now
wants reviewing authorities to use the "top-down" aooroach for RACT
determinations.  The first step in this approach is to determine, for the
emission source in question, the most stringent control technology available.
If it can be shown that this level of control is technically or .economically
infeasible for the source in question, then1 the next most stringent level
of control i.s determined and similarly evaluated.  The EPA is supposed to
be doing tin's analysis for all permitting actions and States that have the
PSD program by delegation are also supposed to be incorporating this method
into their control technology evaluations.  Although EPA can only encourage
States with PSD SIP's to use the "top-down" method for BACT determinations,
the December!, memorandum states that "A final 3ACT determination v/hic1''
fails to reflect adequate consideration of the factors that would have been
relevant using a "top-down" type of analysis shall be considered deficient
by EPA."  An auditor that discovers such a deficient permit should notify
the reviewing authority and make note of it on 'the audit report.

     6.4.2  Non-NSR/°SD Sources (Form 4).  '

     For sources that are not, subject to PSD or Part D control technology
requirements, i.e., BACT or LAER, it is still important for the reviewing
agency to address a number of control technology considerations.  Methods
of reducing emission should be checked for the* reasonableness of the
performance claims associated with them.  This is especially true when the
applicant intends to avoid major review by demonstrating that emission
levels will fall below the major source threshold levels.

     Sources that do not qualify for major review, under PSD or Part D may
still be subject to NSPS or NESHAR requirements for certain pollutants. .
Auditors should determine whether each source VMS adeouately re viewed for


o.5  Air Quality Monitoring Data (PSD/--(Section V.  corm 3 only)

     Every PSD source with the potential to emit significant amounts of a
particular criteria pollutant, where both the existing air quality and the
estimated impact of the source or modification are significant, must meet
the requirements for preconstruction air quality monitoring data, unless
exempted under provisions for temporary emissions or compliance with the
Offset Policy.  In the latter case, which appli-es only to VOC emissions, if
the source satisfies all conditions of the emission e, postapproval monitoring
may be provided in lieu of oroviding preconstruction data [40 CFR 51.24(m)(1)(v)]
Only PSD sources are required to submit such data.  For PSD sources not.


                                    3-17

-------
required to submit ambient data, the applicable exemption should be clearly
stated in the preliminary determination.

     The requirement for ambient air quality monitoring data may be met in
one of two ways.  First, the permitting agency may require that the PSD
applicant establish a monitoring network designed to collect the appropriate
air quality data.  Second, if existing air quality data is representative
of the air quality in the area where the source would have an impact, then
such representative data may be provided in the place of monitoring data
collected by the applicant.

     The Ambient Monitoring Guidelines for PSD contain minimum quality
assurance requirements that must be met by the applicant when monitoring
must be performed.  The detailed criteria for quality assurance generally
should not be audited by the new source review auditors.  Instead, the
Regional ambient monitoring staff .is usually better able to audit the quality
assurance procedures.  It is important that the two groups discuss in
advance the division of responsibility of audited areas, to avoid overlap
or omissions.  The new source review auditor should determine:  (1) whether
a monitoring plan was submitted by  the source and evaluated by the permitting
agency; (2) whether a quality assurance plan was submitted by the applicant;
and (3) whether the permitting agency evaluated the data for compliance
with 40 CFR 58 Appendix 8.

     Use of representative data is  restricted by the criteria described in
EPA's "Ambient Monitoring Guidelines for Prevention of Significant Deteri-
oration (PSD)," EPA-450/4-80-012, Revised February 1981.  Generally, only
new sources in remote areas may use existing data gathered at sites greater
than 10 km away.  For all sources 'in flat terrain, monitors within 10 km
are acceptable.  For complex terrain, the guidelines are very difficult to
meet, and new data are almost always required.  In addition to the monitor
location criteria, there are also restrictions concerning data currentness
and quality.  The auditor should be familiar with the guidelines concerning
representative data and verify that the audited agency is following them.

6.6  PSD Increment Analysis — (Section VI.  Form 3 only)

     Before a permit is granted, the permit agency must determine that no
national ambient air quality standards 'will be violated.  In the special
case of a PSD permit, the agency must further verify that no allowable
PSD Increment dill be exceeded by the source under review.  In'all cases,
the ambient impact analysis nust be reviewed carefully by the permit agency
responsible for managing the ambient air quality.  The auditor should
determine the adequacy of the .ambient air analysis performed as part of the
preconstruction review.  In most cases the auditor for these sections should
be the regional meteorologist or a  modeling expert.

     Allowable PSD increments exist only for S02 and TSP at the present
time.  There are a number of important considerations that the permit
agency must routinely take into account in order to ensure that the maximum
allowable increments are not exceeded.  The permit agency must give the
proper attention to such things as  the baseline concentration, baseline
area and baseline date(s); the appropriate emission changes for increment
                                    3-18

-------
 consumption purposes;  long and  short-term increment  averaging  periods;  and
 special  Class I  area impacts.

      The baseline concentration generally reflects actual  emissions  occurring
 at the time of receipt of the first  complete  PSD  application in  the  §107
 attainment or unclassifiable area.   This  ambient  concentration is  adjusted
 to include projected emissions  of major sources commencing construction
 before January 6, 1975, but not in  operation  as of the  baseline  date, and
 to exclude the impacts of actual emission changes, resulting from construction
 at a major stationary  source commencing after January 6,  1975.

      Changes in  emissions contributing to the baseline  concentration from
 any source subsequent  to the baseline date and from  any major  source construc
 tion commenced after January 6, 1975, can either  consume  or expand the  PSD
 increment.  Where actual  emissions  cannot be  used, e.g.,  the source  has not
 yet begun to operate or sufficient  operating  data is not  available,  then
 allowable emissions must be used.   The auditor should verify that  the
 agency considers the appropriate emission changes relative -to the  baseline
 concentration.                                             '

      The date of receipt of the first complete application for a major  new
 source or major  modification subject to PSD becomes  the baseline date,  and
 the area in which the  baseline  is  triggered is known as the baseline area.
 The analysis can become somewhat complicated  when the baseline area  for the
 proposed source  includes more than  one Section 107 attainment  or unclassified
 area,  particularly if  the baseline  date has already  been  triggered in some,
 but not  all, of  the Section 107 areas within  that baseline area.   Auditors
 should be familiar with the PSD increment analysis process to help alleviate
 some of  the potential  confusion that could occur  during the review of the
 PSD permit files.  A good presentation of the increment analysis is  contained
 in the EPA PSD Workshop Manual  (EPA-450/2-80-081, October 1980).
      Both TSP and SOg have  long  and  short-term  averaging  periods  for which
 PSD increments have been established.   These  maximum  allowable  increases  ara
 not to be exceeded more than  once  per year  for  otherj than  an  annual averaging
 period.  The auditor should verify that each  PSD  application  considers  all
 appropriate averaging periods with complete ]documentati on  in  the  permit-file.

      For sources proposing  to locate near  a Class I area,  an  increment
'analysis may be required .under conditions  that  would  not  trigger  an analysis
 In any other locations.  Any  emissions  fro.n ~ proposed  source should be
 considered significant when the  source  would  locate within 10 km  of the Class
 area and cause an ambient impact equal  to  or  greater  than  1 ug/rn^ (24-hour
 average).  Generally, sources locating  within 100 km  of a  Class  I area  should
 be screened to determine their impact on the  Class  I  area.  All Class  I
 analyses, of course, should also include any  impacts  on visibility.

 6.7  NAAQS Protection — (Section  V.  Form 4; Section VII.   Form  3)

      States may differ as to  the emission  baseline used to protect the
 NAAQS.  In some cases, the  allowable emissions  (or  some other "representative"
 emission estimates) from all  major sources  are  used for modeling  air quality.
 In other cases, the modeled allowable emissions from  the  proposed source  or
                                     3-19

-------
modification are added to the background air quality which is basad solely
on monitoring data.  The auditor should identify the emission baseline
required by the.agency and gain an understanding of the specific approach
utilized to estimate the -impact of a new or modified source.  This information
will be used to assess current practices and for consideration of future
policy development.

     In evaluating the NAAQS, the ambient impact analysis should determine
the maximum long-term and short-term impacts of the proposed new source
or modification.  However, maximum ambient impact may actually occur at
other locations when the impacts of other sources and background data are
taken into account..  Hot spots may also occur where growth resulting from
minor sources or sources otherwise exempted from detailed permit review are
not subjected to a rigorous ambient analysis.  The auditor should verify
that the agency performs a detailed analysis of a source's maximum ambient
impact beyond those areas of maximum impact of the source alone.

     EPA has recommended the use of a number of models for specific types
of applications and has stated its preference for.certain new models for
analyzing the impact of sources on ambient air quality.  However, utilization
of any particular model should be consistent with the design and intent
of the model itself.  Some models are very specific as to terrain and
applicability.  The auditor should verify that impact analyses are being
performed with the appropriate models, and that the permit agency conducts
its own independent review of the source's analysis (including the replication
of modeling results when appropriate) to ensure conformance to accepted
procedures.  EPA guidance is provided in "Guideline on Air Quality Models,"
EPA-450/278-027, April 1978.  This report is currently undergoing revision.
Additional guidance is also provided in "Regional Workshops on Air Quality
Modeling: Summary Report," OAQPS, April 1981 and "Guideline for Use of
City-Specific EKMA in Preparing Ozone SIP's,"  EPA-450/4-80-027, March 1981.
EPA's "Guideline on Air Quality Models" includes, among other things,
guidance on the selection of air quality dispersion models.

6.8  Emission Offset Requirements — CSectign VIII.  Form 3 only)

     Part D of the Clean Air Act ,1'ntends that certain stringejit requirements
be met by major sources approved for construction in nonattainment areas.
One such requirement calls for the proposed source or modification to get
emission reductions (offsets) from existing sources in the area such that
there will be reasonable further progress toward attainment of the applicable
NAAQS.  The specific audit objectives are: (1) to assure that reviewing
agencies are requiring, where appropriate, adequate emission offsets as a
condition to authorizing major construction in designated nonattainment
areas; and (2) to assure that emission offsets are being obtained in a
manner consistent with RFP.

     All emission reductions used to offset proposed new emissions must
be made enforceable.  This is true whether the offsets are obtained from
another source owned by the applicant or from a source not under common
ownership.  In either case the offsets should be fully agreed upon and docu-
mented, preferably within, the permit of the source from which the offset is
obtained.  In addition, Federal enforceability requires that an external
                                    3-20

-------
offset be made a part of the applicable SIP.   This would require a specific
SIP revision if the offset is not made part of a permit issued pursuant to
the State's constrjction permit requirements  approved pursuant to 40 CFR
51.18 or 51.24 (now numbered 40 CFR Part 51,  Subpart I).  Conditions to State
or local operating permits are not always considered to be part of the appli-
cable SIP(s).  The auditor should verify that all  offsets are documented by
means of well-defined emission limits pertaining to the emission offset.

     The proposed emissions offset cannot be  otherwise needed to show RFP
toward attaining the NAAQS.  To use the same  emission offset for two
different purposes would result in "double counting" those emissions with
the net result being subsequent deterioration of air quality.  The auditor
should seek assurance from the agency that compliance with annual RFP
increments is independent of the offsets being obtained from proposed new
or modified sources.  In addition, the permit file should be checked to
determine whether any documentation is provided to address this issue.  All
findings sould be recorded In Form 3.

     In order for the system for getting offsets to be consistent with the
demonstration of reasonable further progress, both should be expressed in
the same emission terms, i.e., actual or allowable emissions.  Section
173(1)(A) of the Clean Air Act sets the emission offset baseline as the
"allowable" emissions of the source, but also requires that the offsets must
be sufficient to represent RFP.  Consequently, where the RFP demonstration
is based on an inventory of actual emissions, EPA requires that offsets to be
attained by a proposed new or modified source also be based on actual
emissions.  Form 3 requires that the auditor  determine whether there is
consistency in the emission baseline for offsets and the RFP demonstration.

     In order to comply with the Act requirement that emission offsets
must be sufficient to represent RFP, any increases in area and minor source
growth not considered in the approved RFP demonstration must be covered by
offsets required of the proposed new or modified source.  Failure to account
for these emission increases would result in  air quality deterioration
just as in the case of "double counting."  The auditor should verify that
area and minor source growth considerations are made in order to establish
the offset level, particularly when more than 'one year has passed since the
last offset.

     Section 173(1)(A) of the Clean Air Act requires that offsets be obtained
and in effect "by the time the [new or modified] source is to commence
operation." No specific guidance is available to identify when a source iiau
officially "commenced" operation.  Some agencies may allow a shakedown period
similar to the shakedown provision allowed for net emission increases in
40 CFR 51.18(j)(l)(vii)(f) [now numbered 51.165(a)(l)(vii)(f)].  The auditor
should focus primarily on whether offsets were sought to be in effect in a
timely manner, which may include, for replacement facilities, a shakedown
period not to exceed 180 days.  The auditor should also determine whether the
effective date for the offsets is documented  in the permit file(s).
                                    3-21

-------
                                  TABLE 1

                              REFERENCE TABLES
             FOR USE WITH FY 1988-1989 NSR AUDIT QUESTIONNAIRES
I.  Questionnaire abbreviations:
     0 CBD = cannot be determined from information available in permit file
     0 NA = not applicable
     0 PSD = prevention of significant deterioration
     0 Part D = nonattainment area provisions applying to sources
       a nonattainment pollutant and locate within that

II.  Pollutant Criteria
          which emit
nonattainment area.
Use
this
Pollutant - Abbreviation
C
R
I
T
E
R
I
A
R
E
G
U
L
A
T
E
D



Carbon monoxide
Nitrogen oxides
Sulfur dioxide
Particulate matter:

Ozone (as volatile
organic compounds)
Lead
Asbestos
Beryllium
Mercury
Vinylchloride
Fluorides
Su If uric acid mist
Hydrogen sulfide
Total reduced sulfur
Reduced sulfur compounts
Radionuclides
Benzene
Arsenic
NOTE: For each regulated
that causes an air impact of 1
CO
NOX ,
S02
TSP (or
PMlO
VOC

PB
AB
BE
HG
VC
FL
SAM
H2S
TRS
RSC •
RN
BZ
AS
Significant
Emission
Levels, TPY
100
40
40
PM) 25
15
40

0.6
0.007
0.0004
0.1
1.0
3
7
10
10
10
*
*
*
pollutant, any emission
ug/nr (24-hr) or greater
Significant Air Quality Con
cent rat ions (for Monitoring
Determinations), ug/nr
575, 8-hr avg
14, annual avg
13, 24-hr avg
10, 24-hr avg
10, 24-hr avg
(100 TPY of VOC)

0.1, 3-month avg
No monitoring required
0.001, 24-hr avg
0.25, 24-hr avg
15, 24-hr avg
0.25, 24-hr avg
No monitoring required
0.2, 1-hr avg
No monitoring required
No monitoring required
*
*
*
i
rate Is significant
in any Class I area
located within 10 km of the source.  Air toxics emitted in sufficient
amounts to be of concern should also be indicated, even though not directly
regulated.

    * These values have not been determined as of the time this audit guidance
was written.
                                    3-22

-------
III.  The following source categories are major if >IQQ TPY,  including  fugitive
emissions.  (One exception exists; see note for last source category  in list.)

Coal cleaning plants (with thermal dryers)
Kraft pulp mills
Portland cement plants
Primary zinc smelters
Iron and steel mills
Primary aluminum ore reduction plants
Pimary copper smelters
Municipal incinerators > 250 TPY
Hydrofluoric, sulfuric, or nitric acid plants
Petroleum refineries
Lime Plants
Phosphate rock processing plants
Coke oven batteries
Sulfur recovery plants
Carbon black plants (furnace process)
Primary lead smelters
Fuel conversion plants
Sintering plants
Secondary metal production plants
Chemical process plants
Fossil-fuel boilers (or combination thereof)
  totaling > 250 million BTU/hr heat input
Petroleum storage & transfer units with total
 . storage capacity > 300,000 bbls
Taconite ore processing plants  .
Glass fiber processing plants
Charcoal production
Fossil fuel-fired steam electric plants
  > 250 million Btu/hr heat input
Any other NSPS or NESHAP source as of
  August 7, 1980 [Note:  for PSD, major
  source status based on emissions >250 TPY.]
                                     3-23

-------
                                  Form 1

                            NSR PERMIT SUMMARY
                              QUESTIONNAIRE

I.  GENERAL INFORMATION                       AUDIT PERIOD:  _/_ to _/_
                                                            Mo.Yr.   Mo.Yr.
REGION:

STATE:

f~|  State  |~|  Local Agency

AGENCY NAME:

     Please answer the questions below for the specified audit period basad
on the number of construction permits that you (the above-named State or
local agency) issued to sources .(major and minor) in your jurisdiction.
Use the accompanying list of instructions to formulate your  responses.

II.  PERMIT SUMMARY

     1.  PSD and Part D (Offset) Construction "Permits"

         a.  	Prevention of significant deterioration (>_ 100 or 250 TPY)

        , b.  	Part D major sources in nonattainment areas (_> 100 TPY)

         c.  	Combination (i.e., PSD and Part D)

         d.  	TOTAL (a + b + c)

     2.  Other Source Construction "Permits"

         a.  	Non-PSD permits ,(>100 TPY) in attainment/unclassified areas

         b.  	Minor sources (<100 TPY)  .

             i.  	Minor sources undergoing ambient impact analysis

            ii.  	Sourcas avoiding major source raview  via restrictior
                       not otherwise required but imposed to lower source's
                       potential  to emit.

         c.  	TOTAL (a + b)


III.  PRECONSTRUCTION MONITORING FOR PSD

     	No. of PSD sources subject to preconstruction monitoring requirements.

          	No. of PSD sources actually required to collect data via monitoring.

          	No. of PSD sources allowed to use existing representative data.
                                     3-24

-------
                          INSTRUCTIONS, FOR FORM 1

INSTRUCTIONS

I.  GENERAL INFORMATION

   ,  °  This section should be filled out by the EPA Regional  Office
before forwarding the questionnaire to the audited agency.   The audit
period represents the period from the time of the last audit.

II.  PERMIT SUMMARY

     0  Major source permits:   enter "N/A" if you do not have
program authority; "0" if you  have authority, but no permits of a particular
type were issued.

     0  "Permit" should be defined in terms.of the entire source or project
for which a particular construction approval (for a new source or modification)
was requested.  Consequently,  one application should generally be regarded
as a "permit" regardless of the number of agency permits (for  individual
emission units) actually issued.  For cases where an application would
qualify for two permit groupings (e.g., major source review for both PSD
and offsets), the permit should be listed under II.I.e.

     0  All "permit" numbers reported should pertain to new construction
(which may involve a completely new plant or a modificatin  to  an existing
one) or a new method of operation for. which a permit analysis  was required.
Permit extensions, minor revisions, etc., should not be included.  If  the
exact number of "permits" is not known, please provide your most reasonable
estimate and place an "(E)" after the value provided.

     0  If EPA performs the application review and issues' a PSD permit
(i.e., the State or local agency does not have either a SIP approved PSD
program or delegated authority), do not include such "permit"  'in line  l.a.
However, if a permit is required by the State or local agency  in addition
to EPA's PSD review, then that permit should be included in line II.2.a.
Also, use line II.2.a. to account for permits issued to major  (_> 100 tpy)
new ;or modified sources which  are not subject to PSD because their emissions
are less than the 250 tpy cutoff for unlisted PSD sources.

     0  Sources may avoid ;;iajor source'review by agreeing to limitations
which would restrict their potential emissions to an amount below the  100
or 250 tpy threshold.  This is often accomplished by limiting  the source's
hours of operation, fuel use,  or operating rate via Federally  enforceable
permit conditions.  If this occurs, it should be noted in line II.2.b.ii.

III.  PRECONSTRUCTION MONITORING FOR PSD

     For sources subject to PSD monitoring, indicate number of sources for
which (a) ambient monitoring was required, or (b) the use of existing  data
was allowed.  For cases where a PSD source is required to monitor for  one
or more pollutants, but is also allowed to use existing representative data
for another pollutant, the source should be counted once for each event.
Therefore, sources may be double counted under III.l.a. and b.


                                    3-25

-------
                                   FORM 2

                      NSR AUDIT SUMMARY QUESTIONNAIRE

  I.  GENERAL INFORMATION:                   AUDIT PERIOD:  _/_/  to _/_
                                                           Mo.Yr.     Mo.Yr.
      REGION:

      STATE:                         |_|~State Agency       |_|~iocal  Agency

 II.  NUMBER  OF PERMITS AUDITED

      Indicate the number of permit files audited  (including those for which
      a questionnaire was not completed) for  each  of  the  following types
      of permits:

      a.	PSD only   b.	Part  D only   c.	PSD/Part D .  d.	all  other


III.  TIME REQUIRED TO AUDIT PERMITS

      Indicate the amount of time, in  hours,  spent auditing the total  number
      of permits specified above,  as well  as  the range  i.i time needed for
      auditing individual permits  For  which a questionnaire was completed.
      (Times  should be stated to the nearest  half  hour.)

      a.	Hours for total  audit of files.

      b.	Hours for maximum single file  audit.

      c.	Hours for minimum single file  audit.

 IV.  PERMIT  SELECTION

      a.  The audited agency [ ] was [ ] was  not told prior to the audit
          which permits would be examined.

      b.  The audited agency C ] did [ ] did  not participate in the selection
          of  permits which were audited.

      Comments -
                                      3-26

-------
 For:n 2 (continued)

  V.   COMDITION OF PERMIT FILES

      For the two categories specified below, mark the response that best
      describes the condition of the audited agency's permit files:

      a.  Organization -  [ 1 Information in each file well orqanized.

                         [ ] Infornation available but not well organized;
                             did/did not (circle one) significantly lennt^en
                             the tine required to audit files.

                         [ ] Information not contained in a central file,
                             but maintained in separate files; did/did not
                             (circle one) have opportunity to examine a'
                             pertinent information.

      Comments -
      b.  Documentation - [ ] Al ]  files revi^v/e-' contained necessary
                             documentation.

                         [ ] Some (	%) files reviewed contained necessary
                             documentation.

                         [ ] Files reviewed typically lacked necessary
                             documentation.

      Comments -
VI.   SIGNIFICANT PROBLEMS

     a. List the five (or fewer) most significant oroblems found as a result

     of the NSR audit.  Start with the most significant oroblf3'"1 and continue

     I i 31 i'V] i "i .iesc3Mni nrj order.  '"Each problem listed should be'suooor~eo

     by discussion contained in the audit narrative.]

        i.	

       ii.	

      i i i.	.	'_	

       iv.
                                    3-27

-------
   Forrc 2 (continued)

        b.  For each problem identified on  previous  page,  select toe rea:>on(s)
        which you believe nay contribute to the particular problem:

                                                  (i)   (il)   (iii)   (iv)   (v)
                 /
           0 Inadequate agency procedures          [ ]   [  ]    [ ]   f ]   [ 1

             Failure of aqency to follow its own
                         '
VII,

o
o
o
o
procedures
Inadequate agency rules/regulations
Inadequate agency resources/
organization
Need for EPA policy or guidance
Other: Specify
[ .1
r ]
[• 1
r ]
r ~i
L J
r j
r j
c j
c i
r ]
i: i i: i c ]
[1 C 1 L 1
[ ] . C J C 1
[ ] [ ] c J
r ] c i i: i
Comments -






PROGRAM
IMPROVEMENTS



      Briefly describe below orogran improvements  that  have occurred since the
      last NSR audit.   These should  also be discussed  in full  detail in tho
      narrative report.  The improvements generally  should relate to specific
      audit findings identified during previous  audits.
                                      3-28

-------
                                  FORM 3
                        PERMIT FILE QUESTIONNAIRE
           FOR MAJOR SOURCES SUBJECT TO PSD OR PART D (OFFSETS)

[NOTE:  Unless otherwise indicated place an "X" in the box beside each
statement or response which applies.  Many of the questions will allow
more than one response.]
SECTION I.  SOURCE INFORMATION
A.I.Company/Source Name:
  2.Source type/Category and capacity:

  '3.Location:
B. 1. Region | |
I. Date Application
Considered Complete
r i
mo day yr
D. Date Permit to
Construct Issued:
1 1 1 1 1 1 1
mo day yr
:. This permit was reviewed for (list
pollutants):
1. Attainment area pollutants
[ ] PSD for:
2. Nonattainment area pollutants
[ ] offsets for: • ,
[ ] Growth allowance for:
3. Toxic air pollutants:
1


-2. State | |
3. Permitting Agency
a. [ '] State
b. [ ] Local:
4. Auditor
5. Permit #
6. Type of Review:
a. C ] New Major Sourc
b. [ ] Major Modi fi cat
F.I. [ ] This source is located within. 10 km of a Class I area.
  2. [ ] This source is located within 100 km of a Class I area.
  3. [ ] Source is in an attainment area and significantly impacts a
         designated nonattainment area or any area where a NAAQS violation
         exists..                            ,   '
  4. [ ] Construction ban for some pollutants.
  5. [ ] None of the above.

SECTION II.  PUBLIC PARTICIPATION REQUIREMENTS
A.  Public Notice:                                           YES
      was published in a newspaper (approx. cost:  $	) L \
      provided opportunity for public hearing .......  [ ]
      provided opportunity for written comment  	  [ ]
      described agency's preliminary determination  ....  [ ]
      included estimated ambient impact 	  C ]
     NO
      indicated addt'l info, available for inspection
C ]  C ]
       ]
CBD
n
[ J
                                    3-29

-------
Form 3 (continued)

B.  The following other affected government agencies were    YES   N0_   C30
    notified:
  1.  other agencies and officials within the State  ....  [  ]   f ]  [  ]'
  2.  other States	[  J   L ]  C  J
  3.'  Federal Land Manager	[  ]   [1  [  j
  4,  EPA	 . .	  [  ]   : ]  T  1

C. Documentation for parts A and 3 consists of:                 'ioti ficcteion
                                                                   of Other
                                                Public Notice      Agencies
  1. copy of notice/correspondence in file         a.f ]    '        b.r_  1
  2. indication on processing checklist            a.[ ]            b.[  ]
  3. no documentation provided                   •  a.[ ]            b.[  1
  4. other, explain:	'
Section III.APPLICABILITY DETERMINATIONS
A.  Definition of Source.

  l.a.  The new source or -'lodi fication for whic'-t this oer-n't
        was made was considered by the reviewing agency to consist of the
        following now and modified pollutant-emitting activities associated
        with the same industrial grouping, located on contiguous or adjacent
        sites, and under common control or ownership (if more than 5, list
        only 5 largest):
        [ ]  using a plantwide definition
        [ ]  using a dual source definition
        C 1  using another definiti
                                  ion

flew  Modif.           Emission Unit/Size                 Pollutant
r j  i:']
r i  r i
  ]  [ ]

    b.  [ ] C130
                                   3-30

-------
Form 3 (continued)

2.  Were any new or modified pollutant-emitting activities (other than
    fugitive emissions) omitted which should have been included:

       [ ]  NO.       [ ]  CBD.       [ ]  YES, as follows:

                                                  Reason qiven by agency
     Emission       Emission                      for not considering as
     Unit/Size      Pol lutan't  TRY   Mew  Modif.   part of source	

                                      r ]' r ]
                                      C 1  r j
3.  If yes was checked for Section III.A.2.c. above, did this new source
    or modification escape any PSD or Part 0 analyses for significant
    emissions as a result of the omission of the. listed activities7

    C ] Yes   [ ] No

B.  Fugitive emissions.

1.  This source or modification:

  a.  [ ]  Does not have any quantifiable fugitive emissions.  (GO TO
           Section III.C,.)

  b.  [1  Has too little'documentation in the file to determine whether
           quantifiable fuqitive emissions were included in the emission
           estimates.

  c.  Has quantifiable fugitive emissions which were:
                       i  -  |
                            i
    i.   [ ] Included in determining whether the source/modification was
             major for the following oollutants:     	


    ii.  [ ].riot included in determining whether the source was .major for
             the following pollutants:   	

             because the source is neither one of the 23 PSD source
             categories nor regulated under Section 111 or 112 of the Act.

  .  iii. [ ] Mot included in determining whether the source .was major for
             the following pollutants:  	,
             although they should have  been.
                                    3-31

-------
Form 3 (continued)

C.  Potential  to Emit (PTE).  The determination of whether a source is
    major should be based on PTE which, because most sources do not
    operate 3,760 hours per year at 100% capacity, can differ nreatly
    from actual emissions.

  1.  The emissions of this source or Codification were determined:

    a. Using emission rates based on emission factors which were:

     ' i.  f ]  'jell established (e.o., AP-42) or well documented in the
               file.

      ii. [ ]  Not 'well established and lacking adequate .documentation for
               the following units and pollutants:
    b. [ J  CSD

    c. [ ]  Using another method.  Explain':
  2.  The emissions of this source were determined on the basis of:

    a. [ ]  CSQ; GO TO C.3.

    b. [ 1  Maximum capacity to emit at full  physical and operation design;
            GO TO C.3.
                                   3-32

-------
Form 3 (contined)

  c. [ ]  Limited, capacity based, on control equipment, or physical,
          operational or emission limitations, not otherwise required
          (e.g.,  NSPS), as follows:

                                          If used, was limitation identified on
                                           Constr. Permit Operating Permit
          Limitation         Pollutants    Ye?    Tkf      Ye?     f!o    NA


    i.   [ ] Control equipment* 	   [ ]    [ ]     [ J    [  J   I!  J

    ii.  [ ] Emission limit*    	   [ J  '  C ]     [1    [ ]   f  1

    iii. [ ] Operating hours    	   [ ]    [ ]     f ]    L J   C  j

    iv.  [ ] Operating rate     	   [ ].    []-_[]    C ]   [  ]

    v.   [ ] Fuel/material      	   [ ]    [ ]     f 1    [ ]   I!  J
             restriction
    vi.  r -i	[ ]'    [ ]     r i    r  ]   •;  ]

*Not otherwise required, i.e., main purpose of limitation is. to reduce
potential emissions.

If Section III.C.2.C. above is marked, please describe the limitations:
3.  Do the calculated'emissions correctly represent the source's potential
    emissions?                                   !

  a.  f ] YES                           i        -

  b.  [ ] can       ' .

            Ixnl .r: i:	

  c.  NO, because:

    i.   [ ] calculations based on control equipment, or physical,
             operational or emission limitations that are not Federally
             .enforceable

    ii.  [ ] emission factors not acceptable

    iii. [ ] did not adequately address fugitive emissions
                                    3-33

-------
Form 3 (contined)

    iv.  [ ] did not address all  pollutant-omitting activities (other than
             fugitives)
                Explain:
    v.   [ ] other.  Explain:      '	

D.  Emission Netting.

  1.  Check tha appropriate box for tli.is pernit action:

    a.  [ ] Mew source; emission netting not applicable. GO TO Section III

    b.  [ ] Major modification

  ?.  The deternination op significant emissions can be a complex nrocess
      when emission netting occurs.   The following work sheet should be
      used to determine whether the 'proper procedure was followed:
  Pollutant

  a.  TSP
  b.
              Proposed Emission
              Changes. TPY
  Other creditable,
   contemooraneous
Overal1
emissions changes, TPY   Net Change,
                           TPY
  c.  S02

  d.  NOX

  e.  03 (VOC)_

  f.  CO
3.  Did agency correctly identify whether emissions were significant?

    [ ] Yes  [ ] CBO [ ] Mo; Explain	
4.  Did analysis consider all pollutants for which a net increase in
    emissions occurred?

    [ 1 Yes

    C ] flo, failed to address one. or nore nollutants.  Explain: 	
    C ] CBO; Explain
                                    3-34

-------
Form 3 (continued)

***COMPLETE THE FOLLOMIMG APPLICABLE STATEMENTS.***

   5. a.  Emission netting was based on actual emissions.

         [ ] C8D; GO TO Question 7.  [ ] No; GO TO Question 5.  [ ] Yes

     b.  Indicate whether the calculation of actual emissions properly
         •consiaered the Toll owing criteria:

                                                         Yes  No_    • C3P

         i.  Representative of normal unit operation     [111    [  J

        ii. . Sased on a two-yaar average                 [ ]  [ j    i]  ]

       iii.  Expressed in TPY                            !".][][!

  6.  Emission netting was based on another approach.

    a. [ ] Mo.

    b. [ ] Yes, and approach was acceptable.  Explain: _ • __
    c. [ ] Yes, but approach was incorrect.  Explain:
  7.  Emission decreases were considered.

     'a. [ ] Mo; GO TO E.

     b. [ ] Yes, and the decreases:

       Were   Here not  CBD   N/A

       [ ]      [ j     I. J         a.  Contemporaneous w.itn t;ie proposed
                                        modification.
                                        •3 net emission change.

       [ ]      [ J     [ ]   [ ]   c.  Previously counted as nart of t.ie
                                        SIP attainment strategy  (Part 0
                                        source only).

       I" J      [1     [ .1   C ]   d.  Previously relied on to meet the
                                        "reasonable further progress"
                                        requirement of Part 0 (Part 0
                                        source only).

       [ J      [ ]     [ ]        'e.  Made Federally enforceable as
                                        permit conditions.

                                    3-35

-------
Form 3 (continued)

E.  Emission limits.

1. ' Did the agency identify ths appropriate allowable emission rates uit.i
    respect to each emission unit?

C ] Yes  [ ] No; explain: 	
2.  The number of limitations actually appearing or referenced in1the
    construction penm't is	(see worksheet for emissions 'init).

3.  How many of the 1 imitati ons:                          •  Number	
                                                      Yes     No     CSu
  a.  Include clear and concise averaging periods
      compatible with appropriate requirements
      (e.g., NSPS, short-term MAAQS)?

  b.  Are compatible with acceptable measurement
      techniques?
  c.  Consist of design, equionent, work practice,
      or operational standards?

  d.  Appear Federally enforceable?

  e.  Include stated or referenced compliance test
      methods?
SECTION IV.BACT/LAER DETERMINATIONS (' JIf not applicable, nark here,
             THEN GO TO SECTION V.)
A.  BACT Analysis.
                                                TSP PMio SO? VOC NUx CO
   1. Pollutant emitted in significant amounts. [  1 [ ]  [ J T  ] !~ 1 I ]

   2. 8ACT/LAER analysis made (indicate w/"3"   [  ] i; ]  [ J f  ] f ] j;
      ,r ,,^.l<

   2. oAC7/LA£U was" specified in per-iit.        C  " ;~  '  l'_  ' '•'.  i •. j •.  i

                                                           YES   NO   C3D
   4. a. Did the application address more than one
         control ootion for BACT?                          [ ]   [ ]

      b. If NO, was the selected BACT clearly acceptable?  [ ]   C J  C  J

         Comments: 	


      c. If YES, did each option address the economic,
         and environmental i^oacts associated with that
         option?                                           r 1   ,  j
                                    3-36

-------
Fom 3 (continued)
                                                           Yes   Hp_
   5. Hoes the file contain documentation to show that
      the reviewing agency verified the applicant's        [ ]   [ J
      calculations and assumptions for BACT/LAER?

B.  BACT/LAER Stringency.  (Use the appropriate symbol(s) below to answer
    this part.)

   1.  a.  Was "top-down" used in determining BACT? [ ] YES [ ] NO [ ] CBO

       b.  Was the final control  technology chosen
           more strinqent than BACT?         .       [ ] YES [ ] MO [ ] CBj

       c.  Comments:
   2. Is the source (or modification) one for which FISPS or NESHAP has
      boen established?             [ ] Mo; GO TO Section V.  [ ] Yes.

   3. The Agency's BACT/LAER determination          BACT      LAER
      compared to NSPS/NESHAP is:
                                          1.  TSP   [  ]

     "A" -- more strinqent                ?..  P^Q  [  ]      ,.   j

     "B" -- equal                         3.  SO?   [  ]      [1

     "C" -- less stringent                4.  HC    [  J      [   ]

     "D" -- did not address,              5.  NOX   [  ]      f   j

            but should have               6.  CO    [  ]      [   ]

    ":JA" -- not applicable                7. 	   [  ]      [   ]

                                          8. 	   [  ]      [   j

   4.  Here air toxics considered in                •
             in inn BACT?                   L ] YES [ ] NO [ ] CBO
SECTION V.AIrt QUALITY MONITORING OATA -- PSD (L J If not subject to PSD,
"ark here, THEN GO TO Section VI)	\	'	

A.  Ai> Quality iioni taring Worksheet :•

                                For each "yes" in (a), complete the
(a) Are
potential
emissions


I.
2.
3.
4.
5.
f
1 j *
7.

Pollutant

TSP
P'IIQ
S02
CO
iiOv
VOC/03


signi
YES
C 1
C 1
C 1
r i
i. i
L J
L -j

ficant?
NO
r ]
C ]
L J
r i •
[. 3
1_ J
: j

followi
(b) Are
ng:





modeled
concentrations
si
YES
C. ]
r i
f 3
' ]
r i
C ]
i. .;

gnifi
MO
[ J
[ 1
r J
r 1
f ]
L .1
C j
3-37
cant?
CBO
C
[
[
r
1
i_
L
L

]
1
3
i
-i
j
]
i
j

(c)
qual
YES
[ J

i i
!" i
'. j
L J


Is
ity

r
i
c
i
~
r


exi
si
NO
1
1
J
1
".
]
-i

sting air
qnificant?
CBD •
[ 1
r I
C ]
r "i
i
^
I !
i


-------
Form 3 (continued)

B.  Applicability.

    1.  Was source required to address PSD air quality monitoring data
        requirements (either source monitoring or use of existing data)?

    a. [ ]  Yes, required to address air quality monitoring data
            requirements for at least one pollutant.

    b. [ ]  No, existing air quality for all pollutants was determined to
            be de_ minimi's for all  pol 1 utants (GO TO SECTION VI).

    c. [ ]  No, proposed ambient concentration increases for all pollutants
            were demonstrated to be de minimis (60 TO SECTION VI).

    d. [ ]  No, for the following reason(s):
                  (GO TO SECTION VI)

  C.  .Ambient Monitoring.

      1.  Was ambient monitoring required of applicant?

               [ ] No; GO TO D, below   [ ] Yes

      2.  Did the applicant submit a monitoring plan, including quality
          assurance (QA) procedures?
            a. [ ] YES, for 	
            b. [ ] NO, for 	GO TO Question 4
            c. [ ] CBD, for 	•     GO TO Question 4

     3.  Is the monitoring plan in the permit file?
            a. [ ] YES, for 	
            b. [ ] NO, for 	

     4.  For how long did the monitors collect air quality data?
            a. [ ] 12 months or more for: 	.	
            b. [ ] 4 to 12 months for:	

            c. [ ] less than 4 months for: 	
               If less than 12 months of data were submitted, summarize
               explanation:	
  D.  Representative Data

     1.  Was the use of existing data allowed?
               [ ]'No, GO TO Section VI.     [ ] Yes
                                    3-38

-------
Form 3 (continued)

     2.  Is the basis for allowing the use of existing data documented
         in the permit file?       .

            a. [ ] YES, for 	

            b. [ ] NO, for 	
     3.  a.  Did the agency's determination of "representative" adequately
             consider:

                                               YES for:  NO for:  C8D for:
            i.  Location of existing monitors  [ ]	[ ] 	  [ ] 	
           ii.  Quality of the existing air
                 quality data.                 [ ] 	  [ ] 	  [ ]	
          iii.  Currentness of existing air
                 quality data                  [] 	  [] 	  C •] _
       b.  If "NO" for any pollutant, please explain:
SECTIONTI.PSDINCREMENTMALYSIS
  A.  Modeling Analysis

Note:  It is important that an auditor knowledgeable in iTiodeling techniques
       and required procedures participate in this portion of the audits

                                                     CLASS I     CLASS II
1.  Was a PSD increment analysis performed?          TSP  S02    TSP  .S02

    [ ]'NO, GOTO VI.D.        [ ] YES, as follows. .[ ]  [ ]    [' ]  C ]
2.  How v
as the analysis performed?
    [ ] By the applicant with adequate review
        (including replication of results, if
        appropriate) by the agency for	[]  [ ]    [ ]  []

    [ ] By the applicant, without adequate agency
        for	[ ]  [ ]    [ ]  [ ]

    [ ] By the reviewing agency for-	[]  [ ]    []  []

    [ ] Not applicable for	C 1  [ ]    [ ]  [ ]
                                    3-39

-------
Form 3 (continued)

3.  Identify the dispersion model(s) used to. perform the increment analysis:

 Model Used              Pollutant/Area Classification      Averaging Times
                                                            3hr 24hr Annual
(Identify one
model  per line)
                      [ ] S02 for [] Class I [] Class II  £ ] [ ]   C ]
  a.	[ ] TSP'for [ ] Class I [ ] Class II      [• ]   [ J

                      [ ] S02 for [ ] Class I [ ] Class II  [ ] [ ]   [ ]
  b. 	[ ] TSP for [] Class I [] Class II      [ ]   [ ]

                      [ ] S02 for [ ] Class I [ ] Class II  [ ] [ ]   [ ]
  c.	[ ] TSP for.C ] Class I [ ] Class II      [ ]   [ ]


  d. [ ] CBD

4.    Did the agency select appropriate model(s)?             FOR MODEL
                                                           (see A.3. above)

                                                           3.a  3.b  3.c
    [ ] a. Yes, and documentation supports use of each     [ ]  [ ]  [.]
           model as being appropriate.
    [ ] b. Yes, model was appropriate, but inadequate      [ ]  [ ]  [ ]
           documentation was available to explain its
           selection.
    [ ] c. Cannot be determined. Documentation not         [ ]  [ ]  [ ]
           provided to justify model selection.
    C ] d. No, documentation failed to address             [ ]  [ J  \'_ ]
           appropriate considerations. Explain:	
  5.  Did the agency exercise the appropriate model options' (urban/rural,
      receptor network design, wind speed profiles, building wake effects,
      final/gradual plume rise, etc.)?
                                                             FOR MODEL

                                                           3.a  3.b  3.c
    C ] a. Yes, and documentation supports use of options  [ ]  [ ]  [ ].
           as being appropriate.
    [ ] b. Yes, options were appropriate, but inadequate   [ ]  C ]  C J
           documentation was available to explain its
           selection.
    [ ] c. Cannot be determined. Documentation not         [ ]  [ J  [ ]
           provided to justify options selection.
    [ ] d. No.  Explain: '	[ ]  [ ]  [ ]
                                    3-40

-------
Form 3 (continued)

6.  Did analysis consider the appropriate meteorological  data?

    [ ] 1. Yes, five consecutive years of the most recent representative
           sequential hourly National  Weather Service data.
    C ] 2. Yes, one year of NWS data (if 5 not avail a le) + use of
           highest modeled results.
    [ ] 3. Yes, five years of on site  data subjected to quality assurance
           procedures.
    [ ] 4. Yes, at least one year of hourly sequential  on site data,
           including worst-case conditions and subjected  to  quality
           assurance procedures.
    [ ] 5. Yes, screening data were used to obtain conservative results.
    [ ] 6. No.  Explain: 	
    [ ] 7. CBD                                    .         .

B.  Baseline Area.

    The baseline area for either TSP or S02 (or both) is defined as one
or more designated attainment/unclassified (§107) areas and will include
the §107 area the source will  locate in, plus  any other §107 areas where
the pollutant impact exceeds 1 ug/rrr annual average  (1 ug/m ,  24-hour
average, for Class I areas).

  1.  Were §107 areas properly applied?                        TSP   SO?

     [ ] a. Yes, the baseline  area consists of all  portions
            of the designated  attainment/unclassified areas
            as listed in 40 CFR 81 Subpart C.                   [ ]   [ ]
     [ ] b. No, the baseline area consists of  only portions
            of the designated  attainment/unclassified areas.   [ ]    [ ]
            Explain:  	

     L ] c. Cannot be determined from available information.   [ ]    [ ]

  2.  Did the baseline area include any other  areas  besides the area in
      which the source would construct?

     [ ] a. No, file documentation demonstrated no significant impact
            beyond area where  source would locate.
     [ ] b. No, but documentation was not provided to indicate whether
            other areas should have been included.
     [ ] c. Yes, for some of the [ ] TSP [ ] S02 attainment/unclassified
            areas in baseline  area.
     [ ] d. CBD.
                                    3-41

-------
Form 3 (continued)

  3.  Did the source trigger the baseline date?

      [ ] a. Yes, for entire baseline area.
      [ ] b. Yes, for some of the [ ] TSP [  ] S02 attainment/unclassified
            'areas.
      [ ] c. No, baseline date(s) for all areas within baseline area
             previously triggered.
      [ ] d. CBD.

C.  Increment Consumption

  1.  Did the analysis include, where appropriate (or explain why not):

                                                              TSP    SO?
    a. emissions from major sources commencing construction
       after 1/6/75 in"determining increment consumed (PSD
       Workshop Manual, Pt I, Sec. C.2)?,

             (i) Yes, for	C ]    [ ]
            (ii) No, for	[ ]    [1
           (iii) No prior major source emissions consumed
                 increment, for	C]    []
            (iv) CBD, for	C ]    [ ]

    b. emissions from minor sources occurring after the
       applicable baseline date(s) within the impact area
       in determining increment consumed?

             (i) Yes, for	[ ]    [ ]
            (ii) No. Explain: 	  C ]    [ ]
           (iii) Not applicable.  Source triggered baseline   [ ]    [ ]
                 date	C ]    [ ]
            (iv) CBD, for	[ ]    C 3

  2.  What impact concentrations were used for the short-term increments?

                       	Concentration Used	•   '
                                   Highest of the

      Pollucant        Hignest     2na highest        Other (explain)

         TSP             [ ]           [ ]            [ ] 	
                         C ]           C J            C 3              "
D.  Is there any reason to believe that an increment analysis should
    have been performed but was not?

          [ ] No.                                 Increment
          [ ] Yes, as follows:   [ ] TSP   [ ] Class  I    [  ] Class  II
                                C ] S02   [ ] Class  I    [] Class  II
   Explain .each "yes":	
                                    3-42

-------
Form 3 (continued)
SECTION VII. NAAQS PROTECTION
1.  Was a NAAQS analysis performed?

    [ ] NO, GO TO Question 8
    C ] YES, as follows:
a
b
c
d
e
f
Pol
. C
• L
• L
. C
• L
. C
lutant
]
]
]
TSP or [
S02 [
NOX
CO [
03 [
Pb
Identify
] PMio
! ]3hr:
: ]lhr:
: ]lhr:

model used
[ ]24hr:
[ ]24hr:
[ ]8 hr:


in the blanks below.
[
L
c
C
jannual :
]annual :
]annual :
i
]3mos:
2.  For the pollutants checked above, how was the analysis performed?

                                                   FOR THESE POLLUTANTS:
  C ] a. by the applicant with adequate review
         (including replication of results, if
         appropriate) by the agency.               [ ] all   [ ]:.

  [ ] b. by the applicant without adequate review
         (including replication of results, if
         appropriate) by the agency.               [ ] all   [ ]:
         Explain:
  C ] c. by the agency.                        .    [ ] all   [ ]:  ___

3. ,Did the applicant/agency use the appropriate model (s) to complete the
    analysis?

                                                     FOR THESE MODELS:
  [ ] a. YES, and documentation supports use
         of the model (s).                         C ] all   [ ]:  __

  [ ] b. YES, but inadequate documentation was
         available to explain its use.            [ ] all   [ ]:  _

  [ ] c. CANNOT BE DETERMINED, documentation not
         provided to justify model  selection.     [ ] all   [ ]:  _

  [ ] d. NO, documentation failed to address
         appropriate considerations.              [ ] all   [ ]:  _
          xplain:  ___
         E
                                    3-43

-------
Form 3 (continued)

4.  Did the applicant/agency.-use the appropriate node!  options (urban/
    rural, receptor network design, wind speed profiles, building wake
    effects, final/ gradual plume rise)?  •
                                                    FOR THESE MODELS:
[ ] a.  YES, and documentation suoports use of
        the option(s).                           [ ] all  [ ]: 	

[ ] b.  YES, but inadequate documentation was    [ ] all  [ ]: 	
        available to explain their use.

[ ] c.  CANNOT BE DETERMINED, documentation not  [ ] all  [ ]: 	
        provided to justify model option
        selection.
                                                     i
[ l.d.  NO, documentation failed to address      C ,0 all  [ ]: ,	
        appropriate considerations.
        Explain:  	

5.  Did the analysis consider appropriate meteorological data?

  [ ] a.  YES, five consecutive years of the most recent, representative
         sequential hourly National Heather Service data.
  [ ] b.  YES, five years of on site data subjected to quality assurance
         procedures.
  [ ] c.  YES, at least one year of hourly sequential  on site data,
         including worst-case conditions and subjected  to quality
         assurance procedures.
  C ] d.  Yes, one year of MWS data if 5 not available + use of highest
         modeled results.
  T ] e.  YES, screening data were used to obtain conservative results.
  [ ] f.  NO.  Explain: 	

  [ ] g.  CBDi

6.  Is there sufficient information in the file to verify that ?nissions
    from the following stationary sources (including sources with permits,
    hut not yet :in operation) were adequately considered when .appropriate?
  a. Existing major stationary sources.
     Explain: 	]	[ ]    I ]   f 1

  b. Existing minor and area stationary  sources.
     Explain: 	                [1    r ]   [ ]

-------
Forr 3 (continued)

7. 'Did the analysis .orovide adequate consideration of multi-source
    pollutant interactions?

 [ ] a.  YES, analysis adequately defined points of "!axi:nun i-npact
         determined from consideration of all sources in the vicinity
         rather than the maximum impact of the pronosecl source alone and
         the modeling exercise followed the guidance contained in the
         current Guideline on Air Quality Modeling.

 [ ] b.  MO, analysis ignored significant emissions from other sources
         in the vicinity.  Explain:  	'	
                                                                       •

 f ] c.  C8D.      '       ,

8.  Is there any reason to believe that one or more NAAOS analyses should
    have been performed but were not?

    [ ] CBO.
    [ 1 NO.       .
    [ ] YES; explain. 	   	      	
VIII.  EMISSION OFFSET REQUIREMENTS
A.  Emission offsets:

   [ ] 1. were not applicable to this source.  (GO TO SECTION IX.)

   [ ] 2. were applied to the following nollutants: 	
   [ ] 3. should have been applied to
          the following pollutants:
   l ] 4. use cannot be determined fron information in the file.
       Explain: 	  	]       	  	

-------
Form 3 (continued)

B.  For the six questions below, identify the applicable pollutants, then
    use:   "Y" for yes, "N" for no, "NA" for not applicable, or "C8D" in
    spaces below.  (Note:  You nay need to consult the file for the
    source(s) from which the offsets are being obtained to be able to
    respond to the following questions.)

                                      Information obtained frO"i:
                              	(Check appropriate box, below)	
                                                      Offset
                                 Pollutant      This  Source   Other
                             (_  )(__..)(   ) Permit Permit (Explain).
          I
    1.  Emission 'offsets
    obtained by this source
    are expressed in the same
    terms (i.e. actual or
    allowable) as are those
    emissions used in the     	   	  ' 	    [ ]    [ ]  C ]	
    3FP .demonstration. .            •

    2.  Minor/area source
    growth was taken into
    account in determining
    the amount of emission
    offsets needed.           	   	   	    [ ]    [ ]  [ J	

    3.  Offsets are surplus,
    i.e., would not interfere
    with RFP.                 	     '    	[ ]   [ ]   [ J	

    4.  Offsets are
    Federally enforceable.    	   	   	    [ ]   [ ]   [ ]	

    5.  Offsets, were required
    to occur on1 or before the
    dates 'of the start-up
    of h.he iew or -lodined '
    6.  Offsets were not
    utilized from early source
    shutdowns or production
    curtailments, except for
    replacements.             	   	   	    [ ]   [ J   [ ]

    Comments:
                                    3-46

-------
     3 (continued.
SECTION IX.  COMMENTS, NOTES
                                    3-47

-------
                                  FURM 4

                        PERMIT FILE QUESTIONNAIRE
            FOR SOURCES NOT SUBJECT TO PSD OR PART D (OFFSETS)

[NOTE:  Unless otherwise indicated, place an "X" in the box beside each
 statement or response which applies.  Many of the questions will allow
 more than one resoonse.]

SECTION"!".  SOURCE INFORMATION                         '

A.I.  Company/Source Name:
  2.  Source/type Category and Capacity

  3.  Address:
C.  Date Complete   |  D.  Date Permit to
  Application rec'd:|    Construct Issued:


1 1 1 1

mo day yr

1 1 1
| 1
mo day yr
E.  Source Location

1. [ ] Attainment/uncl assi f ied for
       all criteria pollutants

2. [ ] NonattainiTient without construction
       ban for:
                                           B.  1. Region

                                               2. State
                                                 3. Permitting A ne ncy
                                                    a. [ ] State"'
                                                    b. [ ] Local :
                                               4. Auditor _

                                               5. Permit ft
                                               6. Type of Review:
                                                  a. [ ] New Source
                                                  b. [ ] Modi ficat ion
3. [ 1 Monattainment area subject to a construction ban for:

4. " ] Hi thin 10 km of a Class I area

SECTION  II.  PUBLIC PARTICIPATION REQUIREMENTS

A.  PUD lie 'lot ice:
                                                           YES HO  COO
1.  [1 was not issued because exempted by ayencv rules.
    GO TO 1 1. 8.
2.  [ ] was not issued, but agency rules do not exempt.
    GO TO II. B.
3.  was published in a newspaper (approximate cost $ _ )[
4.  provided opportunity for public hearing  ........ [
5.  provided opportunity for written comment ........ C
6.  described agency's preliminary determination ...... [
7.  included estimated anbient impact ............ [
8.  indicated availability of additional information for
    oublic inspection ................... C
9.  resulted in _ comments ("0" if notice produced no
    comments )
                                 3-48
                                                                ] [
                                                                1C
                                                                j C
                                                                1 [
                                                                ] [
1 [ J
1 C 1
] r j
] i: J
i r i

-------
 Form  4  (continue".!}

      3.  The  follov/inq  other  affected  government  agencies  were  notified:

•   1.  other-agencies  and  officials within  the  State	[]'!][]
   2.  other States  	  [][][]
   3.  Federal  Land  Manager	f  ]  !"  ]  f  j
   4.  EPA	C  ]  [  ']  i;  J

 C.  Documentation for Section II  parts  A and 3  consists  of:
                                                             Notification  of
                                             Public  Notice  Other Agencies
   1.  cony  of notice/correspondence  in  file      a.[  ]           b.r.  ]
   2.  indication on processing  checklist*    •    a.[  ]    •       b.[  ]
   3.  no  documentation  provided                  a.[  1           b.T  1
   4.  other,  explain: 	

  *  i.e.,  no copies, but some  official  indication  in  file that notice was
    provided

 SECTION  III.   APPLICABILITY DETERMINATIONS          ~                 ~

 A.  Definition of Source.   The  source  or 'Codification  for  v/hioii this
    permit  application  was  made was  considered  by the  reviewing.agency  to
    consist of:                                        •            •

   l.a.[  1 The following new and modified pollutant-emitting  activities
      associated with the same  industrial  grouping,  located  on  contiguous
      or  adjacent sites,  and  under common  control  or ownership  (if  m
      than  5,  list  only 5 largest):

                                                         TPY/Pollutant
'•lew
C J
[ 1
C ]
r J
Modif. Emission Unit/Size
[ ] i •
T 1 ii.
[ ] Hi.
C j iv. .
     b.   C  1  CBD

   2.  Were any new  and  modified  pollutant-emitting  activities  (other  than
      fugitive emissions)  omitted  which  should  have been  included:

        [  ] No       [  ]  C30     [ ] Yes,  as  follows:
                                    3-49

-------
Form 4 (continued)

                                                   Reason given by agency
        Emission Unit      Emissions                for not considering
       or Activity/Size  Pollutant TPY  Mew Mod if.   as part of source

    a.                                  [ ]  [ ]
    b.	ETC]

    c.      	 '	C 1  []

    d.                                  L 1  [•]
  3.  If Section III.A.2.C. above was narked, did this new source or nodifi-
      cation.escape PSD or Part 0 (major source Seating in a nonattainment
      area) review as a result of the omission of the activities?

        [ ] Yes      [ ] No      [ ] CBD

B.  Fugitive Emissions.

  1.  This source or modification:

    a.  [ ]  Does not have any quantifiable fugitive emissions.
             (GO TO Section III.C.)

    b.  [ ]  Has too little documentation in the file to determine whether
             quantifiable fugitive emissions occur or were considered.

    c.  Has quantifiable fugitive emissions which were:

      i. [ ]  Included in determining whether the source/modification was
              major for the following pollutants:  	

     ii. [ ]  Not included in determining whether the source was major for
              the following pollutants: 	
              because the source is neither one of the 23 PSO source
             .categories nor regulated under Sections 111 or 112 of the Act.

    ii;. r_ "*  'lot included •'n  :o;:er"! :i i ••)" ,/iu?V:e° Vie source .-MG ~ v •.:••••
              for the following pollutants:	,
              although they should have been.

  2.  Did this source escape PSD or Part 0 review as a result of the
      omission of fugitive emissions?

      [ ] Yes    [ ] No     [ ] CBD

C.  Potential to Emit (PTE).  Determination of whether a source is major
    should be based on PTE rather than actual emissions.

  1.  The emissions of this source or modification were determined:

    a.  Using emission rates based on'emission factors which were:

                                   3-50

-------
Form 4 (continued)

      i.   [ ]  Hell  established (e.g., AP-42) or well documented in the
               file.
     ii.  [ ]  Not well  established and lacking adequate documentation
               for the following units and pollutants: 	
    b.  [ ]  CBD

    o.  [ ]  Usinn another net hod.  Explain:
  2.  The emissions of this source were determined on the basis of:

    a.  [ ]  CBD, Go to C.3.

    b.  [ ]  Maximum capacity to emit at full physical and operational
             design; GO TO c".3.
    c.  [ ]  Limited capacity based on control equipment, nhysical,
             ooerational or emission limitations, not otherwise nriui red
             (e.g. , NSPS) , as fol lov/s:

                                    If used, was limitation identified on
Preconst. Permit
i. C
ii. [
i i i . L
iv. [
v. C
vi. C
Limitation Pollutants Yes
] Control
equipment* [ ]
1 Emission
limit* [ ]
] Operating
hours ' [ ]
]; Operati ng<
rate [ ]
] Fuel/
material
restriction [ ]
] C .1
No Yes
[ J [ ]
i: ] r ]
r ] r j
r ] r j •
[ 1 r ]
C ] [ ]
Operatina Permit
No iiA
[ j C .1 •
C ] C 1
r i r j
r 1 i." !
L 1 C 1
L ] [ 1
   *Not ot'nerv/ise required (i.e., main purpose of limitation is to reduce
    potential emissions).

If Section III.C.2.C. above is marked, please describe the limitations:
                                    3-51

-------
Forvi 4 (continued)

  3.  Do the calculated emissions correctly represent the sourcp's
      potential  emissions?

    a.  [ ]  YES

    b.  [ ]  CBD Explain:	\	

    c.  No, because:

      i.  [ ] calculations based on control equipment, physical,
              operational  or emission limitations that are not Federally
              enforceable

     ii.  [;] emission factors not acceptable

    iii.  [ ] did not adequately address Punitive emissions

     iv.  [ ] did not address all pollutant-emitting activities (other
              than fugitives) Explain:  	
      v.  [] other.  Explain:
D.  Emission Netting.

  1.  Check the appropriate box for this permit action:

    a. C ] New .source; emission nettinn not applicable.
           GO TO Section III.E.

    b. [ ] Source was not required by agency to determine any net c'nango
           in emissions, but should have been.  EXPLAIN:   	
                                                 .   GO TO.Sect ion 11 I.E.
    c. f ] Source review included a determination of a net channe in
           •emissions.

  2.  The determination of significant emissions can lie a complex process
      when emission nettinq occurs.  The following worksheet should be
      used to determine whether the proper procedure was followed:

                                        Other creditable ,
                  Proposed emission    contemporaneous       Overall net
                    changes , TPY     emissions chanoes, TPY    channe,
      Pollutant         T+T"              ( + )    (-T"           TPY"

      a.  TSP
                                    3-52

-------
Forn 4 (continued)
                                        Other creditable,
                  Proposed emission    contemporaneous       Overall net
                1    changes, TPY     emissions changes, TPY •   change,  '
      Pollutant         T+T               ( + )  ~F)            TPY

      c.   SO?           	           '   	  	        •	

      d.   NOX           		          	•_

      e.   03(VOC)       		          	

      f.   CO
      g-
      h.
  3.  Oid agency correctly identify whether emissions were significant?

      [ ] Yes    [ ] CBD   [ ] No, explain	;	



  4.  Did analysis consider all pollutants for which a net increase in
      emissions occurred?

      [ ] Yes.

      [ ] No, failed to address one or Jiore pollutants.   Explain: 	
      [ ] CBO.  Explain:
***COf!PLETE THE FOLLOWING APPLICABLE STATEMENTS.***            !

  Fi.a.   Emission net "inn was based on actual e-nissions.

         [•] CBO; GO TO Question'7.   r_ ] rio; GO FU Question 6.  [ ] Yes

    b.   Indicate whether the calculation of actual  emissions properly
        considered the following criteria:
                                                      Yes     No     CBD

         i.  Representative of normal unit operation  [ 1     C 1    [ ]

        ii.  Based on a two-yaar'average              [ ]     [ J    [ ]

       iii.  Expressed in TPY                         r ]     r ]    r ]
                                    3-53

-------
FOTTI 4 (continued)

  6. -Emission netting was based on another approach.

    a. [ ] Mo.

    b. [ ] Yes, and approach was acceptable.  Explain:
    c. [ ] Yes, but approach was incorrect.  Explain:
  7.  Emission decreases were considered.

    a. [ ] No; GO TO E.

    b. [ 1 Yes, and the decreases:

       Mere   Here not  CBD   N/A

       [ ]      [ ]     [ ]         i.  Contemporaneous with t;ie oropos.."!
                                        modification.

       [ ]      [ ]     [ .1        ii.  Previously relied on to determine
                                        a net emission change.

       [ ]      [ ]     [ j   f J iii.  Previously counted as part of tne
                                        SIP attainment strategy (part 0
                                        source only).

       [ ]      [ ]     [ ]   [ ]  iv.  Previously relied on to r^eet the
                                        "reasonable further progress"
                                        requirement of Part b (Part 0
                                        source only).

       I" ]      [ ]     [ ]       •  v.  Hade Federally enforceable as
                                        Demit conditions.

~.. j?f'"i ".-ion of '!ajor<.  This source was N!OT subjected '.3 PSD .--r ~>-
-------
Fom 4 (continued;

  2.  [ ] Emission increases resulting from modification were not
      significant.

  3.  [ ] Source eligible for exemption; describe:	
  4.  Review agency erred and source should have been subject to:

    a.  [ ] PSO review; explain:	
    b.  [ ] Part D review:  explain: 	

  5.  [ ] CBQ

F.  Emission 1imits.

  1. Oid the agency identify the appropriate allowable emission rates
     with respect to each emission unit in the construction oerrif

     [ ] Yes.  [ ] No; Explain: 	
  ?.  The number of limitations actually aopearing in the orecunstruc'ion
      Dermit(s) is 	.

  3.  How many of the limitations:
                                                           Number
                                                  Yes    No   CBD  Total
    a.  Include clear and concise averaging
        periods compatible with appropriate
        requirements (e.g., NSPS, short-term
        ,'IAAOS)?  '                 •

    b.  Are consistent with acceptable measure-
        ment techniques?

    c.  Consist of design, equipment, work
        practice, or operational standards?
    e.  Include stated or referenced compliance
        test methods?                             _    _   __   _

ri. 'Applicability Summary.  Is there any reason to believe that this
    application should have been subiect to PSD or Part 0 provisions?

  1.  [] NO, GO TO Section IV.

  2.  [ ] YES.  Exolain:
                                    3-55

-------
:orm 4 (conti nue<~! j
SECTION IV.CONTROL TECHNOLOGY
  1.  Does the file contain documentation to show that  the  reviewing
      agency verified the applicant's calculations and  assumptions
      pertaining to the selected control technoloqy?

      [ ] Yes   [ ] No

      Comments:
  2.  Does file documentation show that the reviewing agency  ascertained
      compliance of estimated emissions with, apolicable SIP limits?

      [ ] Yes   [] No

      Comments:
  3.  Is the source subject to NSPS or MESHAP renuiraments?

      [ ] Yes   [ ] No, GO TO Section V.

  4.  Was the source identified by the agency as being sub.iect  to:

    a.  NSPS: [ ] Yes, for 	.

       • [ ] No, because, no "ISPS apply.

        [ ] No, but should have been.  Explain: 	
    b.  NESHAP:  [ 1 Yes,
        r_ } ''o, '^ocouse -ic ''iflSMAP TOO!'•'.

        [ 1 No, but should have been.   Explain:
SECTION V. AMBIENT AIR QUALITY ANALYSIS (NAAQS Protection)

  1.  Has an ambient impact analysis performed?

      [ ] No, Go To  Question 8.
                                    3-56

-------
Form 4 (continued) .

    [ ] YES, as follows:

        Pollutant   Identify model used in the appropriate blanks below.

    a.  [ ] TSP or                   [ ]24hr:^	  r jannual :	
        [ 1 PMlO

    b.  [ ] S02     C ]3hr:	   [ ]24hr:_	  [ Jannual:	

    c.  [ ] NOx                                       [ ]annual :	___

    d.  [ ] CO      [ ]lhr:j	   [ l8hr: 	   •

    e.  [ ] 03      r ]lhr:                   -             •  ,

    f.  [ ] Pb                                        [ ]3mos:
  2.  For the pollutants checked above,.how was the analysis performed:

                                                    FOR THESE POLLUTANTS;
    [ ] a. by the applicant: with adequate review
           (including replication of results, if
           appropriate) by the agency.               [ ] all   C ]:	
    [ ] b. by the applicant without adequate review
          1 bv the agency.  Explain:	
                   	[ ] all   [ ]:

    [ ] c. by the agency.                            [ ] all   [ ]:
  3.  Did the applicant/agency use the appropriate model(s) to complete
      the analysis?

                                                  FOR THESE MODELS:
    [ j a. YES, and documentation supports use
           of the nodel(s).                       C ] all  [  ]: :
           available to explain its use.          [  ] all   [  ]:

    [ ] c. C3D, documentation not
           orov.ided to justify model selection.   [  ] all   [  ]: •

    [ ] d. NO, documentation failed to address•   C  ] all   [  ]:
           anprooriate considerations.

        Exolain:
                                    3-57

-------
Form 4 (continued)

  4.  Did the agency exercise the appropriate model  onti'ons (urban/rural,
      receptor network design, wind speed profiles,  building wake effects,
      final/gradual  plune rise, etc.)?  .
                                                    FOR THESE MODELS:

    [ ] a.  Yes, and doume.ntation supports use of   [ ] all r ]:	
            each option as beinn aoprooriate.

    [ ] b.  Yes, each option was appropriate, but   [ ] all [ ]:	
            inadequate documentation was available
            to explain its selection.

    [ ] c.  Cannot  be determined.  Documentation    [ 1 all [ ]:	
            not provided to justify-option
            selection.       .                    i
                                                 i
    [ ] d.  No, documentation failed to address     [ J al 1 [ ]:	
            appropriate considerations.
            Explain:  	

  5.  Did the analysis consider appropriate meteorological  data?

    [ ] a. YES, five consecutive years of the most recent reoresentativ=
           sequential hourly National Weather Service data.

    [ ] b. YES, one year of MV!S data (if 5 not available) + use of highest
           modeled  results.

    r ] c. YES, five years of on site data subjected to quality assurance
           procedures.

    [ ] d. YES, at  least one year of hourly sequential  on site data,
           including worst-case'conditions and subjected to quality
           assurance procedures.

    [ ] e. NO; Explain: 	,	•

    r ] f. CBD               •                        '

  5.  Is there sufficient information in the file to verify tha'c emissions
      from the following stationary sources (including sources with permits,
      but not yet in operation) were adequately considered when appropriate?

                                                        Y£S  '   N0_    NA
    a. existing major stationary sources.
         If no, explain:	[ ]     [ ]

   . b. existing minor and area stationary sources.
         If no, explain:  	  	   	      [ ]     [ ]
                                   3-53

-------
Form 4 (continued)

  7.  Did the analysis provide adequate consideration of multi-source
      pollutant interactions?

    a.  f ]  YES, analysis adequately defined points of rcaxiinun impact
             determined from consideration of all  sources in the vicinity
             rather than the maximum i'npact of the proposed source alone.

    s«  [ ]  NO, analysis ignored significant emissions fron other sources
             in the vicinity.  Explain:  	

    c.  [ ]  C8D from information available in file.

  8.  Is  there any reason to believe that the proposed project should hove
      been subjected to an ambient impact analysis that was not performed?

    a.  [']  YES, source was within 10 km of a Class I area but its
             ambient impact was not considered.

    b.  [ 1  YES;  Explain: c.  [ ]  NO.	

    d.  [ ]  CBD.
SECTION VI.  ADDITIONAL REVIEW
  1.  Was this source subject to any of the following additional reviews?

        Yes   No_

    a.   []''[]  BACT

    b.   [ ]  ' [ 1  LAER

    c.   [ ,]   [ J  PSD/Increment analysis

    Comments:
                                   3-59

-------
                                                                           Work  Sheet
Emissions Unit ID Modilii'd [ ] Unit/Size One of 28
Geographic
Appl icabil ity
Pollutant Attain. Nonattain.
P"10
TSP
NOX
CO
voc
S02
Pb ' N/A
Asbestos N/A
Beryl 1 iuni M/A
Mercury N/A
Vinyl N/A
chloride
OJ
i
"•Fluorides N/A

Sulfur N/A
acid ''list
Hydro. |,;n N/A
Sd 1 fide
Total reduce.) . N/A
sul fur
fiudiid-:! SulUir " N/A
components
Air Toxics
Pollutant Applicability
Pollution1 Other Net
Level Before Creditable Creditable Epission
Change (TPY) Increases Decreases Increase

























Fugitives
Counted




-












-



I



Major
or
Signif .

"























cat.f>nories [ ]
Cases
for
Determination"?










1




BACT/LAER
Done?















1



.

_












Emission Limitations/Test Methods

























K Measured as actual emissions
?..  Actual, allowable,  PTE


Emission Limitations  Summary:
                                      Yes

clear and concise                   	

acceptable v.v r.ureineut technique   	

consistent ol  iiesiqn equipment,  -
v/ork practice,  or  qperat ioriril      	
stanc'ard

federally cul01ceablc              	

include  staii.-d or  reference        	
r.oi:i|il i ance  I c'. I nelhods
                                              No
                                                     Total

-------
«  5
c  -o
a  r
00
m  m

-------
                                Chapter  4
                 Compliance Assurance Audit  Guidelines
                              FY 1988-1989
Secti on      Title	    Page
  A          Introduction	     4-1
  B          Periodic Review and Assessment of Source Data	     4-1
  C          Asbestos Memol i ti on and Renovati on (D&R)	     4-4
  D          File Review...	     4-6
  E          Overview Inspections	     4-8
  F          Compliance Assurance Audit Report	     4-9

-------
                          Chapter 4

            Compliance Assurance Audit Guidelines

                           FY 88-89

A.  INTRODUCTION'

    The major parts of the compliance assurance element  in
the i.;'Y 88-89 audit period will be periodic review and  assessment
of source data, asbestos demolition and renovation,  file
reviews, and overview inspections.  There will be continued
emphasis on volatile organic coaipound (VOC) sources  in
states with ozone nonattainment areas.                     i

    The questions which follow were developed for use  by  all
ten regions to ensure consistency in the National Air  Audit
System (NAAS) effort, and provide an accurate basis  for
national comparison of state compliance programs.  All questions
must be answered and procedures followed for each audit.

    The time period to be covered by the audits is the most
recent twelve months preceeding the bn-site visit, with  the
exception of CDS data.  For CDS data, the most recent  fiscal
year should be used (i.e., FY 87 or FY 88) or a state's  fiscal
year if that is more appropriate.

B.  PERIODIC REVIEW AND ASSESSMENT OF SOURCE DATA

    To assess the adequacy of state compliance programs  in
meeting Clean Air Act requirements, the EPA regional offices  .
continually review source compliance status and inspection
information submitted by the state for the SIP, cJSPS,  and
j'ESiiAPs- programs.  This information is contained in  tne  CDS.
In -preparation for each audi;, the regions are to obtain  CDS
retrievals for operating Class A state implementation  plan '    i
sources (SIP, including '.-TSR and D:30) , new source p^r-forvianoe
jnanuards -sources i, USP'3 ) ,  -\nd nonr.rans i r.ory .-: ooiiA?  ioa/ro-.iis -
These retrievals should Include information on inspection
frequency, compliance rates, and enforcement activity  for tne
most recent fiscal year.   This data must tnen be analyzed by1
answering the following questions to show the status of  a
state's compliance program.

    1)  what percentage of sources in the state received  the
required inspections as specified in the Section 105 grant
agreement?  Prepare an inspection summary for each state  as
follows:
                            4-1

-------
Progress in Meeting Inspection Commitments  in  the  State  Grant

                          Class Al SIP   • NSPS*     NESHAP*

Total percentage of
sources committed
to for Fiscal Year

Percentage of Sources
actually inspected
during Fiscal Year

  *Assumes program delegated to state

    2)  What is the compliance status breakdown  of sourc.es  in
each air program?  Prepare compliance chart  for  each  state  as
follows:

                 Compliance Status of Sources

                In         Meeting       In
Program      Compliance    Schedule    Violation   Unknown   Total

Class A SIP

Class Al SIP

Class Al VOC

NSPS

NHSHAP

    3)  Based on CDS, what percentage of Class Al  SIP, NSPS,
and tfESHAP long-term complying sources (defined  as  being in
compliance two consecutive quarters or more) have  recently
(within the past year) had state compliance  inspectioni?
Specify results for 'each program separately  'inc.lud.ing  the
.•lumbers '^sed to derive the oercsritaaes .
                            4-2

-------
    4)  Regarding "Timely "and Appropriate" (T&A) response" to
violators:

           For the current fiscal year, what are the numbers
           of violators in each category II (A)-(D) of the
           "timely and appropriate" guidance?

           What procedures have been established for reporting
           of data by the state?  How are violator's subject
           to the guidance reported to the region?  Have .
           these procedures been followed by the state?
           (This should be answered only if procedures have
           changed from previous years).

           Give specifics of state actions and results in
           cases where rJPA deferred action beyond day 120.

           Has 'the state always satisfied penalty requirements
           where applicable under the "Timely and Appropriate"
           guidance?

           What, procedures have been established for L-JESHAP
           sources subject to T&A?  Are the penalty, data
           transfer, and consultation requirements being
           satisfied?

           Overall, has the state followed all of the "Timely
           and Appropriate" procedures it agreed to?

    5)  how many long-term violators (shown in CDS as in
violation two consecutive quarters or more) have not been
subject to enforcement activity or additional surveillance
activity?  Provide a source by source listing of the state's
Class A SIP,  NSPS, and NESHAPs long-term violators including
type and date of most recent surveillance or enforcement
activity.

    5)  What  is the regi'on's overall assessment of the state
compliance
    Following the investigation into eacii of these questions, .
findings are to be outlined for each state program under the
heading of "Pre-visit Assessment of State Compliance Program".
These findings should include a clear and concise .statement
on what CDS reflects about the program and how T&A is being
implemented.  Conclusions should then be drawn on the condition
of the state's compliance program,  and summarized in paragraph
form.
                            4-3

-------
    The pre-visit assessment should be sent to the state
prior to the audit.-  The region should include in this
transmittal questions it wants to follow-up on and any other
air compliance related items requiring discussion during the
audit - such as findings from the overview inspections.

C.  ASBESTOS"DEMOLITION AND RENOVATION (D&R)
    Because EPA has established compliance with the various
NESHAP regulations as a high priority, the Agency in April 1934
issued an Asbestos Strategy Document to Regional Air Division
Directors and Regional Counsels.  The document's purpose was
to aid in the goal of ensuring that sources violating asbestos
NESHAP regulations for D&R are identified and corrected and
that appropriate enforcement action is taken.

    The essential elements of a successful asbestos compliance
program are:

    - strategy for identification of non-notifiers and
      violators of applicable NESHAP regulations;

    - inspection strategy to ensure that asbestos operations
      and activities are performed properly;

    - prompt initiation of enforcement actions whenever
      asbestos violations are found;

    - assessment of penalties for violations, and prompt
      collection of those penalties;

    - use of proper safety equipment during inspections
      including appropriate training for inspectors.

    The Agency believes many asbestos problems are area specific
and best resolved through state and local action.  The follow-
ing questions will help evaluate the effectiveness of a
particular state or local program where enforcement authority
has been delegated.

    1.  How does the delegated agency ensure  compliance wit1"!
        the demolition and renovation notification requirements?
        What is the delegated agency's strategy for learning
        about nonnotifiers i.e., publicity campaigns to
        promote whistleblowing or private citizen reporting,
        etc?
                             4-4

-------
  '  2.  What procedures are followed when violations are
        identified?  Are violations "from notifiers and non-
        notifiers treated differently?

    3.  Are penalties routinely assessed and collected
        according to EPA's Asbestos D&R Penalty Policy (or the
        state's penalty policy)?

    4.  Is a contractor certification program for asbestos
        sampling and analysis in place?  Please describe.

    5.  Is a manifest system to keep track of removal, hauling,
        and disposal of asbestos material- in place?

    6.  Do you know locations and disposal requirements of
        approved landfills in each state?

    Following the discussion of these six questions,  the
audit team should conduct a file review devoted entirely to
NESHAP Demolition and Renovation (D&R) compliance.  The file
review should determine'compliance with requirements in
40 CFR 61, Subpart M., for applicability, notification and
control procedures for asbestos D&R projects.

    This file review is separate from the file review in
Section D and should include a representative sample of D&R
files.  The following questions should form the basis of each
file reviewed.

1.  Is the project demolition or renovation?

2.  Is the applicability portion of the regulations adequately
    addressed?

3.  Is the notification portion of the regulations adequately
    addressed?

4.  Ts compliance with required control procedures (wetting
    and removal) adequately addressed?

".  If Oere is an insnection reoort, Joes ic contain c.;~e
    following:

    a.  Name and location of source, date of inspection,

    b.  Applicable regulation,

    c.  Sample and analysis information,
                              4-5

-------
    d.  Proper chain of custody procedures, and

    e.  Evidence of compliance with applicable safety
        procedures?

6.  From the information in the file, can the raviewar
    determine the compliance status of the project?

    The D&R section of the compliance assurance report should
include answers to the six D&R program questions, as well as
a summary of all files reviewed for each of the six D&tl file
review questions.

D.  FILE REVIEW

    An effective state and local compliance program .aust have
a well documented file on each source.  This file should be
available -for use by management and field personnel.  The
structure and location of files are optional as long as any
needed data can be supplied upon request.  The files should
contain information supporting the compliance status of each
source.

    The audit team should review a representative sample of
files from the three air programs (SIP, tfSPS, non-O&R NESHAP)
in each state or local agency.  In state with ozone nonattain-
ment areas, the sample should concentrate on VOC sources.  In
most cases, each state audit file review should consist of
15-23 files.  Selection of sources for file review should be
based on such factors as duration of violation, NSPS sources
with GEM requirements, recently reported compliance changes,
citizen or congressional inquiries,  problems surfaced in the
CDS orevisit program analysis, personal knowledge of the
source, VOC sources in ozone nonattainment areas, or uon-DStR
NESHAP sources.

    For each file reviewed, the following questions suust be
answered.  The purpose of these seventeen questions is to
gather the information necessary to answer .the three file
review summary questions £or the audit report.
                                  l
    i.  Jan trie reviewer, crom information avaiiaola i. ;i ~ae
file, determine the programs to which the source is subject?
If not, why?  The various programs are SIP, PSD, NSPS and
non-D&R NESHAPs.
*For the purposes of this report,  the term "source" is
 synonymous with facility and consists of one or more emission
 points or processes.
                            4-6

-------
    2.  From the  informatio'n  available  in  the  file,  can  the
source's compliance status be determined  for all  regulations
to which it is subject?

    3.  Does the  file contain documentation supporting  the
.source's compliance status?   (As a minimum, the  file  should-
contain:  -(a) documentation that the  source was  inspected and
that  the regulated emission points and  pollutants were  evaluated,
and (b) a determination of the compliance  status  of  the  source
and documentation of the basis for that determination.)

    4.  Are all major emission, points identified  (i.e.,  in an
inspection report, operating  permit,  etc.,) and  each  point's
compliance status indicated?

    5.  Does the  file identify which  emission  points  are
subject to NSR, NSPS, PSD, and non-D&R  NEHSAPs requirements?
If yes, are regulated continuous emission  monitoring  (CEM)
requirements or permit conditions shown to be  i-n  compliance
and documented? Are required  start-up performance tests
included?  Are dates for the  test specified?

    6.  Does the  file identify special  reporting  requirements
to which a source may be subject (i.e., excess emission
reports from malfunction or CEM requirements)  and are any
such  reports found in the file?

    7.  Does the  file include technical reviews,  source  tests,
CEM performance specification tests,  permit applications,
correspondence to and from the company, and other supporting
documentation?

    3.  What methods of compliance documentation  are  used
(e.g., source test, CEM, fuel sampling  and analysis,  inspection,
certification, engineering analysis,  asibestos  analysis  etc.)?

    9.  Was the method used to ascertain compliance  the-most
appropriate one for the type  of 'source  being documented?  Is
the method prescribed by NSPS, NESHAPs  or  SI??   If noc,
9 x p 1 a.i n .                 '

   10. If the documentation includes  an inspection, does the
inspection report contain control equipment parameters  observed
during the inspection (pressure drops,  flow rates, voltages,
opacities)?  Were observed control equipment operating
parameters or CEM emission levels compared to  permit  conditions,
design parameters, or baseline observations?  Were plant
operating parameters recorded?
                              4-7

-------
11. If .documentation includes a stack  test,  were  visible
emission observations or CEM emissions  levels  and operating
parameters recorded during the test?  Were  they required?  was
there a quality assurance procedure used with  a stack  test?
'Who conducted, observed, and reviewed  the test?

12. Are enforcement actions contained  in the file?

13. Are actions to bring about compliance taken in a timely
manner?  Do any take longer than  30 days from  the ti.ne the
violation  is discovered? If yes,  how long?

14. »fliat are the types of documentation in  the file to support
the enforcement action?

15. What are the types of documentation in  the file to' show
follow-up  to the enforcement actio'n.  (reinspection,  letter,
etc.)?

16. Regarding citizen compliants:  a)1  are they documented  in
tne file?  b) are the investigation and follow-up procedures
adequate?

17. What action does the Agency take with respect to reports
of excess  emissions?

    The review team should summarize their  findings following
the file review by answering the  following  three  questions
and including the responses in the NAAS report.

    1.  Do all files reflect a reasonable profile of the
source  (meaning that the files contain  inspection reports,
stack test reports, CEM data, enforcement actions,  etc.)?
If not, explain.
                                i                       _
    2.  Do all files contain- adequate  Written  documentation
to support the '(compliance status  reported to EPA?  'If  not,
explain.

    3.  Are violations documented  ina  pursued  ~-3  :'ecurn t.ie
source to  compliance expeditiously?  Explain.

ti.  OVERVIEW INSPECTIONS

    To provide quality assurance  for compliance data in state
or local files furnished to EPA,  and to promote effective
working relationships between EPA and  state  or local agencies,
EPA should continue the overview  inspection  program begun  in
FY 84.  It is envisioned that the regions will continue to
inspect 2-3% of the Class A SIP, NSP3,  and NESHAPs  sources
in the CDS inventory each fiscal  year.  The  FY 38-89 overview
portion of the audit should focus on the overview insoections
                             4-3

-------
performed during the most recent fiscal year.  As with file
reviews, the overview inspections should include a representa
tive portion of VOC sources with special emphasis on those
source types most environmentally significant or troublesome
based on impact on ozone nonattainment areas, and NESHAP
sources including D&R.

    EPA should notify 'the state and local agenices of its
intent at least 30 days before each inspection is to take
place to encourage their participation (this may not be
possible for NESHAP D&R sources but as much' advance notifica-
tion as possible should be given).   Each inspection should be
an independent verification of the source's compliance status
at minimum, and should review the state and local inspector's
procedures for determining compliance if the inspection is
jointly performed.
                                                           i
    To promote uniformity, the following .quest ions must be
answered for the overview inspection effort:

    1)  How were sources selected by the region for the
        overview inspections?

    2)  How many inspections were performed?

    3)  Generally, what did the inspections consist of?
        Specify inspection procedures used as well as the
        degree and extent of involvement of state personnel.

    4)  What was their purpose (that is, to independently
        verify state reported compliance, to observe state
        inspection practices, or .some combination of these)?
        Other purposes?

    5)  Generally, what were the results of the inspections?
        Answer should relate to purpose stated in item 4.

    6)  Discuss the important points overall of the overview
        inspection findings.  Give  recommendations for
        •resolution of any problems  discovered during the
F.  COMPLIANCE ASSURANCE REPORT FORMAT

    For each audit performed, the region must prepare a
compliance assurance, report that includes a complete summary.
of all audit activities and answers to the six asbestos D&R
questions and file review questions on pages 4 thru 6, the
three file review questions on page eight and the six overview
                              4-9

-------
questions on page nine.  In addition, the report should
include the conclusions reached on 'the previsit assessment
(see page three).  The main body of the report should follow
the questionnaire exactly in each of the four areas (Pre-Visit
Program Assessment,  Asbestos D&R, File Review, and Overview
Inspections).  In addition, each report must include' an
overall summary of findings for each state program including '
positive and negative points,  and recommendations for resolu-
tion.This summary offindings should be at the beginning of
the compliance assurance report and/or contained in the
report's executive summary.  Each report should be reviewed
by the audited agency to help eliminate misconceptions or
misunderstandings and to ensure factual accuracy before it is
finalized.
                             4-10

-------
o



o
39

z
o

-------
                                  CHAPTER 5

                       Air Monitoring Audit Guidelines

                                 FY 1988-1989



Section    Title	    Page

11.1      Introduction	,	  5-1

11.2      Regulatory Authority to Perform a System Audit	  5-3

11.3      Preliminary Assessment and Systems Audit Planning	  5-9

11.4      Guidelines for Conducting Systems Audits of
          State and Local Agencies	  5-15

11.5      Criteria for the Evaluation of State and local
          Agency Perfonnanc	,	  5-26

11.6      Systems Audit Questionnaire - Short Form	 SF-1

11.7      Systans Audit Questionnaire - Long Form	  LF-1
                                      b-i

-------
                 SUMMARY OF CHANGES TO THE NATIONAL  AIR  AUDIT
                      SYSTEM GUIDANCE  FOR  AIR  MONITORING


     Duriny the FY 198b and FY 1986 to 1987 National  Air Audit  Cycles,  the
audit guidance utilized for those audits was  identical to  the guidance  for
systems audit provided in Section 2.0.11 of the  Quality  Assurance  for Air
Pollution Measurement Systems Handbook, Volume II,  EPA-60U/4 -77-U27a.   This
guidance will  ayain be used for  the FY 1983 to 1989  National Air Audit  Cycle.
However, because this material does not include  PM]_Q ,  the short and  long  form
questionnaires of Section 11.6 and 11.7 have  been modified  to include informa-
tion concerning PM]_Q  monitoring  for the FY 1988  to 1989  audit.   These are
temporary revisions necessary to accomplish the  goals  of the FY 1988 to B'.-)
National Audit program.  The Environmental  Monitoring  Systems Laboratory
(EMSL) will revise Section 2.0.11 of the  handbook, according to established
procedures for handbook revision, at a future  date which has yet to  be
determined.                                     .

     It is assumed that, as in.the past biennial audit cycle, approximately
50 percent of the agencies will  be audited each  year of  the FY  1988-to  1989
eye 1 e.

     The Air Quality Management Division of the  Office of Air Quality Planning
and Standards will  develop a schedule  for  submission of  the regionally  prepared
audit reports.
                                     5-ii

-------
11.0  SYSTEMS  AUDIT  CRITERIA  AND  PROCEDURES  FOR  AMBIENT  AIR  MONITORING
      PROGRAMS
11.1  Introduction
11.1.1  General  - A systems audit is an on-site review  and  inspection  of  a
state  or  local   agency's  ambient  air  monitoring  program  to  assess  its
compliance with established regulations governing  the  collection,  analysis,
validation,  and   reporting  of  ambient air quality data.  A systems audit of
each state or autonomous agency within an EPA Region is performed biennially
by a member of the Regional Ouality Assurance (QA) staff.
     The purpose of the guidance included here is to  provide  the  regulatory
background and appropriate technical criteria which form the basis for the air
program evaluation by the Regional  Audit Team.  To promote national uniformity
in  the evaluation of state and local  agency monitoring programs and agencies'
performance, all  EPA Regional  Offices are required to use at least  the  short
form  questionnaire  (Section  11.6), corrective action implementation request
(CAIR) (Section 11.4.2), and  the  systems  audit  reporting  format  (Section
11.4.4)  each year.  Use of sections of the long form questionnaire is left to
the discretion of the Regional QA Coordinator, with  the  concurrence  of  the
State  or  local  agency.  The short form questionnaire is essentially the same
as the monitoring audit questionnaire used in FY-84.  No  substantive  changes
have  been  made;   however, the questionnaire has been reorganizeci to improve
the information received and facilitate its completion.  In addition, requests
for resubmission  of data already possessed by EPA have been deleted.
     "T'ne scope of a systems audit is of major concern to both EPA Regions  ?nd
the agency to be evaluated.  A systems audit as defined in the context of this
iocumenc is seen to include in   lonraisal  of  "He  following  orouri::;  .ir^as:
•^cwor1''   iannijemenr., rieui operations, ' aborar.ory operations, jaca  lanriuanen" ,
quality assurance1 and reporting.  The guidance provided concerning topics  for
discussion  during  an  on-site interview have been organized around these key
program areas (Section 11.5).  The depth of coverage within these areas may be
increased  or  decreased  by  using  one  or  more  sections  of the longrform
questionnaire (Section 11.7) in conjunction with the short-form  questionnaire
(Section 11.6).  Besides the on-site interviews, the evaluation should include
                                     5-1

-------
the review of  some  representative  ambient  air  monitoring  sites  and  the
monitoring  data processing procedure from field acquisition through reporting
into the Aerometric Information Retrieval  System (AIRS) computer system.
     The systems audit results should present a clear, complete  and  accurate
picture   of   the  agency's  acquisition   of  ambient  air  -nonitoring  data.

11.1.2 Road  Map  to_  Using  this  Section - This  section  contains  guidance
sufficient  information for operating a systems audit of an agency responsible
for operating ambient air monitoring sites, as part of the State and Local  Air
Monitoring  Stations  (SLAMS)  network, and to report the results in a uniform
manner.  The foil owing topics are covered  in the subsections below:
o A brief sketch of the regulatory requirements  which  dictate  that  systems
  audits  be  performed,  indicating  the   regulatory  uses to which the audit
  results may be put (Section 11.2);
o A discussion of
    1)   the requirements on the agency operating the SLAMS network;
    2)   program facets to be evaluated by the audit;  and
    3)   additional criteria to assist in  determining the required  axtent   of
         the forthcoming audit;  (Section  11.3) .
o A recommended audit protocol for use by  the .Regional Audit Team, followed by
  a detailed discussion of audit results reporting (Section 11.4);
o Criteria for the evaluation of State and local agency performance  including
  suggested  topics  for  discussion  during  the  on-site interviews (Section
  li.R);
o A short-form questionnaire, based  on  the  National  Air  Monitoring  Audit
  Questionnaire  preoared  by  the  STAPPA/ALAPCO  Ad Hoc Ai r, Mon i ".orinc Audi':
  •".ommi r.tae.  viO-:-20-^3) (Section ll.n);
o A long-form questionnaire, organized around the six key program areas to   be
  evaluted (Section 11.7);  and                         r
o A  Bibliography  of  APA  guideline  documents,  which  provides  additional
  technical,  background  for  the different program areas under audit (Section
  11.8).
The guidance provided in this section is addressed primarily to  EPA  Regional
                                    5-2

-------
QA  Coordinators  and  members  of  the  Regional audit teams to guide them in
developing and implementing an effective and nationally uniform .yearly  audit
program.   However,  the  criteria presented can also prove useful  to agencies
under audit to provide them wit/h descriptions  of  the  program  areas  to  be
                                                /
evaluated.
     Clarification of certain  sections,  special  agency  circumstances,  and
regulation   or   guideline  changes  may  require  additional   discussion  or
information.  For these reasons, a list of contact names and telephone numbers
i s given i n Table 11-1.

11.2  Regulatory Authority to Perform a Systems Audit
11.2.1  General Regulatory Authority - The.authority t  perform systems audits
is  derived  from  the  Code  of Federal Regulation (Title 40).  Specifically:
40 CFR Part 35, which discusses agency grants and grant conditions, and 40 CFR
Part 58, which deals specifically with the .installation, operation  and quality
assurance of the SLAMS/MAMS networks.
     The regulations contained in 40 CFR Part 35 mandate  the  performance  of
yearly audits of agency air monitoring programs by the Regional Administrators
or their designees.  Pertinent regulatory citations are  summarized  in  Table
11-2.  All citations are quoted directly from the regulations and are intended
as art indication of the context within which systems audits are performed  and
the impact that audit results may have on a given agency.  Even though this is
the regulatory authority to conduct such audits, for the  SLAMS  network,  the
specific  authority  is derived from 40 CFR -Part (58.  Three specific citations
from 40 CFR Part 58 are also quoted in Table 11-2.
     ~i r.ddif'on r.o  :hs  "e'jul acinns  onsanted  "• i  "^abl ?.  II-'1.,    J.  r"jr~.hs^
"=qui "2'nenr.   -,-.   imposeo  ,in   -eporting  organizations suram'tr.ing  iaia iimnary
reports to the  National  Aerometric  Data  Bank  (NADB)  through  the  AIRS
computer  system.   AIRS acceptance  criteria   call  for at least 75 % data
completeness, which has been accepted as a data quality  objective  for  state
and  local  agencies'  monitoring operations.   The Regional QA Coordinator may
wish to use this requirement together with information obtained  by  accessing
the AIRS AMP Computer Programs, discussed  in section 11.3.  The percent data
completeness may be effectively used as an indicator of whether a rigorous
                                   5-3

-------
          TABLE 11-1.  LIST OF KEY CONTACTS AND TELEPHONE NUMBERS
 Assistance Area                           Telephone
Office/Laboratory         Name              Dumber        EPA Location
                                     /

Laboratory          1-lilliamJ. Mitchell   (919)  541-2769  .EMSL/QAD/PER
 Areas and NPAP                            FTS  629-2769

General QA          William F. Barnard   (919)  541-2205   EMSL/qAD/PER
 Guidance                                  FTS  629-2205

Monitoring          Stanley Sleva        (919)  541-5651   OAQPS/MDAO/MRB
 Objectives/Siting

PARS-System    .     Gardner Evans        (919)  541-3887   EMSL/MAD/DRB

SAROAD              Jake Summers         (919)  541-5694   OAQPS/MDAO/NADB'
System/NADP
NPAP = National  Performance Audit Program

PARS = Precision and Accuracy Reporting System

NADB = National  Aerometric Data Bank
                                  5-4

-------
    TABLE 11-2.  SUMMARY OF REGULATORY AUTHORITY TO CONDUCT SYSTEM AUDITS

A.  Highlights of 40 CFR 3b
Section Number
and Description
                      Text
35.510-2
Grant Amount
"In determining the amount of support for a control
agency, the Regional Administrator will  consider

A. The functions duties and obligation assigned to
   the agency by an applicable implementation
   plan,

R. the feasibility of the program in view of the resources
   to be made available to achieve or maintain EPA
   priorities and goals

C. the probable or estimated total cost of the program in
   relation to its expected accomplishments

D. the extent of the actual or potential pollution problein

E. the population served within the agency's jurisdiction

F. the financial need, and,

H. the evaluation of the agency's performance."
35.510-3

Reduction in
Grant Amount
35.520

>" T.sr" a 'or
; Grant.,  Aware
"If the Regional Administrator's annual performance
evaluation reveals that the grantee will fail or has failed
to achieve the expected outputs described in his approved
program, the grant amount shall be reduced	"
                 "No grant :nay be awarded  to any interst
                 nimioinal sir oollution --.ontrol agency
                 jrovvjes  is::;irance  ;.yr.':-; factory ~,n  ::ie
                                       ate or inter-
                                       .ml ess the ^ool i can:
                 Administrator Chat the agency provides for adequate
                 representation of appropriate State, interstate,  local  and
                 (when appropriate) international interests in the air
                 quality control region, and further that the agency has the
                 capabi 1 i ty of_ developing and implementing a_ comprehensi ve air
                 qua!ity pi an for the ai r quality control region."
                                   5-5

-------
    TABLE 11-2.  SUMMARY OF REGULATORY AUTHORITY TO CONDUCT SYSTEM AUDITS

A.  Highlights of 40 CFR 35 (Cont'd)
Section Number
and Description
                      Text
35.520

Criteria for
(Grant) Award
35.530
Grant Conditions conditions:
No grant may be awarded unless the Regional Administrator
has determined that (1) the agency has the capability or will
develop the capability, to achieve the objectives and outputs
described in its EPA-approved program, and (2) the agency   •
has considered and incorporated as_ appropriate the
recommendations £f_ the latest EPA performance evaluation i_n_
its program."

In addition to any other requirements herein, each air
pollution control  grant shall be subject to the following
35.538-1

Auency
Evaluation
                 A. Direct cost expenditures for the purchase of.
                 B. The sum of non-Federal recurrent expenditures....

                 C. The grantee shall provide such information as the Regional
                    Administrator may from time to time require to carry out
                    his functions.  Such information may contain, but is not
                    limited to:  Air quality data, emission inventory data,
                    data describing progress toward compliance with regulations
                    by specific sources, data on variances granted, quality
                    assurance i nformation related to_ data col 1ection and
                    analysis and similar regulatory motions, source reduction
                    pians and procedures, real  t i me ai r qua!ity and control
                    acti vities, other data rel ated t_o ai r pol 1 ution emergency
                    epi sodes, and similar regulatory actions.   .             ~
"Agency evaluation... .should
 "HKhjet oeriod.  It  is FPA
                           pol
:>e continuous throughout the
 icy to li'Tiit -.?f\ e'M 1 nati on
jr "RSponsinie r,d,riaue:!ierir. :r
 regional and national efforts to control air pollution,   "he
 Regi onal Admi ni strator shal 1 conduct a_n_ agency performance
 evaluation annually in accordance with 35.410."
                                   5-6

-------
    TABLE 11-2.  SUMMARY -OF REGULATORY AUTHORITY TO CONDUCT SYSTEM AUDITS

A.  Highlights of 40 C.FR 35 (Cont'd)

Section Number
and Description                        Text


35.410           "A performance evaluation shall be conducted at least
                 annually by the Regional  Administrator and the grantee
Evaluation of    to provide a basis for measuring progress toward achievement
Agency           of- the approved objectives and outputs described in the work
Performance     ' program.  The evaluation  shall be consistent with the
                 requirements of 35.538 for air pollution control agencies...."
                                  5-7

-------
    TABLE 11-2.-  SUMMARY OF REGULATORY AUTHORITY TO CONDUCT SYSTEM AUDITS

B.  Highlights 'of 40 CFR 58

Section Number
and Description                        Text


58.20            "By January 1, 1980 the State shall adopt and submit  to
                 the Administrator a revision to the plan which will:
Ai r Quality
Surveillance     A. Provide for the	
Plan
Content (SLAMS)  B. Provide for meeting the requirements of Appendices  A,  C,  0,
                    and E, to this part

                 C. Provide for the operation of	

                 '0. Provide for the review of the air quality surveillance
                    system or\_ a_n_ annual basis to_ determine i_f the system
                    meets the monitoring objectives defined in Appendix D  to
                    this part.  Such review must	"
58.23

Monitoring
Network
Completion
58.34

NAMS Network
Compl etion
Appendix A
Section 2.4

National •
Performance and
Systems Audit
"By January 1, 1983:

 A. Each station in the SLAMS network must be in operation,
    be sited in accordance with the criteria in Appendix E to
    this part, and be located a described on the station's
    SAROAD site identification form, and

 B. The qual ity assurance requi rements p_f_ appendix A t£ thi s
    part must b_e_ ful 1y implemented."
                                                     i
 "By January 1, 1981:
                                             i
 A. Each MAMS must be int operation	

"T.'The aualitv assurance reaui rements of Aooenrii:: '- -.-> -,-ns
                        ". must- se ";uii-v  imo i ementec!  ror ail  .''Ari'j^1
 "Agencies operating all or a portion of a SLAMS network are
 required to participate in EPA's national  performance audit
 program and _tp_ permit a_n_ annual EPA systems audit o_f_ thei r
 ambient ai r monitori ng program....for additional i nformati on
 about these programs.  Agencies should contact either the
 appropriate EPA Regional  Qual ity Control Coordinator p_£ the
 Qua!ity Assurance Branch, EMSL/RTP	for i nstructi ons for
 participation."
                                  5-3

-------
rigorous  systems audit, using the long form questionnaire, might be needed or
not.
11.2.2  Specific Regulatory Guidance - The specific regulatory requirements of
an  EPA-acceptable  quality assurance program are to be found in Appendix A Co
40 CFR Part 58.  Section 2.2 of Appendix A details the operations for which an
agency  must  have  written  procedures.  The exact format and organization of
such  procedures  is  not  indicated,  however.   Thus,  many  approaches   to
appropriate documentation have been suggested by EPA, local agencies and other-
groups.
     One
-------
TABLE 11-3. SPECIFIC REGULATORY REQUIREMENTS TO BE EVALUATED IN A SYSTEMS AUDIT
             REQUIREMENT
       (40 CFR 58, Appendix A)
(1) Selection of Methods and Analyzers
PERTINENT SECTION], PERTINENT SECTION
OF OAMS DOCUMENT |  OF QUESTIONNAIRE
   '  005/30    '          (11.7)
 Project
 Description
 Organization &
 Responsibility
PIanni ng
PIanni ng
Planning
                                        OA Objectives
                   PIanning
(1) Selection of Methods, Analyzers
 Sampling
 Procedures
Field Operations
(11) Documentation of Quality Control
     Information
 Sample Custody
Fi eld/Lab Operations
(?.) Installation of Equipment
(3) Calibration
(7) Calibration and Zero/Span Checks
    for Multiple Range Analyzers
 Calibration
 Procedures and
 Frequency
Field/Lab Operations
    Only applicable if other than
    automated analyzers are used and
    analyses are being performed on
    filters - e.g., NO- or lead and
    TSP
'10) Recordina 3nd '/a! idati nu' Data
 Analytical
 Procedures
Lab Operations
 Data Reduction, i  pat-a '•
 V a i i oatlon and
 Reporting   .    j
                                  5-10

-------
         TABLE 11-3.   SPECIFIC REGULATORY REQUIREMENTS TO BE  EVALUATED
                      IN A SYSTEMS AUDIT (cont'd)
             REQUIREMENT
       (40 CFR 58, Appendix A)
(4)  Zero/span checks and adjustments
    of automated analyzers
(5)  Control  Checks and their frequency
(6)  Control  Limits for Zero/Span
(7)  Calibration and Zero/Span for
                                       PERTINENT SECTION
                                       OF  OAMS  DOCUMENT
                                          •  005/80
PERTINENT SECTION
OF QUESTIONNAIRE
      (11.7)
                                        Internal  Quality
                                        Control  Checks
Field/Lab Operations
                                                          QA/OC
Multiple Range Analyzers
(9) Quality control checks for air
pollution episode monitoring
Appendix A - Sections 2.0, 3.0
and 4.0
(8) Preventive and Remedial
Maintenance
Appendix A - Section 4.0
(10) Recording and Validating Data


Performance and
Systems Audits
Preventive
Maintenance
Specific Routine
Procedures used
to Assess Data
Preci sion,
Accuracy and
Compl eteness

QA/QC
Fi eld/Lab Operati .-
QA/QC
Data Management

(4)  Zero/jSpan checks and adjectments   j  Corrective
    of automated analyzers              Action
'5'  Control  Limits and Corrective     !
                                                        |  Field/Lab  Ope rat'
 11) Documentation of Quality Control
     Information
(10)  Data Recording and Validation
                                        Quality
                                        Assurance
                                        Reports to
                                        Management
Reporting
                                  5-11

-------
•11.3.1   Frequency of Audits  -  The  EPA  Regional  Office  retains   the   'regulatory
 responsibility  to  evaluate  agency  performance  annually.   Regional  Offices  are
 urged to use  the short-form  questionnaire   (Section   11.6),   the   CAIR   (Fig.
 11-4),   and the audit  reporting  format  (Section  11.4.4.).   Utilizing  the  above
 to  provide OAQPS with  this audit information  will  establish  a  uniform   basis
 for  audit  reporting   throughout   the  country.    For  many   well-established
 agencies, an  extensive systems   audit   and   rigorous   inspection  may not   ')e
 necessary  every  year.   The  determination  of  the  extent  of  the systems  audit
 and  its rigor  is left  completely   to  EPA   Regional   Office    discretion.
 Therefore,  the  option  is  provided   here   that   extensive   inspections   and
 evaluations may be  accomplished  using  the  short-form   questionnaire   (Section
 11.6),   and   appropriate  section(s)   of  the long-form questionnaire (Section
 11.7).   It is suggested that a complete systems   audit  using  the  long-form
 questionnaire be   performed  at least  once  every three years.   Yearly reports
 must still, however, include the short  form,  CAIR,  and  the.  report  completed
 according to  Section 11.4.4
     The primary screening tools to aid the  EPA  Regional   OA   Audit   Team   in
 determining which type of audit  to  conduct and  its  required extent  are:
 A.  National Performance Audit  Program  (MPAP)   Data—which  provide   detailed
    information  on  the  ability of participants to certify transfer  standards
    and/or calibrate monitoring instrumentation.  Audit  Data summaries provide
    a  relative performance ranking  for  each  participating  agency when compared
    to the other participants for a  particular pollutant.   These data  could   he
    used   as a preliminary assessment of laboratory  operations  at the  diff.ere'it
    local  agencies.
 B.  Precision  and Accuracy Reporting1 System (PARS) Oata--wni >:h  provide detailed
    ; n r'onnani on on precision  inn  -.ccurocy  :'n?c\z  'for .*zcr,  !or;.-:i  :ije'T~.v inn  MC:;'
    p.o'llutant, on a  quarterly basis. These data  summaries   could  be   used   to
    identify out-of-control conditions  at different  local  agencies,  for certain
    pollutants.
 C.  National Aerometric Data  Bank (NADB) AMP430  Data Summaries—which  provide a
    numerical  count  of  monitors meeting  and those not meeting  specifications  on
                                    5-12

-------
   monitoring data  completeness  on  a  quarterly  basis,  together  with   an
   associated  summary  of  precision  and  accuracy  probability  limits.   An
   additional program, AMP430,  will  provide  data  summaries  indicating   the
   percent of data by site and or by state for each pollutant.
11.3.2  Selection gjf Monitoring Sites for Evaluation - It  is  suggested  that
approximately  five percent (5%) of the sites of. each local agency included  i o
the  reporting  organization  be  inspected  during  a  systems  audit.   Many
reporting  organizations  contain a large number of monitoring agencies, wh-i 1 e
in other cases, a monitoring agency is its own  reporting  organization.   For
smaller  local  agencies, no fewer than two (2) sites should be inspected.   To
insure that the -selected  sites  represent  a  fair  cross-section  of  agency
operations,  one  ha.lf  of the sites to be evaluated should be selected by fjie
agency itself, while the other half should be  selected  by  the  Regional   QA
Audit Team.
     The audit team should use  both  the  Precision  and  Accuracy  Reporting
System  (PARS) and the AIRS computer databases in deciding on specific sites
to be evaluated.  High flexibility exists in the outputs obtainable  from  the
NAUB  AMP430 computer program;  data completeness can be assessed by pollutant,
site, agency, time period and season.  These data summaries would  assist  the
Regional  audit team in spotting potentially persistent operational  problems  in
need of more complete on-site evaluation.  At least one site showing poor daca
completeness,  as  defined by AIRS, must be included in those selected to be
evaluated.
     If the reporting organization under audit operates rnany sites and/or  •• :.i
structure  is complicated and perhaps inhomogeneous, then an additional  number
.n sites ibove the initial -." 'evel snouid '.T^ inspected so- fiar.  •;  '-:i-   .,1.-:
if. c:jr~;r. e   ji..;~:,rp  :f  the  itfica  arm "ocai  agency'o aoi 1 i ~j •;.; ' jjnnii,;: "":; .
monitoring activities  can  be  obtained.   At  the  completion  of  the  site
evaluations,  the  Regional   audit  team  is  expected to have established the
adequacy of the operating procedures, the flow of data from the sites  and   to
be  able  to  provide  support  to  conclusions  about  the homogeneity of the
reporting organization.
                                    5-13

-------
11.3.3  Oata Audits - With the implementation by many  agencies  of  automated
data acquisition systems, the data management function  has,  for the most part,
become increasingly complex.  Therefore, a complete systems  audit iiust include
a  review  of  the  data  processing  and reporting procedures  starting at  the
acquisition stage and terminating at the point of data  entry into  the  SAROAD
computer  system.   The  process of auditing the data processing trail  will  ne
dependent  on  size  and  organizational  characteristics  of  the   reporting
organization,  the volume of data processed, and the data acquisition system's
characteristics.  The details of performing a data processing audit are  left,
therefore,  to  Regional and reporting organization personnel  working together
to establish a data processing audit trail appropriate  for a given agency.
     Resides establishing and documenting processing trails,  data  processing
audits  procedure must involve a certain amount of manual recomputation of  raw
data.  The preliminary guidance provided here, for the  number of  data  to   be
manually  recalculated,  should  be  considered  a  minimum   enabling only  the
detection of gross data mishandling:

     (a)  For continuous monitoring of criteria pollutants,  the Regional'QA
          Coordinator should choose two 24-hour periods from the high
          and low seasons for that particular pollutant per  local  agency
          'per year.  (In most cases the seasons of choice will  be winter
          and summer).  The pollutant and time interval choices are left
          to the Regional auditor's discretion.

     (b)  For manual monitoring, four 24-hour periods per local agency
          per year should ^e recomputed.
                                                 I
     The Regional QA Coordinator  should  choose  the  periods   for  the  -Jaca
processing   audit  while  planning  the  systems  audit  and  inspecting  the
completeness records provided by  the  NADB  AMP430  system.   The  recommended
acceptance-  limits  for the differences between the data input  into SAROAD  and
that recalculated during the on-site phase of the systems audit, are given   in
Table 11-4.
                                  5-14

-------
              TABLE  11-4.   ACCEPTANCE  CRITERIA FOR DATA AUDITS

Data Acquisition
Mode
Automatic Data
Retri eval
Stripchart
Records
Manual
Reduction
i
Pollutants
S02, 03, NO?
CO
S02, 03, N02
CO
TSP
Pb
Measurement
Range (ppm)(a)
0-0.5, or 0-1.0
0-20, or 0-50
0-0. S, or 0-1.0
0-20, or 0-50
	

Tolerance
Limits
+3 ppb
+0.3 ppm
+20 ppb
_+! ppm
+2 g/m3 (b)
+0.1 g/m3
 (a)   Appropriate scaling  should  be used for higher measurement ranges.
 (b)   Specified  at 760  mm  Hg  and  25oC.
      Systems  audits  conducted  on large  reporting  organizations  (e.g.   four
 local  agencies)  require  recomputation  of eight  24-hour periods for each of the
 criteria  pollutants  monitored  continuously.  This  results  from  two  24-hour
 periods   being  recomputed for  .each local  agency,  for each  pollutant monitored,
 during a   given   year.    For  manual   methods,   sixteen  24-hour  periods  are
 recomputed, consisting of four periods per local  agency,  per year.
 11.4   Guide!ines for Conducting  Systems Audits  o_f State and  Local  Agencies
      A systems  audit should  consist of thrae separate phases:
        o  Pre-Audit Activities
        o  On-Site Audit Activities
        o  Post-Audit  Activities
          i    . i  ;                           (         ,
,   ,   Summary  activity  flow diagrams have been included as  "iuur^s   11-1,  11-?
 \n<\   11-';,   "^snec" "i VP i y.   ~h.-   "racier  iav  "'inn : r. us^f'jl  ;o r^rer ~.j ."..'i^;;.'?
 diagrams  while  reading this  protocol.
 11.4.1  Pre-Audit Activities - At the   beginning   of  each  fiscal  year,  the
 Regional   QA  Coordinator  or  a  designated member of the Regional  QA Audit Team,
 should establish a tentative  schedule  for. on-site  systems  audits  of  the
 agencies  within  their  region.
                                     5-15

-------
                         DEVELOP AUDIT SCHEDULE
                    CONTACT REPORTINOTORGANIZATIONS
                         TO SET TENTATIVE DATES
                      REVISE SCHEDULE AS NECESSARY
CONTACT REPORTING ORGANIZATION TO
     DISCUSS AUDIT PROCEDURE
  FIRM DATES FOR ON-SITE VISITS
INITIATE TRAVEL PLANS


  SEND QUESTIONNAIRE AMD REQUEST
   PRELIMINARY SUPPORT MATERIAL
   REVIEW MATERIAL DISCUSS WITH
 REPORTING ORGANIZATION QA OFFICER
IFINALIZE  TRAVEL  PLANS  WITH  INFORMATION]
I   PROVIDED  BY  REPORTING  ORGANIZATION   I
    DEVELOP CHECKLIST OF POINTS
          FOR DISCUSSION
                     CONTACT AGENCY ,0 Si:T ^CIFIC
                   INTERVIEW AMD SITE INSPECTION TIMES]
                              TRAVEL ON-SITE
    Figure 11-1.  PRE-AUDIT ACTIVITIES
                                 5-16

-------
     [•AUDIT TEAM INITIAL INTERVIEW OF REPORTING ORGANIZATION DIRECTOR
                        INTERVIEW WITH KEY PERSONNEL
AUDIT GROUP 1
  INTERVIEW PLANNING MANAGER
 INTERVIEW LABORATORY DIRECTOR
        VISIT LABORATORY
       WITNESS OPERATIONS
  REVIEW SAMPLE RECEIVING AND
            CUSTODY
    SELECT PORTION OF DATA
      INITIATE AUDIT TRAIL
   ESTABLISH DATA AUDIT TRAIL
 THROUGH LABORATORY OPERATIONS
  TO DATA MANAGEMENT FUNCTION
                                   AUDIT GROUP 2
                          INTERVIEW FIELD
                         OPERATIONS MANAGER
                  IVISIT SITES (AGENCY SELECTED)
                   visfi SITES~TREGION SELECTE?))!
                    VISIT AUDIT AND CALIBRATION
                             FACILITY
                       SELECT PORTION. OF DATA
                        INITIATE AUDIT TRAIL
        MEET TO
        DISCUSS
        FINDINGS
ESTABLISH TRAIL THROUGH FIELD
OPERATIONS TO DATA MANAGEMENT
                 FINALIZE AUDIT TRAILS AND COMPLETE DATA AUDIT
                      PREPARE AUDIT RESULTS SUMMARY OF
               (a) overall operations     (b)  data audit 'findings
             |  (c) laboratory operations  (d)  field operations    |

  |INITIATE REQUESTS FOR CORRECTIVETCTIOM IMPLEMTNTATIQN REQUESTS
              DISCUSS FINDINGS WITH KEY PERSONNEL  OA OFFICER
         EXIT INTERVIEW WITH REPORTING ORGANIZATION DIRECTOR TO OBTAIN
                             SIGNATURES ON CAIR
FIGURE 11-2.

ACTIVITIES
ON-SITE AUDIT COMPLETE
                                  5-17

-------
        TRAVEL BACK TO REGIONAL HEADQUARTERS
     AUDIT TEAM WORKS TOGETHER TO PREPARE REPORT
      INTERNAL REVIEW AT REGIONAL HEADQUARTERS
      INCORPORATE COMMENTS AMD REVISE DOCUMENTS
  ISSUE COPIES TO REPORTING ORGANIZATION DIRECTOR
       FOR DISTRIBUTION AND WRITTEN COMMENT
        INCORPORATE WRITTEN COMMENTS RECEIVED
             FROM- REPORTING ORGANIZATION
            SUBMIT FINAL DRAFT REPORT FOR
              INTERNAL REGIONAL REVIEW
       REVISE REPORT AND INCORPORATE COMMENTS
                    AS NECESSARY
                 PREPARE FINAL COPIES
        DISTRIBUTE TO REPORTING ORGANIZATION
              DIRECTOR, OAQPS AND REGION
Figure 11-3.   POST-AUDIT ACTIVITIES
                      5-18

-------
     Six (6) weeks prior to the audit,  the-  Regional  QA  Coordinator  should
contact  the  Quality Assurance Officer (QAO.) of the reporting organization to
he audited to coordinate specific dates and schedules for  the  on-site  audit
visit.   During  this  initial   contact,  the  Regional   QA Coordinator should
arrange a tentative schedule for meetings with key personnel  as  well  as  for
inspection   of  selected  ambient  air  quality  monitoring   and  measurement
operations.  At the same time,  a schedule should be set for the exit interview
used  to  debrief  the  agency   Director or his designee, on  the systems audit
outcome.  As a part of this scheduling, the  Regional  OA  Coordinator  should
indicate  any  special  requirements  such  as  access .to  specific  areas or
activities.  The Regional QA Coordinator should inform the agency QAO that  he
will  receive  a  questionnaire, precision and accuracy data, and completeness
data from NADB programs AMP240 and AMP430 which is to'be reviewed or completed
He  should emphasize that the completed questionnaire is to be returned to the
EPA Region within one (1) month of receipt.  The additional information called
for  within  the questionnaire  is considered as a minimum, and both the Region
and the agency under audit should feel free to include additional information.
     The Regional Audit Team  may  use  this  initial  contact  or  subsequent
conversations  to  obtain  appropriate  travel  information,  pertinent data on
monitoring sites to be visited, and assistance in coordinating meeting  times.
     Once the completed questionnaire has been received, it should be reviewed
and  compared  'with the criteria and information discussed in Section 11.2 am!
with those documents and regulations included by reference  in  Section  11.5.
The  Regional  OA Audit Team should also use the PARS and NADB AMP240 and
AMP a. 30 to augment r,ne documentation received from the  reporting  organization
•inder audit.  This ore! iminary  evaluation  will  be instrumental in select-iny
ve :":^s v>  ie  v/aiuar.e^  ?.nn  ;n  :ne  lecislon   in ~:^e ^xt.r'ir   i;  :.--  r.\:^ •.'.-•-••:
sir,e  data  audit.   The Regional  Audit  Team should then prepare a checklist
detailing specific points  for discussion with agency personnel.
     The Region Audit Team could be made of several members to  offer  a  wide
variety .of  backgrounds and expertise.  This team may then divide into groups
once on-site, so  that  both  audit  coverage  and  time  utilization  can  be
                                  5-19

-------
optimized.   A  possible  division1  may  be  that one group assess the support
laboratory and headquarters  operations  while  another  evaluates  sites  and
subsequently  assesses  audit  and  calibration  information.   The team leader
should reconfirm the proposed audit schedule with the  reporting  organization
immediately prior to travelling to the site.
11.4.2  On-Site Activities - The Regional  QA Audit Team should meet  initially
with the agency's Director or his designee to discuss the scope, duration, and
activities involved with the audit.  This should be followed by a meeting with
key personnel  identified from the completed questionnaire, or  indicated by the
agency QAO.  Key personnel to  be  interviewed  during  the  audit  are  those
individuals with responsibflities for:  planning, field operations, laboratory
operations, QA/QC, data management, and reporting.  At the conclusion of tnese
introductory  meetings,  the Regional  Audit Team may begin work as two or more
independent groups.  A suggested auditing method is outlined in Figure 11.2.
     To increase uniformity of site inspections, it is suggested that  a  site
checklist be developed and used.
     The importance of the data processing systems audit cannot be overstated.
Thus,  sufficient  time  and effort should be devoted to this  activity so that
the audit team has a clear understanding and complete  documentation  of  data
flow.  Its importance stems from the need to have documentation on the quality
of ambient air monitoring data for all the criteria pollutants for  v/hich' the
agency  has  monitoring  requirements.  The data processing systems audit will
serve as an  effective  framework  for .organizing  the  extensive 'amount  of
                                                                      i n,ij ,  and
                                                                        n a" " u s
information  gathered  during  the  audit of laboratory,  field  monito
support functions within the agency.
     'He ?nr.-j.-a iiidi : t.^am inoul'1 prepare "i 'irief v;r-'tten summary of
orjanizeu  into  :;ie   following areas:   planning, rlel-j operations, ],\
operations, quality assurance/quality control,  data management, and reporting.
Problems  with  specific areas should be discussed and  an attempt made to rank
them in order of their potential  impact on data quality.   For the more serious
of  these  problems,   Corrective   Action  Implementation   Request (CAIR)  forms
should be initiated.   An example  form is provided in Figure 11-4.   The  forms
have been designed such that one  is filled out  for each major deficiency'noted
that requires formal  corrective action.
                                    5-20

-------
                CORRECTIVE ACTION IMPLEMENTATION REQUEST (CAIR)
Reporting Organization



State or Local Agency
Deficiency Noted:
Agreed-upon Corrective Action:
Schedule for Corrective Action Implementation:
Signed 	   Director 	 Date



                                QA Officer                    Date
                                Audit Team Member             Date
Corrective Action Implementation Report:
-. • 'ineo                           ii r^r.7,0 r
Signed 	   QA, Officer                    Date
                                    5-21

-------
     The format, content, and intended use of  CAIRs  is  fully  discussed  in
Section 11.4.5 of this document.  Briefly, they are request forms for specific
corrective actions.  They are initiated by the  Regional  QA  Audit  Team  and
signed  upon  mutual  agreement by the agency's Director or his designee during
the exit interview.
     The audit is now completed by having the Regional Audit Team members meet
once  again with key personnel, the QAO and finally with the agency's Director
or his designee to present their findings.  This is also the  opportunity  for
the agency to present their disagreements.  The-audit team should si.nply state
the audit results including  an  indication  of  the  potential  data  quality
impact.   During these meetings the audit team should also discuss the systems
audit reporting schedule and notify agency personnel that they will be given a
chance  to  cpmment  in writing, within a certain time period, on the prepared
audit report in advance of any formal distribution.
11.4.3  Post-Audit  Activities - The  major   post-audit   activity   is   tne
preparation  of  the  Systems Audit Report.  The report format is presented in
Section 11.4.4.
     To  prepare  the  report-,  the  audit  team  should  meet   and   compare
observations   with   collected   documents  and  results  of  interviews  and
discussions with key personnel.  Expected QA Project  Plan  implementation  is
compared with observed accomplishments and deficiencies and the audit findings
are reviewed in detail.  Within thirty (30) calendar days of the completion of
the field work, the audit report should be prepared and submitted.
     The Systems Audit Report is submitted to the audited agency toget'ier v/i :;i
a letter thanking agency personnel for their assistance, time and cooperation.
~r. i s sund that the body of trie letter be -jsad to ^iterate "he c^ct uhat
".he audit •Deport is Teing provided for "eviey/ -ind written ;onmie".r,.  ~'i^ :_.-.->.-
should also indicate that, should no  written  comments  he  received  by  tne
Regional QA Coordinator within thirty (30) calendar days from the report date,
it will be assumed acceptable to the agency in its current form, and  will  be
formally distributed without further changes.
                                  5-22

-------
     If the agency has written comments  or  questions  concerning  the  audit
report,.the  Regional  Audit  Team  should  review  and  incorporate  them  as
appropriate, and subsequently prepare and resubmit  a  report  in  final  form
within  thirty  (30)  days  of receipt of the written comment.  Copies of this
report should be sent to the agency Director or his designee for his  internal
distribution.   The transmittal  letter for the ammended report should indicate
official distribution and again draw attention to the agreed-upon schedule  for
Corrective Action Implementation.
11.4.4  Audit Reporting - The Systems Audit Report format  discussed  in  this
section  has  been  prepared  to  he  consistent  with guidance offered by  tha
STAPPA/ALAPCO Ad Hoc Air Monitoring Audi.t Committee.  The format is .considered
as  acceptable  for  annual   systems  audit  reports  submitted  to the OA()PS.
Regional Audit Team members shall use this framework as a starting  point   and
include  additional material, comments, and information provided by the agency
during the audit to present an accurate and complete picture of its operations
and performance evaluation.
     At a minimum, the systems audit report should include the  following   six
sections:
     Executive Summary — summarizes the overall performance of  the  agency's
monitoring  program.   It  should  highlight  problem areas needing additional
attention  and  should  describe,  any  significant  conclusions  and/or  broad
recommendations.
     Introduction -- describes  the  purpose  and  scope  of  the  audit    and
identifies  both  the  Regional   Audit'Team members, key agency personnel,  -v1-.!
other  section or area leaders who were interviewed.  It should  also  indicate
the agency's  facilities .ind monitoring sitas which were visited -and ; nso^ct.^: ,
together  ,-n n.i '.he .jacas ami vi.nes of me  HI- sits! ~uK<1i " "-'si:.   -<;.. 'ow'- :-M;r>p-;rr.
of  the  cooperati'on and assistance of the Director and the QAO should also be
considered for inclusion.      .  '
     Aud.it Results  -- presents sufficient technical detail to allow a complete
understanding  of  the agency operations.  The information obtained during  the
audit  should  be organized using the  recommended  subjects  and  the  specific
instructions  given below.  It will be noted that the report format follows  the
four-area organization of the short-form questionnaire.
                                   5-23

-------
A. "Network Design and Siting

    1)   Network Size — Provide an overview of  the "network  size  and  the
         number  of  local   agencies  responsible  to  the  state  for network
         operation.

    2)   .Network Design and Siting -'-- Describe any  deficiencies  in  network
         design  or  probe  siting discovered during the audit.  Indicate what
         corrective actions are planned to deal with these deficiencies.

    3)   Network Audit— Briefly discuss the conclusions of the last network
         annual  audit and outline any planned network revision resulting from
         that audit.

    4)   Non-criteria Pollutants — Briefly discuss the  agency's  monitoring
         and quality assurance activities related to non-criteria pollutants.

B.  Resources and Facilities

    1)   Instruments and Methods — Describe any  instrument  non-conformance
         with  the  requirements  of  40  CFR  50,  51,   53,  and 58.  Briefly
       •  summarize agency needs for  instrument  replacement  over  and  above
         non-conforming instruments.

    2)   Staff ajid  Facil ities — Comment  on>  staff  training,  adequacy  of
         facilities  and  availability of NBS-traceable  standard materials and
         equipment necessary for the agency to properly  conduct the , bi-v/eekly
         precision  checks and quarterly accuracy audits required under 40 CFR
         Part 58, Appendix A.

    3)   Laboratory  Facilities— Discuss  any  deficiencies  of  laboratory
         procedures, Staffing and facilities to conduct  the tests and analyses
         needed to implement the' SLAMS/NAMS monitoring the  Quality  Assurance
         pians.

C.  Data and Data Management'

    1)   Data Processing and Submittal  — Comment  on ,  the  adequacy  of  the
         agency's  staff  and  facilities  to  proce'ss :  and  submit 3AROAO ji'"
         .iuaiity  aaca  as  specified - in  •-!•() 7FR -30..;^   inn   --'e    ••-roor~ : -in
         requirements of 40 CFR 53, Appendices A ana F.   Inclune an i noi CaC '•. on
       '  of the timeliness of data submission by indicating  the  fraction  of
         data which are submitted more than forty-five (45) days late.

    2)   Data Review — A brief discussion of  the  agency's  performance  in
         meeting  the  75%  criteria  for  data  completeness.   Additionally,
         discuss any remedial actions necessary to improve data reporting.
                                 5-24

-------
    3)    Data  Correction — Discuss  the  adequacy  and   documentation   of
         correctionsand/or  deletions  made to preliminary ambient air data,
         and their consistency with both the agency's. QA Manual  and  Standard
         Operating Procedures, and any revised protocols.

    4)    Annual   Report — Comment  on   the   completeness,   adequacy   and
         timeliness of submission of the SLAMS Annual Report which is required
         under 40 CFR 58.26.

0.  Quality Assurance/Qua!ity Control

    1)    Status  of Quality  Assurance  Manual— Discuss  the  status  of  the
         Agency's 'Quality   Assurance  Plan.   Include  an  indication  of its
         approval status, the approval status of recent changes and a  general
         discussion  of  the consistency, determined during the systems audit,
         between the Agency Standard  Operating  Procedures  and  the  Quality
         Assurance Pl.an.

    2)    Audit Participation — Indicate frequency  of  participation  in  an
         audit  program.   Include .as .necessary, the agency's participation in
         the National Performance Audit Program (MPAP) as required by  40  CFR
         Part 53.  Comment  on audit results and any corrective actions taken.

   •3)    Accuracy and Precision— As a  goal,  the  95  percent  probability
         limits  for precision (all  pollutants) and TSP accuracy should be less
         than +_15 percent.   At 95 percent probability limits, the accuracy for
         all  other pollutants should be less than +2Q percent.  Using a short
         narrative and a summary table, compare the  reporting  organization's
         performance against these goals over the last two years.  Explain any
         devi ations.

     Discussion  — includes a narrative of the way in which the audit  results

above  are  being  interpreted.   It should clearly identify the derivation of
audit results which affect  both data quality and  overall  agency  operations,

and  should  outline  the basis in regulations ;and guideline documents fjr :,'i?
specific, mutually-agreed upon, corrective acti'on recommendations.

     ^onc! usi ons . -.no  "-lec.-iinmendati ons -- uioul'i  -:entsr   irounr:  :hs  v-'er'.:'

performance of the agency's monitoring program.  Major proo'iem areas snoui-i .>.-?

highlighted.  The salient facts of  mutually  agreed  upon  corrective  action
agreements should be included in this section.  An equally important aspect to

be considered in the conclusion is a determination of the homogeneity  of  the
agency's  reporting  organizations  and  the  appropriateness  of  pooling the

Precision and Accuracy data within the reporting organizations.  The checklist
                                  5-25

-------
in-   Figure  11-5  should  he  included  and  submitted  with  the  supporting
documentation.
     Appendix of Supporting Documentation -- contains a clean and legible copy
of   the   completed   short-form  questionrtaire  and  any  Corrective  Action
Implementation Request Forms (CAIR).  Additional documentation may be included
if it contributes significantly to a clearer understanding of auriic result?.
11.4.5  Follow-up and Corrective Action Requi rements - An effective corrective
action procedure for use by the Regional  QA Audit Team follows.  As a means of
requesting corrective actions identified during the on-site audit, the auditor
completes  one  copy  of  the  form,  shown  in  Figure  11-4,  for each 'major
deficiency'noted.  These CAIR forms are presented to, and discussed vith,  the
agency's  Director  or  his  designee,  and its QAO during the exit interview.
Once Agreement has been reached, both the auditor and the  Director  sign  the
form.  The original  is given to the agency Director or his designee and a copy
is retained by the auditor.  A photocopy of the completed CAIR is included  in
the  audit  report.    It  is  taken  to be the  responsibility of the agency to
comply with agreed-upon corrective  action  requests  in  the  specified  time
frame.
11.5  Criteria for the Evaluation of State and  Local Agency Performance
     This  section  is  designed  to  assist  the  Regional   Audit   Team   in
interpretation  of  the  completed questionnaire received back from the agency
prior to the on-site interviews.  It also provides the necessary guidance  for
topics to be further developed during the on-site interviews.
     This section is organized such tjhat  the specific topics to ne covered and
the appropriate technical  guidance are keyed to the major subject -ireas- of fie
long-form questionnaire ^Section  II..7).    Th^   'ef~-oann  sine   )r  ~.-e   •! .,;
itemizes  the  discussion topics and the right-hand side provides citations to
specific regulations and guideline documents  which  establish  the  technical
background  necessary  for  the  evaluation  of  agency  performance.   A more
complete bibliography of EPA guideline documents is presented in 11.3.
                                  5-26

-------
                  REPORTING ORGANIZATION HOMOGENEITY CHECKLIST


                                                           Yes     No

1.  Field operations, for all  local agencies, conducted
    by a common team of field operators?                   	    	

2.  Common calibration facilities are used for all
    local agencies?                                        	    	

3.  Precision checks performed by common staff for
    all local agencies?                                    	
4.  Accuracy checks performed by common staff for
    all local agencies?

5.  Data handling follows uniform procedures for
    all local agencies?

fi.  Central  data processing facilities used for
    all reporting?

7.  Traceability of all standards established by
    one central  support laboratory?

8.  One central  analytical laboratory handles all
    analyses for manual methods?
          Figure 11-5.  Example of Reporting Organization
                        Homogeneity Checklist
                                  5-27

-------
11.5.1   Planning -
  Topics for Discussion

  o General  information on
    reporting organization and
    status of Air Program, QA
    Plan and availability of SOPs

  o Conformance of network design
    with regulation, and
    completeness of network
    documentation
  o Organization staffing and
    adequacy of educational
    background and training of key
    personnel
  o Adequacy of current facilities
    and proposed modifications

11.5.2   Field Operations -

  Topics for Pi scussion
                 i
  o Routine operational practices
    for SLAMS network, and
    confonnance with regulations
Background Documents

o State Implementation Plan
o U.S. EPA QAMS 005/30
o Previous Systems Audit
  report

o QA Handbook for Air
  Pollution Measurement
  Systems, Vol. II - Ambien-i
  Ai r Speci fie Methods,
  Section 2.0.1.
o 40 CFR 58 Appendices D
  and E
o OAQPS Siting Documents
  (available by pollutant)

o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol. I -
  Principle's, Section 1.4
  Vol . II - Ambient Ai r
  Specific Methods,Section
  2.0.5
  o Types of analyzers and samplers
    used for SLAMS network
Background Documents

o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol. II,
  Section 2.0.9
o UA Handbook for  \i r
  ^o i i ;j cl'on "e?.s:;r^:'i-vi-.
  Systems, Voi . [I
o 40 CFR 50 plus appendices
  A through K

o 40 CFR 58 Appendix C -
  Requirements  for SLAMS
  analyzers
                                  5-28

-------
  Topics  for  Discussion
Background Documents
    Adequacy  of  field  procedures,
    standards used  and field.
    documentation employed  for
    SLAMS network
    Frequency  of  zero/span  checks,
    calibrations  and  credibility
    of calibration  equipment  used
  o Traceability  of  monitoring  and
    calibration  standards
o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol.  II
o Instruction Manuals for
  Designated analyzers

o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol.  II -  Ambient
  Ai r Specifi c Methods
  Section 2.0.Q

o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol.  II -  Ambient
  Air Specific Methods
  Section 2.0.7
o 40 CFR 58 Appendix  A
  Section 2.3
  o  Preventive  maintenance  system
    including spare parts,  tools
    and  service contracts  for  major
    equipment

  o  Record keeping to include
    inspection  of  some site log
    books and chain-of-custody
    procedures

  o  Data acquisition and  handling
    system establishing a  data
    audit trail from the  site  to
    the  central data processing
    faci1i ty

11.5.3.   Laboratory Ooerations -

  Topics for Discussion

 _o  Routine operational practices
    for  manual  methods used in
    SLAMS network  to include
    quality of  chemical and
    storage times.
o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol.  II,
  Section 2.0.6

o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol.  II -  Ambient
  Air Specifi c Methods
  Sections 2.0.3 and  2.,1.9
RaCKground Documents

o 40 CFR 50 Appendices A -R,
  and QA Handbook, Vol. II
                                   5-29

-------
Topics for Discussion
Background Documents
o List of analytical  methods
  used for criteria pollutants
  and adherence to reference
  method protocols

o Additional  analyses performed
  to satisfy  regional, state
  or local requirements
o Laboratory quality control
  including the regular usage
  of duplicates, blanks, spikes
  and multi-point calibrations
o Participation in EPA NPAP and
  method for inclusion of audit
  materials in analytical  run
o Documentation and traceability
  of laboratory measurements
  such as weighing, humidity and
  temperature determinations

o Preventive maintenance in the
  laboratory to Hclud« service
  :jnr.rac".i;, on najor pieces'or'
  i nstrumentation

o Laboratory record keeping and
  chain-of-custody procedures
  to include inspection of
  logbooks used
o 40 CFR 58 Appendix C; "List
  of Designated Reference
  and Equivalent Methods"
o Refer to locally available
  protocols for analysis of
  aldehydes, sulfate,
  nitrate, pollens, hydro-
  carbons, or other toxic
  air contaminants

o U.S. EPA  APTD-1132
  "Quality. Control Practices
  in Processing Air
  Pollution Samples"
o 40 CFR 58 Appendix C; "List
  of Designated Reference
  and Equivalent Methods"

o 40 CFR 58 Appendix A
  Section 2.4
o OA Handbook for Ai r
  Pollution Measurement
  Systems, Vol . II,
  Section 2.0.10
o 40 CFR 58 Appendix C; "List
  of Designated Reference
  and Equivalent Methods"

o 40 CFR 5H Appendix C; "List
  of Designated Reference
  and Equivalent Methods"
40 CFR 58 Appendix
of -Designated
•5.no "/I u i va ] en r.
                        "List
                Reference
o QA Handbook for Ai r
  Pollution Measurement
  Systems, Vol. II,
  Section 2.0.6
                                5-30

-------
  Topics for Discussion
Background Documents
  o Adequacy of Laboratory
    facilities, Health and Safety
    practices and disposal of
    wastes

  o Data acquisition,  handling
    and manipulation 'System
    establishing data  flow in
    the laboratory, data back-up
    system and data reduction
    steps.

  o Data validation procedures,
    establishing an audit trail
    for the laboratory to the
    central  data processing
    facility
11.5.4.  Data Management -

  Topics for Discussion

  o Data flow from field and
    laboratory activities to
    central  data processing
    facility

  o Extent of computerization of
    data 'management system and
    verification of media changes,
    transcriptions and manual
    data entry

  o Software used for processing
    and its  documentation; to
    •nclude- functional description
    )f lOftwars, "as~ .;.ises and
    configuration control for
    subsequent revisions

  o System back-up and recovery
    capabilities
o Handbook for Analytical
  Quality Control  in Water
  and Hastewater Laboratories
o QA Handbook for Ai r
  Pollution Measurement
  Systems, VoT. II,
  Sections 2.0.3 and  2.0.9
  Annual Book of ASTM
  Standards, Part 41, 1978.
  Standard Recommended
  Practice for Dealing with
  Outlying Observations
  (E 178-75)
Background Documents,

p QA Handbook for Ai r  '
  Pollution Measurement
  Systems, Vol.  II,
  Section 2.0.3

o OA Handbook for Air
  Pollution Measurement
  Systems, Vol .  II,
  Section'2.0.9
o OA Handbook for Ai r
  Pollution Measurement
  Systems ,.Vol. II,
  Sect": ons  '•:.>) .3 -,
-------
  Topics for Discussion
Background Documents
  o Data screening, flagging and
    validation
  o Data correction'procedures
    and key personnel  allowed to
    correct ambient air data
  o Reports generated for in-house
    distribution and for submittal
    to EPA

  o Responsibility for preparing
    data for entry into the SAROAD
    and PARS systems and for
    responsibility for its final
    validation prior to submission

11.5.5  OA/QC Program -

  Topics for Discussion

  o Status of QA Program and its
    irnpl ementation

  o Documentation of audit
    procedures, integrity of
    audit devices and acceptance
    criteria for audit results

  o Participation in the National
    Performance Audit Program
    For what oollutanf, s and
    ~3nk1nu of resuit5
  o Additional  internal  audits
    such as document reviews or
    data processing audits
o Validation of Ai r
  Monitoring Data, EPA-
  600/4-80-030
o Screening Procedures for
  Ambient Air Quality Data,
  EPA-450/2-78-037

o OA Handbook for Air
  Pollution Measurement
  Systems, Vol . II,
  Section 2.0.9
o Aeros Manual Series,
  Vol . II, Aeros User's
  Manual, EPA-450/2-76-029
Background Documents

o 40 CFR 58 Appendix A
  and QAMS 005/30

o ,QA Handbook for Air
  Pollution Measurement
  Systems, Vol . II,
  Sections 2.0.11 and 2

o 40 CFR 58 Appendix A
o OA Handbook for Ai r
  Pollution ••1eas;>reiPent
  5;ysr..-D"is , Voi . ' •  ,
  Section 2.0.,10
                                  5-32

-------
  Topics for Discussion

  o Procedure and implementation
   'of corrective action

  o Frequency of performance and
    concentration levels for
    precision checks for each
    criteria pollutant

11.5.6.  Reporting -

  Topics for Discussion

  o Preparation of precision and
    accuracy summaries for the
    PARS system

  o Other internal reports used
    to track performance and
    corrective action
    implementation

  o Summary air data reports
    required by regulations

  o Completeness, legibility and
    validity of P X A data on
    Form 1
Background Documents
o 40 CFR 58 Appendix A
Background Documents

o PARS User's Manual
  (in preparation)
0'40 CFR 58 Appendix A
o 40 CFR 58 Appendices
  F and G

o 40 CFR 58 Appendix A
11.6  Systems Audit Questionnaire  (Short-Form)

     The short-form questionnaire  has been designed specifically   for   use   in-

annually  reviewing  state and local agencies air monitoring  programs.   If  the

Regional QA Coordinator decides that a more rigorous systems  audit  and  sice

inspections  are  necessary,  he   can  utilize  appropriate   sect ion (•;)  of  ;:ie

Long-Form Questionnaire (Section 11.7).  This quest i onnai re has been  desiqne-1

"iDtmij  -. TP  •••>r::Mr,  ~3c:");mne'':'ie'i   iy ]T^PI)!V ALAPO'l : n  r.'ie  'i.'.r. i O'M '  -.in : •?.":"  -.-

Monitoring Questionnaire  and  is  organized  around   four  (4)   najor   topics

consistent with the reporting format outlined in Section  11.4.4.   They  are:

     A. Network Design and Siting

     B. Resources and Facilities

     C. Data Management, and

     D. Quality Assurance and Quality Control
                                   5-33

-------
                     NATIONAL AIR MONITORING SYSTEMS AUDIT
                                 QUESTIONNAIRE

                                  (SHORT FORM)

Agency 	
Address 	




Telephone Number (Area Cade)  	 Number


Reporting Period ( beyinni ny-ending dates)  	


Organization Director 	


Air Program Supervisor	
Data Management Supervisor


Quality Assurance Officer_

                  i
Questionnaire Completed
                             (date)-(by)
 Jn-Size Visit
Jace:                     Audi", Team Me.'
Affiliation of Audit Team
                                      SF-1

-------
                            SHORT FORM QUESTIONNAIRE

                               TABLE OF CONTENTS



                                                        PAGE NO

A.   .NETWORK DESIGN AND SITING

     1.'  Network Si ze                                     SF-3
     2.  Network Desiyn and Siting                        SF-5
     3.  Network Review                       .            SF-6
     4.'  Non-Criteria Pollutants                          SF-7



B.   RESOURCES AND FACILITIES

     1.  Instruments and Methods      •                    SF-8
     2.  Staff and Facilities                             SF-9
     3.  Laboratory Operations and Facilities             SF-1U
     4.  Standards and Traceability                       SF-11



C .   DATA AND DATA MANAGEMENT

     1.  Timeliness of Data                               SF-13
     2.  Data Review                                      SF-14
     3.  Data Correction                                  SF-lb
     4. !  Annual Report                                    SF-16



D.   QUALITY ASSURANCE/QUALITY CONTROL

     I.  TLaCus 3f Quality Assurance ?royr.?,m              SF-l 7
     ?..  Audit Participation                              5F-1S
     3.  Precision and Accuracy Goals                     SF-19
                                      SF-2

-------
                           A.   NETWORK  DESIGN.  SITING

1.   NETWORK SIZE

(a)  Complete the table below for each of  the  criteria  pollutants monitored  as
    part of your air monitoring  network.   Include  only  those  sites  that are
    presently operating and those which are  temporarily inoperative (off  line
    less than 30 days).  Do not  include additional monitors which are
    collocated or index sites.

                         Number of Monitors
           SU2 ,     N02      CO      . 03       TSP       PM10
NAMS
SLAMS
(excluding NAMS)
SPM
FUTAL
(b)  SLAMS Network Description

  1. What is the d'atq of the most current official  SLAMS  Network
     Description?    !	

  •!. /Jhere r; i c availaole ror puDlio inspection:' 	
                                      SF-3

-------
  3. Does it include for each site the following?

                                           YES       NO

     AIRS Site ID#                         	     	

     Location                          .    	     	

     Sampling and Analysis Method          	     	

     Operative Schedule                    	     	

     Monitoring Objective and Scale
     of Representativeness                 	     	
     Any Proposed Changes
(c) For each of the criteria pollutants,  how many modifications (SLAMS
    including NAMS) have been made since  the last systems  audit?  (List
    the total SLAMS and MAMS)

    Date of last systems audit 	
                                Number £f_ Monitors

Pollutant             Added          Deleted        Relocated

Sulfur, Dioxide      	       	       	

Nitrogen Dioxide    	       	       	

Carbon Monoxide     	       	       	

Ozone,  .
"oral Suspended
3 5rt1 an aces        	       	       	

Lead	       	       	


PM10                	       	       	 '

(d) Briefly discuss changes to the Air Monitoring Network  planned for the
    next  audit period.  (Equipment is  discussed  in Part  B).
                                      SF-'i

-------
2. NETWORK DESIGN AMD SITING

Indicate by AIRS Number any non-conformance with the requirements  of 40
CFR 58, Appendices D and E.
                Site ID
Monitor         (AIRS)                 Reason for Non-Conformance
  S02
  03
  CO
  N02
  TSP
  Pb
                                      SF-5

-------
3. NETWORK REVIEW
Please provide the following information on your  previous  i nternal' Network
Review required by 40 CFR 58.20d.
Review performed on:    Date


Performed by:     .	
Location and Title of Review Document:
Briefly discuss all problems uncovered by this review.
                                      SF-6

-------
4. NON-CRITERIA POLLUTANTS
Does j'our agency monitor and/or analyze  for  non-criteria  and/or  toxic  air
pollutants?   Yes 	   No	

If yes, please complete the form below.
                      Monitoring                SOP Available
Pollutant          Method/Instrument              Yes/No
                                      SF-7

-------
                          B.  RESOURCES-AND FACILITIES

1. INSTRUMENTS AND METHODS

(a) Please complete the table below  to  indicate which analyzers do not
    conform with the  requirements of 40 CFR 53 for NAMS, SLAMS, or SIP
    related SPM's.
                                 •  .     Site         Comment on
Pollutant    Number    Make/Model     Identification    Variances
CO


S02


N02


03


TSP
Pb
(b) Please comment briefly  on your currently identified equipment needs..
                                      SF-8

-------
2. STAFF'AND FACILITIES

(a) Please indicate the number of people available to each of the following
    program areas:
                                            Comment on Need  for
Program Area            Number              Additional Personnel
Network Design
and Si ti ng
Resources and
Facilities
Data and Data
Management
QA/QC
(b) Comment on your agency's needs for additional  physical  space
    (laboratory, office, storage, etc.)
                                      3F-9

-------
3. LABORATORY OPERATION AND  FACILITIES
(a) Is  the documentation-of  Laboratory  Standard Operating  Procedure^
    complete?    Yes 	    No 	

    .Please complete the table  below.
  Analysis                            Date of  Last Revision
  TSP
  Pb

  S04

  MO 3

  S02
         (bubbl ers)
  N02

  Others (list by pollutant)
(b) Is sufficient  instrumentation  available  to conduct your  laboratory
    analyses?   Yes	  No 	

    If no, please  indicate instrumentation needs  in  the  table  below.
Instrument         ,                New or             Year  of
  Needed          Analysis         Replacement        Acquisition
                                      SF-10

-------
4. STANDARDS AND TKACEABILITY

(a) Please complete the table for your agency's laboratory standards,
                     Primary         Secondary      Recertification
Parameter            Standard        Standard         '    Date
CO
N02
S02
03
Weiyhts
Temperature
Moisture
Barometric
Pressure
Flow
Sulfate
Nitrate
Other (specify)
                                     SF-11

-------
(b) Please complete the table below for your ayency's site standards (up to
    !•% of the sites, not to exceed 20 sites).
                       Primary         Secondary       Recertification
Parameter              Standard        Standard             Date
CO

   i

N02



SU2



03
                                     SF-12

-------
                          C.  DATA  AND  DATA  MANAGEMENT
1. TIMELINESS OF  DATA
For the current calendar year or portion  thereof  which  ended  at  least  136
calendar days prior to the receipt of  this  questionnaire,  please  provide
the following percentages for required data submitted.
                        .  % Subini tted  on  Time*
Moni tori ny
   Qtr.
S02    CO    03    N02    TSP     PM
                                   10
Pb
      1
(Jan. 1-March 31)
(Apr. 1-Oune 30)
(July 1-Sept. 30)
(Oct. 1-Oec. 31;
  "On-Time" = within 135 calendar days  after  the  end of  the  quarter  in
  which the data were collected.
                                     SF-13

-------
2.. DATA REVIEW
Uhat fraction of the SLAMS sites (by pollutant) reported less .than 7'j% of
the data (adjusted for seasonal monitoriny and site start-ups and
terminations)?

Calendar Year
Pollutant
Ozone
 Percent of Sites
<75% Data Recovery
1st
Quarter
2nd
Quarter
3rd
Quarter
4th
Quarter
Nitroyen Dioxide
Sulfur Dioxide
Carbon Monoxide
Total  Suspended
Particul ates
•_eaa
                                      SF-14

-------
3.  DATA CORRECTION

(a)  Are changes to submitted data documented  in a  permanent  file?

    Yes 	  No	


If  no, why not? 	
(b)  Are changes performed according  to a documented  Standard  Operating
    Procedure or your Agency  Quality  Assurance  Project  Plan?

    Yes   -    No
   If not according to the QA Project Plan,  please attach a  copy  of your
   current Standard Operating Procedure.
(c)  Who has signature authority for approving  corrections?
           (name)(Program  Function)
                                     SF-15

-------
4. ANNUAL REPORT

(a)  Please provide the dates  annual  reports  have  been  submitted  in  the  last
    two years.
(b)  Does the agency's  annual  report  (as  required  in  40  CFR  CFR  58.26}
    include the following?

                                                       YES      MO

    1. Data summary  required  in  Appendix F.  .           	  	

    ?.. Location, date, pollution source  and  duration
       of all  episodes reaching  the  significant harm
       levels.                                         	  	

    3. Certification by a senior officer in  the State
       or his  designee.
(c)  Oescribe any deficiencies  which  cause  the  answer to  part  (b) of  this
    question to be No.
                                     SF-16

-------
                     D.   QUALITY ASSURANCE/QUALITY CONTROL






1.   STATUS OF  QUALITY  ASSURANCE PROGRAM



(a)  Does the agency have an EPA-approved  quality  assurance program plan?*



    Yes 	   No 	



    If yes, have changes to the plan  been approved by  the EPA?



    Yes         No
    PI ease provide:



    Date of Original  Approval



  •  Date of Last  Revision
    Date of Latest Approval
(b)  Do you have any revisions  to  your  QA  Program Plan still pending?



    Yes         No
  If answer is No,  give a brief  summary  of  the deficiencies
                                     SF-17

-------
2. AUDIT PARTICIPATION

(a) Date last systems audit was conducted?

    By whom? 	
(b) Does the ayency participate in the National  Performance  Audit
    Proyram (NPAP)  as required  under 40 CFR  58  Appendix  A?*

    Yes         No
(c) Please, complete the table below'.
Parameter Audited                         Date of  Last  NPAP  Audit



S02 (Continuous)

CO

Pb

ReF Device

S02 (bubbler)
                             i
NU2 (buobler)
  If No, yive a brief summary of deficiencies
                                      SF-18

-------
3.  PRECISION AMD ACCURACY GOALS

     As a goal, the 95 percent probability limits for precision (all  pollutants)
and TSP and PM]_i) accuracy should be less than _+15 percent.   At 95 percent
probability limits, the accuracy for all other pollutants should be less than
+2Q percent.*  Using a short narrative and a summary table, compare the
"Feporting organization's performance against these goals over the last year.
Explain any deviations.

   •  Precision and accuracy are based, on reporting organizations; therefore,
this question concerns those reporting organizations that are the responsibility
of the agency.  A copy of a computer printout has been provided which contains
the precision and accuracy data submitted to EMSL for each  of the agency's
reporting organizations.  The printout, containing at least the last  four
completed calendar quarters of precision and accuracy data, was obtained using
the NAD8 program AMP240.  This data should be verified using agency records.
If found in error, please initiate corrections.  Based on the data provided
or corrections thereto, complete the table in part "a" below indicating the
number of reporting organizations meeting the goals stated  above for  each
pollutant by quarter.
(a) Precision Goals
Pollutant
# of Reporting
 Organization
         Precision
Qtr/Yr  Qtr/Yr  Qtr/Yr  Qtr/Yr
03

N02

S02

CO

TSP

>M-.

Pb
*While the accuracy goals are important for all  audit levels for the gaseous
 pollutants, the principal concerns are the audit levels that include the
 ambient standard or the levels just below and just above the standard.
                                       :-19

-------
(b) Accuracy Goal s

             # of.Reporting
Pollutant     Organization
                                               Precision
                                      Qtr/Yr  Qtr/Yr  • Qtr/Yr  Qtr/Yr
03

N02

S02

CO

TSP
Pb
(c) To the extent possible,  describe  problens preventing the meeting of
    precision and accuracy goals.
                                     SF-2U

-------
 11.7  Systems Audit Questionnaire (Long-Form)

'The long-form systems audit questionnaire which follows  is  intended  to  provide  a
 complete  picture  of  agency  ambient  air  monitoring   operations   and quality
 assurance implementation.  The following instructions might  prove   helpful   in
 completing  this survey questionnaire.

 1. For ease in completing the  questionnaire,  it  is  not   necessary  to  type.
   Filling  it out legibly in blank'ink is acceptable.

 2. Feel frae to elaborate on any point or question in the form.   Use  additional
   pages as necessary to give a complete response.

 3. When necessary, include copies of documents which will  aid  in  understanding
   your response.

 4. Please   pay  careful  attention  in  completing   the   questionnaire.    The
   information  supplied will  have a direct bearing  on  the  conclusions  drawn  and
   recommendations  made  concerning  the  evaluation  of  your    organization's
   program.

 5. The Regional Quality Assurance Coordinator or a member of his   staff  may   be
   contacted for assistance in completing the questionnaire.

-------
        5.  AIR MONITORING
  'This material  is  similar  to:
          SECTION 2.0.11
      SYSTEM-AUDITS CRITERIA
        AND PROCEDURES FOR
 AMBIENT AIR MONITORING PROGRAMS
              of the
  QUALITY ASSURANCE HANDBOOK FOR
AIR POLLUTION MEASUREMENT SYSTEMS,
  VOLUME II.  EPA-60U/4-77-U27a
              5-iii

-------
                     SYSTEMS AUDIT QUESTIONNAIRE  (LONG FORM)
                               GENERAL INFORMATION
Questionnaire completion date
On-site systans audit date	
Reporting period 	
Ayency name and address
Mailing address (if different from above)
Telephone number (FTS) 	
Commercial (	)    -	__,	
Ayency Director 	I	
Agency QA Officer	
Reporting organizations making up this agency
Systems audit conducted by
Affiliation of audit team
Key Personnel:           Completed Questionnaire        Interviewed
Pla'nning 	
Field Operations 	
Laboratory Operations 	
QA/QC 	
Data Management 	
Reporting	

Persons Present during exit interview
                                      LF-1

-------
                            LONG FORM QUESTIONNAIRE

                               TABLE OF .CONTENTS

                                                        PAGE NO.
A.   NETWORK MANAGEMENT

     1.  General                                          LF-3
     2.  Network Design and Siting                        LF-8
     3.  Organization, Staffing and Training            '  LF-12
     4.  Facilities .                                      LF-15

B.   FIELD-OPERATIONS   -  ,

     1.  Routine Operatibns                               LF-16
     2.  Quality Control                                   LF-20
     3.  Preventive Mai n'tenance                           LF-25
     4.  Record Keeping                                   LF-27
     5.  Data Aquisition and Handling,                     LF-28

C.   LABORATORY OPERATIONS

     1.  Routine Operations                               LF-30
     2.  Quality Control                                   LF-32
     3.  Preventive Mai ntenance                           LF-37
     4.  Record Keeping                                   LF-38
     5.  Data Aquisition and Handling                     LF-40
     6.  Specific Pollutants
           TSP                                     ' '      LF-42
           PM1L)                                           LF-44a
           Lead                                           LF-45

D.   DATA AND DATA MANAGEMENT

     1.  Data Handling                                    LF-48
     2.  Software Documentation                           LF-51
     3.  Data Validation and Correction  '                 Lf-54
     4.  Data Processing                                   L.--;7
     3.  Internal Reporting          •         .            Lr-b3
     6.  External Reporting                          .     LF-65

E.   QUALITY ASSURANCE/QUALITY CONTROL

     1.  Status of Quality Assurance Program              LF-68
     2.  Audits and Audit Systan Traceability             LF-69
     3.  National Performance Audit Program (NPAP)
         and Additional Audits                .            LF-73
     4.  Documentation and Data Processing Review         LF-75
     5.  Corrective Action System                         LF-77
     6.  Audit Result Acceptance Criteria                 LF-79

-------
                             A.  NETWORK MANAGEMENT

1. GENERAL

(a) Provide an organization chart clearly showing  the agency's structure and
    its reporting organizations. (Attach sheet(s)  as  necessary.)

(b) What is the basis for the current structure of the agency's reporting
    organi zations?
                                                               .Yes    Mo

    Field operations for all  local  agencies,  conducted         	  	
    by a common team of field operators?

    Common calibration facilities are used  for all
    local agencies?                                            		

    Precision checks performed by common staff for
    all'local agencies?                                        	  	

    Accuracy checks performed by common staff for
    all local agencies?          .                              	  	

    Data Handling follows uniform procedures  for
    all local agencies?                                        	  	

    Central  data processing facilities used  for all
    reporting?                                                 	  	

    Traceability of all standards established by
    one central support laboratory?                            	  	
          I
    One central analytical  laboratory handles all
    analyses  for manual methods?                               	  	
(c) Does the agency feel 'that the data for the reporting  organizations  it
    contains -:an be pooled?

    '•es        rio       Please comment on ei:.i:f answer
(d) Briefly describe any changes which will  be made within the agency's
    monitoring program the next calendar year. 	
                                      LF-3

-------
(e) Complete the table below for each of the criteria pollutants monitored
    as part of your air monitoring network.
                                                                        /
                         Number of Monitors
           S02      N02      CO       03       TSP      PM10     PD
NAMS
SLAMS-
(excluding NAMS)
SPM
TOTAL
(f) What is the date of the most current official  SLAMS Network

    Description? ____^	


   I.  Where is it available for public inspection?  	

   II. noes it include  for each site the following?
                                             YES       NO

       AIRS -Site [D£	     	

       Location                              	     	

       Sampling and Analysis Method          	     	

       Operative Schedule
       Monitoring Objective and Scale
        of Representativeness

       Any Proposed Changes


                                      LF-4

-------
(g)  For each of the criteria  pollutants,  how many modifications (SLAMS
    including NAMS) have been made  since  the last systans audit?  (List
    the total SLAMS and NAMS)

    Date of last systems audit 	
                                   Number of Monitors

Pollutant                Added           Deleted  '      Relocated

Sulfur Dioxide         	      	      '	

Nitrogen Dioxide       	      	       	

Carbon Monoxide        	      	       	

Ozone
Total Suspended
Particul ates
Lead
(h) Briefly discuss changes  to  the  Air  Monitoring Network planned for the
    next audit period.  (Discuss  equipment  needs  in Section B.3.g)
                                      LF-5

-------
(i)  Does an overall  SLAMS/NAMS Monitoring Plan exist?

    Yes 	  No 	

(j)  Has the agency prepared and implemented Standard Operating Procedures
    for all facets of agency operation? Yes	 No	

    It" no,  list subject of  any missing SOPs	
(k)  Do the Standard Operating Procedures adequately address  at least the
    fourteen (14)  item quality control  program required  by Appendix A to 40
    CFR 58?  Yes      No     Comment
(1)  Clearly identify by section number and/or document  title,  major 'changes
    made to documents since the last  on-site review.

           Title/Section #            .       Pollutant(s)  Affected
                                      LF-6

-------
(m)  Does the ayency have an  impl anented  plan for operations during emery ency
    episodes? Yes  	  No 	  Indicate latest revision, approval date and
    current location of this plan.

    Document Ti t'l e	

    Revision Date	

    Approved 	
(n)  Duriny  episodes,  are communications  sufficient so that regulatory actions
    are based  on  real-time data?

    Yes         No
(o)  Identify the section  of  the  emergency episode plan where quality control
    procedures  can  be  found.
                                      LF-7

-------
2. NETWORK DESIGN AND SITING

(a) Indicate by AIRS Number any non-conformance-with the requirements of
  •  40 CFR 58, Appendices D and E.
                Site ID
Monitor         (AIRS)                 Reason for Non-Confonnance
  S02
  03
  CO
  N02
  TSP
  PM
    10
  Pb
(b) Please provide the following information on your previous Network Review
    required hy 40 ,CFR 58.2nd.

Review performed on:    Date	

Performed by: 	
Location and Title of Review Document:
Briefly discuss all  problems uncovered by this review.
                                      LE-8

-------
(c)  Have NAMS Hard  Copy  Information Reports (NHCIKs) been prepared and
    submitted  for all monitoring sites within the network?
    Yes	'   No  	

(d)  Does each site  have  the  required  information including:

                                                          • YES    NO

    SAROAD identification number?
    Photographs/si ides  to  the  four  cardinal compass
    points?                  •                               	  	

    Startup  and  shutdown dates?  •                           	  	

    Documentation of  instrumentation?                       	  	

    Reasons  for  periods of missiny  data?                    	  	


(e)  Who  has  custody of  the current  network documentation?


               (Name)                          (Title)

(f)  Does the current  level  of  monitoring  effort, site placement,
    instrumentation,  etc., meet  requirenents imposed by current grant
    conditions?  Yes        -No        Comment
(g)  How often  is  the  network design and siting reviewed?
    Date of last  review
                                      I --M

-------
(h)  Please provide a summary  of  the monitoriny  activities conducted as  the
    SLAMs/NAMS network  by  the agency  as  follows:

    I.   Monitoriny  is seasonal for (indicate pollutant and month of high
        and low concentrations).

                                     Month(s)

                           High                Low"
       Pollutant        Concentration    Concentration   Collocated

    	     	    	      Y/N

    	     	          .	      Y/N

    	     	. •   	      Y/N

    	     	    	      Y/N

    	     	    	      Y/N

                                                            Y/N
    II.  Monitoring  is  year-round  for  (indicate pollutant)

                     Pollutant           Collocated

               '   	            Y/N

                  	            Y/N

                  	      •      Y/N

                                           Y/N
                                      LF-1U

-------
(i) Does the nunber of collocated  monitoring  sites meet  the  requirements of
    4U CFR  58 Appendix A?

    Yes        No        Comment
(j) Does your ayency monitor and/or analyze  for non-criteria  air  and/or  toxic
    air pollutants?   Yes 	   No	

    If yes, please complete the form below.
                           Monitoriny                SOP Available
Pollutant               Method/Instrument              Yes/No
                                     LF-11

-------
3. ORGANIZATION. STAFFING AND TRAINING


(a) Please indicate the key individuals responsible for the following


    Agency Director 	

    SLAMS Network Manager 	
    Quality Assurance Officer
    Field Operations Supervisor

    Laboratory Supervisor 	
    Data Management Supervisor

    SLAMS Reporting Supervisor
(b) Please indicate the number of people available to each of the following
    program areas:
                                            Comment on Need for
Program Area          Number                Additional Personnel
Network Design
and Si ti ng
1
Resources an<
Faci 1 i ties
Data ind Dat
Management
J

QA/QC
                                     LF-12

-------
(c)  Does the agency  have an  established  training program?

    Yes  	  No	

    I.   Where is this  documented?
             (rev  date)

   II.   Does it make use  of seminars, courses, EPA sponsored college level
        courses?  Yes 	  No	

  III.   Indicate below  the three  (3) most  recent  training events and
        identify the personnel participating in them?

          Event            Dates                  Participant(s)
                                     LF-13

-------
(d)  Does the agency subscribe  to  recognized  publications?  Please provide a
    list of  periodicals.   Are  periodicals available to all personnel?
              Periodical  Title
Di stribution
                                     LF-14

-------
4. FACILITIES
(a)  Identify the principal  facilities  where  the work is performed which is
    related to the SLAMs/NAMS network?   (Do  not include monitoring
    sites but do include  any  work which  is performed by contract or other
    arranganents).

         Facility       Location       Main SLAMS/NAMS Function
(b)  Please review the entries  on  the  above  table.  Are there any areas of
    facilities  which  you  believe  should be  upgraded?  Please identify by
    location.
    Are there any  significant  changes which are likely to be implemented to
    ayency facilities before the  next systems audit?  Comment on your
    agency's  needs for additional  physical space (laboratory, offica,
    storage,  etc.)                           '     ]
         :                         I   •       |
         •-acilTcy        l-unction      Prooosea Chanae - Dare
                                     LF-15

-------
                              B.   FIELD OPERATIONS


1. ROUTINE  OPERATIONS

(a) Is the  documentation of Monitoriny  Standard  Operating  Procedures complete?

    Yes 	  No	


    Please  complete the table below.
    Pollutant  '
    Monitored                            Date  of  Last  Revision
      TSP

      PM10

      Pb

      S02
             (continuous)
      N02

      S02
             (bubblers)
      N02

      03                                               .

      CO                           !     ]

      Others (list by jjollutant)


(b) Are such procedures  available  to  all  field operations personnel?

    Yes        No        Comment
(c) Are standard operating procedures  prepared  and  available  to  field
    personnel  which detail  operations  during  episode monitoring?
    Yes        No        Comment
                                     IF-16

-------
(d)  For what does each reporting  organization within the agency monitor?
    Provide  the  list  requested below.

    Reporting Organization     #  of Sites     Pollutants
(e)  On  the average,  how often  are most of your sites visited by a field
    operator?   	 per	

(f)  Is  this visit frequency  consistent for all reporting organizations within
    your agency?   Yes 	  No	

    If  no, document  exceptions	
(g)  On the average,  how many sites  does  a  single site operator hava
    responsibility  for? 	,	

( h)  How m!any of the; sites  of your S LAMS/NAM S network are equipped with
    manifoldfs)  r

    I. Briefly aescrioe most common .naniruid  type. 	
   II.  Are manifolds  cleaned  periodically?  Yes 	  No

        If yes, how often?  	  per	
                                     LF-17

-------
    III.  If  the manifold is  cleaned,  what  is  used?
     IV.  Are manifold(s)  equipped  with  a  blower?   Yes 	  No
      V.  Is  there sufficient air  flow through  the manifold  at  all  times?
         Yes 	  No	
         Approximate air flow is
                                   (flow  units)

     VI.  Is  there a conditioning  period for  the manifold' after  cleaning?
         Briefly  comment  on  the length of  time the conditioning  is performed.
(i)  What material  is  used  for  instrument  lines?
(j)  Has  the agency obtained  necessary waiver  provisions  to operate equipment
    which does not meet the  effective reference  and  equivalency requirements?
    Yes  	  No 	

    Comment on Agency use  of approved/:non-approved  instrumentation. 	
                                     LF-18

-------
(k) Please complete the table below to  indicate which analyzers do not conform
    with the requiranents  of'40  CFR 53  for NAMS, SLAMS, or SIP related SPM's.
                                              Site            Comment on
Pollutant      Number      Make/Model       Identification      Variances
CO


S02


N02


03


TSP


PM1(J


Pb
('1) Please comment briefly  and  prioritize your currently identified
    instrument needs.
                                     LF-19

-------
2. QUALITY  CONTROL

(a)  Are field calibration  procedures  included  in the documented Standard
    Operating Procedures?   Yes 	  No	

    Comment on location  (site, lab, office) of  such procedures	
(b)  Are multipoint calibrations  performed?  Indicate both the frequency and
    pollutant.

    Reporting Organization         Pollutant         Frequency
(c) Are calibrations  performed  in  keeping with the guidance offered in
    Section 2.U.9 Vol.  II  of  the Quality Assurance Handbook for Air Pollution
    Measurement  Systems?   Yes 	  No 	

    If no, why not? 	
(d)  Are calibration  procedures  consistent with the operational requirements
    of Appendices to 40  CFR  btJ  or  to  analyzer operation/instruction manuals?
    Yes 	 'No 	

    If no, briefly explain deviations	
                                     LF-20

-------
(e)  Have changes been made to  calibration methods based on manufacturer's
    suggestions  for  a particular  instrument?  Yes 	  No	
    Are these also documented?  Yes	  No 	

(f)  Oo  standard  materials  used  for calibrations meet the requirements of the
    appendices to 41) CFR 5U (EPA  reference methods) and Appendix A to 40 CFR
    58  (traceability of  materials  to NBS-SRMs or CRMs)?  Yes 	  Mo 	
    Comment on deviations
(g)  Are all  flow-measurement devices checked  and certified?
    Yes       No       Comment
(h)  What are the  authoritative standards used for each type of flow
    measurement?   Please list them  in  the  table below, indicate the frequency
    of calibration standards  to  maintain field material/device credibility.

    Flow Devices         Primary  Standard       Frequency of Calibration
(i)  Where do  field  operations  personnel obtain gaseous standards?
        inoss  ir.anaaros  oern r: eu  oy:

       The agency laboratory?

       EPA/EMSL/RTP standards  laboratory?
       A laboratory  separate  from  this  agency but
       part of the same reporting  organization?

       The vendor?

       NBS?
                                     IF-21

-------
(j) Does the documentation include expiration  d-ate  of  certification?
    Yes 	  No	

    Reference to primary standard used?     Yes 	 No 	

    What traceability protocol  is used? 	
    Please attach an example of recent documentation of  traceabil ity  (tag,
    1aoel, 1 oy sheet). 	•	

(k) Is cal ibration equi pnent maintained at  each  site?   Yes 	  No	

    For what pollutants?
(1) How is the functional  integrity of this  equipment  documented?
(m) Please complete the table below for  your  agency's  site  standards  (up to  7%
    of the sites, not to exceed 20 sites).
                       Primary          Secondary       Recertification
Parameter              Standard        Standard              Date
CO


N02


SG2


03
                                     LF-22

-------
(n)  Are level  1  zero  and  span  (z/s) calibrations (or calibration checks)  made
    for all  continuous monitoring equipment and flow checks made for TSP
    samplers?    Yes	  No 	
    Please  complete  table below:
                                           Span Cone.
                              Pollutant      (ppm)      Frequency
    I.-   Continuous  analyzers
                              Flow Rate
     II.   TSP  Samplers
Frequency
    III.   PM-]_Q  Samplers
                              F low Rare
                                                rsauencv
                                     LF-23

-------
(o)  Does the agency have acceptance  cri'teria  for  zero/span  checks?
    Yes         No          Comment
    I.    Are  these  criteria  known  to  the  field operations personnel?
         Yes	   No 	

    II.   Are  they documented in  standard  operating  procedures?
         Yes  	   No	

         If  not,  indicate  document and  section where  they can be  found
    III.   Do  the documents  discussed  in  (ii),  above  indicate  when  zero/span
          adjustments  should  and  should  not be made?   Yes 	  No 	
          Indicate  an  example	
    IV.    Are  zero  and  span  check  control charts maintained? Yes 	 No
(p)  In  keeping  with  40 CFK  58  regulations,  are  any necessary  zero  and  span
    adjustments made after  precision checks?  Yes	  No 	

    If  no,  comment on why not	'<	
(q)  Are precision  check control  charts maintained?  Yes'	•  No


( r)  Who has trie rpsponsi oi i i ~y  for  jerformi nu  zero/span checks?
(s)  Are precision checks  routinely  performed  within concentration  ranges and
    with a  frequency  which meet  or  exceed  the requirements of 40 CFR 58,
    Appendix A?    Yes	   No 	

    Please  comment on any discrepancies. 	
                                      LF-24

-------
(t)  Please identify person(s)  with  the responsibility  for  performance  of
    precision checks on continuous  analyzers?

    Person(s) 	

    Title
3.  PREVENTIVE MAINTENANCE

(a)  Has the field operator been yiven any special  training  in  performing
    preventive maintenance?   Briefly comment on  background  and/or  courses
(b) Is this  training  routinely  reinforced?   Yes

    If no, why not? 	
         No
(c) If preventive maintenance i s  MINOR,  it  is  performed  at  (check one or
    more):   field site	,  headquarters  facilities	,  equipment  is sent
    to manufacturer
(d)  If preventive maintenance is MAJOR ,  it  is  performed  at  (check one  or
    more):   field site	,  headquarters facilities 	,  equipment is sent
    to manufacturer   •  .                   <
( e)  06es the agency have service contracts or
    i us'•"]"'jinen r, 'Manufacturers?  Indicate  oeiow
    snow v/nich i nstrunentauon is covered.
agreements ' i place with
o r a c t ac h aau i ''. i o n a i  o a^ es
                                      LF-25

-------
(f)  Comment briefly  on the  adequacy  and  availability of  the  supply of  spare
    parts,  tools  and manuals  available to  the field operator to perform any
    necessary maintenance activities.  Do  you feel  that  this is adequate  to
    prevent any  significant data  loss? 	
(g)  Is  the ayency currently  experiencing  any  recurririy  problem with equipment
    OF  manufacturer^s) ?  If. so,  please  identify  the equipment and/or
    manufacturer, and  comment  on  steps  taken  to  remedy  the problan.
                                     LF-26

-------
4. RECQRDKEEPING

(a)  Is a loy book(s)  maintained  at  each site  to document  site visits,
    preventive maintenance  and resolution of  site operational problems and
    corrective actions taken?  Yes       No       Other uses
(b)  Is the loybook maintained  currently  and  reviewed  periodically?
    Yes 	  'No 	     Frequency of Review •	
(c)  Once entries are made and  all  pages  filled,  is  the  loybook sent  to  the

    laboratory for archiving?   Yes 	   No	

    If no, is it stored at other location (specify)
(d)  What other records  are  used?                       YES      NO

    Zero/span record?                                  	    	

    Gas usage log?                                     	    	

    Maintenance loy?	    	

    Loy of precision checks?                           	    	

    Control charts?                                    	    	

    A record of audi ts?
    Please 1 escribe trie use and  storage of  these  documents.
(e) ArV-calibration records  or  at  least calibration constants available to
    field operators?  Yes	   No 	    Please  attach  an  example  field
    calibration record  sheet to this  questionnaire.
                                     17-27

-------
5.  DATA ACQUISITION  AND  HANDLING

(a)  With the excejjtiorr of TSP,  are instrument  outputs  (that  is  data)  recorded
    to (a)  stripcharts,  (b)  magnetic  tape  acquisition  system (c) digitized
    and telemetered  directly to agency headquarters?  .Please complete the
    table below for  each of  the reporting  organizations, or'agencies  within
    the overall R.O.
                                                   Data  Acquisition  Media
    Reporting Organization        Pollutants        (a, b,  c  or  combination)
(b) Is there stripchart backup  for  all  continuous  analyzers?  Yes	 No _

(c) Where is the flow of high-volume samplers  recorded  at  the site?

    For samplers with flow controllers?  Log sheet	,  Dixon chart 	,
    Other	 (specify)

    On Hi yh-volume sampl ers_wi thout flow contro llers?   Log  sheet	,
    Dixon chart	,  Other 	  (specify)

:. i) '-ihaC xi nu of recovery capabilities  for  oatd  -acquisition  ^aui unefit  -.r?
    availaole to the  field operator after power  outages, storms, etc?
    Briefly  describe  below.
                                     LF-28

-------
(e)  Usiny  a summary flow dlay ram,  indicate below all data handling steps
    performed  at  the air monitoring site.  Identify the format, frequency and
    contents'of data submittals  to the data processing section.  Clearly
    indicate  points at  which  flow  path differs for different criteria
    pollutants.  Be sure  to  include all calibration, zero/span and precision
    check  data flow paths.   How  is  the integrity of the data handling system
    verified?
                                      LF-29

-------
                           C.  LABORATORY  OPERATIONS


1. ROUTINE  OPERATIONS

(a) What analytical  methods are employed in support of your  air monitoriny
    network?


        Analysis     '                                Methods


   TSP
   Pb

   S04

   MO 3

   S02
          (bubblers)
   N02

   Others (list by pollutant)
(b) Are bubblers used  for any criteria  pollutants  in  any  agencies?
    Yes _  No _  If yes, attach a table which indicates  the number of
    sites  where bubblers are used,  the  ayency and  pollutant(s).

                                                         i     i
                                                              i
(c) Do any laboratory procedures deviate from tihe 'reference, equivalent, or
    approved 'fiethods?   Yes _  No _  If  yes,  are the deviations  r'or 1 e?.d
    analysis _ ,  T:>P f 1 1 cer r.ondi n'oni ny    •    or  other —  (specify  ^eiow)
(d)  Have the procedures and/or any changes been approved  by EPA?   Yes
    No 	   Date of Approval	
                                      LF-30

-------
(e)  Is  the docunentation  of Laboratory  Standard Operating Procedures complete?
    Yes _  No _ '_.  Please  complete  the table below.

        Analysis                         "           Methods


   TSP
   Pb

   S04

   N03 .

   SU2
          (bubblers)
   N02

   Others (list by pollutant)
(f) Is  sufficient instrumentation  available to conduct your laboratory
    analyses?   Yes _  No _ .   If  no, please indicate instrumentation
    needs  in  the  table  below.

Instrument                              New or             '.   Year of
  Needed            Analysis           Replacement           Acquisition
                                      LF-31

-------
2.  QUALITY CONTROL

(a) PI ease'compl ete  the  table-for your  agency's  laboratory standards,
                     Primary          Secondary       Recertification
Parameter            Standard         Standard             -Date
CO
N02
S02
03
Weights
Temperature
Moisture
Barometric
Pressure
Flow


Lead


Sul fate


Nitrate


VOC



                                      LF-32

-------
(b)  Are all  chemicals and  solutions  clearly marked with an  indication of shelf
    life?  Yes        No                              •
(c)  Are chemicals removed  and  properly disposed of when  shelf life expires?
    Yes     '  Mo
(d)  Are only ACS chemicals used  by  the  laboratory?  Yes	  'No
(e) Comment on  the  traceabil i.ty, of chemicals used in the preparation of
   •calibration standards?    ,
(f)  Does  the laboratory:

    purchase standard solutions  such  as  those  for  use with lead or other
    AA analysis?  Yes 	  No	

    make  them themselves?   Yes        No
    if the laboratory  staff  routinely make  their own standard solutions,
    are procedures for such  available?  Yes 	  No	   Where?
    Attach an example.
                                     LF-33

-------
(g)  Are all  calibration  procedures documented?  Yes	  No 	

    Where?'  	     /	   	•	
                   (titl e)                    (revision)

    Unless fully  documented,  attach a brief description of a calibration
    procedure.

( h)  Are at least  one  duplicate, one blank, and one standard or spike included
    with a yiven  analytical batch?  Yes 	  No 	   Identify analyses for
    which this  is routine  operation?
(i)  Briefly describe the  laboratory's  use of data derived from blank analyses?
    Do criteria exist which determine  acceptabl e/ non-acceptable blank'data?
    Please complete  the  table below..

       Pol lutant                       Blank Acceptance Criteria

          S02                       _

          NU2                       _

          S04

          NU3

          Pb                                                         '
          TSP

          VOC

          Other
                                    IF-34

-------
(j)  How frequently  and  at  what  concentration ranges does the lab perform
    duplicate  analysis?  What constitutes acceptable agreement?  Please
    complete the table  below.

       Pollutant                   .  Frequency         Acceptance Cri teria-

          S02    Bubblers

          NU2    Bubblers            _

          S04                       _

          N03

          Pb                    '
          TSP

          voc

          Other
(k)  How does  the lab  use  data  from  spiked samples?  Please indicate what may
    be considered acceptable percentage  recovery by Analysis?  Please
    complete  the table  below.

     Pollutant                     .     %_ Recovery Acceptance Criteria

         ,S02   Bubblers            _

          N(J2   Bubblers

          S04                      _ .

          NG3

          Pb
          TSP

          VOC

          Other
                                     LF-35

-------
(1)  Does the laboratory routinely  include  samples of reference material
    obtained from  EPA  within  an  analytical batch?  Yes 	  No 	

    If yes, indicate frequency,  level, and material used. 	
(m)
Are mid-ranye standards included  in  analytical  batches? Yes    _ No
If yes, are such standards  included  as a  QC check (span chec~i<7 on
                         laras  inciuaea as a yi. cnecK ^span cnecK; on
    analytical  stability?   Please  indicate the frequency, level and compound
    used  in  the  space  provided  below.  	
(n) Do criteria  exist  for "real-time" quality control based on the results
    obtained for the mid-range  standards discussed above?  Yes	  No 	
    If yes,  briefly discuss  them below or indicate the document in which they
    can be found.
(o)  Are appropriate acceptance  criteria documented for each type of analysis
    conducted?   Yes 	  No 	 Are they known to at least the analysts
    workirvj  with respective instruments?
                                     LF-36

-------
3.  PREVENTIVE MAINTENANCE
(a)  For laboratory equipment,  who  has  responsibility  for major  and/or minor
    preventive maintenance?

    Person                             Title
(b) Is most maintenance  performed:

    in the lab?   Yes       No
    in the instrument 'repair .f'acil ity?   Yes-	  No

    at the manufacturer's facility?   Yes	 No
(c) Is a maintenance log  maintained  for  each major laboratory instrument?

    Yes       No        Comment
(d)  Are service contracts  in place for  the  following  analytical  instruments:

                                                     YES     NO

    Analytical  Balance                              	   	

    Atomic Absorption Spectrometer                  	   	

    Ion Chr_omatoyraph                               	   	

    Automated Colorimeter   ]                  '       	   	
                                     LF-37

-------
4.  RECORDKEEPING


(a)  Are all  samples that are received  by  the  laboratory:

    loyyed-in?  Yes 	  No 	

    assigned a unique laboratory  sample number?  Yes	  No
    routeJ  to  the  appropriate  analytical section?  Yes 	  No	

    Discuss sample routiny  and special  needs  for analysis (or attach a copy
    of the  latest  SOP  which covers  this).  Attach a flow chart if possible.
(b) Are logbooks  kept  for  all  analytical laboratory instruments?
    Yes       No
(c) Do these loybooks  indicate:
                                                         YES    NO

    analytical  batches  processed?                         	  	

    quality control  data?                                 	  	

    calibration data?
    results of blanks,  spikes  and  duplicates?
            i    i
                i
 I   in i i;1 al s J f analyst?
                                     LF-38

-------
(d)  Is there a logbook which indicates  the  checks made  on:

    weights?  Yes 	  No 	

    humidity indicators?   Yes	  No 	

    balances?  Yes '      No 	

    thennometer( s) ?  Yes  	  No  	
(e)  Are loybooks maintained to track the  preparation  of  filters  fur  the  fi-.ald?
    Yes 	  No	~

    Are they current?  Yes	  No  	
    Do they indicate  proper  use of  conditioning?   Yes 	  No

    Weighings?   Yes	  No 	

    Stamping  and numbering?   Yes 	  No 	
(f)  Are logbooks kept which track filters  returning  from  the  field  for
    analysis?  Yes 	  No	
(y)  How are data records from the laboratory  archived?

    Where?	

    Who has the responsibility?   Person	

    Title
    •How I on;!  oir2 r^corus 
-------
5.  DATA ACQUISITION  AND  HANDLING

(a)  Identify those laboratory  instrunents which make use of computer1
    interfaces  directly  to  record data?  Which ones use stripcharts?
    integrators?
(b) Are QC data readily  available  to  the analyst during a yiven'analytical
    run?   Yes       No       •
(c)  For those instruments  which are computer  interfaced, indicate which are
    backed  up by stripcharts? 	
(d)  What is the laboratory's  capability with regard to data recovery?  In
    case of problems,  can  they  recapture data or are they dependent on
    computer operations?   Discuss  briefly.
(e) Has a user's manual  been  prepared  for the automated data acquisition
    instrumentation?   Yes        No         Comment
    Is it in the analyst's  or  user's  possession?  Yes	  No

    Is it current?   Yes        No1
                                     LF-4U

-------
(f)  Please provide  below a data  flow diagram which establishes, by a  short
    summary  flowchart; transcriptions, validations, and  reporting format
  .  changes  the  data yoes through before being released to the data
    managanent group.  Attach additional pages as necessary.
                                    LF-41

-------
6.  SPECIFIC  POLLUTANTS:   TSP .  P^h n.  AND  LEAD

    TSP

(a)  Are filters supplied  by EPA used  at  SLAMS  sites?  Yes	  No 	

    Comment 	

(b)  Do filters meet the specifications  in the  Federal Register 40 CFR  50?

    Yes       No        Comment
(c) Are filters checked  for  surface  alkalinity?   Yes	  No

    Indicate frequency	•
(d) Are filters visually  inspected  via  strong  light  from  a view  box  for
    pinholes and other imperfections?   Yes	 No	
    If no, comment on  way imperfections are determined? 	
(e) Are filters  permanently marked  with  a  serial number?   Yes 	  No

    Indicate when and how this is  accomplished: 	
(f)  Are unexposed filters equilibrated  in  controlled  conditioniny  environment
    which meets  or exceeds  the  requiranerits of 40 CFR 50?   Yes 	  No 	
    If no, why not?
(g)  is  ciie conditioning  environment moni ~ jreu?   Yes 	  No
    Indicate frequency 	~~~^^
    Are the monitors properly calibrated?  Yes	 No
    Indicate frequency	
                                     LF-42

-------
(h)  Is the balance checked with  Class  "S"  weights  each  day  it  is  used?
    Yes 	  No   _   If no,  indicate  frequency  of  such checks
(i) Is the balance check  information  placed  in  QC  logbook?  Yes 	 No-
    If no, where is it recorded? 	'
(j)  Is the filter weighed to the nearest milligram?  Yes 	 No
    If not,- what mass incrsnent	
(k)  Are filter-serial  numbers and tare weights  permanently  recorded  in  a
    bound notebook?  Yes 	._  No 	

    If no, indicate where	
(1) Are filters  packaged  for  protection  while  transporting  to and  from  the
    monitoring sites?   Yes	  No	

(m) How often are filter  samples  collected?   (Indicate  average  lapse  time
    (hrs.)  between end of sampling  and  laboratory  receipt.)
(n) Are field measuranents  recorded  in  logbook  or on  filter  folder?
(o) Are. exposed  filters  rec-ondi tioned  for  at  least 24  hrs  in  the  same
    conditioning environment as  for  unexposed filters?  Yes 	  No
    If no ,  why no T.I
(p) Are exposed  filters  removed  from  folders,  etc., before conditioning?
    Yes       No
                                     LF-43

-------
(q)  Is  the. exposed fi 1 ten weighed  to  the  nearest  mil 1 igram?   Yes	  No

(r)  Are exposed  filters  archived?   Yes  	  No	  When? 	

    Mhere? 	

    Indicate retention  period  	
(s) Are blank filters  reweighed?   Yes 	  No 	  If  no,  explain  why  not
    If yes, how frequently?
(t)  Are analyses  performed  on  filters?   Yes 	  No	.   Indicate  analyses
    other than Pb and mass  which are routinely  performed. 	
(u)  Are sample  weights  and  collection data  recorded  in  a  bound  laboratory
    logbook?   Yes	  No  __	  On  data forms?   Yes	 No 	

(v)  Are measured  air volumes  corrected  to reference  conditions  as  given  in
    CFR regulations (Qstd of  76U  mm  Hg  and  2boC)  prior  to calculating
  .  the Pb concentration?    Yes  	  No 	

    If .'iot, indicate conditions routinely employed  for  both  internal  and
    external .reporting  	
                                     LF-44

-------
    PHin

(a) Are filters supplied by EPA .used at SLAMS sites?   Yes	  No  	

    Comment 	

'(b) Do filters meet the specifications in the Federal  Register 40 CFK 5U?

    Yes       No        Comment
(c)-Are filters checked for surface alkalinity?  Yes 	  No
                 i
    Indicate frequency 	
(d) Are filters visually inspected via strong  light from a view box  for
    pinholes and other imperfections?  Yes	  No _:	
    If no, comment on way imperfections are determined?  	
(e) Are filters permanently marked with a serial  number?  Yes 	  No

    Indicate when and how this is accomplished:  	
(f) Are unexposed filters equilibrated in controlled conditioning environment
    which meets or exceeds the requirements of 40 CFR 50?  Yes 	  No 	
    If no, why not?
    Is trie condi tioni ru environment monitored?  Yes 	  No
    Intricate frequency
    Are the monitors properly calibrated?  Yes 	  No
    Indicate frequency	"	
                                     IF-44 a

-------
(h)  Is the' balance checked with  Class  "S"  weights  each day  it  is  used?
    Yes   •    No 	 •  If no,  indicate  frequency,  of such checks
(i) Is  the balance check  information  placed  in  QC  logbook?  Yes 	 Mo
    If  no, where is it recorded? 	
(j)  Is the filter weiyhed to the nearest  milligram? Yes	 No
    If 'not, what mass  increment	
(k)  Are filter serial  numbers and  tare weiyhts  permanently  recorded  in  a
    bound  notebook?  Yes  	  No 	

    If no, indicate where	
(1) Are filters  packaged  for  protection while  transporting  to and from the
    monitoring sites?   Yes	  No 	

(m) How often are filter  samples  collected?   (Indicate average lapse time
    (hrs.)  between end of sampling  and  laboratory  receipt.)
 n) Are field measurenents  recorded  in  logbook or on  filter  folder?
(o) Are exposed  filters  reconditioned  for  at  least 24  hrs  in  the  same
    conditioning environment as for  unexposed filters?  Yes	• Nd
    If MO,  finy -lot?
(p) Are exposed  filters  removed  from  folders, etc., before conditioning?
    Yes   '    No
                                     LF-44b

-------
(q)  Is the exposed filter weighed  to  the  nearest mi 1Tigram?  Yes	  No

(r)  Are exposed  filters  archived?   Yes 	  No	 When? 	'_	

    Where?

    Indicate retention  period  	
(s) Are blank filters  raweiyhed?   Yes 	  No 	  If no, explain why not.
    If yes, how frequently?
(t) Are analyses  performed  on  filters?   Yes 	  No	.   Indicate analyses
    other than Pb and mass  which are routinely  performed. 	
(u) Are sample weights  and  collection data  recorded  in a bound laboratory
    loybook?  Yes	  No  	  On  data  forms?  Yes	  No 	

(v) Are measured  air volumes  corrected  to reference  conditions as given in
    CFR regulations (Qstd  of: 76U  mm  Hg  and  25oC) prior to calculating
    the Pb concentration?     Yes  	  No 	

    If not, indicate conditions routinely employed  for both  internal  and
    external reporting
                                     LF-44C

-------
    LEAD

(a)  Is  analysis for lead beiny  conducted  usiny  atomic  absorption  spectrometry
    with air acetylene  flame?   Yes  	  No	        '
    If  not, has the ayency received an  equivalency desiynation  of their
    procedure.
(b)  Is either the hot acid  or ultrasonic  extraction  procedure  beiny  followed
    precisely?  Yes 	  No	   Which? 	
(c) Is  Class  A borosilicate glassware  used  throughout  the  analysis?
    Yes	  No 	

(d) Is  all  ylassware scrupulously  cleaned with  detergent,  soaked  and  rinsed
    three times  with distilled-deionized water?   Yes 	   No	
    If  not, briefly describe or  attach procedure.
(e)  If extracted samples are stored,  are  linear  polyethlyene  bottles  used?
    Yes       No  .      Comment
(f) Are all  batches  of  glass  fiber  filters  tested  for background  lead content?
    Yes 	  No 	  At  a rate of 20  to 30  random  fil ters  per  batch of  oOO
   .or ureater?  Yes,'       Mo       Indicate  rate
(g) Are ACS reagent grade  HN03  and  HC1  used  in  the analysis?   Yes
    No 	   If  not,  indicate yrade used 	
                                     LF-4b

-------
(h)  Is a calibration  curve available  haviny  concentrations  that cover the
    linear absorption  ranye of  the  atomic absorption instrumentation?
    Yes	  No  	   Briefly describe	
 i) Is  the stability  of  the  calibration curve checked by alternately
    remeasuriny every every  10th  sample a concentration =1 ug Pb/ml;
    =10 uy Pb/ml?   Yes	   No 	  If not, indicate frequency.
(j) Are measured" air  volumes  corrected  to  reference conditions as yiven in
    CFR reyulations (Qstd  of  76U  mm Hy  and 25oC) prior to calculating the
    Pb concentration?    Yes  	  No 	   If not, indicate conditions
    routinely employed for both-internal  and  external reporting.
(k) In either  the  hot  or  ultrasonic  extraction procedure, is there always a
    3U-min H20 soakiny period  to  allow  HN03 trapped  in the  filter to
    diffuse into  the rinse  water?   Yes       No        Comment
                                     LF-46

-------
(1)  Is a quality control  program  in effect that includes periodic
    quantification  of  (1)  lead  in 3/4" x 8" glass fiber filter strips
    containing  1UU-3UU  ug  Pb/strip, and/or (2) a similar strip with 600-1JOU
    ug strip,  and (3)  blank  filter strips with zero Pb content to determine
    if the method,  as  being  used, has any bias?  Yes	  No _.	
    Comment on  lead  QC  program  or attach applicable SOP.  	
(m) Are blank  Pb values  subtracted from Pb samples assayed? Yes 	  No

    If  not, explain why.	
                                    LF-47

-------
                          D.   DATA AND DATA MANAGEMENT

1.  DATA HANDLING

(a)  Is tnere a procedure,  description, or  a chart which shows a complete data
    sequence from  point of acquisition to  point of submission of data to EPA?
    Yes-        No
    Please provide below a  data  flow diagram  indicating both the data
    flow within  the reporting  organization and the data received from the
    various local  agencies.
                                     LF-48

-------
(b) Are'data handling and data reduction procedures documented?

    For data from continuous analyzers?    Yes 	  No 	

    For data from non-continuous methods?   Yes       No
.(c) In what format and medium are data  submitted  to  data  processing 'section?
    Please provide separate entry for each reporting organization.


            Reporting
           Organization        Data Medium      Format
(d) How often are data received at the processing  center  from  the  field  sites
    and 1aboratory?

       at least once a week? 	

       every 1-2 weeks?	

       once a month? 	


(e) Is there documentation accompanying the data regarding  any media changes,
    transcriptions, and/or flays which have been placed into ifiie dat.n before
    iata ire released :o agency internal  data  procassing?    Describe.
                                     LF-49

-------
(f)  How are the data actually  entered  to  the computer  system?  Digitization
    of stripcharts?   Manual  or computerized transcriptions?  Other?
(g) Is a double-key  entry  system  used  for data at the processing center?  Are
    duplicate card decks  prepared?  Yes  	  No 	  If no, why not?
(h)  Have special  data handling  procedures  been  adopted  for  air  pollution
    episodes?  Yes  	  No 	  If yes,  provide brief description.
                                     LF-5U

-------
2.  SOFTWARE DOCUMENTATION


(a)  Does the agency have available a copy of  the AIRS  Users  Manual?

    Yes       No       Comment
(b) Does the agency have the PARS user's  guide available?   Yes 	  No

    Comment (provide guide #)  	
(c) Does the Data Management Section  have  complete  software  documentation?
    Yes        No    '    Comment
    If yes, indicate the implementation  date  and  latest  revision  dates  for
    such documentation.
(d)  Do the documentation  standard  follow the  guidance  offered  by  the
    Software Documentation Protocols?   Yes 	  No 	

    If no, what protocols are they based on?      	
                                     LF-51

-------
(e)  What is the origin of the software used to process air monitoring data
    prior to its release into the AIRS/NADB database?

    I.  Purchased.?  Yes 	 No 	;  Supplier	

        Date of latest version
   II.  Written in-house? Yes 	 No 	;  Latest version

        Date
  III.  Purchased with modifications in-house?   Yes 	  No

        Latest version 	  Date 	

   IV.  Other (specify) 	
(f)  Is a user's manual  available to data management personnel for all software
    currently in use at the agency for processing SLAMS/MAMS data?

    Yes       No        Comment
(g)  Is there a functional  description either:

    included in the user's manual?  Yes       No
    separate from it and available to the users?  Yes 	  No
(h)  Are the computer system contents, including ambient air nonitoring daca,
    hacked 10 regularly?  V^efly describe, indicating at leas' 'Vie .nedi-. ,
    -"3auency, ana ':>ac:< :ip-'ne'.ii a •if.orrifjp "oca" ion.   •           	
                                     LF-52

-------
(i)  What is  the  recovery capability (how much time and data would be lost)  in
    the  event of a  significant computer problem?  ^	
(j)  Are test  data  available  to  evaluate the integrity of the software?

    Yes 	   No •	   Is it properly documented?  Yes 	  No	
                                    LF-53

-------
                                                            Section No. .2.0.11
                                                            April  30, 1985
                                                            Page 138
Reference

EPA-600/4-77-027a
May 1977
EPA-450/3-77-013
April 1977

EPA-450/2-76-029
December 1976

EPA-450/2-76-005
April 1976

EPA-600/9-76-005
March 1976

EPA-450/2-76-001
February 1976

EPA-450/3-75-077
September 1975

APTD-1132
March 1973

47 FR 54912, Dec. 6, 1982;
48 FR 17355, Apr. 22, 1983
Report Title

QA Handbook for Air Pollution Measurement
Systems, Vol. II - Ambient Air Specific
Methods

Optimum Site Exposure Criteria for S02
Monitoring

Aeros Manual Series, Vol. II - Aeros
User's Manual

Aeros Manual Series', Vol. V - Aeros
Manual of Codes

QA Handbook for Air Pollution Measurement
Systems, Vol. I - Principles

Aeros Manual Series, Vol. I - Aeros
Overview

Selecting Sites for Carbon Monoxide
Monitoring

Quality Control Practices in Processing
Air Pollution Samples

Amendments to reference methods for S02,
TSP and CO in 40 CFR Part 50 Appendices A,
B, and C
Proposed amendments to 40 CFR Part 5'8 are pending.
Proposed revision (Handbook, Vol.  II,  Sections  2.0.7  and 2.0.9  are  pending

-------
3.  DATA VALIDATION  AND CORRECTION

(a)  Have validation  criteria,  applicable  to all pollutant data processed by
    the reporting  organization been established and docunented?  Yes 	
    No 	

    If yes, indicate document  where such  criteria can be found (title,
    revision date).  	
(b) Does documentation  exist  on  the  identification and applicability of flags
    (i.e. identification  of  suspect  values) witnin the data as recorded with
    the data in  the  computer  files?  Yes 	  No 	

(c) Do documented data  validation  cri ten' a -employed address limits on and for
    the following:

    I.  Operational  parameters,  such as flow rate measurements ;>r flow rate
        changes.	
   II.  Calibration  raw data, calibration validation and calibration
        equipment tests. 	
  III.  All  special  checks  unique  to  a measurement system
   IV.  Tests for outliers  in  routine data  as part of screening process
    V.  Manual  checks  such  as  'nanrl calculation of concentrations and their
        comparison with  computer-calculated data  	
                                     LF-54

-------
(d)  Are changes to data submitted  to  NADB documented  in  a  permanent file?

    Yes 	  No 	    If  no,  why not? 	
[e) Are changes  performed  according' to  a documented Standard Operating
    Procedure or your Agency Quality Assurance  Project Plan? Yes	No
    If not according  to  the  QA  Project  Plan,  please attach a copy of your
    current Standard  Operating  Procedure.

 f)  Mho has signature authority for  approving corrections?
              (name)            •(Program Function)

(g)  Are data validation summaries  prepared  at  each critical point  in the
    measuranent  process or  information  flow and forwarded with the applicable
    block of data to  the next level  of  validation?  Yes 	  No 	

    Please indicate the points  where  such summaries are performed.
 h)  What criteria are applied  for'data  to  be  delated?  Discuss orief'!y.
                                     LF-b5

-------
(i) What criteria are applied to cause data to be reprocessed?   Discuss.
'(j) Is the yroup supplying data provided an opportunity to revie// data and
    correct erroneous entries?  Yes	  No 	  If  yes, how?
(k) Are correct data resubmitted to the issuiny yroup for cross-cheekily
    prior to release?  Yes 	  No 	
                                     LF-b6

-------
4.  DATA PROCESSING

(a)  Does the agency generate data summary reports?        Yes	  No
                                             /
    Are the data used for in-house distribution and use?  Yes 	  No

    Publication?  Yes	  No	

    Other (specify)	
(b)  Please list at least three (3) reports routinely generated, providing the
    information requested below.

         Report Title           Distribution        Period Covered
(c)  Have special procedures been instituted for pollution index reporting?
    Yes 	  No 	  If yes, provide brief description.
(d) Who at the ayency has th.-e responsibility for submitting data  co SArlOAD/
    NAOB? (name)      '  	  ; ".HI e)	

    Is ::ie .iar..=, -wi ewe'.i -jno  luproveu oy an  jfMc.-fr jf  CM a ..ujency .jrior  :o
    sjbmittal?  Yes 	  Ho	

    (name) 	   (title)  	
                                     uF-57

-------
(e)  Are those persons different from the individuals who submit data to PARS?
    Yes 	  No	  If yes,  provide name and title of individual
    responsible for PARS data submittal.

    (name)  	   (title)	?ARS

    Data review and approval  (name)	:	

    (title)	


(f)  How often are data submitted to:

    AIRS? 	

    PARS? 	



(g)  How and/or in what form are data submitted?

    TO AIRS?

    TO PARS?
(h)  Are the recommendations  and requirements  for data coding arid submittal,
    in the AIRS User's Manual  followed closely for AIRS? Yes 	  Mo 	
    Comment on any routine deviations in coding procedures.
 1;  Are r.ne ^commendations .anri requirements for daca coding ana juorni n;ai ,
    in the PARS User's Guide,  followed closely?  Yes   ^     No 	   Comment
    on any routine deviations  in coding and/or computational procedures.
                                     .F-53

-------
(j)  Does the agency routinely request a hard copy printback on submitted data;

    from AIRS/NADB?  Yes 	  No 	

    from PARS?         Yes       Mo
(k)  Are records kept for at least 3 years by the agency in an orderly,
  -  accessible form?  Yes 	  No 	

    If yes, does this include raw data	,  calculation 	,  QC data 	, and
    reports 	?  If no, please comment.
(1)  In what format are data received at the data processing center? (Specify
    appropriate pollutant.)

    (a)  concentration units 	 (b)  % chart 	 (c) voltages 	 (d) other;

(m)  Do field data include the following documentation?

    Site ID?  Yes 	  No 	

    Pollutant type?  Yes	  No 	

    Date received at the center?  Yes      No
    Collection data (flow, time, date)?  Yes 	  No 	

    Date of Laboratory Analysis (if applicable)  Yes 	  No 	

    Operator/Analyst?  Yes 	  Mo 	

Jn) Are the appropriate calibration equations submitted '.*ntn the data to the
    •jrncessi nn <;enr..-er?  Yes      'In      [" "or ,  •?<:)! .ii T.

-------
(o)  Provide a brief description of  the procedures and appropriate formulae
    used  to convert field data to concentrations prior to inpdt into the data
    bank.

    S02
    N02
    CO
    03
    TSP '
    CH4/THC
                                    LF-60

-------
    Pb
    PM
      1U
    Other
(p) Are all  concentrations  corrected.to EPA standard (298oK, 760 mm Hg)
    tanperature and  pressure condition before  input to the SAROAD?

   "Yes 	  No 	   If  no,  specify conditions used 	
(q) Are data reduction  audits  performed on a routine basis? Yes 	 No
    If yes,
       at what frequency?
       are they done by  an  independent group?
(r) Are there  special  procedures  available  for handling and processing
    precision, accuracy,  calibrations  and  spaa checks?  -Yes 	    No1
    -f 10,  comment
    If yes,  provide a  brief  description:

    Span check data	
    Calibration data
                                      LF-61

-------
    Precision data __

    Accuracy  data
(s)  Are precision and  accuracy data checked each time they are recorded,
    calculated  or transcribed to ensure that incorrect values are .not
    submitted  to  EPA?   Yes	  No 	  Please comment and/or provide a brief
    description of checks performed. 	
(t)  Is  a  final data  processing check performed prior to submission of any
    data?   Yes	   No 	

    If  yes, document procedure briefly	
    If no,  explain
                                     -F-62

-------
5.  INTERNAL REPORTING

(a)  What reports are prepared and submitted as a result-of the audits  require-J
    under 40 CFR 58 Appendix A?

                  Report Title                  Frequency
    (Please include an example audit report and, by attaching a coversheet,
    identify the distribution such reports are given within  the agency.)


(b)  What internal reports are prepared and submitted as a result of precision
    checks also required under 4U CFR 58 Appendix A?

                  Report Title                  Frequency
    (Please include an example of 
-------
(d)  Does the agency prepare Precision and Accuracy summaries other than
    Forms   for Precision  and Accuracy included  in Appendix A of 40 CFR 58,
    Yes       No 	    If no,  please attach examples of 'recent summaries
    incl uding a recerTt  Precision  and Accuracy.

(e)  Who has the responsibility  for the calculation and preparation of dar.a
    summaries?  To whom are such  P and A summaries delivered?
          Name             Title      Type of Report      Recipient
(f)  Identify the individual  within  the agency who  receives  the results of the
    agency's participation in the NPAP and the internAL distribution of the
    results once received.

    Principal  Contact for NPAP is (name,  title)
    Distribution is
                             jiame)                     . '.~, c i e!
                                     LF-64

-------
6.  EXTERNAL REPORTING

(a)   For the current calendar year or portion  thereof  which  ended  at  least
     135 calendar days prior  to  the receipt of  this  questionnaire,
     please provide the following  percentages  for  required data  submitted.

                         % Submitted  on  Time*
Moni toring
l(Jan. 1 -
2(Apr. 1 -
3(July 1 -
4(0ct. 1 -
Qtr.
S02 CO 03 N02 TSP PMi:J Pb
March 31)
June
Sept.
Dec.
30)
30)
31)
 *"On-Time" = within  135  calendar  days  after  the  end of  the quarter  in which
  the data were collected.

 (b)   Identify the individual  within the  agency with the responsibility  for
      preparing the required 40  CFR  58  Appendix F and  G  reporting  inputs.

 Name                                      Title
 (c)  Identify the individual  within  the  agency  with  the  responsi:) i I : ~.j
      reviewing and releasing  the data.

 "lame                 • i                    Ti tl e '
                                LF-65

-------
 (d) Does  the agency regularly report the Pollutant Standard Index (PSI)?
    Briefly describe the media, coverage, and frequency of such reporting.
•(e)  What  fraction of  the SLAMS sites (by pollutant) reported less than 7o% of
     the data  (adjusted for seasonal monitoring and site start-ups and
     tenni nations) ?
                               FY
                                             Percent of Si tes
Pollutant                                   <75% Data Recovery
1st
Quarter
2nd
Quarter
3rd
Quarter
4t.i
Quarter
 Ozone
 Nitrogen Dioxide
  'jl fur  Dioxide
 Carbon Monoxide
 oral  Suspended
 Particul ates
 Lead
                                     LF-66

-------
(f)  Does the agency's  annual  report  (as  required  in  40  CFR  CFR 58.26)  include
    the foil owing?
                                               /
                                                       YES     NO

    Data summary  required  in  Appendix  F.                	  	
    Location,  date,  pollution  source  and  duration
    of all  episodes  reaching the  significant  harm
    levels.

    Certification  by a  senior  officer in  the  State
    or his  designee. .
(g)  Please provide  the  dates  at  which  the  annual  reports have been submitted
    for the last 2  years.

-------
                     E.  QUALITY ASSURANCE/QUALITY CONTROL

                                     /
1. STATUS OF QUALITY ASSURANCE PROGRAM

(a)  Does the agency have an EPA-approved quality assurance program plan?

    Yes 	   No	

    It" yes, have chanyes to the plan been approved by the EPA? Yes	  No

    Please provide:

    Date of Original Approval  	

    Date of Last Revision 	

    Date of Latest Approval 	
(b) Do you have any revisions  to your QA Program Plan still  pending?

    Yes	   No 	

(c) Is the QA Plan fully implemented?  Yes	  No 	  Comment:
(d) Are-copies of QA Plan or pertinent sections available to agency personnel?

    Yes 	  No 	    If no, why not? 	



[i] VJhlch ; fieri Y iaua: 'i 'outinely .^ecaive updates t;) QA Plan?
                                     LF-68

-------
2.  AUDITS AND AUDIT  SYSTEM TRACEABILITY
(a)  Does the agency maintain  a  separate  audit/calibration support facility
    laboratory?    Yes  	  No	
(b)  Has the agency documented  and  implemented  specific audit procedures?

    Yes       No
(c)  Have audit procedures been prepared  in  keeping with the requirements of
    Appendix A to 40 CFR  58?    Yes 	   No	'

    If no, comment on any EPA  approved deviations	
(d) Do the procedures meet  the  specific requirements for independent standards
    and the suggestions regarding  personnel  and  equipment?  Yes	  No 	
    Comment:
(e) Are SRM or CRM materials  used  to  routinely certify audit materials?

    Yes 	  No 	


(f).Uoes the agency routinely use  NBS-SKM or  CRM materials? Yes 	 No
        audits only?  	 For  calibrations  only? 	 For ooth? 	 For
    neither, secondary standards  are  employed    ~
                                     LF-69

-------
(g)  Please complete the following table to summarize auditing method for
    CO, ^02,  02,  03 analyzers,  and  High-Volume  Samplers.
                                                               Aud i t
   Pollutants                     Audit  Method                  Standard
      CO

      03

      NO 2
            (continuous)
      S02

      N02
            (bubblers)
      S02
      TSP
                                     LF-70

-------
( h)  Are SRM or CRM materials  used  to  establish  traceability of calibration
    and zero/span check  materials  provided  to field  operations personnel?

    Yes       No
(i)  Specifically for  gaseous  standards,  how  is  the  traceabi-1 i ty of audit
    systen standard materials  established?   Are .they:

    purchased certified  by  the vendor? 	
    certified  by  the  QA support  laboratory which is part of  this agency?
    Other?  (Please comment  briefly  below).
(j)  Are all  agency traceability  and  standardization methods  used documental?

    Yes       No        Indicate docunent where such method  can be  found.
                                     IF-71

-------
(k)  Do the traceability  and  standardization methods conform with the guidance
    of Section  2.U.7  Vol.  II of the Handbook for Air Pollution Measurement
    Systems?

    For permeation  devices?  Yes	  No 	

    For cylinder  gases?      Yes 	  No 	


(1)  Does the  agency have identifiable  auditing equipment (specifically
    intended  for  sole  use) for audits?

    Yes	  No	 If  yes, provide specific identification
(m)  How often  is  auditing equipment certified for accuracy against standards
    and equipment of  higher-authority?
(n)  As  a'result of the  audit  equipment checks performed, have pass/ fail
    (acceptance criteria) been decided for this equipment?  Indicate what
    these criteria are  with respect  to each pollutant.  Where are such !
    criteria  documented?
               Pollutant                    Criteria
                                    LF-72

-------
3.  NATIONAL PERFORMANCE  AUDIT  PROGRAM (NPAP) AND ADDITIONAL AUDITS

(a) Identify the individual  with  primary  responsibility  for  the  required
    participation in the  National  Performance Audit  Program.

    For'gaseous materials?  (name,  title)  	
    For laboratory material s?  (name,  title)
(b)  Does the agency currently have in  place  any  contracts or  similar
    agreements either with another agency or outside contractor  to  perform
    any of the audits required by  40 CFR  58?

    Yes       No       Comment
    If yes, has the agency included QA  requirements  with  this  agreement?
    Yes 	  No 	

    Is the agency adequately familiar with  their  QA  program?
    Yes 	  No 	                               !

                                           I                            _
                                           I
    Date last systems -iud 1 t vas conducted?

    •i>/  /nom?
                                     LF-73

-------
(d) Please complete the table below
Parameter Audited                              Date  of  Last  NPAP


S02 (Continuous)

CO

Pb

ReF Device

S02 (bubbler)    '                      .

N02 (bubbler)
(e) Does the ayency participate in  the  National  Performance  Audit  Progra
    (NPAP) as required under 40 CFR 58  Appendix  A?   Yes	  No 	

    If no, why not?  Summarize below.
                                     LF-74

-------
4.  DOCUMENTATION AND DATA PROCESSING  REVIEW

(a) Does  the agency periodically review its  record-keeping  activities?

    Yes	  No 	

    Please list below areas routinely  covered by this review,  the date of the
    last  review, and changes made a a  direct result  of  the  review.

    Area/Function     Date of Review     Changes?     Discuss  Changes

    _____	     	       Y/N        	

    	.	   '    Y/N        	

    	     	       Y/N        	

    	     	       Y/N        	

    	     	       Y/N	


(b) Are data audits (specific re-reductions  of stripcharts  or  similar
    activities) routinely performed for criteria pollutant  data reported  by
    the agency?  Yes 	  No	

    If no, please explain. 	
. c;  Are jrocsqurys rnr suc:i 'laca •?. u-.n'ts aoc.jiierir.eu?   Yes
                                     LF-76

-------
(d)  Are they consistent  with  the  recommendations of Sections 2.3-2.9 of
    Vol.  II  of  the  QA  Handbook for Air Pollution Measurement Systems?

    Yes	 No 	  If  no,  why  not?   •	
(e)  What is  the  frequency and  level (as a percentage of data processed)  of
    these audits?

    Poll.   Audit  Freq.   Period of Data Audited    % of Data Rechecked
(f)  Identify  the  Criteria  for  acceptable/non^acceptable result from a data
    processing  audit  for each  pollutant, as appropriate.

    Pollutant        Acceptance Criteria      Data Concentration Level
(g)  Are procedures  documented  and  implemented for corrective actions based on
    results  of data audits which fall outside the established limits?
    Yes	 No 	

    If  yes,  where are such corrective action procedures documented?
                                    LF-76

-------
5.  CORRECTIVE  ACTION SYSTEM
(a)  Does the agency  have  a  comprehensive Corrective Action program in place
    and  operational?   Yes 	  No 	


(b)  Have the procedures been documented?  Yes	  No 	  As  a part of the
    agency  QA Plan?  Yes 	  No	  As a separate Standard Operating
    Procedure?  Yes 	  No 	  Briefly describe it or attach a copy.
(c)  How is responsibility  for  implementing corrective actions on the basis  of
    audits, calibration  problems, zero/span checks, etc. assigned?
    Briefly discuss.
(d)  How does the agency  follow-up on implemented corrective actions?
                                    LF-77

-------
(e)  Briefly describe two  (2)  recent examples  of  the  ways  in  which  the  above
    corrective action  system  was  employed  to  renove  a  problem area with
    I.  Audit Res'ul ts:
   II.  Data Manayement:
                                     LF-78

-------
6.   AUDIT RESULT  ACCEPTANCE  CRITERIA

(a) Has  the ayency  established  and  has  it documented criteria  to define
    agency-acceptable audit  results?  Yes	  No 	

    Please complete the  table  below with the pollutant, monitor and
    acceptance cri teria.
         Pollutant                    Audit  Result Acceptance Criteria


          1  CO   ,

            03

            N02
                   (continuous)
            S02

            N02
                   (bubblers)
            S02

            PM10

            TSP
(b)  Were these audit criteria  based  on,  or derived from, the -juidance found
    in Vol. II of the QA Handbook  for  Air  Pollution Measurement system,
    Section 2.0.12?  Yes 	  No  	

    If no, plealse explain.
    If yes,'please explain any chanyes  or  assumptions made  in  the derivation,
                                     LF-79

-------
(c) What corrective action may be taken if criteria are exceeded?  If
    possible, indicate two examples of corrective actions taken within the
    period since the previous systems audit which are based directly on
    the criteria discussed above?

    Corrective Action #1
    Corrective Action #2
(d) As a goal, the 95 percent probability limits for precision (all  pollutants)
    and TSP and PM^Q accuracy should be less than +15 percent.  At 95 percent
    probability limits, the accuracy for all other pollutants  should be less
    than +20 percent.*  Using a short narrative and  a summary  table, compare  the
    reporting organization's performance against these goals  over the last
    year.  Explain any deviations.
MOTE:  Precision and accuracy are based on reporting  organizations;  therefore
this question concerns the reporting organizations  that  are the responsibility
of the agency.  A copy of a computer printout has been  provided which contains
the precision and accuracy data submitted to EMSL' for each  of the agency's
reporting organizations.   The printout, containing  at least the List 'our
completed calendar quarters of precision and accuracy data, was obtained   is"i'vi
the HADB program AMP240.   This data should be verified  using agency  records.
If found in error, please initiate corrections.   Based  on  the :1ata
c'ne number of reporting organization's  Meeting  :!ie  goal  stacaa  aoove
for each pollutant by quarter.

     (Report level 2 checks unless otherwise directed by Regional  Office.)


*While the accuracy goals for all'  audit levels  are  important,  the  principal
 concerns are the audit levels  that include the ambient  standard or the
 levels just above the standard and just below  the  standard.
                                     l.F-80

-------
   I. Precision Goal s
             # of Reporting
Pollutant     Oryanization
         Precision
Qtr/Yr  Qtr/Yr   Qtr/Yr   Qtr/Yr
03

N02

SU2

CO

TSP

PM1U

Pb
  II. Accuracy Goals
             # of Reporting
Pollutant     Organization
         Prec'i si on
Qtr/Yr  Qtr/Yr   Qtr/Yr   Qtr/Yr
03

M02

S02

CU
Pb
                                      LF-81

-------
(e)  To the extent jjossible,  describe problems  preventing  the  meetiny  of
    precision  and accuracy yoals.
                                     LF-32

-------
                                                            Section No. 2.0.11
                                                            April 30, 1985
                                                            Page 137
11.8  Bibliography

Guideline documents for the SLAMS Air Program, arranged in descending
chronological- order, the most recent ones first.
Reference

EPA-600/4-83-023
June 1983
EPA-600/7-81-010
May 1981  '
EPA-QAMS-005/80
December 1980
EPA-600/4-80-030
June 1980

EPA-600/4-79-056
September 1979

EPA-600/4-79-057
September 1979

EPA-600/4-79-019
March 1979

EPA-450/4-79-UU7
February 1979 '

EPA-600/4-78-047
Auqus- 1978
EPA-450/2-78-037
July 1978

EPA-450/3-78-013
April 1978

EPA-450/3-77-018
December 1977
Report Title

Guideline on-the Meaning and Use of
Precision and Accuracy Data Required'
by 40 CFR Part 58 Appendices A and B

A Procedure for Establishing Traceability
of Gas Mixtures to Certified National
Bureau of Standards SRMs

Interim Guidelines and Specifications
for Preparing Quality Assurance Project
PI ans

Validation of Air Monitoring Data
Transfer Standards for Calibration of Air
Monitoring Analyzers for Ozone

Technical Asssitance Document for the
Calibration of Ambient Ozone Monitors

Handbook for Analytical  Quality Control
in Water and Wastewater Laboratories

Guidance for Selecting T5P Episode
Monitoring Methods

Investigation of Flow Rate Calibration
Procedures Associated with' the High
Volume Method for Determination of
Suspended Particulates

Screening Procedures for Ambient Air
Quality Data

Site Selection for the Monitoring of
Photochemical Air Pollutants

Selecting Sites for Monitoring Total
Suspended Particulates

-------
                CORRECTIVE ACTION IMPLEMENTATION REQUEST (CAIR)
Reporting Organization



State or Local Agency
Deficiency Noted:







Ayreed-upon Corrective Action:







Schedule for Corrective Action  Implementation.:








Signed 	   Director 	 Date



       	   QA Officer	Date



       	   Audit Team Member 	 Date







Corrective Action Implementation Report:
Signed 	   Director 	 Date



Siijnect 	   QA Officer 	•_ Date'

-------
 5
 m
m "2


fi
o
m

-------
                                    Chapter 6

           Motor Vehicle Emissions Inspection Program Audit Guidelines

                                  FY  1988-1989
Section
Title
Page
1.0         Introduction and Purpose	    6-1

2.0         Overview of the I/M Audit	    6-2
2.1            Monitoring Program Effectiveness	    6-2
2.2            Initial I/M Audits	    6-3
2.3            Follow-up Audits	    6-4
2.4            Corrective Action		    6-5

3.0         Initial Program Audit Process	    6-5
3.1            Advance Preparation	    6-5
3.1.1             Documentation Assembly	    6-5
3.1.2     :        Review Operating Data	    6-6
3.1.3             Noti ce to Program Offi ci al s	    6-6
3.1.4             Program Questionnaire	    6-7
3.1.5             Conference Call	    6-7
3.2            Site Visit	    6-7
3.2.1             Interviews	    6-9
3.2.2             Records Review	    6-10
3.2.3             Inspection Station Visits	    6-11
3.2.4             Special Surveys	    6-12
3.2.5             Exit Meeting	    6-13
3.3            Audit Report	    6-13

4.0         Follow-Up Audit Process	    6-14
4.1            Follow-Up Audit Activities	    6-14
4.1.1             Site Visits	    6-14
4.1.2             Station Visits	    6-15
4.1.3             Revi ew and Analysi s	    6-15
4.2            Fol low-Up Audit Reports	    6-15
4.3            Corrective Action	    6-15
                                                   f
Appendices
A.          Method for Determining Required Emission Deduction
               Compliance in Operating I/M Programs	    A-l
B.          Descriptions of I/M Program Elements	    8-1
C.          I/M Program Audit Questionnaire	    C-i
0.          Descriptions of On-Site Audit Activities	    D-l
E.          Procedures Used in QMS Tampering Surveys	    E-l
F.          Instructions and Forms for Audit Activities	    F-l
G.          Checklist for Completing the I/M Audit	    G-l

-------
              Guidelines for Auditi^ Motor vehicles

                   Emissions  Inspectionxprograms

                           FY1988-1989
1.0  INTRODUCTION AND PURPOSE

     Motor   vehicle   emissions   inspection/maintenance   (I/M)
programs are currently  required  in  over 30 States  or  localities
under Part D of  the Clean Air Act.   At present, I/M programs are
operating  in  59  urban  areas  in 31  States  across -the  country
affecting  over   45  million  vehicles.   All  but  a  few  of  these
programs  measure   the   tailpipe  emissions   concentrations  of
subject  vehicles;   many  of  them also  inspect  for evidence  of
emission  control system  tampering   or misfueling.   EPA  has  a
regulatory responsibility to  review  these  programs  in each State
for  overall  effectiveness  and  conformance  to  EPA  policy  and
State Implementation Plan (SIP) commitments.

     EPA's  auditing  of  I/M  programs  is  accomplished  through
routine monitoring  of  program operating  data,  periodic  onrsite
program audits  (or  program evaluations),  and targeted follow-up
audits.   The primary purpose  of  this  report  is  to present I/M
auditing guidelines  for use by  EPA  personnel  involved  in these
monitoring and  evaluation  efforts.   Both  emissions  inspections
and  tampering/misfueling inspection elements  are covered by the
guidelines.   Since  all  but  four  of  the  operating I/M  programs
have undergone full  initial  on-site  audits, most of EPA's future
evaluative efforts  will  emphasize  monitoring  of  programs  with
operating problems.   The  audit guidelines  are designed to ensure
that  EPA's  audits  of  the  I/M  programs  are comprehensive  and
consistent.

     The primary  purpose  of EPA  I/M auditing  is  to  ensure that
each State or  locality  is effectively  implementing, and enforcing
its  I/M program,  in  a  manner  consistent  with  its 'SIP.   As  a
result  of  the  audit, both  EPA and   the' State or  Local  agencies
involved  will   be  able  to  determine  wnac,  if  any,   program
improvements  may be  required  to  allow  SIP  goals  and  commitments
to be met.  EPA  reviews .the design  and operation of I/M programs
across  the  country,  and  has the   opportunity   to  observe  the
strengths and weaknesses  of the various programs.  As  a result,
EPA  may   be    able  to   suggest   administrative   or   program
modifications  which  would increase program effectiveness.
                           6-1

-------
 2.0  OVERVIEW OF  THE  I/M AUDTT

     EPA  must  determine  whether     . -"""  "  ~~ -a  managed
 effectively,   to  produce   hh ner-^red-  emissions   reductions,
 determine whether adequate  -"^?s  are^being  taken to  enforce  the
 program,  and ensure  tnat  vehicles  are being  inspected properly
 and  effectively  repaired.   Each  approved  SIP  commits  to  a
 certain  level  of  I/M  program  effectiveness  which  meets   or
 exceeds EPA's minimum emission  reduction requirement.  A number
 of  SIP. design  factors affect  the ability  of  a  program to  be
 effective  in reducing  emissions  including:   vehicle  coverage,
 model year  coverage,  geographic coverage,  stringency,  frequency
 of  inspections,  waivers,   enforcement,  and  others.   Apparent
 deviations  from  the   SIP  will  be investigated and  documented
 through monitoring of  operating  data and on-site program visits.


 2.1  Monitoring Program Effectiveness

     The first  years   of  experience auditing  I/M programs 'have
 lead EPA to develop  indicators of program effectiveness that  can
 be  used  to  monitor  progress   in  achieving  and  maintaining
 operating goals.  These indicators of  program effectiveness are:

     1)     Compliance  Rate  - The percentage of  vehicles  subject
           to the I/M  program that complete  the test process.

     2)     Initial  Failure  Rate -  The  percentage   of  vehicles
           failing the initial test.

     3)     Retest Failure   Rate   -  The  percentage  of , vehicles
           failing one or  more retests.

     4)     Waiver Rate  -   The  percentage of  vehicles  that  fail
           the initial test and  receive a waiver.

     5)     Emission  Reductions  From Waived  Vehicles  - The  mean
           initial  emission  scores  for  vehicles  receiving  a
           waiver minus the  final  test mean emission  scores  for
           these vehicles.

     6)     Repair Effectiveness   Measure  -  ^ear.  idle  emission
           test   scores  for  vehicles   failing  the  initial  test
           minus  the  final  test  mean  idle  emissions  scores  of
           these vehicles.

     7)     Overall.  Emission   Reduction  -   An   analysis   that
           incorporates compliance  rate,  initial  failure  rate,
           waiver   rate,    and   adjusts  the    design   emission
           reductions  to reflect actual program performance.   A
           detailed  description of  this analysis  is  included in
           Appendix  A.

     Tracking these  seven   indicators  on  a  routine  basis allows
early identification of program operating problems.   In order to
                           6-2

-------
accomplish this  tracking,  I/M programs will need  to  provide EPA
with the following data on a semi-annual basis:

     1)    Number of-vehicles required to be inspected.
     2)    Number  of  vehicles  receiving  initial  inspection  by
           model year.
     3)    Number of  vehicles  failing the  initial emission test
           by model year.
     4)    Number of vehicles failing each retest by model year.
     5)    Number of  vehicles  passing,  failing,  or not  equipped
           for each  tampering  component for which SIP  credit is
           claimed by model year.
     6)    Number of vehicles receiving waivers  by model year.
     7)    Mean scores  for CO and  HC of  passing, failing,  and
           waived vehicles by initial test and final retest.
     8)    Number  and  percentage  of  analyzers  malfunctioning
           (out of calibration, leaks, etc.) during audits.

     In  addition,   decentralized  programs  must  provide  the
following summary statistics on a  semi-annual basis:

     9)    Number of  vehicles  inspected  by station and  by model
           year.
     10)   Number of  vehicles  failing the  initial  emission test
           by station and model year.
     11)   Number of  vehicles  failing the  retest  by  station and
           by model year.
     12)   Number of  initially failed vehicles  receiving waivers
           by station and by model year.
     13)   Number of  vehicles  passing,, failing,  or not  equipped
           for each  tampering  component for which SIP  credit is
           claimed by model year  and station.
     14)   Number of station audits conducted.
     15)   Number  and types  of   covert  surveillance  activities
           conducted.
     16)   Number  of  facility  licenses  outstanding  at  end  of
           reporting period.
     17)   ID numbers of  stations warned,   suspended, or revoked
           for violation of rules  by type of violation.

     Headquarters  staff   is, responsible  cor analyzing   the  I/M
operating  data  and  identifying  potential  operating  problems.
Regional  offices   are   responsible   for   obtaining  semi'-annual
reports and planning  and  implementing appropriate  ac-ions  (e.g.,
assessing problems  through an on-site  visit;  requescing further
information;  calling  for  special  studies,   surveys  or  corrective
action).


2.2  Initial  I/M Audits

     The initial I/M audit is conducted after a  program has been
in  operation  for  at  least one year.  In  addition  to  reviewing
program operating  statistics,  all design  parameters  of  the I/M
program  which  affect  the   program's   ability  to  meet  legal
                           6-3

-------
requirements  of  the  SIP  and  the  Clean  Air  Act  should  be
reviewed.   Comparing the  design  of  the program in the SIP to its
actual implementation allows  EPA to evaluate  whether  applicable
laws,   regulations,   and  procedures   are  being   administered
properly  and  are  resulting   in  expected  emissions  reductions.
Program parameters include:

     1)    Inspection test procedures,
     2)    Emission standards (cutpoints),.
     3)    Inspection station licensing requirements,
     4)    Analyzer specifications,
     5)    Maintenance/calibration requirements,
     6)    Quality control procedures,
     7)    Audit/surveillance procedures,
     8)    Internal control  systems  (quality assurance),
     9)    Enforcement  procedures,
     10)    Vehicle coverage  considerations,
     11)    Waiver procedures,
   ,  12)    Consumer assistance and protection,'  and1
     13)    Mechanics training.
     14)    Data  recording and analysis  procedures.

     Detailed  descriptions   of  each   I/M   design   element  are
provided  in  Appendix B.  As well  as   affecting  overall  program
operations,  many  of  these  design   parameters   influence  the
availability  of   the   emissions   performance  warranty   in  a
particular State or locality.

     It is critical during an initial  audit  to determine  whether
the various elements of  the  I/M  program as designed and approved
in the SIP, are  actually  being carried out.   At a  minimum,  the
initial evaluation must  determine:

     1)    Whether  the   program   is  being  adequately  enforced
           (i.e.,  whether   all   subject   vehicles   are   being
           inspected).
     2)    Whether inspection standards and  vehicle coverage are
           adequate.
     3)    Whether vehicles  are  being   inspected  properly (i.e.,
           according to  established procedures,  using  the proper
  !         outpoints.,  etc.) .         .              '     ]
     4)    Whether  ''vehicles   identified Ifor  repjair  are  being
        i   repaired effectively.


2.3  Follow-up Audits

     The  purpose of a follow-up  audit  is  to assess  progress made
in correcting operating  problems identified  during the  initial
program audit or  through monitoring of program indicators.   The
form of the follow-up audit  will depend on the type of  problems
identified for  correction.    The. follow-up  audit  may include  a
site  visit,   analysis   of   reported   data,  review  of   amended
procedures   and   regulations,    or.   other   activities.    EPA
headquarters   staff  and  Regional  Staff will  determine the  form
and  substance  of  follow-up  audits   and  audit  documentation
(reports)  on a case by  case  basis.

                            6-4

-------
2.4  Corrective Action

     Design   flaws   or   -implementation  deficiencies   that   are
serious  enough to  cause  emission  reduction  benefits  to.  fall
below the minimum  requirements  will  trigger a formal EPA process
to   bring  about resolution  of  the problems.   Once  an  emission
reduction  deficiency  is identified,  the  Regional  Office  will
notify the Governor of the need for  a  corrective  plan.   The  plan
should specify the  steps the State plans to take  to correct the
deficiencies.   To  be  acceptable,   the  plan  should  include  an
expeditious  implementation   schedule,  contain specific  measures
to   address  the problems  causing  the  shortfall, •  and  have  a
reasonable  chance  for success.  The Regional Offices  will  have
the   lead   in  obtaining   corrections  from   the  States   and
localities.  The Office  of Mobile  Sources  will  provide technical
and policy  support.   Failure to comply  with either  the request
for corrective action or  failure  to  resolve operating problems
could  result  in  initiation  of sanctions  available  under  the
Clean Air Act.
3.0  INITIAL PROGRAM AUDIT PROCESS

     The  initial  program  audit  is  comprised  of  four  basic
elements:    advance  preparation,  audit  visit,  audit  report,  and
follow-up actions.

3.1  Advance Preparation

     Preparation   for   the   audit   allows   the   auditors   to
familiarize  themselves  with  the  design  and  operations  of  the
program under review and to identify  those particular aspects of
the  program which  may need  special emphasis  during  the audit
visit.   Proper preparation will  allow the auditors to  use their
time more  efficiently  and will reduce the disruption of the I/M
program during the audit.

     The   auditors  assigned    to    perform   the   audit   must
collectively  possess  as  much  knowledge  as  possible  about  the
operations  of  I/M  programs  in  general  and  a'bout the  specific
details  of  the   program  under  review.    The  goal  of  audit
preparation  is   to  determine   the   potential   strengths   and
weaknesses  of  che programs  so  that the audit can  be- focused and
efficiently conducted.

3.1.1      Documentation Assembly

     The first  step in preparing  for the  audit  visit  includes
acquisition  and  assembly  of  current  versions  of  the  basic
documents  associated  with  the  'program.   This   includes  EPA
documents  such   as  SIPs,   letters   and   memoranda,   and  State
documents ,such as  legislation,  rules  and  procedures  manuals.
The  types  of documents  that  the Regional Office  needs to obtain
from the State for this phase of the audit include:
                           6-5

-------
     1)    Rules and regulations.
     2)    Analyzer specifications.
     3)    Quality control procedures and forms.
     4)    Quality assurance procedures and forms.
     5)    Test forms,  waiver forms,  repair forms.
     6)    Enforcement  procedures.
     7)    Mechanic and inspector  training materials.
     8)    Inspector and station licensing requirements.
     9)    Current contracts.
     10)   Public awareness materials.

Copies . of  these  items  will  be  needed  for  each  EPA  office
involved in the audit.

     Other  sources  of   information  in  preparing  for  an  audit
visit may include:  periodic operating reports  produced  for EPA,
documentation    of   -previous    audits    or    investigations,
correspondence,  formal  reports to  agency  heads,  governors,  or
legislatures.   Correspondence   to  and  from  program  officials,
citizens,  and  other interested parties  should  be  reviewed  to
determine  what,  if  any,  issues   may  may  have  already  been
identified  or   addressed.    It  is  particularly  important  that
auditors  try  to be  aware  of  any  special  sensitivities   in  a
specific State  or locality revealed in previous correspondence.


3.1.2      Review Operating Data

     The next  step  in  audit  preparation  is to review the program
operating  data  that  should  be provided  to  the   agency  on  a
semi-annual basis.  QMS  will conduct  an  analysis of the  emission
reduction  benefits  from  the   program.    This  will  allow  the
auditors  to  identify  in advance  areas  of concern  on which the
audit should be focused.
3.1.3      Notice to Program Officials'

     After  audit  preparation has  progressed  to the  point where
an audit visit can  be  scheduled,  the EPA Regional  Office ^hould
send  a  formal written  notice of  the  audit  to the  appropciar.s
State  and   local  officials.  The  written  notice  should  allow
ample  lead time  (about  60 days  should be  sufficient  in  most
cases) to schedule a mutually convenient time  for  the site visit
and  to  complete  pre-visit  preparations.    The  formal  notice
should  also  specify, whenever  possible, those individuals  who
will   comprise   EPA's   "audit    team   and   what   State/local
organizations  should be represented.  Finally,  the formal notice
should  identify  any  special issues  raised  during  the advance
preparation.

     When  agencies  in  addition  to  the  air  planning  agency  must
be included in the  audit,  the Regional  Office will determine how
co  secure  their   involvement and  cooperation,  and  will  notify

                            6-6

-------
.each  of  the  upcoming site  visits.-   In  addition,  the Regional
Office  will provide  notice to  the' Office of  Mobile Sources of
scheduled audit visits at  least  60 days in advance.


3.1.4       Program Questionnaire

     To    facilitate    audit   preparation,    an   I/M   program
questionnaire  has   been   developed   (see  Appendix   C) .    This
document  summarizes  relevant design and operating aspects of  the
I/M program.   The questionnaire addresses design  aspects  of  the
program  and actual  operating experience.   It is recommended that
the Regional Office  send  a  blank copy of  the  questionnaire  for
the State or local officials to  complete  in  advance of the site
visit    along   with   the   formal    notification.    Once    the
questionnaire  has   been   completed  and  submitted  to  EPA,   a
conference  call  could be  held  with  Regional Office  staff,   QMS
staff  and.the  State or local agency  to  discuss any answers that
require  further   discussion.  This process  can be  continued or
completed at the  entrance  interview.


3.1.5       Conference Call

     The  conference  call  should  take  place  about   two  weeks
before  the  audit.   The  purpose of  the  call  is  to  make final
arrangements,   assure  understanding   of   plans,   resolve   any
outstanding questions  and obtain any  additional information.  In
some  cases, EPA  may  request certain  information  not covered by
the questionnaire or  other  information submitted  for  review in
order to  complete audit preparations.


3.2  Site Visit

     The  site  visit  is  for the purpose  of  investigating   and
documenting whether  the  program  is being  properly  administered
and  enforced,  according   to  established  laws,  regulations,   and
procedural  requirements in the SIP.

     Jiaid  observations may  be necessary to determine whether:

     1.     Vehicles  are  being  tested properly  and  the  results
            are being  reported correctly.
     2.     Emission standards are being properly applied.
     3.   .  Licensing  requirements are  being met.
     4.     Analyzers  are  being calibrated and maintained.
     5.     Quality control procedures  are being followed.
     6.     Inspection and other  records are being  kept  properly.
     7.     Data analysis  is  being used to manage the  program.
     8.     Inspection     stations    are    receiving    adequate
            surveillance and  supervision.
     9.     Repair waivers  are being processed properly.
     10.    Owners of  non-complying  vehicles  are being  identified
            and prosecuted.

                             6-7

-------
     11.   Failed vehicles .are being repaired.effectively.
     12.   Consumer  assistance  and  protection  provisions  are
           being administered properly.
     13.   Mechanics training is being conducted appropriately.
     14.   Repair   information   (manuals),    newsletters    and
           brochures are available for owners and mechanics.
     15.   Complaint mechanisms are  available for consumers.

     It is also  a  means  for  identifying causes of  and  solutions
to  operating  problems  indicated  either   by  data  analysis  or
review  of  materials  submitted  prior  to  the  site visit.   The
purpose of  the  audit  visit  is  to  verify information  already
available  and to gather new information as  needed  to  satisfy the
objectives of  the  audit.  One  objective  should  always   be  to
identify  those  areas  where  EPA  can  provide   assistance  to
strengthen I/M  programs.   Sometimes such  assistance  may involve
specific aid to  a particular  State,  and at other  times it  may
involve more  general  assistance  aimed at  resolving an  overall
technical  issue.

     The audit  visit should be adequately planned  to  ensure that
all needed activities  are conducted within  the  time constraints
involved.   Generally,  a  two-  or   three-person  EPA  audit  team
should  be  able  to  complete  the on-site visit  in  three or four
days,  at times working  independently,  depending on the  size and
complexity of  the  program.  These days need not  be consecutive,
and  the Regional  Office  may  find  it  desirable  to   separate
special surveys,  records  review, inspection station  visits,  and
interviews with  officials.    The  audit  must  be  planned  and
coordinated with  State  and local agencies  in order  to  minimize
the  level  of   intrusion  and   disruption  of   normal   program
activities.

     In States  with I/M programs  operating  in multiple  urbanized
areas,  it  will usually  be necessary to visit several I/M  areas.
This is because  each  urbanized  area will usually have a separate
manager, potentially different enforcement  agency  practices,  and
potentially different  repair  industry competence  and  enthusiasm
for the I/M  program.   Careful  planning  will again  be  necessary
to  schedule  multiple  city  visits,  maximize  the  audit  team's
efficiency,   and  minimize   the   impact   on   State  or   local
operations.   It  may also  be possible  to  utilize  contractor  or
other third-party support  for  certain  types of  audit  activities
such  as  roadside  surveys,   if   acceptable to  State  or  local
officials.

     A formal  record  of  the  auditors'  work  should be  compiled
for the audit  file.   Each auditor should keep  detailed  notes  on
persons contacted  or  interviewed and  information  received,  its
source, and  when  received.    Sample  forms  for  such  notes  are
included in Appendix F.                ~~

     Auditors  should be alert  to  situations which  do not  appear
to be  in  keeping with  program  procedures  or  regulations  in the
SIP; such  situations   should  be  investigated  during  the  audit

                           6-8

-------
visit to  the  extent  possible.   Auditors should also be  alert  to
situations that  could be indicative of  fraud,  abuse,  or illegal
acts;  these   situations  may  be  reported  to  program  officials
rather  than  being  investigated  during  the  audit.    Finally,
auditors  should  be  alert  to  signs  that  their  visit  may  have
resulted  in  observed  behavior  not  typical  of  normal  program
operations.   Auditors  should  select  specific  facilities  and
personnel  for observation  without advance  notification to  the
State or local agencies.  This  is often  done  in conjunction with
the  entrance  interview during  which  the audit team chooses  the
stations to be visited in. cooperation with program officials.

     An  important  part of the  audit  visit  is  ah evaluation  of
the   internal  control   system   applicable   to  the   program,
organizations.),   and  activities  under  review.   Each  State  or
locality  operating  an  I/M  program  should  have an  established
methodology and capability  for  evaluating program operations  in
order  to  assess  program   results.    Through  this  system  of
administrative   controls,   the   State   or   locality   should
continuously  or  periodically compare  actual  program  operations
to  intended   design  and  overall  goals.   The  emphasis  in  this
system  should be  to  identify  problems  and  to  take  corrective
action.    Internal  audits are  usually  an  integral   part  of  the
internal control  system.

     In  the  following  sections,  specific activities of  the  I/M
audit  visit   are   listed   and  discussed.    A  more   detailed
discussion of these  activities  is  found  in  Appendix  D.   The
purpose  of each  activity is  to  collect  information  related  to
one  or   more   of   the  audit  topics  listed  in  Section  2.0  and
discussed in detail in Appendix B.

3.2.1      Interviews

     The  following State or local officials  or  their  designees
must  be  interviewed   during  the  audit  visit:  the  air  planning
agency officials  with  responsibility for  mobile sources,  the  I/M
manager  (whether  employed by the air 'agency  or another  agency),
and  operations  personnel   with   close • knowledge  of   currant
practices  and experiences in  che areas of  enforcement,  quality
control,   repair  waivers,  data  analysis,  and 'mechanic  training.
The  I/M  manager  may   be  able co  discuss  ail  of  these, areas,  or
additional staff  may  need to be included.

3.2.1.1  Entrance Interview

     The starting point for  the audit visit  should be  an initial
meeting  with  State and  local officials  involved in air  quality
planning  for  mobile   sources   and,   if  different,   officials
involved  in  program.   The  agenda for  this   interview  should  be
finalized at  the  pre-visit   conference  call  and  should  be  based
on  the  questionnaire and the findings  of  any on-site  activities
which have preceded  the  site visit.   In  addition to  discussions
of  program design  and organization,  this meeting should focus on
any  problem  areas in  the  program which State/local  officials

                           6-9

-------
have  alrea.dy  identified  and  their  plans  for  resolving  them.
Also  any   planned  program  modifications,   improvements,   and
expansions  should  be discussed.   Finally,  .the  interview  should
answer any outstanding questions posed prior to the site visit.

     Discussions with  I/M manager and other  program operations
personnel  should  focus  on  operational  aspects  of  the  program
including:   compliance   rates;   failure   rates;   waiver  rates;
repair costs;  mechanic  training efforts;  complaints  (types,  how
resolved, etc.); internal control  efforts  (surveillance,  'results
of station  audits,  data  analyses,  investigations  using  unmarked
vehicles,  etc.);  and  enforcement  aspects  (procedures,  results,
etc.).  Again  known problems and  planned  resolutions should  be
discussed.   In some cases,  EPA may  request  the  State  or  local
officials to make a formal presentation, or have any contractors
make  a  presentation,  at  the  entrance interview on a particular
topic or on the program as a who.le.


3.2.1.2   Other Interviews

     In   addition   to  interviewing  agency  officials  with  direct
involvement in the I/M program,  it  may  be  desirable to interview
non-program  individuals  with knowledge of the  I/M  program and
differing perspectives on  its operation.   Useful  interviews can
be  held with  contractor representatives,  au'to  club officials,
service  industry association  officials, auto  dealers association
officials,  consumer  agency  officials,  instructors  of mechanics,
and  persons  in  other  related  roles.   These  people  can  be
interviewed by telephone or at a time other than the site visit.


3.2.2      Records Review

     Another important phase  of  the site  visit is  the  review of
records   relevant   to   the  I/M  program.   These   records  include
inspection  records,  waiver  records,  audit records,   and  covert
surveillance  records.    To  the .extent possible • and  practical,
copies of  these  records  should ;be obtained  for  careful  review
before or after the on-sita visit.

     Inspection  records   should be  reviewed  'to  determine  whac
data are collected  and how the  data  are  used.   When inspection
records  are  kept  manually,  it is  important to  review samples of
the records for completeness, legibility,  accuracy of inspection
standards,  reasonableness  of test scores,  and  accuracy of  the
pass/fail decision'.

     Waiver records should be reviewed  to  determine that waivers
are  being  processed  in  compliance with  waiver  criteria.   The
review  should  also determine how  many waiver  applications  are
being  received,   approved,   and  denied;   the  extent  to  which
waivers  are denied because of inappropriateness  of repairs;  and,
the  axtent  to which  waiver  transactions  are tracked  by repair
facility (where appropriate).

                           6-10

-------
     Audit records should  be  reviewed  to determine that thorough
audits are  being  conducted and  that  problems are  resolved  when
found.  The  review should determine-whether  the  audit frequency
and the procedures used  are  in accordance with SIP commitments.
This  may  include  reviewing   records  kept  by station  auditors,
records  kept  in   the  stations,  and/or-  records   maintained  by
program management.

   •  Covert surveillance records  should  be  reviewed to determine
the nature  of  surveillance efforts,  the' information  collected,
-and  the  types  of  actions   taken  for  various   findings.    An
assessment should be made to  determine:

     1)    If   enough   surveillance   is   being    performed   to
           reasonably identify problems.
     2)    The methods and standards  used to  gather  information
           and  .determine  the need  for  enforcement  action  are
           adequate to insure correction of  operating problems.
     3)    The   process   and    procedures    for   implementing
           enforcement  are  timely  and   effective  and' are,  in
           fact, being applied  (i.e. suspensions  and revocations
           are occurring).                       •


3.2.3      .Inspection Station Visits

     During the site visit,  all  types  of inspection stations and
other  licensed  facilities  should be  visited, including  regular
inspection stations, fleet stations, referee  or  waiver stations,
reinspection stations,  or  any other type of  station  conducting
initial inspections  or  retests.   The  emphasis,  however,  should
be  on  the types of  stations  that  inspect  the  majority of  the
vehicles.   Depending  on  the   type of  station, any  or  all  of the
following activities may be part of the visit:

     1.    Conduct or  observe  an  audit  of   emission  analyzers,
           using span gas.
   •  2.    Observe     exhaust    emission     inspections     and
           anti-tampering inspections,  as appropriate.
     3.    Observe waiver processing.
     4.    Check  inspection,  enforcement, calibration  and  audic
           records.
     5.    Interview station  personnel.

     It is  preferable  to  begin  each  station  visit  by observing
the normal  practice  of  the  State auditor.    Appendix  F includes
forms  for observing  the  State  auditor  and  inspection  station
performance.   Once  the  State  auditor  is  finished,  the  EPA
auditors  should  begin  their  audit.    Forms  are  provided  in
Appendix  F  for EPA activities  as well.  Decentralized stations
are  generally  required  to keep inspection records  and analyzer
calibration records; these records  should be  reviewed  during the
station visits .
                            6-11

-------
     The number  of station  visits will  vary  according  to  the
type and size .of  the  I/M program, under evaluation.   In  the  case
of  a  centralized  program  with only  a  few   (less  than  five)
inspection stations,  it  may  be  possible to visit all or  most  of
the  facilities.    In   larger   centralized  programs   and   in
decentralized programs,  only a. fraction ,of  the stations can  be
visited.  The  following guidelines  should be  used  in  deciding
how many and which stations to visit.

In centralized programs:

     1.     Visit at least  three  stations;  more  should be visited
           if inconsistencies are found, or when the  program has
           an extremely large number of centralized facilities.
     2.     Choose stations  with a high volume of inspections.
     3.     Choose  stations  that   represent  a  reasonable  cross
           section of  types of economic strata.,
     4.     Choose at  least one station, if possible, that  has  a
           past  record  of  possible  quality  control  problems;
           i.e.,  low  failure  rates, high  level1 of  analyzer  audit
           failures,  etc. (based on records review).

In decentralized programs:

     1.     Visit at  least  10  stations, with  more depending  on
           the time available and the  size of the program.
     2.     Choose stations  which  represent  a  cross  section  of
           types of economic strata.
     3.     Choose  stations  which  are   the   responsibility  of
           different State/local agency field investigators.
     4.     Choose stations  which  represent  a  cross  section  of
           different  types   of  businesses   (service  stations,
         •  independent  garages,  auto  dealers,  chain   service
           centers,  etc.)
     5.     Choose stations  which  represent  a  cross  section  of
           types/makes of analyzers.

     In   decentralized   programs,    the   EPA   auditors   should
accompany  at  least  two State/local   field  auditors.   The  EPA
auditors may^  choose  which  State/local auditors are  accompanied
but must be careful to get  the right mix of types of stations.


3.2.4       Special  Surveys

     In  some  cases there  may be a need  for special surveys  as
part of  the audit visit  (or  at  some other time).   In areas  that
use sticker  enforcement,  a  sticker  compliance  survey  must  be
conducted.   Such a survey  involves  visiting  a  number of  parking
lots  and  walking  along  streets  of  parked  cars   to observe  a
sample  of  vehicles and  collect data  on  sticker   compliance  or
noncompliance.
                           6-12

-------
     Other   potentially  useful   data   gathering   surveys   are
emissions  testing surveys  and tampering  surveys.   Such surveys
.involve  collecting emissions  short  test  data  and/or  tampering
data on  a  representative group of local vehicles.  Tampering and
emission surveys  are typically conducted  by EPA  in cooperation
with State  and local officials  as part of  a mandatory roadside
pullover program.   QMS   is  interested  in working with  State and
local   officials   to   design   and  conduct   such   surveys.   A
description  of the  procedures currently  used in  QMS  tampering
surveys is included in Appendix E.


3.2.5      Exit Meeting

     The exit  meeting  is  usually the  last  scheduled  activity
during  the  site  visit.   The purpose of  the  exit  interview is to
inform  agency  officials  of ,  the  preliminary  findings  of  the
audit.   The. audit team  should meet  prior to  the exit interview
to  discuss  the   observations,   opinions,   and  conclusions  of
individual  auditors.   The  limited   time  available  during  the
audit visit does not, however, allow all "information and data to
be  fully  evaluated prior to the exit  interview.   In some cases,
aspects  of  the   program  may  necessitate   computer   .modeling
analyses before a  final  evaluation can be made.

     Despite  limitations,  the exit  interview should be  used to
convey as much information  as  possible  to  program officials.  At
a  minimum,  they  should be briefed  on the  activities  that  were
conducted during the audit visit and  on any  follow-up activities
that will  be needed to  supplement the  audit  visit.  If definite
problems  are  identified  during  the   audit,   they  should  be
discussed  during  the  exit  interview.'  Discussion  should  also
cover  suspected  but  unverified problems,  and  how  they  will  be
further evaluated  during, follow-up  activities.   State  and  local
officials should  also be  given  a  projection  of  when  the draft
audit report will  be submitted to  the State cor review.
      i
      i

3.3  Audit Report

     The SPA Regional Office  r^ill  prepare an  audit rapor- which
will   document   the   findings   of    the   audit,   present   EPA
conclusions,  and  suggest  improvements.   The Office of  Mobile
Sources will  be  given   an  opportunity  to  review and comment on
the Region's  draft audit  report  prior to it  being  sent  to the
State  and  local   agencies for  review.   All official  comments
received from  the  State or local   agencies should be  appended to
the report.

     The audit  report must properly  and objectively reflect the
findings of  the  audit,   both  positive  and negative.   The audit
report should specifically:
                           6-13

-------
     1.    Summarize program operating data.
     2.    Identify the strengths of the program.
     3.    Identify   SIP   deficiencies    for    which    program
           improvements must be implemented.
     4.    Make   recommendations   for   correction   of   these
           deficiencies.
     5.    Identify  aspects  of  the  program  which  need  further
           study,   may   be  potential  problems or  where EPA  can
           suggest  minor   modifications   which  would   improve
           program effectiveness or efficiency.

     The  audits  should  be of   sufficient  scope and  adequately
documented  and reported  in  such  a  way  that   the  record  will
adequately  support  EPA  follow-up  action  in  the form of  a  call
for SIP revisions  or a  finding of  non-implementation  of  the  SIP,
should  that  be  necessary.   In  some  cases,  follow-up  may  be
necessary- to collect data  to  support such  actions.   A  checklist
is provided in Appendix G for  completing the I/M audit.


4.0  FOLLOW-UP AUDIT PROCESS

     The follow-up  audit process,  while  similar to  the  initial
audit  process, has by necessity 'a  flexible  structure.   The
activities pursued  in  a follow-up  audit  should be  designed  to
address the particular  problems  identified in previous  audits or
problems  that have arisen  since.   In  particular,   the  audit
should  determine   what  progress   the   program  has   made   in
implementing   recommendations   and  correcting    problems.    The
following   guidelines  will  attempt  to  cover  the major  problem
groups  often  encountered  in  I/M  programs  but  each  particular
program may require some unique attention.


4.1.  iFollow-up Audit Activities

    .Follow-up audit - activities  may  include  any or  all  of  the
activities required  in  the initial  audit 'but generally  will  be
narrower in scope.   There  are  three  basic activities  which  may
be involved  in a  follow-up audit:  site visits, station' visits
and review and analysis.
4.1.1      Site Visits

     Site  visits  are  needed  when  the program  has  instituted
significant  changes   that   require  on-site   verification   and
assessment.  Specific  activities  could  include station  visits,
record review,   c.overt  surveillance  activities, sticker  surveys,
and the like.
                          6-14

-------
4.1.2      Station Visits

     Station  visits  are  needed when  improper  testing,  quality
control or quality assurance  problems  exist.  If  problems  exist
with  waiver  processing  by  stations,   this   too  might  require
station visits.
4.1.3      Review and Analysis

     Site  and  station visits  can  be  omitted  in  many  cases,
especially  when  the  problems' relate  to  data  processing  and
reporting.  Also,  review  of changes  to  rules or  procedures  can
generally be  completed without a  site  visit.  Conference  calls
and  correspondence   can  be   used   to  provide  the   detailed
information needed to satisfy the follow-up audit goals.


4.2  Follow-up Audit Reports
                «.
     The  report  of   activities  and  findings  from  a  follow-up
audit should be tailored to the specific  situation.  There  is no
need to   address  all  of  the  elements  addressed  in an  initial
audit,  only those that were the subject  of the follow-up audit.


4 .3  Corrective Action

     As with  the initial  audit,  design  flaws  or  implementation
deficiencies that are serious enough  to cause emission  reduction
benefits  to  fall below the minimum requirements,  will  trigger a
formal  EPA process to  bring  about a  call for a  corrective plan
as  described in Section 2.4.
                           6-15

-------
               APPENDIX A
         METHOD FOR DETERMINING






REQUIRED EMISSION REDUCTION COMPLIANCE






       IN OPERATING  I/M PROGRAMS
APPENDIX A WILL NOT BE USED DURING THIS AUDIT CYCLE.
              A-l

-------
                            APPENDIX



               DESCRIPTIONS OF I/M PROGRAM ELEMENTS







1.    Test Procedures



2.    Emission Standards



3.    Inspection Station Licensing Requirements



4.    Analyzer Specifications



5.    Quality Control Procedures



6.    Quality Assurance



7.    Enforcement Procedures



8.    Vehicle Coverage



9.    Waiver Procedures



10.  Consumer Assistance and Protection



11.  Mechanics Training
                           B-l

-------
               DESCRIPTIONS OF I/M PROGRAM ELEMENTS


1.   Test Procedures  .

     There  are  two  basic  types  of  tests  conducted  in  I/M
programs: tailpipe  emission tests and  a  check for  the presence
and/or  function  of  emission  control devices.   The goal  of  the
emission test  is to  provide a uniform,  reliable, simple  test  to
identify high  emitters.  The  goal of the-emission control device
check is to  identify vehicles with missing or modified emission
control  devices.   Retests  are  required  to  ensure that  failed
vehicles receive sufficient maintenance to reduce  emissions  or,
proper  repairs or  replacement  of modified  or missing emission
control components.  Emission test procedures vary  from state  to
state but  usually  consist  of measurement  of emissions  at  idle
and  in  some programs  at  2500 rpm,  as  well.   In  many programs
different  tests  are  conducted  on  different  model   years  of
vehicles, especially  in the case of  checks for  emission control
components.

     An  additional  variation among  State/local  programs  is  the
degree  of  automation  involved   with  the  test ' procedure.   Some
programs have  a fully automated  procedure  while  others  have a
completely manual system.

     Another  important element  of  the test  procedure'  is  the
adherence to  the  requirements  of 40  CFR, Part  85, Subpart  W.
This section  of  the  Code  of  Federal  Regulations  specifies  the
requirments  for  the  emission performance  warranty provided  in
§207(b)  of  the Clean Air  Act  Amendments of 1977.   This warranty
generally  applies  to  1981 and   newer  (1982 and  newer  at  high
altitude) model  year light-duty  vehicles  and light-duty trucks
and  has  specific requirements for test procedures  which must  be
followed,  in  order  for  motorists to 'be  eligible   for  warranty
coverage.


2.   Emission Standards

     Emission  standards  are  used to   determine  which  vehicles
pass or  fail  the emission  test.   The  State  Implementation Plan
for  each  program includes  a  target  design stringency,  which  is
the  percentage  of  pre-1981  vehicles   failing  che  tesr  in  -he
first year of the 'program.  Evaluation  of  standards should focus
on overall  failure'rate  and the  degree to  which SIP commitments
are being achieved.

-------
                                -2-
 3.    Inspection Station Licensing Requirements

      In order  to  achieve uniform and accurate testing,  it  is
'necessary to  assure that  all  inspections are  conducted  by
 properly trained  and  equipped  inspectors  and  to provide  a
 mechanism for  accountability  of. inspection  facilities.   In
 centralized  programs, both contractor-run  and government-run,
 where inspectors are under  more or less direct control,  there
 are  only  be  employee  training requirements   rather   than
 licensing requirements.   In  decentralized programs,  and  in
 fleet stations  in centralized  programs,  EPA policy  requires
 that there be licensing requirements  which  ensure that:

      1.    All  stations  employ trained inspectors.

      2.    All  stations  have'approved analyzers.

      3.    All    stations    keep   necessary   records     on
           inspections,   calibrations,  and   maintenance and
           agree to  make these records available to the  State
           or local  agency.

 4.    Analyzer Specifications

      Each   program   must    adopt   and    enforce  equipment
 requirements which will  provide   for  accurate and  consistent
 emission measurements.   Equipment  specifications  cover the
 basic technical requirements  of the analyzer  (i.e., accuracy,
 repeatability,   drift,   etc.)   as    well   as   other   basic
 requirements '  (e.g.,    throughput    capabilities,    software
 requirements, durability, etc.).

      Centralized I/M  programs typically  use  analyzers   that
 are computer-controlled and feature automatic data  collection
 and decision-making.   These   systems  usually  have elaborate
 maintenance  and  calibration   networks associated  with  them.
 As  one would expect, decentralized I/M programs  have  the most
 variety  with respect to analyzer  requirements.   Some  programs
 require   computerized  analyzers,  and  others   require   only
 manual analyzers.   Some  programs have established  a  list  of
 approved  analyzers,   while    others   accept   any   analyzer
 certified by the manufacturer  co  meet  certain  specifications.

      The emission  performance  warranty   regulations   include
 requirements  for   analyzer  accuracy  and  quality   control.
 These requirements  must   be  met  in  order  for  motorists
 involved in  I/M programs  to be  eligible for warranty repairs.

      Regardless • of  the  type  of  analyzer-  or  the  type  of
 program,  appropriate maintenance and  calibration  procedures
 are needed  to  ensure that  the  analyzers  yield  accurate and
 repeatable   measurements.   These  requirements  involve  the
 introduction of span gas to  check' and adjust calibration and
 check for leaks  (some  programs   use  vacuum  decay instead),
 periodic cleaning of  tape drives and other  analyzer parts,
 and replacement of filters  and  other  consumeables.

-------
                                -3-
5.   Quality Control Procedures             _           ;

     Quality  control   procedures   should  be  prescribed  to
ensure  that analyzers  are  calibrated  and .maintained,  that
inspections  are conducted  properly,  and that  inspection and
calibration  records  are completed  properly.   In computerized
systems, some of these checks can be automated.

A comprehensive -quality control system should address:

     1.    . Analyzers
                 Periodic calibration checks
                 Periodic leak checks
                 Regular preventive maintenance
                 Accurately named calibration gases

     2.     Inspections
           -     Assurance that analyzer  is ready for testing
                 (warmed up, zeroed, and spanned)
                 Assurance that vehicle is warmed up
                 Assurance that there  is  no excessive exhaust
                 system leakage
                 Assurance  that  proper probe  insertion depth
                 is achieved

     3.     Pass/Fail Determinations
                 Proper vehicle and cutpoint identification
                 Safeguards to prevent mistake, fraud or abuse

6.    Quality Assurance

     Quality  assurance  procedures  are  necessary  to  assure
that prescribed regulations  and  procedures are  followed and
that  the program  is  achieving  its  purpose.   The  internal
control   system  is  totally  dependent  on  data  to  serve  as
feedback  on how  well  the  program  and  parts  thereof  are
working.   Therefore,   the   first   requirement   of  internal
control   is  to  have  an   adequate  and  functioning   data
collection  system.   To  have maximum  benefit, data  analysis
must  be  accurate/  reliable,  complete, ', and  timely.    Data
analyses  should  be -capable'  of   identifying  the -level  of
noncompi iance among  vehicle  owners,,  the  failure  rate  among
inspected   vehicles    of   different   age   groups   (initial
inspections separately  .from retests),  the  waiver  rates,  and
the  quality of  repairs "(through comparison  of  initial  and
retest  emissions  levels of  failed  vehicles).   It is  highly
desirable that  failure rates  and waiver rates -be calculated
and  periodically  reviewed  at.  the  level  of  the  individual
inspection station or repair facility.

-------
                                -4-
     An  integral  part  of the  internal  control--  system  is the
auditing of  inspection  facilities.   These audits should focus
on  analyzer  calibration  and' leak  checks as  well   as  record
checks,  -especially   when   manual   -records   are   kept.    In
decentralized programs,  investigations  with  unmarked vehicles
set to  fail  the emission and/or  tampering  test  are essential
for monitoring inspector performance.

     In -those  programs  which rely  on windshield stickers  or
certificates of compliance  for  enforcement,  an accountability
system  is  needed.  , Each  inspector  or  inspection  station
should be  required to  show  that  the number  of  used stickers
or  certificates  corresponds  to  the  -number  of   inspection
passes  for  the  period  in  question.   This  applies to  both
centralized  and   decentralized   programs,    since   improper
diversion of stickers or certificates is possible in both.

     Another   integral   element   of .  internal   control   is
corrective  action,  or   at   least  the  means  for   corrective
action.   In some cases,  this  would  involve  penalties  for
infractions  by  inspection stations  or  individual  inspectors.
In  other  cases,  corrective  action  might  involve  making  an
administrative change in the way records  are  kept or a change
in  the  forms   themselves.    The  ultimate  test  of  internal
control is whether problems can be identified and resolved.

7.   Enforcement Procedures

     The  goal  is  to  assure  100%   participation  of  subject
vehicles  in  the  I/M  program.   There  are  three  systems  of
enforcement currently used  in  I/M programs  (some areas  use a
combination of  these):

     1.    Registration denial systems.

     2.    Sticker based systems.                          •

     3.    Registration data link systems.         ,

     Registration  denial systemsS have  historically provided
the  most  effective • and  efficient  means . to   enforce  I/M
requirements.  In  chis  system,  vehicle  registration is denied
unless  a  vehicle has   complied  wi-h  the   I/M  requiramer.-.
Existing  penalties  for  operating  an  unregistered  vehicle
serve to  deter  noncompliance,  and  the  State  or  locality has
an  incentive  (recovery  of   lost   registration  revenue)  for
enforcing compliance.

-------
                                — 3 —
     Sticker  enforcement  systems  are  another  widely  used
enforcement-  method.   In  this  system, vehicles  are  issued  a
window " sticker  as  evidence  of  compliance.   Vehicles  with
expired stickers or  without  stickers  are subject, to  citation
by  police   followed,  by  some  sort of   penalty.    Sticker
enforcement  programs  often  suffer   from  a  lack  of  police
priority  and  the  ability  to  readily  distinguish  subject
vehicles (especially in regionalized programs).

    •The registration  data link system  is  a  system used  by
several  regionalized  programs.  'In  the  data-link -system,
vehicles are  identified and scheduled for  inspection on  the
basis of registration  data.   Inspection  data is then reviewed
by comparing the list of vehicles scheduled  for  inspection in
a  particular   period   to   the  list of   vehicles   actually
inspected and passed.   Through this  comparison,  non-complying
vehicles and their  owners can be identified for enforcement
action.  The chief drawbacks  to this  system are  the  lag time
in data  analysis  and  the level  of  resources  which must  be
devoted to  establish the  data link,  operate  it, and  pursue
prosecution of non-complying owners.

8.    Vehicle Coverage

     This design  area  covers  those  factors which  affect  the
fraction  of• total  vehicles  being  inspected   in   the  I/M
program.   These factors  include:  1)  geographic  coverage area,
2)  weight class or  use  exemptions,  3)  exemptions  related to
fuel type,  and 4) model year  exemptions.

     Vehicle coverage  factors  should be chosen  such  that  all
or  most  of  the vehicles  that  operate  in the  non-attainment
area   would   be   subject   to  program  requirements.    The
geographic  area' should  include the  commuting  area  for  the
urbanized area(s)  in  question.   Similarly,  weight class  and
use exemptions  can be  set to cover  the majority  of  gasoline.
Most  I/M programs  include  vehicles  up to  8500  pounds GVW,
which covers light-duty  vehicles  and light-duty  trucks, both
             are   used   almost   exclusively   for   personal
            3n.  Many  programs require  inspections for higher
             vehicles.
of   which
transportati
weight class

     Model  year,  coverage   is   another   potential  area  nor
affecting program  effectiveness  and the  distribution  of  the
program's impact.   Many I/M programs  inspect  all  vehicles or
all  1968  and  newer  model  years.   Other   programs  limit
coverage  to  fewer model  years.   Some programs  vary coverage
depending on the test  type,  often requiring  emission control
component checks on fewer model  years than the emission test.

     One   important    consideration   when    reviewing   all
exemptions is whether there are  loopholes in  the way they are
administered.   It   is  desirable,   for   instance,   to  have
provisions  to  prohibit,  and procedures  to  prevent,  owners
from  registering  their vehicles outside  the  I/M area.  Also,
where a  weight  limit  exists, it  is  important   uc  define  che
exact basis  for the  limit (gross  vehicle weight rating, empty

-------
                               -6-
weight, or  other),  the way  the  official weight  of  a vehicle
is-  determined  (reference   book,    examination,   or   owner
testimony), and whether evasion  through misreported weight is
occurring.    Similarly,    use   exemptions   and   fuel   type
exemptions should be examined for loopholes.

9.    Waiver Procedures

     Many  I/M programs  include waiver  provisions which  are
intended to limit  the amount of expense an owner would have
to  incur  as  a  result  of  failing  the I/M  .test.   Most  I/M
waivers  are  tied  to  repair  cost  ceilings.    Repair  cost
ceilings between  $50  and  $100  are  the most  common,  but both
lower  and  higher  limits  are used  in  some  programs.   Since
waived    vehicles    represent     reductions    in    program
effectiveness,   high   waiver   rates   can   be   particularly
troublesome.  Therefore,  it  is  important to  review the number
of waivers being issued as well  as the criteria' for  granting
waivers and the procedures  used in processing  waivers.   All
SIP's  were  approved  under  the   assumption  that waiver  rates
would   be   low   and  emission   reductions   would  not   be
significantly affected.

     In programs  without   emission  control   component checks,
criteria  should be  set   to  prevent  tampered  vehicles  from
getting waivers.   All  programs, should  ensure that  owners take
advantage   of   the   emission    performance   warranty,   if
available.   Procedures for processing  waivers should  focus on
verifying that  all waiver  criteria have been  met,  including
verification  that  all   repairs  claimed  toward  the  repair
ceiling  were  appropriate   and   actually   done.    Another
desirable safeguard is  to have  the  ability to track  waiver
rates by repair  facility  in  order  to be able to identify both
abuses and simple lack of  repair expertise.

     Particular attention  should be given to waiver  criteria
f o'r,  and waiver  rates among, 1981  and newer vehicles.   These
vehicles are  particularly susceptible to high waiver  rates
because  the  repair  industry  is  relatively unfamiliar  wich
them.  They are becoming  the majority of the fleet',  and good
repair  practices  should  be  encouraged  before  bad  habits
became entrenched.

10.   Consumer Assistance  and Protection

Other  consumer  assistance  and  protection  aspects  of  I/M
programs,  in addition  to  waivers, include the following:

     1.    Referee  test   -  This  is  an  EPA requirement  for
           decentralized    programs   but  may   be  found   in
           centralized programs,  too.

     2.    Complaint   handling    -   Procedures   should   be
           established for investigating complaints.

-------
                                -7-
     3.    Repair  information  - Owners  of  failed  vehicles
           should receive a  brief  discussion of  the  possible
           reasons for vehicle  failure;  this  information will
           guide  owners  toward  obtaining proper  repair  at  a
           reasonable cost  and  may serve to  reduce  abuses by
           mechanics.  I/M programs can  also  go as far  as to
           publish repair  effectiveness statistics of  repair
           facilities.

11.   Mechanics Training

The  goal  of  mechanics  training   is  to   improve   program
effectiveness  and enhance   consumer  protection  by  having  a
supply  of  mechanics  trained   in  proper   emission   related
repairs.   Cost  savings  to   the  public  from more  efficient
repairs can more than offset the cost  of delivering training.

The following aspects of mechanics training are important:

     l.    Course curriculum - What is the course content?

     2.    Course distribution - How is the course delivered?

     3.    Course  promotion  -  What   is  done   to   promote
           interest and participation  by mechanics?

     4.    Course followup -  What  is  done to foster continued
           support to trained mechanics?

     Course  content   must  address  proper  analyzer  use  and
calibration,    emission  test   procedures,   procedures   for
detecting tampering  and  misfueling,  basic information  on the
types  of  I/M failures, and diagnosis  and  repair  of  excessive
hydrocarbon   and  carbon   monoxide   emissions.     Mechanics
training courses  are  generally  offered through one or more of
the following ways:

     1.  '  Community  colleges,  vocational/technical  schools,
           high schools,  etc.     :,

     2.    Independent  training  agents  (private  individuals
           or firms licensed or  certified -o offer courses).

     3.    In-house training personnel.

Often  the  cost  and  geographic   availability  of  the  training
have a  bearing on how many  mechanics  participate.  It must be
realized that  if repair cost waivers  are readily available,
mechanics  are  not forced by customer satisfaction to  become
competent  in  emission  repairs.   I/M  programs  with  waiver
provisions,  therefore,  carry  a greater  burden  to  encourage
participation in training.

-------
                               -8-
     Training can be promoted  through mailings to garages and
contacts  with garage  associations,  service .station  dealers
associations, auto clubs  and the like.   A program to identify
incompetent  and  problem  mechanics  should  be accompanied  by
efforts to get these mechanics to receive training.  Finally,
legislation to allow only trained and certified  mechanics  to
qualify  vehicles  for   repair  cost  waivers  exists  in  some
States and should be considered by all.

     Follow-up with trained mechanics can  be useful to  keep
them  interested   and   informed  of  new  issues or  additional
training opportunities.  Some ways to do this include:

     1.    Periodic newsletter.

     2.    Repair information hotline.

     3.    General mailings.

-------
          APPENDIX C
I/M PROGRAM AUDIT QUESTIONNAIRE
          C-l

-------
• I/M AUDIT QUESTIONNAIRE
Page One
STATE
VEHICLE COVERAGE
Emission Tost
PROGRAM
Enter model year coverage for each class
LDV

LDT1

LDT2

Antl-Tamperlng Test
Catalyst
Fuel inlet restrictor
i
Tailpipe lead check
PCV
Evaporative Canister
L Air Pump


















^ Exemptions
Motorcycles
Diesels
Other fuels
New vehicles












Describe any exceptiohs or qualifications to above answers:
1 ' 1
GEOGRAPHIC COVERAGE \List areas & number of vehicles requiring testing

-------
                          INSTRUCTIONS
                            Page One
VEHICLE COVERAGE
     This section is requesting detailed  information on vehicle
coverage in the  program.   There  are three columns, one for each
vehicle weight class, defined as:
     LDV = light duty vehicles up to 6000 Ibs. GVW
     LOT1 = light duty trucks up to 6000 Ibs. GVW
     LOT2 = light duty trucks from 6000 to 8500 Ibs
GVW
In these columns, please enter the model  year  ranges applicable
to the  emission test and each component  of  the tampering test.
Also,   indicate  whether   the   vehicle   types   listed   under
exemptions are,  in  fact, exempt.  Enter  the model  year  ranges
of those that are not exempt.

GEOGRAPHIC COVERAGE

     This section  is requesting  information  on  the geographic
locations of  the program.    List  the  major urban  areas covered
by  the  program and  provide  an  estimate  of  the  number  of
vehicles required  to participate  in  the  program.    If possible,
break down the estimate by urban area.

-------
A
I/M AUDIT QUESTIONNAIRE
      Page Two
iEST PROCEDURES
Emission Test
Electric zero/span
HC hang-up check
Preconditioning
Tachometer used
Idle test
2500 RPM test
L
W Loaded test
CO2 cutpoint
Restart test
YES/NO









DESCRIBE
// and when each of the items listed is
used, how it is done, and any exceptions



,





Describe other emission test related features, problems or comments:
\

-------
                          INSTRUCTIONS
                            Page Two
EMISSION TEST PROCEDURES
     The  intent  of  this  section   is  to  describe  the  precise
nature  of  the  emission  test  procedures.    Enter  yes   in  the
column  labeled  "Yes/No"  if the  activity listed  is  used during
the  test  procedure.  Use  the  describe  column  to  explain  the
activity further.

-------
1 I/M AUDIT QUESTIONNAIRE
^ Page Three
HsT PROCEDURES
Anti-tarn Bering Test
Equipment Requirements
Lead test paper
Fuel inlet gauge
Emission control
component manual
Other equipment
Repair Requirements
Grace period
L
W Catalyst replacement
requirement
Other part replacement
requirements
Catalyst replaced
on inlet failure
and on lead test failure
Describe verification
procedures for repairs
Cther Reauiremems
I
YES/NO
DESCRIBE
if and when each of the items listed is
used, how it 16 done, and any exceptions







'
•
Describe requirements











-------
                          INSTRUCTIONS
                           Page Three

ANTI-TAMPERING TEST PROCEDURES

     The  intent  of  this section  is  to  describe  the  precise
nature of the anti-tampering test procedures.

Equipment Requirements

     Enter  yes  in  the  column   labeled  "Yes/No"  if  the   item
listed  is  used  during  the   test  procedure.   Use  the  describe
column  to  explain how  and  when  the  equipment is  used  and the
specifications  for  the  equipment  (e.g.,  size  and  material
requirements for fuel inlet  gauge).

Repair Requirements

     Enter  yes  in  the  column   labeled  "Yes/No"  if  the   item
listed  is  a  program  requirement.   Use  the describe  column to
explain how  and  when the requirement  applies  and other related
details.   Indicate whether  specific  parts (e.g.,   OEM  parts)
must be used for replacements.

-------
                          I/M AUDIT QUESTIONNAIRE
                                Page Four
ANALYZER SPEC
Computerized
YES/NO
Lockouts
Warm-up time
Leak check
Calibration check
HC hang-up



-
Automatic Features
Outpoint selection
L .Pass/Fail decision
Calibration adjustment
Test data collection
Data loss problems
Indicate magnitude and
problems if known
Q/C data collection






Manual Analyzers
Percent of data
computerized for analysis
BAR-74 or BAR-80
specification


DESCRIBE
spec, how it operates, & frequency
i
i
i

f
ther Details, Comments, or Problems:

-------
                          INSTRUCTIONS
                            Page  Four
ANALYZER SPECIFICATIONS
     The  intent of  this  section is to describe  the  features of
the emission analyzers used  in the program.  There are separate
sections for computerized equipment and manual analyzers.

Lockouts

     Indicate  whether the  analyzers  in use  prevent  official
inspections  until  the items  listed have been  satisfied.   This
applies  to  computerized  analyzers  only.   Also,  indicate  how
often  leak  checks   and  caMbrat ion  checks  are  required,  the
warm-up period and how and when HC hang-up is monitored.

Automatic Features
                     i
     Indicate whether the  items listed are automated  or manual
(at the  inspector  level).   Describe  the criteria used  for  the
first  three  items.   Describe  the  type  of  data  collected,
whether data loss  has  been a problem,   the  magnitude  of  data
loss and the reasons for  it (if known).

Manual  Analyzers

     Indicate  the  percent  of  manually  collected  data  that is
keypunched.  Note the equipment specification  requirement  and,
if possible, provide a list of  approved analyzers.

-------
^ I/M AUDIT QUESTIONNAIRE
P Page Five
ANALYZER QC AND QA
Service agreement
requirements
Gas naming of
calibration & audit gases
Station Gas Accuracy
Concentration of
station gases
Cal check tolerances
Station gas span frequency
L Audit gas accuracy
Concentration of
audit gases
Audit tolerances
Audit gas span frequency
DESCRIBE



PROPANE CO CO2


'
PROPANE CO CO2
<
.
Additional features or comments: \ .

«
4

-------
                          INSTRUCTi ONS
                            Page  Five

ANALYZER QUALITY CONTROL AND QUALITY ASSURANCE

     The  intent of  this section  is  to  describe the  quality
control and  quality  assurance practices  as  they  relate  to the
emission analyzers used  in  the  program.   Provide as much detail
as  possible  about the  types of  gases,   their  accuracy,  blend
tolerance,    naming   protocol   and  about   the   tolerances  for
analyzer checks, and  frequency of audits and spanning.

-------
                         I/M AUDIT QUESTIONNAIRE
                                Page Six	
 INSPECTOR  LICENSING
YES/NO
DESCRIBE
Training  requirements:
       Analyzer use covered

   Emission  related  repairs

            Quality  control
    Periodic  recertificati.on
    or  other  recertification
                  required
         Number of licensed
          inspectors  (date)
         Number of licensed
            stations  (date)
   AUDIT  PRACTICES
                DESCRIBE
for past 12 months or most recent year available
         Number of auditors
     (full-time  equivalents)
           Audit frequency
Total number of overt audits
 	(give time period)
     Total number of covert
   audits  (give time period)
  Number and model years of
   undercover vehicles used
    Are undercover cars  set
       to  fail   (which tests)
Describe coven and  overt, audit practices:

-------
                          INSTRUCTIONS
                            Page Six
QUALITY ASSURANCE
     The  intent of  this  section  is  to  describe  the  quality
assurance practices used to license and monitor stations.

Inspector Licensing

     Describe   the   features   of   the   courses   required   of
inspectors,  including  the  source  of  the  curricula  used  in
training.  Also, describe  the  recertification  requirements both
regular and otherwise (e.g., after suspension for violation).

     Provide statistics  for number  of  stations  and inspectors
(and the date for which these numbers are valid).


Audit Practices

     In  addition  to  the   listed  questions,  briefly  describe
other details and procedures  for  both overt and  covert  audits.
Attach a copy of any related procedures manuals,  if available.

-------
£
                      I/M AUDIT QUESTIONNAIRE
                            Page Seven .
WAIVER PRACTICES
Cost limits
(dollar amounts)
Specific repairs required
(list)
Minimum emission
reduction required
(percentage)
Tampering checks
conducted
(list)
Warranty eligible
L vehicles excluded
^ Repair documentation
required
YES/NO

,




Describe other waiver features:
DESCRIBE







4

-------
                          INSTRUCTIONS
                           Page Seven
WAIVER PROCESSING
     The  Intent  of  this  section  is  to  describe  how  and  when
waivers  are  issued  to  failed  vehicles.    In  describing  the
listed items, indicate what the dollar  amounts  are,  which items
must  be   repaired,   what  percentage   emission  reduction  is
required,   which  emission  control  devices must  be  present  and
unaltered,  how  warranty   issues   are   dealt  with,  and  what
documentation is necessary.

-------
I/M AUDIT QUESTIONNAIRE
Page Eight
ENFORCEMENT
Sticker Enforcement
Are all -subject vehicles
identifiable visually
Is there a fine for 'driving
without a valid sticker
Is a grace period allowed
after citation
Is a court appearance
required
Is compliance required
before case is closed
List police agencies with
enforcement authority
Are sticker surveys
conducted
Are sticker numbers
recorded on the test form
Are different stickers used
for new or exempt vehicles
Can parked vehicles be
ticketed if in violation
Are roadside pullovers
done to check stickers
YES/NO
DESCRIBE

*





















Comments:

-------
                          INSTRUCTIONS
                           Page Eight
STICKER ENFORCEMENT
     The  intent  of  this  section   is  to  describe  the  sticker
enforcement mechanism,  if  used by  the  program.   Describe  when
the  listed  items apply, how much  the  fines are, how  long the
grace  periods,   and   other  details   associated  with   each
question.   Use the space below if  addition room is needed.

-------
I/M AUDIT QUESTIONNAIRE
Page Nine
ENFORCEMENT
YES/NO DESCRIBE
Registration Enforcement
Is guidance given to
registrars on requirements
Are audits conducted on
registration documents
How many subject vehicles
register in non-l/M areas
How many subject vehicles
have expired registrations
Data-Link Enforcement
How many notices of
violation have been sent
How many vehicles
responded to the notice
How many enforcement
actions have been taken
How many vehicles
responded to enforcement
actions taken
Time frame for results




DESCRIBE

-



Additional  Comments:

-------
                          INSTRUCTIONS
                            Page Nine
REGISTRATION ENFORCEMENT
     The intent of this section  is  to describe the registration
enforcement mechanism,  if used by  the  program.   In addition to
the  questions  listed,  describe  any other  problems,  studies,
data, or information relevant to the enforcement effort.

DATA LINK ENFORCEMENT

     The intent of  this  section is to describe the data link or
computer  matching  enforcement   mechanism,    if  used   by  the
program.   In  addition  to  the  questions listed, describe  in as
much detail  as possible  the  enforcment process,  step  by  step.
Also, discuss  problems,  studies, data,  or information relevant
to the enforcement effort.

-------
* I/M AUDIT QUESTIONNAIRE
™ • Page Ten
EMISSION INSPECTION STATISTICS
MODEL
YEAR
1968
1969
1970
1971
1972
1973
1,974
1975
(76
1977
1978
1979
1980
1981
T982
1983
1984
1985
kl986
1987
TOTAL
Number Vehicles
Initially Tested


'










I







Number Failing
Initial Test



•

















Number
Waived





_






i
i







Average Repair
Cost





'








!







-------
                          INSTRUCTIONS
                            Page Ten

EMISSION INSPECTION STATISTICS

     For  each  model   year  covered  by  the  program,   list  the
combined number  of  light-duty cars  and  light-duty  trucks  that
were  initially  tested during  the reporting  period,  the number
failing the initial  emission test (only),  the number waived and
the  average  repair cost  for  failed  vehicles.   If  repair  cost
can  be  broken  out   for  waived  vehicles   vs.   passes  after
maintenance,  report these separately.

-------
I/M AUDIT QUESTIONNAIRE
Page Eleven
ANTI-TAMPERING INSPECTION STATISTICS

MODEL
YEAR
1975
1976
1977
1978
1979
1980
1981
1982
1983
1984
1985
1986
1987
1988
CATALYST
Failures
#
PASS














#
FAIL














#
NA














FUEL INLET
Failures
#
PASS














#
FAIL








-





#
NA














LEAD TEST
Failures
#
PASS



/










#
FAIL














#
NA














PCV VALVE
Failures
#
PASS














#
FAIL














#
NA






i







EVAP CANISTER
Failures
#
PASS














#
FAIL














#
NA














AIR SYSTEM/PUMP
Failures
#
PASS














#
FAIL














n
NA














Comments:

-------
                          INSTRUCTIONS
                          Page Eleven

AHTI-TAMPERING INSPECTION STATISTICS

     For  each model  year  covered  by  the  program  (1975  and
later),   list  the number of  vehicles  that passed  the component
checks,   the  number that  failed  and  the number  not originally
equipped  (or  NA).   If  rates  are  not  available  by  model  year,
use whatever  model year groupings  are available  (e.g.,  1981+,
pre-1981;  or  all  model  years).   Similarly,  if  rates are  not
available  in  the  component  breakouts  provided,   use component
groupings  available  (e.g.,  catalyst/misfueling,   underhood).
Note  in  the  comments  section  what  the data  listed  include.
Finally, if data  cannot be  broken  into  pass/fail/NA categories,
explain  on  the  comments section what the data  provided include
(e.g.,  pass includes NA).

-------
4
^ I/M AUDIT QUESTIONNAIRE
P Page Twelve
OVERT
AUDIT STATISTICS
Stations
Inspectors
Equipment Findings
Analyzers checked
with audit gas
Analyzers failing
span check
Leaks identified
Stations missing anti-
L tampering equipment
COVERT
AUDIT STATISTICS
Stations
Inspectors
ENFORCEMENT DATA
Required to be inspected
in the reporting period
t
Not complying
with requirements
Receiving citations
or other enforcement
iQther Comments:
f
Number of
Overt Audits


Number of




Number of
Covert Audits


Number of
Warnings


Number of
Suspensions


Number of
Revocations


Comments

Number of •
Warnings


Number of vehicles


>
Number of
Suspensions


Number of
Revocations


Comments



-------
                          INSTRUCTIONS
                          Page  Twelve
AUDIT STATISTICS
     List  the  number  of  overt   and  covert  audits  conducted
during the audit period.  As a result of  those  audits,  list the
number  of  warnings,  suspension,  and  revocations  issued.   If
stations  and  inspector  performance  are  not  always  checked  or
warned or penalized together, list the statistics separately.

     For  overt   audits,   list  the  results  of  analyzer  audits
(number checked  and number  failing.)  Also,  for  ATP programs,
list  the  number   of   stations   that   were  missing  equipment
required  for  the anti-tampering  check  (e.g., inlet  gauge,  lead
test paper, etc.).

ENFORCEMENT DATA

     List  the  number  of  vehicles  required  to  be  inspected
during the reporting period.  Estimate the  number  not complying
with  requirements  as best  you can.  Also  the number  of  other
enforcement actions taken.   Describe  how these  statistics  were
determined.

-------
                        I/M  AUDIT QUESTIONNAIRE
                               Page  Thirteen
COST DATA
Emission/Tampering
check only
Safety test fee
Combined fee
Program Budget
Air agency
I/M operations agency
Enforcement agency
Other government
agencies involved
Government Staffing
Number of station
(QA) auditors
W Number of consumer
assistance ' staff
Number of
administrative staff
Number of
enforcement staff
Number of
other staff
Station/Program Staffing
Number of inspection
stations
Number of inspectors
Number of licensed
mechanics
Others (list)
Test fees



Annual Dollar Amount




Full-time Equivalents





Full-time Scuivaiems




Comments

<

-------
                          INSTRUCTIONS
                          Page Thi rteen
COST DATA
     List the test  fees  in effect during the  reporting period.
For each  agency  involved  in  the inspection program,  list  the
dollar  amount  allocated  or  actually  spent  on  I/M -  related
activities.

     Enter  the   number  of  people  employed  by the  government
agencies  involved  in  the  program  for  the various  activities
listed.   If  an   auditor,  for example, only spends half  his or
her time on the   inspection program,  count that as 0.5 persons.

     For decentralized programs,  list  the  numbers  of  stations,
inspectors  &  mechanics  involved  in  the  program  during  the
reporting period.  For centralized  program, list the  number of
government  run   stations,   the  number  of  government-employed
inspectors,   mechanics  or  other  staff  not   listed  . in  the
government staffing section.

-------
                           APPENDIX D
               DESCRIPTIONS OF ON-SITE ACTIVITIES
1.    Surveys
2.    Records Review
3.    Procedures Observations
                            D-l

-------
             DESCRIPTIONS OF ON-SITE AUDIT ACTIVITIES
1. Surveys
     The  activities  denoted  as surveys  consist of  information
collection activities  that  involve the  examination of  vehicles
and equipment.  State  or  local  cooperation and participation are
generally necessary.

Enforcement Survey

     In I/M  programs  with sticker  enforcement  (or  with  another
form of enforcement but  accompanied by a sticker which indicates
compliance) a sticker  survey must  be  performed.   The  purpose  of
the survey  is to determine what  percentage of  subject  vehicles
are complying with the inspection  requirement  as indicated  by  a
valid  sticker,   versus  the  percentage  .of  vehicles  without  a
sticker or with an expired sticker.

     The  survey  should  include  a  sample  of   at  least  1000
randomly   selected   vehicles    in   each  urbanized   area.    For
practicality,  it  is  acceptable to survey  vehicles  which  are
parked on-street,  in  paid off-street  parking, or in  free public
lots such as  at  shopping centers.  At  least five  widely spaced
locations  of several  types  in each  urbanized   area  should  be
surveyed  to   get  the  sample  of  1000  vehicles,  to  insure  a
reasonable cross-section.  At   least  one  location  should be  in
the central business  district.

     The sticker survey offers  a convenient  opportunity  to  get  a
rough   measure   of    the  influence   of   non-subject   (e.g.,
out-of-county) vehicles,  since  non-subject vehicles will  have  to
be identified in  the  survey to  avoid  bias to the compliance rate
estimate.   Consequently,  non-subject  vehicles  should  be  counted
and  recorded,  rather  than  just  passed  by.    If  there are  a
significant number of  non-subject  vehicles  operating   in  the I/M
area,  EPA may in  the  audit  report  recommend expansion of program
boundaries or vehicle coverage.

     The  sticker  survey  is  not  intended  to  identify  specific
vehicles  or  owners  for .adverse action,  so vehicle  identifiers
should  not  be   recorded.   The  only  exception   would  be  when
subject  vehicles   cannot  be  identified  without  a  registration
crosscheck (such as in a computer matching system)  in  which case
license plates should be recorded for  unstickered vehicles only.

     A comprehensive  investigation into compliance rates is not
required  in  registration-enforced  programs  unless   there  is
reason to suspect that registrations  are being processed without
required  inspection  documentation  or  program  area  vehicles are
being  registered  elsewhere  to  avoid  inspection.  At  a  minimum,
registration data should be compared with  inspection  volume data
to determine whether  a significant  gap exists.

-------
     Data-linked enforcement  programs  should be investigated via
 records  review  to  determine how many vehicles  are  at  each stage
 of the enforcement sequence.

 Tampering Survey

     In  programs  with  required ' inspections  of  some  or  all
 vehicles  for  tampering  and  misfueling  and  where  additional
 emissions  reduction  credits are claimed  for them  in  the SIP,  a
 tampering  survey must  be  performed.   EPA  has been  conducting
 such surveys  at various  locations  around the  country  each year
 and every  effort has been made,to conduct surveys  in  areas that
 are due  for an  audit.   Tampering  surveys are  the only reliable
 method for  determining the  effectiveness of  anti-tampering  and
 misfueling programs.

 Analyzer Audit, Centralized Programs

     In  centralized  programs all 'of  the  active analyzers  in at
 least three inspection stations should  be audited.   These audits
 may be  performed by EPA  personnel  or  by I/M program personnel.
 I/M program span gases can  be used for analyzer checks,  as long
 as it is verified that the gas is named properly.

     In  centralized  programs with multiple  urbanized  areas,  the
 site visit can  be  limited to one urbanized  area  (assuming there
 are  at  least   three  stations  there),  as   long  as one  of  the
 following conditions is met:

     1.    There are independent  routine  audits of  all  stations
           in the  other  urbanized areas  conducted  by  an outside
           group or  agency  other  than  the  one which  performs
           routine  calibration  and maintenance,  the results  of
           these audits can be reviewed through records,  and  the
           EPA auditors have  observed  at  least one audit by this
           outside  group or  agency,  or

     2.    The  same  State/local  personnel  perform the  routine
           calibration and maintenance in  all urbanized areas.

Otherwise  the  EPA  auditors  must audit  analyzers   in  additional
urbanized areas', but not necessarily al-1 of them.

     An  analyzer audit consists of a  calibration   check  througn
 the  probe- _and   a  low  flow  indicator  check.   State/local  and
contractor cooperation  will  be  needed to  audit analyzers  that
 are  in  service  in  open  inspection lanes.   Instructions  and  a
 recording form are  found in  Appe.ndix F.

 2. Records Review

     In association with each of  the  following  records  reviews,
 the EPA  auditors should  seek an  understanding of how the records
are generated and  handled by the  I/M  program.  Where  it  seems
useful  and  practical,  copies of  records  should   be  requested,
especially before the audit  to allow in-office review.

-------
Vehicle Records

     Recent   documentation  (inspection  forms,   retest   forms,
repair  forms  or  receipts,  and waiver  forms)  must  be  reviewed
from  at  least 500  vehicles.   In a  decentralized program, these
records  must  come  from  at  least  ten  different   inspection
stations.    In   centralized   programs   with    computer-printed
inspection  forms,  test records for  passing  vehicles  need  not be
reviewed and  the number of vehicles may  be  reduced accordingly.
Care  should be  taken  that the forms  are from typical cases  and
that  they  have  not  been   pre-screened  before being  provided to
EPA.

     The  auditors  should  examine   the  forms  for  completeness,
legibility,   accurate  .application   of   inspection    standards,
reasonableness   of    the   test    scores,    correct   pass/fail
determination,   appropriateness   of   repairs,   reductions   in
emission  levels  from  repairs,  and  adequacy  of documentation  for
a waiver  if one  was  given.   If severe  deficiencies  or  repeated
errors  are  noted   for  a   licensed   inspection  station,  the   EPA
.auditors should  ask to be allowed to  review  the  records of  past
audits  and of past and  ongoing  corrective  action towards  that
station.  Due  to the limited amount of  time  available during an
audit,  bulk   record  reviews  are  more  effective  if  conducted
before the  audit.   This way more time can be spent studying  the
records  and  assessing problems.   The  results   of  an in-office
review could  lead  the  audit  team to  put more emphasis  in  some
areas and less in others.

Station Audit Records, Centralized

     EPA  auditors  should  review  the  records   created  by   the
routine  State or   local  audit of   a  few inspection   lanes,  to
familiarize  themselves with the  procedures  used  by the  auditors
and the data available from their activities.

Station Audit Records, Decentralized
                                     j
     EPA  auditors  should  review  the  audit  records  for  each
decentralized  station  visited  during
should look for  audit  completeness,, adherence to procedures,  and
indications  that inadequate performance  by  licensed stations  is
routinely  identified >and corrected.
 cound  in  Aooendix  F.   .It   is  also
 the  audit.   EPA  auditors
nstructions and  a  form
worthwhile  co   review
records  of  stations  that  have  been  suspended  or   revoked   to
determine  the incidence and causes of suspension or revocation.

Data Summaries

     If  the I/M  program  generates  periodic data  summaries  not
previously  made  available,  these should be  reviewed  on site  for
the  last   few  reporting  periods.   The  manner  in   which   the
summaries  are produced .and  the meaning of all  entries should  be
understood.

-------
Licensing/Suspension Records

     Files  relating  to  the  disciplining of  licensed inspection
stations or  fleets which do not  adhere to procedures  should be
examined  to  determine  the  general  nature  of  the State's  or
locality's practices in such cases.

Consumer Inquiries and Complaints

     If the  I/M  program  keeps  such records, they can  be scanned
to  determine the nature of such  interactions with  the public.
These  activities should  be considered  a  low  priority  unless
other audit findings suggest a  need to review these records.

Enforcement Records

     Statistics  on  recent   and  current  enforcement  activities
should have  been obtained during  the audit preparation.  . while
on-site,   the  EPA   auditors   should  verify   the  enforcement
procedures and general level of activity by reviewing records.

Other Records

     Unique  program   features  or  earlier  findings  during  the
audit may suggest other records which should be reviewed.


3.  Procedures Observation

     Much  of  the on-site  visit will  consist  of   observing  I/M
officials or  licensed  inspectors  perform  their regular functions
to  determine  if  actual  operations  are  consistent  with  the
documented program design, questionnaire  answers  supplied by the
I/M  program,  and  good  engineering  and  management  practice.
Suspicions raised by  record  reviews,  surveys,  and interviews may
make  it  advisable to intensify  the  observation   of  procedures
compared to the minimums suggested here.
Audits of Inspection Stations
An audit of one centra
personnel should be observed
during the ,site visit, at
requested. The audit records
reviewed for consistency of f
, Centralized
.ized inspection station by program
If no such audits are scheduled
Least one special audit should ce
of 'a. 11 auditing agencies should be
indings .
Audits of Inspection Stations,  Decentralized

     EPA  auditors  should accompany  State  or local  officials  as
they visit licensed  inspection  stations  on  regular  audits.   The
EPA  auditors  .should  observe   how   the  State/local  employees
conduct  the  audit:   whether  written  procedures  are  followed  by
the auditor,  whether the auditor has the  expertise to correctly
respond to questions from the station personnel,  and whether and
how  the  auditor   reacts   to   equipment  defects  or  inspector

-------
performance  problems.    Auditors  .should  observe  at  least  one
inspection  at each  station  (requesting  one  if necessary)  and
examine any  records  being  maintained by the station.   Forms  for
observing  audits  are  found  in Appendix  F.   A  total of  10-20
station  audits  divided  among  at  least  three  program  auditors
should be observed by EPA.

     In  centralized  programs  with  authorized  self-inspecting
fleets which  together  account  for  5 percent  or  more  of  annual
inspections,  at  least 3  fleet audits  should be observed  using
the same procedures and form as for decentralized programs.

Inspections, Centralized

     EPA  staff  should  observe  at  least   30   inspections   by
centralized   inspectors,   not   all  at   one   station.    These
observations  may  be conveniently  combined with  the  surveys  of
centralized  analyzers.   If heavy-duty vehicles  are  inspected at
a separate location, several  such  inspections should be observed
if  time  permits.   .Observations  of  inspections  should  include
re.tests and waiver processing.

Waiver Processing

     Where waivers  are granted separately  from  the  retest,  EPA
auditors  should  observe  waivers being  processed.   To the extent
practical, waiver processing  should  be  observed  at  each location
where it occurs.

Spot Checks Using Unmarked Cars

     Most  decentralized  I/M  programs  conduct spot  checks  using
unmarked cars at  licensed  inspection stations.   Such  checks,  if
they  use  vehicles  adjusted  to  fail   standards  or  component
checks, can  be  a very  important  part  of  the program's  quality
assurance efforts.  An EPA auditor  should observe the procedures
used  first  hand  by  accompanying  a  program official  on a  spot
check.  EPA  should  also  determine what actions  are  taken when a
station "fails" a  spot check  and  how stations  are  selected  for
surveillance.

             1  ' •
Other Procedures

     Other  activities  should  be   observed  as  necessary.    ?or
example, where questionnaire  answers  or  records  review indicates
a  shortfall  of vehicle  inspections  in a  registration enforced
system, it  is  recommended  that the  registration  renewal  process
be observed.

-------
               APPENDIX E
PROCEDURES USED IN QMS TAMPERING SURVEYS
              E-l

-------
                     PROCUREMENT ABSTRACT

     Motor vehicles and motor vehicle engines sold in the
United States are required to be covered by a Certificate of
Conformity which is issued to manufacturers who have demon-
strated that their vehicles and engines can meet the emissions
standards established under the Clean Air Act (Act) .  Most
new vehicles are certified to use unleaded gasoline to protect
emissions control systems.  Section 203(a)(3)  of ,the Act
prohibits manufacturers, dealers, fleet operators, or anyone
in the business of selling, servicing, leasing, repairing, or
trading motor vehicles from tampering with emissions control
devices and systems.  Regulations promulgated pursuant to the
Act prohibit the introduction of1leaded fuel into vehicles
requiring unleaded fuel.  EPA is aware that tampering and
fuel switching do occur.  The Field Operations and Support
Division of EPA is responsible for enforcement of these laws.

     Each year EPA conducts national  tampering and fuel
switching surveys.  These surveys are used by EPA for measuring
the rate of tampering and fuel switching.  The results of the
surveys a.re used to direct policy actions and to determine
the effectiveness of ongoing programs.  One such program  is
the field office operation of the Field Operations and Support
Division.  These o'ffices investigate  and prosecute acts of
tampering and fuel switching.  Additionally, state governments
utilize these data and results in order to evaluate, develop,
and implement State and local antitampering and anti-fuel
switching programs.             . ,  .

     For the 1988 survey EPA wishes to have a contractor
coordinate, collect, and compile the  data  for EPA's 1988  report,
The period of performance for completing work shall be one
year with options for two additional'years .  The contractor
must have personnel with extensiye experience in automotive
emission controls and in detecting emission control tampering
and fuel switching, be tfam'ilar wlith past surveys since the
jnechodology must be consistent with those  surveys, and have
the ability to enter and edic the daca in a machine readable
format.  The contractor must not have any  interests which
could affect the impartiality of the data.

-------
3.  The Contractor shall conduct the underhood examination,
    fuel sampling, emissions test,  plumbtesmo test, exhaust
    system examination, and fuel inlet restrictor examination,
    and record the required information for each vehicle.

9.  Contractor shall label all fuel samples so as to be
    identified with a particular vehicle,  pack samples as
    required by applicable D.O.T. Federal regulations pertinent
    to shipping of gasoline samples, and ship the samples in
    strict accordance with the Technical Proposal Instructions
    by a method approved by the EPA Project Officer no later
    than one week after sampling to EPA's laboratory at the
    Motor Vehicle Emissions Laboratory (MVEL)  in Ann Arbor,
    Michigan, for testing by 'EPA.  The results will be supplied
    to the contractor within sixty  (60) days from the receipt
    of the samples at MVEL.

10.  Contractor shall edit the data  from all vehicles surveyed
    and compile the raw data in an account specified by the
    Project Officer on EPA's IBM computer  system.  The raw data
    shall be accessible on EPA's computer system no later  than
    2 months after the end of the survey and the delivery of
    the fuel sample results to the contractor.  The contractor
    shall also deliver to the EPA copies of the daily calibra-
    tion logs for the exhaust gas analyzer(s)  and copies of
    the data forms for every vehicle surveyed within thirty
    (30)  days after each survey site is completed.

11.  Duplicate fuel samples.shall be taken  every twenty samples
    and shipped with original samples to EPA's laboratory  (MVSL)
    for analysis.  Samples of the gas used to flush the fuel
    pump and line will also be taken whenever new wash gas is
    obtained .

-------
     EPA representatives will also do a background report for
     each site which  will include the exact situation in which
     vehicles were procured,  a geographical description of the
     site,  weather,  and  other circumstances that might affect
     refusals, who the inspectors are on a particular site, how
     many and which  observations  were made at each site, and
     other circumstances that might bear upon the representa-
     tiveness of the  data.   The contractor personnel will
     perform the actual  vehicle inspections, and be respon-
     sible for filling out  the raw data forms.

II.   Suggested Equipment

     2  -  HC-CO gas  analyzers with sample lines, water trap and
          tailpipe probe

     1  -  Calibration Gas ^ 2% of listed concentration Nominal

          8% CO
          1560 ppm HC (Hexane equivalent)

          1.6% CO
          320 ppra HC  (Hexane  equivalent)

     1       -  Field  kit for  testing lead in gasoline

     3       -  Inspection Mirrors

     1       -  Large long-handled mirror for exhaust system
               inspect ion

     2       -  Flashlights

     2       -  Vacuum Pumps

     2       -  Fender Covers

     2       -  Fuel  Sampling  Pumps with 4 ft noses

   500       -  Sample Bottles per site

     1       -  Gasoline-powered generator for sites without power

     1  pair -  Battery Jumper Cables

     2       -  Leaded nozzles

-------
            DATA COLLECTION AND  RECORDIMG  PROCEDURES


     The forms on the following  pages  (Figures A-l and A-2)
will be used  to record  the survey data  in  the  field.  Minor
revisions may be made to these forms by the Project Officer
prior  to the  start of the surveys.  The forms  are  forced  choice
to ensure coding consistency/ and are designed to  facilitate
direct data entry.  The  following codes will be  used  to  record
data for the major system components on the data sheets:

         0 - Not originally equipped

         1 - Functioning properly

         2 - Electrical disconnect

         3 - Vacuum disconnect

         4 - Mechanical disconnect

         5 .- Incorrectly routed  hose

         6 - Disconnect/Modification

         7 - Missing item

         8 - Misacljusted tt>em

         9 - Malfunctioning

         A - Stock equipment

         3 - Non-stock

         D - Add on equipment    ,

         Y - Yes

         2 - NO

     Additional codes can be used for  those components  which
could not be classified  into the above  categories.  A brief
description of each data -jntry. follows.

-------
14


16


20


23
      1986  TAMPERING  SURVEY -  PART B  (REAR)
                 ID NUMBER
                 MAKE
                 (write out)

                 MODEL
                 (write out)
              12
13  VEHICLE  TYPE

         C- Car

         T- Truck (includes vans)
LICENSE PLATE
(State)
      IDLE HC
      (PPM)
22
    IDLE CO
    ODOMETER
    (Thou.)
                  26  DASH  LABEL

                       Q- Hoc ori|. equipped

                       1- Funce. properly
                         (present)
                       7- Missing Itea
                            31 TANK  LABEL

                                 0- Hot orif. equipp«*

                                 1- Funct. properly
                                    (present)
                                 /- Hissing Item
                       j — is*** »••» — — — —

                  27  CATALYTIC CONVERTER   p,LLER  NECK
                                               32 RESTRICTOR
                                                     0- Hot orlf. equlpp«
-------
     4 - Mechanical disconnect - When the stovepipe is
         disconnected or  deteriorated.   Also when the air
         cleaner has been unsealed, i.e., inverted air cleaner
         lid, oversized filter element,  or  holes punched into
         air cleaner.

     7 - Missing item - Missing stovepipe hose.

     9 - Malfunctioning item - Problems  with the vacuum override
         motor.

     8 - Non-stock equipment - Custom air cleaner.

j.   Positive Crankcase Ventilation (PCV) system - A typical
     configuration for a  v-8 engine consists of  the PCV valve
     connected to a valve cover and then connected to the
     carburetor  by vacuum line.  The other  part  of the system
     has a fresh air tube running from the  air cleaner to the
     other valve cover.  The PCV will be coded"as foliowa:

     1 - Functioning properly

     3 - Vacuum  disconnect - When the line  between the PCV and
         the carburetor Is disconnected,

     4 - Mechanical disconnect - When the fresh  air tube between
         the valve cover and the air cleaner is disconnected or
         removed.

     7 - Missing item - When the entire  system has been removed.

     9 - iMalf unct ioning item - When the   line between the PCV
         and the carburetor is cracked or collapsed.

     3 ,- Non-stock - When the  fuel economy  device  is installed
         in ?CV  line.

k.   Turbocharger - Will  be coded '0', 'A1, .'3', or  'o1.

1.   Evaporative Control System  (ECS) -   Controls vapors  from
     the fuel tank and carburetor.  Some systems have two
     lines, one  from the fuel  tank to the canister, and one
     from the canister to the carburetor, or air  cleaner to air
     purge the canister.  The  ECS will be coded as follows:

-------
n.   Air Pump Belt

     0 - Not originally equipped (if an aspirated system or none)

     1 - Functioning properly

     7 - Missing item

     8 - Misadjusted item - Loose pump belt

o.   Air Pump

     0 - Not originally equipped (if an aspirated system or none)

     1 - Functioning properly

     4 - Mechanical disconnect (other than belt remova'l)

     7 - Missing item

     9 - Malfunctioning - Frozen pump

p.   Exhaust Mainfold - will be coded  'A'  (stock) or 'B1
     (non-stock).

q.   Oxygen Sensor - Controls the air-fuel mixture going into
     the engine of vehicles equipped with  three-way catalytic
     converters.  The sensor will be coded '0', '!', '2',  '4'
     (sensor unscrewed), or '!'.

c.   Carburetor Type - An 'A' is used to indicate that  the
     carburetor is a production unit  (non-sealed original
     equipment).  If the carburetor is a sealed unit (without
     limite.r caps), an  '3'  is recorded.   If  fuel injection  is
     used, then an 'F1  is recorded.  If the carburetor  has been
     replaced with a non-stock unit,'then  a  '3' is  recorded.

3.   Limiter Caps - Plastic caps on idle mixture scr.-ws
     designed to limit  carburetor adjustments.  Sealed  plugs.
     are also considered a type of limiter cap.  Limiter caps
     will be coded as  follows:

     0 - Not originally equipped (fuel injected vehicle)

     1 - Functioning properly

     4 - Mechanical disconnect - Tab broken or.bent

-------
Form B - Rear


a.  ID Number - Same as on Form A.

b.  Make

c.  Model

d.  Vehicle Type - coded as follows:  C » car,  T » truck

e.  License Plate - State abbreviation

£.  Exhaust gas HC concentration (in  ppm)  at curb idle.

g.  Exhaust gas CO concentration (in percent)  at curb idle.

h.  Odometer - record mileage in thousands

i.  Dash Label - displays the fuel required and shall'be
    coded '0', '!' , or '7' .

j.  Catalytic Converter - oxidizes the HC and CO to water
    and C02 in the exhaust gas.  Later model catalysts also
    reduce oxides of nitrogen.  The converter shall be coded
    '0', 'I1, or  '7'  (entire catalyst canister removed).

k.  Exhaust System - if as originally equipped an 'A' shall
    be coded.  If non-stock a 'B' shall be coded.

1.  Exhaust System Integrity - the condition of the exhaust
    system shall be coded '!' (no obvious leaks) or  '9'  (leaks
    evident) .  An exhaust system will apparent Leaks will
   •invalidate the idle emissions readings.

m.  Tank Cap - seals the fuel, tank during normal operating
    conditions and shall be coded '!',  '?', or '9'  (loose cap).

n.  Tank Label - displays required fuel and is coded  '0',
    'I1 , oc ' 7' .

o.  Filler Neck Inlet Restrictor - The restrictor is designed
    to prevent the introduction of leaded fuel into  a vehicle
    requiring unleaded fuel.  It shall be coded  '0'  (leaded
    vehicle only), '!', '4'  (widened or cheater device  present),

-------
         FUEL SAMPLE COLLECTION AND LABELING PROCEDURES

A fuel sample shall be taken from each vehicle requiring unleaded
fuel.  These samples shall be collected in 4 ounce glass bottles
with a hand fuel pump.  Once the sample is drawn, the fuel shall
be replaced with an equivalent amount of unleaded fuel  if the
driver requests, and the pump shall be flushed with unleaded fuel

Each bottle shall be identified with a stick-on  label that has
the vehicle identifying survey number on' it.  The vehicle
identifying survey number .is the first entry on  the data forms
described in Attachment A.

Prior to shipment from the field, a sample tag with the sane
identifying number shall be attached to each bottle.  The
bottles will be packaged, labeled, and shipped to the Chemistry
Laboratory at EPA's Motor vehicle Emissions Laboratory  in Ann
Arbor, Michigan, according the shipper's requirements.  The
contractor shall use screw-on caps on all sample bottles,
having either teflon or polyethylene cap liners.  The contractor
shall assure that all sample bottles are capped  securely to
prevent any leakage and/or contamination.

-------
                EMISSIONS SAMPLING OF HC AND CO

vehicles ace tested in. as-received condition with  the  engine
at normal operating temperature..  With engine  idling  and  trans-
mission in neutral, the sample probe is inserted  into  the
tailpipe.  Exhaust concentrations are recorded after  stabilized
readings are obtained or at the end of 30 seconds, whichever
occurs first.  The process is repeated as necessary  for multiple
exhaust pipes.  However, multiple readings are. not necessary
for exhaust originating from a common point.  Results  from
multiple exhaust pipes are to be numerically averaged.  Results
are then recorded on the form for the vehicle being sampled.

-------
                APPENDIX F
INSTRUCTIONS AND FORMS FOR AUDIT ACTIVITIES
              F-l

-------
AUDITOR PERFORMANCE EVALUATION FORM
Program Auditor
RECORD REVIEW
Test records reviewed
Sticker inventory made
QC records reviewed
Problems found by auditor
Was feedback given
EQUIPMENT INSPECTION
Gas bottle checked
Gas audit conducted
Tolerances applied
Inlet gauge measured
Lead test paper checked
Required manuals checked
Problems found by auditor
Feedback given
YES/NO














VEHICLE INSPECTION
Test observed
Rating form used
Problems found by auditor
Feedback given




COMMENTS
•


ADDITIONAL OBSERVATIONS
i

-------
              OBSERVING  AUDITS OF  INSPECTION  STATIONS

Background

The  purpose  of  the  audit  observation  is  to determine  whether
actual audits are  consistent with requirements and based on good
engineering and  management  practices.   In  centralized  programs,
at  least one  audit  by  a  State/local  program  auditor must  be
observed.  In decentralized programs, most  station.visits  should
allow  for  observation  of  the  regular   audit   practice.   EPA
auditors should  first observe  the State/local personnel as they
are  performing  their  audits   with  as   little  interference  as
possible.  After the audit  is completed,  or  while  activities not
necessary  to  observe  are  underway,  EPA  auditors may  commence
their audit functions.


Instruct ions

This  form  covers  three  basic  audit  functions   that  should  be
completed by  the program  auditor.   Each  function  has  activities
associated with  it and the EPA auditor should determine whether
the program auditor accomplished  these  activities and  how well.
In  particular,   the  EPA  auditors should  note  how  the  program
auditor deals with problems found.   In  centralized programs, the
record   review  and   vehicle   inspection   sections  may  not  be
applicable.   Be  sure  to  note  the name  of  the  auditor and the
station  in which the audit occurred.

-------
CENTRALIZED STATION CALIBRATION CHECK RESULTS
Auditor Date

STATION















LANE














ANALYZER















PEF














HC
1-tANGUP














LOW FLOW
Before














After














READINGS
HC














CO














HC RANGE
+5%














-7%














P/F














CO P/F
-















-------
             ANALYZER  AUDITS  OF  CENTRALIZED  FACILITIES

Instruct ions

The  objective  of  analyzer  audits  is  to  determine  whether
accurate  readings  are  being obtained  in  normal  testing  (i.e.,
through   the  probe).    Therefore,   analyzer  audits   involve
introducing span gases  of  known concentration  into  the  analyzer
through the probe in order to simulate an actual I/M test.

The  report  for  the analyzer  audits should summarize  the  number
of analyzers which  were audited,  the number passing  all  checks,
and  the  number  failing  to meet tolerances and  why.   The  report
should also  indicate  what  action  was taken by  program officials
for problem analyzers,  i.e.,  no action,  taken out of service, or
repaired  on  the spot.   If  the latter,   the   repaired  analyzers
should  be rechecked  during  the audit  to  verify  their  accuracy
after repair.

Equipment Needed

1)   Span gases - Low range span gas  (nominally  1.6%  CO,  600 ppm
     propane,  balance  N,);   all  span  gases  must be  traceable
     +1% to NBS standards;  gas  analysis must be performed  by EPA
     or using EPA-approved protocol.

2)   Cylinder gauges and flow regulator.

3)   Hardware to  flow  gas  through  the probe,  commonly  referred
     to as a "tailpipe simulator."

4)   Calculator;  balloons;  hand tools.


Audi t Procedure

1)   Analyzers must be warmed up and ready for testing.
2)   Record station,  lane,  analyzer  number,  and PEF.
3)   If not automatic, check/adjust  zero and electric span.
4)   Check  the  hangup;  purge  until   less  than  20  ppm;  record
     final HC hangup value; recheck  zero and electrical span.
5)   Insert  probe    into  tailpipe  simulator   for   low   flow
     indication;  if passed, record "ok" and proceed.
6)   Flow span gas  through probe; anter  HC and  CO  readings  *hen
     stabilized (i.e., obtain maximum values);  close valve.
7)   Verify final  low flow; record "ok" if passed;  remove probe.
8)   Adjust HC  reading  for  HC  hangup.   Adjust  propane  span gas
     concentration   using PEF  and  calculate   the  acceptable  HC
     range.   Enter  the results  and  compare  with  adjusted  HC
     reading.  If  within the  range (or + 15 ppm HC,  whichever  is
     greater),  HC channel passes.    Indicate HC P/F.
9)   Enter the allowable CO!  range  in the top  of  the  last  column
     (+5% and -7%  of  the  CO span gas value).   If CO reading  is
     within the  range (or '±0.1%  CO, whichever  is greater),  CO
     channel  passes.  Indicate CO P/F.

-------
INSPECTION OBSERVATION FORM
Inspector \Station ID
Vehicle Identifcation Info
INSPECTION AUDIT
EMISSION TEST
Checked for exhaust leaks
Checked for vehicle warm
Turned accessories off
Probe insertion okay
RPM limits maintained
Preconditioning done
Preconditioning time
Test time
Test results observed
Test results recorded
TAMPERING CHECK
Catalyst
Fuel inlet restrictor
inlet gauge used
Lead test
PCV
Air pump
Evaporative canister
Others (list)
YES/NO







HC


YES/NO








- Comments
# seconds
# seconds
CO


P/F/NA






&


.-

-------
                      INSPECTION OBSERVATION

Background

The  purpose  of  observing  inspections  is  to determine  whether
they are  being conducted  according  to  procedure.  At  least  one
inspection  should  be  observed   in  each  decentralized  station
visited and as time allows  in centralized stations.
  /
Instruct ions

The  form  covers  both  the emission  test  and emission  control
component  checks.   Most  of  the  form  has  the  format  of  a
questionnaire,  indicate  yes  or  no  answers  as  appropriate  for
each item that  applies  or not applicable for  those that  do not.
Use  the comments section to  describe deviations  from procedure
or  other  problems  observed.   Use a  watch  that  shows  time  in
seconds  and  monitior  test  and  preconditioning  times.   When
manual   analyzers  are  in  use,  observe  emission  readings  and
record  the  results.   Also enter  the  readings  recorded by  the
inspector.

     Ideally,  observe  a  re-inspection  of  a  vehicle  that  is  at
the  station  and  was  inspected  earlier  in  the  day.    If  such a
vehicle is not  available, observe  an  inspection of a  vehilce in
for  an   official  inspection.   At  last   resort,   request   an
unofficial  inspection  of   any   vehicle  available.    For  the
emission  control  component    inspection,    request    that   the
inspector verbally and physically identify underhood components.

-------
ANTI-TAMPERING STATION AND INSPECTION OBSERVATION FORM
• .
Inspector {Station ID
Vehicle Identifcation Info
EQUIPMENT CHECK
Lead Test
Paper Active
Paper Plentiful
Spray Bottle
Deqreasing Rag
Fuel Inlet Gauge
Present
Proper Size
Underhood Checks
Manuals Present
Other Equipment
Other Requirements
Stickers/Forms
Log Book
Sign
Lighting
TAMPERING CHECK
Catalyst
Fuel inlet restrictor
Inlet gauge used
Lead test
PCV
Air pump
Evaporative canister
Others (list)
YES/NO




YES/NO








Comments •

P/F/NA








i
1
i

-------
            ANTI-TAMPERING ONLY INSPECTION OBSERVATION

Background

The  purpose of  observing  inspections  is  to determine  whether
they are  being conducted according  to  procedure.  At  least  one
inspection  should  be observed   in  each  decentralized  station
visited and as time allows in centralized stations.

Instructions

The  form  covers  only   the  emission control  component  checks.
Most of the form has the  format of a  questionnaire,  indicate  yes
or no  answers as  appropriate  for each item  that  applies or  not
applicable  for those  that do not.   Use the comments  section to
describe  deviations from procedure  or other  problems observed.
Be sure to  note  the results that the inspector  derives  for  each
component checked. .

     Ideally,   observe  a  re-inspection  of  a  vehicle  that  is at
the  station and  was  inspected  earlier  in  the  day.   If  such  a
vehicle is  not available, observe an inspection  of  a  vehilce in
for  an   official   inspection.    At   last  resort,   request   an
unofficial  inspection   of   any  vehicle   available.    For  the
emission   control   component   inspection,    request   that   the
inspector verbally and physically identify underhood components.

-------
                         RECORDS  REVIEW
Location
Program Auditor
INSPECTION RECORDS
Criterion
Completion
Legibility
Accuracy
Form Adequacy
Component Choice
Cutpoint Selection
Pass/Fail Decision
Poor









Fair









Good

'







Excel









Comments

WAIVER RECORDS ,
Criterion
Completion
Legibility
Accuracy
Form Adequacy
L Criteria Met
m Documentation
Poor










Fair










Good










Excel










Comments

QUALITY CONTROL RECORDS
Criterion
Completion
Legibility
Accuracy
. - Form Adeauacv
QC/QA Freauencv •
Weekly cal check
Cal check "pass"
Tolerances correct
Weekly leak check
Leak check "pass"
QA check period
L Audit cal "pass"
1 Audit leak "pass"
Poor




Fair




R S






















Good




Excel




U A






















, Comments
t _
i
Key: R - Rarely, S = Sometimes, U = Usually, A = Always
•

-------
                          RECORD REVIEW

Background

The primary purpose of reviewing records  is  to  determine whether
program data  are  being collected properly and  completely.   This
form  provides  for   review  of  three  basic   types  of  records:
inspection records,  waiver records,  and quality  control records.
               /
General Instructions

Review records keeping  in mind  the  criteria  listed on the form;
common to  all  record review  is  the need  to assess  whether  the
forms  are  filled  in completely,  legibly,  accurately  and whether
the  form   itself  is  adequate.    There  is   space  available  for
additional  criteria  pertinent, to  the particular  program  being
audited.   Make  notes in the  comments  section on  problems  found
and afterwards make overall  judgments on each of the criteria.

Depending on  the  systems used  in  the program,   some  records may
be  collected  through  the   use  of" computers   and   it  may  be
difficult  or  unnecessary  to  make   judgment  on  some  of  the
criteria  (e.g., legibility).   In  decentralized  programs, records
review should  take  place in  each  station visited, and may also
be done in bulk before or after the audit.

Inspection Records

Inspection records should be  reviewed  to determine whether test
procedures  are  being  properly   followed.    In  particular,  a
determination  should  be  made  as  to  whether   inspectors  are
applying   the  correct  emission   standards,  checking  for  the
applicable emission  control  components,  and filling  out  forms
correct Iy.

Waiver Records

Waiver records  should be reviewed  to determine  whether waivers
are being properly issued.   A determination  should be made  as  to
whether waived vehicles; meet  all  applicable  criteria   and whether
sufficient documentation is  included to verify this.

Quality Control Records

In addition  to  the  basic  review,  quality control  records should
be  reviewed  to determine  the  frequency  and results of various
quality  assurance   and  quality   control   actions.   Generally,
weekly  calibration  and  leak   checks  must  be  done  and,   in
decentralized programs,  either  monthly  or  quarterly   audits are
requi red.

-------
             DECENTRALIZED STATION ANALYZER AUDIT FORM
Background
The  objective  of  analyzer   audits   is  to  determine  whether
accurate  readings  are being  obtained  in  normal   testing  (i.e.,
through   the  probe).    Therefore,    analyzer   audits   involve
introducing span gases of  known  concentration into  the  analyzer
through the probe in order to simulate an actual  I/M test.

Instruct ions

This form provides  space  for  the results of the  regular analyzer
audit conducted by the station owner or  the  program auditor,  and
for  the  results of  the  EPA  analyzer  audit.   In  observing  the
station  inspector  or   the  program  auditor  check   the  analyzer,
note the  procedure  used  and the results.   In addition,  the  form
provides space for noting the condition of  the analyzer  and  type
and concentration of  calibration gases in the station.   The  form
also has space to note the effectiveness of  lockouts.   A  simple
procedure to  check  the lockouts on computerized. analyzers  is as
follows:

     1)    Conduct  an official  test  and  sample  room  air  to
           trigger  the  COZ   lockout.   The  C02    lockout  should
           result in an invalid test.

     2)    Conduct a  leak  check  without  capping  the  probe,  then
           attempt  to  conduct  an   official   inspection.    The
           lockout should prevent an official inspection.

     3)    Disconnect   the  calibration  gas  line -and  conduct  a
           "weekly"   calibration  check  using   room air,   then
           attempt  an  official  inspection.   The  lockout .should
           prevent an  official  inspection.

-------
DECENTRALIZED INSPECTION STATION
ANALYZER AUDIT FORM
Program Auditor
Station Name
STATION SPAN
Hangup checked
Zero/span done
Leak check done
Data entered properly
Bottle values
Span results
VISUAL INSPECTION
Sample Line/Probe
Filter/Water trap
General Condition
CO2 Lockout
Leak Lockout
Calibration Lockout
EPA GAS SPAN
HC Hangup
Probe
Other
Inspector Name
Station Number
YES/NO COMMENTS




HC
GOOD/BAD





i
I

DEVIATION
CO DEVIATION
STATION GAS DATA
Blend
Tolerance
Accuracy
Spec
Name of
Gas Supplier

•

COMMENTS

ppm

HC
HC
DEVIATION
DEVIATION

CO DEVIATION
CO DEVIATION
ADDITIONAL COMMENTS

-------
               APPENDIX G
CHECKLIST FOR  COMPLETING TME-I/M...AUDIT.
             G-l

-------
              CHECKLIST FOR COMPLETING THE I/M AUDIT
I.    Advance Preparation

     A.   Documentation Assembly
                Review applicable portions  of  SIP
                	   Enabling legislation.
                	   Program rules  and regulations
                	   Other  technical  or procedural information

                Review other  SIP related information
                	   EPA rulemakings
                	 '  EPA Technical  Support Documents
                	   EPA Tampering  Surveys

                Review other  program  documents
                	  -Operating contracts
                	   Procedures  manuals   (testing,    quality
                       control,  etc.)
                	   Analyzer specifications
                	   Quality assurance plan

                Review available reports on program operations
                	   Periodic reports published by'I/M agency
                	   Reports on previous  audits
                	   Reports on special surveys or projects
                	   Data summaries obtained from I/M agency

                Review recent correspondence
     B.   Program Questionnaire
                Review program design
                	   Vehicle coverage
                	   Cutpoints
                	   Test  procedures
                	   Analyzer specifications
                	   Analyzer maintenance and calibration
                	   Station and inspector licensing
                	   Record seeping at  time of inspect ion
                	   Audit/surveillance ac^ivi-ies
                 ,      Challenge mechanism
                	   Repair waivers
                	   Enforcement mechanism
                	   Mechanics  training and other interface
                	   Consumer issues
                	   Self  assessment  through  data analysis
                	   Future plans

-------
                                -2-
          	  Review operating experiences: • ;
                	   Operating statistics
                	   Quality; control statistics
                ._	   Data analyses  :. •  • - •
               •" - '     Consumer protection
                	   Repair waiver's   ^^.
                 '      Enforcement
                _____   Mechanics: training arid  other  interface
                	  • Self assessment through data  analysis

     C.   Notice to Program Officials and Other Affected Parties

          	  Send  formal notice  to 'State/local  agencies  60
                days in advance of""site visit
           	  SenxiJ'- blank - questionnaire-  to   program  officials
       ''" ''.. '.,', '.  . "for :completion and return before site  visit
          ______'• Notify -a 11 -concerned EPA offices of  audit

      ~ '  --'   - •- •  .   : -oj-i:
II.   On-Site Audit Visit

     A:-  initial -interviews

        ,._:_	  Air planning agency officials
        ""_;	_;_  I/M operating agency officials

     )B.  .Review of program records'

         .	 . Inspection recdE'ds
         ••;  •'-.'•• Enforcement records
          	  Waiver records
          	  Audit/surveillance records
           :"    Repair records
     C^""Inspection station visits

          	  Observation of program auditors
          	  Analyzer checks
          	  Observation of waiver processing
          	  Interviews of station personnel
          	  Record checks
          	  Observation of inspecrions

     D.   Special surveys and interviews

          	  Enforcement surveys
          	  Idle test surveys
          	  Tampering/misfueling surveys
          	  Interviews  of  non-program  representatives  with
                special knowledge/experiences

-------
                           -.;.,   -3-
     E.  Exit intec3/rie>£ «.:•:-.••» CM,
                Rev i-ew of7 -audi t, activities -
                Feedback on-audit
                Discus.sipn^Qf preliminary audit findings
                Plans fq-r :audit  report     '~ "
                Requests for. additional  materials needed by EPA
                Follow-up activities  by  EPA~~
III.  Audit Report.
                Receive FOSD -trip  report
                .Receive ECT-D  trip  report
                Assemble draft  audit  report.'.'.
                •Send draft:to FOSD and EC^D .for -review
                Incorporate: FOSD and. ECTD. jcomments
                Send final draft to program agettcies for comment
                Finalize  audit report;   final  (initial)  audit
                report shall  contains
                       Background  descrip^tion^of .program
                       Review  of audit  act'ivities""
                       Discussion  of  program strengths
                       Discussion  of  program weaknesses
                       EPA    recommendations  •-  for    correcting
                       problems.-         .  .. .   .  ._.:
                       Description of "any" follow-up 'activities,
                       if needed       ..^.,....,-
                       Discussion   of   "stjate/-tocal   commitments
                       subsequent  to  the""' audit -to  resolve  any
                       identified  problems, .-
                       State/local  comments   -on-  draft  report
                       shall be appended to  the-final report
                       Completed   audit  questionnaire,,  shall   be
                       appended to the  final  report

-------
                                       TECHNICAL REPORT DATA
                               (Please read Instructions on the reverse before completing}
     GPORT NO.
      PA-450/2-88-002
                                 2.
                                                              3. RECIPIENT'S ACCESSION NO.
       IE AND SUBTITLE
     National Air Audit System Guidance Manual for
     FY 1988 - FY 1989
                                                              5. REPORT DATE
                                                                February 1988
                                                              6. PERFORMING ORGANIZATION CODE
  7. AUTHOR(S)
                                                                8. PERFORMING ORGANIZATION REPORT NO.
  9. PERFORMING ORGANIZATION NAME AND ADDRESS
    Office of Air Quality Planning and Standards
    U.S. Environmental  Protection Agency
    Research Triangle  Park,  North Carolina  27711
                                                                 10. PROGRAM ELEMENT NO.
                                                              11. CONTRACT/GRANT NO.
                                                                 68-02-4396
   12. SPONSORING AGENCY NAME AND ADDRESS
                                                                 13. TYPE OF-REPORT AND PERIOD COVERED
    DDA for Air Quality Planning and Standards
    Office of Air  and  Radiation
    U.S. Environmental  Protection Agency
    Research Triangle  Park,  North Carolina  27711
                                                                . SPONSORING AGENCY CODE
                                                                   EPA/200/04
   1S. SUPPLEMENTARY NOTES

-------