United States        Office of Air Quality         EPA-450/2-84-009
Environmental Protection   Planning and Standards       December 1984
Agency           Research Triangle Park NC 27711

Air
National Air Audit System
FY 1984 National Report

-------
This report has been reviewed by the Office of Air Quality Planning
and Standards, EPA, and approved for publication.  Mention of trade
names or commercial products is not intended to constitute endorsement
or recommendation for use.  Copies of this report are available through
the Library Services Office (MD-35), U.S. Environmental Protection
Agency, Research Triangle Park, North Carolina 27711; or, for a fee,
from the National Technical Information Service, 5285 Port Royal
Road, Springfield, Virginia 22161.
                  Publication No. EPA-450/2-84-009

-------
                                CONTENTS
  I.  EXECUTIVE SUMMARY 	     1-1

 II.  INTRODUCTION                                         II-l

III.  AIR QUALITY PLANNING AND SIP ACTIVITIES
      A.  Introduction	     III-l
      B.  Major Findings and Conclusions   	     III-l
      C.  Air Quality Evaluation	     111-10
      D.  Emissions Inventories 	     111-15
      E.  Modeling	     111-24
      F.  SIP Evaluation and Implementation  	     111-30

 IV.  NEW SOURCE REVIEW
      A.  Introduction	     IV-1
      B.  Major Findings and Conclusions	     IV-1
      C.  Administrative Procedures 	     IV-11
      D.  Applicability Determinations   	     IV-14
      E.  BACT/LAER Determinations  	     IV-19
      F.  Ambient Monitoring  	     IV-21
      G.  Ambient Impact Analysis  	     IV-23
      H.  Emissions Offset Requirements  	     IV-27
      I.  Permit Specificity and Clarity	     IV-29

  V.  COMPLIANCE ASSURANCE
      A.  Introduction	     V-l
      B.  Major Findings and Conclusions	     V-2
      C.  Periodic Review and
          Assessment of Source Data	     V-3
      D.  File Review	     V-4
      E.  Overview Inspections	     V-5

 VI.  AIR MONITORING
      A.  Introduction	     VI-1
      B.  Major Findings and Conclusions	     VI-2
      C.  Quality Assurance 	     VI-4
      D.  Network Design and Siting	     VI-9
      E.  Data Handling	     VI-11
      F.  Facilities, Staff, and Equipment	     VI-14

VII.  EVALUATION OF THE FY 1984 AIR AUDIT PROGRAM
      A.  Introduction	     VII-1
      B.  Air Quality Planning and SIP Activities  .  .  .     VII-1
      C.  New Source Review	     VI1-2
      D.  Compliance Assurance	     VII-4
      E.  Air Monitoring	     VII-5
                                  iii

-------
                          I.  EXECUTIVE SUMMARY
A.   INTRODUCTION

     The National Air Audit System (NAAS) was developed as a joint effort
by EPA, the State and Territorial Air Pollution Program Administrators
(STAPPA), and the Association of Local  Air Pollution Control Officials
(ALAPCO).  The need for the NAAS resulted from the fact that State and
local air pollution control agencies have assumed responsibility for an
increasing number of programs under the Clean Air Act over the years.
The primary goals of the NAAS are to identify any obstacles that are
preventing State and local air pollution control  agencies from implement-
ing effective air quality management programs and to provide EPA with
quantitative information for use in defining more effective and meaningful
national programs.  The NAAS was implemented initially in FY 1984 and a
total of 68 State and local air pollution control agencies were audited
by teams composed primarily of EPA Regional  Office personnel.  The four
program areas selected for the FY 1984 audit were air quality planning
and State implementation plan activity, new source review, compliance
assurance, and air monitoring.

     The audits were conducted on-site at the State and local agency offices
and generally included a preliminary meeting with key agency staff to discuss
audit goals and procedures, discussion of the individual  audit questionnaires
with personnel in each of the four audit areas, a review of agency files
to verify implementation and documentation,  and an exit interview with agency
management to discuss the preliminary audit results.  Individual agency audit
reports were drafted by EPA Regional Office staff after the on-site visits.
These draft reports were reviewed by the audited  agency and then issued
by the Regional Office in final form to the audited agency.  The individual
audit reports were the bases from which the final FY 1984 national report
was prepared.

     The following sections summarize the findings in each of the four major
audited areas.

B.   AIR QUALITY PLANNING AND SIP ACTIVITIES

     Four major program components within the air quality planning and
State implementation plan (SIP) activities area were evaluated in the FY  1984
audit.  These areas were air quality evaluation,  emission inventories,
modeling, and SIP evaluation and implementation.

     The FY 1984 audit revealed that the majority of the audited agencies have
sound programs in most of these components,  but some agencies lag behind  in
basic planning functions.  Major findings from each of the components are
described below.
                                   1-1

-------
Air Quality Evaluation

     This portion of the audit surveyed how State and local  agencies use
air quality data in evaluating source impacts and planning activities.
This section covered the four elements of air quality reports,  Section  107
designations/ redesignations, use of special  studies monitoring.data, and
air management/planning.  As highlighted below,  the audit  results clearly
indicate that the majority of the audited agencies are performing a basic
level of service in air quality evaluation.  Major findings  were:

     0  The activities of publishing annual air  quality reports and
evaluating attainment and nonattainment area  designations  based on those
data are being carried out by at least 75 percent of the audited  agencies.
Designations to attainment are more prevalent among agencies than are
designations to nonattainment.  Several agencies saw no incentive for
designating new nonattainment areas.

     0  Monitoring data, and to a lesser extent, modeling  results and
special studies, are being used by most audited  agencies to  evaluate and
investigate source impacts.  Relating ambient data to source impacts is
normally done on a case-by-case basis and is  usually not governed by
documented agency criteria.  Also, these activities are apparently not
systematically integrated into program evaluation and management
priorities.

Emission Inventories

     This portion of the audit surveyed the data maintained  in  the State
or local agency emission inventories.  It covered several  aspects of the
inventory including sources, emissions, and other data maintained; frequency
of updates; documentation, methodology, and quality assurance checks; and
computerization, report formats, and submittals  to the national  emissions
data system (NEDS).  The majority of the questions included  in  the audit
of this area were survey questions (i.e., agencies were asked only to
state the activities performed and there was  little verification  of the
responses required by the Regional  Office).  A few questions focused on
the quality of the inventory.  Major findings were:

     0  Nationally, the audit results indicate that emission inventories
for almost all agencies contain data for major point sources.  However,
fewer agencies maintain special  inventories for  sources subject to new
source performance standards (NSPS) or national  emission standards for
hazardous air pollutants (NESHAP), minor unregulated point sources, and
area or mobile sources.

     0  The methodology and factors used to compute inventory data are
reportedly consistent with national guidance, and in most  cases,  adequately
documented.

     0  Most agencies reported some quality assurance or cross-checking
of the emission data for accuracy and completeness.  However, as  many as
18 percent of the audited agencies apparently do not employ  at  least a
basic cross-check at the agency level.
                                   1-2

-------
     0  Most inventories are computerized, but many control  agencies are
not currently capable of producing reports which summarize the impacts of
source controls or process changes, and some cannot aggregate emission
sources by category or by geography.  Absence of these capabilities might
impair their flexibility to track changes in emission trends or reasonable
further progress (RFP).  However, more investigation will  be necessary to
determine if this has serious national implications.

     0  With respect to emission inventories, point source inventories are
reportedly adequate starting points for SIP development and  related modeling
analyses.  However, area source and mobile source inventories are receiving
less attention in frequency of updates and periodic revaluation.  Most
agencies have historically emphasized point sources and deemphasized area
sources.  However, by 1987, for example, area source emissions of volatile
organic compounds (VOC's) will outnumber point source emissions in most
urban nonattainment areas, and agencies will have to give  more attention
to area sources.  Area and mobile source emission inventories will therefore
receive additional treatment in the FY 1985 audit questionnaire.

Modeling

     This section of the audit gathered information on the modeling abilities
of the State and local agencies.  The objectives were to determine if
there was consistency with respect to using air quality models, in following
EPA guidance and in communicating with EPA Regional Offices.  The three topics
covered were the staff's knowledge and capabilities with regard to modeling,
documentation that nonreference in-house analyses are coordinated with
EPA, and agency review of outside modeling analyses performed by  sources
or their contractors.  The audit results indicate that:

     0  The majority of the audited agencies are apparently  knowledgeable
and capable of performing and reviewing most routine modeling analyses.
However, some of the agencies need improvement to attain a basic  level  of
knowledge.

     0  When agencies use nonreference procedures, EPA is  usually contacted.
However, EPA contact or approval is not always documented.

     0  Most agencies reported providing guidance to responsible  parties
prior to modeling by sources.  Subsequently, most agencies reviewed the
modeling analyses performed by the sources.  These reviews varied among
agencies.  Less than a third of the agencies verify all source modeling.
Whether verification is appropriate for all source modeling  analyses could
not be determined from the audit results.

SIP Evaluation and Implementation

     This section of the audit was a review of the periodic  evaluation
and implementation of SIP's at the State and local levels.  The FY 1984
audit focused on several  broad areas—timeliness of studies  and regulation
development, familiarity with EPA policy on site-specific  SIP revisions,
inspection and maintenance (I/M) and transportation control  measures
(TCM) implementation problems, and SIP coordination and validation.
                                   1-3

-------
     In general, the majority of the audited  agencies  are  making some
progress in submitting required SIP rules and in  completing  required studies.
However, more than half of the SIP commitments and  required  regulations
had either not yet been submitted or were in  progress  but  late.   State
and local agencies cited resource problems and lengthy review procedures
as causes of delay.  It should be noted  that  Regional  Offices lack
remedies to induce State and local agencies to make punctual  submittals.
The audit did not indicate major problems on  a national  scale in communi-
cation among the State and local agencies and the Regional Offices.
Isolated disagreements between State/local agencies and  EPA  Regional
Offices do exist, but this is apparently not  a national  problem.

     Other findings were:

     0  Few of the audited agencies have I/M  programs  that are known to
be achieving the emission reductions claimed  in the SIP, but many programs
have yet to be in operation for any length of time. A need  for  the more
detailed I/M audit planned in FY 1985 is apparent.

     0  Transportation control measures  (TCM's) are being  implemented  at
38 agencies.  In tracking TCM implementation, the method most used by  air
pollution control agencies is that of receiving reports  from implementing
units such as the State transportation agencies.  Apparently, only a few
agencies directly participate in urban transportation  planning.   Potentially,
TCM implementation could be improved if  a larger  number  of air pollution
control agencies were more actively involved  in the transportation planning
process.

     0  Two-thirds of the audited agencies indicated that  they could show
that the SIP reductions were being achieved in practice  for  point sources.
This has been accomplished primarily by  relying on  routine agency permitting
and inspection activities.  However, with regard  to tracking of  reasonable
further progress (RFP) toward attainment in ozone (03) and carbon monoxide
(CO) extension areas, only about a dozen agencies appear able to document
the emission reductions claimed in their ozone and  CO  RFP  demonstrations.
(RFP is a SIP element required under the Clean Air  Act which requires
progressive yearly emission reductions in all 03/CO extension areas.)  The
absence of effective tracking programs for RFP in agencies where it is
required indicates that most agencies either  give RFP  a  low  priority or
simply lack the ability to track diverse emission sources  in a nationally
consistent manner.  EPA should reexamine its  RFP  guidance  and determine
whether more guidance to the States will be necessary  to promote a
satisfactory level of tracking.

     0  Most agencies believe that the point  and  area  source growth
projections included in the SIP are capable of keeping up  with the current
growth rates.  Only about a fifth of the agencies periodically review
their SIP projections against growth data. This  level  of  activity may be
appropriate for most areas; however, it  may not be  adequate  in those
areas where high rates of growth are occurring.
                                    1-4

-------
C.  NEW SOURCE REVIEW

     A specific objective of the FY 1984 new source review (NSR) audit was
to promote reasonable consistency and quality in the way that State and
local agency NSR programs are carried out.  In order to meet this objective,
the FY 1984 audit examined the ways that State and local agencies require
and actually implement the preconstruction review of new and modified
stationary sources.  The findings contained in this report are based upon
the responses of 64 air pollution control  programs, including 49 States*,
the District of Columbia, Puerto Rico, the Virgin Islands, and 12 local
agencies.

     In summary, the FY 1984 audit was successful as an initial  assessment
of the present national NSR framework, as  well as a valuable feedback
mechanism to identify where EPA should emphasize future policy development.
The audit verified that the 64 audited State and local  agencies  are generally
familiar with and have strong support for  the preconstruction review process.
Most often, these agencies were found to have comprehensive programs and to
function well in accomplishing the overall goals of the national NSR program.
EPA now has a much better national data base that documents many of the
NSR policies and procedures used by these  agencies.

     Various problems were encountered during this first national  audit
effort that limited its ability to provide definitive conclusions regarding
the way that many of the program details actually function.  Three particular
types of problems that were experienced were the inconsistent quality of the
individual audit reports submitted to EPA  Headquarters  for analysis, the
poor condition of a significant number of  permit files  that were selected
for auditing, and the limited number of major source permits that were
issued during the period covered by the audit.  Some of these problems,
particularly those concerning the quality  of the audit  reports,  will be
corrected in next year's audit.

     The FY 1984 NSR audit was divided into seven major topics.   The major
findings and conclusions from this initial national audit are described
below.

Administrative Procedures

     The audited agencies generally performed adequately in terms  of the
way that permit applications are checked for completeness, tracked during
the review process, and (in nonattainment  areas) verified for Statewide
source compliance.  However, only 61 percent of the agencies routinely
provide the public with an opportunity to  comment on each proposed major
source, and only 31 percent routinely provide for public comment on all
sources (major and minor) required to obtain permits.  Based on  this finding,
EPA needs to reassess its public participation requirements, particularly
with respect to minor source coverage.
*The State of California does not have authority to issue permits  and does
 not implement a new source review program.   These activities  are  performed
 by the local air pollution control  agencies.

                                   1-5

-------
Applicability Determination

     With only a few exceptions, agencies appear to use the appropriate
geographic applicability criteria, adhere to SIP-approved definitions  of
"source," and address fugitive emissions in calculating a source's potential
to emit.  They also appear to use actual emissions (as required)  to determine
if a "net significant emissions increase" would occur for a source modifi-
cation.  A significant problem was revealed, however, in that  approximately
47 percent of the agencies seem to misuse the concept of "potential to
emit" when determining a source's applicability to major source review for
offsets in nonattainment areas and prevention of significant deterioration
(PSD) determinations.  Poor file documentation, inconsistent permit condi-
tioning practices, and problems with the issue of Federal  enforceability
contribute to the overall  problem.

BACT/LAER Determinations

     Agency policy and procedures appear to be adeqeuate in most  cases to
address the requirements for best available control  technology (BACT)
in PSD areas and lowest achievable emission rates (LAER)  in nonattainment
areas.  However, the tendency of at least. 17 agencies to set BACT at or
near the minimum acceptable level is a basis for further examination of the
way that agencies establish BACT on a case-by-case basis.

Ambient Monitoring (PSD)

     Based on a limited number of examples, agencies appear to be ensuring
that PSD applicants provide adequate preconstruction ambient air  quality
data.  Some of the audit responses suggest that there is a substantial
reliance on the use of existing representative air quality data instead of
data collected from source-operated monitoring networks.  Consequently,
next year's audit will focus more upon the agencies' approval  criteria for
the use of existing representative data.

Ambient Impact Analysis

     In most cases, agencies appear to give adequate procedural consideration
to the protection of the national ambient air quality standards,  PSD increment
consumption, and Class I areas with respect to major sources.   This includes
requiring applicants to use EPA reference models to  carry  out  the required
ambient impact analyses.  However, the use of actual (rather than allowable)
emissions and other technical aspects of modeling by some  agencies needs
further study before their analyses can be considered fully satisfactory.
Also, the audits indicated that approximately 40 percent of the agencies may
rely too heavily on the applicant's analyses without adequate  review of the
modeling procedures and results.  With respect to minor sources,  auditors
recommended in 20 percent  of the agencies the development  of a tracking
system for minor source emissions in areas where the PSD baseline has  been
triggered.  EPA guidance is needed, particularly with respect  to  PSD increment
consumption during short-term averaging periods.
                                    1-6

-------
Emissions Offset Requirements

     No meaningful  major findings could be identified in this area from
this initial national  audit.  This was due largely to the limited opportunities
that agencies had to issue permits to major construction projects in non-
attainment areas.  Responses in some audit reports did indicate, however,
that some agencies may not be fully aware of existing offset criteria.  EPA
should seriously consider the need to clarify and increase its guidance
regarding the appropriate use and timing of emissions offsets.

Permit Specificity and Clarity

     The new source review audit results raised some important questions
concerning the enforceability of permits issued by many agencies.  Agencies
were found to use a variety of techniques to establish enforceable source
limitations for such things as defining allowable emissions, designating
applicable compliance testing procedures, and restricting the operation and
production capacity of sources to qualify them as minor.  EPA needs to
carefully examine the issue of permit enforceability, including clarification
of the minimum criteria for Federal enforceability of permit conditions.
Only after such careful  examination can EPA fairly assess the adequacy of
current State and local  agency permit issuance practices.

D.   COMPLIANCE ASSURANCE

     Many States and locals showed one or more strong points characteristic
of a succussful air compliance program, such as high source compliance
rates supported by high inspection frequency rates, performance of all
required NSPS source tests, expeditious resolution of violators, and few
long-term violators.  These activities were adequately reflected and vali-
dated by the national  compliance data system.  Other States had source
files that were for the most part well organized, up-to-date, and complete,
reflecting a reasonable profile of each source.

     However, the compliance audits also revealed that several States and
locals, to a varying extent, have weaknesses in three areas vital to a
strong and effective compliance program.  First, files generally do not
contain strong and verifiable information reflecting a reasonable profile
of each source.  Second, many inspection reports are of poor quality (no
mention of operating or emission parameters or pollutants emitted).  It is
unclear whether the problem is with the inspections themselves or just the
reports.  Third, the reviewed agencies' enforcement efforts are not always
effective in reducing the number of long-term violators by expeditiously
returning documented violators to compliance.

E.   AIR MONITORING

     The Federal monitoring regulations which were promulgated in 1979
require audits as part of the quality assurance program.  In particular,
State and local agencies are required to participate in EPA's national per-
formance audit program and to permit an annual  EPA system audit of their
ambient air monitoring program.  Consequently,  in 1980 EPA issued compre-
hensive guidance for conducting system audits of State or local agencies.
Because of the lack of national consistency in  conducting the annual
                                    1-7

-------
system audit the air monitoring audit  committee  developed  in  1983,  as  part
of the NAAS effort, an interim monitoring questionnaire  for use  in  FY  1984.
The interim questionnaire,  which was to be revised  for use in FY 1985,
was intended to provide an  overview and summary  asessment  of  the ambient
air monitoring program audit.

     Based on the audit results and periodic  status reports on the  State
and Local Air Monitoring Stations (SLAMS) and National Air Monitoring
Stations (NAMS) networks, it can be concluded that  nationwide, State and
local  agencies have done a  commendable job in their efforts to establish
and operate criteria pollutant ambient air monitoring networks.   About  94
percent of the agencies audited were  found to have  all of  their  monitors
in full compliance with network design and siting criteria.   This finding
compares closely with EPA's annual  SLAMS status  report which  listed that
about 97 percent of the 4888 SLAMS* were operating  and complying with  the
monitoring and reporting regulations.   These  figures indicate a  most satis-
factory performance overall.  Results  of the  audits conducted in FY 1984,
however, did identify several  monitoring activities, as  well  as  confirm
previous knowledge of other areas that do need further attention.  These
items include late data submissions,  failure  to  attain 75  percent data
capture, the need to replace old or worn out  equipment,  incomplete  or
outdated standard operating procedures, and inadequate precision and accuracy
data submittals.

     Approximately 32 percent of the  audited  agencies typically  submit
ambient NAMS data later than required  by regulation (90  days  after  the  end
of each quarter).  Several  reasons  are cited  for this situation  including
staff shortages, inadequate computer  capabilities,  length  of  laboratory
analysis time for lead, and the need for additional  data validation time.
A couple of actions are now underway to reduce the  magnitude  of  the problem.
First, several States have  initiated  steps to upgrade their computer capabil-
ities.  Second, a regulatory change to the Part  58  monitoring regulations
is being developed that will increase  the number of days for  submitting
data from 90 to 120 days after the  end of the calendar quarter in which it
was collected, as being more reflective of EPA's data needs.

     Maintaining a 75 percent data  capture rate  was a problem for roughly
34 percent of the audited agencies.  Further  study  is needed  to  determine
the major cause of the problem.  In reviewing the data capture results  for
the 66 percent of the agencies without significant  problems meeting the 75
percent goal, it was determined that 90 percent  of  their sites met  the  75
percent completeness criteria.  While  the goal is to have  100 percent  of
the sites meeting completeness criteria, it is unlikely  from  a practical
viewpoint that more than 85 to 90 percent is  achievable.  Data are  lost due
to both scheduled events such as preventive maintenance  tasks and quality
control functions (calibration/zero and span/precision checks) and  unsched-
uled events such as repairs, vandalism, site  maintenance or construction,
power failure, loss of lease agreement or the like.

     Results of the air monitoring  audit also revealed that some agencies
were experiencing equipment failure problems  due to old  or worn  out equipment,
These findings were consistent with a  1983 STAPPA/ALAPCO air  monitoring
*1362 of these are National Air Monitoring Stations (NAMS).

                                    1-8

-------
equipment survey.  In response to these needs, EPA has recommended that a
portion of the Section 105 air grants be allocated for replacement of
ambient instrumentation.

     All of the State Quality Assurance Plans have received formal approval
by the EPA Regional  Offices.  However, since the formal  approval  of these
plans, Federal reference methods procedures have changed and in some cases
better operational  procedures have been devised.  The audit results indicate
that several State and local agencies need to modify their Quality Assurance
Plans to address these changes.  The necessary modifications are currently
in progress in some  agencies and will be made by others  in the near future.

     No significant  deficiencies were identified with respect to the annual
network reviews or the annual SLAMS Air Quality Data Report.  The few minor
problems discovered  will be easily corrected.

     The precision and accuracy audit results appeared as a substantial
problem in terms of  the adequacy of response to the question and the failure
to attain precision  and accuracy goals.  However, it is  believed that the
largest part of this problem could be attributed to the  particular question
asked because only 48 percent of the respondents provided an adequate response.
Based on available data in EPA's data base, it can be concluded that most
agencies are providing the required precision and accurancy data, and
over the last three  years are improving the data quality.  It appears the
majority of the agencies should meet the precision and accurancy goals
(plus or minus 15 percent for precision, plus or minus 20 percent for
accuracy) established in the survey in the next few years.  An effort to
remove the ambiguity of this question and other problem  questions contained
in the audit questionnaire is underway.  The new questionnaire will attempt
to minimize the resubmission of massive blocks of information already
within the EPA's possession and limit questions to one subject area.

F.   EVALUATION OF THE FY 1984 AIR AUDIT EFFORT

     The FY 1984 NAAS has proven to be a valuable tool to focus attention
upon various aspects of the national air quality management program that
need improvement.  In some cases, it has shown the need  for improved
guidance from Headquarters and in other cases it indicated areas  where
State and local control agencies need to remedy deficiencies.  These
conclusions are apparent even though in some cases the audit responses
were inconsistent with regard to comprehensiveness, level of detail, and
other factors.  In some instances, ambiguous audit questions led  to
difficulties in obtaining clear responses.

     Adjustments to  the audit program in FY 1985 have been made in order to
make it even more useful.  These adjustments include redesignating the
audit questionnaires to make the questions clearer and more uniform,
streamlining the questionnaires by removing unnecessary  material, and by
developing more specific guidelines for the FY 1985 audit program.
                                   1-9

-------
                            II.  INTRODUCTION


     Auditing of State and local agencies is not a new activity as the
EPA Regional Offices have been conducting some form of evaluation and
audit program for many years.  However, the frequency, depth, and procedures
for conducting these audits have varied greatly from Region to Region
within EPA and even from State to State within a Region.  This inconsistency
was reflected in a report issued by EPA's Office of Air Quality Planning and
Standards (OAQPS) on the EPA Regional  Office air audit programs (Survey
of Regional  Air Audit Programs, June 1983).  Even before this survey, the
State and local agencies had begun expressing concern about the inconsistency
of the Regional oversight programs.  The general tenor of the comments was
that EPA should develop uniform evaluation criteria that could be applied
nationwide.   This concern led to the development of the National  Air
Audit System (NAAS), a joint effort of the State and Territorial  Air
Pollution Program Administrators (STAPPA), the Association of Local  Air
Pollution Control Officials (ALAPCO),  and EPA.

     The need for the NAAS evolved as  State and local air pollution
control agencies assumed responsibility under the Clean Air Act for an
increasing number of programs.  The EPA responded to the concerns of the
STAPPA and ALAPCO members by agreeding to participate in a STAPPA/ALAPCO/EPA
workgroup charged with developing and  directing the implementation of an
auditing system that would ensure the  desired national consistency and
would confirm that State and local air pollution control programs were
operating in such a manner as to satisfy the national requirements of the
Clean Air Act.

     The workgroup decided that the primary goals of the NAAS should be
to identify any obstacles preventing State and local agencies
from implementing an effective air quality management program and to
provide EPA with information which could be used to develop more  effective
and meaningful  national programs.  The NAAS would provide audit guidelines
that EPA and State and local agencies  could use (1) to meet statutory
requirements; (2) to assist in developing an acceptable level  of  program
quality; (3) to account for the achievements, shortcomings, and needs of
various air programs; (4) to identify  programs needing further technical
support or other assistance; and (5) to manage available Federal, State
and local resources effectively so that the national ambient air  quality
standards are attained and maintained  as expeditiously as possible.

     In late 1982, the STAPPA/ALAPCO/EPA workgroup reached an understanding
on the development of a national air audit program.  In April  1983,  the
group identified the audit topics and  appointed subcommittees to  write
the audit guidelines.  The four program areas selected by the workgroup
for which guidelines would be written  were air quality planning and  State
implementation plan (SIP) activity, new source review, compliance assurance,
and air monitoring.  Standardized audit guidelines for each program  area
were written by the subcommittees.  The subcommittees were chaired by a
                                   II-l

-------
State agency person, with an EPA staff person  serving  as  coordinator.
Local agencies and the EPA Regional  offices were  also  represented  on each
subcommittee.  In October 1983,  the  workgroup  developed the protocol for
implementing these audit guidelines.

     The guidelines were used by EPA Regional  Offices  in  FY 1984 to audit
68 State and local air pollution control  programs,  including all States
except California, plus Puerto Rico, the  Virgin  Islands,  and the District
of Columbia.  The local agencies audited  were:


               Albuquerque, NM                  Cook County, IL
               Allegheny County, PA              Dayton, OH
               Birmingham, AL                   Louisville, KY
               Charlotte, NC                    Philadelphia, PA
               Chattanooga, TN                  Puget  Sound, WA
               Chicago, IL                      Spokane County,  WA
               Cincinnati, OH                   Ventura County,  CA
               Cleveland, OH                    Wayne  County, MI

     The California State Agency was not  audited  because  the local  district
agencies there are responsible for implementing  the various air  quality
management programs.  All program areas were not  audited  in each agency
because the four activities selected for  audit were not performed  by all
agencies.

     The involvement of State/local  personnel  as  members  of the  audit
teams was encouraged by the workgroup, but  only  a few  agencies were able
to participate this first year.   Out-of-state  travel restrictions,  scheduling,
and travel costs were factors which  prevented  greater  participation.

     The level of involvement of the Regional  Office management  staff  in
the audit visits varied from Region  to Region.   The initial meetings and
exit interviews with the agencies were led  by  Regional Office staff
ranging from Division Directors  to Section  Chiefs.  One Regional Office
held a public meeting with each  State agency a day  before the site visits.

     The audit teams varied in size, and  all four of the  program areas
were not always audited at the same  time.  The number  of  auditors  in an
agency at any one time rarely exceeded five.

     EPA Headquarters personnel  observed  13 audits. This was to provide
national overview on the audits  and  was part of  the quality assurance
program to which STAPPA, ALAPCO, and EPA  had agreed.

     The protocol used by the EPA Regional  Offices  in  conducting the
audits included advance preparation  before  to  the site visit, an initial
meeting with the agency director, discussions  with  agency staff, review
of the agency files, and an exit interview.
                                   11-2

-------
     The advance preparation involved, among other things, sending a
letter to the agency well in advance of the audit to confirm the date
and time to and identify the individuals performing the audit.  The
guidelines and questionnaires were also provided to the agencies with a
request to complete the questionnaire before the visit and, in some cases,
to return them to the EPA Regional Offices.

     The site visits were conducted generally in four phases:

     0 The audit team met with the agency director and key staff to
discuss the goals and the procedures to be followed.

     0 The auditors discussed the questionnaire with the personnel in
charge of each of the four audited activities.

     0 The agency files on compliance, permits, air monitoring, and SIP
documents were reviewed to verify the implementation and documentation of
required activities.

     0 An exit interview was held to inform agency management  of the
preliminary results of the audit.

     The Regional Offices drafted a report after each site visit and
requested that the audited agency review it before it was made final.
The individual agency audit reports were used by EPA to compile and write
the final FY 1984 national report.

     The national report was extensively reviewed in draft form before it
was made final.  Several persons who commented on the draft of the report
suggested that it should contain recommendations for addressing the
deficiencies that were uncovered.  Although some recommendations appear
in the report, more recommendations were not included for the  following
reasons:

     0 This is the first experience with the national air audit system.
The primary purpose of this initial audit was to establish a baseline of
knowledge about activities of the State and local agencies.

     0 State-specific deficiencies uncovered by the Regional Office audit
teams in individual agencies are being addressed through the Section 105
grants and other administrative mechanisms that involve interaction
between the Regional Offices and the State and local agencies.

     0 EPA will address the national implication of many of the more
prevalent deficiencies identified in the FY 1984 audit when EPA revises
its FY 1985 operating plan and develops its FY 1986 program plan and its
FY 1987 budget.

     0  A number of the deficiencies cited in the FY 1984 national report
may have been a result of vagueness in the questionnaires used in the
audits.  For example, some responses to a number of questions  could not
be understood because the respondents were apparently confused by the
wording of the questions.  Where this happened, EPA has significantly
revised questions for the FY 1985 audits to minimize the vagueness.


                                   II-3

-------
     0 EPA expects to address  more  recommendations  in the  FY  1985
national  report.  The audit  for FY  1985  has  been  redesigned in many
respects to learn more about the nature  of some of  the deficiencies
uncovered in the FY 1984 audit.

     The implementation of the NAAS in FY 1984 was  the first  step of an
evolving process.  The FY 1984 audits provided an opportunity for EPA
to identify differences between EPA policy,  as reflected in the  audit
guidelines, and the implementation  of policy by State and  local  agencies.
The EPA will use the findings  of the FY  1984 audits to establish a baseline
of information on State and  local air pollution control programs.  The
progress agencies are making to improve  their program operations will
then be measured in future EPA audit activities.

     For State and local agencies,  the National Air Audit  System is an
important step in ensuring national  consistency in  the air programs and
in providing another opportunity to exchange ideas  with and to learn from
other air pollution control  agencies.  Above all, the NAAS should result
in a national air quality program that will  more  effectively  and
expeditiously achieve the goals of  the Clean Air  Act.
                                   II-4

-------
              III.  AIR QUALITY PLANNING AND SIP ACTIVITIES


A.   INTRODUCTION

     Fifty-two State and nine local air pollution control  agencies1- were
audited for air quality planning and SIP activities.   This is the first
time a national audit of this type had been made of their performance in
this area.  The 41 questions covered in this chapter were developed from
guidelines prepared jointly by U.S. EPA and State and local agencies.

     After receiving the drafts of the individual agency audit reports,
EPA classified the responses from each agency into 60 separate answers
corresponding to each question and its parts.  Each agency's response and
comments were then tabulated onto a worksheet from which national summaries
were compiled.

     This chapter summarizes the 1984 audit findings of the Air Quality
Planning and SIP Activities chapter in each of its four major areas: air
quality evaluation, emissions inventories, modeling,  and SIP evaluation/
implementation.  The major findings and conclusions of each area are
presented first, followed by results of individual audit questions in
each area.  A few selected questions are illustrated with bar charts.

B.   MAJOR FINDINGS AND CONCLUSIONS

     Air Quality Evaluation

     It is clear that a majority of the 61 audited agencies which are
covered in this chapter are performing a basic level  of service in air
quality evaluations.

     1.  Air Quality Reports

     Nationally, the results indicate that the activities of publishing
air quality data and evaluating attainment and nonattainment area designa-
tions based on that data are being carried out by most agencies.

         0 Agencies appear to be doing a good job disseminating air quality
           information.  Eighty-three percent of the State and 78 percent
           of the local control agencies published data annually or more
           often (see Figure 1).
T.For the purposes of this chapter,  the 52 "State"  agencies include:  49
    States (all except California), the District of Columbia, Puerto
    Rico, and the Virgin Islands.  The 9 local  agencies include:  Albuquerque,
    New Mexico; Birmingham, Alabama; Charlotte, North Carolina;  Chattanooga,
    Tennessee; Louisville, Kentucky; Philadelphia,  Pennsylvania;  Pittsburgh,
    Pennsylvania; Puget Sound, Washington, and  Ventura County,  California.
    In addition, Chicago and Cook County, Illinois  are included in  the
    section on emissions inventories.
                                  III-l

-------
               FIGURE L HOW OFTEN DOES YOUR AGENCY
                      PUBLISH AIR QUALITY DATA?
                                    61AGENCES AUDITED
                                        E2JS2SUTB
                     ANNUAUY
QUARTERty   NO PUBLICATION
     2.   Section 107 Redesignatlons

         0 Although 75 percent  of  the control agencies reviewed Section 107
           redesignations  during 1983 and either have submitted or are in
           the process of  submitting these  to EPA, it should be noted that
           several  agencies  see no incentive for designating new nonattainment
           areas.

     3.   Special Studies

     Special  studies are used by most agencies to evaluate existing point
source limits.  These activities sometimes  result in revised SIP emissions
limitations.

         0 Seventy  percent of the  audited agencies report that they
           reenter  the planning process and revise SIP or permit limits
           when there are  conflicts between special study data and existing
           data.

     4.   Air Management/Planning

     Most agencies  use monitoring  data and  modeling results to focus on
source evaluation and investigation.  However, the results also indicate
that the process of relating ambient data to source impacts is usually
done as needed and  is not  defined  by a formal plan or by established
criteria documented within the  agency.
                                  II1-2

-------
         0 Only 34 percent have a formally documented way of using exceedance
           data to help determine program priorities on a geographic
           basis.

         0 Only 43 percent use internally documented criteria or procedures
           for relating source impacts to ambient data.

     These results indicated that air quality data are periodically
reviewed and used on an as needed basis for source analyses.   On the
other hand, in the majority of agencies, there is an absence of a documented
program to review and evaluate air quality data and make the appropriate
SIP planning corrections.  States nationally have the capability to use
air quality data effectively to meet specific needs, but apparently do
not have documented procedures to integrate these data into the planning
program.

     Emissions Inventories

     1. Sources, Emissions, and Other Data Maintained in Emissions Inventory

     Nationally, the audit indicates that emissions inventories for almost
all agencies contain data for major point sources.  However,  many do not
cover NSPS, NESHAP, unregulated minor point sources, and area sources.

         0 Ninety-seven percent (61 out of 63 agencies) maintain emissions
           and other data for major point sources.

         0 Sixty-seven percent maintain both actual and allowable emissions
           rates.

         0 Sixty percent maintain NSPS and 56 percent maintain NESHAP
           emissions data.

         0 Fifty-two percent maintain emissions and other data on unregulated
           minor sources and area sources.

     2.  Update Frequency

         0 Sixty-five percent of agencies update the criteria pollutant
           emissions inventory annually; this is usually performed through a
           combination of permitting, inspecting, and source reporting.
           Forty-three percent indicate that they update inventories continually
           or as changes occur.

     3.  Methodology, Documentation, Quality Assurance

     The methodology and factors used to compute inventory data are
apparently consistent with national  guidance and in most cases, adequately
documented.  There appears to be some kind of cross-checking of the data
for accuracy and completeness; however, these are predominately error
checks.  Comprehensive quality assurance of emissions inventories is not
normally performed by most agencies.
                                  III-3

-------
         0 Eighty-four percent of the  agencies  believe  that  the  inventory  is
           adequately documented.   Only  one  agency  reported  having  inadequate
           documentation of its inventory.

         0 Eighty-four percent of the  agencies  perform  some  type of validity
           check,  76 percent check  for missing  sources,  and  71 percent
           compare inventory to other  records such  as permits or enforcement.

         0 Sixty-seven percent use mobile  source  data consistent with DOT/MPO
           transportation data or Section  208 water quality  projections.

     4.  Computerization, Reports Formats, NEDS Submittals

     Most inventories are computerized,  but  many  agencies cannot produce
reports summarizing impacts of source  controls  or process changes,  and
some agencies cannot aggregate sources by  category  or by geography.  The
inventories, as described by most agencies,  are apparently adequate as
starting points for control  strategy evaluations.

         0 Eighty-two percent say the  inventory has adequate temporal and
           spatial resolution for SIP  analyses  and  related modeling
           activities.  However,  most  agencies  did  not  indicate  if  this
           degree  of resolution extends  to projected year inventories or
           just the baseline inventory.   It  was also not clear if mobile
           and area source inventories are capable  of the same resolution
           as point sources.

         0 Eighty-one percent of all agencies maintain  their inventories on a
           computer.

         0 Seventy-six percent can aggregate emissions  sources by source
           category, and 74 percent can  aggregate sources by geographic
           units such as counties or census  tracts.

         0 Seventy-three percent of the  agencies  say they submit NEDS data
           annually to the EPA Regional  Office.

     Modeling

     1.  Knowledge and Capabilities

         0 The majority of agencies are  apparently  knowledgeable and capable
           of performing and reviewing most  routine modeling analyses.
           Sixty-nine percent of the State and  local agencies audited have a
           basic knowledge or better with  respect to the use of  EPA models
           and guidance, however, 14 agencies (23 percent) either do not
           have the ability to use EPA models and guidance or can run
           only screening models (see  Figure 2).

         0 Apparently, 40 percent  of  most modeling is  done  within  the agency
           (in-house), 30 percent is done  outside by source  or contractor  and
                                        III-4

-------
      the other 30 percent could not be determined.   Generally,  the
      audit results were not specific enough,  for most  agencies,  to
      determine what types (new or existing source,  PSD,  etc.) of modeling
      these figures cover.

    0 Eighty-three percent of the control  agencies have in-house or
      computer links to models and data bases  (at least 23  agencies
      have EPA UNAMAP versions 4 and 5 models  on-line).

    0 Eighty-two percent believe that their staff keep  abreast of
      current modeling practice.

    0 Seventy percent of agency staffs are,  or claim to be,  familiar
      with modeling for bubbles, redesignations,  new source  review
      (NSR), or nonattainment areas.
            FIGURE 2. WHAT IS STAFF KNOWLEDGE ON
                    EPA MODELING GUIDANCE?
      30-
       25-
       20-
       e-
       9-
       61 AGENCIES AUDITED
           Z3 S25EUE5
           •iSlflCALS
               ADVANCED
BASIC
NO RESPONSE
2.  Documentation and Guidance

    0 When agencies use nonreference  modeling procedures, EPA is usually
      contacted;  however,  documentation of EPA contact or approval is
      not always  done.

    0 Although 32 agencies claim  to have  sought EPA's approval on
      nonreference procedures,  only 19 documented the request.
                             III-5

-------
         0 Fifty-three percent (32 agencies)  said  EPA  was  routinely  contacted
           for approval  of nonreference  procedures,  although  it  is not
           clear that prior approval  was routinely sought.  Thirty-nine
           percent apparently never used nonreference  procedures and five
           percent said they did not contact  EPA when  using them.

         0 Seventy-one percent of the control  agencies indicated that
           documentation is available to show that EPA guidelines and
           procedures are followed.  Only four agencies (seven percent)
           reported that deviation from  EPA procedures had not been
           documented.

     3.  Control Agency Review of "Outside" Modeling Analyses

         0 Eighty-five percent of the agencies provide guidance  to sources
           or contractors prior to model  application.

         0 Seventy-four percent reviewed outside modeling  by  a source or
           contractor in 1983.  In reviewing  modeling  analyses by sources
           or contractors, agencies use  varying levels of  sophistication
           from complete replication to  simplified screening  models.
           Eighteen agencies (29 percent) replicated or verified all
           outside modeling.  Another 35 percent performed some  type
           of verification, such as a review  using screening  models,  on
           at least some of these outside analyses.  Twenty-one  percent
           of the agencies performed no  verification.

     SIP Evaluation and Implementation

     1.  Timeliness of Studies and Regulation Development

     In general, agencies were making some progress in submitting required
rules and in completing required studies.  The audit did not  indicate
major problems, on a national scale,  in  communication  among the  State and
local agencies and EPA Regional Offices.
         o
           Where formal  schedules were approved in  the  SIP,  only  25  percent
           of the agencies reported that all  required regulations had  been
           developed and submitted to EPA.   While another 54 percent of  the
           agencies reported that they were making  progress,  submittals  were
           still either missing or late (see Figure 3).   Available resources
           and lengthy State procedures were cited  as problems.

           Thirty-three agencies said additional  studies  (for CO  hotspots,
           TCM's, TSP, etc.) were required  in the SIP,  and of those, 27  said
           the intent of such schedules had been carried  out.

           Thirty-four percent (21 agencies) said the SIP contained
           schedules for adoption and implementation of Part D rules,  but
           only three agencies identified slippages in  the schedules.   It was
           not evident from the responses how most  agencies  track schedules
           or who is responsible for this.   It was  also not  clear whether
           affirmative responses refer to formally  submitted schedules or
           subsequently agreed to schedules.
                                  III-6

-------
   FIGURE 3. HAVE ALL REQUIRED SEP ELEMENTS
                BEEN SUBMITTED?
               31
                         61AGENCES AUDITED
                             CZ) 32 SttlCS
                             • 110005
                      PROGRESS  RESPONSE  APPUCABU
    FIGURE 4. ARE SITE-SPECIFIC SIP REVISIONS
         CONSISTENT WITH EPA GUIDANCE?
35-
                         61 AGENCIES AUDfTED
                             ez sa suits
                          RESPONSE   APPUCABLE
                        III-7

-------
         0 Fifty-four percent (33 agencies)  indicated progress  was  being
           made on SIP deficiencies  identified  by  the EPA  Regional  Office.
           Two agencies indicated progress was  not being made and ten
           agencies either did not respond or their answers  were  inconclusive.

     2.  Familiarity with EPA Policy on Site-Specific SIP  Revisions

         0 Fifty-one percent (31 of 61 agencies)  said that they had
           processed site-specific SIP revisions  (see Figure 4).  Bubbles
           and emissions trading were the most  commonly reported  actions.
           While most agencies indicated that their revisions were  consistent
           with EPA policy,  ten agencies indicated that some of their
           bubbles and variances were not.   The predominant  mechanism
           used by States to assure consistency was allowing the  EPA
           Regional Office to review and comment  on each revision.

     3.  I/M, TCM Implementation

     The audit indicated few agencies have I/M  programs that are  known to
be achieving the reductions  claimed in the SIP.  However,  many  progams have yet
to be in operation for any length of time.

         0 At the time of the audit, of the 32  agencies required  to have
           I/M programs, only 7 agencies could  show that I/M was  fully
           implemented consistent with the credit  taken in the  SIP.  Five
           agencies said the program was inconsistent with the  SIP  (low
           failure rates, poor participation, etc.).   The  other 17  agencies
           required to have  I/M programs could  not yet make  this  determi-
           nation.  Most of  these programs either  had not  yet started by
           the end of the audit or had been in  operation for less than 1
           year.  Only three agencies did not respond to the questions.
         o
           Of the 38 agencies implementing TCM's,  only 21  indicated
           responsibility for tracking or implementing them.   It appears
           that only eight agencies are actively involved  in  TCM tracking
           through either membership on a Metropolitan Planning Organiza-
           tion (MPO) transportation committee or  periodic review of
           the transportation improvement program  (TIP).   Nineteen agencies
           track TCM's primarily through reports from the  implementing
           agencies.  The remaining 11 agencies are apparently not actively
           involved (see Figure 5).

     4.  SIP Coordination and Validation

     Forty agencies (66 percent) said they were able to show  that the
emissions reductions claimed in the SIP were achieved in practice, while
11 agencies (18 percent) said they could not do this.  Most (35 agencies)
rely on their permitting and inspection activities as the  principal
verification method.  It appears that the majority of agencies do not
include tracking implementation as a central element of their program.
                                  III-8

-------
     In the reasonable further progress (RFP)  area,  only a dozen or so
agencies appear to be making much effort to document the emissions  reduc-
tions claimed in their ozone and CO RFP demonstrations.

         0 Twelve of the 26 agencies with 1987 extension areas  for  ozone have
           a formal  system for documenting RFP emissions reductions for VOC.

         0 Eleven of the 34 agencies with extension  areas for CO have  a
           system for tracking CO RFP reductions.

         0 Thirteen  agencies with extension areas  for ozone claimed they
           could assure that the RFP data were consistent with  agency
           enforcement and emissions inventory data, but seven  agencies
           could not.

         0 For CO, only 12 agencies could assure that the RFP CO emissions
           reductions were consistent with agency  emissions inventory
           data, and 11 agencies could not.

         0 Only 18 percent (11 agencies) reported  performing a  periodic
           review or evaluation of the SIP growth  projections.   Fifty-nine
           percent (36 agencies) either did not periodically evaluate  growth
           or thought it was unnecessary to do so.   Twenty-five agencies
           thought their SIP's projections to  be adequate for current  growth
           in both point and area sources.
               FIGURE 5. HOW IS TCM IMPLEMENTATION ASSURED?
            o
                              HOW TRACKED?
                                                61 AGENOES AUDITED
                                                   Z3 52 STATES
                                                   • 9L0CALS
                              T=TIP REVIEW OR
                                MPO MEMBER
                       TCMsl
                       IMPLEMENTED     BEING        NOT APPLICABLE
                       WHERE REQUIRED  IMPLEMENTED   OR NO RESPONSE
                                    WHERE REQUIRED
                                        III-9

-------
C.   AIR QUALITY EVALUATION

     This section contains the detailed audit information on how air
quality data were used by State and local  agencies in the evaluation of
source impacts and planning activities.  Four main areas are covered:  air
quality reports, Section 107 designations/redesignations, special  studies,
and air management/planning.

     In the air quality reports area,  the audit asked how frequently
agencies published air quality monitoring data (i.e., provide the  data to
the public).  Although not a part of the question, the lag time interval
from retrieval to publication of the data was recorded,  if given by the
agency in their response.

     The second area evaluated whether Section 107 redesignations  were
being carried out.  A list of submitted redesignations and a description
of the agency's review process was requested.

     The third area concerned whether agencies used results of special
monitoring studies in periodic planning or SIP activities.  Agencies were
requested to describe the extent to which air quality data from special
studies are compared to the SIP air quality data base.

     The fourth area covered the use of air quality data for program
planning and management.  Agencies were asked to describe their use of
monitoring results or modeling studies to focus on source compliance.
They were also asked if they had a formal  way to use exceedance data to
define geographic priorities, and if they had established criteria or a
plan to relate source data to ambient impacts.

     Responses to Individual Questions

     1. Air Quality Reports (Question Al)

     Question Al asked if agencies regularly publish data from their
ambient air quality monitoring stations.  The intent was to determine if
these data are made available to the general public.  Auditors were to
review the agency's most recent public reports on air quality data.

     This is an activity that 83 percent of the State and 78 percent of
the local agencies perform.  Most States publish the data in an annual
air quality report that includes data tables and an analysis of current
statewide air quality.  Usually, the reports are distributed to requestors
and others on a mailing list, including libraries, media, and legislators.
Copies of the report were attached to some of the audit questionnaires.
A couple of State agencies that did not publish annual reports indicated
their intention to begin this activity in the future.

     The question on lag time was not asked as part of question Al,
however, the agency responses, if given, were summarized.  About one-third
of the responding agencies reported their lag time.  Of these, 30  percent
                                  111-10

-------
published the report within 5 months from the end of the collection period
and 90 percent published within a year from the end of the collection
period.

                           Air Quality Reports

                                           Quarterly  Annually   No

Al Does Agency publish air          Agencies  10         40      11
monitoring data with comparisons              16%        66%     18% *
to the NAAQS? (61 responses)

                                              0-5        5-12  12 or more
                                             Months     Months   Months

Al What is the lag time between retrieval
and publishing of air quality data?           30%        60%     10%
(20 responses)

     2. Section 107 Redesignations (Question A2)

     Question A2 asked whether agencies periodically review Section 107
designation status.  The intent was to determine if State and local
agencies are responsive to changes in attainment status and if they use
the Section 107 redesignation process in an appropriate manner.   Auditors
were to discuss the control agency's designation procedures and pending
submittals.

     Seventy-five percent of the audited agencies indicated that, from time
to time, they did review how air quality data affected Section 107 attain-
ment status, although not all of these agencies had actually submitted
requests for redesignations during the review period.   Although  this review
covers redesignations to nonattainment from attainment and vice versa for
both primary and secondary standards, several agencies saw no incentive for
designating new nonattainment areas.

     Some control agencies sought EPA Regional Office technical  review
prior to submitting a redesignation request, however,  some agencies
preferred to resolve redesignation problems (especially changes from
attainment to nonattainment) on the State or local  level with minimum EPA
involvement.  Four agencies indicated an annual review while one State
agency indicated a quarterly review.  The one State agency indicating the
question was not applicable said they had attained the standards and
therefore periodic review was unnecessary.
*Percentages in this chapter may not add to 100 percent due to rounding.
                                  III-ll

-------
                        Section 107 Redesignations

                                                 Yes      No     NA2    CD2

A2 Does agency perform periodic review of
Section 107 designations and submit proposed
changes to EPA?                                  75%      20%   2%    3%
(61 responses)

     3. Special Studies (Question A3)

     Question A3 asked if the control  agencies  used the results of  special
monitoring studies in their planning activities and whether  these studies
resulted in periodic updates of the SIP.   The intent  was to  determine if
the agency's program was capable of integrating air monitoring  and  planning
functions and of adjusting the SIP data base to include results of  new
data.  Auditors were to discuss who in the agency reviewed data from
monitors other than the NAMS/SLAMS network, and to  identify  the extent to
which special monitoring data is compared to data that the SIP  is based
upon.  A "yes" response indicated that special  studies were  evaluated
against the current information and some  action was taken.   Evaluation of
some special studies may have resulted in revisions to SIP limits via
permits or may have led to specific SIP revisions.   Twenty-eight agencies
reported taking some form of action on special  study  data when  there was a
conflict with the existing SIP data base.  The  three  agencies reporting
not taking action on conflicting data commented that  special  studies data
are not normally evaluated.

     Several States said that special  monitoring data are reviewed  and
acted on in the same way as data from the NAMS/SLAMS  network.   Some
agencies commented that special purpose monitoring  is done to help  deter-
mine attainment status in an area not covered by existing monitors  (e.g.,
localized problem spots for CO or TSP).  In some agencies, new  source
modeling was performed that showed exceedances  from existing sources.
This sometimes resulted in more stringent emissions limits in the permits
for the existing sources.

     Although these examples and others indicated that action on the State
level occurred as a result of special  study monitoring, it could not be
determined in all cases whether EPA was involved through submittal  of a SIP
revision or if such action was confined to emissions  limit modifications
on the local or State level.
2 NA means "not applicable."  CD means "cannot determine,"  i.e.,  the
explanation provided in the response was inadequate to determine  a
category for the answer.
                                  111-12

-------
                             Special  Studies

                                              Yes    No      MA      CD
A3 Does agency update the SIP when
special studies conflict with                 46%    5%      39%     10%
current information?                 Agencies 28     3       24       6
                                              »

     4. Air Management/Planning (Question A4, A4a,  and A4b)

     Question A4 asked how the control  agency used  air quality data  in
focusing its resources and establishing program priorities.   The intent
was to determine how the agency used air quality data (and modeling
results) for program planning and management.  Auditors were to review
the agency's capabilities and experience in the use and evaluation of air
quality data.  In particular, agencies were to describe how  they use data
to: (I) direct and focus their compliance efforts e.g., for  sources  not
meeting emissions limits; (2) determine program priorities e.g., on  a
geographic basis; or (3) identify regulations that  are inadequate to
maintain ambient standards.  Formal  methods, plans, and criteria used by
the control agencies in these activities were to be identified.

     Question A4 can be broken into four parts.   For the first part,
results indicate that 84 percent of the agencies review and  evaluate
monitoring data to focus on sources or source categories. These monitoring
activities are varied.  They range from analysis of existing network data
such as microscopy filter analysis for metals from  suspected sources or
identification of fugitive TSP sources, to special  purpose monitoring,
such as CO or NOX monitoring around airports and other congested areas.
Many agencies evaluate existing monitoring data in  support of compliance
efforts or complaint investigations.

     The second part of question A4 asked how modeling studies are used
to focus on sources.  Modeling studies are used by  several agencies  to site
monitoring stations (e.g., S02 and lead monitors) or to assess the impact
of proposed stationary and indirect sources in areas without monitors.
Examples included: modeling impacts from nontraditional TSP  sources  such
as road dust or woodburning, review of CO impacts from indirect sources
such as shopping centers or parking lots, and modeling S02 impacts from
power plants or industrial fuel-burning sources. Although this activity
was performed by 66 percent of the audited agencies, there are a signifi-
cant number of agency responses that could not be determined one way or
the other.

     The third part (Question A4A) was intended to  determine if an agency
used information on monitored exceedances in any formal way  to revise
program priorities, particularly on a geographic basis (i.e., to identify
and control sources adjacent to ambient monitors).   The question also
asked agencies to give examples where this had been done and to describe
how the results were used.

     Many agencies cited examples in which they review exceedance data,
or relate ambient trends to emissions trends.  This type of  activity is
                                  111-13

-------
most frequent in areas with air quality levels at or slightly  above the
standards and which have a potential  for or have received a  recent SIP
call.

     However, the key word in the question was "formal,"  i.e., did the
agency have some documented method or set procedure to govern  program
priorities or was the exceedance data handled informally? A majority of
agencies do not use any formal  procedure for this activity since only 34
percent answered yes.  This is the only question in the group  for which
there are more negative than affirmative responses.  There are also a
significant number (20 percent) of "cannot determine" responses.  Although
the question was unclear, most agencies set program priorities based on
evaluated air quality data, but they  do this activity on  an  informal or
as needed basis.

     The fourth part (Question A4b) is intended to determine if some
criteria or plan is established that  directs agency attention  to indivi-
dual emissions sources or to specific source categories.   Although it may
seem to overlap with the first two parts of Question A4,  it  contains two
distinguishing words: "criteria" and  "plan."  Activities  under an affirma-
tive answer could include: source/receptor analyses, such as analysis of
atmospheric aerosols and the identification of their source; source
oriented monitors, which are used to  verify dispersion models  that predict
emissions reductions necessary to attain a standard; and  any systematic
review or program evaluation of air quality data.  These  activities were
reported as occurring in only a few of the audited agencies.

     Twenty-six agencies (43 percent) reportedly use some kind of plan  or
have established some criteria for relating ambient monitoring data to
source activities.  Almost as many, 24 agencies (39 percent),  said that
there were no documented criteria or  plan established in  their program.
Two agencies (3 percent) indicated the question did not apply.  Nine
agencies (15 percent) gave unclear responses or did not answer the question.

     Many agencies cited examples of  special monitoring studies or reiterated
studies given elsewhere in the audit.  In deciding whether the responses
were "yes" or "no", a certain amount  of judgment was required  to determine
if an agency had established criteria or a plan for this  activity.  An
affirmative answer was coded if the weight of the response indicated that
there was probably a plan or criteria; otherwise the answer  was coded
"cannot determine" and lumped with those agencies which did  not answer
the question.  A negative response meant either the agency did not relate
source impacts to ambient data or that there probably was no plan or
criteria involved.

     Although the results indicate that most agencies do, in some way,
relate source impacts to ambient data, there is in most agencies an
absence of a formal program, criteria, or procedure to review air quality
data and make the appropriate SIP planning corrections.
                                  111-14

-------
                         Ai> Management/Pianning

                                              Yes      No      NA     CD

A4 Does the agency use monitored data to
focus attention on emissions sources or       84%      10%     1%   .   6%
source categories?

Does the agency use modeling studies to
focus attention on emissions sources or       66%      11%     2%     21%
source categories?

A4A Does the agency have a formal way to
determine geographic program priorities
using statistics on the frequency and
severity of NAAQS exceedances?                34%      43%     3%     20%

A4B Does the agency have a criteria/plan
for relating source impacts to ambient data?  43%      39%     3%     15%

D.   EMISSIONS INVENTORIES

     This section contains detailed audit information on how the State and
local control agencies maintain emissions inventories for point, area, and
mobile sources.  It covers several aspects of the inventory including:
sources, emissions and other data maintained (Questions Bl thru 84 and
B9); frequency of updates (B5); documentation, methodology and quality
assurance checks (B6, B7, 88, BIO); and computerization, report formats
and NEDS submittals (811 thru 815).  The majority are survey questions, but
some (such as BIO) are designed to help determine the quality of the
inventory.

     Sixty-three agencies provided audit responses to this section of  the
Air Quality Planning and SIP Activities chapter, as opposed to 61 agencies
that responded to the other sections.

     Responses to Individual Questions

     1. Source, Emissions and Other Data Maintained in Inventory
        (Questions Bl thru B4 and 89)

     Question Bl asked agencies to identify specific data categories (such
as stack parameters, process data, etc.) maintained in its emissions inven-
tory, and to describe its definition of major and regulated minor sources.
This question, in combination with questions B2 and B3, was intended to
determine whether components of the inventory include all  significant
source contributors to the pollutant burden.  Except for one State and one
Territory, all agencies reported maintaining criteria pollutant inventories
for major point sources of State/local  interest.  There were variations
among programs in the definition of a major source; most used 100 tons
per year (TPY) potential but four reported using 100 TPY actual.   Some
agencies inventory only one or two pollutants that are of interest to  them.
                                  111-15

-------
State agencies with no urban areas usually inventory only  TSP  and S02
emissions sources.

     Some agencies store their inventory in compliance,  inspection,  or permit
files.  This storage method is apparently used most often  for  NSPS or
NESHAP sources.  The NESHAP inventory for many agencies  is not computerized
and may contain only survey or compliance data and sometimes covers  only
asbestos sources.  One agency reported a cutoff level  of 100 TPY for
NESHAP sources, (it could not be determined if the emissions were actual
or allowable). In general, most agencies maintain an inventory of emissions
and related data for major sources, but coverage could be  improved.

     Question B2 covers the unregulated minor sources and  area source
(including mobile sources) inventory in nonattainment areas.   In general,
the audit revealed that States that are nonattainment for  ozone maintain
area source inventories for YOG since these are major contributors as a
group.  On the other hand, States with mostly rural (and few nonattainment)
areas do not maintain area source inventories.  The results showed that
the mostly rural western States did not normally maintain  area source
inventories whereas the industrialized midwestern and northeasten States
did.  However, a populous State in the northeast did not maintain area
source inventories, because it claimed that its resources  were inadequate
to inventory the large number of area sources.  Altogether about half (52
percent) of the agencies maintain emissions inventories  for area sources
and unregulated minor point sources.

     Question B3 was asked to determine how agencies handled small point
sources, such as those below a set cutoff level.  The purpose  of the
question was to determine if agencies treated these small  point sources
as individual sources or if they aggregated them into area sources,  and
if so, how?  The intent of the question was to determine if all sources
are accounted for and to investigate potential double counting of small
point sources and area sources.

     This question was apparently confusing.   Twenty-one percent of  the
agencies did not respond or answer clearly.  Although fifty-two percent
said that they maintained unregulated minor source and/or  area source
inventories (Question 82), only thirty-five percent said in this question
that they aggregated small point sources as area sources.   It  could  not
be determined from the responses how those agencies that do not aggregate
handle small point sources.

     Generally, those agencies that aggregated used a two-part approach.
If the source category was subject to agency regulation, (e.g., power
plants or industrial facilities) the agency aggregated sources below a
cutoff and treated them as area sources.  On the other hand, if the
source category was unregulated (e.g., commercial surface  coating or
small gas/oil boilers) and emissions were derived from activity indicators
such as population or fuel use surveys, then the aggregate subtracted any
large sources above a cutoff from the total to obtain a  remainder which
they treated as area source emissions.  Many agencies with a negative
response simply said that their cutoff level  was low and therefore their
inventory included all small sources in the point source emissions inventory (El)
                                  111-16

-------
     Question B4 was intended to discover:  (1)  who was responsible for
maintaining the mobile source inventory,  and (2)  what methods  and  models
were used.  It was hoped that the responses would reveal  whether mobile
source inventories could be used with a reasonable level  of confidence
for planning purposes.

     There were some problems with this question, however.   Most (52 of
63) of the audited agencies were not always responsible for mobile source
inventories.  Many States left development of mobile source inventories
to either the State Department of Transportation  (DOT) or to the Metro-
politan Planning Organization (MPO)  for urban areas within  the State.
Therefore, one-third of the agencies said this question did not apply  to
them since they were not the responsible  agency.   This may  present an
accountability problem since such inventories are critical  to  any  urban
CO or 03 control strategy.

     Mobile source inventories are conceptually calculated  from the product
of two numbers: vehicle miles traveled (VMT) and  a composite mobile
emissions factor (grams per mile).  Each  of these are calculated from
separate and usually complex models: VMT  from a transportation planning
model; and the composite emissions factor from an EPA modal model  (such as
Mobile 2 or Mobile 3).  Almost all VMT numbers are generated via DOT's or
MPO's, and many of these agencies also run the modal  models.

     Since the audit question asked: "How does your agency  inventory
mobile source emissions?" and many State  agencies are not responsible
for this, the results were unclear as to  how mobile source  inventories
are done.  However, half of the agencies  did respond, even  though  the
mobile inventory may have been done  by another agency.  From these
responses, it was clear that the current  or most  recently used modal
model^ was Mobile 2, followed by Mobile 2.5.  One or two  agencies  also
employed Mobile 1 and Mobile 3.

     Question B9 was a survey question intended to determine if an agency
is capable of computing the difference between actual and allowable
emissions rates.  Two out of three agencies say they maintain  both actual
and allowable rates.
3 Mobile 1 was the EPA modal  model  developed in 1978  for CO,  NMHC,
and NOX emissions estimates from mobile sources.   It  was updated  as  follows:
Mobile 2-1980, Mobile 2.5-1981, Mobile 3-1984.
                                  111-17

-------
                       Data Maintained  In  Inventory
                                                   Percentages of Agencies

                                                       Yes   No    NA    CD
Bl Maintain emissions inventory  for:

    - major point sources

    - NSPS sources

    - NESHAP sources

B2 Maintain emissions inventory—unregulated
minor/area sources

83 Aggregate small point sources
as area sources
B4 Modal model last used to develop
mobile source inventory (Mobile 1,  2,
2.5, or 3)
     97

     60

     56

     52
                                                            3

                                                           27

                                                           35

                                                           43
     35   33


     2.5  3
 3

 3


11


NA
13

 6

 2


21


CD
25   16
                                                                33   18
                                                       Yes   No    NA   CD
                                                       67    30
                     0
B9 Does El have both actual  and
allowable emissions data?

     2. Frequency of Updates (Question  B5)

     Question B5 is a survey question on  how  often  the  inventory  data  are
updated for criteria pollutants  in  attainment and nonattainment areas  and
for NESHAP sources.  Agencies were  asked  to describe  their  procedures  for
updating the inventory and to state the update frequency  in three areas:
criteria pollutant sources in nonattainment areas,  criteria pollutant
sources in attainment areas, and NESHAP sources.

     Almost all data for regulated  point  sources  is updated through  the
source permitting, registration, or certification process,  i.e.,  the
inventory is updated when the source permit changes or  expires.   Sixty-
five percent of the agencies mentioned  an annual  update for criteria
pollutant sources in attainment  and nonattainment areas.  NESHAP  sources
were updated annually by the majority of  agencies that  maintain these
data.

     Many agencies (43 percent)  also update the El  either continually  or
as changes occur, i.e., more frequently than  once a year.   Some mentioned
a "threshold" change value, such as 15  percent increase or  decrease  in
emissions levels, which would prompt the  agency to  update the  inventory.
                                  111-18

-------
     Area and mobile source inventories are updated much less frequently
than point source inventories.  Generally the most recent update for area
and mobile sources was the last SIP baseline year:  e.g., 1976 for the
1979 SIP; 1979 or 1980 for the 1982 SIP.   However, some urban areas are
updating these inventories annually or biannually.  Other agencies indicated
their intent to increase the update frequency for mobile and area sources.

            Frequency of Updating the Inventory  (Question B5)

                                                  Percentages of Agencies
1 yr
or
less
65
65
42



2
yrs
19
26
14
Yes
43
3 yrs
or
more
3
3
3
No
3


NA
11
3
27
NA
3


CD
2
3
14
CD
51
   Update frequency--Nonattainment areas

   Update frequency—attainment areas

   Update frequency--NESHAP



   Update as change occurs

     3. Documentation, Methodology, and Quality Assurance (Questions B6,
        B7, B8, and BIO)

     Question B6 asked agencies to describe the methodology behind the
emissions estimates contained in the point and area source El's.   The
intent was to determine if there was completeness and uniformity  among
agencies in the calculation of emissions data.

     Ninety-one percent of the agencies said they used EPA guidance to
develop emissions inventories and that their source data are compiled in
accordance with the most recent guidance.  However, very little detail  in
support of these claims was provided by either the control  agencies or in
the EPA Regional Offices review.  A typical response either simply said
yes or indicated that AP-42 emissions factors were used.  Although a few
States reported there were no AP-42 emissions factors for some processes  in
their State (e.g., certain chemical operations, or the wood products
industry), some States reported developing their own emissions factors for
some of these sources.

     Question B7 attempted to determine if transportation and population
baseline and projection data used in the SIP (such as VMT and population)
are consistent with DOT and MPO planning data.  Altogether, two-thirds of
the agencies said the figures were consistent with other transportation
data.  Some of the agencies reported that population growth rates did not
comport with Section 208 water project estimates but were consistent with
                                  111-19

-------
the 1980 census and projections (to 1987  for example)  based on  that
census.  Since MPO's and DOT's are often  responsible  for  generating these
data, there usually are no inconsistencies of these data  with other
transportation data.

     Question B8 sought to determine whether the basic input data (such
as fuel use, production rates, emissions  test data, traffic counts,  popula-
tion, etc.) and the calculation methods (such as emissions  factors,  control
efficiencies, VOC reactivity conversion,  etc.)  were sufficiently  documented
and available.  The intent was to find out if the accuracy  and  appropriate-
ness of the agency's approach could be verified.  Agencies  were expected
to produce calculation sheets, test reports, traffic  counts, source
reports, and other documents.  It is not  surprising that  84 percent said
there was adequate documentation; however, only one agency  (2 percent)
said there was insufficient documentation.  For seven  agencies  (11 percent),
it could not be determined if adequate documentation  existed.   Combining
the latter two groups indicates that 13 percent of the agencies either do
not document or could not show adequate documentation.

     Question BIO was a survey on the basic quality assurance (QA)
methods used to check the accuracy of the inventory.   Auditors  were to
determine what kind of QA measures were used and to review  QA manuals
and other documents.  In practice, the question asked  agencies  to indicate
if they use three procedures in particular: validity checks, checks for
missing sources, and a comparison of inventory  data to enforcement,
planning, NSR, or other permit records.  The audit asked  the agency to
briefly explain its validity checks and missing source checks.

     Eighty-four percent of the agencies  perform validity checks  but the
type of checks varied.  Agency comments indicated that validity checks
included: computerized gross limit checks for extreme  data  entries (such
as an outrageous emissions rate of 10,000 TRY for a small boiler); comparisons
to previous year records for the same source (e.g., has the data  changed
significantly and if so, what is the reason?);  and some types of  review or
spot-checks by staff or supervisor.  Almost any kinds  of  crosschecks were
included in the "yes" category.  This being the case,  it  is noteworthy that
at least 10 percent (six agencies) did not report using any type  of validity
checks.

     Checks for missing sources are performed by 76 percent of  the agencies.
These include: comparison of sources in the inventory  with  industrial
listings, RACT source category lists and the telephone yellow pages,
discovery of sources, especially those with visible plumes, through field
surveillances and complaint investigations; and cross-checking  building
permit logs or zoning dockets.  Even though most agencies check for
missing sources, many apparently believe their inventory  is comprehensive
(due to a low cutoff level or extensive source  registration) and  therefore
the inventory has a low probability of missing sources.

     Comparison of inventory data to enforcement, planning, new source
review (NSR), or permit records is done by 71 percent  of  the agencies.
Among these four areas of comparison, most agencies commented that permits
                                  111-20

-------
or enforcement were the usual areas of comparison.   This is not surprising
since many agencies update the inventory on or near the expiration date
of the permit.  However, 21 percent said that no comparison was performed.

            Documentation, Methodology, and Quality Assurance

                                                    Percentages of Agencies

                                                       Yes  No   NA   CD

86 El consistent with EPA guidance                     91    3    3    3

B7 Transportation data consistent with DOT, MPO,
data, or with Section 208 water quality projections    67    3   22    8

B8 El adequately documented                            84    2    3   11

BIO El has some validity checks                        84   10    3    3

BIO El checked for missing sources                     76   18    3    3

BIO El compared with enforcement, planning,
    NSR, and permits                                   71   21    3    5

     4. Computerization, Report Formats, and NEDS Submittals (Questions
        Bll to B15)

     Question Bll is a survey on whether the emissions inventory is stored on
computer.  Auditors were to review printouts of the inventory and other
documents.  Eighty-one percent of the agencies indicated their inventories
were computerized; 18 percent said they were not computerized.  While
some agencies have created individual inventory systems, many use the EPA
EIS system.

     Question B12 is a survey on what kinds of inventory reports are avail-
able, specifically in four areas: aggregation by various source categories,
aggregation by various geographic units, effects of changes in control
measures or process modifications on the inventory, and annual submittal
of inventory in NEDS format, to meet EPA reporting  requirements.  This
survey question is intended to determine the extent of use of emissions
inventory data in plan evaluation, resource allocation, and control
strategy development.

     In the first area, 76 percent of all  audited agencies can or do
aggregate the inventory by source category; 15 percent cannot or do not
aggregate by category.  Normally, emissions inventory systems are able to
sort by standard category codes such as standard industrial  classification
(SIC) or source classification code (SCS).   The SIC's are general  manufac-
turing categories, but SCC's are more useful for inventories, being more
specific to air pollution sources.  A typical SCC,  for example, may
indicate the size and fuel type of an industrial boiler.  Although most
agencies' inventory systems are generally able to aggregate and rank actual
                                  111-21

-------
emissions in each SIC or SCC category,  more elaborate systems  could
report aggregated data such as allowable emissions,  production rates  and
fuel use.

     In the second area, 74 percent of  all  agencies  audited indicated an
ability to aggregate by geographical  unit,  (i.e.,  by county or perhaps
census tract).  Fifteen percent said they could not  do this.   Most agencies
provided little detailed explanation beyond a  simple "yes"  answer.
Therefore, it was not clear exactly what kind  of aggregation was  available.

     In the third area, agencies were asked if they  could produce a
report summarizing the effects of source controls  or process changes  which
might be useful for regulation evaluation or RFP tracking.   Forty-one
percent of the agencies indicated such  a report could be generated, and
the same number reported they could not generate such a report.   This
report implies the ability of the agencies to  compare current  year emissions
records to a previous year and to subtract year from year for  individual
points where changes in emissions control devices  or pollutant generating
processes have occurred.  It is doubtful that  all  of the agencies would
use computers in producing such a report but it is more likely that this
report is done by hand, and only for special purposes.  Although  41 percent
said this type of report was not available, most agencies appear  to
maintain sufficient point source data on types of  control  equipment,
efficiencies, process descriptions, etc.

     The last part of the question asked if the national  emissions  data
system (NEDS) submittals to EPA were made annually.   Seventy-three percent
of the agencies said NEDS were submitted annually, 13 percent  said they
were not, and 10 percent did not answer the question.  Of the  three agencies
indicating that this question was not applicable,  two were  local  agencies
not responsible for submitting NEDS (the State agency does  this for the
local agency in most cases).  Readers are referred to question B15  for
the reported date of the last NEDS submittal to EPA.

     Question B13 asked if the inventory provided  temporal  (daily or
seasonal variation) and spatial (county, census tract, gridded area,
etc.) resolution for use in SIP analyses and related modeling  activities.
Auditors were asked to review output sheets and other planning documents.
If the inventory cannot provide information in this  area,  it is considered
inadequate.  For eighty-two percent of  the agencies, inventories  were
reported to be adequate for these purposes. For 11  percent, the  inventories
were reportedly inadequate; for 2 percent,  the question was not applicable;
and in 5 percent, the response could not be determined.  Most  agencies
apparently do not keep a separate file  of modeling data,  however, they
generate this by hand when needed (e.g., input data  to the  EKMA/OZIPP
day-specific ozone model).  Typical inventory  report results are  in annual
emissions rates.  For VOC emissions, the results are usually in total VOC
instead of reactive VOC.  Therefore, conversion to daily reactive units
for VOC, and daily or hourly units for  TSP, S02 is usually  necessary  for
modeling activities.
                                  111-22

-------
     Several agencies indicated an ability to produce summary  reports  of
not only emissions rates, but throughputs (production rates) and workshifts
on a quarterly, weekly, or daily basis.   The majority of agencies,  however,
do not record this level  of detail,  at least not on  a routine  basis.

     Question B14 asked whether the inventory was compatible with requirements
to track and demonstrate reasonable further progress (RFP).   It was expected
that RFP methods would vary widely,  but the emissions inventory maintained
by the agency should have sufficient capabilities, especially  with  respect
to periodic updating, to support RFP efforts.  Auditors were to review
the RFP reports and inventory outputs.

     A key element for RFP tracking is periodic updating of the inventory.
Sixty-five percent of'the agencies reported annual  (and another 19  percent
reported biannual) updating of the emissions inventory in nonattainment
areas (Question B5).  Therefore, although the majority of agencies  seem
to be updating the point source inventory, it is unclear whether the area
and mobile source inventories, which are major contributors to ozone and
CO, are also being updated as frequently.  Although  75 percent of the
agencies indicated that their inventories were compatible with RFP  needs,
a few States did recognize that a lack of annual emissions updating was a
problem for the purposes of tracking RFP.  Other States commented that
RFP tracking was done manually, particularly for area sources, or that
they felt their system was capable of tracking RFP but had not been
proven.  It is noteworthy that only 8 percent of the agencies  apparently
felt that their inventories were not compatible with RFP.

     The last question, B15 (which asked when the last NEDS submittal  was
made), was an afterthought to Question 812 on whether annual NEDS submittals
were made.  Eleven percent indicated 1984 and 55 percent said  1983  was
the most recent submittal.  Several  States indicated either 1982 or 1981
and prior submittal dates.  States with old submittal dates even reported
that they submitted NEDS annually.

           Computerization, Reports  Format, and NEDS Submittals

                                                    Percentages of  Agencies

                                                       Yes No   NA   CD_

Bll El on computer                                     81  18    2    0

B12-Aggregate by source category                       76  15    3    6

   -Aggregate by geographic unit                       74  15    5    6

   -Summarize control/change impacts                   41  41    2   16

   -Submit NEDS annually                               73  13    5   10

B13 Inventory adequate for SIP modeling                82  11    2    5

B14 Inventory compatible with RFP                      75   8   11    6


                                  111-23

-------
B15 Date of last NEDS submittal

E.   MODELING
84  83  82

11  55  10
                                                           81
                                                           and
                                                           prior NA  CD
10  10
     This section contains detailed audit information on the air quality
modeling abilities of State and local  agencies.   The intent of the  audit was
to determine the degree of consistency among control agencies with  respect
to using models, following EPA guidance,  and communicating  with EPA
Regional Offices.

     Three areas are covered:   (1)  the staff's  knowledge and capabilities
with regard to modeling (Questions  Cl, C3,  C6,  C9,  and Cll); (2) documen-
tation and guidance (Questions C4,  C5, and  CIO);  and (3)  agency review of
outside modeling analyses (Questions C2,  C7, and  C8).

     In the first area, the agencies are  audited  on their staff's modeling
experience and training, the kinds  of models and  data bases available,
the use of current modeling practices, and  the  staff's familiarity  with
the Clean Air Act requirements and  EPA policy.

     The second area covers whether or not  the  control  agency deviates
from the EPA Modeling Guideline, and if so,  whether such deviation  is  docu-
mented.  The audit also attempted to determine  if:  the agency contacts
EPA prior to using nonreference procedures  and  whether it documents the
communication; and whether the agency provides  modeling guidance to
sources or consultants prior to their application of models.

     The last area covers the agency's review of  outside modeling analyses
(by sources or contractors), if it  is replicated  by the agency staff,  and
if the agency has participated in site-specific model  evaluation studies.

     Responses to Individual Questions

     1.  Knowledge and Capabilities (Questions  Cl,  C3,  C6,  C9, and  Cll)

     Question Cl was asked to determine the  staff's level of experience
with using air quality models (particularly  EPA guideline models).   The
intent was to find out if the agency was  capable  of conducting modeling
analyses and reviewing those done by others. Auditors were to review  the
agency's modeling staff with respect to:  meteorological  background,
adequate staffing, and computer training.  State  and local  agencies were
to describe their modeling staff and their  experience with  EPA reference
models.

     The responses were classified in three categories of knowledge and
capability.  Advanced knowledge meant the agency  had extensive experience
in the application of EPA models and an ability to  evaluate nonguideline
models.  Basic knowledge implies that the agency  has the capability of
                                  111-24

-------
using most EPA models but is not able to evaluate nonguideline models.
Agencies that can use only screening models or do not have basic  knowledge
are classified as "limited or none" (i.e., not familiar with all  or most
EPA models).

     Five agencies (8 percent) indicated levels of modeling experience
associated with an advanced knowledge capability.  These agencies are
capable of performing sophisticated, nonreference modeling and, in some
cases, have developed their own models.   Thirty-seven agencies (61 percent)
indicated a basic level of modeling knowledge, 14 agencies (23 percent)
indicated that their level of modeling capability was limited to  EPA
screening models or that expertise on using the full  range of EPA models
was limited or did not exist.  The answer for another five agencies (8
percent) could not be clearly determined.

     A further breakdown of the 14 agencies with less than basic  knowledge
was interesting.  Their responses to questions C2, C3, and C7 revealed  that
nine of these agencies reported having in-house access to models  and data
bases, nine reviewed source or contractor's modeling  analyses in  1983,  three
verified or replicated analyses performed by sources  or contractors and two
reported performing most or all of their own modeling.  In addition, five
agencies reported contacting EPA before  using any nonreference procedures,
but only one agency said that they maintained documentation of such contact,
and another of these agencies reported neither contacting EPA nor documenting
the fact.

     The second part of question Cl asked agencies to identify the percent-
ages of different types of modeling analyses performed "in-house" versus
"outside" by source or contractor.  In general, the responses did not
indicate in sufficient detail what types of modeling  were performed by
whom, nor was it possible in all cases to determine whether an overall
percentage of modeling was performed in-house or by sources other than
the agency.  The results, therefore, are classified only as to whether
most modeling is done by the agency or by a source or contractor.  The
modeling mix was 40 percent in-house, 30 percent outside, and 30  percent
unknown or no response.

     Question C3 asked agencies to describe their access to EPA and other
models, meteorological data, and emissions data bases.  The intent was  to
determine if the agency had adequate resources, including computer capa-
bilities, to conduct or review modeling  studies.    Agencies were  to
provide a list of models, data bases and other data requested by  the
auditors who were to evaluate the available resources with respect to the
demands of the required studies.

     Fifty-one agencies (83 percent) responded that they maintain in-house
access to models and data bases, seven agencies (12 percent)  do not,  and
three agencies (5 percent) did not respond.  Several  agencies .indicated
that their in-house models included only screening types.
                                  111-25

-------
     Of the 28 agencies indicating which version  of the EPA UNAMAP  model
series were available to them,  15 reported using  version 5, 9  use version
4, 3 use version 3, and 1 uses  version 1.   A few  agencies commented that
version 5 of the UNAMAP model  series was on order.   The remaining 33
agencies did not give enough detail to allow a determination of which
version they used.

     Question C6 asked whether  the agency kept abreast of changing  modeling
guidance.  Fifty agencies (82 percent) responded  that  they kept abreast
of the latest guidance, but gave few details in their  comments to this
question.  Seven agencies reported not keeping up with current guidance.

     Question C9 asked if the agency modeling staff was familiar with
Section 110, Section 107, and Part D requirements of the Clean Air  Act
(for SIP's, redesignations, and ozone/CO extension  areas, respectively)
which prompt modeling analyses.

     Forty-seven agencies (77 percent) responded  that  their modeling staffs
were familiar with most of these requirements. Eight  agencies (13  percent)
indicated they were not familiar with one or more of these requirements.
Four agencies (7 percent) reported the question was not applicable  to
them and two agencies did not respond.  Four of the eight agencies  reporting
that they were not familiar with some of these requirements were State or
local agencies with Part D SIP's.

     Question Cll asked if the  agency staff was familiar with  modeling
and control strategy requirements for bubbles, redesignations, new  source
review (NSR), and nonattainment areas.  There was some overlap of this
question with question C5 with  respect to Section 107  redesignation
requirements.

     Forty-three agencies (70 percent) indicated  that  their staff was
familiar with most of these requirements.   Fourteen agencies (23 percent)
reported that their staff was either partly or totally not familiar with
one or more of these requirements.  Two agencies  reported these requirements
did not apply (one was a local  agency that deferred to its State agency
and the other was a rural State), and two agencies  did not respond.

                   Modeling Knowledge and Capabilities

                                        Advanced  Basic     Limited
                                        Knowledge Knowledge Or None CD_

Cl Does staff have experience       Agencies 5      37         14      5
and knowledge using EPA                      8%     61%        23%     8%
air quality models?

                                                  In-House  Outside CJD

Cl Are most modeling analyses                       40%        30%    30%
performed by agency (in-house)
or by contractor/source (outside)?
                                  111-26

-------
C3. Does agency have in-house
access to EPA models and
other data?
C3. Which version of EPA UNAMAP?
C6. Does agency staff keep
abreast of changing
modeling guidance?

C9. Is the agency staff familiar
with Sections 110, 107, and Part D
modeling requirements?
                 Yes No  CD

       Agencies  51   7   3
                 83% 12%  5%
           UNAMAP Version

          i   1   1   I  CD

Agencies 15   9   3   1  33

             Yes No  NA  CD

   Agencies  50   7   3   1
             82% 11%  5%  2%
    Agencies 47   8   4   2
             77% 13%  7%  3%
Cll. Is agency staff familiar with
modeling requirements for bubbles,
redesignations, NSR, and nonattainemnt areas?
    Agencies 43  14   2   2
             70% 23%  3%  3%
     2.  Documentation and Procedures (Questions C4, C5, and CIO)

     The first part of question C4 asked if a review of the documentation
supporting the modeling analyses performed by the agency could easily
verify that EPA procedures were followed.  The intent was to determine if
modeling analyses performed by State and local agencies were consistent
with national guidance.  Auditors were to review several of the agencies'
modeling analyses or reviews of modeling by others and to evaluate the
thoroughness and consistency of the supporting documentation.  Auditors
were to note systematic differences between EPA recommendations and agency
practices.

     Forty-three agencies (71 percent) reported that they have documentation
available showing that EPA modeling guidelines and procedures are routinely
followed.  Four agencies (7 percent) indicated that such documentation was
not routinely available.  Six agencies said the question was not applicable
to them either because they do no modeling or there is never any deviation
from EPA procedures, thereby implying that documentation of such nondevia-
tion would be unnecessary.  The remaining eight agencies did not clearly
answer the question.

     The second part of question C4 asked agencies if any deviations from
EPA modeling procedures or guidelines were clearly documented and supported.
Twenty-three agencies (38 percent) replied that they did have clear documen-
tation and support where they deviated from EPA procedures and guidelines  on
models.  Almost as many, 22 agencies (36 percent), reported that the question
                                  111-27

-------
did not apply since they did not deviate from EPA guidance.   Four agencies
(7 percent) indicated that not all  cases were clearly documented.  The
remaining 12 agencies (20 percent)  did not clearly respond either way to
the question.

     Question C5 explored the area  of communication between  modeling
staffs of State and local agencies  and the EPA Regional  Offices.   The
question asked whether the agency routinely contacts EPA for approval
prior to implementing nonreference  procedures and whether such  contact is
documented.  Auditors were to discuss with the agency modeling  staff
several modeling analyses and to evaluate the agency's early coordination
with EPA and any documentation where nonreference procedures were used.

     Thirty-two agencies (53 percent) indicated that EPA was routinely
contacted for approval prior to using nonreference modeling  procedures.
Three agencies (5 percent) reported not contacting EPA either routinely
or prior to the fact.  Twenty-four  agencies (39 percent) responded that
they never use nonreference procedures and therefore the question was not
applicable, and two agencies did not answer the question.

     A second part of the question  asked if the above contact of  EPA by
the agency prior to using nonreference procedures was documented.  Of the
32 agencies that reported making such contact, 19 indicated  that  they kept
documentation of it, 7 did not and  6 did not respond clearly either way.

     Question CIO asked if the agency provided modeling guidance  to sources
or consultants prior to their performing analyses.  The intent  of the
question was to determine if guidance was being distributed  before the
modeling began and to survey whether agencies generally are  providing
guidance to sources or contractors.

     Fifty-two agencies (85 percent) reported that guidance  was provided
prior to modeling.  Only two agencies indicated that guidance was not
provided.  Four agencies responded  that the question was not applicable,
presumably because there was no outside modeling performed,  and three
agencies did not answer the question.

                       Documentation and Procedures

                                                       Yes  NŁ    NA   CD

C4 Is documentation available to              Agencies 43    4     6   8
show that agency routinely follows                      71%   7%  IQ%  13%
EPA modeling guidelines and procedures?

C4 Are deviations from EPA procedures         Agencies 23    4    22   12
and guidelines clearly documented and                  38%   7%  36%  20%
supported?

C5 Does agency contact EPA for approval       Agencies 32    3    24    2
prior to using nonreference procedures?                53%   5%  39%   3%

C5 Does agency keep documentation of          Agencies 19    7    23   12
such contact with EPA?                                 31%  11%  38%  20%


                                  111-28

-------
CIO Does agency provide guidance to
sources or contractors prior to
initiation of modeling?
                                              Yes  No   NA   CD

                                     Agencies 52    2    4    3
                                              85%   3%   6%   5%
     3.
and C8)
Agency Review of "Outside" Modeling Analyses (Questions C2,  C7,
     Question C2 asked agencies to provide a list or an estimate of the
types and number of modeling analyses performed by sources or contractors
that were reviewed by the agency during FY 1983.  The intent was to determine
if the staff is capable of performing all  required analyses in a timely
manner.  Auditors were to evaluate whether all  SIP requirements were met
and if the agency staff allowed enough time for a thorough review of
these analyses.

     Forty-five agencies (74 percent) reported that some modeling analyses
performed by a source or its contractor were reviewed by the agency in FY
83.  Ten agencies (16 percent) did not report reviewing any such analyses and six
agencies (10 percent) did not answer the question.  Altogether, 48 agencies
reported reviewing about 500 modeling analyses by sources or contractors
in FY 83; however, the responses did not include enough detail to allow
the mix of these to be determined.

     In a related area, question C7 inquired if the agency verified or
replicated modeling analyses performed by  sources or their contractors,
i.e., how thorough was the review performed by the agency on outside
analyses?  Perhaps because of the way in which it was asked, it did not
elicit a significant amount of detail from the State and local agencies.
For example, it did not ask what kinds of  modeling analyses the agencies
verified nor did it try to determine the depth of verification employed
by the reviewing agency.

     In general, 39 agencies (64 percent)  responded that some outside analyses
did receive some kind of verification by their modeling staff.  From their
comments, the agencies indicated that 18 of the 39 replicated all outside
modeling, 9 verified only if the modeling  analyses was marginal or had a
potential for an exceedance of the standard, 8 used only screening models
in their verification, and 4 were not specific as to how they performed
this verification.  Thirteen agencies (21  percent) reported that they performed
no verifications.  Five agencies said there was no outside modeling
performed that required verification, and  4 agencies did not answer the
question.

     Question C8 was a survey on whether agencies had participated in
site-specific model evaluation studies.

     Seventeen agencies (28 percent) reported participating in site-specific
modeling evaluation studies, seven of which occurred within the last
year.  Four agencies (7 percent) did not participate in such studies when they
occurred.  Twenty-two agencies (44 percent) indicated that no such studies had
                                  111-29

-------
been done, and 13 agencies (21 percent)  did not clearly  answer the question.
This large number of unclear responses indicates the  question  may  have
not been understood by a significant number of agencies.

               Agency Review of "Outside" Modeling Analyses

                                                         Yes   No  NA CD

C2 Did the agency review source or contractor-  Agencies 45    10       6
performed modeling in FY 83?                             74%   16%     10%

C7 Does the agency verify or replicate modeling Agencies 39    13   5   4
analyses performed by sources or contractors?             64%   21%   8%  7%

C8 Has agency participated in site-specific     Agencies 17     4  22 13
model evaluation studies?                                28%    7%  44% 21%

F.   SIP EVALUATION AND IMPLEMENTATION

     This section contains detailed audit information on the  periodic
evaluation and implementation activities related to the  State  implementation
plan (SIP) at the State and local levels.  The 1984 audit focused  on
several broad areas:  timeliness of studies and regulation development
(Questions Dl, D2, D6, and D10); familiarity with EPA policy  on site-
specific SIP-revisions (Question D3); inspection and  maintenance (I/M)
and transportation control measures (TCM's) implementation (Questions D4
and D5); and SIP coordination and validation (Questions  D7, D8, and D9).

     The first area, timeliness of studies and regulation development,  is
intended to discover any obstacles in the development of major regulations
such as reasonably available control  technology (RACT) for volatile organic
compounds (VOC's) or Section lll(d) rules and  whether or not  major study
efforts or schedules for rule adoption were included  in  the SIP.  Auditors
were to identify any generic problems or innovative approaches and to
discuss progress on schedules or studies due during FY 83. There  is also a
review of progress on outstanding SIP deficiencies identified by the
Regional Office.

     The second area covers site-specific SIP  revisions  such  as variances,
bubbles and emissions trading, and whether or  not the State or local agency's
handling of these is consistent with EPA policy.  Auditors were to review
agency files and reference documents in this area.

     The third area discusses I/M and TCM implementation problems  and
asks agencies to identify their tracking processes and responsibilities.
A brief discussion of I/M program elements is  also covered.

     The last area, SIP coordination and validation,  covers RFP tracking,
substantiation of emissions reductions claimed in the SIP, and growth
estimate reevaluation.
                                  111-30

-------
     1.  Timeliness of Studies and Regulation Development (Questions  Dl,
02, D6, and D10.

     Question 01 asked whether all required regulations had been  adopted
and submitted to EPA or if progress toward submittal  was being  made.   The
intent was to identify any obstacles to the development of major  regula-
tions, such as VOC RACT rules required under Part D or Section  lll(d)
rules.  Auditors were to identify and discuss any submittals due  during
the review period.

     Fifteen agencies (25 percent) reported that all  regulations  had  been
submitted.  Another 33 agencies (54 percent) indicated that not all had
been submitted but were in progress.  Four agencies (7 percent) said  that
progress had not been made on required submittals—mostly the submittals
were incomplete, but in one case, a lll(d) plan was long overdue.   Responses
of the remaining 9 agencies (14 percent) were either not applicable,
unclear, or not given.

     In the category of overdue submittals in progress, the most  typical
types of rules were (in descending order):  lll(d)  plans, CTG rules (RACT,
Stage I, floating roof regulations, etc.); lead SIP's; TSP and  secondary
TSP rules (including wood burning); and I/M rules.

     The time frames of each of these categories is unclear from  the
responses, i.e., it is not possible to determine which of the required
rules have longer submittal times than others.   A tentative conclusion,
based on a few States' comments, is that resources are the limiting factor.
Yet a large southwestern State agency had no submittals outstanding in FY
1984.  Several States indicated that lengthy legislative oversight  require-
ments delayed submittal of rules.

     Question 02 dealt with whether studies required in the SIP had been
completed.  The intent was to cover major study efforts such as nontradi-
tional TSP sources or CO hotspot studies.  Auditors were to review  and
discuss a list of major studies due during FY 1983 and evaluate their
status.

     Generally, the agencies have either completed their required studies
(27 agencies or 44 percent have), or there were no studies required or due
during the FY 1983 reporting period (26 agencies, or 43 percent,  were  in this
category).  Six agencies were not doing studies where required, however
five of these involved TSP or nontraditional TSP studies which  the  agencies
had placed on hold due to EPA's revaluation of the PMio standard.  The
other agency commented that an I/M and tampering study was delayed  due to
inaction by an outside agency.  Two agencies did not reply to the question.

     Most of the completed studies were for one or more of the  following:
CO hotspot analyses, TCM's or nontraditional TSP (e.g., street-sweeping,
wood burning, fugitive dust, etc.).

     Question 06 asked what agencies do to assure that schedules  for
adoption and implementation of rules are satisfied.   This question  was
                                  111-31

-------
intended to see if agencies attempt to:   minimize  unnecessary  slippages
in the schedule; maintain reasonable further progress  (RFP); and
adopt and implement required rules on a  schedule consistent with the
attainment deadline.   Auditors were to review outstanding  Part D condi-
tions, SIP approvals  containing "with the understanding" clauses and  any
schedules for rule adoption or implementation contained in the 1982
ozone/carbon monoxide SIP's.  Agencies were to identify who, in-house,
was responsible for tracking schedules and the major causes for any
slippage in the schedules.

     Twenty-one of the 61 agencies indicated that  schedules were contained
in the Part D SIP's.   Twenty agencies said there were  no SIP schedules,
11 agencies had no Part D SIP's,  and answers for the remaining nine agencies
were unclear or not given.   Therefore, the question currently  applies to
about one-third of the 61 audited agencies.  One of these  agencies  identi-
fied delays in adopting non-CTG RACT rules because of  problems in identify-
ing sources subject to the rules.   Another identified  slippages in VOC
RACT schedules such as automobile and light truck  surface  coating, miscel-
laneous metal parts and paper, vinyl, and fabrics  facilities.   However,
there were only three agencies that indicated any  slippages in schedules.
Most agencies did not indicate who is responsible  or how they  track
schedules.  The few that did, indicated  incorporating  the  schedules as
Section 105 grant conditions, including  them in a  work plan, or setting
up an "event schedule" for tracking schedules.

     Question D10 asked EPA Regional  Offices to identify SIP deficiencies
for which States and  local  agencies would indicate whether submittals
were in progress.  The intent was to determine if  resolution of differences
between State/local agencies and EPA Regional  Offices  could be more
promptly resolved or  with less friction.   Auditors were to determine  if
priority problems were receiving appropriate attention or  if agencies had
a plan for resolution of identified deficiencies.

     Twenty-seven agencies (44%)  indicated some kind of SIP deficiency
existed and that progress toward resolution was being  made. Five types
of deficiencies were  listed for five or  more agencies. These  covered the
following areas: I/M; TSP (iron and steel sources, fugitive sources,
woodburning, etc.); administrative (permit fees, board composition,
malfunction reporting, etc.); 03 SIP-related (attainment demonstration,
rule adoption); and new source review (PSD, offset rules,  etc.). A few
agencies (3 or less)  identified other deficient areas, such as:   CO
SIP's, S02 rules, lead SIP's, and stack  height regulations.

     Three agencies replied that resolution of deficiencies was not in
progress at the time  of the audit.  One  agency was deficient in new
source review (NSR) for which work was in progress but submittal was
delayed.  The same agency reported a SIP modeling  guideline was late  and
there was no set schedule for its submittal.  Another  agency reported
differences with the  Regional Office on  the VOC reduction  target and
asked EPA for clarification.  The third  agency must adopt  locally enforce-
able rules and attributed not having done so to personnel  shortages.
                                  111-32

-------
     Twenty-two agencies (36 percent)  reported that no deficiencies were
identified and nine agencies (15%) did not give a clear response.

             Timeliness of Studies and Regulation Development
                                       All  sub-   In
                                       mitted    Progress  Yes  jto  NA  CIO
                                Agencies
Dl Have SIP regulations or emissions
limits been submitted or, if not,
is progress being made?

D2 Has agency carried out additional
studies where required by the SIP?

06 Does the agency have schedules
in the SIP for adoption or
implementation of Part D rules?

D10 Is agency making progress on
identified SIP deficiencies?
15
25%
33
54%
4   7
7% 11%
2
3%
                 44%  10% 43%  3%
                 34%  33% 18% 15%
      Agencies
       27
       44%
3  22   9
5% 36% 15%
     2. Familiarity with EPA Policy on Site-Specific SIP Revisions  (Question  03)

     Question 03 explores the area of variances,  bubbles,  emissions  trading,
and other site-specific SIP revisions.  The question was intended to
discover if the agency understood and followed EPA policies  on  these SIP
revision topics.  Auditors were to review agency  notebooks and  files to
see if they were complete and up to date, and discuss any  pending site-
specific SIP revisions.  If problems existed, auditors were  to  try  to
find their origin.

     Thirty-one agencies (51 percent) reported that they are able to show
that their site-specific SIP revisions have been  consistent  with  EPA
policy.  About half of these (15 agencies) said the mechanism used  to
assure consistency was review and comment by the  EPA Regional Office.
Others used permits, generic rules, or reviewed "guideline notebooks,"
etc., kept at the agency.  Among the 31 agencies,  bubbles  and emissions
trades are the most commonly occuring site-specific SIP revisions.

     Ten agencies (16 percent) reported, or the Regional  Office said, that
some of their bubbles or variances were inconsistent with  EPA policy.   Of
these, three agencies were calculating VOC allowable emissions  on averaging
times contrary to EPA policy, two agencies had bubbles that  were  found  to
be inconsistent with EPA policy, two agencies had  variances  not approved
by EPA, and three other agencies'  inconsistencies  fell into  miscellaneous
categories.

     The remaining 20 agencies, not in the above  categories  had either
never employed bubbles, trades, or variances or did not provide a clear
response to the question.
                                  111-33

-------
        Familiarity With EPA Policy  on  Site-Specific  SIP  Revisions

                                                       Yes    No   MA   CD
03 Can agency assure that
site-specific SIP revisions are                        51%    16% 13% 2Q%
consistent with EPA criteria?

                                                      EPA
D3 How does agency assure                  Generic  Review/
site-specific SIP revisions       Permit   Rule    Comment  Other NA   iCD
are consistent with EPA criteria?
                                     11%     13%        3Q%     5% 18% 23%

     3.  Implementation of Inspection/Maintenance  (I/M) Programs and
         Transportation Control  Measures (TCM)  (Question  D4  and  D5)

     Question D4 deals with whether  EPA-approved TCM's (i.e.,  contained
in an approved SIP) are being implemented.   The intent is to discover if
the State or local agency, DOT,  or MPO  is responsible for tracking TCM's.
Auditors were to review any SIP commitments to  implement  TCM's.

     Thirty-eight agencies (63 percent)  reported that TCM's  were being
implemented, but 17 indicated that responsibility  for implementing and/or
tracking these lie with other agencies,  usually DOT's or  MPO's.  Even so,
19 agencies indicated tracking TCM's by  receiving  reports from various
implementing agencies (e.g., DOT, transit company,  city traffic  department,
Chamber of Commerce for ridesharing, etc.); eight  agencies track TCM's via
periodic review of the consistency of the transportation  improvement  plan
(TIP) with the SIP (four of these agencies also monitor implementation by
being members on an MPO transporation committee) and  one  agency  relies
mainly on on-site inspection of roadway  improvements.  It could  not be
determined from their responses how  the  other 10 agencies are  involved in
tracking TCM's.

     In addition to the above 38 agencies, two  agencies indicated that
required TCM's were not being implemented.  These  and a few  other agencies
cited problems with enforceability of TCM's and the failure  to meet
stated goals, such as a certain percentage increase in transit ridership
or improvement in ridesharing participation.

     Of the remaining 21 agencies (34 percent), 19  are not required to
implement TCM's and two did not clearly  respond to  the question.

     Question D5 covered whether the SIP credits for  the  I/M program  are
being achieved.  Auditors were to review operating  I/M programs  with
respect to the ten elements required by  EPA policy  published in  the January  22,
1981, Federal Register.

     The I/M programs were required  in  32 of the 61 audited  agencies. At
the time of audit (January to June 1984), less  than half  (14)  of these
programs had been in operation for 1 year or more  and the others had
either been in operation for less than  a year or had  not  yet started.


                                  111-34

-------
Therefore, of the 32 required programs,  only 7 agencies indicated that
I/M was consistent with the SIP credits  claimed and another five indicated
having found problems of inconsistency with the I/M credits in the SIP
(low failure rates, noncompliance/low participation rates or poor reporting)
Three other agencies with I/M programs in operation for a year or longer
gave unclear responses.  For the remaining 17 agencies, the results
indicated that they could not yet make this determination.  (One of the
three agencies had an I/M program in operation for more than 8 years.
However, it could not be clearly determined from the audit response if
the program was consistent with the SIP  credits.)

     In eight of the 32 areas, an agency other than the State or local
control program, such as the State DOT,  had primary responsibility for
administering the I/M program.
                         I/M, TCM Implementation
D4 Where required, are
TCM's being implemented?
                                   Yes   No   NA  CD
D4 For TCM's,
how does agency
assure
implementation?
             Reporting by  Agency member
             implementing  of MPO transp,
             agency	    commi ttee
                                   63%
                          Agencies  38

                           Other (TIP
                           review on-site
                           inspec., etc.)
25%
7%
           Agencies  15
                                                              3% 31% 3%
                                                              2   19   2
                           NA  CD
                                                            41% 16%
                                                            25   10
D5 For I/M, is implementation
consistent with SIP credits?
D5 Length of I/M operation
                                                        Yes   No   NA  CD
                                   11%
                         Agencies    7
                        8% 48% 33%
                        5  29  20
                                  Less than  1  year      Not yet
                                  1 year     or more   in  operation    NA
                                     10%
                                      6
                          23%
                          14
                    21%
                    12
                      Agencies

4. SIP Coordination and Validation (Questions 07,  D8,  and D9)
46%
29
     Question 07 attempted to find out if and how agencies  track RFP  for
ozone and carbon monoxide in extension areas.  This  was  basically  a survey
of the tracking approach used by State and local  agencies.   Auditors
were to review the agencies'  annual  RFP report and to try to determine:
(1) if the claimed reductions were adequately documented, (2) who  was
responsible for RFP tracking, and (3)  how the RFP report was used  in
agency evaluation and planning activities.
                                  111-35

-------
     Of the 26 agencies with ozone extension areas,  12 indicated a system
was available to document VOC emissions reductions claimed in  the ozone RFP
demonstration.  Seven agencies said they did not have a system to do this
and the remaining seven did not give a clear answer  either way.   On the
second part of the question, 13 agencies reported having the ability to
assure that the VOC reductions (being tracked in the annual  RFP reports)
were consistent with current emissions inventory or  enforcement data,  but
7 said they could not do this and 6 did not answer the question either
way.

     Of the 34 agencies with extension areas for CO, 11 indicated that a
system was available to document CO emissions reductions claimed in the CO
RFP demonstration.  However, 12 agencies said they did not have a system
to do this, and the rest did not clearly answer.  One State said it had
not yet been required to submit RFP reports for its  CO extension areas.
Responses to the second part of the question were similar to the ozone
part of question D7: 11 said they could assure consistency between CO emis-
sion reductions and inventory or enforcement data, 11 could not, and 12
did not respond clearly.

     In general, RFP tracking for VOC point sources  is based on the
periodic updates of the VOC emissions inventory.  If the inventory is
updated annually, and the audit revealed that 65 percent of agencies in
nonattainment areas performed annual  updates, then current actual emissions
can be cempared to the RFP line to see if the reductions called for in
the SIP strategies have been attained.  However, most State or local
agencies do not update area and mobile source emissions as frequently  as
point sources.  Therefore, only part of the RFP reductions (the point
sources) may be tracked for VOC.  Only four State agencie-s reported
annually tracking RFP for point, area, and mobile sources.  A couple of
these also plot air quality trends for ozone and compare to the emissions
trend.  A few States track air quality data instead  of emissions.  Others,
citing insufficient resources, use surrogate numbers for the baseline
data (e.g., using traffic count growth, instead of VMT growth  generated
from a transportation model).  Several States requested additional guidance
from EPA on RFP tracking, and a few were apparently  delaying any kind  of
RFP tracking until EPA approved their 1982 03/CO SIP.  One Regional
Office suggested that RFP reports of some States had left out  emissions
from upsets, maintenance, and malfunctions, and had  not included emissions
reductions from shutdown or compliance by a source in advance  of the
scheduled due date.

     The RFP picture is about the same on the CO side.  The erratic
updating of the mobile source inventories affects CO more than 03.  Not
more than five agencies indicated that they documented changes in traffic
patterns and adjusted VMT accordingly.  The other five agencies claiming to
use a "system" to document RFP reductions either track air quality data
or monitor TCM implementation, but do not actually document mobile source
emissions reductions.

     Question D8 asked how the agencies are able to  substantiate that the
emissions reductions claimed in the SIP control strategy are actually
                                  111-36

-------
achieved.  The intent is to find out when and how initial  compliance is
determined by the agency for stationary source rules (such as Part D RACT
for solvent coatings) and TCM implementation.  Auditors were to coordinate
this activity with the compliance audit.  Current and planned activities
were to be summarized.

     Sixty-six percent (40 agencies) said that they could substantiate
their SIP emissions reductions, 18 percent (11 agencies) said they could
not, and the remainding responses were "not applicable" (no nonattainment
areas) or "could-not be determined."  In keeping with the responses on
the previous questions on RFP, the vast majority (34 of the 40 agencies)
responded that stationary source activities,  such as inspection,  permitting,
and/or testing (as opposed to area and mobile source activities)  were the
principal means of substantiating compliance  with the control strategies.
Several agencies also verified or monitored TCM implementation.  The four
agencies in the "other" category rely on some kind of source reporting
when compliance schedules come due.  In general, the level of detail in
the responses was not sufficient to identify  the quality associated with
any of the three categories of verification methods.

     One agency indicated that it required major TSP, S02, and VOC sources
to perform stack tests every two years.  Some agencies claimed to limit
source growth to the allowance in the latest  SIP via permit conditions.
One Regional Office recommended that a State  agency improve its documenta-
tion of SIP reductions and improve its verification of information submitted
by the source.  Few States reported making any attempt at verifying
source reported compliance data.  Although agencies may be doing  such
verification via routine source inspection and permitting, the audit could
not confirm this.

     Question D9 asked when and how the agency was planning to address
changes in the SIP growth projections.  The intent was to determine if
there was any periodic review of the growth projections against current
data.  Auditors were to discuss data sources  and any process used to update
the projections and to identify the parties responsible for evaluating any
changes in the data.  As background to this question, Table 5 also gives
responses from the audit where agencies indicated if they believed that
the SIP growth projections were adequate for  point source or area source
growth and, therefore, periodic evaluation was thought to be unnecessary.

     The results indicated that only 11 agencies (18 percent) performed a
periodic review of growth projections.  Most  (36 agencies or 59 percent)
did not.  Two agencies indicated the question was not applicable  and 12
did not respond clearly one way or the other.  Several comments from
agencies giving a negative response indicated that, generally, growth is
not currently perceived as a problem or that  their SIP's contain  conserva-
tive projections (i.e., relatively large projected percentage increases
in emissions) that are not likely to be exceeded.  In addition, 25 agencies
indicated that point source growth was adequately projected in the SIP:
only one agency had concerns that additional  VOC reductions from  point
sources may be necessary.  Similarly, 24 agencies indicated that  the SIP
projections were seen as adequate for the currently expected growth in
                                  111-37

-------
area sources.  The one agency mentioned previously also indicated that
further reductions in VOC from area sources might be necessary.

     Some agencies commented that growth is checked by a review of ambient
data or of the emissions inventory.  A comprehensive review of growth data
is not done by these agencies.
                     SIP Coordination and Validation
D7 Does agency have a system
to document emissions reductions
claimed in RFP demonstration?
D7 Can agency assure RFP emissions
reductions are consistent with
inventory or enforcement data?

D8 Is agency able to show SIP
emissions reductions claimed are
achieved in practice?
(Number of Agencies)

 Yes   No   NA  CD  Total
D8 Principal verification
method employed for above.

D9 Does agency perform a periodic
review of whether SIP growth
projections have actually occurred?
D9 Agency believes SIP growth
projections are adequate for
point source growth.

D9 Agency believes SIP growth
projections are adequate for
area source growth.
ozone
CO
ozone
CO
agencies
Source
Inspec-
tion,
Permit,
or Test
agencies
34
56%
12
11
13
11
40
66%
TCM
Veri-
fica-
tion

2
3%
agencies
agencies
agencies
7 0
12 2
7 0
11 1
11 1
18% 3%
Other
Yes
11
18%
25
41%
24
39%
4
7%
No
36
59%
1
2%
1
2%
7
9
6
11
8
13%
NA
13
21%
NA
1
3%
26
34
26
34
CD
8
13%
CD
12
20%
35
57%
36
59%
                                  111-38

-------
                           IV.  NEW SOURCE REVIEW

A.  INTRODUCTION

     The new source review (NSR) audit examined the ways that State and
local  agencies require and actually carry out the preconstruction review of
new and modified stationary sources of air pollution.  The audit was designed
to address all sizes of sources but placed an emphasis on sources defined
as "major" according to the Federal Clean Air Act.  This is consistent with
EPA's  primary focus in its preconstruction review regulations to establish
minimum requirements for major stationary sources.  It should be recognized,
however, that State and local  air pollution control  agencies typically
focus  much of their time and resources on the more numerous population of
minor  sources that must also undergo preconstruction review before a permit
to construct can be issued.

     A specific goal of the NSR audit was to provide an overall  snapshot of
how State and local preconstruction review permit programs, including preven-
tion of significant deterioration (PSD) and NSR programs specifically for
major  sources, are now operating.  Additional goals were to assess State
and local programs in terms of regional and national consistency, provide
information feedback concerning certain agency practices, identify potential
problems, particularly where Clean Air Act requirements are not  being met,
reach  conclusions as to the implications of the audit findings,  and to
identify areas where EPA may need to emphasize future policy development.
This report attempts to convey the FY 1984 audit results relative to all  of
these goals.

     The audit committee selected seven basic audit topics in developing
the NSR audit questionnaire:  (1) Administrative Procedures, (2) Applicability
Determinations, (3) BACT/LAER Determinations, (4) Ambient Monitoring (PSD),
(5) Ambient Impact Analysis, (6) Emissions Offset Requirements,  and (7)
Permit Specificity and Clarity.  A number of questions were established
under  each audit topic to examine whether and how agencies were  carrying
out specific preconstruction review responsibilities.  Typically, the
responses were to be provided in the form of "yes" or "no" answers, but
special comments could be provided wherever necessary.

     The NSR audit considered responses from 64 air pollution control
programs.  This included 49 States,* the District of Columbia, Puerto Rico,
the Virgin Islands, and 12 local agencies.  Where a State program involved
one or more district offices and these offices were audited, they were
considered part of the State program.  Local agencies were considered
separately, even though there was a dependency on the State for  certain
program operations in some cases.

B.   MAJOR FINDINGS AND CONCLUSIONS

     This section summarizes the findings of the 1984 NSR audit.  Where
appropriate, conclusions have been drawn to clarify and put into proper
perspective the specific findings.  In addition, findings that are determined
to have national implications are discussed in greater detail than are
problems which appear to be more isolated in nature.  For a better under-
standing of how the major findings were derived, the reader is referred to
*The State of California does not have authority to issue permits and does
not implement an NSR review program.  These activities are performed  by  the
local air pollution control agencies.

                                    IV-1

-------
Sections IV.C. through IV.I., where a breakdown of the individual  audit
questions and answers is provided.

1.  Administrative Procedures

    Major Findings

Public Participation and Notification--

     o  Sixty-one percent of the 64 audited agencies are reported  to routinely
        provide the public with an opportunity to comment on each  proposed
        major source permit.  The rate drops to 31 percent when both major
        and minor source source permits are considered.  The major sources
        commonly not addressed are non-PSD sources locating in attainment
        or unclassified areas.  Figure 1 below illustrates the percentage
        of agencies for which public comment is routinely required for
        specified source types.

     o  When the public is notified, agencies appear to provide sufficient
        information concerning proposed permit actions, and adequately
        notify other affected government agencies and officials, including
        Federal Land Managers in Class I areas, of the proposed determination.
                            FIGURE 1
                 PUBLIC PARTICIPATION REQUIREMENTS
                                 X OF ACENGCS
             AO.PSOAMD	
             P»KTD SOURCES
             UX
                                    IV-2

-------
Application Documentation and Determination of Completeness--

     o  More than 80 percent of the agencies audited state that they notify
        PSD permit applicants concerning the finding of completeness/incom-
        pleteness.  All applicants are reportedly notified if the application
        is determined to be incomplete.

     o  Almost 90 percent of the agencies indicate that a record is kept
        that tracks when agency actions were taken with respect to permit
        applications.

Prerequisite Statewide Source Compliance--

     o  With only one exception, all  audited agencies try to ensure that
        when a major source or major  modification applies for a permit in  a
        nonattainment area, all existing major sources owned or operated by
        the applicant throughout the  State are in compliance (or on a
        compliance schedule).

     o  Seventy percent of the agencies claiming to ensure Statewide source
        compliance report that documentation of compliance is provided in
        the permit file.  This finding implies no specific wrongdoing on
        the part of those agencies who do not provide documentation in the
        permit file.  However, the practice is clearly recommended as a
        means of verifying within the permit process that the source compliance
        prerequisite is being checked.

     Conclusions

     Overall, the initial assessment  of how the administrative procedures
have been implemented provided generally acceptable findings with the caveat
that this first audit was not supported by a significant amount of major
source and major modification permit  data.  However, the failure of some
agencies to require public notification for all major sources is a particular
concern.

     The public participation requirement under 40 CFR 51.18(h) is general
in its coverage in that it does not specify what sources and modifications
must be addressed.  However, when taken literally, all new and modified
sources would fall under its coverage.  Some States for years have maintained
that the public notification process  is expensive and often has little
impact on the outcome of the final permit.  Consequently, public comment is
oftentimes sought on a discretionary  basis, particularly for minor source
permits and permits issued to major non-PSD sources locating in attainment
and unclassified areas.  EPA does not now have a policy which supports such
limited coverage, but EPA should examine its own regulations to reassess the
need to have them apply to all permitted sources.  Until such reassessment
is actually done, States are encouraged to examine their current procedures
and to consider the expeditious implementation of the public participation
process for at least major sources and major modifications, regardless of
the applicable type of review.
                                    IV-3

-------
2.  Applicability Determinations

    Major Findings

     0 Almost half of the audited  agencies  may  be  inadequately making
       applicability determinations  for major new  sources  and major  modifi-
       cations because of a  misunderstanding or misuse  of  the concept  of
       potential  to emit (PTE).   (See Figure 2. The  inset chart,  "PTE
       Problems," illustrates  proportionately the  various  types  of problems
       associated with potential to  emit that the  audits identified.)

     0 Most agencies indicate  that they use an  acceptable  concept  of actual
       emissions  to determine  if a "net significant emissions increase"
       would occur for a source modification.   Fewer  than  10 percent were
       found who  incorrectly used  allowable emissions as their netting
       baseline.   (See Figure  2.)

       Most agencies appear  to adequately address  fugitive  emissions,  to the
       extent that such emissions  are  reasonably quantified, in  calculating
       "potential  to emit" and "net  emissions increase."   Problems were
       identified in less than 10  percent of the agencies  audited.   (See
       Figure 2.)

     0  With only  one exception, agencies report using their State  implementation
       plan (SIP)  approved definitions  of "source" for PSD, NSR, new source
       performance standard  (NSPS),  and  national emission standard for hazardous
       air  pollutants  (NESHAP) applicability.   (See Figure 2.)  However, some
       States have definitions that  are  known to differ from EPA's 1980 amendments
       and  may require  updating.
                                      FIGURE 2
                        SOURCE APPLICABILITY DETERMINATIONS
                                            LEGEND
                                    §• AGENCIES WITH PROBLEMS FOUND
                                    173 64 TOTAL AGENCIES AUDITED
                                   IV-4

-------
     o
       With only one exception, agencies appear to correctly apply the
       §107 area designations when making NSR applicability determinations
       for major sources.

     0 The audited agencies issue construction permits to many types of minor
       sources.  Agencies differ, however, in the criteria used to determine
       which minor construction projects are subject to their permit requirements,

     0 The potential for a source to divide its construction plans with a
       series of applications in order to avoid major source review was identi-
       fied as a potential problem in less than 10 percent of the audited
       agencies, and no cases were found where such circumvention actually
       occurred.

     0 Less than 5 percent of the agencies indicated that substantive
       conditions once issued within construction permits could be overruled
       by a board or commission without requiring new analyses of the
       affected provisions or public notification.

    Conclusions

     The misapplication of the EPA "PTE" definition in approximately
40 percent of the audited agencies is a problem that EPA considers to be
significant.  Several types of problems need to be addressed.  Some permits
could be improved simply by providing the documentation necessary to show
how the applicability determination was made.  For others, there is a need
to establish Federally enforceable permit conditions as required to legally
restrict the potential to emit to some level below maximum design capacity.
In many cases, such conditions are already in an operating permit which the
State and local agencies consider enforceable, but EPA does not.  In others,
however, the operational and design limitations used to restrict the source's
potential to emit are not included in any permits.

     All agencies should examine their policy and procedures to verify,
first, that they are using EPA's definition of "PTE" for determining source
applicability to major source requirements, and second, that they are using
the concept properly.  Proper use of the definition requires that special
permit conditions be established when necessary to ensure Federal  enforce-
ability.  Sources should not be allowed to avoid major source review without
the appropriate Federally enforceable limitations.  (See additional  comments
concerning the issue of Federally enforceable conditions under the conclusions
for Permit Specificity and Clarity, page IV-10.)

3.  BACT/LAER Determinations

    Major Findings
     o
       With only one exception,  all  agencies  report  that  they require  each
       best availabile control  technology (BACT)  analysis to address each
       regulated pollutant potentially emitted by the PSD project  in significant
       amounts.  Isolated discrepancies were  revealed, however,  in several
       PSD permits.
                                    IV-5

-------
     0 Most agencies claim to require the consideration  of  more  than  one
       BACT control  strategy, but  this was often  done  on a  case-by-case
       basis rather than routinely.

     0 Most agencies reportedly  perform an independent evaluation  of  the
       applicant's BACT analysis,  or review the applicant's analysis.  However,
       audits of selected PSD permits identified  instances  where the  agency's
       own evaluation was not very apparent.

     0 A tendency for BACT to conform with NSPS/NESHAP has  been  found  to
       occur in at least 27 percent  of the audited  agencies for  at  least
       one particular pollutant  or source category.  The tendency  for  lowest
       achieveable emission rate (LAER)  to conform  with  NSPS/NESHAP appears
       to be significantly lower.

     0 Eighty-eight percent of the agencies indicate that they use  EPA's
       BACT/LAER Clearinghouse as  part of their BACT/LAER determination
       procedure.  Seventy-eight percent stated that the clearinghouse
       provides useful  information,  but many also cited  certain  shortcomings.

    Conclusions

     Agency policy and procedures  appear to be adequate  in  most  cases.
However, there were relatively few permits issued from which the agencies'
practices could be fairly evaluated.  The tendency  on  the part of  some
agencies to set BACT at levels conforming to NSPS is of  some concern  but
should not necessarily be interpreted as a deficiency  at this time.  What
it does suggest is that the BACT requirement is not typically being used as
a technology-driving mechanism as  Congress had intended. As greater  amounts
of the available PSD increments  are  consumed, noticeable changes are  likely
to be observed in the use of BACT  to establish tighter levels of control.
In any event, EPA will  continue  to audit the BACT determinations being made
by State and local agencies to identify where and when BACT effectively
provides better control than that  provided by NSPS.

4.  Ambient Monitoring

    Major Findings

     0 The vast majority of State  and local agencies report that they
       adequately -

       (a) require applicable PSD  sources to submit preconstruction ambient
           monitoring data,

       (b) determine when an applicant may use representative existing air
           quality data rather than  conduct new monitoring, and

       (c) ensure that monitoring  is performed in accordance with  PSD  quality
           assurance requirements.

     0 When a PSD source is exempted from the preconstruction ambient  air
       quality data requirement, agencies, with few exceptions,  indicate
       the basis for the exemption in the preliminary  determination.

                                    IV-6

-------
    Conclusions

     Preliminary indications are that few problems exist with respect to
the PSD requirements for preconstruction monitoring data.  In retrospect,
it would have been useful  to have included in the audit a survey question
to identify the number of PSD applicants required to conduct preconstruction
monitoring.  Information that was volunteered in a portion of this year's
audit reports suggests that a large amount of the monitoring data being
collected to meet the PSD requirements is being derived from existing
monitors, rather than from new monitoring by the applicant.  This general
assumption is based on the fact that for nine audits from which  the infor-
mation could be determined, 15 out of 19 PSD applicants were allowed to use
existing representative data.  If this information is indicative of the
national trend, then additional attention should be given to the adequacy
of EPA's current guidance and to how it is being used by State and local
agencies to approve representative air quality monitoring sites.  The
questions in this year's audit were somewhat general and did not specifically
address the criteria set forth by EPA in its guideline on PSD monitoring.

5.  Ambient Impact Analysis

    Major Findings

PSD Increment Consumption--

     0 Agencies typically respond that they give adequate consideration to
       PSD increment consumption from major projects.  However,  auditors
       believe that almost 20 percent of the audited programs need to
       devise a means of tracking minor source emissions where the PSD
       baseline has already been triggered.

     0 Most agencies have no specific plans to periodically assess PSD
       increment consumption beyond the analysis accompanying the PSD
       application for major projects.  There is a general  sense of uncertainty
       as to how such periodic assessments are to be done.

     0 Agencies reportedly give adequate consideration to both the long-
       and short-term PSD increments.

     0 With only one exception, agencies report that they implement criteria
       meeting or exceeding the special significance criteria for assessing
       major source impacts on Class I area increments.

NAAQS Protection—

     0 The emissions baseline most commonly used to evaluate the impact on
       the national ambient air quality standards (NAAQS)  of new and modified
       sources consists of the modeled allowable emissions  from  the proposed
       source plus a monitored background air quality.  However, a significant
       amount of modeling using allowable emissions from existing major
       sources also appears to occur.  These findings were  derived from a
       survey question from which EPA will  evaluate current practices and
       consider the need for policy development.


                                    IV-7

-------
       Approximately 45 percent of the audited agencies claim to  evaluate,
       on a routine basis, the ambient impact of minor source construction.
       Many other agencies do so on a case-by-case basis when they  suspect
       that an ambient air quality problem could result.
Dispersion Models--
       While most agencies use or require the use of EPA reference models  to
       carry out the ambient impact analysis, procedural and other problems
       were revealed in approximately 40 percent of the audited agencies.
       Also, 40 percent of the audited agencies do not always check the
       applicants' modeling analyses or do not check them as fully as  the
       auditors deemed necessary.  (See Figure 3.)
                                     FIGURES
                      DISPERSION MOOEUNG FOR NEW SOURCE REVIEW
                   MODa ADEQUACY
CHECK OF RESULTS
       IMPROPER USE
       OFMOOOS
       UX
                                        NEEDCD UOOELMG
                                        NOT REQUIRED
                                        **
                                 INADEQUATE
                                • DOCUCNWICN
                                 n
    Conclusions

     The fact that many agencies do not routinely model the ambient  impact
of minor sources does not generally appear to be a problem at this time.
Audit reports do show that agencies tend to model minor source  impact
(oftentimes using screening models) when the possibility  for ambient  problems
is suspected, usually through an awareness of existing ambient  air quality
concentrations in the area surrounding the proposed source.  However,  PSD
increments are more difficult to account for because existing air quality
                                     IV-8

-------
data often cannot be used to determine the extent to which increment
consumption is taking place.  For this reason, it becomes important to
begin tracking all  emissions that contribute to the PSD increments once the
baseline has been triggered.  In fairness to those agencies that were cited
for not tracking minor source emissions, it is believed that most agencies
do not.  Consideration of the effects caused by both new and existing minor
sources in PSD areas becomes increasingly more important as the amount of
available increment is reduced.  The absence of EPA guidance in terms of
developing increment tracking procedures, particularly with respect to
short-term averaging periods, needs to be addressed before State and local
agencies can be expected to do much developmental work of their own in this
area.

     With regard to dispersion modeling practices, the use of actual (rather
than allowable) emissions and other technical  aspects of modeling by some
agencies needs further study before their analyses can be considered fully
satisfactory.  In addition, agencies should generally be giving more attention
to their own technical documentation of the modeling analyses provided by
permit applicants.   For some agencies, this issue may be difficult to
address in the near term because of in-house limitations on resources and
expertise.  EPA technical assistance may be the best near-term solution
while these agencies seek to establish or upgrade their modeling capabilities.
In other cases, it  appears that there is a need for agencies to review
their procedures in order to ensure that modeling analyses receive the
appropriate in-house attention.

6.  Emissions Offset Requirements

    Major Findings

     Insufficient permitting activity involving emissions offsets occurred
during the audit period.  This fact resulted in a large number of agencies
answering "not applicable" to the question asked in this portion of the
audit.  Responses in some audit reports did indicate, however, that some
agencies are not fully aware of Federal  criteria for crediting offsets due
to early source shutdowns and production curtailments, as well as other
issues involving the proper use of emissions offsets.

    Conclusions

     While isolated permitting problems were found in some audits, these
problems should not be taken to imply that a problem exists nationally.  EPA
should consider the need to clarify and even expand its guidance regarding
the appropriate use and timing of emissions offsets.  This would assist agencies
in the future to ensure that major construction in nonattainment areas meets
the requirements of the Clean Air Act before a permit can be issued.
                                    IV-9

-------
7.  Permit Specificity and Clarity

    Major Findings

   0 Approximately 75 percent of the audited agencies  said that  they
     routinely identify all  emissions units  and their  allowable  emissions
     rates on the final permit.

   0 When the allowable emissions rates  are  stated in  the permits,  agencies
     claim that they are typically clear, concise and  enforceable.   EPA
     auditors did find, however, that allowable emissions rates  are not
     always identified in the permits.

   0 Approximately half of the audited agencies reportedly do  not routinely
     state or reference applicable compliance test methods in  the permit
     terms and conditions.  Many agencies believe that it is not necessary
     to address compliance test  methods  as permit conditions.

   0 Approximately one-third of  the audited  agencies demonstrated inconsistent
     policy in terms of identifying special  operational  and design  limitations
     as permit conditions when needed to restrict a source's potential  to
     emi t.

   0 Most agencies believe that  the information contained in a source's
     application for a permit is an enforceable part of the permit  itself.
     Only 10 percent of the agencies indicated that the information contained
     in permit applications submitted to them is not an enforceable part of
     the permit.

    Conclusions

     The audit raised some important questions concerning the  enforceability
of the permits being issued by many State and local  agencies.  In some
cases, the lack of adequate documentation in the permit file made it
very difficult for auditors to determine whether special  permit  conditions
should have been included in the permit.  In other cases, however,  it
appears that important limitations affecting a source's operation or produc-
tion capacity were obviously not addressed.   In the absence of the  necessary
permit conditions, such presumed limitations upon a source might prove
difficult to identify or enforce.

     Further complicating the issue of enforceability  are (1)  the variety
of techniques that agencies currently use to establish "enforceable"
limitations upon sources for such things as  defining allowable emissions,
designating applicable compliance testing procedures,  and restricting the
operation and production capability of certain sources;  and (2)  the fact
that many agencies typically rely upon operating permits  rather  than
construction permits to establish limitations, regardless of the specific
technique used.  EPA traditionally has held  that conditions contained
in most operating permits are generally  not  Federally  enforceable,  even
though State and local agencies  themselves are often able to enforce them.
                                   IV-10

-------
     EPA needs to examine carefully the whole issue of permit enforceability,
including clarification of the minimum criteria required for establishing
Federally enforceable permit conditions.  Only after this is accomplished
can EPA fairly assess the adequacy of current State and local agency permit
i ssuance practices.

C.  ADMINISTRATIVE PROCEDURES

     The FY 1984 audit examined the adequacy of State and local  agency procedures
that govern the way permit applications are checked for completeness of
information, tracked during the entire review process, and made  available
for public review once a preliminary determination has been made.   In addition,
the audit sought to determine whether agencies are adequately ensuring
compliance of existing sources when the owner/operator applies for a new
major source permit in a designated nonattainment area.  This compliance
provision is required under Part D of the Clean Air Act.

Public Participation and Notification—

     Almost 40 percent of the audited programs revealed inconsistencies in
at least one of the three areas examined for providing public notice of a
proposed permit action.  These areas include the types of sources  for which
public notice must be issued, the adequacy of information contained in the
public notice, and the procedures used to notify government officials of
the pending permit action.

     1.  For which new or modified sources was the public afforded an
opportunity to comment on proposed permits?

                                                         YES    NO

     a. all PSD (100/250 TPY) sources                    91%    9%

     b. all 100 TPY nonattainment NSR sources            81%    19%

     c. all other 100 TPY sources                        63%    37%

     d. other sources                                    75%    25%

     The summary of responses illustrates the fact that the public is not
always being afforded a formal opportunity to comment on the construction
permits being issued by State and local agencies.  Of the six agencies that
do not require public notification for PSD sources, none has PSD program
authority.  Consequently, the PSD program, including the requirement for
public notification, is being carried out by EPA in these States.   Even so,
sources subject to a Federal PSD program are still  required to obtain a
State or local agency permit to construct and should also be required to
undergo public comment before a permit is issued by the State or local
agency.  The audit responses do not suggest that this is the case,  however.

     Concerning the 12 agencies that do not require public notification for
major sources in nonattainment areas, a problem only appears to  exist in
two.  Of the remaining 10 agencies, one agency indicated that it had no


                                   IV-11

-------
nonattainment areas;  three agencies  are under a  Federally-imposed construction
ban; one (local) agency does not review major sources;  and  five agencies
reportedly agreed, just prior to the 1984 audits,  to begin  notifying the
public when a major source sought a  permit in a  nonattainment  area.   (The
effects of these agreements will be  observed during  the 1985 national  audit.)

     Where most agencies fail to adhere to the public participation  requirements
is for major non-PSD  sources locating in attainment  or  unclassified  areas.
Twenty-three agencies fall under this category.   Next year's audit will attempt
to examine the number of permits that this problem may  actually involve.

     Finally, 48 agencies appear to  afford the public an opportunity to
comment concerning some major and minor source permits  being issued, but
not always on a routine basis.

     Taking all these factors into account, the  responses can  then be
collectively summarized as follows:

     Thirty-nine agencies (61%)  routinely provide  the public with an opportunity
to comment concerning all major  source permits issued by those agencies.
In addition, 20 of the 39 agencies also provide  similar opportunity  for
public comment concerning essentially all  minor  sources to  which permits are
issued.

                                                               Yes     No

     2.  Do the public notices routinely provide adequate
information to satisfy the agency's  public participation       72%     28%
requirements?

     All but one of the audited  agencies reportedly  issue public notices
offering the public an opportunity to comment on at  least some proposed
permits.  However, the notices issued by 18 agencies reportedly do not
routinely provide sufficient information.  The most  common  finding involved
the lack of a statement identifying  for the public the  estimated ambient
impact of the source, including  the  degree of increment consumed by  PSD
sources.  This omission was identified in notices  issued by 12 agencies.
Some agencies commented that information pertaining  to  the  source's  ambient
impact is required only for PSD.  EPA acknowledges that the inclusion of
such information is specifically called for only in  the PSD regulations,
but believes that it  is important to convey the  results when an ambient
impact analysis is performed.  It could not be determined in all  cases
whether PSD or non-PSD permits were  involved when  the omission was identified,
but in at least 4 of the 12 agencies, the finding  did pertain  only to
non-PSD permits.

     Five audits found other omissions, including  failure to provide a
preliminary determination, failure to require an opportunity for public
hearing, or to make adequate source  and analysis information available
for public inspection.
                                                                 Yes    No
     3.  Were other State and local  air pollution  control
agencies and other officials whose jurisdictions might  be        92%    8%
affected by the proposed new or  modified source  adequately
notified of the proposed action?


                                   IV-12

-------
     Most programs reportedly provide adequate notice to government agencies
and officials.  Only five agencies were found to have inadequate notification
procedures to ensure that the appropriate agencies and officials, particularly
Federal Land Managers, are routinely notified of pending permit actions.

Application Documentation and Determination of Completeness—

                                                           Yes       No

     4.  Is there a review to see if each permit
application is complete?

         a.  Agency reviews applications for              100%
             completeness

         b.  applicant is informed concerning the
             finding of completeness/incompleteness        83%      17%

     State and local agencies are required under 40 CFR 51.24 to notify all
PSD permit applicants concerning the status of their applications, i.e.,
complete or incomplete.  Some agencies appear to provide such notification
to all applicants or at least all involving major sources.  However, at
least 11 agencies with responsibility for carrying the PSD program appear
only to notify the PSD applicant if the application is
found to be incomplete.

                                                           Yes     NŁ

     5.  Is a formal record kept on file which documents
agency action with respect to a permit application?        89%     11%

     Only seven audits identified agencies that did not maintain a status
record of actions taken on a particular permit.  It should be noted that,
with one exception, these agencies handle relatively few permits, and
particularly few permits for major sources or major modifications.  While
there is not a Federal requirement pertaining to maintenance of a permit
tracking system, such a system is believed to provide benefits to the
efficiency of the overall permit process, including the timely issuance of
permits.  The high percentage of agencies that maintain such a system
appears to attest to this.  Information from the audit reports indicates
that the existing systems range from informal to computerized tracking
systems.

Statewide Source Compliance--
                                                           Yes    No   NA*
     6.  For major new or modified sources which are
     subject to Part D of the Clean Air Act (nonattain-    72%    3%   25%
     ment area major sources), has the agency ensured
     that other sources owned by the applicant are
     in compliance, or on a compliance schedule,
     with applicable SIP requirements?
*"Not applicable"

                                   IV-13

-------
     With the exception of one program,  reported data indicated that all
applicable agencies routinely seek to ensure that when a major source
applies for a permit in a nonattainment  area, all  existing  major sources
owned or operated by the applicant throughout the State are in compliance
(or on a compliance schedule) with SIP requirements.   Differences did
exist, however, in the way that such compliance was  ensured,  and whether
such compliance was documented in the appropriate permit file.

     Although it could not be determined in all  cases, some agencies
require the applicant to certify the compliance of existing sources, while
others simply run an in-house compliance check.   A few agencies indicated
that a specific question regarding Statewide compliance is  provided on the
permit application; other agencies reportedly are considering doing the
same.
     7.  Does the agency document  Statewide
source compliance in the permit  file?
                                                       Yes    No    NA   NR*
                                                       52%   23%  23%  2%
     Fifteen agencies indicated that they did not provide any documentation
of Statewide source compliance in the permit  file.   While no specific
requirement for such documentation exists, agencies  should have some record
readily available to show that this requirement  is  being met as a permit
consideration, as required by the Clean Air Act.

D.  APPLICABILITY DETERMINATIONS

     This portion of the new source review audit sought  to evaluate how
well agencies apply the criteria contained in the SIP as to whether certain
projects are subject to review and apply the  appropriate requirements
leading to the approval or denial of a construction  permit when subject.
The task of making correct applicability determinations  depends upon the
existence of adequate regulations containing  the proper  definitions and
applicability criteria, plus the in-house expertise  to properly apply them
to each potential source.

     This part of the audit provided questions designed  to address directly
the adequacy of the applicability determination  process, including the various
definitions of "source" in use, the use of "potential  emissions" to determine
a project's status for possible review as major  new  and  modified sources,
and the consideration of fugitive emissions in emissions calculations.

     Another group of questions examined the  proper  application of §107 area
designations and how agencies deal with major sources that seek to locate
in areas where a construction ban has been imposed.

     Two questions were asked to determine the potential for sources to
improperly circumvent the applicability and/or depth of  an agency's permit
process.  Finally, several questions were asked  to  gain  a better understanding
*No response
                                   IV-14

-------
of agency requirements for permitting minor source construction and the use
of registration systems to identify sources for which permits may not be
required.

Source Discovery System—

    1.  What are the mechanisms used by the agency to ensure that applicable
sources submit construction plans for formal  review and approval  prior to
beginning their proposed construction?

     Agencies typically responded by identifying more than one source
discovery mechanism.  Those most commonly identified, along with  the frequency
of such response, are as follows:

          64%  Inspection/surveillance

          55%  Other government agencies (Building Department, Department
               of Commerce, Planning Commission, etc.), and

          23%  Newspapers/trade association journals.

     Audit reports occasionally included comments about source discovery
problems.  Evidence was found in at least four audits that sources had been
constructed without first receiving a permit.  In one of these agencies, it
is common practice to issue such sources only operating permits.   The
question of Federal enforceability was raised with respect to those sources.
In a fifth program, the audit questioned whether all  appropriate  air permit
applications were being received by the air program office because the appli-
cations were first being screened by a central office with little or no air
expertise.  In yet another case, sources that received construction permits
were found to commence operation without first notifying the agency.

Review Applicability--
                                                              Yes    No_   NA  JW

     2.  Does the agency apply the proper Federal defini-
tion^) of "source" (i.e. those now in effect in Federal
NSR/PSD regulations)?

     a.  PSD                                                  87%    —   13%  —

     b.  NSR (Part D requirements)                            90%    2%    8%  --

     c.  NSPS                                                 97%    —    3%  —

     d.  NESHAP                                               86%    --   11%  3%

     Audit reports indicate that agencies apply the current Federal  definition
of source in most cases.  A problems was identified in one agency where the
audit indicated that, although the agency had adopted the correct definition
of source, that definition was not being implemented properly. No further
explanation of this reported misapplication was provided.  Two agencies have
approved definitions (one for PSD, one for NSR) whose overall  stringency may be


                                   IV-15

-------
considered equivalent to the Federal  definition,  even  though  certain  aspects
of the definitions are, in fact, less stringent.   No permits  were found,  in
either case, to suggest that any difference in  applicability  determinations
resulted from the use of those definitions of source.
    3.  Does the agency typically use the best  available
emissions projections and Federally enforceable restrictions
in defining a new major source's (or unit's)  "potential  to
emit"?
                                                                   Yes
       61%   39%
     Permits issued by at least 25 agencies  were  found  to have  conceptual
problems concerning the use of the potential  to emit  definition for new
major sources.  The PTE is a source's maximum capacity  to emit  a pollutant
under its physical  and operational design.   In order  for any physical  and
operational limitations to be considered part of  the  source's design (to
restrict the maximum capacity of the source), they must be Federally enforce-
able.  The major status of new sources must  be determined on the basis of
their potential  to  emit.

     In many cases, agencies did not establish permit conditions to identify
the limitations  restricting potential to emit.  In other cases, the conditions
were contained in operating permits which were not recognized as being
Federally enforceable.  Finally, four agencies did not  provide  sufficient
documentation in permit files for the EPA auditors to be able to determine
whether potential to emit was properly considered.
                                                                   Yes  No
     4.  Does the agency routinely use an existing source's
"potential to emit" to determine major source status  for
proposed modifications?
       81%   19%
     Audits of six agencies revealed applicability determinations where the
major source status of the existing source was  not considered.   In each
case, agency policy dictated that major modifications  were to be determined
only on the basis of the proposed change in emissions.  Some agencies
required that the emissions increase itself must be major (i.e., 100 TPY).
Others set lower limits such as 50 TPY.  In three audits, it was reported
that there was not sufficient information in the individual  permit files to
demonstrate whether a source's potential  to emit was adequately considered.
In at least two cases, emissions reduction credits were used to avoid  a
significant net emissions increase without making the  emissions reductions
Federally enforceable.  In all, at least 12 agencies are believed to use
improperly EPA's current concept of potential  to emit  for major modifications,
     5.  Does the agency use as its netting baseline
actual TPY emissions?
                                                            Yes    No   NA   NR
73%   11%  6%   10%
     Seven agencies reportedly base the determination of a "net emissions
increase" on allowable emissions.  In another agency, a single permit  was
questioned because there was insufficient documentation provided to determine
                                   IV-16

-------
whether emissions estimates were representative of actual emissions for a
normal, 2-year period.  Current EPA rules require that the netting baseline
use actual emissions.

                                                             Yes   Np_   NA   NR

     6.  Does the agency check applications for proper use   77%   6%   14%  3%
of contemporaneous emissions changes to prevent the "double
counting" of emissions decreases for netting purposes?

     Four agencies were found to have inadequate systems to track emissions
changes and, therefore, were likely to experience difficulty ensuring that
"double counting" of some emissions decreases did not occur.  However, no
specific permit was identified for which double counting had actually
occurred.  Nine agencies answered "not applicable," presumably because the
use of netting is not allowed in those agencies as means of gaining exclusion
from the permit process.

     It is important also to note that double counting of emissions reductions
may occur inadvertently where agencies do not adequately record the emissions
reductions (used for netting out of review) as permit conditions.  This
problem was found to occur in at least two cases (see question 2.4).

                                                                      YES  NO  NA
     7.  Does the agency adequately address fugitive emissions in
calculating "PTE" and "net emissions increase"?                       86%  11%  3%

     The consideration of fugitive emissions presented a potential problem
in at least seven agencies.  Responses were not always specific other than
to indicate that a problem was found.  In one case, fugitive emissions were
not addressed for a specific source category.  Another indicated that
netting calculations sometimes failed to consider fugitives.

                                                             Yes   No   NA   NR

     8.  Does the agency properly apply the §107 area        94%   2%   2%   2%
designations when determining what type of preconstruction
review will be required of major construction?

     Only one agency was determined to have problems with regard to geographic
issues.  That agency was identified for its lack of adequate consideration
of major sources locating in nonattainment areas.  One local agency answered
"not applicable," presumably because all  major source reviews are conducted
by the State agency.

                                                              Yes   No    NR_

     9.  Where a construction ban is in effect, would the     73%   12%   15%
agency deny a permit to a major source construction project
in the ban area?

     Twenty-six agencies indicated that construction bans were in effect within
their jurisdiction.  Of these, three agencies indicated that they would not


                                   IV-17

-------
deny a permit to a major source in the ban area.   Two of the agencies noted
that the permits issued would contain the condition that construction could
not commence until the ban is lifted.  In the third case, the agency indicated
that it had no authority to deny a permit solely  on the basis of a "Federal"
ban.  However, the respondent added that EPA would be notified that a State
permit had in fact been issued in a ban area.

     Four other agencies with bans in effect did  not answer the question other
than to provide comment.  Two indicated that no affected applications had
been received, while one indicated simply that it would consult with EPA.
The fourth respondent provided a detailed comment suggesting that in certain
instances a permit might be issued to a major source in the ban area.

    10.  Does the agency issue permits for "minor" construction?

          a.  62%   All minor sources.

          b.  38%   Certain minor sources as described by agency rule or
                    regulation.

          c.   0%   Only major sources (100 TPY).

     While all audited agencies indicated that permits were issued to minor
sources, differences existed as to what minor sources were actually regulated
and the degree of review that was performed.  Forty agencies reported that
all minor sources greater than a specified cutoff size receive permits.  A
number of different cutoff sizes were identified; values of two TPY or less
were mentioned most often.  The remaining 24 agencies indicated that only
certain minor sources, as described by agency rule or regulation, are
required to receive permits.
                                                                 Yes     Np_

     11.  For major sources exempted from both PSD and           100%
nonattainment NSR requirements, does the agency continue to
apply other permit requirements?

     All agencies require that all major sources  get some type of permit.
Major sources are subjected, as a minimum, to the review applied to minor
sources i.e., compliance with emissions regulations, and case-by-case
ambient air quality screening.  One agency responded that no major sources
are exempted from PSD review.

                                                       Yes      Np_        NA

     12.  Are all exempted sources subject to some     42%      53%       5%
type of registration system?

     Twenty-seven of the audited agencies require some type of registration
system.  Of those who do not (33), seven stated that very few sources were
exempt, therefore making such a system impractical.  One agency commented
that there were too many exempted sources, thus making a system impractical.
Five agencies stated that some exempt sources were further exempted from a
registration system pursuant to various State rules.

                                   IV-18

-------
E.  BACT/LAER DETERMINATIONS

     The audit examined several aspects of the BACT/LAER control technology
requirements that are generally applicable to PSD sources and major new and
modified sources in nonattainment areas.  However, more weight was given
to the BACT analysis for this year's audit.  With respect to BACT, emphasis
was put on whether agencies were requiring a BACT analysis for each regulated
pollutant emitted in significant amounts.  Prescribed significance thresholds
applicable to each pollutant are defined by the PSD regulations.

     In order to get a better idea of how thoroughly the BACT analysis is
being carried out, additional questions were asked to determine whether the
analysis routinely considered more than one possible control technology,
and whether the agency routinely took it upon itself to evaluate the analysis
submitted by the applicant in an independent fashion.

     The audit also sought to determine the ability of the BACT/LAER
requirements to function as technology-forcing requirements.  This was done
simply by asking what the tendency was for existing BACT/LAER determinations
to conform exactly to NSPS or NESHAP--the minimum control requirements
legally allowed for BACT and LAER.

     Finally, the audit involved two questions designed to assist EPA in
learning about the participation of State and local agencies in using the
BACT/LAER Clearinghouse and the usefulness of this clearinghouse's information
for setting BACT/LAER.

Pollutant Applicability--
                                                         Yes    No   NA
     1.  Does the BACT analysis consider each
regulated pollutant emitted in significant amounts?      80%    5%   15%

     Only one agency was found to not apply the BACT requirement to all
regulated pollutants when they are emitted in significant amounts.  In
this case, it was reported that the agency did not recognize all PSD applicable
pollutants.  Consequently, no BACT analysis was required for those pollutants.

     While no other agencies were identified as failing to consider each
regulated pollutant, in two agencies, specific PSD permits were found that
did not contain BACT analyses for any pollutants.  No explanation was offered
for this omission, other than that this was not believed to be symptomatic
of all PSD permits issued by these agencies.

     Two agencies reportedly also required BACT for pollutants other than
those regulated by EPA.  One agency identified chromium as a regulated
pollutant for which BACT applied; the other identified hydrochloric acid.

Control Strategy Alternatives for BACT—
                                                         Yes    No   NA
     2.  Does the review agency require the
consideration of more than one control  alternative?      71%    14%  15%
                                   IV-19

-------
     Nine agencies reportedly do not  require  the  consideration  of  other
control  alternatives as part of the BACT analysis,  according  to responses
recorded on the audit questionnaires.   In further examining the audit
reports, however, it appears that three of these  agencies  may only require
other control  alternatives  to be considered on  a  case-by-case basis.   That
is, if a control  technique  is "obvious" or common for a  particular source
category, then no control alternatives  are formally required  to be examined.
On the other hand, nontraditional control  techniques or  unusual  source
types would likely be required to consider several  control techniques  as
part of the BACT analysis.

     Three other agencies who indicated that  they did not  require
consideration  of different  control  alternatives also commented  that their
BACT procedures included a  preapplication meeting with the applicant to
discuss and determine BACT  requirements.  They  claimed that this effectively
eliminated the need for considering more than the control  technology in the
application itself.

     3.  To what extent are economic, energy, and nonair environmental
impacts considered in the applicant's BACT analysis?

     Thirteen  out of 62 agencies failed to respond  to this question.   It
could be inferred, perhaps, that those  agencies do  not give serious attention
to these analyses, possibly because of  poor understanding  of  the requirements.
Six agencies relied on experience,  the  clearinghouse, or the  information
submitted by the applicant  in order to  make a determination.  Of the agencies
considering all the criteria, eight weighed economic factors  most  heavily,
and three did  not consider  nonair environmental factors  in their determinations,
Six agencies stated that the question was either  not applicable, or that the
answer could not be determined.

                                                           Yes   No   NA
     4.  Does  the agency perform an independent
evaluation of  applicant's BACT analysis?                   81%   4%   15%

     Only two  agencies indicated that they did  not  perform an independent
evaluation of  the applicant's BACT analysis.  These agencies  merely reviewed
the applicant's analysis.   In both cases, the agencies indicated in a
preceding question that they routinely  required the consideration  of control
alternatives as part of the applicant's BACT  analysis.  The audit  results
did not usually provide sufficient information  to suggest  that  the other
agencies, which responded affirmatively to the  question, actually  went
beyond a review of the applicant's  analysis and to  what  extent.

     5.  What  tendency is there for the reviewing agency's determinations
to conform exactly to minimum NSPS/NESHAP requirements?

     Responses to this question were somewhat imprecise  (e.g.,  "yes,"  "strong,"
"moderate," "somewhat lower") but the  responses appear to  indicate that BACT
tends to be set at or close to NSPS in  at least 17  agencies.  In some  cases,
however, the tendency may apply only to a particular pollutant. In two
agencies, for example, the  tendency appeared  to be  limited to particulate
matter (opacity determinations); in one case, S02;  and in  another  agency,


                                   IV-20

-------
VOC.  Often it was not clear as to the number of permits the "tendency" was
based on, but in at least one case only one PSD permit had been issued by
the agency.  It is believed that further evaluation of this issue in subse-
quent audits will  provide a better indication of the use of BACT as a
technology-forcing mechanism.

     With respect to LAER, the tendency to conform with NSPS/NESHAP appeared
in only five agencies.  In two agencies, auditors referred to the tendency
to conform to NSPS as being "strong."  In the others, the tendency was less
significant.

                                                           Yes   INp_   NA

     6.  Does the agency make use of the BACT/LAER         88%   6%   6%
Clearinghouse?

     Fifty-six agencies indicated that they used the BACT/LAER Clearinghouse
to some extent.  Only four agencies indicated they did not.  Of those four
agencies, two stated that there had not yet been an opportunity to use the
clearinghouse.

                                                           Yes   No   NR

     7.  Has the BACT/LAER Clearinghouse been found to     78%   13%  9%
provide useful information?

     With respect to the usefulness of the clearinghouse, 50 agencies said
that the clearinghouse was useful, while eight responded that it was not.
In addition, some of the affirmative respondents indicated some problems
with the clearinghouse.  Comments in general  ranged from "very helpful" to
"too lenient."

F.  AMBIENT MONITORING

     This portion of the audit examined the PSD requirement which provides
in general that PSD sources must gather air quality data as part of their
application for a construction permit.  The PSD regulations contain, for
each pollutant, specific criteria that are to be used to determine when a
source does not need to report ambient air quality data as part of a complete
PSD application.  For those pollutants for which ambient data must ultimately
be reported, EPA guidelines set forth procedures whereby a source may be
relieved of the responsibility to conduct monitoring if existing ambient
data is available and it is "representative"  of the air quality in the area
where the proposed source would locate.

     The audit asked two narrative questions  concerning the criteria an
agency follows in implementing the PSD monitoring requirements, including
the use of representative existing data.  Additional questions were asked
in order to determine whether agencies (a) formally record their justification
for excluding a source from gathering monitoring data, and (b) routinely
ensure that EPA-prescribed quality assurance  procedures are followed when
monitoring is required.
                                   IV-21

-------
Data Submittal Criteria--

     1.  Under what circumstances is a source required to submit preconstruction
ambient monitoring data?

     The ambiguity of this question resulted in a large number of inappropriate
responses.  As explained in the audit guideline, every PSD source with
"significant" emissions of a particular pollutant, where both  the existing
air quality and the ambient impact of the proposed source or modification
would also be "significant," must submit ambient monitoring data.  The
question was designed to determine whether each agency adhered to the tests
for significant emissions and ambient impact in requiring preconstruction
monitoring.  Only 12 agencies provided responses that indicated an understanding
of what the question had intended.  This question will  be addressed more
clearly in the planned FY 1985 audit.

                                                                  Yes    No   NA

     2.  For sources not required to conduct monitoring,
is the basis for the exemption provided in the preliminary        72%    6%   22%
determination?

     Three agencies indicated that the basis for the exemption is not
provided in the preliminary determination, and a fourth agency was found to
provide no explanation at all in at least two permit files where monitoring
was not required of the applicant.  Fourteen respondents indicated that  the
question was "not applicable," either because they did not have PSD authority
or because they had never issued a PSD permit.  It should be noted that
there is no specific requirement that the basis for the monitoring exemption
be included in the preliminary determination.  However, it may be desirable
to do so because all determinations pertaining to the PSD monitoring require-
ments should be documented, and the preliminary determination  affords the
public with an opportunity to know how the requirement was addressed.

Representative Data vs. Source Monitoring—

     3.  Under what circumstances may a source submit representative existing
data, rather than conduct new monitoring?

     Agency responses were typically brief and imprecise, thus preventing  a
fair assessment of the agencies' performance in this area.  Responses such
as "EPA guidelines" and "representative data" were provided most often.  Two
audits did, however, identify agencies that do not consistently follow EPA's
guidelines for accepting representative data.  In two additional audits, a
single PSD permit in each was questioned because of inadequate justification
for using representative data.  Next year's audit will  be designed to examine
more carefully the use of representative data.

Quality Assurance of PSD Monitoring Data--
                                                               Yes    No    NA
     4.  Do the monitoring data gathered by the source
adhere to PSD quality assurance requirements?                  78%    2%    20%
                                   IV-22

-------
     The audits identified only a single problem concerning the quality
assurance of PSD data.  This occurred in one PSD case where it was revealed
that the source was not entirely meeting the quality assurance requirements
under 40 CFR 58, Appendix B.

     Thirteen respondents answered "not applicable"  either because they did
not have PSD authority or because they had no experience with PSD permits.

G.  AMBIENT IMPACT ANALYSIS

     The impact that a new source or modification will  have on air quality
is one of the most important determinations made during the new source  review
process.  The audit examined three areas of concern  pertaining to the way
agencies assess the ambient impact of new and modified sources.  Primary
concern tended to focus on major source activities,  but some minor source
considerations were addressed as well.

     The first area of concern examined was PSD increment consumption.
Increments are consumed by PSD sources commencing construction after
January 6, 1975, but are only consumed by other emissions increases
occurring after the date of the first PSD application in a particular area.
Agencies were evaluated to determine whether:  PSD increment consumption
was adequately considered, efforts to periodically assess increment consump-
tion were planned, impacts on both long- and short-term increments were
adequately considered, and whether the adequacy of assessments of sources
impacting Class I area increment was sufficiently addressed.

     The second area of concern involved protection  of the NAAQS.  Questions
in this section examined the methods used to perform the NAAQS analysis,
the extent to which minor source impact is considered,  and the adequacy of
the agencies'  efforts to protect against ambient "hot spots."

     Finally, two general questions concerning the use of air quality dispersion
models were asked.  These questions were designed to evaluate the agency's
choice and use of models for the situation at hand,  and to determine whether
agencies routinely evaluate the modeling analyses submitted by the permit
applicants.

PSD Increment Consumption--
                                                         Yes    No   NA

     1.  Does the agency adequately consider the         50%    17%  33%
baseline concentration and emissions changes that
affect increment consumption?

     The reported data suggest that most agencies perform PSD increment analyses
adequately.  However, problems were found in three agencies concerning  their
overall ability to account for emissions changes which consume PSD increment.
Also, the EPA auditors commented in eight programs that there was a need to
improve the agencies' consideration of minor source  emissions, even though
the agency itself believed the evaluation to be adequate.  Twenty-one agencies
responded "not applicable" because they did not have PSD authority or had
no experience with PSD sources.
                                   IV-23

-------
     It should also be noted that two agencies  were considered  exemplary
in their efforts to track minor source emissions  to protect  PSD increments.

                                                Yes    N^    NA    INR

     2. Does the agency perform or have
plans to perform periodic assesments  of        49%    27%    21%   3%
PSD increment consumed?

     A significant number of respondents  indicated  that either  plans or
actual  efforts were underway to periodically  assess PSD increments.  Two
agencies have actually performed assessments  as part of their efforts  to
track minor source emissions as mentioned above.  Most  respondents  answering
affirmatively, however, did not identify  a specific plan or  schedule for
periodically assessing increments.  Several agencies responding with a
negative answer indicated minimal  growth  occurring  in their  jurisdictions
and could find no basis for periodic  assessments, other than those  performed
by each PSD applicant.  The smaller number of "not  applicable"  responses
was due to the fact that some agencies without  PSD  authority or without PSD
permit experience indicated that they still "planned" to assess increment
periodically.

     Four agencies (two provided no response) specifically  requested EPA
guidance as to how an increment assessment should be performed.

                                                       Yes        Np_     NA

     3.  Are long- and short-term PSD increments
being given adequate consideration as part of the      76%        —      24%
ambient impact analysis?

     Applicable agencies overwhelmingly responded that  both  the long-term
and short-term increments were adequately considered.  EPA  auditors did not
identify any examples of problems in  the  limited  number of  PSD  permits
examined.  Some respondents did, however, indicate  that evaluation  of
short-term impacts were difficult to  make.

                                                       Yes       No     NA
     4.  Does the agency make adequate assessment
of the ambient impact from new sources and modifi-      71%       2%     27%
cations on the Class I area increments?

     All but one of the applicable agencies claim adherence  to  EPA  criteria
when considering  the impact of a PSD source  on a Class I area.  These
criteria include (1) treating a pollutant concentration which exceeds
1 ug/m3 (24-hr average) as a significant  ambient  impact if  the  source  locates
within 10 km of a Class I area, and (2) screening major sources locating
within 100 km of a Class I area.  Only one agency was found  that did not
follow the significant impact criteria for Class  I  areas.  However, no
specific permits involving misuse of  this criterion were identified.
                                   IV-24

-------
NAAQS Protection--

     5.  What emissions baseline does the agency require to be used to
evaluate the impact on the NAAQS of new and modified sources?

     a.   2%  Allowable emissions modeled exclusively

     b.  47%  Modeled allowable emissions from proposed source and added
the monitored background air quality

     c.  45%  Other

     d.   6%  Not Applicable

     Almost half of the audit agencies chose "b" to describe the emissions
baseline that they use for NAAQS analyses.  Many agencies chose a combination
of the answers in order to describe more closely the procedure that they use.
When combinations were chosen, it was assumed that "c" was the appropriate
answer.  When "c" was chosen or interpreted to be the appropriate response,
it typically meant that modeling was used to estimate not only the proposed
source but existing sources as well.  Then, monitored background data were
added to the modeled results.  Concerning the modeling of existing sources,
the respondents were fairly evenly divided between using a source's allowable
emissions or some form of an actual emissions estimate (e.g., "annual
average," "worst case actual," "actual operating conditions").

     Two agencies identified "a" as their response; one was without comment.
Comments from the other, however, suggested that "c" was the proper response
because monitored background was added to modeled allowable emissions from
the proposed source and existing sources.  Four agencies answered "not
applicable" because ambient impact analyses were typically handled by
another agency.

                                                             Yes      No
     6.  Does the agency routinely evaluate the impact of
minor source construction?                                   44%      56%

     Although a larger number of agencies do not routinely evaluate minor
source impact than those who do, many of the negative respondents indicated
that minor sources are reviewed for ambient effects on a case-by-case basis
when they are suspected of causing a problem.  However, in at least five
agencies, it was suggested that minor sources rarely undergo air quality
review.

     Three agencies were noted for their exemplary programs for minor source
review to protect the NAAQS.

                                               Yes     No_     NA       NR

     7.  Does the agency's ambient impact
analysis provide adequate protection against   73%     15%    6%       6%
the development of "hot spots?"


                                   IV-25

-------
     Preliminary indications suggest that 45 agencies give adequate
protection to ambient "hot spots," although two of these agencies were
advised to alter or increase their receptor sites to assure that high
concentrations are being adequately modeled.  Five agencies were found to
have deficiencies based generally on a failure to consider nearby sources
in addition to the proposed source being modeled.

     Of the four that did not respond to the question,  three indicated that
they did not understand what a "hot spot" was, despite  the explanations
contained in the FY 1984 NAAS guideline.  It therefore  appears  that these
respondents either did not receive the guideline or failed to adequately
review it.  This problem appears to have occurred in a  significant number of
instances throughout the audit.

Dispersion Models—
                                                       Yes      Np_

     8.  Does the agency use adequate models to
carry out the ambient impact analysis?                 61%     39%

     This question is particularly difficult to assess  because  of the
variety of narrative responses contained in the audit reports.   The results
suggest that most agencies use, or require the use of,  EPA reference models
in most modeling experiences.  However,  the audits indicate problems that
go beyond simply the use of adequate models.  Consequently, the various
problems identified are factored into the summary of responses  above.

     In addition to several cases where  an inappropriate model  was applied,
other potential  problems were identified.,  These include inadequate documen-
tation to determine whether the proper procedures are followed, failure to
demonstrate equivalency of models being  used, failure to routinely require
a modeling analysis when deemed necessary, inability to run refined models
as needed, inadequate consideration of receptor sites,  and failure to
identify worst case meteorological conditions.

                                                       Yes      No

     9.  Does the agency perform an independent,
internal review of the modeling analysis contained     60%      40%
in the permit analysis?

     The responses to this question suggest that while  most agencies use
the appropriate models, there is a significant portion  that does not
adequately review the modeling analysis  submitted by the applicant.  Four
general types of problems have been identified in this  area: seven
agencies do not adequately review the analyses; six agencies may review the
applicants' analyses but either cannot or do not replicate the  results as
needed; eleven agencies either cannot or do not review the analyses as
needed; and, in one agency, the results  of the ambient  impact analyses are
not always adequately considered in the  final permit restrictions.
                                   IV-26

-------
H.  EMISSIONS OFFSET REQUIREMENTS

     Emissions offset requirements apply to all major construction that
occurs in nonattainment areas.  Agencies that implement the new source
review program in nonattainment areas are expected to adhere to criteria
consistent with the requirements under Part D of the Clean Air Act.  The
objectives of this portion of the audit are (1) to assure that agencies are
requiring, where appropriate, adequate emissions offsets as a prerequisite
to major construction in designated nonattainment areas; and (2)  to assure
that offsets are being obtained in a manner consistent with approved
demonstrations of NAAQS attainment and interim reasonable further progress
(RFP).

     Two additional questions concerning the performance of emissions
offsets were asked so as to gain information on an issue that is  not
clearly covered under the Clean Air Act requirements or EPA regulations.

Enforceability—
                                                   Yes    No    NA    NR

     1.  Does the agency require that all offsets
be Federally enforceable?              •            70%    6%    19%   5%

     Forty-five agencies indicated that they do require Federally enforceable
offsets, while two agencies responded only that their offsets must be
"enforceable."  There is some question as to what this means, however, because
both agencies are known to have approved SIP's that require Federally
enforceable emissions offsets.  On the other hand, two agencies responding
affirmatively to the question were found to have emissions banking systems
containing emissions credits that are not Federally enforceable.

     Some of the answers suggest that the question was not clearly understood
by some agencies.  In at least one case, an agency with no NSR responsibility
answered "yes," thinking that the question pertained to PSD netting situations.
Several others who responded "not applicable" did have NSR programs but
have not issued any offset permits.  One agency answered "not applicable"
because they have no nonattainment areas, but the agency is in fact known
to have a nonattainment area.

Consistency with RFP--
                                                     Yes     No     NA     NR

     2.  Does the agency routinely ensure that
emissions offsets are not otherwise needed to        62%     5%     30%    3%
show RFP or attainment?

     Four respondents answered "no," but one response was changed to "not
applicable" because the respondent clearly misunderstood the intent of the
question.  A greater number of respondents answered "not applicable" to
this question than the preceding one for a variety of reasons, including
some which indicated that no offset permits had been issued.  In  such
situations, it would have been preferred that those respondents indicate
their policy regarding this issue.  As a result, EPA does not feel  that
adequate feedback was given on this question.


                                   IV-27

-------
     3.  Does the agency's offset
requirement cover other emissions
increases since the last offset
review?
                                             Yes
50%
No

14%
 NA

 31%
 NR

 5%
     Thirty-two of the audited programs address area and minor source growth
either through emissions offsets, required before major source construction
is approved, or through an accommodative SIP.  A high percentage answered
"not applicable," giving reasons such as:   area subject to ban, no offset
regulations, no nonattainment areas, and no offset permits issued.  Two
agencies that claimed no nonattainment areas are known to have them.
Timing of Offsets--

     4.  Does the agency require that
offsets occur on or before the time of
new source operation?
Yes
72%
No
1%
NA
23%
NR
4%
     According to the responses, all  emissions offsets must occur before
the new sources commence operation.  However, one audit revealed problems
with an agency's regulations with respect to the timing of offsets.  No
further comments were provided to explain what the specific problem was.
Four affirmative responses were changed to "not applicable" because of the
existence of construction bans within their jurisdictions.
     5.  Is the effective date of the
offset documented in the permit file?
                                             Yes
58%
        No
      NA
3%    31%
        NR
        8%
     Two agencies answered "no."  One indicated that the date of the offsets
could not always be documented in the permit file because the offsets were
to occur in the future.  The other did not document the date in the permit
file but documentation was provided in a separate file.  Details of the
procedure were not provided.  The number of agencies answering "not applicable"
increased over the previous question mainly because some agencies indicated
that no offset permits had been issued.  Some agencies answered "yes" even
though they had issued no offset permits.  These answers were left unchanged
because they were taken as an expression of the agencies' intent in the
absence of actual experience.
     6.  Does the agency allow offsets
resulting from early source shutdowns
or production curtailments?
                                             Yes
63%
        No
      NA
9%    19%
        NR
        9%
     Thirty-nine responded "yes" to allowing offsets that result from early
source shutdowns or production curtailments; one agency allows offsets only
for production curtailments.  Only nine agencies indicated that no such
offset credits were allowed.  In general, shutdowns and production curtailments
that occur before the date of the new source application should not be allowed.
The findings suggest that there may be a problem which should be further
explored.
                                   IV-28

-------
Offset credit is allowed, however, in accordance with specific criteria in
cases where the new source (unit) is a replacement.  This is the focus of the
next audit question.

                                             Yes     Np_    NA      NR_
     7.  Are offsets granted in
accordance with EPA shutdown criteria?       55%     9%    27%     9%

     Of the 39 agencies allowing offsets resulting from early source shutdowns
or production curtailments, 88 percent report that they follow EPA's shutdown
criteria.  However, 15 percent of these agencies report that they do not adhere
to EPA shutdown criteria.  No reasons or justifications for this discrepency
were discernable from the audit reports.  However, these responses do
indicate a need for further investigation in this area.  Most agencies
responding "not applicable" related lack of permitting experience or absence
of offset regulations as the basis for their response.

Permanence of Offsets—

     8.  Under what conditions would emissions offsets not be considered
permanent?

           39% - Under no conditions
            5% - If nonattainment area becomes attainment
           30% - Other
           17% - Not applicable
            9% - No response/No policy

     9.  What type of permit review would be required pursuant to a SIP
relaxation involving emissions offsets?

          9% - Not allowed
         53% - Other (Some type of review generally stated)
         17% - No response/No policy
         21% - Not applicable

I.  PERMIT SPECIFICITY AND CLARITY

     The final phase of the new source review audit focused on the policies
and practices involved with the actual issuance of permits by State and
local agencies.  The objectives of this part are to determine whether the
permits being issued have conditions which are clear, concise, and enforceable,
and to determine the extent to which special conditions are being used.

Source Identification--
                                                           Yes   Np_   NA

     1.  Does the agency identify all emissions units and
their allowable emissions on the final permit(s)?          74%   23%   3%

     This question was intended to help EPA determine whether agencies made
an effort to identify an allowable emissions limit for each affected emissions
unit, rather than "bubbling" the source emissions for a particular pollutant


                                   IV-29

-------
under a single or composite emissions limitation.   This  latter practice  is
of questionable enforceability.  Unfortunately,  the question  was  ambiguously
worded, and some respondents based their answer  on whether the emissions
limitations were actually contained in the final  permit, which is one  of the
concerns of audit question 3 below.  In fact,  sometimes  opposite  answers
were accompanied by the exact same explanation.

     In an attempt to properly interpret the responses,  nine  were revised
in order to place the emphasis on the agencies'  intentions to specify
emissions limitations for individual  emissions units.  That is, if it
appeared that an agency did identify  all of the  affected emissions units
and their corresponding allowable emissions, then  no attempt  was  made  to
assess whether the information was identified  in  the permit,  incorporated
by reference from the regulations, or contained  in the permit application
itself.  Based on the revision of some responses,  at least 14 agencies are
believed not to routinely identify all  emissions  units and their  allowable
emissions.

Permit Conditions--
                                                          Yes    No     NA
     2.  Does the agency have an established format
detailing the compilation of special  terms and             77%   18%     5%
conditions?

     Forty-eight agencies indicated the use of some type of format for
compiling special terms and conditions.  It is assumed that such  terms and
conditions are typically added to the standard permits conditions as appro-
priate for each permitted source.  Where comments  were provided,  it was
clear that the question was being answered in  two  different ways:  some
agencies identified methods used to incorporate  conditions within the  permit,
e.g., permit issuance letter, permit  process memo, append to  operating
permit.  On the other hand, others identified  formats  used to maintain a
list of conditions in-house.  From such lists, appropriate conditions  could
be selected for different permits. One effective  technique that  is being
used successfully by at least two agencies is  to  store the conditions  on a
word processor.  Then the permit engineer simply  identifies the appropriate
conditions for a particular permit rather than having  to write them out  in
their enti rety.

     3.  Are the allowable emissions  rates stated  or
referenced adequately in the permit conditions?                Yes   ^p_    NA

     a.  Clear and precise averaging  periods                  88%   9%    3%

     b.  Emissions rates consistent with acceptable
measurement procedures                                        89%     8%  3%

     c.  Design, equipment, work practice, or
operational standards used where appropriate                  91%     6%  3%

     Where stated or referenced in the permit, allowable emissions rates
are generally expressed in an adequate manner  in  terms of clarity and
enforceability.  Seven agencies were  found to have problems with  emissions


                                   IV-30

-------
rates that lacked specific averaging periods, and five agencies do not
always establish emissions rates that are consistent with acceptable compliance
measurement methods.

     A number of agencies did not always identify the allowable emissions
rate in the permit.  In at least eleven agencies, there was a tendency to
omit emissions rates for certain pollutants.  In one agency, for example,
the audit found that pollutants other than TSP and S02 were generally not
limited in the permit.

                                                              Yes    Np_    NA
     4.  Are the compliance test methods stated or
referenced in the permit terms and conditions?                52%   45%    3%

     The audit results indicate that there is no strong tendency to state
or reference compliance test methods on the permit.  Twenty-eight agencies
do not routinely do so.  In at least 12 cases, where the respondent answered
that compliance test methods were stated or referenced on the permit, EPA
auditors found examples where this was not accomplished.

     The implications of this finding are not clear.  Agencies that answered
negatively often indicated that their regulations included requirements
for compliance testing to be done.  In one case, for example, the respondent
referred to a 30-day intent to test requirement, indicating that this was
"adequate to address any questions concerning test methods and procedures."
Other responses indicated that, while not done routinely, compliance test
methods are stated in the permit as needed—usually only for major sources.

                                                              Yes    No    NA
     5.  If a source's calculated potential to emit is
based on less than full design capacity and continuous,
year-round operation, are all limiting restrictions           63%    34%   3%
clearly identified in the permit?

     Based only on the answers provided in the questionnaires 88% responded
affirmatively to this question.  However, numerous answers had to be reinter-
preted as a result of the EPA audit findings that revealed numerous problems
with actual permits issued.  Permits issued in at least 22 agencies do not
consistently include permit conditions identifying operational and design
limitations that are necessary to restrict a source's potential  to emit.
The frequency of occurrence varied from one agency to another, but oftentimes
the sources involved had been able to avoid major source review because of
their lower potential to emit.  Without the appropriate permit conditions
ensuring that such lower potential is adhered to, these sources should have
undergone major source review.  In two agencies, such permits were issued
in areas subject to a construction moratorium.  More often, however, PSD
review was affected.

                                                           Yes    No     NA

     6.  Is the information accompanying the permit
application considered to be enforceable?                  87%    10%    3%
                                   IV-31

-------
     Most agencies believe that the information submitted by the applicant
is an enforceable part of the permit.  Six agencies  answered that such
information is not enforceable.  Based on some of the comments  provided,  it
would appear that the existing regulations that govern State and local
agency permitting procedures vary in terms of their  clarity on  this  issue.
One particularly interesting comment indicated that, with respect to the
agency responding, the information contained in a permit  application is
"generally enforceable as a description of the operation  covered by  the
permit."  The comment emphasized-, however, that emissions data  and represent-
ative operating data are not considered enforceable  unless they are  reiterated
as permit conditions.
                                   IV-32

-------
                          V.  COMPLIANCE ASSURANCE
A.  INTRODUCTION

    The compliance assurance element of the FY 1984 National  Air Audit
System (NAAS) was designed to examine State and local  programs which  are
responsible for the compliance of sources subject to requirements of  State
implementation plans (SIP's) and, where delegated, new source performance
standards (NSPS) established under Section 111 of the Clean Air Act
and national emission standards for hazardous air pollutants (NESHAP) estab-
lished under Section 112.  Of the several hundred thousand regulated
stationary sources in the nation, there are approximately 30,000 sources in
these categories for which EPA and State and local air pollution control
agencies share a concern about compliance status and associated enforcement
activities.  Compliance activities directed to these sources formed the
primary basis on which each audit was conducted.

    The major parts of the compliance assurance element include performing a
pre-visit assessment by examining source data reported to EPA by State and
local agencies, reviewing source files, and conducting overview inspections.
According to the NAAS guidance, the EPA Regional Offices were to prepare
for the compliance assurance element of each audit by obtaining Compliance
Data System (CDS) retrievals on inspection frequency,  compliance rates, and
enforcement activity.  The Regions were then to analyze the CDS data  for
progress in meeting inspection commitments, source compliance status,
identification of long-term violators and associated compliance activity,
identification of long-term compliers and associated surveillance activity,
and identification of operating NSPS sources without the required 180-day
performance test.  Finally, based on this CDS analysis, the Regions were to
prepare a summary of each comoliance program and send it to the State or
local agency before the visit.  The analysis could have taken the form of a
questionnaire for that agency or could have been a statement of findings to
be discussed for completeness and accuracy during the visit.   The pre-visit
assessment was also designed to help in identifying the files to be reviewed
during the on-site visit.

    The next major part of each audit was the on-site visit.   This visit
centered on a discussion of the findings in the pre-visit assessment  and on
review of 15-20 source files.  The files to be reviewed were to consist of
a mixture of the three air programs (SIP's, NSPS and NESHAP,  assuming the
latter two programs were delegated).  A file review checklist was developed
to ensure consistency in how the file reviews were to be implemented.  The
goals were to see if the files contained a reasonable profile of the  source,
contained written documentation to support the compliance status reported
to EPA, and contained documentation to show that violators are expeditiously
returned to compliance.  The State and local audit reports were envisioned
to include a discussion of both the pre-visit assessment and the status of
the files.

    The final component of the compliance audit was to be a program of
overview inspections conducted by EPA of 2-3 percent of the sources in the
CDS inventory.  The purpose was to verify the compliance status of a  source


                                    V-l

-------
as reported to EPA and to observe inspection practices  to see if there were
areas where EPA could increase performance through technical  assistance.
Due to the timing of this year's initial  audits,  implementation of this
program was expected to occur after the on-site visit.   Where these overview
inspections occurred before the drafting  of the audit  reports, it was
expected that the process would be described and  the results  discussed with
recommendations, where appropriate.  More complete results of the overview
inspections component will be discussed in the FY 1985  audit  reports.

    The compliance assurance audit included 65 State and local audits
(3 local agency audits did not cover compliance).  Since the  compliance
assurance element of the NAAS did not have a standardized questionnaire,
five questions were developed as a guide  in developing  a summary of the
findings in the State and local audit reports. These  questions represent
the key elements of the compliance portion of the audit, and  provide a
uniform basis to do a national assessment of the  compliance and enforcement
programs.

B.  MAJOR FINDINGS AND CONCLUSIONS

    Many States and locals showed one or  more strong points characteristic
of a successful air compliance program, such as high source compliance
rates supported by high inspection frequency rates, performance of all
required NSPS source tests, expeditious resolution of violators, and few
long-term violators.  These activities were adequately  reflected and vali-
dated by the national CDS.  Other States  had source files that were for the
most part well organized, up-to-date, and complete, reflecting a reasonable
profile of each source.

    However, the compliance audits also revealed  that  several  States and
locals, to a varying extent, have weaknesses in three  areas vital to a
strong and effective compliance program.   First,  files  generally do not
contain strong and verifiable information reflecting a  reasonable profile
of each source.  Second, many inspection  reports  are of poor  quality (no
mention of operating or emission parameters or pollutants emitted).  It is
unclear whether the problem is with the inspections themselves or just the
reports.  Third, the reviewed agencies' enforcement efforts are not always
effective in reducing the number of long-term violators by expeditiously
returning documented violators to compliance.

    State and local agencies have already addressed many of the problems
found during the audit which should be reflected  in the FY 1985 audit.
EPA, working with the State and Territorial Air Pollution Program Admini-
strators(STAPPA) and the Association of Local Air Pollution Control
Officials (ALAPCO), has recently defined  performance criteria in expeditiously
taking action against violators through a "timely and  appropriate" guidance
document.  EPA, STAPPA, and ALAPCO are also studying means to improve the
inspection program for quality and coverage and to assure that the national
CDS has current valid source information.

    The remainder of this chapter addresses these findings in more detail.
The aforementioned five questions, which  represent the  key elements of this
compliance audit, are discussed in each appropriate part.


                                    V-2

-------
C.  PERIODIC REVIEW AND ASSESSMENT OF SOURCE DATA

    To assess the adequacy of State and local  compliance programs,  the  EPA
Regional Offices continually review source compliance status and inspection
information submitted by the audited agencies  and reflected in CDS  for  the
SIP, NSPS, and NESHAP programs.  The attached  Figures 1-6 provide a compli-
ance snapshot of all reviewed State and local  air compliance programs as  of
March 31, 1984 (the time when most audits were being carried out by the
Regional Offices).

    As shown in the three pie charts in Figures 1-3, the national compliance
picture for the three air programs is very respectable.  The subsequent
three bar charts depict, by air program, the compliance range, inspection
range, and the range of the number of long-term violators for all States
and locals audited.  As shown, compliance rates for SIP, NSPS, and  NESHAP
sources range from a low of 50 percent in one  jurisdiction to a high of 100
percent in another, with median figures near the mid-90th percentile.
Inspection rates range from 0 percent to 100 percent, with median figures
between 62 percent and 80 percent for the three programs.  The number of
long-term violators (defined for this audit as two consecutive quarters or
more) in each jurisdiction was largest for Class Al SIP sources, ranging
from a low of one in one agency to a high of 82 in another, with a  median
figure of three to four sources per jurisdiction.  Several  of the audited
agencies have expressed concern about the accuracy of the data currently  in
CDS, and EPA and the State and local agencies  have committed to ensure
inclusion in CDS of the most valid information.

    The following question and discussion is the first of the five  questions
developed as a guide for summarizing the findings in the audit reports.

    (1)  Did the audit report discuss the findings of the pre-visit CDS
program assessment (including the agency's reaction), and provide an overall
statement on the condition of the air compliance program revealed by that
CDS analysis?

    A review of the 65 audit reports shows that some form of pre-visit
assessment was done by the Regions for at least 55 State and local  programs
prior to each audit (the other ten reports did not discuss whether  a pre-visit
CDS program assessment was performed or give an overall statement on the
condition of the air compliance program).  Thirty-four of these reports
contained an overall statement about the particular compliance program  based
on the CDS analysis:

    -  Four air programs were considered very  good.

    -  Twenty-six air programs were considered adequate
       (meeting most Clean Air Act requirements).

    -  Four air programs were termed seriously deficient.
                                    V-3

-------
    The remaining 21 audit reports made no definitive statement
on the air program based on the CDS assessment,  but positive comments  were
made in 8 of these reports, such as "inspection  rates are very
good" and "compliance rates are good to excellent."  For another  9,
negative comments were made such as "90-day time lag in  reporting violations
to EPA" and "numerous long-term violators."

    Careful study of the audit reports for the four agencies with "very
good" air compliance programs shows several elements contributing to the
success of each compliance program.  In general, these agencies,  based on
the CDS analysis:

    -  routinely complete nearly all the required inspections for SIP
       sources, and NSPS and NESHAP sources where delegated,

    -  have compliance levels for Class A SIP  sources consistently above 90
       percent with recent inspections to support this level, and

    -  address, in a timely manner, sources found to be  in violation of
       applicable emission limits or permitting  requirements resulting in
       few, if any, long-term violators (greater than 180 days).

It seems likely that other States have compliance programs as good as  these
four, but this was not readily discernable from  the description of the
programs in the audit reports.

    To summarize, 38 (58 percent) of the 65 State and local  compliance
programs were found by the Regions to be either  adequate or very  good
according to the CDS pre-visit analysis, and 13  (20 percent) were judged
either less than adequate or seriously deficient in at least some element
of the program.  It was not possible to assign an overall  description  of
programs from the other 14 (22 percent) reports.  This initial effort
identified many good programs and pointed out  other areas where the States
and locals and EPA should work together to improve their compliance programs.

D.  FILE REVIEW

    The following questions and discussions pertain to the portion of  the
audit that addressed file reviews:

    (2)  Did the source files reflect a reasonable profile of the sources?

    Of the 65 audit reports reviewed, 63 contained file  review information.
Thirty-eight (38) of these indicated that each file reflected a reasonable
profile of the source, which means it contained  the following information:
source compliance status based on recent inspection or source test, an
identification of all air program regulations  to which the source is subject
and, within the inspection report, operating parameters, point sources, and
                                    V-4

-------
pollutants emitted by the facility.  Some common reasons cited in the 25
audit reports where the files were considered deficient were:  inability to
determine source compliance status from file contents,  no indication  of
which air program regulations to which the source was subject  (SIP, NSPS,
NESHAP), missing inspection report, or poor quality inspection report (no
mention of operating or emission parameters, point sources, or pollutants
emitted by facility).

     (3)  Did the files contain adequate written documentation to support
the CDS compliance status?

     Thirty-nine (62 percent) of the 63 audit reports indicated that  files
reviewed contained some written documentation of compliance status to
support CDS.  The other 24 (38 percent) audit reports either cited a  lack
of any compliance information in the files or showed information in the
files which conflicted with CDS.  Most of these 24 reports were for the
agencies which were identified as having deficient files in the preceding
question.

     (4)  Are violations documented and pursued by the  Agency  to expedi-
tiously return a source to compliance?

     Forty-four (70 percent) of the 63 audit reports indicated that violations
are documented and pursued to expeditiously return a source to compliance.
The other 19 reports indicated that some sources were not being expeditiously
returned to compliance, leading to a number of long-term violators (greater
than 180 days) or untimely, protracted enforcement actions.

E.  OVERVIEW INSPECTIONS

    The following question and discussion pertains to the portion of  the audit
that addressed overview inspections:

     (5)  Did the Region conduct an overview inspection program in each
agency?  Summarize the procedure including how sources  were selected,
inspection procedures (State or local  participation), and inspection  results.

    As noted earlier, the short time available for NAAS implementation  in
FY 1984 limited the opportunity for Regional Offices to perform and evaluate
this aspect of the audit before the audit summary reports were written.
Some Regions carried out an overview inspection program in a few of their
jurisdictions which involved joint EPA-State inspections.  However, the
audit reports have no details about source selection, specific inspection
procedures, or inspection results.  This will  be covered more  completely in the
FY 1985 compliance assurance audit program.
                                    V-5

-------
                FIGURE 1
COMPLIANCE BREAKDOWN OF SIP PROGRAMS
          CLASS A1 SIP SOURCES
                                  UNKNOWN
                                  4%
                                   IN VIOLATION
                                   4%
                                    MEETING SCHEDULE
                                    3%
                   V-6

-------
                 FIGURE 2
COMPLIANCE BREAKDOWN OF NSPS PROGRAMS
              NSPS SOURCES
                                  UNKNOWN
                                  9%
                                     IN VIOLATION
                                     4%


                                     MEETING SCHEDULE
                                     2%
                    V-7

-------
                  FIGURE 3
COMPLIANCE BREAKDOWN OF NESHAP PROGRAMS
              NESHAP SOURCES
                                      UNKNOWN
                                      3%


                                      IN VIOLATION
                                      2%
                                      MEETING SCHEDULE
                                      1%
                    V-8

-------
                          FIGURE 4
          STATE AIR COMPLIANCE STATISTICS
                 CLASS A1 SIP SOURCES
PERCENT
  100
   75
   50
   25
STATE      STATE
COMPLIANCE  INSPECTION
RATES      RATES
(4/1/84)    (4/1/83-3/31/84)

     «*
     MEDIAN |	
               79-80%
               MEDIAN
100 r-
 75 -
 50  -
 25 -
     STATES:
     NUMBER LONG TERM
     VIOLATORS (2 QTRS OR MORE)
     AS OF 4/1/84
                           V-9

-------
                        FIGURES
      STATE AIR COMPLIANCE STATISTICS
                   NSPS SOURCES
        STATE
        COMPLIANCE
PFRCPMT RATES
PERCENT ,4/1/84,
  100 i-
   75 -
   50 -
   25 -
1100*
STATE
INSPECTION
RATES
(4/1/83-3/31/841
     100X
                      71X
                      MEDIAN
100
                      75
                      50
                      25
                           STATES:
                           NUMBER LONG TERM
                           VIOLATORS (2 OTRS OR MORE)
                           AS OF 4/1/84
   DATA FOR THIS BAR CHART DO NOT ALLOW CALCULATIONS OF MEDIAN FIGURE.
                            V-10

-------
                     FIGURE 6
     STATE AIR COMPLIANCE STATISTICS
               NESHAP SOURCES
       STATE
       COMPLIANCE
PERCENT ^TS
  100 i-
75 -
SO -
25 -
              STATE
              INSPECTION
              RATES
              (4/1/83-3/31/84)
                   100*
                   92*
                   MEDIAN
100
 75
 50
 25
     STATES:
     NUMBER LONG TERM
     VIOLATORS (2 OTRS. OR MORE)
     AS OF 4/1/84
DATA FOR THIS BAR CHART DO NOT ALLOW CALCULATIONS OF MEDIAN FIGURE.
                        v-n

-------
                            VI.  AIR MONITORING
A.   INTRODUCTION
                                                                    »
     In May 1979, EPA promulgated air monitoring and reporting regulations
for State implementation plan (SIP) purposes which significantly revised
the previous 1971 regulations.  The major elements of the revised regulations
are as follows:

     o  Provides for fixed and movable monitoring sites.

     o  Establishes monitoring network design and siting criteria.

     o  Requires that reference or equivalent methods be used.

     o  Imposes an annual network review.

     o  Requires an approved quality assurance program.

     o  Provides for the quarterly reporting of all National  Air
        Monitoring Stations (NAMS) data and the annual  summary reporting of
        State and Local  Air Monitoring Stations (SLAMS) data.

     An effective air monitoring program includes audits as  an integral
part of its overall activities.  Consequently, the May 1979  Federal  monitor-
ing regulations required State or local agencies operating SIP networks  to
participate in EPA's national performance audit program and  to permit an
annual EPA system audit  of their ambient air monitoring program.  In view
of this requirement, EPA in July 1980 issued detailed and lengthy guidance
for Regional Office use  on conducting system audits of State and local
agencies.  The guidance  was issued in the EPA report, "Quality Assurance
Handbook for Air Pollution Measurement Systems, Volume II -  Ambient  Air
Specific Methods,"  (EPA 600/4-77-027a.)

     In late 1982 and early 1983, as part of the National Air Audit  System
work group efforts, the  Regional Offices were surveyed to compile information
on how each Region conducted audits of State and local  air pollution control
programs.  Specific aspects of audits covered in the survey  included selection
of topics, format, depth of audit, use of audit results, and tracking of
corrective actions.  With regard to air monitoring, EPA found that the
monitoring regulations in 40 CFR Part 58 did form the basis  for all  Regional
systems audits.  Of the  ten Regions, four indicated they used the criteria
found in the Quality Assurance Handbook Volume II cited above.  Other
Regional Offices developed their own criteria and indicated  that some of
the criteria in the Quality Assurance Handbook were not entirely relevant
to their States' situations.  Because most Regions were auditing air moni-
toring programs based on their own criteria, it was decided  jointly  repre-
sentatives of the State  and Territorial Air Pollution Program Administrators
(STAPPA), the Association of Local Air Pollution Control Officials (ALAPCO),
and EPA that air monitoring would be one of the four major areas to  be
included in the FY 1984  audit program.
                                    VI-1

-------
     Several  major problems were encountered in  preparing  an  adequate  and
timely air monitoring questionnaire for use in FY  1984.  One  problem was
that the existing EPA system audit guidance in the Quality Assurance Handbook
was in need of major revision, but because  of its  complexity  and  length  it
could not be completely revised for use in  FY 1984.  Another  problem was that
the purpose of the FY 1984 audit questionnaires, the  criteria for preparation,
and the use of the audit results were not  sharply  focused  at  the  time  of
preparation.  Faced with these difficulties, it  was recognized from the
outset that the audit questionnaire developed by the  air monitoring audit
committee would only serve as an interim document  for conducting  system
audits in FY 1984.

     The air monitoring questionnaire developed  for use  in FY 1984 was
intended to provide an overview and summary assessment of  an  ambient air
monitoring program.  On the other hand, it  was not intended to replace the
detailed system audit guidance contained in the  Quality  Assurance Handbook.
During FY 1984, EPA proceeded to develop a  new questionnaire  and  a systems
audit protocol for the Quality Assurance Handbook.  These  materials will be
reviewed and approved by the air monitoring audit  committee as well as by
State and local air pollution control officials.  It  is  expected  that  the
new questionnaire and systems audit protocol will  be  completed in time for
use in FY 1985.

     Fourteen major questions asked during  the air monitoring systems
audit.  For ease of review, these 14 questions have been separated into the
four categories of quality assurance; network design  and siting;  data
handling; and facilities, staff, and equipment.  Each of these areas are
described in detail below following the discussion of major findings and
conclusions.

B.   MAJOR FINDINGS AND CONCLUSIONS

     The 63 air monitoring audit questionnaires  received by EPA covered 48
States, the District of Columbia, 2 U.S. Territories, and  12  local agencies.
Based on the monitoring audit results and  periodic SLAMS/NAMS status reports
issued by EPA, it is concluded that State  and local agencies  have successfully
established and maintained their respective SLAMS/NAMS networks of approxi-
mately 4888 pollutant monitors.  Overall,  at any particular time, about 97
percent of these monitors are operating and in compliance  with the 40  CFR  58
requirements for network design, siting, and instrumentation.  This compares
favorably with the audit findings that 94  percent  of  the agencies are
meeting these regulations.  The remaining  three  percent  of the monitors
reflect monitors that are discontinued or  being  relocated  because of lost
leases, instrument repairs, construction projects, or the  like.

     The audit reports clearly show that timely  data  submission (within 90
days of the end of each quarter) is a problem for  approximately 32 percent
of the agencies involved.  The reasons appear to be related to staff short-
ages, the need to upgrade computer capabilities, the  length of analytical
time required to perform lead analyses, and the need  for additional data
validation time.  Several of the States are in the process of upgrading
                                    VI-2

-------
their computer capabilities which will lead to more timely data submittal.
Another action currently in progress that will effect the timeliness of
data submittal is a proposed regulatory change.  This proposal  will  increase
the time limit for submittal from 90 to 120 days after the end  of the
calendar quarter in which the data were collected.  This change is being
proposed because EPA has found that a 120-day period is more reflective of
national data needs.

     Approximately 34 percent of the agencies had trouble maintaining a 75
percent data capture rate for their SLAMS/NAMS sites.  It was not possible
to determine from the survey whether the missing values were due to  instrument
problems or data submission problems.  This problem does need additional
study.  On the positive side, 66 percent of the agencies indicated that at
least 90 percent of their sites are meeting the 75 percent completeness
criteria.

     The survey indicated that many agencies were experiencing  problems
related to old or worn out monitoring equipment.  This need was independently
confirmed by a STAPPA/ALAPCO equipment survey in 1983.  EPA has recommended
that a portion of the Section 105 air grants be allocated to replacement of
some ambient instrumentation.

     Based on regularly occurring surveys performed by EPA, all States have
an approved Quality Assurance Plan.  The national audit survey  indicates
that many States and local agencies are in the midst of modifying and
upgrading their Quality Assurance Plans due to revised Federal  reference
procedures or because some operational procedures were not complete  or
entirely adequate.

     With respect to annual network reviews and the annual SLAMS Air Quality
Data Report, the survey indicates only minor problems and these should be
easily remedied administratively.

     The precision and accuracy audit results appeared as a substantial
problem.  However, it is believed that the largest part of the  problem
could be attributed to the particular question asked because only 48 percent
of the respondents provided an appropriate response.  Based on  available
data, it can be concluded that most agencies are providing the  required
precision and accuracy data and over the last 3 years are improving  the data
quality.  It appears the majority of the agencies should meet the precision
and accuracy goals established in the survey in the next few years.   An
effort to remove the ambiguity of this question and other problem questions
contained in the audit questionnaire is underway.  The new questionnaire
will attempt to minimize the resubmission of massive blocks of  information
already within EPA's possession and limit questions to one subject area.
                                    VI-3

-------
C.   QUALITY ASSURANCE

     Quality assurance consists of two distinct and equally important
functions.  One function is the assessment of the  quality  of the  monitoring
data by estimating their precision and accuracy.  The other function is the
control and improvement of the quality of the monitoring data by  implemen-
tation of quality control policies, procedures, and corrective actions.
These two functions form a control  loop in that when the assessment function
indicates that the data quality is inadequate, the control  effort must be
increased until the data quality is acceptable.

     With this in mind, the following questions were asked regarding quality
assurance:

     Quality Assessment

     o  How do the precision and accuracy of the Agency's  instruments compare
        with the goal  of plus or minus 15 percent  for precision and plus or
        minus 20 percent for accuracy?

     o  Does the Agency participate in interagency audits?

     o  Does the Agency participate in the National Performance and System
        Audit Program required under 40 CFR 58 Appendix A?

     Quality Control

     o  Does the Agency have a quality assurance manual?

     o  Are the Agency's monitoring practices consistent with the quality
        assurance manual?

     o  Are corrections/deletions to preliminary ambient air data performed
        according to the quality assurance manual?

     o  Is the basis for revising final ambient air data formally documented?

     o  Does the Agency provide quality assurance  measures for noncriteria
        pollutants?

     A total of 63 audit questionnaires were received by EPA which addressed
the topic of quality assurance.  Responses were obtain from 48 States, the
District of Columbia, 2 U.S. Territories, and 12 local agencies.   The
District of Columbia and the 2 territories are counted as  States  in this
section of the report.
                                    VI-4

-------
     In response to the first question regarding how the Agency's instruments
compare to the 95 percent probability limits of plus or minus 15 percent
for precision and plus or minus 20 percent for accuracy (plus or minus 15
percent for hi-vol accuracy), less than half of the responses (30/63 or 48
percent) reported probability limits for all pollutants for both years
(1981 and 1982).  In some cases, minor omissions occurred, such as excluding
carbon monoxide (CO) accuracy or only reporting lead values for one year.
In other cases, data were not reported at all, percentage values were
reported, or only the highest probability limits in the 2-year period
were reported.

     There was little difference if the reporting was done by a State or
local agency.  Fifty percent of local agencies (6/12) reported precision
and accuracy data for 2 years, while 47 percent (24/51) of State agencies
reported precision and accuracy data for 2 years.

     For those 30 agencies where precision and accuracy could be quantified,
only 13 agencies were within the 95 percent probability limit goals for
precision and accuracy for all pollutants for the 2 years requested.  For
those agencies which did not achieve the goal of plus or minus 15 percent
for precision and plus or minus 20 percent for accuracy, precision was
exceeded most often for sulfur dioxide (S02) (22 percent of all  reported
exceedances) while accuracy was exceeded most often by nitrogen dioxide
(N02) (30 percent of all reported exceedances).  Reported exceedances by
percent for precision and accuracy are shown in Figures VI-1 and VI-2.  The
fewest reported exceedances for precision were for CO, while the fewest
reported exceedances for accuracy were for total suspended particulate (TSP).

     The precision and accuracy results presented above do not truly depict
the actual precision and accuracy data submitted to EPA's data bank.  The
discrepancy between the precision and accuracy results reported on the
audit questionnaire and those contained in the data bank can be attributed
to the fact that the audit questionnaire is new, confusion over how to
complete the questions, and reluctance on the part of the local  or State
agency to resubmit data previously reported to EPA.  In view of these
problems and the incompleteness of the questionnaire results, we have
included precision and accuracy data summary statistics based on data
contained in EPA's data bank.  Table VI-1 depicts by pollutant the national
percent completeness of precision and accuracy data.  For example, in 1983
the precision and accuracy data reported ranged from 80 percent complete
for N02 to 96 percent for TSP.  Table VI-2 shows the 90th percentile lower
and upper precision and accuracy 95 percent probability limits for CO for
the years 1981-1983.  For CO precision in 1983, there were 374 quarters of
precision data submitted by State and local agency reporting organizations.
Of the total, 90 percent were within upper and lower probability limits of
minus 13 percent and plus 12 percent or meeting the goals of plus or minus
15 percent.  For the other criteria pollutants, 65 to 75 percent of the
data met the precision goals of plus or minus 15 percent and 80 to 90
percent met the accuracy goals of plus or minus 20 percent.
                                    VI-5

-------
     Concerning the question of an approved quality assurance  manual,  92
percent of all State agencies,  and 58 percent of all  local  agencies  surveyed
responded as having EPA fully-approved and updated quality  assurance manuals.
In general, the quality assurance manuals were acceptable except for one or
two deficiencies.  The deficiencies noted included lack  of  ozone transfer
standard recertification procedures and the need for hi-vol  sampler  cali-
bration procedures to be written in conformance with the December 12,  1982,
Federal Register changes to the TSP reference method.
                                    VI-6

-------
^m* mm*
Z<

So


-------
                                 TABLE VI-1

PERCENT COMPLETENESS OF QUALITY ASSURANCE DATA (PRECISION AND ACCURACY)
       Pollutant

          CO

          S02

          N02

          03

          TSP
1981

 77

 82

 56

 83

 94
1982
1983
89
93
72
89
97
87
87
80
88
96
                                 TABLE VI-2
               CARBON MONOXIDE PRECISION AND ACCURACY LIMITS
           MET BY 90 PERCENT OF THE SLAMS REPORTING ORGANIZATIONS
     Precision
     Accuracy Level 2
                                          Probability Limits
                                              (Percent)
Number
296
346
374
197
256
295
Lower
-18
-15
-13
-18
-14
-13
Upper
+17
+16
+12
+16
+13
+13

1981
1982
1983
1981
1982
1983
                                    VI-8

-------
     However, not all agencies which had an approved quality assurance manual
followed the procedures outlined in the manual.  Twenty-nine percent of all
agencies did not follow monitoring practices which were consistent with their
approved quality assurance plan.  Problems ranged from procedural  inconsistencies
to the lack of quality assurance for certain sites or criteria pollutants.

     Concerning data corrections, 89 percent of all State and local  agencies
corrected preliminary ambient air data according to the quality assurance
manual, while 88 percent of all agencies documented and maintained their
basis for revising their final ambient air data.  Generally, agencies which
were deficient in the area of data correction did have documented  procedures;
however, they were not listed in the quality assurance manual.

     Another important aspect of quality assurance is participation  in audits
on an independent or interagency basis.  Of the 51 State agencies  which were
audited, 92 percent (47/51) participated in independent and/or interagency
audits, while all local agencies participated.  One Region reported  that the
States within the Region participated in field audits, but did not participate
in lab audits.  Similar to the independent or interagency audit participation,
92 percent of all State agencies participated in the National  Performance
and System Audit Programs required in Appendix A of 40 CFR Part 58.   It
should be noted that those agencies which did not participate in National
Performance and System Audit Programs did participate in independent and/or
interagency audits.

     The final quality assurance question asked if quality assurance measures
were provided for noncriteria pollutants.  This question was very  ambiguous,
and results varied from State to State and Region to Region.  Many question-
naires responded "yes" without describing which noncriteria pollutants were
being monitored, other States understood this question to deal  with  air
toxics, while still other States described whether the quality assurance
measures employed were for nonmethane organic compounds (NMOC's),  sulfates
and nitrates, or acid rain.  Forty-six agencies did provide quality  assurance
measures for noncriteria pollutants (73 percent), nine agencies reported
they did not monitor for any noncriteria pollutants (14 percent),  while 8
agencies (13 percent) did monitor for noncriteria pollutants without suffi-
cient quality assurance measures.

D.   NETWORK DESIGN AND SITING

     Appendices D and E of 40 CFR Part 58 describe network design  and
siting criteria for State and Local Air Monitoring Stations (SLAMS)  and
National Air Monitoring Stations (NAMS), a subset of SLAMS.  Stations which
comprise SLAMS should be designed to meet one of four principal  monitoring
objectives:

     o  To determine the highest concentration of a given pollutant  expected
        to occur in the area covered by the network;
                                        VI-9

-------
     o  To determine representative pollutant  concentrations in areas of
        high concentration and high population density;

     o  To determine the impact of significant sources  on  ambient air
        quality; and

     o  To determine general  background pollutant concentration levels.

In addition to the above monitoring objectives from Appendix D, each
site must comply with specific siting requirements of Appendix E.

     With regard to network design and siting, three questions were asked:

     o  How many SLAMS (including NAMS) were in operation  for each criteria
        pollutant?

     o  Was network design and siting in conformance with  40 CFR 58 Appendix
        D and E, and the Quality Assurance Handbook?

     o  Was the SLAMS network reviewed annually with each  station being
        assigned a number in  accorance with EPA's Storage  and Retrieval  of
        Aerometric Data (SAROAD) system, an operating schedule, a spatial
        scale, and a monitoring objective?

     The first question was meant to serve as  a cross-check  to the SLAMS
Status Report produced annually by EPA.  In every case  but one, the number
of sites were reported.  One  State agency stated that monitoring was conduc-
ted on an individual district basis, and they  did not keep records on the
number of monitoring stations operated by local districts.  Of those which
responded, 84 percent corresponded closely with the 1983 SLAMS Status Report.
It should be understood that  the numbers do not remain  stable because through-
out the year a small number of sites are added, discontinued, or replaced
for valid reasons (lost lease, building construction, or the like).  However,
there was a large discrepancy between the 1983 SLAMS report  (see Table
VI-3) and the air audit report for eight agencies.  Of  the eight agencies,
six reported fewer sites (some significantly)  while two  reported more
sites.  It is believed that the problem could  have been  resolved if the
question explicitly stated that a State was to include  all SLAMS sites
operated by local agencies within the States.

     From the questionnaires, it was determined that 94  percent of all the
agencies surveyed (59/63) reported that air monitoring  network design and
siting were in conformance with 40 CFR 58 Appendices D  and E, and EPA's
Quality Assurance Handbook.  One questionnaire responded "unknown," while
three agencies were not in conformance with design and  siting criteria.
The audit indicates that there are relatively  few sites  not  in compliance
with these requirements and all three agencies noted that  corrective actions
are in progress for those sites.  The audit questionnaire  compares closely
with EPA's annual SLAMS status report which listed that  about 97 percent of the
4888 SLAMS sites (including NAMS) were operating and complying with the  monitori
and reporting regulations.
                                   VI-10

-------
     Similarly, 94 percent of all  agencies surveyed reviewed their SLAMS
network annually and ensured that SAROAD numbers were assigned,  and there
was an operating schedule, spatial scale, and monitoring objective.  Four
State agencies failed to do this.   One agency reported no review since
1981, while another was scheduled to begin reviews in 1984.   No  comments
were received about the other two States.
                                 TABLE VI-3

             NATIONAL SUMMARY OF AIR MONITORING STATIONS (1983)

      POLLUTANT                                  SLAMS*          NAMS

      TSP                                         2574            644
      SO?                                          583            222
      CO          '                                 450            115
      N02                                          298             58
      03                                           612            216
      Lead                                         371            107
TOTAL                                             4888           1362

includes NAMS



E.   DATA HANDLING

     States are required to submit to EPA an annual  summary report of all
ambient air quality monitoring data from all SLAMS monitoring stations as
required by 40 CFR 58.25.  In addition,  40 CFR 58.35 requires States  to
submit quarterly reports of all ambient air quality monitoring data from
all NAMS sites to EPA.  Appendix F to 40 CFR 58 describes how these
data are to be submitted.

     In order to determine if data were being submitted according to  40 CFR
Part 58, five questions were asked:

     o  Does the Agency have staff and data processing facilities adequate
        to process and submit to SAROAD air quality data as specified in 40
        CFR 58.35 and Appendix F?

     o  What fraction of the data is more than 45 days late?

     o  What fraction of the SLAMS sites reported less than 75 percent of
        the data?
                                   VI-11

-------
     o  Are the Agency's  SLAMS instruments  designated  as  reference  or
        equivalent methods  by EPA?

     o  Does the Agency submit an annual  summary  report as  specified in  40
        CFR 58.26?

     With regards to adequate staff and  processing  facilities  needed to
submit SAROAD data, 90 percent of the agencies  had  sufficient  staff and
facilities necessary to submit SAROAD air quality data as specified in 40
CFR 58.35 and Appendix F.  As for those  which did not  have  adequate staff
or facilities, problems were mainly attributable  to computer services and
turn-around time on lead  analyses.   Twenty  State  and/or local  agencies did
submit at least a portion of their  data  more than 45 days late.   The percent-
age of sites by agency which submitted their data late are  shown  in Figure
VI-3.  The figure shows that of the 20 agencies reporting late data, 2
agencies reported 100 percent of their data late, 8 agencies reported at
least 30 percent of their data late, and 10 agencies reported  up  to 20
percent of their data late.  The questionnaires indicated that some of the
late reporting agencies are in the  process  of upgrading their  computer
services (new software and  addition of some personal computers) which
should improve this item.  In addition to those improvements,  EPA is pro-
posing revisions to 40 CFR  Part 58  to increase  the  reporting period from 90
days after the end of the quarter to 120 days,  which should help  this
problem.

     Only 13 agencies (21 percent)  reported that  all of their  SLAMS sites
achieved data recovery greater than 75 percent.  Of those agencies  which did
not attain at least 75 percent data recovery, the fraction  of  SLAMS sites
reporting less than 75 percent of data ranged from  less than 1 percent to
41 percent.  Figure VI-4 shows that 18 agencies reported that  less  than  5
percent of their sites had  less than 75  percent data recovery, 11 agencies
reported between 5 and 10 percent of their  sites  with  less  than 75  percent
data recovery, 1 agency reported between 10 and 15  percent  of  its sites
with less than 75 percent data recovery,  and 6  agencies reported  between 15
and 20 percent of their sites with  less  than 75 percent data recovery.
These data show that 42 of  the 63 agencies  audited  (66 percent) have a 75
percent or better rate for  data capture  for at  least 90 percent of  their
SLAMS sites.  Independent verification of these data is a tedious hand
calculation problem; therefore, it  has not  been attempted.  A  computer
program to perform this task is currently under development.

     Responses from the audit questionnaire indicate that with only two
exceptions (3 percent), all other agencies  use  monitoring instruments that
are designated as either reference  methods  or equivalent methods  by EPA.
The non-equivalent instruments utilized  in  these  two States are limited  to
lead sites in one State and a few TSP locations in  the other.  All  State
agencies except two (4 percent) submit an annual  SLAMS data report  as
required by 40 CFR 58.26.
                                   VI-12

-------
                   FIGURE 3

  PERCENT OF SITES BY AGENCY REPORTING DATA
                 >45 DAYS LATE
   8

co  7
UJ
O  6

W  e
U  5

u_  4
O
oc  3
in
   1

   o
      10  20  30  40  50  60   70   80   90  100
            SITES WITH LATE DATA, PERCENT
                   FIGURE 4
  PERCENT OF SITES REPORTING <75% OF DATA
    18

  
-------
F.   FACILITIES, STAFF, AND EQUIPMENT

     In order to conduct bi-weekly precision  checks  and  quarterly
performance (accuracy)  audits as specified in Appendix A to 40  CFR  58,
trained staff, adequate facilities, and  access to  the National  Bureau of
Standards (NBS) traceable gas and flow standards are necessary.  Of those
responding to the audit questionnaire, 84 percent  of all  agencies did have
sufficient staff, facilities, and access to NBS traceable gas to conduct
bi-weekly precision checks and quarterly audits.   Two States and one local
agency did not operate  continuous monitors; thus,  bi-weekly precision
checks were not necessary and the question therefore was not applicable.
Nine agencies (16 percent) were not adequately trained or equipped  to conduct
bi-weekly precision checks and quarterly audits.   Comments included inadequate
training budgets, additional  funding required for  expanding networks or
replacing monitors, and problems with obtaining up-to-date calibration
gases for precison checks.  An independent equipment needs survey has been
conducted which confirms the need to replace  old monitoring equipment.

     The air monitoring audit also questioned whether agencies  had  adequate
laboratory procedures,  staff, and facilities  to conduct  the tests and analyses
needed to implement the Agency's SLAMS monitoring  and quality assurance
plans.  Of the 61 agencies which answered the question,  56 (92  percent) did
have adequate laboratory facilities.  Of the  five  negative responses,
quality assurance documentation was inadequate at  one agency, laboratory
space was inadequate at another, lead procedures needed  updating at a
third, a new atomic absorption spectrophotometer was needed at  the  fourth,
and inadequate (too few) staff were reported  at the  fifth.  Two agencies
did not respond to the  question.
                                   VI-14

-------
             VII.  EVALUATION OF THE FY 1984 AIR AUDIT EFFORT

A.  INTRODUCTION

     In spite of the fact that FY 1984 was the initial  year for the National
Air Audit System (NAAS), and a number of first-year implementation problems
were experienced, the audit provided a reasonable assessment of the
health of the national  air quality management program.   The mere fact
that the EPA Regions and the State and local air pollution control  agencies
were able to accomplish the complex and resource-intensive job of completing
the audit of 68 agencies is a major feat in itself.  This first audit  was
of necessity a learning experience in which EPA came to understand how to
improve successive audits.  An evaluation of what was learned during the
FY 1984 audits of the four individual program areas follows.

B.  AIR QUALITY PLANNING AND SIP ACTIVITIES

    The air quality planning and SIP activities section of the FY 1984
NAAS attempted to examine what agencies do and to verify how they do it,
i.e., how effectively they use their resources and how good their outputs
are.  Although the "what" questions of this audit seemed to fare better
in the audit, they are also the easiest to ask and to answer.  The "how"
questions will need more work in the FY 1985 audit effort. Due to the
lack of specificity in some of the questions, and the lack of detailed
responses from many agencies, the FY 1984 audit could not, in some
instances, comprehensively and accurately verify how effective most agencies
are doing their jobs.  Nevertheless, the audit effort was successful in
identifying certain areas of concern as well as those areas where the
air quality planning process is being implemented successfully.

     Of the 68 agencies that participated in the FY 1984 audit, 61  programs
(49 States, the District of Columbia, 2 Territories, and 9 local  agencies)
were audited for all four sections of the air quality planning and SIP
activities chapter. (The State of California was not audited except for
the local Ventura County agency.)  In addition, two local agencies (Chicago
and Cook Co., IL) supplied questionnaires only for the  section of the
audit dealing with emission inventories.  About half of the audit question-
naires were completed by the State, and about one-third was completed  by
the Regional Offices.  Three questionnaires were filled out jointly by
both the State and the Regional Office, two to four were unusable in
various parts, and three were not supplied.  In all  but one Region, the
Regional Office prepared an executive summary.  The Regions'  evaluation
and discussion of the State or local agency's audits ranged from almost
nothing to a detailed 14 page analysis.

     Considerable inconsistency existed among EPA Regions in the compre-
hensiveness and level of detail provided in the audit activities.  For
example, some audited States did not complete questionnaires or submitted
unusable ones.  Some Regions chose to fill  out the questionnaires for  all
or almost all of their States while in other Regions the States alone
answered the questions.  One Region's evaluation consisted of a one or
two page summary for the entire chapter, while other Regions provided
several pages on each section of the chapter and a dozen or so pages in
all.
                                  VII-1

-------
     The majority of State and local  responses lacked sufficient detail  or
explanation of their answers.  A one-word "yes" or "no" response was
typical of many responses even where  further detail  was required by the
audit quidelines.  Only a dozen or so agencies provided a sufficient level
of detail to most of the audit questions.

     A significant number of the audit questions resulted in responses
from which no definite conclusions could be drawn.  For the most part, this
is probably the fault of the wording  of the question, and not the responding
agency.

     Consideration of the problems encountered in implementing the FY 1984
audit effort has led to a realization that certain changes need to be
made in the audit program for FY 1985.  In particular, the FY 1985 audit
of air quality planning and SIP activities will emphasize the following:

     ° Streamlining and clarifying questions to eliminate any confusing
       jargon so as to guide the responses of State and local agencies
       toward an appropriate level of detail.

     0 Standardizing the activities of the EPA Regional Offices in reviewing
       the State and local audits and promoting national  consistency in
       the performance of the audit.

     0 Having State and local agencies complete the questionnaires and,
       if time permits, send them to  the Regional Offices.  Regional
       Offices will then review the answers provided and comment in
       the appropriate space before the on-site visit.

     0 Encouraging EPA Regions to spend more time with the State and
       local agencies to make sure that all audit questions are clearly
       understood and adequately addressed.  Both sides of any unresolved
       disagreements between control  agencies and EPA Regions should be
       clearly presented on the questionnaire.
C.   NEW SOURCE REVIEW

     The FY 1984 new source review (NSR)  audit was successful  as an
initial assessment of the present NSR framework.  The audit verified that
64 State/local agencies are generally familiar with and have strong
support for the preconsruction review process.  The basic State/local
framework appears to be in place for the  review of projects before they
commence construction.  The FY 1984 audit was also successful  in establish-
ing a valuable feedback mechanism to identify where EPA should emphasize
future policy development.

     Various problems were encountered, however, during the FY 1984 NSR
audit that limited its ability to provide definitive answers regarding
the way that many of the program details  actually function.  Three types
                                  VII-2

-------
of problems are worth noting.  These include the inconsistent quality of
the individual  audit reports, the poor condition of some permit files
that were selected for auditing, and the limited number of major source
permits issued during the period covered by the audit.

     Concerning the individual audit reports, there were several contributing
factors.  First, there was some confusion as to what certain audit questions
meant and how they were to be answered.  This appeared to be caused in large
part by either late or no advance circulation of the audit guidance that
provided an explanation of each question asked in the new source review audit
section.  Consequently, some questions were left unanswered, or were improperly
answered (these were reinterpreted where possible).  Second, perhaps also as a
result of the unavailability of guidance, answers to the questionnaire tended
to be very brief and imprecise, leaving much uncertainty as to how they should
be interpreted.  Finally, the completed questionnaires were not consistent
in terms of whether the answers had been provided by the audited agency or by
the EPA Regional Office.   Some Regions submitted the questionnaire containing
the audited agency's answers accompanied by a narrative audit report that
sometimes suggested contradictory conclusions to particular questions.   This
also resulted in an effort within EPA headquarters to attempt a reinterpretation
of the questionnaire responses.

     Regional auditors attempting to inspect individual permit files often
were confronted by files  that were fragmented, disorganized, and inadequately
documented.  Such conditions were reported in almost half of the audited
programs.  This contributed to slowing down the audit process as well  as
making it difficult to determine whether the appropriate procedures were
actually followed pursuant to issuing the permit.  The lack of documentation
caused problems in evaluating the applicability determinations, the ambient
impact analyses, and best available control technology (BACT) determinations,
among other things.

     The third problem, which is totally external to the audit development
process itself, involved  the limited number of permits that had been issued
to major sources during the audit period.  Due to the limited number of
major source reviews available to audit, it was difficult to satisfactorily
evaluate agency performance in the key areas of prevention of significant
deterioration (PSD) and emission offsets in nonattainment areas.

     Recognizing these problems, the NSR audit committee chose certain
corrective actions for FY 1985.  The first step has been to take the question-
naire and make revisions  to it where appropriate.  Having established a
national data base that identifies the national framework for the new
source review program in  FY 1984, the FY 1985 audit will focus more attention
on agency performance, i.e., permits issued.  In order to accomplish
this, the questionnaire is being revised so that it can be used to audit
permit files in a comprehensive manner.  All seven of the original  NSR
audit topics will be retained to some extent.  Questions will be revised
and supplemented, however, to encourage more precise and consistent
responses, to eliminate potentially confusing terminology, and to facilitate
the final processing and  analysis of the responses.

     The NSR audit committee is also developing more specific guidance to
be used in the selection  of permit files.  This will  help to ensure more
uniformity in the types and numbers of permits that are audited.  The


                                  VII-3

-------
questionnaire itself will ensure that the depth of each file audit is
relatively consistent.   Meanwhile,  each audited agency will  be asked to
summarize its permit issuing activities for the audit period (or within a
reasonably similar time frame) to better enable the auditors to put into
proper perspective the representativeness of the findings resulting from
those permits that are actually audited.

     Assuming that such corrective actions are successful  in alleviating
many of the problems experienced during the FY 1984 audit,  one remaining
problem needs to be considered.  This is the problem of permit documentation.
The overall success of a permit file audit depends to a large degree on the
information available within each audit file.   No drastic changes in work
practices or administrative procedures are being suggested.   However, the
need for complete, well-documented files exists because it allows the
auditor to complete the auditing task more expeditiously and, perhaps more
importantly, it provides a clear trail through the agency's preconstruction
review process.  Furthermore, the importance of understanding how and why
particular determinations and decisions were made is not only an important
part of the audit process, but is also essential  in the event of enforcement
actions that may arise for any number of reasons, such as citizen complaints,
lawsuits, and compliance actions.

D.  COMPLIANCE ASSURANCE

     The major parts of the FY 1984 compliance assurance audit were
envisioned to be performance of a pre-visit- assessment of each State air
compliance program, review of State source files, and performance of
overview inspections.  For the pre-visit assessment, the EPA Regions were to
analyze the compliance data system (CDS) data for State progress in
meeting compliance goals, and then summarize the findings in a report to
the State prior to the audit.  The actual State audit visit was to concentrate
on the findings in the pre-visit assessment as well as file reviews of 15-20
sources from the three air programs dealing with State implementation plans
(SIP's), new source performance standards (NSPS), and national emission
standards for hazardous air pollutants (NESHAP).   Finally,  overview inspec-
tions were to be conducted by EPA of 2-3 percent of the CDS source inventory
(these could be done jointly with State personnel).  The Regions were then
to write audit reports for each State and include a discussion of the
pre-visit assessment, file status, and overview inspections.

     Because there were minimal detailed instructions for the compliance
assurance portion of the audit, the EPA Regions had the flexibility to develop
the exact make-up of each audit within the confines of general guidance on
the subject.  Consequently, there was little consistency among or even
within some Regions in the content and format of the audit reports.  As noted
in Chapter V of this report deficiencies  such as lack of a pre-visit assessment
summary and lack of discussion of overview inspections, along with the
inconsistencies, made it difficult to write the national summary on com-
pliance assurance.  Other deficiencies also made it difficult in some
cases to determine exactly of what an audit consisted.

     It is clear from the results of the FY 1984 audit that a more detailed
format is essential for the compliance assurance program in FY 1985.  The
EPA/STAPPA/ALAPCO air compliance committee will address this need.  The
format agenda will cover essentially the same areas as found in the FY 1984


                                  VII-4

-------
audits and include specific, "bottom line" questions on the Regions'  opinions
of the State's air compliance program based on the facts gathered during
the audit.  A uniform set of questions to be used by all ten Regions  will
ensure consistency in the NAAS effort, and provide an accurate basis  for
national  comparison State compliance programs.

     Regarding overview inspections, the Regions should be performing
them now so the results can be included in next year's reports.  A summary
of the overview inspection effort with results is required in every FY 1985
State or local compliance assurance audit report.

E.   AIR MONITORING

       The air monitoring portion of the FY 1984 audit effort was intended
to provide an overview and summary assessment of the ambient air monitoring
program.   The interim monitoring questionnaire was not intended to replace
the systems audits required by the monitoring regulations but rather  was
intended to impose a greater degree of Regional consistency in the per-
formance of audits than had occurred in the past.  We believe that these
objectives have been met in this audit.  Forty-eight States and some  local
agencies were audited and generally the responses have provided varying
degrees of useful  information from a high percentage of those responding.

     This initial  audit served as an appropriate independent confirmation
of the status of the major aspects of the ambient air monitoring program
carried out by a majority of the State and local agencies.  The results
of the audit correspond very well with previous EPA evaluations of air
monitoring network design and siting, data submission, and quality assurance.
In addition to the goals stated above, the audit process appeared to  be a
good, although not a fully utilized, mechanism for documentation of specific
program needs.  For example, several agencies used the questionnaire  to
express specific needs such as replacement of worn-out or obsolete equipment.

     The principal problems encountered with the FY 1984 air monitoring
audit program appear to be related to the ambiguity or inappropriateness
of some of the questions in the audit questionnaire.  For example, one of
the questions was stated as follows:  "Are the Agency's SLAMS instruments
designated as reference or equivalent methods by EPA and operated in
accordance with 40 CFR 50, 53, and 58?"  The problem with this question
comes when trying to interpret what an unelaborated "no" response means.
Does it mean the instruments are not equivalent?  If so, how many; all,
some, or 1 out of 100?  The agencies were requested to elaborate on all
negative responses to audit questions; however, there were a significant
number of negative responses to one or more questions with no elaboration.
Another example of a problem question was the one which requested that
the precision and accuracy values be submitted for the last 2 years.
This request amounts to a substantial resubmission of data already in EPA's
possession.

     To improve the air monitoring audit process for FY 1985, the questionnaire
has been redesigned to improve the information received and facilitate its
completion.  Many agencies objected to the FY 1984 questions that required
                                  YII-5

-------
resubmission of data that EPA already had in its possession.   This type of
request will be replaced by some process of confirmation of the data
already in EPA's possession.

     An additional area of improvement in the 1985 audit process is the
inclusion of a "corrective action agreement" in the air monitoring audit.
A new audit protocol being developed for the EPA Quality Assurance Handbook
will contain a "corrective action agreement" for use in systems audits.
This agreement will summarize deficiencies discovered during an audit and
provide for an agreed-upon course of action to correct the deficiencies.
This will serve as a very useful tool to begin the process of correcting
deficiencies discovered during the audit, especially small or minor
problems.
                                  VII-6

-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO. ' 2.
EPA-450/2-84-Q09
4. TITLE AND SUBTITLE
National Air Audit System-FY 1984 National Report
7. AUTHOR(S)
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Office of Air Quality Planning and Standards
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
12. SPONSORING AGENCY NAME AND ADDRESS
Director, Office of Air Quality Planning § Standards
Office of Air and Radiation
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
3. RECIPIENT'S ACCESSION NO.
5. REPORT DATE
December 1S84
6. PERFORMING ORGANIZATION CODE
8. PERFORMING ORGANIZATION REPORT NO
10. PROGRAM ELEMENT NO.
13A2A
11. CONTRACT/GRANT NO.
68T02T3892
13. TYPE OF REPORT AND PERIOD COVERED
Final T- FY 1984
14. SPONSORING AGENCY CODE
EPA/200/04
15. SUPPLEMENTARY NOTES
16. ABSTRACT

       The National Air Audit System, which was jointly developed by EPA and
  representatives of State and local air pollution control agencies, was implemented
  for the first time in FY 1984.   The system audited air pollution control activities
  in 68 State and local agencies in the areas of air quality planning and State
  implementation plan activity, new source review, compliance assurance, and air
  monitoring.  The goals of the audit system are to identify obstacles that are
  preventing State and local agencies from implementing effective air quality
  management programs and to provide EPA with quantitative information for use
  in defining more effective and meaningful national programs.  The report for
  FY 1984 indicated that, for the most part, State and local agencies have sound
  programs in each of the four audited areas.  Areas of possible improvement were
  found, however, which will be the focus of various remedial activities.
17. KEY WORDS AND DOCUMENT ANALYSIS ~~
a. DESCRIPTORS
Air pollution
Air audit
Air quality planning
New source review
Compliance assurance
Air monitoring
18. DISTRIBUTION STATEMENT
Release unlimited. Available through
NTIS.
b.lDENTIFIERS/OPEN ENDED TERMS
Air Pollution Control
19. SECURITY CLASS (This Report)
Unclassified
20. SECURITY CLASS (This page)
Unclassified
c. COSATI Field/Group
13B
21. NO. OF PAGES
117
22. PRICE
EPA Form 2220-1 (Rev. 4-77)
                     PREVIOUS EDITION IS OBSOLETE

-------
                                                        INSTRUCTIONS

   1.   REPORT NUMBER
       Insert the EPA report number as it appears on the cover of the publication.

   2.   LEAVE BLANK

   3.   RECIPIENTS ACCESSION NUMBER
       Reserved for use by each report recipient.

       TITLE AND SUBTITLE
       "'"itle should indicate clearly and briefly the subject coverage of the report, and be displayed prominently. Set subtitle, if used, in smallei
         pe or otherwise subordinate it to main  title. When a report is prepared in more than one volume, repeat the primary title, add volume
         mber and include subtitle for the specific title.

   5.   REPORT DATE
       Each report shall carry a date indicating at least month and  year.  Indicate the basis on which it was selected (e.g., date of issue, date of
       •pi. -oval, date of preparation, etc.).

   6.   PERFORMING ORGANIZATION CODE
       Leave blank.

   7.   AUTHOR(S)
       Give name(s) in conventional order (John R. Doe, J. Robert Doe, etc.). List author's affiliation if it differs from the performing organi
       zation.

   8.   PERFORMING ORGANIZATION REPORT NUMBER
       Insert if performing organization wishes to assign this number.

   9.   PERFORMING ORGANIZATION NAME AND ADDRESS
       Give name, street, city, state, and ZIP code.  List  no more than two levels of an organizational hirearchy.

   10.  PROGRAM ELEMENT NUMBER
       Use the program element number under which the report was prepared. Subordinate numbers may be included in parentheses.

   11.  CONTRACT/GRANT NUMBER
       Insert contract or grant number under which report was prepared.

   12.  SPONSORING AGENCY NAME AND ADDRESS
       Include ZIP code.

   13.  TYPE OF REPORT AND PERIOD COVERED
       Indicate interim final, etc., and if applicable, dates covered.

   14.  SPONSORING AGENCY CODE
       Insert appropriate code.

   15.  SUPPLEMENTARY NOTES
       Enter information not included elsewhere but useful, such as:  F'repared in cooperation with, Translation of, Presented'at conference of,
       To be published in, Supersedes, Supplements, etc.

   16.  ABSTRACT
       Include a brief (200 words or less) factual summary of the most significant information contained in the report. If the report contains ;
       significant bibliography or literature survey, mention it here.

   17.  KEY WORDS AND DOCUMENT ANALYSIS
       (a) DESCRIPTORS - Select from the Thesaurus of Engineering and Scientific Terms the proper authorized terms that identify the ma]O
       concept of the research and are sufficiently specific and precise to be used as index entries for cataloging.

       (b) IDENTIFIERS AND OPEN-ENDED TERMS - Use identifier for project names, code names, equipment designators, etc. Use open-
       ended terms written in descriptor form for those  subjects for which no descriptor exists.

       (c) COSATI FIELD GROUP - Field and group assignments are to be taken from the 1965 COSATI  Subject Category List. Since the ma
       jority of documents are multidisciplinary in nature, the Primary Field/Group assignment(s) will  be specific discipline, area of human
       endeavor, or type of physical object. The application(s) will be cross-referenced with secondary Field/Group  assignments that will folk
       the primary posting(s).

   18.  DISTRIBUTION STATEMENT
       Denote releasability to the public or limitation for reasons other than security for example "Release Unlimited." Cite any availability t
       the public, with address and price.

   19. &20. SECURITY CLASSIFICATION
       DO NOT submit classified reports to the  National Technical Information service.

   21.  NUMBER OF PAGES
       Insert the total number of pages, including this one and unnumbered pages, but exclude distribution list, if any.

   22.  PRICE
       Insert the price set by the National Technical Information Service or the Government Printing Office, if known.
EPA Form 2220-1  (Rev. 4-77) (Reverse)

-------