EPA-450/2-85-009
National Air Audit System
 FY 1985 National Report
     Control Programs Development Division
  U.S. ENVIRONMENTAL PROTECTION AGENCY
          Office of Air and Radiation
    Office of Air Quality Planning and Standards
    Research Triangle Park, North Carolina 27711

             December 1985

-------
This report has been reviewed by the Office of Air Quality Planning
and Standards, EPA, and approved for publication.   Mention of trade
names or commercial products is not intended to constitute endorsement
or recommendation for use.  Copies of this report  are available through
the Library Services Office (MD-35), U.  S. Environmental  Protection
Agency, Research Triangle Park, North Carolina 27711; or,  for a fee,
from the National Technical Information  Service, 5285 Port Royal Road,
Springfield, Virginia 22161.
                     Publication No. EPA-450/2-85-009
                                    n

-------
                                    CONTENTS
   I.  OVERVIEW OF AUDIT FINDINGS 	      1-1

  II.  INTRODUCTION	      II-l

 III.  AIR QUALITY PLANNING AND SIP ACTIVITIES
           Executive Summary	      I II-l
       A.  Air Quality Evaluation	      II1-4
       B.  Emissions Inventory	      111-10
       C.  Modeling	      111-20
       D.  SIP Evaluation and Implementation	      111-32

 IV.   NEW SOURCE REVIEW
           Executive Summary	      IV-1
       A.  Introduction	      IV-4
       B.  Summary of Major Findings	      IV-6
       C.  Public Notification Procedures 	      IV-12
       D.  Applicability Determinations 	      IV-14
       E.  BACT/LAER Determinations 	      IV-18
       F.  Ambient Monitoring 	      IV-21
       G.  Ambient Air Quality Analysis	      IV-22
       H.  Emissions Offset Requirements	      IV-27
       I.  Permit Specificity and Clarity 	      IV-28

  V.   COMPLIANCE ASSURANCE
           Executive Summary	      V-l
       A.  Introduction	      V-2
       B.  Major Findings and Conclusions 	      V-3
       C.  Periodic Review and Assessment of Source Data	      V-4
       D.  File Review	      V-6
       E.  Overview Inspections 	      V-7

 VI.   AIR MONITORING
           Executive Summary	      VI-1
       A.  Introduction	      VI-1
       B.  Major Findings and Conclusions 	      VI-2
       C.  Network Design and Siting	      VI-3
       D.  Resources and Facilities	      VI-5
       E.  Data and Data Management	      VI-7
       F.  Quality Assurance/Quality Control	      VI-9

VII.   VEHICLE INSPECTION/MAINTENANCE
           Executive Summary	      VII-1
       A.  Introduction	      VII-2
       B.  Major Findings and Conclusions 	      VII-3
       C.  Enforcement	      VII-5
       D.  Reported Failure Rate	      VII-6
       E.  Waiver Rates	      VII-6
       F.  Analyzer Quality Assurance 	      VII-7
       G.  Data Analysis	      VI1-8
       H.  Quality of I/M Repairs	      VII-8
       I.  Evaluation of the FY 1985 Air Audit Effort	      VI1-9

                                        i i 1

-------
                       I.  OVERVIEW OF AUDIT FINDINGS
     The air quality management objectives of the Clean Air Act  (CAA)  are
to attain the national  ambient air quality standards (NAAQS)  as  expedi-
tiously as practicable, maintain them thereafter, and prevent significant
air quality deterioration.  The CAA gives most of the responsibility for
attaining these objectives to the States.  Therefore, the overall  goal of
the National Air Audit System (NAAS) is to determine if State and  local
air pollution control programs are achieving the CAA's air quality manage-
ment objectives.  In spite of some minor shortcomings, the FY 1985 audit
effort provided a good assessment of the health of the national  air quality
management program.

     While it is clear that in FY 1985 the State and local air pollution
control agencies get high marks in some areas, particularly when considering
the difficulty and magnitude of the air quality management task  and the
limited resources and tools available to do the job, their overall  score
relative to actually meeting all the literal objectives of the CAA shows
room for improvement.  Some of the deficiencies are minor.  In other cases,
programs are under way at the State or national level to correct the
deficiencies.  There are, however, some deficiencies that if left  unattended
could threaten the overall health of some air quality management programs.
In order to support this conclusion, it will be necessary to identify those
activities that make up an air quality management program, specify how
those activities are to be conducted as specified by Environmental  Protection
Agency (EPA) regulations and guidance or by recognized good practice, and
then compare those activities and practices to what was found during the FY
1985 audit program.

Air Quality Data

     The starting point for air quality management is ambient air  quality
data.  Is the air quality good and, thus, in need of protection  against
significant deterioration by potential  new sources or is it poor,  dictating
the need for additional control of existing sources and offsetting emission
reductions for new sources?  In order to make these determinations, it is
necessary to have a sufficient quantity of high quality ambient  air
monitoring data available in a timely fashion.  The FY 1985 audit  showed
that States have done a commendable job in establishing and operating
ambient monitoring networks for criteria pollutants and that good  quality
data are generally available.  The audit indicates that State and  local
agencies have continued their successful  performance in operating  and
maintaining their State and local  air monitoring stations (SLAMS)  and
national air monitoring stations (NAMS).  No significant or widespread
problems were discovered concerning network design or monitoring siting.

     The audit showed that timeliness of data submittal remains  a  problem
for many agencies, particularly for submission of lead (Pb) data which are
late about 25 percent of the time.  Another problem area in the  data
management section of the audit was a deficiency by 24 percent of  the
agencies in the submittal of the required annual  SLAMS report.  Corrective


                                    1-1

-------
actions are being taken to resolve these problems.   This will  be
accomplished by early identification of late data or deficient annual
SLAMS reports and prompt notification of the appropriate Regional  Office
for foilow up.

     The audit indicates that most of the State and local  agencies are
doing a good job of maintaining adequate and up-to-date quality assurance
plans.  The only significant problems with respect  to the achievement  of
quality assurance goals are related to accuracy for nitrogen oxide
(N02) and precision for Pb.  The low values for N02 accuracy and Pb precision
are believed to be related to the complexity of the N02 measurement method
and the low ambient Pb levels.

     The audit indicated that 82 percent of the audited agencies had a need
for new or replacement of air monitoring or laboratory equipment.   The
total equipment needs were approximately $4.6 million with $3.6 million
required for monitoring equipment and $1 million for laboratory equipment.
This finding indicates that data completeness and reliability problems
will be encountered if this area receives no attention.  Therefore,
$2.4 million has been included in FY 1986 105 grant funds to replace carbon
monoxide (CO) and ozone (03) monitors and to procure size-selective particu-
1 ate matter monitors.  It is anticipated that similar funding levels will  be
made available in FY 1987.

Air Quality Evaluation

     Once the availability of air quality data is established, it  is only
of value if it triggers an appropriate action via the air quality  manage-
ment process.  After the data are collected, they must be evaluated to
determine the appropriate control  action.  The FY 1985 audit showed that
over 90 percent of the agencies published annual air quality reports and over
80 percent evaluated attainment and nonattainment area designations based on
those data.  The audit and a subsequent survey also showed that action has
been initiated by the audited agencies on all  relevant new violations  of the
NAAQS.  Action, however, generally does not include redesignations under
section 107 of the CAA from attainment to nonattainment.   Thus, the viability
of the section 107 designation program as a continuing and comprehensive
planning tool is seriously in question.

Emission Inventory

     In order to develop a control strategy, it is  necessary to relate air
quality to source emissions using such tools as an  air quality model.   A
good emission inventory is an absolute prerequisite for such an analysis.
The audit indicated that almost all agencies have inventories for  major
point sources, but only 55 percent of the agencies  maintain an area source
inventory.  Point source inventories are relatively current—and agencies
plan to keep them current via the new source and operating permit  programs—
but few agencies routinely modify their inventories to reflect sources'
compliance with regulations.  The audit suggests that area source  inven-
tories are not as comprehensive as they should be.   Some categories are
missing all together.  Mobile source inventories seem to be almost neglected
by many agencies that do not have direct responsibility for this inventory.

                                     1-2

-------
Furthermore, only two-fifths of the agencies indicate that they use the
most current factors to update their mobile source emission inventory.

     A number of State and local  agencies have begun, or are beginning, to
develop inventories of potentially toxic substances, but these are primarily
for point sources and cover groups of sources and pollutants that differ
greatly from agency to agency.

     While many point source inventories appear adequate as a starting  point
for State implementation plan (SIP) modeling and development of control
strategies, most area and mobile source inventories are receiving less
attention in frequency of updates and periodic reevaluation.  This lack of
attention may create problems when dealing with urban ozone problems, where
area source emissions of volatile organic compounds (VOC)  may outnumber
point source emissions.  This lack of emphasis on area and mobile source
emissions is indicative of a generic transition problem that air pollution
control agencies seem to be experiencing.  This problem is one of transition
from a control  program focused on the control  of stack emissions from a few
big sources with well understood control technology to a program of control-
ling many relatively small and poorly recognized sources with control
technology that has not previously been widely used and to controlling
nontraditional  nonstack sources.  Accordingly, because of the lesser quality
of some mobile and area source inventories, "reasonable further progress"
tracking is not as precise a tool for monitoring progress toward attainment
of the NAAQS as EPA had originally intended.

Modeling

     As noted previously, air quality models are used to relate emissions
to ambient air quality and, hence, to determine the necessary control
measures.  The FY 1985 audit indicated that most agencies are knowledgeable
and capable of performing and reviewing most routine modeling analyses.
However, the State/local agencies generally do not follow EPA procedures in
using nonguideline models.  The agencies also appear to have problems in
implementing EPA modeling guidance for sophisticated models, as approxi-
mately one-third of all modeling analyses submitted to EPA had to be
returned for revisions.  Modeling analyses for emission trades ("bubbles")
fared worse than the average, with over four-fifths being returned to the
States.  It is interesting to note that modeling analyses (all  types)
performed by industry and submitted to the State/local  agencies for concur-
rence had a similar revision rate of approximately one-third; bubble modeling
had a higher return rate of two-thirds.

     It is anticipated that the issuance of the revised EPA modeling
guideline, and the increasing availability of updated modeling training
courses, will ease the overall implementation problems currently affecting
State/local agencies.  The modeling efforts associated with bubble analyses
appear incomplete, however.  This indicates that more advance interaction
is needed among sources, States,.and EPA before any modeling provisions of
a "generic" bubble rule can be made workable.
                                    1-3

-------
SIP Revisions

     The CAA clearly envisions  that  control  agencies would periodically
evaluate source/receptor relationships,  i.e., the  relationship between
air quality data and source emissions  data,  and  revise  their SIP control
strategies and resultant emission  regulations accordingly.  If the  State
does not act in these situations,  the  CAA requires  EPA  to initiate  State
action by calling for revisions to the SIP.  The FY 1985 audit found the
following with respect to State and  local  development and implementation of
their SIP's:

     1.  While the majority of  agencies  are  making  progress in submitting
the required rules, 44 percent  of  the  SIP revisions that were either newly
due or overdue in 1984 had not  been  completed by the agencies and,  of those
completed, 27 percent had not been submitted to  EPA.

     2.  Almost half of the additional studies that the agencies had
committed to complete by the end of  1984 as  part of their SIP control
strategies had not been completed  and  64 percent of these studies were
behind schedule.

     3.  Agencies regularly consult  with EPA concerning bubbles and other
source-specific SIP revisions.   However, the audit  and  a subsequent survey
of EPA Regional Offices revealed problems with "generic" bubble rules
(i.e., rules in which emission  trades  are granted  by the States directly
without the need for EPA approval).   Information available revealed that
some States operated generic bubble  rules without  prior EPA approval.  This
practice places sources that have  received such  bubbles in jeopardy of
potential EPA action to enforce the  EPA-approved SIP limit.  Also,  the
limited information available indicated  problems with many generic  bubbles
that had been issued.

     4.  While most agencies took  action (SIP revisions, studies, monitoring,
enforcement) when discrepancies were found between  actual source activity
(i.e., increases or decreases in growth) and past  projections, many of the
agencies performed these evaluations at  intervals  greater than 5 years.


     It was clear from the audit that  a  lack of  resources and technical
capabilities contribute to some of the SIP problems just mentioned. It was
also clear, however, that in some  instances  the  problems were attributed to
too low a priority being given  to  SIP  development  and tracking by some
States.  Upcoming efforts to develop and implement  the  new stack height
regulations, a post-1987 03 policy,  and  a revised  particulate matter standard
should significantly stimulate  the SIP development  efforts in most  States.

I/M Programs

     The CAA required that SIP's for 03  and  CO nonattainment areas  with
attainment date extensions to 1987 provide for vehicle  inspection and
maintenance (I/M).  The results of the 16 I/M program audits conducted  in
FY 1984 and FY 1985 indicate that  a  number of serious problems exist in


                                    1-4

-------
some I/M programs.  These problems must be addressed  if I/M programs  are
going to achieve the emission reduction targets  committed  to in  the SIP's.
Four of the six problem areas identified—enforcement problems,  low failure
rates, high waiver rates, and poor quality repairs—have a direct  bearing on
the emission reductions achieved by I/M programs.   The other two problem
areas—analyzer quality assurance and data analyses—have  more  indirect
effects; however, they reflect to an extent the  States'  ability  (or inability)
to assess program performance and improve it when  necessary.

     Twelve of the 16 programs audited were identified as  having critical or
serious operating problems.  Nine of these 12 are  decentralized  programs and
the other three are government-run, centralized  programs.   The most chronic and
the most challenging problem facing EPA is the issue  of low failure rates,
especially in decentralized programs.

     The programs audited represent a good cross-section of the  types  and
sizes of the 33 I/M programs currently operating nationwide.  It is,  there-
fore, reasonable to expect that many of the programs  which have  not yet been
audited are experiencing many of these same problems.  This position  is
supported by current operating data available from some of the other
programs.

     The audit results clearly identify the need for  EPA to continue  its
efforts to work with each State or local  I/M program  to address  problems
identified in audits.  In addition, EPA should give a high priority to
conducting I/M audits in the remaining programs  as soon as possible.

New Source Review

     The CAA anticipates that the review of new  sources by the States  will
be one of the main mechanisms by which they attain and maintain  the NAAQS
and prevent significant air quality deterioration. A State's new  source
review (NSR) program must be designed and implemented to prevent a new
source from aggravating an existing air quality  problem or creating a new
problem where one does not already exist.  The FY  1985 audit verified  that
State and local agencies are generally familiar  with  and have strong  support
for the preconstruction review process.

     This year's audit findings support and oftentime amplify the  findings
from the FY 1984 NSR audit.  The findings indicate that most agencies  perform
their overall NSR program responsibilities reasonably well, although,  in a
number of State and local agencies, problems were  identified with  respect
to the consistent and adequate application of certain specific program
requirements.  In addition, auditors occasionally  mentioned that noticeable
improvements had already been observed in the performance  of some  agencies
where specific deficiencies had previously been  found.  This is  certainly
to the credit of these agencies in their efforts to improve in-house  perfor-
mance and to strive for a greater level of national consistency.  Overall,
however, EPA auditors often cited the lack of adequate file documentation
as a hindrance to a complete evaluation of the agencies'  permit  review pro-
cedures.  With respect to specific audit topics, EPA  continued to  find
significant problems with the way that many agencies  carried out their
source applicability determination procedures, with the lack of  thorough

                                    1-5

-------
ambient air quality impact analyses in some instances, and  with the way
that different methods were used to require operating  limitations  on new
and modified sources.  The audit once again noted the  overall  tendency of
agencies to rely on New Source Performance Standards  (NSPS) in defining
best available control technology (BACT).  The audit  also confirmed the
fact that agencies are typically willing to allow prevention of significant
deterioration (PSD) applicants to use existing data in lieu of new monitoring
data, but raised a new concern when EPA found that the basis for such
actions was not always well documented or in complete  conformance  with
existing EPA criteria for representative data.

     Yet, despite the assortment of problems that the  audit was able to
identify, it is still fair to say that most State and  local agencies function
in a competent manner and generally have their own individual  strong points.
For example, some agencies routinely extend the requirement for BACT to non-
PSD sources, or require BACT for all pollutants once  it is  determined to be
required for any pollutant.  The focus on permit files demonstrated that
most agencies are usually more prone to internal inconsistencies than to
routine malpractice.  While not always true, auditors  were  usually able to
find good examples along with the bad whenever selected permit files were
examined.  The EPA hopes that through the audit process many of these
inconsistencies can be identified and then satisfactorily addressed by
agencies within their existing technical and legal  frameworks.  Where
appropriate, EPA will seek to provide the necessary guidance and training
to support State and local agencies in their NSR efforts.

Compliance

     The ultimate success of a State air quality management program relies
on its ability and will  to enforce its regulations. Many States and locals
showed one or more strong points characteristic of a successful  air
compliance program, such as high source compliance rates supported by high
inspection frequency rates, performance of all required NSPS source tests,
expeditious resolution of violators, and few long-term violators.   Other
States had source files that were for the most part well  organized,
up-to-date, and complete, reflecting a reasonable profile of each  source.
These positive points show that most States are fulfilling  compliance and
enforcement responsibilities under the CAA.

     Inspection rates for Class Al* SIP sources generally increased over
those reported in last year's audit, although four States are still
unacceptably low (inspection rates of less than 60 percent).  Compliance
rates for Class Al SIP sources remained roughly the same as last year.  The
NSPS national average for both inspection and compliance rates rose, even
though some individual State rates declined slightly.   The  NSPS inspection
rates for two States are still seriously deficient, with figures of 33
percent.  Overall, inspection rates for National Emissions  Standards for
Hazardous Air Pollutants (NESHAP) remained steady while compliance rates
* Class Al includes sources with actual  or potential  controlled emissions
greater than or equal  to 100 tons per year.
                                    1-6

-------
fell slightly.  Fourteen States still  have NESHAP inspection rates  at
or below 55 percent.  This information shows that performance did  not
change significantly from last year's  audit.  In  the coming  fiscal  year,  the
States and locals will be working to further improve compliance of  sources.

     The audits also revealed that several  State  and local  agencies, to a
varying extent, still  have weaknesses  in three areas vital  to a strong  and
effective compliance program.  First,  some source files maintained  by State
and local agencies do  not contain verifiable information reflecting a
reasonable profile of  each source.  However, there has been  some improve-
ment since last year's audits in the condition of States files, where the
percentage of those reviewed that reflected a reasonable profile of the
sources increased from 58 percent to 72 percent.   Second, some inspection
reports still are of poor quality (no mention of  operating  or emission
parameters or pollutants emitted).  For some agencies, there was a  notice-
able improvement in the quality of inspection reports since the last review,
but there remain significant deficiencies in this area.  Third, some of the
reviewed agencies' enforcement efforts are not always effective in  reducing
the number of long-term violators by expeditiously returning documented
violators to compliance, although there was a slight drop in the percentage
of reports that indicated sources were not being  expeditiously returned to
compliance (from 30 percent down to 26 percent).

     Thus, while there are improvements in all  of these critical  areas,
some States and locals need to heighten efforts on the aforementioned three
areas to further strengthen their compliance programs.  Success in  these
three areas is vital to the establishment and maintainance  of State and
local credibility with EPA and the public.

Next Steps

     The EPA Regional  Offices are now in the process of working with the
State and local agencies to correct those deficiencies that  were identified
in the FY 1985 audit.   They will continue this effort into  FY 1986. In
addition, EPA intends  to use the results of the FY 1985 audit in its program
planning and budgeting cycle in order  to assure that resources are  directed
to areas of highest need.  After the FY 1984 audit,  EPA convened a  symposium
to develop specific recommendations for future improvements  to the  air
program based upon what was learned in the audit.  The symposium included
representatives from EPA and the State and local  air pollution control
agencies.  A report on the implementation of these recommendations  will be
distributed by EPA early in 1986.  A similar report  is planned by EPA based
upon the results of the FY 1985 audit.
                                    1-7

-------
                            II.  INTRODUCTION
The NAAS was developed in 1983 through a joint effort of the State and
Territorial Air Pollution Program Administrators (STAPPA), the Association
of Local Air Pollution Control Officials (ALAPCO), and EPA.  The NAAS
provides uniform national criteria for evaluating (auditing) State and
local  air pollution control  programs.  Such nationally applicable criteria
minimizes inconsistency of program audits carried out by EPA's ten Regional
Offices.

     The need for the NAAS evolved as State and local air pollution
control  agencies assumed responsibility under the Clean Air Act (CAA) for
an increasing number of programs.  The EPA responded to the concerns of
the STAPPA and ALAPCO members by agreeing to participate in a STAPPA/
ALAPCO/EPA workgroup.  The workgroup set forth to develop and direct the
implementation of an auditing system that would ensure the desired national
consistency and would confirm that State and local air pollution control
programs were operating in such a manner as to satisfy the national
requirements of the CAA.

     The workgroup decided that the primary goals of the NAAS should be
to identify any obstacles that are preventing State and local agencies
from implementing an effective air quality management program and to
provide EPA with information which can be used to develop more effective
and meaningful  national programs.  The NAAS should provide audit guidelines
that EPA and State and local agencies can use (1) to meet statutory
requirements; (2) to assist in developing an acceptable level of program
quality; (3) to account for the achievements, shortcomings, and needs of
various air programs; (4) to identify programs needing further technical
support or other assistance; and (5) to manage available Federal, State,
and local  resources effectively so that the national ambient air quality
standards (NAAQS) are attained and maintained as expeditiously as possible.

     The first audit covered four program areas selected by the workgroup:
air quality planning and SIP activity, new source review, compliance
assurance, and air monitoring.  Standardized audit guidelines for each
program area were written by subcommittees appointed by the workgroup.
The subcommittees were chaired by a State agency person with an EPA staff
person serving as a coordinator.  Local agencies and the EPA Regional
Offices were also represented on each subcommittee.  The workgroup also
developed the protocol for implementing the audit guidelines.

     These guidelines were used for conducting the first NAAS audits in
FY 1984 (ending September 30, 1984).  A national report (EPA-450/2-84-009)
that summarized the results of the 68 audits performed the first year was
issued in December 1984.  The audit guidelines were revised for FY 1985
and vehicle inspection/maintenance was added as a fifth program audit
area.
                                   II-l

-------
     The guidelines were used by EPA Regional  Offices  in  FY  1985  to audit
66 State and local  air pollution control  programs,  including all  States
except California plus Puerto Rico,  the Virgin Islands,  and  the
District of Columbia.   The California State agency  was not audited  because
the local district agencies in California are  responsible for implementing
the various air quality management programs.   The local  agencies  audited
were:*                                                                     /:

         Allegheny County, PA            Philadelphia, PA
         Asheville, NC                   South Coast AQMD, CA
         Fresno County, CA               Southwest  APCA,  WA
         Jacksonville, FL                St. Louis, MO
         Lane County,  OR                 Tampa, FL
         Nashville, TN                   Toledo,  OH
         Northwest APA, WA               Wayne County, MI


     The STAPPA, ALAPCO, and EPA encouraged State/local  personnel  from one
agency to serve as members of the audit team for  another  agency.   Four
States participated in this activity in FY 1985.  The  agencies that
participated in the audit exchanges  listed the following  benefits of the
program:

     It provides an opportunity to compare their  agencies' programs with
     the host State's  program to see what improvements can be "transplanted."

     It fosters communication with other agencies at the  working  level so
     that common problems can be shared.

     It provides general insight into what other  agencies are doing.

All of the participating agencies indicated that  the exchanges were
beneficial and that they would continue their  participation  in the  future
if resources allow.  Such audit team exchanges apparently offer an  excellent
opportunity to learn firsthand how other States operate  their programs.

     The audit teams varied in size; the number of  auditors  in an agency
at any one time rarely exceeded five.  All five of  the program areas were
generally not audited  at the same time.  Also, all  program areas  were not
audited in each agency because the five activities  selected  for audit
were not performed by all agencies.

     The EPA Headquarters personnel  observed ten  audits.   This served to
*Additional local agencies were included in the air monitoring audits
 because of the delegated responsibility of operating local  air monitoring
 networks.
                                   II-2

-------
provide national  overview on the audits and was part of the quality
assurance program to which STAPPA/ALAPCO and EPA had agreed.

     The protocol followed by the Regional  Office in conducting the
audits included advance preparation prior to the on-site visit, an initial
meeting with the agency director, discussions with agency staff, review
of the agency files, and an exit interview.

     The advance preparation involved, among other things, sending a
letter to the agency well in advance of the audit to confirm the date and
time and to identify the individuals performing the audit.  The guidelines
and questionnaires were also provided to the agencies with a request to
complete portions of the questionnaire and  to return them to the EPA
Regional Offices at least 2 weeks before the scheduled visit.

     The site visits were conducted generally in four phases:

     0 The audit team met with the agency director and key staff to
discuss the audit goals and procedures to be followed.

     0 The auditors discussed the questionnaire with the personnel in
charge of each of the five audited activities.

     0 The agency files were reviewed to verify the implementation and
documentation of required activities.

     0 An exit interview was held to inform agency management of the
preliminary results of the audit.

     The Regional Offices drafted an audit  report after each site visit
and requested that each audited agency review it.  The individual  agency
audit reports were used by EPA to compile and write this FY 1985 national
report.
                                   II-3

-------
                III.  AIR QUALITY PLANNING AND SIP  ACTIVITIES
                              EXECUTIVE SUMMARY


     Four major program components within  the air  quality  planning  and  SIP
activities area were evaluated in the FY 1985 audit.   These  components  were
air quality evaluation, emission inventories, modeling,  and  SIP  evaluation.

     This section of the audit was again in  the  form  of  survey questions that
were, for the most part, answered by the audited agencies.   Generally,  there
was not an actual "audit" conducted in that  file reviews were generally not
used to confirm the answers given.  A large  part of this section does not
lend itself as well  to file checks as the  other  sections (new source review,
air quality monitoring, vehicle inspection/maintenance,  compliance  assurance),
but some parts might.  Air quality evaluation, emission  inventories, and
modeling activities could be verified by selected  file reviews.   In the
future, this section of the audit should probably  have a least a minimum
amount of file verification to be a meaningful part of the NAAS.

     The FY 1985 audit revealed that the majority  of  the audited agencies
have sound programs in most of these components, but  there are gaps that need
to be filled.  Major findings from each of the components  are described below.

Air Quality Evaluation

     This portion of the audit covers how  air quality data are used by  State
and local agencies for the purpose of section 107  redesignations, handling
newly measured violations of the NAAQS, and  keeping the  public informed.  The
vast majority of agencies make their air quality data available  to  the  public
in a timely fashion.

     Relating ambient data to source impacts is  one important area  where the
audit revealed improvements.  The audit indicated  that 90  percent of the
new violations of the NAAQS were investigated to determine the cause or
identify corrective actions.  A later survey of  the EPA  Regional  Offices
showed that the agencies were investigating  the  remaining  10 percent.

     The majority of the agencies systematically review  section  107 designations
and submit proposed changes to EPA.  Emphasis is strongly  in the direction of
attainment, however, as less than 1 percent  of the requests  for  redesignations
are from attainment to nonattainment, even though  new violations continue to
occur.  Most agencies reported that they review  their attainment status at
least on a yearly basis.

Emission Inventories

     The audit indicated that almost all  agencies  have inventories  for  major
point sources, but only one-half of the agencies maintain  an area source
inventory.  Point source inventories are relatively current, and agencies plan


                                    III-l

-------
to keep them current.  Area source inventories are not as comprehensive as
they could be.  The majority of agencies use a fully automated  or partially
automated storage and retrieval system.

     Many State and local  agencies have taken the lead in inventorying sources
of toxic substances.  There is, however, a lack of national  consistency in
the sources and specific toxics being inventoried.  It also  appears that
area sources of toxic substances are not being included in the  toxic inventories
                                 ,*"'
                                .*
     This portion of the audit was conducted again in the form  of survey
questions.  Perhaps next year's audit should use the file review method to
verify the condition of the inventories.

Modeling

     Air quality models are used to relate emission to ambient  air quality and
hence to determine the necessary control measures.  The FY 1985 audit indicated
that most agencies are knowledgeable and capable of performing  and reviewing
most routine modeling analyses.  However, the State/local agencies generally
do not follow EPA procedures when using non-guideline models.   The agencies
also appear to have problems in implementing EPA modeling guidance for sophis-
ticated models as approximately one-third of all modeling analyses submitted
to EPA had to be returned for revisions.  Modeling analyses  for emission
trades ("bubbles") fared worse than average, with over four-fifths being
returned for revisions to the States.  It is interesting to  note that general
modeling analyses performed by industry and submitted to the State/local
agencies for concurrence had a similar return rate of approximately one-third.
Bubble modeling performed by industry had an even higher return rate of
two-thirds.


SIP Evaluation

     The CAA clearly envisioned that control agencies would  periodically
evaluate the relationship between air quality data and source emission data
and, where appropriate, submit timely revisions of their SIP control strate-
gies.  The FY 1985 audit found that 44 percent of the due or overdue SIP
revisions had not been completed by the agencies and of those completed, 27
percent had not been submitted to EPA.  The audit also showed that only half
of the additional studies that the agencies had committed to complete by the
end of 1984 as part of their SIP control strategies had been completed, and
64 percent of the remaining studies were behind schedule. The  audit found
that many agencies are not comparing their current ozone or  carbon monoxide
emission inventory with their 1982 SIP projections for the current year.
Continuous emission monitoring is being completely implemented  by agencies in
only 59 percent of all cases.

     The audit and a subsequent survey of the EPA Regional Offices also
revealed problems with "generic" bubble rules (i.e., rules in which emission
trades are handled at the State level without the need for EPA  rul emaking).
                                    III-2

-------
Information available revealed that some States operated generic  bubble rules
without ever having received formal EPA approval.   This  practice  places
sources that have received individual  bubbles under these State programs  in
jeopardy of potential EPA action to enforce the EPA-approved  SIP  limits.
                                    III-3

-------
A.   AIR QUALITY EVALUATION

Introduction

     This section contains detailed audit information  on  how air  quality data
are used by State and local agencies for the purpose of section 107
redesignations, trend analyses, prioritization of air  program activities, and
public information.  Three main areas are covered:   air quality reports,
section 107 designations, and new violations.

     In the air quality reports area, the audit asked  how frequently  agencies
published air quality monitoring data, the contents  of the air quality reports,
and the time frame between data acquisition and publication.

     The second area evaluated whether section 107  attainment status
designations were being reviewed and redesignated,  if  applicable.  The number
of redesignations and methods used to review attainment status were  requested.

     The third area was intended to determine the number  of air monitors that
revealed any new violations and the type of action  taken  for each  violation.

Major Findings and Conclusions

     The air quality evaluation part of the audit showed  that 92  percent of
all State agencies and 90 percent of all  local  agencies made air  quality
reports available to the public.  This is an increase  over the FY  1984 audit,
where 83 percent of all State agencies and 78 percent  of  all  local agencies
made air quality reports available.  Eighty percent  of the agencies  reported
that they make their air quality data available to  the public within  6 months
of data acquisition.

     On a weighted average basis, the time between  data collection and report
publication for all agencies that submit reports to  the public is  between
4 and 5 months.  The longest delay reported by any  one agency was  greater
than 18 months.

     Most agencies reported that they use air quality  data in reviewing
attainment/nonattainment designations and prioritizing resources.  Eighty-three
percent of all State and local agencies systematically review section 107
attainment status designations and submit proposed  changes to EPA.  Ninety-one
percent of these agencies review attainment/nonattainment status  at  least on
a yearly basis.  During FY 1984, 106 primary and 166 secondary redesignations
were initiated, while 81 primary and 166 secondary  redesignations  were completed
and submitted to EPA for review.  The redesignation  efforts are generally uni-
directional; less than 1 percent of all initiated actions and less than
2 percent of all completed and submitted actions were  for redesignations from
attainment or unclassifiable to nonattainment.  Thus,  the viability  of the
section 107 designation program as a continuing and  comprehensive  planning
tool is seriously in question.

     Air quality monitoring detected 58 new NAAQS violations in FY 1984.  Of
these, investigative action was initiated at 40 sites  to  determine the cause
of the exceedance, or to initiate enforcement action.  Twelve of  the remaining
sites had total suspended particulate (TSP) violations that control  agencies

                                    III-4

-------
believe are covered under EPA's rural  fugitive dust policy and  do not consti-
tute violations of the NAAQS for attainment purposes.   No action  was  taken  by
State and local agencies on six violations.

Response to Individual Questions

Air Quality Reports (A.I.)

   ,-*"The purpose of this audit question is to determine:   (1)  if  State and
local  agencies make air quality reports available to the  public;  (2)  the
contents of the air quality reports;  and (3) the time frame between data
acquisition and report publication.  Specific questions regarding the report's
contents are summarized in  Table A-l.   Based on a review  of individual  agency
responses, the same agencies consistently omitted information  listed  in Table
A-l from their air quality  reports.

     The audit determined that 46 out  of 50 State agencies (92  percent) and 9
of 10 local agencies (90 percent) made air quality reports available  to the
public.  This is an increase over the  FY 1984 audit, where 83  percent of all
State agencies and 78 percent of all  local  agencies made  air quality  reports
available.

     The time lag between data collection and publication was  calculated to
average between 4 and 5 months.  All  reporting agencies,  except one,
published air quality reports within  12 months of data acquisition.  Seventeen
agencies (28 percent) reported a 0 to  2 month lag time, 31 agencies
(52 percent) reported a 3 to 6 month  lag time, and 11  agencies  (18 percent)
reported a 7 to 12 month lag time.


Section 107 Attainment Status Designation (A.2.)

     The audit determined that 50 State and local  agencies (83  percent)
systematically reviewed section 107 primary and secondary attainment  status
designations and submitted  proposed changes to EPA.  Seven agencies
(12 percent) did not review, and three local agencies  (5  percent)  reported
not applicable because of systematic  review at the State  level.

     Twenty-nine agencies reported that 253 section 107 reviews were  completed
in FY 1984 that did not result in requests for redesignations.  Of these reviews,
177 came from one State agency.  At the same time, 272 primary  or secondary
redesignation actions were  initiated  and 247 actions were completed and
submitted to EPA.  The results are presented in Table  A-2.  Less  than
1 percent of all initiated  actions and less than 2 percent of  all  completed
and submitted actions were  for redesignations from attainment  or  unclassifiable
to nonattainment.  This appears low in view of the fact that 18 agencies
reported that 58 sites recorded new violations of the  NAAQS in  FY 1984.
                                    III-5

-------














oo
o:
o
0-
LU
tx.


(—

^"^
_J
cr
Q£
h-H
*iit

CD
yy
KH
0
o£
 TJ
fc
01
en
Z IO

















CM CM CM CM CMCMCMCMCM








r- _ _ i— ,— ,— ,— ,— _











cn^ooo o ro r— o in
— i— 00 CM <•> CM CM








in co O O ^ co r*» •"" ^









o\ ^^ QO ^^ oo m r^ oo co
00 00 01 oo ^™ r^ vo r^» r^








01 m ^^ ^p *™> ^™ r^ ^o c3
^f ^f in ^f ^~ ^^ f) ^f ^f




v» «— p-«
** »— c« c viaiviai
CMC a>r-p-o c ja •<-> ^3
(o oai i-vi o i- O«^--^C-F-C-
i. 4J l_ 01 T- >— IO >>-O t. *J *J -r-VIL.IOi.VI
ai 3  c 3 »J *J = c <-> s- o +-> uai
>^— >-MU 10 E*^oi oeio*^ oicu oiiao vi3 vi3
Of— O-r- C Or-UfOr- VIM V) U -J O>>— OICT
UO UC-O 01 C 0 « -M VIUWO 3« 3O-r- "O — 131-
O.-'-VIO 3 I- Q. t- i— CO C

l.«t!*j'uS.-r^ S- "io t. > "" 1. U i-SSl. S- 0
o->- 0-^3 o t- oi-u o-r>vi*r- oo oo oai oai
a. t- Q. i -^> a. ia O.-T- i- Q-*-> c 10 P- Q.+J cxf *J Q.J: o. •M
£cu cu ai a.r- aiias- aiuo vi ai <•- cuvic a> *-> ai
+j uvivi 4- e oo t- o i. 4;-!- o> c kviia i- a> t- t-cn
•*• u - O C7 4JW M»4->CO U«J O*^~ **- C
ail. oio>> eu o < aicvi oi<«-ief--r- cu •>- 10 cu -w ^ cu o ai-r-
jz u j; *-> o f ^ .cof- fair— uvi jcf-o f E -c fr—
•M •*<> T. C +> >)3£ -f->U£ 4^ 331/1 4->O. 4->V)iO 4->.C 4^0.
•— c ai vi (. 3e cn-o t- 10 cu o. u E
•Of— 13 o en 13 « o *O3O T) o cu cu E -o i- x: -c> « v- -oio •oie
f-1-
10 *-•
VI
Ol O)
C 3
O cr
IO
1II-6

-------
       TABLE A-2.  NUMBER OF REDESIGNATION BY STATE AND LOCAL AGENCIES
Redesignation
    Initiated        Completed  &  Submitted
Primary  Secondary    Primary   Secondary
Attainment or unclassifiable
  to nonattainment

Unclassifiable to attainment

Nonattainment to attainment
  or uncl assifiable

Total
    17

    87


   106
  0


  2

164


166
 7

70


81
  0


  1

165


166
     The audit then attempted to determine how agencies review attainment
status.  Five options were listed:  (1) staff notification of violations  to
agency management, (2) consideration of air quality data for the past 2 to 3
years, (3) consideration of modeled exceedances, (4) consideration of control
and emission changes, and (5) investigation into cause of violations.
Responses are shown in Table A-3.  From Table A-3 it can be seen that all
five methods are used by over half of the audited agencies.  The method most
used is the use of air quality data, while modeling is the least used method.
    TABLE A-3.  METHODS USED TO REVIEW ATTAINMENT STATUS,
           Method
              Number of Agencies
          Yes       No         N/A
Staff notifications of violations
to agency management
Consideration of air quality data
for the past 2-3 years
Consideration of modeled
exceedances
Consideration of control and
emission changes
Investigation into causes of violations
38

50

28

32

36
10

1

18

15

10
12

9

14

13

14
     Figure A-l shows the frequency of attainment status reviews  by pollutant,
Most agencies (92 percent) perform attainment status  reviews  at least  on  a
yearly basis.
                                    III-7

-------
u

fl
0
M
«
70
60 -
50 -
40-
30 -
30-
10 -
                 Z\
                       X\
                 /\
                 /N^W
                           /V VN
                           ^
     3 yrs



NOx
        Figure A-l.  Frequency of  agency attainment status
                     reviews by  pollutant.
                                 UI-8

-------
New Violations (A.3.)

     The purpose of this audit question was to determine the number of air
quality monitors that  revealed any new violations  and  the type of action
taken for each violation.  Eighteen agencies (30 percent)  reported 58 new
violations were detected at air quality monitoring sites in FY 1984.   Table A-4
lists the type of action taken for each violation.  Of the 58 violations
detected, action was initiated on 40 sites (69 percent).  Twelve of the remaining
18 sites had TSP violations that control  agencies  believe are covered under
EPA's rural  fugitive dust policy and are not considered violations of the
NAAQS for attainment purposes.  No action was taken on 6 sites;  however,  a
follow up survey of the EPA Regional Offices indicated that the agencies  are
investigating these violations.


    TABLE A-4.  ACTIONS TAKEN FOLLOWING AIR QUALITY MONITORING VIOLATIONS

                              TotalAction
                              Actions     Action          Action         submitted
        Action                Taken       initiated       completed      to EPA


Enforcement action based on     3             3              0                0
  existing SIP regulations

Change in an individual         110                0
  source permit

Source-specific, local,         741                2
  county, or areawide  SIP
  revisions

Additional monitoring            660                0

Verification by modeling        110                0
  studies

Data ignored:  a documented     961                2
  exceptional event

Microscopic examination         770                0
  of filters
Other
Total
6
40
6
34
0
2
0
4
                                    III-9

-------
B.  Emission Inventory

Introduction

    Fifty-nine State and local  agencies were interviewed  to  determine  the
completeness, timeliness and procedures they use  to  develop,  update  and
document their emission inventories.   Agencies  were  asked to  provide infor-
mation on the status of their point  source (both  major  and minor), area
source and mobile source inventories.   Pollutants of primary  interest  were
particulate matter (PM), sulfur dioxide ($02),  CO, VOC, oxides  of nitrogen
(NOX)5 and Pb.  Information on inventory activities  associated  with  potentially
toxic air pollutants, including sources affected  by  NESHAP was  also  obtained.

Major Findings and Conclusions

    About 90-95 percent of the 59 agencies maintain  an  emission inventory
for most criteria pollutants; however, only 75  percent  apparently maintain an
inventory for Pb.  Point source emissions were  the major  focus  of these
inventories.  A smaller number of agencies (55  percent) maintain an  area
source inventory.

    Most of the inventories are fairly current.  Approximately  half  of the
(point source) inventories were updated in 1984 or later. About three-quarters
have been updated since 1983.  Moreover, about  80 percent of  the agencies
indicate that they plan to update their inventory within  the  next year.
Nevertheless, a few inventories have apparently not  been  updated in  6-10
years.

    Some inventories are not as comprehensive as  perhaps  they could  be .
For example, some agencies do not include certain sources in  their area
source inventory (e.g., fugitive PM  emissions from open dust  sources,  VOC
emissions from miscellaneous solvent  use, Pb emissions  from  waste combustion).

    About 30 percent of the agencies  have an inventory  of point sources which
emit potentially toxic substances.  The number  of pollutants, as well  as the
specific toxic pollutants inventoried, vary considerably  from agency to
agency.  In almost every case, area  source emissions of toxic substances are
not inventoried.

    The majority of agencies use a fully automated or partially automated
data system for storage and retrieval  of emissions inventory data.   Agencies
using totally manual data processing  procedures are  mostly agencies  that have
a small number of point sources to track.
                                    111-10

-------
    Agencies reported using various  procedures  to  check  the  inventory.  About
two-thirds prepare reports documenting  the  procedures  and  assumptions they
use to compile the inventory, and about half of these  agencies  have  the
report externally reviewed.  Most agencies  also have established various
levels of internal review of the inventory  data.

Response to Individual  Questions

    Approximately 90-95 percent of the  agencies* indicated they maintain  an
emission inventory for PM, S02, CO,  VOC and NOX.   Seventy-five  percent indicated
they maintain an inventory for Pb.  There are,  however,  many differences  in
the scope and content of the inventories from agency to  agency. For example,
while 90+ percent maintain an inventory of  point source  emissions, only
55 percent maintain an area source inventory.  This  is not too  surprising,
since there is no general requirement to maintain  area source inventories on
a routine basis (unless the agency is reporting on Reasonable Further Progress
(RFP) in a nonattainment area).  For mobile sources, most  agencies indicated
that an inventory is available for their particular  area;  however, only 50
percent of these agencies stated it  is  maintained  by their agency.   In many
cases, the inventory is maintained by either the local planning or transpor-
tation agency.

    Table 1 summarizes the types of  inventories maintained by agencies on a
pollutant-by-pollutant basis.
   Although 59 agencies were interviewed,  a lesser number  (50 to 55)  responded
   to any particular question.   Unless indicated  otherwise,  the percents listed
   in this report are based on  the number  of agencies  interviewed  (i.e., 59).
   To determine the number of agencies responding to  a particular  question,
   simply multiply the percentage by 59.
                                    III-ll

-------
   Table 1:   Number Of Agencies  Maintaining Various Types Of Emission  Inventories
Number With Point
Source Inventory
Number With Area
Source Inventory
Number With Mobile
Source Inventory
Number Of Agencies
Interviewed
Pollutant
PM S02 CO VOC
56 (93%) 56 (93%) 54 (90%) 54 (90%)
37 (62%) 28 (47%) 32 (53%) 38 (63%)
(a) (a) 45 (75%) 40 (68%)
59 59 59 59
NOX Pb
56 (93%) 44 (75%)
31 (52%) 18 (30%)
29 (43%) (a)
59 59
(  )  Number in parentheses  is  the  percent  of  agencies, based on the number
     of agencies interviewed (i.e.,  59).

(a)  Some, though not all,  agencies  indicated  that they  inventory PM,
     Pb and S02 emissions from mobile  sources.


    A few observations are  worth noting.   First, with the  exception of Pb,
there are generally only slight variations in  the percents  reported among
pollutants.  This is a relatively  consistent  finding that  will be seen in
other comparisons.  Also of interest is the finding that only 43 percent of
the agencies maintain an NOX inventory of  mobile sources.   While it is true
there are only a few nonattainment areas for  NOX, an NOX inventory may be
useful  for modeling analyses in ozone  nonattainment areas.  As such, a similar
number of VOC and NOX mobile source  inventories would have been expected.

Timeliness Of The Inventory

    The audit results provide an indication of how current  the inventories
are.  Information was obtained with  respect to when the  "complete" inventory
was last updated and Table  2 presents  this information.  About half of the
agencies report that the complete  inventory was updated  in  1984, or later,
while about three quarters  indicate  an update  since the  beginning of 1983.
                                    111-12

-------
            Table 2:   Number Of Agencies  Indicating  When The
                      Complete Emission  Inventory  Was  Updated
Year Of Update
1984 (or later)
1983
1982
1981
1976-1980
1975 (or before)
No Response
Pollutant
PM
31 (53%)
13 (22%)
7 (12%)
1 ( 3%)
4 ( 6%)
0 ( 0%)
3 ( 5%)
S02
31 (53%)
13 (22%)
7 (12%)
1 ( 3%)
3 ( 4%)
0 ( 0%)
4 ( 7%)
CO
29 (49%)
12 (20%)
8 (13%)
1 ( 3%)
3 ( 5%)
1 ( 3%)
5 ( 8%)
VOC
30 (51%)
15 (25%)
5 ( 9%)
1 ( 2%)
3 ( 5%)
0 ( 0%)
5 ( 8%)
NOX
30 (51%)
14 (23%)
8 (13%)
1 ( 3%)
3 ( 5%)
0 ( 0%)
3 ( 5%)
Pb
18 (31%)
9 (15%)
7 (11%)
1 ( 2%)
9 (15%)
0 ( 0%)
15 (26%)
Mobile Source Emission Factors Used  in  SIP's

    Table 3 provides information  with  respect  to  the mobile source emission
factors used in the base year mobile source  inventory of the current SIP.
              Table 3:   Mobile Source  Emission  Factors
                        Used In Current  SIP's
Mobile 1 Mobile 2 or 2.5
Number of SIP's: 15 (25%) 26 (44%)
Mobile 3 No Response
7 (12%) 11 (19%)
    As can be seen, 41 agencies  (69  percent)  indicated that either Mobile 1,
Mobile 2, or Mobile 2.5 was used  to  estimate  emissions in the current SIP.
This suggests that many mobile source  inventories  are outdated, since Mobile 3
is now available and revises emission  estimates  upward considerably, particu-
larly for VOC.  Also, only 24 agencies (41  percent)  indicated that they now
use Mobile 3 to update the inventory.   It  is  not clear why more agencies
would not use Mobile 3 to update  their mobile source inventory.
                                    111-13

-------
Point Source Inventories

    The audit obtained  information  concerning the emission  rate cutoff
level which agencies  use  to  define  a  point source.  The survey revealed
that roughly 50 percent of the  agencies  use 25 tons/year, or less, to define
a point source.  Po1lutant-by-po11utant  information concerning cutoff levels
is summarized in Table  4.
               Table 4:   Number  Of  Agencies Using Different
                         Emission Rates  To Define A Point Source
Cutoff Level
(Tons/Year)

1-5
6-10
11-25
26-50
51-100
> 100
No response
Pollutant
PM
15 (26%)
3 ( 5%)
15 (25%)
1 ( 2%)
8 (13%)
0 ( 0%)
17 (25%)
S02
15 (26%)
4 ( 7%)
11 (18%)
4 ( 7%)
7 (12%)
0 ( 0%)
18 (31%)
CO
13 (22%)
3 ( 6%)
9 (15%)
1 ( 2%)
11 (18%)
2 ( 3%)
20 (34%)
VOC
12 (21%)
4 ( 7%)
9 (15%)
5 ( 8%)
9 (15%)
0 ( 0%)
20 (34%)
NOX
13 (22%)
3 ( 6%)
9 (15%)
5 ( 8%)
9 (15%)
1 ( 2%)
19 (32%)
Pb
25 (43%)
2 ( 3%)
4 ( 7%)
0 ( 0%)
0 ( 0%)
0 ( 0%)
28 (47%)
    Similar information was obtained  to  determine  the cutoff  level  agencies
use to distinguish between a "major"  point  source  and a  "minor" point source.
With the exception of Pb, most  agencies  (approximately 75 percent)  indicated
that they considered sources emitting 100 tons/year, or  greater, as major
point sources.  Most agencies use 5 tons/year,  or  greater, to define a "major"
source of Pb.  The significance of being considered  a "major" or "minor"
source is that less attention is generally  devoted to minor sources with
respect to obtaining, updating  and quality  assuring  emissions data.

Area Sources

     Responses to a number of audit questions provide some perspective with
respect to agency activities regarding area source inventories.  Of interest
was whether agencies include certain  sources in their area source  inventory
(e.g., fugitive particulate emissions, nonreactive VOC's).  For example,
of the agencies who reported to have  an  area source  inventory:

    27 agencies (73 percent) reportedly  include fugitive open sources
    (e.g., paved and unpaved roads) of particulate matter.

    24 agencies (63 percent) apparently  exclude nonreactive emissions of VOC.

    29 agencies (76 percent) said that they include  VOC  emissions  from
    miscellaneous solvent use.

    7 agencies (39 percent) stated that  their area source inventories include
    lead emissions from waste oil combustion.
                                    111-14

-------
    While many agencies include these categories,  as they  should, others
do not.  The lack of consideration  of these  source categories could have
considerable impact if the inventories were  used for control strategy
development purposes in nonattainment areas.

Updating Of Emission Inventory

    A number of questions obtained  information  about agency  plans and procedures
for updating inventories.  Roughly  80 percent of the agencies stated that
they either plan to update the inventory (at least for major point sources)
during the next year or that they routinely  update their inventory on a
continual basis.  About 40 percent  plan to update  minor sources, and 20 percent
plan to do so for area sources over the same time  period.  Table 5 illustrates
the reduced level of updating activity for area and minor  sources compared to
point sources.
    Table 5:  Number Of Agencies  Planning  To  Update  Specified Portions Of
              Their Inventory, Either On A Continual  Basis Or Within One Year
  Portion Of
  Inventory
                          Pollutant
                      PM
             S02
CO
VOC
NOv
Pb
Major Point Sources

Minor Point Sources

Area Sources
43 (72%)   44 (73%)    46 (77%)    44 (74%)    43  (72%)   21  (52%)

23 (38%)   22 (37%)    25 (42%)    25 (42%)    24  (40%)   16  (27%)

13 (22%)   13 (22%)    12 (20%)    11 (18%)    11  (18%)   4  ( 7%)
                                    111-15

-------
         Agencies were asked to identify the type  of  procedures  they  use  to
update their inventories.   Agency responses  concerning  the  use of  specific
procedures are provided below.
    (1)  Number of agencies which  update point  source  inventories  based  on:

                                               Most  Of
                                      Always    The Time    Sometimes    Never
                                           No
                                        Response
Source Inspection                       14
Source Permit Expiration                17
Source Reporting                        26
Source Test                             24
Source Permit Application               28
Continuous Emission Monitor Report       6
            14
             9
            13
            14
            14
             3
              26
              13
              16
              17
               5
              21
           4
          13
           3
           3
          11
          22
          2
          8
          2
          2
          2
          8
    (2)  Number of agencies which  update point  sources  based on:
Changes in Emission Controls
Shutdown of Sources
New Sources
Malfunctions
Change in Source Activity or
  Production
New Inspection or Enforcement
  Information
Always

  42
  45
  47
   7

  26

  22
Most of
The Time

   11
    6
    7
    3

   17

   10
                                                          Sometimes    Never
 3
 5
 3
20

12

23
 0
 1
 1
25

 2

 2
   No
Response

   4
   3
   2
   5

   3

   3
    (3)  Number of agencies  which  update  area  and mobile  sources based on:
                                               Most  Of
                                      Always    The Time
                    Sometimes   Never
                                  No
                               Response
Change in Census or
  Population Estimates
Change in Employment Estimates
Change in Fuel  Use Estimtes
Change in VMT Estimates
  20
   9
  14
  29
    9
    4
    7
    2
 9
15
12
 5
 8
16
12
 4
  14
  16
  15
  20
                                    111-16

-------
    It is interesting to note that continuous emission monitoring  data are
apparently not used in many cases to update the inventory.   It is  not clear
why this may be so since continuous monitors conceptually provide  the best
estimate of emissions from a source.  Of further interest is the fact that
not all agencies reported making changes to the inventory,  even in cases  when
a point source shuts down.

Inventory of Non-criteria Pollutants

    The audit obtained information concerning inventory activities associ-
ated with non-criteria pollutants, including sources emitting (a)  potentially
toxic substances; (b) hazardous or toxic pollutants regulated by NESHAP
(mercury, beryllium, asbestos, vinyl chloride, and benzene); and (c)  those
pollutants regulated under the section lll(d) of the Clean  Air Act.   Section
lll(d) pollutants are non-criteria pollutants for which NSPS have  been set.

    Approximately 30 percent of the agencies compile inventories for  various
potentially toxic substances.  Virtually all of the inventories focus exclusively
on point sources (i.e., area sources were not inventoried).  There is con-
siderable variations with respect to the number of toxic substances  inven-
toried.  For example, some agencies reported that they inventory around 6-10
pollutants, while others inventory as many as 800-1000 pollutants.

     About 40-50 percent of those agencies which have been  delegated  authority
to regulate NESHAP sources maintain an inventory of those sources.  Roughly
30 percent of the agencies indicate that they also inventory sources  emitting
pollutants affected by section lll(d) of the Act.  This response may  be misleading,
since sources of lll(d) pollutants (fertilizer plants, sulfuric acid  plants,
kraft pulp mills, and primary aluminum plants) may not be located  within  each
of the jurisdictions administered by the agencies.

Data Checks

    The audit focused on procedures that agencies use to check their  inventory.
Many agencies said that they perform a number of activities with respect  to
documenting and reviewing the inventory data.

    For example, about 75-80 percent of the agencies reportedly perform some
type of edit checks (e.g., extreme value checks) of their point source inventory
data.  Similar results were obtained when agencies were asked about  consistency
checks of emission factors among source types.  About 60 percent of  the agencies
indicated these checks are performed on point sources, while 20-25 percent of the
agencies perform these checks on area and mobile sources.  Inventory  data are
internally reviewed by various staff personnel (e.g., enforcement, planning,
permit staff), as well as by the supervisor in about 60-70  percent of the
agencies.

    In addition, about two-thirds of the agencies indicated that they
prepared reports describing the inventory, assumptions and  procedures, with
about half of these agencies having the reports reviewed by external  groups.
                                    111-17

-------
Computerization And Report Formats

    The majority of agencies interviewed  indicated  that  they  use  a  fully
automated or partially automated data  system for  storage and  retrieval of
emission inventory data.  Agencies  using  totally  manual  data  processing
procedures are mostly agencies that have  a  small  number  of  point  sources.
Most States that have automated systems use them  for  storage  of point  source
data.  Four of the States that do not  have  an automated  system for  storing
point source data indicate that they have a need  for  such a system.  Only
about half of the agencies use an automated system  for storage of area source
data.  Some of these use their system  only  for limited applications, such as
for storage of data for certain urban  areas or for  "minor"  point  sources.
Nine agencies indicated that an area source system  is not available but  is
needed.

    For nearly all cases, computerized data handling  systems  are  capable
of computerized editing of data and are capable of  producing  an annual
National Emissions Data System (NEDS)  report.  Most systems operate in batch
mode for storage and retrieval of data.   This appears to be a satisfactory
situation for most States.  Only two States indicate  no  need  for  a  batch
retrieval  capability, saying instead that they use  interactive retrievals
exclusively.  Interactive retrieval  capability is available for 21  agencies.
Eleven others indicated that they do not  have this  capability but would  find
it desirable.  A few comments were provided that  interactive  retrievals,
although desirable, are not used because  of high  cost.

    About 80 percent of responses indicate  that the systems are or  should be
sufficiently "user-friendly" to permit non-ADP personnel  to use the system.
Roughly the same proportion of responses  indicates  that  systems are, or
should be, able to store both current  and historical  data for tracking RFP
and should be capable of storing emissions  and enforcement/compliance  data
in the same file.  These are some of the  design criteria for  the  new
Aerometric Information and Retrieval System (AIRS)  facility data  system  to be
developed by EPA.  The States' responses  indicate that if the AIRS  system is
to meet State needs, it should be "user friendly,"  have  the capability to
store both emissions and enforcement data,  and be capable of  tracking  emission
trends.  The AIRS should provide both  interactive and batch access  capabilities,
In order for States to make use of  the interactive  capabilities,  the expense
to the States should not be excessive.

    The survey results do not reveal a great deal about  the States' level
of satisfaction with their current  systems.  The  greatest needs appear to
be for more "user-friendly" access  capabilities,  more efficient interactive
retrievals, and merging of emissions with enforcement data  in those States
that have not already accomplished  this.
                                    111-18

-------
What Conclusions Can Be Drawn From The Audit  Responses?

    There are a number of observations that can  be drawn  from  the audit.
However, it is important to consider two points  while interpreting the
responses to the audit questions.

    First, the responses to similar questions did  not always  result in  similar
responses.  For this and other reasons, it is recommended that the numbers
and percentages presented in this  analysis be viewed more as  qualitative
indicators rather than as absolute numbers.

    Secondly, the audit sought general information on emission inventory
procedures that are used by 59 different agencies.  It is important to
realize that there is no uniform inventory requirement that applies to  all
areas.  For example, more complete, comprehensive  and up-to-date  inventories
for point, area and mobiles sources are typically  required in  nonattainment
areas than in attainment areas.  As a result, it is not  entirely  possible to
conclude from the information provided if agencies are adequately (or inade-
quately) compiling inventories which meet specific requirements.   The audit
results should, therefore, be viewed more for general  trends  in various
inventory activities, rather than  to attempt  to  determine specific gaps or
limitations with ongoing programs.  With these caveats in mind, the following
conclusions are presented.

    Almost every agency maintains  emission inventories for criteria pollutants.
These inventories focus primarily  on point sources, although mobile and area
source inventories are included to a lesser degree.  Point source inventories
are relatively current, and agencies'  plans to keep them  current  are noteworthy.
Most agencies appear to make reasonable efforts  to compile a comprehensive
inventory of point sources.  Many  agencies use various means to estimate
emissions for a source, even if an emission factor is not readily available.
Most agencies who are required to  track RFP believe that  their inventory is
useful for performing this function.  Further, agencies  have  incorporated a
number of internal and external checks into the  emission  inventory system,
which give some confidence that credible emission  inventory data  bases,
particularly for point sources, are being collected.

    On the other hand, some information from  the audit suggests that area
source inventories may not be as comprehensive as  perhaps they should.
Certain sources may not be included, and some area source inventories may
be outdated.  Mobile source inventories may suffer, since, in  some areas,
the agency responsible for air pollution control does not have direct
responsibility for the mobile source inventory.  Accordingly,  the RFP concept
is not as precise a tool for tracking progress toward attainment  of the NAAQS
as EPA originally intended.

    With regard to toxics, agencies should be applauded  for taking a leader-
ship role in inventorying sources  of toxic substances.  Judging from the
responses however there is an apparent lack of consistency in  both the  specific
pollutants and number of pollutants being inventoried.  Also,  it  appears that
agencies may be ignoring area sources of toxic pollutants which have been
found to contribute a considerable portion of the  total  risk within a particular
area.  Some effort to foster a greater degree of consistency and  completeness
among air toxic inventories appears to be needed.

                                    111-19

-------
C.  MODELING

Introduction

     This section summarizes the detailed  audit  information  on  the  air  quality
modeling capabilities of State and local  agencies.   The  intent  of the audit  was
to determine the technical  adequacy of modeling  analyses and the degree of
nationwide consistency with EPA modeling  guidance.   In order to make this
determination, the State/local  agencies and  the  EPA  Regional  Offices were asked
to respond to questions in  five areas:  (1)  training and experience of  State/local
agency modeling staff; (2)  availability of computer  hardware and software for
performing modeling analyses; (3)  deviations from  recommended EPA modeling
techniques; (4) Regional  Office review of  State/local modeling  analyses; and
(5) State/local agency review of industry  modeling  analyses.

     A key area for assessing the State/local  agency's ability  to perform
technically adequate analyses is the adequacy of the education  and  experience
level of the agency's modeling staff.  The agencies  were asked  to describe their
staff's training and experience in meteorology and/or modeling, and to  identify
the staff's experience with both simple and  complex  modeling issues.

     A second area for determining the ability of  the agency to deal effectively
with modeling problems is the availability of the  physical resources (hardware
and software) necessary for performing the analyses. The agencies  were asked
to indicate whether they had access to UNAMAP models, and if so, whether they
had the "in-house" expertise to use the models.

     The agencies were asked to discuss deviations  from  EPA  recommended models
and data bases that were used for regulatory modeling analyses. Reasons were
solicited for using alternative models or  data bases not specifically recommended
in EPA modeling guidance.

     The questions in the final two areas  concerned  the  EPA  review  of State/local
agency modeling analyses and State/local  agency  review of industry  modeling
analyses.  The intent of the questions in  these  two  areas was to determine the
degree of consistency with  EPA modeling guidance and the technical  adequacy  of
modeling analyses.

Major Findings and Conclusions

     Some general observations about the degree  of  consistency  and  technical
adequacy in modeling done by the agencies  can be made based  upon the responses
to the audit.  It is important to realize  that the resources required by an
agency for modeling analyses depend on a number  of  factors and  vary widely
from agency to agency.  Such factors as the  number  of modeling  problems,
complexity of suitable models, level of detail required  for  input data  bases,
meteorological and topographic features,  and other  technical  issues, determine
an agency's ability to effectively deal with modeling problems. These  factors,
along with the case-by-case requirements  of  many modeling analyses, make it
difficult to assess the specific difficulties that  the agencies have in
implementing EPA modeling guidance.  For these reasons,  the  audit responses
should be viewed more for the general problems experienced by the agencies,
rather than an attempt to determine specific limitations of  modeling programs.

                                    111-20

-------
     Most of the State/local  agency personnel  responsible for air quality
modeling appear to have adequate training and  experience in air pollution
meteorology.  About one-third of those performing modeling have a degree in
meteorology, and another 50 percent have an engineering/technical  degree
with formal training in air pollution meteorology.

     Based upon the audit results, it was not  possible to determine whether
the agencies had sufficient personnel resources to deal  with their modeling
problems.  A review of the complexity of each  modeling analysis on a case-by-
case basis would be necessary to make that assessment.  One important obser-
vation about the agencies personnel resources  is that about one fourth of the
agencies lack a single person with modeling training or experience.  Although
these agencies reported very few regulatory modeling applications during
FY 1984, there is a potential for future problems as new modeling issues
arise.

     Most agencies have the physical  facilities (hardware and software)
available to the staff to perform modeling analyses.  Most of the agencies
have access to some of the UNAMAP models, but  only 55 percent of the agencies
reported having access to the most current version 5 of UNAMAP.  To ensure
consistency, the agencies should all  be accessing the most current version of
UNAMAP models.

     Although the majority of analyses performed by the State/local
agencies follow EPA modeling guidance, one half of the agencies found it
necessary to use techniques not specifically recommended in EPA guidance at
least once during FY 1984.  Twenty-two agencies reported using a non-recommended
model, primarily because the alternative model  was judged to be technically
more appropriate for the situation than the recommended model.  However,
these agencies rarely employed a performance evaluation to judge the superiority
of the alternative model.

     Sixteen out of 59 agencies reported cases where EPA's recommendation for
using 5 years of off-site or 1 year of on-site meteorological  data was not
followed.  These agencies cited unavailability of data or cost of running
5 years of data as reasons for not following EPA guidance.

     Based upon response by the EPA Regional Offices, the State/local
agencies appear to have problems implementing  EPA guidance regarding regulatory
modeling.  Thirty-two percent of all  modeling  analyses submitted for review
to the EPA Regional  Office by the State/local  agency were returned for*revision.
These analyses required revision because the agency did  not know,  understand
or follow EPA guidance, or because the analysis was considered to be technically
inadequate.  Perhaps some mechanism should be  provided to promote a better
understanding among the State/local agencies concerning  the requirements for
regulatory modeling.
                                    111-21

-------
Responses to Individual  Questions

Training and Experience (Questions C.la-f)

     Question C.la was asked to determine both  the  number  and  the  educational
background of staff who normally use air quality models.   The  intent  of  the
question was to determine whether the personnel  who routinely  perform modeling
at the agency had adequate training in the use  of air  quality  models.

     The responses to this question show that of the 167 agency  personnel
that perform modeling, 55 (34 percent) have a college  degree in  either
meteorology or atmospheric science, 82 (48 percent)  have a college degree  in
a technical field and also have had some formal  meteorological or  dispersion
modeling training, and 27 (18 percent) have a college  degree in  a  technical
field but have no formal  training in meteorology or dispersion modeling.

     Of the 60 agencies responding to this question, 12 agencies (20  percent)
did not have any personnel with either an atmospheric  science  degree  or  formal
training in dispersion modeling.  Most of these agencies responded to later
questions that they had few modeling applications,  used only screening models,
or had used contractors to perform the analyses. The  remainder  of the agen-
cies had at least one person with modeling training.

     Question C.lb asked for the experience level of the modeling  staff.  The
results were as follows:   25 percent have less  than 2  years of modeling
experience, 22 percent have 2-5 years experience, 37 percent have  6-10 years
experience, and 16 percent have more than 10 years  experience.

     Part c of Question C.I concerned the agency's  capabilities  to execute
EPA models.  A majority of the agencies (86 percent) had the ability  to  run
EPA screening models in-house.  Most agencies (80 percent)  could also run
refined EPA models in simple circumstances. About  two-thirds  of the  agencies
reported the ability to run refined EPA models  in-house in all types  of
circumstances.

     Question C.ld asked the agency to identify the most complex air
quality model that had been used for regulatory applications and the  models
most commonly used by the staff.  Nearly half of the agencies  listed  either
the ISCLT or ISCST models as the most complex that  they had used.   These
agencies typically used the ISC models for PSD  review, SIP-related modeling,
and downwash analysis.  A smaller number of agencies reported  the  use of
models such as MPTER or CRSTER as the most complex  model used  for  regulatory
applications.  Seven agencies reported using city-specific EKMA  for urban
ozone applications, and one agency had used the Urban  Airshed  Model for  a  SIP
analysis.  Two agencies listed MESOPUFF as their most  complex  modeling appli-
cation for determining Class I PSD increment consumption and for acid deposition
studies.  The models most commonly used were screening models  such as PTPLU
and PTMAX.

     Question C.le was asked to determine if the agency staff  has  used EPA's
"Interim Procedures for Evaluating Air Quality  Models" for evaluating non-
guideline models.  Thirteen agencies responded  positively. They employed
the procedures for evaluating nonguideline modeling techniques relating  to
urban/rural dispersion coefficients, complex terrain representations, and
                                    111-22

-------
averaging times.  Each of these 13 agencies  employed  at  least  one  staff
person with at least 5 years of modeling  experience.

     Question C.lf concerned development  of  nonguideline models.   Nineteen
agencies (32 percent)  had modified guideline models or had developed  new
nonguideline models.  There were a variety of reasons for the  circumstances
that required use of a nonguideline model.   Some  of the  changes to guideline
models were modifications to the existing model's algorithm  for averaging
times, downwash and plume rise computations, dispersion  coefficients, and
terrain representations.  New nonguideline models were developed for  applica-
tions such as area source emissions from  toxic waste  sites and techniques
for chemical mass balance receptor analyses.

     Table C-l contains the information from questions C.I a,  b, c, e, and f.
                                    111-23

-------
                                 TABLE C-l

                      Modeling Training and Experience
Cla  What is the training of
     the staff that performs
     modeling?
Clb  What is the experience
     level of the modeling
     staff?
Meteorology
  Degree

    34%
    25%
                                             Technical Degree   Technical Degree
                                                   with              without
                                             Modeling Training  Modeling Training
       48%
                18%
                                < 2 Years    2-5 Years  6-10 Years    > 10 Years
22%
                                              Yes
37%
                        No
                NA
16%
Clc* Can staff run EPA:
        screening models?
        refined models in simple
          circumstances?
        refined models in all
          circumstances?
Cle* Has staff used EPA's
     procedures for evalu-
     ating nonguideline
     models?
Clf* Has staff developed
     nonguideline models
     in-house?
               86%

               80%

               63%

               Yes

               22%
                                              Yes
               32%
          7%

         13%

         27%

         No

         68%
         No

         58%
     7%

     7%

    10%

    NA

    10%
    NA

    10%
*Percentages are based on the number of agencies audited
                                   111-24

-------
Access, Expertise, and Use (Questions C.2a-c)

     The first part of Question C.2a asked for the UNAMAP version  number  used
by the agency.  Fifty-five percent of the agencies accessed  Version 5  (most
current), 17 percent accessed Version 4,  3 percent used Version  3, and  25
percent failed to respond.

     The second part of Question C.2a asked the agency if they have access to
and expertise in using 22 UNAMAP models,  and if so, how many FY  1984 applications
did they have with each model.  The models were assigned to  four categories:
(1) models recommended in the "Guideline  on Air Quality Models (1978)";
(2) models recommended in the "Regional Workshops  on Air Quality Modeling:  A
Summary Report (1981)"; (3) EPA and other screening techniques;  and (4) other
nonguideline models.

     Models in the first category included APRAC-1A, AQDM, COM,  RAM, CRSTER,
TCM, TEM, and HIWAY.  Most agencies (78 percent)  had access  to these models,
and 68 percent had in-house expertise in  the use of these models.   There  were
about 430 applications of these models in FY 1984, with the  COM, RAM,  and CRSTER
models having the most applications.

     The second group of models included  ISCLT, ISCST, MPTER, and  CDMQC.  Again,
most agencies had both access to and expertise in  the use of these models.
There were about 542 applications of these models, with the  ISCST  being used
the most.

     The third group included the screening models PTMAX, PTDIS, PTMTP, PTPLU,
VALLEY, and COMPLEX I, as well as agency-specific  screening  techniques.
Eighty-eight percent of the agencies had  access to and 83 percent  had  expertise
in the use of screening models.  Nationwide in FY  1984 there were  about 1,900
uses of either the EPA screening models or agency-specified  screening  tech-
niques.  A majority of these applications used the PT-type models  or the
VALLEY model.  A few agencies have developed their own algorithms  for  screening
analysis.

     The last category included EKMA, Urban Airshed, PAL, and PLUVUE,  along
with other agency-specified models.  There were about 300 uses of  models  in
this category, most of which employed EKMA, CALINE-3, or an  agency-specific
model.

     The number of modeling applications  varied considerably from  agency  to
agency.  The average number of uses per agency was about 52  in FY  1984, with a
range of 0-349.  It was not possible to determine the complexity of each
modeling application.  From this portion  of the audit, therefore,  it could not
be determined whether the agency had adequate staff to conduct the number of
applications that were reported.

     Question C.2b asked the agencies to  identify  the method of  access  to dis-
persion modeling programs.  One-half of the agencies had access  to an  in-
house dedicated computer, 37 percent used telephone lines to access a  nonagency
state/local computer, 27 percent used telephone lines to communicate with EPA's
computer, and 7 percent used a private firm's computer.  Several agencies had
access to models on more than one system.

                                     111-25

-------
     Question C.2c was asked  to  determine  if the agencies had the expertise to
modify the software for UNAMAP models,  and, if  so, to identify which models had
been modified.  Two-thirds of the  agencies  reported that their staff has the
capability to modify the algorithms  of  the models.  Many of the software modifi-
cations that have been made can  be grouped  into two general categories: (1) changes
to the input/output format in order  to  make data entry simpler or report output
easier to read; and (2) changes  that were  necessary to get the model to compile
and run on the agency's computer.  The agencies  involved were convinced that
neither of these types of modifications should  cause the model to compute
results that are different from  the  guideline model result.  However, a few
agencies made modifications to UNAMAP programs  that would significantly change
the results of the model.

     Table C-2 contains the information from the questions discussed above.
                                     111-26

-------
                                 TABLE C-2

                Accessibility, Expertise, and Use of Models
C2al  Which version of UNAMAP does
      staff access?
55%  17%   3%

                               25%
C2a2  Summarize general  capabilities
      with the following types of
      models:

        EPA Guideline Models
        Models recommended in EPA
          Regional  Workshop Report
        EPA Screening Techniques
        Other Models
                                              Expertise  #  of FY  1984   #  of  FY  1984
                                      Access   to Run    Applications   Applications
78%
78%
88%
68%
68%
73%
83%
60%
430
542
1792
300
14%
18%
58%
10%
                                     In-House   State/Local     EPA         Private
                                     Computer    Computer    Computer      Computer
C2b  How are models accessed?
41%
30%
22%
7%
C2c  Does staff have ability
     to modify software for
     models?
         Yes

         67%
         No

         25%
     NA

      8%
                                   111-27

-------
Nonrecommended Modeling Techniques  (Questions  C.3a-d)

     Question C.3a asked the agencies  to  report  the  number of  regulatory modeling
analyses performed by the agency  in FY 1984 where  it was  necessary to  use techniques
not specifically recommended in EPA guidance.   (Note:   One State  reported a
very high number of modeling analyses. These  analyses  were  not included in the
statistics in this section in order to keep the  discussion representative of
the other 59 agencies that responded.) The responses  indicated that 352 out of
1,734 modeling analyses used some nonguideline technique.  The reasons  for
using the nonguideline technique  were  solicited  in Questions C.3b and  C.3c,
relating to eight specific nonguideline techniques.

     The use of a nonguideline model was  reported  in 213  cases.   For most of
these analyses, the agencies reported  that the nonguideline model was  judged
to be technically more appropriate  for the given situation than the guideline
model.  However, in only a small  percentage of these cases was the non-
guideline model chosen to be more appropriate  because  of  a detailed performance
analysis of the guideline versus  nonguideline  model.

     There were 16 analyses performed  using a  guideline model  that had  been
modified to use a nonguideline technique, 20 cases of  using  a  nonrecommended
option of a guideline model, and  15 cases of using a guideline model outside of
its stated limitations.  There were many  reasons why nonguideline modeling
techniques were used, but there was not enough detail  given  to determine if
there were any specific areas where the agencies had problems  with the  EPA
guidelines.

     The agencies were asked to indicate  the number  of modeling analyses that
did not use recommended EPA guidance in four areas related to  meteorological
data, background concentrations,  treatment of  calms, and  design of receptor
network.

     There were 135 cases where the agencies did not follow  EPA's recommended
use of 5 years of off-site or 1 year of on-site  meteorological data.   The two
reasons most often cited for not  following the EPA's recommendation were that
5 years of off-site or one year of  on-site were  not  available  to  the agency, or
that the agency had determined a  "worst case year" to  be  used  instead  of the
full 5-year period.  One agency cited  the cost of  running the  models as the
reason for not following EPA guidance.

     There were many fewer cases  of not following  EPA  guidance in the  other
three areas.  Agencies reported 32  cases  related to  the background air  quality
determination, 13 cases related to  treatment of  calm wind conditions,  and 18
cases related to receptor network design.  There was not  sufficient detail in
the responses to determine specific problems that  the  agencies had with EPA
guidance.

     The final question in this area concerned the degree of concurrence by the
EPA Regional Office with the nonguideline techniques employed  by  the State/
local agency.  Fifty-two of the nonguideline model ing  analyses were approved
by EPA without major reanalysis by  the agency, and only 10 were not approved by
EPA.  EPA concurred with three analyses only after major  revisions to  the
analyses were made by the agency.  Only four analyses  involving nonguideline
techniques are still awaiting a decision  by EPA.  Finally, 115 of the  analyses
were never submitted or did not require EPA concurrence.

     Table C-3 contains the information discussed  above.

                                    111-28

-------
                                  TABLE  C-3

                        Use of Nonrecommended  Models
C3a  How often were nonguideline
     modeling techniques used  in
     FY 1984?
C3b
C3c
C3d
When nonrecommended models
were used, indicate
how often the following
alternatives were used:
When nonrecommended data
bases were used, indicate
how often the following
aspects of the modeling
analyses were important:
When nonrecommended
techniques were used,
indicate how often EPA
ultimately concurred
with the analysis:
                                        Recommended
                                        Techniques

                                        1,449  (80%)
                         Nonrecommended
                            Techniques

                           352  (20%)
213    Use of nonguideline model
 16    Modification of guideline model
 20    Use of nonrecommended  option  of  a
         guideline model
 15    Use of guideline model  outside its
         stated 1 imitations
 16    Other

135    Use of less than 5 years of
         off-site or less than 1 year of
         on-site meteorological data
         logical  data
 32    Use of nonguideline techniques for
         determining background
 13    Use of nonguideline techniques for
         treatment of calms
 18    Use of nonguideline techniques for
         design of receptor network

 28%   EPA concurred without  reanalysis
  2%   EPA concurred after major reanalysis
  5%   EPA did not concur
  2%   Awaiting EPA decision
 63%   EPA concurrence not sought
                                    111-29

-------
EPA Review of Agency Analyses (Question C.4)

     The intent of this question was to determine the frequency  of approval/
disapproval of State/local  agency modeling analyses  by EPA when  approval  or
action by EPA is required.   The EPA Regional  Office  was asked  to provide  the
number of analyses reviewed by EPA, the number of analyses requiring revision,
and the factors that contributed to requiring a revised analysis.   The responses
were grouped into six regulatory areas:  (1)  bubble  (emission  trades);  (2)  sec-
tion 107 redesignations; (3) NSR (including PSD); (4) nonattainment area  SIP
analyses; (5) Pb SIP's; and (6) other SIP modeling.

     Thirty two percent of  all modeling analyses submitted for review to  the
EPA Regional Offices were returned to the State/local agency for revision.
For bubble applications, 13 of 15 analyses reviewed  by the EPA Regional
Office required a revision  to the original  analysis.   From the responses, it
appears that there is not a particular factor that contributed to  requiring a
revised analysis.  Rather,  it appears that the State/local  agency  had general
problems understanding and  responding to EPA guidance related  to modeling for
emission trades.

     Twenty-eight percent of the analyses in  the other five areas  were sent
back to the State/local agency for revision.   The primary factor contributing
to requiring a revised analysis was that the  original analysis was judged to
be technically inadequate by EPA.  There was  insufficient detail  available from
the survey to identify specific problem areas that required reworking of  the
State/local agency analysis.

     Table C-4 shows the types of modeling applications and the  number that
required revision by the agencies.

                                   TABLE C-4

                         EPA Review of Agency Analyses

C4  Indicate the number of  analyses requiring EPA approval  and the results of
    EPA's review in the following areas:

                                             # of Analyses       # of Analyses
                                                Reviewed      Requiring  Revision

    Bubble (emission trades)                      15                 13
    Section 107 redesignations                    68                 25
    New source review (including PSD)             104                 11
    Nonattainment area SIP  analyses               21                 12
    Lead SIP's                                    19                  8
    Other SIP modeling                            33                 13_
    Totals                                        260                 82

    What were the contributing factors that required  a revised analysis?

       9%   Agency did not  know or ask for EPA guidance
      20%   Agency misinterpreted or misunderstood EPA guidance
      18%   Agency misapplied or did not follow EPA  guidance
       0%   No guidance available from EPA
      36%   The analysis was judged technically inadequate by  EPA
      18%   Other

                                     111-30

-------
Agency Review of Industry Analyses (Question C.5)

     This question asked the State/local  agencies  to identify the number of
analyses submitted by industry to the agency for review and  the result  of that
review.  The agencies were to list the contributing factors  that required a
revised analysis.  The responses were grouped into five categories:   (1)  bub-
bles (emission trades); (2)  section 107 redesignations; (3)  NSR, including PSD;
(4) Pb SIP's; and (5) other  source-specific SIP modeling.

     State/local agency review of modeling analyses performed by industry
resulted in 32 percent (124  of 393) of the analyses being  sent back  to  the
responsible party for revision.  Every area except Pb SIP's  had a high  percentage
of the analysis returned for revision.  Thirty-six percent of those  analyses
requiring revision were considered to be technically inadequate by the  reviewing
agency.  Forty-eight percent required revision because the responsible  party
either was unaware, did not  understand, or did not follow  agency guidance for
the particular analysis.

     Table C-5 shows the types of modeling applications and  the number  returned
to industry by the agencies.

                                   TABLE C-5

                       Agency Review of Industry Analyses

C5  Indicate the number of analyses performed by industry  requiring  concur-
    rence by the control agency and the results of the agency's review  in the
    following areas:
                                            # of Analyses      # of Analyses
                                              Reviewed      Requiring Revision

       Bubble (emission trades)                 12                 8
       Section 107 redesignations               13                 11
       New source review (including PSD)         268                 85
       Lead SIP's                               15                 1
       Other source-specific SIP modeling       85                 19
       Totals                                   392               124

    What were the contributing factors that required a revised analysis?

       11%   Responsible party did not know or ask for agency's guidance
       13%   Responsible party misinterpreted or misunderstood agency's
             guidance
       23%   Responsible party misapplied or did not follow  agency's
             guidance
        2%   No guidance available from State or local  agency
       36%   The analysis was judged technically inadequate  by agency
       15%   Other
                                     111-31

-------
D.  SIP EVALUATION AND IMPLEMENTATION

Introduction

    An evaluation of SIP development and implementation  activities  was  designed
to assess whether State plans for attainment  are being reasonably carried  out.
The evaluation was also designed to identify  needs  of State and  local  agencies
in developing and updating SIP's.  This  section  of  the report  summarizes major
findings in these areas and presents detailed information  in each of the audit
areas.  Audit questions covered by the SIP Evaluation and  Implementation
section of the Air Quality Planning and  SIP Activity audit include:

        Timeliness of Regulatory Development,

        Timeliness of Studies,

        Regional Consultation,

        Transportation Control Measures,

        Reasonable Further Progress,

        Source Activity,

        Continuous Emission Monitoring,  and

        Generic Bubbles.


Major Findings and Conclusions

    The SIP evaluation and implementation audit  identified that  44  percent of
the SIP revisions and strategies due or  overdue  by  the end of  1984  had  not been
completed, and of those completed, 27 percent had not been submitted.   In
addition, the audit found that 43 percent of  the revisions and strategies
incomplete were behind schedule.

    Attempts were made to determine causes for delays to SIP revisions  and
strategies.  Survey results were inconclusive.  Questions  asked  included if
there was a tracking system, and if governmental  oversight caused delays.
No correlation was found between those agencies  that did not have a for-
malized tracking system and regulatory delays.  Likewise,  no additional
delay was found for those agencies that  reported governmental  oversight
problems vs. agencies that did not have  governmental oversight.   For those
agencies reporting governmental oversight problems, State  laws and  State
procedures would have to be revised to alleviate the problem.

    The audit also determined that a very significant number (48 percent)
of additional studies (such as nontraditional TSP or CO  hotspots) due by
the end of 1984 had not been completed,  and that 64 percent of these were
behind schedul e.

    The audit determined that agencies regularly consulted with  EPA Regional
Offices concerning bubbles and other source-specific revisions.   Regional

                                   111-32

-------
Offices, however, were only consulted in slightly over half the cases
involving variances.  Agency staff personnel  responsible for source-specific
revisions stated that they had access to current EPA policy and criteria
concerning SIP revisions.

    The audited agencies indicated that an unusually high number (87 percent)
of all transportation control  measures required  by SIP's were being  imple-
mented, while 6 percent of the required TCM's were not being implemented.
Seven percent of the agencies  did not respond.  Additional  review of TCM
implementation and effectiveness may be in order in future audits.

    It was found that many agencies are not comparing changes in emission
inventories with projected emission changes given in the 63 or CO RFP  curve
in the SIP.  For 03, 72 percent of all  cases  were tracked to ensure  reason-
able further progress (RFP).  For CO, only 37 percent of all  cases were
tracked to ensure RFP.  As previously noted under the Emissions Inventory
section, RFP as originally contemplated for long-term tracking and planning
may not be a viable tool.

    Continuous emission monitoring required under 40 Part 51.19 was  being
completely implemented by State and local  agencies in 59 percent of  all
cases, partially implemented in 13 percent of all  cases, and not implemented
28 percent of the time.

     Based upon the data available from this  audit and a subsequent  follow-
up survey of EPA Regional Offices, it appears that the issue of EPA-approved
generic bubble rules must be clarified.  States  that are operating generic
programs without EPA approval  are placing  the "impacted" sources in  jeopardy
of potential EPA actions to enforce the original  SIP limits.  Also,  the
submission of approved generic bubble actions to EPA by States should  be
increased.

Responses to Individual Questions

Timeliness of Regulatory Development (D.I.)

    One of the purposes of the audit was to determine how effective  State
and local agencies are in revising and adopting  State implementation plans
and lll(d) plans.  Questions asked in this regard were intended to identify
if there are delays in regulatory development, and if so, to determine the
type of delays so corrective action could  be  implemented.  EPA Regional
Offices provided to each agency a list of  SIP revisions or strategies  that were
due or overdue by the end of 1984.  The list  included Part D plans,  non-Part
D plans, and other SIP requirements.  States  were requested to complete a
table showing whether action had been completed  and submitted, completed
and not submitted, not completed but on schedule, or not completed and not
on schedule.  National summaries to this question are shown in Table D-l.
Of the 256 revisions or strategies due or  overdue by the end of 1984,
States and local agencies responded to approximately 99 percent.  Overall,
141 out of 256 revisions or strategies had been  completed (55 percent),
while 112 had not been completed (44 percent).  Of those not completed, 64
of 112 (57 percent) were on schedule, while 48 (43 percent) were not on
schedul e.

                                   111-33

-------
             


                            IQ
                            Q.


                            C
                            O
                                  Q.  O»
                                  i—i  &-
                                  00 ••-


                                   s-  o-   ^


                                  ^ Q£    +J
                                  •»-»        O
                                  O       I—
                                 111-34

-------
    The audit then asked if agencies tracked  implementation  of  these  activities
using a management system with key dates  and  periodic  reports,  or whether a
staff person was assigned to track all  dates  on  a  periodic basis.   Seventeen
of 61 agencies (28 percent) reported that they used  a  management system; 23
agencies (38 percent)  assigned a staff  member; and 6 agencies (10 percent)
did both.  Eight agencies (13 percent)  either did  not  track  dates,  had no
formal procedures, or used other mechanisms to assure  implementation.  Eleven
agencies (18 percent)  did not respond to  the  question.   No correlation was
observed between those agencies that did  not  have  a  formalized  tracking system
and regulatory delays.

    The audit also questioned whether delays  in  State  action were due to
required approval  at a level above the  agency, e.g.  legislative or  executive
branch oversight,  and  if so, the length of such  delays.   Twenty-three of
61 agencies (38 percent) reported delays  in State  action  due to oversight,
while 31 agencies  (51  percent) reported no oversight delays. For seven
agencies (11 percent), the question did not apply.  Of those responding
positively, nine agencies (39 percent)  stated that legislative  oversight
caused delays, while eight agencies (35 percent) stated  executive oversight
(including environmental and pollution  control boards, caused delays.  The
remainder of the responses did not fix  blame  with  one  branch of government
or another.

    Figure D-l shows the length of delays for those  agencies responding
affirmatively.  The weighted average length of such delays was  9 months.  One
agency reported a  2 year delay.  Again, responses  to this question  were
compared with regulatory delays to determine  whether a correlation  existed
between delays in  regulations and legislative or executive oversight.  No
correlation was observed.

    For those agencies that reported delays due  to governmental oversight, the
audit questionnaire asked which of the  following action  would reduce delays:

        EPA discussion with appropriate State officials,

        Revision to State's internal  procedures,

        Revision to State regulation,

        Revision to State law,

        Revision to State constitution, or

        Other changes.


    Of the 23 agencies that reported delays due  to government oversight, 18
agencies responded to  the question.  Because  these questions are not mutually
exclusive, many States responded more than once.   Of the 30  suggested actions
received, 3 (10 percent) stated EPA discussion with  State officials would help,
8 (27 percent) would need to revise State internal procedures,  1 (3 percent)
would need to revise State regulations, 13 (43 percent)  would need  to revise
State laws, and 1  (3 percent) would need  to revise the State constitution.
There were four other  responses, such as  no action, or EPA should change its
policy.  Figure D-2 graphically presents  the  responses to the question.

                                   111-35

-------
a

6
g
M
*>

6
0
a.
43




40 -




35 -




30 -




25 -




20 -




15 -




10 -




 5 -
            0-2 mos.
                       3-6 mos.
7-12 mos.
>12 mos.
                  Figure D-l.  Length of  delays due to oversight
                                        111-36

-------
•
0
ti

M
41
C
•
a
45


40 -


35 -


30 -


25 -


20 -


15 -
       10
        5 -
                                                             V777,
                           B
         A.  EPA discussions with  State officials
         B.  Revise State internal procedures
         C.  Revise State regulations
                                                                       i
                                                                       P
                                         D.  Revise State laws
                                         E.  Revise State constitutions
                                         F.  Other actions
                Figure  D-2.  Actions  to reduce government oversight.
                                     111-37

-------
Timeliness of Studies (D.2.)

    EPA Regional  Offices provided to  State and  local  agencies  a  list  of
additional studies, such as nontraditional  TSP  or  CO  hot-spots,  that  were due
by the end of the 1984.   As in "Timeliness of Regulatory  Development," agencies
were requested to complete a  table showing whether action had  been completed
and submitted, completed and  not submitted, not completed but  on schedule, or
not completed and not on schedule. Of the 46 cases listed by  the Regional
Offices, 24 had been completed (52 percent) while  22  (48  percent) were not
completed.  Of those completed, 6 cases (25 percent)  had  not been submitted
to EPA.  Of those not completed, 14 cases  (64 percent) were behind schedule.

Regional Consultation (D.3.)

    EPA Regional  Office  were  asked how often State and local agencies consulted
with them formally before submitting  SIP revisions.  According to Regional
Office responses, State  and local  agencies consulted  on 18 variances  and
submitted 31, consulted  on 38 bubbles and  submitted 38, and consulted on 41
other source specific actions and submitted 41. Although the  reported number
of consultations  and submittals are the same for bubbles  and for other
actions, this does not mean that State and local agencies consulted with
EPA prior to each submittal.   In 34 percent of  these  cases, agencies  either
consulted with EPA and did not submit revisions, or submitted  revisions
without consultation.

    State and local agency officials  were  then  asked  if staff  personnel
responsible for source-specific revisions  had access  to all current EPA policy
and criteria concerning  SIP revisions.  Forty-seven of 61 agencies said yes
(77 percent), one agency said no (2 percent), and  13  agencies  either  did
not respond, or the question  was not  applicable (21 percent).

    Agency officials were then asked  how guidance  was disseminated within
each agency.  The agencies that answered affirmative  to the question  above
distributed copies of EPA guidance to the  appropriate personnel.  Of  these,
25 agencies indicated that they also  used  other methods of disseminating
information, such as using a  policy manual, a summary memo, or periodic
staff meetings.

Transportation Control Measures (D.4)

    EPA Regional  Office  supplied State and local agencies with a list of
transportation control measures (TCM's) that were  to  be implemented in
accordance with the SIP.  Agencies were then asked whether the TCM's  were
being implemented.  Of the 294 measures listed  by  the Regional Offices, the
agencies stated that 255 measures (87 percent)  were being implemented, and
17 measures (6 percent)  were  not being implemented.  This positive
implementation response  seems extraordinarily high.  No answer was given
for 22 measures.

    State and local air  pollution control  officials were  then  asked to
indicate how the air pollution control  agency tracked TCM implementation.
                                   111-38

-------
Of 33 agencies required to implement TCM's  in accordance with
their SIPs, 28 (85 percent) tracked TCM implementation  by receiving  reports
from the implementing agencies.  Eighteen air pollution control  agencies,
some of whom also received reports, had membership on the Metropolitan
Planning Organization (MPO) committee that  enabled them to track TCM imple-
mentation.

Reasonable Further Progress (D.5.)

    Section 172 of the CAA requires SIP's to provide for revision
and resubmission of emission inventories to ensure RFP  toward  attainment.
EPA Regional Offices provided a list of of 03 and CO extension areas to
State and local agencies and asked  if the agency compared changes in the
emission inventory with projected emission changes given in the 03 or CO
RFP curve in the SIP.  Of the 78 03 areas listed by the Regional  Offices,
agencies reported that 56 areas (72 percent) were tracked to ensure  RFP, 15
areas (19 percent) were not tracked, and 7  areas (9 percent) had no  response.
Of the 83 CO areas listed by the Regional Offices, agencies reported that
only 31 areas (37 percent) were tracked to  ensure RFP,  46 areas (55  percent)
were not tracked, and 6 areas (7 percent) had no response.  Figures  D-3 and
D-4 show the 03 and CO responses.

    Agencies that answered affirmative to 03 and/or CO  tracking were then
asked if they prepared a report of  the comparisons, if  the report had been
submitted to EPA, and if the report was made available  to the  public.
Twenty-two agencies indicated that  they prepared a report of the comparisons,
18 agencies stated that they submitted the report to EPA, and  8 agencies
made the report available to the public.  Based  on the  response to this
question, the RFP program cannot be considered as viable an effort as
originally contemplated by EPA.

Source Activity (D.6.)

    The audit questionnaire asked how often State and local  agencies
periodically compared actual source activity (i.e., increases  or decreases
in growth) with past projections.  Forty-five of 61 agencies responded (74
percent).  Of the respondents, 21 agencies  compared actual  source activity
every 1 to 2 years (47 percent), 9  agencies (20  percent)  compared actual
source activity every 2 to 5 years, while 15 agencies  (33 percent) compared
actual source activity at intervals greater than 5 years.

Continuous Emission Monitoring (D.7.)

    Part 51.19 of EPA's regulations on SIPs requires agencies  to adopt
continuous emission monitoring (CEM) for selected categories of existing
stationary sources.  These categories include steam generators, nitric acid
(HNOs) plants, sulfuric acid (H2S04) plants, and fluidized bed catalytic
cracking units.  Agencies were to respond if they required CEM completely,
CEM partially, or no CEM.  These responses  are summarized nationally in
Table D-2.  For those agencies responding that had sources within their
jurisdiction, 59 percent completely required CEM, 13 percent partially
required CEM, and 29 percent did not require CEM.
                                   111-39

-------
                  No response (9.0%)
RFP not tracked (19.2%)
                                                        RFP tracked (71.8%)
                     Figure D-3.   RFP tracking  for ozone.
                   No response (7.2%)
  RFP not  tracked (55.4%)
                                                           RFP tracked (37.3%)
                 Figure D-4.   RFP tracking for carbon monoxide.
                                        111-40

-------








oo
h-
z
LU
2:
LU
o-
LU

O
Z
QC
O
1— 1
Z
s
o
oo
00
1— 1
I ! I
UJ
oo

o
^^
«_J
t— 4
1—
o
U

o
>_
o:
g
y
~^
oo


•
CM

Q

LU
_J
CQ





















- (O O
ex z
^
^>d^
>—
to •»->
0) (U
>- 1—
fro
o z
o













































































+J
3 C
Q.  S-

(O 0) +J
&- .C (O
**s^
Ef^O-
(O CQ -t->
0) Z U

tO V)_
o

Q> CM 4->
S- -i-
•r- C O
4- (O  CD
r- (0 .C CL
•r- O) 4J
to J- -i- O
to o> 2 co
O
Lu
•— i co ir> coco coco r-i
r- 1 r-H i— 1 i— 1 i— 1 «— 1 i-H i— 1


r-» co cn coco coco r~~

r-l i— 1 CO VOCT> COCM «*
i-H CO ^" CM CM VO ^" ^"


a\ o vo co  vo r-.
r^.i— ico 1-1 i— i COCM CM



COLOO t-l VO VOO VO
CM i-H i-H CM r-l r-l r-l


^$* ^^ ^D CO ^^ ^^ tp ^^
r-H T-l r-t r-i

OCOUO OO CMCO CO
r^H rH i-H


VOUOCO VOVO i— 1 CM CM


LO^J-CO r-HCM VOCM VO
^J-COCM COCO .-ICO CM



COi-H'* VT>O OO VO
CMCMl-l i-HCMi i-HCM r-l

to to
"o i— s-
tO S- CO O tO «O r—
+->+-) -l-> S- CO T3 O
•l- C 1- 4J E C 10 t- CO
CO C C C X-i- <0 C •»-> 4J
3O3-r-OO-l->-l-> O C t-
(J Z to i- O C
S-CMS-S-r-» -(J 0 3
O O O -C XT7 • C tS
<4-OOt|_>vs.ocOO i— CM O>
13 ^^ (O /\ hr— O ^"^ g— >j
>> ^1 >> 1— \ tOCM-r- Z -r-J-i
r--(->. — CO 2 >, C Ol O > ^^ CM^^T-
>>CT-cSO>O'i-O S OOO
•I->OS OStOtU-r-E -C «l_ 00 (O"-^ ^-s - -l-> tO CU i- -tJ O Ol 1- >> Q.
O ^— > f*v (Q to O "t— •• ^> 1 O*O f^
(O CM CMO OS-'i-CM 3>>>>O£ (Ol "O
Q.O OOO--»-»EO CM •»•> *<_5 ^O O^^l
OOO Z •— I < to 4) Z O T3 •«- "O O* ^v. -r- CO 1
^v O O ^s, ^C C • • +•* ^*—
II 1 I^»(OC O ^" ^^ f*
II 1 1-i-O.OC 4-> O r— J3
Vl- O +•> i- OO (O
1 CM O CM •»-> O
CO 1 O CMO O OC (O O
(O 1 O O Z CO O O
cr> co z /\ •*-> «
•• "••s.tO/NZl CTOO
"X3 r— r— 1 tO CD > i— CLtUO)
OO r— S- C7) Q. NO)
•— OO_<00»OT-t|_
«J r- Q.+J «3-O "a
O -r- i- CO «J O •— I i- .C
O OOOO1-00 3+->
Z O -M CMM- • 	 r-
I | "T" f^ (/) T* O LL- 3
111-41

-------
Generic Bubble Rule (D.8.)

    States and local  agencies were asked  in  the audit  questionnaire  if  they
had a generic emission trading (bubble)  rule,  if EPA had  approved  the rule,
and the number of bubbles submitted to the EPA Regional Offices  in FY 1984.
Also, in preparing this report,  the Office of  Air Quality Planning and
Standards has included clarifying  information  obtained  from  the  EPA  Regional
Offices after the individual  audit reports were completed.

     Nineteen States  indicated in  the questionnaire that  they  have generic
bubble rules.  Of the 19, only 8 have generic  rules which have been  formally
approved by EPA.  To  have operable programs, the remaining States  should
either have these overall  rules  approved  by  EPA rulemaking or  submit indi-
vidual bubble actions to EPA as  SIP revisions.  If States do not,  the
source may be subject to EPA enforcement  action (because  the source  is  not
in compliance with the EPA-approved SIP).

Bubbles Issued Under  EPA-Approved  Generic Rules

     State and local  agencies were asked  in  the questionnaire  to indicate
the total number of bubbles ever issued  under  their generic  rules.   Six of
the agencies with EPA-approved generic rules have issued  only  four bubbles
(total) under these rules.   Two  of these  four  individual  bubbles were
submitted to EPA for  approval anyway.

     The audit results indicated that the other control agencies with
approved generic rules have issued about  31  individual  bubbles.  Fifteen of
these 31 were granted by one State.  Many of these 15 generic  bubble actions
have problems.  After consultation with  the  appropriate EPA  Regional Office,
it was determined that most if not all,  have averaging  periods longer than
24 hours, although the State does  not have the authority  under its EPA-
approved generic rule to issue bubbles with  averaging times  in excess of 24
hours.

     Consultation with the Regional  Offices  revealed that some of  these
same 15 bubbles may also have the  following  problems:

--Not all VOC bubbles for surface  coating operations were calculated on a
  solids-applied basis.  (This has the effect  of making the  State  emission
  limit less stringent than the  approved  SIP.)

--Some of these bubbles were granted for  sources in differing  nonattainment
  demonstration areas.  (This error probably adds emissions  to a control
  strategy demonstration with no corresponding deduction  in  that same
  area--i.e., there is no bubble.)

The Region has written to this State identifying these  problems.

       Of the 16 remaining bubbles issued under generic rules  by the other
  State, two of these were explicitly reviewed by the  EPA Region and found
  to have problems.  The difficulties uncovered were--
                                   111-42

-------
  --revoking the SIP limit on a source (increasing emission)  while failing
    to tighten the limit on other sources (decreasing  emission);  and

  --failing to identify the sources providing credit  and establishing
    no source-specific limits on sources providing credit.

The quality of the other 14 bubbles in this State is  unknown.

Bubbles Issued under Generic Rules Not Approved by EPA

     Forty-eight bubbles were approved by States under generic  rules
that were not approved by EPA.  All these bubbles should have been sub-
mitted to EPA for formal approval.  Three of the bubbles approved by one
State were submitted to EPA for approval.  One State  refused  to submit any
of its 15 bubbles.  A second State would not submit any of  its  18 bubbles
because it did not see a need to.  A third State has  been informed that it
needs to submit its 12 bubbles, but has only submitted 2 since  the audit.
Lack of formal EPA regulatory action on these individual bubbles  again
places the sources in question in jeopardy of EPA enforcement action based
on the approved SIP limits.

     The audit could not determine if there are other States  with generic
rules that were not reported.
                                   111-43

-------
                           IV.  NEW SOURCE REVIEW

                             EXECUTIVE SUMMARY


     EPA made a notable change in its method for gathering information for
the FY 1985 new source review (NSR) audit.  This change involved the shift
from last year's comprehensive interview of agency staff to an almost exclusive
focus on the examination of agency permit files.  This change added a new
dimension to the audit in that while EPA could continue to evaluate the
performance level of the State and local agencies,  it is also able to gain
a better perspective of how the problems identified affect the collective
population of permits being issued.

     For FY 1985, the NSR audit once again examined the seven major audit
topics originally selected by the NSR audit committee for the FY 1984 audit.
The seven audit topics and an overview of the findings for each topic are
provided below.  EPA conducted on-site audits of 64 air pollution control
agencies, including 48 States,1 the District of Columbia, Puerto Rico, the
Virgin Islands, and 13 local  government programs.

     Altogether, the audited agencies processed approximately 17,150 permits
for new and modified sources of all types and sizes during the time period
upon which this audit report is based.  In turn, EPA auditors spent slightly
less than 950 hours examining to varying degrees a total of 748 permit
files.  Auditors carried out a comprehensive examination of about 60 percent
of these files by completing detailed questionnaires in accordance with
the FY 1985 NAAS guidance.  These questionnaires were then forwarded to EPA
Headquarters, keyed into a computerized audit data base, and analyzed to
prepare this national audit report.

     This year's audit findings support many of the findings from the FY 1984
NSR audit.  That is, the findings again indicate that most agencies perform
their overall new source review program responsibilities reasonably well,
although, in a number of State and local agencies,  problems were identified
with respect to the consistent and adequate application of certain specific
program requirements.  In addition, auditors occasionally mentioned that
noticeable improvements had already been observed in the performance of
some agencies where specific deficiencies had previously been found.  This
is certainly to the credit of these agencies in their efforts to improve
in-house performance and to strive for a greater level of national consistency.
1 The State of California does not have authority to issue permits and does
not implement a preconstruction review program.  Instead, all source permitting
activities in California are performed by local air pollution control
districts.  A new source review audit was not done for the State of Hawaii
during FY 1985 because of an EPA Region IX decision not to conduct an
on-site agency visit.
                                    IV-1

-------
Public Notification Procedures

     Overall, agencies generally complied with the public participation
requirements as applied to major sources, although EPA found some incon-
sistencies in the informational content of the public notices.  However,
while most proposed permits for NSR/PSD sources were announced to the
public for an opportunity to review and comment, the majority of other
permits were not.  This was demonstrated by the fact that EPA found evidence
of public notification in only 38 percent of the non-NSR/PSO permits which
auditors examined.  Moreover, more than 40 percent of the agencies reportedly
did not issue public notices for any of the non-NSR/PSD permits which EPA
examined.

     Many agencies are reluctant to go through the formal notification
process for each and every permit that they review.  EPA agreed, as a result
of last year's similar findings, to reassess the SIP requirement which, when
literally interpreted, requires public notification for all permits.  While
this reassessment is being made, EPA plans to continue evaluating and
reporting on each agency's public notification procedures for all permits.
However, EPA will be more critical in its assessment of how well agencies
apply their procedures to major new and modified sources.

Applicability Determinations

     A significant number of agencies continued to experience difficulties
with the way that they carry out the source applicablity process.  EPA
believes that approximately 15 percent of the audited permits not reviewed
under PSD or NSR requirements probably should have been.  In addition, the
lack of sufficient file documentation often precluded EPA's ability to
adequately evaluate the agencies' applicability determinations.

     EPA identified various types of problems, but most pertain to the way
that agencies account for a new or modified source's emissions in order to
determine whether a major review would be required under either PSD or
nonattainment area regulations.  One particular problem pertains to the
misuse by numerous agencies of the concept of "potential to emit" which
involves the use of Federally enforceable permit conditions to properly
restrict a source's potential emissions (as is often attempted in order to
enable a source to avoid major source review).  EPA intends to provide
guidance to agencies providing ways to correctly apply the concept of
"potential to emit."  One way in which this will be done is through training
courses beginning in early 1986.

BACT/LAER Determinations

     Agencies generally did a good job of applying the BACT requirements to
PSD sources and applicable pollutants.  However, EPA concluded that the
quality of the analysis performed to select the level of control defining
BACT on a case-by-case basis could be improved in some instances.  As was
the case last year, EPA also found that when a PSD source was subject to
NSPS, agencies had a strong tendency to accept the use of the applicable
NSPS to define BACT.  Even though examples of BACT determinations more
stringent than NSPS were identified in PSD permits issued by 12 of 24 agencies,
BACT was established at levels required by NSPS for approximately 80 percent
of the pollutant determinations.

                                    IV-2

-------
     Agencies showed far less tendency to use NSPS for LAER determinations
in nonattainment areas.  Slightly over 50 percent of the time, agencies
required pollutants to be controlled at levels more stringent than the
applicable NSPS.  Nevertheless, these findings suggest that BACT and LAER
requirements do not yet have the technology-forcing effect that Congress
had envisioned when it established the requirements, and that EPA needs to
provide more explicit guidance for making BACT/LAER determinations.

Ambient Monitoring

     This year's audit substantiated a finding that EPA made last year.  That
is, agencies commonly allow PSD applicants to comply with the preconstruction
monitoring data requirements by relying upon existing ambient air qualtiy data
instead of new data collected from a special source-operated monitoring
network.  EPA accepts the use of such existing data if it can meet certain
criteria for representativeness.  Of concern to EPA is the finding that in
almost half of the cases where agencies accepted existing data, the permit
files (a) offered no documented basis for allowing its use, or (b) contained
some description of the data but failed to adequately address or meet all
of the EPA criteria.

Ambient Impact Analyses

     EPA verified that agencies generally required ambient impact analyses
to be conducted where needed to ensure protection of the increments and
NAAQS.  In addition, most agencies generally used or required the use of
the appropriate models and model options to complete the NAAQS analyses.
However, certain questions arose concerning the quality of these analyses
in a number of cases.  With respect to both PSD increment analyses and
NAAQS analyses, agencies did not appear to consistently give thorough
attention to significant emissions from existing sources (major and minor)
located within the impact area of the proposed source.  Another key problem
was the lack of sufficient documentation of the details of the analyses.
This prevented EPA from being able to adequately evaluate the adequacy of
the analyses contained in a significant number of files.

Emission Offset Requirements

     Only 20 percent of the audited agencies issued NSR permits to major
sources in nonattainment areas during the audit period.  EPA's examination
of the NSR permits provided no indication of any nationally significant
problems.  As with other phases of the audit, however, EPA encountered
problems with inadequate file documentation and this hindered an adequate
evaluation of the full creditability of some of the emission offsets that
agencies required.

Permit Specificity and Clarity

     This year's audit raises the same concern that was originally described
last year about the enforceability, and more specifically the Federal
enforceability, of some of the permits which agencies are issuing.  The
acceptable use of physical  and operational limitations to restrict the year-
round operation and production capacity of a source hinges upon the Federal
enforceability of the limitations.   EPA auditors questioned the enforceability
of such presumed limitations in a number of cases where sources had been
allowed to avoid major source review.

                                    IV-3

-------
A.  INTRODUCTION

     For FY 1985, the new source review (NSR) audit examined the ways that
the State and local agencies implement programs for the preconstruction
review of new and modified stationary sources.  As was the case for the
previous year, the FY 1985 audit examined the seven basic topics which were
selected by the NSR audit committee.   These topics included:  (1) Public
Notification Procedures (formerly Administrative Procedures), (2) Applicability
Determinations, (3) BACT/LAER Determinations, (4) Ambient Monitoring (PSD),
(5) Ambient Impact Analysis, (6) Emission Offset Requirements, and (7) Permit
Specificity and Clarity.

     While the audit continued to examine the same selected NSR topics, the
approach differed notably in that the FY 1985 audit focused primarily upon
the inspection of State and local agency permit files, instead of gaining
information through the interview of  agency staff.  Thus, the audit findings
described herein pertain largely to contents of the permit files in contrast
to the focus on agency rules and procedures as described in the FY 1984
audit report.

     EPA obtained the information for this audit report through the use of
several types of questionnaires:  (1) a permit summary questionnaire which
each audited agency was asked to complete, (2) an NSR audit summary question-
naire which the EPA auditors completed following the actual onsite audit,
and (3) two kinds of permit file questionnaires which were completed by the
EPA auditors during the  audit.   These questionnaires were used to collect
information found in a selected portion of the total number of State and
local agency permit files which the auditors examined.  (Each of the question-
naires was explained in the FY 85 audit guidance manual where copies of
each can be found.)

     The NSR audit considered information obtained from 64 air pollution
control agencies, including 48 States*, the District of Columbia, Puerto
Rico, the Virgin Islands, and 13 local government programs.  Where a State
program was carried out by one or more offices (i.e., headquarters or
central office plus district offices) and more than one of the offices was
audited, they considered them all as  part of one State program (agency).
Local agencies were considered separately, even though there may have been
a dependency on the State agency for  certain program operations in some
cases.

     For the time period upon which this report is based, the audited agencies
processed approximately 17,150 permits, including 115 PSD and 57 NSR permits.
EPA, in turn, examined 87 percent of the PSD permits, 42 percent of the NSR
permits, and about 4 percent (624) of the other (non-NSR/PSD) permits.
1 The State of California does not have authority to issue permits and does
not implement a preconstruction review program.  Instead, all source permitting
activities in California are performed by local air pollution control
districts.  A new source review audit was not done for the State of Hawaii
during FY 1985 because of an EPA Region IX decision not to conduct an
on-site agency visit.

                                    IV-4

-------
     While each auditor was encouraged to examine as many permit files as
he or she could, the NAAQS audit guidance did not require the completion of
a questionnaire for each permit file that was examined during the audit.
Instead, a certain minimum number of PSD, NSR, and non-NSR/PSD source
questionnaires was prescribed.  The total number of permit file questionnaires
submitted to EPA Headquarters, and subsequently keyed into a computerized
data base and analyzed to prepare the enclosed audit findings, represents
about 60 percent of the number of files actually examined.  Reference to a
percentage of the "total number of audited files" as used throughout this
report refers only to those permits which were included in the EPA Headquarters
data base, as follows:  59 PSD permits, 14 NSR permits, 5 permits involving
both PSD and NSR, and 360 non-NSR/PSD permits.

     Some findings of general  interest are:

     0 For 53 of the 64 agencies that were audited, EPA examined all of the
NSR/PSD permits that had been processed during the audit period.

     * No NSR/PSD permits at all were issued by 23 of the audited agencies,
while two (2) agencies each issued the highest number (16) of PSD permits;

     0 Agencies issued permits to an estimated 1,673 sources whose potential
emissions ranged from 100 to 249 tons per year, but whose preconstruction
review is typically categorized as a "minor" source review.  This is because
these sources, locating in attainment or unclassified areas, are not subject
to Federal PSD requirements, i.e., they are not listed under 40 CFR 51.24(b)(1)(i)(2)
Consequently, for them to be classified as "major" PSD sources, each would  have
to emit at least 250 tpy of any regulated pollutant.

     e EPA auditors spent approximately 945 hours primarily examining agency
permit files.  On the average, auditors spent 1-1/4 hours on each permit file
and nearly 15 hours per agency audit.  Actual review time for individual
files ranged from 15 minutes to an unusually high 11-1/2 hours.

     Finally, auditors were asked, on the audit summary form, to identify
from their own overall perspective, the 5 most significant problems encountered
for each agency audit.  From the lists of problems submitted, 4 specific
problem areas stood out as being the most often mentioned.  They are:

      Problem Area                 Frequency of occurrence     Percent of total

0 Applicability determinations                 37                    21

0 Permit conditions                            29                    17

* Documentation                                25                    14

0 Public participation requirements            ZL_                    12_

                                              112                    64

Four other problem areas each  constituted 5 percent of the total responses:
BACT and LAER determinations,  PSD increment analyses, dispersion modeling,
and NAAQS protection.

                                    IV-5

-------
8.  SUMMARY OF MAJOR FINDINGS

     This section summarizes the major findings of the 1985 NSR audit.  The
findings that are determined to have national implications are discussed in
greater detail than are problems which appear to be more isolated in nature.
For a better understanding of how the major findings were derived, the
reader is referred to Sections IV.C.  through IV.I., where a breakdown of
the individual audit questions and reponses is provided.

1.  Public Notification Procedures

       While most proposed permits for NSR/PSD sources are announced to the
public for review and comment, most other permits  are not.   EPA found
evidence of public notification in only 38 percent of the non-NSR/PSD source
permits that were examined.  Agency rules excluded approximately half of the
minor source permits from the public notification  process.   (See Figure 1.)

       Forty-two (42) percent of the audited agencies did not issue public
notices for any non-NSR/PSD permits which EPA reviewed.  One third of the
agencies issued public notices for all permits reviewed, while 25 percent
issued notices for some, but not all, of their audited permits.  (See Figure 1.)

       Sixty (60) percent of the agencies that issued PSD permits routinely
included the information required by regulation for PSD public notices.
Only half of the public notices for PSD sources included all of the information
required by the PSD regulations.  The information  most frequently omitted
was the description of the source's estimated ambient impact, including the
amount of PSD increment being consumed.

     '  Federal Land Managers were not always notified of PSD construction
which might adversely affect Class I areas.  Sixteen of 22 PSD permits involving
construction within 100 km of a Class I area were  brought to the attention
of the appropriate Federal Land Manager.   No record of notification was
apparent in the remaining 6 permit files -- each issued by a different
agency.

2.  Applicability Determinations

     0 A significant number of agencies are experiencing difficulties in
adequately carrying out the source applicability process.  EPA believes that
approximately 15 percent of the audited non-NSR/PSD permit files involved
sources that probably should have been reviewed as major sources.  A lack
of adequate file documentation often precluded EPA's ability to adequately
evaluate the agencies' applicability determinations.

     * EPA found examples where 13 agencies either failed to consider certain
pollutant-emitting activities at a source, or improperly interpreted an
exemption provision.  Only in the latter case, however, were sources enabled
to at least partially avoid major source review.  (See Figure 2.)

     0 Agencies continue to have problems properly defining a new or modified
source's "potential to emit."  Thirty-eight (38) agencies had permit files
which EPA considered to be deficient in some respect concerning the calculation
or use of potential emissions for source applicability purposes.  Sometimes,

                                    IV-6

-------
                                     Figure 1
                      PUBLIC NOTIFICATION REQUIREMENTS
                   Agency  Performance  and  Permits  Issued
                           - 42X
Yes. Routinely - 33X
                          Without P/N - 48X
          Undetermined - 14X

Sometimes - 25X
                                                                Kith P/N - 3BX
            Did Agency Issue P/N's?
                  X Agencies
                     How Were Permits Issued?
                            X Permits
                                                Infornation baaed only on non-NSR/PSO permits
                                     Figure  2
                        APPLICABILITY  DETERMINATIONS
                              Agency Performance
              All Audited Agencies - 100X
                Netting' Problems - 31X
                 Source  Problems - 20X
                                          IV-7
                                                          Percentages are mutually exclusive.

-------
but certainly not always, these problems resulted in agencies not subjecting
proposed sources to the correct preconstruction review requirements.   (See
Figure 2.)

     ' EPA identified 20 agencies who either did not require the proper
"netting" procedures to be followed to calculate the change in emissions  at
a modified source, or did not provide sufficient file documentation to
enable adequate evaluation of the agency's  procedures.   In  these agencies,
EPA found at least 35 files with procedural or documentation problems.
(See Figure 2.)                                                       /-

       EPA found no evidence of the improper "double counting" of emission
reduction credits used for netting purposes.   Some agencies need to be
more careful about ensuring that each emission reduction credit is made
Federally enforceable--an important criteria for properly using emission
reductions for netting purposes.

3.  BACT/LAER Determinations

     0 Most agencies routinely complied with the PSD requirement for applying
BACT to each regulated pollutant emitted in significant amounts.   EPA found
exceptions in a total of 8 permits issued by 5 agencies.  In only 2 agencies
did the problem occur in more than 1 permit.

     0 Approximately 85 percent of the agencies who conducted PSD reviews
last year required the evaluation of alternative control strategies as part
of the BACT determination process for at least some of the PSD reviews
which they conducted.  However, one-third of the reviews where alternatives
were formally considered failed to fully address the impacts of each alternative
in order to properly demonstrate the rationale for selection of a particular
control technique.

     e Twenty-five (25) percent of the agencies who conducted PSD reviews
last year did not appear to consistently check the applicants' BACT analyses
to verify their accuracy.  EPA concluded that in some cases little or no
independent agency review was likely to have occurred,  while in other
instances it appeared to be more a question of whether the agencies failed
to include documentation of their own analyses.

     0 Collectively, agencies showed a strong tendency to accept the use
of the applicable NSPS to define BACT for PSD sources.   Even though examples
of BACT determinations more stringent than NSPS were found in PSD permits
issued by 12 of 24 agencies, BACT was established at levels required by
NSPS for approximately 80 percent of the pollutant determinations for BACT
(see Figure 3).

     0 Agencies showed far less tendency to use NSPS for LAER determinations
than for BACT.  Agencies required emissions limits more stringent than NSPS
to establish LAER in 6 of 11 pollutant determinations, affecting 9 major
nonattainment area sources otherwise subject to NSPS.
                                    IV-8

-------
                                        Figure  3
                                 BACT DETERMINATIONS
                  Relative  Stringency  of BACT  Determinations
Pollutant Determinations
         50 r
         40
         30
         20
         10
                                  No. of Agencies
Only pollutant determinations for which
KSPS applied were considered
                                                         12
                                     Figure  4
                        AMBIENT AIR QUALITY ANALYSIS
                              Agency Performance
                                                                         ]Controls none
                                                                         1 stringent than NSPS
                                                                         (Controls equal to
                                                                         I NSPS
  Adeq. Analys - 19X
             N/A -
.Inadeq. Analys - 16X  Typically Adeq - 22X


       isuff. Do cum. - iox




                     N/A - 22X'
                                                                       consistent - 13X
                                                                     Insuff. DOCUB - 3X
                                       Inadequate - 22X
                PSD Increment
                 X Agencies
                             NAAQS Protection
                                % Agencies
                                          IV-9

-------
4.  Ambient Monitoring

     0 With only a few possible exceptions, agencies typically required PSD
applicants to address the preconstruction montoring requirements where applicable.
Where agencies did exempt PSD applicants from the requirements, permit files
usually provided an adequate demonstration that the proposed sources'  impacts
were de minimi's.  However, 4 agencies failed to explain why a total of 5 sources
were totally exempted from the monitoring requirements.

       Twenty (20) agencies required a total of 34 applicants to comply with
the preconstruction monitoring requirements.  Thirty (30) PSD applicants
were allowed to use only existing ambient air quality data.  Four (4)
agencies each required 1 PSD applicant to monitor for 1 or more pollutants.

       In approximately half of the cases where agencies accepted the  use of
existing data, the permit files (a) offered no documented basis for allowing
its use, or (b) contained some description of the data but failed to adequately
address or meet all of the EPA criteria for representative data.

5.  Ambient Air Quality Analysis

PSD Increment Analysis—

     * Twenty-nine (29) agencies required 46 PSD applicants to meet either
the TSP or S02 increments or both.   EPA found no PSD reviews for which an
increment analysis should have been done but was not.  However, in more than
half of the affected agencies, EPA found that either:  (a) the analyses
did not adequately address existing major and minor source emissions which
also consumed increment, or (b) the permit files did not provide sufficient
information to enable auditors to evaluate the analyses.  (See Figure  4.)

     * With only one exception, agencies typically gave adequate consideration
to both long- and short-term PSD increments.

     0 None of the 16 PSD permits for which agencies required a Class  I
increment analysis revealed any problems related to special Class I area
considerations.

NAAQS Protection--

     0 EPA did not identify any NSR/PSD permits for which a NAAQS analysis
was completely omitted but should have been required.  However, 5 agencies
were found to have (a) incorrectly omitted certain pollutants from analysis,
or (b) lacked a sufficiently comprehensive review of some pollutants.

     * Most agencies appear to scrutinize non-NSR/PSD source permit applications
individually to determine whether a NAAQS analysis should be done.   However,
18 agencies were found to have issued permits to sources who probably  should
have been subjected to NAAQS analyses but were not.  Five (5) of these agencies
may not require a NAAQS analysis for any non-NSR/PSD source construction.

     8 Thirty-one (31) percent of the agencies had files which typically
(a) lacked sufficient documentation to enable EPA to determine whether and
to what degree source interactions had been considered in the NAAQS analysis,

                                    IV-10

-------
or (b) omitted significant emissions from other sources in the vicinity of
the proposed source.  (See Figure 4.)

Dispersion Models--

     0 Most agencies generally used or required applicants to use the
appropriate models and model  options in the NAAQS analyses performed.
However, the lack of sufficient documentation to fully describe the rationale
for the use of particular models and the methods used was a hindrance to
the auditors in many instances.

       Apparently, most agencies do not often require minor sources to
perform the modeling analysis.  Eighty (80) percent of the time,  the minor
source analyses were performed by the agencies themselves; but in the 15
cases where the applicants did submit an analysis, about half of  the analyses
did not appear to be adequately checked by the responsible agency.

6.  Emission Offset Requirements

     0 Thirteen (13) agencies issued NSR permits to major sources in
nonattainment areas.  EPA found a few examples of areas where agencies
experienced specific problems but found no examples of emissions  offsets
that were not Federally enforceable.  Also, agencies typically required
offsets to occur on or before the time of new source operation and to be
expressed in the same manner  as emissions used for the demonstration of
reasonable further progress.

     0 Four (4) agencies did  not provide sufficient information in their
NSR files to enable EPA to adequately evaluate the full creditability of
the emission offsets.

     0 Three (3) agencies did not account for area and minor source growth
that had occurred since the last NSR permit and should have been  offset
by the proposed source.

7.  Permit Specificity and Clarity

       Twelve (12) agencies did not appear to routinely state or  reference
each source's allowable emissions in the applicable permit.  The  omission
appeared to be a common occurrence, at least for non-NSR/PSD sources, in
8 of these agencies.

     0 At least 15 agencies did not appear to routinely identify  each emission
unit along with its allowable emission limit in the permits.   In  some cases,
it may be agency policy to do so primarily for PSD permits or where needed to
avoid NSR/PSD review.

       Nineteen (19) agencies reportedly either did not at all or did not
consistently state or reference compliance test methods in the permits.
Where the practice is not followed consistently, it is not clear  what
criteria, if any, agencies use to determine whether the methods are to be
stated or referenced in the permits.
                                    IV-11

-------
C.  PUBLIC NOTIFICATION PROCEDURES

     The audit examined State and local agency procedures for notifying the
public of proposed permit actions.  These procedures were reviewed with
specific concern for (1) the types of sources for which notices were issued,
(2) the adequacy of the information contained in the notices, and (3) the
extent to which other agencies and officials are informed of pending permit
actions which could affect their jurisdictions.

1.  For which new or modified sources was the public afforded an opportunity
to comment on proposed permits?

     The FY 1985 audit findings indicate that while most proposed permits
for NSR/PSD sources are announced to the public for comment, most other
proposed permits are not.  Approximately 90 percent of the NSR/PSD permits
examined by the auditors were announced through a public notice, while only
38 percent of the audited non-NSR/PSD permits included public notices.  In
terms of the agencies who issue the notices for proposed permits, the findings
support the FY 84 audit results which indicated that approximately one-third
of the agencies routinely notify the public of their proposed permits.  This
year's findings reveal that:

     0 33 percent of the agencies issued public notices for all of the permits
that were examined;

     0 25 percent issued public notices for some, but not all, audited permits;

     * 42 percent did not issue public notices for any of the audited non-NSR/PSD
permits.

     Of the 77 NSR/PSD permit files examined by the auditors, 3 PSD files
and 4 NSR files contained no evidence that the proposed permits were announced
to the public for review and comment.  The three PSD permits were issued by
three separate agencies.  In one case, the omission of public notification
was a processing error apparently resulting from the fact that the source
was initially reviewed as a minor source.  The second involved a PSD source
whose permit was modified and the State did not consider a public notice
appropriate (although the EPA Regional Office disagreed).  No explanation
was provided for the omission of the public notice in the third case.  Each
of the identified agencies issued at least one other PSD permit during 1984
and for each the public was properly notified.

     Four of the 19 audited permit files involving major review in a nonattainment
area contained no evidence that public notification had been provided.  Three
of the four NSR permits were issued by the same agency.  Auditors did find
that other permits requiring emission offsets were issued by that agency and
public notices were issued for those other permits.

     With respect to the non-NSR/PSD files, auditors found many permits for
which there was no evidence of public notification.  Only 38 percent of the
audited permit files clearly indicated that public notification occurred.
Of the remainder, auditors concluded that 46 percent had been exempted from
notification by agency rules, and 2 percent were excluded from notification
as a result of a "processing error."  The remaining 14 percent of the

                                   IV-12

-------
questionnaires contained no response at all to this question primarily because
of the lack of documentation in the files.

     In order to help assess the general value of public notification for
non-NSR/PSD permits, auditors were asked to indicate whether the public
notices resulted in any comments.  Of those minor permits for which a
public notice was issued, more than half (55 percent) contained evidence
in the files that public comments had been submitted.  Most responses did
not indicate how many comments had been received, nor were the auditors asked
to evaluate the quality of the comments.

2.  Do the public notices routinely provide adequate information?

     Auditors were asked to determine whether the following items of information,
required under the PSD regulations, were included in the public notices
issued by State and local agencies:

     a.  Opportunity for written comment;

     b.  Opportunity for a public hearing;

     c.  Description of the agency's preliminary determination to approve
or disapprove the permit;

     d.  Description of the source's estimated ambient impact;

     e.  Statement of the availability of additional information for public
inspection.

     Of the 38 agencies issuing PSD permits during the FY 1985 audit period,
60 percent routinely addressed the information requirements in an adequate
manner.  The remaining 40 percent of the agencies were inconsistent at
best.  Wherever information was omitted from a notice, a descripton of the
source's estimated ambient impact was always missing.  Half the time, other
required information was missing as well.  Overall, only half of the public
notices for PSD permits were found to contain all of the required information.

     Other types of permits, including those subject to major review in a
nonattainment area typically did not contain all the information items listed
above.  The informational content of non-PSD permits is not as clearly
delineated by regulation, so the following is provided primarily for comparative
purposes.  Frequently omitted from non-PSD permits was the description of
the source's estimated impact.  Eight agencies did address source impact in
some of their notices.  Less frequently omitted was the agency's preliminary
determination, yet it was not found in 43 percent of the notices.

3.  Were other State and local air pollution control agencies and other
officials whose jurisdictions might be affected by the proposed new or
modified source adequately notified of the proposed action?

     Auditors identified 13 agencies where it did not appear that this part
of the notification procedure was being adequately carried out.  In some
cases it was not apparent that outside agencies or officials were notified.
In a few cases neighboring States were not informed of a proposed source's
impact when appropriate.  Approximately 30 percent of the NSR/PSD permit

                                   IV-13

-------
files contained no evidence of efforts to notify other agencies and officials,
including EPA.

     EPA policy calls for the notification of the appropriate Federal  Land
Manager (FLM) when a PSD source would propose construction within 100  km of
a Class I area.  Twenty-two (22) PSD permits issued by 14 agencies, involved
construction within such range.  Auditors verified that 16 of the 22 PSD
permits were brought to the attention of the appropriate FLM.  No record of
notification was apparent in the remaining 6 files—each issued by a different
agency.

D.  APPLICABILITY DETERMINATIONS

     The specific types of requirements which are to apply to a proposed
new source or modification are generally based on the size of the new  source
or modification, expressed in terms of its "potential to emit," and the
geographic location where the proposed construction would occur (attainment
vs. nonattainment area).  The task of making the appropriate applicability
determinations depends upon the existence of adequate regulations containing
the proper definitions and applicability criteria, plus the in-house expertise
to correctly apply them to each incoming application for a permit.

     EPA auditors examined the selected permit files to evaluate each
agency's ability to adhere to the approved definitions of "source" and
"potential to emit," and how well each agency verified and corrected,  where
necessary, the emissions estimates provided by the applicants.   As was the
case in last year's audit, the overall findings pertaining to applicability
determinations suggest that a significant number of State and local agencies
are experiencing difficulties in adequately carrying out the source
applicability process.  Overall, EPA found that:

     8 Approximately 15 percent of the audited non-NSR/PSD permit files should
have been reviewed as major sources in the auditors' judgment.

     e Another 5 percent of the audited non-NSR/PSD files, did not contain
sufficient information about the sources' emissions to enable the auditors
to indicate whether the correct applicability determinations had been  made.

     Described below are the findings as they relate to the various aspects
of the applicability determination process.

1.  Does the agency properly apply its approved definition(s) of "source"?

     EPA found, in 13 agencies, 20 non-NSR/PSD permits for which certain
pollutant-emitting activities had not been considered in defining the
subject source.  However, EPA concluded that none of these sources would
have been required to undergo PSD or NSR review.

     EPA did, however, identify other problems that, while not related to
the definition of source, involved other source-related issues.  These
source-related problems, affecting 13 agencies, kept sources from being
properly regulated under the agencies' permit requirements.  No one problem
was widespread, and correction of each would appear to require greater
attention on the part of each agency to correctly interpret its applicable
regulations.  EPA identified the following types of problems:

                                   IV-14

-------
     a.  Failure to identify the proper SIC classification for PSD applicability;

     b.  Exemption of certain pieces of equipment when determining source
emissions increases and decreases;

     c.  Improper exemption of temporary, portable, and stand-by sources
from PSD review;

     d.  Omission from PSD review of significantly-emitted pollutants
because of the misinterpretation of "major modification" at a source; and

     e.  Failure to document all or part of the applicability process, thus
preventing adequate judgment of whether "source" was adequately addressed.

2.  Does the agency typically use the best available emissions projections
and Federally enforceable limitations in defining a source's "potential  to
emit"?

     The PTE is a source's maximum capacity to emit a pollutant under its
physical and operational design.  In order for any physical or operational
limitation (e.g., less than 24-hour, year-round operation, fuel usage
restriction) to be considered part of the source's design, thereby restricting
the maximum pollutant-emitting capacity of the source, the limitations must
be Federally enforceable.  The major status of new or modified sources must
be determined on the basis of their potential emissions.

     Thirty-eight (38) agencies were found to have a problem with their
procedures for establishing a source's "potential to emit" (PTE).  Sometimes,
but certainly not always, these problems appear to have resulted in incorrect
applicability determinations.  Problems related to the agencies' determinations
of PTE can be broken down as follows:

     a.  Failure to ensure the Federal enforceability of all physical and
operational limitations used in the PTE calculations;

     b.  Failure to address the major source status (PTE) of the existing
facility at which a modification is being proposed;

     c.  Use of emissions factors that are not well-established or well-
documented; and

     d.  Failure to include quantifiable fugitive emissions where applicable.

     For any one or more of these reasons, EPA considered the PTE determination
in approximately 30 percent of the audited non-NSR/PSD source files to be
deficient.  More importantly, at least one-fifth of the files where EPA
found deficiencies reportedly should have been reviewed as major sources.

     In 36 agencies, EPA found permit files for which the agencies (a) did
not properly ensure the Federal enforceability of all physical and operational
limitations upon which emission estimates were calculated, or (b) did not
adequately consider the potential emissions of existing facilities where a
modification was being proposed.
                                   IV-15

-------
     EPA identified at least 28 permits where the agencies simply did not
establish permit conditions defining the necessary limitations upon which
the sources' estimated emissions were based.   In other permits, some necessary
limitations were either not addressed at all  or were inadequately restricted.
Sometimes the limitations were specified in operating permits which EPA
generally does not regard as being Federally enforceable, but which are
usually enforceable by State and local agencies.

     It is also important to point out that some agencies consider the
limitations to be enforceable if they are contained in the permit application.
Apparently some agencies include a general  condition in permits, which
links the applicants'  plans and specifications to the permits.  It is not
clear when and how often auditors took this into account when evaluating the
Federal enforceability of the limitations.

     In cases where a permit involved a modification to an existing source,
EPA sometimes found that no determination of the existing source's PTE was
made.  While it is true that the existing source's PTE is irrelevant for
the immediate applicability determination,  when the proposed emission
increases would not exceed prescribed significance levels, it is nevertheless
important to know what the source's cumulative PTE is for consideration in
subsequent modification proposals by that source.  Files, in some cases,
did not appear to contain any documentation of the existing source's PTE or
of cumulative emissions for future reference.

     Eleven (11) agencies issued 21 permits (6 percent of the audited non-NSR/PSD
source files) that did not adequately address fugitive emissions.  Reportedly,
one source likely  would have been required to undergo major source review
if the quantifiable fugitive emissions had  been included in the emission
calculations.  Twice as many non-NSR/PSD source files, i.e., 12 percent, did
not provide sufficient documentation of the emission calculations, to enable
EPA to verify whether fugitive emissions were properly considered.

     Finally, in 24 separate agencies, EPA  identified 40 non-NSR/PSD source
permits (11 percent of the minor source permit files audited) where they
concluded that well-established or well-documented emissions factors were
not used to estimate source emissions.  An  additional 32 files did not
provide enough information to allow a good understanding of how the emission
estimates were calculated.

3.  Does the agency use as its netting baseline actual emissions expressed
in tons per year?

     No specific problems involving the NSR/PSD permit files were found.
Auditors did indicate, however, that insufficient documentation prevented
an affirmative conclusion from being drawn  in just a few cases.

     With respect to the non-NSR/PSD files, EPA identified 20 agencies who
either did not require the proper procedures to be used to calculate a net
change in emissions, or did not provide enough information to enable the
auditors to determine whether actual emissions were correctly calculated.
In these agencies, EPA found at least 35 examples of specific procedural
problems or documentation problems.  The findings indicate that:


                                   IV-16

-------
     0 Ten (10) agencies allowed proposed modifications to determine their
net change in emissions on the basis of potential  or allowable emissions
rather than actual emissions.  At least 5 of the agencies permitted new
replacement units, of equal capacity to units being shut down, without
considering the net change in actual emissions.

     e Three (3) agencies did not properly determine actual emissions changes--
2 failed to use a tons-per-year emission baseline, while 1 did not use emissions
that were representative of normal source operation.

     e Thirteen (13) agencies did not provide sufficient information in
some files to enable the auditors to determine how the emission changes were
calculated.

4.  Does the agency check applications for proper use of contemporaneous
emission changes to prevent the "double counting"  of emission decreases
for netting purposes?

     No evidence of double counting was found in the audited permit files.
Of some concern, however, is (1) the lack of documentation to verify that
double counting has not occurred, and (2) the apparent failure to make the
emission reduction credits Federally enforceable.

     The lack of documentation was indicated in seven agencies as the reason
why auditors could not verify that double counting had not occurred.  But
it should be noted that there were no suggestions that any problems were
suspected.

     EPA requirements stipulate that emission reductions must be made
Federally enforceable.  This was reportedly not done in single permits
found in 10 agencies.  In 7 of these agencies, emission reduction credits
were not addressed at all in permit conditions, thus raising the question
of whether the reductions are enforceable even by the affected agencies.
Making the emission reductions enforceable conditions of the permit also
helps to ensure that subsequent double counting of such emissions will not
occur inadvertently.

5.  Does the agency properly apply the §107 area designations when determining
what type of preconstruction review will be required of major construction?

     No clear examples of the misapplication of §107 area designations were
identified.  One agency apparently continues to provide for a "clean spot"
exemption in its permit requirements for nonattainment areas.  To date,
however, this apparently has not resulted in any applicability determinations
that are inconsistent with EPA requirements.

6.  Verify that the agency does not approve major construction projects in
designated nonattainment areas under an EPA-imposed construction moratorium.

     The audit findings produced no problems of national significance.
However, auditors questioned two minor source permits issued to sources
locating in nonattainment areas where EPA had imposed construction bans.
One source actually operated at levels below significant emissions rates,
but was a major source in terms of its potential to emit.  No corrective

                                   IV-17

-------
action was recommended because the source had shut down.  In the other case,
the source was approved as a minor source apparently because an allowable
(rather than actual) emission baseline was used for the netting calculations.
A recommendation was made in this case to re-evaluate the permit.

E.  BACT/LAER DETERMINATIONS

     In this section, the audit examined several aspects of the BACT/LAER
control technology requirements that are generally applicable to PSD sources
and major new and modified sources in nonattainment areas.  However, more
weight was given to the BACT analysis for this year's audit.  With respect
to BACT, emphasis was put on whether agencies were requiring an adequate
analysis of each regulated pollutant emitted in significant amounts.
Prescribed significance thresholds applicable to each pollutant are defined
by the PSD regulations.

     In order to get a better idea of how thoroughly the BACT analyses are
being carried out, additional questions were asked to determine whether the
analyses routinely considered more than one possible control technology,
and whether the agency routinely took it upon itself to verify the analyses
submitted by the applicants.

     The audit also sought to determine the extent to which the BACT/LAER
requirements are functioning as technology-forcing requirements.  This was
accomplished by asking the auditors to determine the relative stringency
of the BACT/LAER determination for each major source audited on the basis
of applicable NSPS and NESHAP standards, which serve as the minimum control
requirements legally allowed for BACT and LAER.

1.  Does the BACT analysis consider each regulated pollutant emitted in
significant amounts?

     Of the 32 agencies issuing PSD permits, most appear to be complying
with this PSD requirement.  The auditors found exceptions, however, in a
total of 8 PSD permits issued by 5 agencies.  In only 2 agencies did the
problem occur in more than one permit.

     Some pollutants were not considered for BACT because of an apparent
failure on the part of the audited agencies to address potential emissions,
i.e., actual emissions were  incorrectly used.  It is presumed that in such
instances when this practice is corrected, any pollutants calculated to be
potentially emitted in significant amounts will be properly considered for
BACT.

2.  Does the review agency require consideration of more than one BACT
control technology?  If so, to what extent are economic, energy, and
non-air environmental impacts considered?

     Last year's audit indicated that most agencies appear to require PSD
applicants to analyze more than one control technique as part of their BACT
selection process.  Some agencies noted, however, that this requirement
was not always implemented if a particular control technique was regarded
as "obvious" or common for a particular source.  A few agencies claimed
that a preapplication meeting with the applicant was used to determine BACT;

                                   IV-18

-------
therefore, an analysis of alternatives did not need to be contained in the
PSD application.

     This year's audit findings generally support the agencies'  claims in
that most (84 percent) of the audited PSD files did address, to some degree,
consideration of alternative control techniques.  However, the overall
quality of the BACT analyses was questioned in a significant number of cases.
Specifically, auditors found that:

     0 Fifty-three (53) percent of the agencies where PSD permit files were
examined had files in which control alternatives were routinely considered;

     0 Thirty-one (31) percent had some files which addressed alternatives
while other files did not; and

     0 Sixteen (16) percent of the agencies had no PSD files which addressed
alternative controls for BACT.

     0 One-third of the PSD permit files where control alternatives for BACT
were considered failed to adequately address the impacts of each alternative
in order to demonstrate the rationale for selection of a particular control
technique.

     Eighteen (18) PSD permits (approximately 30 percent of the PSD permits
audited), issued by 13 agencies, did not address alternative controls at all.
Yet, it could not be determined in every case that the control technique
used to define BACT was always the most "obvious" choice.  Auditors noted
that in some cases the applicant claimed the best control(s) had been selected,
but this was rarely confirmed by the auditor.  In some cases, auditors noted
that only NSPS were considered.  Thus, it would appear that the omission of
other control techniques from consideration may not always be acceptable.
EPA intends to examine this in greater detail in future audits.

     Finally, where agencies claim to conduct a preapplication meeting
with the applicant in order to review candidate control options in advance,
EPA recommends that each meeting should be carefully documented to include
a description of the alternatives considered and the basis for them being
eliminated.   Agencies should retain this documentation in the appropriate
PSD files as a formal record of the BACT selection process.

3.  What checks does the review agency employ to confirm the applicant's
BACT analysis?

     Auditors were asked to determine whether each audited PSD file contained
sufficient documentation to show that the reviewing agency had verified the
applicant's calculations and assumptions for BACT.  Auditors did not respond
to the question in a few cases, but the overall findings show that:

     0 Sixty (60) percent of the agencies consistently verified the applicants'
BACT analysis;

     0 Sixteen (16) percent were inconsistent in that some files demonstrated
the agencies' verification efforts while other files did not; and
                                   IV-19

-------
     0 Nineteen (19) percent provided no evidence  in  their  files  that  they
had verified the BACT analyses  submitted by  the  applicant.

     Auditors found no apparent agency verification of  the  applicants' BACT
analyses in 18 PSD permit files.   This finding was mixed  between  situations
where only one control technology was considered and  others where several
alternatives were considered by the  applicant.   The auditors concluded in
some cases that little independent analysis  was  likely  to have  occurred
because of the questionable nature of the BACT selections.   In  other instances,
however, it appeared to be more a question of whether the agencies had
failed to adequately document their  own analyses.

4.  What tendency is there for  the agencies'  BACT/LAER  determinations  to
conform exactly to the minimum  requirements,  i.e., NSPS or  NESHAP standards
where applicable?

     For this question, applicable PSD files were  examined  for  the application
of BACT and files for major nonattainment area sources  were examined for
LAER.  The findings are based on 39  PSD files from 24 agencies  and 9 major
nonattainment area source files from 8 agencies.   Only  sources  for which
NSPS standards applied were considered.

     a.  BACT

     There is a strong overall  tendency for  agencies  to accept  the use of
the applicable NSPS to define BACT for PSD sources.   Even though  examples
of BACT determinations more stringent than NSPS  were  found  in 12  of 24 affected
agencies, BACT was defined as the applicable NSPS  for approximately 80 percent
of the pollutant determinations which agencies made for PSD sources subject
to NSPS.

     The audit findings show that:

     e Twelve (12) agencies accepted the applicable NSPS  for all  BACT
determinations.  These agencies issued 18 permits  for which 32  pollutant
determinations were made.

     * Seven (7) agencies defined BACT more  stringently than the  applicable
NSPS for at least 1 pollutant but typically  accepted  NSPS for most pollutant
determinations.  In the 15 permits that these agencies  issued,  BACT was set
at levels more stringent than NSPS for 9 pollutants,  while  NSPS was applied
to 34 pollutant determinations.

     0 Five (5) agencies defined BACT more stringently  than the applicable
NSPS for all of their BACT determinations.   These  agencies  issued 6 PSD
permits for which 9 pollutants  were  controlled  beyond NSPS.

     e  Of 39 PSD sources subject to NSPS, 25 were allowed  to use NSPS for
all  affected pollutants, while  8 were required  to meet  control  requirements
more stringent than BACT for all  affected pollutants.
                                   IV-20

-------
     b.  LAER

     As might be expected, agencies showed a significantly greater tendency
to define LAER beyond the applicable NSPS than was the case for BACT deter-
minations.  For the 9 permits issued, LAER was defined to be more stringent
than NSPS for 6 pollutants,  while the control  of 5 pollutants was set equal
to NSPS.

     Three (3) agencies allowed 4 sources to meet the applicable NSPS to
satisfy the LAER requirement for a single pollutant in each case.  The
other 4 agencies required LAER to be set at levels more stringent than NSPS
for 5 sources for all but 1 pollutant.

F.  AMBIENT MONITORING (PSD)

     This portion of the audit examined the PSD requirement which provides
that PSD sources must collect air quality data and submit it as part of
their application for a construction permit.  The PSD regulations contain,
for each pollutant, specific de minimi's ambient concentrations that are to
be used to determine when a PSD applicant does or does not need to gather
ambient air quality data.  For those pollutants for which ambient data must
ultimately be reported, EPA guidelines set forth procedures whereby a
source must either (1) establsh and operate an ambient monitoring network
and collect data for 12 months or less, or (2) analyze existing ambient
data which is "representative" (in accordance  with specific EPA criteria)
of the air quality in the impact area of the proposed source.

1.  Under what circumstances is a source required to submit preconstruction
ambient monitoring data?

     The auditors examined PSD files to determine whether agencies had
followed the correct procedures for requiring  applicants  to submit ambient
air quality data, either from source-operated monitors or from existing
representative data.  Thirty-four (34)  sources were required to submit
ambient data.  Another 20 sources were correctly exempted in accordance
with the criteria for de minimis situations, but 5 did not address the data
requirements.

     Because of inadequate documentation, auditors were unable to ascertain
whether the exemption of 5 sources, allowed by 4 agencies, had been handled
properly.  For at least 2 of the exemptions, made by 1 agency, the auditors
believed that the sources' impacts were de minimis and ambient data would
not be needed.  For one, however, the auditor believed that the source
should have been required to submit air quality data.

     In the 20 agencies requiring that the data requirements be addressed,
most applicants were allowed to use existing air quality data rather than
having to establish a monitoring network to collect new data.  The findings
indicated that:

       30 sources were allowed to use only existing data for a total of
55 pollutants; and
                                   IV-21

-------
     8 4 sources (involving 4 separate agencies) were required to monitor
a total of 7 pollutants (but agencies allowed 2 of the sources to use
representative data for 1 pollutant each).

2.  Under what circumstances may a source submit existing data, rather than
conduct new monitoring?

     Where PSD sources were allowed to use existing data to meet the air
quality data requirement, auditors examined the files to determine whether
agencies followed Federal criteria to ascertain that the existing data was
representative of the area of source impact.   The air quality data was
checked for adequate consideration of the location of existing monitors, as
well as the quality and currentness of the existing data.

     Seven (7) files from 5 agencies (1 agency had 3 affected files) offered
no documented basis for allowing the use of existing data.   Seven (7) other
files involving 6 agencies (including 2 of the 5 already mentioned) contained
some description of the data used but failed to adequately consider or meet
all of the criteria for representative data.

     For the 32 PSD sources allowed to use existing data for at least 1 pollutant,
auditor responses given to the Federal criteria for represenative data are
as follows:

                                                              YES  NO   CBD

     a.  Adequate consideration of monitoring site location   61%  16%  23%

     b.  Adequate consideration of data's quality             61%  10%  29%

     c.  Adequate consideration of data's currentness         63%  10%  26%

3.  Do the source monitoring data adhere to PSD quality assurance requirements?

     In the 4 agencies requiring source monitoring, EPA auditors checked
the 4 PSD files and found that only 1 file contained a monitoring plan.
This file contained, among other things, the applicant's quality assurance
procedures that would be followed for the duration of the monitoring effort.
One auditor indicated that a monitoring plan had been submitted for one
source but the plan was not in the permit file.  The remaining 2 files did
not provide any evidence that a monitoring plan had been submitted.

     In all 4 cases, applicants conducted the monitoring for 12 months as
generally required by the PSD regulations.

6.  AMBIENT AIR QUALITY ANALYSIS

     For this section, auditors were asked to examine three main areas of
concern.  The first—PSD increment analysis—looked at how well agencies
evaluted PSD permit applications to determine the amount of PSD increment
that would be consumed by the proposed source or modification.  Auditors
focused on whether the increment analyses (1) addressed the appropriate
emission changes which affect available increments, (2) considered both
long- and short-term increment averaging periods, and (3) gave adequate
attention to Class I area increments.

                                   IV-22

-------
     The second area of concern pertains to agency procedures for providing
adequate NAAQS protection.  Auditors were asked to determine, for all major
and minor source permits, whether and to what extent each source underwent
an analysis to ensure that the national  standards (NAAQS) would not be
violated.

     Finally, the auditors evaluated the adequacy of the agencies'  models
and modeling procedures.  Agencies are expected to use models which have
been approved for use by EPA, but also of importance is that the appropriate
model (and model options) is selected for a particular set of modeling
conditions.

PSD Increment Analysis

     The audit findings indicate that 29 agencies required PSD applicants
to perform increment analyses.  In these agencies, 46 PSD permits included
analyses of either the TSP or SOg increments, or both.  These analyses totaled
43 for S02 and 31 for TSP.  Auditors did not find any PSD files for which an
increment analysis should have been done but was not.

1.  Does the agency consider the baseline concentration and emission changes
which affect increment consumption?

     Seventeen (17) agencies were affected by the auditors' findings that
existing major and minor source emissions are not always being adequately
addressed as part of the required increment analysis.  In a few cases, it
was apparent that no emissions other than those emissions resulting from
the proposed PSD source were being addressed.  Occasionally, only emissions
from other PSD sources were included in  the analysis.  The key findings
show that:

     0 Approximately 33 percent of the PSD applicants required to analyze
PSD increment consumption failed to adequately address existing major and
minor source emissions which contribute  to the amount of increment consumed.

     e Approximately 15 percent of the files did not provide enough information
to enable the auditors to evaluate the increment analysis that was done.

     0 Minor source growth was not adequately addressed in 30 percent of
the S02 analyses and 20 percent of the TSP analyses.

     0 Existing emissions from major sources were inadequately addressed in
10 percent of the analyses for SOg and TSP.

2.  Are both long- and short-term PSD increments being given adequate consideration
as part of the increment assessment?

     The audit findings indicate that agencies adequately consider both the
long- and short-term increments for S02  and TSP.  In only one case did an
agency fail to adequately address both averaging periods.  The auditor's
remarks indicated that this was a unique circumstance which was not indicative
of the affected agency's typical performance.
                                   IV-23

-------
     It is interesting to note that agencies tended to be conservative in
their use of modeling results to determine the amount of increment consumed.
Whereas EPA recommends using the highest of the second highest receptor
site concentrations, agencies used the highest concentration in 65 percent
of the TSP analysis and 51 percent of the S02 analyses.

3.  Does the agency make an adequate assessment of the new sources and
modifications affecting the Class I increments?

     None of the 16 PSD permits for which a Class I increment analysis was
required revealed any specific problems related to special Class I area
considerations.   Seven (7) of the permits were included  among those found
to be deficient in the consideration of other emissions  changes contributing
to increment consumption.  It could not be determined whether and to what
degree emissions not addressed by the applicants may have affected the analysis
of Class I increments as opposed to Class II increments  also analyzed for
those permits.

     In two cases, auditors made general comments indicating that 2 agencies
needed to provide better protection of Class I areas within their own
jurisdictions.  No specific deficiency was identified in either case, however.

NAAQS Protection

1.  Does the agency routinely evaluate the ambient impact of minor source
construction?

     This year's audit information supports last year's  finding that most
agencies do not routinely evaluate minor source construction for air quality
effects.  Less than 25 percent (81) of the audited minor source files were
required to undergo a NAAQS analysis.   Most agencies did, however, appear to
scrutinize minor source applications individually to determine whether an
ambient impact analysis should be done.  But some agencies appear to provide
little, if any, review of the ambient effects of minor sources.

     Specifically, the audit results indicate that:

     0 Twently-six (26) agencies did not require NAAQS analyses for any audited
non-NSR/PSD source permits.  In 5 of these agencies, auditors found sources
which they believed should have undergone analysis but did not.

     * Eighteen (18) agencies were found to have issued  permits to non-NSR/PSD
sources which, in the auditors' judgment, probably should have been subjected
to an ambient impact analysis but were not.  EPA identified a total of
40 permits for which this omission occurred.  Forty (40) percent of these
permits were issued by only 3 agencies.

     c Overall, approximately 10 percent of the audited non-NSR/PSD source
permits which did not consider ambient effects probably should have because
of the sources' potentially significant air quality impacts.

     e In 23 percent of the questionnaires, auditors did not respond when
asked whether a NAAQS analysis should have been done, but was not.  This may
suggest that information in the files was insufficient to determine the need
for a NAAQS analysis.
                                   IV-24

-------
     The auditors also examined the NSR/PSD permit files to determine whether
and how well the NAAQS analyses were performed for major sources.   The
findings show that most NSR/PSD sources underwent NAAQS analyses where appro-
priate; 61 files (80 percent of the files audited) included a NAAQS analysis.
The findings also indicate that:

     0 Auditors did not identify any NSR/PSD permits for which a NAAQS
analysis was completely omitted but, in the auditors' judgment,  should have
been required.

     0 Five (5) agencies,  each having a PSD file that included a NAAQS
analysis, should have required additional analyses, either for omitted
pollutants or for more comprehensive review of considered pollutants.

     0 Auditors did not indicate whether a NAAQS analysis was performed for
2 PSD permits involving criteria pollutants.  Nor did the auditors indicate
that a NAAQS analysis should have been performed.  It is assumed that the
auditor did not have sufficient information available to them to respond
appropriately to the applicable questions.

2.  Does the agency's ambient impact analysis provide adequate protection
against the development of "hot spots"?

     Adequate NAAQS protection requires that the reviewing agency give
consideration to the interaction of proposed new emissions with  emissions
from sources already in existence (including sources which may have already
received a permit to construct but are not yet operating) and to points of
projected maximum ambient concentrations resulting from multi-source inter-
actions rather than just points of maximum concentrations from the proposed
source alone.

     Auditors found that many agencies generally provide adequate NAAQS
protection.  Oftentimes, however, a lack of file documentation prevented
the auditiors from making a determination.  The NSR/PSD source files tended
to contain better documentation than did the non-NSR/PSD source files.   The
audit findings reveal the following:

     0 Thirty-four (34) percent of the audited agencies were judged to
provide good or acceptable protection of the NAAQS most, it not  all, of the
time.

     0 Twenty-two (22) percent of the agencies had files which typically
suffered from insufficient documentation.  In these cases, auditors could
not determine whether and to what degree source interactions had been
considered in the NAAQS analysis.

     * Thirteen (13) percent of the agencies were found to be inconsistent
in that some analyses adequately considered source interactions  but others
did not.

     0 Nine (9) percent of the agencies typically omitted significant
emissions from other sources in the vicinity of the proposed source.
                                   IV-25

-------
     8 Almost 60 percent of the non-NSR/PSD source permits reviewed for NAAQS
protection failed to include sufficient documentation to determine the adequacy
of the ambient impact analysis.  Fifteen (15) percent were judged inadequate
in terms of considering multi-source interactions.

     0 For NSR/PSD permits, 32 percent of the files reviewed for NAAQS
protection had insufficient file documentation.   Thirteen (13) percent were
judged to have inadequate analyses for full NAAQS protection.

Dispersion Models

1.  Does the Agency use adequate models to carry out the ambient impact
analysis?

     EPA examined the modeling techniques used or accepted by agencies to
analyze PSD increment consumption and potential  source impact on the NAAQS.
The audit results indicate that agencies generally used or required applicants
to use the appropriate models and model options.   However, the lack of
sufficient documentation to fully describe the rationale for the use of
particular models and the methods used was a hinderance to the auditors in
many instances.

     Specifically, the audit findings show that:

     e Five (5) agencies used or allowed the use of inappropriate models for
the ambient impact analyses contained in 8 permit files.  In only 2 of the
agencies did the problem occur more than once.

     e The use of inappropriate models was identified in 2 PSD increment
analyses and in 6 NAAQS analyses involving the review of minor sources.  In
at least half of these situations, the models used were inappropriate for
the existing terrain features.

     e Permits found in 34 agencies (almost 70 percent of the agencies where
ambient impact analyses were included in the files) did not contain sufficient
information to support the use of the models and model options used.  However,
in over 40 percent of these situations, auditors concluded that the appropriate
modeling techniques had been applied.

2.  Does the agency perform an independent, internal review of the modeling
analyses contained in the permit application?

     Most agencies were found to adequately review the applicants'  modeling
analyses; but inadequate reviews were identified in 8 agencies for a total
of 10 permit files.  Seven (7) of these files pertained to non-NSR/PSD source
permits; the other 3 were for PSD sources.  In only 1 agency did the finding
relate to more than 1 file.

     Apparently, most agencies do not often require non-NSR/PSD source
applicants to perform the modeling analyses.  Eighty (80) permit of the
time, the source analyses were performed by the agencies themselves.
According to the responses provided by auditors, the applicants were required
to submit a modeling analysis for only 15 of the files that were examined.
Thus, the audit results show that in about half those situations the analyses
were not adequately checked by the agency responsible for the permit review.

                                   IV-26

-------
H.  EMISSION OFFSET REQUIREMENTS

     When a major new or modified source is allowed to construct in an area
designated as nonattainment, emission reductions are generally required to
offset the new emissions.  These emission reductions or offsets must meet
specific criteria set forth under Part D of the Clean Air Act in order for
the offsets to be creditable.  Auditors examined selected files involving
sources subject to the emission offset requirements to determine whether
they met such criteria as described below.

     EPA examined, in 13 agencies, 19 major source permit files involving
construction in nonattainment areas.  Of these 19 files, 1 permit was denied,
and 2 were issued under policies which excluded the applicants from having
to obtain emission offsets.  The remaining files involved sources that should
have obtained offsets, although some apparently did not.

     With regard to the examples where offsets were not obtained, the audit
results indicated that:

     0 One agency allowed a source (resource recovery facility) to postpone
the acquisition of offsets because none were available at the time.  This
action complied with the agency's rules as contained in its Federally-approved
SIP, but the SIP does not meet the current requirements based on Part D of
the Clean Air Act.

     0 Two agencies should have required emission offsets, but apparently
failed to do so.  The auditors could not explain why 2 permits failed to
address emission offsets for several nonattainment pollutants.

     In examining the emission offsets on the basis of the specific criteria
with which they must comply, EPA found only a few examples of areas where
agencies experienced specific problems.  EPA identified no examples of
emission offsets that were not Federally enforceable.  Similarly, agencies
typically required offsets to occur on or before the time of new source
operation (with the exception of the one case where offsets were allowed to
be postponed) and to be expressed in the same manner, i.e., actual  or
allowable emissions, as for the demonstration of RFP.

     Some problems did surface in that:

     0 EPA auditors had difficulty determining whether offsets were not
otherwise needed to demonstrate attainment or RFP.  Agencies often did not
provide documentation ensuring that this criteria had been met, even though
they may have actually done so.

     0 Four (4) agencies did not provide sufficient information in their
files to enable EPA to conduct an adequate evaluation of the full creditability
of the emission offsets.

     * Three (3) agencies did not account for area and minor source growth
that had occurred since the last permit.  These emissions are to be offset
along with the new emissions from the proposed source or modification.
                                   IV-27

-------
I. PERMIT SPECIFICITY AND CLARITY
     This final section of the new source review audit provides the results
of EPA's examination of information contained in the permits issued to new
and modified sources.  Specifically, auditors were asked to examine how,
and whether, permit conditions defining limitations applicable to the approved
source or modifications are being established.  Such limitations typically
become the enforceable measures by which a source's construction and operation
is regulated, and the means by which ongoing compliance is determined.

1.  Does the agency adequately state or reference allowable emissions rates
in each permit?

     EPA identified 12 agencies that did not routinely state or reference
each source's allowable emissions in the permits.  The omission appeared to
be a common occurrence, at least for non-NSR/PSD permits, in 8 of these
programs.

     An analysis of the individual permit file questionnaires shows that
approximately 30 percent of the non-PSD/NSR construction permits in the
audit data base did not specify the allowable limits.   This finding must be
qualified, however, because of the ambiguity of the instructions provided
to the auditors by the questionnaire.  Those instructions easily could have
been interpreted by auditors as asking them to provide only the number of
emissions limits actually specified in the permits, and not cases where a
reference was made to a regulation containing the required limit.  It is
not clear how many of the responses took referenced limits into account (as
was intended), but it is known that some did not.

     The same qualification must be given for the results pertaining to
NSR/PSD permits as well, although the ambiguity did not appear to have much
effect on the findings.  There were no permits for which auditors specifically
said no limits were found, but auditors did not respond to the question in
5 cases.

     Where limits were specified in the permits, auditors were asked to
evaluate them in terms of their clarity, consistent with measurement techniques,
and Federal enforceability.  In most cases, the permits contained more than
one emission limit.  Where at least one of the limits was determined to be
inadequate with respect to any of the variables considered, that file was
rated inadequate as a whole.  The percentages, as shown below, are based on
the total number of permits which contained emission limits.
     a.  Clear and precise averaging
         periods

     b.  Emissions rates consistent
         with acceptable measurement
         techniques

     c.  Federally enforceable
PSD/NSR Permits

 YES.  NO   CBD_

 75%  18%   7%


 78%  11%  11%



 88%   6%  11%
 Non-PSD/NSR

YES  NO   CBD

76%  18%   6%


72%  17%  11%



78%  12%  10%
                                   IV-28

-------
2.  Does the agency identify all emission units and their allowable emissions
in the permits?

     Auditors reported that at least 15 agencies did not routinely identify
each emission unit along with its allowable emission limit in the permits.
The responses indicated that over half of the agencies "do not" or "generally
do not" appropriately address each emissions unit.  In two of these agencies,
it was noted that the emissions units and the emission rates applicable to
those units were identified primarily for PSD permits or where needed to
avoid NSR/PSD review.

     For the non-NSR/PSD permit files audited, EPA found 138 (38 percent)
permits that did not address each unit and its allowable emissions.  It
would appear that is some cases emissions were "bubbled" under a single or
composite emission limitation.  This would make it difficult to enforce the
limit with respect to the emissions coming from any particular unit.   Agencies
are advised to avoid any such practice because of the questionable enforceability
of such composite limits.

3.  Are the compliance test methods stated or referenced in the terms and
conditions of the permits?

     Nineteen (19) agencies reportedly either did not at all or did not
consistently state or reference compliance test methods in the permits.
Where the practice is not followed consistently, it is not clear what criteria,
if any, agencies use to determine whether such information is to be included
in the permit.  Compliance test methods are commonly defined in the State
or local agencies' rules and regulations, and many agencies indicated in
last year's audit that specific mention of the test methods in each permit
is not required to enable the agency to use them for compliance determination
purposes.

     Of the 252 non-NSR/PSD permits which specified emission limits,  42 percent
stated or referenced all or some compliance test methods; 33 percent  did
not.  For the remaining permits, it could not be determined from the  auditors'
responses how compliance test methods were addressed.

     Agencies appeared more consistent in stating or referencing the  compliance
test methods in NSR/PSD permits.  Sixty-five (65) percent of the permits
included stated or referenced compliance test requirements; only 12 percent
did not.  Again, auditors did not respond to the question in a significant
number of cases.
                                   IV-29

-------
                         V. COMPLIANCE ASSURANCE
                            EXECUTIVE SUMMARY

    As was the case in last year's National  Air Audit System  (NAAS)  effort,
many States and locals showed one or more strong points  characteristic  of
a successful  air compliance program, such as high source compliance  rates
supported by high inspection frequency rates, performance of  all  required
new source performance standards (NSPS)  source tests, expeditious resolution
of violators, and few long-term violators.   These activities  were adequately
reflected and validated by the national  Compliance Data  System  (CDS).
Other States had source files that were, for the most part, well  organized,
up-to-date, and complete, reflecting a reasonable profile of  each source.

    A State-by-State analysis of compliance  statistics shows  that inspection
rates for Class Al* State implementation plan (SIP) sources generally
increased over those reported in last year's audit, although  four States
were still unacceptably low, with inspection rates of less than  60 percent.
Compliance rates for Class Al SIP sources remained roughly the  same  as
last year.  The NSPS national average for both inspection and compliance
rates rose, even though some individual  State rates declined  slightly.  The
NSPS inspection rates for two States are still  seriously deficient,  with
figures of 33 percent.  Overall  national  emission standard for  hazardous
air pollutants (NESHAP) inspection rates remained steady while  compliance
rates fell slightly.  Fourteen States still  have NESHAP  inspection rates
at or below the 55 percent.

    The compliance audits also revealed  that several  State and  local
agencies, to a varying extent, still have weaknesses  in  three areas  vital
to a strong and effective compliance program.  First, some source files
maintained by State and local agencies do not contain verifiable information
reflecting a reasonable profile of each  source.  However, there has  been
some improvement since last year's audits in the condition of State  files,
where the percentage of those reviewed that  reflected a  reasonable profile
of the sources increased from 58 percent to  72 percent.   Second,  some
inspection reports still are of poor quality (no mention of operating or
emission parameters or pollutants emitted).   For some agencies,  there was
a noticeable improvement in the quality  of inspection reports since  the last
review, but there remain significant deficiencies in  this area.   Third,
some of the reviewed agencies' enforcement efforts are not always effective
in reducing the number of long-term violators by expeditiously  returning
documented violators to compliance, although there was a slight  drop in
the percentage of reports that indicated sources were not being  expeditiously
returned to compliance (from 30 percent  down to 26 percent).
*C1 ass Al includes sources with actual  or potential  controlled  emissions
 greater than or equal  to 100 tons  per  year.
                                   V-l

-------
    Thus, while there are improvements  in  all  of  these critical  areas,
some States and locals need  to  heighten  efforts on the aforementioned
three areas to further strengthen  their  compliance programs.

A.  INTRODUCTION

    As in FY 1984, the compliance  assurance  element  of the  FY  1985  NAAS
was designed to examine State and  local  programs  which are  responsible
for the compliance of sources subject to requirements of  SIP's  and, where
delegated, standards for new stationary  sources (section  111)  and national
emission standards for hazardous air pollutants (section  112).   Of  the
several  hundred thousand regulated stationary  sources in  the nation,
there are approximately 30,000  sources  in  these categories  for  which EPA
and State/local agencies share  a concern about compliance status and
associated enforcement activities.  Compliance activities focusing  on
these sources formed the primary basis  on  which each audit  was  conducted.

    There are three major parts of the  compliance assurance audit.  The
first is a pre-visit assessment of the  State or local agency performed by
examining source data reported  to  EPA by the agency. For FY 1985,  this
included an assessment of how the  newly-implemented  "timely and appropriate"
guidance was working.  The other parts  of  the  element are reviewing
selected State source files  and conducting overview  inspections.

    In accordance with the NAAS guidance,  the  EPA Regional  Offices  were
to conduct the pre-visit assessment by  obtaining  Compliance Data System
(CDS) retrievals for FY 1984 on inspection frequency, compliance rates, and
enforcement activity.  The Regions were then to analyze the CDS data for
source compliance status, progress in meeting  inspection  commitments,
identification of long-term  violators and  associated compliance activity,
adherence to "timely and appropriate" guidance, identification  of long-term
compilers and associated surveillance activity, and  identification  of
operating NSPS sources without  the required  180-day  performance test.
Finally, based on this CDS analysis, the Regions  were to  prepare a  summary
of each compliance program and  send it  to  the  State  or local agency
before the visit.  The analysis could have taken  the form of a  questionnaire
for the agency or could have been  a statement  of  findings to be discussed
for completeness and accuracy during the visit.   The pre-visit  assessment
was also designed to help in identifying the source  files to be reviewed
during the on-site visit.

    The next major part of each audit was  the  on-site visit.   The visit
centered on a discussion of the findings in  the pre-visit assessment and
on review of 15-20 source files.  The files  to be reviewed  were to  consist
of a mixture of SIP, NSPS, and  NESHAP sources. A file review  checklist
was developed to assure consistency in  how the file  reviews were implemented.
The goals were to see if the files contained a reasonable profile of the
source, contained written documentation to support the compliance status
reported to EPA, and contained  documentation to show that violators are
expeditiously returned to compliance.   The State  and local  audit reports
were envisioned to include a discussion of both the  pre-visit  assessment
and the status of the files.
                                   V-2

-------
    The final  component of the compliance audit  was  to  be  a  program  of
overview inspections conducted by EPA of  2-3  percent  of the  sources  in
the CDS inventory (Class A SIP, NSPS, and NESHAP).   The purpose was  to
verify the compliance status  of a source  as reported  to EPA  as well  as
review State or local agency  inspection practices to  see if  there were
areas where EPA could increase performance through technical  assistance
to the State and local  agencies.

    This report covers 65 State and local  audits.  (No  report was received
on the compliance program for Hawaii). Ten questions were developed  as  a
guide in developing a summary of the findings in the State and local
audit reports.  These questions represent the key elements of the compliance
portion of the audit, and provide a uniform basis to do a  national assessment
of the compliance and enforcement programs.

B.  MAJOR FINDINGS AND CONCLUSIONS

    As was the case in last year's NAAS effort,  many  States  and locals
showed one or more strong points characteristic  of a  successful air
compliance program, such as high source compliance rates supported by
high inspection frequency rates, performance  of  all  required NSPS source
tests, expeditious resolution of violators, and  few  long-term violators.
These activities were adequately reflected and validated by  the national
CDS.  Other States had source files that  were, for the  most  part, well
organized, up-to-date, and complete, reflecting  a reasonable profile of
each source.

    A State-by-State analysis of compliance statistics  shows that inspection
rates for Class Al SIP sources generally  increased over those reported in
last year's audit, although four States are still unacceptably low,  with
inspection rates of less than 60 percent. Compliance rates  for Class Al
SIP sources remained roughly  the same as  last year.   The NSPS national
average for both inspection and compliance rates rose,  even  though some
individual State rates declined slightly. The NSPS  inspection rates for two
States are still seriously deficient, with figures of 33 percent.  Overall
NESHAP inspection rates remained steady while compliance rates fell
slightly.  Fourteen States still have NESHAP  inspection rates at or  below
55 percent.

    The compliance audits also revealed that  several  State and local
agencies, to a varying extent, still  have weaknesses  in three areas  vital
to a strong and effective compliance program. First, some source files
maintained by State and local agencies do not contain verifiable information
reflecting a reasonable profile of each source.  However,  there has  been
some improvement since last year's audits in  the condition of State
files, where the percentage of those reviewed that reflected a reasonable
profile of the sources increased from 58  percent to  72  percent.  Second,
some inspection reports still are of poor quality (no mention of operating
or emission parameters or pollutants emitted).  For  some agencies, there
was a noticeable improvement  in the quality of inspection  reports since
                                   V-3

-------
the last review, but there remain  significant deficiencies in this area.
Third, some of the reviewed agencies'  enforcement efforts are not always
effective in reducing the number  of long-term violators by expeditiously
returning documented violators  to  compliance, although there was a slight
drop in the percentage of reports  that indicated sources were not being
expeditiously returned to compliance (from  30 percent down to 26 percent).**

    Thus, while there are improvements in all of these critical areas,
some States and locals need to  heighten  efforts on the aforementioned
three areas to further strengthen  their  compliance programs.

    The remainder of this report  addresses  in more detail these findings.
It is organized by the three parts of the audit:  pre-visit assessment,
file review, and overview inspections.  The aforementioned ten questions,
which represent the key elements  of this compliance  audit, are discussed
in each appropriate part.

C.  PERIODIC REVIEW AND ASSESSMENT OF SOURCE DATA

    To assess the adequacy of State and  local compliance programs, the
EPA Regional Offices continually  review  source compliance status and
inspection information submitted  by the  audited agencies and reflected in
CDS for the SIP, NSPS, and NESHAP  programs. The attached Figures 1-8
provide a compliance snapshot of  all  reviewed State  and local air compliance
programs as of September 30, 1984  (the time used for the CDS information
in the FY 1985 audits).

    As shown in the four pie charts in Figures 1-4,  the national compliance
picture is very respectable.  Compliance rates have  improved since last
year's audit for Class Al SIP and  NSPS sources and declined slightly for
NESHAP sources.  The bar charts in Figures  5-8 depict, for each aspect
of the air program, the inspection range, compliance range, and number of
long-term violators range for all  State  and local agencies audited.  As
shown, inspection rates for SIP,  NSPS, and  NESHAP sources range from
0 percent to 100 percent, with  median figures between 67 percent and
89 percent (compared to 62 percent and 80 percent in last year's report).
Compliance rates for SIP, NSPS, and NESHAP  sources range from a low of
0 percent in one jurisdiction to  a high  of  100 percent in another, with
median figures near 95 percent.  The number of long-term violators (defined
for this audit as two consecutive quarters  or more)  in each jurisdiction
was largest for Class A SIP sources, ranging from a  low of 0 in some agencies
to a high of 124 in another, with  a median  figure of 6 sources per jurisdiction.
**"Long-term violators" means sources in violation  for  two  continuous
  quarters or more.
                                   V-4

-------
    The following question is the first  one of  the  ten  developed  as
a guide for summarizing the findings  in  the audit reports.

    (1)  Based on the findings of the pre-visit program analysis,  what
         was the Region's overall  assessment of the condition of the air
         compliance program?

    A review of the 65 audit reports  shows  that some form of pre-visit
assessment was done by the Regions for all  but  three State  and  four local
programs.  Thirty-one of these reports contained an overall statement
about the particular compliance program  based on the CDS analysis:

    - Ten (10) air programs were considered very good.

    - Twenty-one (21) air programs were  considered  adequate
      (meeting most Clean Air Act requirements).

    - None of the air programs were termed  seriously deficient.

    The remaining 27 audit reports (where a pre-visit assessment was
done) made no definitive statement on the air program based on  the CDS
assessment, but positive comments were made in  17 of these  reports, such
as "inspection rates are very good" and  "compliance rates are good to
excellent."  It was not possible to determine anything  of substance
relative to the pre-visit assessment  from the remaining ten reports.

    Careful study of the audit reports for  the  ten  agencies with "very
good" air compliance programs shows several  elements contributing to the
success of each compliance program.  In  general, these  agencies:

    - routinely complete nearly all the  required inspections for SIP
      sources, and NSPS and NESHAP sources  where delegated;

    - have compliance levels for Class A SIP, NSPS, and NESHAP  sources
      consistently above 90 percent with recent inspections to  support
      this level;

    - address, in a timely manner (and according to "timely and appropriate"
      guidelines), sources found to be in violation of  applicable  emission
      limits or permitting requirements  resulting in few, if any, long-term
      violators (greater than 180 days).

It seems likely that other States have compliance programs  as good as
these ten but this was not readily discernible  from the description of
the programs in the audit reports.

    (2)  What is the Region's overall assessment of how the newly-implemented
         "timely and appropriate" guidance  is working in the State or
         local agency?
                                   V-5

-------
     Twenty-nine audit reports  indicated  that  the  guidance  is being followed
and the program is working well,  while  five  reports  stated  that the guidance
was not being followed (meaning few,  if any, violators were resolved
according to the guidelines).   Of the remaining  reports, 20 had no con-
clusions on the "timely and appropriate"  guidance  because either: (1)
agreements reflecting the guidance were not  reached  with the States until
late FY 1984, preventing assessment until  later  in FY 1985  (11 reports), or
(2) there were no violators subject to  the guidance  (9 reports).  The
other 11 reports did not discuss  the  guidance  in any detail.

     To summarize the CDS based pre-visit  assessment, 48 (74 percent) of
the 65 State and local compliance programs were  found by the Regions to
be either adequate or very good,  and  no programs were judged seriously
deficient based on that assessment.  It was  not  possible to assign an
overall description of programs from  the  other 17  (26 percent) reports.
This initial effort identified  many good  programs  and pointed out other
areas where the State and local agencies  and EPA should continue to work
together to improve compliance  programs.

D.   FILE REVIEW

     (3)  Did the source files  reflect  a  reasonable  profile of the sources?

     All 65 audit reports contained file  review  information.  Forty-seven
(72 percent) of these indicated that  the  files reviewed reflected a reasonable
profile of the source, which means they contained  the following information:
source compliance status based  on recent  inspection  or source test; an
identification of all air program regulations  the  source is subject to,
and, within the inspection report, operating parameters, point sources,
and pollutants emitted by the facility.  Some  common reasons cited in the
13 audit reports (20 percent) where the files  were considered deficient
were: inability to determine source compliance status from  file contents,
no indication of which air program regulations the source was subject to
(SIP, NSPS, NESHAP), and missing  inspection  reports  or poor quality inspection
reports (no mention of operating  or emission parameters, point sources, or
pollutants emitted by facility).   The remaining  five reports (8 percent)
did not contain a conclusive statement  on this question.

     (4)  Did the files contain adequate  written documentation
to support the CDS compliance status?

    Thirty-seven (57 percent) of the  65 audit  reports indicated that
files reviewed contained some written documentation  of compliance status
to support CDS.  This represents  a drop of 5 percent from last year's
documentation rate of 62 percent.  Twenty-six  (40  percent)  of the audit
reports (up from 24 last year)  either cited  a  lack of any compliance
information in the files, or showed information  in the files which conflicted
with CDS.  The other two reports did  not  contain sufficient information
to answer this question.
                                   V-6

-------
     To facilitate a more consistent evaluation  on  this  point,  a  further
explanation of "adequate written documentation"  was  agreed  upon in  April
and will be included in the FY 1986/1987 NAAS guidelines.   Since  this  elabora-
tion on the adequacy of written documentation was only  agreed upon  at  a
meeting with State and local  agency representatives  in  April, it  was not
consistently used in this year's evaluation.   As a  minimum, a file  should
contain:  (a)  documentation that the source was  inspected  and that  the
regulated emission points and pollutants were evaluated, and (b)  a  deter-
mination of the compliance status of the source  and  documentation of the
basis for that determination.  The compliance status in  the file  should
agree with the compliance status shown  in CDS.

     (5)  Are violations documented and pursued  by  agencies to  return  a
          source to compliance expeditiously?

     Forty (62 percent) of the 65 audit reports  indicated  that  violations
are documented and pursued to return a  source to compliance expeditiously.
Seventeen reports (26 percent) indicated that some  sources  were not being
expeditiously returned to compliance, in some cases, leading to a number
of long-term violators (greater than 180 days) or untimely, protracted
enforcement actions.  Eight reports (12 percent) lacked  a  definitive
response to this question.

E.  OVERVIEW INSPECTIONS

     Thirteen of the 65 audit reports reviewed did  not  contain  any
information on overview inspections. Therefore, the following questions
cover the remaining 52 reports:

     (6)  How many inspections were performed?

     The number of EPA overview inspections performed ranged from a low of
2 to a high of 48.  The total number of inspections  for  all 52  reports
was 725, which is acceptable compared to the  600 to  900  inspections
projected for this effort (2-3 percent  of the Class  A SIP,  NSPS,  and
NESHAP sources in CDS).

     (7)  How were sources selected by  the Region for the  overview  inspections?

     By far, the most common criteria used by the Regions  to select sources
subject to overview inspections were some combination of source size
(preference to Class Al), type of program (to ensure a  representative
sample of SIP, NSPS, NESHAP sources), location (primary  attention to
impact on nonattainment areas as well as geographic  spread), source com-
pliance history, and pollutants.  Most  of the differences  between the
Regions' selection approaches were found in the  amount  of  relative  emphasis
placed on each criterion.
                                   V-7

-------
     (8)  What did inspections  consist  of?

     Almost all  of the overview inspections  performed were a joint effort
between EPA and the States.   Most  began with a  review of State source
files and progressed to an on-site visit to  the source, with both EPA and
State inspectors conducting  separate  evaluations of  source compliance
status, after which separate reports  were written and compared.  A summary
of the 52 audit reports with answers  to this question appears below:

    Joint Inspections - State Lead      30   (406 inspections)
    Joint Inspections - EPA  Lead        11   (132 inspections)
    Joint Inspections - Dual  Lead         9   (148 inspections)
    Independent EPA Inspections          2   ( 39 inspections)
             TOTAL                      52 "  (725 inspections)

     (9)  What was the purpose  of  the overview  inspections (that is, to
          verify independently  State  reported compliance, to observe State
          inspections practices, or some combination of these)?

     Forty-six (90 percent)  of  the 51 reports stated that the purpose was
a combination of independently  verifying State  compliance and observing/
critiquing State inspection  procedures  (including State inspector qualifi-
cations).  Of the five other reports, four mentioned simple verification
of compliance status as the  only inspection  goal, and one explicitly
cited status of overview inspections  to train State  and local inspectors.

     (10)  What were the overall results of  the overview inspection effort,
           including recommendations  for resolution  of any problems discovered
           during the effort?

     Thirty-nine of the 45  responses  to this question (87 percent) showed
both the expertise of State  inspectors  and State reported compliance
status to be adequate, which means the  State inspectors were experienced
enough to conduct a thorough inspection and  determine the compliance
status of a source.  Six reports (13  percent) indicated that overview
inspection results did not  agree with the compliance status in the State
files or CDS.

    Regarding recommendations,  five reports  suggested that more training
for State inspectors would  improve the  quality  of inspections and result      _v
in more accurate inspection  data being  reported to EPA.
                                   V-8

-------
tu

QC
ID
                                                                                                                 cr>
                                                                                                                 o
                                                                                                                 u

                                                                                                                 •sr
                                                                                                                 co

                                                                                                                 >•
                                                                                                                 
-------
                                                        cn
                                                        o
                                                        CO

                                                        ti-

                                                        ll]
                                                        CJ
                                                        CL
                                                        ID
                                                        O
                                                        (O
                                                        <
                                                        o
V-10

-------
LU
a:

o
                                                                                                      CO
                                                                                                      o
                                                                                                      CO
                                                                                                      cr
                                                                                                      z>
                                                                                                      o
                                                                                                      en
                                                 V-ll

-------
UJ
cr
                                                                                                     €0
                                                                                                     CD
                                                                                                     O

                                                                                                     ^
                                                                                                     O
                                                                                                     (O

                                                                                                     (U
                                                  V-12

-------
in

LU
                                                                                                o
                                                                                                o


                                                                                                00

                                                                                                >•
                                                                                                 o

                                                                                                 ^
                                                                                                 o
                                                                                                CO

                                                                                                Q
                                                V-13

-------
<0

LU
cc


o
                                                                                                             CJ

                                                                                                             ^r
                                                                                                             00
                                                                                                             ca
                                                                                                             +->
                                                                                                             CO
                                                    V-14

-------
                        (O
                        o
                        CO



                        s
                        UJ
                        u
LU
cr
                        £
                        cr
                        UJ
                        a
                        a
                        (O





100*
MAXIMUM


K 2
?|



M
K>
0



*
i












i §
£ ^







*- X



i
LTVMEDI
r<"

X
E


» 4
ij



| s\
:

i
;


z w
a «
LU O
i§
§2
-j >
UJ
u
OMPLIAN
RATES

                                                                                                             to
                                                                                                            O
                                                      V-15

-------
CO
LU
cr

o
                                               V-16

-------
                             VI.   AIR MONITORING


                              EXECUTIVE SUMMARY

     The 1985 National Air Audit  System included air monitoring  audits  of  73
agencies.  Four principal  areas within the agencies'  air  monitoring  program
were evaluated.  These areas were network design and siting,  resources  and
facilities, data and data management, and quality assurance/quality  control.
The principal conclusions relating to these areas are highlighted  below.

     The audit reports indicate that State and  local  agencies have continued
their successful  performance in operating and maintaining their  State and
Local Air Monitoring Stations (SLAMS) and National  Air Monitoring  Stations
(NAMS) networks.   About 98 percent of the 4300 monitors operated by  the
audited agencies are meeting the  monitoring regulations.   No  major or wide-
spread siting problems with monitors were discovered.

     The audit reports did disclose that 82 percent of the audited agencies
reported needs for new or replacement of air monitoring or laboratory equipment
totaling $4.6 million.  Of this total, $3.6 million were  needed  for  air
monitoring equipment and $1.0 million for laboratory equipment.

     Similar to last year's audit, the 1985 audit indicated that timeliness of
data submittal is still a problem for many agencies, particularly  for submis-
sion of lead (Pb) data which are  late about 25  percent of the time.   In
general, the data completeness percentage was good  with a low of 84  percent
for nitrogen dioxide (N02) and a  high of 92 percent for total  suspended
particulate (TSP).  Another problem area in the  data management  section of
the audit was a deficiency by 24  percent of the  agencies, in  the submittal of
the required annual  SLAMS report.  Corrective actions, however,  are  being
taken to resolve this problem.

     Quality assurance/quality control aspects  of the audit reports  indicate
that most of the State and local  agencies are doing a good job of  maintaining
adequate and up-to-date quality assurance plans.  Four agencies  need to
substantially revise their plans  and 46 had minor revisions pending. With
respect to the achievement of quality assurance  goals, for precision
(+_ 15 percent for all pollutants) and accuracy  (+^15 percent  for TSP, j^ 20
percent for other pollutants) data, the only significant  problems  are related
to accuracy for N02 and precision for Pb.  For  N02, the percent of reporting
organizations meeting the accuracy goal  was 46.   This same percent was  indicated
for Pb precision.  The low values for NOg accuracy  and Pb precision  are
believed to be related to the complexity of the  N02 measurement method  and the
low ambient Pb levels.

A.  INTRODUCTION

    Ambient air monitoring for State implementation plan  (SIP) purposes is
required by section  110 of the Clean Air Act.   Furthermore, section  319 of
the Act requires  the development  and application of uniform air quality
                                     VI-1

-------
monitoring criteria and methodology,  reporting  of  a  uniform  air quality
index in major urban areas, and  the establishment  of  a national air moni-
toring system which uses uniform monitoring  criteria.  To  comply with these
requirements, the EPA promulgated ambient  air monitoring regulations  (40
CFR 58) in 1979, with further revisions  in subsequent years.   Included in the
Part 58 regulations are requirements  for the auditing of State and local air
monitoring programs.  These provisions  have  served as the  basis for the
national air monitoring audits which  began in 1984.   As a  result of the
findings and recommendations of  the 1984 audits, several changes to the audit
questionnaire were made in  preparation  for the  1985  air monitoring audits.
The modifications made were not  major in content.  Instead,  they were princi-
pally concerned with reorganization,  clarity, and  the reduction in requests
for resubmission of data.  The 1985 guidance did require,  for  national consis-
tency, that all EPA Regional Offices  use at  least  the short  form questionnaire,
the corrective action implementation  request, and  the system audit reporting
format.  Use of the long form questionnaire  was left  up to the discretion of
the Regional Quality Assurance (QA) Coordinator with  the concurrence of the
State or local agency.  The audit team  in  1985  consisted of  EPA Regional
Office personnel and, in some cases,  Headquarters  representatives.  The
audits included interviews, on-site inspections, and  the completion of the
short or long form air monitoring questionnaire.   The questionnaires contained
questions covering the following four important topic areas:   network design
and siting; resources and facilities; data and  data  management; and quality
assurance/quality control.   Each of these  areas and  the associated questions
are described in detail below following  the  discussion of  "Major Findings and
Conclusions."

B.  MAJOR FINDINGS AND CONCLUSIONS

    The air monitoring programs  of 73 agencies  (48 States, 22  locals, the
District of Columbia, and 2 territories) were audited during 1985.  The audit
results indicate that State and  local agencies  continue to successfully
operate and maintain their respective SLAMS/NAMS networks.  About 98 percent
of the 4300 monitors operated by the  audited agencies are  meeting the Part 58
monitoring requirements.  These  results  are  consistent with  the periodic
national SLAMS/NAMS status reports which show 97 percent of  the 4723 monitors
complying with the regulations.

    Concerning the audit findings on  resources  and facilities, most agencies
indicated that they had adequate space  and personnel.  However, the audit
showed that 60 agencies or 82 percent did  have  air monitoring  equipment needs.
These needs include pollutant monitors,  calibration  systems, data processing
equipment, and meteorological equipment.  The projected cost to procure the
monitoring equipment is approximately $3.6 million.   A need  for major labora-
tory equipment to bring the laboratory  support  to  adequate levels was expressed
by 24 agencies.  Total cost for  this  equipment  is  about $1.0 million.

    Similar to last year's audit results,  timely data submittal continues to
be a problem for many agencies.   The  problem is principally  lead data submit-
tal s, which are late approximately 25 percent of the time.  The percentage
of late submittals for the other pollutants  range  from 8 to  14.  This problem
                                     VI-2

-------
has received greater attention by EPA and State and local  agencies,  and  a
tracking system has recently been instituted within the EPA's  Monitoring and
Data Analysis Division to monitor the progress being made  in timely  data
submittals.  With respect to meeting the National  Air Data Bank  (NADB) 75
percent data completeness criteria, in general, the results were good.  The
audit showed a range of 8 percent, with a low of 84 percent for  N02  and 92
percent for TSP.  One of the larger problem areas  found in the data  management
section of the audit was the requirement for submitting the annual SLAMS
report.  The audit showed that of the 51 audited agencies  required to submit
an annual SLAMS report, 12 agencies or 24 percent  were deficient in  one or
more of the 4 required elements of the annual  report.  This problem  should be
easily resolved administratively and efforts are underway  to correct it.

     The QA aspects of the audit reports indicated that 69 of  the 73 agencies
had QA plans that were, in general, acceptable.  However,  4 agencies need to
substantially revise their plans and 46 had minor  revisions to their QA plans
pending.  Seventy-two of the 73 agencies were participants in  the National
Performance Audit Program, an excellent rate of involvement.   The last phase
of the quality assurance program evaluation was the assessment of achievement
of the precision and accuracy goals (precision, +_ 15 percent for all  pollutants;
accuracy, +_ 15 percent for TSP, +_ 20 percent for other pollutants) by the
agencies' reporting organizations.  The lowest rate for achievement  of the
precision goals (based on one quarter of data) was for Pb, with  only 46 percent
of the organizations meeting the +_ 15 percent goal.  Based on  data submitted
to the Environmental Monitoring and Support Laboratory (EMSL)  for the 1984
annual precision and accuracy report, 65 percent of the reporting organizations
met the Pb precision goal.  This difference of 19  percent  is thought to be
principally due to the difference between the time periods used.

     The lowest percent of reporting organizations meeting the accuracy goals
was 46 which was for N02-  The 1984 accuracy achievement level for NOg based
on the annual data submitted to EMSL was considerably higher,  reaching 70
percent.  The other pollutants compare fairly well  between the two data sets.
The reasons for the lower values for N02 accuracy  and Pb precision are believed
to be related to the complexity of the N02 measurement method  and its associated
audit methods and the relatively low ambient levels for lead.  It is also pos-
sible that the wide confidence intervals (CI's) associated with  N02  accuracy
estimates are related to the fact that, for most reporting organizations, there
are actually few N02 sites relative to site counts for the other parameters.
In a statistical sense, the presence of even a few relatively  large, but still
"acceptable," individual  audit differences are magnified into  large  quarterly
CI's due to the small number of points actually comprising the statistic.

C.   NETWORK DESIGN AND SITING

     The network design and siting section of the  audit was aimed at assessing
compliance of air monitoring programs with the requirements of Appendices D
and E of 40 CFR Part 58.   To assess this topic, five overall aspects were
reviewed.  They were as follows:
                                     VI-3

-------
     0 Network size and written  description  of the  network

     0 Network modification  during  the  last  year

     0 Sites not meeting network design or siting requirements

     0 Performance of the annual  network review requirements

     0 Survey of noncriteria pollutants monitored by  agency

     Responses to this section of the audit,  were intended to serve as a cross-
check and update of existing EPA data on the number of monitors, their
distribution, their conformance  with siting  requirements, and compliance with
the annual network review provisions.   This  section also provides an enhanced
perspective on network stability and the variety of noncriteria pollutants
monitored by State and local  agencies.   In reviewing  the audits, there appeared
to be general agreement between  State and local agencies in terms of complete-
ness of response or compliance with requirements of the regulations.

     The number of sites operated by the 73  agencies  audited was 1290 NAMS,
3019 SLAMS, and 867 special  purpose monitors (SPM)  for a total of 5176 sites.
These totals compare favorably with the 1984 SLAMS  status report which indi-
cated that 1345 NAMS and 3378 SLAMS were operating  at the end of 1984.  The
difference between the 1984  SLAMS report and the audit totals is accounted
for by those sites operated  by State and local agencies which were not audited.
An indication of network stability  is gained by looking at the number of
reported changes to State and local  networks.  There  were 148 new sites
established, 198 sites discontinued, and 101 sites  relocated during 1984 for
a total of 447 modifications. This total affected  approximately 10 percent
of the sites covered by the  audit and occurred in 52  of the 73 agencies
audited.  During 1985, about the same number of agencies are planning network
changes.  Due to the short-term  nature  of SPM monitoring, no attempt has been
made to track these sites on a  regular  basis.

     From the audit reports, it  was determined that over 98 percent of the
4309 SLAMS/NAMS operating monitors  reported  by the  73 agencies audited were
in compliance with 40 CFR 58 Appendices D and E.  More than half of the monitors
not in compliance with Appendices D and E are located in two States which are
taking steps to bring these  sites into  compliance.  The remaining sites are
scattered over 14 State or local agencies.

     Results of the review for compliance with the  SLAMS network description
and annual review requirements of 40 CFR Part 58 show that all agencies
audited maintain a network description  and that the descriptions for 7 agencies
(or 10 percent) are deficient in one or more of the items required for the
description.  With respect to adherence to the annual network review, seven
agencies did not provide the date of their last review, and eight agencies
indicated that the last review occurred prior to 1984.  The remaining 58 (79
percent) of the agencies indicated  that network evaluations were conducted
sometime during 1984 or early 1985.
                                     VI-4

-------
     The audit results pertaining to noncriteria  pollutant monitoring  showed
that 56 of the 73 agencies monitored for  one  or more  noncriteria  pollutants.
This information was collected to provide some initial  knowledge  of  the
variety and magnitude of noncriteria monitoring conducted by  the  audited
agencies.  Based on this information, the most frequently monitored  substances
are organic solvents, metals, acid rain,  and  sulfates/nitrates.   The infor-
mation collected is not specific enough  for a comparison to EPA's priority
toxic pollutant listing.  Future audits  could be  adjusted to  provide additional
information for this purpose.  The listing below  shows  the pollutant monitored
and the number of agencies monitoring for each.


     Metals             17                     Fine  Particulate     5
     Acid Rain          11                     S04/N03             12
     Asbestos            5                     Sulfur               1
     Solvents           18                     Phosphate            1
     Formaldehyde        3                     Radiation            1
     Fluoride            4                     Pesticides           2
     BaP                 4                     Freon                1
     H2S                 7                     Chloride             1
     NMOC                9
D.   RESOURCES AND FACILITIES

     The resource and facility section of the audit  was  developed  to  provide
additional  information about the size of operations, and the  adequacy and
condition of various resources of each audited agency.   Topics  considered  in
this section were as follows:

     0  Number of nonconforming analyzers

     0  Instrument needs

     0  Number of man-years of effort

     0  Documentation of standard operating  procedures  for  laboratories
        and availability of necessary equipment

     0  Availability and traceability of laboratory  and  field (site)
        standard reference materials

     An analysis of the audit results for the resources  and facilities section
of the audits provides the following  information.  There are  23 nonconforming
pollutant monitors currently in use in the SLAMS network.  This amounts to
less than 1 percent of the reported sites.  The biggest  block of nonconforming
monitors (19 of 23) are hi-volume samplers for TSP or Pb which  do  not conform
to the new shelter standards.
                                     VI-5

-------
     Sixty agencies reported various monitoring  equipment  needs  ranging  from
spare parts to several  new monitors or  calibration  systems.  The equipment
requests have been categorized in  four  areas:  field  equipment which  includes
such items as pollutant monitors,  flow  controllers, and  shelters; calibration
and quality control (QC) equipment, including  items like calibration  systems,
gas dilution systems and rootsmeters; data  processing  equipment, covering
such items as personal  computers,  data  loggers,  telemetry  equipment,  etc.;
and meteorological  equipment.   The table  below indicates the broad  categories
of equipment requested, the number of items, and an estimate of  the cost to
acquire all of the equipment.


                             Number of
Equipment Type                 Items                  Cost  $(000)

Field
 (a) Monitors                   400                 2,800
 (b) Shelters, flow             23                    97
     controllers

Calibration-QC                  28                    107

Data Processing                 88                    583

Meteorological                  30                 	46^

        TOTAL                   569                 3,633


    The total cost to purchase all of this  equipment  is  estimated at  $3.6
million.  Twenty-four agencies also requested  laboratory equipment.   This
equipment (a total  of 35 items)  included  spectrophotometers, humidity-
controlled chambers, microbalances, etc.  The  estimated  cost to  acquire  these
items is $1.0 million.   The total  cost  to meet the  monitoring  and laboratory
equipment needs identified by the  audits  is about $4.6 million.  In FY 1986,
$2.4 million in section 105 grant  funds were allocated to  replace carbon
monoxide (CO) and ozone (03) monitors and procure additional particulate
matter (PMio) samplers.  EPA anticipates  continuing funding equipment needs at
approximately the same level in  FY 1987.

    The analysis of agency space and staff  indicates  that  65 percent  of  the
agencies felt they had  adequate  space to  operate their program.  To establish
a better understanding  of the number of people involved  in air monitoring,
the agencies were asked to break down the number of work-years associated with
operating various aspects of their programs.   Sixty-two  agencies provided
tabulated responses to this two-part question.  The remaining  11 agencies either
did not respond, provided organizational  charts  which  could not  be  reduced to
man-years, or simply indicated their space  was adequate.  The  total number of
man-years assigned to 4 specific areas  of air  monitoring for 63  agencies is
shown below.
                                     VI-6

-------
    Program Area                    Number                  Percent

Network Design & Siting             128                     15

Resources & Facilities              347                     41

Data & Data Management              172                     20

QA/QC                               209                     24

  TOTAL                             856                     100


     The audit showed that 63 agencies or 86  percent  of  those audited have
adequate laboratory standard  operating procedures  for air quality measurements.
However, only 67 percent (49  agencies) indicated that they  had sufficient
instrumentation available to  conduct all necessary laboratory analyses.

     The last major area of the facility and  resource section audited concerns
standards and traceability for laboratory and field site use.  The audit indi-
cated 92 percent of the agencies had and could demonstrate  adequate  reference
standards for laboratory use.  This  percentage drops  off to 72 percent for
field site standards.

E.   DATA AND DATA MANAGEMENT

     The principal areas of concern  in the  area of data  and data management
are as follows:

     0  Percentage of data submitted on time

     0  Percentage of sites submitting less than 75 percent of the data

     0  Documentation of changes to  submitted data

     0  Data changes performed according to a documented standard operating plan

     0  Completeness of the annual SLAMS report.

     The pertinent findings of the audits for the  data and  data management
portion of the national audit are discussed below.

     Historically, timeliness of data submittal has been a  chronic problem;
therefore, each audited agency was asked to provide an estimate of the per-
centage of data submitted within 135 days after the calendar quarter in which
it was collected.  In general, this  requirement applies  to  NAMS sites; however,
most agencies submit their SLAMS data to the  NADB  and have  included  their
SLAMS in the percentages for  data submitted on time.   The calculation of data
submitted on time is a time-consuming and sometimes difficult statistic to
develop retrospectively.  An  analysis of the  percentages submitted by quarter
                                     VI-7

-------
by pollutant from 68 agencies indicates  the  quarterly  average percent submitted
on time varied between 75 percent  for  lead to 93 percent for carbon monoxide.
These percentages seem to be similar to  those percentages reported quarterly
in the Strategic Planning and Management System  (SPMS) reports; however, the
audit results are not directly comparable to SPMS numbers because SLAMS sites
are included in the audit report  and not in  the  SPMS.

    In addition to timeliness of  data, completeness  of data submittal is of
great concern to EPA; therefore,  the audits  examined the percentage of sites
meeting the NADB data completeness criteria  (in  general, submitting at least
75 percent of the theoretically obtainable data  by quarter by pollutant).
Similar to the response on timeliness  of data, 68 agencies provided data on
this question.  All of the agencies' responses on data completeness were
combined as a national data completeness quarterly audit result compared to
the 1984 percentages from the September  1985 quarterly NAMS status report.


                    Percent of Sites Meeting NADB Criteria

                                                       Sept 1985
Pollutant              Audit Results             Quarterly NAMS Report

   TSP                      92                             90

   S02                      85                             88

   N02                      84                             76

   CO                       90                             90

   03                       91                             86

   Pb                       86                             75


     A direct comparison of the national audit results with the quarterly  NAMS
report should be done with caution because  the former  represents  a quarterly
average of approximately 4000 sites,  and the NAMS report is an average for
approximately 1300 sites.  The two data  sets compare reasonably well, the
difference range being 0 (for CO)  and  11 percent (for  Pb).

    The audit indicated that 60 (82 percent) of  the  audited agencies documented,
in a permanent file, any change to air quality data  previously submitted to
EPA.  However, the audit reports  showed  that only 50 agencies (68 percent)
performed these changes according to  a documented standard operating procedure.
It is assumed that the other 10 agencies which said  they documented changes
either ignored their established  procedure  or had no standard procedure for
documenting changes to data.
                                     VI-8

-------
     The last area considered under the data management  section  of the  audit
was the requirement to submit an annual  SLAMS report  to  EPA.   This requirement
is applicable to States, the District of Columbia,  and United  States  territories;
therefore, local agencies are not considered here.   Four particular aspects
of the SLAMS annual report are needed for a complete  SLAMS  report. They  are
a data summary, annual precision and accuracy information,  air pollution
episode information, and certification of the report. Of the  51 agencies
required to produce a SLAMS annual  report, 39 (76 percent)  audits showed
inclusion of all required elements, while the remaining  12  were  deficient  in
one or more elements of the report.  These results  are comparable to  findings
based on those received by the Office of Air Quality  Planning  and Standards.

F.   QUALITY ASSURANCE/QUALITY CONTROL

     The Quality Assurance/Quality Control (QA/QC)  Section  of  the national
audit program is the last major topic considered  in the  audits.   Consideration
was given to the following portions of the audited  agencies' QA/QC programs.

     0  EPA approved QA plan

     0  Pending revisions to QA plans

     0  Agency participation in the National Performance Audit Program  (NPAP)

     °  Attainment of precision and accuracy goals

     Information provided in the QA/QC Section of the audit reports is  summarized
and discussed below.

     Several years ago, EPA approved all of the various  agency Quality  Assurance
Program plans.  However, it was not intended that either a  QA  program or  its
approval would be a static one-time affair.  As a result of program reviews
and the changing state of the art, some agency QA programs  are outdated or
need modification.  This is evidenced in the audit  results  that  show  4  out of
the 73 audited QA plans need to be revised and approved, and 46  (63 percent)
of the audited plans have formal revision proposals pending approval  actions.
The number of revisions to the QA program plans is  a  good indication  that
EPA's concern over quality assurance is being taken seriously.  Similarly,
the high level of participation in the NPAP, 72 of  the 73 audited agencies,
demonstrates the deep interest of State and local agencies  in  the data  quality
of their air monitoring programs.

     The last consideration of the QA/QC Section of the  national  audit  was a
measure of the agencies' reporting organizations ability to achieve the
precision and accuracy goals specified in the 1985  audit guidance (precision,
+_ 15 percent for all pollutants; accuracy, +_ 15 percent  for TSP, +_ 20 percent
all other pollutants).
                                     VI-9

-------
     Precision and accuracy data are measures  of  data  quality  and  are  based  on
"reporting organizations."   Each State must  define  at  least one  or more  reporting
organization for each pollutant, and each  reporting organization should  be
defined such that the precision  and  accuracy data reported  among all stations are
reasonably homogeneous.  Nationally, there are approximately 150 organizations.

     Usable data (in terms  of responses to the questionnaire)  for  precision
were received from 59 agencies,  while accuracy data were  supplied  by 58
agencies.  Some of the agencies  that did not provide usable data for this
section did provide some indication  of success in achieving or failing to
achieve the goals.  The 1985 audit guidance  directed that achievement  of
goals was to be based on each of the last  4  complete calendar  quarters prior
to the audit for which precision and accuracy  data  were available; therefore,
the data received covered a period between July 1983 and  December  1984.  To
simplify the analysis, a common  quarter for  all agencies  submitting data was
chosen.  All of the data for the second quarter of  1984 were reduced to  pro-
duce precision goal achievements (Table 1) and accuracy goal achievements
(Table 2).
                                 TABLE 1
                           FY 1985 AUDIT RESULTS
                                PRECISION

                  # Reporting        # of Rpt.  Org.'s      Percent  Rpt.  Org.'s
Pollutant        Organizations        Meeting Goals           Meeting  Goals
	         (Rpt. Org.)        	     	

    03                85                    66                     78

   N02                53                    31                     58

   S02                82                    60                     73

   CO                 76                    67                     88

   TSP               112                    83                     74

   Pb                 59                    27                     46
                                     VI-10

-------
                                 TABLE  2

                           FY 1985 AUDIT  RESULTS
                                 ACCURACY

                  # Reporting        #  of Rpt.  Org.'s      Percent  Rpt.  Org.'s
Pollutant        Organizations       Meeting Goals           Meeting Goals

    03                84                    65                     77

   N02                52                    24                     46

   S02                81                    57                     70

   CO                 75                    58                     77

   TSP               111                   104                     94

   Pb                 58                    47                     81
    Achievement of precision goals  by reporting  organizations  as  shown in
Table 1 varies between 46 percent for lead  to  a  high  of  88  percent  for CO.
The precision of both lead and N02  is noticeably lower than the level of the
remaining pollutants.  Similarly, the achievement of  accuracy  shown  in Table 2
ranges between 46 percent for N02 and 94  percent for  TSP.   The accuracy goals
set in the audit guidance are stringent because  they  were applied to all four
accuracy audit levels as opposed  to meeting only one  of  the levels.   It is
evident that N02 accuracy falls considerably below the level  achieved by
other pollutants and has been attributed  to the  complexity  of  the NO/N02/NOX
analyzer.  It is also possible that the wide CI's associated with N02 accuracy
estimates is related to the fact  that, for  most  reporting organizations, there
are actually few N02 sites relative to site counts for the  other  parameters.
In a statistical sense, the presence of even a few relatively  large, but still
"acceptable" individual audit differences are  magnified  into large  quarterly
CI's due to the small number of points actually  comprising  the statistic.

    To provide further perspective  on precision  and accuracy performance
nationally, Table 3 for Precision and Table 4  for Accuracy  have been assembled
from data submitted to EMSL for calendar  year  1984.
                                    VI-11

-------
                                  TABLE 3
Pollutant

   03

  N02

  S02

  CO

  TSP

  Pb
                          1984  DATA SUBMITTED TO EPA
                                  PRECISION

                                 Probability Limits (%)
# Rpt. Org.'s
104
66
103
96
138
72
Lower
-16
-20
-20
-14
-19
-25
Upper
+15
+21
+15
+12
+12
+27
Percent of Rpt. Org.'s
  Meeting Goal  + 15%

          85

          75

          85

          90

          75

          65
                                   TABLE 4

                          1984 DATA SUBMITTED TO EPA
                           ACCURACY (AUDIT LEVEL 1)

                                 Probability Limits (%)
Pollutant       # Rpt. Org.'s    Lower	Upper
   03                84

  N02                43

  S02               103

  CO                 96

  TSP*              138

  Pb                 72

*TSP is based on + 15
-21
-34
-24
-20
-12
-15
+20
+20
+20
+18
+12
+18
Percent of Rpt. Org.'s
   Meeting Goal + 20%*

          85

          70

          85

          90

          95

          85
     Tables 3 and 4 utilize the entire 1984  data  base.   The  upper  and lower
probability values were selected so  that  the range would  include 90 percent
of the reporting organizations for precision and  accuracy  (audit Level 1).
The last column reflects an estimate of the  percent of the reporting organiza-
tions meeting the audit goals for the 1984 precision  and  accuracy  data submitted
to EMSL.  The percentage of reporting organizations meeting  the audit goals
of +_ 15 percent for precision ranges from 65 percent  for  Pb  to 90  percent  for
CO.
                                    VI-12

-------
The goals for accuracy were +_ 15 percent for TSP  and  +_ 20  percent  for  all
other pollutants.  The range of percentages  for reporting  organizations
achieving the accuracy goals in Table 4  is  from 70  percent for  N02 to  95
percent for TSP.   The 1984 values shown  in  Tables 3 and 4  are about the same
as the 1983 levels.

     Tables 1 and 3 or 2 and 4 are not directly comparable to each other because
of the difference in period of record and number  of reporting organizations
used.  However, it is evident that goal  achievement is higher for  all  pollutants
based on the 1984 annual  data submitted  to  EMSL.  Furthermore,  the extent of
the range (the percent of reporting organizations meeting  the precision and
accuracy goals) for Tables 3 and 4 is smaller than  the corresponding Tables
1 and 2.

     Based on the two sets of tables, N02 for both  precision and accuracy shows
up with lower values than the other pollutants.   This may  be related to the
complexity of the N0£ instrument and the related  instrument audit  procedure,
and the limited number of N02 instruments operated  by the  various  reporting
organizations.  The low level of achievement for  lead may  be related to the
procedure which analyzes two strips from the same filter.   In general , ambient
lead levels are low.  This would magnify any very small  differences in the
analytical  results between the two strips to rather large  percentage differences.
                                   VI-13

-------
                   VII.  VEHICLE INSPECTION/MAINTENANCE
                            EXECUTIVE SUMMARY

     While the audit process for inspection/maintenance programs  did  not
become part of the National  Air Audit System (NAAS)  until  FY  1985,  E^A
actually started inspection/maintenance (I/M) audits in FY 1984.   Eight  I/M
programs were audited in FY  1984 and another eight I/M programs were  audited
in FY 1985.  The results of  these sixteen audits  are included in  this
report.

     Enforcement is a problem in some programs with  sticker based
enforcement.  Five of the 16 programs audited had enforcement problems.

     Low reported failure rates are a problem in  most decentralized
programs, especially in those using manual  analyzers and in some  centralized
government-run programs.  Nine of the 16 programs audited  were experiencing
lower than expected failure  rates.

     High waiver rates are a problem in some programs, both centralized
and decentralized.  In total, 4 of the 16 programs audited had apparent
excessive waiver rates for at least some vehicle  categories.

    Analyzer quality assurance ranges from excellent in centralized
contractor programs to marginal in both decentralized programs with manual
analyzers and some centralized government-run programs.  Ten  of the 16
programs audited need improvements in this area.

     Data analyses are not being effectively used in most  programs  to
monitor and improve program  performance and the performance of individual
inspection stations.  Thirteen of the 16 programs need to  improve their
data management programs.  The quality of I/M repairs is a problem, to
some extent, in every program audited.

     The EPA believes that the resolution of these problems generally rests
with each State/local I/M program developing an overall I/M quality
assurance program to ensure  that problems are identified and  resolved in
a timely manner.  This overall system needs to monitor and assure adequate
enforcement, adherence to procedures for testing  and record keeping,  and
proper diagnosis and repair  of failed vehicles.
                                  VII-1

-------
     Identifying operating  problems  is  an  important first step in assuring
quality I/M programs.   The  results of completed audits are being used by
State/local agencies and EPA to  improve I/M programs.  The EPA believes
that the I/M audit system and guidelines will continue to be a dynamic
and vital  process for  achieving  environmental results.

A.   INTRODUCTION

     Auditing of State/local  motor vehicle I/M programs was added to the
NAAS in FY 1985.  The  EPA actually started the I/M audit program in FY 1984
when eight pilot audits were conducted  in  the spring and summer of 1984.
The results of these FY 1984 audits  are being included in this report.
Eight additional audits occurred in  FY  1985.  Table 7-1 lists the I/M
programs audited in FY 1984 and  FY 1985, the dates of each audit, and the
type of I/M program in each State.   Table  7-2 (on p. VII-11) summarizes the
operating  details of the 16 programs.


                                Table 7-1

                           FY 1984/85 I/M  Audits

                                                       Program
          Location               Dates                 Type*

     Connecticut              5/14  -   5/16/84             A
     Massachusetts            5/16  -   5/18/84             D
     Colorado                 5/21  -   5/23/84             C
     Arizona                  5/23  -   5/25/84             A
     District of Columbia     6/04  -   6/06/84             B
     Virginia                 6/06  -   6/08/84             C
     Memphis, TN              6/27  -   6/29/84             B
     New Jersey               7/10  -   7/13/84             BC
     Nevada                   10/15  - 10/19/84             C
     New York                 12/10  - 12/14/84             D
     Georgia                  1/22  -   1/25/85             C
     Missouri                 3/04  -   3/08/85             C
     Delaware                 3/07  -   3/08/85             B
     North Carolina           3/18  -   3/22/85             C
     Texas                    3/26  -   3/27/85             C
                              4/02  -   4/04/85
     Oregon                   4/15  -   4/19/85             B

     *A =  centralized, contractor
      B =  centralized, government-run
      C =  decentral ized
      D =  decentralized with computerized  analyzers
                                  VII-2

-------
     The primary purpose of the I/M audit  is  to  allow  EPA  to  ensure  that
each State or locality is implementing and enforcing  its  I/M  program in a
manner consistent with its State implementation  plan  (SIP).   Another
objective of the audit is to identify areas where EPA  can  provide assistance
to strengthen I/M programs.  This includes either specific aid  to a
particular State or more general  assistance aimed at  resolving  an overall
technical issue.                                                   ,,.*"

     The I/M audit questionnaire and the audit visit  are  structured  to
allow EPA and the State to determine what, if any,  program improvements
may be required to enable SIP goals and commitments to be  met.   The  I/M
questionnaire is in two sections—one dealing with  design  and intended
operating aspects and the other with actual recent  operating  experiences.

     The on-site audit visit includes various activities designed to
provide an in-depth analysis of the I/M program.   These activities include
records reviews, inspection station visits, interviews with program
officials, and special surveys.

B.   MAJOR FINDINGS AND CONCLUSIONS

     1.    Major Findings

     The following major findings resulted from  the completed I/M audits:

     a.    Enforcement - EPA auditors found that  the  rate  of  compliance
           among vehicle owners ranged from greater than 95 percent  in
           some States to less than 50 percent in other States.   The
           problems noted were in sticker-based  enforcement programs.

     b.    Reported failure rates - EPA auditors  found reported  failure
           rates as high as 35 percent and as low as  2 percent.   Very low
           reported failure rates indicate inspection  error,  or  at least
           data reporting errors.  These problems were prevalent in  most
           decentralized programs and in some government-run, centralized
           programs.

     c.    Waiver rates - EPA auditors found  that waiver  rates  varied
           considerably among the programs audited.  A few programs  do
           not allow waivers.  In some States, the waiver  rates  appeared
           excessive.

     d.    Analyzer quality assurance - EPA auditors  found quite a variation
           in analyzer quality assurance among the audited programs.  In
           the centralized contractor programs,  analyzer  quality assurance
           was excellent.  In decentralized programs  with  manual  analyzers
           and in some centralized government-run programs, analyzer
           quality assurance was marginal.
                                  VII-3

-------
     e.    Data analyses - EPA auditors  found  that,  with  only  a  few
           exceptions, I/M programs  are  failing  to  effectively use available
           program data to monitor and take steps to improve program
           performance and performance of individual  inspection  stations.

     f.    Quality of I/M repairs -  EPA  auditors found  that  in every
           program, to some extent,  a problem  exists with,respect to  the
           quality of I/M repairs.

     2.    Conclusions

     EPA believes that the resolution to the problems in  operating I/M
programs generally rests with each State/local  I/M  program developing an
overall I/M quality assurance program to ensure  that problems  are identified
and resolved in a timely manner.   Through such systems, program  managers
need to track:

     a.    The level of noncompl iance among vehicle owners.

           This is particularly important in nonregistration enforcement
           systems.  However, even registration  enforcement  is not necessarily
           exempt from problems.

     b.    The performance of inspection stations to make sure that
           vehicles are receiving fair,  equitable,  and  accurate  inspections.

           This involves inspection  station audit and surveillance activities
           as well as tracking performance through  data analysis.  Obviously,
           the latter is possible only if accurate  data are  collected.

     c.    The performance of the program itself.

           Program data need to be summarized  and analyzed to  ensure  that
           cutpoints, failure rates, waiver rates,  and  other program
           statistics are within acceptable limits.

     d.    The quality of repairs.

           Quality repairs are the backbone of I/M.  Program data need  to
           be reviewed to ensure that vehicles are  not  being  improperly
           or incorrectly repaired.  Retest failure rates and  comparisons
           of before- and after-repair emissions levels can  be useful
           indicators to assess the quality of I/M  repairs.
                                  VII-4

-------
C.   ENFORCEMENT

     Enforcement - EPA auditors found that the rate of compliance among
vehicle owners ranged from greater than 95 percent in  some States to  less
than 50 percent in other States.  Five of the 16 programs  audited had
enforcement problems.

     Three enforcement methods are currently used in I/M programs:

           1. Registration based enforcement.

           2. Sticker enforcement.

           3. Data-link enforcement.

     The I/M audits have not identified any problems in registration
enforcement programs, although these systems have potential  problems  with
motorists improperly registering their vehicles outside the I/M area  to
avoid the program, with police failing to ticket vehicles  for  expired
license plates, and with clerical  errors in the registration process
which might allow uninspected vehicles to be registered.  No audits have
yet been conducted for programs which use data-link enforcement.

     The enforcement problems that have been identified in the I/M audits
are with sticker-based enforcement.  Of the first eight sticker enforcement
programs audited, four had serious levels of noncompl iance (in excess of
20 percent of vehicles in noncompliance).

     The problems in sticker-based enforcement programs tend to be caused
by the indifference of police officers to sticker violations and  by the
failure of police departments to devote the necessary  resources or to
accept the risk to public goodwill.  In some cases, there  are  also problems
because of an inability of police officers to distinguish  subject vehicles
from exempt vehicles and to distinguish expired stickers from  valid ones.
Limitations are also imposed in some cases because police  are not authorized
to ticket parked vehicles.

     Another problem area in some programs was sticker accountability.
The EPA auditors found that some programs have very thorough sticker  account-
ability procedures while others do not.  In order to ensure the proper
disposition of stickers, State/local agencies need to  confirm  that each
approval sticker has a matching inspection record showing  passing results.
(This same confirmation is needed in nonsticker programs for approval
certificates, except where the approval certificates and official  inspection
reports are printed automatically by machine.)  This confirmation is
accomplished in some current programs by correlating sticker serial
numbers to inspection reports and then reviewing inspection records,
sticker records, and sticker supplies during audits.  In programs with
                                  VII-5

-------
automated data collection, sticker serial  numbers  can  be  easily  recorded
in the inspection report and reviewed  through  routine  data  analyses.
Because of the potential for data loss in  some of  these systems,  State/local
agencies need to be cautious about totally relying on  the automatic
records.  Serial numbers for which no  passing  test data can  be found
should be checked against paper records  kept at the  inspection station
or centrally.  Alternatively, all  serial number/passing test verification
can be done manually.

D.   REPORTED FAILURE RATES

     Reported failure rates - EPA auditors found reported failure rates
as high as 35 percent and as low as 2  percent.   Nine of the  16 programs
audited were experiencing lower than expected  failure  rates.

     Reported failure rates were consistently  much lower  than (less than
half) the designed failure rates in decentralized  programs.   (One exception
was a decentralized program with computerized  analyzers.)  Also,  there
were low failure rates reported in one government  operated,  centralized
program.  All contractor operated centralized  programs had  reported
failure rates in the designed range.

     There are several reasons for the low failure rates.  In a  few
cases, the failure rates are low because the I/M cutpoints  are too lenient.
However, most of the problems with low reported failure rates, especially
in decentralized programs, are caused  by either pre-inspection repairs,
mistakes or cheating by inspectors, or some combination of  these  three
factors.  The EPA believes that a strong inspection  station  surveillance
program is needed to ensure proper station performance.   This surveillance
program should include regular station audits,  spot  checks  with  unmarked
vehicles, and the ability to gauge and track station performance  through
data analyses.  Spot checks with unmarked  vehicles set to fail inspection
should be considered an indispensable  part of  the  oversight  function  in a
decentralized program, particularly a  program  with manual analyzers.

E.   WAIVER RATES

     Waiver rates - EPA auditors found that waiver rates  varied  considerably
among the programs audited.  A few programs do  not allow  waivers.  In
some States, the waiver rates appeared excessive (greater than 10 percent
of failed vehicles receiving waivers).  In one State,  the overall  waiver
rate was about 13 percent of failed vehicles.   In  another case,  approximately
50 percent of the 1981 and newer vehicles  which failed were  receiving
waivers.  In total, 4 of the 16 programs audited had apparent excessive
waiver rates for at least some vehicle categories.
                                  VII-6

-------
     The reasons for the excessive waivers varies  to  some extent  with  the
type of program.  In centralized programs  and  other  programs  where  the
State/local agencies process waiver applications,  the problem tends to be
a failure on the part of the agency to adhere  to strict  processing  procedures.
In such cases, vehicles which receive improper or  irrelevant  repairs are
granted waivers as long as the repair cost ceiling,  and  other criteria if
any, are met. If repair facilities have learned that  owners never have to
return for a better repair, they have much less incentive to  perform only
relevant repairs and to perform them correctly. To  date, program officials
in centralized programs have not spent the effort  to  create other motivations
for good repairs and to find the repair facilities most  in need of  improvement,

     In decentralized programs where the inspection  stations  have the
authority to grant waivers, high waiver rates  tend to be caused primarily
by a lack of close scrutiny by the State/local  agency.  In these  cases,
the agencies do not track waiver rates by  inspection  station  and  do not
investigate questionable waiver transactions.   Waiver rates seem  to be
particularly high in those decentralized programs  which  use mandatory
repair sequences (i.e., all or some failed vehicles must have dwell,
timing, air/fuel ratio, and idle speed adjustments only) rather than cost
1imits.

     In decentralized, programs where the State agencies  process all
waivers, the waiver rates are surprisingly low.  In  fact, the waiver
rates in some cases are so low that it reinforces  the suspicion that
vehicles are not being failed correctly at reinspection.

     In some cases, a secondary problem contributing  to  high  waiver rates
in both centralized and decentralized programs is  the use of  relatively
low waiver cost ceilings.  Over half of the current  operating programs
have waiver cost ceilings of $55 or less.

F.   ANALYZER QUALITY ASSURANCE

     Analyzer quality assurance - EPA auditors found  quite a  variation in
analyzer quality assurance among the audited programs.  In the centralized,
contractor programs, analyzer quality assurance was  excellent.  However,
in decentralized programs with manual  analyzers and  in some centralized,
government-run programs, analyzer quality  assurance was  marginal.   This
group included 10 of the 16 programs audited.   These  problems were  caused
by several factors:

     1.    Lack of a comprehensive program of  preventive and  corrective
           maintenance.

     2.    Lack of thorough audit/surveillance activities for inspection
           stations.

     3.    In some cases, existing analyzers are rather  old.
                                  VII-7

-------
6.   DATA ANALYSES

     Data analyses - EPA auditors found  that,  with  only  a  few  exceptions,
I/M programs are failing to effectively  use available  program  data to
monitor and take steps to improve program  performance  and  performance of
individual inspection stations.  Of the 16  programs  audited,  13 should
improve their data management programs.

     In cases where inspection data are  collected manually,  there are a
number of problems which prevent, or at  least  limit, the collection of
accurate data.  In some cases, records are illegible and therefore unusable.
A more serious problem, and one  that is  more difficult to  resolve, is
that in many cases inspectors do not correctly record  data.  The EPA
auditors found that manually collected data records often  contain easily
identified patterns of record keeping (or  other) abuses.  In most cases,
however, the State/local agencies tend to  categorize the problem of poor
record keeping as inevitable rather than attempting to resolve it.
Abuses which are limited to record keeping of  themselves may not be
serious threats to air quality objectives.  However, there at  least needs
to be a way to analyze the data  to distinquish between record  keeping
errors and more serious infractions, such  as falsification of  test results
in order to improperly pass a vehicle with high emissions  or to avoid
inspecting a vehicle at all.  By screening inspection  data,  agency field
investigators should be able to  identify questionable  transactions or
problem stations.  Therefore, EPA considers emphasis on  collecting and
analyzing valid data to be a high priority.

     In cases where data are collected automatically,  the  data are generally
available to program officials,  although there have been problems with
data loss in some cases.  The problem in these cases tends to  be an
inability to use the data effectively.  In some cases, the State/local
agencies have placed little priority on  developing  computer  programs to
analyze data.  In other cases, the data  are over-analyzed  with many useless
reports being generated.  This latter situation often  leaves the program
managers overwhelmed by the data and confounds their ability to focus on
the useful reports.

H.   QUALITY OF I/M REPAIRS

     Quality of I/M repairs - EPA auditors found that  in every program,
to some extent, a problem exists with respect  to the quality of I/M
repairs.

     State and local agencies are providing minimal attention  to assurance
of quality repairs to failed vehicles.  All  too often  it appears that
vehicles are adjusted to meet I/M cutpoints with a  small margin of safety
rather than being adjusted near  manufacturer specifications.  This results
in much lower emissions reductions being achieved.
                                  VII-8

-------
I.  EVALUATION OF THE FY 1985 AIR AUDIT EFFORT

     The eight FY 1984 I/M program audits  were  intended  to  test  the  auditing
concepts planned for the FY 1985 audit  guidelines.   The  eight  FY 1985  I/M
audits were more comprehensive in scope and  were  able  to rely  on the
established guidelines and questionnaires.   The EPA  believes that the
FY 1984 and FY 1985 I/M audits were successful  in establishing a long  term
process for evaluating and improving operating  I/M programs.

     One measure of the success of the  audit program can be obtained by
reviewing feedback from the State/local  officials who  participated in  the
audits.  To obtain this feedback, EPA surveyed  officials in the  eight  I/M
programs audited in FY 1985.   In summary,  the major  comments were generally
as follows:

1.   On questionnaires, most  commenters indicated that they would have
     preferred getting the questionnaires  sooner  than  they  did.   They
     felt, in some cases, that EPA should  have  done  a  better job of
     filling out the questionnaire before  it was  sent  to the State.  A
     few commenters indicated a preference for  a  State-specific  questionnaire
     or a separate questionnaire for decentralized and centralized programs;
     however, they agreed that better completion  and screening of questions
     by EPA would have minimized their  problems with the current questionnaire,

2.   Most commenters indicated that draft  reports should have  been sent
     to the States sooner or, at a minimum,  EPA should formally  follow-up
     with the State on the major findings  of the  audit in a timely manner.
     Some States indicated that they had no  formal contact  with  EPA  on
     the audit between the audit exit meeting and the  receipt  of the
     draft audit report.  In  some cases, this represented a time gap of
     up to 6 months.  Most commenters felt  that better follow-up was
     needed in order to focus on problems,  to develop  reasonable and
     timely resolutions, and  to set priorities  and schedules.  Such  audit
     follow-up activities should be discussed in  the audit  report to make
     the document current at  the time of its release.

3.   Some commenters indicated that EPA should  have  had  better pre-planning
     for site visit activities and better  coordination with the  State/local
     agencies prior to the audit visit.  In  a few cases, site  visit
     activities were not planned until  the  audit  team  arrived  on site.
     Some commenters felt that EPA should  improve communications and
     coordination efforts with the nonair  State/local  agencies that  are
     involved with the I/M programs. In some cases, all  coordination
     efforts with these agencies were inappropriately  delegated  to the
     State/local air agencies.
                                  VII-9

-------
4.   One commenter suggested and others  agreed  that  EPA  should  place more
     emphasis on involving officials from other State/local  programs on
     the audit team.  Participating officials  from other States/local ities
     should be from States/localities with I/M  programs  similar to  the
     one being audited.

5.   Another suggestion was to involve EPA officials from an EPA Regional
     Office other than the one handling  the audit.   This would  help to
     ensure consistency and to overcome  any biases which may have developed
     within the host Region.  It was again stressed  that the visiting
     personnel be familiar with programs similar to  the  one  being audited.

6.   A few commenters felt that the State/local  officials "evaluating"
     the audits was a good way of getting constructive feedback on  the
     audit process and that EPA should solicit  such  evaluations as  part
     of the audit process.  It was felt  that these evaluations  should
     occur soon after the audit visit and follow-up  activities  were
     conducted.

     In view of the comments received, there was no  apparent need for
major revisions to the I/M audit guidelines or  questionnaires for FY 1986
and FY 1987.  The major need is simply to improve the administrative
process of implementing the guidelines.   The EPA believes that  the  main
reason for the logistical problems experienced  in FY 1985 was that  the  I/M
audits were a new part of the National Air Audit System.  This  first
year's learning experience by itself will yield improvements for later
years.

     Even though no major revisions were apparently  needed in the guidelines
or questionnaires, two areas were identified for minor changes  in the
FY 1986/87 I/M audit guidelines.  One change was to  add  a short discussion
to the guidelines on audit visit exit meetings.  This change was made to
clarify that exit meetings occur at a point in  the audit process when
only tentative conclusions can be drawn, since  much  information and data
cannot be fully evaluated during the audit visit itself.

     The other area for minor change was in the instructions for the
audit questionnaires.  To avoid some of  the logistical problems with the
questionnaires, the instructions were modified  to clarify that  EPA  personnel
are to initially complete as much as possible  of the questionnaire  and  to
clearly mark those questions which do not apply to a particular State/local
program.

     Identifying operating problems is an important  first step  in providing
quality I/M programs.  The results of completed audits are already  being
used by State/local agencies and EPA to  improve I/M  programs.  The  EPA
believes that the I/M audit system and guidelines will continue to  be a
dynamic process for achieving environmental results.
                                  VII-10

-------
        Table 7-2
Description of I/M Programs
    Audited in FY-84/85
Region/ Program
State/ Areas Start Type
I
CT Statewide 1/83 CC
SE
A
I
MA Statewide 4/83 0
SE
A
I
S
II
NJ Statewide 2/74 H
SE
A
I
S
NY NYC metro: 1/82 D
Bronx SE
Ki ngs A
Nassau I
New York S
Putnam
Queens
Richmond
Rockl and
Suffolk
Westchester
Program Type Key
D decentralized
CL central local-run
CC central contractor
CS central state-run
H cntrl/dcntrl hybrid
Test Mode
I = idleed
L <* loadedeed idle
R » two speed idle
S = safety
Tamper Test
Test Waiver Fee
$
Waiver C 40 10
P,A,S E
E.C.I
R.T
Always C 100 10
I or 10%
1980+ of value
C and
Waiver L
All E
None None 2.50°
C
12°
D
Always L 6.50
1984+
P.A.E
C,I,R
T
Program Type Key
RE registration-enforced
SE sticker-enforced
RS registration & sticker
A annual inspection
B biennial inspection
"Includes safety
Inspection fee.
Vehicles Exemp- Light Duty Cutpoints
Included tions Years CO HC
% ppm
1968+ to M 1968-1969 7.5 750
10,000 Ibs. D 1970 7.0 650
1971 6.0 650
1972 6.0 575
1973-1974 6.0 425
1975-1979 3.0 300
1980 2.5 275
1981+ 1.2 220
Last 15 M 1970-1974 7.0 800
years to D 1975-1979 4.0 400
8500 Ibs. 1980 2.7 300
1981+ 1.2 220
All years M Pre-1968 8.5 1400
to 6000 D 1968-1970 7.0 700
Ibs. 1971-1974 5.0 500
1975-1980 3.0 300
1981+ 1.2 220
All years M Pre-1975 6.5 800
to 8500 D 1975-1977 5.7 700
pounds 1978 4.3 500
1979 3.0 400
1980 2.7 330
1981+ 1.2 220
Tamper Test Key Waiver Key
P PCV C = cost waiver/$
A air injection R = reduction/%
S spark system L * stated repairs
E evap system E * tamper repair
C catalyst costs excluded
I Inlet
T air intake Exemption Key
R EGR M » motorcycles
D - diesels
    VII-11

-------
Table 7-2 (corit.)
Region/ Program
State/Areas Start Type
III
DC city-wide 1/83 CL
SE
A
I
S
DE Wilmington: 1/83 CS
New Castle RE
A
I
S
VA DC suburbs :12/81 D
Arlington RS
Fairfax Co. A
Prince William I
Fairfax S
Al exandri a
Falls Church
Manassas
Manassas Park
IV
GA Atlanta: 4/82 D
Cobb SE
DeKal b A
Fulton I
Gwinnett 1/86
NC Charlotte: 12/82 D
Mecklenburg SE
A
I
S
TN Memphis: 8/83 CL
Shelby RE
A
I
S
Program Type Key
D * decentralized
CL = central local-run
CC = central contractor
CS = central state-run
Test Mode
L * loaded
I = idle
R = two speed idle
S = safety
Tamper Test
Test Waiver Fee
$
None None 5°
None C 75 None
Always C 75 5
All or
devices L
E
Always C 50 3
P.A.S E
E,C,I
R
Always C 50 10°
P.A.C E max
I.R.T
Waiver C 50 None
C,I or
L
E
Program Type Key
Vehicles Exemp- Light Duty Cutpoints
Included tions Years CO HC
% ppm
All years M Pre-1968 12.5 2000
to 6000 D 1968-1970 11.0 1250
Ibs. 1971-1974 9.0 1200
1975-1979 6.5 600
1980+ 1 .5 300
1968+ to M 1968-1970 1100
8500 Ibs. D 1971-1974 800
1975-1979 500
1980 275
1981+ 220
Last 8 M 1977-1979 4.0 400
years to D 1980 2.0 220
6000 Ibs. A 1981+ 1.2 220
Last 10 M 1975-1979 4.0 400
years to D 1980+ 2.5 250
6000 Ibs.
Last 12 M 1973-1974 7.0
years, all D 1975-1978 5.0
vehicles 1979-1980 3.0
1981+ 1.5
All years M Pre-1972 9.9 1990
to 8500 D 1972-1974 9.0 1990
Ibs. 1975-1979 8.5 1990
1980 6.5 1990
1981+ 3.0 1990
Tampering Key Waiver Key
RE » registration-enforced P PCV C = cost waiver/J
SE » sticker-enforced A air injection R - reduction/%
RS = registration & sticker S spark system L * stated repairs
A * annual inspection E evap system E * tamper repair
C catalyst costs excluded
"Includes safety I inlet
inspection fee. R EGR Exemption Key
T air intake D * diesels
A * air-cooled
M = motorcycles
    VII-12

-------
                                     Table 7-2 (cont.)
Region/
State/ Areas
VI
TX Houston:
Harris
Program
Start Type
7/84 D
SE
A
T
S
Tamper
Test Waiver
Always None
P,E.A,R,T
1980+:
C.I.L.X
1984+
0
Test Vehicles Exsnp- Light Duty Outpoints
Fee Included tions Years CO HC
$ X ppm
2.75 1968+ to M
1980+ 8500 Ibs. 0
0
None
VII
MO St. Louis: 1/84
St. Charles
St. Louis
St. Louis City
Jefferson
VIII
CO Denver: 1/82
Adams
Arapahoe
Boulder
Denver
Douglas
Jefferson
Colorado Springs:
El Paso
Fort Collins:
Larimer
D
RE
A
I
S

0
SE
A
I
R 81+






Always
P,A,E,R
1981+
C.I


Al ways
1982+
A.C.I








L





1968-80
L 15
1981+
C 100
E






4.50 1971+ to
max 6000 Ibs.
licensed
weight


10 1968+ to
max 10,000 Ibs.









M
D
0



M
0
0








1971-1974
1975-1979
1980
1981+


1968-1971
1972-1974
1975-1976
1977-1978
1979
1980+





7.0
6.0
3.0
1.2


6.0
5.4
5.0
3.4
2.0
1.5





700
600
300
220


1200
1200
800
500
400
400





Program Type Key
D  » decentralized
CL = central local-run
CC » central contractor
CS = central state-run

Test Mode
Program Type Key
RE
SE
RS
CM
A
registration-enforced
sticker-enforced
registration & sticker
computer matching
annual  inspection
                        Tampering Key
I = idle
R * two speed idle
L - loaded
T = tanpering
S = safety
p
A
R
E
C
I
0
T
L
X

PCV
air injection
E6R
evap system
catalyst
inlet
oxygen sensor
air intake
PI umbtesmo
repl ace cat if
inlet tampered
Waiver Key
C » cost waiver/$
R » reduction/%
L » stated repairs
E = tamper repair
    costs excluded

Exemption Key
M = motorcycles
                                                    diesels
                                                    other fuels
                                          VII-13

-------
                                       Table 7-2 (cont.)
Region/ Program
State/ Area Start Type
IX
AZ
NV
X
OR
Phoenix: 1/77
Maricopa
Tucson:
Pima
Las Vegas: 10/83
Clark
Reno:
Washoe
Portland: 7/75
Multnomah
Cl ackamas
Washington
CC
RE
A
I
0
RE
A
R
CS
RE
B
R:81+
Irpre
81
Tamper
Test
None
Waiver
1975+
C
Always
1975+
P,A,S
E.C.I
R.T.M.O
1970-74
P.A.E
Waiver
C
R
50
40%
Pre-82
L 14
1982+
C 100
E
None
Test Vehicles Exemp- Cutpnts 4cyl
Fee Included tions Years CO HC
$ % ppm
5.44 Last 13
years, all
vehicles
8 1965+ to
5000 Ibs.
curb wt.
7 Last 20
years
D
M
D
0
M
D
over
8500
Ibs.
1972-1974 6.
1975-1980 2.
1981+ 1.
1965-1967
1968-1969
1970-1974
1975+
0 450
5 250
5 250
7.5
5.0
4.0
3.0
6-8
CO
%
5.5
2.2
1.5

cyl
HC
ppm
400
250
250

Outpoints established by
model year and make,
detailed list available.
Most 1975+ vehicles:
1.0% CO, 225 HC
Program Type Key
D  = decentralized
CL * central local-run
CC = central contractor
CS * central state-run

Test Mode
    idle
    two speed idle
    loaded
    s afety
    tampering
Program Type Key
RE = registration-enforced
SE * sticker-enforced
RS = registration & sticker
CM = computer matching
A  * annual  inspection
B  - biennial inspection
Tampering Key
P * PCV
A = air injection
R = EGR
E * evap system
C * catalyst
I = inlet
M - computer module
T = air intake
S = spark system
0 = oxygen sensor
Waiver Key
C = cost waiver/$
R = reduction/%
L = stated repairs
E = tamper repair
costs excluded
Exemption Key
M = motorcycles
D = diesel fuels
0 » other fuels
This table summarizes characteristics  of the  16  I/M programs audited in FY 1984/35.  These
characteristics have been derived from statutes  and/or rules and regulations promulgated by the
State or locality.  The list includes  the names  of counties, cities, and States implementing
I/M; however, in somes areas only part of the county listed is involved, not the entire county.
The date listed under Program Start  is the  actual start date of the mandatory I/M program; in
some cases, voluntary programs started earlier.  The Program Type column indicates whether the
program is centralized or decentralized, what type of enforcement mechanism is being used, the
test type, and frequency.  The Tamper  Type  column indicates when tampering inspections are
conducted and which components are checked.   A key to abbreviations is provided at the bottom
of each page.  Test fees may include safety inspection which Is indicated by a degree symbol.
The cutpoints are for light duty vehicles only.
                                          VII-14

-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing/
1. REPORT NO. 2.
EPA-450/2-85-009
4. TITLE AND SUBTITLE
National Air Audit System-FY 1985 National Report
7. AUTHOR(S)
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Office of Air Quality Planning and Standards
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
12. SPONSORING AGENCY NAME AND ADDRESS
Director, Office of Air Quality Planning & Standards
Office of Air and Radiation
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
3. RECIPIENT'S ACCESSION-NO.
5. REPORT DATE
December 1985
6. PERFORMING ORGANIZATION CODE
8. PERFORMING ORGANIZATION REPORT NO.
10. PROGRAM ELEMENT NO.
13A2A
11. CONTRACT/GRANT NO.
68-02-3892
13. TYPE OF REPORT AND PERIOD COVERED
Final - FY 1985
14. SPONSORING AGENCY CODE
EPA/200/04
15. SUPPLEMENTARY NOTES
16. ABSTRACT
       The National Air Audit System, which was jointly developed by EPA and
  representatives of State and local air pollution control agencies was implemented
  for the first time in FY 1984.  In FY 1985, the system audited air pollution control
  activities in 68 State and local agencies in the areas of air quality planning and
  State implementation plan activity, new source review, compliance assurance, air
  monitoring, and inspection and maintenance.  The goals of the audit system are to
  identify obstacles that are preventing State and local agencies from implementing
  effective air quality management programs and to provide EPA with quantitative
  information for use in defining more effective and meaningful national programs.
  The report for FY 1985 indicated that, for the most part, State and local agencies
  have sound programs in each of the four audited areas.  Areas of possible
  improvement were found, however, which will be the focus of various remedial
  activities.

17. KEY WORDS AND DOCUMENT ANALYSIS
a. DESCRIPTORS
Air pollution
Air audit
Air quality planning
New source review
Compliance assurance
Air monitoring
Inspection and maintenance
13. OISTRIBUT1ON STATEMENT
Release unlimited. Available through
NTIS
b.lDENTIFIERS/OPEN ENDED TERMS
Air Pollution Control
19. SECURITY CLASS (This Report)
Unclassified
20. SECURITY CLASS (This page)
Unclassified
c. COSATI Field/Group
13B
21. NO. OF PAGES
129
22. PRICE
EPA Form 2220-1 (9-73)

-------