&EPA
               United States
               Environmental Protection
               Agency
Office of Air Quality
Planning and Standards
Research Triangle Park NC 27711
EPA-450/2-89-003
January 1989
               Air
               National Air Audit System
                      FY 1986 -  1987
                     National Report

-------
                                 EPA-450/2-89-003
   National Air Audit  System
FY  1986-1987  National  Report
          Air Quality Management Division
     U.S. ENVIRONMENTAL PROTECTION AGENCY
            Office of Air and Radiation
      Office of Air Quality Planning and Standards
     Research Triangle Park, North Carolina 27711

                January 1989

-------
This report has been reviewed by the Office of Air Quality Planning and Standards,
EPA,  and approved for publication.  Mention of trade names or commercial
products is not intended to constitute endorsement or recommendation for use.
Copies of this report are available through the Library Services Office (MD-35),
U.S. Environmental  Protection Agency, Research Triangle Park, North Carolina
27711; or, for a fee, from the National Technical  Information Service, 5285 Port
Royal Road, Springfield, Virginia  22161.
                   Publication No. EPA-450/2-89-003

-------
                            TABLE OF CONTENTS
1.  OVERVIEW OF AUDIT FINDINGS 	  1-1

2.  INTRODUCTION	2-1

3.  AIR QUALITY PLANNING AND SIP ACTIVITIES

    3.1  Executive Summary 	  3-1
    3.2  Air Quality Evaluation	3-4
    3.3  Emission Inventory  	  3-10
    3.4  Modeling	3-28
    3.5  SIP Evaluation and Implementation	3-39

4.  NEW SOURCE REVIEW

    4.1  Executive Summary 	  4-1
    4.2  Introduction	4-4
    4.3  Summary of Major Findings 	  4-6
    4.4  Public Notification Procedures  	  4-15
    4.5  Applicability Determinations  	  4-17
    4.6  BACT/LAER Determinations  	  4-21
    4.7  Ambient Monitoring (PSD)  	  4-24
    4.8  Ambient Air Quality Analysis	4-25
    4.9  Emission Offset Requirements  	  4-29
    4.10 Permit Specificity and Clarity	4-30

5.  COMPLIANCE ASSURANCE

    5.1  Executive Summary 	  5-1
    5.2  Introduction	5-2
    5.3  Major Findings and Conclusions  	  5-3
    5.4  Periodic Review and Assessment of Source Data 	  5-4
    5.5  File Review	5-14
    5.6  Overview Inspections	5-15

6.  AIR MONITORING

    6.1  Executive Summary	6-1
    6.2  Introduction	6-2
    6.3  Major Findings and Conclusions  	  6-2
    6.4  Network Design and Siting	6-4
    6.5  Resources and Facilities	6-5
    6.6  Data and Data Management	6-7
    6.7  Quality Assurance/Quality Control 	  6-9

                                   ii i

-------
MOTOR VEHICLE INSPECTION/MAINTENANCE

7.1  Executive Summary 	  7-1
7.2  Introduction	7-2
7.3  Major Findings and Conclusions	7-2
7.4  Reported Failure Rates/Improper Testing ..... 	  7-6
7.5  Enforcement	  7-8
7.6  Waivers	   7-9
7.7  Quality Control	   7-11
7.8  Quality Assurance  	   7-11
7.9  Data Analysis. .	   7-12
                          IV

-------
                              LIST OF TABLES
Table

3.1       Specific Questions Regarding Air Quality Reports           3-6
3.2       Area Redesignations by State and Local  Agencies            3-8
3.3       Methods Used to Review Attainment Status                   3-9
3.4       Actions Taken Following Air Quality Monitoring
            Violations                                               3-11
3.5       Summary of Experience and Training of Modeling
            Staff                                                    3-31
3.6       Summary of Model  Availability                              3-33
3.7       Summary of Alternate Modeling Techniques                   3-37
3.8       Summary of EPA Reviews of Modeling Analyses                3-38
3.9       Timeliness of Regulatory Development                       3-41
3.10      Reasons for Delays in the Submittal  of SIP Revisions
            and Strategies                                            3-43
3.11      Reevaluations of  Growth Projections                        3-47

6.1       FY 1985 Audit Results for Data Precision                   6-11
6.2       FY 1985 Audit Results for Data Accuracy                    6-11
6.3       Precision of 1986 Data Submitted to EPA                    6-12
6.4       Accuracy of 1986  Data Submitted to EPA
            (Audit Level  2)                                          6-12

7.1       FY 1986 I/M Audits                                         7-3
7.2       FY 1987 I/M Audits                                         7-3
7.3       I/M Data Collected During FY 1986-87 Audits                7-4

-------
                             LIST OF FIGURES

Figure                                                              Page

3.1       Reasons for Delays in SIP Responses                        3-44

4.1       Agency Performance on Public Notification                  4-7
4.2       Applicability Determinations - Agency Performance          4-8
4.3       BACT Determinations - Agency Consideration
            of Control Alternatives                                  4-11
4.4       Relative Stringency of BACT Determinations                 4-11
4.5       Agency Performance on Ambient Air Quality Analysis         4-12
4.6       Condition of Issued Permits                                4-14
5.1       Compliance Breakout of Class A SIP Sources                 5-5
5.2       Compliance Breakout of Class Al SIP Sources                5-6
5.3       Compliance Breakout of NSPS Sources                        5-7
5.4       Compliance Breakout of NESHAPs Sources                     5-8
5.5       Class A SIP Performance Statistics                         5-9
5.6       Class Al SIP Performance Statistics                        5-10
5.7       NSPS Performance Statistics                                5-11
5.8       NESHAPs Performance Statistics                             5-12

-------
                                CHAPTER 1

                        OVERVIEW OF AUDIT FINDINGS
     Among the major air quality management objectives of the Clean Air
Act (CAA) is the requirement to attain the national ambient air quality
standards (NAAQS) as expeditiously as practicable, maintain them thereafter,
and prevent significant deterioration of air quality that is better than
the NAAQS.  The CAA imposes the primary responsibility for attaining these
objectives on the States.  The Environmental Protection Agency (EPA) has a
responsibility to overview the State activities and ensure proper program
direction.  Therefore, the overall goal  of the National Air Audit System
(NAAS) is to determine if State and local  air pollution control programs
are achieving the CAA's air quality management objectives.

     As was the case with the audit conducted in FY 1985, the FY 1986-87
audit indicated that State and local air pollution control agencies receive
high marks in some areas, particularly when considering the difficulty and
magnitude of the air quality management task, the limited resources, and
the technological limit of the tools available to do the job.  The overall
assessment, in actually meeting the literal objectives of the CAA, however,
shows room for improvement.  Although some of the identified deficiencies
are minor, there already exist, in some cases, programs at the State or
national  level to correct the deficiencies.  On the other hand, if left
unattended, deficiencies such as (1) insufficient resources, (2) absence
of a formal emission inventory quality assurance program for ozone exten-
sion and SIP call areas, (3) access to and familiarity with ozone models
such as EKMA and the Urban Airshed Model,  and (4) inadequate enforcement
against long-term violators, could threaten the overall effectiveness of
some air quality management programs.  If  the various State and local
programs are to improve and become more effective, it is necessary for EPA
to (1) identify those activities that make up an air quality management
program, (2) specify how those activities  are to be conducted pursuant
to recognized good practice and within the current resource constraints,
(3) compare those activities and practices with what was found during the
FY 1986-87 audit program, (4) continue to  provide guidance and support on
those activities, and (5) seek new sources of revenue to ensure that
agencies are properly funded to carry out  the objectives of the Clean Air
Act.

AIR QUALITY DATA

     The starting point for air quality management is collection and
analysis of information on the quality of  the ambient air.  To obtain
this information, it is necessary to establish a monitoring network

                              1-1

-------
sufficient to collect data representative of the monitored area.   The
data must then be analyzed to determine whether the air quality is within
the prescribed limits of national  ambient air quality standards or is in
violation of those standards.  These steps must be done in an accurate
and timely fashion, to determine the need either for protection against
significant deterioration by possible new sources or for additional
control of existing sources and offsetting emission reductions for new
sources.
     The FY 1986-87 audit of air quality monitoring data is essentially
a repetition of the audit conducted in 1985.  Both audits evaluate network
design and siting, resources and facilities, data and data management,
and quality control.  The FY 1986-87 audit showed once again that States
have done a commendable job in establishing and operating ambient monitoring
networks for criteria pollutants,  and that quality data are generally
available.  The audit indicates that State and local  agencies have continued
their sucessful performance in operating and maintaining the State and
local  air monitoring stations (SLAMS) and the national air monitoring
stations (NAMS).  Approximately 99 percent of the monitors operated  by
the audited agencies are meeting the design and siting regulations for
ambient air monitoring networks.  This duplicates almost exactly  the 98
percent compliance rate in FY 1985.

     The audit results also indicate that 85 percent of the audited  agen-
cies need either new or replacement air monitoring or laboratory  equipment.
This is an increase of 3 percent over the number of agencies needing new
or replacement equipment in FY 1985.  The total cost of equipment needed
is approximately $2.7 million, with $1.7 million required for monitoring
equipment and $1 million for laboratory equipment.  Even though there is
a decrease from FY 1985 of $1.9 million for total equipment need  cost,
the amount needed for laboratory equipment remains the same.  This finding
suggests that data completeness and reliability problems may continue to
be encountered if this area does not continue to receive attention.

     Similar to prior audits, the FY 1986-87 audit showed that timeliness
of data submittal remains a problem for many agencies, particularly  for
submission of lead (Pb) data.  During this audit cycle, approximately 29
percent of the agencies were late with their lead data submittals, compared
to 25 percent in FY 1985.  The percentages of late submittals for all
pollutants ranged from 25 percent  for Pb to 10 percent for TSP.  Perfor-
mance in meeting the National Aerometric Air Data Bank (NAADB) 75 percent
data completeness criterion showed a range of 16 percent, with a  low of
78 percent for N02 and a high of 94 percent for lead.  In FY 1985, the
low was 84 percent for N02 and the high was 92 percent for TSP.  Another
problem area which continues to resurface during each successive  audit of
agencies' data management is the requirement for submitting the annual
SLAMS report.  The audit showed that, of the 45 audited agencies  required
to submit an annual SLAMS report,  9 agencies, or 20 percent, were deficient
in one or more of the four elements required in the annual report.
Efforts to resolve this problem administratively are continuing,  with a
slight improvement occurring between the 1985 and 1986-87 audits.

     Once again in FY 1986-87, the quality assurance/quality control portion
of the audit reports continues to demonstrate good overall performance.
Three agencies needed major changes to their quality assurance plans, and

                               1-2

-------
32 had minor revisions pending.  With respect to achieving quality
assurance goals for data precision (+ 15% for all pollutants)  and accuracy
(+_ 20% for other criteria pollutantsj, the only significant problem is with
the precision for Pb, for which only 55 percent of the reporting organizations
achieved the goal.  These figures are nearly identical to those reported in
FY 1985, except there is an increase of 9 percent for Pb accuracy for FY
1986-87.  The continued inability to meet precision and accuracy goals for
Pb over other pollutants is believed to be related to the analysis procedure.

AIR QUALITY EVALUATION

     Once air quality data are collected and analyzed, they serve as the
foundation for air quality management planning.  Due to this important
function, the FY 1986-87 audit was once again concerned with the States'
ability to perform air quality evaluations.  As was the case in FY 1985,
the 1986-87 audit investigated (1) States' ability to consider available
air quality data systematically for the purpose of disseminating informa-
tion to the public, (2) States' ability to assess Section 107  redesignations,
and (3) State's ability to analyze new violations.

     The air quality evaluation portion of the audit shows that 93 percent
of States and 75 percent of local agencies made air quality reports avail-
able to the public.  This is a decrease from the FY 1985 audit, which showed
that 98 percent of local agencies made reports available to the public.
Sixty (60) percent of the agencies publish annual reports, while 3.5 and
13 percent of the agencies publish semiannual and quarterly reports,
respectively.  Forty-eight of the agencies reported that they  publish
reports within 6 months of acquiring the data.  This is also down from the
FY 1985 audit, in which 80 percent of the agencies were found  to publish
reports within 6 months of acquiring the data.

     Seventy-three (73) percent of the State and local agencies reviewed
Section 107 primary and secondary attainment status, as opposed to 83
percent of the agencies reported in the FY 1985 audit.  Eighty-two Section
107 reviews were completed that did not result in a request for redesigna-
tion, as opposed to 253 reviews in the FY 1985 audit.

     Forty-six redesignation actions were initiated (272 initiations in FY
1985) and, of these, 40 actions were completed and submitted to EPA (247
completions and submissions reported in the FY 1985 audit). Approximately
27 percent of these completed and submitted actions resulted in a redesig-
nation from attainment to nonattainment.

     Air quality monitoring detected new NAAQS violations at a total of
60 monitoring sites.  Of these violations, 41 were at existing monitoring
sites and 19 were at newly established sites.

EMISSION INVENTORY

     In order to develop a control strategy, it is necessary to relate air
quality to source emissions, using such tools as an air quality model.
This relationship is possible only with a good emission inventory.  Since
nonattainment of the ozone and carbon monoxide standards is a  significant
problem which EPA will have to deal  with over the next several years, the

                              1-3

-------
FY 1986-87 audit for emission inventories concentrated on securing
inventory information on the status and maintenance of and other problems
associated with these pollutants.  Fifty-nine control  agencies  were
interviewed and asked to complete a questionnaire on four aspects of
their emission inventory program: (1) uses of criteria pollutant emission
inventories, (2) volatile organic compound (VOC)  inventories  in ozone
(03) nonattainment areas, (3) carbon monoxide (CO) inventories  in CO
nonattainment areas, and (4) contribution of highway vehicle  emissions  in
03 and CO nonattainment areas.

     Questions on uses of criteria pollutant emission inventories were
designed to determine agency compliance with Clean Air Act requirements
to maintain emission inventories to ensure reasonable further progress
(RFP) in 03 nonattainment areas.  When looking at these requirements, we
find that approximately 75 percent of the RFP areas are being tracked by
about 60 percent of the responsible agencies.  Similarly, nearly 70 per-
cent of VOC emission inventories have been updated during the last year.

     These two facts are good indicators that there is room for improve-
ment in RFP management.  Approximately 30 percent of the local  agencies
currently have a formal quality assurance (QA) program.  This strongly
suggests the need from Headquarters and the Regional Offices  for additional
guidance and improved emission factors for some aspects of the  inventory
program.

     When looking at CO emission inventories to determine geographic
coverage, we find that adequate CO inventories are being maintained in
85 percent of the CO nonattainment areas.  We further find that 37 percent
of the agencies include woodstoves in their inventories, and  approximately
74 percent of the agencies are using mobile source inventories  to locate
potential CO hot spots.  Most agencies also are utilizing one or more of
the recommended techniques (nonreactive VOC, 03 seasonal  temperature) to
adjust their VOC inventories.

     Investigation of highway vehicle inventories indicates that 53
percent of the inventories are compiled and maintained by other local
agencies or by transportation departments.  This arrangement  appears to
be satisfactory, in that most agencies are utilizing approved techniques
to adjust their inventories to local conditions and to update their
highway inventories.

MODELING

     As noted previously, air quality models are used to relate emissions
to ambient air quality and, hence, to determine necessary control measures.
Because of this relationship, the FY 1986-87 audit focused on the modeling
experience and training of agency personnel, and their access to various
air quality models.  In FY 1986-87, State and local agencies  were queried
concerning (1) experience and training of agency personnel who  conduct
modeling; (2) availability of various guideline models, other EPA
recommended models, nonguideline models, screening models, and  other
models; (3) uses of non-EPA recommended modeling techniques;  and (4) EPA
Regional Office review and revision of modeling applications  conducted  by
the State and local agencies.

                              1-4

-------
     Overall, the responses indicate that agency modeling personnel  have
the educational background to be able to conduct modeling.  If experience
is taken into account, even those who do not have degrees in meteorology,
engineering, or science, or who do not have formal training in modeling,
do have experience in modeling.  As a result of this combination of  educa-
tion, training, and experience, State and local agency staffs appear to
be capable of performing adequate modeling analyses.

     State and local  agencies generally have access to a wide variety of
dispersion models.  Most agencies indicate that they are most comfortable
with models that predict concentrations of particulate matter, sulfur
dioxide, and other nonreactive pollutants from sources located in simple
terrain.  The agencies tend to have less access to and familiarity with
models used to determine the effect of mobile sources and with ozone models
such as EKMA and the Urban Airshed Model.  Since these models will be the
primary ones used in analyses for the post-1987 ozone and CO SIPs, the lack
of access to and familiarity with these models may pose a problem for some
agencies.

     Compared to 20 percent in FY 1985, the FY 1986-87 audit indicated
that State and local  agencies reported using nonguideline modeling tech-
niques in only about  10 percent of modeling analyses.  The most common
reason given for using a nonguideline technique was the technical superiority
of that method over the one recommended in the guideline.  There were
also 321 instances of guideline recommendations on the use of data bases
not being followed.  Of these, 93 percent failed to use 5 years of offsite
meteorological data or 1 year of onsite data.  These instances may be due
to model applications in areas for which there are little or no meteorological
data.

     There are few cases in which the EPA Regional Office review of  the
State or local agency modeling analyses led to requiring a revised analy-
sis.  It is unclear whether the small number of reviews and revisions was
due to the expertise of the agencies or to the fact that few analyses
were submitted to the Regional Offices for review.

SIP EVALUATION AND IMPLEMENTATION

     The CAA clearly envisions that control agencies would periodically
evaluate source/receptor relationships, i.e., the relationships between
air quality and source emissions, and would revise their SIP control
strategies and resultant emission regulations accordingly.  If a State
does not act in those situations where the ambient air quality standards
are exceeded and the Administrator determines the SIP to be substantially
inadequate, the CAA requires EPA to initiate State action by calling for
revisions to the SIP.  To this end, the FY 1986-87 SIP evaluation and
implementation audit was designed to assess whether State plans for  attain-
ment are being reasonably carried out and to identify agencies' needs in
developing and updating their SIPs.  Questions asked during this audit
included inquiries on:  (1) timeliness of regulatory development, (2)
timeliness of studies, (3) transportation control measures, (4) resource
activity, and (5) approved generic bubble rules.
                              1-5

-------
     While the majority of agencies have made progress in submitting
required rules, 29 percent of the SIP revisions approved as  part of the
SIP control  strategy had not been completed and, of those completed, 24
percent were not on schedule.  This is a substantial  reduction from the
44 percent of SIPs due or overdue and the 27 percent  reported  on schedule
in FY 1985.   The audit identified three major causes  for these delays:
overly optimistic schedules, insufficient resources,  and other reasons
that were specific to each revision.

     One-fourth of the additional studies that the agencies  had committed
to complete  as part of their SIP control strategies had not  been completed
and, of these studies, 56 percent were behind schedule.  These figures  show
a considerable improvement over FY 1985, when only 50 percent  of additional
studies were completed and 64 percent were behind schedule.

     For transportation control measures (TCMs) to control VOC and CO,
it was found that 74 percent (27 percent in 1985) had been implemented  for
traffic flow improvements, 55 percent for mass transit, 80 percent for
carpooling,  89 percent for vehicle inspection/maintenance, and 68 percent
for other measures such as bicycle lanes and parking  controls.

     The audit revealed differences in values between agencies'  latest
projections  of certain air pollution indicators for use in revising SIPs,
and the actual levels that had occurred.  The absolute differences were 1
percent per  year for population growth (in 22 metropolitan areas), 1.8
percent per  year for employment projections, 1.8 percent per year for
vehicle miles traveled, and 1.4 percent per year for  major source emissions.
Many respondents contemplated, after finding significant differences
between their projections and the actual growth figures, making revisions
to upcoming  RFP reports or to the SIPs themselves.

     Based upon data from the audit, only 18 percent  of the  States have
bubble rules formally approved by EPA.  Source emissions in  these States
dropped overall to rates at or below allowable emission standards after
the bubbles  were implemented.  For the 11 States with bubble rules not
formally approved, source emission rates increased, but 67 percent were
at rates at  or below the allowable emission standards once the bubbles
were implemented.  However, these States are placing  the "impacted"
sources in jeopardy of EPA actions to enforce the original SIP limits.

I/M^PROGRAMS

     The primary purpose of the I/M audit is to allow EPA to ensure that
each State or locality is implementing and enforcing  its I/M program in
a manner consistent with its State implementation plan.  Another objective
of the audit is to identify areas where EPA can provide assistance to
strengthen I/M programs.

     During  FY 1986-87, EPA conducted 33 program audits.  The  results of
these audits essentially confirmed the validity of findings  of the audits
performed during FY 1985 and provided further evidence that  I/M is a
reasonable and effective strategy for reducing motor vehicle tailpipe
emissions.  In fact, many of the lessons learned from auditing operating
I/M programs serve as the basis for EPA's proposed Post-1987 Ozone and
Carbon Monoxide Policy.
                              1-6

-------
     The FY 1986-87 audit provided convincing evidence that the most
effective I/M program design is the centralized design, while decentra-
lized programs with manual analyzers are the weakest.  In fact, centralized
programs were found to be so superior in identifying failing vehicles and
achieving emission reductions required by EPA that EPA requested corrective
action from the Governors of seven States having decentralized programs.
Other problems that resurfaced in the FY 1986-87 audit were associated
with high waiver rates, ineffective use of program data for program
management, lack of quality control and consistency of testing in
decentralized programs, enforcement in sticker enforced programs, and the
use of license or registration suspension as a measure of enforcement.

     As reported in the FY 1985 audit, the resolution of certain problems
generally rests with each I/M program.  Current audit results clearly indi-
cate that differently designed programs do not realistically have equal
benefit potential.  Resolution of certain problems rests in changing
program design.  It is unlikely that additional monitoring or administrative
control will be sufficient to resolve problems associated with decentralized
manual inspections.  The audit results also indicated that decentralized
tampering inspections may lack the impartiality, consistency, and accuracy
needed to achieve sufficient benefits.  Given the strength of the audit
findings, EPA favors the centralized program design and believes that
registration denial enforcement is more effective than sticker based  .
systems.

     The audit results clearly identify the need for EPA to continue to
work with each State or local I/M program to address problems identified
during the audits.  In addition, EPA should give a high priority to
conducting I/M audits on the remaining programs as soon as possible.

NEW SOURCE REVIEW

     The CAA anticipates that review of new sources by the States will
be one of the main mechanisms by which States attain and maintain the NAAQS
and prevent significant air quality deterioration.  A State's new source
review (NSR) program must be designed and implemented to prevent a new
source from aggravating an existing air quality problem or creating a
new problem where one does not already exist.  The FY 1986-87 audit verified
that State and local agencies are generally familiar with, and strongly
support, the preconstruction review process.

     These audit findings support, and oftentimes amplify, the findings
from the FY 1985 NSR audit.  The findings indicate that most agencies per-
form their overall NSR program responsibilities reasonably well, although,
in a number of State and local agencies, problems were identified with
respect to the consistent and adequate application of certain specific
program requirements.  Nonetheless, auditors occasionally mentioned that
improvements were observed in the performance of some agencies where
specific deficiencies had previously been found.  This is certainly to
the credit of these agencies' efforts to improve in-house performance and
to contribute to a greater level of consistency nationwide.  Overall, how-
ever, EPA auditors often cited the lack of adequate file documentation as a
hindrance to a complete evaluation of the agencies' permit review procedures,
With respect to specific audit topics, EPA continued to find significant

                              1-7

-------
problems with the way many agencies carried out their source applicability
determination procedures; in some instances, with the lack of thorough
ambient air quality impact analyses and inconsistency in the methods used
to require operating limitations on new and modified sources.  The audit
once again noted the overall tendency of agencies to rely on New Source
Performance Standards (NSPS) in defining best available control technology
(BACT).  The audit also confirmed that agencies typically are willing to
allow prevention of significant deterioration (PSD)  applicants to use
existing data in lieu of new monitoring data, but it raised a new concern
when EPA found that the basis for such actions was not always well  documented
or in complete conformance with existing EPA criteria for representative
data.

     Yet, despite the assortment of problems identified in the audits,
it is still fair to say that most State and local agencies function
in a competent manner, with each agency having its own individual  strong
points.  For example, some agencies routinely extend the requirement for
BACT to non-PSD sources, or require BACT for all pollutants once it is
determined to be required for any pollutant.  The focus on permit files
demonstrated that most agencies are more prone to internal inconsistencies
than to routine malpractice.  While not always true, auditors were usually
able to find good examples along with the bad whenever selected permit
files were examined.

COMPLIANCE

     Ultimately, the success of a State's air quality management program
relies on its ability and will to enforce its regulations.  Many State
and local agencies showed one or more strong points characteristic of a
successful air compliance program, such as high source compliance rates
supported by high inspection frequency rates, performance of all required
NSPS source tests, expeditious resolution of violations, and few long-term
violators.  Other States had source files that were for the most part
well organized, up-to-date, and complete, reflecting a reasonable profile
of each source.  These positive points show that most States are fulfilling
compliance and enforcement responsibilities under the CAA.

     Inspection rates for Class Al* SIP sources decreased over those
reported in the FY 1985 audit (87 to 84 percent), and five States had
unacceptably low inspection rates of less than 60 percent.  Compliance
rates for Class Al SIP sources remained roughly the same as in FY 1985
(91 to 92 percent).  The NSPS national figures for both inspection and
compliance rates rose (88 to 90 percent and 90 to 93 percent, respectively),
even though the rates in some States declined.  The NSPS inspection rates
for nine States are unacceptably low, with figures of less than 60 percent.
National Emission Standards for Hazardous Air Pollultants (NESHAP) inspec-
tion rates showed an increase (from 67 to 96 percent), while compliance
rates fell slightly (from 93 to 90 percent).  Twelve States have NESHAP
inspection rates at or below 60 percent.  The audit results show that
performance did not change significantly from the 1985 audit.  In the
* Class Al includes sources with actual  or potential  controlled emissions
  greater than or equal to 100 tons per year.

                              1-8

-------
coming fiscal year, it is expected that State and local  agencies will
continue to work toward further improvement in the compliance of
sources.

     The FY 1986-87 audit also revealed that several State and local
agencies, to a varying extent, still have weaknesses in three areas vital
to a strong and effective compliance program.  First, source files main-
tained by some agencies do not contain verifiable information reflecting
a reasonable profile of each source.  This is evidenced by the 7 percent
reduction in the number of source files having adequate verifiable infor-
mation since FY 1985.  Even so, some audit reports cited improvements
since the FY 1985 audits in the condition of State files.  Second, some
inspection reports still are of poor quality (no mention of operating or
emission parameters, or pollutants emitted), and this is a significant con-
cern.  Third, although overall there was a slight increase in the percentage
of audit reports indicating that sources were being expeditiously returned
to compliance (from 74 to 77 percent), some of the reviewed agencies'
enforcement efforts are not always effective in reducing the number of
long-term violators by expeditiously returning documented violators to
compliance.

     Thus, while there have been improvements in the aforementioned three
areas, some State and local agencies need to heighten efforts in these
critical areas.  Success in these areas is vital to the establishment and
maintenance of State and local agency credibility with EPA and the public.

NEXT STEPS

     The EPA Regional Offices are now in the process of working with the
State and local agencies to correct those deficiencies that were identified
in the FY 1985 audit.  They will continue this effort into FY 1989.  In
addition, EPA intends to use the results of the FY 1986-87 audit in its
program planning and budgeting cycle to assure that resources are directed
to areas of highest need.

     The National Air Audit System is intended to focus on State and
local air quality management programs, identifying program deficiencies
and defining actions necessary to bolster the programs to make them more
responsive to environmental needs.  Past audit reviews indicate that
there are some inherent weaknesses in the NAAS, both in how audits are
conducted and in how audit results are utilized.  This is especially true
for the Air Quality Evaluation and SIP Evaluation and Implementation
sections of the NAAS.  Consequently, a task force is being formed to
assess the current NAAS and make recommendations for changes.  Special
emphasis will be placed on making the system more audit-oriented and less
survey-oriented.  The task force will also attempt to make the audit
results more useful to audit participants.
                                   1-9

-------
                                CHAPTER 2

                               INTRODUCTION
     The National Air Audit System (NAAS) was developed in 1983 through a
joint effort of the State and Territorial Air Pollution Program Administrators
(STAPPA), the Association of Local  Air Pollution Control  Officials (ALAPCO),
and EPA.  The NAAS provides uniform national  criteria for evaluating
(auditing) State and local  air pollution control programs.  Such nationally
applicable criteria minimize inconsistency in program audits carried out
by EPA's 10 Regional Offices.

     The need for the NAAS evolved as State and local air pollution
control agencies assumed responsibility under the Clean Air Act (CAA) for
an increasing number of programs.  The EPA responded to the concerns of
STAPPA and ALAPCO members by agreeing to participate in a STAPPA/ALAPCO/
EPA workgroup.  The workgroup set forth to develop and direct the imple-
mentation of an auditing system that would ensure the desired national
consistency and confirm that State and local  air pollution control
programs were operating in such a manner as to satisfy the national
requirements of the CAA.

     The workgroup decided that the primary goals of the NAAS should be
to identify any obstacles that are preventing State and local agencies
from implementing an effective air quality management program and to
provide EPA with information that can be used to develop more effective
and meaningful national programs.  The NAAS should provide audit guidelines
that EPA and State and local agencies can use to (1) meet statutory
requirements; (2) assist in developing an acceptable level of program
quality; (3) account for the achievements, shortcomings, and needs of
various air programs; (4) identify programs needing further technical
support or other assistance; and (5) manage available Federal, State,
and local resources effectively so that the national ambient air quality
standards (NAAQS) are attained and maintained as expeditiously as possible.

     The first audit covered four program areas selected by the workgroup:
air quality planning and SIP activity, new source review, compliance
assurance, and air monitoring.  Standardized  audit guidelines for each
program area were written by subcommittees appointed by the workgroup.
Each subcommittee was chaired by a State agency representative, with an
EPA staff person serving as coordinator.  Local agencies and the EPA
Regional Offices were also represented on each subcommittee.  The workgroup
developed the protocol for implementing the audit guidelines written by
subcommittees.
                             2-1

-------
     These guidelines were used for conducting the first NAAS audits in
FY 1984 (ending September 30, 1984).  A national  report (EPA-450/2-84-009)
that summarized the results of the 68 audits performed the first year was
issued in December 1984.  The audit guidelines were revised for FY 1985
and vehicle inspection/maintenance was added as a fifth program audit
area.

     The guidelines were used by EPA Regional  Offices in FY 1985 to audit
66 State and local air pollution control  programs, including all States
except California.  In addition, Puerto Rico,  the Virgin Islands, and the
District of Columbia were included in the audit.   The California State
agency was not audited because the local  district agencies in California
are responsible for implementing the various air quality management
programs.  The local agencies audited were:*

         Allegheny County, PA            Philadelphia, PA
         Asheville, NC                   South Coast AQMD, CA
         Fresno County, CA               Southwest APCA, WA
         Jacksonville, FL                St. Louis, MO
         Lane County, OR                 Tampa, FL
         Nashville, TN                   Toledo,  OH
         Northwest APA, WA               Wayne County, MI


     The STAPPA, ALAPCO, and EPA encouraged State/local personnel from one
agency to serve as members of the audit team for another agency.  Four States
participated in this activity in FY 1985.  The agencies that participated
in the audit exchanges listed the following benefits of the program:

     It provides an opportunity to compare their agencies' programs with
     the host State's program to see what improvements can be "transplanted."

     It fosters communication with other agencies at the working level  so
     that common problems can be shared.

     It provides general insight into what other agencies are doing.

All of the participating agencies indicated that the exchanges were
beneficial and that they would continue their  participation in the future
if resources allow.  Such audit team exchanges apparently offer an excellent
opportunity to learn firsthand how other States operate their programs.

     The audit teams varied in size; the number of auditors in an agency
at any one time rarely exceeded five.  All five of the program areas were
generally not audited at the same time.  Also, all program areas were not
audited in each agency because the five activities selected for audit
were not performed by all agencies.
*Additional  local agencies were included in the air monitoring audits
 because of the delegated responsibility of operating local  air monitoring
 networks.

                             2-2

-------
     The EPA Headquarters personnel  observed 10 audits in FY 1985.
This served to provide national  overview on the audits and was part of
the quality assurance program to which STAPPA/ALAPCO and EPA had agreed.

     The protocol followed by the Regional Offices in conducting the
audits included advance preparation prior to the on-site visit, an initial
meeting with the agency director, discussions with agency staff, review
of the agency files, and an exit interview.

     The advance preparation involved, among other things, sending a
letter to the agency well in advance of the audit to confirm the date and
time and to identify the individuals performing the audit.  The guidelines
and questionnaires were also provided to the agencies with a request to
complete portions thereof and return them to the EPA Regional  Offices at
least 2 weeks before the scheduled visit.

     The site visits generally were conducted in four phases:

     0 The audit team met with the agency director and key staff to
       discuss the audit goals and procedures to be followed.

     0 The auditors discussed the questionnaire with the personnel in
       charge of each of the five audited activities.

     0 The agency files were reviewed to verify the implementation and
       documentation of required activities.

     0 An exit interview was held to inform agency management  of the
       preliminary results of the audit.

     The Regional Offices drafted an audit report after each site visit
and requested that each audited agency review it..  The individual agency
audit reports are used by EPA to compile and write each audit  cycle report.

     With only minor deviations, the FY 1986-87 audit continued the NAAS
objective of investigating the ability of State and local air  pollution
control programs to achieve their air quality management objectives.  The
FY 1986-87 audit contained a number of improvements over previous audits.
Among the FY 1986-87 improvements were:

     0  A curtailment on audit topics for local agencies and agencies
        with I/M programs.  For local agencies, only those topics for
        which the specific implementing authority rests with the local
        agency received an audit.  Only those agencies with I/M programs
        that had been ongoing for 1 year or more were considered for an
        I/M audit.

     0  To even out audit workloads on the State and local agencies and
        the EPA auditors, STAPP/ALAPCO and EPA agreed that the NAAS effort
        would be conducted biannually.  Each EPA Region would  be responsible
        for ensuring that all State agencies and selected local agencies
        are audited within the 2-year period.  Instead of designated
        dates for the completion of an audit, the FY 1986-87 audit was
        tied to the Strategic Planning and Management System (SPMS).

                             2-3

-------
Under SPMS, the Regional Office must forward a final audit report
to OAQPS within 180 days after the audit is completed.

Questions other than those contained in the audit manual could be
appended only with the advance consent (at least 30 days) of the
audited agency.  It was also made clear that this restraint did
not preclude a Regional Office from inquiring about deficiencies
that surfaced during the audit.
                           2-4

-------
                                CHAPTER 3

                 AIR QUALITY PLANNING AND SIP ACTIVITIES


3.1  EXECUTIVE SUMMARY

     Four major program components within the air quality planning and  SIP
activities area were evaluated in the FY 1986-87 audit.   These components
were air quality evaluation, emission inventories, modeling,  and SIP
evaluation.

     This section of the audit was again in the form of  survey questions
that were presented to the audited agencies prior to the on-site visit.
In some instances, the questionnaires were verified upon review of selected
program files.

     In summary, the FY 1986-87 audit revealed that the  majority of the
audited agencies have sound programs in most of these components,  but
there is still room for improvement.

Air Quality Evaluation

     This portion of the audit covers how air quality data are used by  State
and local agencies for the purpose of Section 107 redesignations,  trends
analyses, prioritization of air program activities, and  public information.
Three main areas are covered:  (1) air quality reports,  (2)  Section 107
redesignations, and (3) new violations.

     Ninety-three (93) percent of States and 75 percent  of local agencies
made air quality reports available to the public.  This  is a  decrease  from
the FY 1985 audit, in which 98 percent of local agencies made reports  availa-
ble to the public.  Over one-half of the agencies publish semiannual and
quarterly reports, respectively.  Forty-eight (48) percent of the  agencies
reported that they publish reports within 6 months of acquiring air quality
data.  This number is down from the FY 1985 audit, when  80 percent of  the
agencies published their reports within 6 months of acquiring the  data.

     Forty-six redesignation actions were initiated (272 initiations in FY
1985) and of these, 40 actions were completed and submitted to EPA (247
completions and submissions reported in FY 1985 audit).   Approximately
27 percent of these completed and submitted actions resulted  in a redesig-
nation from attainment to nonattainment.

     Air quality monitoring detected a total of 60 new violations.  Of  these,
41 were at existing monitoring sites and 19 were at newly established  sites.
                                   3-1

-------
Emission Inventory

     Fifty-nine control agencies were interviewed and asked  to  complete a
questionnaire on four aspects of their emission inventory program:   (1) uses
of criteria pollutant emission inventories,  (2) volatile organic  compound
(VOC) inventories in ozone (03)  nonattainment areas,  (3) carbon monoxide (CO)
in CO nonattainment areas, and (4) highway vehicles in 03 and  CO  nonattainment
areas.

     When looking at criteria pollutant emission inventories to determine
compliance, we find approximately 75 percent of the RFP areas  are being
tracked by about 60 percent of the responsible agencies.  Furthermore,  it
appears that the majority of VOC emission inventories have been updated
during the last year.

     These two discoveries indicate room for improvement in  RFP management.
Only approximately 30 percent of the local agencies currently  have  a  formal
quality assurance (QA) program.   Audit results strongly suggest the need
for additional guidance and improved emission factors for certain aspects
of the inventory program.

     Adequate CO emission inventories are being maintained in  85  percent
of the CO nonattainemnt areas.  Most agencies are utilizing  one or  more
of the recommended techniques (nonreactive VOC, 03 seasonal  temperature)
to adjust their VOC inventories.  Thirty-seven (37) percent  of  the  agencies
include woodstoves in their inventories, and approximately 74  percent of
the agencies are using mobile source inventories to locate potential  CO
hot spots.

     Investigation of highway vehicle inventories indicates  that  53 per-
cent of the inventories are compiled and maintained by local agencies or
transportation departments not usually involved in air pollution  control.
This arrangement appears satisfactory, since most agencies use  approved
techniques to adjust their inventories to local conditions and  to update
their highway inventories.

Modeli ng

     For FY 1986-87, State and local agencies were queried concerning (1)
experience and training of agency personnel  who conduct modeling, (2)
availability of various guideline models, other EPA recommended models,
nonguideline models, screening models, and other models, (3) uses of non-
EPA recommended modeling techniques, and (4) EPA Regional Office  review
and revision of modeling applications conducted by the State and  local
agencies.

     The audit indicated that most agencies are knowledgeable  and capable
of performing and reviewing routine modeling analyses.  If experience is
taken into account, even personnel who do not have degrees in  meteorology,
engineering, or science, or no formal training in modeling,  do have experience
in modeling.

     State and local agencies generally have access to a wide  variety
of dispersion models and feel most comfortable with those that  predict

                                   3-2

-------
concentrations of particulate matter, sulfur dioxide, and other nonreactive
pollutants from sources located in simple terrain.  Agencies  tend to have
less access to, and familiarity with, models used to determine impacts
from mobile sources and ozone models such as EKMA and the Urban Airshed
Model .

     Compared to 20 percent in FY 1985, the FY 1986-87 audit  indicated that
State and local agencies reported using nonguideline modeling techniques  in
only about 10 percent of modeling analyses.  The most common  reason given for
using a nonguideline technique was the technical superiority  of that method
over the one recommended in the guideline. There were also 321 instances
of guideline recommendations on the use of data bases not being followed.
Of these, 93 percent failed to use 5 years of offsite meteorological  data
or 1 year of onsite data.  There are few cases in which the EPA Regional
Office review of State or local agency analyses revealed the  need for a
revised analysis.

SIP Evaluation

     The FY 1986-87 SIP evaluation portion of the audit was designed to
assess whether State plans for attainment are being reasonably carried
out and to identify agency needs in developing and updating their SIPs.
Questions asked during this audit included inquiries on:  (1) timeliness
of regulatory development, (2) timeliness of studies, (3) transportation
control measures, (4) source activity, and (5) approved generic bubble rules.

     While the majority of agencies have made progress in submitting
required rules, 29 percent of the SIP revisions approved as part of the
SIP control strategy had not been completed, and of those completed, 24
percent were not on schedule.  This is a substantial reduction from the
44 percent of SIPs uncompleted and the 43 percent not on schedule in FY
1985.  The audit identified three major causes for these delays:  overly
optimistic schedules, insufficient resources, and other reasons specific
to particular revisions.

     One-fourth of the additional studies that the agencies had committed
to complete as part of their SIP control strategies had not been completed
and, of these studies, 56 percent were behind schedule.  These figures
show a considerable improvement over FY 1985 when only 52 percent of
additional studies were completed and 64 percent of the remaining studies
were behind schedule.

     For transportation control measures (TCMs) to control VOC and CO,
it was found that 74 percent (27 percent in 1985) of the traffic flow
improvements, 55 percent of mass transit, 80 percent of carpooling, 89
percent of vehicle inspection/maintenance, and 68 percent of  the remaining
measures had been implemented.

     The audit revealed differences in values between growth  projections
for SIP planning and the actual growth that occurred.  The absolute
differences were 1 percent per year for population growth (in 22 metropo-
litan areas), 1.8 percent per year for employment projections, 1.8 percent
per year for vehicle miles traveled, and 1.4 percent per year for major
source emissions.  Many respondents, after finding significant differences

                                   3-3

-------
between their projections and the actual  growth figures,  contemplated
making revisions to upcoming RFP reports or to the SIPs themselves.

     Based upon data from the audit, only 18 percent of the States  have
bubble rules formally approved by EPA.  Source emissions  in these  States
dropped overall to rates at or below allowable emission standards  after
the bubbles were implemented.  For the 11 States with bubble rules  not
formally approved, source emission rates increased, but 67 percent  were at
rates at or below the allowable emission standards once the bubbles  were
implemented.  Nonetheless, these States are placing the "impacted"  sources
in jeopardy of EPA actions to enforce the original SIP limits.

3.2  AIR QUALITY EVALUATION

Introduction

     The National Air Audit is one of EPA's mechanisms for measuring the
effectiveness of State and local air pollution control programs  and
identifying the need for Federal assistance to support such programs.
The audit program is divided into five categories, which  include:   Air
Monitoring, Air Quality Planning and SIP Activities, New Source  Review,
Compliance Assurance, and Vehicle Inspection/Maintenance.  At the  conclu-
sion of each audit period, EPA analyzes the audit data gathered  by  EPA
Regional  "auditors" and publishes a document that summarizes the findings,
identifies the major issues, and recommends solutions to specific  problems.

     This section contains detailed information on how air quality  data
are used by State and local agencies for the purpose of Section  107
redesignations, trends analyses, prioritization of air program activities,
and public information.  Three main areas are covered:  air quality
reports, Section 107 designations, and new NAAQS violations.  The  total
number of agency respondents was 61.

     In the air quality reports area, the audit determined how frequently
agencies published air quality monitoring data, the contents of  the  air
quality reports, and the time lag between data acquisition and publication.

     The second area focused on how Section 107 attainment status  desig-
nations were being reviewed and changed.  The number of redesignations
and methods used to review attainment status were also requested in  the
questionnaire.

     The intention of the third area was to determine the number of  air
monitors that revealed any new violations and the type of action taken  for
each violation.

Major Findings and Conclusions

     Air Quality Reports

     The air quality evaluation portion of the audit showed that 93  percent
of State agencies and 75 percent of local agencies made air quality reports
available to the public.  This indicates a decrease from the FY  1985
audit, in which 98 percent of local agencies made reports available  to  the
                                   3-4

-------
public.  Sixty (60) percent of the agencies publish annual  reports,  while 3
and 13 percent of the agencies publish semi-annual and quarterly reports,
respectively.

     Forty-eight (48) percent of the agencies reported that they publish
reports within 6 months of acquiring air quality data.  This is also down
from FY 1985, when 80 percent of the agencies published reports within 6
months of acquiring the data.

     Section 107 Designations

     Seventy-three (73) percent of the State and local agencies reviewed
Section 107 primary and secondary attainment status designations,  versus
83 percent reported in the FY 1985 audit.  Eighty-two Section 107  reviews
were completed that did not result in a request for redesignation,  as
opposed to 253 reviews in the FY 1985 audit.

     Forty-seven redesignation actions were initiated (272  in FY 1985) and,
of these, 40 actions were completed and submitted to EPA (247 completions
and submissions reported in the FY 1985 audit).  Approximately 27  percent
of these completed and submitted actions resulted in a redesignation from
attainment to nonattainment.

     New Violations

     Air quality monitoring detected a total of 60 new NAAQS violations.
Of these, 41 were at existing monitoring sites and 19 were  at newly
established sites.  The most common action taken in these cases was  a
microscopic examination of filters.

Responses to Individual Questions

     Air Quality Reports

     The purpose of this audit section was to determine:  (1) the  extent
to which State and local agencies make air quality reports  available to
the public, (2) the contents of the air quality reports, and (3) the time
lag between data acquisition and report publication.  The specific  questions
asked are summarized in Table 3.1.

     The audit determined that 37 of the 40 reporting State agencies (93
percent) and 15 of the 20 local agencies (75 percent) make  air quality
reports available to the public.  In comparison to the FY 1985 audit, the
percentage of State agencies providing air quality reports  to the  public
remains constant; however, the number of local agencies that provide
reports to the public was down 15 percent from the previous audit.

     While there is no specific requirement or commitment for agencies to
issue air quality reports in a timely manner, EPA sought to determine in
the audit the frequency of such reports and the time lag between data
acquisition and the release of the reports.  Of the 60 reporting agencies,
26 agencies (60 percent) publish reports annually, 2 agencies (3 percent)
publish reports semi-annually, 8 agencies (13 percent) publish quarterly
reports, and 4 agencies (7 percent) publish reports on a monthly basis.
                                   3-5

-------
                                                           TABLE 3.1


                                       SPECIFIC QUESTIONS REGARDING AIR QUALITY REPORTS
Question
Did the report indicate whether data shown
have been quality assured according to EPA
guidelines?
Did the report cover. all criteria pollutants?
Did the report cover all monitors within the
agency's jurisdiction?
Did the report describe the completeness of
the data collected during the reporting period?
Did the report indicate comparisons to NAAQS
on an appropriate basis (i.e., expected
exceedances for ozone and highest 2nd high
for CO, etc.)?
Did the report summarize historical air
quality data by year since the start of
the monitoring program in that area?
Did the report present maps of the
nonattainment areas and maps depicting
the locations of the ambient monitors?
Did the report describe each of the
poll utants?
Did the report briefly describe each of
the sampling techniques?
Yes
Number of
Agencies Percent
35 58
46 77
50 83
39 65
48 80
31 52
33 55
36 60
39 65
No
Number of
Agencies Percent
15 25
5 8
1 2
10 17
3 5
19 32
15 25
1 23
11 18
Not Answered
Number of
Agencies
10
9
9
11
9
10
12
10
10
Percent
17
15
15
18
15
16
20
17
17
co
i
en

-------
     All but three of the reporting agencies publish air quality reports
within 12 months of acquiring the data.  Three of the agencies (59 percent)
reported a 0 to 2 month lag time, 26 agencies (43 percent)  reported a 3
to 6 month lag time, and 19 agencies (32 percent) reported  a 7 to 12
month lag time.  Nine of the agencies did not respond to the question.

     Section 107 Attainment Status Designation

     The audit determined that 44 State and local agencies  (73 percent)
systematically reviewed Section 107 primary and secondary attainment
status designations and submitted proposed changes to EPA.   This figure
is somewhat lower than the 83 percent determined in the FY  1985 audit.
Fifteen of the 60 agencies (25 percent) did not review Section 107 attainment
status designations and one of the agencies (2 percent) did not respond
to the question.

     The audit showed that 82 Section 107 reviews were completed that did
not result in a request for redesignation.  This is considerably lower
than the 253 reviews reported in the FY 1985 audit.

     Forty-seven redesignation actions were initiated and,  of these, 40
actions were completed and submitted to EPA.  A breakdown of the results
is presented in Table 3.2.  Again, these numbers are considerably lower
than the 272 initiations and the 247 completions and submissions reported
in the FY 1985 audit.  Of the 40 submitted actions, EPA has published
final  rulemaking for all redesignations to nonattainment, 20 redesignations
from nonattainment to attainment or unclassifiable, and 4 redesignations
from unclassifiable to attainment.

     The methods used by the agencies to review attainment  status are
shown in Table 3.3.  Five methods were considered:  (1) staff notification
of violations to agency management, (2) consideration of air quality data
for the last 2 or 3 years, (3) consideration of modeled exceedances, (4)
consideration of method control and emission changes, and (5) investi-
gation into causes of violations.  The method most commonly used was
consideration of recent air quality data (used by 87 percent of the
reporting agencies).  Consideration of modeled exceedances  was used the
least of the five methods (used by 48 percent of the reporting agencies).

     The audit showed that there were 10 attainment or unclassified
areas under a "call for SIP revision".  Three agencies intended to submit
a request for redesignation to nonattainment for these areas.

     New Violations

     The purpose of this section of the audit was to determine the number
of new violations of the NAAQS that occurred during the 1985 to 1986
period, and how the agencies dealt with these violations.  A new violation
is defined as a violation at a monitoring site that had been violation-free
for the previous 3 years.  The following are actions that were taken by
the agencies in response to a violation:
                                   3-7

-------
                                TABLE  3.2



             AREA REDESIGNATIONS BY  STATE AND  LOCAL AGENCIES
Type of Redesi gnat ion
Attainment to nonattainment*
Nonattainment to attainment or
unclassifiable
Primary nonattainment to secondary
nonattainment
Secondary nonattainment to primary
nonattainment
Unclassifiable to attainment
Unclassifiable to nonattainment
(e.g., PMiQ, S02, & NOX)
Total
Initiated
4
25
10
3
5
0
47
Completed & Submitted
4
20
10
2
4
0
40
*  For 03 or CO, an "attainment"  designation  also  includes "unclassifiable,
                                   3-8

-------
                                                  TABLE 3.3



                                   METHODS USED TO REVIEW ATTAINMENT STATUS
Method
Staff notification of violations
to agency management
Consideration of air quality
data for the last 2 to 3 years
Consideration of modeled
exceedances
Consideration of method control
and emission changes
Investigation into causes
of violations
Yes
Number of
Agencies

45
52

29

38
41
Percent

75
87

48

63
68
No
Number of
Agencies

4
0

19

11
8
Percent

7
0

32

18
13
Not Answered
Number of
Agencies

11
8

12

11
11
Percent

18
13

20

18
18
co
i
to

-------
     Microscopic laboratory examination of filters
     Additional monitoring
     Data ignored (documented exceptional  event)
     Enforcement action based on existing SIP regulation
     Source-specific, local, county, or areawide SIP revision
     Verification by modeling studies
     Change in an individual source permit
     Other action

If no response was given by a reporting agency,  it was assumed  that no
action was taken.

     The results of this audit section are presented in Table 3.4.   A
total of 41 new violations at existing air monitoring sites were reported
in this audit.  In the majority of instances, these violations  were dealt
with by the agency in the following ways:   by a  microscopic examination  of
filters (22 percent), data were considered to be the result of  a documented
exceptional event and were ignored (22 percent), or by some other means
(24 percent).

     Nineteen violations were reported at newly  established sites.   Of
these, 37 percent were dealt with by microscopic examination of filters,
and 27 percent were followed up with additional  monitoring.

3.3  EMISSION INVENTORY

Introduction

     Fifty-nine control agencies were interviewed and asked to  complete  a
questionnaire on four aspects of their emission  inventory programs:

          1.  Uses of criteria pollutant emission inventories

          2.  Volatile organic compound (VOC) inventories in ozone  (03)
              nonattainment areas

          3.  Carbon monoxide (CO) emission inventories in CO nonattain-
              ment areas

          4.  Highway vehicle inventories in 03  and CO nonattainment areas.

There are 63 State implementation plans (SIPs),  but 4 agencies  do not
operate significant inventory programs.  Hence,  44 State, 13 local, and  2
territorial programs were involved in this audit.

     It is not coincidental that all four of the interview subjects deal
at least in part with 03 and/or CO nonattainment.  Nonattainment of
ambient air quality standards for these two pollutants is a significant
problem that EPA must expect to deal with for the next several  years. A
complete, current emission inventory is an essential starting point in
developing programs and strategies for reducing  the ambient concentrations
of 03 and CO in nonattainment areas.  Therefore, the 1986-87 National Air
Audit System program for emission inventories has been directed at
obtaining information on the status, maintenance, and associated problems
                                   3-10

-------
                                TABLE 3.4

        ACTIONS TAKEN FOLLOWING AIR QUALITY MONITORING VIOLATIONS
                                  New Violations      Violations at Newly
	Action	at Existing Sites	Established Sites

Microscopic examination
of filters                              9                       7

Additional monitoring                   4                       6

Data ignored due to a documented
exceptional event                       9                       0

Enforcement action based on
existing SIP regulations                6                       1

Source-specific, local,
county, or areawide SIP
revisions                               2                       1

Verification by modeling
studies                                 0                       2

Change in an individual
source permit                           1                       0

Other                                  10                       2
   Total                               41                      19
                                   3-11

-------
of 03 and CO nonattainment emission inventories.  It should be pointed out
that approximately 50 percent of the agencies participating in this
program were audited in each of the last 2 years.  The audit does not
represent a status of the program at one point in time, so some types of
comparisons may be inappropriate.

Major Findings and Conclusions

     The reasonable further progress (RFP) tracking process for 03
nonattainment areas is being carried out for approximately 75 percent of
the RFP areas by about 60 percent of the agencies responsible.  Similarly,
it appears that 60 to 70 percent of VOC emission inventories have been
updated during the last year.

     These two facts suggest that improvement is needed in the RFP manage-
ment area.  Only about 30 percent of the local agencies have a formal
quality assurance (QA) program, and so this area should receive additional
attention from both Headquarters and the Regional Offices.  Clearly,  a
need is indicated for better guidance and improved emission factors for
some aspects of the inventory program.  Most agencies are utilizing the
EPA-recommended techniques (e.g., nonreactive VOC, 03 seasonal  temperature)
to adjust their VOC inventories.

     CO emission inventories are maintained, to one degree or another, for
85 percent of the CO nonattainment areas.  In addition to inventories of
CO emissions, 74 percent of the agencies are using traffic inventories to
locate potential CO hot spots.

     The audit questionnaire results indicate that 53 percent of the  high-
way vehicle inventories are compiled and maintained by a local  agency or
transportation department not usually involved in air pollution control.
Most of the air pollution agencies that do not maintain the highway
inventory themselves have found this arrangement acceptable, generally
because resources or expertise are lacking within the air agency itself.
Most agencies are utilizing approved techniques to adjust their inventories
to local area conditions.  An analysis of the responses indicates that at
least 60 percent of the highway inventories (transportation data, vehicle
miles traveled, etc.) have been updated within the last 2 years.

Responses to Individual Questions

     To facilitate discussion of the results and conclusions of the audit,
a composite of the questionnaires is included for each of the four aspects
of the inventory section.  The composite questionnaire is highlighted by
double indentation and a vertical line along the left margin throughout
the report.  In general, the number of agencies responding to each of the
four sections was dependent on their attainment status.  This number  is
in parentheses directly below the section title.

     There has been no attempt in the composite questionnaire to summarize
all of the "other" responses.  Also, Regional Office comments are not re-
flected, on the, composite form.  Nearly all Regional Of fice comments
e.ither confi rmedrt he -agency response, or elaborated, on needs the agency
                                 3rl2

-------
Section B.I.  Uses of Criteria Pollutant Emission Inventories

     This section was intended to determine agency compliance'withrthe
Clean Air Act requirements to maintain emission inventory data"to ensure
RFP in 03 nonattainment areas.  In addition to the information  concerning
RFP areas, this section obtained information about other uses of inventories,
quality assurance, and what EPA can do to improve emission inventories.

     B.I.  Uses of Criteria Pollutant Emission Inventories
           (28 agencies)

           Emission inventories are used in a number of applications by air
     pollution control agencies, including the demonstration of reasonable
     further progress (RFP) in 63 nonattainment areas.  The following
     questions deal with the uses of emission inventories by your agency.

     State/local  Agency Response

          a.  RFP Toward Attainment of 03 Standards

              The Clean Air Act requires SIP's to provide for maintenance
     of emission  inventories to ensure RFP.  In the past year,  did your
     agency actually track and compare changes in the VOC emission inven-
     tories with  projected changes in emissions given in the respective
     03 curves in the SIP's?  Check yes or no for each nonattainment area
     listed below.  If no, insert the letter code best representing the
     reason RFP is not tracked.
     Areas where 03 RFP should be tracked:             Yes     No

       (Regional Office to provide list)

       [28 agencies responsible for 43 RFP areas]

       Agencies:                                        19      9

       RFP areas:                                       31     12

       Reasons listed by agencies:  A-2, B-2, C-4, D-l, E-5
     1 This question applies only to those areas specified by the Regional
       Office as 03 extension areas and areas where EPA has called for SIP
       revisions.  (EPA Regional Offices must provide a list of these
       areas.)  Note that RFP tracking means compili'ng a realistic estimate
       of an individual year's emissions and^compa-nTng-cthis to the appro-
       priate year in the SIP RFP curve.  Merely compiling air quality data
       trends as an alternative to compiling emissions data is not accepted
       as RFP tracking.
                                 3-13

-------
CODE

A.  Alternative tracking mechanisms are used (e.g., air quality
    data), not directly involving emission inventories

B.  RFP tracking is not considered a priority task

C.  Insufficient resources available to track RFP by maintaining
    an up-to-date emission inventory

D.  Insufficient guidance available on how to do RFP tracking

E.  Other (specify)	
    b.  RFP Report:  If "yes", did your agency, in the past year:

        1.  Prepare a report on RFP?  16  yes 12  no
            (26 reports, prepared by 16 agencies)

        2.  Submit the report to EPA?  16  yes 12   no

        3.  Make the report available for public comment?
            10  yes  6  no

        For EPA Regional Office response:

        4.  Has the RO received the above reports?
            26^ yes	 no

        5.  If "yes", has the RO commented or otherwise responded
            to the State or local agency?
            18  yes  8  no (reports)

        6.  Are you assured that any progress indicated is  the
            result of real emission reductions rather than  changes
            in methodologies, emission factors, etc.?
            21  yes  5  no RFP areas
        RFP Emission Inventory Update:  For each of the areas
        where 03 RFP should have been tracked in the past year,
        what approximate percentage of VOC emissions in the
        inventory was updated for the major, minor, and mobile
        source categories that year?  The areas are the same as
        those listed by the Regional Office in Question a.  Use
        one of the following percent ranges in the table:

            0-19 percent,   20-39 percent,   40-59 percent,
           60-79 percent,   80-100 percent
                        3-14

-------
                        Number of RFP Areas
Areas where
03 RFP should
be tracked.
See question a.
Percentage
Range of VOC
Inventory
Updated
1. 0-19
2. 20- 39
3. 40- 59
4. 60- 79
5. 80-100
Stationary Sources
Major
% RO*




4
0
1
1
30
Minor
Regulated
% RO*




5
6
2
1
19
Unregulated
% RO*




10
9
1
0
11
Mobile
Sources
% RO*




9
0
1
0
26
No data reported  7
                            10
                                12
7
* The Regional  Office auditor should ask the agency for
  documentation on the extent of updating the RFP inventory and
  initial  in this column if documentation appears adequate.
    d.  Other Uses of Emission Inventories

        Indicate below other uses that were made of your agency's
criteria pollutant emission inventories in the past year,  not
necessarily just in nonattainment areas.  (Check
appropriate.)
                                                 "x"  where
                                     PM  SO?  NOv   VOC   CO   Pb
        1.  Used for developing
            and evaluating area-
            wide control  strategies   25^  24^
2.  Used as input to
    dispersion and other
    air quality models

3.  Used to project
    possible areas of high
    pollutant concentrations
    to help place ambient
    monitors
        4.  Used for source permits
            or inspections, includ-
            ing for assessing permit,
            operating, and inspec-
            tion fees                42
                                              17   27   20   11
                                     39  43   28   18   20   14
                                         26   11
                                                        16  12
                                         41   37   35   35  22
        5.  Used for responding to
            information requests     48  49
                                              44   46   40  24
                              3-15

-------
                    PM  SO?  NOX  VOC  CO  Pb


                    37  34   30   38   31  22
        6.  Used indirectly for
            general  program          	                	
            planni ng

        7.  Other uses (specify)  13 - a variety of special
            projects, enforcement/compliance, and toxic evaluations,
    e.  Quality Assurance
        (59 agencies)

        Quality (or validity) assurance for emission inventories
involves checks of the procedures, emission factors, calculations,
etc., that were used during compilation as well  as checks  for
missing sources and edit checks for reasonableness.  Specify below
the statement that best describes the quality assurance measures
that are conducted on your State's emission inventories.
        1.  20   Formal, rigorous, regular checks are implemented
        2.  36
        3.  _p_

        4.   3
Less formal, spot checks  are made,  on  an
irregular basis

No quality assurance measures are implemented

No response
    f.  What should EPA do to help you make your criteria
pollutant inventories more comprehensive, accurate, and current?
(Check "x" where appropriate.)

                                        Current Data
                        Very            and Guidance  No Strong
                     Important  Useful  Adequate      Opinion

1.  Provide better
    guidance on...
    -Point sources      12        12
    -Area sources       20        19
    -Highway vehicle     8        14
    -Locating sources    3         8
    -Questionnaire
       design            3         7
    -Quality assurance   6        18
    -Data handling      16         6
                           26
                            9
                            18
                            26
                            30
                            20_
                            24
6

16
                               3-16

-------
                                              Current Data
                              Very            and Guidance  No Strong
                           Important  Useful  Adequate      Opinion

          -Reflecting SIP
           pegs in projec-
           tion inventory      4        11         16          23

          -Other(specify) 11 - a variety of guidances and emission
                         factors are needed	

      2.  Improve emission
          factors in AP-42    33        15          5           2
      3,  Provide
          computerized
          systems having
          better data
          handling            27        10         11
          capabilities

      4.  Other(specify) 16	
      EPA Regional Office Response (Confirmation or Comment)
     The composite questionnaire indicated that RFP is tracked for approxi-
mately 75 percent of the required areas by 19 of the 28 responsible agen-
cies.  Resource constraints and incomplete SIPs appear to be the primary
reasons that RFP has not been tracked in the remaining areas.  In general,
it appears that updated emission estimates for more than half of the VOC
sources have been obtained within the last year, 60 percent for regulated
minor sources to 75 percent for major point sources of the RFP areas,
with the exception of unregulated minor sources, updated for approximately
48 percent of the areas.

     The composite form shows that most of the 59 responding agencies use
criteria pollutant inventories for a wide variety of purposes.  The most
frequent uses are related permit and inspection programs, response to in-
formation requests, and general program planning.  There appears to be
some indication that additional guidance, and perhaps resources, should
be devoted to an emission inventory quality assurance program, since 60
percent of the agencies characterized their inventory quality assurance
as "less formal, spot checks on an irregular basis."  However, about a
third of the agencies characterized their quality assurance as "formal,
rigorous, regular checks." The questionnaire did not define what formal or
rigorous meant; therefore, the response to the questionnaire was left to
the discretion of the auditor.  Additionally, the audited agencies clearly
indicated a need for better guidance with respect to area sources and
quality assurance, and improved emission factors.
                                 3-17

-------
Section B.2.  VOC'inventories in 03 Nonattainment Areas

     The questions asked in this section were designed  to determine the
methods, extent, and basis for VOC emission inventory adjustments.   Other
aspects of this section were intended to obtain information about  the
inclusion of recently identified VOC sources, and to determine  the use  of
an EPA pen-capita emission factor for miscellaneous solvents.   Similar  to
the inventory uses section, agencies were asked how EPA could help  to
improve their VOC emission inventories with respect to  adjustments,
exclusions, and additional or better emission factors.

     B.2.  VOC Inventories in 0^ Nonattainment Areas
           (29 agencies)

          Current guidance requires that VOC inventories in 03
     nonattainment areas be adjusted in various ways to reflect reactive
     emissions occurring during the 03 season.  The following questions
     address the adjustments made in your agency's VOC  inventory and
     apply to all components of the inventory; i.e., point, area,  and
     highway vehicle sources.  (NOTE:  if your State contains no
     nonattainment areas for 03, check here [ 30 ] and  go to B.3.)

     State/local Agency Response

          a.  EPA guidance specifies that methane, ethane, methylene
     chloride, methyl  chloroform, trifluoromethane and  six chloro-
     fluorocarbons should be excluded from 03 SIP inventories as
     nonreactive.  Indicate below your agency's exclusion of nonreactive
     VOC compounds from its 03 SIP inventory.
     (Check "x" where appropriate.)


              1.   2   No VOC compounds have been excluded as non-
                       reactive (if checked, go to question b.)

     Check the compounds that are excluded from the 03  SIP as
     nonreactive.

              2.  27   Methane

              3.  23   Ethane

              4.  21   Methylene chloride

              5.  24   Methyl chloroform

              6.  21   Trifluoromethane

              7.  21   Chlorofluorocarbons CFC-11, CFC-12,
                       CFC-22, CFC-113, CFC-114, CFC-115

              8.   3   Others - specify compounds  CO,  CO?, benzene,
                       carbonic acid, metal carbides	
                              3-18

-------
         9.  13   The agency excludes the following compounds
             based on vapor pressure cutpoints:  10-0.1  mmHg  while
             other conditions .002 1b/inS O.OlkPa. 2.0kPa used  by
             some agencies for specific facilities	


     b.  What technical  basis does your agency use to  identify and  quantify
nonreactive VOC?  (Check "x" where appropriate.)


         1.   2   Nonreactive VOC not excluded,  so question  not  applicable

         2.  12   Use EPA's VOC Species Data Manual

         3.  16   Use MOBILES option to generate nonmethane  VOC  emission
                  factors for highway vehicles

         4.   7   Use general species profiles from the literature

         5.  11   Sources are asked to list nonreactive compounds in
                  their VOC emissions

         6.   6   Other (specify) 	
     c.  Higher summertime temperatures can have a marked  impact  on the
levels of evaporative VOC from several important source categories.
Likewise, lower summertime volatilities of gasolines,  as reflected by lower
Reid Vapor Pressures (RVP), can also have an impact.  Have your VOC totals
been adjusted for conditions representative of the 03  season?

         1.  19_ Yes  10  No

If yes, check the appropriate statement(s) below:

         2.  18   Higher 03 season temperatures have been  considered in
                  generating highway vehicle emission  factors

         3.  15   Higher 03 season temperatures have been  considered in
                  estimating evaporative losses from petroleum product
                  (including gasoline) storage and handling.

         4.   7   Lower summertime RVP's have been considered in  estimating
                  evaporative losses from gasoline storage and handling


     d.  A number of source categories have recently been identified as
being potentially significant VOC emitters that have not traditionally
been included in VOC inventories, especially those related to fugitive
and/or waste treatment processes.  Have the following  sources been included
in your agency's VOC inventory?  (Specify "yes" or "no," or "N/A" if no
such sources are located in your area.)
                                 3-19

-------
         1.


         2.
      Yes   No

       3   26


       9   20
               15    8
                         NA

                         0


                         0
POTW's (Publicly Owned Treatment  Works,
i.e., sewage treatment plants)

TSDF's (Treatment, Storage and  Disposal
Facilities for hazardous wastes,  including
landfills, surface impoundments,  waste
piles, storage and treatment  tanks,  hazardous
waste incinerators, and injection wells)

Fugitive leaks from valves, pump  seals,
flanges, compressors, sampling  lines,
etc., in organic chemical  manufacturing
facilities (esp. SOCMI)
     e.  EPA has recommended  an emission factor of  6.3  1b/capitayear to
account for miscellaneous commercial/consumer solvent  in urban areas.
Check below which one response best describes your  agency's handling of
this source category.

         1.  25   EPA per-capita factor used  both  in  base year and projected
                  attainment  year inventories

         2.   1
    	  EPA factor used  in base year  but  a  lower factor used in
         attainment year  inventories,  projecting lower  commercial/
         consumer solvent use  in future

3.   3   Non-EPA factors  used  in both  base year  and attainment
         year inventories

4.   0   This source category  not  considered in  VOC inventory
     f.  In question B.l.f,  we asked how EPA could  help you on your
criteria pollutant inventory.   Indicate below where your  agency  specifically
feels better information or  guidance is needed to improve  its VOC  inventory.
(Check "x" where appropriate.)

                                             Current
                                             Data or
                            Very             Guidance  No  Strong
                          Important  Useful   Adequate  Opinion
1.  Excluding
    nonreactive VOC

2.  03 season
    adjustment of
    VOC totals
                                      11
                                               20
                                      10
                                 3-20

-------
                                            . Current
                                             Data or
                            Very             Guidance   No  Strong
                          Important  Useful   Adequate   Opinion

     3.  Emission factors for
         sewage treatment
         plants and hazardous
         waste treatment,
         storage, and disposal
         facilities               II      II       _L        _2

     4.  Updated commercial/
         consumer solvent factors 1J3      _10       _2        _4

     5.  Other (specify)  12 -  need additional  guidance and  new or  improved
                         emission factors	

     The agency responses shown on the composite questionnaire indicate
that many agencies (72 to 93 percent)  are excluding one or more of  the
EPA specified nonreactive hydrocarbon  compounds from their 03 nonattainment
SIP inventories.  This range of percentages  covers the percentage exclusion
for each of the listed nonreactive pollutants.   In general,  this exclusion
is based on the VOC Species Data Manual.  MOBILES options,  or source
listings of nonreactive compounds.  Some  of  this last  group, particularly
benzene, are not excludable according  to  EPA guidance.

     Responses to questions dealing with  the effects of higher summertime
temperatures on VOC losses show that 65 percent, or 19 agencies, adjust
their inventories to account for these differences. Specifically,  18
agencies consider temperature when generating their highway  emission
factors.  Similarly, 15 agencies adjust values  for evaporative loss from
storage and handling facilities, and 7 utilize  lower summertime RVP
Regional Oversight Policies (ROP) to adjust  losses from these facilities.

     At the time of this audit, the majority of the agencies were not
including VOC emissions from newly identified potentially  significant VOC
sources.  Three such potential  sources, POTWs,  TSDFs,  and  fugitive  leaks
from valves, pumps, etc., were addressed  in  the questionnaire.  Only
about 30 percent of the inventories included emissions from  TSDFs and a
little over 50 percent of the inventories addressed fugitive leaks.

     The questionnaire response shows  that most agencies,  84 percent, are
using the EPA recommended per-capita emission factor to account for miscel-
laneous commercial/consumer solvents in urban areas.  In response to  the
question, "How can EPA help to  improve inventories," it appears that  most
agencies feel that the per-capita solvent factor needs to  be updated  and
that new emission factors need to be developed  for POTWs and TSDFs.
                                 3-21

-------
Section B.3.  CO Emission Inventories

     The CO nonattainment area emission inventory section was  designed to
answer questions concerning geographic coverage of CO inventories.   Addi-
tional  agencies were asked if they included woodstoves in their inventory
and if they attempted to locate potential  CO hot spots.
     B.3.  CO Emission Inventories in CO Nonattainment Areas
           (35 agencies)

     If your State contains no CO nonattainment areas, check here
     [ 24] and go to B.4.

     State/Local  Agency Response

          a.  Indicate which response(s) below describe the geographic
     coverage and focus of your agency's CO inventory.
     (Check "x"  where appropriate.)
              1.  16   Major emphasis is on maintaining a  CO
                       inventory for highway vehicle sources for
                       certain traffic areas such as Central
                       Business Districts, intersections,  or
                       specific nonattainment areas

              2.  24   Areawide or countywide CO inventory is
                       maintained, covering major CO point sources,
                       area sources, and highway vehicles

              3.   5   CO inventory is not currently maintained
          b.  Are woodstoves included in your CO emission  inventory?

              12   Yes  22  No'


          c.  Is the highway vehicle inventory or transportation/
              traffic data used to locate potential  CO hot spots?

              26   Yes  10  No
     Thirty-five agencies responded to questions in the  CO  nonattainment
area section.  Their responses show that 44 percent of the  agencies
characterize their geographic inventory coverage as limited to  highway
vehicle sources in certain limited areas, such as central business districts
or specific nonattainment areas.  Approximately 65 percent  of the  agencies
indicated a much broader inventory coverage as areawide, including major
sources, area sources, and highway vehicles.  Seven agencies, or 19  percent,
indicated both of the above, and 5 agencies said they did not currently
maintain a CO emission inventory.

                                 3-22

-------
     With respect to their CO  inventories,  63  percent  of  the  agencies do  not
include woodstoves.  However,  72 percent of the agencies  are  using their
highway inventory or transportation/traffic data to  locate  potential CO hot
spots.
                                   3-23

-------
Section B.4.  Highway Vehicle Inventories in 03 and CO Nonattainment Areas

     The 1985 audit of emission inventories indicated that a  substantial
number of air pollution control agencies compile their highway vehicle
inventories cooperatively with a planning agency or a transportation
department.  In these cases, the planning agency or transportation depart-
ment is the lead agency.  This section was designed to develop a better
understanding of who compiles the highway emission inventories and what
factors are involved.

     B.4.  Highway Vehicle Inventories in 03 and CO Nonattainment Areas
           (45 agencies)

           Highway vehicle emissions are often compiled by the air
     pollution control agency acting in concert with the local planning
     agency or transportation department.  In some instances,  the local
     Metropolitan Planning Organization (MPO) or Department of
     Transportation (DOT) will compile the inventory independently as
     the lead responsible agency.  In general, highway vehicle emissions
     are calculated by applying mobile source emission factors to
     transportation data such as vehicle miles traveled (VMT), trip
     ends, etc.  Mobile source emission factors are available  for
     various vehicle types and conditions from an EPA emission factor
     model  entitled MOBILES (or from earlier versions).  Important
     conditions affecting emissions are vehicle age and mix,  speed,
     temperature, and cold start operation.

           If your State contains no 03 or CO nonattainment areas,
     check here [ 15 ] and go to B.5.

     State/local Agency Response

           The State or local agency should answer the following
     questions even if a transportation or planning agency is
     responsible for the highway vehicle inventory.


           a.  Which agency maintains the highway vehicle emission
               inventory for the 03 and/or CO nonattainment areas?
               (Check "x" where appropriate.)


               1.   21   Air pollution agency (State or local)

               2.   20   Local planning organization (MPO, COG, RPC,
                         etc.)

               3.   11   State or local transportation department (DOT)

               4.  	1_  Other (specify	

               5.    1   None  is maintained
                                 3-24

-------
         6.    1   Unsure
     b.  If an agency other than the air agency maintains the
         highway vehicle inventory, indicate what difficulties
         (if any) result.  (Check "x" where appropriate.)
         1.   21  No significant difficulties are evident

         2.  	2  Scheduling and coordination of activities are
                  negatively affected

         3.  	1_  The air agency loses control of the design
                  and format of the inventory

         4.  	3  The responsible agency has not been adequately
                  funded to be responsive

         5.  	4  Additional technical guidance is needed for
                  effective communication of program needs to
                  another agency

         6.  	2  Other (specify)  cold starts, local does not
                  receive results of update.	
     c.  Conversely, indicate what benefits (if any)  accrue from
having another agency responsible for the highway vehicle
inventory.  (Check "x" where appropriate.)
         1.  	4  No significant benefits result

         2.   25  Less resource drain on the air agency

         3.   21  The air agency doesn't have to develop
                  transportation planning expertise

         4.  	9  A better product results

         5.  	3  Other (specify)  to better working relation-
                  ships leading to more influence in trans-
                  portation network design, use of computer
                  system	

     d.  Which emission factor model  (MOBILE 1, 2, 2.5, or 3)
was used to generate the highway vehicle emission factors  for
the most recently developed or maintained inventory?  Mobile
1-0, 2-5. 2.5-3, 3-31, U-6	

(Indicate number or "U" if unsure.)
                          3-25

-------
     e.  Were the highway vehicle emission factors in the model
tailored to your area to account for the following parameters?
(Indicate "Yes," "No" or if unsure, specify "U.")

1. _
2. _
3.
4.

	 Vehicle mix
	 Vehicle age
Speed
Ozone season
Yes
29
26
39
26
No
11
12
3
12
Unsure
5
7
3
7
                    temperature

          5.  _  Cold/hot start       24   12      9
                    operating modes

     f.  Were data from the local  transportation planning process
used to compile the most recently developed or maintained
highway vehicle inventory? (e.g., VMT, street locations,  traffic
volumes, growth patterns, etc.)

Yes  34    No  4     Unsure  7
     g.  If not, were gross areawide estimates of VMT or
gasoline sales used to compute emissions?

Yes  6     No   5    Unsure  6
     h.  An important component of travel  sometimes overlooked
in highway vehicle inventories is VMT associated with minor
roads and connectors, often called, "local" or "off network"
travel.  Was local travel included in your most recently-
developed or maintained highway vehicle inventory?

Yes  27    No   5    Unsure  13
     i.  The results of last year's audit indicated that
significantly fewer highway vehicle inventories contained NOX
emissions than VOC emissions.  Indicate if and why this is so
for your agency.  (Check "x" where appropriate.)
         1.   28  Not so.  Our highway vehicle inventory contains
                  both NOX and VOC.

         2.  	6  NOX inventory not perceived as needed because
                  NOX reductions are not required for 03 control
                           3-26

-------
              3.  	2  NOX inventory perceived as needed  for 03 but  it was
                       not included because of resource limitations

              4.  	3  Other (specify)  	

              5.  	6_  No response


           j.  Last year's audit asked  each agency to specify the  base
     year of the highway vehicle inventory in the SIP, which gave  a  limited
     idea of how well these inventories have been maintained to the
     present.  What is the latest year  of record for which your agency's
     highway vehicle inventory has been updated?  When was this done?

                                    Latest year         No. of years
                Latest year         update was          from last  up
                of record           performed           through 1987

                1977   2            1982    6           01
                1979   1            1983    1           1   11
                1980   4            1984    6           2   15
                1982   6            1985   15           36
                1983   7            1986   11           43
                1984   9            1987    1           56
                1985   8
                1986   3
                1987   1
                No response 4       No  response 5


     An analysis of the responses on who compiles and maintains the  highway
emission inventory indicates that about 47 percent of the inventories  are
principally an air pollution agency function and the remaining inventories,
53 percent, are maintained by some other agency, usually a local planning
agency or State/local transportation department.  Comments from the  air
pollution agencies with highway vehicle emission inventories maintained
by another agency indicate that 85 percent of them have no significant
difficulties with this arrangement.  Those who do have problems with this
arrangement cite scheduling, loss of control, adequacy of funding  (other
agency), and need for technical guidance as significant problems.  Nearly
all of the audited agencies maintaining the inventory indicated as the
principal benefit not having to develop transportation planning expertise.
Almost 25 percent of the agencies believed a better inventory was  produced
in cooperative efforts than the air agency alone would have been able  to
produce.

     Each agency was asked which emission factor model was used to generate
the highway emission factors for its inventory.  Five agencies are using
MOBILE2, three are using MOBILE2.5, and 31 are using MOBILES.  Additionally,
six agencies are unsure about which version is being used.  In addition to
information about which version was used, each agency was asked to give the
various user defined parameters they employed, as opposed to model default
values.  The variables are vehicle mix, vehicle age, speed, ozone  season
temperature, and cold/hot starts.  The  agencies' use of these parameters

                                 3-27

-------
to tailor the model  to individual  areas ranged from 53 to 86 percent.
The most frequently involved variable was speed, with cold/hot  start
information being least frequently applied.  Thirty-four agencies,  75
percent, indicated that data from the local transportation planning
process were used to generate their most recent emission inventory. Six
agencies said that gross areawide estimates of either vehicle miles
traveled (VMT) or gasoline sales were used to compute emissions.  An
important component of highway vehicle inventories is the VMT associated
with "local" or "off network" travel.  Twenty-seven agencies included
"off network" travel in their inventory, and 18 agencies either did not
use it or were unsure if this component was included.

     The 1985 audit found that many highway inventories did not include
NOX emissions.  Analysis of the audit questionnaire shows that  62 percent
of the agency inventories include both VOC and NOX emissions.  Eight agencies
do not include NOX in their inventories, because they do not perceive  a
need to reduce NOX as a means of reducing 03 or because of resource problems.
The remaining nine agencies either did not respond to this issue  or
stated they had problems only with CO.

     The last question in this section requested the latest year  of record
used in the highway vehicle inventory, and the year in which the  inventory
was most recently updated.  The years of record ranged from 1977  to 1987,
while the most recent updates were performed between 1982 and 1987.
Through the end of 1987, 27 agencies (60 percent) had updated their
traffic data within the previous 2 years.

3.4  MODELING

Introduction

     In the National Air Audit, the State and local agencies were queried
concerning the following aspects of dispersion modeling:

     •  Experience and training of agency personnel conducting  modeling

     •  Availability of various guideline, other EPA recommended, non-
        guideline, screening, and other models

     •  Uses of non-EPA recommended modeling techniques

     •  EPA Regional Office review and revision of modeling applications
        conducted by the State and local agencies.

The purpose of these questions was to discover any widespread problems
that State and local agencies may be having in conducting dispersion
modeling analyses that they are required to perform.

     In this section, the results of the questionnaire on dispersion
modeling are presented.  Major findings and conclusions that arose  from
the results of the audit are presented.  Then, further detail is  provided
on the responses to individual questions with summary statistics.
                                 3-28

-------
Major Findings and Conclusions

     Although the purpose of the audit's questions on modeling was  to
determine the adequacy of the State and local  agencies'  modeling staff to
perform or review modeling analyses in support of the agencies'  SIP or
permitting activities, the responses can only measure the general  adequacy
of the agencies'  staffs.  The adequacy of a particular modeling analysis is
dependent upon such factors as the purpose of the analysis,  complexity
and availability of suitable models, level  of detail  required by the
model chosen, and availability of meterological  data, source data,  topo-
graphic and other data.  The responses to the audit's questions can be
useful in identifying potential problems in the agencies'  abilities to
conduct routine analyses.

     Overall, the responses indicate that agency modeling personnel have
the educational  background to be able to conduct modeling.  If experience
is taken into account, even those who do not have a degree in meteorology,
engineering,  or science, or do not have formal training in modeling, do
have experience in modeling.  As a result of this combination of education,
training, and experience, the State and local  agencies'  staffs appear to
be capable of performing modeling analyses.

     The State and local agencies have access to a wide variety of
dispersion models.  However, it appears that many may be using outdated
versions of UNAMAP.  Since the major change in UNAMAP occurred in July
1986, those agencies audited in FY 1986 could not have been using this
new version and the decision was made not to change the question for FY
1987.  This apparent failure to update their models may be an artifact of
the same question being asked in two fiscal years.  It should also  be
noted that the Guideline on Air Quality Models was significantly revised
near the midpoint of the biennial audit (September 1986).  Several  models
were added to the Guideline and a few were deleted.  Again,  the questions
regarding use of guideline and non-guideline models were not changed in
midstream.

     The responses of the agencies indicate that they are most comfortable
with models that predict concentrations of particulate matter, sulfur
dioxide, and  other non-reactive pollutants from sources located in  simple
terrain.  They tend to have less access to and familiarity with models
used to determine impacts from mobile sources.  Access to and familiarity
with ozone models such as EKMA and the Urban Airshed Model is even  less
common.  Since these models will be the primary ones used in analyses for
the post-1987 ozone SIPs, the lack of access to and familiarity with
these models  may pose a problem for some agencies.

     State and local agencies reported the use of non-guideline modeling
techniques in only about 10 percent of their modeling analyses.  The most
common reason for using a non-guideline technique was the presumed  technical
superiority of the non-guideline method over that recommended in the
Guideline.  There was no indication as to the performance evaluation used
to justify the use of a non-guideline technique.  These instances may be
indicative of the agencies using unapproved techniques in situations such
as complex terrain for which the recommended models may not be refined.
In addition,  there were 321 instances where the Guideline recommendations

                                   3-29

-------
 on  the  use  of  data  bases were  not  followed.  The vast majority of these
 were  cases  in  which  5 years  of off-site or 1 year of on-site meteorological
 data  were  not  used.  These  instances may be due to model applications in
 areas for which there were  little  or no meteorological data.

      There  were few  cases in which the EPA Regional Offices reviewed the
 State or local agencies' analyses  and required a revised analysis.  It is
 unclear whether the  small number of reviews and revisions was due to the
 expertise of the agencies or to the fact that few analyses were submitted
 to  the  Regional Offices for  review.

 Responses to Individual Questions

      Experience and  Training

      In order  for a  State or local  agency to have the ability to conduct
 modeling analyses adequately,  it should have an experienced staff of
 modelers who have some educational  background in modeling.  Table 3.5
 summarizes  the responses to these questions.  The agencies reported a
 total of 346 personnel  that are involved in modeling.  The majority of
 these,  i.e., 203 or  58 percent, are engineers or scientists.  Another 77,
 or  22 percent, are meteorologists.   Only 66, or 19 percent, were listed
 in  the  "other" category.

      These  categories were further subdivided into those with and without
 training in modeling.  Of the engineers and scientists,  119 or 34 percent
 of  all personnel  involved in modeling have training in dispersion modeling.
 Of  the  "others",  only six,  or 2 percent,  of the total  personnel  involved in
 modeling have training.  A total  of 60 personnel,  or 17  percent,  of the
 total  do not have a degree  in meteorology, engineering,  or related science,
 and do  not  have training in modeling.

      The second aspect  of this  question was the number of years  of
 experience  in modeling  that State and local  agency personnel  have.  Three-
 fourths of the modeling personnel  have over 2 years of experience in
modeling.  The breakdown by range of  years of experience was:   25 percent
with 0 to 2 years of experience,  14 percent with  from 2  to 5 years,  26
 percent with from 5 to  10 years,  and  34 percent with over 10 years of
 experience.

     Since the real  picture of  the  experience and  training of State  and
 local  agency modeling personnel is  a  combination  of the  two,  a  further
analysis of the two categories  of expertise is  of  interest.   Of  all
the modeling personnel  reported,  only 12,  or 3  percent,  were  categorized
as  having less than 2 years  of  experience  and not  having a degree in
meteorology, engineering, or a  science,  or training in modeling.   All  of
these personnel work at agencies  where they are supervised by more
experienced modelers.  Thus, it appears  that  the modeling personnel  at
the State and local  agencies do have  sufficient  education,  training, or
experience  to conduct modeling  analyses.
                                   3-30

-------
                                TABLE 3.5

           SUMMARY OF EXPERIENCE AND TRAINING OF MODELING STAFF
 Professional Qualifications    0-2
  Experience in Years
2-5     5-10   >10   Total   Total
(%)     (%)     (%)    (%)    Number
Meteorologist                    3

Engineer/Scientist with
 'Modeling Training             11
Engineer/Scientist without
  Modeling Training

Other with Modeling Training

Other without Modeling Training  3

Percentage with Years of
  Experience

Total Number of Staff with
  Years of Experience
         10
                      22
       34
6
1
3
5
7
2
0
3
14
49
          5    11     24

          002

          2     9     17
         26
34
100
         91   119
         77


        119


         84

          6

         60
              346
Note:  The percentages may not add up to 100 percent because of rounding.
                                   3-31

-------
     Model Availability

     Three questions were asked to determine the availability and  existence
of in-house expertise in a wide range of models, the type of access  to
models available to the State and local  agencies, and the capability to
modify model software.  In conjunction with the previous questions con-
cerning experience and training, these questions are important in  evaluat-
ing the capability of the agencies to conduct modeling analyses.   If they
do not have access to and in-house expertise in the use of a variety of
models, no amount of education and training will be sufficient to  provide
the capability to conduct extraordinary analyses.  Responses to individual
questions on model availability are summarized in Table 3.6.

     The agencies were first asked to identify the Version Number  of UNAMAP
that they are currently using.  Although the current version of UNAMAP is
No. 6, the question did not include that version as an option since
that version became available in July 1986 and the question was not
revised between FY 1986 and FY 1987.  Thus, although 26 agencies or  43
percent of the respondents indicated that they are using Version 4,  this
may have been a response in FY 1986, and Version 6 may be the one  that
these agencies now use.  Similarly, of the respondents, 35 percent or 21
agencies are using Version 5 of UNAMAP.  Other versions of the models are
being used by 18 perent of the respondents, or 11 agencies.  It is unknown
whether any of these began using the current Version 6 by the end  of FY
1987.  The remaining respondents (3 percent or 2 agencies) indicated that
they did not know which version they are using.

     A second part of the question addressed the availability of and in-
house expertise in the use of specific models.  This part of the question
was divided into four subparts based upon the following classification of
models:

     •  EPA guideline models including APRAC-1A, AQDM, COM, RAM, CRSTER,
        TCM5 TEMS and HIWAY

     •  Other EPA recommended models such as ISCLT, ISCST, MPTER,  and
        CDMQC

     •  Other models such as EKMA, the Urban Airshed Model, PAL, PLUVUE,
        and others

     •  Screening techniques including PTMAX, PTDIS, PTMTP, PTPLU, VALLEY,
        COMPLEX I, and others

This division of models is logical from the standpoint of the discussion
of models in the Guideline on Air Quality Models (1978) then in effect.
The Guideline on ATr Quality Models (Revised) was promulgated in September
1986.  The ISCST, ISCLT, CDMQC, MPTER, and Urban Airshed Model  became EPA
guideline models, while the APRAC-1A, AQDM, TCM, and TEM were deleted.
While the Guideline was undergoing revision, other refined models,
including these latter four models, were recommended for use in the
Regional Workshops on Air Quality Modeling, A Summary Report (1983).
                                   3-32

-------
                                TABLE 3..6

                      SUMMARY OF MODEL AVAILABILITY
                                                        Positive
Item	Response (%)

UNAMAP Version Number:
     Version No. 5                                           35
     Version No. 4                                           43
     Other                                                   18
     Don't Know                                               3

Access to EPA Guideline Models:
     APRAC-1A                                                74
     AQDM                                                    78
     COM                                                     90
     RAM                                                     91
     CRSTER                                                  94
     TCM                                                     74
     TEM                                                     77
     HIWAY                                                   89

In-House Expertise in the Use of EPA Guideline Models*:
     APRAC-1A                                                37
     AQDM                                                    55
     COM                                                     78
     RAM                                                     79
     CRSTER                                                  90
     TCM                                                     53
     TEM                                                     51
     HIWAY                                                   63

Access to Other EPA Recommended Models*:
     ISCLT                                                   96
     ISCST                                                   92
     MPTER                                                   96
     CDMQC                                                   91

In-House Expertise in the Use of Other EPA
Recommended Models*:
     ISCLT                                                   82
     ISCST                                                   94
     MPTER                                                   82
     CDMQC                                                   79

Access to Other Models*:
     EKMA                                                    77
     AIRSHED                                                 40
     PAL                                                     89
     PLUVUE                                                  50
                                   3-33

-------
                          TABLE 3.6 — Concluded
Item
Positive
Response (%)
In-House Expertise in the Use of Other Models*:
     EKMA                                                    55
     AIRSHED                                                 30
     PAL                                                     70
     PLUVUE                                                  38

Access to Screening Techniques:
     PTMAX                                                   96
     PTDIS                                                   88
     PTMTP                                                   85
     PTPLU                                                   98
     VALLEY                                                  92
     COMPLEX I                                               83

In-House Expertise in the Use of Screening Techniques:
     PTMAX                                                   91
     PTD IS                                                   87
     PTMTP                                                   87
     PTPLU                                                   90
     VALLEY                          .                        90
     COMPLEX I                                               72

Model Access Obtained by:
     Telephone line to State/local  agency mainframe
       computer                                              34
     Telephone line to private or subscription computer       4
     In-house dedicated computer                             49
     Telephone line to EPA computer                          13

Ability of staff to modify software for models               81
   Note that the Guideline on Air Quality Models was revised in September
   1986, and the regulatory status of several  models changed as indicated
   in the text.
                                   3-34

-------
     The percentages given in Table 3.6.for each model  are based upon the
number of responses for that specific model.  In general, agencies have
access to and expertise in the use of traditional  Gaussian models used to
predict concentrations resulting from emissions of sulfur dioxide and
particulate matter from stationary sources.  The highest number of positive
responses in terms of the number and percentage of agencies responding
that they have access and in-house expertise were for the PTPLU, PTMAX,
ISCLT, MPTER, CRSTER, VALLEY, ISCST, CDMQC, RAM, and COM models.  (Over
90 percent of the agencies that responded to the question concerning
these particular models indicated that they had access  to them.  A high
percentage, i.e., generally greater than 80 percent, indicated that they
have in-house expertise in these models.)

     The positive response rate for other models was less.  The lowest
rate was for the Urban Airshed Model.  Only 40 percent  of the agencies
responding, or 16 agencies, indicated that they had access to that model.
Only 11 agencies indicated that they had in-house expertise in the use of
the Urban Airshed Model.  PLUVUE was the next least likely model to which
the agencies have access and with which the agencies have in-house
expertise in its use.

     The lack of access to and in-house expertise in the use of these two
models is not surprising.  The Urban Airshed Model is a complex numerical
model  for analyzing ozone.  The amount of data and resources, i.e.,
computer time and manpower, needed to run it are intensive.  As a result,
many agencies have used EKMA in the past to analyze ozone nonattainment
problems.  The access to and in-house expertise in the  use of EKMA is
more widespread.  Of the agencies responding to this portion of the
question, 77 percent or 34 agencies have access to EKMA, and 23 agencies
or 55 percent of the respondents have in-house expertise in the use of
EKMA.  Due to the emphasis that will be placed upon ozone nonattainment
problems in the future, increased access to and in-house expertise in the
use of the Urban Airshed Model and EKMA may be necessary.  PLUVUE is a
complex model used to analyze source impacts on visibility.  It is unlikely
that there are many agencies that have had to conduct such visibility
analyses.

     Most State and local agencies have access to the models through the
use of in-house dedicated computers.  The questionnaire did not ask
whether these in-house computers were mainframes, minicomputers, or
personal computers (PCs).  However, PCs have become popular with State
and local agencies, and many of the Gaussian models for predicting
concentrations of particulate matter and sulfur dioxide are available for
PCs.  The next highest percentage of agencies, i.e., 34 percent, have
access via a telephone line to a State or local agency  mainframe computer.
Seven of the agencies, or 13 percent, use a telephone line to EPA's
computer.  Only two agencies, or 4 percent, use a telephone line to a
private or subscription computer.

     The last part of this question concerned the ability of the State
and local agencies to modify model software.  Of the respondents to this
part of the question, 81 percent indicated that they had that capability.
A total of 38 agencies reported that they had modified  model software.


                                   3-35

-------
     Alternate Modeling Techniques

     The third question in the questionnaire for modeling dealt with the
use of non-guideline methods and models.  Responses to this question are
summarized in Table 3.7.  The responses from State and local  agencies
indicated that in 256, or 10 percent, of the total of 2,499 modeling
analyses performed, the agencies had used techniques that are not recom-
mended by EPA.  Over half, i.e., 56 percent, of these involved the use of
a non-guideline model.  Of course, this number could include models such
as ISCST and ISCLT, because the Guideline was revised during the period
of the survey.  The next highest percentage of cases (16 percent) was
reported because a guideline model had been modified.

     Although the agencies were asked to provide a reason for using a non-
approved modeling technique, few responded.  Of those that responded, the
most common reason provided was that the alternative technique was judged
to be technically superior to the recommended one.

     The agencies also reported 321 instances when EPA guidance on
data bases to be used for modeling was not followed.  Of these instances,
93 percent or 309 entailed a failure to use 5 years of off-site meteoro-
logical  data or 1 year of on-site data.  There were few instances of the
use of other types of non-approved databases.

     Regional Office Review and Revision of Modeling Analyses

     In the fourth set of questions which are summarized in Table 3.8, the
Regional Offices and the State and local agencies were asked to identify
the number of modeling analyses that had been submitted to the Regional
Offices for review and the number that required revision because they were
technically inadequate or deviated from EPA guidance.  The numbers reported
by the Regional Offices and the State and local agencies were identical.
A total  of 18 analyses were submitted for review.  Of these, 10 were
conducted for new source review, 5 for SIP modeling, and 3 for "bubble"
policy applications.  Four of the new source review analyses had to be
revised, and one each of the other types of analyses had to be revised.
Reasons provided for requiring that the analyses be revised included the
following:

     •  Complex terrain considerations (3)

     •  Use of non-guideline model without a performance evaluation (2)

     •  Receptor network design (2)

     •  Emission inventory and operating parameters (1)

     •  Meteorological data (1)

     •  Comparison with acceptable air quality levels (1)

The number of reasons given exceeds the number of revisions required
because a revision may have been required for several different reasons.


                                   3-36

-------
                                TABLE 3.7

                 SUMMARY OF ALTERNATE MODELING TECHNIQUES
                                                       Percentage of
Item	Cases (%)

Analyses Using Non-EPA Recommended Techniques*               10

Types of Non-EPA Recommended Techniques:
     Use of a non-guideline model*                           56
     Modification of a Guideline model                        16
     Use of a non-recommended option in a                     4
       Guideline model
     Use of a Guideline model outside its                    11
       stated limitation
     Other                                                   13

Selection/Use of Database for Models That
  Does Not Conform to EPA Guidance:
     Other than 5 yrs of off-site or 1  yr of on-site
       meteorological data                                   93
     Techniques other than those from the Guideline
       on Air Quality Models                                  5
     Techniques other than those in EPA policy on
       calms                                                  0
     Techniques other than those in EPA policy on
       design of receptor network                             2
   Note that the Guideline on Air Quality Models was revised in September
   1986, and the regulatory status of several  models changed as indicated
   in the text.
                                   3-37

-------
                                TABLE 3.8

               SUMMARY OF EPA REVIEWS OF MODELING ANALYSES
Item	Number

State and Local  Agency Modeling Analyses Reviewed by
EPA Regional Offices, by Type of Regulatory Program:
     Bubble                                                   3
     Section 107 redesignations                               0
     New source review                                       10
     Nonattainment area SIP analyses                          0
     Lead SIPs                                                0
     Other SIP modeling                                       5

           Total                                             18

State and Local  Agency Modeling Analyses Reviewed by EPA
Regional Offices That Had to be Revised, by Type of
Regulatory Program:
     Bubble                                                   1
     Section 107 redesignations                               0
     New source review                                        4
     Nonattainment area SIP analyses                          0
     Lead SIPs                                                0
     Other SIP modeling                                       1

           Total                                              6

Reasons for Revision of Modeling Analyses:
     Use of an inappropriate guideline model                   0
     Use of non-guideline model without a performance
       evaluation                                             2
     Urban/rural dispersion coefficients                      0
     Emission inventory and operating design parameters       1
     Meteorological data                                      1
     Receptor network design                                  2
     Complex terrain considerations                           3
     Downwash considerations                                  0
     Comparison with acceptable air quality levels            1
     Technical documentation                                  0
     Other                                                    0
                                   3-38

-------
     The responses to this question do not suggest that there is a major
problem in the application of guideline methods or models.   The fact  that
only 18 analyses were referred to the Regional  Offices indicates that the
vast majority of analyses were routine and non-controversial.  Since  the
analyses that were submitted were ones that entailed non-routine or
controversial model applications, it is understandable that one-third of
those reviewed would have to be revised.

3.5  SIP EVALUATION AND IMPLEMENTATION

Introduction

     An evaluation of SIP development and implementation activities was
designed to assess whether State plans for attainment are being reasonably
carried out.  The evaluation was also designed  to identify  needs of State
and local  agencies in developing and updating SIPs.  This section of  the
report summarizes major findings in these areas and presents detailed
information in each of the audit areas.  Audit  questions covered by the
SIP Evaluation and Implementation section of the Air Quality Planning and
SIP Activity audit include:

     • Timeliness of Regulatory Development
     • Timeliness of Studies
     • Transportation Control Measures
     • Source Activity
     • Generic Bubble Rules.

Major Findings and Conclusions

     Timeliness of Regulatory Development

     The SIP evaluation and implementation audit identified that 29
percent of the revisions and strategies related to SIPs or  Section lll(d)
plans had not been completed and, of those completed, 24 percent were not
on schedule.  The audit found three major causes for these  SIP submittal
delays:  the schedule was overly optimistic, insufficient resources,  or
other reasons specific to each revision.

     Timeliness of Studies

     The audit also revealed that. 24 percent of additional  studies (such
as nontraditional TSP or CO hot spots) had not  been completed, and that
56 percent of these were behind schedule.  Reasons cited for delays
included an overly optimistic schedule, a change in guidance or scope of
work from the original plan, the schedule was contingent on another
action that did not occur, insufficient resources, and unclear or unassigned
responsibil ities.

     Transportation Control Measures

     It was found that for the transportation control measures (TCMs)
that agencies had instituted to control VOC and CO, the implementation
rate was 74 percent for traffic flow improvement, 55 percent for mass
                                   3-39

-------
transit, 80 percent for carpooling, 89 percent for vehicle inspection/
maintenance programs, and 68 percent for other measures such as bicycle
lanes and parking control.

     Source Activity

     The audit revealed differences in the values assumed for SIP
revisions and the actual  growth that occurred.  The absolute differences
were 1 percent per year for population growth (in 22 metropolitan  areas),
1.8 percent per year for employment projections, 1.8 percent per year for
vehicle miles travelled, and 1.4 percent per year for major source emis-
sions.  Many respondents contemplated a form of action after finding
significant differences between their projections and the actual  growth
figures, including revisions to upcoming reasonable further progress
(RFP) reports or to the SIP itself.

     Generic Bubble Rules

     Based upon data from this audit, only 18 percent of the States have
bubble rules formally approved by EPA.  The source emissions in these
States experienced an overall drop to rates at or below allowable  emission
standards after the bubbles were implemented.  Of the States with  bubble
rules not formally approved, the source emission rates increased,  but 67
percent were at rates at or below the allowable emission standards once
the bubbles were implemented.  However, through their failure to obtain
EPA approval, these States are placing the "impacted" sources in jeopardy
of potential EPA action to enforce the original  SIP limits.

Responses to Individual Questions

     Timeliness of Regulatory Development

     One of the purposes of the audit was to determine the effectiveness
with which State and local agencies are revising and adopting SIPs and
Section lll(d) plans.  Questions asked were intended to identify whether
strategies related to SIPs or Section lll(d) plans were being developed
on time and, if not, to determine the necessity for corrective action.
The EPA Regional Offices provided to each agency a list of SIP revisions
or strategies, which included Part D plans and non-Part D plans.  The
Regions requested that States complete a table showing whether action had
been completed and submitted, completed and not submitted, not completed
but on schedule for completion, or not completed and not on schedule for
completion.  Table 3.9 presents a summary of responses to the Regional
requests.  Of a total of 256 revisions or strategies listed, State and
local agencies responded to approximately 90 percent.  Overall, 152 of
256 revisions or strategies had been completed (59 percent), while 74 had
not been completed (29 percent).  Of those completed, 115 of 152 (76
percent) were on schedule, while 37 (24 percent) were not on schedule.
Of those not completed, 30 of 74 (40 percent) were on schedule, while 44
of 74 (60 percent) were not on schedule.

     The audit then asked if agencies experienced any delays in the sub-
mission of revisions or strategies, and the causes of such delays.  A list
of potential causes provided to the agencies is given below:
                                   3-40

-------
                                                     TABLE 3.9


                                        TIMELINESS OF REGULATORY DEVELOPMENT
Type of Revision
or Strategy
Part D Plans
Non-Part D Plans
Total
Number
of Cases
155
101
256
Completed
Submitted
64
51
115
Not Submitted
30
7
37
Not Completed
Submitted
20
10
30
Not Submitted
25
19
44
No
Response
16
14
30
OJ
I

-------
     Code           Reason for Delay

      A       Unclear or unassigned responsibilities
      B       No formal  tracking system
      C       Schedule was overly optimistic
      D       Change in guidance or scope of work from original  plan
      E       Executive or legislative oversight
      F       Insufficient resources
      G       Schedule contingent on other action that did not occur
      H       Other

Table 3.10 resents a summary of responses to this question, by EPA Region.
The reason for delay cited most frequently was an overly optimistic
schedule.  This was followed by:  insufficient resources, other reasons
(specific to that revision), and a change in guidance or scope of work.
Figure 3.1 graphically shows the percentage of respondents citing each
cause for delay of submission.

     Timeliness of Studies

     EPA Regional Offices provided to State and local agencies a list of
additional studies, such as nontraditional TSP or CO hot-spot modeling
studies, that had been approved as part of SIP control strategies.  As in
"Timeliness of Regulatory Development," agencies were requested to complete
a table showing whether action had been completed and submitted, completed
and not submitted, not completed but on schedule, or not completed and
not on schedule.  Of the 68 cases listed by the Regional  Offices, 32 had
been completed (47 percent), 16 had not been completed (24 percent), and
responses were incomplete for 15 cases (22 percent).  Five cases (7
percent) had been cancelled or were ongoing.  Of those cases completed, 5
(16 percent) had not been submitted to EPA.  Of those not completed, 9
cases (56 percent) were behind schedule.

     Fourteen agencies responded with reasons for study delays.  Since
this question was not mutually exclusive, many States listed more than
one reason for a delay.  The agencies listed 36 causes of delay for 31
studies.

     The following causes were cited by the agencies that responded:

     •  Schedule was overly optimistic       11 responses (30 percent.)

     •  Change in guidance or scope of
        work from original plan              9 responses (25 percent)

     •  Schedule contingent on other
        action that did not occur            8 responses (22 percent)

        Insufficient resources               5 responses (14 percent)

     •  Unclear or unassigned
        responsibilities                     1 response (3 percent)

     •  Other                                2 responses (6 percent)


                                   3-42

-------
                                TABLE 3.10

                   REASONS FOR DELAYS IN THE SUBMITTAL
                     OF SIP REVISIONS AND STRATEGIES
Percentage of
Cause of
EPA Region3
I
II
III
IV
V
VI
VII
VIII
Total
A
0
0
2
2
0
0
0
0
4
B
2
0
1
0
1
0
0
0
4
C
9
0
2
4
6
0
0
5
26
D
2
1
2
7
4
0
0
1
17
Respondents
Del ay b
E
0
1
1
0
0
0
0
0
2

F
5
0
0
5
11
1
0
3
25

G
0
0
1
2
1
0
1
3
8

H
1
0
1
3
12
0
0
4
21
3  No responses were received from Region IX.

b  The cause of delay code is explained on page 3-42.
                                     3-43

-------
O
CO
K
     JL f^  unclear/unassigned responsibilities
           no formal tracking system
                                    schedule overly
                                      optimistic
                          guidance/scope change from
                                original plan
executive/legislative oversight

                                    insufficient resources
               schedule contingent on other action
                                other
    0     5     10     15    20     25    30


              PERCENTAGE  OF RESPONDENTS
                                        35     40
FIGURE 3.1   REASONS FOR DELAYS  IN SIP RESPONSES
                            3-44

-------
     Transportation Control Measures

     This part of the questionnaire was intended to determine the  degree
to which transportation control measures (TCMs)  to control  VOC and CO
were being implemented.  State and local agency officials were asked  to
list three or more principal TCM categories after consultation with the
Regional Office, and give the VOC and CO emission reduction credits
assumed for each one in the SIP.  Most of the TCMs listed (94 percent)
can be categorized under traffic flow improvements (41 percent), mass
transit (26 percent), carpooling (20 percent), and vehicle  inspection/
maintenance (7 percent).  The remaining TCMs involved primarily alternate
transportation (such as bicycle lanes) and parking restrictions.   Emission
reductions of VOC and/or CO were provided for 49 percent of the 166
specific TCMs listed.

     Agency officials were then asked to name the agency responsible  for
implementing each measure, the extent to which the measure  had been fully
implemented, and the basis for determining the extent of implementation.
The average percent of implementation reported by the agencies was 74
percent for traffic flow improvements, 55 percent for mass  transit, 80
percent for carpooling, 89 percent for vehicle inspection/maintenance,  and
68 percent for the remaining measures.  Responsible agencies included
primarily State and local agencies such as transportation,  highway, and
motor vehicle departments, and various planning commissions.  Estimates
of implementation were derived from RFP reports, annual  reports, construc-
tion schedules, bus utilization and purchase records, and agency estimates.

     Source Activity

     This section of the questionnaire was concerned with projections of
the growth of source activity that agencies make as a part  of their SIP
development.  The questions were to determine (a) if verification  checks
on these projections are made, (b) the accuracy of the projections, and
(c) whether any action is contemplated to account for significant
differences.  Four projection categories considered indicators of  emissions
were included:  population, employment, vehicle miles traveled, and major
stationary source emissions (for cases where a composite growth rate  was
assumed in the SIP).  Responses were requested for each  metropolitan  area
that received an attainment date extension to 1987 or for which EPA called
for a SIP revision for CO or ozone.

     State and local agencies were first asked to list the  growth  projec-
tions assumed in the latest SIP revision, the actual growth that has
occurred, and the difference between these values.  Projections of popula-
tion growth ranged from 0.5 percent per year lower, to 3.4  percent higher,
than the actual changes (data for 22 metropolitan areas).  The average
absolute difference (disregarding the sign of the differences) was 1.0
percent per year.  Employment projections ranged from 3.9 percent  lower,
to 3.7 percent higher, than reality (13 data values).  The  average absolute
difference was 1.8 percent per year.  The greatest discrepancies occurred
in projections of vehicle miles traveled, which ranged from 10.4 percent
lower, to 6.7 percent higher, than true growth figures (18  values).  For
major source emissions, projections ranged from 5.8 percent lower, to 2.8
percent higher, than the actual figures (11 values).  The average  absolute
difference was 1.4 percent.

                                   3-45

-------
     Agencies were next asked how recently these projections had been
reevaluated using more current information, whether significant differences
were found, and what action was contemplated.  Table 3.11 summarizes the
responses made to these questions.  Most projections (over 80 percent)
had been reevaluated within the 1 to 2 year period prior to the audit.
In performing these reevaluations, agencies found significant differences
from the projections made in the latest SIP revision nearly 50 percent of
the time.

     Many respondents contemplated some form of action as a result of
finding significant differences between their projections and actual growth
figures.  Several of them said that the figures would be revised in their
upcoming RFP reports, while about the same number said that the SIP would
be revised.  A number said that no action was contemplated, with some of
these providing the reason that no action was necessary because air
quality was not impacted by the projection errors.

     Generic Bubble Rules

     State and local agencies were asked in the audit questionnaire if
they had a generic emission trading (bubble) rule, and the number of
bubbles submitted to the EPA Regional  Offices in FY 1987.  If they did
have a bubble rule, a second questionnaire was filled out by the audit
team for each source involved.

     Twenty States indicated in the questionnaire that they have generic
bubble rules.  Of the 20, only 9 have generic rules that have been formally
approved by EPA.  To have operable programs, the remaining States should
either have these overall rules approved by EPA rulemaking or submit
individual  bubble actions to EPA as SIP revisions.  If States do not, the
source may be subject to EPA enforcement action (because the source is
not in compliance with the EPA-approved SIP).

Bubbles Issued Under EPA-Approved Generic Rules

     State and local agencies were asked in the questionnaire to indicate
the total number of bubbles ever issued under their generic rules.  Six
of the agencies with EPA-approved generic rules stated that a total of 35
individual  bubbles have been issued.

     The audit results of the sources revealed that before the bubble, 70
percent of emissions exceeded the allowable emission standard.  Once these
bubbles were issued, an overall drop in source emissions occurred to rates
less than or equal to allowable emission standards.  However, no banked
emissions resulted from the bubble rules for any source, half of the PSD
baseline dates had been triggered, and 27 percent of net baselines increased,

     The audit questionnaire revealed the failure of State and local
agencies in documenting or in communicating with EPA on the following points:

     •  Generic rule procedures followed - 18 percent

     •  Bubble emission limits interfered with attainment and maintenance
        of NAAQS - 8 percent
                                   3-46

-------
                TABLE 3.11



AGENCY REEVALUATIONS OF GROWTH PROJECTIONS
Category
Population
Employment
Vehicle Miles
Traveled
Source
Emissions
Years Since Last
Reevaluation
No. of
Responses
21
13
19
14

0
10%
--
11%
7%
1
43%
38%
68%
57%
2
24%
54%
16%
29%
3
14%
--
--
--
4
9%
8%
5%
7%
Significant Differences
Found
No. of
Responses
28
17
25
14
Percent Yes
Responses
57%
41%
48%
43%
                   3-47

-------
     •  EPA modeling guidance in approving the bubble - 20 percent

     •  Bubble subjected to EPA's notice and comment procedures before
        approval  - 50 percent

     •  Bubble application copies sent to EPA -all

     •  Permit copy sent to EPA - 67 percent.

Bubbles Issued Under Generic Rules Not Approved by EPA

     Twenty-two bubbles were approved by States under generic rules that
were not approved by EPA.  All of the bubbles have been submitted to EPA
for formal approval, but had not received this approval at the time of
the audit.

     The audit results for these bubbles revealed that before the bubble
rule, all  of the source emissions met the allowable emission standards.
After the bubble rule was in place, increases in actual emission rates
occurred,  but 67 percent of the sources' emissions were less than or
equal to the allowable emission rate.  However, the bubbles did not
result in  banked emissions and all  of the PSD baseline dates had been
triggered.

     The audit questionnaire also showed that State and local agencies
had not documented or communicated with EPA on the following points:

     •  Generic rule procedures followed - all

     •  New emission limits violate PSD increments -all.
                                   3-48

-------
                                CHAPTER 4

                            NEW SOURCE REVIEW
4.1  EXECUTIVE SUMMARY

     For the FY 1986-87 audit cycle, EPA retained the same NSR audit pro-
cedure as the one followed in 1985.  This procedure involved the examina-
tion of State and local agency permit files for all kinds of stationary
sources.  EPA auditors were instructed to review both major and minor
source permit files and to complete a file questionnaire for at least a
minimum number of the files that were actually reviewed.  EPA then took
the completed questionnaires and entered them into a computerized data
base from which further analyses were performed.

     This audit report chapter covers seven major topics.  These topics
are the same ones addressed during the last two national audits and are
the topics originally selected by the NSR audit committee in FY 1983.
This chapter presents the major audit findings under each of the seven
NSR topics.

     EPA conducted onsite NSR audits of 57 air pollution control agencies,
including 44 States^, Puerto Rico, the Virgin Islands, and 11 local
government programs.  Altogether, these audited agencies processed approx-
imately 13,115 permits for new and modified sources of all types and
sizes during the approximate period upon which this audit report is based.
EPA auditors spent approximately 1,140 hours examining 602 permit files.
Auditors carried out a comprehensive examination of about 56 percent of
these files by completing detailed questionnaires in accordance with the
FY 1986-87 NAAS guidance.  These questionnaires were then forwarded to
EPA Headquarters, keyed into a computerized audit data base, and analyzed
to prepare this national  audit report.

     This year's audit findings support many of the findings from the FY 1985
NSR audit.  That is, the findings again indicate that most agencies perform
their overall new source review program responsibilities reasonably well,
although, in a number of State and local agencies, problems were iden-
tified with respect to the consistent and adequate application of certain
specific program requirements.  In addition, auditors occasionally mentioned
   The State of California does not have the authority to issue permits
   and does not implement a preconstruction review program.  Instead, all
   source permitting activities in California are performed by local  air
   pollution control districts.
                                    4-1

-------
that noticeable improvements had already been observed in the performance
of some agencies where specific deficiencies had previously been found.
This is certainly to the credit of these agencies in their efforts to
improve in-house performance and to strive for a greater level  of national
consistency.

Public Notification Procedures

     Overall, agencies generally complied with the public participation
requirements as applied to major sources, although EPA found some incon-
sistencies in the informational content of the public notices.   However,
while most proposed permits for NSR/PSD sources were announced  to the
public for an opportunity to review and comment, the majority of other
permits were not.  This was demonstrated by the fact that EPA found
evidence of public notification in only 36 percent of the non-NSR/PSD
permits that auditors examined.  Moreover, 29 percent of the agencies
reportedly did not issue public notices for any of the non-NSR/PSD permits
that were examined (compared to the FY 85 result of 42 percent).

Applicability Determinations

     A significant number of agencies continued to experience difficulties
with the way that they carry out the source applicability process.  EPA
believes that approximately 17 percent of the audited permits not reviewed
under PSD or NSR requirements probably should have been.  In addition, the
lack of sufficient file documentation often precluded EPA from  adequately
evaluating the agencies' applicability determinations.

     EPA identified various types of problems, but most pertain to the
way that agencies account for a new or modified source's emissions in
order to determine whether a major review would be required under either
PSD or nonattainment area regulations.  One particular problem  perta-ins
to the misuse by numerous agencies of the concept of "potential to emit",
which involves the use of Federally enforceable permit conditions to
properly restrict a source's potential emissions (as is often attempted
in order to enable a source to avoid major source review).  EPA has
initiated training courses to provide agency guidance in the determination
of applicability, including the concept of "potential to emit".

BACT/LAER Determi nati ons

     Agencies generally did a good job of applying the BACT requirements
to PSD sources and applicable pollutants.  However, EPA concluded that the
quality of the analysis performed to select the level of control defining
BACT on a case-by-case basis could be improved in some instances.  As was
the case in FY 85, EPA also found that when a PSD source was subject to
NSPS, agencies had a tendency to accept the use of the applicable NSPS to
define BACT.  Even though examples of BACT determinations more  stringent
than NSPS were identified in PSD permits issued by 11 of 22 agencies,
BACT was established at levels required by NSPS for approximately 60
percent of the pollutant determinations.

     Agencies showed far less tendency to use NSPS for LAER determinations
in nonattainment areas.  Approximately 25 percent of the time,  agencies
                                    4-2

-------
required pollutants to be controlled at levels more stringent than the
applicable NSPS.  This is down from the 50 percent figure obtained during
FY 85.  These findings suggest that BACT and LAER requirements do not yet
have the technology-forcing effect that Congress envisioned when it
established the requirements, and that EPA needs to provide more explicit
guidance for making BACT/LAER determinations.

Ambient Monitoring

     This year's audit substantiated a finding identified during the FY
1985 audit.  That is, agencies commonly allow PSD applicants to comply
with the preconstruction monitoring data requirements by relying upon
existing ambient air quality data instead of new data collected from a
special source-operated monitoring network.  EPA accepts the use of such
existing data if it can meet certain criteria for representativeness.
Due to inadequate file documentation, auditors were not able to ascertain
whether 7 PSD sources (out of 51 audited PSD permits) were correctly
exempted from addressing any ambient air monitoring requirements.

Ambient Impact Analyses

     EPA verified that agencies generally required ambient impact analyses
to be conducted where needed to ensure protection of the increments and
NAAQS.  In addition, most agencies generally used or required the use of
the appropriate models and model  options to complete the NAAQS analyses.
However, certain questions arose concerning the quality of these analyses
in a number of cases.  With respect to both PSD increment analyses and
NAAQS analyses, agencies did not appear to consistently give thorough
attention to significant emissions from existing sources (major and
minor) located within the impact area of the proposed source.  Another
key problem was the lack of sufficient documentation of the details of
the analyses.  This prevented EPA from being able to properly evaluate
the adequacy of the analyses contained in a significant number of files.

Emission Offset Requirements

     Only 19 percent of the audited agencies issued NSR permits to major
sources in nonattainment areas during the audit period.  EPA's examination
of the NSR permits provided no indication of any nationally significant
problems.  As with other phases of the audit, however, EPA encountered
problems with inadequate file documentation and this precluded an adequate
evaluation of the full creditability of some of the emission offsets that
agencies required.

Permit Specificity and Clarity

     This year's audit raises the same concern that was identified in the
FY 1985 audit about the enforceability, and more specifically the Federal
enforceability, of some of the permits that agencies are issuing.  The
acceptable use of physical and operational  limitations to restrict the
year-round operation and production capacity of a source hinges upon the
Federal enforceability of the limitations.  EPA auditors questioned the
enforceability of such presumed limitations in a number of cases where
sources had been allowed to avoid major source review.
                                    4-3

-------
4.2  INTRODUCTION

     For the 1986-87 audit period, EPA continued its focus on the
examination of State and local agency permit files.  EPA guidance for
the NSR audit called for an onsite review of both major and minor source
permit files, with the review addressing seven major audit topics.
These audit topics, selected by the NSR audit committee, included:
(1) Public Notification Procedures, (2) Applicability Determinations,
(3) BACT/LEAR Determinations, (4) Ambient Monitoring (PSD), (5) Ambient
Impact Analysis, (6) Emission Offset Requirements, and (7) Permit
Specificity and Clarity.  The results of the 1986-87 NSR audit pertaining
to each of these selected topics are reported in the remainder of this
chapter.

     The information used to develop the NSR audit findings contained
herein was collected by EPA Regional Office audit teams using the same
questionnaires as were used for the FY 1985 NSR audit, although some
revisions to the questionnaires were made to improve their utilization
and the quality of data being reported.  Four questionnaires were employed
as follows:  (1) a permit summary questionnaire (Form 1) which was completed
by each audited agency in order to summarize the permitting activities
during the applicable period covered by the audit, (2) an NSR audit summary
questionnaire (Form 2) which was completed by the EPA audit team following
the actual  onsite audit, and (3) two kinds of permit file questionnaires
which were completed onsite to collect information found in a selected
portion of the permits issued by the audited agency.  Copies of these
questionnaires, along with the necessary guidance for completing them,
can be found in the FY 1986-87 audit guidance manual prepared by EPA.

     The NSR audit considered information obtained from 57 air pollution
control agencies, including 44 states1, Puerto Rico, the Virgin Islands,
and 11 local  government programs.  Where a State program was carried out by
one or more offices (i.e., headquarters or central office plus district
offices) and more than one of the offices was audited, they were all
considered as part of one State program (agency).  Local agencies were
considered separately, even though there may in some cases have been a
dependency on the State agency for certain program operations.

     For the time period upon which this report is based, the audited
agencies processed approximately 13,115 permits, including 101 PSD and
42 NSR permits.  EPA, in turn, examined 80 percent of the PSD permits, 24
percent of the NSR permits, and about 4 percent (504) of the other (non-NSR/
PSD) permits.

     While each auditor was encouraged to examine as many permit files as
he or she could, the NAAQS audit guidance did not require the completion
of a questionnaire for each permit file that was examined during the
audit.  Instead, a certain minimum number of PSD, NSR, and non-NSR/PSD
   The State of California does not have the authority to issue permits
   and does not implement a preconstruction review program.   Instead, all
   source permitting activities in California are performed  by local  air
   pollution control districts.

                                   4-4

-------
source questionnaires was prescribed.  The, total  number of permit file
questionnaires submitted to EPA Headquarters, and subsequently keyed into
a computerized data base and analyzed to prepare the enclosed audit find-
ings, represents about 56 percent of the number of files actually examined.
Reference to a percentage of the "total  number of audited files"  as used
throughout this report refers only to those permits that were included
in the EPA Headquarters data base, as follows:  51 PSD permits, 17 NSR
permits, 6 permits involving both PSD and NSR, and 273 non-NSR/PSD permits.

     Some findings of general interest are:

     o  Twenty-eight (28) agencies issued NSR/PSD permits, no NSR/PSD
        permits at all were issued by 17 of the audited agencies, and
        one agency accounted for greater than 20 percent (21 permits)
        of PSD permits.

     o  Agencies issued permits to an estimated 756 sources whose
        potential  emissions ranged from 100 to 249 tons per year  (TPY),
        but whose  preconstruction review is typically categorized as a
        "minor" source review.  This is because these sources, locating
        in attainment or unclassified areas, are not subject to Federal
        PSD requirements, i.e., they are not listed under 40 CFR  51.24(b)
        (l)(i)(2).  Consequently, for them to be classified as "major"
        PSD sources, each would have to emit at least 250 TPY of  any
        regulated  pollutant.

     o  EPA auditors spent approximately 1,140 hours primarily examining
        agency permit files.  On the average, auditors spent 20 hours per
        agency audit and nearly 2 hours on each permit file.  Actual  review
        time for individual files ranged from 30 minutes to an unusually
        high 25 hours.

     Finally, auditors were asked, on the audit summary form, to  identify
from their own overall perspective, the five most significant problems
encountered for each agency audit.  From the lists of problems submitted,
five specific problem areas stood out as being the most often mentioned.
They are:

                                           Frequency of      Percent of
              Problem Area                  Occurrence         Total

     o  Applicability Determinations            32              26

     o  Permit Conditions                       31              26

     o  Impact Analysis                         16              13

     o  BACT/LAER Determinations                12              10

     o  Public Notification Procedures          10               8
                                               101              83

Two other problem areas collectively constituted 5 percent of the total
responses:  ambient monitoring and offset requirements.
                                    4-5

-------
4.3  SUMMARY OF MAJOR FINDINGS

     This section summarizes the major findings of the FY 1986-87 NSR
audit.  These findings are presented for each of the seven audit topics
subject to review.  Where appropriate, this year's audit findings are
compared to the findings from the previous audit period to identify where
improving or worsening performance trends may be occurring.  For a better
understanding of how the major findings were derived, the reader is re-
ferred to sections 4.4 through 4.10, where a breakdown of the individual
audit questions and responses is provided.

Public Notification Procedures

     •  While most proposed permits for NSR/PSD sources are announced to
the public for review and comment, most other permits are not.  EPA found
evidence of public notification in only 36 percent of the non-NSR/PSD source
permits that were examined.  Agency rules excluded approximately half (53
percent) of the minor source permits from the public notification process.
(See Figure 4.1.)

     •  Approximately one-third of the audited agencies issued public
notices for all permits reviewed, and one-third issued notices for some,
but not all, of their audited permits.  Twenty-nine (29) percent of the
audited agencies did not issue public notices for any non-NSR/PSD permits
that EPA reviewed.  (See Figure 4.1.)

     •  Sixty-four (64) percent of the agencies that issued PSD permits
routinely included the information required by regulation for public notices.
Sixty (60) percent of the public notices for PSD sources included all  of
the information required by the PSD regulations.  The information most
frequently omitted was the description of the source's estimated ambient
impact, including the amount of PSD increment that would be consumed.

     •  Federal Land Managers were not always notified of PSD construction
that might adversely affect Class I areas.  Eleven (11) of 16 PSD permits
involving construction within 100 km of a Class I area were brought to the
attention of the appropriate Federal Land Manager.  No record of notification
was apparent in the remaining five permit files -- each issued by a different
agency.

Applicability Determinations

     •  A significant number of agencies are experiencing difficulties in
adequately carrying out the source applicability process.  EPA believes
that approximately 17 percent of the audited non-NSR/PSD permit files
involved sources that probably should have been reviewed as major sources.
A lack of adequate file documentation often prevented EPA from adequately
evaluating the agencies' applicability determinations.

     •  EPA found examples where seven agencies (approximately one-half of
the FY 85 total) either failed to consider certain pollutant-emitting
activities at a source, or improperly interpreted an exemption provision.
Only in the latter case, however, were sources enabled to avoid major
source review.  (See Figure 4.2.)

                                    4-6

-------
             Did Agency Issue Public Notifications?
                    Percent of Agencies
   FY  1985
|-T

r
33%
i
if
i !_._ ,
1 ' 	
r~
42%
i
'
i
i
r
' —

25%
i

   FY 1986-87
rlf


37%
i
i
rff


29%
1
'
i
i
i
rT


34%
i
i
'
i
                Yes, Routinely    No
Sometimes
                 How Were Permits  Issued?
                   Percent  of Permits
   FY 1986-87
   Undetermined — 11%
                                    Without
                                  Public Notice
       - 53%
                                 With Public Notice - 36%
Above information based only on non-NSR/PSD permits
 FIGURE 4.1.  AGENCY PERFORMANCE ON PUBLIC NOTIFICATION
                           4-7

-------
               Percentage of Audited Agencies
                                                        60
           Netting" Problems
          "Source" Problems
                                                        60
  Percentages are mutually exclusive and based on  64 total
   audited agencies during FY 1985, and 57 total  audited
                 agencies during FY 1986-87.

Figure 4.2.  APPLICABILITY  DETERMINATIONS - AGENCY PERFORMANCE
                              4-8

-------
     •  Agencies continue to have problems properly defining a new or
modified source's "potential to emit."  Twenty-five agencies had permit
files that EPA considered to be deficient in some respect concerning the
calculation or use of potential emissions for source applicability purposes.
Sometimes, but certainly not always, these problems resulted in agencies
not subjecting proposed sources to the correct preconstruction review
requirements.  As Figure 4.2 illustrates, this represents a significant
decrease from the FY 85 audit findings.

     •  EPA identified 17 agencies that either did not require the proper
"netting" procedures to be followed to calculate the change in emissions
at a modified source, or did not provide sufficient file documentation to
enable adequate evaluation of the agency's procedures.  In these agencies,
EPA found at least 49 files with procedural  or documentation problems.
(See Figure 4.2.)

     •  EPA found no evidence of the improper "double counting" of emission
reduction credits used for netting purposes.  Some agencies need to be
more careful about ensuring that each emission reduction credit is made
Federally enforceable—an important criterion for properly using emission
reductions for netting purposes.

BACT/LAER Determinations

     •  Most agencies routinely complied with the PSD requirement for
applying BACT to each regulated pollutant emitted in significant amounts.
EPA found exceptions in a total of 14 permits issued by 11 agencies.  In only
three agencies did the problem occur in more than one permit.

     This year's audit findings generally support the agencies' claims in
that most (65 percent) of the audited PSD files (at 86 percent of the
agencies) did address, to some degree, consideration of alternative
control techniques (Figure 4.3).  Specifically, auditors found that:

     .  Fifty-seven (57) percent of the agencies where PSD permit files
        were examined had files in which control alternatives were
        routinely considered;

     .  Twenty-nine (29) percent had some files that addressed alternatives
        and other files that did not;

     .  Fourteen (14) percent of the agencies had no PSD files that
        addressed alternative controls for BACT; and

     .  Sixteen (16) percent of the PSD files where control alternatives for
        BACT were considered failed to adequately address the impacts for
        each alternative in order to demonstrate the rationale for
        selection of a particular control technique.

     •  Collectively, agencies showed a tendency to accept the use of the
applicable NSPS to define BACT for PSD sources.  Even though examples of
BACT determinations more stringent than NSPS were found in PSD permits
issued by 11 of 22 agencies, BACT was established at levels required by


                                    4-9

-------
NSPS for approximately 60 percent of the pollutant determinations  for BACT
(Figure 4.4.)

     •  Agencies showed far less tendency to use the NSPS level  for LAER
determinations than for BACT.  Agencies required emission limits more
stringent than NSPS to establish LAER in six of seven pollutant  determina-
tions, affecting five major nonattainment area sources otherwise subject
to NSPS.

Ambient Monitoring

     •  With only a few possible exceptions, agencies typically  required
PSD applicants to address the preconstruction monitoring requirements where
applicable.  Where agencies did exempt PSD applicants from the requirements,
permit files usually provided an adequate demonstration that the proposed
sources' impacts were de minimis.  For 14 approvals made by 9 agencies,
however, the auditors believed that the sources should have been required
to submit ambient air quality data instead of existing representative data.

     •  Nineteen (19) agencies required a total of 28 applicants to comply
with the preconstruction monitoring requirements.  Twenty-six PSD  applicants
were allowed to use only existing ambient air quality data.  Two agencies
each required one PSD applicant to monitor for one or more pollutants.

        In approximately 20 percent of the cases where agencies  accepted
the use of existing data, the permit files (a) offered no documented
basis for allowing its use, or (b) contained some description of the data
but failed to adequately address or meet all of the EPA criteria for
representative data.

Ambient Air Quality Analysis

PSD Increment Analysis—

     •  Twenty-three (23) agencies required 32 PSD applicants to meet either
the TSP or SOg increments, or both.  In only one case did an auditor find
that an increment analysis (Class II for S02) should have been performed.
However, in more than half of the affected agencies, EPA found that either:
(a) the analyses did not adequately address existing major and minor source
emissions which also consumed increment, or (b) the permit files did not
provide sufficient information to enable auditors to evaluate the  analyses.
(See Figure 4.5.)

     •  Agencies typically gave adequate consideration to both long- and
short-term PSD increments and tended to be conservative in their use of
modeling results.

NAAQS Protect ion--

     •  EPA auditors identified three NSR/PSD permits for which  an NAAQS
analysis was completely omitted but should have been required.  However,
five agencies were found to have (a) incorrectly omitted certain pollutants
                                    4-10

-------
           Alternatives Routinely Alternatives Considered Alternatives not

              Considered         in Some Cases        Considered


           Statistics based on agencies issuing PSD permits


FIGURE  4.3.  BACT DETERMINATIONS:  Consideration of Control Alternatives
             50
         CO
         fl
         O
         •^H
         4->
         efl
         fl
CO
-t-1
3
r—i
'o
eu
     40  -
             30
             20
             10
                           659


                                 Number of Agencies

                                                       Control equal to NSPS
       Control more stringent than NSPS
         FIGURE 4.4. RELATIVE STRINGENCY OF BACT DETERMINATIONS


                                       4-11

-------
               PSD Increment - Percent  of  Agencies
                     Adequate Analysis 17%
                               Inadequate Analysis 15%
                                   -Insufficient Documentation 8%
                                Not Applicable 60%
               NAAQS Protection -  Percent of Agencies
                              Inconsistent 13%
      Insufficient Documentation 20%
       Typically Adequate 20%
                                                     Inadequate 12%
                         Not Applicable 35% -
FIGURE 4.5.  AGENCY PERFORMANCE ON AMBIENT AIR QUALITY ANALYSIS
                                  4-12

-------
from analysis, or (b) lacked a sufficiently comprehensive review of some
pollutants.

     •  Four agencies, each having a PSD file that included an NAAQS
analysis, should have required additional  analyses, either for omitted
pollutants or for more comprehensive review of considered pollutants.

     •  Most agencies appear to scrutinize non-NSR/PSD source permit
applications individually to detemine whether an NAAQS analysis should be
done.  However, 16 agencies were found to have issued permits to sources
that probably should have been subjected to NAAQS analyses, but were not.
EPA identified a total of 16 permits for which this omission occurred.

     •  Thirty-two (32) percent of the agencies typically had files that
either:  (a) lacked sufficient documentation to enable EPA to determine
whether and to what degree source interactions had been considered in  the
NAAQS analysis, or (b) omitted significant emissions from other sources
in the vicinity of the proposed source.  (See Figure 4.5.)

Dispersion Models--

     •  Most agencies generally used or required applicants to use the
appropriate models and model options in the NAAQS analyses performed.
However, the lack of sufficient documentation to fully describe the rationale
for the use of particular models and the methods used was a hindrance  to
the auditors in many instances.

     •  Apparently, most agencies do not often require minor sources to
perform the modeling analysis.  Seventy (70) percent of the time, the  minor
source analyses were performed by the agencies themselves; but in the  18
cases where the applicants did submit the analyses, adequate checks by
the responsible agency occurred 80 percent of the time.

Emission Offset Requirements

     •  Eleven agencies issued NSR permits to major sources in nonattainment
areas.  EPA found a few examples of areas where agencies experienced specific
problems and one case where the Federal enforceability of an offset
granted was questionable.  Agencies typically required offsets to occur
on or before the time of new source operation and to be expressed in the
same manner as emissions used for the demonstration of reasonable further
progress.

Permit Specificity and Clarity

     Where limits were specified in the permits, auditors were asked to
evaluate them in terms of their (1) clarity, (2) consistency with measurement
techniques, and (3) Federal enforceability.  The results are presented in
Figure 4.6.  With regard to these three criteria, the FY 86-87 results show
almost universal improvement over the results of FY 85.

     •  Thirty-five agencies did not appear to routinely state or re-
ference each source's allowable emissions in the applicable permit.  The
                                   4-13

-------
Clear and concise
averaging periods
                     PSD/NSR Permits  Non-PSD/NSR Permits

                             87%
                         75%
                  76%
                                              78%
Emission rates
consistent with
acceptable
measurement
techniques
    83%
78%
                                              88%
72%
Federally
enforceable
                         88% 83%
                     87%
                                          78%
   *Based on total number of permits containing emission limits.
 FY 1985
                  FY 1986-87
           FIGURE 4.6. CONDITION OF ISSUED PERMITS
                            4-14

-------
omission appeared to be a common occurrence., at least for non-NSR/PSD
sources, in 13 of these agencies.

     •  At least 13 agencies did not appear to identify routinely each
emission unit along with its allowable emission limit in the permits.  In
some cases, it may be agency policy to do so primarily for PSD permits or
where needed to avoid NSR/PSD review.

     •  Eighteen agencies reportedly either did not at all or did not
consistently state or reference compliance test methods in the permits.
Where the practice is not followed consistently, it is not clear what
criteria, if any, agencies use to determine whether the methods are to be
stated or referenced in the permits.

4.4  PUBLIC NOTIFICATION PROCEDURES

     The audit examined State and local  agency procedures for notifying
the public of proposed permit actions.  The procedures were reviewed with
specific concern for (1) the type of sources for which notices were
issued, (2) the adequacy of the information contained in the notices, and
(3) the extent to which other agencies and officials are informed of
pending permit actions that could affect the air quality in their juris-
dictions.

     1.  For which new or modified sources was the public afforded an
         opportunity to comment on proposed permits?

     The FY 1986-87 NSR audit findings indicate that, while most proposed
permits for NSR/PSD sources are announced to the public for comment, most
other proposed permits are not.  Approximately 90 percent of the NSR/PSD
permits examined by the auditors were announced through a public notice.
This percentage represents no change from the FY 85 results.

     In contrast to the results for NSR/PSD permits, auditors found many
non-NSR/PSD permits for which there was  no evidence of public notification;
only 36 percent of the audited non-NSR/PSD permit files included public
notices.  Auditors concluded that 53 percent had been exempted from
notification by agency rules, while 5 percent of the permits that were
not subject to public notification should have been subject under agency
rules.  Fifteen (15) percent of the questionnaires contained no response
at all to this question, primarily because of the lack of documentation in
the files.  None of these percentages cited represents a significant
change from the FY 85 audit results.

     In terms of public participation requirements, it was found that EPA
auditors responded to procedures regarding issuance of public notice for
40 of the 57 agencies audited.  Of the agencies for which responses were
recorded, this year's findings reveal that:

     o  37 percent of the agencies issued public notices for all of the
        permits that EPA examined

     o  34 percent issued public notices for some, but not all, audited
        permits (compared to 25 percent  for FY 85)

                                    4-15

-------
     o  29 percent did not issue public notices for any of the  audited
        non-NSR/PSD permits (compared to the FY 85 result of 42 percent).

     In order to help assess the general  value of public notification for
non-NSR/PSD permits, auditors were asked to indicate whether the public
notices resulted in any comments.  Of those minor permits for which  a
public notice was issued, the files contained no evidence that  public
comments had been submitted.

     2.  Do the public notices routinely provide adequate information?

     Auditors were asked to determine whether the following items of
information, required under the PSD regulations, were included  in the
public notices issued by State and local  agencies:

     o  Opportunity for written comment

     o  Opportunity for a public hearing

     o  Description of the agency's preliminary determination to approve
        or disapprove the permit

     o  Description of the source's estimated ambient impact

     o  Statement of the availability of additional information for  public
        inspection.

     Of the 28 agencies issuing PSD permits during the FY 1986-87 audit
period, 64 percent routinely addressed the information requirements  in an
adequate manner.  Specifically, 60 percent of the public notices issued
for PSD permits by these agencies included all  of the required  informa-
tion.  The remaining 36 percent of the agencies were inconsistent at
best.  The most frequently omitted information was the description of the
source's estimated ambient impact, including the amount of PSD  increment
that would be consumed.  This omission continues a trend from the previous
audit results.  Overall, approximately one-half of the public notices for
PSD permits were found to contain all of the required information outlined
above.

     Other types of permits, including those subject to major review in  a
nonattainment area, typically did not contain all the items of  information
listed above.  For example, the description of the source's estimated
impact was frequently omitted.

     3.  Were,other State and local air pollution control agencies,  and
         other officials whose jurisdictions might be affected  by the
         proposed new or modified source, adequately notified of the
         proposed action?

     Auditors identified 11 agencies where it did not appear that this
part of the notification procedure was being adequately carried out.  In
some cases, it was not apparent that outside agencies or officials had been
notified.   With regard to the NSR/PSD files, it appeared that  EPA and other
agencies within the State were notified when appropriate over 90 percent
                                    4-16

-------
of the time.  This percentage fell to less than 50 percent for the non-
NSR/PSD permits.  In most cases, the NSR/PSD permit files contained
evidence of notification.  This documentation was missing from the non-
NSR/PSD files more than half of the time.

     EPA policy calls for notification of the appropriate Federal  Land
Manager (FLM) when a PSD source proposes construction within 100 km of a
Class I area.  Sixteen PSD permits met this criterion.  Auditors verified
that 11 of the 16 PSD permits were brought to the attention of the appro-
priate FLM.  No record of notification was apparent in the remaining
files, each issued by a different agency.

4.5  APPLICABILITY DETERMINATIONS

     The specific types of requirements that are to apply to a proposed
new source or modification are generally based on the size of the  new
source or modification, expressed in terms of its "potential to emit"
and the geographic location where the proposed construction would  occur
(attainment vs. nonattainment area).  The task of making the appropriate
applicability determination depends upon the existence of adequate
regulations containing the proper definitions and applicability criteria,
plus the in-house expertise for correct application to each permit
application.

     EPA auditors examined the selected permit files to evaluate each
agency's ability to adhere to the approved definitions of "source" and
"potential to emit," and how well each agency verified and corrected,
where necessary, the emissions estimates provided by the applicants.  As
was the case in the FY 85 audit, the overall findings pertaining to
applicability determinations suggest that a significant number of  State
and local  agencies are experiencing difficulties in adequately carrying
out the source applicability process.  Overall, EPA found that:

     •  Approximately 17 percent (40 permits) of the audited non-NSR/PSD
permit files should have been reviewed as major sources, in the auditors'
judgment.  ..This represents a slight increase over the FY 85 results.

     •  Another 5 percent of the audited non-NSR/PSD files did not contain
sufficient information about the sources' emissions to enable the  auditors
to indicate whether the correct applicability determinations had been
made.

     •  Twenty-one (21) agencies had at least one source that should have
been reviewed as major.

     Described below are the findings as they relate to the various
aspects of the applicability determination process.

     1.  Does the agency properly apply its approved definition(s) of
"source"?

     EPA found, in seven agencies, nine non-NSR/PSD permits for which
certain pollutant-emitting activities had not been considered in defining
the subject source.  Four sources, in four agencies, escaped review

                                   4-17

-------
because new or modified pollutant-emitting activities were not included
in the definition of source.  Twelve (12) permits at 10 agencies did not
have sufficient information to determine if the source escaped review due
to the omission of a pollutant-emitting activity.

     EPA did, however, identify other problems that, while not related to
the definition of source, involved other source-related issues.  These
source-related problems kept sources from being properly regulated under
the agencies' permit requirements.  No one problem was widespread, and
correction of each would appear to require greater attention on the part
of each agency to correctly interpret its applicable regulations.  EPA
identified the following types of problems:

     •  incorrect application of the 250 TPY source category criterion;

     •  the use of external  offsets to net out of review;  and

     •  lack of Federally enforceable limits.

     2.  Does the agency typically use the best available  emissions
     projections and Federally enforceable limitations in  defining a
     source's "potential to emit (PTE)"?

     The PTE is a source's maximum capacity to emit a pollutant under its
physical and operational design.  In order for any physical  or operational
limitation (e.g., less than a 24-hour, year-round operation fuel usage
restriction) to be considered part of the source's design  (thereby
restricting the maximum pollutant-emitting capacity of the source), the
limitations must be Federally enforceable.  The major status of new or
modified sources must be determined on the basis of their  potential
emissions.

     Twenty-five (25) agencies (44 percent of all audited  agencies)
were found to have a problem with their procedures for establishing a
source's "potential  to emit" (PTE).  This represents a significant decrease
from the FY 85 audit total of 59 percent of all audited agencies.  Sometimes,
but certainly not always, these problems appear to have resulted in
incorrect applicability determinations.  Problems related  to agencies'
determinations of PTE can be broken down as follows:

     •  failure of 28 permits (17 agencies) to ensure the  Federal enforce-
ability of all physical and operational limitations used in the PTE calcu-
lations;

     •  use of emission factors in four permits (four agencies) that are
not well-established or wel 1-documented; and

     •  failure to include quantifiable fugitive emissions in five permits
from five agencies.  However, no source escaped major review because
fugitive emissions that should have been included were not.

     •  Twenty-nine (29) permits in 18 agencies were found by the auditors
to lack sufficient documentation for a determination of whether PTE was
correctly calculated.
                                   4-18

-------
     For any one or more of these reasons,.EPA considered the PTE
determination in approximately 25 percent of the audited non-NSR/PSD
source files to be deficient.  More importantly, at least 18 percent of
the files where EPA found deficiencies reportedly should have been reviewed
as major sources.  This figure represents a slight decrease from the
20 percent figure obtained from the FY 85 audit.

     In 38 agencies, EPA found permit files for which the agencies (a)  did
not properly ensure the Federal enforceability of all physical  and opera-
tional  limitations upon which emission estimates were calculated, or (b)
did not adequately consider the potential emissions of existing facilities
where a modification was being proposed.

     EPA identified at least 14 permits where the agencies simply did not
establish permit conditions defining the necessary limitations  upon which
the sources' estimated emissions were based.  In other permits, some
necessary limitations were either not addressed at all or were  inadequately
restricted.  Sometimes the limitations were specified in operating permits
which EPA generally does not regard as being Federally enforceable, but
which are usually enforceable by State and local agencies.

     It is also important to point out that some agencies consider the
limitations to be enforceable if they are contained in the permit
application.  Apparently, some agencies include in permits a general
condition that links the applicants' plans and specifications to the
permits.  It is not clear when and how often auditors took this into
account when evaluating the Federal  enforceability of the limitations.

     In cases where a permit involved a modification to an existing
source, EPA sometimes found that no determination of the existing source's
PTE was made.  While it is true that the existing source's PTE  is irrelevant
for the immediate applicability determination, when the proposed emission
increases would not exceed prescribed significance levels, it is nevertheless
important to know what the source's cumulative PTE is for consideration
in subsequent modification proposals by that source.  As was the case in the
FY 85 audit, some files did not appear to contain any documentation of
the existing source's PTE or of cumulative emissions for future reference.

     Nine (9) agencies issued 11 permits (4 percent of the audited non-
NSR/PSD source files) that did not adequately address fugitive  emissions;
however, no source escaped major review because emissions which should
have been included were not.  Fourteen (14) percent of the non-NSR/PSD
source files did not provide sufficient documentation of the emission
calculations to enable EPA to verify whether fugitive emissions were
properly considered.

     3.  Does the agency use as its netting baseline actual emissions
     expressed in tons per year?

     No specific problems involving the NSR/PSD permit files were found.
Auditors did indicate, however, that insufficient documentation prevented
an affirmative conclusion from being drawn in just a few cases.
                                   4-19

-------
     With respect to the non-NSR/PSD files, EPA identified 17 agencies
that either did not require the proper procedures to be used to calculate
a net change in emissions, or did not provide enough information to enable
the auditors to determine whether actual  emissions were correctly
calculated.  In these agencies, EPA found at least 49 examples of specific
procedural or documentation problems.  The findings indicate that:

     •  Five agencies allowed proposed modifications to determine their
net change in emissions on the basis of potential or allowable emissions,
rather than actual emissions.  At least two of the agencies permitted new
replacement units, of equal capacity to units being shut down, without
considering the net change in actual emissions.

     •  Six agencies did not properly determine actual  emissions changes—
one failed to use a tons-per-year emission baseline, while five did not use
emissions that were representative of normal source operation.

     •  Twelve agencies did not provide sufficient information in some
files to enable the auditors to determine how the emission changes  were
calculated.

     4.  Does the agency check applications for proper use of contemporaneous
     emission changes to prevent the "double counting" of emission
     decreases for netting purposes?

     No evidence of double counting was found in the audited permit files.
Of some concern, however, is (1) the lack of documentation verifying that
double counting has not occurred, and (2) the apparent failure to make the
emission reduction credits Federally enforceable (eight permits).

     The lack of documentation was indicated in six agencies as the
reason why auditors could not verify that double counting had not occurred,
but it should be noted that there were no suggestions that any problems
were suspected.

     EPA requirements stipulate that emission reductions must be made
Federally enforceable.  This was reportedly not done in single permits
found in seven agencies.  In three of these agencies, emission reduction
credits were not addressed at all in permit conditions, thus raising the
question of whether the reductions are enforceable even by the affected
agencies.  Making the emission reductions enforceable conditions of the
permit also helps to ensure that subsequent double counting of such
emissions will not occur inadvertently.

     5.  Does the agency properly apply the §107 area designations  when
     determining what type of preconstruction review will be required of
     major construction?

     No clear examples of the misapplication of §107 area designations
were identified by EPA auditors.

     6.  Verify that the agency does not approve major construction
     projects in designated nonattainment areas under an EPA-imposed
     construction moratorium.
                                   4-20

-------
     The findings indicate that none of the audited agencies had issued
permits to sources locating in nonattainment areas where EPA had imposed
construction bans.

4.6  BACT/LAER DETERMINATIONS

     In this section, the audit examined several  aspects of the BACT/LAER
control technology requirements that are generally applicable to PSD
sources and major new and modified sources in nonattainment areas.
With respect to BACT, emphasis was put on whether agencies were requiring
an adequate analysis of each regulated pollutant  emitted in significant
amounts.  Prescribed significance thresholds applicable to each pollutant
are defined by the PSD regulations.

     In order to get a better idea of how thoroughly the BACT analyses
are being carried out, questions were asked to determine whether the
analyses routinely considered more than one possible control technology,
and whether the agency routinely took it upon itself to verify the  analyses
submitted by the applicants.

     The audit also sought to determine the extent to which the BACT/LAER
requirements are functioning as technology-forcing requirements. This
was accomplished by asking the auditors to determine the relative stringency
of the BACT/LAER determination for each major source audited on the basis
of applicable NSPS and NESHAP standards, which serve as the minimum
control requirements legally allowed for BACT and LAER.

     1.  Does the BACT analysis consider each regulated pollutant emitted
     in significant amounts?

     Most of the 28 agencies that issued PSD permits appear to be complying
with this PSD requirement.  The auditors found exceptions, however, in a
total  of 14 PSD permits issued by 11 agencies.  These figures represent a
higher percentage of the audited PSD permits than was found in the  FY 85
audit.  In only three agencies did the problem occur in more than one
permit.  In almost all cases where the BACT analysis was considered, BACT
was specified in the permit.

     Some pollutants were not considered for BACT because of an apparent
failure on the part of the audited agencies to address potential emissions;
i.e., actual emissions were incorrectly used (see Section 4.5, Applicability
Determinations).  It is presumed that in such instances when this practice
is corrected, any pollutants calculated to be potentially emitted in
significant amounts will be properly considered for BACT.

     2.  Does the review agency require consideration of more than  one BACT
     control technology?  If so, to what extent are economic, energy, and
     non-air environmental impacts considered?

     Previous audits indicated that most agencies appear to require PSD
applicants to analyze more than one control technique as part of their
BACT selection process.  Some agencies have noted, however, that this
requirement was not always implemented if a particular control technique
was regarded as "obvious" or common for a particular source.  A few
                                   4-21

-------
agencies have claimed that a preapplication meeting with the applicant
was used to determine BACT; therefore, an analysis of alternatives was
not needed for the PSD application.  Where agencies claim to conduct a
preapplication meeting with the applicant, in order to review candidate
control options in advance, EPA recommends that each meeting be carefully
documented to include a description of the alternatives considered and
the basis for them being eliminated.  Agencies should retain this documen-
tation in the appropriate PSD files as a formal record of the BACT selection
process.

     This year's audit findings generally support the agencies' claims in
that most (65 percent) of the audited PSD files (at 86 percent of the
agencies) did address, to some degree, consideration of alternative control
techniques.  Specifically, auditors found that:

     .  Fifty-seven (57) percent of the agencies where PSD permit files were
        examined had files in which control alternatives were routinely
        considered;

     .  Twenty-nine (29) percent had some files that addressed alternatives
        and other files that did not;

     .  Fourteen (14) percent of the agencies had no PSD files that
        addressed alternative controls for BACT; and

     .  Sixteen (16) percent of the PSD files where control  alternatives"
        for BACT were considered failed to adequately address the impacts
        of each alternative in order to demonstrate the rationale for
        selection of a particular control technique.

     Seventeen (17) PSD permits (33 percent of the PSD permits audited),
issued by 13 agencies, did not address alternative controls  at all.
Auditors noted that in some cases the applicant claimed the  best control(s)
had been selected.  For nine cases from seven agencies, the  auditors
found that even though only one option was reviewed, the option was found
by the auditor to be acceptable.  In some cases where the source was
subject to NSPS, auditors noted that no other control technology was
considered for BACT.  Thus, it would appear that the omission of other
control techniques from consideration may not always be acceptable.

     3.  What checks does the review-agency employ to confirm the applicant's
     BACT analysis?

     Auditors were asked to determine whether each audited PSD file
contained sufficient documentation to show that the reviewing agency had
verified the applicant's calculations and assumptions for BACT.  The
findings show that:

     .  Sixty (60) percent of the agencies consistently verified the
        applicants' BACT analyses;

     .  Twenty (20) percent were inconsistent in that some files demonstrated
        the agencies' verification efforts while other files did not; and


                                   4-22

-------
     .  Twenty (20) percent provided no.evidence in their files that they
        had verified the BACT analyses submitted by the applicants.

     Auditors found no apparent agency verification of the applicants'
BACT analyses in" 15 PSD permit files.  This finding was mixed between
situations where only one control  technology was considered and others
where several alternatives were considered by the applicant.  The auditors
concluded in some cases that little independent analysis was likely to
have occurred because of the questionable nature of the BACT selections.
In other instances, however, it appeared  to be more a question of whether
the agencies had failed to adequately document their own analyses.

     4.  What tendency is there for the agencies' BACT/LAER determinations
     to conform exactly to the minimum requirements, i.e., NSPS or NESHAP
     standards where applicable?

     For this question, applicable PSD files were examined for the
application of BACT, and files for major  nonattainment area sources were
examined for LAER.  The findings are based on 30 PSD files from 22 agencies
and 5 major nonattainment area source files from 4 agencies.

     a.  BACT

     Some improvement was found over the  FY 85 audit; however, there is
still a tendency for agencies to accept the use of the applicable NSPS  to
define BACT for PSD sources.  Even though examples of BACT determinations
more stringent than NSPS were found in 11 of 22 affected agencies, BACT
was defined as the applicable NSPS for approximately 60 percent of the
pollutant determinations that agencies made for PSD sources subject to  NSPS.

     The audit findings show that:

     .  Nine agencies accepted the applicable NSPS for all BACT determina-
        tions.  These agencies issued 10  permits, for which 26 pollutant
        determinations were made.

     .  Five agencies defined BACT more stringently than the applicable
        NSPS for at least one pollutant,  but typically accepted NSPS for
        most pollutant determinations.  In the nine permits that these
        agencies issued, BACT was  set at  levels more stringent than NSPS
        for seven pollutants, while the NSPS level  was applied to 20
        pollutant determinations.

     .  Six agencies defined BACT  more stringently than the applicable
        NSPS for all of their BACT determinations.  These agencies issued
        seven PSD permits, for which eight pollutants were controlled
        beyond NSPS.

     .  Of 30 PSD sources subject  to NSPS, 12 were allowed to use NSPS
        for all affected pollutants, while 9 were required to meet control
        requirements more stringent than  BACT for all affected pollutants;
        however, only 1 of these sources  had more than 1 pollutant subject
        to BACT.
                                   4-23

-------
     .   EPA auditors did not address the stringency of BACT  relative  to
        the applicable NSPS in three permits from three agencies.   However,
        all other BACT determinations at two of the agencies were  more
        stringent than the applicable NSPS.   The third permit contained  a
        BACT determination less stringent than the applicable NSPS for
        one pollutant and a "not determined" for another subject pollutant.

     b.  LAER

     Agencies showed a significantly greater tendency to define LAER
beyond the applicable NSPS than was the case for BACT determinations.
Though a smaller sample of nonattainment NSR permits was taken, an even
greater percentage of LAER determinations exceeded the applicable  NSPS
than in the FY 85 Audit.  For the five permits issued, LAER  was defined
to be more stringent than NSPS for six pollutants, while the control  of
only one pollutant was set equal  to NSPS.

     Only one agency allowed one source to meet the applicable NSPS to
satisfy the LAER requirement for a single pollutant.  The other three
agencies required LAER to be set at levels more stringent than NSPS
for four sources.

4.7  AMBIENT MONITORING (PSD)

     The PSD regulations contain specific de minimi's ambient concentrations
for each pollutant, indicating when a PSD applicant needs to gather ambient
air quality data as part of the permit-to-construct application process.
For those pollutants which require reporting of ambient data, EPA  guidelines
set forth procedures whereby a source must either:  (1) establish  and
operate an ambient monitoring network and collect data for 12 months  or
less, or (2) analyze existing ambient data that are "representative"  (in
accordance with specific EPA criteria) of the air quality in the impact
area of the proposed source.

     1.  Under what circumstances is a source required to submit
     preconstruction ambient monitoring data?

     The auditors examined 61 PSD files to determine whether agencies had
followed the correct procedures for requiring applicants to  submit ambient
air quality data, either from source-operated monitors or from existing
representative data.  Twenty-eight (28) sources were required to submit
ambient data.  Another 26 sources were correctly exempted in accordance
with the criteria for de minimis situations, but 7 did not address the
data requirements.

     Seven sources were exempted from preconstruction monitoring require-
ments; however, because of inadequate documentation in the permit  files,
auditors were unable to ascertain whether the exemptions, allowed  by  six
agencies, were appropriate.

     In the 19 agencies requiring that the data requirements be addressed,
most applicants were allowed to use existing air quality data rather  than
having to establish a monitoring network to collect new data.  The findings
indicated that:

                                   4-24

-------
     .  Twenty-six (26) sources were allowed to use only existing data.
        For four of these sources, the existing data were not  in  the
        files.

     .  Two sources (involving two different agencies)  were required to
        measure ambient air quality.  Neither the monitoring plans nor
        the monitoring results were available for either source.

     2.  Under what circumstances may a source submit existing data,
     rather than conduct new monitoring?

     Where PSD sources were allowed to use existing data to meet  the air
quality data requirement, auditors examined the files to determine whether
agencies followed Federal criteria to ascertain that the existing data
were representative of the area of source impact.  The  air quality data
were checked for adequate consideration of the location of existing monitors,
as well as the quality and currentness of the existing  data.

     Four files from four different agencies offered no documented basis
for allowing the use of existing data.  One other file  involving  one
agency contained some description of the data used but  failed  to  adequately
consider, or meet, all of the criteria for representative data.

     For the 26 PSD sources allowed to use existing data for at least one
pollutant, supporting data were in the files for 23 sources.  For these,
auditor responses addressing the Federal criteria for representative data
are as follows :
                                                             YES   NO  CBD
     a.  Adequate consideration of monitoring site location   79   4   17

     b.  Adequate consideration of data quality               79   0   21

     c.  Adequate consideration of data currentness     ,      80   0   20


     3.  Do the source monitoring data adhere to PSD quality assurance
     requirements?

     In the two agencies requiring source monitoring, EPA auditors checked
the two PSD permit packages but found no monitoring plans.  One auditor
indicated that a monitoring plan had been submitted for one source, but
the plan was not in the permit file.  This source conducted monitoring
for 12 months, as generally required in the PSD regulations.

4.8  AMBIENT AIR QUALITY ANALYSIS

     Auditors were asked to examine three main areas of concern regarding
ambient air quality analysis.  The first area, PSD increment analysis,
looked at how well agencies evaluated PSD permit applications to determine
the amount of PSD increment that would be consumed by the proposed source
or modification.  The second area of concern pertains to agency procedures
for providing adequate NAAQS protection.  Auditors were asked to determine,

                                   4-25

-------
for all major and minor source permits, whether and to what extent each
source underwent an analysis to ensure that the national  standards (NAAQS)
would not be violated.  Finally, the auditors evaluated the adequacy of
the agencies' models and modeling procedures.  Agencies are expected to
use models that have been approved for use by EPA, but also of importance
is that the model (and model options)  selected be appropriately applied
to a particular set of modeling conditions.

PSD Increment Analysis

     Auditors focused on whether the PSD increment analyses (1) addressed
the appropriate emission changes that  affect available increments, (2)
considered both long- and short-term increment averaging periods,  and (3)
gave adequate attention to Class I area increments.

     The audit findings indicate that  23 agencies required PSD applicants
to perform increment analyses.  In these agencies, 32 PSD permits  included
analyses of either the TSP or S02 increments, or both.  In only one case
did an auditor find that an increment  analysis (Class II for S02)  should
have been performed, but was not.
                                                      J .
     1.  Does the agency consider the  baseline concentration and emission
     changes that affect increment consumption?

     •  In 6 of the 23 agencies, auditors found that applicants did not
address all applicable increment-consuming emissions from existing sources.
In all  but one case, the permit involved minor source emissions.

     •  In 11 of the 23 agencies, files contained insufficient information
to enable the auditors to satisfactorily evaluate the adequacy of  the
required increment analyses.  In some  cases, the increment analysis
conducted may have been adequate; however, the available documentation
would not allow a determination.

     2.  Are both long- and short-term PSD increments being given  adequate
     consideration as part of the increment assessment?

     The audit findings indicate that  agencies adequately consider both
the long- and short-term increments for S02 and TSP.  It is interesting
to note that agencies tended to be conservative in their use of modeling
results to determine the amount of increment consumed,.  Whereas EPA
recommends using the highest of the second highest receptor site
concentrations, agencies used the highest concentration-in 67 percent
of the TSP analyses and 48 percent of  the S02 analyses.

NAAQS Protection

     1.  Does the agency routinely evaluate the ambient impact of  minor
     source construction?

     This year's audit information supports the findings of the previous
audit in that most agencies do not routinely evaluate minor source
construction for air quality effects.   Less than 25 percent (60) of the
audited minor source files were required to undergo an NAAQS analysis.

                                   4-26

-------
Most agencies did, however, appear to scrutinize minor source applications
individually to determine whether an ambient impact analysis  should  be
done.  Some agencies, however, appear to provide little,  if any,  review
of the ambient effects of minor sources.

     Specifically, the audit results indicate that:
     •  Twenty-five
audited non-NSR/PSD
found sources which
not.
agencies did not require NAAQS analyses  for any
source permits.  In four of these agencies, auditors
they believed should have undergone analysis but did
     •  Sixteen agencies were found to have issued permits to non-
NSR/PSD sources which, in the auditors'  judgment, probably should have
been subjected to an ambient impact analysis but were not.  EPA identified
a total of 16 permits for which this omission occurred.

     •  Approximately 6 percent of the audited non-NSR/PSD source permits
that did not consider ambient effects probably should have, because  of
the sources' potentially significant air quality impacts.

     •  In 22 percent of the questionnaires, auditors did  not respond to
the question asking whether an NAAQS analysis should have  been done, but
was not.  One possible inference is that information in  the files was
insufficient to determine the need for an NAAQS analysis.

     The auditors also examined the NSR/PSD permit files to determine
whether and how well the NAAQS analyses were performed for major sources.
The findings show that most NSR/PSD sources underwent NAAQS analyses
where appropriate; 46 files (75 percent of the files audited) included  an
NAAQS analysis.  The findings also indicate that:

     •  Auditors identified three NSR/PSD permits for which an NAAQS
analysis was completely omitted but, in the auditors' judgment, should
have been required.

     •  Four agencies, each having a PSD file that included an NAAQS
analysis, should have required additional analyses, either for omitted
pollutants or for more comprehensive review of considered  pollutants.

     2.  Does the agency's ambient impact analysis provide adequate
     protection against the development of "hot spots"?

     Adequate NAAQS protection requires that the reviewing agency give
consideration to the interaction of proposed new emissions with emissions
from sources already in existence (including sources which may have
already received a permit to construct but are not yet operating) and to
points of projected maximum ambient concentrations resulting from multi-
source interactions, rather than just points of maximum concentrations
from the proposed source alone.

     Auditors found that many agencies generally provide adequate NAAQS
protection.  Oftentimes, however, a lack of file documentation prevented
the auditors from making a determination.  The NSR/PSD source files  tended

                                   4-27

-------
to contain better documentation than did the non-NSR/PSD source files.
The audit findings reveal the following:

     •  Twenty (20) percent of the audited agencies were judged to provide
good or acceptable protection of the NAAQS most, if not all, of the time.
This figure represents a significant decrease from the 34 percent determined
for FY 85.

     •  Twenty (20) percent of the agencies had files that suffered from
insufficient documentation.  In these cases, auditors could not determine
whether and to what degree source interactions had been considered in the
NAAQS analysis.

     •  Thirteen (13) percent of the agencies were found to be inconsistent
in that some analyses adequately considered source interactions, but others
did not.

     •  Twelve (12) percent of the agencies typically omitted significant
emissions from other sources in the vicinity of the proposed source.

     •  Thirty-four (34) percent of the non-NSR/PSD source permits reviewed
for NAAQS protection failed to include sufficient documentation to determine
the adequacy of the ambient impact analysis.  Fifteen (15) percent were
judged inadequate in terms of considering multi-source interactions.

     •  For NSR/PSD permits, 23 percent of the files reviewed for NAAQS
protection had insufficient file documentation.  Less than 10 percent
were judged to have inadequate analyses for full NAAQS protection.

Dispersion Models

     1.  Does the Agency use adequate models to carry out the ambient
     impact analysis?

     EPA examined the modeling techniques used or accepted by agencies  to
analyze PSD increment consumption and potential source impact on the
NAAQS.  The audit results indicate that agencies generally used or required
applicants to use the appropriate models and model options.  However, in
many instances there appeared to be a lack of sufficient documentation
regarding the rationale for the use of particular models and methods.

     Specifically, the audit findings show that:

     •  Twenty-three agencies required PSD applicants to perform increment
analyses.  In 10 of the 23 agencies, auditors found problems with some
aspect of the required modeling procedures.  These problems included use
of an inappropriate model, use of inappropriate model options (such as
failure to account for terrain elevation or building downwash), and use
of insufficient meteorological data.

     •  The use of inappropriate models was identified in three NAAQS
analyses involving the review of minor sources.  The major problem seemed
to be that the models used were inappropriate for the existing terrain
features.

                                   4-28

-------
     •  Permits found in 22 agencies (almost 70 percent of the agencies
where ambient impact analyses were included in the files)  did not contain
sufficient information to support the use of the models and model  options
used.

     2.  Does the Agency perform an independent, internal  review of the
     modeling analyses contained in the permit application?

     Apparently, most agencies do not often require non-NSR/PSD source
applicants to perform modeling analyses.   Seventy (70)  percent of the
time, the source analyses were performed  by the agencies themselves.
Most agencies were found to adequately review the applicants'  modeling
analyses, but inadequate reviews were identified in four agencies.
One of these agency reviews involved a non-NSR/PSD source; the other
three were PSD sources.

     According to the responses provided  by auditors,  the  applicants were
required to submit a modeling analysis for only 18 of  the  files that were
examined.  Therefore, in cases where modeling analyses were submitted by
the applicant, the analyses were not adequately checked by the responsible
agency approximately 20 percent of the time.

4.9  EMISSION OFFSET REQUIREMENTS

     When a major new or modified source  is allowed to construct in a
designated nonattainment area, emission reductions are generally required
to offset the new emissions.  These emission reductions or offsets must
meet specific criteria set forth under Part D of the Clean Air Act in
order for the offsets to be creditable.  Auditors examined selected files
involving sources subject to the emission offset requirements to determine
whether they met such criteria.

     EPA examined, in 11 agencies, 17 major source permit  files involving
construction in nonattainment areas.  Of  these 17 files, 3 permits were
issued under programs which excluded the  applicants from having to obtain
emission offsets.  The remaining files involved sources that should have
obtained offsets; however, three sources  apparently did not obtain offsets
and the enforceability of an additional case was called into question.

     With regard to the examples where offsets were not obtained, the
audit results indicated the following:

     .  One agency should have required emission offsets for two different
        audited sources, but apparently failed to do so.  The offset
        required for each of the facilities concerned  the  same nonattain-
        ment pollutant.  There was insufficient documentation in each of
        the files for the auditors to determine why this issue was not
        addressed.

     .  The file review at one agency identified a source  modification
        that resulted in a significant increase in VOCs, as well as
        resultant VOC emissions qualifying the facility as a major source
        in an ozone nonattainment area.  The offset requirements were not
        clearly defined and the auditor had to review the  entire set of

                                   4-29

-------
        documents to determine where adjustments were made.  Apparently,
        no analysis was done for ozone.  The application and engineering
        analysis stated that the operation of existing equipment would be
        limited to 50 percent of annual production.   This limitation was
        not stated in the permit.

     The Federal enforceability of offsets issued in one case was questioned
by the auditor.  The permit to construct was deemed  Federally enforceable;
however, the permit to operate did not contain specific emission limits
for offsetting units.  Apparently, certain allowable limits were lowered
based on good housekeeping and engineering practices rather than pollution
control  equipment.

     Generally, agencies required offsets to occur on or before the time
of new source operation, and to be expressed in the  same manner as for the
demonstration of RFP (i.e., actual vs. allowable emissions).  EPA auditors
had difficulty determining compliance with other specific criteria with
which the offsets must comply, such as consideration of minor source
growth and use of credit from shutdowns or production curtailment.
Agencies often did not provide documentation ensuring that these criteria
had been met, even though they actually might have been.

4.10  PERMIT SPECIFICITY AND CLARITY

     This final section of the new source review audit provides the
results of EPA's examination of information contained in the permits
issued to new and modified sources.  Specifically, auditors were asked to
examine how, and whether, permit conditions defining limitations applicable
to the approved source or modifications are being established.  Such
limitations typically become the enforceable measures by which a source's
construction and operation is regulated, and the means by which ongoing
compliance is determined.

     1.  Does the agency adequately state or reference allowable emission
     rates in each permit?

     Ten (10) agencies did not routinely state or reference each source's
allowable emissions in the PSD/NSR permits, and 25 agencies did not
always do so for non-PSD/NSR permits.  The omission  appeared to be a
common occurrence--at least for non-NSR/PSD permits; at 13 agencies this
omission occurred in over 35 percent of the permits.

     An analysis of the individual permit file questionnaires shows that
approximately 34 percent of the non-PSD/NSR construction permits in the
audit data base did not specify the allowable limits.  This finding must
be qualified, however, because of the ambiguity of the instructions
provided to the auditors by the questionnaire.  Those instructions easily
could have been misinterpreted by the auditors as asking them to provide
only the number of emission limits actually specified in the permits,
and not cases where a reference was made to a regulation containing the
required limit.  It is not clear how many of the responses took referenced
limits into account (as was intended), but it is known that some did not.
                                   4-30

-------
     The same qualification must be given for the results  pertaining  to
NSR/PSD permits as well, although the ambiguity did not  appear  to  have
much effect on the findings.  There was one permit  for which  an  auditor
specifically stated that no limits were found, but  auditors did  not
respond to the question in three cases.  There were also 10 permits  for
which the questionnaire was filled out incorrectly, and  4  others that listed
inappropriate limits.  These considerations will  be taken  into  account
during revision of the questionnaire for use in the next air  audit.

     Where limits were specified in the permits,  auditors  were  asked  to
evaluate them in terms of their clarity, consistency with  measurement
techniques, and Federal enforceability.  In most  cases,  the permits
contained more than one emission limit.  Where at least  one of  the limits
was determined to be inadequate with respect to any of the variables
considered, that file was rated inadequate as a whole.  The percentages,
as shown below, are based on the total number of  permits that contained
emission limits.

                                 PSD/NSR Permits      Non-PSD/NSR Permits

                                 YES   NO   CBD         YES   NO   CBD
                                      (II   ill         i%I   (II  ill
     a. Clear and precise        87    6     7          78    12      4
        averaging periods

     b. Emission rates           83    1    16          88     7      1
        consistent with
        acceptable measurement
        techniques

     c. Federally enforceable    83    7    10          87     1      2


     2.  Does the agency identify all  emission units  and their allowable
     emissions in the permits?

     Auditors reported that the PSD/NSR permits from  at least 13  agencies
did not routinely identify each emission unit along with its  allowable
emission limit.  For non-PSD/NSR permits, this problem was  evident  at 31
agencies.  Overall, the responses indicated that over half  of the agencies
"do not" or "generally do not" adequately address each emission unit in
their issued permits.  In one of these agencies, it was noted that  the
emission units and the emission rates  applicable to those units were
identified primarily to avoid PSD/NSR  review.

     EPA found that 17 (28 percent) of the PSD/NSR permits  and 142  (52
percent) of the non-PSD/NSR permits did not address each unit and its
allowable emissions.  It would appear  that in some cases emissions  were
"bubbled" under a single or composite  emission limitation.  This  would
make it difficult to enforce the limit with respect to the  emissions
originating at any particular unit. Agencies are advised to  avoid  any
such practice because of the questionable enforceability of such  composite
limits.

                                   4-31

-------
     3.  Are compliance test methods stated or referenced  in  the  terms
     and conditions of the permits?

     Eighteen agencies reportedly either did not at all  or did  not
consistently state or reference compliance test methods  in the  PSD  and
Part D permits.  For non-PSD/NSR permits, similar results  were  found  for
29 agencies.  Where the practice is not followed consistently,  it is  not
clear what criteria, if any, agencies use to determine whether  such
information is to be included in the permit.  Compliance test methods  are
commonly defined in the State or local  agencies' rules and regulations,
and many agencies have indicated in past audits that specific mention  of
the test methods in each permit is not  required to enable  the agency  to
use them for compliance determination purposes.

     Of the 180 non-NSR/PSD permits that specified emission limits, 52
percent stated or referenced all or some compliance test methods; 31
percent did not.  For the remainder, it could not be determined from  the
auditors'  responses how compliance test methods were addressed.

     Agencies appeared no more consistent in stating or  referencing the
compliance test methods in PSD/NSR permits.  Fifty-four  (54)  percent  of
the evaluated permits stated or referenced compliance test requirements
for some pollutants; only 11 percent did not.  Again, auditors  did  not
respond to the question in a significant number of permit  reviews (16
percent).  Based on the number of pollutants, 59 percent of the limits
had adequate test requirements, 27 percent did not, and  the adequacy  of
14 percent could not be determined.
                                   4-32

-------
                                CHAPTER 5

                           COMPLIANCE ASSURANCE
5.1  EXECUTIVE SUMMARY

     As was the case in the FY 85 National  Air Audit System (NAAS)  effort,
many States and locals showed one or more strong points characteristic of a
successful air compliance program, such as high source compliance rates
supported by high inspection frequency rates, performance of all  required
New Source Performance Standard (NSPS) source tests, expeditious  resolution
of violations, and few long-term violators.  These activities were  adequately
reflected and validated by the national Compliance Data System (CDS).
Other States had source files that were, for the most part, well  organized,
up-to-date, and complete, reflecting a reasonable profile of each source.

     A State-by-State analysis of inspection rates shows that inspections
of Class Al* State implementation plan (SIP) sources generally decreased
over those reported in the FY 85 audit (from 89 to 84 percent), and five
States were still unacceptably low, with inspection rates of less than 60
percent.  Compliance rates for Class Al SIP sources remained roughly the
same as FY 85 (91 to 92 percent).  The NSPS national figures for  both
inspection and compliance rates rose (88 to 90 percent and 90 to  93
percent, respectively), even though some individual  State rates declined.
The NSPS inspection rates for nine States are unacceptably low, with
figures of less than 60 percent.  National  Emission Standards for Hazardous
Air Pollutants (NESHAP) inspection rates showed an increase (from 67 to
96 percent), while compliance rates fell slightly (from 93 to 90  percent).
Twelve States have NESHAP inspection rates at or below 60 percent.

     The compliance audits also revealed that several State and local
agencies, to a varying extent, still have weaknesses in three areas vital
to a strong and effective compliance program.  First, source files  maintained
by some State and local agencies do not contain verifiable information
reflecting a reasonable profile of each source.  In fact, the percentage
of those reviewed that reflected a reasonable profile of the sources
decreased from 72 percent in FY 85 to 65 percent in FY 1986-87.  Even  so,
some audit reports cited improvement since the FY 85 audits in the  condition
of State files.  Second, some inspection reports are still of poor quality
(no mention of operating or emission parameters, or pollutants emitted),
and this is a significant concern.  For some agencies, however, there  has
* Class Al includes sources with actual  or potential  controlled emissions
  greater than or equal to 100 tons per year.

                                   5-1

-------
been noticeable improvement in the quality of inspection reports since FY
85.  Third, some of the reviewed agencies' enforcement efforts are not
always effective in reducing the number of long-term violators by expeditiously
returning documented violators to compliance, although overall there was
a slight increase in the percentage of audit reports that indicated
sources were being expeditiously returned to compliance (from 74 to 77
percent).

     Thus, while there have been improvements in all of these critical
areas, some State and local agencies need to heighten efforts in the
aforementioned three areas to continue strengthening their compliance
programs.

5.2  INTRODUCTION

     As in FY 1985, the compliance assurance element of the FY 1986-87 NAAS
was designed to examine State and local programs that are responsible for
the compliance of sources subject to requirements of SIPs and, where
delegated, NSPS (Section 111) and NESHAP (Section 112).  Of the several
hundred thousand regulated stationary sources in the nation, there are
approximately 30,000 sources in these categories for which EPA and State/local
agencies share a concern about compliance status and associated enforcement
activities.  Compliance activities focusing on these sources formed the
primary basis on which each audit was conducted.

     There are three major parts of the compliance assurance audit.  The
first is a pre-visit assessment of the State or local agency performed by
examining source data reported to EPA by the agency.  For FY 1986-87, this
once again included an assessment of how the "timely and appropriate"
guidance was working.  The other two parts of the audit consisted of
doing an on-site review of State source files and conducting overview
inspections.

     In accordance with the NAAS guidance, the EPA Regional Offices were
to conduct the pre-visit assessment by obtaining CDS retrievals for the
most recent fiscal  year on inspection frequency, compliance rates, and
enforcement activity.  The Regions were than to analyze the CDS data for
source compliance status, progress in meeting inspection commitments,
identification of long-term violators and associated compliance activity,
adherence to "timely and appropriate" guidance, identification of long-
term compliers and associated surveillance activity, and identification
of operating NSPS sources without the required 180-day performance test.
Finally, based on this CDS analysis, the Regions were to prepare a summary
of each compliance program and send it to the State or local agency before
the visit.  The analysis could have taken the form of a questionnaire for
the agency or could have been a statement of findings to be discussed for
completeness and accuracy during the visit.  The pre-visit assessment was
also designed to help in identifying the source files to be reviewed
during the on-site visit.

     The next major part of each audit was the on-site visit.  The visit
centered on a discussion of the findings in the pre-visit assessment and
on review of 15 to 20 source files.  The files to be reviewed consisted
                                   5-2

-------
of a mixture of SIP, NSPS, and NESHAP sources.  A file review checklist
was developed to ensure consistency in how the file reviews were implemented,
The goals were to see if the files contained a reasonable profile of the
source, written documentation to support the compliance status reported
to EPA, and documentation to show that violators are expeditiously returned
to compliance.  The State and local audit reports were envisioned to
include a discussion of both the pre-visit assessment and the status of
the files.

     The final component of the compliance audit was to be a program of
overview inspections conducted by EPA of 2 to 3 percent of the sources in
the CDS inventory (Class A SIP, NSPS, and NESHAP).  The purpose was to
verify the compliance status of a source as reported to EPA, as well as
review State or local agency inspection practices to see if there were
areas where EPA could increase performance through technical assistance
to the State and local  agencies.

     This national  summary report covers 60 State and local audits.  The
10 questions in Sections 5.4, 5.5, and 5.6 which follow were condensed
from the FY 1986-87 audit questions.  These questions represent the key
elements of the compliance portion of the audit, and provide a uniform
basis to do a national  assessment of the 60 compliance audit reports.

5.3  MAJOR FINDINGS AND CONCLUSIONS

     As was the case in the FY 85 NAAS effort, many States and locals
showed one or more strong points characteristic of a successful air
compliance program, such as high source compliance rates supported by high
inspection frequency rates, performance of all required NSPS source tests,
expeditious resolution of violations, and few long-term violators.  These
activities were adequately reflected and validated by the national CDS.
Other States had source files that were, for the most part, well organized,
up-to-date, and complete, reflecting a reasonable profile of each source.

     A State-by-State analysis of surveillance statistics shows that
inspection rates for Class Al SIP sources generally decreased over those
reported in the FY 85 audit, and five states are still unacceptably low,
with inspection rates of less than 60 percent.  Compliance rates for
Class Al SIP sources remained roughly the same as in FY 85.  The NSPS
national figures for both inspection and compliance rates rose, even
though some individual  State rates declined.  The NSPS inspection rates
for nine States are still unacceptably low, with figures of less than 60
percent.  NESHAP inspection rates showed an increase, while compliance rates
fell slightly.  Twelve states have NESHAP inspection rates below 60 percent.

     The compliance audits also revealed that several State and local
agencies, to a varying extent, still have weaknesses in three areas vital
to a strong and effective compliance program.  First, source files
maintained by some State and local agencies do not contain verifiable
information reflecting a reasonable profile of each source.  In fact, the
percentage of those reviewed that reflected a reasonable profile of the
sources decreased from 72 percent in FY 85 to 68 percent in FY 1986-87.
However, several audit reports mentioned some improvement since FY 85
audits in the condition of State files.  Second, some inspection reports

                                   5-3

-------
are still of poor quality (no mention of operating or emission  parameters,
or pollutants emitted), and this is a significant concern.   For some
agencies, there was, however, a noticeable improvement in  the quality  of
inspection reports since the last review.  Third, some of  the reviewed
agencies' enforcement efforts are not always  effective in  reducing  the
number of long-term violators* by expeditiously returning  documented
violators to compliance, although there was a slight increase in the
percentage of reports that indicated sources  were being expeditiously
returned to compliance.

     Thus, while there are improvements in all  of these critical  areas,
some State and local agencies need to heighten  efforts in  the aforementioned
three areas to continue strengthening their compliance programs.

     The remainder of this report addresses these findings  in more  detail.
It is organized by the three parts of the audit:   pre-visit assessment,
file review, and overview inspections.  The aforementioned  10 questions,
which represent the key elements of this compliance audit,  are  discussed
in each appropriate part.

5.4  PERIODIC REVIEW AND ASSESSMENT OF SOURCE DATA

     To assess the adequacy of State and local  compliance  programs, the
EPA Regional Offices continuously review source compliance  status and
inspection information submitted by the audited agencies and reflected in
CDS for the SIP, NSPS, and NESHAP programs.

     As shown in the four pie charts in Figures 5.1 through 5.4, the
national compliance picture is very respectable.   Compliance rates  have
improved since the FY 85 audits for Class Al  SIP  and NSPS  sources,  and
have declined slightly for NESHAP sources.  The bar charts  in Figures
5.5 through 5.8 depict, for each aspect of the  air program, the inspection
range, compliance range, and range of long-term violators  for all State
and local agencies audited.  As shown, inspection rates for Class A SIP
sources range from 28 to 97 percent, while NSPS and NESHAP  sources  range
from zero to 100 percent.  Median figures for these three  programs  are
between 73 and 96 percent (compared to 67 percent and 88 percent in the
FY 85 report).  Compliance rates for Class A  SIP, NSPS, and NESHAP  sources
range from a low of 40 percent in one jurisdiction to a high of 100
percent in another, with median figures between 95 and 99  percent.  The
number of long-term violators (defined for this audit as two consecutive
quarters or more) in each jurisdiction was largest for Class A  SIP  sources,
ranging from a low of zero in some agencies to  a  high of 88 in  another,
with a median figure of 8 sources per jurisdiction.

     The following question is the first of 10  developed as a guide for
summarizing the findings from each of the audit reports.
* "Long-term violators" means sources in violation for two continuous
   quarters or more.

                                   5-4

-------
                                            IN COMPLIANCE
                                            IN VIOLAT ON
                                            MEETING SCHEDULE
    FIGURE 5.1  COMPLIANCE BREAKOUT OF CLASS A SIP SOURCES
Data source: FY 87 CDS

-------
tn
                                                 IN COMPLIANCE
                                                 IN VIOLATION
                                                 MEETING SCHEDULE
       FIGURE 5.2  COMPLIANCE BREAKOUT OF CLASS Al SIP SOURCES
   Data Source: FY 87 CDS

-------
CJ1
                                                IN COMPLIANCE
                                                IN VIOLATION
                                                MEETING SCHEDULE
           FIGURE 5.3  COMPLIANCE BREAKOUT OF NSPS SOURCES
   Data Source: FY 87 CDS

-------
en
00
                                                IN COMPLIANCE
                                                IN VIOLATION
                                                MEETING SCHEDULE
         FIGURE  5.4  COMPLIANCE BREAKOUT OF NESHAPS SOURCES
   Data Source: FY 87 CDS

-------
VO
100 -I
90 •
80

70
60
50 •
40 •

30
20
10

H
97*
r MAXIMUM













73%
MEDIAN




20X

















IOOX
MAXIMUM

	 95* 	
MEDIAN
77X
MINIMUM









.88 LTV
HAXlnUn




















8 LTV
MEDIAN











INSPECTION COMPLIANCE LONG-TERM
RATES RATES VIOLATORS
             FIGURE 5.5  CLASS A SIP PERFORMANCE STATISTICS
  Data Source: FY 87 CDS

-------
en
i
100% ,OOX I
100 •
90 •
Qf\
00 •
70 •
60 •
50 •

40 •
30
20

10

1














	 MAXIMUM 	

84*

MEDIAN




30X
MINIMUM



















• 	 MAXIMUM 	 	
93*
MEDIAN



46%









80 LTV
MAXIMUM



















7 LTV
MFOIAN












INSPECTION COMPLIANCE LONG-TERM
RATES RATES VIOLATORS
          FIGURE 5.6  CLASS Al SIP PERFORMANCE STATISTICS
   Data Source: FY 87 CDS

-------
en
i
100* 100*
MAXIMUM MAXIMUM
1 UU
QA
VU
80 •
70
60 •
50
40
30

20

10 •
•












90X
MEDIAN








0%
MINIMUM












nq 
-------
en
i
100* 100*
MAXIMUM MAXIMUM
1 UU
90 •
80 •
70
60 •
50 •
40
30

20 •

10
i













96*
MEDIAN








OX
MINIMUM












99*
MEDIAN



40*
MINIMUM










!



12 LTV
MAXIMUM

1 LTV MEDIAN
INSPECTION COMPLIANCE LONG-TERM
RATES RATES VIOLATORS
            FIGURE 5.8  NESHAPS PERFORMANCE STATISTICS
   Date Source: FY 87 CDS

-------
     (1)  Based on the findings of the pre-visit program analysis,  what
          was the Region's overall assessment of the condition of the air
          compliance program?

     A review of the 60 audit reports shows that some form of pre-visit
assessment was done by the Regions for all  but three States and one local
program.  Thirty-four of these reports contained an overall statement
about the particular compliance program based on the CDS analysis:

     - Six air compliance programs were considered very good.

     - Twenty-seven air compliance programs were considered adequate
       (meeting most Clean Air Act requirements).

     - One air compliance program was termed deficient.

     The remaining 22 audit reports (where a pre-visit assessment was
done) made no definitive statement on the air compliance program based on
the CDS assessment, but positive comments were made in 10 of these  reports,
such as "inspection rates are very good" and "compliance rates are good
to excellent."  It was not possible to determine anything of substance
relative to the pre-visit assessment from the other 12 reports.

     Careful study of the audit reports for the six agencies with "very
good" air compliance programs shows several elements contributing to the
success of each compliance program.  In general, these agencies:

     - routinely complete nearly all the required inspections for SIP,
       NSPS, and NESHAP sources where delegated;

     - have compliance levels for Class A SIP, NSPS, and NESHAP sources
       consistently above 90 percent, with recent inspections to support
       thi s level;

     - address, in a timely manner (and according to "timely and appropriate'
       guidelines), sources found to be in violation of applicable  emission
       limits or permitting requirements, resulting in few, if any, long-
       term violators (greater than two consecutive quarters).

     It seems likely that other States have compliance programs as  good
as these six, but this was not readily discernible from the description
of the programs in the audit reports.

     (2)  What is the Region's overall assessment of how the "timely and
          appropriate" guidance is working in the State or local agency?

     Forty-two audit reports indicated that the guidance is being followed
and the program is working well, while nine reports stated that the
guidance was not being followed (meaning few, if any, violations were
resolved according to the guidelines).  Of the remaining reports, four
had no conclusions on the "timely and appropriate" guidance because there
were no violators subject to the guidance.   The other five reports  did
not discuss the guidance in any detail.
                                   5-13

-------
     To summarize the CDS based pre-visit assessment, 33 (55 percent)  of
the 60 State and local  compliance programs were found by the Regions to be
either adequate or very good, and only one program was judged deficient
based on that assessment.  It was not possible to assign an overall
description of the programs from the other 26 (43 percent)  reports but,
as noted earlier, 10 of these had positive comments about the air compliance
program.  This initial  effort identified many good programs and pointed
out areas where the State and local  agencies and EPA should continue to
work together to improve compliance  programs.

5.5  FILE REVIEW

     (3)  Did the source files reflect a reasonable profile of the sources?

     All but one of the 60 audit reports contained file review information.
Forty-one (60 percent)  of these indicated that the files reviewed reflected
a reasonable profile of the source,  which means they contained the following
information:  source compliance status based on recent inspection or
source test, an identification of all air program regulations the source
is subject to, and, within the inspection reports, operating parameters,
point sources, and pollutants emitted by the facility.  Some common
reasons cited in the 18 (30 percent) audit reports where the files were
considered deficient were:  inability to determine source compliance
status from file contents, no indication of which air program regulations
the source was subject to (SIP, NSPS, NESHAP), and missing or poor quality
inspection reports (no mention of operating or emission parameters,  point
sources, or pollutants emitted by facility).  The remaining report did
not contain a conclusive statement on this question.

     (4)  Did the files contain adequate written documentation to support
          the CDS compliance status?

     Thirty (50 percent) of the 60 audit reports indicated that reviewed
files contained some written documentation of compliance status to support
CDS.  This represents a drop of 7 percent from the FY 85 documentation
rate of 57 percent.  Twenty-seven (45 percent) of the audit reports  either
cited a lack of any compliance information in the files or showed
information in the files that conflicted with CDS.  The other three
reports did not contain sufficient information to answer this question.

     (5)"  Are violations documented  and pursued by agencies to return  a
          source to compliance expeditiously?

     Thirty-seven (62 percent) of the 60 audit reports indicated that
violations are documented and pursued to return a source to compliance
expeditiously.  Fourteen reports (23 percent) indicated that some sources
were not being expeditiously returned to compliance, in some cases leading
to a number of long-term violators (greater than 180 days)  or untimely,
protracted enforcement actions.  Nine reports (15 percent)  lacked a
definitive response to this question.
                                   5-14

-------
5.6  OVERVIEW INSPECTIONS

     (6)  How many inspections were performed?

     The number of EPA overview inspections performed ranged from a low
of one in 1 State to a high of 54 in another.  The total  number of
inspections for all 45 reports was 772.

     (7)  How were sources selected by the Region for the overview
          i nspections?

     By far, the most common criteria used by the Regions to select
sources subject to overview inspections were some combination of the type
of program (to ensure a representative sample of SIP, NSPS,  and NESHAP
sources), location (primary attention to impact on nonattainment areas as
well as geographic spread), source compliance history, and pollutants.
Most of the differences between the Regions' selection approaches were
found in the amount of relative emphasis placed on each criterion.

     (8)  What did each inspection consist of in terms of procedures
          followed and involvement of State personnel?

     Almost all  of the overview inspections performed were a joint effort
between EPA and the States.  Most began with a review of State source
files and progressed to an on-site visit to the source, with both EPA and
State inspectors conducting separate evaluations of source compliance
status, after which separate reports were written and compared.  A summary
of the 45 audit reports with answers to this question appears below:
     Joint Inspections - State Lead
     Joint Inspections - EPA Lead
     Joint Inspections - Dual Lead
     Independent EPA Inspections

         Total
Reports

 13
 21
  8
  3

 45
Inspections

    223
    236
    149
     36
    644
     The other 128 inspections were from audit reports that did not
specify what each inspection consisted of.

     (9)  What was the purpose of the overview inspections (that is,  to
          independently verify State reported compliance, to observe
          State inspection practices, or some combination of these)?

     Twenty-one (35 percent) of the 60 reports stated that the purpose
was a combination of independently verifying State compliance and observing/
critiquing State inspection procedures (including State inspector quali-
fications).  Of the other 26 reports, 24 mentioned simple verification of
compliance status as the primary inspection goal, and 2 cited observation
of State practices as the goal .  The remaining 13 reports had no information
on this subject.
                                   5-15

-------
     (10)  What were the overall  results of the overview inspection
           effort, including recommendations for resolution  of any  problems
           discovered during the  effort?

     Forty-three (43) of the 44 responses to this question showed  both
the expertise of State inspectors and State reported compliance status to
be adequate,  which means the State inspectors were experienced enough  to
conduct a thorough inspection and determine the compliance status  of a
source.  Only one report indicated that overview inspection  results did
not agree with the compliance status in the State files or CDS.

     Regarding recommendations, four reports suggested that  more training
for State inspectors would improve the quality of inspections  and  result
in more accurate inspection data  being reported to EPA.
                                   5-16

-------
                                CHAPTER 6

                              AIR MONITORING

6.1  EXECUTIVE SUMMARY

     The 1986-87 National  Air Audit System included audits of air monitoring
in 75 agencies.  Four principal areas within the agencies'  programs were
evaluated.  These areas were network design and siting, resources and
facilities, data and data management, and quality assurance/quality
control.  The principal conclusions from the audits are discussed below.

     State and local agencies have continued to operate and maintain
successfully their State and local air monitoring station  (SLAMS) and
national air monitoring station (NAMS) networks.  About 99 percent of the
3,500 monitors operated by the audited agencies are meeting the design
and siting regulations for ambient air monitoring networks.

     The audit reports did disclose that 85 percent of the audited agencies
reported a need for either new or replacement monitoring or laboratory
equipment, which would cost $2.7 million.  Of this total,  $1.7 million was
needed for air monitoring equipment and $1 million for laboratory equipment.
A more comprehensive State/local agency equipment survey conducted in 1987
estimates that $4 million is needed for monitors and $3 million for support
equipment.  This survey is more recent and should be used  for planning
purposes, rather than estimates from the audit program.

     Similarly to previous audits, the FY 1986-87 audit indicated that
timeliness of data submittal is still a problem for many agencies, particu-
larly submissions of lead (Pb) data, which were late in 29 percent of the
agencies audited.  Concerning data completeness, the overall  percentage
was good, with a low of 78 percent for nitrogen dioxide (N02) and a high
of 94 percent for lead.  The data management section of the audit revealed
that 20 percent of the agencies failed to submit at least  one of the
essential elements of the annual SLAMS report, and corrective actions are
being taken with regard to this problem.

     The quality assurance/quality control portions of the audit reports
continue to demonstrate good overall performance.  Three agencies needed
major changes to their quality assurance plans, and 32 had minor revisions
pending.  Concerning the achievement of quality assurance  goals, for data
precision (j-15% for all pollutants) and accuracy (_+15% for TSP, +_20% for
other pollutants), the only significant problem is with precision for Pb,
for which only 55 percent of the reporting organizations achieved the
goal.  This is believed to be related to the analysis procedure.


                               6-1

-------
6.2  INTRODUCTION

     Ambient air monitoring for State implementation plan (SIP)  purposes
is required by Section 110 of the Clean Air Act.   Furthermore,  Section
319 of the Act requires the development and application of uniform air
quality monitoring criteria and methodology, the  reporting of a  uniform
air quality index in major urban areas, and the establishment of a national
air monitoring system that uses uniform monitoring criteria.   To comply
with these requirements, the Agency promulgated ambient air monitoring
regulations (40 CFR 58) in 1979, with further revisions in subsequent
years.  The most recent changes were for particulate matter of  aerodynamic
diameter equal to or less than a nominal 10 micrometers (PMig).   Included
in the CFR Part 58 regulations are requirements that State and  local  air
monitoring programs be audited.  These provisions have served as the
basis for the national air monitoring audits which began in 1984.  The
findings and recommendations of the 1984 audits led to several  changes  in
the questionnaire for the 1985 air monitoring audits.  The 1985  guidance
did require, for national consistency, that all EPA Regional  Offices  use
at least the short form questionnaire, the corrective action  implementation
request (CAIR), and the system audit reporting format.  Use of  the long
form questionnaire was left to the discretion of  the Regional Quality
Assurance Coordinator, with the concurrence of the State or local agency
involved.  Following the FY 1985 audit program, the National  Air Audit
System changed from annual to biannual cycles, to conserve resources  and
to allow State and local agencies a reasonable period to implement changes
indicated by the prior audit.  The team performing the 1986-87  audits
consisted of EPA Regional Office personnel and, in some cases,  Headquarters
representatives.  The audits included interviews, site inspections, and
completion of the short or long form air monitoring questionnaires.

6.3  MAJOR FINDINGS AND CONCLUSIONS

     The air monitoring programs of 75 agencies (46 States, 26  locals,
the District of Columbia, and 2 territories) were audited during the  FY
1986-87 cycle.  However, we have received only the text portion  of the
audits for four State and four local agencies.  Since the audit  questionnaires
for these eight agencies are not included, the tabulations of this report
are based on information from 67 agencies.  The audit results represent
all EPA Regions except Region IX, which did not conduct any State or
local air monitoring system audits during the cycle.  The audit  results
continue to demonstrate that, overall, State and  local agencies  are
successfully operating and maintaining their respective SLAMS/NAMS networks.
About 99 percent of the 3,500 monitors operated by the audited  agencies
are meeting the CFR Part 58 monitoring requirements.

     Concerning the audit findings on resources and facilities,  most
agencies indicated that they had adequate space and personnel.   However,
73 percent of the agencies did have air monitoring equipment  needs,
including pollutant monitors, calibration systems, data processing equipment,
and meteorological equipment.  The estimated cost to procure  the needed
equipment is approximately $1.7 million.  Also, 20 agencies reported  a
need for major laboratory equipment, the cost of  which is about  $1 million.
                               6-2

-------
     Timely data submission remains a problem for many agencies.   As
reported in the 1985 audit results, data submittals for the pollutant
lead (Pb) continue to be the most deficient.  During the audit cycle,
approximately 29 percent of the agencies were late with their Pb  data
submittals.  The percentages of late submittals for all pollutants range
from 29 for lead to 10 for TSP.  Performance in meeting the National
Aerometric Data Bank (NADB) 75 percent data completeness criterion showed
a range of 16 percent, the lowest being 78 percent for N02 and the highest
being 94 percent for lead.  One of the larger problem areas found in data
management was the requirement for submitting the annual  SLAMS report.
The audit showed that, of the 45 audited agencies required to submit an
annual  SLAMS report, 9 agencies (20 percent) were deficient in one or two
of the four elements required in the annual  report.  Efforts to resolve
this problem administratively are continuing, with a slight improvement
occurring between the FY 1985 and FY 1986-87 audits.

     The quality assurance (QA) aspects of the audit reports indicated
that 64 of the 67 agencies had QA plans that were, in general, acceptable.
However, three agencies need to revise their plans substantially, and 32
had minor revisions to their QA plans pending.  Sixty-four of the 67
agencies were also participants in the National Performance Audit Program,
an excellent rate of involvement.  The last phase of the quality  assurance
program evaluation was an assessment of agencies'  achievement of  the
precision and accuracy goals.  The lowest achievement rate for the precision
goals (based on four consecutive quarters of data between October 1984
and March 1987) was for Pb, for which only 55 percent of the organizations
met the goal.  Based on data submitted to the Environmental  Monitoring
Systems Laboratory (EMSL) in 1986, 66 percent of the reporting organizations
met the Pb precision goal.  This difference of 11 percent is thought to
be principally because of the different time periods used.  Goal  achievement
for Pb precision was noticeably lower than for other pollutants,  in both
the audit results and the 1986 EMSL data.  The lowest percentage  of
accuracy goals met was 65 percent, which was for N02-  However, the 1986
N02 accuracy achievement level based on the annual data submitted to EMSL
was considerably higher, at 94 percent.  There is no apparent reason for
this large difference, since the two data sets for other pollutants
compare fairly well.  The lower values for both N02 and Pb precision are
believed to be related to the complexity of the measurement of analytical
methods.  It is also possible that the wide confidence intervals  (CI)
associated with N02 accuracy estimates are related to the fact that, for
most reporting organizations, there are actually few N02 sites relative to
the number of sites for the other pollutants.  In a statistical sense,
the presence of even a few relatively large, but still "acceptable",
individual audit differences is magnified into large quarterly CI's due
to the small number of observations actually composing the statistic.

     The FY 1986-87 audit cycle is different from previous audits under the
National Air Audit System in that it was carried out over a 2-year period,
with approximately 50 percent of the agencies being audited in each year.
The approach has both benefits and disadvantages.  Overall, the advantages
of the biannual cycle, reduced annual resources and added time for agencies
to implement corrective actions between audits, outweigh the disadvantage of
this approach.  The principal disadvantage is that it is difficult to make


                               6-3

-------
comparisons of the 2-year audit results to other independent measures of
performance, which typically are annual compilations.

     Analysis of the various audit reports has indicated that the current
program does not provide a good mechanism for tracking the implementation
of corrective actions indicated by individual audits at the national  level.
Therefore, the Technical Support Division (TSD) will explore possible
mechanisms to track these actions and implement tracking during the next
audit cycle.

6.4  NETWORK DESIGN AND SITING

     The network design and siting section of the audit was aimed at
assessing air monitoring program compliance with the requirements of Appen-
dices D and E of 40 CFR Part 58.  To assess this topic, five overall  aspects
were reviewed:

          o Network size and written description of the network

          o Network modification during the previous year

          o Sites not meeting network design or siting requirements

          o Performance of the annual network review requirements

          o Survey of noncriteria pollutants monitored by agency.

     Responses to this section of the audit were intended to provide a
cross-check and update of existing EPA data on the number of monitors, their
distribution, conformance with siting requirements, and compliance with
the annual network review provisions.  This section also provides an enhanced
perspective on network stability and on the variety of noncriteria pollutants
monitored by State and local agencies.

     The 67 audited agencies operated 1,103 national air monitoring stations
(NAMS), 2,360 State and local air monitoring stations (SLAMS), and 705 special
purpose monitoring stations (SPM), a total of 4,168 sites.  An indication of
network stability is evident in the number of reported changes to State and
local networks.  There were 144 new sites established, 293 sites discontinued,
and 88 sites relocated during the 1986-87 audit period, a total  of 525
modifications.  This total affected approximately 12 percent of the sites
covered by the audit and occurred in 61 of the 67 agencies audited.  During
the next audit period, nearly all agencies are planning significant network
changes to align their networks with the PM^o standards and monitoring
requirements promulgated on July 1, 1987.  Due to the short-term nature of SPM
monitoring, no attempt has been made to track these sites regularly.

     From the audit reports, it was determined that over 99 percent of the
3,463 SLAMS/NAMS operating monitors were in compliance with 40 CFR 58 Appen-
dices D and E.  There were 24 monitors listed in 14 States that did not
comply with probe siting criteria of Appendix E.  The most frequent cause
of noncompliance was obstruction of the sampling probe by trees.
                               6-4

-------
     Results of the review for SLAMS, network compliance and  annual  review
requirements of 40 CFR Part 58 show that all audited agencies  maintain  a
network description, and that the descriptions for seven agencies  (or 10
percent) are deficient in one or more of the required items  for the
description.  These seven States are repeats from the 1985 audit,  and the
respective Regional Offices have been alerted to take corrective action.
Four agencies did not provide the date of their last annual  network
review.  Six of the 67 agencies audited indicated that network review was
a continuing process, and is covered in their annual  105 grant negotiations.

     Audits of noncriteria pollutant monitoring showed that  38 of  the 67
agencies monitored for one or more such pollutants.  The most  frequently
monitored substances are organic solvents, metals, acid rain,  and  sulfates/
nitrates.  The list below shows the pollutants monitored and the number
of agencies monitoring for each.

                             Metals             13
                             Acid Rain           9
                             Asbestos            4
                             Solvents           17
                             Formaldehyde        1
                             Fluoride            4
                             BaP                 2
                             H2S                 5
                             NMOC                9
                             Fine Particulate    2
                             S04/N03            12
                             Sulfur              1
                             Phosphate           1
                             Radiation           1
                             Pesticides          1
                             Chloride            1
                             NH3                 1

6.5  RESOURCES AND FACILITIES

     The resources and facilities section of the audit includes information
about the magnitude of agency operations, the adequacy of resources,  and  the
condition of monitoring and support equipment.  Topics considered  in  this
section were:

          o  Number of nonconforming analyzers

          o  Instrument needs

          o  Number of work-years of effort

          o  Documentation of standard operating procedures  for
             laboratories and availability of necessary equipment

          o  Availability and traceability of laboratory and field (site)
             standard reference materials.
                                   6-5

-------
     Audit results for the resources and facilities section disclosed that
there are only four nonconforming pollutant monitors currently in use in
the SLAMS network, less than 0.1 percent of the sites reported.  All  of
the nonconforming monitors are hi-volume samplers for TSP,  which do  not
conform to the new shelter standards.  These instruments are being phased
out as part of the conversion to PM^g monitoring.

     Thirty-eight agencies- reported specific monitoring equipment needs,
ranging from spare parts to new monitors or calibration systems.  Eleven
agencies indicated needs for some monitors to reflect PMio  regulations,
which were promulgated after these audits.  Several agencies indicated
that the equipment needs including replacements were an annual  budgeted
item; therefore, they did not list replacement items as needs.   The
equipment needs have been put in four categories:  field equipment,  which
includes such items as pollutant monitors, flow controllers, and shelters;
calibration and quality control (QC) equipment, including items such  as
calibration systems, gas dilution systems, and Roots meters; data processing
equipment, covering such items as personal computers, data  loggers,
telemetry equipment, etc.; and meteorological equipment.  The table  below
indicates the broad categories of equipment requested, the  number of
items, and estimated costs of acquiring the equipment.  The estimated
total cost to purchase all of this equipment is $1.7 million.

                                    Number of
          Equipment Type              Items          $Cost  (OOP)

          Field
            Monitors                   200              1,500
            Shelters, flow
              controllers               18                 85

          Calibration (QC)              21                 82

          Data Processing                6                 60

          Meteorological                 3                 10

                                                        1,737

     Twenty agencies also requested laboratory equipment.  This equipment
(a total of 30 items) included spectrophotometers, humidity-controlled
chambers, microbalances, etc.  The estimated cost to acquire these items is
$1 million.  Therefore, the total  cost to meet the monitoring and laboratory
equipment needs identified by the audits is about $2.7 million.  In  March
1988, the Agency published a comprehensive report on monitoring equipment
needs of air pollution monitoring agencies, titled Final Results of  the
1987 STAPPA/ALAPCO Survey of the Condition of State and Local Agency Air
Monitoring Equipment, March 1988.  This report included responses from 45
State agencies and 150 local agencies.  Approximately 73 percent of
State/local agencies participated in the survey, representing areas  with
approximately 95 percent of the NAMS and SLAMS monitors currently in use.
                               6-6

-------
     The report rated the condition of air monitoring analyzers and
support related equipment in terms of good, fair, and poor.  "Poor"
equipment is that for which repair is economically infeasible and replace-
ment is urged.  The total estimated costs of replacing equipment identified
as in poor condition are $4 million for continuous analyzers and $3
million for support equipment.

     Seventy-five (75) percent of the agencies felt they had adequate
space to operate their program.  For a better estimation of the number of
people involved in air monitoring, the agencies were asked to give the
number of work-years associated with operation of their monitoring programs,
Sixty agencies provided tabulated responses to this question, while the
remaining seven agencies either did not respond or provided organization
charts that could not be related to work-years.  Twenty agencies indicated
a need for additional personnel.  The total number of work-years assigned
to four specific areas of air monitoring for the 60 agencies is shown
below.
             Program Area

          Network Design & Siting

          Resources & Facilities

          Data & Data Management

          QA/QC
                                 TOTAL
Number of
Work Years

   114

   271

   167

   173

   725
Percent

   16

   37

   23

   24
  100
     The audit showed that 57 agencies, or 83 percent of those audited, have
adequate laboratory standard operating procedures for air quality measurements.
However, only 72 percent (48 agencies) indicated that sufficient instrumentation
was available to conduct all necessary laboratory analyses.  There were five
questionnaires with no response to this portion.

     The last topic of the facility and resource section review concerns
standards and traceability of reference materials for laboratory and field
site use.  The audit indicated that 90 percent of the agencies had, and could
demonstrate adequate reference standards for, laboratory use, and 72 percent
could demonstrate standards for field use.

6.6  DATA AND DATA MANAGEMENT

     The principal areas of concern in data and data management are:

          o  Percentage of data submitted on time

          o  Percentage of sites submitting less than 75 percent of
             the data
                                   6-7

-------
          o  Documentation of changes to submitted data

          o  Data changes performed according to a documented standard
             operating plan

          o  Completeness of the annual  SLAMS report.

     Each agency was asked to estimate the percentage of data submitted
within 120 days after the calendar quarter in which they were collected.
Although this is a requirement for NAMS sites, most agencies submit their
SLAMS data to the National Aerometric Data Bank (NADB) and have included
these in calculating the percentages for data submitted on time.  The cal-
culations, by pollutant, from 64 agencies show that the quarterly average
percentage submitted on time varied from 71 percent for lead to 90 percent
for TSP.  These percentages are in general  agreement with those reported
quarterly in the July 1987 NAMS Air Monitoring Station Network Report.

     The audits also examined the percentage of sites meeting the NADB
data completeness criteria (in general, submitting at least 75 percent of
the possible data by quarter by pollutant).  The 66 agencies' responses
on data completeness were combined as a national quarterly average data
completeness and were compared to the 1985 annual  percentages from the
July 1987 Quarterly NAMS Status Report.

                    Percent of Sites Meeting NADB Criteria

          Pollutant          Audit Results          1985 Annual
            TSP                   93                     93

            S02                   86                     92

            N02                   73                     78

            CO                    93                     90

            03                    94                     88

            Pb                    86                     94

     Considering that the national audit results reflect a quarterly
average of approximately 4,000 sites, and the NAMS report is an average
for approximately 1,300 sites, the two data sets compare reasonably well
in that the difference range is from 0 (for TSP) to 8 percent (for Pb).,

     The audit indicated that 58 (85 percent) of the audited agencies
documented permanently any change to air quality data previously submitted
to EPA, and that 55 agencies (82 percent) performed these changes according
to a documented standard operating procedure.
                               6-8

-------
      The last area considered under the data management section of the
audit was the requirement to submit an annual SLAMS report to EPA.  This
requirement is applicable to States, the District of Columbia, and U.S.
territories, so local agencies are not considered here.  Four particular
aspects of the SLAMS annual  report are needed for a complete SLAMS report:
a data summary, annual precision and accuracy information, air pollution
episode information, and certification of the report.  Of the 45 audited
agencies that are required to produce a SLAMS annual report, 36 (80
percent) showed inclusion of all required elements, while the remaining 9
were deficient in one or two elements of the report.  The most frequently
missing items were episode reporting and precision and accuracy data.
The requirement for submission of precision and accuracy data in the
annual SLAMS report was not removed from the regulations until May of
1986 and was not a significant factor in these audits.  These results are
comparable to findings based on those annual reports received by the
Office of Air Quality Planning and Standards (OAQPS).

6.7  QUALITY ASSURANCE/QUALITY CONTROL

     The Quality Assurance/Quality Control (QA/QC) section of the National
Air Audit System is the last major topic addressed in the audits.  The
following portions of the agencies' QA/QC programs were examined:

          o  EPA approved QA plan

          o  Pending revisions to QA plans

          o  Agency participation in the National Performance Audit
             Program (NPAP)

          o  Attainment of precision and accuracy goals.

     Several years ago, EPA completed its approval of State Quality
Assurance Program plans.  As a result of program reviews, and of the
changing state of the art, some agency QA programs are outdated or need
modification.  This is evidenced in the audit results showing that 3 out
of the 67 audited QA plans need major revisions and 32 (48 percent) have
initiated formal revision proposals concerning minor matters.  At the
time of the audits, two of the three needing major revisions were scheduled
for the next fiscal year or were otherwise in progress.  Participation in
the NPAP was very good, with 64 of the 67 audited agencies participating.

     The last consideration of the QA/QC section of the national audit
was an evaluation of the ability of the agencies' reporting organizations
to meet the precision and accuracy goals specified in the FY 1986-87
audit guidance (precision, +15% for all pollutants; accuracy, jvL5% for TSP,
+_2Q% for all other pollutants, using audit level  2 for accuracy).

     Precision and accuracy data are measures of data quality and are
based on "reporting organizations".  Each State must define at least one
or more reporting organization for each pollutant, and each reporting
organization should be defined such that the precision and accuracy data
reported among all stations are reasonably homogeneous.  Nationally,
there are approximately 150 organizations.

                               6-9

-------
     Sixty-three (63) agencies provided responses on precision performance
in a comparable format, while accuracy data were supplied  by 62 agencies.
Some of the agencies that did not provide comparable responses for this
section did provide some indications of achieving or failing to achieve
the goals.  The FY 1986-87 audit guidance directed that achievement of
goals was to be determined on each of the last four complete calendar
quarters for which precision and accuracy data were available, covering
the period October 1984 through December 1986.  Since no common time
period was available for all agencies, the four quarters during which
each reporting organization provided data on precision goals were summed.
Similarly, a summation of the accuracy goals achievement was prepared.
The results of this tabulation, by pollutant, are presented in Table 6.1
for precision and in Table 6.2 for accuracy.

     Achievement of precision goals by reporting organizations, as shown
in Table 6.1, varied from 55 percent for lead to a high of 88 percent for
CO.  Similarly, the achievement of accuracy shown in Table 6.2 ranges
from 65 percent for N02 to 84 percent for Pb.  It is evident that NOg
accuracy falls considerably below the level  achieved by other pollutants and
this has been attributed to the complexity of the NO/N02/NOX analyzer.
It is also possible that the wide confidence intervals associated with N02
accuracy estimates are related to the fact that, for most  reporting
organizations, there are few N0£ sites relative to site counts for the
other pollutants.  In a statistical  sense, the presence of even a few
relatively large, but still "acceptable", individual audit differences
are magnified into large quarterly confidence intervals because
of the small number of points actually composing the statistic."

     To provide further perspective on precision and accuracy performance
nationally, Table 6.3 for precision and Table 6.4 for accuracy have been
assembled from data submitted to EMSL for calendar year 1986.
                              6-10

-------
               TABLE 6.1



FY 1985 AUDIT RESULTS FOR DATA PRECISION
Pollutant
03
N02
S02
CO
TSP
Pb


Pollutant
03
N02
S02
CO
TSP
Pb
Reporting
Organizations
(Rpt. Orgs.)
83
50
85
74
109
66

FY 1985 AUDIT
Reporting
Organizations
(Rpt. Orgs.)
82
49
84
73
108
65
Rpt. Orgs.
Quarters Meeting
Goals
287
131
241
260
294
146
TABLE 6.2
RESULTS FOR DATA
Rpt. Orgs.
Quarters Meeting
Goals
266
127
240
231
355
217
Percent Success at
Meeting Goals
87
66
73
88
67
55

ACCURACY
Percent Success at
Meeting Goals
81
65
71
79
82
84
             6-11

-------
                                   TABLE 6.3



                   PRECISION OF 1986 DATA SUBMITTED TO EPA
Pollutant
03
N02
S02
CO
TSP
Pb


Pollutant
03
N02
S02
CO
TSP*
Pb
Rpt. Orgs.
75
59
103
90
136
68

ACCURACY OF 1986
Rpt. Orgs.
98
40
82
68
127
68
Probability
Lower
-14
-19
-19
-12
-17
-28
TABLE
Limits (%) Percent of Rpt. Orgs.
Upper Meeting Goal + 15%
+12
+20
+14
+13
+19
+30
6.4
DATA SUBMITTED TO EPA (AUDIT LEVEL
Probability
Lower
-14
-18
-19
-12
-11
-12
Limits (%) Percent
Upper Meeting
+15
+16
+18
+12
+12
+10
93
80
86
95
80
66 .

2)
of Rpt. Orgs.
Goal +_ 20%*
96
94
93
98
95
98
*TSP is based on +15%.
                              6-12

-------
     Tables 6.3 and 6.4 utilize the entire 1986 data base.  The upper and
lower probability values were selected so that the range would include 90
percent of the reporting organizations for precision and accuracy (audit
level. 2).  The last column reflects estimates of the percentage of reporting
organizations meeting the audit goals for 1986 precision and accuracy
data submitted to EMSL.  The number of reporting organizations meeting
the audit goal of _+15 percent for precision ranges from 66 percent for Pb
to 95 percent for CO.  The goals for accuracy were _+15 percent for TSP
and +20 percent for all other pollutants.  The range of reporting organi-
zations achieving the accuracy goals in Table 6.4 is from 93 percent for
S02 to 98 percent for CO and Pb.

     Tables 6.1 and 6.3, or 6.2 and 6.4, are not directly comparable with
each other, because the period of record for the agencies audited could
not be equated to a single quarter or year and because the number of
reporting organizations used is not the same.  However, based on 1986
annual data submitted to EMSL, the goal  achievement rate is higher for
all pollutants for precision and accuracy than that indicated by the
agencies audited.  Furthermore, the extent of the range (the percent of
reporting organizations meeting the precision and accuracy goals) in
Tables 6.3 and 6.4 is smaller than in Tables 6.1 and 6.2.

     Since the 1985 audit reports, significant national improvements in
the N02 precision and accuracy summary measurements (Tables 6.3 and 6.4)
have been observed.  The percentage of reporting organizations meeting
the N02 precision goal has increased from 75 to 80 percent, while the
accuracy percentages have improved by 24 percentage points, from 70 to 94
percent.

     The reasons for the rather low percentage of audited agencies meeting
the precision and accuracy goals (Tables 6.1 and 6.2)  for N02 are not
entirely clear, but the low achievement level for lead precision is believed
related to the procedure, which analyzes two strips from the same filter.
This may occur because of variable Pb content across the length of the
filter, handling losses, or inconsistency in cutting filter strips.  A
review of Pb precision data from 1982 through 1986 shows little improvement
over the period.  Consequently, it appears that the selected precision
goal of jKL5 percent is too rigorous, and that some consideration to relax
this goal is warranted.  This matter has been referred to the appropriate
groups in OAQPS and EMSL.  Preliminary findings indicate that the Agency
will be adjusting the precision and accuracy goals for lead.
                                  6-13

-------
                                CHAPTER 7

                   MOTOR VEHICLE INSPECTION/MAINTENANCE
7.1  EXECUTIVE SUMMARY

     By the end of FY 1987, there were inspection and maintenance (I/M)
programs in 33 states.  During FY 1986 and FY 1987, EPA conducted 33
program audits.  The audit findings summarized in this report confirm the
validity of findings of audits performed during FY 1985 and provide
further evidence that I/M is a reasonable and effective strategy for
reducing tailpipe emissions of vehicles in use.

     The audits also provided convincing evidence that the most effective
I/M program design is the centralized design, while the weakest program
is the decentralized program with manual analyzers.  In fact, audits
showed that programs operating with a decentralized design and manual
analyzers were so significantly inferior in identifying failing vehicles
and in achieving the minimum emission reductions required by EPA, that
the Agency requested corrective action from the Governors of seven States
with this type of program.

     Other problems documented during the first round of auditing were
encountered again in the audits in FY 1986 and FY 1987.  High waiver
rates, ineffective use of program data for program management, lack of
quality control and consistency of testing in decentralized programs,  and
enforcement problems in sticker enforced programs were problems identified
during both the first and second round of audits.  EPA also identified
some problems with using license or registration suspension as a means of
enforcement.

     Certain problems were found to be widespread among programs and EPA
has conducted several special studies to develop reasonable solutions  to
technical  problems identified in operating programs.  EPA has also, based
on audit results, developed and revised policies to assist State and
local  governments in resolving problems and to prevent problems from
occurring.   Much of EPA's proposed Post-1987 Ozone and Carbon Monoxide
Policy as it relates to I/M is based on lessons learned from auditing
operating I/M programs.

     While EPA still believes that the resolution of certain problems
generally rests with each I/M program, audit results make it clear that
differently designed programs do not realistically have equal benefit
potential.  Resolution of certain problems rests with changing program
design.  It is not likely that additional monitoring or administrative

                                  7-1

-------
control will be sufficient to resolve problems associated with decentralized
manual inspections.  The audit results also indicate that decentralized
tampering inspections may lack the impartiality, consistency,  and  accuracy
needed to achieve sufficient benefits.  Given the strength of  the  audit
findings, EPA favors the centralized program design over the decentralized  •
design.  In addition, EPA believes that registration denial  enforcement
is more effective than sticker based systems.

7.2  INTRODUCTION

     The primary purpose of I/M audits is to ensure that each  State or
locality is implementing and enforcing its I/M program in a manner consis-
tent with its State implementation plan (SIP).  A second objective of the
audit process is to identify areas where EPA can provide assistance to
strengthen I/M programs.  The I/M audit questionnaire and audit visit are
structured to obtain operating data and qualitative information to allow
EPA to determine if SIP goals and commitments are being met.

     In FY 1986, the Technical Support Staff developed'and computerized a
methodology to assess program performance in terms of emissions benefits
against SIP design and minimum requirements.  Using this methodology, EPA
calculates the expected program benefit assuming no operating  problems,
disaggregates that benefit into its component parts of tampering deter-
rence, emission reductions from tailpipe checks, and emission  reductions
from the tampering check.  The benefits are then adjusted to reflect
improper testing, waivers, and enforcement problems.   If the  analysis
shows that the benefit from an operating program falls below the required
minimum emission reduction, EPA formally requests a plan of corrective
action from the State.  The corrective plan requires States to resolve
program problems within 9 months to a year or face funding restrictions
as required by the Clean Air Act.  Based on the FY 1986-87 audits, EPA
has requested corrective plans from 12 I/M programs.

     Sixteen (16) audits were conducted in FY 1986 and 17 in FY 1987.
Eight of these audits were follow-up audits of programs that had received
initial audits in FY 1985.  Three programs, in Indiana, Georgia, and Davis
County, Utah, were audited twice in the FY 1986-87 timeframe.    Twenty-
three initial audits were conducted so that by the end of FY 1987, all
programs that had been operating for 1 year or more had received an
initial program audit.  Tables 7.1 and 7.2 list the I/M programs audited
in 1986-87 along with the dates of each audit.

7.3  MAJOR FINDINGS AND CONCLUSIONS

Audit Results

     Detailed results of each program audit are contained in full  audit
reports written jointly by EPA Regional and Headquarters staff. Table 7.3
summarizes the audit findings by program type.  The principal  findings can be
described as follows:
                                  7-2

-------
                                TABLE .7.1

                            FY 1986 I/M AUDITS
            Location
     Dates
Utah (Davis and Salt Lake Counties)
Nashville, Tennessee
Maryland
Georgia (follow-up)
California
Seattle, Washington
Louisvil le, Kentucky
Pennsylvania
Indiana
Houston, Texas (follow-up)
New York (fol low-up)
Georgia (follow-up)
Memphis, Tennessee
Massachusetts
Fairbanks, Alaska
Anchorage, Alaska
Connecticut
10-7 to 10-10
1-5 to 11-7
11-19 to 11-21
1-29 to 1-30
1-27 to 2-6
2-24 to 2-28
3-04 to 3-07
3-10 to 3-13
3-18 to 3-21
4-28 to 5-02
6-11
6-11 to 6-12
6-16 to 6-18
7-07 to 7-11
8-20 to 8-22
8-25 to 8-27
9-23 to 9-25
                                TABLE 7.2

                            FY 1987 I/M AUDITS
              Location
     Dates
New Jersey  (follow-up)
Louisiana
Indiana  (follow-up)
Provo, Utah (preliminary evaluation)
Tulsa, Oklahoma
Michigan
District of Columbia  (follow-up)
Nevada  (follow-up)
Dal las, Texas
Missouri  (follow-up)
II linois
El  Paso, Texas
Provo, Utah
Davis County, Utah  (follow-up)
Spokane, Washington
11-17 to 11-21
12-01 to 12-05
1-12 to 1-14
1-15 to 1-16
2-09 to 2-12
3-10 to 3-12
3-22
3-27
3-30 to 4-03
5-04 to 5-07
5-11 to 5-14
6-01 to 6-05
8-10 to 8-12
8-12 to 8-14
9-21 to 9-24
                                  7-3

-------
                                TABLE 7.3

               I/M DATA COLLECTED DURING FY 1986-87 AUDITS
Failure
Program Rate
%
DECENTRAL MANUAL
Georgia (86)
Georgia (87)
Boise, Idaho
Missouri
Nevada
New York
Davis County, Utah
Provo, Utah
Salt Lake City, Utah
DECENTRAL COMPUTERIZED
California
Pennsylvania
Fairbanks , Alaska
Anchorage, Alaska
Michigan
El Paso, Texas
CENTRAL GOVERNMENT
District of Columbia •
Memphis, Tennessee
New Jersey
CENTRAL CONTRACTOR
Connecticut
11 linois
Indi ana
Louisville, Kentucky
Maryland
Nashville, Tennessee
Seattle, Washington
Spokane, Washington
TAMPERING ONLY
Houston, Texas
Louisiana
Tulsa, Oklahoma
Dallas/Ft. Worth, TX

4
7
9
9
11
5
10
18
10

25
18
19
15
22
?

?
8
20

12
21
24
16
15
24
17
18

?
?
?
?
Compli ance Waiver
Rate Rate
% %

63
93
98
95
95
98
95
95
95

95
94
95
95
95
?

18
66
95

93
73
50
100
99
95
94
78

95
85
95
98

?
22
3
23
40
?
25
10
20

19
4
20
10
13
?

100
0
NA

14
10
20
19
20
NA
17
14

NA
NA
NA
NA
Tamper
Test
Problems

Yes
Yes
Yes
Yes
NA
Yes
Yes
No
Yes

Yes
NA
Yes
Yes
NA
Yes

NA
NA
Yes

NA
NA
NA
NA
NA
NA
NA
NA

Yes
Yes
Yes
Yes
Quality
Assurance
Problems

Yes
Yes
Yes
Yes
Yes
Yes
Yes
No
No

Yes
No
Yes
Yes
Yes
Yes

Yes
Yes
No

No
No
No
No
No
No
No
Yes

Yes
Yes
Yes
Yes
NA

?
Not applicable, waivers are not issued or tamper checks are not
performed as part of the initial test.
Unknown, usually due to lack of data analysis on the part of the
program.

                           7-4

-------
1)   Improper Testing - All  manual  decentralized programs had much
     lower than expected failure rates as a result of improper testing.
     Audit results showed that failing vehicles were not being
     properly identified in  these programs, due to inspector malfeasance
     or incompetence.  Serious emission reduction shortfalls resulted
     in all  but two areas with programs using this design.

     Tampering inspections in decentralized programs also suffered
     from improper inspection.  Auditors found that inspectors often
     failed to perform tampering inspections correctly or neglected
     them entirely.  Roadside surveys conducted by EPA confirmed
     these findings.

2)   Enforcement - As in the earlier audits, EPA found a range of
     compliance rates from 50 percent to greater than 95 percent
     among programs.  The lowest rates were in sticker enforced
     programs and in some programs  with driver's license or registration
     suspension.

3)   Waiver rates - EPA auditors found that waiver rates varied among
     programs.  Excessive waiver rates were found in programs that
     did not have adequately administered waiver processes or had
     too low a cost requirement for the waiver.

4)   Quality control and quality assurance - FY 1986-87 audits showed
     that analyzer and inspection quality control were excellent in
     the centralized, contractor-run programs.  The decentralized
     programs had more problems with both QC and QA in general, and
     these problems were quite severe in most decentralized manual
     programs and decentralized tampering programs.

5)   Data analysis - The centralized, contractor-run programs for the
     most part had the best  data and data management capabilities.
     Audits showed that a few of the more sophisticated decentralized
     programs captured and used program data for'management purposes,
     but most decentralized  programs lacked reliable data with which
     to effectively evaluate and manage program operations.  The
     audits also revealed that the  data from the decentralized
     manual  programs and decentralized tampering programs were in
     most cases completely unreliable.

6)   Quality of I/M Repairs  - Most  programs lack the data to seriously
     evaluate the quality of I/M repairs.  However, EPA did find that
     assuring quality repairs was beyond the scope of most programs.

     The high waiver rates in many  programs and the small reductions
     obtained from waived vehicles  indicated the failure of vehicle
     repair.  Repair studies conducted under EPA contract showed
     that vehicles failing I/M programs can be successfully repaired
     at higher rates than EPA audits indicated was the case in the
     field.
                              7-5

-------
National Program Management

      EPA has made an effort to integrate audit findings into policies
that will result in more effective I/M programs.  The first step in this
process was requiring programs with shortfalls to correct program defi-
ciencies.  The second step was using the knowledge gained through the
audits to design the Agency's Post-1987 Ozone and Carbon Monoxide Policy,
which was proposed in the Federal  Register on November 23, 1987.  EPA
also revised the National Air Audit Guidance to reflect the FY 1986-87
audit experience.

Correcting Operating Problems

     Program design is the basis for improper testing problems and some
enforcement shortfalls.  EPA believes that certain problems can be resolved
through tightening of administrative procedures; however, problems serious
enough to cause emission reduction shortfalls may require program design
changes.  Programs with a decentralized design using manual analyzers
have so many problems in the field that EPA no longer finds this design
approvable.  While using computerized analyzers mitigates some of the
problems of the decentralized design, it does not solve all of the problems.
The decentralized approach does not allow for consistency in either
tampering or emissions checks, quality control is more difficult than in
the centralized network, and data collection is less reliable.  Audit
results lead EPA to conclude that registration denial is more effective
than either registration/license suspension or sticker enforcement.

Future I/M Audit Policy

     The consolidation of audit results from FY 1985 to FY 1987 is reflected
in the Agency's proposed Post-1987 Ozone Policy.  Under this policy, the
minimum emission reduction requirement for areas with severe air quality
problems is based on a centralized program design and assumes 100 percent
enforcement.  In the future, if a waiver provision is included in the
program, the SIP will have to include a commitment to a waiver rate,
which will be factored into the program benefit.  Previous program operat-
ing results, including enforcement levels and reported failure rates,
will also be taken into consideration in determining the program benefit.
To encourage better and more uniform data management and application, EPA
will require programs to report certain statistics on a biannual basis.
EPA will continue to audit programs on a regular basis, incorporate audit
findings into policy, and work through the audit process to ensure that
SIP commitments are met.

7.4  REPORTED FAILURE RATES/IMPROPER TESTING

     EPA auditors found that failure rates varied with program design.
The FY 1985 audits had shown reported failure rates in decentralized
manual programs, such as Virginia and North Carolina, to be consistently
lower than the design failure rates.  Eight decentralized manual programs
were audited in FY 1986-87.  All but three of these programs, in Boise,
Provo, and Salt Lake City, had failure rates so low that emission reduction
levels fell below the minimum emission reduction requirement.


                                   7-6

-------
     The primary reason for the low failure rates was found  to be improper
testing.  In some cases, vehicles may have been undergoing pre-inspection
repair.  However, in most cases, inspectors were making mistakes  in
testing, failing to test the vehicle, or intentionally passing a  vehicle
that should fail .  The overwhelming evidence accumulated through  the
audit process has lead EPA to be skeptical about its previous  position
that manual  decentralized programs can be successful under a rigorous
program of surveillance.  Regular station audits, extensive undercover
auditing, and a data analysis capability that allows the tracking of
station performance are essential for even marginal  success  in a  decentra-
lized program using manual analyzers.  Size is another important  factor
in the success of decentralized programs.  The larger programs cannot
maintain the personal  contact with stations exhibited in the smaller,
more successful programs.  The Boise, Provo, and Salt Lake City programs
are small, well managed programs that meet the minimum emission reduction
requirement but still  fall below the expected results, given the  program
coverage and cutpoints.  EPA believes that the basic problems  with the
decentralized manual approach are so intractable that no future programs
with this design will  be approved for SIP credit.

     Auditors found that to some degree the use of computerized analyzers
mitigated the effects of the decentralized design on improper  testing.
Failure rates in decentralized programs using computerized analyzers were
usually close to, although generally slightly below, the design failure
rate range,  with a few notable exceptions.  The New Jersey program,  which
has both centralized,  State-run testing facilities and licensed decentra-
lized stations using computerized analyzers, reported a failure rate of
7.3 percent in the decentralized stations.

     This compares to  a reported failure rate of 20.3 percent  in  the
centralized  stations.   (These reported figures were not well documented,
and further investigation may find valid reasons for some of the  disparity.)
Improper testing in computerized programs requires greater effort on the
part of the inspector, but is possible.  Most computerized programs  do
not do all the data analysis or covert auditing they could to  detect,
quantify, and pursue suspected cases of improper testing.  The proposed
Post-1987 Ozone and CO Policy addressed these issues.

     Some of the problems that could mask improper testing in  decentralized
stations with computerized analyzers were reported.   It is possible  to
produce computer-generated test records showing a pass for a failing
vehicle by entering into the analyzer the proper identifying information
for the vehicle but then testing another that is a known pass. Some
programs experienced data loss as a result of the use of cassette recording
media in computerized analyzers.  Part of this problem could stem from
deliberate tampering with cassettes or the recording media to  avoid
detection of improper  issuance of a certificate of compliance.

     EPA found that all centralized contractor-run programs  had failure
rates within the range of expected results.  Two centralized,  locally  run
programs, the District of Columbia and Memphis, were audited and  both were
found to have low failure rates.  The District addressed this  problem by
installing computerized analyzers.  Auditors found that the  cutpoints  in the
Memphis program were very loose and the inspectors in some cases  were not

                                   7-7

-------
inspecting vehicles properly.  The standards have now been tightened and
two new stations with modern equipment have been installed.

7.5  ENFORCEMENT

     As in earlier audits, EPA auditors found that the rate of compliance
varied from program to program, with rates ranging from greater than 95
percent to a 1 ow of 50 percent.  Of the 30 programs audited, 7 programs
had serious enforcement shortfalls (see Table 7.3).

     Three enforcement methods are currently used in I/M programs:
sticker enforcement, registration denial, and data-link enforcement.  The
problems with sticker based enforcement programs found in the FY 1984-85
audits were found again in the FY 1986-87 audits.  These problems include
inadequate sticker accountability, inability to easily identify subject
vehicles, inability to distinguish expired stickers from valid ones, and
failure of police departments to devote the necessary resources to  enforce-
ment of the testing requirement.

     A sticker enforced program must have complete sticker accountability
procedures that are rigorously followed in order to be effective.  To
ensure the proper disposition of stickers, government agencies must
confirm that each sticker has a matching inspection record showing  a
passing score.  In decentralized programs, this process of accountability
is very resource intensive, requiring correlation of sticker serial
numbers to inspection reports and manual  review of inspection records,
sticker records, and sticker supplies during audits.  It also requires a
process of sticker distribution that allows auditors to track (by serial
number) inspection stickers issued to each inspection facility.  EPA
found that in most programs, sticker accountability systems are well
designed and implemented.  On the other hand, the New York program  had as
many as 800,000 stickers for which no test record was found.  Part  of
this problem stems from data loss due to use of cassettes, but EPA  found
that the New York auditors did not systematically check for sticker
problems during audits and stickers were being issued without tests.

     In programs where registration denial is used as the mechanism for
enforcing the program, few problems have been identified.  EPA audited 16
programs that use registration denial enforcement in FY 1986-87.  In some
cases, the estimates for subject vehicles were rough, making accurate
calculation of compliance rates difficult.  However, all but three  programs
appeared to have compliance rates in excess of 94 percent.  The complex
geographic coverage of the I/M area and problems with the decentralized
and largely unsupervised registration system resulted in low compliance
in the Spokane I/M program.  The Memphis and Atlanta programs, also  had
low compliance rates, which appeared to be related to loopholes during
the phase-in period of new registration denial systems.

     During FY 1986-87, EPA conducted the first audits of programs  using
the computer matching method of enforcement.  These programs differ in
design details, but all rely on matching test records with registration
data on subject vehicles.  When noncomplying vehicles are identified,
vehicle owners are notified.  If vehicles are not subsequently tested,


                                  7-8

-------
then vehicle registrations are suspended or the vehicle owner receives  a
citation to appear in court.  Illinois is different in that it suspends
the non-complying vehicle owner's driver's license.

     Of the four programs that rely solely on computer matching, the
Maryland and Louisville, Kentucky programs worked well.  In both programs,
compliance rates were near 99 percent.  Non-complying vehicles were
usually brought into compliance within 3 months of notice of violation.

     In Illinois and Indiana, auditors found severe problems with the
computer matching enforcement.  The computer matching system of enforce-
ment requires inter-agency cooperation, an accurate up-to-date registra-
tion data base, and substantial  computer-related resources.  The Illinois
system is complicated by the addition of driver's license data.  The
notification and suspension process has taken much longer than the sche-
duled 7 months.  Often, vehicles that are found to be out of compliance
have been sold, the registration data base does not reflect the sale, and
driver's licenses are improperly suspended.  The Indiana program had the
lowest compliance rate in the country in 1986-87 and continued to experience
difficulties with system design, dedication of resources, and inadequate
inter-agency cooperation in enforcing the program.

     After 4 years of"auditing I/M programs, EPA has found that the
sticker enforcement program design is less effective and more resource
intensive (if conscientiously executed) than the registration denial
method of enforcement.  Of all the sticker enforced programs audited,
almost half had problems.  Sticker enforcement is less difficult to
implement in terms of identifying subject vehicles where the I/M program
is statewide.  However, problems with sticker accountability, inadequate
police resources, and low priority for enforcing the program remain.  It
is clear that computer matching enforcement can be as effective as regis-
tration denial, but the resources required to implement such a complex
system make it less favorable than simple registration denial.


7.6  WAIVERS

     EPA auditors found that waiver rates varied considerably among the
programs audited.  The highest waiver rate was 40 percent and the lowest
was 2 percent.  Unless there was a carefully administered waiver system,
waivers tended to be a weakness in all programs that allow them.

     To some extent, excessive waivers varied with program design.  In
programs where the administering agency processes all waiver applications
(many centralized programs, a few decentralized), the reason for high
waiver rates tended to be lenient requirements.  Vehicles receiving
improper or poorly-performed repairs were granted waivers as long as the
repair cost limit was reached.  It is not unusual for retest scores on
failed vehicles to remain the same or increase as a result of such repairs.
Repair cost limits were often inadequate to ensure that vehicles received
the basic repairs needed to bring the vehicle into compliance.  In addition,
vehicles eligible for warranty coverage could get waivers without ever
having sought a free warranty repair, and owners of failing vehicles were


                                  7-9

-------
allowed to do their own repairs and receive a waiver even if the repairs were
inadequate.  Repairs done by vehicle owners were often ineffective and,
in one program that had data available, about one-third of the waivers
were from this group, a disproportionately large percentage.  Finally,
not all commercial repairs were appropriate for the cause of the I/M
failure.

     In decentralized programs, where the inspection stations usually
have the authority to grant waivers, high waiver rates are also caused by
a lack of control over the waiver process.  In these cases, State or
local agencies did not track waiver rates by inspection station and did
not investigate questionable waiver transactions.  In some decentralized
programs, the waiver rates were surprisingly low.  In some cases, the
rates were so low (less than 2 percent) that it reinforced the suspicion
that vehicles are not being correctly failed at reinspection.  As in
centralized programs, a low waiver cost limit usually meant a higher
waiver rate.

     EPA has run several  test programs in which failing vehicles have
been recruited from I/M programs.  Results of these studies indicate that
most vehicles can be repaired successfully for reasonable costs.  Based
on these findings, EPA recommends that cost waiver limits be increased to
$200 for 1981 and later vehicles, and to $75 for pre-1981 vehicles. The
typical cost waiver limit found in I/M programs is $50 or $75.  However,
anti-tampering programs typically do not allow cost waivers for the
repair or replacement of tampered components.  In addition to raising the
cost limit, other factors must be considered in controlling waivers.
Proper administrative procedures must be followed in granting waivers and
no waiver should be granted where there is no decrease in emissions (EPA
recommends that at least a 20 percent reduction be required).  Bringing
pressure on the repair industry to perform proper repairs is difficult,
but some agencies are dealing with the repair industry where repairs from
a facility are consistently ineffective.  EPA believes that the waiver
process and monitoring of repairs is most easily managed in the centralized
program where the government oversight agency is responsible for approving
waivers.

     Centralized waiver processing provides the opportunity for intervention
in the repair process.  For example, Louisville requires all owners of
failed vehicles to present a completed repair form at the time of the
retest.  The data required include the name of the repair facility,
repairs conducted, and the cost.  Program officials periodically publish
data on each repair facility involved in I/M repair.  The data published
show the number of vehicles repaired, the number that pass and fail the
retest, and the number that experience no emission improvement at all.
Program officials report that this has had a very positive impact on the
repair industry.  The approach used in the Milwaukee, WI program (audited
in FY 1985) is to track repair facility performance.  When a facility is
returning too many vehicles that fail the retest, program officials visit
the facility, encourage the mechanic(s) to take advantage of training
available through local colleges, and provide assistance in analyzer
calibration and testing.   Another control used in some centralized programs
is to physically check the vehicle to see if repairs were actually performed
and also to check for tampering with emission controls.  These strategies

                                  7-10

-------
are effective in preventing abuse of the waiver system and insuring that
emission reductions are not lost without justification.

     In the past, EPA has calculated SIP credits under the assumption
that all vehicles failing an I/M test are successfully repaired.  Since
overall average emission reductions do not usually occur among vehicles
given waivers, EPA now considers waiver rate in analyzing current program
success and evaluating future program designs.   In the future, EPA will
ask programs to commit to a design waiver rate  for the purpose of calculating
SIP credit.

7.7  QUALITY CONTROL

     Quality control (QC) is used here to refer to the quality enhancing
practices of the person or organization doing vehicle inspections.  The
centralized, contractor-run programs for the most part had excellent
quality control.  The most common problem found in these programs was
poorly named calibration gases, but the degree  of error was always small
(within a few percentage points).  Quality control tended to vary consi-
derably among centralized, government-run programs.  In addition to
problems with calibration gases, these programs in some cases had poor QC
practices, especially as they relate to the frequency of checks.  Quality
control in decentralized manual programs tended to be extremely poor.
Often, over 50 percent of the analyzers checked were found out of calibra-
tion in these programs.  It was found that quality control procedures are
not typically followed by inspection stations.   In decentralized programs
with computerized analyzers, QC was found to be much better than manual
programs, but not up to the level expected.  Typically, about 25 percent
of the analyzers checked in computerized programs failed the calibration
test.  Some of this was due to station gas quality but, in most cases,
this was not the cause.  EPA is concerned at this point that weekly
calibrations, which are required in all decentralized programs, are not
frequent enough to ensure reasonable quality control.  This issue will be
pursued further to assess the causes and possible mitigations.

7.8  QUALITY ASSURANCE

     Quality assurance (QA) refers to the practices of the persons,
agency, or organization one level removed from  the actual conduct of
inspections.  Quality assurance is fundamentally different in centralized
and decentralized programs.  Centralized, contractor-run programs generally
required only minimal quality assurance on the  part of the oversight
agency.  Generally, the QA done involved weekly or monthly visits to
inspection sites to observe testing, to check analyzer calibration, and
the like.  As stated above, the level of quality control in these programs
was exemplary and this is due in part to the vigilance of oversight
agencies.  Also, in most such areas there is a  degree of hostility by
some members of the public and legislators toward the contractor, and so
the contractor is careful to avoid incurring legitimate negative criticism.
In centralized, government-run programs, both the level of quality control
and the effectiveness of quality assurance were uneven and, in one or
or more cases, very poor.  In most cases, these programs have split
                                    7-11

-------
responsibilities between the air agency and the motor vehicle  agency,
with the latter doing the testing and implementing quality control  procedures
and the former responsible for quality assurance.  In some cases,  little
or no quality assurance was being conducted by the air agency  and  this
was usually reflected in the results of EPA's analyzer audits.  In  some
cases, the QA procedures were not entirely adequate.   For example,  loose
audit tolerances were found in some programs (e.g., +10% as opposed to
the +5/-7% recommended by EPA).  Also, other QA practices were not  used
in some cases, such as observing inspectors testing vehicles and processing
forms.  Even when the air agency is following good quality assurance
practices, its ability to obtain quality control  corrections by the motor
vehicle agency may be low, since the only official with authority  over
both agencies may be the Governor.

     In decentralized programs, quality assurance ranged from  poor to
excellent.  The major problems included inadequate procedures  and  guidelines
for auditors, failing to follow through once problems were found,  and
poor tracking of station performance.  Audit procedures often  departed
from EPA's basic policy requirement that auditors check to see that all
program rules and regulations are being followed.  The most important
feature of this is to observe an inspector actually doing an inspection.
When this important practice was followed, problems were often found.
More importantly, most programs imposed lenient penalties or only  gave
out warnings for infractions.  In a few cases, the administrative  process
or legal barriers discouraged or prevented more effective penalties but,
more often, the problem was a lack of will on the part of the  oversight
agency to suspend or revoke station and inspector licenses. Even  once
licenses are suspended, the length of suspension was  often too short to
be a meaningful deterrence.  Good QA programs generally included very
thorough overt audits conducted quarterly (or monthly in manual  analyzer
programs) and periodic covert audits using vehicles set to fail  the
emission and anti-tampering inspection.  Another positive feature  of good
quality assurance programs was tracking of station performance through
the use of very organized, often computerized, recordkeeping.

7.9  DATA ANALYSIS

     EPA auditors found that with a few exceptions decentralized programs,
especially programs collecting data manually, were failing to  effectively
use program data to monitor program performance.  Centralized  programs
were doing a better job where the programs were managed by a contractor
who made use of data and provided reliable data to the oversight agency.
In all of the centralized programs, the data were at  least available to
construct meaningful program statistics.  This was not the case in  most
decentralized programs audited, where data loss was found to be a  common
problem.

     Audits showed manual data collection in decentralized programs, both
tailpipe and tampering, to be so unreliable that nothing meaningful can
be extrapolated from the data.  Records are often illegible and therefore
unusable.  Many inspectors also do not record the data correctly.   EPA
found, as it did in earlier audits, that manually collected data sometimes
contained easily identifiable patterns of record falsification.
                                  7-12

-------
     EPA found that the data collected varied from program to  program,  as
did definitions and calculation of reported statistics.   This  lack of
uniformity and the inability of programs to use available data illustrate
the need for stronger direction from EPA for future reporting  requirements,
                                  7-13

-------
                                     TECHNICAL REPORT DATA
                             (Please read Instructions on the reverse before completing!
1. REPORT NO.
 EPA-450/2-39-003
              3. RECIPIENT'S ACCESSION NO.
4. TITLE ANO SUBTITLE
 National Air Audit  System -
 FY 1936-1987 National  Report
              5. REPORT DATE
               January 1989
              6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
                                                              8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME ANO AOORESS

 Office of Air Quality Planning and Standards
 U.S.  Environmental Protection Agency
 Research Triangle Park,  North Carolina  27711
                                                              10. PROGRAM ELEMENT NO.
              11. CONTRACT/GRANT NO.
              68-02-4393
12. SPONSORING AGENCY NAME ANO AOORESS
 Director, Office of Air Quality Planning and Standards
 Office of Air and Radiation
 U.S.  Environmental Protection  Agency
 Research Triangle Park, North  Carolina  27711
              13. TYPE OF REPORT ANO PERIOD COVERED

              Final -  FY
               . SPONSORING AGENCY CODE
              EPA/200/04
15. SUPPLEMENTARY NOTES
16. ABSTRACT
       The National Air Audit System (NAAS), which was developed  jointly by EPA and
 representatives of State and local  air pollution control agencies,  was implemented for the
 first  time in FY 1984, and audits were again conducted in  FY  1985.   The audits covered by
 this report were performed at State and local agencies over the  2-year period of FY 1986-
 87,  in the areas of air quality  planning and State implementation  plan activity, new source
 review,  compliance assurance, air monitoring, and inspection  and maintenance.

     The report for FY 1986-1987 indicates  that State and  local  agencies generally have
 sound  programs in each of the audited  areas and the audited agencies  show improvement over
 the  previous audits.  There are  still  areas where improvement is needed, however, and
 various  remedial actions have been  initiated.  A task force is also being formed to assess
 how  to improve the function of the  NAAS, to make the audit results  more comcrehensive and
 useful.
17.
                                 KEY WORDS ANO DOCUMENT ANALYSIS
                   DESCRIPTORS
                                                b.iOENTIFIERS/OPEN ENDED TERMS
                              COSATI Field.Group
   Air audit
   Air monitoring
   Air pollution"
   Air quality  planning
   Compliance assurance
   Inspection and maintenance
   New source review
 Air Pollution  Control
13B
18. DISTRIBUTION STATEMENT

   Release unlimited.   Available through NTIS.
19. SECURITY CLASS i This Report!

  Unclassified	
                                                                            |21  NO. OF * A (j e b
                                                20. SECURITY CLASS , Tins paw
                                                   Unclassified
  142
                                                                            22. "ICE
EPA Form 2220-1 (R«». 4-77)   PREVIOUS EDITION is OBSOLETE

-------