EPA-450/3-75-055
April 1975
SYSTEM FOR TABULATING
SELECTED MEASURES
OF STATE AIR PROGRAMS
STATUS
U.S. ENVIRONMENTAL PROTECTION AGENCY
Office of Air and Waste Management
Office of Air Quality Planning and Standards
Research Triangle Park, North Carolina 27711
-------
EPA450/3-75-055
SYSTEM FOR TABULATING
SELECTED MEASURES
OF STATE AIR PROGRAMS STATUS
by
Marsha N. Allgeier and Barry Levene
System Sciences, Inc.
P.O. Box 2345
Chapel Hill, N. C. 27514
Contract No. 68-02-1420
Project No. IIB5
Program Element No. 2AH137
EPA Project Officer: Norman L. Dunfee
Prepared for
ENVIRONMENTAL PROTECTION AGENCY
Office of Air and Waste Management
Office of Air Quality Planning and Standards
Research Triangle Park, N. C. 27711
April 1975
-------
This report is issued by the Environmental Protection Agency to report
technical data of interest to a limited number of readers. Copies are
available free of charge to Federal employees, current contractors
and grantees, and nonprofit organizations - as supplies permit - from
the Air Pollution Technical Information Center, Environmental Protection
Agency, Research Triangle Park, North Carolina 27711; or, for a
fee, from the National Technical Information Service, 5285 Port Royal
Road, Springfield, Virginia 22161.
This report was furnished to the Environmental Protection Agency by
System Sciences, Inc. . Chapel Hill, N. C. 27514, in fulfillment of Contract
No. 68-02-1420. The contents of this report are reproduced herein as
received from System Sciences, Inc. The opinions, findings, and conclu-
sions expressed are those of the author and not necessarily those of
the Environmental Protection Agency. Mention of company or product
names is not to be considered as an endorsement by the Environmental
Protection Agency.
Publication No. EPA-450/3-75-055
-------
ACKNOWLEDGMENT
The comments and assistance of Dr. Bernard Steigerwald, and
Mssrs. Jean Scheuneman and Norman Dunfee are gratefully acknowledged,
as are the contributions and efforts of various personnel in the
Monitoring and Data Analysis Division and other divisions in the Office
of Air Quality Planning and Standards in Durham, North Carolina.
The EPA regional offices were of invaluable help, particularly
Region IV personnel through Mr. Gregory Glahn, Region V personnel through
Mr. Thomas Mateer, Mr. Henry Brubaker of Region III, Mr. Leo Stander of
Region VIII, and Mr. Wayne Blackard of Region IX.
Finally, the authors would like to thank Mr. Tom Pace and the
Project Monitor Mr. Daniel DeRoeck for their assistance and patience.
iii
-------
SUMMARY
The System for Tabulating Selected Measures of State Air Programs
Status provides a method for consolidating, organizing, summarizing,
and presenting within a coherent framework air programs data from
existing reporting systems available to EPA headquarters. It is pre-
sented as an independent, objective system applicable to state and
territorial air pollution control agencies in determining their progress,
efficiency, and overall performance in achieving the national ambient
air quality standards.
The system was developed within the constraint of using only
existing data available to EPA headquarters. It does not purport to
be a comprehensive evaluation or priority ranking system of state air
pollution control programs. However, the system does provide an
overall view of state control performance and need, and makes explicit
the relative importance of the various program areas and aspects con-
sidered. Existing data permit presentation of a broad picture of
national status and trends, and identification of geographic and
programmatic problem areas.
The system consists of a framework of measures concerning selected aspects
of state air programs for which data are readily available, a methodology
for computing values and scores for these measures, and alternative
formats for summarizing and presenting values and scores. Comparative
analysis is facilitated.
iv
-------
Measures are organized within a four-level structure. At the
lowest level of aggregation, sub-indicators are composed of combinations
of individual data items drawn from existing data systems. One or more
sub-indicators comprise an indicator, one or more indicators comprise
a sub-index, and at the highest level of aggregation an index is composed
of one or more sub-indices.
The five indices that make up the system measure state performance
and need in relation to long-term goals and objectives (ambient air
quality standards and emissions reductions), as well as more immediate
operational objectives necessary to the accomplishment of goals, specifically:
1. source compliance and enforcement actions,
2. monitoring and reporting air quality and emissions, and
3. completing plans and plan revisions.
Values are computed for each measure. The values can be presented
for each index on the first of the suggested output formats. An example
of this format is presented below, with states listed alphabetically.
Computed values can be converted to scores, which in turn are weighted
according to the relative importance of the measures, and combined with
the weighted scores of other components to yield a score for the measure
at the next level of aggregation. The second output format can be used
to present scores at any or all levels of aggregation. Finally, the
third output format can be used to present a frequency distribution
of the number of states within ranges of computed values or scores for
a given measure.
In addition to the three suggested output formats, there are many
possible ways of organizing and presenting the results of the system,
-------
Fig. II-8a. Output Format Pi, State Values for Index 1
STATE
Alabama
Alaska
A. S.
Arizona
Arkansas
Calif.
Colorado
Conn.
Delaware
D.C.
Florida
Georgia
1. GOAL ATTAINMENT
1.1. TSP
1.1.1.
AAQI
.l.l.l.(a)>Annual
rH
rH
H
H
to
1
A
rH
rH
xi
Of
rH
.1.2. Em. Reduction
T3
M
rH
(8
CO 01
0? 5
rH Cfi
£2 O
rH
H
1.2. S02
1.2.1.
AAQI
1. (a)>Annual
rH
CM
rH
fl}
X1
rH
rH
N.
M
1
I
i-H
Annual
1
X1
.5.2. Em. Reduction
01
N
I
ag
3»
en o
CO
m
rH
-------
depending on the use to which system results will be put and the specific
area of interest. Automation of the system would enable presentation
of results in a wide variety of ways, in regard to both states and
measures of interest.
In developing the system, the goal was flexibility in output
format as well as in assignment of weights and selection of specific
measures and levels of aggregation of interest to the user.
A trial run of the system was conducted for fifty-five state
and territorial control programs to demonstrate the manual application
of the system. It was concluded that a periodic manual application of
the system is feasible, but very time-consuming and subject to errors
in calculation. The feasibility of automating the system depends on
the extent of system usage and the degree of stability of data items
and measures. Partial automation of the system specifically,
automated computation of the values for selected measures, each
drawn from a single data source was considered to be the best alter-
native at this time, subject to a detailed cost feasibility study.
Understanding two additional points is essential to the proper
use of the results generated by the system:
1) The system is only as good as the data from which values are
calculated. Although there are problems with data validity,
completeness, and timeliness, data are expected to improve in
quality and quantity in the future. In the meantime, system
results should be used with the limitations of the data and
information systems in mind.
vii
-------
2) Inherent in any objective system relying solely on quan-
titative data is the lack of qualitative judgment necessary to
interpret and put into proper perspective the quantitative
results. Data inaccuracies and unique problems faced by each
state constrain the usefulness of these quantitative results,
which make up only one of many inputs to EPA's decisionmaking
processes.
With these limitations in mind, however, system results can be
useful in subjecting comparative analysis and resource priority
allocation judgments to the discipline of available data.
viii
-------
INTRODUCTION
Volume I presents a chronological description of the development
of the system and an overview of the individual project tasks. Included
in this overview are a description of what was done at each step, the
rationale for excluding and including individual parameters and
measures, problems encountered and their solutions. Basically, it
traces system development from its inception, through changes which
occurred during its implementation up to its current form.
A detailed description of the final system that resulted is
presented in volume II, including alternative output formats, possible
uses of the system, and limitations and difficulties in using the
system. Volume II has been written so that it can be used separately
as a self-contained description of the system and a reference manual
on its application and use..
ix
-------
A SYSTEM FOR TABULATING
SELECTED MEASURES OF
STATE AIR PROGRAMS STATUS
Vol. I: Development of the System
-------
TABLE OF CONTENTS
LIST OF FIGURES
Introduction 1-1
A. Task 1: Parameter Identification 1-2
B. Task 2: Parameter Analysis, Review, and
Selection 1-3
1. Culling of Parameters 1-3
2. Feasibility of Automation 1-7
C. Task 3: System Development 1-7
1. Indicator Construction 1-7
2. Conceptual Framework 1-8
3. Weighting System 1-17
4. Output Formats 1-19
D. Task 4: System Implementation 1-20
Appendix I-A: Personal Contacts
Appendix I-B: Comprehensive List of Information Sources
For Use as Input to SIP Objective
Evaluation System
Appendix I-C: State Objective Evaluation System
Ques t ionnaire
Number
1-1. Categorization Scheme #1 1-9
1-2. Categorization Scheme #2 1-10
1-3. Categorization Scheme #3 1-12
1-4. Categorization Scheme #4 1-13
1-5. Categorization Scheme #5 1-16
1-6. Categorization Scheme #6 1-18
1-7. Final Categorization Scheme (#7) 1-22
xi
-------
Introduction
The purpose of the project as stated in the work plan was to develop
an independent, objective evaluation system to be applied to control
agencies in determining their progress, efficiency, and overall performance
in achieving the national ambient air quality standards. Parameters would
be identified, evaluation measures would be structured from these para-
meters, and an evaluation system developed.
An important constraint on the system was that evaluation parameters
had to be drawn from existing data sources and reporting systems accessible
to EPA headquarters. For that reason it was decided that an inductive,
rather than deductive approach would be most productive. In other words,
instead of first developing an elaborate abstract evaluation framework and
then investigating sources of data for the evaluation (deductive approach),
parameters would be identified from existing data sources, culled, and
grouped into categories; on the basis of these parameters, an evaluation
system would be designed (inductive approach).
While such an approach probably does not lead, as would the deductive
approach, to a comprehensive system for evaluating all possible aspects of
air pollution control programs and activities, it does avoid the situation
where significant portions of a more ideal evaluation system might not be
practical because of extensive data gaps or difficulties in collecting data
for all states and territories. The purpose of the project was to provide
to EPA's Control Programs Development Division (CPDD) an evaluation system
that was (1) applicable to all states and territories, (2) of immediate use
*
Throughout the report, the term "parameter" is used to refer to an indi-
vidual data item which describes some aspect of status of activities, while
the term "measure" refers to some combination of these parameters by which
states are evaluated.
1-1
-------
to EPA, and (3) capable of being implemented on a regular basis (annually,
or for whatever period was desired) with a minimum of effort.
Data trends resulting from periodic implementation of the system would
also provide increasing administrative and technical insights. The
inductive approach, by designing an evaluation system based on data derived
from existing nationwide sources, fills these needs.
Another consideration that influenced the entire process of developing
the evaluation system was the need and desire to involve as many persons as
possible who will be users and/or who will be affected in the conceptuali-
zation, development, and review of the system. Such extensive participation
was considered necessary not only to ensure consideration of all facts
relevant to the substance of the system, but also to facilitate implementa-
tion of the system.
A. Task 1: Parameter Identification
All potential parameter inputs to the system, their sources, and
associated time delay in obtaining the data were identified during Task 1.
EPA personnel with knowledge of data banks, reporting systems, published
reports, and other data bases currently in existence or projected for the
near future were contacted and interviewed. An outline of existing and
projected data sources identified in the course of this task is presented
below. Published documents are underlined.
EPA Data Sources;
Aerometric and Emissions Reporting System (AEROS) maintained by
the Monitoring and Data Analysis Division, Durham, N.C.
Storage and Retrieval of Aerometrie Data (SAROAD)
National Emissions Data System (NEDS)
National Emissions Report
Monitoring and Air Quality Trends Report
1-2
-------
Compliance Data System (CDS)
Management-By-Objectives (MBO) Outputs Reporting System
Air Pollution Training Institute (APTI)
Air Programs Manpower Model
Plan Revision Management System (PRMS)
State Air Pollution Implementation Plan Progress Reports
State Implementation Plans
Non-EPA Data Sources;
U.S. Bureau of the Census, Department of Commerce
U.S. Decennial Census and other special purpose censuses
Statistical Abstract of the U.S.
U.S. County-City Data Book
Climatological Data, National Weather Service
Dun and Bradstreet, Dun Market Identifiers (DMI) File
OBERS Projections (1972), Bureau of Economic Analysis (Dept. of
Commerce) and Economic Research Service (Dept. of Agriculture)
Appendix I-A presents a list of EPA personnel interviewed in the
course of investigating data sources and information derivable from these
sources. (The list also includes persons who were asked as a part of
tasks 2 and 3, for their input into the development and review of the
parameters, the evaluation system and its component indicators,)
The end product of task 1 was a comprehensive list of information
bits, data sources, and estimated time delay in obtaining each information
bit. This list is included as Appendix I-B.
B. Task 2: Parameter Analysis, Review, and Selection
1. Culling of Parameters
The parameters identified in task 1 were evaluated in the light
of the following criteria:
1-3
-------
(1) Validity: are the reported data accurate? do they reflect the
true state of affairs? are they reliable, e.g., do they reveal
logical trends and variations or are there unreasonable
fluctuations that indicate inconsistency in data collection
procedures or definitions in terms?
(2) Accessibility: are the data reported to a central data collection
unit? are the data automated? how difficult, in terms of cost,
programming effort, time, requesting procedures, is it to obtain
the data? in what formats can the data be retrieved?
(3) Completeness: is sufficient information reported to adequately
draw conclusions about a particular aspect of status or activities?
is there much data that is collected but not reported to the
collection center?
(4) Timeliness: what are the deadlines for reporting data, and are
they usually met? what is the extent of time delays expected?
(5) Stability: is the data now collected likely to be collected in
approximately the same format and with some regularity in the
foreseeable future? What data items or data systems are likely
to be added or deleted?
EPA personnel familiar with the parameters, data systems, and data
collection processes were questioned in regard to the parameters
(see Appendix I-A for list of personal contacts). General findings in
regard to each of the criteria as applied to the major EPA data systems
are presented below:
(1) Validity: There was a wide range of opinion with regard to the
validity of the various data systems. Many reservations were
expressed about the validity of the SAROAD data because of problems
1-4
-------
with air quality measurement methods and procedures, quality
assurance, and site representativeness. However, all agreed
that ambient air quality (AAQ) data were essential and that
validity of this data would improve as EPA guidelines were
issued, quality assurance programs were established, and state
and local control agencies became more experienced.
Some regional personnel expressed serious doubts about the MBO
data because of problems with definitions of terms and the guess-
work involved in state commitments and reporting of outputs. The
validity of the MBO data, it was recognized, can be improved in
the future with an accurate and efficient CDS in the regional
offices and comparable systems (such as EMS) on the state and
local level.
Opinions on the validity of the PRMS analysis varied widely
making any firm conclusions about the use of PRMS parameters
impossible at this point.
Generally, because of the wide variation in opinions on the
validity of the various data systems and because of the probability
of improvement of the data over time, no parameters were eliminated
on the basis of the validity criterion.
(2) Accessibility: Some parameters, specifically those derived from
CDS, the manpower model, and air quality monitoring quality
assurance systems, were put aside temporarily until projected
data or reporting systems were made operational or were equally
operational in all regions. Control agency expenditures broken
down by functional areas or activities were found to be unavailable
without additional data collection efforts (total expenditures per
1-5
-------
state are available). MBO outputs #2 through #8 are not required
to be reported by state; however, these parameters were left in,
pending determination of whether state breakdowns could be ob-
tained from regional offices with a minimum of extra effort.
(3) Completeness: Because there is a significant amount of air
pollution-related training that is not conducted through the Air
Pollution Training Institute, it was felt that;training data from
V
the APTI could not be used to evaluate states on the amount of
personnel training taking place.
Major reservations were expressed about the completeness of
emissions data reported to NEDS. There was general agreement
that the completeness of state's emissions inventories varied a
great deal, as did the extent of updating NEDS files (new sources
or changes in existing sources). Regional personnel generally
agreed that a state's own files contained much more information
than was submitted to NEDS. However, it was felt that emissions
data could not be completely eliminated, that NEDS data would
improve over time, and that existing NEDS data to some extent
reflected actual changes in emissions. One major problem was
the inability to determine to what extent changes in total emissions
reported to NEDS were due to changes in emissions of existing
sources or to changes in the number of sources on NEDS. Current
efforts by NADB should resolve this problem in the near future.
There were also questions as to the relative completeness of
AAQ data reported to SAROAD, given the wide range of completeness
of monitoring networks in the AQCRs and states. This issue is
addressed further in Volume II of this report.
1-6
-------
(4) Timeliness: There were problems of varying seriousness with
timeliness in relation to all data systems; however, no parameter
was eliminated on the basis of this criterion alone. This
problem is discussed further in Volume II.
(5) Stability: Generally no significant deletions of data were
planned, and personnel contacted could not anticipate what
changes might occur in form. Additional data were anticipated
with the completion of CDS and the manpower model both of which
contain parameters which were included in the list of parameters.
2. Feasibility Study of Automation of the System
A preliminary feasibility study was conducted to determine the
advisability of automating all or a portion of the system. Three
approaches to system implementation were analyzed: (1) complete
automation, (2) partial automation, and (3) manual data preparation
and reduction. All the major factors which might be considerations
in the practicability of automation were examined. These factors
included (1) frequency of update, (2) system utility-user access
requirements, (3) need for flexibility, (4) probability of system
modification, (5) linkages among existing automated systems, (6)
accuracy of calculations, (7) lag time in data flow, (8) needed
manpower, and (9) hardware and software needed.
The results of this study and the recommendations emerging from
it are included in Volume II, Section E.
Task 3; System Development
1. Indicator Construction
Culled parameters were combined to build measures of some aspect
of control agency status and performance. These measures were normalized,
1-7
-------
i.e., related to a norm or standard, so that one state could be
compared meaningfully to another. The basic premise was that a state
should be evaluated in terms of national goals or objectives, or in
terms of its own objectives (not inconsistent with national goals).
For example, an air quality improvement measure examines not merely
3
absolute improvements (in Ug/m , etc.), but also improvements relative
to achievement of the national ambient air quality standards.
2. Conceptual Framework
The next step was to organize the measures into a logical cate-
gorization scheme that was to serve as the conceptual framework for
the system. Several categorization outlines were developed during
the system design stage.
Initial categories followed the lines of control agency functions
and activities (Figure 1-1). However, the problem of significant
overlapping of measures into more than one category combined with the
rigid framework revealed this method to be unworkable as an overall
view of control agency status. Therefore a new framework was developed
based on evaluation schemes used in social research (see Figure 1-2).
This categorization distinguishes between the goals of air quality
improvement and emissions reductions, and operational objectives of
meeting commitments, ensuring source compliance, reporting AQ and
emissions data, and completing plans and plan revisions. A measure
of need, reflecting the magnitude of the air pollution problem apart
See, for example, Edward Suchman's Evaluative Research (1967), and
Carol Weiss!s Evaluation Research (1971), both published by the Russell
Sage Foundation.
1-8
-------
Suggested Categories and Subcategories
Outputs
1. Administrative
2. Enforcement
3. Engineering
4. Technical Services
Monitoring
Enforcement
Administrative
Resources
Overall Performance
1. Emissions Reductions
2. Air Quality Improvements
Figure 1-1, Categorization Scheme #1
1-9
-------
Categorization of Indices
1. Need: How great is the present air pollution problem?
a. Actual ambient air quality problem
b. Emissions and emission sources
c. Environmental conditions that exacerbate air pollution
2. Effort: What resources are being expended, what actions
are being taken in relation to the need?
a. Resources expended
b. Surveillance/enforcement actions
c. Manpower training
3. Performance; How well is the agency operating in relation
to operational objectives?
a. Performance in meeting MBO commitments
b. Compliance performance
c. Performance in reporting AQ/emissions data
d. Performance in completing plans and plan revisions
4. Adequacy; What is the agency progress in accomplishing
air quality goals (adjusted for population/economic growth,
data changes, new sources, meteorological conditions, etc.)?
a. Emissions reductions
b. AQ improvement
5. Efficiency; Cost-effectiveness
a. Cost per action taken
b. Cost per unit of emissions reduction/AQ improvement
6. Process; Tie between effort and result, assumptions made,
limitations of evaluation system, lead into the need for
subjective evaluation.
Figure 1-2. Categorization Scheme #2
1-10
-------
from the extent of state efforts and progress, was also added. The
six major categories of measures, termed indices, were to be the
overall measures of control agency status.
Additional categorization schemes attempted to refine the basic
framework. Figure 1-3 illustrates how the indices fit into the
evaluation process. Emphasis here is on the point in time or period
of time for which an index and its component indicators are relevant.
Two types of indices were suggested: (1) indices of need, and (2)
indices of performance. Further discussions with EPA personnel pointed
out the need to distinguish between progress during the most recent
period of evaluation (such as the past year), improvement in progress
from the previous period to the present period, and long-term cumulative
achievement, resulting in eight indices and twenty-four sub-indices
(see Figure 1-4),
At this point a questionnaire (see Appendix I-C) was sent out
to all EPA regional offices and certain headquarters offices asking
for comment on the general categories of measures then under consi-
deration. Respondents were asked to weight the relative importance
of the measures and to add any additional indicators they felt were
relevant. Twenty-three responses were received: 18 from 6 regional
offices, 4 from headquarters offices, and 1 from a state control
agency. The questionnaire was intended to indicate general consensus
as to relevant categories of measures of state performance and need;
there was never any intention to use the results in any statistically
rigorous manner. In fact, for the majority of categories, there was
a great deal of variation in the weights provided by the respondents
for any one category and little variation between categories in any
1-11
-------
Outline of Steps of State Evaluation System
A. Determine State's Initial Need (at the beginning of the period of analysis)
1, Goal: Attainment of Ambient Air Quality Standards
PROBLEM: Magnitude of air quality problem that must be solved
Ambient air quality problem
Emissions and emission sources
Environmental conditions that exacerbate air pollution
2. Objectives: Source Compliance
Minimum required reporting of AQ and emissions
Completion of plans and revisions
STATUS: Discrepancy between objectives and actual conditions
Source compliance status
Status in reporting AQ/emissions data
Status in completing plans and revisions
B. Rate State Performance (during the period of analysis)
1. EFFORT: Resources expended during the period in relation to
problem at the beginning of the period
1.1. Resources Expended
1.2. Manpower Training (APTI)
2. ACHIEVEMENT: Operational Accomplishments (incremental, cumulative,
and rate of accomplishment) in relation to objectives
or deficiency at the beginning
2.1. Achievement in Meeting MBO Commitments
2.2. Compliance Achievement
2.3. Achievement in Reporting AQ/Emissions Data
2.4. Achievement in Completing Plans and Revisions
3. GOAL ATTAINMENT: Changes in AQ and emissions during the period in
relation to problem at the beginning
3.1. Ambient Air Quality Improvement
3.2. Emissions Reductions
3.3. PRMS
4. EFFICIENCY: Cost-effectiveness of resources expended during the
period
4.1, AQ improvement/emissions reduction per dollar of resources
expended
C. Rate State's Present Need (at end of period of analysis)
5. PROBLEM: Magnitude of air quality problem that must be solved
5.1. Ambient Air Quality Problem
5.2. Emissions and Emission Sources
5.3. Environmental Conditions that Exacerbate Air Pollution
6. STATUS: Discrepancy between objectives and actual conditions
6.1. Source Compliance Status
6.2. Status in Reporting AQ/Emissions Data
6,3. Status in Completing Plans and Revisions
Figure 1-3. Categorization Scheme #3
1-12
-------
I. State Performance Indices
A. Goal: Air Quality Improvement, Emissions Reduction
1. EFFORT: Resources expended during the period in relation
to the air quality problem at the beginning of the period
1.1. Total Expenditures
2. GOAL ATTAINMENT: Changes in air quality and emissions
during the period in relation to the air quality problem
at the beginning of the period
2.1. Ambient Air Quality Improvement
2.2. Emissions Reduction
2.3. PRMS
3. EFFICIENCY: Cost-Effectiveness
3,1. Air Quality Improvement Per Dollar of Resources Expended
3.2. Emissions Reduction Per Dollar of Resources Expended
B. Operational Objectives: Meeting Commitments, Source Compliance,
Enforcement Actions, Monitoring and Reporting Air Quality and
Emissions, Completing Plans and Revisions
4. PROGRESS: Operational accomplishments during the period in
relation to operational objectives or deficiencies at the
beginning of the period
4,1. Meeting MBO Commitments
4.2. Source Compliance
4.3. Enforcement Actions
4.4. Monitoring and Reporting Air Quality and Emissions
4.5. Completing Plans and Revisions
5. IMPROVEMENT: Progress during the present period in relation
to progress during the previous period
5.1. Meeting MBO Commitments
5.2. Source Compliance
5.3. Enforcement Actions
5,4. Monitoring and Reporting Air Quality and Emissions
5.5. Completing Plans and Revisions
6. ACHIEVEMENT: Cumulative operational accomplishments at the
end of the period in relation to long-term operational
objectives
6.1. Source Compliance
6.2. Monitoring and Reporting Air Quality and Emissions
6.3. Completing Plans and Revisions
Figure 1-4. Categorization Scheme #4
1-13
-------
II. State Need Indices
A. Goal: Air Quality Improvement, Emissions Reduction
7. PROBLEM: Status at the end of the period in relation
to air quality and emissions goals
7.1. Ambient Air Quality
7.2. Emissions and Emission Sources
B. Operational Objectives: Source Compliance and Enforcement,
Monitoring and Reporting Air Quality and Emissions, Completing
Plans and Revisions
8. DEFICIENCY: Status at the end of the period in relation
to long-term operational objectives
8.1. Source Compliance and Enforcement
8.2. Monitoring and Reporting Air Quality and Emissions
8.3. Completing Plans and Revisions
Figure 1-4. Categorization Scheme #4
(Continued)
1-14
-------
summary statistic (mean, median, mode) of the weights for these cate-
gories. The responses, however, did point out a few measures for
which there appeared to be a consensus for elimination (such as total
land area) or inclusion (such as urban population).
During this same period visits to two EPA Regional Offices were
made to discuss the system and the individual measures. Meetings were
also held with personnel from two additional Regional Offices and
relevant headquarters offices (see Appendix I-A).
On the basis of the questionnaire responses and continuing dis-
cussions with EPA personnel, the system was trimmed to five indices
and sixteen sub-indices (see Figure 1-5). The index "Improvement"
compared progress during the present period of evaluation with progress
during the previous period. Since for this first demonstration of the
system there would be no "previous period of evaluation," the index
was dropped temporarily; the index can be reinserted for the second
application of the system if desired. The indices "Effort" and
"Efficiency" were eliminated because of the questionable validity
of using total state expenditures in relation to specific activities
or changes in specific aspects.
Many of the indicators eliminated from the original list (such
as program expenditures), along with some additional data, were put
into a separate section of State Background Information. This section
is intended to provide some perspective on state status without serving
as a basis for evaluating state performance or need in regard to air
pollution control.
Finally, because of the expressed desire of EPA personnel to look
separately at state performance and need relative to each of the
1-15
-------
A. State Performance Indices
1. GOAL ATTAINMENT: Changes in air quality and emissions during
the period in relation to the air quality and emissions pro-
blems at the beginning of the period
1.1. Ambient Air Quality Improvement
1.2, Emissions Reduction
1.3. PRMS
2, PROGRESS: Operational accomplishments during the period in
relation to operational objectives or deficiencies at the
beginning of the period
2.1. Meeting MBO Commitments
2.2. Source Compliance
2.3. Enforcement Actions
2.4. Monitoring and Reporting Air Quality and Emissions
2.5. Completing Plans and Revision
3, ACHIEVEMENT: Cumulative operational accomplishments at the
end of the period in relation to long-term operational objectives
3.1. Source Compliance
3.2. Monitoring and Reporting Air Quality and Emissions
3.3. Completing Plans and Revisions
B. State Need Indices
4. PROBLEM: Status at the end of the period in relation to air
quality and emissions goals
4.1. Ambient Air Quality
4.2. Emissions and Emission Sources
5. DEFICIENCY: Status at the end of the period in relation to
long-term operational objectives
5.1. Source Compliance and Enforcement
5.2. Monitoring and Reporting Air Quality and Emissions
5.3, Completing Plans and Revisions
Figure 1-5. Categorization Scheme #5
1-16
-------
criteria pollutants, the sub-indices were reorganized to feature the
pollutants at the highest possible level of aggregation. Thus, the
index "Goal Attainment," which originally was composed of the sub-indices,
ambient air quality improvement, emissions reductions, and PRMS flags,
each of which was composed of indicators for all the pollutants, was
reorganized so that the sub-indices became goal attainment for the
pollutants. Each sub-index was composed of indicators of ambient air
quality improvement, emissions reduction, and PRMS flags for that
pollutant. This categorization scheme is shown in Figure 1-6.
3. Weighting System
After the conceptual framework was developed and component sub-
indices, indicators, and sub-indicators were constructed, weights
were assigned at each level of aggregation according to the relative
importance of each component within the whole.
An initial set of weights was developed partly on the basis of
responses to the questionnaire (Appendix I-C) and partly based on
subjective judgment derived from discussions with various EPA
personnel. The system of measures and weights was sent to EPA
Regional Offices and to the State and Territorial Air Pollution Pro-
gram Administrators (STAPPA) for further review and comment.
Respondents were asked to substitute their own weights if the
weights provided proved unsatisfactory.
Comments were received from four states, all of which dealt
with the overall system or the validity of specific measures. No
alternative weights were suggested. Therefore the initial set of
weights was retained for use in the test run of the system.
However, it is well understood that assignment of weights depends
a great deal on the use to which the results of the system will be put.
1-17
-------
A. State Performance Indices
1. GOAL ATTAINMENT: Changes in air quality and emissions during
the period in relation to the air quality and emissions problems
at the beginning of the period
1.1. TSP Goal Attainment
1.2. S02 Goal Attainment
1,3. CO Goal Attainment
1.4. Ox Goal Attainment
1.5, N02 Goal Attainment
2. PROGRESS: Operational accomplishments during the period in
relation to operational objectives or requirements at the
beginning of the period
2.1. Meeting MBO Commitments
2,2. Source Compliance
2.3. Surveillance and Enforcement Actions
2.4. Monitoring and Reporting Air Quality (Pollutant-Specific)
2.5. Monitoring and Reporting Air Quality and Emissions (General)
2.6. Completing Plans and Revision
3. ACHIEVEMENT: Cumulative operational accomplishments at the end
of the period in relation to long-term operational objectives
3.1. Source Compliance
3.2. Monitoring and Reporting Air Quality (Pollutant-Specific)
3.3. Monitoring and Reporting Air Quality and Emissions (General)
3,4. Completing Plans and Revisions
B. State Need Indices
4. PROBLEM: Need at the end of the period in relation to air quality
and emissions goals
4.1. Ambient Air Quality (Pollutant-Specific)
4.2. Emissions Sources (General)
4.3. Emission Reduction Needed (Pollutant-Specific)
5, OPERATIONAL REQUIREMENTS: Need at the end of the period in relation
to long-term operational objectives
5.1, Source Compliance and Enforcement
5.2. Monitoring and Reporting Air Quality (Pollutant Specific)
5.3. Monitoring and Reporting Air Quality and Emissions (General)
5.4. Completing Plans and Revisions
Figure 1-6. Categorization Scheme #6
1-18
-------
In addition, a good case can be made that any evaluation of state
performance must be related to the extent and nature of the problem
with which a state is faced. For example, an adequate oxidant
monitoring network is much more important than a complete S02 net-
work in a state in which ambient 0 levels exceeds standards and S09
X £
levels are below standards. Such an argument suggests the need for
separate weighting schemes for each state or region, and perhaps
weights that vary from one application of the system to the next.
For these reasons, it was decided that while the initial set of
weights would be used for the initial demonstration of the system,
emphasis would be placed on making explicit what weights were used
and making simple the recalculation of scores using alternative weights.
A user of the system need not utilize the weights used in the trial
run, but may substitute his own. The whole issue of system flexibility
is discussed further in Volume II, Section E of this report.
4. Output Formats
The output format depends to a large degree on the use to which
system results will be put. Two alternative formats were developed
and are described in Volume II, Section D.
Variations within each basic format were developed when interest
was expressed in using certain levels of aggregation. Thus format //I
can be used to show all scores for all measures under a given index,
a summary of scores for all indices and sub-indices, or computed values
(not scores) for measures at the lowest level of aggregation. Simi-
larly format #2 can be used to depict values or scores for any measure
at any level of aggregation.
1-19
-------
D. Task 4: System Implementation
An initial application of the system was made as a demonstration in
order to test the availability of data and make some judgments about the
validity of the measures, Data were collected from the various sources.
Calendar year 1973 was chosen as the present period of evaluation of those
measures using SAROAD and NEDS data. This was the latest complete year
for which SAROAD data were available in published form; use of the published
reports facilitated comparison with a previous period, calendar year 1972.
NEDS emission data generally representative of 1973 were available from
NEDS printouts.
Because the MBO system and its outputs were established beginning in
FY 75, a different period of evaluation for indicators utilizing MBO data
was necessary. MBO commitments and achievements for all states as of the
second quarter of FY 75 (ending December 31, 1974) were utilized. This
was considered acceptable because MBO data are not combined with other
data in any measure, so that there is no inconsistency of time period
within any one measure.
The trial run produced some changes in the framework of measures.
Some data, such as the MBO outputs #2 through #8, proved unavailable, so
that many measures had to be dropped. As a consequence, two sub-indices
under each of three indices, 2.4. and 2.5,, 3.2. and 3.3., and 5.2. and
5.3, were combined. Sub-index 2.6., Progress in Completing Plans and
Revisions, was dropped because it was felt that the completion of plans
and revisions was a more long-term process than could be measured for a
single period of evaluation. Sub-index 3.4., Achievement in Completing
Plans and Revisions, was retained. Finally, sub-index 4.2. Emissions
Sources (General) was expanded to include current emissions of each of the
1-20
-------
pollutants, and was renamed Emissions and Emission Sources. The final out-
line of the five indices and eighteen sub-indices is presented in Figure 1-7,
Description of the trial run procedures is given in Volume II, Section
C; the results of the trial run, presented in various formats, are shown in
Volume II, Section D.
1-21
-------
A, State Performance Indices
1. GOAL ATTAINMENT: Changes in air quality and emissions during
the period in relation to the air quality and emissions pro-
blems at the beginning of the period
1.1. TSP Goal Attainment
1.2. S02 Goal Attainment
1.3. CO Goal Attainment
1.4. Ox Goal Attainment
1.5. NO- Goal Attainment
£*
2, PROGRESS: Operational accomplishments during the period in
relation to operational objectives or requirements at the
beginning of the period
2.1. Meeting MBO Commitments
2.2. Source Compliance
2.3. Surveillance and Enforcement Actions
2.4. Monitoring and Reporting Air Quality and Emissions
3, ACHIEVEMENT: Cumulative operational accomplishments at the
end of the period in relation to long-term operational objectives
3.1. Source Compliance
3.2. Monitoring and Reporting Air Quality and Emissions
3.3. Completing Plans and Revisions
B, State Need Indices
4. PROBLEM: Need at the end of the period in relation to air quality
and emissions goals
4.1. Ambient Air Quality
4.2. Emissions and Emission Sources
4.3, Emission Reduction Needed
5, OPERATIONAL REQUIREMENTS: Need at the end of the period in
relation to long-term operational objectives
5.1. Source Compliance and Enforcement
5.2. Monitoring and Reporting Air Quality and Emissions
5.3. Completing Plans and Revisions
Figure 1-7. Final Categorization Scheme (#7)
1-22
-------
APPENDICES TO VOLUME I
Appendix I-A: Personal Contacts
Appendix I-B: Comprehensive List of Information Sources
For Use as Input to SIP Objective
Evaluation System
Appendix I-C: State Ojective Evaluation System Questionnaire
-------
APPENDIX I-A
Personal Contacts
-------
APPENDIX I-A
Personal Contacts
Meetings were held with the following persons in order to obtain information
about data sources with which they were familiar, and/or input into and
review of the evaluation system and component indicators.
EPA-Washington
Grants Administration:
Water Programs Operations:
Office of Planning and Evaluation:
Resource Management, Program
Reporting Division:
Stationary Source Enforcement
Division:
Land Use Planning:
Intergovernmental Affairs:
Mr. Joe Rausher
Mr. Ed Richards
Mr. James R. Janis
Mr. Frank Blair
Mr. Barry Korb
Mr. Dario Monti
Mr. Robert Duprey
Mr. Jack Siegel
Mr..Michael Merrick
Dr, David Morrell
Mr. Marvin B. Fast
EPA-RTP/Durham (Air Programs)
National Air Data Branch:
Monitoring and Data Analysis
Division:
Control Programs Development
Division:
Dr. James R. Hammerle
Mr. Gerald J. Nehls
Mr. James H. Southerland
Mr. Thomas B. McMullen
Mr. Alan J. Hoffman
Mr. Jon R. Clark
Mr. William M. Cox
Mr. William Hunter
Mr. Norman G. Edmisten
Mr, David R. Dunbar
Mr. Walter H. Stephenson
Mr. Joseph J. Sableski
Mr. John I. Eagles
-------
APPENDIX I-A (Continued)
Data Services Division:
Meteorology Division:
Stationary Source Enforcement
Division:
Air Pollution Training Institute;
EPA Regional Offices
Region III:
Region IV:
Planning and Operations:
Air Enforcement:
Air Programs:
Region V:
AHMD Division:
S & A Division:
Central Regional Laboratory:
Enforcement Division:
Region VIII:
Region IX:
Ms. Maureen M. Johnson
Mr. Gerald A. DeMarrais
Mr. Kirk E. Foster
Mr. Charles D. Pratt
Mr, Henry Brubaker
Mr. Dwight Brown
Mr. James Wilburn
Mr. Thomas A. Gibbs
Mr. Gregory Glahn
Mr. Thomas Strickland
Mr. Bryan Beal
Mr. Mike DeBusschere
Mr. Winston Smith
Mr. Thomas Mateer
Mr. Roger Gorski
Mr. J. Clesceri
Mr. Ron Van Mersbergen
Mr. Charles Miller
Mr. Eugene Moran
Mr. John Logsoon
Ms. Carol Foglesong
Mr. Leo Stander
Mr. Wayne Blackard
-------
APPENDIX I-B
Comprehensive List of Information Sources
For Use as Input to
SIP Objective Evaluation System
-------
APPENDIX I-B
Comprehensive List of Information Sources
For Use as Input to
SIP Objective Evaluation System
Information Bit
Estimated
Time Delay
Comments
tout Units
. Number of identified point sources determined
to be in final compliance with emission require
ments (by State)
Number of identified point sources of unknown
compliance status with final emission require-
ments (by State)
, Number of identified point sources out of
compliance with final emission requirements
which are not on schedule (by State)
Number of identified point sources determined
to be in compliance with scheduled increments
(by State)
Number of point sources determined to be over-
due in meeting increments of progress in
schedules (by State)
Number of point sources of unknown status
regarding compliance with scheduled increments
(by State)
Number of field surveillance actions taken to-
determine source compliance status by:
(1) each State
(2) EPA in each State
Number of enforcement actions undertaken by:
(1) each State
(2) EPA in each State
Number of revisions to regulatory portion of
SIP (by State)
Number of TCP and TCP revisions (by State)
States with indirect source control plans
States with all AQMA Plans completed
States which have been delegated enforcement
of NESHAPS
States which have been.delegated enforcement
of NSPS
Number of field tests to be conducted (fuel
additives)
Number of increments of progress that must be
met to ensure compliance with all TCP's
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
6-9 months
6-9 months
6-9 months
L2-15 month
3-6 months
3-6 months
3-6 months
3-6 months
From:
Management by Objectives
FY 75 Operating Guidance
Type of action is important
here
-------
Information Bit
Estimated
Time Delay
Comments
Output Units (Continued)
5b. Number of parking facility construction permit
applications to be reviewed (by State)
6a. States with complete required network for
criteria pollutants
6b. States and local Quality Assurance Programs
established
la. Percent of sources subject to NESHAPS (including
spraying and demolition operations) which are in
compliance with schedules of emission standards
8a. Percent of sources subject to NSPS determined to
be in compliance in each Region (by State)
3-6 months
3-6 months
6-9 months
3-6 months
3-6 months
-------
Comprehensive List of Information Sources
For Use as Input to
SIP Objective Evaluation System
Information Bit
Estimated
Time Delay
Comments
Activity Indicators
Number of formal inquiries sent to all sources
to determine compliance status by:
(1) each State
(2) EPA in each State (Sec. 114 letters)
Number of source tests conducted or observed
to determine compliance status by:
(1) EPA
(2) all States in the Region
Number of notices of violation issued by:
(1) each State
(2) EPA in each State
Number of abatement orders issued by:
(1) each State
(2) EPA in each State
le. Number of civil/criminal proceedings initiated
by:
(1) each State
(2) EPA in each State
la.
Ib.
le.
Id.
4a.
4b.
4c.
Number of laboratory tests performed
Number of stop-sale orders
Number of fines assessed
Manpower Model^
1. Manpower required for monitoring activities
2. Manpower required for source control activities
3. Manpower required for overhead
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
3-6 months
V
none
From:
Management by Objectives
FY 75 Operating Guidance
Unleaded gas program
Based on parameters such as
emission sources, monitoring
equipment, population, etc.
-------
Information Bit
Estimated
Time Delay
Comments
NADB
1. SAROAD status score
2. Number of sites with valid year of data
3. Quality assurance grade
4. Number and types of data items missing from NEDS
form \
5. Number of sources discovered in verification file
which are not in the original inventory
6. Number of new sources sent into NEDS
7. Percentage of miscalculated emission rates
3-6 months
3-6 months
3-12 months
none
none
none
none
Can be obtained for any
period (e.g., over
previous four quarters)
Not fully completed
Questionable availabilitj
PRMS
1. Potential deficiencies (projected values exceed
actual by more than a pre-specified amount)
2. Percent reduction in emissions necessary to
achieve primary and secondary standards
3. Correction factor needed to bring projection to
standard at final compliance date.
4. Number of observations (%)
(a) > proj ected
(b) > primary standard
(c.) > secondary standard
5.. Total number of. observations
6-12 months
6-12 months
6-12 months
According to proportional
model
6-12 month:
-------
Information Bit
Estimated
Time Delay
Comments
See Output 1 of Guidance Package
Percentage of total sources at each increment
of progress which are on time, late, overdue,
or in the future
Number of sources on CDS
i
Percentage of sources on CDS
Visible emissions observations
3-6 months
In the early stages of
implementation. Not all
sources are included.
Pollution Training Institute
Number of people trained from state/local
agency per year
Number of student-days of training
Number of personnel trained vs. size of agency
Number of personnel trained vs. years of
experience
Many other types of training information will
be available on completion of system automation
none
In the planning stages of
automation
-------
Information Bit
Estimated
Time Delay
Comments
Trends Report
1. Minimum number of stations required in state
(by pollutant)
2. Number of stations reporting in state (by
pollutant)
3. Number of stations required and not reporting
in state (by pollutant)
4. Number of ACQR's in state reporting <1/2 M.R.,
1/2 to M.R., and >_ M.R. ~~
5. Same information as above for each AQCR
6. Number of stations exceeding standards in each
AQCR (by pollutant)
6-12 months
6-12 months
5-12 month
5-12 months
5-12 months
5-12 months
Need definition of
"required"
Need definition of
"reporting"
Based on stations in
state
National Air Monitoring Program - A.Q. and
Emissions Trends Annual Report
1. Number of monitoring stations required,
proposed, and existing in each AQCR
(by pollutant)
2. Trends in A.Q. at NASN Station (by pollutant)
3-12 months
3-12 months
Down, up, or no change
-------
Information Bit
Estimated
Time Delay
Comments
SIP Progress Report
State and local support broken down by
(a) TCP development
(b) SIP revision
(c) SIP secondary standards development
(d) NEDS and air quality data reporting
(e) Industrial source 10-year maintenance
(f) Demonstration grant
(g) Smelter study
Status of SIP's
(a) Public availability of data
(b) Require source record-keeping and
reporting
(c) Review of new sources and modifications
(d) Compliance schedules
(e) TCP's
(f) Emission limitations
so2
TSP
HC
N02
(g) Air quality surveillance
(h) Periodic testing and inspection
(i) Emergency episode plan
(j) Resources
(k) Intergovernmental cooperation
TCP acceptability (20 subprograms)
3-6 months
1-3 months
State or EPA promulgation,
proposed, or deficient
1-3 months
State or EPA promulgation
-------
Other Possible Sources of Information Bits
1. F. W. Dodge Co. Reportsan automated system which provides data
on new construction to compare with NEDS file and information on
new sources for NSPS.
2. Census informationdata on population, population density, housing,
etc., to help in normalization of information bits and permit state
versus state comparisons.
3. Dun and Bradstreetcheck on NEDS file. Includes information on
SIC, size, and operations involved within a facility.
4. R. L. Polk filesto obtain information on number of vehicles
registered within a state or other geographical area.
5. Federal Power Commission Form 67 tapesdata on steam-electric
generating plants above a certain capacity reported annually to
FPC; tapes and summary file available at EPA.
-------
APPENDIX I-C
State Objective Evaluation System Questionnaire
-------
APPENDIX I-C
STATE OBJECTIVE EVALUATION SYSTEM QUESTIONNAIRE
PART I.
Please weight the indicators within each of the following categories according
to (1) their validity as measures of State performance or need, and (2) if valid,
their importance relative to the other indicators in the category.
The following weighting system should be used:
0 = indicator is invalid or useless; should not be given any weight
1 = unimportant indicator; should be given little weight
2 = fair indicator
3 = good indicator
4 = very important indicator; should be weighted heavily
These weights should reflect differences in magnitude; two indicators or cate-
gories may have the same weight if they are of equal validity and importance.
No criteria for weighting the indicators are given. Your overall judgment based
on whatever criteria you consider relevant and important is desired. Space is
provided for any comments you may have regarding the indicators or any additional
indicators you consider relevant.
A. Weight the following items according to their importance as indirect
indicators of the magnitude of a State's air quality problem:
Weight (0-4)
1. Total population
2. Urban population (no. of persons in
urban areas)
3. Total land area (square miles)
4. Urbanized land area (square miles)
5. No. of AQMAs
6. Population in AQMAs (no. of persons)
7. Total fuel consumption
8. Projected population/economic growth
9. Average no. of days of air stagnation
10. Average no. of heating degree-days
11. Other (specify)
12. :
Comments:
-------
B. Weight the following Items according to their importance as direct
indicators of the magnitude of a State's air quality problem:
Weight (0-4)
1. Total emission reduction needed (current
emissions minus 1975 allowable emissions)
2. Stationary point* source emission reduction
needed (current minus 1975 allowable emissions)
3. Deviation of measured TSP annual means from the
TSP annual standards
4. Deviation of measured TSP concentrations from
the TSP 24-hour standard
5. No. of sites with potential deficiencies
flagged by PRMS
6. Other (specify)
7.
Comments:
C. Weight the relative importance of direct and indirect indicators of the
magnitude of a State's air quality problem (as listed in A. and B. above);
Weight (0-4)
1. Direct indicators
2. Indirect indicators
Comments:
D. Weight the following MBO Outputs according to their importance as indicators
of State performance:
Weight (0-4)
1. Output # 1 - Source compliance and
enforcement
2. Output #2 - SIP revisions & completions
3. Output #3 - NESHAPS and NSPS delegation
4. Output #5 - Transportation control plans
5. Output #6 - Completion of air monitoring
networks
6. Outputs # 7 & 8 - NESHAPS and NSPS
compliance
Comments:
-------
E. Weight the following items according to their importance as indicators of
the quality of a State's monitoring & reporting of air quality & emissions:
Weight (0-4)
1. No. of reporting stations in relation to
minimum required no. of stations
2. No. of reporting stations in relation to
no. of stations proposed in SIP
3. Percent of AQCRs with the minimum re-
quired or proposed no. of stations
4. Average percent of minimum required or
proposed no. of stations reporting in all
the AQCRs in any one State
5. Use of reference or equivalent pollutant-
methods
6. Validity and sufficiency of data sub-
mitted, to SAROAD
7. Completeness of list of sources on NEDS
relative to regional goals
8. Completeness of data submitted to NEDS
9. Completeness of list of sources on CDS
relative to regional goals
10. Quality Assurance status (existence of
QA program, QA grade)
11. Other (specify)
12.
Comments:
-------
PART II.
Please weight the indicators within each of the following categories according
to their importance, relative to the other indicators in the' category, in terms
of (1) regional priorities, and (2) the amount of resources needed to achieve
them.
As weights, use any positive number(whole number or fraction)that expresses
the magnitude of difference in importance or resources required. These
weights are not ordinal rankings; two indicators may have the same weight if
they are of equal importance.
Example A. Weight the following pollutants:
A complete (minimum required no. of stations) SO,
monitoring network is:
1/2 times as important as a complete TSP monitoring
network
2 times as important as a complete CO monitoring
network
4 times as important as a complete Ox monitoring
network
10 times as important as a complete NOX monitoring
network
A. Weight the following pollutants:
A complete(minimum required no. of stations) S02 monitoring network is:
times as important as a complete TSP monitoring network
times as important as a complete CO monitoring network
times as important as a complete Ox monitoring network
.- times as important as a complete Nox monitoring network
Comments:
B. Weight the following field surveillance and enforcement actions:
1 source (stack) test is equivalent to:
process (plant) inspections(s)
opacity observation(s)
\_ notice(s) of violation
abatement order(s)
civil/criminal proceeding(s)
other(specify)
Comments;
-------
C. Weight the following types of source compliance status:
1 source whose compliance status is known is equivalent to:
source(s) in final compliance with emission requirements
source(s) on compliance schedule
source(s) in compliance with scheduled increments of progress
other(specify)
Comments:
D. Weight the following components of a SAROAD reporting score:
1 station-quarter of valid & timely data is equivalent to:
station-quarter(s) of valid and late data
station-quarter(s) of invalid and timely data
station-quarter(s) of invalid and late data
Comments:
E. Weight the following SIP plans and revisions:
1 regulatory portion of the SIP completed is equivalent to:
'_ TCP or TCP revision (s) completed
indirect source plan(s) completed
AQMA analysis & plan(s) completed
NESHAPS procedures completed
NSPS procedures completed
other (specify)
Comments:
-------
A SYSTEM FOR TABULATING
SELECTED MEASURES OF
STATE AIR PROGRAMS STATUS
Vol. II: Manual for the System
-------
A.
B.
C.
D.
TABLE OF CONTENTS
Page
Description of Measures
1.
2.
3.
Performance Indices
a. Index 1. Goal Attainment
c. Index 3. Achievement
Need Indices
State Background Information
Methodology
1.
2.
Performance Indices
a. Index 1. Goal Attainment
b. Index 2. Progress ,
c. Index 3. Achievement
Need Indices
a. Index 4. Problem ... 1
Trial Run
1.
2.
Performance Indices
a. Index 1. Goal Attainment
b. Index 2. Progress
c. Index 3. Achievement
Need Indices
a. Index 4. Problem
Formats for System Results
1.
2.
3.
Format #1
Format #2
Format #3
. . . II-3
. . . II-7
. . . II-7
. . . 11-10
. . . 11-20
. . . 11-27
. . . 11-28
. . . 11-34
. . . 11-38
. . . 11-41
. . . 11-44
. . . 11-44
. . . 11-50
. . . 11-59
. . . 11-66
. . . 11-66
11-72
. . . 11-79
. . 11-82
11-82
. . . II 86
11-90
. . 11-94
. . . 11-94
. 11-100
. 11-106
. . 11-106
. . . 11-106
. . . II-H9
(Continued)
-------
TABLE OF CONTENTS
(Continued)
Page
E. Issues and Problems 11-127
1. Uses of the System 11-127
2. Data Base 11-129
3. Weighting System 11-131
4. System Flexibility 11-134
5. Feasibility of Automation 11-135
Appendix II-A:
Appendix II-B:
Appendix II-C:
Appendix II-D:
Appendix II-E:
Data Sources
1972/1973 Pollutant-Method-Stations Summary
Summary of National Ambient Air Quality Standards
State Background Information
Workbook
LIST OF FIGURES
Number
Page
II-l. Index Structure . . .
II-2. Outline of Indices . .
II-3. Flowchart for Index 1.
II-4. Flowchart for Index 2.
II-5. Flowchart for Index 3.
II-6. Flowchart for Index 4.
II-7. Flowchart for Index 5.
II-8a. Output Format #1, State
II-8b. Output Format #1, State
II-8c. Output Format #1, State
II-8d. Output Format #1, State
II-8e. Output Format #1, State
II-9a. Output Format #2, State
II-9b. Output Format #2, State
II-9c. Output Format #2, State
II-9d. Output Format #2, State
II-9e. Output Format #2, State
Goal Attainment
Progress . . .
Achievement . .
Problem ....
Operational Requirements
II-4
II-6
II-8
11-11
11-21
11-29
11-35
Values for Index 1 11-107
Values for Index 2 11-108
Values for Index 3 11-109
Values for Index 4 11-110
Values for Index 5 11-111
Scores for Index 1 11-112
Scores for Index 2 11-113
Scores for Index 3 11-115
Scores for Index 4 11-116
Scores for Index 5 11-117
(Continued)
-------
LIST OF FIGURES
(Continued)
Number
Page
11-10. Output Format #2, State Scores for
All Indices and Sub-Indices 11-118
II-lla. Format #3, Frequency Distribution 11-120
II-llb. Format #3, Frequency Distribution 11-121
II-llc. Format #3, Frequency Distribution 11-122
II-lld. Format #3, Frequency Distribution 11-123
II-lle. Format #3, Frequency Distribution 11-124
II-llf. Format #3, Frequency Distribution 11-125
II-llg. Format #3, Frequency Distribution 11-126
LIST OF TABLES
Number
Page
II-l. Converting Values to Scores:
Index 1. Goal Attainment
II-2. Scoring and Weighting:
Index 1. Goal Attainment
II-3. Converting Values to Scores:
Index 2. Progress
II-4. Scoring and Weighting:
Index 2. Progress
II-5. Converting Values to Scores:
Index 3. Achievement
II-6. Scoring and Weighting:
Index 3. Achievement
II-7. Converting Values to Scores:
Index 4. Problem
H-8. Scoring and Weighting:
Index 4. Problem
II-9. Converting Values to Scores:
Index 5. Operational Requirements
H-10. Scoring and Weighting:
Index 5. Operational Requirements 11-74
11-46
11-48
11-52
11-54
11-60
11-61
11-68
11-69
11-73
(Continued)
-------
LIST OF TABLES
(Continued)
Number
11-11. Converting Values to Scores:
Index 1. Goal Attainment 11-87
11-12. Scoring and Weighting:
Index 1. Goal Attainment 11-88
11-13. Converting Values to Scores:
Index 2. Progress 11-91
11-14. Scoring and Weighting:
Index 2. Progress 11-93
11-15. Converting Values to Scores:
Index 3. Achievement . 11-95
11-16. Scoring and Weighting:
Index 3. Achievement 11-96
11-17. Converting Values to Scores:
Index 4. Problem 11-101
11-18. Scoring and Weighting:
Index 4. Problem 11-102
11-19. Converting Values to Scores:
Index 5. Operational Requirements 11-104
11-20. Scoring and Weighting:
Index 5. Operational Requirements 11-105
-------
Introduction
The system described herein provides a method for organizing and pre-
senting information on the status of selected aspects of the state air
pollution control situation in the U.S., using currently available data
from existing reporting systems. It consists of a framework of measures
of selected aspects of state air programs, and a methodology for computing
and presenting values and comparative scores for these measures.
The purpose of the system is to consolidate and organize into a
coherent framework, data routinely reported to EPA headquarters as well
as data drawn from standard national data sources. The system requires
no reporting by states to EPA beyond what is already required nor has it
been a factor in the current reporting requirements.
The system meets the four major constraints imposed on it at its
conception. These constraints were:
(1) to use data currently available or projected to be available
in the near future, drawn from existing data banks and
reporting systems,
(2) to be applicable to all states and territories,
(3) to be of immediate use to EPA once it was developed and tested,
and
(4) to be capable of being implemented on a regular basis using
updated data.
Possible uses of the system as well as its limitations are discussed
in Section E. The system is not a comprehensive evaluation system because
of the constraint against requiring new data. It is a way of organizing
data into a coherent framework within which state air pollution control
status can be examined.
II-l
-------
It should also be noted that the state (or territory) is the unit of
analysis of interest in this system, requiring generalizations from com-
ponent AQCRs, While there is some doubt that one can generalize about the
ambient air quality of an entire state, the state is considered the most
meaningful unit in relation to control activities and need, because the
state has ultimate responsibility for air pollution control under the
Clean Air Act and federal regulations.
The following sections discuss:
the framework of measures, including descriptions of each;
the methodology for computing values and scores for the indicators,
including the sources of data used;
the trial run of the system conducted to verify the availability
of data and to test the validity of indicators;
alternative formats for presenting results, and
issues and problems in implementing the system, including possible
uses and limitations and the feasibility of automating the system.
Readers interested only in the general outlines of the system can
limit their reading to sections A, D, and E. Sections B and C are written
for those who are interested in actually implementing the system. In
addition, reference is made to the Workbook (Appendix II-E). The Workbook
includes worksheets and tables needed to compute values and scores, and
output formats for presenting values and scores.
II-2
-------
SECTION A. DESCRIPTION OF MEASURES
1.
II-7
a. Index 1.
b . Index 2 .
c. Index 3.
Need Indices
a. Index 4.
b. Index 5.
Operational Reauirements .
.... II-7
.... 11-10
.... 11-20
.... 11-27
.... 11-28
.... 11-34
3. State Background Information 11-38
-------
A. Description of Measures
Measures are organized Into a four-level structure (see Figure II-l).
At the lowest level of aggregation (level 4), sub-indicators are composed
of combinations of individual data items drawn from existing data systems.
In some cases, there are no sub-indicators and the indicator (level 3) would
then be composed of data items.
Appendix II-A presents a list and summary description of all sources
of data used in the system.
In constructing indicators or sub-indicators from data items, data is
used in as many combinations as possible that reveal some aspect of control
agency status or activities for which a measure might be valid and useful.
Measures are normalized, i.e., related to a norm or standard, so that one
state could be compared meaningful to another. The basic premise is that
a state should be evaluated in terms of national goals or objectives or in
terms of its own objectives (not inconsistent with national goals). For
example, an air quality improvement indicator measures not merely absolute
3
improvements (in yg/m , etc.), but also improvements relative to achievement
of the national ambient air quality standards.
One or more sub-indicators comprise an indicator (level 3). At the
next level of aggregation, a sub-index (level 2) is composed of one or
more indicators. Finally at the highest level of aggregation, an index
(level 1) is composed of one or more sub-indices.
An index represents an aspect of state status in relation to overall
goals or operational objectives, within a specified time frame and encom-
passing the range of program activities of an air pollution control agency,
for which an overall score was thought to be meaningful.
II-3
-------
Level 1
Level 2
Level 3
Index
Sub-Index
Sub-Index
Indicator
Indicator Indicator
Indicator
Level 4 J Sub- Sub-
Indicator Indicator
Sub-
Indicator
Sub- Sub-
Indicator Indicator
Figure II-l, Index Structure
II-4
-------
There are five Indices that make up the system. Three indices measure
state performances in relation to goals and objectives. One of these three
indices (Goal Attainment) measures performance in relation to long-term
goals of AQ improvement and emissions reductions. The other two (Progress
and Achievement) measure performance in relation to more immediate operational
objectives that are necessary to accomplishing goals. These objectives are
meeting (MBO) commitments, ensuring source compliance, carrying out sur-
veillance and enforcement actions, monitoring and reporting to EPA air
quality and emissions, and completing SIP plans and plan revisions. Progress
measures performance in a given period of evaluation, such as a calendar
year, while Achievement measures cumulative performance up to a given
point in time (usually the end of the period of evaluation).
Two indices measure state need. Problem measures need in relation to
air quality and emission goals, while Operational Requirements measures
need in relation to operational objectives.
The five indices are broken down into 18 sub-indices. An outline and
brief description of the indices and sub-indices are presented in Figure II-2.
Each of the indices, sub-indices, indicators, and sub-indicators as
outlined in Figure II-2 is described below. A flowchart is presented for
each index showing the four-level structure of the index. The "present
period of evaluation" refers to the period of time for which the data used
in the computation of indicators are relevant (such as calendar year 1973);
"previous period of evaluation" means the period of time of equal length
that immediately preceded the present period (if calendar year 1973 is the
present period, calendar year 1972 is the previous period).
II-5
-------
A, State Performance Indices
1. GOAL ATTAINMENT: Changes in air quality and emissions during
the period in relation to the air quality and emissions pro-
blems at the beginning of the period
1.1. TSP Goal Attainment
1.2. S02 Goal Attainment
1.3. CO Goal Attainment
1.4. Ox Goal Attainment
1.5. NO- Goal Attainment
2. PROGRESS: Operational accomplishments during the period in
relation to operational objectives or requirements at the
beginning of the period
2.1. Meeting MBO Commitments
2.2. Source Compliance
2.3. Surveillance and Enforcement Actions
2,4. Monitoring and Reporting Air Quality and Emissions
3, ACHIEVEMENT: Cumulative operational accomplishments at the
end of the period in relation to long-term operational objectives
3.1. Source Compliance
3.2. Monitoring and Reporting Air Quality and Emissions
3.3. Completing Plans and Revisions
B, State Need Indices
4. PROBLEM: Need at the end of the period in relation to air quality
and emissions goals
4.1. Ambient Air Quality
4.2. Emissions and Emission Sources ;
4.3. Emission Reduction Needed
5. OPERATIONAL REQUIREMENTS: Need at the end of the period in
relation to long-term operational objectives
5.1, Source Compliance and Enforcement
5.2. Monitoring and Reporting Air Quality and Emissions
5.3. Completing Plans and Revisions
Figure II-2, Outline of Indices
II-6
-------
1. Performance Indices; Measure state efforts and accomplishments in
relation to national or state goals and objectives.
a. Index 1. GOAL ATTAINMENT (see flowchart, Figure II-3, p. II-8)
Index 1 measures changes in air quality and emissions in a state
from the previous period of evaluation to the present period, in
relation to the air quality and emissions problems during the pre-
vious period. Each sub-index measures goal attainment for a particular
pollutant:
1.1, TSP Goal Attainment
1.2, SO- Goal Attainment
1.3. CO Goal Attainment
1,4. Ox Goal Attainment
1.5. N02 Goal Attainment.
The goal attainment sub-index for each pollutant is composed of
3 indicators: (1) ambient air quality improvement, (2) emission
reduction, and (3) PBMS flags.
(1) The ambient air quality improvement indicator measures
changes in air quality in relation to long-term and short-term
primary standards, as applicable, for each pollutant (i.e., TSP
annual and 24-hour primary standards, S0» annual and 24-hour
primary standards, CO 8-hour and 1-hour primary standards, 0
A
1-hour primary standard, and NO- annual primary standard). Using
a set of monitoring stations that reported data to SAROAD in both
the present and previous periods, (a) changes in the sum of
percentage deviations above standards (air quality worse than
standards), for states that exceeded the standard at any station
in both years, or (b) changes in the sum of all observations,
for states that did not exceed the standard in either year, are
calculated.
II-7
-------
Index
Sub-Index
FLOWCHART FOR INDEX 1. GOAL ATTAINMENT
Indicator Sub-Indicator
-1.2. S02 Goal
Attainment
1. GOAL -
ATTAINMENT
1.1. TSP Goal -
Attainment
L.3. CO Goal
Attainment
1.4. Ox Goal
Attainment
1.5. N02 Goal -
Attainment
r-1.1.1. TSP ambient air
quality improvement
-1.1.2. Z of needed TSP
emission reduction
attainad
-1.1.3. No. of TSP PRMS flags
pl.2.1. S02 ambient air -
quality improve-
ment
-1.2.2. I of needed S02
emission reduction
attained
--1.2.3. No. of S02 PRMS flags
1.3.1. CO ambient air
quality improvement
-1.3.2. Z of needed CO
emission reduction
attained
-1.3.3. No. of CO PRMS flags
1.4.1. Ox ambient air
quality improvement
1.4.2. Z of needed RC
emission reduction
attained
1.4.3. No. of Ox PRMS flags
1.5.1. N02 ambient air
quality improvement
-1.5.2. Z of needed N02
emission reduction
attained
-1.5.3. No. of N02 PRMS flags
C.I.1.1.
.1.1.2.
TSP annual std.
TSP 24-hour std.
C'.2.1.1. SO2 annual std.
.2.1.2.
S02 24-hour std.
C.3.1.1. CO 8-1
.3.1.2. CO 1-i
8-hour std.
hour std.
-1.4.1.1. Ox 1-hour std.
-1.5.1.1. N02 annual std.
Fig. II-3
II-8
-------
(2) The emission reduction Indicator measures the percentage
of the needed emission reduction for each pollutant (emission
goal minus actual emissions) in the previous period, that was
actually attained from the previous to the present period.
Total emissions for each pollutant from all point and area
sources in a state reported to NEDS are used.
(3) The PRMS flags indicator is equal to the number of monitoring
sites flagged as having "potential deficiencies" by the Plan
Revision Management System. This represents the number of
sites in a state that appear not to be meeting schedules for
ambient air quality improvement. The higher the number of flags,
the lower the extent of goal attainment. However, the number of
flags depends on the number of sites with sufficient data for
analysis, PRMS requires readings for four consecutive quarters
at each site analyzed. Some states may have many sites that
are reporting enough valid data to enable a PRMS analysis.
Other states may have relatively fewer sites. For this
reason, along with the number of sites flagged, the number
of stations that were analyzed by PRMS is also presented in
order to put the number of flags in perspective.
A correction factor similar to that used in the AQDI (see p. 11-30)
was considered, but rejected. It was thought unworkable in
this case because the number to be corrected is in most cases
a small integer. Indeed the number is often 0 and such a
correction factor, no matter how large, would not increase 0
to a larger number.
II-9
-------
b. Index 2. PROGRESS (see flowchart, Figure II-4, on p. 11-11)
Index 2 measures operational accomplishments during the present
period in relation to operational objectives at the beginning of the
period. Each sub-index measures accomplishments in relation to a
particular objective:
2.1. Meeting MBO Commitments
2.2. Source Compliance
2.3. Surveillance and Enforcement Actions
2.4. Monitoring and Reporting Air Quality and Emissions
(1) Sub-Index 2.1. Meeting MBO Commitments represents the
percentage of output commitments that was actually met by a
state as reported through the EPA Formal Reporting System and
summarized in the State Activity Report.
At the beginning of a fiscal year states are required, as
a result of negotiations with EPA Regional Offices, to estimate
the tasks that will be performed during the upcoming year for
each output specified by EPA. During the ensuing year states
periodically report their actual accomplishments for each output.
Because the only output category that Regional Offices are
required to report broken down by state is Category #1, source
compliance outputs, this was the only output category that could
be used to measure performance for this sub-index. Thus there
is only 1 indicator under sub-index 2.1. (Any changes in output
reporting requirements, of course, need to be reflected in the
/
make-up of this sub-index. Significant changes in output format
will make it difficult to compare performance from one period to
the next.) Indicator 2.1.1. measures the degree of accomplishment
of outputs in output category 1, and is composed of 8 sub-
indicators, each measuring the percentage of commitments that
were actually accomplished for each output or combination of outputs.
11-10
-------
Index
Sub-Index
FLOWCHART FOR INDEX 2. PROGRESS
Indicator
2. PROGRESS -
Sub-Indicator
1-2.1.1.1.
Commitments
2.1.1.2.
2.1.1.3.
2.1.1.5.
-2.1.1.6.
-2.1.1.7.
*-2. 1.1.8.
C. 2.1. Z non-compl. sources brought
into coropl. w/ final em. req.
, , _ ,
.2.2. Z unknown sources whose
compliance was determined
p2.3.1. No. process insp. & opacity
2.3. Surveillance
& Enforcement
Actions
obs. in relation to no. of
sources
2.3.2. No. stack tests in relation
to no. of sources
-2.3.3. Z of field surveillance
actions taken by state
-2.3.4. No. of enforcement actions
in relation to no. of
non-complying sources
2.3.4.1.
2.3.4.2.
-2.3.4.3.
(-2.4.1.1.
.
2.4. Monitoring &
Reporting Air
Quality &
Emissions
1-2.4.1.3.
pt. 4.2.1.
.
-J 'i ? SO Monitoring
-2.4.2.2.
U.4.2.3.
r-2.4.3.1.
-2.4.3.2.
-2.4.3.3.
[2.4.4.1.
-2.4.4.2.
2.4.4.3.
[.4.5.1.
-2.4.5.2.
.4,5.3.
Output la
Output Ib
Output Ic
Output Id
Output le
Output If
Output lg(l)
Output lh(l)
No. NOV in relation
to no. non-compl. sources
No. AO in relation
to no. non-complying
sources
No. court proceedings
in relation to no. non-
conplying sources
AQCR avg. Z needed
stns . added
Z needed AQCRs w/ req.
network added
SAROAD suff. score
AQCR avg. Z of needed
stns. added
Z needed AQCRs w. req.
network added
SAROAD suff. score
AQCR avg. Z of needed
stns. added
Z of needed AQCRs w/ req.
network added
SAROAD suff. score
AQCR avg. Z of needed
stns. added
Z of needed AQCRs w/ req.
network added
SAROAD suff. score
AQCR avg. Z of needed
stns . added
Z of needed AQCRs w/ req.
network added
SAROAD Suff. score
Z of NEDS missing data
items .completed
Fig. II-4
11-11
-------
Sub-Indicators 2,1,1.1,, 2.1,1,2, and 2,1,1.3, measure the
percentages of commitments for the number of sources in final
compliance with emission requirements (la), the number of sources
whose final compliance status is unknown (Ib), and the number of
non-complying sources not on compliance schedules (Ic), respectively
that were actually accomplished, as reported by the states.
Sub-indicator 2.1.1.4. measures the percentage of the
committed number of sources in final compliance or in compliance
with scheduled increments of progress (outputs la + Id) that was
accomplished. Because a source on a compliance schedule and in
compliance with scheduled increments of progress, that attained
final compliance with emission requirements, is no longer counted
in output Id, it would not be valid to interpret a decrease in
output Id as poor performance on the part of a state. Many states'
commitments for Id increase for part of the year as sources are put
on compliance schedules and brought into compliance with increments
of progress, and then decrease as sources achieve final compliance
with emission requirements. In such cases, it is impossible to
determine on the basis of the numbers alone whether a lower
number of output Id accomplishment is due to fewer sources
brought into compliance with increments or more sources achieving
final compliance. Therefore, outputs la (sources in final com-
pliance) and Id (sources in compliance with increments of progress)
were combined to account for the movement of sources from Id to la-
The same argument can be made for output le, sources overdue
in meeting scheduled increments of progress. Many states' commit-
ments for le increase for part of the year as more sources are
11-12
-------
put on compliance schedules and become overdue in meeting incre-
ments, and then decrease as sources are brought into compliance
with increments and with final emission requirements. In such
cases, a lower number of output le accomplishment may be due to
fewer sources being put on compliance schedules or more sources
kept in compliance with increments of progress. Therefore,
sub-indicator 2.1.1.5. measures the percentage of outputs la +
Id 4- le + If, sources in final compliance or on compliance
schedules, that was accomplished.
Sub-indicator 2.1.1.6. measures the percentage accomplishment
of the committed number of sources on compliance schedules whose
compliance status with regard to increments of progress was unknown
(output If). Sub-indicator 2.1.1.7. measures the percentage
accomplishment of state commitments for the number of state
field surveillance actions (output lg(l)), which includes pro-
cess inspections, opacity observations, and stack tests. Sub-
indicator 2.1.1.8. measures accomplishment of commitments for
the number of state enforcement actions (output (lh(l)), including
notices of violation, abatement orders, and court proceedings.
Because the MBO output numbers do not by themselves reveal
the movement of sources from one status to another, it is hoped
that the CDS, once it is fully operational in all regions, will
be able to fill in the gaps and provide more detailed analysis
of the accomplishment of MBO commitments.
(2) Sub-Index 2.2. Source Compliance measures the accomplishment
of source compliance objectives during the present period. The
11-13
-------
source of the data used is the EPA Formal Reporting System State
Activity Report.
Indicator 2.2,1. represents the percentage of non-complying
sources that was brought into compliance with final emission
requirements or with scheduled increments of progress during the
present period. Indicator 2.2.2. is the percentage of sources
with unknown compliance status, both in regard to final compliance
and compliance with scheduled increments, whose status was deter-
mined during the present period.
The same reservations expressed about the MBO outputs under
sub-index 2,1. apply here. Thus indicator 2.2.1. combines outputs
so as to avoid difficulties with movement of sources from one
status to another. As in sub-index 2.1., CDS should be able to
provide additional and more detailed information once it is fully
operational in all regions,
(3) Sub-Index 2.3. Surveillance and Enforcement Actions is the
sub-index that measures the number of field surveillance and
enforcement actions taken during the period in relation to the
number of sources requiring action. Almost all data is from the
EPA Reporting System State Activity Report.
Indicator 2.3.1. looks at the number of process inspections
and opacity observations conducted by the state in relation the
total number of sources requiring field surveillance. Three
alternatives for the number of sources in a state requiring
field surveillance were considered: the number of sources in
CDS, the number of sources in NEDS, and the number of manufacturing
11-14
-------
facilities listed by Dun & Bradstreet. The number of sources in
CDS was not considered appropriate since these sources are usually
the major sources requiring compliance monitoring (although this
varies from region to region); it was felt that often many more
sources than those on CDS should be the object of some kind of
field surveillance. The completeness of the state NEDS inven-
tories, it was agreed, varied considerably, and use of the number
of sources on NEDS might penalize states that wanted to enter
into NEDS as many sources as possible, as opposed to states that
wished to concentrate only on major emitters. Therefore, the
number of manufacturing facilities listed in the Dun & Bradstreet
DMI (Dun Market Identifiers) file was chosen. While it is recog-
nized that the number from the DMI file may be misleading in some
cases (e.g., a facility may be listed for tax purposes as more
than one plant), it is felt that the DMI file provides the number
most consistent from state to state.
It is also recognized that the number of process inspections
and opacity observations does not indicate the quality of the
inspection or observation, and therefore may penalize the state
that takes more time and does a better job for each. However,
there exists no way of measuring the quality of the field surveillance
action from data routinely reported to Headquarters. Results of this
indicator, therefore, should be looked at with this caveat in mind.
The number of stack tests conducted by the state in relation
to the total number of sources requiring field surveillance is
measured by indicator 2.3.2. Stack tests were examined separately
from other field surveillance actions because of the amount of
11-15
-------
time stack tests take. The same caveat concerning the quality
of the action taken discussed above applies here.
Indicator 2.3.3. measures the percentage of field surveillance
actions done by both the state and EPA, that was done by the
state. This is meant to indicate whether the state is performing
an adequate number of surveillance actions, or whether EPA has
had to step in to significantly supplement state actions. However,
this may penalize states that happen to be in a region where EPA,
for reasons other than the adequacy of state surveillance, has
undertaken a significant federal surveillance program.
Indicator 2.3,4, looks at the number of enforcement actions
taken by a state and by EPA in that state, in relation to the
number of non-complying sources. Because many states have arrange
ments with EPA concerning enforcement actions (i.e., a state may
ask EPA to take a particular action because of limitations in
state law or administrative regulations), all enforcement actions,
whether state or federal, are counted. However, there may be
cases when a regional office may not want to include EPA actions
in their states, because the EPA actions do indicate failure on
the states' part to take needed actions. Non-complying sources
are sources that are not in final compliance with emissions require-
ments and that (a) are not on compliance schedules, or (b) are on
compliance schedules but are not in compliance with increments
of progress. The indicator is composed of 3 sub-indicators, each
referring to a different type of enforcement action.
Sub-indicator 2.3.4.1. deals with the number of notices of
violation issued by a state and EPA in relation to the number of
non-complying sources. There are 2 factors that may affect the
validity of this sub-indicator. First, some states do not have
11-16
-------
an exact equivalent to the EPA section 113 notice of violation.
Thus the number of notices of violation issued that is reported
to EPA may refer to a somewhat different action. Second, a
separate notice of violation may, depending on state regulations
and procedures, be issued for each pollutant emitted from each
point within a source. Therefore, not only can the number of
notices vary significantly from the number of sources, but in
addition the degree of variance depends on the average number of
points within a source in each state as well as the administrative
regulations and procedures of that state.
These problems also affect, perhaps less drastically, sub-
indicator 2.3,4.2., the number of abatement orders issued by a
state and EPA in relation to the number of non-complying sources.
Finally, sub-indicator 2.3,4.3. looks at the number of civil
or criminal court proceedings initiated by the state and EPA in
relation to the number of non-complying sources. It is recog-
nized that many, if not most, states try to avoid court proceedings,
preferring to rely on administrative actions to ensure source
compliance. Nevertheless it was decided that this sub-indicator
still provides a valid measure by which states could be compared.
(4) Sub-Index 2.4. Monitoring and Reporting Air Quality and
Emissions measures performance in relation to operational
objectives of monitoring and reporting ambient air quality and
emissions of the criteria pollutants. Five of the indicators
relate to ambient air quality monitoring and reporting to SAROAD
for the five criteria pollutants:
11-17
-------
2.4.1. TSP Monitoring
2.A.2. S02 Monitoring
2.4.3, CO Monitoring
2.4.4. Ox Monitoring
2.4.5. N02 Monitoring
The last indicator relates to reporting emissions to NEDS:
2.4.6. Emissions Reporting.
Each of the ambient air quality (AAQ) monitoring indicators
(2.4.1.-2.4.5.) is composed of 3 sub-indicators: (a) AQCR
average percentage of needed monitoring stations added, (b)
percentage of AQCRs needing stations that achieved complete
monitoring networks, and (c) SAROAD sufficiency score.
(a) The AQCR average % of needed stations added is equal
to the ratio of: the number of stations in an AQCR using
an acceptable pollutant method to monitor a given pollutant
and reporting sufficient data for at least 1 quarter per
year to SAROAD added during the present period, to the
number of such stations that needed to be added during the
previous period to complete the federally required minimum
network, averaged over all AQCRs in the state. In other
words, this is the average percentage of the number of
stations that needed to be added that were actually added
during the present period. The minimum numbers of stations
required by EPA are set forth in 40CFR51. Alternatively,
the numbers of stations proposed in the State Implementation
Plans may be used instead of the federal minimum.
For each AQCR the maximum percent of needed stations
added is 100%; no credit is given for adding more than the
minimum needed number of stations.
Thus the maximum value for each sub-indicator, which
is the average % of all AQCRs in a state, is also 100%.
11-18
-------
The reason for this limit is to avoid the situation where
an AQCR that adds more stations than needed would balance
out another AQCR that added fewer than the number needed.
If no stations were needed in any AQCR in the state in
the previous period, no value is computed for this sub-
indicator. The purpose of this sub-indicator is to measure
how much of the distance from a completed network was covered
in the AQCRs during the present period.
(b) The percentage of AQCRs needing stations that achieved
complete federally required minimum networks is equal to the
ratio of: the number of AQCRs in a state that attained the
minimum required network during the present period (number
of AQCRs in present period with complete network minus
number in previous period), to the number of AQCRs that had
less than the minimum required networks in the previous
period. If all the AQCRs in the state had at least the
minimum required network during the previous period, no
value is computed for this sub-indicator. This sub-indicator
together with the previous sub-indicator accounts for states
that attempted to add some stations to each AQCR with incom-
plete networks as well as states that concentrated their new
monitors in certain AQCRs.
(c) The SAROAD sufficiency score is the percentage of
station-quarters reported to SAROAD during the present period
that met SAROAD sufficiency criteria. While the other two
sub-indicators measure network completion for each pollutant,
11-19
-------
this sub-indicator provides some measure of the sufficiency
of the data submitted. Sufficiency, as used here by SAROAD,
refers to the number of observations reported during a
quarter and the distribution of observations throughout a
quarter (minimum sufficiency for 24-hour integrating samples
is 5 values per quarter distributed over at least 2 of the
3 months of the quarters with at least 2 samples in each of
the 2 months if there is no sample in the third month, and
for continuous instruments 75% of the possible hourly values
for annual summaries).
Indicator 2,4.6. measures progress in completing emissions info
tion reported to NEDS and has one sub-indicator. Sub-indicator 2.4.6
measures the number of missing necessary NEDS data items that were a
during the present period in relation to the number of necessary
data items that were missing at the beginning of the period. A
necessary NEDS data item is a data bit requested for every point,
source, or plant in NEDS that is considered most important to
meet the basic purposes of NEDS. The selection of the necessary
data items for the trial run was made after consultation with
various EPA personnel, and include all items on the point source
form except: city, contact, plume height, compliance schedule
year and month, compliance status update year, month and day, and
ECAP.
c. Index 3. ACHIEVEMENT, (see flowchart, Figure II-5, on p. 11-21)
Index 3 measures the cumulative operational accomplishments of a
state at the end of the present period in relation to long-term
11-20
-------
FLOWCHART FOR INDEX 3. ACHIEVEMENT
Index
Sub-Index
Indicator
Sub-Indicator
-3.1. Source
Compliance
3. ACHIEVEMENTf3.2. Monitoring -
& Rptg. Mr
Quality &
Emissions
3.3. Completing
Flans &
Revisions
3.1.1. Z sources In compli.
w/ emission requirements
or w/ sch. increments
-3.1.2. Z non-complying sources
on compliance schedule
3.2.1. TSP Monitoring -
-3.2.2. S02 Monitoring-
3.2.3. 00 Monitoring
3.2.4. Oz Monitoring-
3.2.5. N02 Monitoring
Emissions Reporting
-3.3.1. Z required SIF
portions completed
1-3.2.1.1.
3.2.1.2.
L3.2.1.3.
P3.2.2.1.
3.2.2.2.
L-3.2.2.3.
f3.2.3.1.
3.2.3.2.
1-3.2.3.3.
r3.2.*.l.
3.2.4.2.
-3.2.4.3.
ting
-3.2.5.1.
3.2.5.3.
3.2.6.1.
-3.2.6.2.
AQCR avg. Z of required
network
Z AQCRs with required
network
Z pollutant-methods that
are not unacceptable
AQCR avg. Z of required
network
Z AOCRs with required
network
Z pollutant-methods that
are not unacceptable
AQCR avg. Z of required
network
Z AQCRs with required
network
Z pollutant-methods that
are not unacceptable
AQCR avg. "of required
network
Z AOCRs with required
network
Z pollutant-methods that
are not unacceptable
AQCR avg. Z of required
network
Z AQCRs with required
network
Z pollutant-methods that
are not unacceptable
Z total possible sources
(Including NEDS verifica-
tion file) on NEDS
Z nee. NEDS data items
missing
Fig. I1-5
11-21
-------
operational objectives. While Index 2. PROGRESS measures accomplish-
ments during a defined present period, such as a year for an annual
application of the system, in relation to objectives for that period,
Index 3. ACHIEVEMENT measures all accomplishments up to a point in
time, usually the end of the present period, in relation to final
objectives. For example, an Index 2 measure looks at the % of stations
needed at the beginning of the period that were added during a period,
while the comparable Index 3 measure looks at the % of stations in place
at the end of the period.
The operational objectives generally follow the lines of those
for Index 2: 100% source compliance, and completion of minimum
required monitoring networks reporting to SAROAD and emissions data
reported to NEDS. There are no long-range objectives for the number
of surveillance and enforcement actions, however. And one additional
objective is added; completion.of all SIP plans and plan revisions
required as of the end of the present period. The sub-indices measure
achievement of these objectives:
3.1. Source Compliance
3.2. Monitoring and Reporting AQ and Emissions
3.3. Completing Plans and Revisions.
(1) Sub^Index 3.1. Source Compliance measures the cumulative
accomplishments of a state in relation to the objective of
ensuring source compliance.
Indicator 3.1.1. measures the percentage of all point
sources reported in the EPA Formal Reporting System that are
in compliance with final emission requirements or are meeting
scheduled increments of progress. This indicator shows how well
11-22
-------
a state has done In bringing about compliance with emission
limitations or in keeping sources on schedule in meeting
increments of progress.
Indicator 3.1.2. measures the percentage of sources not
in compliance with emission requirements that are on compliance
schedules. This indicates how well a state has done in putting
non-complying sources on compliance schedules.
(2) Sub-Index 3.2. Monitoring and Reporting Air Quality and
Emissions measures the cumulative accomplishments of a state in
relation to the objective of monitoring and reporting ambient
air concentrations and emissions of the criteria pollutants.
The first five indicators measure achievement in relation to
ambient air quality (AAQ) monitoring and reporting to SAROAD for
the five criteria pollutants:
3.2.1. TSP Monitoring
3.2,2. S02 Monitoring
3.2.3. CO Monitoring
3.2.4. Ox Monitoring
3.2.5. N02 Monitoring.
The last indicator measures achievement in relation to
reporting emissions information to NEDS:
3.2.6, Emissions Reporting.
Each of the AAQ monitoring indicators (3.2.1.-3.2.5.) is
composed of 3 sub-indicators: (a) AQCR average percentage of
minimum network, (b) percentage of AQCRs with complete minimum
network, and (c) percentage of pollutant-methods reported that
are not unacceptable.
Ca) The AQCR average % of the federally required minimum
network is equal to the ratio of: the number of stations in
11-23
-------
an AQCR using an acceptable pollutant-method to monitor a
given pollutant that reported sufficient data for one
quarter per year to SAROAD during the present period, to
the minimum required number of stations in that AQCR,
averaged over all AQCRs in the state. In other words, this
is the average percentage of the minimum required network in
place.
It should be noted that this sub-indicator differs from
the comparable sub-indicator under sub-index 2.4., AQCR
average % of needed stations added (see p. 11-18), in the time
period of concern. The latter sub-indicator deals with the
percentage of stations needed at the beginning of the present
period that were actually added during the period, while the
former deals with the status of the monitoring network at the
end of the period in relation to the objective of a complete
monitoring network.
Like the first sub-indicator under each pollutant indi-
cator of sub-index 2.4., the maximum percentage of the minimum
required network for each AQCR is 100%; no credit is given an
AQCR for having more than the minimum required number of
stations. Thus the maximum computed value for this indicator
is 100%, and there is no possibility that an AQCR with more
than the minimum required network would make up for another
AQCR with less than the minimum required network. The pur-
pose of this sub-indicator is to show how far the AQCRs in
the state are from complete networks.
11-24
-------
(b) The percentage of AQCRs with complete minimum network
is equal to the ratio of; the number of AQCRs that have at
least the federal minimum required number of stations, to
the number of AQCRs in the state. This sub-indicator
together with the previous sub-indicator accounts for states
that located some stations in each AQCR as well as states
that concentrated their stations in certain AQCRs.
Again, this sub-indicator is similar to the second
sub-indicator of each of the first five indicators of
sub-index 2,4., the percentage of AQCRs needing stations
that achieved complete networks, except for the time period
of concern. The former deals with the percentage of all
AQGRs in the state that have attained the minimum network
as of the end of the present period, while the latter deals
with the percentage of AQCRs with less than the required
network at the beginning of the present period that attained
the minimum network during the period.
(c) The percentage of pollutant-methods reported that are
not unacceptable is equal to the ratio of: the number of
monitoring sites reporting to SAROAD and using pollutant-
methods classified by SAROAD as acceptable (federal reference
method or equivalent) or unapproved (equivalency not yet
determined), to the total number of monitoring sites re-
porting to SAROAD (including those using pollutant-methods
that have been declared unacceptable). A list of pollutant-
methods and their acceptability classification is given in
Appendix II-B.
11-25
-------
Indicator 3.2,6. looks at reporting of emissions information
to NEDS and is composed of 2 sub-indicators:
(a) Sub-indicator 3.2.6,1. is a measure of the percentage
of sources in a state that are not in NEDS but should
possibly be in NEDS. It is equal to the ratio of: the
number of sources in NEDS, to the number in NEDS plus the
number of sources in the NEDS verification file. The NEDS
verification file is a list compiled from various non-EPA
sources, of point source facilities not in NEDS that need
to be investigated vis-a-vis the necessity of putting them
in NEDS.
(b) Sub-indicator 3.2.6.2. measures the percentage of
necessary NEDS data items for all sources in a state that
are missing as of the end of the present period. Necessary
NEDS data items, selected after consultation with EPA
personnel, are those pieces of information requested on
NEDS point source forms that are considered the most
important to meet the basic purposes of NEDS. All items
requested on the NEDS point source form are considered
necessary except for: city, contact, plume height, compliance
schedule year and month, compliance status update year,
month and day, and ECAP.
(3) Sub-Index 3.3. Completing Plans and Revisions is composed
of 1 measure, indicator 3,3.1., which measures the percentage of
the number of SIP portions due to be completed by a state by the
end of the present period, that was actually completed by that time.
11-26
-------
An SIP portion is a regulatory or non-regulatory part of a
statewide plan or plan for any distinct area within a state
(e.g., AQCR, AQMA, TCP area), as categorized in the SIP Progress
Report. As this categorization changes, the definition of a
portion may change accordingly.
For the sake of simplicity it is assumed that every AQCR in
a state needs one of the SIP portions named in the SIP Progress
Report. The number of completed portions is then equal to:
the total possible number of required SIP portions (number of
AQCRs times the number of portions) minus the number of portions
declared deficient by EPA, including portions EPA has promulgated
in the absence of state completion and adoption of an acceptable
portion. In effect, a portion that is in fact not required for
a state or for AQCRs within a state (such as a TCP) is considered
to have been completed. On the other hand, a deficiency in a
portion of a statewide plan is considered a deficiency for all
AQCRs in the state.
2. Need Indices: Measure the need of a state at the end of the present
period in relation to national goals and objectives. The performance
indices measure state status and activities in terms of one state's acti-
vities or status in relation to its own situation, and computed values are
usually percentages (e.g., AAQI in relation to AAQ problem at the beginning,
number of sources not in compliance in relation to total number of sources).
The need indices, in contrast, measure state status in actual or absolute
terms (e.g., total AAQ deviation, number of sources not in compliance).
There are two need indices, one relating to AAQ and emissions goals and
the other to operational objectives.
11-27
-------
a. Index 4. PROBLEM (.see flowchart, Figure II«6, on p. 11-29)
Index 4 measures the need of a state at the end of the present
period in relation to air quality and emission goals. It consists
of three sub-indices;
4.1. Ambient Air Quality Problem
4.2. Emissions and Emission Sources
4.3. Emission Reduction Needed.
(1) Sub-Index 4.1. Ambient Air Quality Problem measures the
need of a state at the end of the present period in relation to
the national primary ambient air quality standards and the popu-
lation exposed to ambient air quality exceeding standards. The
sub"index is composed of five indicators, each measuring the
ambient air quality (AAQ) problem in relation to one of the five
criteria pollutants:
4.1.1. TSP AAQ Problem
4.1.2, S02 AAQ Problem
4.1.3. CO AAQ Problem
4,1.4. Ox AAQ Problem
4.1.5. N02 AAQ Problem.
Each of the indicators is composed of 2 types of sub-indicators:
(a) air quality deviation indication (AQDI) for each primary
pollutant-standard (2 standards each for TSP, S0», and CO, and
1 standard each for 0 and N0_), and (b) population in AQCRs with
X £,
positive AQDI for each primary pollutant-standard.
(a) The air quality deviation indication for a particular
pollutant-standard is equal to the sum of the percentage
deviations of measured air quality above (worse than) the
\
standard. This sum accounts for the magnitude of deviations
11-28
-------
FLOWCHART FOR INDEX 4. PROBLEM
Index
Sub-Index
Indicator
Sub-Indicator
ft.1.1. TSP ambient air
quality problem
-4.1.2. S02 ambient air
quality problem
1-4.1. Ambient Air-
Quality
Problem
4. PROBLEM-
1.2.
Emission*
Emission
Sources
U.3.
Emission -
Reduction
Needed
-4.1.3. CO ambient air
quality problem
Ox ambient air
quality problem
U.I.5. N02 ambient air
quality problem
-4.2.1. Urban population
.2.2. Urbanized land area
.2.3. Projected population
growth rate
-4.2.4. Projected
manufacturing
growth rate
-4.2.5. Total enisslons-
r-4.3.1. TSP emission
reduction needed
4.3.2. SO- emission
reduction needed
4.3.3. CO emission
reduction needed
4.3.4. HC emission
reduction needed
-4.3.5. NO? emission
reduction needed
-4.1.1.1. TSP annual AQDI
4.1.1.2. TSP 24-hour AQDI
4.1.1.3. Pop. in AQCRs with
positive annual AQDI
4.1.1.4. Pop. in AQCRs with
positive annual AQDI
4.1.2.1. S02 annual AQDI
4.1.2.2. S02 24-hour AQDI
4.1.2.3. Pop. in AQCRs with
positive annual AQDI
14.1.2.4. Pop. in AQCRs with
positive 24-hr. AQDI
4.1.3.1. CO 8-hour AQDI
4.1.3.2. CO 1-hour AQDI
4.1.3.3. Pop. in AQCRs with
positive 1-hr. AQDI
U.I.3.4. Pop. in AQCRs with
positive 1-hr. AQDI
.1.4.1. Ox 1-hour AQDI
.1.4.2. Pop. in AQCRs with
positive 1-hr. AQDI
4.I.S.I. N02 annual AQDI
.1.5.2. Pop. In AQCRs with
positive annual AQDI
c
4.2.5.1. TSP emissions
4.2.5.2. 502 emissions
4.2.5.3. CO emissions
-4.2.5.4. HC emissions
-4.2.5.5. N02 emissions
Fig. II-6
11-29
-------
measured above a standard, and the number of air quality
values registering a deviation above a standard.
The sum can be corrected to account for the completeness
of a state's monitoring network, by dividing the sum by the
ratio of: the number of stations reporting, to the federally
required minimum number of stations (see Section B for more
detailed computation instructions). The assumption behind
\
such a correction factor is that AQ measured by the monitoring
network in place and reporting to SAROAD is in direct proportion
to what AQ would be measured by a "complete (i.e., minimum
required)" network. The sum of deviations above a standard
is thus proportionately increased or decreased according to
the percentage of the complete sampling that was reported to
SAROAD. Another standard for a "complete" network, such as
the number of stations proposed in the SIP could be used in
place of the federally required minimum.
Several drawbacks to the AQDI can be mentioned:
(1) No judgment can be made on the proper spatial distri-
bution of the monitoring stations within each AQCR,
and therefore it must be assumed that measured air
quality is truly representative of air quality in each
AQCR;
(2) For a state that measured no deviation above a standard,
the AQDI could never be increased above 0 by the
correction factor, even if the state had less than a
"representative" network (however, because stations
probably were initially located in areas of expected
11-30
-------
maximun) concentrations, understating the air quality
deviation should not be a problem);
(3) the AQDI using the correction factor may overstate air
quality deviation in AQCRs with less than the minimum
required network if these stations are measuring the
heaviest concentration of a pollutant;
(4) As presently constructed, the AQDI does not distinguish
between source-oriented and population-oriented monitoring
sites. Sites coded as source-oriented were not eliminated,
because it was felt that the reliability of such coding
varied greatly from state to state, However, such a
distinction can easily be built into the measure if
desired.
It should be noted that because the state is the geo-
graphic unit of interest.here, the AQDI for all AQCRs in a
state are summed to yield the AQDI for the state. This state
AQDI masks the distribution and relative local severity of
ambient air quality problems, so that a state with 1 AQCR
with a severe problem might have an AQDI equal to another
state in which all the AQCRs have a slight problem. Like all
measures utilized in the system, interpretation of the meaning
of an AQDI value requires more detailed investigation and
explanation than is intended here.
Finally, calculation of the AQDI requires data that is not
always available. Many states may not report sufficient data to
SAROAD to calculate an AQDI for any of all of their AQCRs; many
states are not required by the federal minimum required network
11-31
-------
to report any data, especially for CO and 0 . For these states
X
AQDI can be listed (as opposed to states with sufficient
data but with an AQDI equal to 0) and any comparison of
state values must be among those with some data.
(b) Population in AQCRs with positive AQDI for a particular
pollutant-standard measures the extent of population exposure
to AAQ exceeding a standard. As for the AQDI, there are 2
population sub-indicators each for TSP, S02 and CO (one
for each of 2 standards), and one each for 0 and NO .
A £*
(2) Sub-Index 4.2. Emissions and Emission Sources looks at
current and projected levels of factors that are associated with
emissions, in the first four indicators:
4.2.1. Urban Population
4.2.2. .Urbanized Land Area
4.2.3. Projected Population Growth Rate
4,2.4. Projected Manufacturing Growth Rate.
The last indicator deals directly with current emissions:
, 4.2.5. Total Emissions.
Indicator 4.2.1. gives the urban population in the state and
is an indication of the number of sources of emissions coming from
urban activity. The Census Bureau definition of urbanized area
population is used: population of an area containing a city of
50,000 or more population plus the surrounding closely settled
incorporated and unincorporated areas which meet Census Bureau
criteria of population size or density (urbanized areas differ from
SMSAs chiefly in excluding the rural portions of SMSA counties and
11-32
-------
those places separated by rural territory from densely populated
fringe around the central city).
Indicator 4.2.2. shows the amount of land over which urban
emission-generating activities take place and like 4.2.1. is an
indication of urban sources of emissions. Urbanized land area is
equivalent to the total area in all SMSAs in the state, and as
such it is not necessarily equivalent to the area on which the urban
population resides. Another measure of land area, such as the area
of Census defined urbanized areas (the more densely settled parts
of SMSAs), can be substituted if data is available.
Indicators 4.2.3. and 4.2.4. attempt to measure future
emissions problems by estimating the growth rate of two major
types of emissions sources, population (area sources) and manu-
facturing activity (point sources). The population growth rate
and the rate of growth of manufacturing activity are estimated for
a given period in the future, such as 5 or 10 years. Choice of the
time period depends on data availability, reliability of various
forecasts, and air program-related deadlines such as air quality
maintenance analysis and planning (see Section B, p. 11-76, for
possible sources of estimates).
Indicator 4.2.5. looks at current levels of emissions (tons
per year) for the criteria pollutants and is composed of 5 sub-
indicators, one for each pollutant. Although it is discussed in
Section.B, Methodology, it should be pointed out at this point also that
the actual values of emissions for the pollutant sub-indicators are not
summed to yield the value for the indicator since it cannot be assumed
11-33
-------
that a ton of TSP is equivalent to a ton of CO (see Section B for
discussion of scoring),
(3) Sub-Index 4.3. Emission Reduction Needed looks at emissions
at the end of the period in relation to emission goals (the level
of emissions needed to attain ambient air quality standards). There
are 5 indicators, one each for emission reduction needed for a
criteria pollutant:
4.3.1, TSP Emission Reduction Needed
4,3.2. S(>2 Emission Reduction Needed
4.3.3. CO Emission Reduction Needed
4.3,4. HC Emission Reduction Needed
4.3.5, NO- Emission Reduction Needed,
b. Index 5, OPERATIONAL REQUIREMENTS (see flowchart, Figure II-7,
on p, 11-35)
Index 5 measures in actual terms the need of a state at the end
of the present period in relation to operational objectives. It
consists of 3 sub-indices, each measuring need in relation to an
operational objective:
5.1. Source Compliance and Enforcement
5.2. Monitoring and Reporting Air Quality and Emissions
5.3. Completing Plans and Revisions.
(1) Sub-Index 5.1. Source Compliance and Enforcement measures
the need of a state at the end of the present period in relation
to source compliance and enforcement objectives. The purpose of
the sub-index is to indicate the relative amounts of time and
effort that will be required of state and local agencies in order
to ensure source compliance. There are 4 indicators that comprise
the sub-index:
11-34
-------
Index
Sub-Index
FLOWCHART FOR INDEX S. OPERATIONAL REQUIREMENTS
Indicator Sob-Indicator
r5.1. Source
Compliance &
Enforcement
5. OPERATIONAL -
REQUIREMENTS
i.2. Monitoring -
& Reporting
Air Quality
& Emissions
Completing
L5.3. Plans &
Revisions
rS.1.1. No. of sources with
unknown compliance status
1.2. No. of non-complying
sources not on
compliance schedule
4.1.3. No. of sources overdue
in meeting scheduled
increments of progress
1.4. No. of sources that re-
quire field surveillance
r-5.2.1. TSP Monitoring
5.2.2. S02 Monitoring
6.2.3. CO Monitoring
-5.2.4. Ox Monitoring
-5.2.5. N02 Monitoring
5.2.6. Emissions Rptg
-5.2.1.1.
5212
-5.2.1.3.
-5.2.2.1.
-5.2.2.3.
-5.2.3.1.
-5.2.3.3.
-5.2.4.1.
-5.2.4.3.
-5.2.5.1.
-5.2.5.3.
-5.2.6.1.
-5.2.6.2.
No. rptg. stns. that
need to be added
No. AQCRs w/ less than
the req. network
Improvement in SAROAO
suff. score needed
No. rptg. stns. that
need to be added
No. AQCRs w/ less than
the req. network
Improvement In SAROAD
suff. score needed
No. rptg. stns. that
need to be added
No. AQCRs w/ less than
the req. network
Improvement in SAROAD
suff. score needed
No. rptg. atns. that
need to be added
No. AQCRs w/ less than
the req. network
Improvement in SAROAD
suff. score needed
No. rptg. stns. that
need to be added
No. AQCRs w/ less than
the req. network
Improvement in SAROAD
suff. score needed
No. sources on NEDS
verification file
No. nee. NEDS data
-5.3.1. No. req. SIP portions
that need to be completed
Fie. II-7
11-35
-------
5.1,1, No. of sources with, unknown compliance status
5.1.2. No, of non~complying sources not on compliance
schedules
5,1.3. No. of sources overdue in meeting scheduled
increments of progress
5,1.4, No. of sources that require field surveillance.
Indicator 5,1.1. counts the number of major point sources
whose compliance status with regard to final emission requirements
or to meeting scheduled increments of progress is unknown, as an
indication of the^ number of sources whose status will have to be
investigated and determined.
Indicator 5.1.2. counts major point sources not in compliance
with emission requirements and not on compliance schedules. These
sources must be placed on compliance schedules.
Indicator 5.1,3. counts major point sources that have com-
pliance schedules but which have not met a scheduled increment of
progress. These sources must be brought into compliance with the
missed increment or the schedules must be revised.
Indicator 5.1.4. is an estimate of the total number of sources
that may require field surveillance of some kind. The number of
manufacturing facilities in the state in the Dun & Bradstreet DMI
file was used (see p. 11-14 for discussion of alternatives rejected),
(2) Sub-Index 5.2. Monitoring and Reporting Air Quality and
Emissions measures the need of a state in relation to the objectives
of monitoring and reporting ambient air quality (AAQ) data to SAROAD
and reporting emissions data to NEDS. The purpose of the sub-index
is to indicate the relative amounts of time and resources that
will be required of state and local agencies in order to complete
the federally required minimum AAQ monitoring network and the NEDS
11-36
-------
file. The first five indicators refer to monitoring and
reporting MQ for the five criteria pollutants:
5,2.1, TSP Monitoring
5.2.2, S02 Monitoring
5,2.3, CO Monitoring
5.2.4, Ox Monitoring
5.2.5, NO, Monitoring.
The last indicator refers to reporting of source and emissions
information to NEDS:
5.2.6, Emissions Reporting.
Each of the first five indicators relating to AAQ monitoring
for a criteria pollutant is composed of 3 sub-indicators: (a)
number of stations needed to complete the minimum network, (b)
number of AQCRs with less than the minimum network reporting, and
(c) improvement needed in SAROAD sufficiency score.
(a) The number of stations needed to complete the minimum
network for a given pollutant is equal to the minimum re-
quired number of stations in each AQCR minus the number of
stations reporting sufficient data for at least 1 quarter
per year to SAROAD, if this latter number is less than the
minimum required number, summed over all AQCRs in the state.
An AQCR with more than the minimum required number of stations
does not make up for another AQCR with less than the minimum
required number,
(b) The number of AQCRs with less than the minimum required
network for each pollutant is equal to the number of station-
quarters of insufficient data reported to SAROAD (see p.11-20
for definition of sufficiency). While the other two sub-indicators
11-37
-------
deal with need relative to network completion, this sub-
indicator is concerned with sufficiency of data reported
from stations in place.
Indicator 5.2.6, measures the need for reporting of source
and emissions information to NEDS, and is composed of 2 sub-
indicators. Sub-indicator 5.2.6.1. counts the number of sources
that are on the NEDS verification file (see p. 11-26 for definition)
and may thus need to be put in NEDS. Sub-indicator 5.2.6.1.
counts the number of necessary data items (see p. 11-26 for
definition) that need to be completed in order to have all
necessary information for sources in NEDS.
\
(3) Sub-Index 5.3. Completing Plans and Revisions has 1 sub-
indicator, 5.3.1., which measures the need of a state at the end
of the present period in relation to completing all necessary SIP
portions and revisions. This includes SIP portions that should
have been completed by the end of the present period as well as
those portions that will need to be completed and approved during
the next period (see p. 11-27 for definition of "SIP portion").
3. State Background Information; In addition to measures of performance
and need, demographic and expenditure information that provides some perspective
on the states is collected for each state. A list of the information collected
is presented on the following page. Actual figures for the states are given
in Appendix II-B. These data are not used as the basis for any scoring, except
for urbanized area population and SMSA land area (index 4).
11-38
-------
STATE BACKGROUND INFORMATION
1. Total population: (a) Civilian (County-City Data Book)
(1000) (b) Including military (Statistical Abstract of the U.S.)
2. Projected population, 1980: (a) Series C (Two Census Bureau population
projections, based on different
(1000) (b) Series E. birth rate assumptions)
3. Urban population (1000) = pop. in urbanized areas + places of 2500 and more.
4. Percentage of population that is urban.
5. SMSA population (1000).
6. Percentage of population that is in SMSAs.
7. Urbanized area population = pop. of densely settled areas of SMSAs (1000).
8. Total land area (sq. mi.).
9. SMSA land area (sq. mi.).
10. Overall density = total pop. /total land area.
11. SMSA density = SMSA pop. /SMSA land area.
12. # of AQCRs of priority I (sum over all pollutants).
13. Population in AQCRs of priority I (sum over all pollutants) (1000).
14. Total air pollution control agency expenditures ($1000).
15. Total expenditures/total population.
16. Total expenditures/urban population.
17. Total expenditures/SMSA population.
18. Total expenditures /UA population.
19. Total expenditures/total land area.
20. Total expenditures/SMSA land area.
21. Total expenditures/overall land density.
22. Total expenditures/SMSA density.
23. Total expenditures/population in AQCRs of priority I (sum over all pollutants)
24. Percentage deviation of 1973 heating degree-days from 30-yr. Normal (
(averaged over all weather stations in state) .
/+ * higher heating degree-days * colder \
- « lower heating degree-days = warmer than normal
11-39
-------
Total air pollution control agency expenditure is equal to the annual
control program budgets for all state and local control agencies in a state,
which is the sum of federal and non-federal funds (including equivalent
value of EPA assignees) plus special contract support funds and demonstration
grants.
11-40
-------
SECTION B. METHODOLOGY
1.
2.
a. Index 1.
b. Index 2.
c. Index 3.
Need Indices
a. Index 4.
b. Index 5.
Goal Attainment . . .
Operational Requireme
, 11-44
, 11-50
, 11-59
, 11-66
, 11-66
jnts .... 11-72
-------
B. Methodology
This section describes the steps involved in calculating values and
scores for the measures described in the previous section. Briefly the
steps are:
(1) The states that will be analyzed are chosen.
(2) The weighting scheme is established. At each level of aggregation
weights for each measure are assigned according to the relative contri-
bution of component sub-indicators to an indicator, of component indi-
cators to a sub-index, and of component sub-indices to an index. For
any given index, sub-index, or indicator, weights for all its components
should sum to 1.0 (or 100%).
(3) The desired number of scoring intervals and the score to be assigned
to each interval are chosen. The number of intervals can range from
two intervals (above and below a mean), or four intervals representing
quartiles, or ten intervals representing deciles, to any other number.
The intervals can be assigned any logical progression of numbers as
scores (e.g., quartiles can be assigned scores 1, 2, 3, and 4; or if
it is desired that each succeeding quartile have a score twice the
score of the last quartile, quartiles can be assigned scores 1, 2, 4,
and 8, etc.). The same number of scoring intervals and the same scores
are used for all measures and all states to ensure comparability from
measure to measure.
(4) For each measure that is made up of individual data items (i.e.,
sub-indicators, or, if there are no sub-indicators for a particular
indicator, indicators), the values for all states being analyzed are
computed from the component data items (discussion of the individual
sub-indicators and indicators is given later in this section, with
reference to worksheets and detailed instructions in the Workbook).
11-41
-------
(5) For each measure the computed values are converted into scores in
the following manner:
Ca) The range of values for which scores will be given is established,
Where possible, this is the range of all possible values, for
example, 0 to 100% final source compliance. If no such range
of possible values is evident, for example, for population or
percent improvement, the range of actual values is determined
for all states for which comparison of results is desired, such
as all states in the nation for a national perspective.
(b) The scale by which the range of values will be divided into the
desired number of intervals, such as an arithmetic scale or
geometric scale, is chosen. The important point to remember
here is that the scale should make it possible to make meaningful
distinctions among states according to the values computed for
the sub-indicator or indicator. Thus a scale should be avoided
that results in a situation in which all or most units being
compared are grouped together at one end of the scale and thus
a large proportion of the units receive the same score. One way
'to decide what scale to use is to list all computed values, look
for groupings of states and natural breakpoints between groupings,
and then try out an arithmetic or geometric scale. The decision
as to whether a given spread of values is satisfactory is subjective,
(c) Using the chosen scale, the range of values for each scoring
interval is determined.
Cd) The scoring intervals into which fall the values for all the
states being compared are determined, and the appropriate scores
are assigned.
11-42
-------
(6) Each score for a component is multiplied by its weight to yield a
weighted score. If a particular value and score for a given state
cannot be computed because of insufficient data, the component weights
for that state are reallocated among those components for which values
and scores can be computed, in the same proportion as was originally
assigned (see Section E, p. 11-132 for further discussion).
(7) Steps (4) computing values, (5) converting values to scores, and (6)
weighting scores, are repeated for all components of a given measure.
(8) The weighted scores for all components of a given measure are summed
to obtain the score of the measure.
(9) Steps C7) computing weighted scores of all components, and (8) summing
weighted scores of all components are repeated until the desired
highest level of aggregation is reached.
A simplified diagram showing the application of weights and summing
of weighted scores at each level is presented below.
Index
5bl"K 5b2=. 5C.J+. 25c2+.
. 25d2+, 125d3+.
Sub" Index
CWt«b2=50%)
, 5c-K
« 5d
-.25d,
Indicator
C;fd2
c ~,5d +.5d.
100%) (Wt.d2=100%) (Wt.d3y50%) (Wt.d4=50%)
11-43
-------
The remainder of this section describes the calculation procedures
(steps (4) through (9)) for the measures within each index, The discussion
follows the outline of indices and components used in Section A. The
sources of the data from which values are computed are also given; additional
information concerning these sources are given in Appendix II-A.
1. Performance Indices
a. Index 1. GOAL ATTAINMENT (see flowchart, Figure II-3, p. II-8)
Each sub-index measures goal attainment for one of five criteria
pollutants, and is composed of 3 indicators: (1) AAQ improvement,
(2) emission reduction, and (3) PBMS flags.
(1) Ambient Air Quality Improvement (AAQI) (1.1.1. TSP, 1.2.1.
SO , 1.3.1. CO, 1.4.1. 0 ,1.5.1. NO ): Following detailed
£ 2t £*
instructions and worksheet #1 in the .Workbook (Appendix II-E), the
sum of all observed values (H) and the sum of all percent
deviations above the standard (I) for both the previous and
present period of evaluation are computed for each pollutant-
standard, using only stations which reported data in both periods.
The data for both (H) and (I) can be obtained from the Monitoring
and Trends Report or from SAROAD printouts. (It should be noted
that the sum of percent deviations for the present period calcu-
lated for Index 1 and the AQDI for the present period computed
for Index 4 and described on page 11-66 will differ because of the
need in the former to compare only values for stations that
reported in both periods,)
For each state with some deviation above a standard (I>0)
in both periods, the sums of deviations (I) for both periods,
11-44
-------
summed over all AQCRs in the state, are compared to yield
the value for each AAQI sub-indicator:
1.1.1.1.(a) TSP annual 1.3.1.1.(a) CO 8-hour
1.1.1.2.(a) TSP 24-hour 1.3.1.2.(a) CO 1-hour
1,2.1.1.(a) S02 annual 1.4.1.1.(a) Ox 1-hour
1.2.1.2.(a) S02 24-hour 1.5.1.1.(a) N02 annual.
The value of each sub-indicator is computed by means of the
following formula:
Previous Period (I) - Present Period (I) -_
Previous Period (I)
For each state with no deviation (l£0) in either period,
the sum of all observed values (H) for both periods are compared
to yield the value for each AAQI sub-indicator:
l.l.l.l.(b) TSP annual 1.3.1.1.(b) CO 8-hour
1.1.1.2.(b) TSP 24-hour 1.3.1.2.(b) CO 1-hour
1.2.1.1. (b) SO2 annual 1.4.1.1.(b) Ox 1-hour
1.2.1.2.(b) 80224-hour 1.5.1.1.(b) N02 annual.
The value of each of these sub-indicators is computed by
means of the following equation:
Previous Period (H) - Present Period (H) .
Previous Period (H)
It should be noted that for any given state and any given
pollutant-standard, a value is computed for either improvement
in air quality deviation above the standard (a) or improvement
in air quality not exceeding the standard (b).
The range of values to be used for each AAQI sub-indicator
is listed on Table II-l. Values for the AAQI sub-indicators can
range from negative improvement (worsening of AQ) to positive
improvement and can conceivably be any real number. Because there
is no predetermined range of possible scores, the ranges of actual
11-45
-------
Table II-l . Converting Values to Scores
Index 1. GOAL ATTAINMENT
Measures
AQ Deviation
Improvement
Sub-Indicators
1.1.1.1. (a)TSP
1.1.1.2. (a)TSP
1.2.1.1. (a)S02
1.2.1.2.(a)S02
1.3.1.1. (a)CO
1.3.1.2.(a)CO
1.4.1.1. (a)0x
1.5.1.1. (a)N02
AQ Improvement
Sub-Indicators
1.1.1.1. (b)TSP
1.1.1.2. (b)TSP
1.2.1.1. (b)S02
1.2.1.2.(b)S02
1.3.1.1. (b)CO
1.3.1.2.(b)CO
1.4.1.1. (b)0x
1.5.1.1. (b)N02
Emission Reduc-
tion
Indicators :
1.1.2. TSP
1.2.2. SO.
1.3.2. CO
1.4.2. HC
1.5.2. N02
PRMS
Indicators :
1.1.3. TSP
1.2.3. S02
1.3.3. CO
1.4.3. Ox
1.5.3. N02
Range of Values Used
Low
-
High
No. of
States
Scale
A-Arith.
G-Geom.
Value Ranges for Scoring Intervals
Tscore- )
Low to:
(Score-. )
to:
Score" )
to:
Score* ^J
to:
11-46
-------
values computed for all states is determined. Also on Table II-l,
the scales to be used and the ranges of values for the scoring
intervals are determined (spaces are provided on the form for
four intervals but any number can be used).
In accordance with the ranges of values for the scoring
intervals, the computed values for the AAQI sub-indicators are
converted to scores on Table II-2 (one per state), the sub-indicators
for each pollutant are weighted and summed to yield the score for
the AAQI indicator for each pollutant.
(2) Emission Reduction (1.1.2. TSP, 1.2.2. S02> 1.3.2. CO,
1.4.2. 0 , 1.5.2. NO-): Using worksheet #2 in the Workbook (Appendix II-E),
X ^
the percent of needed emission reduction attained for each
pollutant from the previous period to the present period is
computed as follows:
/Previous Period Total\ /Present Period Total \
VEmission Rates (T/yr.)j ~ VEmission Rates (T/yr. )j
/Previous Period Total \ ~ /Emission *V
^Emission Rates (T/yr.)y ~ ^Goal (T/yr.)/
Emission rates can be obtained from the National Emissions
Report or NEDS printouts; emission goals can be gotten from the
SIP automated information system.
The resultant value can be negative if present period emissions
reported to NEDS are greater than previous period emissions.
Currently it is not possible to eliminate emissions from new
sources added to NEDS from the previous to the present period.
Thus it is possible that increased emission levels in the present
period reflect new sources added to NEDS, rather than increased
II-4?
-------
STATE:
Table II- 2. Scoring and Weighting
Index 1. GOAL ATTAINMENT
REGION:
Measure
1.1.1.1. (a)
1.1.1.1. (b)
1.1.1.2. (a)
1.1.1.2. (b)
l.l.l.AAQI
I.I.2.E.R.
1.1.3.PRMS
l.l.TSP
1.2.1.1. (a)
1.2. 1.1. (b)
1.2. 1.2. (a)
1.2.1.2. (b)
1.2.1.AAQI
I.2.2.E.R.
1.2.3.PRMS
1.2. SO?
1.3. 1.1. (a)
1.3.1.1.(b)
1.3.1. 2. (a)
1.3. 1.2. (b)
1.3.1.AAQI
I.3.2.E.R.
1.3.3.PRMS
1.3. CO
1.4.1.1. (a)
1.4.1.1. (b)
1.4.1.AAQI
I.4.2.E.R.
1.4.3.PRMS
1.4.0,,/HC
1.5.1.1. (a)
1.5. 1.1. (b)
1.5.1.
1.5.2.
1.5.3.
1.5.N09
l.GOAL
ATTAINMENT
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub- Index
Score Wt. Wtd.
Score
Index
Score
11-48
-------
emissions from the previous period sources. However, ongoing
efforts by NADB staff may soon make it possible to follow
changes in emissions of a given set of sources. Until that is
possible, the number of sources in NEDS in the previous and present
periods can give an indication of the significance of new sources
in accounting for changes in total emissions.
If needed emission reduction is zero or less, that is, if
previous period emissions are less than the emission goals, no
value is computed for the indicator.
Values for the emission reduction indicator for each pollutant
probably would not exceed 100% of needed reduction attained.
However, it is possible that the emission goal could be exceeded
and the value would be more than 100%. On the other end of the
range, there can be negative reduction (increase in emissions)
especially since new sources can be added to NEDS. The range of
computed values or a range from the lowest computed value to 100%
if no computed values exceeds 100%, is listed on Table II-l. In
addition, the range of values for each scoring interval is deter-
mined. On Table II-2, the computed indicator values for each state
are converted to scores.
(3) PBMS Flags a.1.3. TSP, 1.2.3. S00, 1.3.3. CO, 1.4.3. 0 ,
_«Tf_.. ^ X
1.5.3. NO-): Using worksheet #3 in the Workbook (Appendix II-E),
the number of flags for each pollutant-standard in the present
period is calculated. This is the number of monitoring sites
with four consecutive quarters, ending in the present period,
of data reported to SAROAD for which the PRMS analysis indicated
11-49
-------
a "potential deficiency." Data can be obtained from the
PRMS Analytical Summary Report.
For each pollutant the indicator value is the total number
of flags for all standards for that pollutant. Because the
number of flags depends greatly on the number of sites for which
data were reported to SAROAD, the total number of sites
reporting sufficient data for four consecutive quarters ending
in the present period is also calculated.
The PRMS flags indicators can have values ranging from zero to
any whole positive number; therefore the range of values for each
pollutant is from zero to the largest number computed for a state and
listed on Table II-l. The range of values for each scoring
interval is determined, and on Table II-2 computed values are
converted to scores.
Once the scores have been determined for all the indicators for
each pollutant sub-index, the indicator scores are weighted and summed
on Table II-2 to obtain sub-index scores for each state. The sub-index
scores for the pollutants are in turn weighted and summed for a score
for Index 1 for each state.
b. Index 2, PROGRESS (see flowchart, Figure II-4, p. 11-11)
(1) Sub-Index 2.1. Meeting MBO Commitments
Indicator 2,1.1. Output Category 1 is composed of 8 sub-
indicators whose values are computed according to worksheet #4.
Data are derived from the State Activity Report.
H-50
-------
Computed values for these sub-indicators
range from a negative value to any positive
value; the range of values computed for all units
is determined and listed on Table II-3. The scale and ranges of
values for the scoring intervals are chosen and also listed on
Table II-3. Each sub-indicator value computed for each state is
converted to a score and weighted on Table II-4, and weighted
sub-indicator scores are summed to yield the score for indicator
2.1.1. Because there is only 1 indicator under sub-index 2.1.,
the score for indicator 2.1.1. is also the score for sub-index
2.1.
(2) Sub-Index 2.2. Source Compliance
Values for indicators 2.2.1. and 2.2.2. are computed according
to worksheet #4, with data derived from the State Activity Report.
Computed values for the percentages of non-complying sources
brought into compliance (2.2.1.) and unknown sources whose status
was determined (2.2.2.) usually do not exceed 100% or fall below
0. However, the number of non-complying or unknown sources is the
number as of the beginning of the present period. It is possible
that during the period new sources not originally counted are
added to the non-complying or unknown categories, or sources change
their status to unknown or non-complying. A state thus may have
brought into compliance or determined the compliance status of a
greater number of sources than the original number of non-complying
or unknown sources. If computed values thus exceed 100% or fall
below 0, the range of all actual values is used; if no value is
greater than 100% or less than 0, a range of 0 to 100 is used.
11-51
-------
Table II- 3. Converting Values to Scores
Index 2. PROGRESS
Measures
MBO Commitments
Sub-Indicators :
2.1.1.1.
2.1.1.2.
2.1.1.3.
2.1.1.4.
2.1.1.5.
2.1.1.6.
2.1.1.7.
2.1.1.8.
Source Compli-
ance
Indicators:
2.2.1.
2.2.2.
.Surveillance &
Enforcement
Actions
Indicators :
2.3.1.
2.3.2.
2.3.3.
Enforcement
Sub- Indicators :
2.3.4.1.
2.3.4.2.
2.3.4.3.
Range of Values Used
Low
High
No. of
States
Scale
A-Arith.
OGeom.
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score-* )
to:
(Score= )
to;
11-52
-------
Table II- 3. Converting Values to Scores
(continued) Index 2. PROGRESS
Measures
Monitoring &
Reporting
% of Needed
Stations Added
Sub- Indicators :
2.4.1.1.
2.4.2.1.
2.4.3.1.
2.4.4.1.
2.4.5.1.
Z of Needed
AQCRs Attained
Sub- Indicators :
2.4.1.2.
2.4.2.2.
2.4.3.2.
2.4.4.2.
2.4.5.2.
SAROAD Suffi-
ciency Score
Sub- Indicators :
2.4.1.3.
2.4.2.3.
2.4.3.3.
2.4.4.3.
2.4.5.3.
Emissions Rptg.
Sub- Indicator:
2.4.6.1.
Range of Values Used
Low
High
No. of
States
Scale
A-Arith.
G«Geom.
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score- )
to:
(Score= )
to:
11-53
-------
Table II- 4. Scoring and Weighting
Index 2. PROGRESS
STATE:
REGION:
Measure
2.1.1.1.
2.1.1.2.
2.1.1.3.
2.1.1.4.
2.1.1.5.
2.1.1.6.
2.1.1.7.
2.1.1.8.
2.1.1.
2.1. Meeting Com.
2.2.1.
2.2.2.
2.2. Source Compl.
2.3.1.
2.3.2.
2.3.3.
2.3.4.1.
2.3.4.2.
2.3.4.3.
2.3.4.
2.3.Surv. & Enf.
2.4.1.1.
2.4.1.2.
2.4.1.3.
2.4.1.TSP
2.4.2.1.
2.4.2.2.
2.4.2.3.
2. 4. 2. SO,
2.4.3.1.
2.4.3.2.
2.4.3.3.
2.4.3.CO
2.4.4.1.
2.4.4.2.
2.4.4.3.
2.4.4.0,
2.4.5.1.
2.4.5.2.
2.4.5.3.
2.4.5.NO?
2.4.6.1.
2. 4. 6. Em.
2. 4. Monitoring
2. PROGRESS
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub- Index
Score Wt. Wtd.
Score
Index
Score
11-54
-------
The ranges of values to be used are listed on Table II-3, as
are the scales and scoring intervals. The computed indicator
values are converted to scores and weighted on Table II-4, and
weighted indicator scores are summed to yield the score for sub-
index 2.2.
(3) Sub-Index 2.3. Surveillance and Enforcement Actions
The numbers of field surveillance and enforcement actions and
the number of non-complying sources are taken from the State
Activity Report; the total number of sources is taken from a
printout or summary list of the number of manufacturing facilities
in each state derived from the Dun & Bradstreet DMI file, to which
EPA subscribes. Indicators 2.3.1., 2.3.2., 2.3.3., and sub-
indicators 2.3.4.1,, 2.3.4.2,, and 2.3.4.3. are computed on
worksheet #4.
The ranges of computed values for indicators 2.3.1. and 2.3.2.,
and sub-indicators 2.3.4.1., 2.3.4.2., and 2.3.4.3. are listed on
Table II-3. Computed values of indicator 2.3.3., percentage of
field surveillance actions taken by the state, do not exceed 100%,
so a range of 0 to 100% is used and listed on Table II-3. Scales
and scoring intervals are determined.
Computed values of the sub-indicators are converted to scores,
weighted and summed to obtain a score for indicator 2.3.4. on
Table II-4. Computed values for indicators 2.3.1., 2.3.2., and
2.3.3, are converted to scores and weighted and then combined with
the weighted score for indicator 2.3.4. to obtain the sub-index
score.
11-55
-------
(4) Sub-Index 2.4. Monitoring and Reporting Air Quality and
Emissions
The first five indicators under sub-index 2.4. measure
monitoring for the five criteria pollutants, and each is composed
of 3 sub-indicators: (a) AQCR average percent of needed stations
added, (b) percent of needed AQCRs with complete network added,
and (c) SAROAD sufficiency score.
(a) The percentage of stations needed in the previous period
to complete the federally required network, that was added
in the present period, is computed for each state using
worksheet #5a and detailed instructions in the Workbook.
Only stations that do not use unacceptable pollutant-methods
and that reported at least one quarter per year of sufficient
data to SAROAD are counted. If no stations are needed in
all AQCRs in a state, no value for the sub-indicator is com-
puted for the state.
Negative values, resulting from fewer stations reporting
in the present period than the previous period, are possible.
Data are drawn from the published Monitoring and Trends Report
or if more recent data are needed, from SAROAD printouts.
A top limit is set on this sub-indicator (see discussion on
P- H-18) so that values can range from a negative number to
100% (100% means that all needed stations were added). There-
fore the range of values used listed on Table II-3 for each
sub-indicator is the lowest computed value (or 0 if there are
no negative values) to 100%, and the value ranges for the
scoring intervals are determined. The computed value is
11-56
-------
converted to a score for each state and each sub-indicator,
and listed on Table II-4.
(b) The percentage of AQCRs in a state with less than a
complete network for each pollutant in the previous period
that attained a complete network in the present period is
also computed for each state using worksheet #5a and detailed
instructions in the Workbook. If there are no AQCRs in a
state that have less than a complete network in the previous
period, no value for the sub-indicator is computed for the
state. A larger number of AQCRs with less than a complete
network in the present period than in the previous period would
result in a negative computed value.
Data are drawn from the Monitoring and Trends Report or
SAROAD printouts for more recent data.
The highest value that can be computed is 100% since it
is not possible that more than all of the AQCRs that needed
stations achieved complete networks. Thus on Table II-3 the
range of values used for each sub-indicator is the lowest
computed value (or 0 is there are no negative values) to 100%.
Ranges for values for each scoring interval are also listed.
The computed value for each sub-indicator and each state is
converted to a score on Table II-4.
(c) The SAROAD sufficiency score for the present period,
which is the percentage of station-quarters of data sent to
SAROAD during the present period that met sufficiency criteria
11-57
-------
(see p. 11-20 for discussion of sufficiency criteria), is com-
puted on worksheet #5b. Data are from the AEROS Status Report
or from a SAROAD printout.
The range of possible values is 0 to 100% and is listed
on Table II-3. Ranges for scoring intervals are determined
using the selected scale and listed on Table II-3. On
Table II-4 the computed value for each sub-indicator and each
\
state is converted to a score.
Once the scores for all three sub-indicators under each pollu-
tant indicator are computed, the sub-indicator scores are weighted
and summed on Table II-4 to yield the indicator score for each state,
The last indicator, 2.4,5. Emissions Reporting, is made up of
one sub-indicator, 2.4.5.1. percent of missing NEDS necessary data
items completed during the period. The values for this sub-
indicator are computed for all states using worksheet #6 in
the Workbook. Data are from the AEROS Status Report or NEDS
printout of the Missing Data Items Report.
- Possible values for the sub-indicator range from a negative
number (when more items were missing in the present period than
the previous period) to a maximum of 100% (at best all missing
data were completed and no new items were missing). The lowest
computed value and 100% (or the highest computed value if all state
values are much less than 100%) are listed on Table II-3 as the
lowest and highest values of the range of values used. Ranges of
the scoring intervals are also listed on Table II-3.
II-58
-------
The sub-Indicator value for each state is listed on Table II-4
and converted to a score. Because there is only 1 sub-indicator
for the indicator, the score for indicator 2.4.5. for each state
is the same as the sub-indicator score.
Once all six indicator scores have been determined, the indi-
cator scores for each state are weighted and summed for the score
for sub-index 2.4.
Once scores for all four sub-indices under index 2 have been
determined, the sub-index scores are weighted and summed on Table II-4
to obtain the score for index 2.
c. Index 3. ACHIEVEMENT (see flowchart, Figure II-5, p. H-21)
(1) Sub-Index 3.1. Source Compliance
Indicators 3.1.1. percent of sources in compliance and 3.1.2.
percent of non-complying sources on compliance schedules are
computed using worksheet #4 in the Workbook. Data are taken from
the State Activity Report.
Computed values can range from 0 to 100%, and this range of
all values and the ranges of values for scoring intervals are
listed on Table II-5. Using these scoring interval ranges,
computed values for each state are converted to scores on Table
II-6. Indicator scores are weighted and summed to obtain the
score for sub-index 3.1. for each state.
(2) Sub-Index 3.2. Monitoring and Reporting Air Quality and
Emissions
The first five indicators deal with monitoring and reporting
of AAQ of the five criteria pollutants. Each pollutant-indicator
11-59
-------
Table II- 5. Converting Values to Scores
Index 3. ACHIEVEMENT
Measures
Source Complianc
Indicators :
3.1.1.
3.1.2.
Monitoring &
Reporting
% of Required
stations
Sub- Indicators :
3.2.1.1.
3.2.2.1.
3.2.3.1.
3.2.4.1.
3.2.5.1.
Z of AQCRs
Sub- Indicators :
3.2.1.2.
3.2.2.2.
3.2.3.2.
3.2.4.2.
3.2.5.2.
Pollutant-Methods
Sub- Indicators :
3.2.1.3.
3.2.2.3.
3.2.3.3.
3.2.4.3.
3.2.5.3.
Emissions Rptg.
Sub-Indicators :
3.2.6.1.
3.2.6.2.
Completing Plans
Indicator:
3.3.1.
Range of Values Use
Low
High
No. o
State
Scale
A-Arith
G-Geora.
Value Ranges for Scoring Intervals
(Score-
Low to:
(Score-
to:
(Score- )
to;
to:
11-60
-------
Table II-6
Scoring and Weighting
Index 3. ACHIEVEMENT
STATE:
REGION:
Measure
3.1.1.
3.1.2.
3.1. Source Cotnpl.
3.2.1.1.
3.2.1.2.
3.2.1.3.
3.2.1.TSP
3.2.2.1.
3.2.2.2.
3.2.2.3.
3.2.2,80,
3.2.3.1.
3.2.3.2.
3.2.3.3.
3.2. 3. CO
3.2.4.1.
3.2.4.2.
3.2.4.3.
3. 2. 4. Ox
3.2.5.1.
3.2.5.2.
3.2.5.3.
3.2.5.N02
3.2.6.1.
3.2.6.2.
3.Z.&..
3. 2. Monitoring^
3.3.1.
3. 3. Completing
Flans
3. ACHIEVEMENT
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub- Index
Score Wt. Wtd.
Score
Index
Score
n-61
-------
is composed of 3 sub-indicators: (a) AQCR average percent of
required network, (b) percent of AQCRs with required network,
and (c) percent of pollutant-methods used that are not unacceptable.
All data for these pollutant indicators are drawn from the
Monitoring and Trends Report, Air Quality Data-Annual Statistics.
or from SAROAD printouts for more recent data.
(a) The percentage of the federally required network
reporting at least 1 quarter per year of sufficient data
to SAROAD in the present period, averaged over all AQCRs
in a state, is computed using worksheet #5a in the Workbook.
There is a top limit of 100% on the computed value of
this sub-indicator for a state (see discussion on p. 11-18),
so that the range of all possible values is 0 to 100%
(100% means that all AQCRs in a state have the federal
minimum required number of stations). For an AQCR that is
not required to have any stations for a given pollutant,
a value of 100% is given to that AQCR, regardless of the
number of stations actually reporting to SAROAD.
The 0 to 100% range of possible values, and the ranges
of the scoring intervals are listed on Table II-5. The
computed value for each sub-indicator for each state is
converted to a score on Table II-6.
(b) The percentage of AQCRs in a state with the federal
minimum required network reporting to SAROAD in the present
period is also computed on worksheet #5a.
11-62
-------
The range of all possible values, which is 0 to 100%
in this case, is listed on Table II-5, and the ranges of
values of the scoring intervals are determined. Again, an
AQCR that is not required to have any stations is considered
to have the minimum required network regardless of the number
of stations actually reporting. Each computed value for each
state is converted to a score on Table II-6.
(c) The percentage of stations reporting data to SAROAD
using pollutant-methods that are not unacceptable ("not
unacceptable" is discussed on p.11-25) is computed using
worksheet #5a.
For TSP, no unacceptable methods were reported to SAROAD
in 1972 and 1973 because of the prevalent use of the federal
reference method. Until this situation changes, this sub-
indicator for TSP (3.2.1.3.) should probably be given a weight
of 0. There were relatively few unacceptable methods for S0»
and CO reported to SAROAD in 1972 and 1973, and these numbers
may decrease in succeeding years; accordingly the sub-indicators
for CO (3.2.3.3.) and SO- (3.2.2.3.) can be given low weights.
The range of possible values is 0 to 100%. This range
and the ranges of values for the scoring intervals are listed
on Table II-5. Each computed sub-indicator value for each
state is converted to a score on Table II-6.
Once the sub-indicator scores for each pollutant indicator
are determined, the indicator score is obtained by weighting and
summing the component sub-indicator scores.
11-63
-------
The last indicator under sub-index 3.2. is 3,2.6. Emission
Reporting, which consists of 2 sub-indicators. Sub-indicator
3.2.6.1. percent of total possible sources on NEDS, is computed on
worksheet #6, using data on the NEDS verification file. Using the
NEDS Missing Data Report in the AEROS Status Report or more recent
NEDS printouts, values for sub-indicator 3.2.6.2. percent of
necessary NEDS data items that are missing, are also computed on
worksheet #6 in the Workbook.
Possible values for sub-indicator 3.2.6.1. range from 0
(which is improbable since it implies no sources on NEDS) to
100%. On Table 11-5 this range and the ranges of values for the
scoring intervals are listed. Computed values are then converted
to scores on Table II-6.
It should be noted that for sub-indicator 3.2.6.2., a low
value means relatively few Items missing and a high value means
a large proportion of items missing. In contrast to most other
measures, the higher the value, the lower the extent of achieve-
ment. Therefore the range of values used for scoring g«j@s from
the highest to the lowest sub-indicator values.
Also regarding sub-indicator 3.2.6.2., a minimum number of
the necessary data items is required in order to get a source
into NEDS. Therefore, the largest possible value for 3.2.6.2.
is the percent of necessary NEDS data items that are missing when
all the possible # of data items that could be missing, without the
point source being rejected by NEDS, are actually missing. For a
given state, this is equal to:
11-64
-------
/ Minimum # of necessary A
i _ V data items required /
Cotal # of necessary data items\
for all sources in the state /
The upper limit of the range of possible values is 0 (no
data items missing). The range of possible values and the ranges
of scoring intervals are listed on Table II-5. For each state the
computed sub-indicator score is converted to a score on Table II-6.
The scores for sub-indicators 3.2.6.1. and 3.2.6.2. for each
state are weighted and summed on Table II-6 to yield a score for
indicator 3.2.6.
After all scores for indicators 3.2.1. to 3.2.6. for each
state are calculated, the score for sub-index 3.2. is calculated
on Table II-6 by weighting and summing the indicator scores.
(3) Sub-Index 3.3. Completing Plans and Revisions
Using data from the SIP Progress Report covering the 6-month
period at the end of the present period of evaluation (e.g., June-
December 1973 report for calendar year 1973), the value for the
only indicator under sub-index 3.3. is computed. The indicator
3.3.1., percent of required SIP portions completed, uses the
categorization of SIP portions used in the SIP Progress Report,
and is equal to:
1 T
of SIP portions found by EPA \
to be deficient, including those I
proposed or promulgated by EPA /
/ Total possible # of SIP portions \
^required by the end of the present period^
The total possible # is equal to the number of SIP portions
outlined in the SIP Progress Report times the number of AQCRs in
H-65
-------
the state. An SIP portion of a statewide plan found by EPA to
be deficient is considered to be deficient for all AQCRs in the
state and thus the number of deficient portions is equal to the
number of AQCRs. Worksheet #7 in the Workbook is used to calcu-
late indicator values.
The range of possible values for the indicator are 0
(improbable because no state has had all of the possible number
of SIP portions declared deficient) to 100% (no deficiencies).
This range and the ranges of the scoring intervals are entered on
Table II-5, and the indicator value for each state is converted
to a score on Table II-6. Because this is the only indicator
under sub-index 3.3., the sub-index score is the same as the indi-
cator score.
After all sub-index scores (3.1., 3.2., 3.3.) are calculated for
a state, the scores are weighted and summed to obtain the score for
index 3.
2. Need Indices
a. Index 4. PROBLEM (see flowchart, Figure II-6, p. 11-29)
(1) Sub-Index 4.1. Ambient Air Quality (AAQ) Problem
Each indicator measures the AAQ problem for one of the five
criteria pollutants, and is composed of two types of sub-indicators:
(a) air quality deviation indication (AQDI) and (b) population
exposed to air quality worse than standards.
(a) The air quality deviation indication sub-indicator for
a given pollutant and a given primary pollutant standard is
II- 66
-------
equal to;
VES
(Pollutant value) - (Pollutant standard)
Pollutant standard
VES
where J = sum over all values in an AQCR with a
pollutant value exceeding the standard
AQCR
and £ = sum over all AQCRs in a state.
The AQDI can be corrected to account for the complete-
ness of a state's monitoring network. The corrected AQDI for
a given pollutant and a given primary pollutant standard is
equal to:
VES
AOPR Y (Pollutant value) - (Pollutant standard)
y _^_ _ Pollutant standard _
^ No. of stations reporting pollutant value
Minimum required no. of stations
The values for both the uncorrected and corrected AQDI
can be computed using worksheet #8 and instructions in
the Workbook. Data are drawn from the Monitoring and Trends
Report for annual means and SAROAD printouts for other values
and also for more recent data for all values.
The possible values for the AQDI, both uncorrected and
corrected, range from 0 to any positive number. The range of
computed values for all states being analyzed, as well as
ranges of scoring intervals are listed on Table II-7.
Computed values for each state are then converted to scores
on Table II-8.
(b) The exposed population sub-indicator for each pollutant
and pollutant standard is the population in those AQCRs
11-67
-------
Table II- 7 . CdBTcrtlnf Value* to Scon*
Index 4. PROBLEM
Measure*
AAQ Problem
AQDt
Sub-Indicators:
4.1.1.1. (a)TSP
4.1.1.1.(b)TSP
4.1.1.2.(a)TSP
4.1.1.2.(b)TSP
4.1.M.(a)S02
4.1.2.1.(b)S02
4.1.2.2.(a)S02
4.1.2.2. (b)S02
4.1.3.1. (a)CO
4.1.3.1.(b)CO
4.1.3.2.(«)CO
4.1.3.2.(b)CO
4.1.4.1. (a)0x
4.1.4.1.01)0,
4.1.5.1. (a)K02
4.1.5.1. (b)K02
Population (1000)
Sub-Indicator*:
4.1.1.3. TSP
4.1.1.4.
4.1.2.3. S02
4.1.2.4.
4.1.3.3. CO
4.1.3.4.
4.1.4.2. 0,
4.1.5.2- W2
Erlsslor.s &
E>. Source*
Indicators:
4.2.1. Pop.
4.2.2. Land
4.2. 3^ Pop. Or.
4.2.4. Hanu.Rr.
tula* Ions (1000T/
Sub- Indicators:
4.2.5.1. ISP
4.2.5.2. S02
4.2.5.3. CO
4.2.5.4. HC
4.2.5.5. NOX
Eaiaalon Reduc-
tion Needed .
Indicators :
4. 3.1. TSP
4.3.2. S02
4. 3. 3. CO
4.3.4.RC
4.3.5.NO,
Range of Value* Used
Low
n)
High
Ko. of
State*
.
Scale
A-Arith.
OGeon.
Value Range* for ?corlnf Interval*
(Score- )
low to:
(Score- )
to:
(Score- )
to:
(Score- )
tt\ '
() Corrected
(b) Uncorreeted
11-68
-------
Table II- 8. Scoring and Weighting
Index 4. PROBLEM
STATE:
REGION:
Measure
4.1.1.1.
4.1.1.2.
4.1.1.3.
4.1.1.4.
4.1.1.TSP
4.1.2.1.
4.1.2.2.
4.1.2.3.
4.1.2.4.
4. 1.2. SO?
4.1.3.1.
4.1.3.2.
4.1.3.3.
4.1.3.4.
4. 1.3. CO
4.1.4.1.
4.1.4.2.
4.1.4.0,.
4.1.5.1.
4.1.5.2.
4. 1.5. NO?
4.1.AAQ Problem
4.2.1.
4.2.2.
4.2.3.
4.2.4.
4.2.5.1.
4.2.5.2.
4.2.5.3.
4.2.5.4.
4.2.5.5.
4.2.5.
4. 2. Em & Em.
Sources
4.3.1.
4.3.2.
4.3.3.
4.3.4.
4.3.5.
4. 3. Em. Reduc.
Needed
4. PROBLEM
Sub- Indicator
Value Score Wt. Wtd.
Score
,
Indicator
Value Score Wt. Wtd.
Score
Sub- Index
Score Wt. Wtd.
Score
Index
Score
11-69
-------
(and state portions of interstate AQCRs) for which a positive
AQDI is computed. The sub-indicators are computed using
worksheet //9 and instructions in the Workbook. Data is
taken from census population figures as presented in the
OBERS extension to AQCRs (see Appendix II-A for description).
State portions of interstate AQCRs are derived from an NADB
printout of the population of state-AQCR combinations.
The range of computed values for each exposed population
sub-indicator is listed on Table II-7, and the ranges of the
scoring intervals are determined. Computed values are con-
verted into scores on Table II-8.
Scores for the AQDI and population exposed sub-indicators for
each pollutant indicator are weighted and summed to calculate
scores for each pollutant indicator.
Indicator scores for each pollutant are then weighted and
summed to calculate a score for sub-index 4.1. for each state.
(2) Sub-Index 4.2. Emissions and Emission Sources
Values for the urbanized area population indicator (4.2.1.)
are taken from the census or any of the statistical abstracts
based on the decennial census. The Census Bureau also publishes
annual population estimates so that more current population figures
are available. Urbanized (SMSA) land area values (indicator 4.2.2.)
are from the Statistical Abstract of the U.S.
Values for the projected population (1970-1980) and manu-
facturing (1969-1980) growth rate indicators (4.2.3. and 4.2.4.)
II- 70
-------
are from the OBERS projections. An alternative source of popu-
lation growth rates is the Census Bureau Series C or Series E
estimates (which served as a basis for the OBERS rates). The
OBERS projections of changes from 1969 to 1980 in a production
index for all manufacturing industries are used to calculate
manufacturing growth rate. The index is considered an estimate
of gross product. Alternatively, OBERS projections of total
earnings for manufacturing industries, which are not adjusted to
account for differential gross product-earnings ratios among
industries as are the production indexes, can be used to calculate
rates of growth of manufacturing activity.
The ranges of computed values for these indicators, computed
using worksheet #10 in the Workbook are listed on Table II-7,
as are the ranges for the scoring intervals. Values for each
state are converted to scores on Table II-8.
The last indicator, 4.2.5. Total Emissions, is composed of
five sub-indicators, one for each criteria pollutant emitted.
Total emissions from all point and area sources in a state are
taken from the National Emissions Report or from NEDS printouts
for more recent data, and are listed on worksheet #2. The ranges
of values for each sub-indicator for all states are listed on
Table II-7, followed by ranges of scoring intervals. Scores
for the computed values of the five sub-indicators for each state
are listed on Table II-8, weighted and summed for the score for
indicator 4.2.5.
Indicator scores are weighted and summed to yield the score
for index 4,2.
11-71
-------
(3) Sub-Index 4.3. Emission Reduction Needed
There are five indicators, one for each criteria pollutant
emitted. Each indicator is equal to:
(Emission Goal) - (Emissions for Present Period)
and is computed on worksheet #2 in the Workbook.
Emissions for the present period are obtained from the
National Emissions Report covering the present period and are
the total emissions of a pollutant from all point and area
sources in a state. The emission goals for each pollutant for
each state are projected to be available from the NADB's SIP
automated information system in the near future.
The ranges of values for the five pollutant indicators are
listed on Table II-7, as well as the ranges for each scoring
interval for each pollutant. -Individual state values are con-
verted to scores on Table II-8 and scores for the five indicators
are weighted and summed to calculate the score for sub-index 4.3.
b. Index 5. OPERATIONAL REQUIREMENTS (see flowchart, Figure II-7,
p. 11-35)
(1) Sub-Index 5.1. Source Compliance and Enforcement
The four source compliance and enforcement indicators are
computed on worksheet #4 in the Workbook. Data are derived from
the State Activity Report and the Dun and Bradstreet DMI file.
Ranges of computed values for all states and ranges of the
scoring intervals for each indicator are listed on Table II-9,
values for each state are converted to scores on Table 11-10
11-72
-------
Table II- 9. Converting Values to Scores
Index 5. OPERATIONAL REQUIREMENTS
Measure*
Source Compliance
k Enforcement
Indicators :
5.1.1.
5.1.2.
5.1.3.
5.1.4.
Monitoring &
Reporting
No. of Needed
Stations
Sub- Indicators :
5.2.1.1.
5.2.2.1.
5.2.3.1.
5.2.4.1.
5.2.5.1.
No. of AQCRs
Sub-Indicators :
5.2.1.2.
5.2.2.2.
5.2.3.2.
5.2.4.2.
5.2.5.2.
Improvement in
SAROAD score
Sub- Indicators :
5.2.1.3.
5.2.2.3.
5.2.3.3.
5.2.4.3.
5,2.5.3.
Emissions Rptg.
Sub-Indicators :
5.2.6.1.
5.2.6.2.
Completing Flans
Indicator:
5.3.1.
Range of Values Used
Low
Hieh
No. of
States
Scale
A-Arith.
G-Geom.
.
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score- )
to:
[Scor'e« )
to:
11-73
-------
STATE:
Table 11-10* Scoring and Weighting
Index 5. OPERATIONAL REQUIREMENTS
REGION:
Measure
5.1.1.
5.1.2.
5.1.3.
5.1.4.
5.1. Source Compl.
5.2.1.1.
5.2.1.2.
5.2.1.3.
5.2.1.TSP
5.2.2.1.
5.2.2.2.
5.2.2.3.
5.2.2.S02
5.2.3.1.
5.2.3.2.
5.2.3.3.
5.2. 3. CO
5.2.4.1.
5.2.4.2.
5.2.4.3.
5.2.4.0,,
5.2.5.1.
5.2.5.2.
5.2.5.3.
5.2.5.N03
5.2.6.1.
5.2.6.2.
5. 2. 6. Em.
5. 2. Monitor ing
5.3.1.
5. 3. Completing
Plans
5. OPERATIONAL
REQUIREMENTS
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub-Index
Score Wt. Wtd.
Score
Index
Score
11-74
-------
and scores are weighted and summed to obtain the score for
sub-Index 5,1, for each state,
(2) Sub-Index 5.2. Monitoring and Reporting Air Quality and
Emissions
The first five indicators are for monitoring ambient air
quality levels of the five criteria pollutants. Each pollutant
indicator is composed of three sub-indicators: (a) number of
stations that need to be added, (b) number of AQCRs with less
than the required network, and (c) improvement in SAROAD
sufficiency score needed.
(a) The number of stations for each pollutant that need
to be added in a state at the end of the present period
is equal to:
AQCR
~ in-Limnum reqmteu it\ ireporteu uauti aunngji.
V nf a t- a t- { nin ts M % t"V»o r»-»-/3O£iTl * I-VOT**! r\e\ ft
each AQCR
AQCR
where £ = sum over all AQCRs in the state.
Only stations that do not use unacceptable pollutant-
methods (see Appendix II-B) and that reported at least one
quarter per year of sufficient data to SAROAD are counted.
Only AQCRs that needed stations to complete the federally
required network are counted; no negative values are included
in the state total. Worksheet #5a in the Workbook is used
to compute the values for the sub-indicators.
Data are taken from the Monitoring and Trends Report or
SAROAD printout covering the present period.
11-75
-------
Possible values for each pollutant range from 0 (no
stations required in any AQCR in a state) to the total number
of minimum required stations in a state. The range of values
listed on Table II-9 is 0 to the largest number computed for
a state. Ranges for the scoring intervals are also listed
on Table II-9. Computed values for each state are converted
to scores on Table 11-10.
(b) The number of AQCRs in a state with less than the
federally required network for each pollutant is taken from
the Monitoring and Trends Report or SAROAD printout covering
the present period, and is entered on worksheet #5a. Only
stations that do not use unacceptable pollutant-methods and
that reported at least one quarter per year of sufficient
data to SAROAD are counted toward the minimum required number
of stations.
Possible values range from 0 (all AQCRs have the minimum
required number of stations reporting to SAROAD) to the
total # of AQCRs in a state. On Table H-9, a range of
0 to the largest computed state value, and the ranges of the
scoring intervals are listed. Scores are calculated on
Table II-9 by converting computed state values for the
pollutants.
(c) Improvement needed in the SAROAD sufficiency score for
each pollutant and each state is equal to:
(100) - (Sufficiency score for present period).
11-76
-------
The sufficiency score for the present period was computed
for index 2, and the improvement needed is calculated on
worksheet #5b in the Workbook. Data are from the AEROS
Status Report or from a SAROAD printout.
Possible scores range from 0 to 100 and the range of
all possible scores and ranges for the scoring intervals
are listed on Table II-9. The values for the sub-indicators
for the various pollutants for each state are converted to
scores on Table 11-10.
.*»
When all three sub-indicator scores for each pollutant are
determined, the scores are weighted and summed on Table 11-10
for the score for each of the pollutant indicators (5.2.1. to
5.2.5.).
The last indicator, 5.2.6. Emissions Reporting, has two
component sub-indicators. Sub-indicator 5.2.6.1. counts the
number of sources that are on the NEDS verification file and
thus may need to be added to NEDS. Sub-indicator 5.2.6.2.
counts the number of necessary NEDS data items that are missing
and need to be completed. Values for both sub-indicators are
entered on worksheet #6 in the Workbook. Sources for the data
are the NEDS list of sources on the verification file by state,
and the NEDS Missing Data Items Report (in the AEROS Status Report)
The scores for all six indicators for each state are weighted
and summed on Table 11-10 to calculate the score for sub-index 5.2.
11-77
-------
(3) Sub-Index 5V3, Completing Plans and Revisions
Indicator 5,3.1, counts the number of SIP portions which
require state action, completion, or adoption. This is assumed
to be equal to the number of portions of statewide or area plans
declared deficient by EPA, including non-regulatory portions
that require state submittal, regulatory portions proposed or
promulgated by EPA in the absence of approved state action, and
deficiencies in legal authority. The source of data is the SIP
Progress Report covering the end of the present period. Work-
sheet #7 is used to enter the value of this indicator.
The range of computed values for all states is listed on
Table II-9, and the ranges of values for the scoring intervals is
determined. The computed value for each state is converted to a
score on Table 11-10. Because this is the only component indi-
cator under this sub-index, the indicator score is the same as
the score for sub-index 5.3.
Scores for each state calculated for the three sub-indices are
weighted and summed on Table 11-10, and the score for index 5 is entered,
11-78
-------
SECTION C. TRIAL RUN
1. Performance Indices 11-82
a. Index 1. Goal Attainment 11-82
b. Index 2. Progress 11-86
c. Index 3. Achievement 11-90
2. Need Indices 11-94
a. Index 4. Problem 11-94
b. Index 5. Operational Requirements .... 11-100
-------
,, Trial Run
u
A trial run of the system using existing data for all 55 states and
territories was conducted to serve four purposes:
(1) To test the availability and accessibility of data for all states
and territories;
(2) To bring out any problems involved in using the data and calculating
values and scores;
(3) To provide actual values and scores on the basis of which some
assessment of the validity of the measures can be made;
(4) To provide an estimate of the amount of time and effort involved
in implementing the system.
Data availability was a problem throughout the trial run for three
reasons:
(a) Data banks in the Research Triangle Park, N.C., are currently
being converted to the UNIVAC 1110 computer. Consequently there
have been some problems having programs run and output produced;
e.g. short-term ambient air quality values from SAROAD could not
be obtained. Because these difficulties are considered to be
temporary, no measures or procedures described in sections B and
C were changed for this reason. To complete the trial run in
spite of these difficulties substitute values were used for cer-
tain measures, and certain other measures were not computed.
(b) Infrequently, no data for a particular parameter was available
for an individual state either because of nonsubmission of data
by a state or because of a programming error. For example, NEDS
11-79
-------
1973 emissions for Nebraska were missing from the printout and
lack of time prevented going back and getting them.
(c) Some data not currently available at EPA headquarters were
included, nevertheless, in constructing the measures because
they are expected to be available in the near future. The most
prominent example is the state emissions goals necessary to
compute emissions reductions needed, which are expected to be
incorporated in the SIP automated information system. Another
example is N0_ monitoring and ambient air quality data; although
this data was not used in the trial run because of the uncertainty
of measurement methods, the situation can be expected to be clari-
fied in the near future. For those measures dependent on these
data, either substitute values or measures were used instead, or
the measures were not computed.
When data were not available and substitutes were used, this is
explained fully for each measure affected. Where values for measures
were not computed at all, this is explained in the discussion of the
measure and noted in the formatted results, and weights of 0 were assigned
to these measures before scores were aggregated.
Based on the trial run of the system, it is estimated that once the
data are available, computation of index scores and presentation of the
results in the suggested formats for the fifty-five states and territories
takes approximately 38 person-days. Using the instruction in section B
and the Workbook (Appendix II-E), only very basic mathematical skills are
required. Collecting and computing the state background information takes
another three person-days; some of the information (such as land data)
does not have to be collected a second time.
11-80
-------
A breakdown by index of the time that was required in the trial
run is presented below:
Index 1: 10 person-days Index 4: 9.75 person-days
Index 2: 10.25 person-days Index 5: 3.25 person-days
Index 3: 4.75 person-days
State Background Information: 3 person-days
It would appear that annual application of the system on a manual
basis is feasible in terms of time and effort required. The length of
time could be shortened significantly by automating computation of the
AQDI and AAQI measures as suggested in Section E of this report. On the
other hand, computation of the short-term AQDI and AAQI measures, which
was not done in the trial run because of unavailability of data, would add
considerably more time unless the computation procedures were computerized.
For the trial run, a weighting system developed partly on the basis
of the results of a questionnaire sent to the Regional Offices was used.
The weight for each component measure is given on the appropriate table
before the component scores are weighted and summed to calculate scores
on the next level of aggregation. Four scoring intervals, representing
quartiles, with scores of 1, 2, 3, and 4 (the larger the score, the greater
the performance or need) were chosen.
The rest of this section is devoted to discussion of each of the
measures, according to the outline of indices used in Sections A and B.
For each measure, the following points are discussed:
(1) Data sources used;
(2) Time periods to which the measures are relevant for the trial run;
(3) Calculation procedures used in the trial run insofar as they differed
from those set forth in Section B;
11-81
-------
(4) Any difficulties with, definitions of terms or computation of values
or scores encountered;
(5) Actual values and scores computed on the tables described in Section B.
1. Performance Indices
a. Index 1. GOAL ATTAINMENT
(1) Ambient Air Quality Improvement (AAQI)
Data for short-term values could not be obtained from
SAROAD because of computer difficulties. Therefore the AAQI
sub-indicators for the TSP and SO 24-hour standards and the
CO 1-hour standards were not computed (no N0_ values were used).
Because many states did not have sufficient data to compute
annual means for TSP and S0_, it was decided to substitute the
50th percentile value of the frequency distribution of all values
reported to SAROAD for the TSP annual geometric mean, and the
70th percentile value for the SO. annual arithmetic mean. Such
2.
substitutions are recognized to be rough estimations at best,
and it is expected that the sufficiency of data reported to
SAROAD will improve, enabling the use of annual means in the
near future.
The 50th and 70th percentile values for TSP and S02 were
taken from the 1972 and 1973 Air Quality Data-Annual Statistics.
Stations with less than 15 observations (an arbitrary number
equivalent to a minimum of 5 values per quarter for a minimum
of 3 quarters per year, although no assessment of the distri-
bution of values within a quarter was made) were eliminated. For
11-82
-------
the remaining stations, station code numbers reporting in 1972
and 1973 were matched to obtain a set of stations that reported
data in both years.
Using these stations, the sums of all percent deviations
above the annual TSP and SO standards (I) for 1972 and 1973
were computed:
/TSP 50th or SoA _ /TSP or SO primary^
AQCR SES I 70th % value I [ . .. . I
v y > *_ \ annual standard /
I L (TSp or so primary annual standard)
SES
where \ = sum over all stations with TSP 50th or S02
70th % value exceeding standard
AQCR
and £ = sum over all AQCRs in a state.
Because short-term values were not available, substitutes
were used for the CO 8-hour and 0 1-hour AAQI sub-indicators.
X
From the 1972 and 1973 Monitoring and Trends Report, stations
that reported CO and 0 data in both years were identified. For
these stations air quality deviation for the state in each year
was computed. For 0 , AQ deviation (I) was equal to:
AQCR SES
I I
"
ft 1-hr. values\ x
\exceeding std./
total # valid 1-hr, values
/2nd highestX ( .
U-hr. value] ~ (Std-}
Std.
SES
where ^ = sum over all stations in an AQCR with 2 or more
valid values exceeding the standard
AQCR
and £ = sum over all AQCRs in a state.
(Note: The number of values above the standard is multiplied by
100 only to avoid numbers with a large number of decimal
places.)
11-83
-------
CO was eqi
AQCR SES
I I
lal to:
/# 8-hr. averages^
V exceeding std ./X 100
C:otaL # valid
Averages
8-hr\
The air quality deviation CD for the CO 8-hour standard
was computed in the same way except that, because the second
highest 8-hour average is not yet printed in the Monitoring and
Trends Report, the highest 8-hour average was used. Thus (I) for
(highest 8-hr, average)-(Std.j
Std.
SES
where J. = sunj over all stations in an AQCR with 2 or more
valid values exceeding the standard,
AQCR
and £ = sum over all AQGRs in a state.
For each pollutant, if the air quality deviation (I) for
each state was greater than 0 (indicating values greater than
standards) for both 1972 and 1973, the difference between the
numbers for the 2 years was computed to yield the percent
improvement in air quality deviation:
1972(1) - 1973(1)
1972
x 100
If (I) for a state was 0 or less for either 1972 or 1973,
the sums of all observed values (H) for both years were computed
and the percent improvement in air quality was calculated:
1972(H) - 1973(H)
1972(H)
x 100
1.2.1.1.(b)SO
2
2J
The values thus computed represent air quality improvement
from 1972 to 1973.
11-84
-------
C2) Emission Reduction
Emission goals for each, state Cemigslon levels needed to
attain AAQS) were not available, although they are projected to
be available In the near future from the SIP automated information
system. For the trial run, percentage emission reductions were
computed and used as the values for the emission reduction
sub-indicators.
Total 1972 and 1973 emissions from all point and area
sources in a state were taken from the 1972 National Emissions
Report and a NEDS printout emission summary by state dated
January 1975 that is generally representative of 1973 emissions.
For each pollutant, emission reduction from 1972 to 1973 was
computed:
1972 emissions - 1973 emissions
1972 emissions
x 100
1.1.1.2.TSP
1.2.1.2.SO
1.3.1.2.CO
1.4.1.2.HC
1.5.1.2.NO.,
(3) PRMS Flags
Those stations in the PRMS Analytical Summary Report (Analysis
No. 3, October 1974) which had a quarter in calendar year 1973
as the last quarter for which data was available, were identified.
Of these stations, the number with a potential deficiency in
either magnitude or frequency flagged for each pollutant-standard
in the state was counted. The number of sites in each AQCR with
sufficient data for analysis for each pollutant was also calculated.
It should be noted that the number of sites flagged is counted
for each pollutant-standard analyzed; TSP, S0~, and CO each had
11-85
-------
two standards for which the analysis was done, while 0 had
one.
For each sub-Indicator, the range of values and scoring intervals
for all states, and computed values, scores and weights for all measures
under index 1 for a sample state are shown on Tables 11-11 and 11-12,
as instructed in Section B.
b. Index 2. PROGRESS
(1) Sub-Index 2.1. Meeting MBO Commitments
MBO commitments and outputs for the second quarter of FY75
were taken from the State Activity Report for the period ending
December 31, 1974. Calculation of sub-indicator values was done
as instructed in Section B.
(2) Sub-Index 2.2. Source Compliance
Data was drawn from the State Activity Report for the period
ending December 31, 1974. Progress from the beginning of FY75
(July 1, 1974) to the end of the second quarter of FY75 (December 31,
1974) was measured, as instructed in Section B.
(3) Sub-Index 2.3. Surveillance and Enforcement Actions
The number of field surveillance and enforcement actions
taken during the first two quarters of FY75 was taken from the
State Activity Report for the period ending December 31, 1974.
This was compared to the number of non-complying sources reported
at the beginning of the period (July 1, 1974). However, the
number of stack tests was not included in the State Activity
11-86
-------
Table 11-11. Converting Values to Scores
Index 1. GOAL ATTAINMENT
Measures
AQ Deviation
Improvement
Sub- Indicators
1.1.1.1. (a)TSP
1.1.1.2.(a)TSP
1.2.1.1. (a)S02
1.2.1.2. (a)S02
1.3.1.1. (a)CO
1.3.1.2.(a)CO
1.4.1.1. (a)0x
1.5.1.1. (a)N02
AQ Improvement
Sub- Indicators
1.1.1.1. (b)TSP
1.1.1.2. (b)TSP
1.2.1.1. (b)S02
1.2.1.2.(b)S02
1.3.1.1. (b)CO
1.3.1.2.(b)CO
1.4.1.1. (b)0x
1.5.1.1. (b)N02
Emission Reduc-
tion
Indicators :
1.1.2. TSP
1.2.2. S02
1.3.2. CO
1.4.2. HC
1.5.2. N02
PRMS
Indicators :
1.1.3. TSP
1.2.3. S02
1.3.3. CO
1.4.3. Ox
1.5.3. N02
Range of Values Used
Low
-225.0
Not comp
-152.0
Not comp
-521.3
Not comp
-525.0
Not comp
-59.5
Not comp
-71.1
Not comp
-20.4
Not comp
+24.1
Not comp
-63.3
-949.6
-471.6
-90.3
-233.3
+49
+ 4
+12
+ 5
(no PRM!
HiRh
+82.4
ited
+75.2
ited
+98.0
ited
+99.3
ited
+19.0
ited
+80.4
ited
ited
+45.2
uted "
+69.0
+41.8
+75.3
+63.3
+64.5
0
0
0
0
analysis)
No. of
States
36
6
21
7
5
28
1
2
55
55
55
55
55
47
45
21
8
Scale
A-Arith.
G-Geom.
G
G
G
G
A
A
I 0-mld-
/ -point;
break-
1 points
G
A
G
G
Value Ranges for Scoring Intervals
(Score- i)
Low to:
-71.30
r38.40
-211.65
-212.85
-39.88
-33.23
Only 1 vali
Only 2 vali
-20.00
-25.00
-25.00
-20.00
-20.00
+24.5
+ 3.1
+ 6.1
+ 2.1
(Score- 2)
to :
+5.55
+18.40
-56.83
-56.78
-20.25
+4.65
.
e (Md)
.
es (Va. , P«
0.00
0.00
0.00
0.00
0.00
+12.3
+ 1.1
+ 2.1
+ 1.1
(Score- 3 )
to:
+44:00 '
+46.80
+20.59
+21.26
-.63
+42.53
+20.00
+20.00
+20.00
+20.00
+20.00
+3.1
+0.1
+0.1
+0.1
(Score- 4)
fn;
+82.40
+75.20
+98.00
+99.30
+19.00
+80.40
+69.00
+41.80
+75.30
+63.30
+64.50
0
0
0
0
11-87
-------
STATE: Sample
Table 11-12. Scoring and Weighting
Index 1. GOAL ATTAINMENT
REGION:
Xeacure
1.1.1.1. (a)
1.1.1.1. (b)
1.1.1.2. (a)
1. 1.1.2. (b)
l.l.l.AAQI
I.I.2.E.R.
1.1.3.PRMS
1.1. TSP
1.2.1.1. (a)
1.2.1.1.(b)
1.2. 1.2. (a)
1.2. 1.2. (b)
1.2.1.AAQI
I.2.2.E.R.
1.2.3.PRMS
1.2.SO?
1.3.1.1. (a)
1.3.1.1. (b)
1.3.1.2. (a)
1.3.1.2.(b)
1.3.1.AAQI
I.3.2.E.R.
1.3.3.PRMS
1.3. CO
1.4.1.1. (a)
1.4.1.1.(b)
1.4.1.AAQI
1.4.'2.E.R.
1.4.3.PRMS
1.4.0./HC
1.5.1.1. (a)
1.5. 1.1. Cb)
1.5.1.
1.5.2.
1.5.3.
1.5. NO?
l.GOAL
ATTAINMENT
Sub- Indicator
Value Score We. Wtd.
Score
24.4 3 1.0 3.00
nc 0
nc 0
-1.3 2 1.0 2.00
nc 0
nc 0
87.8 4 1.0 4.0C
nc 0
nc 0
85.0 -4 1.0 4.00
nc
Indicator
Value Score Vt. Wtd.
Score
3.00 .40 1.20
-2.8 2 .30 .60
32(107) 1 .30 .30
2.00 .40 .80
-2.9 2 .30 .60
2(30) 2 .30 .60
4.00 .40 1.60
-5.4 2 .30 .60
3(6)' 2 .30 .60
4.00 .57 2.28
-7.7 2 .43 .86
0
nc
nc
nc
Sub-Index
Score Wt. Wtd.
Score
2.10 .25 .53
2.00 .25 .50
2.80 .25 .70
3.14 .25 .79
nc 0
Index
Score
2.52
11-88
-------
Report. Thus indicators 2,3,1, and 2,3,2, could not be computed.
The other indicators and sub^-indlcators were computed as instructed
in Section B.
(4) Sub-Index 2.4. Monitoring and Reporting Air Quality and
Emissions
For each pollutant indicator (2.4.1. to 2.4.4.; NO was
not computed):
(a) AQCR average percent of needed stations added sub-
indicators were computed as instructed in Section B. Data
was taken from the 1972 Monitoring and Trends Report for
the number of stations needed to complete the monitoring
networks, and from the 1973 trends report for the number
of stations added from 1972 to 1973.
(b) Percent of AQCRs needed stations that attained complete
networks sub-indicators were computed as instructed in
Section B. 1972 data for AQCRs needing stations and 1973
data for number of AQCRs that attained complete networks in
1973 were taken from the Monitoring and Trends Report for
1972 and 1973.
(c) The SAROAD sufficiency score for 1973, computed as
instructed in Section B, was based on data taken from the
May 1974 AEROS Status Report.
The sub-indicator for emissions reporting (2.4.6.1.) was not
computed because the missing data items report for the previous
11-89
-------
period (1972) could not be obtained, and thus the number of missing
items completed from 1972 to 1973 could not be computed.
The ranges of all values and values of scoring intervals for all
computed sub-indicators or indicators under index 2 are shown on Table
11-13. For a sample state the computed values, scores, weights and
weighted scores for index 2 measures are shown on Table 11-14.
c. Index 3. ACHIEVEMENT
(1) Sub-Index 3.1. Source Compliance
Data for source compliance status at the end of the second
quarter of FY75 is taken from the State Activity Report. Indi-
cators are computed per instructions in Section B.
(2) Sub-Index 3.2. Monitoring and Reporting Air Quality and
Emissions
For each pollutant indicator (3.2.1. to 3.2.4.; NO- was
not computed):
(a) AQCR average percent of required network sub-indicators
was computed as instructed in Section B, using the 1973
Monitoring and Trends Report.
(b) Percent of AQCRs with required network sub-indicators
was also computed using the 1973 trends report, per
instructions in Section B.
(c) Percent of pollutant-methods that are not unacceptable
was computed per instructions using the 1973 trends report.
No sub-indicator for TSP was computed, because the federal
reference method was used for all stations that reported to
II-90
-------
Table II- 13. Converting Values to Scores
Index 2. PROGRESS
Measures
HBO Commitments
Sub- Indicators :
2.1.1.1.
2.1.1.2.
2.1.1.3.
2.1.1.4.
2.1.1.5.
2.1.1.6.
2.1.1.7.
2.1.1.8.
Source Compli-r
ance
Indicators :
2.2.1.
2.2.2.
Surveillance &
Enforcement
Actions
Indicators :
2.3.1.
2.3.2.
2.3.3.
Enforcement
Sub- Indicators :
2.3.4.1.
2.3.4.2.
2.3.4.3.
Range of Values Usec
Low
-870
-2,100
-17,100
-1,900
-590
-3,300
0
0
-630
-1100
ot compu
ot compu
0
0
0
0
Hijjh
+4,000
+400
+200
+560
+12,450
+400
+21,100
+16,200
+430
+310
co-
ed
+100
+826.0
+36.1
+474.0
No. o:
States
55
55
48
48
48
Scale
A-Arith.
G-Geom.
k
00 +
0*9
100 -» 19
200 * +»
o» '+ 0
0 + 49
50 -» 99
100 + +">
G
G
G
G
Value Ranges for Scoring Intervals
-------
Table 11-13. Converting Values Co Scores
(continued) Index 2. PROGRESS
Measures
Monitoring &
Reporting
X of Needed
Stations Added
Sub- Indicators :
2.4.1.1.
2.4.2.1.
2.4.3.1.
2.4.4.1.
2.4.5.1.
Z of Needed
AQCRs Attained
Sub- Indicators :
2.4.1.2.
2.4.2.2.
2.4.3.2.
2.4.4.2.
2.4.5.2.
SAROAD Suffi-
ciency Score
Sub- Indicators :
2.4.1.3.
2.4.2.3.
2.4.3.3.
2.4.4.3.
2.4.5.3.
Emissions Rptg.
Sub- Indicator:
2.4.6.1.
Range of Values Used
Low High
-200
0
-25
-400
Not compv
-100
-100
-100
-100
Not compi
+33.3
0
0
0
Not compi
Not cpmpj
\l
+100
+100
+100
+100
ted
+100
+100
+100
+100
ted
+100
+100
+100
+100
ted
ted
No. of
States
29
43
21
3*
29
43
21
34
54
52
33
24
Scale
A-Arith.
G-Geom.
A
A .
A
A
A
A
A
A-
G
G
A
A
Value Ranges for Scoring Intervals
(Score- 1.)
Low to:
>
'
;
v
'
;
+83.33
,+49.99
+24.99
+24.99
(Score--2 )
to:
+49.9
+49.9
+91.67
+74.99
+49.99
+49.99
(Score- 3)
to:
+99.9
+99.9
+95.Q4
+87.49
+74.99
+74.99
(Score* 4)
tfit
+100.0
fiob.o
uoo.oo
noo.oo
noo.oo
HOO.OON
11-92
-------
T«bl« 11-14. Scoring and Weighting
Index 2. PROGRESS
STATE: Sample
REGION!
Meaaure
2.1.1.1.
2.1.1.2.
2.1.1.3.
2.1.1.4.
2.1.1.5.
2.1.1.6.
2.1.1.7.
2.1.1.8.
2.1.1.
2.1. Meeting Com.
2.2.1.
2.2.2.
2.2. Source Conpl.
2.3.1.
2.3.2.
2.3.3.
2. 3. A.I.
2.3.4.2.
2.3.4.3.
2.3.4.
2.3.Surv. & Enf.
2.4.1.1.
2.4.1.2.
2.4.1.3.
2.4.1.TSP
2.4.2.1.
2.4.2.2.
J.4.2.3.
2. 4. 2. SO,
2.4.3.1.
2.4.3.2.
2.4.3.3.
2.4.3.CO
2.4.4.1.
2.4.4.2.
2.4.4.3.
2.4.4.0,
2.4.5.1.
2.4.5.2.
2.4.5.3.
2.4.5.N07
2.4.6.1.
2. 4. 6. Em.
2. 4. Monitor In a
2. PROGRESS
Sub- Indicator
Value Score Wt.
39
27
-55
38
39
-200
62
130
2 .1
2 .15
1 .15
2 -15
2 -15
1 .1
2 .1
3 .1
Wtd.
Score
.2
.3
.15
.3
.3
.1
.2
.3
60
0
5
2 -2
1 -4
2 -4
.4
.4
.8
100
100
77.6
4 .35
4 .35
1 .30
1.4
1.4
..3
80
60
. 80.2
3 .35
3 .35
3 .3
1.05
1.05
.9
100
100
12
4 .35
4 .35
1 .30
1.4
1.4
.3
0
0
0
1 .35
1 .35
1 .30
.35
.35
.30
nc
nc
ne
nc
Indicator
Value Score Wt. Wtd.
Score
1.85 1.0 1.85
24 2 .51
17 2 .51
nc -0
nc .0
100 4 .2 .8
1.6 .8 1.28
3.1 .25 .78
3 .25 .75
3.1 .25 .76
1.0 .25 .25
nc 0
nc 0
Sub-Index
Score Wt. Wtd.
Score
1.85 .25 .62
2.00 .25 .50
2.08 .25 .52
2.56 .25 .64
Index
Score
2.28
11-93
-------
SARDAD. Only one station using an unacceptable method for
CO and only two states for SO- were reported.
Emissions reporting sub-indicators (3.2.6.1. and 3.2.6.2.)
were computed using a list of the number of sources on the NEDS
verification file as of May 1974, and the Missing Data Items
Report in the May 1974 AEROS Status Report.
(3) Sub-Index 3.3. Completing Plans and Revisions
The percentage of required SIP portions completed was com-
puted per instructions in Section B using information from the
latest available SIP Progress Report (January 1 to June 30, 1974).
Table 11-15 shows ranges of values and scoring intervals for
all index 3 sub-indicators or indicators, while Table 11-16 shows
computed values, scores, weights and weighted scores for a sample
state.
2. Need Indices
a. Index 4. PROBLEM
(1) Sub-Index 4.1. Ambient Air Quality Problem
For each pollutant indicator (4.1.1. to 4.1.4.; N0? was
not computed):
(a) Air Quality Deviation Indication (AQDI) sub-indicators
for the TSP and S0« primary annual standards were computed
as. instructed in Section B, except that in place of the TSP
annual geometric mean the 50th percentile values of the
frequency distribution for TSP stations were used, and in
11-94
-------
Table II-15. Converting Values to Score*
Index 3. ACHIEVEMENT
Measures
Source Complianc
Indicators :
3.1.1.
3.1.2,
Monitoring &
Reporting
% of Required
stations
Sub- Indicators :
3.2.1.1.
3.2.2.1.
3.2.3.1.
3.2.4.1.
3.2.5.1.
Z of AQCRs
Sub- Indicators :
3.2.1.2.
3.2.2.2.
3.2.3.2.
3.2.4.2.
3.2.5.2.
Pollutant-Methods
Sub- Indicators :
3.2.1.3.
3.2.2.3.
3.2.3.3.
3.2.4.3.
3.2.5.3.
Emissions Rptg.
Sub- Indicators :
3.2.6.1.
3.2.6.2.
Completing Plans
Indicator :
3.3.1.
Range of Values Used
Low
i 0
0
0
0
0
0
Not comp
0
0
0
0
Not corapi
(All meth
(Only 2 s
(Only 1 n
0
Not comp'
0
+64.2
+55.2
Hieh
+100
+100
+100
+100
+100
+100
ited
+ 100
+ 100
+ 100
+ 100
ed
da rep or tec
ates used v
thod report
+100
ted
+100
0
+100.0
No. of
States
55
55
55
55
55
55
55
55
55
55
were Fe
accepta
d was i
55
54
53
55
Scale
A-Arith.
G«Geom.
G
A
eral 'Ref e
le method
acceptabl
A
G
A
A
Value Ranges for Scoring Intervals
(Score-i )
Low to:
+49.9
+24.9
)
L +49.9
}
)
? +49.9
/
ence Methoc
, Fla. & tt
)
+49.9
+49.90
+24.09
+66.39 .
(Score- 2)
.^o:
+74.9
+49.9
+74.9
+74.9
)
nn.)
+74.9
+74.90
+16.06
+77.59
(Score- 3)
to;
+87.4
+74.9
+99.9
+99.9
+99.9
+87.50
+8.03
+88.79
(Score- 4 )
to:
+100.0
+100.0
+100.0
+100.0
+100.0
+100.0
0
+100.00
11-95
-------
Table 11-16. Scoring and Weighting
Index 3. ACHIEVEMENT
STATE: Sample
REGION:
Haaeure
3.1.1,
3.1.2.
3.1. Source Compl.
3.2.1.1.
3.2.1.2.
3.2.1.3.
3.2.1.TSP
3.2.2.1.
3.2.2.2.
3.2.2.3.
3.2.2rSO,
3.2.3.1.
3.2.3.2.
3.2.3.3.
3. 2. 3. CO
3.2.4.1.
3.2.4.2.
3.2.4.3.
3.2. 4. Ox
3.2.5.1.
3.2.5.2.
3.2.5.3.
3.2.5.N02
3.2.6.1.
3.2.6.2.
3.2.6.
3. 2. Monitoring
3.3.1.
3. 3. Completing
Plans
3. ACHIEVEMENT
Sub- Indicator
Value Score We. Wtd.
Score
100 4 .50 2.00
100 4 .50 2.00
nc 0
67 2 .50 1.00
67 2 .50 1.00
nc
100 4 .50 2.00
100 4 .50 2.00
nc
100 4 .35 1.40
100 4 .35 1.40
100 4 .30 1.20
nc
nc
nc
95 4 .50 2.00
3,3 4 .50 2/00
Indicator
Value Score Vt. Wtd.
Score
69r . 2 .50 1.00
47 2 .50 1.00
4.00 .175 .70
2.00 .175 .35
4.00 .175 .70
4.00 .175 .70
nc 0
4.00 .30 1.20
86.7 3 1.00 3.00
Sub- Index
Score Wt. Wtd.
Score
2. 00 .40 .80
3.65 .30 1.09
3.00 .30 .90
Index
Score
2.79
11-96
-------
place of the SO^ annual arithmetic mean the 70th percentile
values of the frequency distribution for SO stations were
used.
Frequency distributions were obtained from the preliminary
1973 Air Quality Data-Annual Statistics. Stations with less
than 15 observations during 1973 were eliminated; once ade-
quate data is available and annual means computed by SAROAD
on the basis of stations meeting SAROAD sufficiency criteria
are used, such an arbitrary elimination of stations will not
be needed.
The TSP and SO- annual AQDI were equal to:
/TSP 50th or SoA /TSP or SO primaryN
AQCR SES VOth % value I ~ \ , _ , , ]
r y _ \ annual standard /
^ ^ (TSP or SO primary annual standard)
SES
where \ = sum over all stations in an AQCR with TSP 50th
or S02 70th % value exceeding standard
AQCR
and £ = sum over all AQCRs in a state.
Note that this AQDI is the same as the air quality
deviation computed for AAQ improvement under index 1,
except that for the latter a given set of stations that
reported in both 1972 and 1973 was used, whereas in the former
all stations reporting in 1973 were used.
Short-term values were not available from SAROAD.
Therefore, the AQDI for the short-term standards (TSP and
SO- 24-hour, and CO 1-hour) were not computed. Also, substi-
tutions were necessary to compute the AQDI for the CO 8-hour
11-97
-------
(# of 1-hr, values^ X10Q
V exceeding std. /
total # of valid 1-hr.
values
/2nd highes£\
\l-hr.value /
(Std.)
Std.
and 0 1-hour standard. For these a product of the percent
X
of values exceeding standards and the megnitude of deviation
of the second highest value (since the standards are worded
in terms of exceeding standards more than once a year) is
used. Data was from the 1973 Monitoring and Trends Report.
The AQDI for the 0 primary 1-hour standard was equal to:
X
AQCR SES
I I
SES
where £ = sum over all stations in an AQCR with 2 or more
valid values exceeding the standard
AQCR
and £ = sum over all AQCRs in a state.
(Note: The number of values exceeding a standard is multi-
plied by 100 only to avoid numbers with a large
number of decimal places.)
The AQDI for the CO 8-hour standard is computed in the
same way. However, the second highest 8-hour average was
not accessible for the 1973 trends report. Thus the highest
8-hour average was used:
AQCR SES
I I
/# of 8-hr
\exceeding
total # of
av*s^\ x 100
std. /
8-hr. avgs.
/highest 8-hr\
V average /
Std.
(Std.)
^ .
SES
where £ = sum over all stations in an AQCR with 2 or
more 8-hr, averages exceeding the standard
AQCR
and £ = sum over all AQCRs in a state.
The AQDI for 'each pollutant can be corrected to account
for the percentage completion of the monitoring network in
each AQCR, by dividing the AQDI for each AQCR by the percent
11-98
-------
of the minimum required number of stations in the AQCR that
reported to SAROAD (see Section A, p. 11-30 for further
discussion of the correction factor). Both the uncorrected
and corrected AQDIs were computed for the trial run.
(b) Population in AQCRs with positive AQDI was computed as
instructed in Section B for the pollutant-standards for
which an AQDI was computed, namely TSP and S0_ annual,
CO 8-hour, and 0 1-hour. Population figures for state-
X
AQCR combinations were given in an NADB printout; however,
the population figures were for different years. Therefore,
the 1970 population of AQCRs, derived from the 1970 Census,
and printed in the OBERS projections for the AQCRs was used.
To derive state portions of the population of interstate
AQCRs, the percentage share of total AQCR population was
derived from the NADB printout and applied to the OBERS
figures.
(2) Sub-Index 4.2. Emissions and Emission Sources
Urbanized area population (1970) and urbanized (SMSA) land
area were derived from the 1972 Statistical Abstract of the U.S.
Projected population and manufacturing growth rates were computed
for the period 1970 to 1980 and 1969 to 1980, respectively, on
the basis of OBERS projections. Total emissions were drawn from
a NEDS printout of emission summary by state dated January 1975,
considered generally representative of 1973 emissions.
11-99
-------
(3) Sub-Index 4.3. Emission Reduction Needed
Because state emission goals were not yet available from the
SIP automated information system, the values for the indicators
under sub-index 4.3. were not computed.
In accordance with the instructions in Section B, the ranges of
computed values for measures under index 4 and the ranges of the scoring
intervals are listed on Table 11-17. Table 11-18 lists computed
values, scores, weights, and weighted scores for a sample state.
b. Index 5. OPERATIONAL REQUIREMENTS
(1) Sub-Index 5.1. Source Compliance and Enforcement
Indicator values were computed per instructions in Section B
from data in the State Activity Report for the period ending
December 31, 1974, and from the Dun and Bradstreet DMI file.
(2) Sub-Index 5.2. Monitoring and Reporting Air Quality and
Emissions
For each pollutant indicator (5.2.1. to 5.2.4.; NO- values
were not computed):
(a) Number of stations that need to be added to complete the
monitoring network was based on information from the 1973
Monitoring and Trends Report. Values were computed as
instructed in Section B.
(b) Number of AQCRs with less than the minimum required
network was computed to indicate the number of AQCRs whose
monitoring networks had to be completed. Data from the 1973
11-100
-------
Table 11-17. Converting Values to Score*
Index 4. PROBLEM
Meuure*
AAQ Problem
AQDI
Sub- Indicator! :
4.1.1.1. (a)TSP
4.1.1.1. (b)TSP
4.1.1.2. (a)TSP
4.1.1.2. (b)TSP
4.1.2.1. («)S02
4.1.2.1. (b)S02
4.1.2.2.(*)S02
4.1.2.2. (b)S02
4.1.3.1. («)CO
4.1.3.1. (b)CO
4.1.3.2. («)CO
4.1.3.2. (b)CO
4.1.4.1. (a)0j
4.1.4.1. (b)0x
4.1.5.1. ) Docorrected
11-101
-------
Table 11-18. Scoring and Weighting
Index 4. PROBLEM
STATE: Sample
EZGION:
Measure
4.1.1.1.
4.1.1.2.
4.1.1.3.
4.1.1.4.
4.1.1.TSP
4.1.2.1.
4.1.2.2.
4.1.2.3.
4.1.2.4.
4. 1.2. SO?
4.1.3.1.(
4.1.3.2.
4.1.3.3.
4.1.3.4.
4. 1.3. CO
4.1.4. l.(
4.1.4.2.
4.1.4.0^
4.1.5.1.
4.1.5.2.
4.1.5.N02
4.1.AAO Problem
4.2.1.
4.2.2.
4.2.3.
4.2.4.
4.2.5.1.
4.2.5.2.
4.2.5.3.
4.2.5.4.
4.2.5.5.
4.2.5.
4. 2. Em & Em.
Sources
4.3.1.
4.3.2.
4.3.3.
4.3.4.
4.3.5.
4. 3. Em. Reduc.
Needed
4. PROBLEM
Sub- Indicator
Value Score Wt. Wtd.
Score
a) 3.14 3
nc
18,934 4
nc
.50 1.5
0
.50 2.0
0
a) 4.03 3
nc
14,533 4
nc
.50 1.5
0
.50 2.0
0
a) 351. 06 4
nc
.50 2.0
0
16.861 4 .50 2.0
nc
0
a) 14.90 3
17.232 4
.50 1.5
.50 2.0
nc
nc
0
0
268.6 3
1026.9 3
5149.5 4
1273.2 4
990.3 4
.20 .60
.20 .60
.20 .80
.20 .80
.20 .80
Indicator
Value Score Wt. Wtd.
Score
3.5 .25 .88
3.5 .25 .88
4.0 .25 1.00
3.5 .25 .88
nc 0
14,267 4 .30 1.20
15,408 4 .10 .40
14.7 3 .10 ,30
49 1 .10 .10
3.6 .40 1.44
nc
nc
nc
nc
nc
Sub- Index
Score We. Wtd.
Score
3.64 .50 1.82
3.44 .50 1.72
nc 0
Index
Score
3.54
11-102
-------
trends report was used to compute values according to
instructions in Section B.
(c) Improvement needed in SAROAD sufficiency score was
computed per Section B instructions using the sufficiency
score computed for index 2 from the AEROS Status Report
(May 1974).
Sub-indicators for the emissions reporting indicator were
computed as instructed in Section B from a list of the number of
sources on the NEDS verification file as of May 1974, and from
the Missing Data Items Report in the May 1974 AEROS Status Report.
(3) Sub-Index 5.3. Completing Plans and Revisions
The number of SIP portions that need to be completed was
the number of portions declared deficient by EPA, including
those for which EPA had promulgated regulations, as reported
in the January 1 to June 30, 1974 SIP Progress Report.
For all computed measures under index 5, the range of values
and the ranges of the scoring intervals are listed on Table 11-19.
Values, scores, weights, and weighted scores for all computed measures
for a sample state are listed on Table 11-20.
State background information for the 5Q states and the District of
Columbia is presented in Appendix II-D. Population figures are from the
1970 Census and published reports based on the decennial census, unless
otherwise specified.
11-103
-------
Table IX-19. Converting Values to Score*
Index 5. OPERATIONAL REQUIREMENTS
Measures
Source Compliance
& Enforcement
Indicators :
5.1.1.
5.1.2.
5.1.3.
5.1.4.
Monitoring &
Reporting
No. of Needed
Stations
Sub- Indicators :
5.2.1.1.
5.2.2.1.
5.2.3.1.
5.2.4.1.
5.2.5.1.
No. of AQCRs
Sub- Indicators :
5.2.1.2.
5.2.2.2.
5.2.3.2.
5.2.4.2.
5.2.5.2.
Improvement in
SAROAD score
Sub-Indicators :
5.2.1.3.
5.2.2.3.
5.2.3.3.
5.2.4.3.
5.2.5.3.
Emissions Rptg.
Sub- Indicators :
5.2.6.1.
5.2.6.2.
Completing Plans
Indicator:
5.3.1.
Range of Values Used
Low
0
0
0
+390
0
0
0
0
lot compu
0
0
0
0
Hot compu
0
0
0
0
Not compi
0.
+344
0
High
J-727
+146
+185
+52,529
+26
+28
+10
+13
:ed
+10
+8
+4
+6
ted
. +66.7
+100.0
+100.0
+100.0
ted
+1,420
+46,921
+24
No. of
States
55
55
55
51
55
55
55
55
55
55
55
55
54
54
33
24
55
53
55
Scale
A-Arith.
G-Geom.
G
G
G
G
\ 0
)"
\ 0
/ 1
) 2
/ 3+
G
G
A
A '
G
G
G
Value Ranges for Scoring Intervals
(Score- l)
Low to:
+5.7
+2.3
+2.9
H204.7
0
0
0
0
0
0
0
0
+4.17
+12.50
+25.00
'+25.00
+22.2
+1,799.5
0
(Score- 2 )
to:
+22.7
+9.1
+11.6
+3648.7
+2.9
+2.9
+2.9
+2.9
+1.9
+1.9
+1.9
+1.9
+8.34
+25.00
+50.00
+50.00
+88.8
+5,166.1
+2.9
(Score- 3 )
to:
+90.9
+36.5
+46.3
+13,424.8
+6.9
+6.9
+6.9
+6.9
+2.9,
+2.9
+2.9
+2.9
+16.68
+50.00
+75.00
+75.00
+355.0
+23,632.5
+6.9
to;
+727.0
+146.0
+185.0
+52, 529. C
+26.0
+28.0
+10.0
+13.0
+10.0
+8.0
+4.0
+6.0
+66.70
+100.00
+100.00
+100.00
+1,420.0
+46,921.0
+24.0
11-104
-------
STATE: Sample
Table 11-20. Scoring and Weighting
Index 5. OPERATIONAL REQUIREMENTS
REGION:
Measure
5.1.1.
5.1.2.
5.1.3.
5.1.4.
5.1. Source Compl.
5.2.1.1.
5.2.1.2.
5.2.1.3.
5.2.1.TSP
5.2.2.1.
5.2.2.2.
5.2.2.3.
5. 2. 2. SO,
5.2.3.1.
5.2.3.2.
5.2.3.3.
5.2.3.CO
5.2.4.1.
5.2.4.2.
5.2.4.3.
5.2.4.0,
5.2.5.1.
5.2.5.2.
5.2.5.3.
5.2.5.N02
5.2.6.1.
5.2.6.2.
5.2. 6. Em.
5. 2. Monitoring
5.3.1.
5. 3. Completing
Plans
5. OPERATIONAL
REQUIREMENTS
Sub- Indicator
Value Score Wt. Wtd.
Score
0 1 .30 .30
0 1 .30 .30
18.8 4 .40 1.60
2 2 .30 .60
1 2 .30 .60
37.1 3 .40 1.20
0 1 .30 .30
0 1 .30 .30
28.6 2 .40 .80
1 2 .30 .60
1 2 .30 .60
25.0 1 .40 .40
nc
nc
nc
105 3 .50 1.50
3918 2 .50 1.00
Indicator
Value Score Wt. Wtd.
Score
174 4 .20 . .80
8 2 .20 .40
9 2 .30 .60
2317 2 .30 .60
2.50 .175 .44
2.40 .175 .42
1.40 .175 .24
1.60 .175 .28
nc 6
2.50 .30 .75
13 4 1.0 4.00
Sub-Index
Score Vt. Wtd.
Score
2.40 .40 .96
2.13 .30 .64
4.00 .30 1.20
Index
Score
2.80
11-105
-------
SECTION D. FORMATS FOR SYSTEM RESULTS
Page
1. Output Format #1 11-106
2. Output Format #2 11-106
3. Output Format #3 11-119
-------
D. Formats for System Results
The way the results of the system are organized and the degree of
i
summarization of the results depend on the uses to which the results will
be put. Three alternative formats for organizing system results are pre-
sented in this section.
(1) Output format #1 presents the computed values of the measures at
the lowest levels of aggregation for which values were calculated
(sub-indicators or indicators) for a state. Figures II-8a to II-8e
illustrate output format #1 for component values of indices 1 to 5,
respectively.
(2) Output format #2 allows examination of a state's scores for all
measures as well as all states' scores for a particular measure.
Figures II-9a to II-9e illustrate output format #2 for the scores
for indices 1 to 5, respectively, each containing all the components
of the index. Each column represents a component sub-indicator,
indicator, or sub-index of the index, or the index itself (see
Section A for discussion of each measure). The weight of each com-
ponent, expressed as a percentage of 100, is given in the parentheses
at the top of each column. The index score for each state is filled
in on line (1), each of the sub-index scores on line (2), each of
the indicator scores on line (3) and each sub-indicator score on
line (4).
Figure 11-10 represents a summary for all states of output
format #2 for scores for the 5 indices and 18 sub-indices. Each
column is a sub-index or index with the weights of the sub-indices
11-106
-------
Fig. II-8a. Output Format II, State Values for Index 1
M
H
I
O
STATE
1. GOAL ATTAIKMENT
1.1. TSP
1.1.1.
AAQI
l.l.l.(a)>Annual
3
H
I
A
£
H
14
I
X1
l-l
1.2. Em. Reduction
o
M
!
8 a
13
IX
ffl O
en
1.2. S02
1.2.1.
AAQI
2.1.1.(a)>Annual
2.1.1. (b)£ Annual
I
M
2.2. En. Reduction
!
X
i
« o
1.3. CO
1.3.1.
AAQI
3.1.1.(a>fl-Hour
i
00
XI
e
i-J
M
i
3
A
3
X'
3.2. En. Reduction
5
N
i
H W
M 0
1.4. Ox
1.4.1.
AAQI
tH
A
,
( 1
|
tH
1
4J
1
a
,
1
V
N
m oa
oo c
tfl 4J
M-t
a o
1.5. N02
1.5.1.
AAQI
5.1.1. (a)> Annual
1
X'
5.2. En. Reduction
o
V
N
!
M «
oo es
5 °
-------
Fig. II-8b. Output Format II, State Values for Index 2
STATE
2. PKOCHKSfl
2.1. MeetlnH Cororoltiafrnty
2.1.1. Source Compliance
«
w
3
a.
3
O
r *
X>
w
3
C.
1
r*
f*
u
u
3
&
*J
£
t*>
fs|
«
*
«
U
3
0.
4J
<§
V
r*
w
t
0
9
W
3
&
u
3
*n
IS
Compliance
r-*
2. Unknown Sources Whose
Status vas Determined
C4
2.3. Surveillance and
Enforcement
1. Process Inspection,
Opacity Observation
M
V)
*J
tf)
iS
8
w
V)
C4
3. Field Surveillance
by State
«N
2.3.4. Enf.
Actions
4.1. Notices of Violation
r*
4.2. Abatecent Orders
M
4.3. Court Proceedings
r4
2.4. Monitoring and Reporting Air Duality anJ Emissions
2.4.1. TSP
1.1. Z of Needed
Stations Added
es
1.2. Z of Needed
AQCRs Attained
f*4
1.3. SAROAD Sufficiency
Score
CN
2.4.2. S02
2.1. Z of Needed
Stations Added
M
2.2. Z of Needed
AQCRs Attained
CM
2.3. SAROAD Sufficiency
Score
r*
2.4.3. CO
3.1. Z of Needed
Stations Added
r>4
3.2. X of Kec-ded
AQCRs Attained
u i
°£
M <
M
t.3. EAROAD Sufficiency
Score
«N
2.4.5. N02
5.1. Z of Needed
Stall or.q Added
<*
5.2. Z of liccdcd
AQCRA Attained
»-^
X
u
c
«
u
*4
3
U1
9.
O t*
fX o
< o
L/l M
r4
2.4.6,
Ea9.
2.4.6.1. Z of Miming NRHS
Data Item* Conplcted
o
co
-------
Fig. II-8c. Output Format II, state Valuea for Index 3
STATt
3. ACHIEVEMENT
3.1. Source
Compliance
3.1.1. Z of Source* in Compli-
ance
i
3.1.2. Z of Non-Coeplylng
Sources on Schedule
3.2. Monitoring and Reporting Air Quality and Emissions
3.2.1. TSP
Monitoring
3
V)
9
O
.
! *^
CJ »
3 «
»! U
f-J
3.2.1.3. Z of Pollutant-
Methods Not t'n-
wrrontaSlp
.
3.2.2. S02
Mnnirorlne
S
CA
O
1^
3.2.2.2. Z of AQCRs with
Coaplete Xetvork
3.2.2.3. Z of Pollutant-
Methods Not Un-
3.2.3. CO
Monitoring
S
AJ
(A
IM
O
M
r?
3.2.3.2. Z of AOCPs with
Coeplete Network
3.2.3.3. Z of Pollutant-
Methods .Vot Un-
acceptable
3.2.4. Ox
Monitorlnft
S
n
£
O
M
2
2
y. *
< ** >
^ tn e
N
. Z of Pollutant
Method N'ot
2
3.2.S. N02
Monitoring
S
u
M
O
M
m
. 2 of AQCRs w/
Corplete
Network
»rt
IS?
s^t
0. T) t
-It
O *> i
s
3.2.6. Emt««ion*
Report inu
. Z of Source*
oa BEDS
£
2 t* C
IM
t R«vlilon«
3.3.1. Z of SIP
portion*
Coapl"."*
O
VO
-------
Fig. II-8d. Output Format II, State Values for Index 4
STATE
4. PROBLEM
4.1. Ambient Air Quality Problem
4.1.1. TSP
M
M
O
1
t
It
M
O
3
i
,-t
1*
X
-»
k<
O
o
w
3
M
t
S.
3
1
n.
ft.
i
0
a.
u
a
o
i
T
n.
o
CU
*?
o
o.
4.1.2. SOj
a
t:
O
'J
*-(
3
1
O
U
u
3
n
i
C-l
n)
O
C
p
t*
I
n
1
a
o
3
O
33
1
O.
O
p-
TJ
V)
1
UJ
4.1.3. CO
8
3
O
CO
(0
-, Uncorr.
1
00
a
3
3
, Uncorr.
3
.
i
s
o
I
4
3
O
S
i
a
0)
o
t
4.1.4.0,
0
1
(4
W
O
I
g
03
1
Jl
M
1
t-t
O.
£
1
O
*
4.1.5.NO,
U
8
1
1
. 5.1. (b) Annual, Uncorr.
1
&
O
V
o
4.2. Ealaslons t Em. Source*
A*
1
-
2. Urbanized Land Area
-
3
1
U
U
8-
0)
1
4.2
a,
t-i
.5.
£
Bmlsslom
8
(A
4.3. Rcduc. N«*d«d
H
8
U
EC
-
-------
Fig. II-8e. Output Format fl, State Values for Index 5
H
H
STATE
5.1. Source
Compliance t
Enforcement
5.1.1. Cnknmm Sources
5.1.2. Non-Complying Sources
Hot en Coepl. Schedule
5.1.3. Overdue Sources
5.1.4. Sources that Require
Field Surveillance
5. OPERATIONAL REQUIREMENTS
5.2. Monitoring and Reporting Air Quality and Rral onions
5.2.1. TSP
5.2.1.1. f of Stations Needed
5.2.1.2. 1 of AQCRs Kith Less
Than Required Network
5.2.1.3. Improvement Needed
In Sufficiency Score
5.2.2. S02
5.2.2.1. t of Stations Needed
5.2.2.2. 1 of AQCRs With Less
Than Required Network
5.2.2-3. Improvenent Needed
In Sufficiency Score
5.2.3. CO
5.2.3.1. 1 of Stations Needed
5.2.3.2. * of AQCRs With Less
Than Required Network
5.2.3.3. Improvement Needed
In Sufficiency Score
5.2.4. 0
5.2.4.1. I of Stations Needed
. f of AQCRs With Less
Than Required Network
.
ol
m
. Improvement Needed
in Sufficiency Score
.
tft
5.2.5. N(>2
L. t of Stations Needed
3
.. f of AQCRs With Less
Than Required Network
.
m
I. Improvement Needed
in Sufficiency Score
.
CM
5.2.6. Km. Rptg.
5.2.6.1. Sources on NEDS
Verification File
a
IA
g
fa
X M
r*
-------
fig. Il-la. Output PuriMt It, Rial* Bco»* for Index 1
M
M
M
M
NJ
(1)
(3)
(4)
(W.ijht.)
STATZ
(1)
(2)
(J)
(4)
<1)
(2)
(3)
(4)
(1)
(2)
O)
(4)
(1)
(2)
(3)
(4)
Index
Sub- Indices .
v!
\
n
**
^4
In.
S
lu
|^
3
r»
H
^
-
r,
<
-*
;
11
ub
U
>
<
V
3
T
4
ca
i.
*
1
^
.1.1. TSP AAQ Iti,-rovcf=ent (40) j
§
g
H
U
3
£
r*
s
u
M
tT- C
-*
'/*!-
trt
CM
C
£
a
H
ft
A
H
Indicators
Su
Indie
o
o
1
3 3
I 1
S!
A
**?
***
03 >(?)!
7
fa-
ll CO
j
V4
3
O
31
1
n
V.
A
r-i
H
r
-s
3
O
t*
£
"^
1
TM
U
M
I
»
1
I
C
O
H
£
8
**!
o
B
W
M
n
« c
a «
a »
*"!
^>
IA
a
u
1
«j
^
J
sT
Indicators
I
g
3:
00
S
A
flj
w*
Su
nd
N
D
3
H
#
t-i
1
OO
O
u
*J
Ul
3.1.2.(a)>CO 1-Hour fM ^ T
?
o
*i
o
3
_£
.3.1. CO AAQ Ic?rovcr.ent (40)
3.2. CO Ea. Reduction (30)
0
«
N
K I
0
3. CO Coal Attainment (25)
Indicators
'
^
1
|
o"
A
[
»
H
.S
o"
V
3
iH
.4.1. Ox AAQ Icprovec«nt (40)
o
c
o
u
1
&
o
N
i ^
c »
1 f
tfl
N
jj
n
01
y
ra
fct
*J
<
.H
ffl
s
o"
Indicator*
'
3
i
CM
i
*-4
,5.1.1.(bXN02 Annual
4J
C.
V
II
b
M
n.
11
cr
f
.5.2. K02 Ea. Reduction
w
9 r:
tfl 4J
$ -
8*5
.5. KOj Coal Attainment (0)
COAL ATTAINHEHT
.
-------
II-9b. Output Form.it 12, State Score* for Index 2
(1)
(2)
O>
(4)
OtaifJtt*)
STATE
U)
(2)
(3)
(4)
(1)
(2)
O>
(4)
(1)
(2)
(J)
(4)
(1)
(2)
(J)
(4)
(1)
(2)
0)
(4)
(1)
(2)
0)
<»>
Ind»x
Sub- Ind leas
Indicators
Sub-
dO)
*4
1
I
»4
»
Indlc
(IS)
^
u
9
O.
c?
rl
n
ators
(15)
S
i
&
IN
(15)
o
+
*H
S
O.
i!
IN
(15)
. *M
4
a
i
a.
3
O
M
(10)
*H
f-l
W
9
Cb
^
N
(10)
H
W
M
»H
W
9
O.
S
W
(10)
iH
^
5
S
S
(B
M
(100)
U
9
T*
1
Z
h
1
r4
L. Meeting NBO ^
Commitment* »
«s
1.1. Hon-Conplylng Sources ^ ^*
Brought Into Cornel. S £
s
3.1. Process Insp., ^ _
Opacity Oba. 3 a.
r4
cato
(0)
w
5
s
u
w
*M
(N
r«
(20)
^
0 2
S"
&£
F>
M
J.4.1. Notices of 0 _
Violation S g.E
r>4
J.4.2. Abateaent "
Orders 3
N
rs
C.O)
0
M «
Ou. BO
r:
t-l T-4
u t)
3 W
O V
u u
r*\
vf
tN
(.4. Enforcement ^
Actions o
(N
*
-------
Fig. II-9b. Output; For=at *2, State Scores for Index 2 (Continued)
H
M
I
(1)
(2)
(3)
(4)
(Weights)
STATE
tt)
(2)
(4)
(1)
(2)
(3)
(1)
(2)
(3)
M
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
a;
(2;
a>
Index
Sub-Indices
Indicators
Sub-
fSS)
tj -c
.
< 2
^
rsi
(25)
90
e
o
i
s
1
CM
Sub-
Inil
(35)
w a
O C
X W
*
-
2. X of Needed Q 't!
AQCRfi Attained »o
*
«
,
(30)
M >>
tf. U
W «M
*
~
Of Monitoring g
*
C4
Sub-
Indi
(35)
11
".
~
cacor
(35)
s|
a <
1
~
a
(30)
«
h*
O
1 O
*M U9
3
CA >»
11
CO %«
"I
«
s
2.4.5. N02 Monitoring ^
o
9
iil
o|]
0
C4
g
2.4.6. Emissions Rptg. S
2.4. Monitoring & Reporting *J
A() t Enlaalon* Gj
2. PROCKESS
-------
ll-9c. Output Format 12. State Scorea for Index 3
(1)
(2)
(3)
(Weight*)
STATE
(1)
(2)
(3)
(1)
(2)
(3)
(1)
(2)
(3)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(*)
Index
Sub-Indict*
3.1.1. t of Sources g |[
In Compliance ^ o"
3.1.2. t of (ion-Complying g o
Sources on Schedule "^
3.1. Source Compliance *
N^
Sub-
OS)
c
9.
O
M
3.2.1.2. X of AQCRa with 2 JL
Cornlete Network *-" K"
3.2.1.3. : of Pollutant £ »
Kethois Not 3 o
Unacceptable - *
00
c
M
M
O
D«
VI
1-
4
*N
f*>
Indicators
Sub-
OS)
a
u
U)
o
*H
Indie
(35)
$5
>l U
M
rn
3.2.2.3. X of Pollutant ^
Methods Not g »
Unacceptable **
(17.5)
M
a
0
o*
1-4
Sub-
OS)
8
«
Si
«M
O
M
3.2.3.2. X of AqCRs with g oL
Corplete Netvork ** 5f
3.2.3.3. X of Pollutant ^ R
Methods Not g 8
Unacceptable -' *
3.2.3. CO Monitoring £
***
Sub-I
(35)
S
J
in
9.
o
in
3.2.4.2. X of AQCRs with u ^
Complete Netvork > £
3.2.4.2. X of Pollutar.t *-. S
Methods Not § 5
Unacceptable ""
3.2.4. 0^ Monitoring £
U1
Sub-
OS)
S
tn
o
4
CS
cn
Indie
(35)
C »-"
^ o
a1"
U u
S'S
M U
in
3.2.5.3. X of Pollutant £ S
Methot Not o °
Ur.ac:eptable ~
3.2.5. ND2 Mcnltorlng S
Sub-
(50)
U
Jl/l
R
o
Ind.
(50)
*lona
3.3.1. X of SIP Portions 5 E 1
Cwnpl«t«) S ' 1 1
3.3. Completing Plan* t ^ 1
kevlslcn* »» |
3. ACMIeVEMlW
Ul
-------
Fig. II-9d. Output Fornat 12, State Score* for Index 4
M
M
M
M
O\
u>
(3)
STA7K
(U
(2)
O)
(4)
(2)
(3)
(4)
(1)
(2)
(4)
(U
(2)
(4)
Jndux
S.
b- Indices
Indicator*
Si
1
.
b-
S
t.
"
Indl.
1
S
p
*.
TJ>;
f
-
-
y
-
-T
»
O
o
r
i.
r
-t
r
t:
r.
T*
O
X
-ff
0
X
t
-T
r»
f.
-
c.
y.
ut
§
&.
W)
Suh-Indl.
g
;
S
"
^.J
j:
c:
1
,'
-
o
u
»
>
j;
.t
-
-
0
V.
t-.
»'
j
r-j
-
O
3
r:
r.
..,
o
Ul
-
o
1
fl
.,
o
n
X
U)
-
(M
«N
Sub-Indl.
ol
*
t.
o
1
to
M
O
k-
"
|;
^
O
k-
t"
o
I
r-4
rt
o
u
-
('
c
0
o
u
o
t/i
g
u
3
£
lad
S"
ICi
c
._..
lor
u
K
M
fN
O
X
: !:
-
O
V
n
tu
tl
x.
G
O
J
u
3
T)
-.1
PKOULM
-------
Fig. II-9.. Output Format 12, Stmtf Scorn* for Index 5
M
M
(1)
(2)
(3)
(4)
(W.lght.)
STATE
(1)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(1)
(2)
(3)
(4)
Index
Sub-Indices
i.l.l. I'r.knovn Sources (20) 7
dlcat
0
tt
«
o
41
s
.**
I
(Q
VI
O
fM
N
0
m M
m p
Ol
-c y,
u b
»*-( C
O cfl
(M
2.3. Icproveoeac Seeded in -,_.
Sufficier.cv Score l U;
N
,
m
c
(M
O
(A
CM
«M
Sub-Indl.
0
o
ai
01
41
y,
tn
SH
kl
V)
O
»N
.0
(ft lJ
(A O
JS?.
>u
£ ^,
t Tl
:< 4>
U U
O1 01
< a-.
CM
3.3. Irreverent Needed in f,M
Sufficier.cv Score l u'
~
IA
8
Sub-indi.
t;
v
01
0>
IT",
cn
c
n
VI
«*.
o
"*.-
9
J*
M W
5 3
H T)
:* o>
tH
cy
-------
Fig. 11-10. Output Format 12, State Scores for All Indices and Sub-Indices
V
oo
(1)
(2)
(1)
(2)
(I)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
Indices
Sub- Indices
(25)
o.
tfl
H
*<
(25)
s"
"
(25)
o
u
«
(25)
0M
«
(0)
^
«
B
COAL ATTAISM
Sub- Indices
(30)
n
u
d
MBO Coamicme
r*
(20)
u
V
u
§
V)
n
(20)
0
Surveillance
Enf. Actions
m
(30)
afi
H
O
-
(A
£
Sub- Indices
(40)
1
1
fH
(30)
M
U
U
1
«
(30)
1
Completing P
«n
ACHIEVEMENT
Sub-Indices
(35)
AAQ Problem
-
(30)
Emissions &
Em. Sources
N
(35)
ii
Needed
L.-Spcci
. .-4
-;
S
DO
O
ft.
Sub-Indices
(40)
5
U C
i:
j
(40)
to
u
1
N
(30)
3
Coop let ing P
«n
OPERATIONAL
REQU1REMESTS
10
-------
within each index in parentheses at the top of the column. The index
scores for each state are given on line (1), and sub-index scores
on line (2).
(3) Output format #3 shows a frequency distribution of the number of states
that had computed values or scores within designated intervals, for
any given sub-indicator, indicator, sub-index, or index. Within
this distribution states falling into each interval can be identified.
Figures II-lla to II-llg give the frequency distributions of all
states in the trial run for a sample of measures (sub-indicators
1.1.1.1.(a) and 1.1.1.1.(b), indicators 1.1.1., 1.1.2., 1.1.3.,
sub-index 1.1., and index 1).
11-119
-------
24 -
20 -
16 -
12 -
Number
of
States
(Arranged
Alphabetically
Within 4
Intervals)
8 -
Alas.
D.C.
'. Hi-
Mass.
' Minn.
Neb.
Colo.
Del.
Id.
111.
lo.
Kan.
Mo.
N.J.
N.C.
S.C.
Tex.
Va.
Ala.
Ariz.
Ark.
Ky.
Md.
Mich.
Nev.
N.Y.
Oh.
Okla.
Tenn.
Wash.
Fla.
Ga.
Ind.
La.
N.M.
Ore.
-225.00
-71.30
+5.55
+44.00
+82.40
Improvement in Air Quality Deviation
Values for Sub-Indicator 1.1.1.1.(a) TSP AQ Deviation Improvement
(>Annual Standard)
(Geometric Scale)
Fig. II-lla. Format #3, Frequency Distribution
II- 120
-------
24 -
20 -
16 -
12 -
Number
of 8 -
States
(Arranged
Alphabetically
Within 4 4
Intervals)
Me:
-59.50
N.D.
N.H.
Conn.
R.I.
-39.88 720.25 -0.63
% Improvement in Air Quality
+19.00
Values for, Sub-Indicator l.l.l.l.(b) TSP AQ Improvement
(
-------
24 --
20 --
16 -
12 -
Number
of
States
(Arranged
Alphabetically
Within 4
Intervals)
8 -
Alas.
B.C.
Hi.
Me.
Mass.
Minn.
Neb.
Colo.
Del.
Id.
111.
lo.
Kan.
Mo.
N.H.
N.J.
N.C.
N.D.
S.C.
Tex.
Va.
Ala.
Ariz.
Ark.
Ky.
Md.
Mich.
N.Y.
Oh.
Okla.
Tenn.
Wash.
Conn.
Fla.
Ga.
Ind.
La.
Nev.
N.M.
Ore.
R.I.
Scores for Indicator 1.1.1. TSP AAQ Improvement
(The higher the % AAQ improvement, the higher the score)
Fig. II-llc. Format #3, Frequency Distribution
11-122
-------
Number
of
States
(Arranged
Alphabetically
Within
Intervals)
24 -
20 -
16 -
12 -
-
4_
Alas.
Ariz.
Id.
N.Y.
P.R.
V.I.
Ark.
Guam
Hi.
lo.
Ky.
La.
Miss.
Mo.
Mont.
N.M.
Oh.
Okla.
S.D.
Wash.
Ala.
Colo.
Conn.
Del.
Fla.
Ga.
111.
Ind.
Me.
Mass.
Mich.
Minn.
Nev.
N.H.
N.C.
N.D.
R.I.
S.C.
Tenn.
Tex.
Ut.
Vt.
Wise.
Wyo.
Cal.
D.C.
Kan.
Md.
N.J.
Ore.
Pa.
Va.
-63.30
-20.00
+20.00
+69.00
% Emission Reduction
Values for Indicator 1.1.2. TSP % Emission Reduction
Fig. II-lld. Format #3, Frequency Distribution
11-123
-------
24 -
20 --
16 -
12 -
Number
of
States
(Arranged
Alphabetically
Within
Intervals)
Ala. (103)*
Colo. (108)
' 111. (104)
Ind. (136)
Mich. (199)
' N.Y. (449)
' Okla. (107)
Tex. (112)
Id. (27)
Ky. (134)
Md. (110)
Minn. (105)
Mo. (62)
Oh. (213)
Va. (73)
Ark. (43)
Ariz. (32)
Del. (28)
Ga. (31)
lo. (10)
Kan. (62)
Neb. (39)
Nev. (31)
N.J. (142)
N.M. (26)
N.C. (64)
Pa. (25)
P.R. (10)
S.C. (60)
Alas. (10)
Cal. (26)
Conn. (55)
B.C. (4)
Fla. (45)
Hi. (26)
La. (12)
Mass. (65)
Miss. (2)
N.H. (6)
Ore. (3)
R.I. (15)
Tenn. (32)
Ut. (2)
Vt. (2)
Wash. (30)
W.V. (2)
Wise. (5)
4 -
+49.0
+24.5
+12.3
+3.1
Number of PRMS Flags (Sites with Potential Deficiency)
Values for Indicator 1.1.3. TSP PRMS Flags
(Annual and 24-Hour Standards)
(Geometric Scale)
* Number of stations in each state with sufficient
data for PRMS analysis are in parentheses.
Fig. II-lle. Format #3, Frequency Distribution
11-124
-------
24 4
20 4
16 4
12 4
Number
of 84
States
(Arranged
Alphabetically
Within 4
Intervals)
1.0
Alas.
Id.
Minn.
Neb.
NY
Ariz.
Colo.
Guam
Hi.
111.
lo.
Ky.
Me.
Mich.
Mo.
Mont.
N.D.
Oh.
Okla.-
S.D.
Tex.
Ala.
Ark.
Del.
D.C.
Ind.
Kan.
Mass.
N.H.
N.J.
N.C.
S.C.
Va.
Ga.
La.
Md.
Miss.
Nev.
N.M.
Tenn.
Wash.
W.V.
Cal.
Conn.
Fla.
Ore.
Pa.
P.R.
R.I.
Ut.
Vt.
Wise.
1.5
2.0
2.5
3.0
3.5
4.0
Scores for Index 1, Goal Attainment
(The higher the extent of goal attainment, the higher the score)
Fig. II-llf. Format #3, Frequency Distribution
11-125
-------
Number
of
States
(Arranged
Alphabetically
Within
Intervals)
-
24 -
-
-
-
20 -
"
16 -
""
12 -
8 ,-
4 -
.
Ark.
Id.
Ala.
Guam
111.
lo.
Mich.
Mo.
Mont.
N.Y.-
N.D.
Wyo.
Ark.
Colo.
Del.
D.C.
Hi.
. Ind.
La.
Kan.
Ky.
Me.
Mass.
Minn.
Miss.
Nev.
N.J.
N.C.
Okla.
Oh.
S.D.
Tenn.
Tex.
Ut.
Va.
Wash.
W.V.
Wise.
Ariz.
Gal.
Conn.
Fla.
Ga.
Md.
Neb.
N.H.
N.M.
Ore.
P.R.
R.I.
S.C.
Vt.
Pa.
1.0
1.5
2.0
2.5
3.0
3.5
4.0
Scores for Sub-Index 1.1. TSP Goal Attainment
(The higher the extent of TSP goal attainment, the higher the score)
Fig. II-llg. Format #3, Frequency Distribution
11-126
-------
SECTION E. ISSUES AND PROBLEMS
1. Uses of the System 11-127
2. Data Base 11-129
3. Weighting System 11-131
4. System Flexibility 11-134
5. Feasibility of Automation 11-135
-------
E. Issues and Problems
1. Uses of the System
An important question regarding the system is the use(s) to
which the results of the system are put. Possible uses to which system
results can and cannot be put are discussed here.
A word first on what the system is not intended to be. Because the
measures used in the system were developed on the basis of data from
existing reporting systems currently or soon to be available to EPA
headquarters, the system is not intended to be a comprehensive evaluation
of any state. In fact, it is well recognized that such a comprehensive
evaluation needs two things which this system lacks:
(1) Additional and more detailed information than is available
from existing data systems at EPA headquarters. Because of
the constraint of using existing data systems available to
EPA headquarters, there are important aspects of control
activities or need that are not treated simply because there
are no data routinely available at headquarters.
(2) Judgment by EPA staff at the regional office le-«el
familiar with the particular problems and unique circumstances
facing an individual state, needed to interpret and put into
perspective the numbers generated by the system. Numbers alone
can be misleading because of errors in data collection and
processing, because of extenuating or unique circumstances,
because of unforeseeable difficulties or any of a number of
other reasons. Indeed, the results of the system, whatever
the uses to which they are put, should be used in light of
11-127
-------
interpretive comments and explanations from knowledgeable
regional personnel.
On the other hand, numbers can shed light on the air pollution control
situation and can indicate trends and problem areas. The system described
in this report is one way of consolidating, organizing, summarizing, and
presenting in a coherent framework the enormous amounts of data routinely
reported to EPA headquarters. It provides an overall view of state control
performance and need, and makes explicit the relative importance of the
various program areas and aspects considered.
The system can be implemented by headquarters or regional office
personnel for any group of states (all states in the country, all states in
a region, states sharing certain characteristics, etc.). The individual
measures and the evaluation framework can be used for an individual state,
although the scoring, weighting, and aggregation procedures were designed
to enable some comparisons among states. The system can be implemented at
regular time intervals to allow assessment of trends, rates of improvement,
and changes in relative standings.
The system can be used as a:
(a) Method of painting a broad picture of national status and trends
in selected air pollution control program areas, pointing out
problem areas, and quantifying deficiencies. Results can be
one input into setting priorities for program planning, and
allocating resources among program areas.
(b) Method of ascertaining state status within a national picture
and indicating the geographical distribution of problems. Results
11-128
-------
can be one input into setting priorities and allocating resources
among geographical areas.
(c) Flagging mechanism to point out to EPA headquarters possible
program areas needing additional investigation with regard to
nationwide air pollution control performance and need, and to
EPA regional offices and states with regard to state progress
within national efforts and trends.
(d) Method for feedback conerning the validity and efficiency of EPA
information systems and reporting requirements, and problems
regarding data collection procedures, definitions of terms, and
data flows.
2. Data Base
The results of the system are only as good as the data from which they
are derived. During the development of the system data items and data
systems were evaluated as to the validity, accessibility, completeness,
timeliness, and stability of the data. These criteria and general findings
are discussed in Volume 1, Section B.
With the development of the system and the trial run completed, several
questions and problems relating to the data base remain, and deserve
discussion:
(a) Data Validity; There was general agreement among all EPA personnel
involved that data: (1) were not as "hard" as was desired, (2)
would improve in validity in the future with additional EPA
clarification and guidance, and increased state and local
experience, (3) had to be used because there were no other data
with a higher degree of validity, and (4) should be used carefully
with their validity limitations firmly in mind.
11-129
-------
(b) Data Availability and Completeness; Varying amounts of data are
available from the states. Since calculation of values of the
measures depends on the existence of data, this means that for
many measures there will be some states for which values cannot
be computed because of the lack of data. For these states
weights for other component measures must be reallocated. This
situation can be expected to improve in regard to data required
of all states.
In addition, even where there is some data available and
values can be computed, the varying amounts of data available
for the states affect the computed values and scores. Thus a
state that submits little air quality or emissions data relative
to what should be submitted will show less air pollution or
emissions than should actually be shown. This problem is sometimes
addressed to some degree, e.g., by the use of a correction factor
for air quality deviation or by showing how many sites in relation
to minimum required numbers were analyzed by PRMS. Also, scores
for any air quality deviation or emissions measure for a state
should be looked at in light of the state monitoring and reporting
score.
(c) Data Timeliness; Some data, such as data in NEDS and SAROAD, are
subject to considerable time delay. Even if states and regions
meet all reporting deadlines, there is a significant lag between
the time for which data are relevant and the time data are available
at headquarters. Thus areas recognized as problems as a result of
implementing the system may have already been improved in the
11-130
-------
intervening time between data collection and implementation of
the system. (This is less of a problem for MBO outputs data,)
Some improvement on this point is possible (witness the
proposal for the establishment of AAQ trend stations for which
data would be available much more quickly). However, this will
probably continue to be a problem, making it even more important
to obtain comments from regional personnel familiar with the
current situation in a state.
(d) Varying Time Periods; The time lag between data collection and
availability varies for different data systems and data items.
For example, NEDS data currently available from NADB are generally
representative of calendar year 1973 and early 1974, while MBO
data for the second quarter of FY75 are available. This is not
considered a significant problem as long as data applicable to
different time periods are not combined within an individual
measure. Also, if two or more states are to be compared, the
same time periods should be used for all states.
3. Weighting System
The system is designed to facilitate assignment of weights by the user
to accommodate the user's priorities and subjective judgment. In fact, the
second step of the methodology involves setting the weights.
There are two reasons that weights may vary. First, it may be
desirable to weight some measures, especially pollutant-related measures
such as monitoring and reporting for each pollutant, according to the
severity of the problem for each pollutant. Thus for one state with a
severe oxidant problem and less severe problems with the other pollutants,
11-131
-------
measures for the monitoring of oxidants may be weighted much more heavily
than the measures for monitoring the other pollutants. Weights for HC
emissions, population exposed to 0 air quality deviation, and population
X
growth rate may also be relatively larger than those for other emissions,
population exposed to other pollutants, and manufacturing growth rate,
respectively. In this case, certain weights may vary from state to state
according to a predetermined scale relating severity of problem to weights
of measures.
Second, weights for certain measures, especially those related to
separate program areas, vary according to priorities. For example, a
regional office may assign weights for source compliance, monitoring,
and completing plans and revisions according to the relative priority of
these program areas. However, if the regional office is implementing the
system for all its states and is interested in comparing its states, weights
relating to relative priorities should not vary from state to state. If
priorities change and the system is implemented another time, these weights
can be altered accordingly.
One additional problem in weighting measures relates to a point
mentioned earlier in discussing data availability. If the value of a
particular measure cannot be computed for a particular state either (a)
the state must be given a score of 0 with the same weight for that
component as is applied for the other states being compared, or (b) the
weights for that state must be reassigned to the remaining components for
which values can be computed, in the same proportions as the original weights-
The consequences of each alternative are illustrated below:
A given indicator is composed of three sub-indicators, each
with a weight of 1/3, State A has sufficient data for
11-132
-------
all three sub-indicators and scores of 2, 3, and 1 are
computed, while for state B values for only two of the three
sub-indicators can be computed and the scores for these are
the same as for state A, i.e., 2 and 3. Under alternative
(a) the scores for the indicator are:
State A: (2) (1/3) + (3) (1/3) + (1) (1/3) = 2
State B: (2) (1/3) + (3) (1/3) + (0) (1/3) = 1 2/3
Under alternative (b) the scores are:
State A: (2) (1/3) + (3) (1/3) + (1) (1/3) = 2
State B: (2)(1/2) + (3)(1/2) = 2 1/2.
There are two reasons that values for a particular measure for a state
cannot be computed:
(1) For some measures, no value need be computed. For example,
progress in adding needed stations is not computed if no
stations are needed. Or, ambient air quality values for a
particular pollutant are not reported if a state is not re-
required by federal regulations to maintain a station for
that pollutant.
(2) Data that is required of a state is not reported to EPA
headquarters. For example, a state required to have a
number of stations for a given pollutant reports no ambient air
quality data.
It was felt that a state should not be penalized if a value was not
computed for the first reason. Therefore alternative (a) was appropriate
11-133
-------
in these cases. If a value for a particular measure could not be com-
puted for the second reason, it was felt that no conclusion could be
made about state status in regard to the measure. Moreover, the
failure to report required data would be reflected in a state's scores
for the reporting measures. Once again alternative (a) was considered
appropriate. Thus, it was decided that in all cases where a value for
a measure could not be computed, weights would be reallocated among the
remaining components.maintaining the same proportions among the weights
of these remaining components as existed originally.
4. System Flexibility
A concerted attempt was made to make the system flexible in order
to meet as many user needs as possible; It is recognized that different
users of the system will be interested in different indicators and different
levels of aggregation. Therefore the system is structured to allow
presentation of results at various levels of aggregation and in different
formats. It is possible to use the framework of measures to compute
values for sub-indicators and indicators without converting values to
scores and weighting and aggregating scores to obtain scores at higher
levels of aggregation.
11-134
-------
5. Feasibility of Automation
Due to the nature of this objective system and its total reliance
on information systems already automated for other purposes, the potential
exists for expansion to a 100% automated, quick turnaround system complete
with periodic updates, generation of reports, and other features of com-
puterized management information systems. The purposes of this section
are to point out the numerous factors which determine the costs and benefits
of such a system as well as its manual counterpart and other alternative
approaches, and to make recommendations based on the importance of these
factors to the system objectives.
This study was not intended to be a complete, in-depth analysis of
the costs and benefits, or the impact of the system on its users. It is
solely a preliminary study which, though not totally unconcerned with
detail, does concentrate on major points and can be of benefit in narrowing
down the possible approaches and in deciding upon the final method for
long term system use.
Some of the factors which are considered in the cost-benefit analysis
of the approaches are the following:
frequency of update
system utility-user access requirements
need for flexibility
probability of major system modification
linkages among existing automated systems
accuracy of calculations
lag time in data flow
manpower needed-developmental and operational
hardware and software needed.
11-135
-------
These factors are used as a basis for analyzing three alternative
approaches: (1) continuation of manual preparation, (2) complete auto-
mation, and (3) partial automation. Manual preparation is defined as any
method of data reduction and analysis using no computer support other than
a desk model calculator or programmable calculator. Complete automation
is defined as the method which minimizes human contact with the data by
allowing all current information systems to feed into another system which
automatically calculates the necessary indices and generates the correct
user-requested report. Partially automated refers to the automation of
specific aspects of the system which may be more amenable to automation
than others.
The three approaches to automation are discussed below.
(a) Complete Automation; The complete automation of the system
would require a large scale effort, one for which the hardware
is not totally available at the present time (linkage between
CDS and AEROS computer facilities is planned but not currently
functional). Information must be assembled from a variety of
existing systems, and indices must be calculated. Two distinct
computer facilities are involved, one at OSI in Bethesda, Md.,
and the other at Research Triangle Park, N.C. The coordination
and retrievals from the various data files may be an expensive
and time-consuming task.
The data bases involved in the evaluation system include
SAROAD,'NEDS, CDS, and EPA Formal Reporting System, as well as
supplementary bases such as PRMS. The assemblage of these data
for use in an automated system would require the creation of special
11-136
-------
tapes from each system with the required information and subsequent
processing of the data. This process would have to take place
at appropriate update times.
The advantages of automation are its accuracy, speed, and
quick turnaround. Operational costs are minimized if program
modifications are not extensive. Users therefore do not have to
wait, and reports are generated upon request. The quick response
allows for some experimentation with varied weighting factors
and their bearing on the final outcome. It also means that the
data used are the most recent. It is not necessary to wait for
data tabulations or reports to be published.
The primary drawbacks of a completely automated system are
its lack of flexibility and its high initial cost in program
development. If the system is fairly stable over time, and it is
to be used for many years in a similar way, the high initial costs
are amortized and the system will begin to pay for itself. In
the case of the state status system, with expected yearly modi-
fications to system parameters, constants, and possibly even
reports, operational costs of program modification would greatly
extend the amortization period.
(b) Manual Preparation; The method currently used to prepare the
data is 100% manual. Data are collected in hard copy forms
from reports generated by existing automated systems, required
reports (SIP Progress Report, Trends Report) and other reports
intended for other purposes (Quarterly AEROS Status Report).
11-137
-------
Data needed for index calculation are extracted and, using
appropriate weighting factors, indices are calculated with
the assistance of preprinted calculation forms. The test run
of this method for all states required approximately 328 hours
of calculation. Future number of hours necessary to complete
one update will depend upon expansion or contraction of numbers
of indices and the potential availability of data which are
currently required but are not currently being reported.
There are a number of distinct advantages in this approach.
The main one is its adaptability to changing indices, changing
weights, or variations in methods of calculation. The method
can be varied in any manner to suit the analyst's requirements
without a need for additional updating expense. Also, since it
uses existing hard-copy output, no expense is incurred by having
to create data tapes, coordinate their acquisition, or write
programs to read them into the computer. Another advantage is
that highly trained manpower is not needed to operate the system.
With detailed instructions and calculation sheets, the data
preparation phase can be completed by anyone knowledgeable in
arithmetic and the use of a calculator.
The manual method also has its disadvantages. It is time-
consuming. If an analyst would like to determine the effect of
changing some weights or input parameters, he or she must wait
the time necessary to recalculate indicators or indices. Sacri-
fice in accuracy is a major disadvantage of any manual, repetitive
11-138
-------
data reduction operation. The effects can be ameliorated to
some extent by establishing a relatively modest quality control
or sampling procedure looking particularly at expected ranges
of values or recalculating a random sample of necessary calcu-
lations .
There is an additional problem of lag time in data collection
because some period of time must elapse before some reports are
published. At the present time, this would mean that two-year
old data are being analyzed, and the index calculated is historical
rather than current. For example, the 1972 Trends Report
was published in mid-1974.
(c) Partial Automation; This method would result in specific aspects
of the process being automated, those aspects which are easiest
and least costly to automate. Selection of the items or sets
of items will necessarily be those which require little develop-
mental programming or coordination of data bases. In fact,
those portions deemed advisable to be automated might include
only the calculations, or it might include only those items which
are derivable from one existing system, such as CDS-based indices.
This method would also combine to some extent the benefits
of both the approaches previously mentioned as well as minimizing
costs. Less time would be required to obtain output, resulting
in increased user access. Flexibility would be retained so that
minor system modifications would be easily assimilated and major
modifications would not be excessively burdensome in terms of
manpower.
11-139
-------
The problem of establishing linkages among existing systems
would be avoided and high accuracy would be maintained. The
problems of lag time would not be avoided, except for the cases
in which the automated items include those which have the longest
lag time. For example, if SAROAD data are automatically entered
into the system, then the lag time associated with waiting for
the most recent Trends Report will be eliminated. However,
since the evaluation is being done for one period in time, that
period must correspond to the one for which most data are
available. Thus having up-to-the-minute data will not be useful
if the majority of data are not up-to-date.
The dependence on the computer and some software will require
the updates to be done by someone knowledgeable in electronic
data processing, particularly if any modifications need to be
made. The shorter time required for the update calculation and
the increased accuracy will, however, most likely balance out
the need for higher quality manpower.
Conclusions concerning the alternatives are:
(1) Complete automation should not be considered at this time.
The problems of complete automation are too extensive at the
present time to recommend this approach. The resource expenditures
necessary would not be justified particularly due to the planned
infrequent updates and input parameter instability. A major
reprogramming effort might be necessary after each period to
adjust the input parameters according to air programs needs.
11-140
-------
If, at some later date, the system is used more frequently,
stability is more prevalent, and systems coordination are en-
hanced, a further study into its feasibility would be warranted.
At that time specific cost elements could be detailed and
weighed against system benefits. Until such time, one of the
other alternatives should be considered as currently more cost-
beneficial,
(2) A manual system should be used if the degree of system usage
is low and/or the degree of expected modification is high.
To minimize costs, no attempt should be made to automate the
system if one or both of the two most important considerations
do not favor it. The two considerations are:
(a) The intended usage rate of the system: Since the developed
system has only been used for one trial run and further
experiments, trial runs, and other testing efforts are
needed before full implementation of the system is possible,
it is not now known how widespread the application of the
system will be. If the system has only a limited appeal and
is rejected by many potential users then the costs of its
development and operations should be low. If, on the other
hand, support for its use are widespread and frequent queries
are made of it, then extra costs would be warranted. The
measure of future applicability is then the most important
consideration in the determination of the extent of system
automation.
11-141
-------
(b) The uncertainty of system parameter stability: Air programs
in the various federal regions and states within regions are
necessarily dynamic. The data outputs from states must
reflect this changing situation and changing priority scales.
Data items to be reported may thereby vary from one year to
the next, the constant ones being emissions and air quality
data. Thus, it is impossible to assume that any one data
item will be reported year after year. Since automation
requires some consistency in data input, system stability
and resistance to major modification is a very important
consideration in choosing an approach to systems update.
(3) The compromise solution of partial automation is the logical
approach if the factors do not weigh heavily in one direction
or the other. Those portions of the system most readily automated
could make up the semi-automated system. Further study would have
to be done to determine which aspects are most easily automated.
Preliminary investigation reveals that the aspects to be likely
candidates are those which are derivable from the SAROAD, NEDS,
or CDS files. Calculations would then be done on data from any
one file to put them in the correct reporting format. For example,
a simple program could be written to scan the SAROAD data base and
calculate the air quality deviation indication and air quality
improvement measure for all states, without having to rely on hard
copy reports.
11-142
-------
APPENDICES TO VOLUME II
Appendix II-A: Data Sources
Appendix II-B: 1972/1973 Pollutant-Method-Stations Summary
Appendix II-C: Summary of National Ambient Air Quality Standards
Appendix II-D: State Background Information
Appendix II-E: Workbook
-------
APPENDIX II-A
Data Sources
-------
APPENDIX II-A. DATA SOURCES
A list and brief description of the sources of data from which
indicator values are computed are presented below.
I. EPA Data Sources
A. Aerometric and Emissions Reporting System (AEROS): data bank
maintained by the National Air Data Branch (NADB) in Durham, N.C.
1. Storage and Retrieval of Aerometric Data (SAROAD):
ambient air quality portion of AEROS records all measurements
of ambient air concentrations of the criteria pollutants
submitted by state and local agencies. States are required
to submit data to EPA Regional Offices quarterly within 45
days of the end of quarter; the data is supposed to be in
SAROAD within 75 days of the end of the quarter. Computer
printouts can be obtained at any time (subject to specific requesting
procedures) of specified data in SAROAD (raw data, frequency
report, standards report, parameter file, summary file).
Regular publications based on SAROAD are:
a. Monitoring and Trends Report (annual) discusses
trends in AAQ and completion of state monitoring networks,
and includes summary data as compared with NAAQS for all
criteria pollutants, states, AQCRs, and monitoring sites
reporting data meeting minimum SAROAD sufficiency criteria
(minimum sufficiency for 24-hour integrating samples is 5
values per quarter distributed over at least 2 of the 3
months of the quarters with at least 2 samples in each
-------
of the 2 months if there is no sample in the third month,
and for continuous instruments 75% of the possible hourly
values for annual summaries).
b. Air Quality Data (quarterly and annual) shows frequency
distributions for all data reported to SAROAD (not subject
to sufficiency criteria) by criteria pollutant, state,
AQCR, and monitoring site.
2. National Emissions Data System (NEDS): emissions portion
of AEROS contains information on emissions of criteria pollu-
tants, emission factors, fuel consumption, and point and area
sources. States are required to submit to EPA Regional Offices
semi-annually information on new sources and certain changes in
existing sources within 45 days of the end of the semi-annual
report period; the data is supposed to be in NEDS within 75
days of the end of the period. In addition to computer printouts
obtainable at any time, NEDS publishes: National Emissions
Report (annual) which lists emissions, totals and by emission
source categories, for the country, every state and every AQCR.
3. AEROS Status Report contains reports on missing data
items of the NEDS point source inventory, status of AEROS/NEDS
validation efforts, status of emission factors improvements,
summary of SAROAD monitoring activity, and summary of valid
data reported to SAROAD. (Report is currently not being pub-
lished regularly, but program is available to generate reports
on request.)
-------
B. Management-by-Objactives (MBO) System: EPA's Formal Planning
and Reporting System which, starting in FY75 provided for a system
of negotiation of output commitments with states in all media
programs and periodic reporting of output achievements. MBO Air
Programs Outputs #2 through #8 do not require breakdowns of commit-
ments and achievements by states. Output #1, dealing with source
compliance, is reported quarterly by state, to EPA's Division of
Stationary Source Enforcement (DSSE) in Washington, D.C. and
summarized in a State Activity Report.
C. Plan Revision Management System (PRMS): system developed by
the EPA Office of AQPS to assist regional offices in making evaluation
of plan adequacy; identifies AQCR's with potentially deficient SIPs,
by comparing measured AQ values at each monitoring site with predicted
AQ values for that site projected from applicable SIP regulations,
expected growth, source compliance status, TCPs and automotive
emission standards to determine whether adequate progress has been
made toward attainment of standards. PRMS has made 3 analyses thus
far, expanding its latest analysis to 117 AQCRs (approximately 6000
sites) and all criteria pollutants except NO^. It is hoped that
eventually PRMS analyses will be made after every quarterly SAROAD
update.
The PRMS Analytical Summary Report consists of 11 volumes, 1 volume
for each region and a summary volume, and includes an analysis for
each site found to be potentially deficient, a summary of analytical
results for all sites, and a map of sites found to be potentially
deficient.
-------
D. State Air Pollution Implementation Plan Progress Report (semi-
annual): report put out by OAQPS and OE that assess the progress
made by states in implementing the Clean Air Act.
E. State Implementation Plan Automated Information System:
recently automated data bank currently containing all regulations
which are part of the SIP's. This system, developed for NADB,
may eventually include other portions of the SIP's.
The following data sources are currently in the process of being
made completely operational, and should be the source of additional
information that may be useful to the system.
F. Compliance Data System (CDS): a Regional Office computerized
enforcement management system designed to track source compliance
schedule status, in various stages of completion in the Regional
Offices. When operational in all regions, CDS can fill gaps in
MBO source compliance information.
F. Manpower Model: a computer model being developed for OAQPS
that will project manpower needs for various aspects of control
agency activities. Used in conjunction with current manpower and
budget information (totals are available in the semi-annual SIP
Progress Report, but breakdowns by type of activity are not now
available), such information can provide some measure of how well
state agencies are meeting resource needs.
-------
II. Non-EPA Data Sources
A. U.S. Bureau of the Census, Department of Commerce
1. U.S. Decennial Census, Population Report (1970).
2. Statistical Abstract of the U.S. (annual): abstract of
information derived from the Census and other sources.
3. U.S. County and City Data Book (annual): selected
information for all U.S. cities and counties derived from
the census and other sources.
B. National Weather Service, National Oceanic and Atmospheric
Agency, Department of Commerce, Climatological Data; monthly and
annual summaries of selected climatological information for the
nation, states and possessions, divisions of states, and individual
weather stations.
C. The 1972 PEERS Projections; Economic Activity in the U.S.;
historical and projected (1929-2020) data by BEA economic area,
water resources region and subarea, states, SMSAs, and AQCRs, pre-
pared by the Bureau of Economic Analysis (Commerce Department) and
Economic Research Service (Dept. of Agriculture); information is
given on population, employment, personal income, earnings, and
indexes of production by industry categories.
D. Dun and Bradstreet, Dun's Market Identifiers; computerized
file of industrial facilities, to which EPA subscribes for an
annual update, that includes summaries of the number of establishments
by category and by state and other geographical units.
-------
APPENDIX II-B
1972/1973 Pollutant-Method-Stations Summary
-------
APPENDIX II-B
1972/1973 Pollutant-Method-Stations Summary
1972 1972 1973
No. of Percent *. No. of
Pollutant Code
TSP 11101 91
CO 42101 11
12
21
SO* 42401 11
' 13
14
15
16
31
33
91
92
93
NO. 42602 11
Z. 12
13
14
71
72
84
91
94
95
96
Photoctanlcal
Ox 44101 11
(Oxone) 13
14
15
51
81
82
44201 11
13
Method*
HI-Vol (FRM)*
_NDIR (FRM)
Coulometrlc
Flame lonLzatlon
Colorlmetric
Conducti metric
Coulonetric
Autometerc
Flane Photometric
Hydrogen Peroxide0
Sequential Conduct! me trie
West-Gaeke-Sulfamic Acid (FRM)
Hest-Gaeke Bubbler
Conduct1metr1c Bubbler
Colorlmetric
Colorlmetric
Coulometrlc
Cheni luminescence
J-H Bubbler (orifice)
Saltzman
Sodium Arsenlte (orifice)
0-H Bubbler (frit)
Sodium Arsenlte (frit)
TEA
TGS
Alkaline KI Instrumental
Coulometrlc <>
Neut KI Colorlmetric
Coulometric
Phenol phthal in
Alkaline KI Bubbler
Ferrous Oxidation
Cheml luminescence (FRM)
Coulome tried
Stations
"5SZ5
223
1
2
-226"
68
80
76
1
12
38
3
1040
45
2
T555
no
15
5
36
11
n
5
816
28
1B3T"
49
10
75
13
S
64
85
62
35T
Of Total
^100
99
0
- -1
W
.' 5
7
6
0
0
3
0
76
3
0
TOT
12
1
0
3
1
1
0
79
3 '
. w-
13
3
21
4
1
18
23
17
lJh
Stations
: 3602
278
2
10
~590~
89
108
172
1
29
38
6
1510
11
0
T553
136
14
10
8
14
5
26
995
456
TB54-
10
10
89
22
3
79
91
131
418
of Tote
TOT
96
0
4
w~
5
6
9
0
1
2
0
77
0
0
TOT""
8
1
1
0
1
0
1
60
28
TCO
2
2
21
5
1
18
21
30
0
To5~
1973
Percent
of Total Approved Unapproved Unacceptable
FRM Federal Reference Method.
See Appendix B for an explanation of why these methods are unacceptable.
These methods should be reported under method code 42401 13.
These Mthods should be under method code 44101 15.
-------
APPENDIX II-C
Summary of National Ambient Air Quality Standards
-------
APPENDIX II-C
SUMMARY OF-NATIONAL AMBIENT AIR QUALITY STANDARDS
POLLUTANT
PARTICULATE
MATTER
SULFUR
OXIDES
CO
N02
PHOTOCHEMICAL
OXIDANTS
HYDROCARBONS
(Non-Methane)
AVERAGING
TIME
Annual
(Geometric Mean)
24 - Hour*
Annual
(Arithmetic Mean)
24 - Hour*
3 - Hour*
8 - Hour*
1 - Hour*
Annual
(Arithmetic Mean)
1 - Hour*
3 - Hour*
(6 to 9 a.m.)
PRIMARY
STANDARDS
75 ug/m3
260 ug/m3
80 ug/m3 (0.03ppm)
365 ug/m3 (0.14ppm)
10 mg/m3 (9ppm)
40 mg/m3 (35ppm)
100 ug/m3 (0.05ppm)
160 yg/m3 (O.OSppm)
160 yg/m3 (0.24ppm)
"^SECONDARY
STANDARDS
60. ug/m3
150 u9/m3
1300 u,g/m3 (O.Sppm)
(Same as
Primary)
(Same as
Primary)
(Same as
Primary)
(Same as
Primary)
FEDERAL
REFERENCE
METHOD (FRM)
Hi -Volume
Sampler
Pararosaniline
Non-Dispersive
Infrared
Spectrometry
Jacobs-
Hochheiser
(Rescinded)
Chemi lumines-
cence
Flame
lonization
COMMENTS
The secondary annual standard (60ug/m3)
is a guide for assessing SIPs-to
achieve the 24-hour secondary standard.
The continuous Saltzman, Sodium
Arsenite (Christie), TGS, and Chemi lum-
inescence have been proposed as replace
ments for the J-H method. New FRM
to be decided upon by Jan. 1975.
The FRM measures 0, (ozone)
The HC standard is a guide to devising
SIPs to achieve the Oxidant standard.
The HC standard does not have to be
met if the oxidant standard is met.
Not to be exceeded more' than once per year.
NOTE: The air quality standards and a description of the reference methods were published on April 30, 1971 in 42 CFR 410, recodified
to 40 CFR 50 on November 25, 1972.
January 30, 1974 - JDC
-------
APPENDIX II-D
State Background Information
-------
APPENDIX II-D
STATE BACKGROUND INFORMATION
1. Total population: (a) Civilian (County-City Data Book)
(1000) (b) Including military (Statistical Abstract of the U.S.)
2. Projected population, 1980: (a) Series C
(1000) (b) Series E.
3. Urban population (1000) = pop. in urbanized areas + places of 2500 and more.
4. Percentage of population that is urban.
5. SMSA population (1000).
6. Percentage of population that is in SMSAs.
7. Urbanized area population = pop. of densely settled areas of SMSAs (1000).
8. Total land area (sq. mi.).
9. SMSA land area (sq. mi.).
10. Overall density = total pop./total land area.
11. SMSA density = SMSA pop./SMSA land area.
12. # of AQCRs of priority I (sum over all pollutants).
13. Population in AQCRs of priority I (sum over all pollutants) (1000).
14. Total air pollution control agency expenditures ($1000), FY 73(SIP Prog. Rpt.)
15. Total expenditures/total population.
16. Total expenditures/urban population.
17. Total expenditures/SMSA population.
18. Total expenditures/UA population.
19. Total expenditures/total land area.
20. Total expenditures/SMSA land area.
21. Total expenditures/overall land density.
22. Total expenditures/SMSA density.
23. Total expenditures/population in AQCRs of priority I (sum over all pollutants).
24. Percentage deviation of 1973 heating degree-days from 30-yr. Normal ( V°'°Jp5sL_)
(averaged over all weather stations in state).
,+ = higher heating degree-days = colder \
- = lower heating degree-days = warmer than normal
-------
1. (a)
(b)
2. (a)
(b)
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
Alabama
3,444
3,452
3,657
3,565
2,012
58.4
1,801
52.3
2,011
50,708
10,194
68
176
8
6,967
1,236
.35
.61
.68
.61
24
121
18,176
7,023
.17
-8.1
Alaska
300
304
365
352
146
48.8
566,432
1
3
283
364
1.21
2.49
1
364,000
1.28
+2.8
Arizona
1,770
1,792
2,228
2,164
1,409
79.5
1,319 .
74.5
1,157
L13,417
9,343
16
141
9
5,981
1,196
.67
.84
.90
1.03
11
128
74,750
8.482
.19
+2.9
Arkansas
1,923
1,929
2,107
2,052
961
50
V 595
30.9
378
51,945
3,379
37
153
2
99
341
.17
.35
.57
.90
7
101
9,216
2,229
3.44
-6.3
California
19,958
20,016
24,865
24,226
18,136
90.9
18,500
92.7
16,147
156,361
47,357
128
356
14
49,143*
22,215
1.11
1.22
1.20
1.37
442
469-
173,555
62,402
.45
-4.5
Colorado
2,207
2,222
2,708
2,636
1,733
78.7
1,1582
71.7
1,424
103,766
6,322
21
250
5
4,283
1,546
.70
.89
.97
1.08
15
245
73,619
6,184 .
.36
+3.7
Conn.
3,031
3,039
3,645
3,551
2,345
77.3
2,505
82.6
2,101
4,862
2,282
624
1,097
4
10,268*
2,301
.75
.98
.91
1.09
473
1,975
3,688
2,098
.22
-9.8
Delaj
y
5!
e;
6!
3!
i:
31
71
3<
1,9!
1,K
r
3:
1,5
4
-
1.
1.
1.
2
I
1,4
1,2
-15
* AQCR not along county lines.
** No counties; made up of townships.
-------
i.(a;
(bj
2. (a)
(b)
3.
4.
5.
6.
7.
8*
Q
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
B.C.
\ 756
755
__
^^
757
100
757
100
756
61
61
12,402
12,402
4
3,024
508
.67
.67
.67
.67
8,328
8,328
41
41
.16
-10.1
Florida
6,789
6,841
8,626
8,280
5,468
80.5
4,657
68.6
4,133
54,090
11,851
126
392
4
5,657
2,078
.30
.38
.44
.50
38
175
16,492
5,301
.36
-7.0
Georgia Hawaii
| 1
4,589
4,603
5,337
5,191
2,768
60.3
2,280
49.7
1,880
58,073
3,608
79
158
a
o
5,671
1,046
.22
.37
.45
.55
18
290
13,241
6,620
.18
-6.5
768
773
895
874
639
83
629
81.9
442
6,425
596
120
947
__
425
55
.66
.67 .
.96
66
713
3,542
449
1-0
0
* AO(TR r>««- al,%«~ -- *. -i _
1 Idaho Illinois Indiana
1
712
738
783
761
385
54.3
I
112
15.8
85
82,677
1^043
9
107
4
648
387
.54
1.00
3.45
4.55
5
371
43,000
3,617
.59
-6.2
1
11,110
11,125
12,591
12,256
9,230
83.1
8,903
80.1
7,874
55,748
12,607
199
706
15
40,343
7,015
.63
.76
.79
.89
126
581
32,250
9,936 .
.17
-9.1
5,194
5,203
5,943
5,782
3,372
64.9
3,214
61.9
2,395
36,097
9,909
144
324
17
9,933
1,610
.31
.48
.50
.67
45
163
11,181
4,970
.16
-12.1
Iowa
1 2,824
2,832
2,985
2,908
1,616
57.2
I
1,006
35.6
842
55,941
2,351
51
427.90
5
2,465
995
.35
.61
.98
1.18
18
423
19,510
2,336
.40
-9.7
»O **W»M,fc.^ 0.0. At. CO
** No counties; made up of townships.
-------
1. (a)
(b)
2. (a)
(b)
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
Kansas
2,246
2,249
2,395
2,334
1,485
66.1
t
949
42.3
785
81,787
2,997
28
181
6
2,879
1,055
.46
.71
1.11
1.34
1.3
352
37 , 688
5,829
.36
-1.5
Kentucky
3,218
3,230
3,462
3,372
1,684
52.4
1,288
40
1,120
39,650
3,203
81
402
6
3,409
1,506
.46
.89
1.16
1.34
38
470
18,593
3,746
.44
-10.1
Louisiana
3,640
3,652
4,092
3,975
2,406
66.1
1,996
54.8
1,703
44,930
6,207
81
321
2
5,602
602
.16
.25
.28
.35
13
97
7,432
1,875
.10
-16.7
Maine
993
997
1,043
1,016
504
50.9
'* 214
21.6
171
30,920
352
32
607
1
327
300
.30
.59
1.40
1.75
10
852
9,375
494
.91
-6.6
Maryland
3,922
3,939
4,916
4,782
3,004
76.6
3,307
84.3
2,588
9,891
3,239
397
1,020
6
8,698
2,814
.71
.93
.85
1.08
285
869
7,088
2,759
.32
-5.3
Mass.
5,689
5,704
6,439
6,277
4,810
84.6
4,818
84.7
4,334
7,826
2,606
727
848
11
16,443*'
1,963
.34
.AP
.40
.45
251
753
2,700
2,315
.11
-7.5
Michigan
8,875
8,899
10,314
10,031
6,554
73.8
6,806
76.7
5,569
58,817
10,664
156
638
6
9,259
3,485
.39
.53
.51
.62
61
327
22,342
5,463
.38
-7.6
Minn.
3,805
3,816
4,367
4,245
2,257
59.3
2,165
56.9
1,902
79,289
9,901
48
219
4
5,953
1,162
.31
.46
.54
.61
15
117
24,217
5,308
.20
-8.2
* AQCR not along county lines.
** No counties; made up of townships.
-------
l.(a)
(b)
2. (a)
(b)
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
Miss.
2,216
2,223
2,308
2,245
987
44.5
393
17.7
320
47,296
2,261
47
173.8
653
.29
.66
1.66
2.04
14
389
13,894
5,780
-10.3
* AQCR not
** No count
Missouri
4,676
4,685
5,201
5,070
3,278
70.1
2,997
64.1
2,576
68,995
6,913
68
433
1
1,764
2,280
.48
.69
.76
.88
33
330
33,529
15,266
1.29
-3.5
Montana
694
698
741
721
371
53.6
169
24.4
142
145,587
5,303
5
32
1
154
567
.81
1.52
3.35
3.99
4
107
113,400
17,719
3.68
-7.0
Nebraska
1,482
1,489
1,614
1,570
913
61.6
634
42.8
588
76,483
2,382
19
266
389
.26
.42
y61
.66
5
163
20,474
1,462
-1.9
Nevada
488
493
693
673
395
80.9
394
80.7
336
109,889
14,249
4
27
1
152
501
1.02
1.26
1.27
1.49
5
35
125,250
18,556
3.29
-2.3
New Hamp.
737
742
902
878
416
56.5
202
- 27.3
174
9,027
174
82
1,160
2
1,258
285
.38
.68
1.41
1.63
32
1,638
3,476
246
.22
-2.9
N. Jersey
7,168
7,197
8,514
8,300
6,373
88.9
5,511
76.9
6,078
7,521
2,472
953
2,229
8
26,311
3,767
.52
.59
.68
.61
501
1,524
3,953
1,690
.14
-11.4
N.Mexico
1,016
1,022
1,124
1,088
709
70
316
31.1
297
121,412,
1,169
8
270
6 '
1,246
755
.74
1.06
2.38
2.54
6
646
94,375
2,796
.60
+2.0
along county lines.
JP.R: made tin of founfihins.
-------
1. (a)
(b)
2. (a)
(b)
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
New York
18,236
18,260
20,275
19,789
15,602
85.6
15,771
86.5
14,267
47,831
15,408
381
1,023
12
56,476
15,075
.82
.96
.95
1.05
318
978
39,567
14,736
.26
-8.9
N. Carolinj
. 5,082
5,096
5,624
5,482
2,285
45
1,896
37.3
1,212
48,798
7,295
104
260
6
3,493
1,855
.36
- .81
.97
1.53
38
254
17,837
7,135
.53
-5.1
N.Dakota
617
620
618
600
273
44.3
74
11.9
53
69,273
1,749
9
42
100
.16
.36
1.35
1.88
1
57
11,111
2,381
-6.4
Ohio
10,652
10,667
11,987
11,675
8,026
75.3
8,273
77.7
6,642
40,975
13,933
260
594
18
21,089
6,072
.57
.76
.73
.91
148
436
23,353
10,222
.29
-11.7
Oklahoma
2,559
2,567
2,858
2,787
1,740
68
1,281
50.1
1,049
68,782
7,011
37
182
4
3,092
754
.29
.43
.58
.71
11
143
20,378
4,143
.24
-2.2
Oregon
2,091
2,102
2,482
2,421
1,403
67
1,281
61.2
984
96,184
10,113
22
126
3
4,422
1,588
75
1.13
.1.23
1.61
17
157
72,182
12,603
.35
-3.8
Pa.
11,793
11,816
12,444
12,157
8,430
71.5
9,366
79.4
6,921
44,966
13,467
262
695
10
29,244
4,740
.40
.56
.50
.68
105
352
18,092
6,820
.16
-7.6
Rhode 1
948
951
1,053
1,02?
825
8)
802
84.7
745
1,049
705
905
1,137
4
3,784
223
.23
.27
.27
.29
213
316
246
196
.05
-9.8
* AQCR not along county lines.
** No cmintien; made un of towns bins.
-------
l.(a)
(b)
2. (a)
(b)
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
S.Carolina
2,590
2,597
2,806
2,731
1,232
47.6
1,017
39.3
649
30,225
4,808
86
211
7
1,915
993
.38
.80
.97
1.53
33
207
11,547
4,706
.51
-4.5
S.Dakota
665
668
677
658
297
44.6
95
14.3
76
75,955
813
9
116
81
.12
.27
.85
1.06
1
100
9,000
698
-8.2
Tennessee
3,923
3,938
4,367
4,259
2,305
58.8
1,918
48.9
1,488
41,328
5,125
95
374
7
5,125
1,802
.45
.78
.93
1.21
44
352
18,968
4,818
.35
-5.9
Texas
11,195
11,241
13,180
12,812
8,921
79.8
8,234
73.5.
6,917
262,134
38,099
43
216
13
15,475
6,087.
.54
.68
.73
.88
23
160
141,558
28,181
.39
+0.8
Utah
1,059
1,066
1,275
1,234
851
80.6
822
77.6
733
82,096
3,656
13
273
4
3,372
428
.40
.50
.52
.58
5
117
32,923
1,568
.12
+3.8
Vermont
444
447
518
504
143
32.2
9,267
48
272
.61
1.90
29
5,667
-6.1
Virginia
4,648
4,660
5,369
5,229
2,935
63.1
2,846
61.2
2,397
39,780
2,563
117
1,110
**
10
9,746
1,707
.36
.58
.59
.71
43
660
14,590
1,538
.17
-6.5
Washington
3,409
3,414
4,061
3,958
2,476
72.6
2,249
66
1,873
66,570
7,663
51
293
9
7,695
2,478 |
.72
1.00
1.10
1.32
37
323
48,588
8,457
.32
-1.6
* AQCR not along county lines.
** No counties; made up of townships.
-------
l.(a)
(b)
2. (a)
(b)
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
^Virginia
1,744
1,749
1,672
1,634
679
39
545
31.3
679
24,070
1,799
72
302
5
1,027
972
.55
1.43
1.78
1.43
40
540
13,500
3,219
.94
-7.5
Wisconsin
4,418
4,429
5,071
4,930
2,910
65.9
2,543
57.6
2,067
54,464
6,947
81
366
4
3,715
1,479
.33
.51
.58
.72
27
213
18,254
4,040
.40
-10.5
Wyoming
332
334
352
342
201
60.4
97,203
3
125
.37
.62
1
41,667
4-5.0
* AQCR not along county lines.
up of
-------
APPENDIX II-E
Workbook
-------
To facilitate implementation of the system the Workbook consolidates
all tables and figures referenced in the report that are needed to
calculate values and scores for all measures. These tables should
be retained as originals, and duplicated when needed to implement
the system.
The Workbook is organized in the following manner:
1) For all measures, worksheets referenced in the text;
2) For each index,
a) "State Values" output format, on which values computed
on the worksheets are presented (this may be the end
product desired or may be used to facilitate scoring);
b) "Converting Values to Scores" table, on which ranges of
values for each scoring interval are determined;
c) "Scoring and Weighting" table for a single state, on
which values computed for a state on the appropriate work-
sheets are converted to scores according to the ranges
established on the "Converting Values to Scores" table;
d) "State Scores" output format, on which computed scores
are presented;
3) For all measures, "State Scores for All Indices and Sub-Indices"
output format, which summarizes scores for all measures on the
two highest levels of aggregation.
-------
Worksheet #1. AMBIENT AIR QUALITY IMPROVEMENT
Region
State
AQCR #
Poll.-Std:
(G)
Previous
Pd.
(H)
(I)
Present
Pd.
(H)
(I)
(J)
1. .1. .
(a) or (b)
-
Poll Std:
(G)
\
Previous
Pd.
(H)
(I)
Present
Pd.
(H)
(I)
(J)
1. .1. .
(a) or (b)
-------
Instructions for Worksheet #1 Ambient Air Quality Improvement
1. List region number, state name, and AQCR code numbers on worksheet #1.
2. Refer to the Monitoring and Trends Report for the previous and present
periods for each pollutant-standard.
a. For each pollutant standard, determine stations that reported
the appropriate values in both periods by matching station code
numbers; list the number of stations reporting in both years in
column(G).
b. For TSP, S02, and NCL annual standards, compute column(I) sum
of percentage deviations above each primary standard (given in
Appendix II-D) in both periods for the state, which is equal to:
AQCR SES
I I
annual mean - annual standard
annual standard
SES -
where £ = sum over all stations in an AQCR with an annual
mean exceeding the .standard
AQCR
and £ = sum over all AQCRs in the state.
(If there are too few stations with sufficient data to calculate
annual means, the 50th and 70th percentile values are rough
estimates of the TSP annual geometric mean and SO- annual
arithmetic mean, respectively.)
c. For short-term standards (TSP and S07 24-hour, CO 8-hour and
1-hour, and 0 1-hour), col. (I) for each period is equal to:
AQCR SES
I I-
/# of values for each \
\station exceeding std./
x 100
(total # of values)
/2nd highest valueN ,_ , ,.
V for each station /
(Std.)
SES
where J = sum over all stations in an AQCR with 2 or more
values exceeding the standard,
AQCR
and J = sum over all AQCRs in a state.
-------
(Note: The number of values exceeding the standard is multiplied
by 100 only to avoid numbers with a large number of
decimal places.)
d. If there are no deviations above standard in the state in one or
both periods, sum all annual means(for long-term standards) or
2nd highest values(for short-term standards) in the state for col(H),
e. If the state total for column(I) is greater than 0 in both
periods, compute column(J) for percentage improvement in air
quality deviation and label resulting value as (a):
Previous Period (I) - Present Period (I) x ._
Previous Period (I)
f. If the state total for column(I) is less than or equal to 0
in either period, compute column(J) for percentage improvement
in air quality and label resulting value as (b):
Previous Period (H) - Present Period (H) x
Previous Period - (H)
-------
Worksheet #2. EMISSIONS
Pollutant:
Region
State
(1)
Previous Period
Emissions (T/yr)
(from NEDS)
(2)
Present Period
Emissions (T/yr)
(from NEDS)
(4.2.5. .)
(3)
Emission Goal
(T/yr)
(from SIP Automated
Information System)
% of Needed Emissions
Attained
(D-(2) , .,
-(1) X *
(5)
Emission Reduction
Needed (T/yr)
[(3M2)]
(4.3. .)
-------
Worksheet #3. PRMS
(Data from PRMS Analytical Summary Report,
Appendix B, Analytical Site Summary)
Pollutant:
Region
State
Number of Stations with Data Ending in the Present Period that Are:
INC
(Insufficient)
(Adequate)
* (Mag. or Freq.)
(All Deficiencies)
(1. .3.)
Sufficient =
Adequate +
Deficiencies
-------
Worksheet #4. SOURCE COMPLIANCE & ENFORCEMENT
(from State Activity Report on MBO Outputs)
Region:
Measure
Sub- Index 2.1.
2.1.1.1. (la)
2.1.1.3. (Ic)
2. 1.1. 4. (la + d)
2.1.1.5. (la+d+e+f)
2.1.1.6. (If)
2.1.1.8. (lh(l))
Total # Sources:
Start
Com.
Mile.
Last
New
Sub- Index 2.2.
2.2.1.
2.2.2.
Sub- Index 3.1.
3.1.1.
3.1.2.
STATES :
-------
Worksheet #4. SOURCE COMPLIANCE & ENFORCEMENT
(continued)
Region:
Sub- Index 5.1.
5.1.1.
5.1.2.
5.1.3.
Sub- Index 2.3.
2.3.1.
2.3.2.
2.3.3.
2.3.4.1.
2.3.4.2.
2.3.4.3.
Indicator 5.1.4.
STATES :
-------
INSTRUCTIONS FOR WORKSHEET #4 SOURCE COMPLIANCE & ENFORCEMENT
1. List region number (no more than one per page) and states in region on
Worksheet #4.
2. Refer to State Activity Report covering desired period, compute measures,
and fill in for each state:
M»
Start = Start Level
Com. « Commitment for the Year
Mile. = Milestones = Commitment for the Period
Last = Last Output Achievement for the Period
A/I = Activity Indicator
-^- «
Sub- Index 2.1.*
2. 1.1.1. (la, Point Sources In Compliance with Emission Requirements)
Last-Start
Mile. -Start
2.1.1.2.(lb, Pt. Sources of Unknown Compliance Status)
Start-Last
Start-Mile. X 1U°
2.1.1.3.(lc, Pt. Sources Out of Compliance & Not on Schedule)
Start-Last
Start-Mile. x 100
2.1.1.4. (la+d, Pt. Sources In Compliance With Emission Requirement or With
Scheduled Increments of Progress)
Last-Start ,nn = (la Last +ld Last) -(la Start + Id Start) --_
Mile-Start X (la Mile. + Id Mile.)- (la Start +ld Start) x
2.1.1.5. (la+d+e+f, Pt. Sources in Compliance with Emission Requirements or
On Compliance Schedules)
Last-Start , An
Mile-Start x 10°
2. 1.1. 6. (If, Pt. Sources of Unknown Status Regarding Increments of Progress)
Start-Last
Start-mile.
2.1.1.7. (lg(l), Field Surveillance Actions by State)
x 100
Mile.
2.1.1.8. (lh(l), Enforcement Actions by State)
-------
Total # of Sources (la+b+c+d+e+f)
Start
Com.
Mile.
Last
New=Last-Start
Sub-Index 2.2.*
2.2.1. (la Last + Id Last)-(la Start + Id Start)
Ib + Ic + le + If Start
2.2.2. (Ib Start - Ib Last)+(lf Start - If Last)
Ib Start + If Start
Sub-Index 3.1.
3.1.1. la Last + Id Last
Total Last
3.1.2. Id + le + If Last
Total Last-la Last
Sub-Index 5.1.
5.1.1. Ib Last + If Last
5.1.2. le Last
5.1.3. le Last
Sub-Index 2.3.**
231 lg(D-A/I lb(2) Source Tests by State
2.3.2.
2.3.3.
2.3.4.1.
2.3.4.2.
2.3.4.3.
# of D & B Manufacturing Facilities in State(see 5.1.4. below)
A/I lb(2)
# of D & B Manufacturing Facilities in State(see 5.1.4. below)
lg(l) Last
lg(l)+(2) Last
A/I lc(l)+(2)
le Start +le Start
A/I ld(l)+(2)
le Start + le Start
A/I le(l)+(2)
le Start + le Start
-------
3. Refer to Dun & Bradstreet Dun Market Identifiers (DMI) File and fill in for
each state:
5.1.4. Number of Manufacturing facilities in each state.
*
If commitment (denominator) is 0, the measure is assigned a value on the
following basis: 0/0=1.0, 1/0=2.0, 2/0=3.0, etc.
**
If the number of sources (denominator ) is 0, no value is computed for the
measure.
-------
Pollutant:
Worksheet #5a. MONITORING AND REPORTING
Region
STATE
AQCR IF
(D-(2)
= (3)
Tot (4)
rp A. /t;\
lOt (D)
(6)
(6)-(2)
= (7)
If(3)>0,
^-<«
If(3)<0,
(9)
Tot (10)
2.4._.1.=
(9)Tot
Tot (5)
(D-(6)
(T T\
-(.J--U
Tot (11)
5.2._.l.
m._ 4. fl O \
TOt \L£)
Tot (13)
5.2. .2.
Tot (14) Tot (12)
3.2._.2.~ Tot(15)
Tot-Tot
Tot (16) _ (12) (4)
O A 0 Tn^ /^S
(17"o}
/T Q\
^J.o;
Tot (19)
3.2. .1.
Tot
(22)
/on\
\£\J)
(20)+(21)
3.2. .3.
-------
Instructions for Worksheet #5.a Monitoring and Reporting
For each pollutant:
A. Refer to Monitoring and Trends Report for the previous period.
1. Fill in pollutant name, region number, state name, and
AQCR code numbers.
2. For each AQCR:
Column (1), minimum required (MR) # of stations -
column (2) # of stations that reported in the previous period =
column (3) # of stations needed to be added (with + or - sign).
3. For state total:
a. Tot.(4) = # of AQCRs in the state with column (3) value
less than or equal to 0 (i.e., that had complete networks).
b. Tot.(5) = # of AQCRs in the state with (3) greater than 0
(i.e., that had fewer than the MR # of stations).
B. Refer to Monitoring and Trends Report for present period.
1. For each AQCR: :
a. Column (6) = # of stations that reported in the present
period.
b. (6) - (2) = (7) # of stations added from the previous to
the present period (with + or - sign).
c. If the value in column (3) is greater than 0, column (8) %
of needed stations that were needed = (7)/(3) (with + or
- sign).
d. If (3) <. 0, fill in NN,for none needed, in column (8).
-------
e. If column (8) 2. 1, column (9) = 1.00.
f. If (8) < 1, (9) = (8).
g. If (8) is NN, (9) is NN.
2. For state total:
or = NN if all AQCRs in the state have
NN in col.(8).
3. For each AQCR:
Col.(1) - col.(6) = col.(11) # of stations needed in present
period (with + or - sign).
4. For state total:
a. Tot. (ll)(5.2.-.l.) = total // of stations needed in the
state in the present period = sum of positive values
in col.(11) for all AQCRs in the state.
b. Tot.(12) = # of AQCRs in the state with the minimum required
network reporting = # of AQCRs with col.(11) value <. 0.
c. Tot.(13)(5.2.-.2.) = # of AQCRs in the state with less than
the minimum required network reporting = # of AQCRs with
(11) > 0.
d. Tot.(14)(3.2.-.2.) % of AQCRs in the state with the MR
. Tot. (12)
network
Tot.(15) total # of AQCRs in the state '
e. Tot.(16)(2.4.-.2.) % of the state's AQCRs needing stations
in the previous period that attained the complete MR network
Tot.(12) - Tot.(4)
Tot.(5)
(If Tot.(5) = 0, fill in NN.)
-------
5. For each AQCR:
a. Col. (17) % of MR # of stations reporting in the present
period =
(If (1) = 0, (17) = 1.00.)
b. If the value in col. (17) >. 1.00, col. (18) = 1.00.
c. If (17) < 1.00, (18) = (17).
6. For state total:
sum °f c°l- (18) for all AQCRs in the state
now* 9 i ^ s
a. iot.uy;U./.-.i.; -> (15) total // of AQCRs in the state
C. Refer to Air Quality Data-Annual Statistics for the present period
(frequency distributions) . Determine pollutants for which data using
unacceptable pollutant methods (see Appendix II-C) were reported for
any state during the present period. For state total:
Col.(22)(3.2._.3.)
% of methods reported
that were acceptable
(20) # of stations using acceptable (FRM,
equivalent, or unapproved) methods
(20) + (21) # of stations using unacceptable
methods
-------
Worksheet #5b. SAROAD SUFFICIENCY SCORE
(from AEROS Status Report. Summary of Monitoring Activity)
Pollutant:
Region
State
(1)
# of Station-
Quarters That
Do Not Meet
SAROAD
Sufficiency
Criteria
(2)
# of Station
Quarters That
Do Meet
SAROAD
Sufficiency
Criteria
(3)
SAROAD
Sufficiency
Score
[(1H(2) Xl00]
(2.4.1.3.)
(4)
Improvement
Needed in Score
[100 - (3)]
(5.2.1.3.)
-------
Worksheet #6. EMISSIONS REPORTING
Region
State
(1)
# of
Sources
in NEDS
(2)
# of Sources
on NEDS
Verification
File
(5.2.6.1.)
(3)
% of Possi-
ble Sources
in NEDS
(i-/f)') xi°°
(3;2.6.1.)
(4)
Total Possi-
ble # of
Necessary
NEDS Data
Items in
Present Pd.
(5)
# of
Necessary
Data Items
Missing in
Present
Period
(5.2.6.2.)
(6)
% of
Necessary
Data Items
Missing in
Present
Period
-Vr-f xlOO
(4)
(3.2.6.2.)
(7)
# of
Items
Missing
in
Previous
Period
(8)
# Com-
pleted
During
Period
(9)
% of
Missing
Items
Completed
(2.4.5.1.)
-------
Instructions for Worksheet #6 Emissions Reporting
1. Rsfer to list of number of sources in NEDS and on NEDS Verification
File for desired period and fill in columns (1) and (2) on worksheet #6
for each state. Compute column (3):
A - C°lumn w I x 100
(2)/
^ Column (2)^
Refer to AEROS Status Report. NEDS (Section I), Point Source Inventory-
Incomplete Data Items, covering present period.
a. Cross out unnecessary data items in each state (see Section A, p. 11-20
of this report for data items declared necessary)
b. Compute column (4) total possible number of necessary data items
(sum of # of necessary items times # of plants/points/processes):
4 x # of plants
51 x # of points
+ 7 x # of processes
Total #
c. Count # of necessary data items missing in each state and fill in
under column (5) (5.2.6.2.).
d. Compute percentage of necessary data items that are missing:
Column (5)
Column (4) X 10° ; and fil1 in under column (6) (3.2.6.2. ),
3. Refer to AEROS Status Report for previous period.
a. Cross out unnecessary data items in each state.
b. Count # of necessary data items missing in each state and fill
in under column (7).
c. Compute net number of items completed:
Column (7) - Column (5)
and fill in under column (8).
d. Compute % of items missing in previous period that were completed:
100
Column (7)
and fill in under column (9).
-------
Worksheet #7. COMPLETING PLANS AND REVISIONS
(from SIP Progress Report)
Region
State
(1)
# of AQCRs
in State
(2)
# of SIP
Portions
(3)
Total
Possible
# of SIP
Portions
(4)
# Declared Deficient
(a)
State-
wide
Plans
(b)
AQCR
Plans
(c)
Total
(5.3.1.)
(5)
# Counted
as
Deficient
[(4a) x (1)
+ (4b)]
(6)
# Com-
pleted
[(3)- (5)]
(7)
% Completed
[JQ x 100]
(3.3.1.)
-------
Worksheet #8. AIR QUALITY DEVIATION INDICATION
Region
State
AQCR #
Poll.-Std.
(A)
(B)
(C)
(D)
(E)
(4.!.__._.)
Poll.-Std.
(A)
(B)
(C)
(D)
(E)
(4.!._._)
-------
Instructions for Worksheet #8 Air Quality Deviation Indication
1. List region number, state name, and AQCR code numbers on worksheet #8.
2. Refer to Monitoring and Trends Report for present period. For each
AQCR and each pollutant-standard:
a. Column(A) = federal minimum required number of stations.
b. Column(B) = number of stations that reported in the present period.
c. Column(C) = column(A) T column(B).
(For AQCRs with MR # = 0, (C) is computed thus: 0/0 =1.0,
1/0 = 2.0, 2/0 = 3.0, etc.)
3. Column(D) = sum of percentage deviations above standard:
a. For annual standards (TSP, S0~, NO^), refer to Trends Report;
x_v _ v annual mean - annual standard
^ annual standard
SES
where £ = sum over all stations with annual means exceeding
standard.
(If too few annual means are available, the 50th and 70th per-
centile values of frequency distributions can be substituted
for the TSP annual geometric mean and the SO and NO annual
arithmetic means, respectively. Refer to Air Quality Data-
Annual Statistics for frequency distributions.)
b. For short-term standards (TSP, S00, CO, 0 ), (D) =
£m X
SES VES
I I
value - standard
standard
VE'S
where J = sum over all values exceeding the standard
SES
and \ -. sum over all stations with 2 or more values exceeding
the standard.
-------
(If short term values are not available, the following equation
can be substituted:
[(# of values exceeding std.)x 1QQ (2nd highest value) -(std.)
| (total // of values) ~~j | standard ~\
SES
where I = sum over all stations in an AQCR with 2 or more
values exceeding the standard.)
c. For the state total, values in column (D) are summed.
4. Column (E) = corrected sum of percentage deviations above standard:
a. The value of (D) for each AQCR can be corrected to account for
the percentage completion of the federal minimum required
network. For each AQCR: (E)
b. For the state total, values in column (E) are summed.
-------
Worksheet #9. POPULATION EXPOSED TO AIR QUALITY DEVIATION
Region
State
AQCR //
Poll.-Std.:
if
AQDI
>0
Population
State Total
Poll.-Std.:
/ if
AQDI
>0
Population
State Total A
\= 4.1. . ./
Poll.-Std.:
/if
AQDI
>0
Population
§tate Total
4.!...
State %
of Inter-
state AQCR
Population
-------
Instructions for Worksheet #9. Population Exposed to
Air Quality Deviation
1. List region number, state name, and AQCR code numbers on worksheet #9.
Fill in pollutant-standards.
2. Refer to worksheet #8. Air Quality Deviation Indication. If a
particular AQCR has an AQDI (column(D) or (E) on worksheet #8)
greater than 0, check appropriate column on worksheet #9 for that
pollutant-standard and AQCR. If AQDI <. 0, leave blank.
3. Enter population of AQCR with checked AQDI column on worksheet #9.
For AQCR population, refer to NADB printout. If dates on printout
are inconsistent and a more consistent set of figures is desired,
refer to OBERS 1970 population by AQCR. For interstate AQCR, compute
from NADB printout approximate percentage of total AQCR population
in a given state and fill in on worksheet #9. Multiply this
percentage by total AQCR population from OBERS to obtain state-AQCR
population.
4. Total population of checked AQCRs in a state is the population in a
state exposed to air quality deviation of each pollutant-standard.
-------
Worksheet #10. EMISSION SOURCES
Region
State
(1)
Urbanized
Area Pop.
(1000)
(4.2.1.)
(2)
SMSA Land
Area
(Sq. Mi.)
(4.2.2.)
(3)
Total Population (1000)
(a)
Base
Year
(b)
Projection
Year =
(c)
Growth
Rate
(4.2.3.)
(4)
)BERS Prod. Indexes
for All Mfg. (1969=100^
(a)
Projection
Year
(b)
Growth
Rate
(4.2.4.)
-------
STATE VALUES FOR INDEX 1
STATE
1. GOAL ATTAINMENT
1.1. TSP
1.1.1.
AAQI
1.1.1.1. (a) >Annual
rH
CO
!
rH
rH
U
3
O
X
1
sf
CM
A
iH1
rH
H
1
CM
s1
CM
rH
T-H
rH
1.1.2. Em. Reduction
0)
N
rH
CO
to 01
00 C
CO u
rH U5
w o .
PH ^
CO
rH
1.2. S02
1.2.1.
AAQI
L.2.1.1.(a)>Annual
..2.1.1. (b)
CO
CM
1.3. CO
1.3.1.
AAQI
3
O
S3
1
00
A
CO
rH
r-i
CO
U
3
T
CO
.0
rH
CO
rH
3
O
A
'cO
CM
rH
CO
rH
H
1
rH
V
JO
CM
rH
CO
rH
e
o
4J
O
3
O
OJ
w
CO
rH
O
0)
rH
rt
» oi
M (3
CO jj
23 °
CO
CO
rH
1-.4. 0
X
1.4.1.
AAQI
3
O
rH
A
rH
J-H
rH
H
3
O
|
rH
.i-H
Annual
1.5.1. '.. (b)£ Annual
1.5.2. Era. Reduction
,,
c
0)
rH
CO
CCt 4J
UH
c/j o
m
-------
Converting Values to Scores
Index 1. GOAL ATTAINMENT
Measures
AQ Deviation
Improvement
Sub- Indicators
1.1.1.1. (a)TSP
1.1.1.2. (a)TSF
1.2.1.1. (a)S02
1.2.1.2. (a) S02
1.3.1.1. (a)CO
1.3.1.2. (a)CO
1.4.1.1. (a)0x
1.5.1.1. (a)N02
AQ Improvement
Sub- Indicators
1.1.1.1. (b)TSP
1.1.1.2. (b)TSP
1.2.1.1. (b)S02
1.2.1.2. (b)S02
1.3.1.1. (b) CO
1.3.1.2. (b)CO
1.4.1.1. (b)0x
1.5.1.1. (b)N02
Emission Reduc-
tion
Indicators :
1.1.2. TSP
1.2.2. S02
1.3.2. CO
1.4.2. BC
1.5.2. N02
PRMS
Indicators :
1.1.3. TSP
1.2.3. S02
1.3.3. CO
1.4.3. 0^
1.5.3. N02
Range of Values Used
Low
High
No. of
States
Scale
A-Arith.
G»Geom.
Value Ranges for Scoring Intervals
-------
STATE:
Scoring and Weighting
Index 1. GOAL ATTAINMENT
REGION:
Measure
1.1.1.1. (a)
1.1.1.1. (b)
1.1.1.2. (a)
1.1. 1.2. (b)
l.l.l.AAQI
I.I.2.E.R.
1.1.3.PRMS
1.1. TSP
1.2.1.1. (a)
1.2.1.1. (b)
1.2. 1.2. (a)
1.2.1.2.(b)
1.2.1.AAQI
I.2.2.E.R.
1.2.3.PRMS
1.2. SO?
1.3.1.1. (a)
1.3.1.1. (b)
1.3. 1.2. (a)
1.3.1.2.(b)
1.3.1.AAQI
I.3.2.E.R.
1.3.3.PRMS
1.3. CO
1.4.1.1. (a)
1.4.1.1. (b)
1.4.1.AAQI
I.4.2.E.R.
1.4.3.PRMS
1.4.0-,/HC
1.5.1.1. (a)
1.5.1.1. (b)
1.5.1.
1.5.2.
1.5.3.
1.5.N07
l.GOAL
ATTAINMENT
Sub- Indicator
Value Score Wt. Wtd.
Score
'
Indicator
Value Score Wt. Wtd.
Score
-
Sub-Index
Score Wt. Wtd.
Score
Index
Score
-------
STATE SCORES FOR INDEX 1
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
~~
Index
Sub- Indices
Indicators
s
&
s*
rH
J
PH
CO
"cd
i-H
rH
rH
rH
1.1.1. 1. (b)£TSP Annual v . ""s cn.
ib-
li
S
s.
3
CM
PH
CO
H
A
/ed
CM
rH
rH
rH
t
N
^
^
O
CM
PM
CO
|
CM
rH
rH
rH
4-1
|
rl
1
I
PM
H
rH
rH
rH
fr?
G
o
rH
4J
o
PH
CO
H
CM
rH
rH
5-S
-a
0)
N
rH
cd
co q
oo 3
rH .
fe W
a
PH MH
%£
CO
*
rH
rH
TSP Goal Attainment ( %)
rH
i-H
Indicators
Inc
s
*
\.
rH
cd
CM
o
CO
A
X-x
CD
V /
rH
rH
CM
rH
Sul
iic<
«
~s
rH
cd
1
CM
O
CO
X"
rH
i-H
CM
rH
3-
ito:
rl
3
O
st
CM
CM
O
A
U
*~f
CM
rH
CM
rH
r
rl
CM
CM
O
CO
XI
,0
CM
rH
CM
rH
1. S0_ AAQ Improvement ( %)
CM
rH
2. SO- Em. Reduction ( %)
CM
rH
O
N
co ^
cd
«-H
* co
o
CM
0 ».
CO ^
co
CM
rH
1.2. S02 Goal Attainment (
Indicators
]
Ii
/
6
^
rl
3
oo
O
A
Id
i-H
rH
CO
rH
3ul
id:
s
A
X-s
cd
x-x
CM
rH
CO
rH
^
^
-/
rl
1
rH
O
U
v_x
CM
*
rH
CO
rH
4-1
\
0
rl
&
H
I
O
U
rH
*
CO
rH
a
o
H
4-1
U
J
A
o
u
CM
CO
rH
1
H
il
O
8^
en
CO
rH
1.3. CO Goal Attainment ( X) / /
"
--^
-------
.>
LO
to
I-*
*
u>
NJ
I-1
P-
OJ
NJ
l->
JS
w
^
ro
s~'
l-i
^^
W OQ
S. -f>
1.4.1.1. (a)>0 1-Hour w
^
1.4.1.1.(b)<0 1-Hour (100/i) ! '
1.4.1. 0 AAQ Improvement ( %
1.4.2. 0 Em. Reduction ( %]
1.4.3. 0 PRMS Flags
x ( /»^
(# of Stns. Analyzed)
1.4. 0 Goal Attainment
1.5.1.1. (a)>N02 Annual ^w
1.5.1.1. (b)< NO 9 Annual \^«w,0y
1.5.1. N0« AAQ Improvement ( %]
1.5.2. N02 Em. Reduction ( %]
1.5.3. N02 PRMS Flags ( %]
(# of Stns. Analyzed)
1.5. N0? Goal Attainment
1. GOAL ATTAINMENT
u>
M
H.
H-
to
§
>
)
( J
H
H-
. P
H
)
>
C 5
to
Crt
0*
1
K
H-
O
0)
S)
0
M
5<
g.
&
X
t/l
g
w
w
o
CO
§
c
ID
-------
STATE VALUES FOR INDEX 2
2. PROGRESS
2.4. Monitoring and Reporting Air Quality and Emissions
2.4.1. TSP
2.4.1.1. % of Needed
Stations Added
2.4.1.2. % of Needed
AQCRs Attained
2.4.1.3. SAROAD Sufficiency
Score
2.4.2. S02
2.4.2.1. % of Needed
Stations Added-
2.4.2.2. % of Needed
AQCRs Attained
2.4.2.3. SAROAD Sufficiency
Score
2.4.3. CO
2.4.3.1. % of Needed
Stations Added
2.4.3.2. % of Needed
AQCRs Attained
2.4.3.3. SAROAD Sufficiency
Score
.
2.4.4. 0
X
2.4.4.1. % of Needed
Stations Added
-
2.4.4.2. % of Needed
AQCRs Attained
2.4.4.3. SAROAD Sufficiency
Score
2.4.5. N02
2.4.5.1. % of Needed
Stations Added
2.4.5.2. % of Needed
AQCRs Attained
2.4.5.3. SAROAD Sufficiency
Score
2.4.6.
Ems.
2.4.6.1. % of Missing NEDS
Data Items Completed
-------
\
1
1
CO
i
2.1.1.1. Output la
2.1.1.2. Output Ib
2.1.1.3. Output Ic
2.1.1.4. Output la + d
2.1.1.5. Output la, d - f
2.1.1.6. Output If
2.1.1.7. Output lg(l)
2.1.1.8. Output lh(l)
to
M
M
*
W
0
c
H
n
(D
o
o
1
M
H.
§
r>
m
2.2.1. Non-Complying Sources
Brought into Compliance
2.2.2. Unknown Sources Whose
Status was Determined
2.3.1. Process Inspection,
Opacity Observation
2.3.2. Stack Tests
2.3.3. Field Surveillance
by State
2.3.4.1. Notices of Violation
2.3.4.2. Abatement Orders
2.3.4.3. Court Proceedings
1
Iss
*
CO
e-
*
> w
o a
rt HI
S"
3
CO
S3
M
1?
fl>
rt
5"
00
o
o
rt
3
§
rt
Cfl
2. 2. Source
Compl .
2.3. Surveillance and ~
Enforcement
2. PROGRESS
-------
Converting Values to Scores
Index 2. PROGRESS
Measures
HBO Commitments
Sub- Indicators :
2.1.1.1.
2.1.1.2.
2.1.1.3.
2.1.1.4.
2.1.1.5.
2.1.1.6.
2.1.1.7.
2.1.1.8.
Source Compli-r
ance
Indicators :
2.2.1.
2.2.2.
.Surveillance &
Enforcement
Actions
Indicators :
2.3.1.
2.3.2.
2.J. 3.
Enforcement
Sub- Indicators :
2.3.4.1.
2.3.4.2.
2.3.4.3.
Range of Values Usec
Low
Hieh
No. of
States
Scale
A=Arith.
OGeom.
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score- )
to:
(Score- )
to:
-------
Converting Value* to Score*
Index 2. PROGRESS (continued)
Measures
Monitoring &
Reporting
% of Needed
Stations Added
Sub- Indicators :
2.4.1.1.
2.4.2.1.
2.4.3.1.
2.4.4.1.
2.4.5.1.
Z of Needed
AQCRs Attained
Sub- Indicators :
2.4.1.2.
2.4.2.2.
2.4.3.2.
2.4.4.2.
2.4.5.2.
SAROAD Suffi-
ciency Score
Sub- Indicators :
2.4.1.3.
2.4.2.3.
2.4.3.3.
2.4.4.3.
2.4.5.3.
Emissions Rptg.
Sub- Indicator :
2.4.6.1.
Range of Values Used
Low
Hluh
No. of
States
*
Scale
A-Arith.
G-Geom.
-
Value Ranges for Scoring Intervals
(Score- 1 )
Low to:
(Score- 2 )
to:
(Score- 3)
to:
-
'
(Score- 4)
f-n;
-------
Scoring and Weighting
Index 2. PROGRESS
STATE:
REGION:
Measure
2.1.1.1.
2.1.1.2.
2.1.1.3.
2.1.1.4.
2.1.1.5.
2.1.1.6.
2.1.1.7.
2.1.1.8.
2.1.1.
2.1. Meeting Com.
2.2.1.
2.2.2.
2.2. Source Corapl.
2.3.1.
2.3.2.
2.3.3.
2.3.4.1.
2.3.4.2.
2.3.4.3.
2.3.4.
2.3.Surv. & Enf.
2.4.1.1.
2.4.1.2.
2.4.1.3.
2.4.1.TSP
2.4.2.1.
2.4.2.2.
2.4.2.3.
2. 4. 2. SO?
2.4.3.1.
2.4.3.2.
2.4.3.3.
2. 4. 3. CO
2.4.4.1.
2.4.4.2.
2.4.4.3.
2.4.4.0,.
2.4.5.1.
2.4.5.2.
2.4.5.3.
2.4.5.N07
2.4.6.1.
2. 4. 6. Em.
2. 4. Monitoring
2. PROGRESS
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
'-
Sub-Index
Score Wt. Wtd.
Score
Index
Score
-------
STATE SCORES FOR INDEX 2
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
Index
Sub-Indices
/*-x
Indicators
Sub-
S-*i
B-S
V '
cd
p.
o
rH
*
H
*
CM
Indie
/*
B-?
N '
&
3
P.
o
CM
rH
CM
ators
x-s
6-S
N^X
CJ
3
P.
O
en
rH
CN
/ .
B~S
₯
+
cd
P.
O
c
P.
o
r>.
H
*
rH
CM
x \
&^
N«X
X\
H
N«X
^3
a
3
o
oo
H
H
.
CM
/*>»
B-S
O
O
H
*~s
0)
O
8
iH
rH
!
0
CO
0
3
o
w
i-H
i-H
CM
-
2.1. Meeting MBO
Commitments
2. ,2.1. Non-Complying Sources tj1
Brought into Compl. £« fi
ri
2.2.2. Unknown Sources Whose '-> £
Status was Determined °
5s: to
/~»
B-S
\^^
cu
0
§
H
i-H
O
V
O
M
O
c/3
.
CM
CM
-------
STATE SCORES FOR INDEX 2 (Continued)
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
_ (4)
(1)
(2)
(3)
Index
Sub-Indices
Indi
s*^.
6^
s
A
p- .
co co
C f>
H O
CO >>
CO 4-1
Q) -H
U CJ
O CO
M CX
P^ 0
rH
CO
CM
.cator
s~^
s-s
" N '
CO
4-1
CO
M
3 (1)
CO 4J
nj
13 4J
rH CO
,
ptH ,Q
CO
CO
CM
2.3.4.1. Notices of 7\Hco
Violation * g, g.
LJ^ 1
2.3.4.2. Abatement g
Orders ( *> %
s
^^^
e^s
s '
o
M CO
PM 00
4^5
VJ 13
3 *
CM
Sub-
Indi
( %)
T3 t)
(U 0)
13 -O
Q) T3
Q) <$
2:
.
u-i ca
0 fi
4J
B^S CO
rH
CM
-------
STATE SCORES FOR INDEX 2 (Continued)
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
Index
Sub-Indices
Indicators
Sub-
Indi
(U CU
CU T3
CU <
25
M-l CO
o e
i-H
CO
CM
2.4.3.2. % of Needed ~ £
AQCRs Attained s-s o
v^ rf
2.4.3.3. SAROAD Suf- /"N w
ficiency Score ^9
^^
2.4.3. CO Monitoring (
Sub-
Indi
0) 0)
CU T3
M-l CO
fr-S CO
*
r-l
CM
2.4.4.2. % of Needed ~ g
AQCRs Attained ;*> o
^ H
2.4.4.3. SAROAD Suf- ~ U1
ficiency Score ^a
V.'
N^
00
c
H
M
O
4-1
H
c
o
3-
-*
CM
Sub-
Indi
cu cu
TJ TJ
cu Ti
cu
-------
STATE VALUES FOR INDEX 3
STATE
3. ACHIEVEMENT
3.2. Monitoring & Reporting Air Quality & Emissions
(continued)
3.2.4. Ox
Monitoring
a
4J
CO
<4-l
o
6-5
i-l
t-flh1 p
3.2.4. N02
Monitoring
CO
C
4J
CO
M-l
O
B-s
H
m
cs
CO
^
&
CO
Pi
u eu
0- 4J ^
w ts
o
B^S O
H
VO
CN
CO
crt ,B
Q (U
W 4J 60
25 H (3
H
IH tri co
O 4J CO
trj -H
B^s n S
*
CM
VO
CM
CO
3.3. Completing
Plans &
Revisions
3.3.1. % of SIP
Portions
Completed
-------
STATE VALUES FOR INDEX 3 (Continued)
STATE
3. ACHIEVEMENT
3.1. Source
Compliance
3.1.1. % of Sources in Compli-
ance
3.1.2. % of Non-Complying
Sources on Schedule
3.2. Monitoring & Reporting Air Quality & Emissions
3.2.1. TSP
Monitoring
a
4J
CO
<4-(
O
s-s
iH
i-l
CN
CO
3.2.1.2. % of AQCRs with
Complete Network
3.2.1.3. % of Pollutant-
Methods Not Un-
amentahle
3.2.2. .S02
Monitoring
co
c
4J
co
14-1
O
B-S
rH
CN
CM
CO
3.2.2.2. % of AQCRs with
Complete Network
3.2.2.3. % of Pollutant-
Methods Not Un-
a/T>e>nf'ah1 p
3.2.3. CO
Monitoring
CO
a
4J
CO
1
4-1
O
&*
»
H
CO
CM
CO
3.2.3.2. % of AQCRs with
Complete Network
3.2.3.3. % of Pollutant-
Methods Not Un-
acceptable
-------
Converting Values to Scores
Index 3. ACHIEVEMENT
Measures
Source Complian
Indicators :
3.1.1.
3.1.2.
Monitoring &
Reporting
% of Required
stations
Sub-Indicators :
3.2.1.1.
3.2.2.1.
3.2.3.1.
3.2.4.1.,
3.2.5.1.
Z of AQCRs
Sub- Indicators :
3.2.1.2.
3.2.2.2.
3.2.3.2.
3.2.4.2.
3.2.5.2.
Pollutant-Methods
Sub-Indicators :
3.2.1.3.
3.2.2.3.
3.2.3.3.
3.2.4.3.
3.2.5.3.
Emissions Rptg.
Sub- Indicators :
3.2.6.1.
3.2.6.2.
Completing Plans
Indicator :
3.3.1.
Range of Values Use
Low
High
No. o:
State
Scale
,A-Arith
G-Geom.
.
,
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score- )
to:
(Score- )
to-
-------
Scoring and Weighting
Index 3. ACHIEVEMENT
STATE:
REGION:
Measure
3.1.1.
3.1.2.
3.1. Source Compl.
3.2.1.1.
3.2.1.2.
3.2.1.3.
3.2.1.TSP
3.2.2.1.
. 3.2.2.2.
3.2.2.3.
3.2.2,80,,
3.2.3.1.
3.2.3.2.
3.2.3.3.
3. 2. 3. CO
3. 2. A.I.
3.2.4.2.
3.2.4.3.
3.2. 4. Ox
3.2.5.1.
3.2.5.2.
3.2.5.3.
3.2.5.N02
3.2.6.1.
3.2.6.2.
3.2.6..
3. 2. Monitor ing
3.3.1.
3. 3. Completing
Plans
3. ACHIEVEMENT
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub- Index
Score Wt. Wtd.
Score
Index
Score
-------
STATE SCORES FOR INDEX 3
(1)
(2)
(3)
(A)
ghts)
[E
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
. w
(1)
(2)
(3)
.(^)
(1)
(2)
(3)
M
Index
Sub- Indices
>-^
tf
3.1.1. % of Sources / «,vo.
In Compliance ° o"
. ... ..... yj
3.1.2. % of Non-Complying ,. »v«
Sources on Schedule ° en
8-8
^
CU
a
§
H
r-l
P.
U
CU
O
M
O
CO
t-H
CO
Sub-
s\
B-«
^M/
CO
C
4J
CO
U-l
o
&-S
rH
r-l
CM
CO
3.2.1.2. % of AQCRs with , ^g.
Complete Network £
3.2.1.3. % of Pollutant , «v£
Methods Not ^ ''o
Unacceptable w
x*\
6-2
V >
6C
5
!J
O
4-1
1-1
C
PJ
M
H
H
*
CM
CO
Indicators
Sub-
/^
B-5
s»/
CO
c
4J
CO
M-l
O
B>S
rH
CM
CM
CO
3.2.2.2. % of AQCRs with ( %)g^
Complete Network ° K'
3.2.2.3. % of Pollutant / ^x»
Methods Not ° H
Unacceptable m
3.2.2. S02 Monitoring ( %)
Sub-
^N
B-S
S^X
CO
C
4-1
CO
U-(
O
fr*
rH
CO
*.
CM
CO
3.2.3.2. % of AQCRs with . . g.
Complete Network^ ; £
3.2.3.3. % of Pollutant . £
Methods Not C J g
Unacceptable m
s~\
S3
v-x
00
c
H
1-j
O
4-1
H
c
£
o
o
«
CO
CM
*
CO
-------
STATE SCORES FOR INDEX 3 (Continued)
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
Index
Sub-Indices
Indicators
Sub-I
CO
C
4-1
CO
<*-<
O
CM
CO
ndica
W i
5 4-1
CU
CO a
C_5 CU
< cu
IH ft
M!
CM
sfr
CM
CO
3.2.4.2. % of Pollutant £
Methods Not ( %)$
Unacceptable
3.2.4. Q Monitoring ( %)
X
Sub-
ca
4J
CO
M-l
O
rH
CM
Indie
ti o
> 4-1
CU
u cu
0' 4J
-------
STATE VALUES FOR INDEX 4
STATE
4. PROBLEM
4.1.
4.1.1. TSP
M
H
O
U
*,
rH
(0
3
1
X-N
ca
s_x
rH
rH
iH
-*
. (b) Annual, Uncorr.
rH
rH
rH
*
H
VJ
0
O
t
M
O
33
-tf
CN
/ N
ca
CN
rH
rH
<
M
(-1
O
O
£
A
M
3
0
re
^
P.
O
PH
n3
(U
CO
O
fr
W
>3-
CN
rH
d-
4.1.3. CO
M
M
O
O
N
M
3
O
33
oo
/ \
n)
v^/
rH
f>
rH
<3-
(b) 8-Hour, Uncorr.
rH
ro
rH
St
M
M
O
U
*\
M
O
33
rH
x~x
ca
s ^
CM
ro
rH
-*
(b) 1-Hour, Uncorr.
CN
CO
rH
^'
D.
O
P4
o
0)
ca
o
ex
£
CO
CO
rH
\
M
3
o
33
rH
/~*>
cd
v^^-
rH
<±
rH
-3-
(b) 1-Hour, Uncorr.
rH
-------
STATE VALUES FOR INDEX 4 (Continued)
STATE
4. PROBLEM
4.2. Emissions & Em. Sources
ex
o
Pn
§
s
£3
rH
CM
*
n
CM
-»
0
U
CO
IT)
CM
~3-
S
>*
U1
CM
-*
CM
§
IO
m
CM
-*
4.3. Reduc. Needed
f^
CO
H
r-H
CO
-st
CM
O
CO
CM
CO
3-
O
u
CO
CO
-*
0
JC
-*
CO
-a-
CM
§
m
CO
-*
-------
Converting Values to Scores
Index 4. PROBLEM
Measures
AAQ Problem
AQDI
Sub-Indicators :
4.1.1.1. (a)TSP
4.1.1.1. (b)TSP
4.1.1.2. (a)TSP
4.1.1.2. (b>TSP
4.1.2.1. (a)S02
4.1.2.1. (b)S02
4.1.2.2. (a)S02
4.1.2.2. (b)S02
4.1.3.1. (a) CO
4.1.3.1. (b)CO
4.1.3.2.(a)CO
4.1.3.2. (b)CO
4.1.4.1, (a)0x
4.1.4.1. (b)0x
4.1.5.1. (a)N02
4.1.5.1. (b)N02
Population (1000)
Sub- Indicators :
4.1.1.3. TSP
4.1.1.4.
4.1.2.3. S02
4.1.2.4.
4.1.3.3. CO
4.1.3.4.
4.1.4.2. Ox
4.1.5.2- N02
Range of Values Used
Low
High
No. of
States
Scale
A=Arith.
G=Geom.
-. '
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score- )
to:
(Score- )
to
-------
Converting Values to Scores
Index 4. PROBLEM (continued)
Measures
Emissions &
Em. Sources
Indicators :
4.2.1. Pop.
4.2.2. Land
4.2.3. Pop.Gr.
4.2.4. Manu.Gr.
Emissions (10001!
Sub- Indicators :
4.2.5.1. TSP
4.2.5.2, S02
4.2.5.3. CO
4.2.5.4. HC
4.2.5.5.NOX
Emission Reduc-
tion Needed
Indicators :
4.3.1. TSP
4.3.2.S02
4.3.3. CO
4.3.4. HC
4.3.5.NOX
Range of Values Used
Low
/r)
High
No. of
States
Scale
A«Arith.
G-Geom.
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
i
(Score- )
to:
(Score= )
to:
(a) - Corrected
(b) - Uncorrected
-------
Scoring and Weighting
Index 4. PROBLEM
STATE:
REGION:
Measure
4.1.1.1.
4.1.1.2.
4.1.1.3.
4.1.1.4.
4.1.1.TSP
4.1.2.1.
4.1.2.2.
4.1.2.3.
4.1.2.4.
4. 1.2. SO?
4.1.3.1.
4.1.3.2.
4.1.3.3.
4.1.3.4.
4. 1.3. CO
4.1.4.1.
4.1.4.2.
4.1.4.0V
4.1.5.1.
4.1.5.2.
4. 1.5. NO?
4.1.AAQ Problem
4.2.1.
4.2.2.
4.2.3.
4.2.4.
4.2.5.1.
4.2.5.2.
4.2.5.3.
4.2.5.4.
4.2.5.5.
4.2.5.
4. 2. Em & Em.
Sources
4.3.1.
4.3.2.
4.3.3.
4.3.4.
4.3.5.
4. 3. Em. Reduc.
Needed
4. PROBLEM
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub-Index
Score Wt. Wtd.
Score
Index
Score
-------
STATE SCORES FOR INDEX 4
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
Index
Sub- Indices
Indicators
St
/
&
>
*
n
H
O
O
r
i-H
CO
3
1
cfl
rH
rH
iH
*
(b) Annual, Uncorr. v '°' 7
rH
rH
rH
-*
Ir
s
B
>
M
S-i
0
O
*
M
3
O
PC
M
1
-*
CN
s~\
A
CN
rH
rH
M
PC
^»
^>
"?
*
CM
v.^
O
O
PM
T3
0)
co
O
-B
w
-*
CM
rH
S
N« /
rH
CO
|
P
O
P-I
T3
cu
CO
o
p.
A
CM
m
rH
St
*«\
6*2
v-'
CM
§
in
rH
-3-
-------
STATE SCORES FOR INDEX 4. (Continued)
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
Index
Sub
§
rH
§
M
CM
I
rH
-Indices
£s
o
&
rH
CM
*
&«
n)
d)
n
C
cu
N
H
1
CM
CM
*
01
4-1
cd
&
o
O
cx
o
m
CM
*
I
cu
4-1
1
O
M
1
CM
vi-
ndicators
Su
CM
H
rH
m
CM
*
b-I
B~S
CM
O
co
CM
m
CM
*
ndi
o
m
CM
*
cat
u
PC
m
CM
*
ors
CM
O
m
in
CM
"*
5. Emissions ( %)
CM
*
Q
CO
cu
0
3
O
CO
C
o
H
CO
CO
H
1
CM
"*
]
B^S
CM
CO
H
rH
CO
~3'
[ndi
B-S
CM
O
CM
m
"*
.cat
o
0
CO
m
*
:ors
B-S
U
PC
st
ro
-;r
i
CM
O
m
-------
STATE VALUES FOR INDEX 5
STATE
5. OPERATIONAL REQUIREMENTS
5.2. Monitoring and Reporting
Air Quality and Emissions
5.2.4. Ox
5.2.4.1. # of Stations Needed
5.2.4.2. # of AQCRs With Less
Than Required Network
5.2.4.3. Improvement Needed
in Sufficiency Score
5.2.5. N02
5.2.5.1. # of Stations Needed
5.2.5.2. # of AQCRs With Less
Than Required Network
5.2.5.3. Improvement Needed
in Sufficiency Score
5.2.6.
Emissions
Reporting
5.2.6.1. Sources on NEDS
Verification File
5.2.6.2. Missing NEDS Data
Items
5.3. Comp .
Plans &
Revisions
5.3.1. Needed SIP Portions
-------
STATE VALUES FOR INDEX 5 (Continued)
STATE
5. OPERATIONAL REQUIREMENTS
5.1. Source
Compliance &
Enforcement
g
I
j
rH
l-l
*
m
5.1.2. Non-Complying Sources
Not on Compl. Schedule
5.1.3. Overdue Sources
5.1.4. Sources that Require
Field Surveillance
5.2. Monitoring and Reporting
Air Quality and Emissions
5.2
5.2.1.1. # of Stations Needed
-
.1. TSP
5.2.1.2. # of AQCRs With Less
Than Required Network
5.2.1.3. Improvement Needed
in Sufficiency Score
5.
5.2.2.1. # of Stations Needed
2.2. S
5.2.2.2. # of AQCRs With Less
Than Required Network
°2
5. 2.. 2. 3. Improvement Needed
in Sufficiency Score
5.2.3. CO
-------
Converting Values to Scores
Index 5. OPERATIONAL REQUIREMENTS
Measures
Source Compliance
& Enforcement
Indicators :
5.1.1.
5.1.2.
5.1.3.
5.1.4.
Monitoring &
Reporting
No. of Needed
Stations
Sub-Indicators :
5.2.1.1.
5.2.2.1.
5.2.3.1.
5.2.4.1.
5.2.5.1.
No. of AQCRs
Sub- Indicators :
5.2.1.2.
5.2.2.2.
5.2.3.2.
5.2.4.2.
5.2.5.2.
Improvement in
SAROAD score
Sub-Indicators :
5.2.1.3.
5.2.2.3.
5.2.3.3.
5.2.4.3.
5.2.5 3.
Emissions Rptg.
Sub- Indicators I
5.2.6.1.
5.2.6.2.
Completing Flans
Indicator:
5.3.1.
Range of Values Used
Low
Hieh
-
No. of
States
Scale
A-Arith.
G-Geom.
Value Ranges for Scoring Intervals
(Score- )
Low to:
(Score- )
to:
(Score- )
to:
(Scofe- )
to:
-------
STATE:
Scoring and Weighting
Index 5. OPERATIONAL REQUIREMENTS
REGION:
Measure
: 5.1.1.
5.1.2.
5.1.3.
5.1.4.
5.1. Source Compl.
5.2.1.1.
5.2.1.2.
5.2.1.3.
5.2.1.TSP
5.2.2.1.
5.2.2.2.
5.2.2.3.
5.2.2.S02
5.2.3.1.
5.2.3.2.
5.2.3.3.
5. 2. 3. CO
5.2.4.1.
5.2.4.2.
5.2.4.3.
5.2.4.0,,
5.2.5.1.
5.2.5.2.
5.2.5.3.
5.2.5.N02
5.2.6.1.
5.2.6.2.
5.2. 6. Em.
5. 2. Monitor ing
5.3.1.
5. 3. Completing
Plans
5. OPERATIONAL
REQUIREMENTS
Sub- Indicator
Value Score Wt. Wtd.
Score
Indicator
Value Score Wt. Wtd.
Score
Sub-Index
Score Wt. Wtd.
Score
Index
Score
-------
STATE SCORES FOR INDEX 5
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
Index
Sub- Indices
5.1.1. Unknown Sources ( %) M
o.
5.1.2. Non-Complying Sources , . H-
Not on Compl. Schedule ^ ^ ("
5.1.3. Overdue Sources ( %) H
5.1.4. Sources That Require , ,,.
Field Surveillance ^ '
5.1. Source Compliance , .
& Enforcement ^ '
5.2.1.1. # of Stations Needed ( %) to
5.2.1.2. # .of AQCRs With Less < «^ cr
Than Required Network ° M
5.2.1.3. Improvement Needed in/ «\ o.
Sufficiency Score °
-
/ \
B^
^^
fli
CO
H
H
CM
in
Indicators
Si
/\
&«
V /
T3
01
O
-------
STATE SCORES FOR INDEX 5 (Continued)
(1)
(2)
(3)
(4)
(Weights)
STATE
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
(1)
(2)
(3)
(4)
Index
Sub- Indices
Indicator
S
B-S
13
CU
T3
CU
CU
&
CO
C
O
H
4-1
tfl
4-1
W
<4-l
0
=a=
i-H
-*
-------
STATE SCORES FOR ALL INDICES AND SUB-INDICES
(1)
(2)
STATE
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
(2)
(1)
Indices
Sub-Indices
( )
E-i
rH
rH
( )
CM
O
in
CM
rH
( )
O
O
CO
^
( )
o*
-a-
rH
( )
CM
in
rH
^»
J£
H
8
rH
Sub-Indices
( )
CO
C
n
H
O
O
s
,.;
CM
( )
,_!
f
Source Co
CM
CM
( )
i^j
O 0)
0 B
B O
Surveilla
Enf. Acti
t-i
CM
( )
.
00
4-1
&
a
H
B
-a-
CM
PROGRESS
CM
Sub-Indicep
( )
rH
s-
c5
a)
o
1-1
3
O
en
rH
CO
( )
4J
S1
ia
4J
H
1
CM
CO
( )
CO
B
n
cu
00
Completin
ro
CO
g
ACHIEVEME
CO
Sub-Indices
( )
g
rH
|
1
^
*
( )
01
Emissions
Em. Sourc
CM
-^
( )
^ ^
(r-(
H
O rj
01 01
T3 P.
co en
01 1
rH
rH
erf o
CM
CO
-*
PROBLEM
^
Sub-Indices
( }
rH
a
e
3
O 4-1
M C
1"
H
in
( }
*
4J
Konit. &
cs
vn
c >
CO
B
a
rH
PM
oo
Completin
CO
in
CO
d g
OPERATION
REQUIREJffi
in
-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO.
EPA-450/3-75-055
2.
3. RECIPIENT'S ACCESSION-NO.
4. TITLE AND SUBTITLE
SYSTEM FOR TABULATING SELECTED MEASURES
OF STATE AIR PROGRAMS STATUS
5. REPORT DATE
April 1975
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
Marsha N. Allgeier, Barry F. Levene
8. PERFORMING ORGANIZATION REPORT NO.
107
9. PERFORMING ORG 'V.NIZATION NAME AND ADDRESS
System Sciences, Inc.
P.O. Box 2345
Chapel Hill, North Carolina 27514
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
68-02-1420
12. SPONSORING AGENCY NAME AND ADDRESS
Environmental Protection Agency
Office of Air Quality Planning and Standards
Control Programs Development Division
Research Triangle Park, N.C. 27711
13. TYPE OF REPORT AND PERIOD COVERED
Final Report
14. SPONSORING AGENCY CODE
15. SUPPLEMENTARY NOTES
16. ABSTRACT
A system for tabulating selected measures of state air programs status was
developed to provide a method for organizing, summarizing, and presenting within
a coherent framework, data from existing reporting systems available to EPA
headquarters. The system consists of a framework of measures of selected aspects
of state air programs for which data is readily available, a methodology for
computing values and scores for these measures, and alternative formats for
summarizing and presenting values and scores. A trial run of the system was
conducted for all fifty-five state and territorial control programs to demonstrate
the manual application of the system. It was concluded that a periodic manual
application of the system is feasible but time-consuming. The feasibility
of automating the system depends on the extent of system usage and the degree
of stability of data items and measures.
17.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b.lDENTIFIERS/OPEN ENDED TERMS C. COS AT I Field/Group
18. DISTRIBUTION STATEMENT
Release Unlimited
19. SECURITY CLASS (ThisReport)
Unclassified
21. NO. OF PAGES
280
20. SECURITY CLASS (Thispage)
Unclassified
22. PRICE
EPA Form 2220-1 (9-73)
-------
INSTRUCTIONS
1. REPORT NUMBER
Insert the EPA report number as it appears on the cover of the publication.
2. LEAVE BLANK
3. RECIPIENTS ACCESSION NUMBER
Reserved for use by each report recipient.
4. TITLE AND SUBTITLE
Title should indicate clearly and briefly the subject coverage of the report, and be displayed prominently. Set subtitle, if used, in smaller
type or otherwise subordinate it to main title. When a report is prepared in more than one volume, repeat the primary title, add volume
number and include subtitle for the specific title.
5. REPORT DATE
Each report shall carry a date indicating at least month and year. Indicate the basis on which it was selected (e.g., date of issue, date of
approval, date of preparation, etc.).
6. PERFORMING ORGANIZATION CODE
Leave blank.
7. AUTHOR(S)
Give name(s) in conventional order (John R. Doe, J. Robert Doe, etc.). List author's affiliation if it differs from the performing organi-
zation.
3. PERFORMING ORGANIZATION REPORT NUMBER
Insert if performing organization wishes to assign this number.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Give name, street, city, state, and ZIP code. List no more than two levels of an organizational hirearchy.
10. PROGRAM ELEMENT NUMBER
Use the program element number under which the report was prepared. Subordinate numbers may be included in parentheses.
11. CONTRACT/GRANT NUMBER
Insert contract or grant number under which report was prepared.
12. SPONSORING AGENCY NAME AND ADDRESS
Include ZIP code.
13. TYPE OF REPORT AND PERIOD COVERED
Indicate interim final, etc., and if applicable, dates covered.
14. SPONSORING AGENCY CODE
Leave blank.
15. SUPPLEMENTARY NOTES
Enter information not included elsewhere but useful, such as: Prepared in cooperation with, Translation of, Presented at conference of,
To be published in, Supersedes, Supplements, etc.
16. ABSTRACT
Include a brief (200 words or less) factual summary of the most significant information contained in the report. If the report contains a
significant bibliography or literature survey, mention it here.
17. KEY WORDS AND DOCUMENT ANALYSIS
(a) DESCRIPTORS - Select from the Thesaurus of Engineering and Scientific Terms the proper authorized terms that identify the major
concept of the research and are sufficiently specific and precise to be used as index entries for cataloging.
(b) IDENTIFIERS AND OPEN-ENDED TERMS - Use identifiers for project names, code names, equipment designators, etc. Use open-
ended terms written in descriptor form for those subjects for which no descriptor exists.
(c) COSATI FIELD GROUP - Field and group assignments are to be taken from the 1965 COSATI Subject Category List. Since the ma-
jority of documents are multidisciplinary in nature, the Primary Field/Group assignment(s) will be specific discipline, area of human
endeavor, or type of physical object. The application(s) will be cross-referenced with secondary Field/Group assignments that will follow
the primary posting(s).
18. DISTRIBUTION STATEMENT
Denote releasability to the public or limitation for reasons other than security for example "Release Unlimited." Cite any availability to
the public, with address and price. .'
19. &20. SECURITY CLASSIFICATION
DO NOT submit classified reports to the National Technical Information service.
21. NUMBER OF PAGES
Insert the total number of pages, including this one and unnumbered pages, but exclude distribution list, if any.
22. PRICE
Insert the price set by the National Technical Information Service or the Government Printing Office, if known.
EPA Form 2220-1 (9-73) (Reverse)
------- |