&EPA
United State
EirviroiiwiU Protection
Agnncy
Health Risk and Exposure Assessment
for Ozone
Second External Review Draft
Chapter 4 Appendices
-------
DISCLAIMER
This draft document has been prepared by staff from the Risk and Benefits Group, Health
and Environmental Impacts Division, Office of Air Quality Planning and Standards, U.S.
Environmental Protection Agency. Any findings and conclusions are those of the authors and do
not necessarily reflect the views of the Agency. This draft document is being circulated to
facilitate discussion with the Clean Air Scientific Advisory Committee to inform the EPA's
consideration of the ozone National Ambient Air Quality Standards.
This information is distributed for the purposes of pre-dissemination peer review under
applicable information quality guidelines. It has not been formally disseminated by EPA. It
does not represent and should not be construed to represent any Agency determination or policy.
Questions related to this preliminary draft document should be addressed to Dr. Bryan
Hubbell, U.S. Environmental Protection Agency, Office of Air Quality Planning and Standards,
C539-07, Research Triangle Park, North Carolina 27711 (email: hubbell.bryan@epa.gov).
-------
EPA-452/P-14-004b
February 2014
Health Risk and Exposure Assessment for Ozone
Second External Review Draft
Chapter 4 Appendices
U.S. Environmental Protection Agency
Office of Air and Radiation
Office of Air Quality Planning and Standards
Health and Environmental Impacts Division
Risk and Benefits Group
Research Triangle Park, North Carolina 27711
-------
This page left intentionally blank
-------
Appendix 4-A
Ambient Air Quality Monitoring Data
1. Ambient Oj Monitoring and Air Quality Data 4
1.1 Overview of Ambient Os Monitoring 4
1.2 Ambient Os Concentration Data 6
1.3 Data Handling 8
2. Air Quality Inputs for the Exposure and Clinical Risk Assessments 12
2.1 Urban Case Study Areas 12
2.2 Air Quality Inputs to the Air Pollutants Exposure (APEX) Model 13
2.3 Evaluation of Air Quality Spatial Field Techniques 30
3. Air Quality Inputs for the Epidemiology-Based Risk Assessment 34
3.1 Urban Case Study Areas 34
3.2 Air Quality Inputs to the Benefits Mapping and Analysis Program (BenMAP)
36
4. References 39
-------
FIGURES
FIGURE 1:
FIGURE 2:
FIGURES:
FIGURE 4
FIGURES
ATLANTA.
FIGURES
BALTIMORE.
FIGURE?
BOSTON.
FIGURES
CHICAGO.
FIGURES
CLEVELAND.
FIGURE 10
DALLAS.
FIGURE 11
DENVER.
FIGURE 12
DETROIT.
FIGURE 13
HOUSTON.
FIGURE 14
Los ANGELES.
FIGURE 15
NEW YORK.
FIGURE 16
PHILADELPHIA.
FIGURE 17
SACRAMENTO.
FIGURE 18
ST. Louis.
FIGURE 19
WASHINGTON, D.C.
FIGURE 20
MAP OF U.S. AMBIENT O3 MONITORING SITES IN OPERATION DURING THE 2006-2010 PERIOD 6
MAP OF MONITORED 8-HOUR O3 DESIGN VALUES FOR THE 2006-2008 PERIOD 11
MAP OF MONITORED 8-HOUR O3 DESIGN VALUES FOR THE 2008-2010 PERIOD 12
NUMERICAL EXAMPLE OF THE VORONOI NEIGHBOR AVERAGING (VNA) TECHNIQUE 14
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
16
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
17
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
18
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
19
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
20
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
21
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
22
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
23
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
24
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
25
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
26
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
27
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
28
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
29
MAPS OF DESIGN VALUES AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON VNA SPATIAL FIELDS FOR
30
MAPS OF MONITORED VALUES (LEFT), NEAREST NEIGHBOR SPATIAL FIELDS (CENTER), AND VNA SPATIAL FIELDS
(RIGHT) FORSELECTED HOURS IN ATLANTA (TOP), AND PHILADELPHIA (BOTTOM) 31
FIGURE 21: DENSITY SCATTER PLOTS AND PERFORMANCE STATISTICS FOR THE CROSS-VALIDATION ANALYSIS OF NEAREST
NEIGHBOR (NN; LEFT) AND VORONOI NEIGHBOR AVERAGING (VNA; RIGHT) SPATIAL FIELD TECHNIQUES APPLIED TO MONITORED HOURLY O3
CONCENTRATIONS IN ATLANTA, 2005 32
-------
FIGURE 22: DENSITY SCATTER PLOTS AND PERFORMANCE STATISTICS FOR THE CROSS-VALIDATION ANALYSIS OF NEAREST
NEIGHBOR (NN; LEFT) AND VORONOI NEIGHBOR AVERAGING (VNA; RIGHT) SPATIAL FIELD TECHNIQUES APPLIED TO MONITORED HOURLY O3
CONCENTRATIONS IN DETROIT, 2005 33
FIGURE 23: DENSITY SCATTER PLOTS AND PERFORMANCE STATISTICS FOR THE CROSS-VALIDATION ANALYSIS OF NEAREST
NEIGHBOR (NN; LEFT) AND VORONOI NEIGHBOR AVERAGING (VNA; RIGHT) SPATIAL FIELD TECHNIQUES APPLIED TO MONITORED HOURLY O3
CONCENTRATIONS IN PHILADELPHIA, 2005 33
TABLES
TABLE 1: INFORMATION ON THE 15 URBAN CASE STUDY AREAS IN THE EXPOSURE AND CLINICAL RISK ASSESSMENTS ....13
TABLE 2 INFORMATION ON THE 12 URBAN CASE STUDY AREAS IN THE EPIDEMIOLOGY BASED RISK ASSESSMENT 35
TABLES SUMMARY OF THE AIR QUALITY INPUTS TO BENMAP 38
-------
1 AMBIENT O3 MONITORING AND AIR QUALITY DATA
This section provides a brief overview of ambient Os monitoring in the U.S. (Section 1.1), the
ambient Os concentration data extracted for use in the risk and exposure assessments (Section 1.2), and
the data handling procedures used for determining compliance with the existing and potential alternative
standards as well as some other relevant air quality metrics (Section 1.3).
1 1 OVERVIEW OF AMBIENT O3 MONITORING
The Clean Air Act establishes air quality monitoring requirements to provide information on
ambient concentrations for six criteria pollutants, including Os, and makes provisions for the collection
of other ambient air quality measurements, such as 63 precursors. The federal regulations for ambient
air quality monitoring, including establishment and periodic assessment of local monitoring networks,
approved monitoring methods, operating schedules, and protocols for data reporting, quality assurance,
and certification, are in Part 58 of the Code of Federal Regulations.
There were over 1,300 ambient Os monitoring sites actively operating in the U.S. in 2010. These
monitoring sites are operated by over 100 federal, state, local, and tribal agencies, and can be grouped
into one of the following networks:
1) State and Local Air Monitoring Stations (SLAMS): Monitoring sites operated by state, local, and
tribal governments for the purposes of determining compliance with the National Ambient Air
Quality Standards (NAAQS), and providing ambient air quality information to help state and
local public health agencies evaluate and implement air quality control programs. There were
over 1,100 SLAMS Os monitoring sites operating in 2010, making up over 80% of the U.S.
ambient Os monitoring network. There are two important subcategories of SLAMS monitors:
a. National Multi-pollutant Monitoring Network (NCore): Approximately 80 monitoring sites
(60 urban and 20 rural) operated by state and local agencies. These sites monitor six criteria
pollutants (CO, NO2, Os, SO2, PMio, and PM2.s) and other important parameters for the
purposes of assessing multi-pollutant impacts on public health, and supporting air quality
forecasting.
b. Photochemical Assessment Monitoring Stations (PAMS): Approximately 80 monitoring sites
operated by state and local agencies with EPA funding. These sites monitor 63 and its
precursors, including NO, NO2, total NOx, total reactive nitrogen (NOy), and over 60 volatile
organic compounds (VOCs) for the purposes of understanding Os chemistry and transport,
aiding photochemical modeling, and evaluating Os precursor emissions control strategies in
areas designated nonattainment for the Os NAAQS. Some PAMS monitoring sites are co-
located with NCore monitoring sites.
-------
2) Clean Air Status and Trends Network (CASTNET): Approximately 80 Os monitoring sites
operated year-round by EPA and the National Park Service (NPS) for the purpose of determining
Os levels in national parks and other rural areas.
3) Special Purpose Monitoring Stations (SPMS): These monitoring sites are used to support various
air quality monitoring objectives, such as specific public health and welfare impacts studies,
model evaluation, or monitoring network assessments. These monitoring sites are often operated
on a temporary basis (up to 24 months), and are generally not used to determine compliance with
the NAAQS. Some of these monitoring sites may be operated by local industry or other private
interest groups.
SLAMS monitoring sites are required to monitor for Os only during the required Os monitoring
season, which is defined for each state in Table D-3 of 40 CFR Part 58. Many states also operate their
Os monitors outside of the required Os monitoring season. States that are required to operate some or all
of their Os monitors on a year-round basis include Arizona, California, Hawaii, Louisiana, New Mexico,
and Texas. EPA regional offices may approve waivers effectively shortening the length of the required
Os monitoring season for some individual monitoring sites (e.g. rural monitoring sites which may be
inaccessible during the winter months). CASTNET and NCore monitoring sites are typically operated
on a year-round basis.
The Federal Reference Method (FRM) for Os measurement is the Chemiluminescence Method
(CLM). The first ultraviolet (UV) absorption photometric analyzers were approved as Federal
Equivalent Methods (FEMs) in 1977 and gained rapid acceptance for NAAQS compliance purposes due
to ease of operation, relatively low cost, and reliability. All SLAMS and CASTNET Os monitoring sites
in the U.S. have been operating UV analyzers since 2005.
Figure 1 shows the locations of the ambient Os monitoring sites used in the risk and exposure
assessments. Gray dots represent SLAMS monitoring sites, green dots represent CASTNET sites, blue
dots represent NCore and/or PAMS monitoring sites, and black dots represent SPMS and other
monitoring sites for which data were available.
-------
SLAMS
CASTNET
NCORE/PAMS
SPMS/OTHER
Figure 1: Map of U.S. ambient Os monitoring sites in operation during the 2006-2010 period
1 2 AMBIENT O3 CONCENTRATION DATA
EPA's Air Quality System (AQS) database is a national repository for many types of air quality
and related monitoring data. AQS contains monitoring data for the six criteria pollutants dating back to
the 1970's, as well as more recent additions such as PM2 5 speciation, air toxics, and meteorology data.
As of 2010, over 100 federal, state, local, and tribal agencies submitted hourly 63 concentration data
collected from over 1,300 ambient monitoring sites to AQS.
Air quality monitoring data from 1,468 ambient 63 monitoring sites in the U.S. were extracted
for use in the risk and exposure assessments. The initial dataset consisted of hourly O3 concentrations in
ppb collected from these monitors between 1/1/2006 and 12/31/2010. Data for nearly 1,400 of these
monitors were extracted from the AQS database in October 2012, and the remaining data were extracted
from the CASTNET database at the same time. CASTNET monitors operated by the National Park
Service were included in the data extracted from AQS, but the CASTNET monitors operated by EPA
did not begin reporting data to AQS until 2011. Data collected from these EPA operated CASTNET
monitors prior to 2011 did not meet EPA's quality assurance requirements, but the data were subject to
-------
quality assurance criteria, and it is generally agreed that data collected from CASTNET monitors prior
to 2011 is of comparable quality to the regulatory data stored in AQS.
There were a number of subtle, yet noteworthy differences between the air quality data used in
the 1st draft REA and the air quality data used in this draft.
1. In the 1st draft REA, multiple AQS data extractions were used for the air quality inputs to various
parts of the risk and exposure assessments. In this draft, all air quality inputs were derived from
the data extraction described above.
2. In the 1st draft REA, data collected from EPA operated CASTNET sites and other non-regulatory
monitoring sites were not included in the air quality inputs to the risk and exposure assessments.
In this draft, these monitors were included, but were not used to make determinations of meeting
the existing standard or the potential alternative standards.
3. In the 1st draft REA, data collected with 63 analyzers not using federal reference or equivalent
methods were not included in the air quality inputs to the risk and exposure assessments. In this
draft, these monitors were included, but were not used to make determinations of meeting the
existing standard or the potential alternative standards.
4. In the 1st draft REA, reported hourly Os concentrations lower than the minimum detection limit
(MDL, 5 ppb for most 63 analyzers) were replaced with a value of 1A MDL. This is called the
"standard sample value" for criteria pollutant data extracted from AQS. In this draft, the actual
reported sample values were used, effectively allowing concentrations down to 0 ppb.
5. In the 1st draft REA, hourly 63 concentrations flagged by the monitoring agencies and concurred
by EPA as having been affected by exceptional events were removed from the air quality inputs
to the risk and exposure assessments. In this draft, these data were included, but were not used
to make determinations of meeting the existing standard or the potential alternative standards,
which is consistent with EPA's exceptional events policy.
6. In this draft, missing data gaps of 1 or 2 hours in length were filled in using linear interpolation.
These short gaps often occur at regular intervals in the ambient data due to an EPA requirement
for monitoring agencies to perform routine quality control checks on their Os monitors. Quality
control checks are typically performed between midnight and 6:00 AM when Os concentrations
are low. Missing data gaps of 3 hours or more in length were not replaced, and interpolated
values were not used to make determinations of meeting the existing standard or the potential
alternative standards.
7. In this draft, hourly Os concentrations from 7 monitoring sites where multiple Os analyzers were
operated simultaneously in the same physical location were combined to create a single hourly
site record using the highest reported hourly concentration in each hour.
8. In some instances, EPA regional offices may approve the data combinations for pairs of nearby
63 monitoring sites affected by short distance site relocations for the purpose of making NAAQS
-------
determinations. In this draft, hourly 63 concentrations from 12 such pairs of monitoring sites
were combined to create a single hourly site record for each pair.
1.3 DAT A HANDLING
To determine whether or not the NAAQS have been met at an ambient air quality monitoring
site, a statistic commonly referred to as a "design value" must be calculated based on 3 consecutive
years of data collected from that site. The form of the existing 63 NAAQS design value statistic is the
3-year average of the annual 4th highest daily maximum 8-hour Oj concentration in ppb, with all decimal
digits truncated. The existing 63 NAAQS are met at an ambient monitoring site when the design value
is less than or equal to 75 ppb. The data handling protocols for calculating design values for the existing
Os NAAQS are in 40 CFR Part 50, Appendix P. In counties or other geographic areas with multiple
monitors, the area-wide design value is defined as the design value at the highest individual monitoring
site, and the area is said to have met the NAAQS if all monitors in the area are meeting the NAAQS.
The initial hourly Os concentration dataset was split into two design value periods, 2006-2008 and 2008-
2010, and subsequent analyses were conducted independently for these two periods.
The following daily summary statistics were calculated from the hourly Os concentrations:
1. Daily Maximum 8-hour Average (MDA8) Concentration: There are 24 consecutive 8-hour
periods in each day (midnight - 8:00 AM, 1:00 AM - 9:00 AM, ..., 11:00 PM - 7:00 AM).
Rolling 8-hour averages were calculated for each period with 6, 7, or 8 hours of data available,
using 6, 7, or 8 as the divisor, respectively, and 8-hour periods with fewer than 6 hours of data
available were not used. The 8-hour average values were stored in the 1st, or start, hour of the 8-
hour period. The MDA8 value is the highest of the 8-hour average values for each day, and the
MDA8 values for two consecutive days may have some hours in common. MDA8 values were
considered to be valid if there were sufficient data available to calculate at least 18 of 24 possible
8-hour averages, or, if used for design value calculations, if the MDA8 value is greater than the
level of the standard. This is the daily metric used in design values and the Smith et al, 2009
short-term mortality study.
2. Daily 10:00 AM - 6:00 PM Mean Concentration: This is the rolling 8-hour average value as
defined above for the 8-hour period starting at 10:00 AM. This is the daily metric used in the
Zanobetti & Schwartz, 2008 short-term mortality study.
3. Daily Maximum 1-hour Concentration: This is the highest hourly Os concentration reported
during a given day. Daily Maximum 1-hour values were considered to be valid if there were at
least 18 hourly Os concentrations reported in a given day. This is the daily metric used in the
Jerrett et al, 2009 long-term mortality study.
4. Daily 24-hour Average Concentration: This is the simple arithmetic average of the hourly Os
concentrations reported during a given day. Daily 24-hour average values were considered to be
-------
valid if there were at least 18 hourly 63 concentrations reported in a given day. This is the daily
metric used in the Bell et al, 2004 short-term mortality study.
The daily summary statistics described above were then used to calculate the following annual
summary statistics:
1. Annual 4th Highest MDA8 Concentration: This is the 4th highest valid MDA8 value measured at
a given monitoring site in a given year. The 4th highest MDA8 values were considered to be
valid if there were valid MDA8 values available for at least 75% of the days in the required O3
monitoring season, or, if used for design value calculations, if the 4th highest MDA8 value was
greater than the level of the standard. Data collected outside of the required 63 monitoring
season were used in determining the 4th highest MDA8 value, but were not used in determining
validity.
2. May - September Average MDA8: This is the average of all available valid MDA8 values at a
given monitoring site during the May - September period (153 days). The May - September
average MDA8 values were considered to be valid if valid MDA8 values were available for at
least 114 days (75%) in May - September of a given year. Three-year averages of these values
were calculated for the 2006-2008 period and used to create the May - September average
MDA8 national fused air quality surface described in Appendix 4c. This surface was then used
in the national-scale risk assessment based on Smith et al, 2009 described in Chapter 8.
3. June - August Average Daily 10:00 AM - 6:00 PM Mean: This is the average of all available
daily 10:00 AM - 6:00 PM mean values at a given monitoring site during the June - August
period (92 days). The June - August average daily mean values were considered to be valid if
daily 10:00 AM - 6:00 PM mean values were available for at least 70 days (75%) in June -
August of a given year. Three-year averages of these values were calculated for the 2006-2008
period and used to create the June - August average daily 10:00 AM - 6:00 PM mean national
fused air quality surface described in Appendix 4c. This surface was then used in the national-
scale risk assessment based on Zanobetti & Schwartz, 2008 described in Chapter 8.
4. April - September Average Daily Maximum 1-hour Concentration: This is the average of all
available valid daily maximum 1-hour values at a given monitoring site during the April -
September period. The April - September average daily maximum 1-hour values were
considered to be valid if valid daily maximum 1-hour values were available for at least 137 days
(75%) in April - September of a given year. Three-year averages of these values were calculated
for the 2006-2008 period and used to create the April - September average daily maximum 1-
hour national fused air quality surface described in Appendix 4c. This surface was then used in
the national-scale risk assessment based on Jerrett et al, 2009 described in Chapter 8.
The design value statistic for the existing Oj standard is the 3-year average of the annual 4th
highest MDA8 values described above. Design values greater than the level of the existing standard (76
-------
ppb or higher) are automatically valid. Design values less than or equal to the level of the existing
standard must be based on 3 valid 4th highest MDA8 values, with the additional requirement that the 3-
year average of the annual data completeness statistics (percent of valid MDA8 values within the
required 63 monitoring season) must be at least 90%. These same criteria were chosen to determine
design values for the potential alternative standards. The implications of this choice are that in some
cases, a monitoring site may have different design values based on the level of the standard. This may
occur for one of two reasons:
1. A monitoring site with insufficient data to determine a design value at higher standard level may
have sufficient data to show a violation at a lower standard level. For example, if a monitoring
site has a 3-year average annual 4th highest MDA8 value of 72 ppb, but does not meet the data
completeness criteria described above, then the site has a design value of 72 ppb for the potential
alternative standards of 70 ppb, 65 ppb, and 60 ppb, but does not have a design value for the
existing standard of 75 ppb.
2. EPA's current practice is to use only "valid" MDA8 values when determining the annual 4th
highest MDA8 value. A valid MDA8 value is either based on at least 18 of 24 possible 8-hour
average values, or it is greater than the level of the standard. This can cause the design value to
change based on the level of the standard, due to whether certain days are considered valid or
not. For example, suppose the five highest MDA8 values at a particular monitoring site for a
given year are 78 ppb, 76 ppb, 75 ppb, 74 ppb, and 70 ppb, but the 4th highest value is based on
only 12 valid 8-hour averages. Then for the existing standard, the 4th highest MDA8 value is 70
ppb, but for the 70 ppb alternative standard, the 4th highest MDA8 value is 74 ppb.
Figure 2 and Figure 3 show the design values for the existing O?, NAAQS for all regulatory
monitoring sites in the U.S. for the 2006-2008 and 2008-2010 periods, respectively. In general, Os
design values were lower in 2008-2010 than in 2006-2008, especially in the Eastern U.S. There were
518 Os monitors in the U.S. with design values above the existing standard in 2006-2008, compared to
only 179 in 2008-2010.
10
-------
* «l?*^
/*0°^r
tt^S o X a QO ^ o
ALASKA
« a
o
HAWAII
^
+
o-o oo <5°J
1-, U 8
V/-K-
v--0 « 8-Hour Ozone Design Values, 2006-2008
33-60 ppb (49 Sites)
O 61 - 65 ppb (65 Sites)
O 66-70 ppb (140 Sites)
71-75 ppb (279 Sites)
76-120 ppb (518 Sites)
PUERTO RICO
Figure 2: Map of monitored 8-hour Os design values for the 2006-2008 period
11
-------
8-Hour Ozone Design Values, 2008-2010
33-60 ppb (79 Sites)
61-65 ppb (165 Sites)
O 66-70 ppb (305 Sites)
71-75 ppb (300 Sites)
76-120 ppb (179 Sites)
Figure 3: Map of monitored 8-hour Os design values for the 2008-2010 period
2. AIR QUALITY INPUTS FOR THE EXPOSURE AND CLINICAL RISK
ASSESSMENTS
This section describes the 15 urban case study areas used in the exposure and clinical risk
assessments described in Chapters 5 and 6 (Section 2.1), the hourly Os concentration spatial fields used
as inputs to the Air Pollutants Exposure Model (APEX; Section 2.2), and some methods evaluations
supporting the change from nearest neighbor to the Voronoi Neighbor Averaging (VNA) technique for
generating the spatial fields (Section 2.3).
2 1 URBAN CASE STUDY AREAS
The 15 urban case study areas in the exposure (Chapter 5) and clinical risk (Chapter 6)
assessments covered a large spatial extent, with boundaries generally similar to those covered by the
respective Combined Statistical Areas (CSA) defined by the U.S. Census Bureau. Table 1 gives some
basic information about the 15 urban case study areas in the exposure and clinical risk assessments,
12
-------
including the number of counties, number of ambient O3 monitoring sites, the required O3 monitoring
season, and the 2006-2008 and 2008-2010 design values. All 15 urban case study areas had 8-hour O3
design values above the existing standard in 2006-2008, while 13 areas had 8-hour O3 design values
above the existing standard in 2008-2010. Chicago (74 ppb) and Detroit (75 ppb) had design values
meeting the existing standard during the 2008-2010 period. The design values in the 15 urban areas
decreased by an average of 6 ppb between 2006-2008 and 2008-2010, ranging from no change in
Sacramento to a decrease of 15 ppb in Atlanta.
#of #ofO3 Population Required O3 2006-2008 2008-2010
Area Name Counties Monitors (2010) Monitoring Season DV (ppb) DV (ppb)
Atlanta
Baltimore
Boston
Chicago
Cleveland
Dallas
Denver
Detroit
Houston
Los Angeles
New York
Philadelphia
Sacramento
St. Louis
Washington
33
7
10
16
8
11
13
9
10
5
27
15
7
17
26
13
7
14
26
13
20
26
12
22
54
31
19
26
17
22
5,618,431
2,710,489
5,723,468
9,686,021
2,881,937
6,366,542
3,390,504
5,218,852
5,946,800
17,877,006
21,056,173
7,070,622
2,755,972
2,837,592
5,838,518
March - October
April - October
April - September
April - October
April - October
January - December
March - September
April - September
January - December
January - December
April - October
April - October
January - December
April - October
April - October
95
91
83
78
82
89
86
81
91
119
90
92
102
85
87
80
89
77
74
77
86
77
75
85
112
84
83
102
77
81
Table 1: Information on the 15 Urban Case Study Areas in the Exposure and Clinical Risk
Assessments
2.2 AIR QUALITY INPUTS TO THE AIR POLLUTANTS EXPOSURE (APEX) MODEL
The Air Pollutants Exposure Model (APEX) described in Chapter 5 requires spatial fields of air
quality concentrations with no missing values as inputs. In the 1st draft REA, the spatial fields were
generated using hourly O3 concentrations from the nearest available monitoring site for each census tract
in four of the 15 urban case study areas (Atlanta, Denver, Los Angeles, and New York). In this draft,
the spatial fields were generated with hourly O3 concentrations interpolated using the Voronoi Neighbor
Averaging (VNA; Gold, 1997; Chen et al, 2004) technique described below for each census tract in the
15 urban case study areas. The following paragraphs provide a numerical example of VNA used to
estimate an O3 concentration value for census tract "E" in figure 4 below.
The first step in the VNA technique is to identify the set of nearest monitors for each census
tract. The left-hand panel of Figure 4 presents a numerical example with nine census tracts (squares)
and seven monitoring sites (stars), with the focus on identifying the set of nearest neighboring sites to
13
-------
census tract "E" in the center of the panel. The Delaunay triangulation algorithm identifies the set of
nearest neighboring monitors by drawing a set of polygons called the "Voronoi diagram" around the
census tract "E" centroid and each of the monitoring sites. Voronoi diagrams have the special property
that each edge of each of the polygons are the same distance from the two closest points, as shown in the
right-hand panel of Figure 4.
A
D
Monitor: *
80 ppb
10 miles
G
*
B
Monitor:
90 ppb ^
15 miles f
/E
#
/ H
*
Monitor:
100 ppb
20 miles
C
*
F
*
Monitor:
60 ppb
15 miles
'
*
#=
*
Figure 4 Numerical example of the Voronoi Neighbor Averaging (VNA) technique
The VNA technique then chooses the monitoring sites whose polygons share a boundary with the
census tract "E" centroid. These monitors are the "Voronoi neighbors", which are used to estimate the
concentration value for census tract "E". The VNA estimate of the concentration value in census tract
"E" is the inverse distance squared weighted average of the four monitored concentrations. The further
the monitor is from the center of census tract "E", the smaller the weight. For example, the weight for
the monitor in census tract "D" 10 miles from the census tract "E" centroid is calculated as follows:
1/102
1/102+1/152+1/152+1/202
= 0. 4675 (Equation 1)
The weights for the other monitors are calculated in a similar fashion. The final VNA estimate
for census tract "E" is calculated as follows:
VNA(E} = 0.4675 * 80 + 0.2078 * 90 + 0.2078 * 60 + 0.1169 * 100 = 80.3 ppb (Equation 2)
The observed hourly Os concentrations in the 15 urban case study areas were used to calculate
VNA estimates for approximately 24,935 census tracts * 43,824 hours -1.1 billion values. The actual
number of values was lower than this, because values were not calcuated for hours outside the required
14
-------
monitoring season. However, the same VNA procedure was also used to create hourly spatial surfaces
based on air quality adjusted to meet the existing standard of 75 ppb, and air quality adjusted to meet the
potential alternative standards of 70 ppb, 65 ppb, 60 ppb, and 55 ppb, which effectively increased the
total number of VNA estimates by roughly a factor of 5. The computations were executed using the R
statistical computing program (R, 2012), with the Delaunay triangulation algorithm implemented in the
"deldir" package (Turner, 2012).
Figure 5 through Figure 19 show maps of the 2006-2008 and 2008-2010 design values and May-
September "seasonal" average MDA8 values in the 15 urban case study areas based on the VNA spatial
fields. The top panels in each figure show the design values, while the bottom panels show the seasonal
average values. The left-hand panels in each figure show the 2006-2008 values, while the right-hand
panels show the 2008-2010 values. The colored circles in each panel represent census tract centroids,
and the colored squares represent monitoring sites. In each panel, counties colored pink indicate the
study area boundaries used in the Zanobetti & Schwartz (2008) and/or Smith et al (2009b) epidemiology
studies1, where applicable. Counties colored gray indicate additional counties within the CBSA
boundaries, and counties colored peach indicate any additional counties included in the exposure and
lung function risk assessments. Note that the maps show some monitors outside of the study area
boundaries. This is because the VNA surfaces were generated using data from monitors within a 50 km
radius of the study area boundaries, in addition to the monitors inside the study areas.
1 The Zanobetti and Schwartz (2008) and Smith et al (2009) study area boundaries were identical for 6 of the 12
urban case study areas, and had at least one county in common for all 12 urban case study areas. The counties colored pink
in figures 5 through 19 refer to counties included in either of those two studies.
15
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 5 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Atlanta.
16
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 6 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Baltimore.
17
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 7 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Boston.
18
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 8 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Chicago.
19
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 9 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Cleveland.
20
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 10 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Dallas.
21
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 11 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Denver.
22
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 12 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Detroit.
23
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 13 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Houston.
24
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 14 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Los Angeles.
25
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 15 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for New York.
26
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 16 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Philadelphia.
27
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 17 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Sacramento.
28
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 18 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for St. Louis.
29
-------
50 60 70 80 90 100 50 60 70 80 90 100
2006 - 2008 Design Value (ppb) 2008 - 2010 Design Value (ppb)
30 40 50 60 70 30 40 50 60 70
2006 - 2008 Seasonal Average (ppb) 2008 - 2010 Seasonal Average (ppb)
Figure 19 Maps of design values and May - September average MDA8 values based on VNA
spatial fields for Washington, D.C.
2.3 EVALUATION OF AIR QUALITY SPATIAL FIELD TECHNIQUES
As mentioned previously, in the 1st draft REA the air quality spatial fields used as inputs to
APEX were based on hourly Oj concentrations from the nearest monitoring site, while in this draft the
spatial fields were based on the VNA technique described in the previous section. This section presents
30
-------
an evaluation comparing the relative accuracy of the nearest neighbor (NN) and VNA techniques for
generating spatial fields of hourly Os concentrations.
The evaluations were conducted over 4km gridded domains in Atlanta, Detroit, and Philadelphia
using monitored hourly 63 concentrations from 2005. Due to potential discrepancies in the availability
of data, only data collected within the required Os monitoring seasons for each area (March - October
for Atlanta; April - September for Detroit; April - October for Philadelphia) were considered. Figure
20 below shows maps of the monitored values (AQS, left), NN values (center), and VNA values (right)
in Atlanta (top) and Philadelphia (bottom) for a single selected hour in each area. These maps show an
extreme example of the differences in the NN and VNA spatial fields. The NN fields (center column)
have very sharp breaks between air quality monitors, while the VNA fields (right column) tend to
produce much smoother surfaces. The NN fields also have a tendency to spread concentrations out over
a large area if the monitoring network is sparse. For example, the highest concentration at the monitor
in southern Atlanta is spread out over a large area in the NN fields, but this effect is somewhat mitigated
in the VNA fields.
AQS (0712612005 14:00:00)
NN (0712612005 14:00:00)
VNA (0712612005 14:00:00)
20 40 60 80 100 120 140 0
AQS (06/26E005 14:00:00)
20
40 60 80 100 120 140
NN (06/26(200514:00:00)
20 40 60 80 100 120 140
VNA (0612612005 14:00:00)
20 40 60 80 100 120 140
Figure 20
Maps of monitored values (left), nearest neighbor spatial fields (center), and VNA
spatial fields (right) for selected hours in Atlanta (top), and Philadelphia (bottom)
31
-------
A cross-validation analysis was performed to evaluate the relative accuracy of the estimates from
these two methods. For each hour in the required O?, season, the concentrations from each monitor in
the 4 km gridded domain were withheld (one at a time) from the input dataset, and the concentrations
from the remaining monitors were used to estimate the concentration at the monitoring site that was
withheld using NN and VNA. Additional monitoring sites within a 50 km radius of the 4 km gridded
domain were used in the estimates, but were not included in the set of monitors that were withheld from
the analysis. The estimated values were then compared to the respective monitored concentrations that
were withheld, and the relative accuracy of NN and VNA were assessed using three summary statistics:
the coefficient of variation (R2), the mean bias, and the root mean squared error (RMSE).
Figure 21 (Atlanta), Figure 22 (Detroit), and Figure 23 (Philadelphia) show density scatter plots
and performance statistics for the NN and VNA spatial fields based on the cross-validation analysis
described above for the three urban areas. Each figure shows the monitored hourly Os concentrations in
ppb (AQS; x-axis) from the monitors which were withheld from the analysis, versus the respective
values estimated using the NN (left panel) and VNA (right panel) techniques. In each panel, the plot
region was split into 2 ppb x 2 ppb squares, with the colors indicating the number of points falling into
each square. The diagonal line in each panel is a one-to-one reference line, and performance statistics
for each method are included in the upper left-hand corner. High R2 values, low mean bias values, and
low RMSE values are indicative of good performance.
AQS vs. HN
R-square = 0.77
Mean Bias = -0.08
RMSE =10.78
200
150
ioo;
50
AQS vs. VNA
R-square = 0.839
Mean Bias = -0.52
RMSE = 8.63
200
150
100
0 50 AQS 100 150 0 50 AQS 100 150
Figure 21: Density scatter plots and performance statistics for the cross-validation analysis of
nearest neighbor (NN; left) and Voronoi Neighbor Averaging (VNA; right) spatial
field techniques applied to monitored hourly Os concentrations in Atlanta, 2005.
32
-------
o
o
AQS vs. NN
R-square = 0.811
Mean Bias = -0.6
RMSE = 8.85
200
150
100
50
AQS vs. VNA
R-square = 0.853
Mean Bias =-0.42
RMSE = 7.56
200
150
100
50
0 50 AQS 100 150 0 50 AQS 100 150 "
Figure 22: Density scatter plots and performance statistics for the cross-validation analysis of
nearest neighbor (NN; left) and Voronoi Neighbor Averaging (VNA; right) spatial
field techniques applied to monitored hourly Os concentrations in Detroit, 2005.
o
o
AQS vs. NN
R-square = 0.798
Mean Bias = 0.08
RMSE = 9.48
200
150
100
50
0
AQS vs. VNA
R-square = 0.868
Mean Bias =-0.01
RMSE = 7.5
200
150
100
50
0
0 50 AQS 100 150 0 50 AQS 100 150
Figure 23: Density scatter plots and performance statistics for the cross-validation analysis of
nearest neighbor (NN; left) and Voronoi Neighbor Averaging (VNA; right) spatial
field techniques applied to monitored hourly Os concentrations in Philadelphia,
2005.
Both techniques had relatively low mean bias statistics in all three urban areas (< 1 ppb in all
cases). However, the VNA estimates had higher R2 statistics and lower RMSE statistics than the NN
estimates in all three urban areas, both of which indicate that the VNA technique consistently produces
more accurate estimates than NN. This is also reflected by the fact that there is generally less scatter in
33
-------
the density plots based on the VNA estimates than those based on the NN estimates. From a physical
perspective, the VNA technique is also more appealing since it does not produce sharp breakpoints in
the estimated values between adjacent monitors when the reported concentrations are different.
3 AIR QUALITY INPUTS FOR THE EPIDEMIOLOGY-BASED RISK
ASSESSMENT
This section describes the 12 urban case study areas for the epidemiology-based risk assessment
described in Chapter 7 (Section 3.1), and the spatially averaged "composite monitor" values used as
inputs to the Benefits Mapping and Analysis Program (BenMAP; Section 3.2).
3 1 URBAN CASE STUDY AREAS
Three distinct sets of boundaries were considered for the 12 urban case study areas in the
epidemiology-based risk assessment:
1. Core Based Statistical Areas2 ("CBSA" boundaries)
2. Area boundaries defined in the Zanobetti & Schwartz, 2008 study ("Z & S" boundaries)
3. Area boundaries defined in Smith et al, 2009 study ("Smith" boundaries)
In the 1st draft REA, the short-term mortality risk estimates were based on the concentration
response functions estimated in the Zanobetti & Schwartz, 2008 study using the Z & S boundaries. In
this draft, the primary set of short-term and long-term mortality risk estimates described in Chapter 7 are
based on the concentration response functions estimated in the Smith et al, 2009 study using the CBSA
boundaries. The other two sets of boundaries are used in sensitivity analyses. The first sensitivity
analysis uses the Z&S boundaries to assess the impact of changing from the quadratic rollback method
used to adjust air quality in the 1st draft REA to the HDDM adjustment method described in Chapter 4
and Appendix 4d of this draft. The second sensitivity analysis uses the Smith boundaries to assess the
impact of pairing air quality information based on the CBSA boundaries with the concentration-response
functions which were derived from air quality information based on the Smith boundaries.
Table 2 gives some basic information about the 12 urban case study areas in the epidemiology-
based risk assessment for each set of boundaries. In general, the Z&S and Smith areas were generally
smaller and more focused on the urban population centers than the CBSAs. The Z&S and Smith areas
were identical in 6 of the 12 urban case study areas, and had at least one county in common for all 12
study areas. The CBSAs were typically smaller than the respective study areas described in Section 2.1
for the exposure and clinical risk assessments, with the exceptions of Baltimore, Dallas, and Houston,
2 Core Based Statistical Areas (CBSAs) are used by the Office of Management and Budget (OMB) to group U.S.
counties into urbanized areas. These groupings are updated by OMB every 5 years. The CBSAs used in the epidemiology
based risk assessment are based on the OMB deliniations from 2008. For more information see:
http://www.whitehouse.gov/sites/default/files/omb/assets/bulletins/blO-02.pdf
34
-------
where the areas were identical. The final two columns of Table 2 show the annual 4th highest MDA8
values in ppb based on monitors within the three sets of boundaries for 2007 and 2009, the two years
upon which the risk estimates in Chapter 7 are focused.
Boundary # of #ofO3 Population 20074th 20094th
Area Name Definition Counties Monitors (2010) high (ppb) high (ppb)
Atlanta
Baltimore
Boston
Cleveland
Denver
Detroit
Houston
Los Angeles
New York
Philadelphia
Sacramento
St. Louis
CBS A
z&s
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
CBS A
Z&S
Smith
28
4
2
7
2
1
7
3
1
5
1
1
10
1
3
6
1
1
10
1
1
2
1
1
23
5
6
11
1
1
4
1
1
16
2
1
13
5
3
7
3
1
11
5
2
10
4
4
16
O
6
8
4
4
22
17
17
21
17
17
22
6
7
15
4
4
17
8
8
17
5
2
5,268,860
3,105,873
1,612,474
2,710,489
1,425,990
620,961
4,552,402
2,895,958
722,023
2,077,240
1,280,122
1,280,122
2,543,482
600,158
1,613,764
4,296,250
1,820,584
1,820,584
5,946,800
4,092,459
4,092,459
12,828,837
9,818,605
9,818,605
18,897,109
8,175,133
9,124,246
5,965,343
1,526,006
1,526,006
2,149,127
1,418,788
1,418,788
2,812,896
1,318,248
319,294
102
98
98
92
83
73
89
88
72
83
83
83
97
76
76
93
92
92
90
90
90
105
105
105
94
83
94
102
95
95
93
90
90
94
94
91
77
77
77
83
71
66
75
75
75
72
71
71
79
63
72
73
73
73
91
86
86
108
108
108
81
78
78
74
72
72
96
96
96
74
70
65
Table 2 Information on the 12 Urban Case Study Areas in the Epidemiology Based Risk
Assessment
35
-------
Since Os is not emitted directly but formed indirectly through photochemical reactions, precursor
emissions may continue to react and form Os downwind of emissions sources, and thus the highest Os
concentrations are often measured downwind of the highest concentration of precursor emissions in the
urban population center. This phenomenon is reflected in Table 2, where the highest monitored value in
the CBSA occurs outside of the Smith boundaries in 9 of 12 areas in 2007, and in 7 of 12 areas in 2009.
In addition, there were some instances where the highest monitored O3 concentrations occurred outside
of the CBSAs, but within the respective study areas used in the exposure and clinical risk assessments,
which were designed to always include the monitor associated with the highest area-wide design value.
For example, in Los Angeles, the CBSA includes Los Angeles County, CA and Orange County, CA, but
the highest Os concentrations are typically measured further downwind in Riverside County, CA and
San Bernardino County, CA.
3.2 AIR QUALITY INPUTS TO THE BENEFITS MAPPING AND ANALYSIS PROGRAM
(BENMAP)
The air quality monitoring data used as inputs to the Benefits Mapping and Analysis Program
(BenMAP) were daily time-series of spatially averaged "composite monitor" values for the 12 urban
case study areas based on monitored air quality data. These composite monitor values were calculated
by taking hour-by-hour spatial averages of the hourly 63 concentrations for all monitors within a given
study area, then calculating the four daily metrics described in Section 1.3:
1. Daily Maximum 8-hour Average (MDA8) Concentration
2. Daily 10:00 AM - 6:00 PM Mean Concentration
3. Daily Maximum 1-hour Concentration
4. Daily 24-hour Average Concentration
These four daily metrics were calculated based on each of the three sets of urban case study area
boundaries listed in the previous section, for a total of 12 daily metric/area boundary combinations.
Although these 12 values were provided based on available monitoring data for each day in the 2006-
2010 period, only a subset of these data were used for the air quality inputs to BenMAP. Since the
BenMAP software is designed to run for only one year at a time, we chose to focus on air quality data
from 2007 and 2009. In most cases, the 2007 data was meant to represent a year with high Os levels,
while 2009 was meant to represent a year with low Os levels. In some areas, the data were also subset to
a particular "Os season", either to match the period used in the respective epidemiology study, or to
avoid the potential disparity in data availability which could arise if the air quality data included months
outside of the required Os monitoring season. Appendix 4d contains "box and whisker" plots showing
the distribution of composite monitor values in each area for current air quality, air quality adjusted to
meet the existing O3 standard, and air quality adjusted to meet the potential alternative standards of 70
36
-------
ppb, 65 ppb, and 60 ppb. These plots are stratified to show the effects of varying spatial extents (CBS As
vs. Z & S areas), Os season lengths (June - August vs. April - October), and years (2007 vs. 2009).
Table 3 gives a list of the air quality inputs to the various BenMAP outputs discussed in Chapter
7, including the health endpoints modeled, the epidemiology studies from which the concentration-
response functions were derived, which urban case study areas were modeled, which air quality metrics
were used, which 63 season was modeled, and which set of boundaries were used in calculating the
composite monitor values. Rows shaded pink represent air quality inputs used in primary risk estimates,
while rows shaded blue represent air quality inputs for risk estimates included as sensitivity analyses.
37
-------
Epidemiology Urban Case Air Quality Study Area
Health Endpoint Study Study Area(s) Metric Os Season Boundaries
Emergency Room Visits,
Respiratory
Asthma Exacerbation,
Wheeze
Asthma Exacerbation,
Wheeze
Emergency Room Visits,
Asthma
Mortality, Long-Term
(Total and Respiratory)
Hospital Admissions, All
Respiratory
Hospital Admissions, All
Respiratory
Hospital Admissions,
Chronic Lung Disease
Hospital Admissions,
Chronic Lung Disease
Hospital Admissions,
Asthma
Mortality, Non-
Accidental
Mortality, Non-
Accidental
Mortality, Non-
Accidental
Mortality, Non-
Accidental
Emergency Room Visits,
Respiratory
Emergency Room Visits,
Respiratory
Mortality, All Cause
Mortality, All Cause
Darrowetal, 2011
Gent et al, 2003
Gent et al, 2004
Ito et al, 2007
Jerrett et al, 2009
Katsouyanni et al,
2009
Lin et al, 2000
Lin et al, 2008
Medina-Ramon et
al, 2006
Silverman and Ito,
2010
Smith et al, 2009
Smith et al, 2009
Smith et al, 2009
Smith et al, 2009
Strickland et al,
2010
Tolbert et al, 2007
Zanobetti and
Schwartz, 2008
Zanobetti and
Schwartz, 2008
Atlanta, GA
Boston, MA
Boston, MA
New York, NY
all 12 areas
Detroit, MI
Los Angeles,
CA
New York, NY
all 12 areas
New York, NY
all 12 areas
all 12 areas
all 12 areas
all 12 areas
Atlanta, GA
Atlanta, GA
all 12 areas
all 12 areas
MDA8
1 -Hour Max
MDA8
MDA8
Seasonal
Average3
1 -Hour Max
24-Hour
Average
1 -Hour Max
MDA8
MDA8
MDA8
MDA8
MDA8
MDA8
MDA8
MDA8
10AM-6PM
Mean
10AM-6PM
Mean
March - October
April -
September
April -
September
April -
September
April -
September
June - August
June - August
April - October
May - September
April - August
Required O?,
season
April - October
June - August
Required Os
season
May - October
March - October
June - August
June - August
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
CBS A
Smith
CBS A
CBS A
CBS A
z&s
Table 3
Summary of the air quality inputs to BenMAP
3 For the Jerrett et al, 2009 long-term mortality study, the air quality inputs were based on an annual metric instead of a daily
metric. The annual metric was the April - September average of the daily maximum 1-hour concentrations.
38
-------
4 REFERENCES
Abt Associates, Inc. (2010). Environmental Benefits and Mapping Program (Version 4.0). Bethesda,
MD. Prepared for U.S. Environmental Protection Agency, Office of Air Quality Planning and
Standards, Research Triangle Park, NC. Available on the Internet at:
http://www.epa.gov/air/benmap.
Bell, M.L., McDermott, A., Zeger S.L., Samet, J.M., Dominici, F. (2004). Ozone and Short-Term
Mortality in 95 U.S. Urban Communities, 1987-2000. Journal of the Air and Waste
Management Association, 292:2372-2378.
Chen, J., Zhao, R., Li, Z. (2004). Voronoi-based k-order Neighbor Relations for Spatial Analysis.
Journal of Photogrammetry and Remote Sensing, 59(1-2), 60-72.
Gold, C. (1997). Voronoi methods in GIS. In: Algorithmic Foundation of Geographic Information
Systems (va Kereveld M., Nievergelt, J., Roos, T., Widmayer, P., eds). Lecture Notes in
Computer Sicnece, Vol 1340. Berlin: Springer-Verlag, 21-35.
Jerrett, M., Burnett, R.T., Pope III C.A., Ito, K., Thurston, G., Krewski, D., Shi, Y., Calle, E., Thun, M.
(2009). Long-Term Os Exposure and Mortality. New England Journal of Medicine, 360:1085-
1095.
R Core Team (2012). R: A language and environment for statistical computing. R Foundation for
Statistical Computing, Vienna, Austria. http://www.R-project.org/.
Smith, R.L.; Xu, B., Switzer, P. (2009). Reassessing the Relationship between Ozone and Short-Term
Mortality in 95 U.S. Urban Communities. Inhalation Toxicology, 21: 37-61.
Turner, R. (2012). deldir: Delaunay Triangulation and Dirichlet (Voronoi) Tessellation. R package
version 0.0-19. http://CRAN.R-project.org/package=deldir
U.S. Environmental Protection Agency. (2012a). Integrated Science Assessment for Ozone and Related
Photochemical Oxidants: Final Report. U.S. Environmental Protection Agency, Research
Triangle Park, NC. EPA/600/R-10/076C. Available on the Internet at:
http://www.epa.gov/ttn/naaqs/standards/ozone/s_o3_2008_isa.html
U.S. Environmental Protection Agency. (2012b). Health Risk and Exposure Assessment for Ozone,
First External Review Draft. U.S. Environmental Protection Agency, Research Triangle Park,
NC. EPA/600/R-10/076C. Available on the Internet at:
http://www.epa.gOv/ttn/naaqs/standards/ozone/s o3 2008 rea.html
39
-------
U.S. Environmental Protection Agency. (2012c). Total Risk Integrated Methodology (TRIM) - Air
Pollutants Exposure Model (APEX) Documentation (TRIM-Expo / APEX, Version 4.4).
Available on the Internet at: http://www.epa.gov/ttn/fera/human apex.html
Wells, B., Wesson, K., Jenkins, S. (2012). Analysis of Recent U.S. Ozone Air Quality Data to Support
the Os NAAQS Review and Quadratic Rollback Simulations to Support the First Draft of the
Risk and Exposure Assessment. Available on the Internet at:
http://www.epa.gOv/ttn/naaqs/standards/ozone/s o3td.html
Zanobetti, A., and J. Schwartz (2008). Mortality Displacement in the Association of Ozone with
Mortality: An Analysis of 48 Cities in the United States. American Journal of Respiratory and
Critical Care Medicine, 177:184-189.
40
-------
Appendix 4-B
Modeling Technical Support Document for the 2013 Ozone Risk and
Exposure Assessment
Table of Contents
Model set-up and simulation 1
Model domain 1
Model time period 3
Model inputs: meteorology 3
Model inputs: emissions 5
Model inputs: Boundary conditions 7
Model inputs: Initial conditions 7
Evaluation of modeled ozone concentrations 7
Operational Evaluation in the Northeast U.S 9
Operational Evaluation in the Southeast U.S 27
Operational Evaluation in the Midwest 34
Operational Evaluation in the Central U.S 46
Operational Evaluation in the Western U.S 59
References 72
Table of Figures
FIGURE 1: MAP OF THE CMAQ MODELING DOMAIN 2
FIGURE 2: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE NORTHEASTERN US FOR WINTER MONTHS IN
2007 11
FIGURE 3: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE NORTHEASTERN US FOR SPRING MONTHS IN
2007 12
FIGURE 4: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE NORTHEASTERN US FOR SUMMER MONTHS IN
2007 13
FIGURE 5: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE NORTHEASTERN US FOR FALL MONTHS IN
2007 14
-------
FIGURE 6: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT BOSTON MONITORING SITES
FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 15
FIGURE 7: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BOSTON MONITORING SITES FOR JANUARY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 15
FIGURE 8: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BOSTON MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 16
FIGURE 9: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BOSTON MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 16
FIGURE 10: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BOSTON MONITORING SITES FOR OCTOBER
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 17
FIGURE 11: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT NEW YORK MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 18
FIGURE 12: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT NEW YORK MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 18
FIGURE 13: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT NEW YORK MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 19
FIGURE 14: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT NEW YORK MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 19
FIGURE 15: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT NEW YORK MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 19
FIGURE 16: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT PHILADELPHIA MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 20
FIGURE 17: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT PHILADELPHIA MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 21
FIGURE 18: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT PHILADELPHIA MONITORING SITES FOR
APRIL2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 21
FIGURE 19: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT PHILADELPHIA MONITORING SITES FOR
JULY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 21
FIGURE 20: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT PHILADELPHIA MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 22
FIGURE 21: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT BALTIMORE MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 23
FIGURE 22: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BALTIMORE MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 23
FIGURE 23: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BALTIMORE MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 24
FIGURE 24: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BALTIMORE MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 24
FIGURE 25: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT BALTIMORE MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 25
II
-------
FIGURE 26: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT WASHINGTON D.C.
MONITORING SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED. . 25
FIGURE 27: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT WASHINGTON D.C. MONITORING SITES
FOR JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 26
FIGURE 28: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT WASHINGTON D.C. MONITORING SITES
FORAPRIL2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 26
FIGURE 29: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT WASHINGTON D.C. MONITORING SITES
FORJULY2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 27
FIGURE 30: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT WASHINGTON D.C. MONITORING SITES
FOR OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 27
FIGURE 31: MAP OF NORMALIZED MEAN BIAS FOR IV1DA8 OZONE CONCENTRATIONS IN THE SOUTHEASTERN US FOR WINTER MONTHS IN
2007 29
FIGURE 32: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE SOUTHEASTERN US FOR SPRING MONTHS IN
2007 30
FIGURE 33: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE SOUTHEASTERN US FOR SUMMER MONTHS
IN 2007 31
FIGURE 34: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE SOUTHEASTERN US FOR FALL MONTHS IN
2007 32
FIGURE 35: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT ATLANTA MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 33
FIGURE 36: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT ATLANTA MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 33
FIGURE 37: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT ATLANTA MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 34
FIGURE 38: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT ATLANTA MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 34
FIGURE 39: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE MIDWEST FOR WINTER MONTHS IN 2007.
36
FIGURE 40: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE MIDWEST FOR SPRING MONTHS IN 2007.37
FIGURE 41: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE MIDWEST FOR SUMMER MONTHS IN 2007.
38
FIGURE 42: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE MIDWEST FOR FALL MONTHS IN 2007.... 39
FIGURE 43: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT CHICAGO MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 40
FIGURE 44: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CHICAGO MONITORING SITES FOR JANUARY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 40
FIGURE 45: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CHICAGO MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 41
FIGURE 46: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CHICAGO MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 41
III
-------
FIGURE 47: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CHICAGO MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 42
FIGURE 48: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT CLEVELAND MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 42
FIGURE 49: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CLEVELAND MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 43
FIGURE 50: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CLEVELAND MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 43
FIGURE 51: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT CLEVELAND MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 44
FIGURE 52: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT DETROIT MONITORING SITES
FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 45
FIGURE 53: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DETROIT MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 45
FIGURE 54: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DETROIT MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 46
FIGURE 55: MAP OF NORMALIZED MEAN BIAS FOR IV1DA8 OZONE CONCENTRATIONS IN THE CENTRAL US FOR WINTER MONTHS IN
2007 48
FIGURE 56: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE CENTRAL US FOR SPRING MONTHS IN 2007.
49
FIGURE 57: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE CENTRAL US FOR SUMMER MONTHS IN
2007 50
FIGURE 58: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE CENTRAL US FOR FALL MONTHS IN 2007.51
FIGURE 59: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT SAINT Louis MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 52
FIGURE 60: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SAINT Louis MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 52
FIGURE 61: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SAINT Louis MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 53
FIGURE 62: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SAINT Louis MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 53
FIGURE 63: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SAINT Louis MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 54
FIGURE 64: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT DALLAS MONITORING SITES
FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 55
FIGURE 65: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DALLAS MONITORING SITES FOR JANUARY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 55
FIGURE 66: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DALLAS MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 56
FIGURE 67: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DALLAS MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 56
IV
-------
FIGURE 68: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DALLAS MONITORING SITES FOR OCTOBER
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 57
FIGURE 69: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT HOUSTON MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 57
FIGURE 70: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT HOUSTON MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 58
FIGURE 71: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT HOUSTON MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 58
FIGURE 72: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT HOUSTON MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 59
FIGURE 73: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT HOUSTON MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 59
FIGURE 74: MAP OF NORMALIZED MEAN BIAS FOR IV1DA8 OZONE CONCENTRATIONS IN THE WESTERN US FOR WINTER MONTHS IN
2007 61
FIGURE 75: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE WESTERN US FOR SPRING MONTHS IN
2007 62
FIGURE 76: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE WESTERN US FOR SUMMER MONTHS IN
2007 63
FIGURE 77: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS IN THE WESTERN US FOR FALL MONTHS IN 2007.
64
FIGURE 78: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT DENVER MONITORING SITES
FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 65
FIGURE 79: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DENVER MONITORING SITES FOR APRIL
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 65
FIGURE 80: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DENVER MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 66
FIGURE 81: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT DENVER MONITORING SITES FOR OCTOBER
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 66
FIGURE 87: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT SACRAMENTO MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 67
FIGURE 88: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SACRAMENTO MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 67
FIGURE 89: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SACRAMENTO MONITORING SITES FOR
APRIL2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 68
FIGURE 90: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SACRAMENTO MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 68
FIGURE 91: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT SACRAMENTO MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 69
FIGURE 92: TIME SERIES OF MODEL PERFORMANCE FOR S-HR DAILY MAXIMUM OZONE CONCENTRATIONS AT Los ANGELES MONITORING
SITES FOR APRIL-OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 70
V
-------
FIGURE 93: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT Los ANGELES MONITORING SITES FOR
JANUARY 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 70
FIGURE 94: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT Los ANGELES MONITORING SITES FOR
APRIL2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 71
FIGURE 95: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT Los ANGELES MONITORING SITES FOR JULY
2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 71
FIGURE 96: TIME SERIES OF MODEL PERFORMANCE FOR HOURLY OZONE CONCENTRATIONS AT Los ANGELES MONITORING SITES FOR
OCTOBER 2007. OBSERVED VALUES SHOWN IN BLACK AND MODELED VALUES SHOWN IN RED 71
FIGURE 97: MAP OF MEAN OBSERVED IV1DA8 OZONE CONCENTRATIONS AT Los ANGELES MONITORING SITES FOR SUMMER MONTHS
(JUNE, JULY, AUG) 2007 72
FIGURE 98: MAP OF NORMALIZED MEAN BIAS FOR MDA8 OZONE CONCENTRATIONS AT Los ANGELES MONITORING SITES FOR SUMMER
MONTHS (JUNE, JULY, AUG) 2007 72
Table of Tables
TABLE 1: GEOGRAPHIC ELEMENTS OF DOMAIN USED IN THE CMAQ/HDDM MODELING 2
TABLE 2: VERTICAL LAYER STRUCTURE FOR 2007 WRF AND CMAQ SIMULATIONS 4
TABLE 3: SUMMARY OF EMISSIONS TOTALS BY SECTOR FOR THE 12KM EASTERN US DOMAIN 6
TABLE 4: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE NORTHEASTERN US 10
TABLE 5: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE BOSTON AREA 14
TABLE 6: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE NEW YORK AREA 17
TABLE 7: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE PHILADELPHIA AREA 19
TABLE 8: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE BALTIMORE AREA 22
TABLE 9: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE WASHINGTON D.C. AREA 25
TABLE 10: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE SOUTHEASTERN US 28
TABLE 11: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE ATLANTA AREA 32
TABLE 12: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE MIDWEST 35
TABLE 13: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE CHICAGO AREA 39
TABLE 14: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE CLEVELAND AREA 42
TABLE 15: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE DETROIT AREA 44
TABLE 16: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE CENTRAL US 47
TABLE 17: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE SAINT Louis AREA 51
TABLE 18: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE DALLAS AREA 54
TABLE 19: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE HOUSTON AREA 57
TABLE 20: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE WESTERN US 60
TABLE 21: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE DENVER AREA 64
TABLE 23: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE SACRAMENTO AREA 66
TABLE 24: SUMMARY OF CMAQ MODEL PERFORMANCE AT AQS MONITORING SITES IN THE Los ANGELES AREA 69
VI
-------
MODEL SET-UP AND SIMULATION
The air quality modeling underlying the HDDM adjustment methodology described in
chapter 4 and appendix 4d was performed using CMAQv4.7.1 with HDDM for ozone
(www.cmaq-model.org). A modified version of CMAQ-HDDM-3D was used that tracked
species concentrations through all modeled processes, but tracked ozone sensitivities through
the chemistry, transport, and dry deposition subroutines only and did not account for the effects
of aerosol and cloud processing due to the uncertainties in these model processes with respect to
ozone sensitivity as well as to conserve computational costs. CMAQ was run using the carbon
bond 2005 (CB05) gas-phase chemical mechanism (Gery et al 1989; Yarwood et al, 2005) and
the AERO5 aerosol module which includes ISORROPIA for gas-particle partitioning of
inorganic species (Nenes et al 1998) and secondary organic aerosol treatment as described in
Carltonetal(2010).
MODEL DOMAIN
For this analysis, all CMAQ/HDDM runs were performed for a domain that covers the
48 contiguous states included portions of southern Canada and Northern Mexico with a 12 x 12
km resolution (Figure 1). The CMAQ simulations were performed with 24 vertical layers with a
top at about 17,600 meters, or 50 millibars (mb). Tables 1 and 2 provides some basic
geographic information regarding the CMAQ domain and vertical layer structure, respectively.
Results from the lowest layer of the model were used for analyses to support the ozone NAAQS
REA.
-------
12US2 domain
x,y origin: -2412000i
col: 396 row:246
Figure 1: Map of the CMAQ modeling domain.
Table 1: Geographic elements of domain used in the CMAQ/HDDM modeling
Map Projection
Grid Resolution
True Latitudes
Dimensions
Vertical extent
CMAQ Modeling Configuration: National Grid
Lambert Conformal Projection
12km
33degNand45degN
396x246x24
24 Layers: Surface to 50 millibar level (see Table
2)
-------
MODEL TIME PERIOD
The CMAQ/HDDM modeling was performed for January and April-October of 2007.
The simulations included 10 day "ramp-up" periods from December 22-31, 2006 and from
March 22-31 2007 to minimize the effects of initial conditions. The ramp-up days were not
considered in the analysis for the HDDM results.
MODEL INPUTS: METEOROLOGY
CMAQ model simulations require inputs of meteorological fields, emissions, and initial
and boundary conditions. The gridded meteorological data for the entire year of 2007 at the 12
km continental United States scale domain was derived from version 3.1 of the Weather
Research and Forecasting Model (WRF), Advanced Research WRF (ARW) core (Skamarock et
al., 2008). The WRF Model is a next-generation mesoscale numerical weather prediction
system developed for both operational forecasting and atmospheric research applications
(http://wrf-model.org). The 2007 WRF simulation included the physics options of the Pleim-
Xiu land surface model (LSM), Asymmetric Convective Model version 2 planetary boundary
layer (PEL) scheme, Morrison double moment microphysics, Kain-Fritsch cumulus
parameterization scheme and the RRTMG long-wave radiation (LWR) scheme (Gilliam and
Pleim, 2010).
The WRF meteorological outputs were processed to create model-ready inputs for
CMAQ using the Meteorology-Chemistry Interface Processor (MCIP) package (Otte et al.,
2010), version 3.6, to derive the specific inputs to CMAQ: horizontal wind components (i.e.,
speed and direction), temperature, moisture, vertical diffusion rates, and rainfall rates for each
grid cell in each vertical layer. The WRF simulation used the same CMAQ map projection, a
lambert conformal projection centered at (-97, 40) with true latitudes at 33 and 45 degrees
north. The 12 km WRF domain consisted of 459 by 299 grid cells. The WRF simulation
utilized 34 vertical layers with a surface layer of approximately 38 meters. Table 2 shows the
vertical layer structure used in WRF and the layer collapsing approach to generate the CMAQ
meteorological inputs. CMAQ resolved the vertical atmosphere with 24 layers, preserving
greater resolution in the PEL.
-------
Table 2: Vertical layer structure for 2007 WRF and CMAQ simulations
Layer Top
Height (m)
17,145
14,490
12,593
11,094
9,844
8,766
7,815
6,962
6,188
5,477
4,820
4,208
3,635
3,095
2,586
2,198
1,917
1,644
1,466
1,292
1,121
952
787
705
624
544
465
386
307
230
153
114
76
38
Pressure
(mb)
50
95
140
185
230
275
320
365
410
455
500
545
590
635
680
716
743
770
788
806
824
842
860
869
878
887
896
905
914
923
932
937
941
946
WRF
34
33
32
31
30
29
28
27
26
25
24
23
22
21
20
19
18
17
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
Depth (m)
2,655
1,896
1,499
1,250
1,078
951
853
775
711
657
612
573
539
509
388
281
273
178
174
171
168
165
82
81
80
80
79
78
78
77
38
38
38
38
CMAQ
24
23
22
21
20
19
18
17
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
Depth (m)
4,552
2,749
2,029
1,627
1,368
1,185
539
509
388
281
273
178
174
171
168
165
163
160
157
78
77
76
38
38
In terms of the 2007 WRF meteorological model performance evaluation, an approach
which included a combination of qualitative and quantitative analyses was used to assess the
-------
adequacy of the WRF simulated fields (U.S. EPA, 2011). The qualitative aspects involved
comparisons of the model-estimated synoptic patterns against observed patterns from historical
weather chart archives. Additionally, the evaluations compared spatial patterns of monthly
average rainfall and monthly maximum planetary boundary layer (PEL) heights. The statistical
portion of the evaluation examined the model bias and error for temperature, water vapor
mixing ratio, solar radiation, and wind fields. These statistical values were calculated on a
monthly basis.
MODEL INPUTS: EMISSIONS
The emissions data used are based on the 2007 Version 5 emissions modeling platform
developed for the Paniculate Matter (PM) NAAQS rule (US EPA 2012a, US EPA 2012b).
Some small updates to the 2007v5 platform are enumerated below. First we give a general
summary of the emissions processing performed for the PM NAAQS 2007 modeling emissions
inputs (more details are available in US EPA 2012a, 2012b, and in the PM NAAQS section of
http://www.epa.gov/ttn/chief/emch/index.html). The 2008 National Emissions Inventory,
Version 2 (http://www.epa.gov/ttn/chief/net/2008inventory.html) was the starting point for
these emissions, with updates to specific source categories made where necessary to better
represent the year 2007. Emissions were processed to photochemical model inputs with the
SMOKE modeling system version 3.1 (Houyoux et al., 2000). For this analysis, emissions
from wildfires and prescribed burns are estimated based on a multi-year average of data from
2003 through 2010. Electric generating utilities (EGUs) emissions for 2007 are temporalized
based on average temporal profiles from 3 years of data. In addition, US emissions are included
from other point sources, area sources, agricultural sources (ammonia only), anthropogenic
fugitive dust sources, nonroad mobile sources, onroad mobile sources, and biogenic sources.
Onroad mobile sources were created using EPA's MOVES 201 Ob model
(www.epa.gov/otaq/models/moves), except that California emissions were adjusted to match
the county total emissions obtained directly from the California Air Resources Board. Biogenic
emissions were estimated using the Biogenic Emissions Inventory System version 3.14
(BEISv3.14) (Pierce et al, 1998). Other North American emissions are based on a 2006
Canadian inventory and 2008 Mexican inventory. Emissions totals within the 12 km Eastern
domain are summarized in Table 3 for CO, NH3, NOx, PM10, PM2.5, SO2, and VOC.
There are a few differences between the emissions data used for this analysis and the data
documented in the PM NAAQS 2007v5 platform technical support document (TSD). First, the
years used to compute the average fires were 2003-2010, versus 2003-2009 for the PM
NAAQS. Second, point source emissions for South Dakota were updated with more recent data.
-------
Finally, a correction was made the spatial surrogates used for oil and gas emissions in the
Western Regional Air Partnership states and updated spatial surrogates for gas stations and dry
cleaners were used.
Table 3: Summary of emissions totals by sector for the 12km Eastern US domain
Sector
Name
afdust
ag
clc2rail
avefire
nonpt
nonroad
onroad
othar
othon
othpt
ptipm
ptnonipm
c 3 marine
US
cSmarine
nonUS
beis
total US
anthro
total
Sector description
Anthropogenic fugitive dust
Agricultural sources
Locomotive and marine mobile
sources (except C3 marine)
Average year fire emissions
Area sources
Off road equipment
Onroad mobile vehicles
Canada and Mexico area
sources
Canada and Mexico onroad
mobile sources
Canada and Mexico point
sources
Point sources: electric
generation units
Point sources other than electric
generating units
C3 marine vessels within 4
miles of the US coast
C3 marine vessels more than 4
miles off the US coast
Biogenic emissions
Total US anthropogenic
emissions used in HDDM (NOx
and VOC only)
Domain-wide total
Emissions (1000 tons/year)
CO
219
15,598
4,335
17,834
36,757
4,225
5,173
1,331
704
2,934
13
86
8,211
97,420
NH3
3,595
0.6
256
155
1.9
145
671
25
21
25
68
4,964
NOX
1,338
216
1,229
1,878
7,561
918
631
1,280
3,357
2,077
138
1,047
1,931
17,578
23,602
PM10
5,854
44
1,589
768
188
363
1,510
23
241
437
583
12
87
11,699
PM25
825
41
1,347
676
178
277
451
18
159
330
409
11
80
4,802
S02
49
118
402
101
40
154
11
2,504
9,136
1,589
105
646
14,854
VOC
60
2,797
6,671
2,781
3,186
1,815
405
626
43
1,074
5.1
38
48,616
13,819
68,117
-------
MODEL INPUTS: BOUNDARY AND INITIAL CONDITIONS
The lateral boundary concentrations for the 12km US2 domain are provided by a three-
dimensional global atmospheric chemistry model, the GEOS-CHEM (Yantosca, 2004) model
(standard version 8-03-02 with version 8-02-03 chemistry). The global GEOS-CHEM model
simulates atmospheric chemical and physical processes driven by assimilated meteorological
observations from the NASA's Goddard Earth Observing System (GEOS-5). This model was
run for 2007 with a grid resolution of 2.0 degree x 2.5 degree (latitude-longitude) and 46
vertical layers up to 0.01 hPa. The predictions were processed using the GEOS-2-CMAQ tool
(Akhtar et al., 2012, Henderson et al, 2013) and used to provide one-way dynamic boundary
conditions at one-hour intervals. The ozone from these GEOS-Chem runs was evaluated by
comparing to satellite vertical profiles and ground-based measurements and found acceptable
model performance (Akhtar et al, 2012; Henderson et al 2013).
Initial conditions were extracted from a slightly older model simulation using GEOS-
CHEM (Yantosca, 2004) version 8-02-03. The model simulation from which the initial
conditions were extracted was also run with a grid resolution of 2.0 of 2.0 degree x 2.5 degree
(latitude-longitude) and 47 vertical layers. A GEOS-Chem evaluation was conducted for the
purpose of validating the 2007 GEOS-Chem simulation outputs for their use as inputs to the
CMAQ modeling system. This evaluation included reproducing GEOS-Chem evaluation plots
reported in the literature for previous versions of the model (Lam, 2010)
EVALUATION OF MODELED OZONE CONCENTRATIONS
CMAQ is a peer-reviewed, community air quality model that simulates the formation
and fate of photochemical oxidants, aerosol concentrations, acid deposition, and air toxics, over
multiple scales for given input sets of meteorological conditions and emissions. In order to
assure that CMAQ is an appropriate tool to estimate the AQ changes expected to result from a
given set of emissions reductions, the model is typically evaluated for each new application.
This evaluation consists of assessments of the model itself and an assessment of this particular
application of the model.
An independent panel consisting of academic and government experts assessed the
science within the CMAQ model version 4.7.1 in September 2011
(http://www.epa.gov/AMD/Reviews/201 l_CMAO_Review_FinalReport.pdf). Among the
conclusions of this peer-review report was the finding that the CMAQ science and evaluation
-------
efforts was of "very high quality" and have provided a "foundation for the more reliable use of
the CMAQ modeling system".
As CMAQ model version 4.7.1 was being developed, a series of incremental diagnostic
tests were performed to assess how model performance varied across a variety of model
improvements. This analysis is summarized in Foley et al. (2010) as well as Godowitch et al.,
(2011). While time and resource intensive, this systematic incremental testing shows "the
effect of each scientific improvement on the simulated fields." This evaluation allowed for a
clear comparison with previous model versions and provided assurance that CMAQ v4.7.1
yielded equivalent or improved performance relative to previous CMAQ versions.
Numerous dynamic evaluations of CMAQ's ability to simulate the change in air quality
resulting from emissions reductions have been conducted and summarized in the peer-reviewed
literature. For instance, Napelenok et al (2011) concluded that the CMAQ model "is able to
reproduce the observed change in daily maximum 8-hour ozone levels" at the majority of
locations when emissions uncertainty is considered. Other dynamic evaluations (Zhou et al.,
2013, Godowitch et al., 2010, Gilliland et al., 2008, Godowitch et al., 2007) have suggested that
CMAQ may be a conservative estimate of the air quality improvements resulting from
emissions reductions.
This TSD summarizes the ability of the model to reproduce 2007 conditions simulated
using specific emissions, meteorological, initial conditions and boundary conditions inputs
described above. This operational evaluation shows that the CMAQ model predictions for 2007
are equivalent or better than typical regional modeling simulations as summarized in Simon et
al. (2012).
In the following sections we present general model performance statistics and plots for
five regions of the U.S. We compare model predictions of maximum daily 8-hr average
(MDA8) ozone concentrations to measurements reported in EPA's Air Quality System (AQS)
which is a repository of air pollution measurements made by EPA, state, local, and tribal
agencies. For the 2007 model performance evaluation, we ran CMAQ for 2007 using the same
emissions as those used for the 2007 FtDDM runs, except that we included actual wild fires
instead of average fires and we used CEM data to for hourly EGU emissions (US EPA, 2012b)
instead of the multi-year average temporal method used for the HDDM runs.
The model statistics presented here include mean bias, mean error, normalized mean
bias, and normalized mean error as calculated in Simon et al. (2012). Our analysis focuses on
regional model evaluation statistics from five US regions as well as evaluations of the 15 urban
areas included in the Risk and Exposure Assessment. The five regions are defined as follows:
Northeast (Connecticut, Delaware, District of Columbia, Maine, Maryland, Massachusetts,
-------
New Hampshire, New Jersey, New York, Pennsylvania, Rhode Island, Vermont), Southeast
(Alabama, Florida, Georgia, Kentucky, Mississippi, North Carolina, South Carolina, Tennessee,
Virginia, West Virginia), Midwest (Illinois, Indiana, Michigan, Ohio, Wisconsin), Central
(Arkansas, Iowa, Kansas, Louisiana, Minnesota, Missouri, Nebraska, Oklahoma, Texas), and
West (Arizona, California, Colorado, Idaho, Nevada, New Mexico, Oregon, Utah, Wyoming).
Statistics for model performance in these regions and urban areas are shown by season in Table
4-Table 23 for observed days with MDA8 ozone values >= 60 ppb, observed days with MDA8
ozone < 60 ppb, and for all observed days. Plots are provided to show regional maps of
Normalized Mean Bias by season and time series of modeled and measured ozone
concentrations in each urban area. Time series are provided for MDA8 ozone from April-
October 2007 and for hourly ozone from one month from each season in 2007 (January, April,
July, October) where monitoring data is available. Note that time series show average
concentrations across all monitors within each urban area and the number of monitors included
in this average sometimes changes by season since different monitors within each area take
measurements over different periods of the year.
OPERATIONAL EVALUATION IN THE NORTHEAST U.S.
Table 4 shows that in the Northeaster US, model mean bias was generally less than 6
ppb and normalized mean bias was less than 15% in most cases. High ozone days and
summertime days were more likely to be under-estimated by the model while low ozone days
and wintertime days were more likely to be over-estimated by the model. Performance was
best in the spring and fall and had the largest errors in the winter. Five of the 15 urban areas
evaluated were in the Northeast: Boston, New York, Philadelphia, Baltimore, and Washington
D.C.
Model performance at Boston area monitoring sites (Table 5) was similar to that at other
Northeastern US sites. The time series plots show that the model has skill at reproducing
measured day-to-day variability in MDA8 ozone concentrations (Figure 6). Hourly daytime
and nighttime ozone concentrations are also well modeled in all seasons with exception of a few
2-3 day periods in July in which the model over-estimates daytime and nighttime ozone with
some over-estimates of daytime peaks in October (Figure 7-Figure 10)1.
Bulk performance statistics for MDA8 model estimates in New York (Table 6) also look
equivalent to both the Boston statistics and those of the Northeast as a whole. Again, the 7-
month time series of MDA8 ozone at NY area sites shows that the model captures synoptic
variations in ozone concentrations (Figure 11). Hourly New York area ozone in January is
1 Note that the Y-axis scale for the various time series are not consistent
-------
generally well captured by the model, although CMAQ does somewhat under-estimate the
daytime peaks (Figure 12). The April time series (Figure 13) shows that the model captures the
range of measured daytime and nighttime ozone values but has a 3-day period of over-estimates
in early April and a week-long period of under-estimates later in the month. July and October
ozone is well captured by the model on both low and high ozone days and at night (Figure 14
and Figure 15).
Bulk model performance statistics for Philadelphia (Table 7) are equivalent to those for
the Northeast as a whole. The time series plots show that variations of MDA8 are well captured
in Philadelphia (Figure 16) and that the model does a reasonable job of estimating day to night
ozone changes in all seasons but has some under-estimates of daytime ozone in January (Figure
17) and some over-estimates of daytime and nighttime ozone in October (Figure 20).
Baltimore MDA8 ozone performance (Table 8) is generally similar to ozone
performance in the rest of the Northeast except for summertime values which are somewhat
more overestimated in Baltimore (8 ppb MB in Baltimore versus 4 ppb MB in the Northeast).
Still, these bias and error statistics are well within the range of state-of-the-science model
performance described by Simon et al (2012). As shown by the time series in Figure 21, most
model overestimates in this area occur from late July to late August. Outside of that time
period, the MDA8 variations from April-October are well captured by the model. The hourly
time series plots for Baltimore generally show reasonable agreement between the observed and
modeled values although the period of over-estimated daytime ozone concentrations in July is
apparent in Figure 24.
Bulk statistics for Washington D.C. sites (Table 9) are similar to statistics for the rest of
the Northeast region. The period of overestimated MDA8 values in late July through August
which was seen in nearby Baltimore is less pronounced in Washington D.C. (Figure 26).
Hourly time series generally show reasonable performance with some overestimates of
nighttime ozone in October (Figure 30).
Table 4: Summary of CMAQ model performance at AQS monitoring sites in the
Northeastern US
Winter
Spring
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
5085
0
5085
10420
1856
12276
MB (ppb)
-4.65
NA
-4.65
-0.69
-6.34
-1.47
NMB (%)
-17.0
NA
-17.0
-1.4
-9.1
-3.1
ME (ppb)
5.83
NA
5.83
5.63
8.25
6.03
NME (%)
21.2
NA
21.2
13.2
11.8
12.9
10
-------
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
11859
4114
15973
10036
1030
11066
5.58
-0.51
4.01
2.86
-4.21
2.21
13.4
-0.7
8.2
8.1
-6.1
5.8
7.99
7.78
7.93
6.23
8.06
6.40
19.2
10.9
16.1
17.7
11.7
16.7
O3_8hrniax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 tor Winter (or MANE-VU [No Cutoff]
units = %
coverage limit = 75%
>50
140
30
20
10
0
-10
-20
-30
-40
<-50
CIRCLE=AQS_Daily;CIRCLE=CASTNET_Daily;
Figure 2: Map of normalized mean bias for MDA8 ozone concentrations in the
Northeastern US for winter months in 2007.
11
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Spring for MANE-VU [No Cutoff]
units « %
coverage limrt = 75%
I
>50
40
30
20
10
0
-10
-20
-30
-40
<-50
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 3: Map of normalized mean bias for MDA8 ozone concentrations in the
Northeastern US for spring months in 2007.
12
-------
03
Shrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12LJS2 tor Summer for MANE-VU [No C
\.
utoff]
S
H^v
^p
units « %
coverage limrt = 75%
I
>50
40
30
20
10
0
-10
-20
-30
-40
<-50
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 4: Map of normalized mean bias for MDA8 ozone concentrations in the
Northeastern US for summer months in 2007.
13
-------
O3_8hrmax NMB (%) for run 2007ee_QRDBC_v5_07c_v471_12US2 for Winter for MANE-VU [No Cutoff]
units = %
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 5: Map of normalized mean bias for MDA8 ozone concentrations in the
Northeastern US for fall months in 2007.
Table 5: Summary of CMAQ model performance at AQS monitoring sites in the Boston
area
Winter
Spring
Summer
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
265
0
265
680
87
767
784
210
994
MB (ppb)
-3.91
NA
-3.91
-1.32
-4.95
-1.73
6.71
-2.88
4.69
NMB (%)
-15.1
NA
-15.1
-3.1
-7.0
-3.8
16.5
-4.0
9.9
ME (ppb)
5.40
NA
5.40
5.73
8.46
6.04
8.47
8.76
8.53
NME (%)
20.9
NA
20.9
13.5
11.9
13.2
20.8
12.1
18.0
14
-------
Fall
Days < 60
Days > 60
All Days
433
62
495
3.69
-6.48
2.42
11.3
-9.1
6.4
6.22
9.92
6.68
19.0
13.9
17.8
2007ee ORDBC v507cv47112US2 O3 ahrmax for AQS Daily Site: Load File
100 -
a. 80 -
E 60 -
40 -
20 -
AQS_Daily
2007ee ORDBG v5 07c v471 12US2
# of Sites: 12
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 6: Time series of 8-hr daily maximum ozone concentrations at Boston monitoring
sites for April-October 2007. Observed values shown in black and modeled values
shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
50 -|
40 -
8 20 H
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 3
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 7: Time series of hourly ozone concentrations at Boston monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
15
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
60 -
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#of Siles: 12
Site: Load File
AprQt Apr03 AprOS Apr07 AprlO Apr12 Apr14 Apr17 Apr 19 Apr21 Apr23 Apr 26 Apr 28 AprSO
Date
Figure 8: Time series of hourly ozone concentrations at Boston monitoring sites for April
2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
60 -
40 -
20 -
AOS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 11
Site: Load File
JulOI Jul03 Jul 05 Jul 07 Jul09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 9: Time series of hourly ozone concentrations at Boston monitoring sites for July
2007. Observed values shown in black and modeled values shown in red.
16
-------
2007ee ORDBCv5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
tt of Sites: 3
Site: Load File
OctOI Oct03 OctOS OctOS OctIO Oc112 Oct15 Oct17 Oct 20 Oct 22 Oct 24 Ocl 27 Oct 29 Oct 31
Date
Figure 10: Time series of hourly ozone concentrations at Boston monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 6: Summary of CMAQ model performance at AQS monitoring sites in the New
York area
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
916
0
916
1547
200
1747
1469
635
2104
1443
108
1551
MB (ppb)
-4.14
NA
-4.14
-1.20
-6.05
-1.75
5.16
1.48
4.05
2.01
-5.60
1.48
NMB (%)
-17.8
NA
-17.8
-3.0
-8.5
-4.0
12.3
2.0
7.9
6.2
-8.1
4.2
ME (ppb)
5.34
NA
5.34
6.04
9.45
6.43
8.53
9.25
8.75
6.40
8.90
6.58
NME (%)
22.9
NA
22.9
15.0
13.2
14.7
20.4
12.7
17.1
19.7
12.8
18.7
17
-------
2007ee ORDBC v507cv47112US2 O3 ahrmax for AQS Daily Site: Load File
100 -
I 60
s
O 40
20 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aiig 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 11: Time series of 8-hr daily maximum ozone concentrations at New York
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
40 -
20 -
8
10 -
o -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# ol Sites: 11
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 12: Time series of hourly ozone concentrations at New York monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
g 40 H
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site: Load File
AprOI Apr03 Apr05 Apr07 AprlO Apr 12 Apr14 Apr17 Apr 19 Apr21 Apr 23 Apr 26 Apr28 Apr30
Date
18
-------
Figure 13: Time series of hourly ozone concentrations at New York monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBCv5_07c_v471 12US2 O3 for AQS Hourly Site: Load File
120 -
100 -
30-
CL
o.
" 60 -
40 -
20 -
O
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 17
Site: Load File
JulOI Jul 03 Jul 05 Jul 07 Jul09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Ju! 29 Jul 31
Date
Figure 14: Time series of hourly ozone concentrations at New York monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
40 -
20 -
0 -1
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site; Load_File
Dot 01 Oct03 OctOS OctOS Oct 10 Oct 12 Oct15 Oct 17 Oct 20 Ocl 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 15: Time series of hourly ozone concentrations at New York monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 7: Summary of CMAQ model performance at AQS monitoring sites in the
Philadelphia area
Winter
Days < 60
Days > 60
All Days
No. of obs
684
0
684
MB (ppb)
-3.48
NA
-3.48
NMB (%)
-14.4
NA
-14.4
ME (ppb)
5.15
NA
5.15
NME (%)
21.3
NA
21.3
19
-------
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
942
204
1146
778
507
1285
1005
103
1108
-0.80
-7.01
-1.91
6.25
1.23
4.27
2.69
-4.51
2.02
-1.9
-9.9
-4.0
9.44
1.7
7.7
7.8
-6.7
5.4
5.93
8.6
6.40
14.1
7.80
8.79
6.01
7.68
6.16
14.1
12.1
13.5
21.3
10.8
15.9
17.4
11.4
16.4
2007ee ORDBC v507cv47112US2 O3 ehrmax for AQS Daily Site: Load File
120 -
100 -
80 -
60 -
40 -
20 -
AQS_Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 15
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct 02 Oct 19
Date
Figure 16: Time series of 8-hr daily maximum ozone concentrations at Philadelphia
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
40 -
30 -
~ 20 H
o
10 -
0 -
AOS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 8
Sile: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
-------
Figure 17: Time series of hourly ozone concentrations at Philadelphia monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v507cv47112US2 O3 for AQS Hourly Site: Load File
100 -
80 -
£ 60 -
a.
0 40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 15
Site: Load_File
AprOI Apr03 Apr 05 Apr07 Apr 10 Apr 12 Apr14 Apr 17 Apr 19 Apr21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 18: Time series of hourly ozone concentrations at Philadelphia monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
140 -
120 -
100 -
£
9- 80 -
O
60 -
40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#of Sites: 15
Site: Land File
JulOI JulOS Jul 05 Jul07 Jul09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 19: Time series of hourly ozone concentrations at Philadelphia monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
21
-------
2007ee ORDBC v507cv47112US2 O3 for AQS Hourly Site: Load File
80 -
60 -
~ 40 -
rt
O
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 15
Site: Load File
OctOI Oct03 OctOS OctOS OctIO Oct12 Oct15 Oct17 Oct 20 Oct 22 Oct 24 Oct27 Oct29 Oct 31
Date
Figure 20: Time series of hourly ozone concentrations at Philadelphia monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 8: Summary of CMAQ model performance at AQS monitoring sites in the
Baltimore area
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
78
0
78
358
92
450
382
255
637
389
51
440
MB (ppb)
-1.50
NA
-1.50
4.62
-0.86
3.50
10.40
4.18
7.90
6.63
-1.07
5.74
NMB (%)
-7.0
NA
-7.0
11.1
-1.2
7.4
23.2
5.7
14.1
17.9
-1.5
14.1
ME (ppb)
4.63
NA
4.63
6.81
5.92
6.62
11.60
8.95
10.50
8.35
7.67
8.27
NME (%)
21.5
NA
21.5
16.3
8.5
14.0
25.8
12.3
18.8
22.5
11.0
20.2
22
-------
2007ee ORDBC v507cv47112US2 O3 ahrrnax for AQS Daily Site: Load File
120 -
100 -
80 -
60 -
40 -
20 -
AQS^Daily
2007ee ORDBC v5 07c v471 12US2
# of Siles: 7
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 21: Time series of 8-hr daily maximum ozone concentrations at Baltimore
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
50
40 -
i
g 20 H
AQSJ-lourty
2007ee ORDBG v5 07c v471 12US2
# of Sites: 1
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 22: Time series of hourly ozone concentrations at Baltimore monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
23
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
-8. eo -
s
O 40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 7
Site: Load File
AprOI Apr03 Apr 05 Apr 07 Apr 10 Apr 12 Apr14 Apr17 Apr 19 Apr21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 23: Time series of hourly ozone concentrations at Baltimore monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
140
120 -
100 -
| 80-
n 60 -
O
40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 7
Site: Load_File
JulOI JUI03 JulOS JUI07 Jul09 JuM1 Jul 14 JuM6 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 24: Time series of hourly ozone concentrations at Baltimore monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
£ 60 -
a.
O 40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 7
Site: Load File
OctOI OctOS OctOS OclOS OcMO Ocl12 Ocl15 Oct 17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
24
-------
Figure 25: Time series of hourly ozone concentrations at Baltimore monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 9: Summary of CMAQ model performance at AQS monitoring sites in the
Washington D.C. area
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
565
0
565
1120
334
1454
1066
819
1885
1188
106
1394
MB (ppb)
-4.97
NA
-4.97
-0.95
-4.98
-1.88
6.55
2.36
4.73
2.07
-2.03
1.46
NMB (%)
-20.5
NA
-20.5
-2.2
-7.3
-3.8
13.9
3.3
8.3
5.4
-3.0
3.5
ME (ppb)
6.12
NA
6.12
6.11
7.00
6.31
8.24
7.13
7.76
7.22
8.37
7.39
NME (%)
25.2
NA
25.2
14.2
10.2
12.9
17.5
10.2
13.6
19.0
12.4
17.4
2007ee ORDBC v5 07c v471 12US2 O3 Bhrmax for AQS Daily Site: Load File
100 -
"I 80 H
60 -
5
0 40 -
20 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 21
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Ajg 29 Sep15 Oct02 Oct19
Date
Figure 26: Time series of 8-hr daily maximum ozone concentrations at Washington D.C.
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
25
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
40 -
30 -
10 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 8
Site: Load File
f
v\
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 27: Time series of hourly ozone concentrations at Washington D.C. monitoring
sites for January 2007. Observed values shown in black and modeled values shown
in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
1
M
o
80 -
60 -
20 -
0 -
AQS_Hourfy
2007ee ORDBG v5 07c v471 12US2
# of Sites: 20
Site: Load File
Apr 01 Apr 03 Apr 05 Apr 07 Apr 10 Apr 12 Apr 14 Apr 17 Apr 19 Apr 21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 28: Time series of hourly ozone concentrations at Washington D.C. monitoring
sites for April 2007. Observed values shown in black and modeled values shown in
red.
26
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
120 -
100 -
80 -
60 -
40-
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 21
Site: Load File
JulOI Jul03 Jul 05 Jul07 Jul 09 Jul11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 29: Time series of hourly ozone concentrations at Washington D.C. monitoring
sites for July 2007. Observed values shown in black and modeled values shown in
red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
£ 6° -
Q_
§ 40-
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 21
Site: Load File
VJ
OctOI Oct03 Oct05 Oct 08 Oct 10 Oct12 Oct15 Dot 17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 30: Time series of hourly ozone concentrations at Washington D.C. monitoring
sites for October 2007. Observed values shown in black and modeled values shown
in red.
OPERATIONAL EVALUATION IN THE SOUTHEAST U.S.
Model performance in the Southeastern US was in the range of reported performance for
state of the science models (Simon et al., 2012). Mean bias for MDA8 ozone was less than 5
ppb at most sites in the winter and spring and less than 10 ppb at most sites in the summer and
fall seasons. Normalized mean bias was generally less than 10% at most sites in the winter and
spring and less than 25% at most sites in the summer and fall. Also, there were very few days
above 60 ppb in the winter season (4 days at two sites and 0,1, or 2 days at all other sites), but
27
-------
the model generally had trouble capturing ozone on those days and had mean biases in the
range of -5 to -15 ppb on those days. The higher biases in the summer and fall are most
pronounced along the Gulf coast and at sites in Florida. Atlanta was the only one of the 15
urban areas from the REA which was located in the Southeast region.
There were no ozone measurements in the Atlanta area during the winter season. Mean
bias and normalized mean bias at Atlanta sites for the spring, summer, and fall months were
typical of performance throughout the Southeast region. The April to June MDA8 ozone time
series (Figure 35) shows that the model does a good job of capturing the variability between
high and low ozone days. The hourly time series plots for April (Figure 36), July (Figure 37),
and October (Figure 38) show reasonable model performance during daytime hours but some
persistent overestimates of nighttime ozone, especially in April and July.
Table 10: Summary of CMAQ model performance at AQS monitoring sites in
Southeastern US
the
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
6378
53
6431
13326
5190
18516
14129
6336
20465
14105
1743
15848
MB (ppb)
0.13
-10.40
0.04
1.10
-4.94
-0.60
10.3
0.3
7.20
5.43
-1.44
4.67
NMB (%)
0.4
-16.1
0.1
2.3
-7.4
-1.1
23.2
0.5
13.8
13.9
-2.1
11.1
ME (ppb)
5.32
11.10
5.37
5.22
6.45
5.57
11.2
6.7
9.82
7.67
6.40
7.53
NME (%)
15.7
17.2
15.7
10.8
9.6
10.4
25.3
9.6
18.8
19.7
9.5
17.9
28
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Winter for VISTAS [No Cutoff]
unrts = %
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 31: Map of normalized mean bias for MDA8 ozone concentrations in the
Southeastern US for winter months in 2007.
29
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Spring for VISTAS [No Cutoff]
units -
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 32: Map of normalized mean bias for MDA8 ozone concentrations in the
Southeastern US for spring months in 2007.
30
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Summer for VISTAS [No Cutoff]
units - %
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 33: Map of normalized mean bias for MDA8 ozone concentrations in the
Southeastern US for summer months in 2007.
31
-------
O3_8hrmax NMB (%) lor run 2007ee_ORDBC_v5_07c_v471_12US2 for Fall for VISTAS [No Cutoff]
units - %
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 34: Map of normalized mean bias for MDA8 ozone concentrations in the
Southeastern US for fall months in 2007.
Table 11: Summary of CMAQ model performance at AQS monitoring sites in the Atlanta
area
Winter
Spring
Summer
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
0
0
0
703
295
998
510
467
977
MB (ppb)
NA
NA
NA
0.52
-3.08
-0.54
10.00
3.77
7.04
NMB (%)
NA
NA
NA
1.1
-4.6
-1.0
21.7
5.2
11.9
ME (ppb)
NA
NA
NA
5.99
6.03
6.00
11.50
8.46
10.00
NME (%)
NA
NA
NA
12.5
9.0
11.2
24.8
11.5
17.0
32
-------
Fall
Days < 60
Days > 60
All Days
575
87
662
5.45
4.01
5.26
13.7
5.9
12.1
7.72
6.27
7.53
19.4
9.2
17.3
2007ee ORDBC v5 07c v471 12US2 O3 Shrmax for AQS Daily Site: Load File
J3 100
Q_
Q.
~x 80
fc
5 60
8
40 -
20 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 11
Site: Load File
mmimiiiimi
Apr 01 Apr IB May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct 02 Oct 19
Date
Figure 35: Time series of 8-hr daily maximum ozone concentrations at Atlanta monitoring
sites for April-October 2007. Observed values shown in black and modeled values
shown in red.
2007ee_ORDBC v5_07c_v471 12US2 O3 for AQS_Hourly Site: Load File
I
100 -
80 -
60 -
40 -
20 -
AQSJHourty
2007ee ORDBC v5 07c v471 12US2
# of Sites: 11
Sile: Load File
Apr 01 Apr 03 Apr 05 Apr 07 Apr 10 Apr 12 Apr 14 Apr 17 Apr 19 Apr 21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 36: Time series of hourly ozone concentrations at Atlanta monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
33
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
120 -
100 -
_ 80 -
I,"
8
40 -
20 -
0 -
ACS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 11
Site: Load File
JulOI JUI03 Jul 05 JUI07 JUI09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jill 29 Jul 31
Date
Figure 37: Time series of hourly ozone concentrations at Atlanta monitoring sites for July
2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC_v5_07e_v471_12US2 O3 for AQS_Hourly Site: Load File
100 -
80 -
£ 60-
a.
0 40 -
20 -
0 -
AQS Hourly
2007ee_ORDBC_v5_07c_v471_12US2
# of Sites: 11
Site: Load File
OctOI Oct03 Oct 05 Oct 08 OctIO Oct12 Oct 15 Oct 17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 38: Time series of hourly ozone concentrations at Atlanta monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
OPERATIONAL EVALUATION IN THE MIDWEST
The model performed well compared to observed ozone concentrations in the Midwest.
Mean bias for MDA8 ozone was around 5 ppb or less at most sites in the winter, spring, and fall
and less than 7 ppb at most sites in the summer. Normalized mean bias for MDA8 ozone was
less than 15% at most sites except in the winter when it was somewhat higher (less than 20% at
most sites) due to lower observed ozone concentrations. The model was more likely to be
biased high on low ozone days and biased low on higher ozone days. No distinct spatial trends
are apparent from the maps of normalized mean bias (Figure 39-Figure 42). Three urban areas
34
-------
in the Midwest were examined more closely for this evaluation: Chicago, Cleveland, and
Detroit.
Chicago performance statistics for MDA8 ozone were similar to those of the rest of the
region. The time series of MDA8 ozone from Apr-Oct 2007 (Figure 43) as well as the hourly
time series for January, April, July, and October (Figure 44-Figure 47) all show that the model
does a good job of capturing synoptic variations in ozone observed in Chicago and reasonably
captures both day and nighttime measured concentrations.
No measurements were made during the winter season in Cleveland. MB for MDA8
ozone on high days (> 60 ppb) was similar to MB in the rest of the region, but MB on low
ozone days was somewhat higher than was typically seen in the Midwest (4 ppb vs. 1 ppb in the
spring, 11 ppb vs. 6 ppb in the summer, and 4 ppb vs. 3 ppb in the fall). The time series plots
for Cleveland sites show good correlation with observed ozone, although a period of moderate
model over-estimates in daytime ozone is depicted in July (Figure 48 and Figure 50).
Nighttime ozone concentrations are often overestimated by the model except in the time series
shown for April, July, and October the latter portion of October, 2007.
Detroit area sites did not report any ozone measurements during the winter of 2007.
Detroit performance statistics for MDA8 ozone were similar to those from the rest of the
Midwest; however under-estimates on high ozone days were more pronounced in Detroit than
in the rest of the region. The time series shows that the model accurately estimates both day
and nighttime hourly ozone in Detroit in April and July and generally captures the variations in
MDA8 ozone across the April-October time period.
Table 12: Summary of CMAQ model performance at AQS monitoring sites in the
Midwest
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
No. of obs
2824
0
2824
9447
2403
11850
12478
4114
16592
8207
MB (ppb)
-3.76
NA
-3.76
1.11
-6.19
-0.37
6.31
-1.19
4.45
2.94
NMB (%)
-15.1
NA
-15.1
2.6
-8.9
-0.8
14.2
-1.7
8.8
8.2
ME (ppb)
5.36
NA
5.36
5.55
7.38
5.92
8.39
7.56
8.18
6.22
NME (%)
21.5
NA
21.5
12.8
10.6
12.2
18.8
10.8
16.1
17.3
35
-------
Days > 60
All Days
1382
9589
-4.48
1.87
-6.5
4.6
7.72
6.44
11.1
15.8
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Winter for MWRPO [No Cutoff]
units - %
coverage limit - 75%
CIRCLE=AQS. Daily; CIRCLE=CASTNET_Daily;
Figure 39: Map of normalized mean bias for MDA8 ozone concentrations in the Midwest
for winter months in 2007.
36
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Spring for MWRPO [No Cutoff]
units = %
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 40: Map of normalized mean bias for MDA8 ozone concentrations in the Midwest
for spring months in 2007.
37
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12LJS2 for Summer for MWRPO [No Cutoff]
units « %
coverage limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 41: Map of normalized mean bias for MDA8 ozone concentrations in the Midwest
for summer months in 2007.
38
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Fall for MWRPO [No Cutoff]
units = %
coverage fimit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 42: Map of normalized mean bias for MDA8 ozone concentrations in the Midwest
for fall months in 2007.
Table 13: Summary of CMAQ model performance at AQS monitoring sites in the
Chicago area
Winter
Spring
Summer
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
963
0
963
1510
218
1728
1700
448
2148
MB (ppb)
-3.93
NA
-3.93
0.70
-5.55
-0.09
8.10
1.70
6.76
NMB (%)
-17.5
NA
-17.5
1.7
-8.1
-0.2
18.8
2.5
13.9
ME (ppb)
5.13
NA
5.13
6.02
6.62
6.1
10.0
8.65
9.74
NME (%)
22.9
NA
22.9
15.1
9.7
14.0
23.2
12.5
20.0
39
-------
Fall
Days < 60
Days > 60
All Days
1375
136
1511
1.93
-4.61
1.34
6.1
-6.7
3.8
5.62
9.05
5.93
17.8
13.2
17.0
2007ee ORDBC v5 07c v471 12US2 O3 Shrmax for AQS Daily Site: Load File
120 -
100 -
.o
°r 80 -
60 -
40 -
20 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 24
Site: Load File
"l
8
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 43: Time series of 8-hr daily maximum ozone concentrations at Chicago
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
40 -
30 -
a.
~ 20 H
10 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 10
Site; Load_Fite
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Dale
Figure 44: Time series of hourly ozone concentrations at Chicago monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
40
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
I
g 40 H
ACS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 24
Site: Load File
AprOI Apr03 AprOS Apr07 AprlO Apr12 Apr14 Apr17 Apr 19 Apr21 Apr23 Apr 26 Apr 28 Apr30
Date
Figure 45: Time series of hourly ozone concentrations at Chicago monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
60 -
40 -
20 -
0
AOS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 24
Site: Load File
JulOI Jul03 Jill 05 Jul 07 Jul09 Jul 11 Jul 14 Jul 16 Jut 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 46: Time series of hourly ozone concentrations at Chicago monitoring sites for July
2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site: Load_File
OctOI Oct03 Oct 05 OctOS OctIO Oct12 Oct15 Oct17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
41
-------
Figure 47: Time series of hourly ozone concentrations at Chicago monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 14: Summary of CMAQ model performance at AQS monitoring sites in
Cleveland area
the
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
0
0
0
491
113
604
714
187
901
543
60
603
MB (ppb)
NA
NA
NA
4.35
-4.80
2.64
11.00
2.25
9.16
4.41
-4.48
3.52
NMB (%)
NA
NA
NA
10.5
-6.7
5.6
25.8
3.2
19.0
11.6
-6.5
8.6
ME (ppb)
NA
NA
NA
7.03
8.46
7.30
12.90
9.48
12.20
7.38
8.97
7.54
NME (%)
NA
NA
NA
17.0
11.7
15.5
30.3
13.6
25.3
19.4
13.1
18.4
2007ee ORDBC v5 07c v471 12US2 O3 Bhrmax lor AQS Daily Site: Load File
100 -
.0
g; 80
I 60
5
8 40
20 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 10
Site: Load_File
AprOI ApMS MayOS May23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct 02 Oct19
Date
Figure 48: Time series of 8-hr daily maximum ozone concentrations at Cleveland
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
42
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
I 6°
a
8 40-
20 -
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# o( Sites: 10
Site: Load File
AprOI Apr03 AprOS Apr 07 Apr 10 Apr 12 Apr14 Apr17 Apr 19 Apr21 Apr 23 Apr 26 Apr 28 Apr 30
Dale
Figure 49: Time series of hourly ozone concentrations at Cleveland monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
60 -
O 40 -
0 -I
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 10
Site: Load File
JulOI Jul03 Jill 05 Jul 07 Jul09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jill 31
Date
Figure 50: Time series of hourly ozone concentrations at Cleveland monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
43
-------
2007ee ORDBC v507cv47112US2 O3 for AQS Hourly Site: Load File
80 -
« 40 -
O
20 -
0 -I
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 10
Site: Load File
OctOI Oct03 OctOS OctOS OctIO Oct12 Oct15 Oct17 Oct 20 Oct 22 Oct 24 Oct27 Oct29 Oct 31
Date
Figure 51: Time series of hourly ozone concentrations at Cleveland monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 15: Summary of CMAQ model performance at AQS monitoring sites in the Detroit
area
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
0
0
0
443
90
533
614
191
805
220
46
166
MB (ppb)
NA
NA
NA
-1.04
-8.61
-2.32
1.65
-7.16
-0.44
-1.01
-14.70
-3.38
NMB (%)
NA
NA
NA
-2.5
-11.7
-4.9
3.9
-9.9
-0.8
-2.6
-21.2
-7.7
ME (ppb)
NA
NA
NA
4.93
9.15
5.64
7.19
10.9
8.07
6.65
15.6
8.2
NME (%)
NA
NA
NA
11.7
12.4
11.9
16.7
15.1
16.2
17.1
22.6
18.6
44
-------
2007ee ORDBC v507cv47112US2 O3 Shrmax for AQS Daily Site: Load File
100 -
& 80 -
X
I 60 -
!
O 40 -
20 -
AQS Daily
2007ee ORDBC v5 07c V471 12US2
# of Sites: 9
Site: Load_File
iiiiiiiiii iiiiiiiiii iiiiiiiiii
Apr 01 Apr 15 Apr 29 May 14 May 29 Jun 13 Jun 27 Jul 11 Jul 24 Aug 07 Aug 22 Sep 06 Sep 21
Dale
Figure 52: Time series of 8-hr daily maximum ozone concentrations at Detroit monitoring
sites for April-October 2007. Observed values shown in black and modeled values
shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
8 40 -I
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#olSites:9
Site: Load File
AprOI Apr03 AprOS Apr07 Apr 10 Apr 12 Apr14 Apr 17 Apr 19 Apr 21 Apr23 Apr 26 Apr28 Apr 30
Date
Figure 53: Time series of hourly ozone concentrations at Detroit monitoring sites for April
2007. Observed values shown in black and modeled values shown in red.
45
-------
2007ee ORDBC v507cv47112US2 O3 for AQS Hourly Site: Load File
100 -
80 -
60 -
40 -
20 -
0 -
AOS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 9
Site: Load File
Jill 01 Jul03 Jill 05 Jul07 Jul 09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 54: Time series of hourly ozone concentrations at Detroit monitoring sites for July
2007. Observed values shown in black and modeled values shown in red.
OPERATIONAL EVALUATION IN THE CENTRAL U.S.
Mean bias for MDA8 ozone concentrations were in the range of 4 ppb or less at Central
US monitoring sites during the winter and spring and less than 13 ppb and 6 ppb in the summer
and fall respectively. Normalized mean bias was less than 15% at most sites in winter, spring,
and fall and less than 35% at most sites in the summer. Summertime model overestimates were
primarily located along the Gulf coast of Texas and Louisiana, mostly occurring on days with
MDA8 ozone concentrations less than 60 ppb. This is similar to summertime overestimates
reported for the Southern US which also were more pronounced along the Gulf Coast.
Houston, Dallas, and Saint Louis were the three cities from the 15 REA areas which were
located in the Central US region.
Saint Louis mean bias for MDA8 was moderately better than mean bias in the rest of the
region for winter and spring periods and substantially better than the mean bias in the rest of the
Central region in the summer and fall. The time series plots show that the model does a good
job of replicating the observed variations in ozone (both MDA8 and hourly). Nighttime ozone
is generally well simulated with the exception of a few several day periods in July when
nighttime concentrations were overestimated by the model.
Dallas performance statistics for MDA8 ozone were very similar to those presented for
the Central region as a whole with the exception of comparisons made for the fall season which
showed that Dallas performed substantially better than average for the region. The time series
plot of MDA8 ozone modeled and measured values (Figure 64) demonstrate consistent high
bias on very low ozone days in May through early September but not in April or October.
46
-------
Hourly ozone is fairly well replicated in January (Figure 65), April (Figure 66), and October
(Figure 68) but is overestimated during July both on low ozone days and at night (Figure 67).
As mentioned above, the largest summertime overestimates in MDA8 ozone in the
Central US occur along the Texas Gulf coast. This is demonstrated by the Houston model
performance statistics. Houston model performance is relatively good and is equivalent to that
in the rest of the Central US during the winter, spring, and fall seasons and on high days during
the summer. However summertime MDA8 ozone is overestimated by 15 ppb and 48% on
summer days with observed levels less than 60 ppb in the Houston area. This summertime
overestimate of low ozone days is demonstrated in the time series plots in Figure 69 and Figure
72.
Table 16: Summary of CMAQ model performance at AQS monitoring sites in the central
US
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
11190
33
11223
13077
2137
15214
14142
2477
16619
13209
1359
14568
MB (ppb)
-0.42
-13.9
-0.46
2.88
-6.68
1.54
11.4
0.60
9.82
4.39
-3.33
3.67
NMB (%)
-1.4
-21.5
-1.5
6.7
-10.0
3.3
29.7
0.9
22.8
11.8
-4.9
9.2
ME (ppb)
5.61
14.5
5.64
6.35
8.26
6.62
12.6
7.77
11.9
7.54
7.04
7.49
NME (%)
18.7
22.4
18.7
14.7
12.4
14.2
32.8
11.1
27.5
20.3
10.4
18.7
47
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Winter for CENRAP [No Cutoff]
ts - %
verage (imit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 55: Map of normalized mean bias for MDA8 ozone concentrations in the Central
US for winter months in 2007.
48
-------
O3_8hrmax NMB (%) for run gQQ7ee_ORDBC_v5_07c_v471_12US2 for Spring for CENRAP [No Cutoff]
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 56: Map of normalized mean bias for MDA8 ozone concentrations in the Central
US for spring months in 2007.
49
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Summer for CENRAP [No Cutoff]
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 57: Map of normalized mean bias for MDA8 ozone concentrations in the Central
US for summer months in 2007.
50
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Fall for CENRAP [No Cutoff]
t '
ferage Limit = 75%
CIRCLE=AQS_Daily; CIRCLE=CASTNET Daily;
Figure 58: Map of normalized mean bias for MDA8 ozone concentrations in the Central
US for fall months in 2007.
Table 17: Summary of CMAQ model performance at AQS monitoring sites in the Saint
Louis area
Winter
Spring
Summer
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
266
0
266
812
139
951
834
447
1281
MB (ppb)
-0.16
NA
-0.16
1.09
-6.93
-0.08
8.71
0.66
5.90
NMB (%)
-0.7
NA
-0.7
2.5
-10.6
-0.2
18.9
0.9
10.6
ME (ppb)
4.10
NA
4.10
5.31
7.85
5.68
9.83
9.34
9.66
NME (%)
18.1
NA
18.1
11.9
12.0
11.9
21.4
12.7
17.4
51
-------
Fall
Days < 60
Days > 60
All Days
829
113
942
2.38
-6.19
1.35
6.6
-8.8
3.4
6.34
8.46
6.60
17.6
12.1
16.4
2007ee ORDBC v5 07c v471 12US2 O3 Shrmax for AQS Daily Site: Load File
120 -
100 -
a. 80 -
X
I 60 -
o>
8 40 -
20 -
AQS^Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 14
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 59: Time series of 8-hr daily maximum ozone concentrations at Saint Louis
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
n
O
40 -
30 -
20 -
10 -
o -
AQS_Hourly
2007ee ORDBG v5 07c v471 12US2
# of Sites: 3
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 60: Time series of hourly ozone concentrations at Saint Louis monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80
60 -
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 14
Site: Load File
AprOI Apr03 AprOS Apr 07 Apr 10 Apr 12 Apr14 Apr17 Apr 19 Apr21 Apr 23 Apr 26 Apr 28 Apr 30
Dale
Figure 61: Time series of hourly ozone concentrations at Saint Louis monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
120 -
100 -
80 -
60 -
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 14
Site: Load File
JulOI Jul03 Jul 05 Jul07 Jul09 Jul11 Jul 14 Jul 16 Jut 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 62: Time series of hourly ozone concentrations at Saint Louis monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
53
-------
2007ee ORDBCv5 07c v471 12LJS2 O3 for AQS Hourly Site: Load File
so -
60 -
40 -
n
O
20 -
0 -I
AQS_Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 14
Site: Load File
OctOI OctOS Oct05 OctOS OctIO Oct12 Oct15 Oct17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 63: Time series of hourly ozone concentrations at Saint Louis monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 18: Summary of CMAQ model performance at AQS monitoring sites in the Dallas
area
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
1483
0
1483
1360
168
1528
1289
260
1549
1306
223
1529
MB (ppb)
-2.12
NA
-2.12
2.92
-8.10
1.71
11.10
3.35
9.77
1.67
-2.77
1.02
NMB (%)
-7.4
NA
-7.4
7.0
-12.3
3.8
28.3
4.7
21.9
4.3
-4.1
2.4
ME (ppb)
5.30
NA
5.30
6.77
9.38
7.05
12.30
8.59
11.70
6.89
6.69
6.86
NME (%)
18.6
NA
18.6
16.2
14.2
15.8
31.5
12.0
26.3
17.8
9.8
15.9
54
-------
2007ee ORDBC v507cv47112US2 O3 8hrmax for AQS Daily Site: Load File
120
100 -
I 80
40 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site: Load File
20
AprOI Apr18 May05 May23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 64: Time series of 8-hr daily maximum ozone concentrations at Dallas monitoring
sites for April-October 2007. Observed values shown in black and modeled values
shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
60 -
50 -
40 -
20 -
10 -
0
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 17
Site: Load File
\
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 65: Time series of hourly ozone concentrations at Dallas monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
-------
2007ee ORDBC v507cv47112US2 O3 for AQS Hourly Site: Load File
80 -
60 -
g 40
20 -
AQS_Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site: Load File
Apr 01 Apr 03 Apr 05 Apr 07 Apr 10 Apr 12 Apr 14 Apr 17 Apr 19 Apr 21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 66: Time series of hourly ozone concentrations at Dallas monitoring sites for April
2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
I
100 -
80 -
60 -
40 -
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#of Sites: 17
Site: Load_File
JulOI Jul03 Jul 05 Jul07 Jul09 Jill 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 67: Time series of hourly ozone concentrations at Dallas monitoring sites for July
2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
60 -
O 40 H
20 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 17
Site: Load File
OctOI Oct03 Oct 05 OctOB OctIO Oct12 Oct 15 OC117 Oct20 Oct22 Oct24 Oct27 Oct 29 Oct 31
Date
-------
Figure 68: Time series of hourly ozone concentrations at Dallas monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 19: Summary of CMAQ model performance at AQS monitoring sites in
Houston area
the
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
1772
0
1772
1569
276
1845
1706
165
1871
1608
259
1867
MB (ppb)
0.48
NA
0.48
5.82
-5.23
4.17
15.00
1.71
13.80
5.41
-3.38
4.19
NMB (%)
1.8
NA
1.8
14.5
-7.7
9.4
48.4
2.5
40.2
14.9
-4.8
10.2
ME (ppb)
5.97
NA
5.97
9.05
10.40
9.25
16.8
11.00
16.30
8.73
9.05
8.78
NME (%)
22.2
NA
22.2
22.5
15.2
20.8
54.2
15.8
47.3
24.1
12.8
21.4
2007ee ORDBC v5 07c v471 12US2 O3 Bhrmax for AQS Daily Site: Load File
120 -
100 -
a. 80 -
X
£ 60 -
Q' 40 -
20 -
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 21
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Ajg 29 Sep15 Oct02 Oct19
Date
Figure 69: Time series of 8-hr daily maximum ozone concentrations at Houston
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
57
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
50 -
40 -
&
£ 30 -
CO
O 20 -
10 -
0 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 20
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 70: Time series of hourly ozone concentrations at Houston monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC_v5_07c_v471 12US2 O3 for AQS_Hourly Site: Load_File
80 -
60'
- AQS_Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 21
Site: Load File
AprOI Apr03 Apr 05 Apr 07 Apr 10 Apr 12 Apr14 Apr17 Apr 19 Apr21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 71: Time series of hourly ozone concentrations at Houston monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
100 -
80 -
.0
&. 60 H
CO
O
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# ol Sites: 21
Site: Load File
Jill 01 Jul03 Jul 05 Jul 07 Jul 09 JuM1 Jul 14 JuM6 Jul18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
-------
Figure 72: Time series of hourly ozone concentrations at Houston monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBCv5 07c v471 12LJS2 O3 for AQS Hourly Site: Load File
100 -
so -
t: eo -
40 -
20 -
0 -
o
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 21
Site: Load File
OctOI Oct03 Oct 05 OctOB OcMO Oct12 Oct 15 Oct17 Oct20 Oct22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 73: Time series of hourly ozone concentrations at Houston monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
OPERATIONAL EVALUATION IN THE WESTERN U.S.
Again model statistics for MDA8 ozone in the Western US are in the range of what has
been reported for state of the science model performance in the literature (Simon et al, 2012).
Mean bias at most sites was less than 6 ppb in the winter, spring, and fall and less than 10 ppb
in the summer. Normalized mean bias at most monitoring locations was less than 10% in the
winter and spring, less than 20% in the summer and less than 15% in the fall. Only 3 sites in
the West recorded ozone concentrations equal to or above 60 ppb in winter (a Riverside
California site reported 13 days and the other two sites in Sacramento, CA and Sublette county
WY reported only 2 and 3 days above 60 ppb). These high wintertime observations were
substantially underestimated by the model with an average MB of-34.5 ppb but likely for
different reasons. The high days in Riverside California are probably due to traditionally
understood ozone formation that occurs on warm sunny days. The high ozone concentrations in
Wyoming are an example of wintertime ozone formation that occurs during cold pool
meteorology events which have substantial snow cover and extreme temperature inversions and
are still an active area of research. Some spatial trends in normalized mean bias are apparent in
the winter (Figure 74) and in the summer (Figure 76). Wintertime ozone is more likely to be
overestimated on the West Coast and more likely to be underestimated in the Intermountain
West. Summertime model overestimates are greatest along the Southern Coast of California in
locations that generally have low observed ozone concentrations. Three urban areas from the
59
-------
REA are located in the Western US and are evaluated in this section: Denver, Sacramento, and
Los Angeles.
Denver area model performance was generally comparable to model performance in the
rest of the Western US although summertime overestimates of MDA8 were somewhat greater
in Denver. These summertime overestimates can be seen on many days in Figure 78 and Figure
80. Figure 79 shows that the model generally captures measured hourly ozone concentrations
in the Denver area on mid to high ozone days but often overestimates very low ozone days in
the spring. Fall hourly ozone estimates from the model are reasonably well captured in Figure
81.
Sacramento area model performance for MDA8 ozone values was reasonably good in
all seasons except for the 2 days with high observed ozone in the winter. The model
underestimated those high wintertime concentrations by 32 ppb. Other than those days, model
bias was generally less than 10% in the Sacramento area. The time series figures of MDA8 and
hourly ozone concentrations show that the model does well at capturing the day to day and day
to night ozone variations in all seasons.
Los Angeles generally had reasonable model performance with low normalized mean
bias number (0-8%) with two exceptions. The thirteen high wintertime ozone days measured in
central Riverside County were not captured by the model which had a mean bias for those days
at that site of-36 ppb and -45%. Also, the model tended to overestimate ozone on summertime
days with observed concentrations below 60 ppb (15 ppb mean bias). These low ozone summer
ozone concentrations generally occurred along the coast (Figure 92). A map of summertime
normalized mean bias (Figure 93) clearly shows that the low ozone sites are the locations with
the largest model bias. Monitors away from the coast generally had fairly low bias (0-20%)
with the exception of two sites in western Riverside County. The summer ozone overestimates
shown in the MDA8 time series (Figure 87) are therefore due to performance at those coastal
sites. The hourly time series for January (Figure 88), April (Figure 89), and October (Figure
90) generally show good hourly ozone performance during the day but model overestimates at
night.
Table 20: Summary of CMAQ model performance at AQS monitoring sites in the western
US
Winter
Days < 60
Days > 60
All Days
No. of obs
23890
18
23908
MB (ppb)
1.31
-34.5
1.29
NMB (%)
4.1
-45.8
4.0
ME (ppb)
5.79
34.5
5.81
NME (%)
17.9
45.8
17.9
60
-------
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
22670
5101
27771
21098
9708
30806
26055
1691
27746
0.61
-4.84
-0.40
7.50
1.29
5.74
3.40
-3.32
2.99
1.3
-7.2
-0.8
17.0
2.8
11.0
8.9
-4.9
7.4
5.61
7.30
5.92
9.96
8.94
9.64
7.08
9.81
7.25
12.1
10.9
11.8
22.6
12.8
18.5
18.4
14.4
18.0
O3_8hrmax HMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Winter for WRAP [No Cutoff]
units = %
coverage limit - 75%
-40
<-50
CIRCLE=AQS_Daily;CIRCLE=CASTNET_Daily;
Figure 74: Map of normalized mean bias for MDA8 ozone concentrations in the Western
US for winter months in 2007.
61
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Spring for WRAP [No Cutoff]
units = %
coverage limit - 75%
CIRCLE=AQS_Daily;CIRCLE=CASTNET_Daily;
Figure 75: Map of normalized mean bias for MDA8 ozone concentrations in the Western
US for spring months in 2007.
62
-------
O3_8hrmax NMB {%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Summer for WRAP [No Cutoff]
units = %
coverage limit - 75%
CIRCLE=AQS_Daily;CIRCLE=CASTNET_Daily;
Figure 76: Map of normalized mean bias for MDA8 ozone concentrations in the Western
US for summer months in 2007.
63
-------
O3_8hrmax NMB (%) for run 2007ee_ORDBC_v5_07c_v471_12US2 for Fall for WRAP [No Cutoff]
unrts = %
coverage limit - 75%
CIRCLE=AQS_Daily;CIRCLE=CASTNET_Daily;
Figure 77: Map of normalized mean bias for MDA8 ozone concentrations in the Western
US for fall months in 2007.
Table 21: Summary of CMAQ model performance at AQS monitoring sites in the Denver
area
Winter
Spring
Summer
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
1006
0
1006
893
182
1075
427
653
1080
MB (ppb)
-3.84
NA
-3.84
3.36
-2.42
2.38
10.2
4.94
7.02
NMB (%)
-11.5
NA
-11.5
7.3
-3.7
4.9
19.4
7.2
11.3
ME (ppb)
7.23
NA
7.23
6.24
6.06
6.21
11.5
8.18
9.49
NMB (%)
21.6
NA
21.6
13.7
9.4
12.7
21.9
11.9
15.2
64
-------
Fall
Days < 60
Days > 60
All Days
993
53
1046
3.19
-1.59
2.95
8.4
-2.5
7.5
6.47
6.38
6.46
17.0
9.8
16.4
2007ee ORDBC v507cv47112US2 O3 Shrmax for AQS Daily Site: Load File
100 -
.Q
5 80 H
60 H
40 -
20 -I
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 12
Site: Load_File
AprOI Apr18 MayOS May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 78: Time series of 8-hr daily maximum ozone concentrations at Denver monitoring
sites for April-October 2007. Observed values shown in black and modeled values
shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
I
g 40 H
20 -
0 -<
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 12
Site: Load File
Apr 01 Apr 03 Apr 05 Apr 07 Apr 10 Apr 12 Apr 14 Apr 17 Apr 19 Apr 21 Apr 23 Apr 26 Apr 28 Apr 30
Date
Figure 79: Time series of hourly ozone concentrations at Denver monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
65
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
120 -
100 -
80 -
60 -
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 12
Site: Load File
Jill 01 JUI03 Jul 05 Jul07 Jul 09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 80: Time series of hourly ozone concentrations at Denver monitoring sites for July
2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS_Hourly Site: Load File
80 -1
60 -
40 -
20 -
0 -I
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 12
Site: Load File
Oct 01 OctOS OctOS OctOS OctIO Oet12 Oct15 Oct 17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 81: Time series of hourly ozone concentrations at Denver monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 22: Summary of CMAQ model performance at AQS monitoring sites in the
Sacramento area
Winter
Spring
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
1374
2
1376
1516
239
1755
MB (ppb)
1.04
-31.80
0.99
-0.89
-4.83
-1.42
NMB (%)
3.5
-49.8
3.4
-2.0
-7.1
-3.0
ME (ppb)
5.41
31.80
5.45
5.28
6.37
5.43
NME (%)
18.4
49.8
18.5
11.8
9.3
11.3
66
-------
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
1443
619
2062
1809
150
1959
4.10
-1.39
2.45
1.80
-7.82
1.06
9.1
-2.0
4.7
4.8
-11.1
2.6
6.58
7.81
6.95
6.30
10.10
6.59
14.6
11.2
13.2
16.7
14.3
16.4
2007ee ORDBC v5 07c V471 12US2 O3 Bhrmax for AQS Daily Site: Load File
100 -
60 -
40 -
20 -I
AQS Daily
2007ee ORDBC v5 07c v471 12US2
# of Sites: 24
Site: Load File
Apr 01 Apr 18 May 05 May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Aug 29 Sep15 Oct02 Oct19
Date
Figure 82: Time series of 8-hr daily maximum ozone concentrations at Sacramento
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
50 -
40 -
30 -
20 -
10 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 16
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 83: Time series of hourly ozone concentrations at Sacramento monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
67
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
8
100 -
80 -
60 -
40 -
20 -
ACS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 19
Site: Load File
AprOI AprOS Apr05 Apr07 AprlO Apr12 Apr14 Apr17 Apr 19 Apr21 Apr23 Apr 26 Apr 28 Apr30
Date
Figure 84: Time series of hourly ozone concentrations at Sacramento monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC_v5_07c_v471_12US2 O3 for AQS_Hourly Site: Load_File
100 -
80 -
60 -
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# ol Sites: 24
Site: Load_File
JulOI JulOS Jill 05 Jul 07 Jul09 Jul11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Date
Figure 85: Time series of hourly ozone concentrations at Sacramento monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
68
-------
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
70 -
60 -
50 -
t 40 -
0 30 -
20 -
10 -
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 24
Site: Load File
OctOI Oct03 Oct 05 Ocl08 Oct 10 Oct 12 Oct 15 Oct17 Oct 20 Ocl 22 Oct 24 Ocl 27 Oct 29 Oct 31
Dale
Figure 86: Time series of hourly ozone concentrations at Sacramento monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
Table 23: Summary of CMAQ model performance at AQS monitoring sites in the Los
Angeles area
Winter
Spring
Summer
Fall
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
Days < 60
Days > 60
All Days
No. of obs
3710
13
3723
2911
968
3879
1972
1896
3868
3461
335
3796
MB (ppb)
1.40
-35.60
1.27
1.55
-3.76
0.22
15.0
3.35
9.27
2.95
-2.66
2.46
NMB (%)
4.4
-44.9
4.0
3.3
-5.4
0.4
32.7
4.4
15.4
7.7
-3.8
5.9
ME (ppb)
5.39
35.6
5.49
6.12
7.20
6.39
16.2
10.4
13.3
8.30
10.60
8.51
NME (%)
16.9
44.9
17.2
13.0
10.4
12.2
35.3
13.7
22.1
21.5
15.1
20.5
69
-------
2Q07ee ORDBC v507cv47112US2 O3 8hrmax for AQS_Daily Site: Load File
120 -
_ 100 -
CL
a.
* 80-
«
S 60 -
S
40 -
AQS Daily
2007ee ORDBG v5 07c v471 12US2
# of Sites: 44
Site: Load File
AprOI ApMB MayOS May 23 Jun 09 Jun 26 Jul 12 Jul 27 Aug 12 Ajg 29 Sep 15 Oct 02 Oct 19
Date
Figure 87: Time series of 8-hr daily maximum ozone concentrations at Los Angeles
monitoring sites for April-October 2007. Observed values shown in black and
modeled values shown in red.
2007ee ORDBC_v5_07c_v471_12US2 O3 for AQS_Hourly Site: Load File
55
O
50 -
40 -
30 -
20 -
10 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 43
Site: Load File
Jan 01 Jan 03 Jan 05 Jan 08 Jan 10 Jan 12 Jan 15 Jan 17 Jan 20 Jan 22 Jan 24 Jan 27 Jan 29 Jan 31
Date
Figure 88: Time series of hourly ozone concentrations at Los Angeles monitoring sites for
January 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c V471 12US2 O3 for AQS Hourly Site: Load File
n
O
100 -
80 -
60 -
40-
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
#ol Sites: 43
Site: Load File
AprOI Apr03 Apr 05 Apr07 Apr 10 Apr 12 Apr 14 Apr17 Apr 19 Apr21 Apr 23 Apr 26 Apr 28 Apr 30
Date
70
-------
Figure 89: Time series of hourly ozone concentrations at Los Angeles monitoring sites for
April 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBCv5 07c v471 12US2 O3 for AQS Hourly Site: Load File
120 -
100 -
-§. 80 -
a.
8 6°
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c V471 12US2
# of Sites: 44
Site: Load File
1
JulOI Jul03 Jul 05 Jul07 Jul09 Jul 11 Jul 14 Jul 16 Jul 18 Jul 20 Jul 22 Jul 24 Jul 27 Jul 29 Jul 31
Dale
Figure 90: Time series of hourly ozone concentrations at Los Angeles monitoring sites for
July 2007. Observed values shown in black and modeled values shown in red.
2007ee ORDBC v5 07c v471 12US2 O3 for AQS Hourly Site: Load File
80 -
60 -
40 -
20 -
AQS Hourly
2007ee ORDBC v5 07c v471 12US2
# of Sites: 44
Site: Load File
OctOI Oct03 OctOS OctOS Oct 10 Ocl12 Oct15 Oct 17 Oct 20 Oct 22 Oct 24 Oct 27 Oct 29 Oct 31
Date
Figure 91: Time series of hourly ozone concentrations at Los Angeles monitoring sites for
October 2007. Observed values shown in black and modeled values shown in red.
71
-------
CIRCLE=AQS_Daily;
Figure 92: Map of mean observed MDA8 ozone concentrations at Los Angeles monitoring
sites for summer months (June, July, Aug) 2007.
units . %
coverage limit = 75%
CIRCLE=AQS_Daily;
Figure 93: Map of normalized mean bias for MDA8 ozone concentrations at Los Angeles
monitoring sites for summer months (June, July, Aug) 2007.
REFERENCES
Akhtar, F., Henderson, B., Appel, W., Napelenok, S., Hutzell, B., Pye, H., Foley, K. (2012). Multiyear Boundary
Conditions for CMAQ 5.0 from GEOS-Chem with Secondary Organic Aerosol Extensions, 11th annual
Community Modeling and Analysis System conference, Chapel Hill, NC, October 2012.
72
-------
Carlton, A.G., Bhave, P.V., Napelenok, S.L., Edney, E.D., Sarwar, G., Finder, R.W., Pouliot, G.A., Houyoux, M.
(2010). Model Representation of Secondary Organic Aerosol in CMAQv4.7. Environmental Science &
Technology, 44: 8553-8560.
Foley, K.M., Roselle, S.J., Appel, K.W., Bhave, P.V., Pleim, J.E., Otte, T.L., Mathur, R., Sarwar, G., Young, J.O.,
Gilliam, R.C., Nolte, C.G., Kelly, J.T., Gilliland, A.B., Bash, J.O. (2010). Incremental testing of the
Community Multiscale Air Quality (CMAQ) modeling system version 4.7, Geoscientific Model
Development, 3: 205-226.
Gery, M.W., Whitten, G.Z., Killus, J.P., Dodge, M.C. (1989). A photochemical kinetics mechanism for urban and
regional scale computer modeling. Journal of Geophysical Research-Atmospheres, 94: 12925-12956.
Gilliam, R.C., Pleim, J.E. (2010). Performance Assessment of New Land Surface and Planetary Boundary Layer
Physics in the WRF-ARW. Journal of Applied Meteorology and Climatology, 49: 760-774.
Gilliland, A.B., Hogrefe, C., Finder, R.W., Godowitch, J.M., Foley, K.L., Rao, S.T. (2008). Dynamic evaluation of
regional air quality models: assessing changes in O3 stemming from changes in emission and
meteorology. Atmospheric Environment, 42: 5110-5123.
Godowitch, J.M, Gilliam, R.C., Rao, S.T. (2011). Diagnostic evaluation of ozone production and horizontal
transport in a regional photochemical air quality modeling system. Atmospheric Environment, 45: 3977-
3987.
Godowitch, J.M., Pouliot, G.A., Rao, S.T. (2010). Assessing multi-year changes in modeled and observed urban
NOx concentrations from a dynamic model evaluation perspective. Atmospheric Environment, 44: 2894-
2901.
Godowitch, J.M., Hogrefe, C., Rao, S.T. (2008). Diagnostic analysis of a regional air quality model: changes in
modeled processes affecting ozone and chemical-transport indicators form NOx point source emissions
reductions. Journal of Geophysical Research-Atmospheres, 113(D19): D19303, DOI:
10.1029/2007JD009537.
Grell, G. A., Dudhia, A. I, and Stauffer, D. R. (1994). A description of the Fifth-Generation Perm State/NCAR
Mesoscale Model (MM5). NCAR Technical Note NCAR/TN-398+STR. Available at
http://www.mmm.ucar.edu/mm5/doc 1 .html.
Henderson, B.H., Akhtar, F., Pye, H.O.T., Napelenok, S.L., Hutzell, W.T. (2013) A database and tool for boundary
conditions for regional air quality modeling: description and evaluation, Geoscientific Model
Development Discussions, 6, 4665-4704.
Henze, O.K., J.H. Seinfeld, N.L. Ng, J.H. Kroll, T-M. Fu, DJ. Jacob, C.L. Heald (2008). Global modeling of
secondary organic aerosol formation from aromatic hydrocarbons: High-vs.low-yield pathways. Atmos.
Chem. Phys., 8: 2405-2420.
Lam, Y.F., Fu, J.S., Jacob, D.J., Jang, C., Dolwick, P., 2010 2006-2008 GEOS-Chem for CMAQ Initial and
Boundary Conditions. 9th Annual CMAS Conference, October 11-13, 2010, Chapel Hill, NC.
Nenes, A., Pandis, S.N., Pilinis, C. (1998). ISORROPIA: A new thermodynamic equilibrium model for multiphase
multicomponent inorganic aerosols. Aquatic Geochemistry, 4: 123-152.
Napelenok, S.L., Foley, K.M, Kang. D.W., Mathur, R., Pierce, T., Rao, S.T. (2011). Dynamic evaluation of
regional air quality model's response to emission reduction in the presence of uncertain emission
inventories. Atmospheric Environment, 45: 4091-4098.
73
-------
Otte T.L., Pleim, J.E. (2010). The Meteorology-Chemistry Interface Processor (MCIP) for the CMAQ modeling
system: updates through v3.4.1. GeoscientificModel Development, 3: 243-256.
Simon, H., Baker, K.R., Phillips, S. (2012). Compilation and interpretation of photochemical model performance
statistics published between 2006 and 2012. Atmospheric Environment, 61: 124-139.
Skamarock, W.C., Klemp, J.B., Dudhia, I, Gill, D.O., Barker, D.M., Dudia, M.G., Huang, X., Wang, W., Powers,
J.G. (2008). A Description of the Advanced Research WRF Version 3.
U.S. Environmental Protection Agency (2011). Meteorological Model Performance for Annual 2007 Simulations,
Office of Air Quality Planning and Standards, Research Triangle Park, NC., 27711, EPA-454/R-11-007.
U.S. Environmental Protection Agency (2012a), Air Quality Modeling Technical Support Document for the
Regulatory Impact Analysis for the Revisions of the National Ambient Air Quality Standard for
Paniculate Matter. Office of Air Quality Planning and Standards, Research Triangle Park, NC, December
2012. Available at: http://www.epa.gov/ttn/naaqs/standards/pm/data/201212aqm.pdf
U.S. Environmental Protection Agency (2012b). Technical Support Document: Preparation of Emissions
Inventories of the Version 5, 2007-based Platform. Office of Air Quality Planning and Standards,
Research Triangle Park, NC. December 2012.
Yantosca, B. (2004). GEOS-CHEMv7-01-02 User's Guide, Atmospheric Chemistry Modeling Group, Harvard
University, Cambridge, MA, October 15, 2004.
Yarwood, G., Rao, S., Yocke, M., Whitten, G.Z. (2005). Updates to the Carbon Bond chemical mechanism: CB05.
Final Report to the US EPA, RT-0400675, December 8, 2005:
http://www.camx.com/publ/pdfs/CB05_Final_Report_120805.pdf.
Zhou, W., Cohan, D.S., Napelenok, S.L. (2013). Reconciling NOx emissions reductions and ozone trends in the
U.S., 2002-2006, Atmospheric Environment, 70: 236-244.
74
-------
Appendix 4-C
Air Quality Spatial Fields for the National Mortality Risk Burden Assessment
of
1. Overview 3
2. Air Quality Spatial Field Techniques 4
2.1 Voronoi Neighbor Averaging (VNA) 4
2.2 Community Multi-scale Air Quality (CMAQ) Model 5
2.3 Enhanced Voronoi Neighbor Averaging (eVNA) 6
2.4 Downscaler (DS) 7
3. Evaluation of Air Quality Spatial Field Techniques 9
3.1 Data 9
3.2 Methods 9
3.3 Results 11
4. Air Quality Inputs to the National Mortality Risk Burden Assessment 14
5. References 18
-------
FIGURES
FIGURE 1 NUMERICAL EXAMPLE OF THE VORONOI NEIGHBOR AVERAGING (VNA) TECHNIQUE APPLIED TO A MODELGRID DOMAIN..5
FIGURE 2 NUMERICAL EXAMPLE OF THE ENHANCED VORONOI NEIGHBOR AVERAGING (EVNA) TECHNIQUE APPLIED TO A MODEL
GRID DOMAIN 7
FIGURE 3 EXAMPLE OF THE "4-FOLD" CROSS-VALIDATION SCHEME USED IN THE EVALUATION OF THE AIR QUALITY SPATIAL FIELD
TECHNIQUES FOR THE SOUTHERN LAKE MICHIGAN AREA 11
FIGURE 4 CROSS-VALIDATION RESULTS FOR THE 2007 ANNUAL 4 HIGHEST DAILY MAXIMUM O3 CONCENTRATIONS 12
FIGURE 5 CROSS-VALIDATION RESULTS FOR THE 2007 MAY-SEPTEMBER MEAN OF THE DAILY MAXIMUM S-HOUR O3
CONCENTRATIONS 13
FIGURE 6 MAY-SEPTEMBER AVERAGE DAILY MAXIMUM S-HOUR O3 CONCENTRATIONS IN PPB, BASED ON A DOWNSCALER FUSION OF
2006-2008 AVERAGE MONITORED VALUES WITH A 2007 CMAQ MODEL SIMULATION 15
FIGURE 7 JUNE-AUGUST AVERAGE DAILY IOAM-SPM MEAN O3 CONCENTRATIONS IN PPB, BASED ON A DOWNSCALER FUSION OF
2006-2008 AVERAGE MONITORED VALUES WITH A 2007 CMAQ MODEL SIMULATION 16
FIGURE 8 APRIL-SEPTEMBER AVERAGE DAILY MAXIMUM I-HOUR O3 CONCENTRATIONS IN PPB, BASED ON A DOWNSCALER FUSION
OF 2006-2008 AVERAGE MONITORED VALUES WITH A 2007 CMAQ MODEL SIMULATION 17
TABLES
TABLE 1 SUMMARY OF THE CROSS-VALIDATION PERFORMANCE METRICS FOR THE FOUR AIR QUALITY SPATIAL FIELD TECHNIQUES .. 14
-------
1 OVERVIEW
The need for greater spatial and temporal coverage of air quality concentration estimates
has grown in recent years as epidemiology and exposure studies that link air quality to health
effects have become more robust and as regulatory needs have increased. These health studies
have historically relied upon direct measurements of air quality concentrations, but prohibitive
logistics and costs limit the spatial coverage and temporal resolution of available ambient
monitoring networks. Numerical methods of interpolation, which predict unknown values from
data observed at known locations, have historically been used by researchers to extend the spatial
coverage of these monitoring networks with a high degree of confidence to inform exposure
studies. However, simple kriging approaches such as Voronoi Neighbor Averaging (VNA) do
not take advantage of the greater availability of model predictions of air quality concentrations
that can enhance the predictive capabilities of numerical methods. Such "data fusion" methods
employ both ambient air quality monitoring data and air quality modeling simulation data as
inputs, and therefore take advantage of the measurement data's accuracy at specific locations and
the air quality model's spatial coverage to generate more robust spatial predictions. For
regulatory purposes, the enhanced Voronoi Neighbor Averaging (eVNA) has been the preferred
method used by EPA to make spatial predictions of ozone as part of conducting health benefit
assessments (EPA, 2010). Given the interest and value of these methods, research and
development efforts have focused on improving their predictive capabilities. This includes the
Office of Research and Development's development of the Downscaler (DS) model that can
potentially provide predictions that are improved over those of eVNA (Berrocal et al, 2011).
This appendix describes the methods, evaluation, and results of four different techniques
for predicting air quality concentrations. These four techniques are:
1) VNA (interpolating the monitoring data),
2) CMAQ (using the absolute modeled air quality concentrations),
3) eVNA, and
4) Downscaler.
EPA used the method with the best performance based on the evaluation to generate national air
quality spatial fields of seasonally averaged Os concentrations as inputs to the national mortality
risk burden assessment in Chapter 8 of the 2nd draft 63 Health Risk and Exposure Assessment.
Air quality spatial fields are also used in two other applications for the risk and exposure
assessments. In Appendix 4a to the 2nd Draft Oj Health Risk and Exposure Assessment, we
describe the methodology used to create urban-scale spatial fields of hourly 63 concentrations
for use in the exposure modeling. In Appendix 4a to the 2nd Draft Welfare Risk and Exposure
-------
Assessment, we evaluate these same four spatial field techniques for use in creating national-
scale air quality spatial fields for the W126 exposure metric.
2. AIR QUALITY SPATIAL FIELD TECHNIQUES
This section briefly describes the methodology of each of the four techniques considered
for generating air quality spatial fields as inputs to the national risk mortality burden analyses
described in Chapter 8.
2 1 VORONOI NEIGHBOR AVERAGING (VNA)
The Voronoi Neighbor Averaging (VNA; Gold, 1997; Chen et al, 2004) interpolation
technique uses inverse distance squared weighted averages of the ambient concentrations from a
set of nearest neighboring monitors to estimate the concentration at a specified location (in this
case CMAQ grid cell centers). VNA identifies the nearest neighboring monitors for the center of
each grid cell using a Delaunay triangulation algorithm, then takes the inverse distance squared
weighted average of the hourly Os concentrations from each neighboring monitor to estimate an
hourly 63 concentration value for the center of the grid cell. The following paragraphs provide a
numerical example of the VNA technique applied to a model grid domain.
The first step in VNA is to identify the set of nearest monitors for each grid cell in the
domain. The left-hand panel of Error! Reference source not found, below presents a
numerical example with nine model grid cells and seven monitoring sites, with the focus on
identifying the set of nearest neighboring sites to the center of grid cell "E", the center cell. The
Delaunay triangulation algorithm identifies the set of nearest neighboring monitors by drawing a
set of polygons called the "Voronoi diagram" around the center of grid cell "E" and each of the
monitoring sites. Voronoi diagrams have the special property that the each edge of the polygons
are the same distance from the two closest points, as shown in the right-hand panel below.
-------
A
D
Monitor: *
80 ppb
10 miles
G
*
B
Monitor:
90 ppb ^
15 miles t
7
if
/
/ "
*
Monitor:
100 ppb
20 miles
C
*
F
Monitor:
60 ppb
15 miles
'
*
#= Center Grid-Cell "E'!
*
= Air Pollution Monitor
# = Center Grid-Cell "E"
*
= Air Pollution Monitor
Figure 1
Numerical example of the Voronoi Neighbor Averaging (VNA) technique
applied to a model grid domain
VNA then chooses the monitoring sites that share a boundary with the center of grid cell
"E". These are the nearest neighboring sites, which are used to estimate the concentration value
for grid cell "E". The VNA estimate of the concentration value in grid cell "E" is the inverse
distance squared weighted average of the four monitored concentrations. The further the monitor
is from grid cell "E", the smaller the weight.
For example, the weight for the monitor in grid cell "D" 10 miles from the center of grid
cell "E" is calculated as follows:
1/102
1/102 + 1/152 + 1/152 + 1/202
= 0.4675 (Equation 1)
The weights for the other monitors are calculated in a similar fashion. The final VNA
estimate for grid cell "E" is calculated as follows:
VNA(E} = 0.4675 * 80 + 0.2078 * 90 + 0.2078 * 60 + 0.1169 * 100 = 80.3 ppb
(Equation 2)
2.2 COMMUNITY MULTI-SCALE AIR QUALITY (CMAQ) MODEL
For more than a decade, the EPA's Community Multi-scale Air Quality (CMAQ; Foley et
al, 2010) model has been a valuable computational tool used by EPA and states to inform air
quality management programs. The CMAQ system simultaneously models multiple air
pollutants, including ozone, particulate matter, and a variety of air toxics to help regulators
-------
determine the best air quality management scenarios for their communities, states, and countries.
CMAQ is also used by states to assess implementation actions needed to attain National Ambient
Air Quality Standards.
The CMAQ system includes emissions, meteorology, and photochemical modeling
components. Research continues in all of these areas to reduce biases and uncertainties in model
simulations. CMAQ is a multi-scale system that has been applied over continental, national,
regional, and urban modeling domains with progressively finer resolution in a series of nested
grids. The CMAQ modeling community includes researchers, regulators, and forecasters in
academia, government, and the private sector with thousands of users worldwide.
Modeled air quality concentrations from CMAQ simulations have a twofold purpose in
the generation and analysis of air quality spatial fields. First, the modeled concentrations are
"fused" with the ambient measurement data using the eVNA and DS techniques. Second, the
original modeled concentrations are evaluated against the resulting concentration estimates from
the other spatial field techniques, to ensure that those techniques successfully reduce biases in
the modeled air quality fields.
2.3 ENHANCED VORONOI NEIGHBOR AVERAGING (EVNA)
Enhanced Voronoi Neighbor Averaging (eVNA; Timin et al, 2010) is a direct extension
of VNA used to combine monitored and modeled air quality concentration data. Continuing
from the previous numerical example for VNA, suppose the model grid cells containing monitors
are associated with modeled concentrations as shown in Figure 2 below. The modeled
concentrations are used to weight the VNA estimates relative to the modeled concentration
gradient:
eVNA(E) = Elii Weight^ * Monitort * ^ (Equation 3)
7w OCLG L /
where Monitor , represents the monitored concentration for a nearest neighboring monitor,
Weighti represents the inverse distance squared weight for Monitor^
ModelE represents the modeled concentration for grid cell "E", and
Modeli represents the modeled concentration in the grid cell containing Monitor t.
-------
A
Model: p
95 ppb
Monitor: *
80 ppb
10 miles
G
*
Model: g
100 ppb
Monitor:
90 ppb
15 miles
Model: /E
85 ppb /
. ft «
/
Model: H
120 ppb
*
Monitor:
100 ppb
20 miles
C
*
Model: F
80 ppb
Monitor:
60 ppb
15 miles
I
*
#= Center Grid-Cell "E"
*
= Air Pollution Monitor
Figure 2 Numerical example of the Enhanced Voronoi Neighbor Averaging (eVNA)
technique applied to a model grid domain
Based on the values shown in Figure 2, the eVNA estimate for grid cell "E" is calculated
as follows:
eVNA(E) = fo.4675 * 80 * } + fo.2078 * 90 * } + fo.2078 * 60 * } + fo.1169 * 100 * } = 70.9 ppb
^ ' \ 95j \ 100/ V 80/ V 120/ FF
(Equation 4)
In this example, eVNA adjusts the modeled concentration in grid cell "E" downward to
reflect the tendency for the model to over-predict the monitored concentrations. In general, the
eVNA method attempts to use the monitored concentrations to adjust for model biases, while
preserving local gradients in the modeled concentration fields. The computations for VNA and
eVNA were executed using the R statistical computing program (R, 2012), with the Delaunay
triangulation algorithm implemented in the "deldir" package (Turner, 2012).
2.4 DOWNSCALER (DS)
The Downscaler (DS) model is EPA's most recently developed "data fusion" method for
spatially predicting air pollution concentrations. DS essentially operates by calibrating CMAQ
data to the observational data, and then uses the resulting relationship to predict "observed"
concentrations at new spatial points in the domain. Although similar in principle to a linear
regression, spatial modeling aspects have been incorporated for improving the model fit, and a
-------
Bayesian1 approaching to fitting is used to generate an uncertainty value associated with each
concentration prediction. The uncertainties that DS produces are a major distinguishing feature
from earlier fusion methods previously used by EPA such as the "Hierarchical Bayesian" (HB)
model (McMillan et al, 2009). The term "downscaler" refers to the fact that DS takes grid-
averaged predictions from an air quality model such as CMAQ for input and produces point-
based estimates, thus "scaling down" the area of data representation. Although this allows air
pollution concentration estimates to be made at point locations where no observations exist,
caution is needed when interpreting any within-grid cell spatial gradients generated by DS since
they may not exist in the input datasets. The theory, development, and initial evaluation of DS
can be found in the earlier papers of Berrocal, Gelfand, and Holland (2009, 2010, and 2011).
DS develops a relationship between observed and modeled concentrations, and then uses
that relationship to spatially predict what measurements would be at new locations in the spatial
domain based on the input data. This process is separately applied for each time step (daily in
this work) of data, and for each of the pollutants under study (ozone and PM2.5). In its most
general form, the model can be expressed in an equation similar to that of linear regression:
Y(s, t) = ~/J0(s, 0 + ^i(t) * ~x(s, t) + e(s, t) (Equation 1)
where Y(s,t) is the observed concentration at point s and time t,
~x(s,t) is the CMAQ concentration at point s and time t, (This value is a weighted average
of both the grid cell containing the monitor and neighboring grid cells.)
~fio(s,t) is the intercept, composed of global and local components,
fii(t) is the global slope, (Local components of the slope are contained in the ~x(s,t) term),
e(s,t) is the model residual error.
DS has additional properties that differentiate it from linear regression:
1) Rather than just finding a single optimal solution to Equation 1, DS uses a Bayesian
approach so that uncertainties can be generated along with each concentration prediction. This
involves drawing random samples of model parameters from built-in "prior" distributions and
assessing their fit on the data on the order of thousands of times. After each iteration, properties
of the prior distributions are adjusted to try to improve the fit of the next iteration. The resulting
collection of ~/?o and fii values at each space-time point are the "posterior" distributions, and the
means and standard distributions of these are used to predict concentrations and associated
uncertainties at new spatial points.
2) The model is "heirarchical" in structure, meaning that the top level parameters in
Equation 1 (ie ~fio(s,t), fiift), ~x(s,t)} are actually defined in terms of further parameters and sub-
parameters in the DS code. For example, the overall slope and intercept is defined to be the sum
1 Bayesian statistical modeling refers to methods that are based on Bayes' theorem, and model the world in
terms of probabilities based on previously acquired knowledge.
-------
of a global (one value for the entire spatial domain) and local (values specific to each spatial
point) component. This gives more flexibility in fitting a model to the data to optimize the fit
(i.e. minimize e(s,t)).
EPA has recently used DS in other applications, such as providing spatial predictions of
national daily 8-hour ozone and fine particulate matter (PM^.s) concentrations at a census tract
resolution to the Centers for Disease Control (CDC) for the Public Health Air Surveillance
Evaluation (PHASE) project as part of their Environmental Public Health Tracking (EPHT)
program.
3 EVALUATION OF AIR QUALITY SPATIAL FIELD TECHNIQUES
This section describes the data, methods, and results of an evaluation that was performed
in order to assess the relative accuracy of the predictions generated by the four air quality spatial
field techniques described in the previous section.
3.1 DATA
The evaluation was designed to assess the relative ability of each spatial field technique
to reproduce monitored concentrations for two annual air quality metrics: 1) the 4th highest daily
maximum 8-hour 63 concentration (henceforth referred to as the "4th max"), and 2) the May -
September mean of the daily maximum 8-hour O3 concentrations (henceforth referred to as the
"seasonal mean"). For the ambient monitoring data, these two metrics were calculated for all
monitors in the contiguous U.S. with complete data for 2007 based on the initial dataset and the
data completeness criteria described in Appendix 4a. For the air quality modeling data, the two
metrics were calculated from hourly OB concentrations based on a CMAQ simulation with a 12
km gridded domain covering the contiguous U.S., and 2007 emissions and meteorology inputs
(EPA, 2012b).
3.2 METHODS
Cross-validation is a method commonly used to evaluate the ability of statistical models
to make accurate predictions. In a cross-validation analysis, the data are split into two subsets,
the "calibration" subset, and the "validation" subset. The calibration subset is used to "fit" the
model, usually by estimating parameters which establish a relationship between the variable of
interest and one or more dependent variables. The resulting model fit is then applied to the
dependent variable(s) in the validation subset, and the predictive ability of the model is assessed
by how accurately it is able to reproduce the variable of interest in the validation subset.
-------
The evaluation used a systematic "4-fold" cross-validation scheme based on the CMAQ
modeling domain (e.g., 12 km x 12 km grid covering the continental U.S.). The CMAQ
modeling domain was divided into four groups, or "folds", so that each 2x2 block of 12 km grid
cells had one member in each fold. Figure 3 shows an example of the resulting four folds with
Os monitor locations for the area surrounding southern Lake Michigan. Four cross-validations
were performed using VNA, eVNA, and DS, for both the 4th max and seasonal mean metrics.
The calibration subset in the first cross-validation consisted of the monitors in folds 2, 3, and 4 as
shown in Figure 3 (blue dots), while the validation subset consisted of the monitors in fold 1 (red
dots). The remaining cross-validations were performed in a similar manner, with three of the
four folds used as the calibration subset and the final fold used as the validation subset. Thus,
this method resulted in four cross-validation partitions, each with approximately 75% of the
monitoring data used in the calibration subset, and the remaining 25% used in the validation
subset. Each monitor was included in the validation subset exactly once, resulting in a validation
dataset of observed 4th max and seasonal mean values paired with VNA, eVNA, and DS
predictions of those values at monitor locations. The CMAQ predictions were simply the
modeled 4th max and seasonal mean values for the 12 km grid cells containing Oj monitors.
-------
4 Fold Validation
Fold 1 for validation
Folds 2; 3; 4 for calibaration
Figure 3 Example of the "4-fold" cross-validation scheme used in the evaluation of the
air quality spatial field techniques for the southern Lake Michigan area
3.3 RESULTS
The cross-validation predictions based on the four air quality spatial field techniques
were compared with the observed 4th max and seasonal mean values based on the ambient data.
The comparison focused on three performance metrics: 1) the root mean squared error (RMSE),
2) the coefficient of variation (R2), and 3) the mean bias (MB). The results of these comparisons
are shown in Figure 4 (4th max) and Figure 5 (seasonal mean), and Table 1 contains a summary
of the three performance metrics for each technique.
The cross-validation results clearly showed that VNA, eVNA, and DS more accurately
predict monitored 4th max and seasonal mean concentrations than the CMAQ model. The scatter
plots and the mean bias statistics indicated that both eVNA and DS were effective at reducing the
amount of bias present in the modeled concentrations. The differences between VNA, eVNA,
and DS were much smaller, but the performance statistics consistently indicated that overall DS
was able to most accurately reproduce the observed values. DS had the highest R2 values, and
-------
the lowest RMSE and MB values of the four techniques for both air quality metrics. In contrast
to eVNA, DS seemed to be an improvement over VNA, which did not make use of the modeled
concentration data. Based on these results, DS was deemed the most appropriate technique for
generating spatial fields of seasonal mean 63 concentrations for the national risk mortality
burden analyses.
Validation Results - 2007 4th Highest Daily Maximum 8-hour O3 (ppb)
o
CM
O
O
CMAQ
RMSE = 8.36
RA2 = 0.501
MB = 3.22
V I
o
o
- RMSE = 6.13
RA2 = 0.645
MB = 0.09
eVNA
40
60
o
-------
Validation Results - 2007 May - September Average Daily Maximum 8-hour O3 (ppb)
CMAQ
eVNA
RMSE = 3.56
RA2 = 0.
MB = 0.08
- RMSE = 3.39
RA2 = 0.826
MB = -0.02
20 30
Figure 5
20
30
40 50
60
70
80
Cross-validation results for the 2007 May-September mean of the daily
maximum 8-hour Os concentrations
-------
Performance
Metric
RMSE
RMSE
R2
R2
MB
MB
O3 Metric
4th max
seasonal mean
4th max
seasonal mean
4th max
seasonal mean
VNA
5.34
3.56
0.702
0.808
0.32
0.08
CMAQ
8.36
6.83
0.501
0.634
3.22
4.71
eVNA
6.13
3.85
0.645
0.779
0.09
0.06
DS
5.01
3.39
0.736
0.826
0.03
-0.02
Table 1 Summary of the cross-validation performance metrics for the four air quality
spatial field techniques
4 AIR QUALITY INPUTS TO THE NATIONAL MORTALITY RISK
BURDEN ASSESSMENT
Three air quality spatial fields were created using DS as inputs to the national mortality
risk burden analyses. These fields were based on three seasonal mean Os metrics:
1) May-September average daily maximum 8-hour Os concentration (consistent with the
metric used by Smith et al. 2009);
2) June-August average daily 10am-6pm mean Os concentration (consistent with the metric
used by Zanobetti and Schwartz 2008); and
3) April-September average daily maximum 1-hour 63 concentration (consistent with the
metric used by Jerrett et al 2009).
For the ambient monitoring data, the 2006-2008 average of these annual metrics were
calculated for all monitors in the contiguous U.S. with complete data for 2006-2008 based on the
initial dataset and the data completeness criteria described in Appendix 4a. For the air quality
modeling data, these metrics were calculated from hourly 63 concentrations based on a CMAQ
simulation with a 12 km gridded domain covering the contiguous U.S., and 2007 emissions and
meteorology inputs. This simulation differed from the one used for the evaluations in that the
wildfire and power plant emissions inputs were based on multi-year averages instead of year-
specific estimates. Appendix 4b contains a complete description of this CMAQ simulation and
provides relevant model evaluation results.
Figure 6, Figure 7, and Figure 8 show maps of the DS air quality spatial fields for the
May-September average daily maximum 8-hour Os concentrations, the June-August average
daily 10am-6pm mean 63 concentrations, and the April-September average daily maximum 1-
hour Os concentrations, respectively. The overall spatial pattern seen in these three fields is very
similar, with the highest concentration values for each metric occurring in Southern California.
-------
60
70
80
May-September average daily maximum 8-hour Os concentrations in ppb, based on a Downscaler fusion of
2006-2008 average monitored values with a 2007 CMAQ model simulation
-------
60
70
80
June-August average daily 10am-6pm mean Os concentrations in ppb, based on a Downscaler fusion of 2006-
2008 average monitored values with a 2007 CMAQ model simulation
-------
70
80
90
April-September average daily maximum 1-hour Os concentrations in ppb, based on a Downscaler fusion of
2006-2008 average monitored values with a 2007 CMAQ model simulation
-------
5 REFERENCES
Abt Associates, Inc. (2010). Environmental Benefits and Mapping Program (Version 4.0).
Bethesda, MD. Prepared for U.S. Environmental Protection Agency Office of Air Quality
Planning and Standards. Research Triangle Park, NC. Available on the Internet at
.
Berrocal, V.J., Gelfand, A.E., Holland, D.M. (2009). A Spatio-Temporal Downscaler for
Output from Numerical Models. Journal of Agricultural, Biological, and Environmental
Statistics, 15(2), 176-197.
Berrocal, V.J., Gelfand, A.E., Holland, D.M. (2010). ABivariate Space-Time Downscaler
Under Space and Time Misalignment. Annals of Applied Statistics, 4(4), 1942-1975.
Berrocal, V.J., Gelfand, A.E., Holland, D.M. (2011). Space-Time Data Fusion Under Error in
Computer Model Output: An Application to Modeling Air Quality. Biometrics, 68(3),
837-848.
Chen, J., Zhao, R., Li, Z. (2004). Voronoi-based k-order neighbor relations for spatial analysis.
ISPRS J Photogrammetry Remote Sensing, 59(1-2), 60-72.
Foley, K.M., Roselle, S.J., Appel, K.W., Pleim, I.E., Otte, T.L., Mathur, R., Sarwar, G., Young,
J.O., Gilliam, R.C., Nolte, C.G, Kelly, J.T., Gilliland, A.B., Bash, J.O. (2010).
Incremental testing of the Community Multiscale Air Quality (CMAQ) modeling system
version 4.7. Geoscientific Model Development, 3, 205-226.
Gold, C. (1997). Voronoi methods in GIS. In: Algorithmic Foundation of Geographic
Information Systems (va Kereveld M., Nievergelt, J., Roos, T., Widmayer, P., eds).
Lecture Notes in Computer Sicnece, Vol 1340. Berlin: Springer-Verlag, 21-35.
Hall, E., Eyth, A., Phillips, S. (2012). Hierarchical Bayesian Model (HBM)-Derived Estimates
of Air Quality for 2007: Annual Report. EPA/600/R-12/538. Available on the Internet
at: http://www.epa.gov/heasd/sources/projects/CDC/AnnualReports/2007 HBM.pdf
Jerrett, M., R.T. Burnett, C.A. Pope III, K. Ito, G. Thurston, D. Krewski, Y. Shi, E. Calle, M.
Thun. (2009). Long-term O3 exposure and mortality. N. Eng. J. Med., 360:1085-1095.
-------
McMillan, N.J., Holland, D.M., Morara, M., Feng, J. (2009). Combining Numerical Model
Output and Particulate Data using Bayesian Space-Time Modeling. Environmetrics, Vol.
21,48-65.
R Core Team (2012). R: A language and environment for statistical computing. R Foundation
for Statistical Computing, Vienna, Austria, http ://www.R-project.org/.
Smith, RL; Xu, B; Switzer, P. (2009). Reassessing the relationship between ozone and short-
term mortality in U.S. urban communities. Inhal Toxicol 21: 37-61.
Timin B, Wesson K, Thurman J. (2010). Application of Model and Ambient Data Fusion
Techniques to Predict Current and Future Year PM2.5 Concentrations in Unmonitored
Areas. Pp. 175-179 in SteynDG, Rao St(eds). Air Pollution Modeling and Its
Application XX. Netherlands: Springer.
Turner, R. (2012). deldir: Delaunay Triangulation and Dirichlet (Voronoi) Tessellation. R
package version 0.0-19. http://CRAN.R-project.org/package=deldir
U.S. Environmental Protection Agency. (2012a). Health Risk and Exposure Assessment for
Ozone, First External Review Draft. Available on the Internet at:
http://www.epa.gOv/ttn/naaqs/standards/ozone/s o3 2008 rea.html
U.S. Environmental Protection Agency. (2012b). Air Quality Modeling Technical Support
Document for the Regulatory Impact Analysis for the Revisions to the National Ambient
Air Quality Standards for Particulate Matter. Available on the Internet at:
http ://www. epa. gov/ttn/naaqs/standards/pm/data/201212aqm. pdf
Wells, B., Wesson, K., Jenkins, S. (2012). Analysis of Recent U.S. Ozone Air Quality Data to
Support the O3 NAAQS Review and Quadratic Rollback Simulations to Support the First
Draft of the Risk and Exposure Assessment. Available on the Internet at:
http://www.epa.gov/ttn/naaqs/standards/ozone/s_o3_td.html
Zanobetti, A., and J. Schwartz (2008). Mortality Displacement in the Association of Ozone with
Mortality: An Analysis of 48 Cities in the United States. American Journal of
Respiratory and Critical Care Medicine, 177, 184-189.
-------
Appendix 4-D
Model-based Air Quality Adjustment Using the Higher-order
Decoupled Direct Method (HDDM)
Table of Contents
1. Motivation for a New Technique to Simulate Ozone Concentrations Under
Alternative Standards 1
2. Higher-order Decoupled Direct Method (HDDM) 2
2.1 Capabilities 2
2.2 Limitations 4
3. Applying HDDM/CMAQ to Adjust Ozone to Just Meet Current and
Alternative Standards: Methodology 5
3.1 Conceptual Framework 5
3.2 Application to Measured Os Concentrations in 15 Urban Areas 6
3.2.1 Multi-step Application of HDDM Sensitivities 7
3.2.2 Relationships between HDDM Sensitivities and Modeled Os
Concentrations 16
3.2.3 Application of Sensitivity Regressions to Ambient Data 24
3.2.4 Alternate Methodologies 25
4. Applying HDDM/CMAQ to Adjust Ozone to Just Meet Current and
Alternative Standards: Results 26
4.1 Emission Reductions Applied to Meet Alternative Standards 26
4.2 Design Values 27
4.3 Distribution of Hourly Os Concentrations 39
4.4 Standard Errors for Predicted Hourly Os Concentrations 56
4.5 Air Quality Inputs for the Epidemiology-Based Risk Assessment 59
4.6 Air Quality Inputs for the Exposure and Clinical Risk Assessment.... 66
4.7 Comparing Air Quality Adjustments Based on NOx Reductions Only
to Air Quality Adjustments Based on NOx and VOC Reductions 131
5. References 144
-------
Table of Figures
FIGURE 1: FLOW DIAGRAM DEMONSTRATING DDM MODEL-BASED O3 ADJUSTMENT APPROACH 6
FIGURE 2: CONCEPTUAL PICTURE OF S-STEP APPLICATION OF HDDM SENSITIVITIES 9
FIGURES: COMPARISON OF BRUTE FORCE AND S-STEP HDDM O3 ESTIMATES FOR 50% NOx CUT CONDITIONS: ATLANTA, BALTIMORE,
BOSTON, CHICAGO, CLEVELAND, DALLAS, DENVER, AND DETROIT 13
FIGURE 4: COMPARISON OF BRUTE FORCE AND S-STEP HDDM O3 ESTIMATES FOR 50% NOx CUT CONDITIONS: HOUSTON, Los
ANGELES, NEW YORK, PHILADELPHIA, SACRAMENTO, ST. Louis, AND WASHINGTON D.C 14
FIGURES: COMPARISON OF BRUTE FORCE AND S-STEP HDDM O3 ESTIMATES FOR 90% NOx CUT CONDITIONS: ATLANTA, BALTIMORE,
BOSTON, CHICAGO, CLEVELAND, DALLAS, DENVER, AND DETROIT 15
FIGURES: COMPARISON OF BRUTE FORCE AND S-STEP HDDM O3 ESTIMATES FOR 90% NOx CUT CONDITIONS: HOUSTON, Los
ANGELES, NEW YORK, PHILADELPHIA, SACRAMENTO, ST. Louis, AND WASHINGTON D.C 16
FIGURE 7: RELATIONSHIP BETWEEN SNOx AND HOURLY O3 AT A NOX-LIMITED SITE DOWNWIND OF ATLANTA (SUMMER). RELATIONSHIPS
ARE SHOWN FOR ONE NIGHTTIME HOUR, ONE MORNING RUSH-HOUR HOUR, ONE DAYTIME HOUR, AND ONE EVENING RUSH-HOUR
HOUR. THE SOLID BLUE LINE SHOW THE LINEAR REGRESSION FOR THESE POINTS AND THE DOTTED BLUE LINE SHOWS THE FLOOR
VALUE USED FOR 5WOx BASED ON THE 5 PERCENTILE MODELED VALUE 19
FIGURE 8: RELATIONSHIP BETWEEN SNOx AND HOURLY O3 AT A NOX-SATURATED SITE IN QUEENS COUNTY, NY (AUTUMN).
RELATIONSHIPS ARE SHOWN FOR ONE NIGHTTIME HOUR, ONE MORNING RUSH-HOUR HOUR, ONE DAYTIME HOUR, AND ONE
EVENING RUSH-HOUR HOUR. THE SOLID BLUE LINE SHOW THE LINEAR REGRESSION FOR THESE POINTS AND THE DOTTED BLUE LINE
SHOWS THE FLOOR VALUE USED FOR SNOx BASED ON THE 5 PERCENTILE MODELED VALUE 20
FIGURE 9: RELATIONSHIP BETWEEN SNOx AND HOURLY O3 ATA NOX-SATURATED SITE IN SUFFOLK COUNTY, NY ON LONG ISLAND
(SPRING). RELATIONSHIPS ARE SHOWN FOR ONE NIGHTTIME HOUR, ONE MORNING RUSH-HOUR HOUR, ONE DAYTIME HOUR, AND
ONE EVENING RUSH-HOUR HOUR. THE SOLID BLUE LINE SHOW THE LINEAR REGRESSION FOR THESE POINTS AND THE DOTTED BLUE
LINE SHOWS THE FLOOR VALUE USED FOR SNOx BASED ON THE 5 PERCENTILE MODELED VALUE 21
FIGURE 10: RELATIONSHIP BETWEEN S2NOx AND SNOx AT A NOX-LIMITED SITE DOWNWIND OF ATLANTA (SUMMER). RELATIONSHIPS ARE
SHOWN FOR ONE NIGHTTIME HOUR, ONE MORNING RUSH-HOUR HOUR, ONE DAYTIME HOUR, AND ONE EVENING RUSH-HOUR
HOUR. THE SOLID BLUE LINE SHOWS THE LINEAR REGRESSION FOR THESE POINTS 22
FIGURE 11: RELATIONSHIP BETWEEN S2NOx AND SNOx AT A NOX-SATURATED SITE IN QUEENS COUNTY, NY (AUTUMN). RELATIONSHIPS
ARE SHOWN FOR ONE NIGHTTIME HOUR, ONE MORNING RUSH-HOUR HOUR, ONE DAYTIME HOUR, AND ONE EVENING RUSH-HOUR
HOUR. THE SOLID BLUE LINE SHOWS THE LINEAR REGRESSION FOR THESE POINTS 23
FIGURE 12: RELATIONSHIP BETWEEN S2NOx AND SNOx AT A NOX-SATURATED SITE IN SUFFOLK COUNTY, NY ON LONG ISLAND (SPRING).
RELATIONSHIPS ARE SHOWN FOR ONE NIGHTTIME HOUR, ONE MORNING RUSH-HOUR HOUR, ONE DAYTIME HOUR, AND ONE
EVENING RUSH-HOUR HOUR. THE SOLID BLUE LINE SHOWS THE LINEAR REGRESSION FOR THESE POINTS 24
FIGURE 13: HOURLY O3 DISTRIBUTIONS AT ATLANTA AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 41
FIGURE 14: HOURLY O3 DISTRIBUTIONS AT BALTIMORE AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 42
II
-------
FIGURE 15: HOURLY O3 DISTRIBUTIONS AT BOSTON AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR QUALITY
ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-2010
(RIGHT) 42
FIGURE 16: HOURLY O3 DISTRIBUTIONS AT CHICAGO AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 43
FIGURE 17: HOURLY O3 DISTRIBUTIONS AT CLEVELAND AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 43
FIGURE 18: HOURLY O3 DISTRIBUTIONS AT DALLAS AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR QUALITY
ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-2010
(RIGHT) 44
FIGURE 19: HOURLY O3 DISTRIBUTIONS AT DENVER AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR QUALITY
ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-2010
(RIGHT) 44
FIGURE 20: HOURLY O3 DISTRIBUTIONS AT DETROIT AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 45
FIGURE 21: HOURLY O3 DISTRIBUTIONS AT HOUSTON AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 45
FIGURE 22: HOURLY O3 DISTRIBUTIONS AT Los ANGELES AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 46
FIGURE 23: HOURLY O3 DISTRIBUTIONS AT NEW YORK AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 46
FIGURE 24: HOURLY O3 DISTRIBUTIONS AT PHILADELPHIA AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 47
FIGURE 25: HOURLY O3 DISTRIBUTIONS AT SACRAMENTO AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 47
FIGURE 26: HOURLY O3 DISTRIBUTIONS AT ST. Louis AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 48
FIGURE 27: HOURLY O3 DISTRIBUTIONS AT WASHINGTON, D.C. AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND
AIR QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND
2008-2010 (RIGHT) 48
III
-------
FIGURE 28: MONTHLY O3 DISTRIBUTIONS AT ATLANTA AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 49
FIGURE 29: MONTHLY O3 DISTRIBUTIONS AT BALTIMORE AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 49
FIGURE 30: MONTHLY O3 DISTRIBUTIONS AT BOSTON AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 50
FIGURE 31: MONTHLY O3 DISTRIBUTIONS AT CHICAGO AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 50
FIGURE 32: MONTHLY O3 DISTRIBUTIONS AT CLEVELAND AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 51
FIGURE 33: MONTHLY O3 DISTRIBUTIONS AT DALLAS AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 51
FIGURE 34: MONTHLY O3 DISTRIBUTIONS AT DENVER AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 52
FIGURE 35: MONTHLY O3 DISTRIBUTIONS AT DETROIT AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 52
FIGURE 36: MONTHLY O3 DISTRIBUTIONS AT HOUSTON AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 53
FIGURE 37: MONTHLY O3 DISTRIBUTIONS AT Los ANGELES AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 53
FIGURE 38: MONTHLY O3 DISTRIBUTIONS AT NEW YORK AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 54
FIGURE 39: MONTHLY O3 DISTRIBUTIONS AT PHILADELPHIA AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 54
FIGURE 40: MONTHLY O3 DISTRIBUTIONS AT SACRAMENTO AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 55
IV
-------
FIGURE 41: MONTHLY O3 DISTRIBUTIONS AT ST. Louis AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY, AND AIR
QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT) AND 2008-
2010 (RIGHT) 55
FIGURE 42: MONTHLY O3 DISTRIBUTIONS AT WASHINGTON, D.C. AREA REGULATORY MONITORING SITES FOR OBSERVED AIR QUALITY,
AND AIR QUALITY ADJUSTED TO MEET THE EXISTING (75 PPB) AND ALTERNATIVE (65 PPB) STANDARDS FOR 2006-2008 (LEFT)
AND 2008-2010 (RIGHT) 56
FIGURE 43: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR ATLANTA BASED ON OBSERVED AND ADJUSTED AIR QUALITY.
61
FIGURE 44: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR BALTIMORE BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 61
FIGURE 45: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR BOSTON BASED ON OBSERVED AND ADJUSTED AIR QUALITY.
62
FIGURE 46: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR CLEVELAND BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 62
FIGURE 47: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR DENVER BASED ON OBSERVED AND ADJUSTED AIR QUALITY.
63
FIGURE 48: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR DETROIT BASED ON OBSERVED AND ADJUSTED AIR QUALITY.
63
FIGURE 49: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR HOUSTON BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 64
FIGURE 50: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR Los ANGELES BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 64
FIGURE 51: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR NEW YORK BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 65
FIGURE 52: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR PHILADELPHIA BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 65
FIGURE 53: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR SACRAMENTO BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 66
FIGURE 54: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR ST. Louis BASED ON OBSERVED AND ADJUSTED AIR
QUALITY 66
FIGURE 55: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN ATLANTA 71
FIGURE 56: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN BALTIMORE 72
FIGURE 57: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN BOSTON 73
FIGURE 58: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN CHICAGO 74
FIGURE 59: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN CLEVELAND 75
V
-------
FIGURE 60: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN DALLAS 76
FIGURE 61: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN DENVER 77
FIGURE 62: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN DETROIT 78
FIGURE 63: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN HOUSTON 79
FIGURE 64: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN LOS ANGELES 80
FIGURE 65: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN NEWYORK 81
FIGURE 66: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN PHILADELPHIA 82
FIGURE 67: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN SACRAMENTO 83
FIGURE 68: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN ST. LOUIS 84
FIGURE 69: CHANGE IN VNA ESTIMATES OF THE DAILY MAXIMUM S-HOUR AVERAGE (MDA8) O3 CONCENTRATIONS BASED ON HDDM
ADJUSTMENTS IN WASHINGTON, D.C 85
FIGURE 70: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR ATLANTA, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 86
FIGURE 71: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR ATLANTA, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 87
FIGURE 72: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR BALTIMORE, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 88
FIGURE 73: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR BALTIMORE, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 89
FIGURE 74: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR BOSTON, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 90
FIGURE 75: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR BOSTON, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 91
FIGURE 76: CHANGES IN ANNUAL 4 HIGHEST MDA8 AND MAY-SEPTEMBER AVERAGE MDA8 VALUES BASED ON HDDM
ADJUSTMENTS FOR CHICAGO, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 92
VI
-------
FIGURE 77: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR CHICAGO, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 93
FIGURE 78: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR CLEVELAND, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 94
FIGURE 79: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR CLEVELAND, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 95
FIGURE 80: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR DALLAS, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 96
FIGURE 81: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR DALLAS, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 97
FIGURE 82: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR DENVER, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 98
FIGURE 83: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR DENVER, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 99
FIGURE 84: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR DETROIT, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 100
FIGURE 85: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR DETROIT, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 101
FIGURE 86: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR HOUSTON, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 102
FIGURE 87: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR HOUSTON, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 103
FIGURE 88: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR LOS ANGELES, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 104
FIGURE 89: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR LOS ANGELES, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 105
VII
-------
FIGURE 90: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR NEW YORK, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 106
FIGURE 91: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR NEW YORK, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 107
FIGURE 92: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR PHILADELPHIA, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 108
FIGURE 93: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR PHILADELPHIA, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 109
FIGURE 94: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR SACRAMENTO, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 110
FIGURE 95: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR SACRAMENTO, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES
FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR Ill
FIGURE 96: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR ST. LOUIS, 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 112
FIGURE 97: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR ST. LOUIS, 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND VALUES FALLING
OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 113
FIGURE 98: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR WASHINGTON, D.C., 2006-2008. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND
VALUES FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 114
FIGURE 99: CHANGES IN ANNUAL 4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS VALUES BASED ON HDDM
ADJUSTMENTS FOR WASHINGTON D.C., 2008-2010. THE POINTS ARE COLORED ACCORDING TO THE CHANGE IN PPB, AND
VALUES FALLING OUTSIDE THE RANGE IN THE COLOR BAR WERE SET TO THE NEAREST VALUE WITHIN THE COLOR BAR 115
FIGURE 100: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR ATLANTA VERSUS POPULATION AND POPULATION DENSITY 116
FIGURE 101: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR BALTIMORE VERSUS POPULATION AND POPULATION DENSITY 117
FIGURE 102: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR BOSTON VERSUS POPULATION AND POPULATION DENSITY 118
FIGURE 103: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR CHICAGO VERSUS POPULATION AND POPULATION DENSITY 119
FIGURE 104: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR CLEVELAND VERSUS POPULATION AND POPULATION DENSITY 120
VIII
-------
FIGURE 105: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDA8AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR DALLAS VERSUS POPULATION AND POPULATION DENSITY 121
FIGURE 106: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR DENVER VERSUS POPULATION AND POPULATION DENSITY 122
FIGURE 107: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR DETROIT VERSUS POPULATION AND POPULATION DENSITY 123
FIGURE 108: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR HOUSTON VERSUS POPULATION AND POPULATION DENSITY 124
FIGURE 109: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR LOS ANGELES VERSUS POPULATION AND POPULATION DENSITY 125
FIGURE 110: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR NEW YORK VERSUS POPULATION AND POPULATION DENSITY 126
FIGURE 111: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR PHILADELPHIA VERSUS POPULATION AND POPULATION DENSITY 127
FIGURE 112: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR SACRAMENTO VERSUS POPULATION AND POPULATION DENSITY 128
FIGURE 113: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR ST. LOUIS VERSUS POPULATION AND POPULATION DENSITY 129
FIGURE 114: CHANGES IN VNA ESTIMATES OF ANNUAL4 HIGHEST MDAS AND MAY-SEPTEMBER AVERAGE MDAS BASED ON
HDDM ADJUSTMENTS FOR WASHINGTON, D.C. VERSUS POPULATION AND POPULATION DENSITY 130
FIGURE 115: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR DENVER BASED ON OBSERVED AND ADJUSTED AIR QUALITY
FOR THE NOX ONLY AND NOX/VOC SCENARIOS 133
FIGURE 116: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR DETROIT BASED ON OBSERVED AND ADJUSTED AIR
QUALITY FOR THE NOX ONLY AND NOX/VOC SCENARIOS 134
FIGURE 117: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR HOUSTON BASED ON OBSERVED AND ADJUSTED AIR
QUALITY FOR THE NOX ONLY AND NOX/VOC SCENARIOS 134
FIGURE 118: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR Los ANGELES BASED ON OBSERVED AND ADJUSTED AIR
QUALITY FOR THE NOX ONLY AND NOX/VOC SCENARIOS 135
FIGURE 119: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR NEW YORK BASED ON OBSERVED AND ADJUSTED AIR
QUALITY FOR THE NOX ONLY AND NOX/VOC SCENARIOS 135
FIGURE 120: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR PHILADELPHIA BASED ON OBSERVED AND ADJUSTED AIR
QUALITY FOR THE NOX ONLY AND NOX/VOC SCENARIOS 136
FIGURE 121: COMPOSITE MONITOR DAILY MAXIMUM S-HOUR O3 VALUES FOR SACRAMENTO BASED ON OBSERVED AND ADJUSTED AIR
QUALITY FOR THE NOX ONLY AND NOX/VOC SCENARIOS 136
FIGURE 122: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT DENVER AREA MONITOR LOCATIONS FOR
OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65 PPB ADJUSTMENT
SCENARIOS (RIGHT PANELS). NOx ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOx/VOC ADJUSTMENTS ARE SHOWN IN
BOTTOM PANELS 138
FIGURE 123: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT DETROIT AREA MONITOR LOCATIONS FOR
OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65 PPB ADJUSTMENT
IX
-------
SCENARIOS (RIGHT PANELS). NOx ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOx/VOC ADJUSTMENTS ARE SHOWN IN
BOTTOM PANELS 139
FIGURE 124: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT HOUSTON AREA MONITOR LOCATIONS
FOR OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65 PPB
ADJUSTMENT SCENARIOS (RIGHT PANELS). NOx ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOx/VOC ADJUSTMENTS ARE
SHOWN IN BOTTOM PANELS 140
FIGURE 125: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT Los ANGELES AREA MONITOR LOCATIONS
FOR OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65 PPB
ADJUSTMENT SCENARIOS (RIGHT PANELS). NOx ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOx/VOC ADJUSTMENTS ARE
SHOWN IN BOTTOM PANELS 141
FIGURE 126: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT NEW YORK AREA MONITOR LOCATIONS
FOR OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65 PPB
ADJUSTMENT SCENARIOS (RIGHT PANELS). NOx ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOx/VOC ADJUSTMENTS ARE
SHOWN IN BOTTOM PANELS 142
FIGURE 127: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT PHILADELPHIA AREA MONITOR
LOCATIONS FOR OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65
PPB ADJUSTMENT SCENARIOS (RIGHT PANELS). NOX ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOX/VOC ADJUSTMENTS
ARE SHOWN IN BOTTOM PANELS 143
FIGURE 128: APRIL-OCTOBER SEASONAL AVERAGE OF DAILY MAXIMUM S-HOUR O3 VALUES AT SACRAMENTO AREA MONITOR LOCATIONS
FOR OBSERVED 2006-2008 CONDITIONS (LEFT PANEL), 75 PPB ADJUSTMENT SCENARIOS (MIDDLE PANELS), AND 65 PPB
ADJUSTMENT SCENARIOS (RIGHT PANELS). NOx ONLY ADJUSTMENTS ARE SHOWN IN TOP PANELS, NOx/VOC ADJUSTMENTS ARE
SHOWN IN BOTTOM PANELS 144
X
-------
Table of Tables
TABLE 1: X AND Y OUTPOINTS USED IN EQUATIONS (4) AND (8). NOTE: THE NOx/VOC SENSITIVITY CASE WAS ONLY PERFORMED IN
TWO CITIES 12
TABLE 2: PERCENT EMISSIONS REDUCTIONS USED FOR EACH URBAN AREA TO OBTAIN EACH STANDARD 26
TABLE 3: DESIGN VALUES FOR THE ATLANTA AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET THE
EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 29
TABLE 4: DESIGN VALUES FOR THE BALTIMORE AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET
THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 29
TABLE 5: DESIGN VALUES FOR THE BOSTON AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET THE
EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 30
TABLE 6: DESIGN VALUES FOR THE CHICAGO AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET THE
EXISTING AND POTENTIAL ALTERNATIVE STANDARDS USING NOX AND VOC EMISSIONS REDUCTIONS IN 2006-2008 AND 2008-
2010 30
TABLE 7: DESIGN VALUES FOR THE CLEVELAND AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET
THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 31
TABLE 8: DESIGN VALUES FOR THE DALLAS AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET THE
EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 31
TABLE 9: DESIGN VALUES FOR THE DENVER AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET THE
EXISTING AND POTENTIAL ALTERNATIVE STANDARDS USING NOX AND VOC EMISSIONS REDUCTIONS IN 2006-2008 AND 2008-
2010 32
TABLE 10: DESIGN VALUES FOR THE DETROIT AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET THE
EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 33
TABLE 11: DESIGN VALUES FOR THE HOUSTON AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET
THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 33
TABLE 12: DESIGN VALUES FOR THE Los ANGELES AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO
MEET THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS USING THE LOWER BOUND OF THE 95 PERCENT CONFIDENCE
INTERVAL OF ESTIMATED HOURLY O3 IN 2006-2008 AND 2008-2010 34
TABLE 13: DESIGN VALUES FOR THE NEW YORK AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET
THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS USING THE LOWER BOUND OF THE 95 PERCENT CONFIDENCE INTERVAL
OF ESTIMATED HOURLY O3 IN 2006-2008 AND 2008-2010 35
TABLE 14: DESIGN VALUES FOR THE PHILADELPHIA AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO
MEET THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 36
TABLE 15: DESIGN VALUES FOR THE SACRAMENTO AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO
MEET THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 37
TABLE 16: DESIGN VALUES FOR THE SAINT Louis AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS TO MEET
THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 38
TABLE 17: DESIGN VALUES FOR THE WASHINGTON D.C. AREA REGULATORY MONITORS FROM OBSERVED DATA AND FOR ADJUSTMENTS
TO MEET THE EXISTING AND POTENTIAL ALTERNATIVE STANDARDS IN 2006-2008 AND 2008-2010 38
TABLE 18: MEAN STANDARD ERROR (PPB) IN ADJUSTED HOURLY O3 CONCENTRATION IN EACH URBAN AREA FOR EACH STANDARD 56
XI
-------
TABLE 19: 95 PERCENTILE STANDARD ERROR (PPB) IN ADJUSTED HOURLY O3 CONCENTRATION IN EACH URBAN AREA FOR EACH
STANDARD 57
TABLE 20: COMPARISON OF NOX-ONLY AND NOx/VOC EMISSION REDUCTIONS APPLIED IN SENSITIVITY ANALYSES FOR SEVEN URBAN
AREAS 132
XII
-------
1 MOTIVATION FOR A NEW TECHNIQUE TO SIMULATE OZONE
CONCENTRATIONS UNDER ALTERNATIVE STANDARDS
As part of the reviews of the National Ambient Air Quality Standards (NAAQS) for 63,
EPA estimates health risks after Os has been adjusted to just meet the existing standard and
potential alternative standards. The first draft documents for this review rely upon the quadratic
rollback method used in previous reviews to adjust or "roll back" hourly 63 concentrations in
urban areas. Although the quadratic rollback method simulates historical patterns of air quality
changes more closely than some alternative methods (e.g. simply shaving peak concentrations
off at the NAAQS level), its implementation requires some assumptions that may not always
hold true. Specifically, the quadratic rollback method assumes that all monitors in an urban
area will have the same response to emissions changes not allowing for temporally varying
response depending on time of day. In addition, it assumes that ozone concentrations never
increase in response to emissions reductions. However, during NOx-saturated (i.e. VOC
limited) conditions, NOx reductions can result in ozone increases (Seinfeld and Pandis, 1998).
Finally, since the quadratic rollback method is purely a mathematical technique and does not
account for physical and chemical atmospheric processes or the sources of emissions precursors
that lead to O3 formation, a backstop or "floor" must be used to ensure that predicted O3 is not
reduced below "background" concentrations1.
EPA has received comments during past 63 NAAQS reviews and during the January 9-
10, 2012 and Sep 11-13, 2013 Clean Air Scientific Advisory Committee (CASAC) meetings for
this Os NAAQS review which encourage the use of alternate methods to quadratic rollback. In
addition, the National Research Council of the National Academies (NRC, 2008) recommended
that EPA explore how emissions reductions might effect temporal and spatial variations in Os
concentrations, and that EPA include information on how NOX versus VOC control strategies
might affect risk and exposure to O3.
Photochemical modeling can simulate the Os response to emission reductions while
avoiding the limitations presented by the quadratic rollback method. While there are
uncertainties inherent in any modeling exercise due to uncertainties in inputs and model
parameters, photochemical modeling provides a more representative characterization of the
1 Background O3 has been characterized in previous reviews of the O3 NAAQS as "policy relevant
background" or PRB, defined as O3 concentrations that would exist in the absence of North American
anthropogenic emissions. In the current review, we have refined the concept of background O3 to recognize that
there are several possible definitions of background O3, reflecting both the geographic source of emissions, e.g.
U.S., North American, Global non-U.S., and whether emissions are anthropogenic or natural in origin. In the cases
described in this document, "background" refers to O3 that would exist in absence of U.S. anthropogenic
emissions.
-------
spatial and temporal responses of 63 to emissions reductions. In this document we present a
model-based Os adjustment methodology that allows for adjustments to Os concentrations.
This new approach is firmly rooted in the latest atmospheric modeling science and was peer
reviewed in Simon et al (2012). This analysis uses EPA's Community Multi-scale Air Quality
(CMAQ) photochemical model (www.cmaq-model.org) instrumented with the Higher order
Direct Decoupled Method (HDDM) - a tool that generates modeled sensitivities of 63 to
emissions changes - to estimate the distribution of O3 concentrations associated with
achievement of the existing Os standard and alternative standards for multiple urban areas. The
HDDM sensitivities are applied to ambient measurements of 63 to estimate how 63
concentrations would respond to changes in U.S. anthropogenic emissions. We use this
methodology to estimate Os concentrations meeting the existing and potential alternative
standards in the 2nd drafts of the risk and exposure assessments and the 2nd draft policy
assessment. The photochemical modeling incorporates emissions from non-anthropogenic
sources and anthropogenic emissions from sources in the U.S and in portions of Canada and
Mexico. Pollution from sources in other locations within and outside of North America is
included as transport into the boundary of the modeling domain. Because the application of the
model-based approach focuses on reductions in U.S. anthropogenic emissions while holding
constant those emissions that influence U.S. background, all changes in Os will be relative to
U.S. background. This does not mean that background O?, concentrations will be constant
between recent ambient 63 conditions and after just meeting the existing or alternative standard
levels, because of nonlinearities in the formation of ozone.
2. HIGHER-ORDER DECOUPLED DIRECT METHOD (HDDM)
2.1 CAPABILITIES
Chemical transport models, such as CMAQ, simulate the effects of physical and
chemical processes in the atmosphere to predict 3-D gridded pollutant concentrations (Foley et
al, 2010, Appel et al, 2008, Appel et al, 2007, Byun and Schere, 2006). These models account
for the impacts of emissions, transport, chemistry, and deposition on spatially and temporally
varying pollutant concentrations. Required model inputs include time-varying emissions and
meteorology fields, time varying concentrations of pollutants at the boundaries of the model
domain (i.e. boundary conditions), and a characterization of the 3-D field of chemical
concentrations to initialize the model (i.e. initial conditions).
Beyond modeling the concentrations of ambient Os, chemical transport models can be
used to estimate the response of ambient 63 concentrations to changes in emissions. One
-------
technique to simulate the response of 63 to emissions changes, the brute force method, requires
the modeler to explicitly model this response by directly altering the emissions inputs in the
model simulation. This technique provides an estimate of the O?, concentration at the altered
emission level, but often does not provide accurate information regarding the response of 63 to
other levels of emissions since the chemistry for Os formation is nonlinear. Therefore, when
using only brute force techniques a new model simulation would need to be performed for
every emissions scenario under consideration.
Other analytical techniques have been developed to estimate the Os response to
emissions perturbations without performing multiple simulations. One such method is termed
the Decoupled Direct Method (DDM) (Dunker 1984). DDM, solves for sensitivity coefficients
which are defined as the partial derivative of the atmospheric diffusion equations that underly
the model calculations, Equations (2) and (3).
sy(t) = Equation (2)
Sy (t) = p^M = p^m. EM Equation (3)
'' ' dpj I d(ejPj) dej n v '
Here, Sy(t), the sensitivity, gives the change in model concentration, C;, (for instance Os
concentration) with an incremental change in any input parameter, PJ (in this case emissions).
Equation (3) allows us to normalize the sensitivity coefficient, Sy(t), so that it shows response in
relative terms for the input rather than in absolute units. Therefore, PJ (x,t) is the normalized
input and Sj is a scaling variable (Yang et al, 1997). In general terms, the sensitivity coefficient
tells us how a model output (O3 concentration) will change if a model input (emissions of NOX
or VOC) is perturbed. This first order sensitivity coefficient, Sy(t) is quite suitable for small
perturbations, but gives a linear response which is unlikely to represent the results of large
perturbations in very nonlinear chemical environments. Second (and third) order derivatives
can be calculated to give higher order sensitivity coefficients (Hakami et al, 2003). Higher
order sensitivity coefficients give the curvature and inflection points for the response curve and
can capture the nonlinearities in the response of Os to emissions changes. Using Higher order
DDM (HDDM) allows for the sensitivities to be more appropriately applied over larger
emissions perturbations. Hakami et al. (2003) report that for an application in California,
HDDM gave reasonable approximations of Os changes compared to brute force results for
perturbations of emissions up to 50% using the first three terms of the Taylor series expansion,
Equation (4).
-------
C(+Ae) = C(0) + AeS(O) + 52(0) + ... + ^5n(0) + Rn+1 Equation (4)
Here Ae represents the relative change in emissions (for instance Ae = -0.2 would be equivalent
to reducing emissions by 20%), Sn(0) is the n-th order sensitivity coefficient, C(0) is the
concentration under baseline conditions (no perturbation in emissions) and Rn+i is a remainder
term.
A variant of DDM called DDM-3D has been implemented into several chemical
transport models, including CMAQ, for both O3 and particulate matter (PM) predictions
(Dunker, 1984; Yang et al, 1997; Hakami et al, 2003; Cohan et al, 2005; Napelenok et al, 2006;
Koo et al., 2010; Zhang et al, 2012). These implementations allow the modeler to define the
parameters for which first and higher order sensitivities will be calculated. For instance, the
sensitivity can be calculated for emissions from a specific source type, for emissions in a
specific geographic region, and for emissions of a single 63 precursor or for multiple 63
precursors. In addition, sensitivities can be calculated to boundary conditions, initial
conditions, and various other model inputs. Sensitivities to different sets of parameters can be
calculated in a single model simulation but computation time increases as the number of
sensitivities increases. Outputs from an HDDM simulation consist of time varying 3-D fields
of first and second order sensitivities.
For the purposes of the O3 NAAQS analysis, HDDM provides an improved approach
compared to existing quadratic rollback techniques for several reasons. First, it captures non-
linearity of 63 response to emissions changes, representing both increases and decreases in 63
concentrations resulting from emissions reductions. Second, HDDM characterizes different Os
response at different locations (downtown urban versus downwind suburban) and at different
times of day allowing us to incorporate temporal and spatial variations in response into the 63
adjustment methodology. Finally, HDDM eliminates the need to use "background" Os as a
floor for rollback since predicted sensitivities are based on model formulations that explicitly
account for background sources.
2.2 LIMITATIONS
In addition to the many potential benefits of using HDDM to understand and
characterize O3 response to emissions changes, there are several limitations. First, HDDM
encompasses all of the uncertainties of the base photochemical model formulation and inputs.
So uncertainties in how the physical and chemical processes are treated in the model and in the
model inputs propagate to the HDDM results. Also, HDDM can capture response to larger
emissions perturbations than DDM but it is still most accurate for small perturbations. The
larger the relative change in emissions, the less likely that the HDDM sensitivities will properly
-------
capture the change in 63 that would be predicted by a brute force model simulation. Several
studies have reported reasonable performance of HDDM for Os up to 50% emissions
perturbations (Hakami et al, 2003; Cohan et al., 2005; Hakami et al, 2004), but the magnitude
of perturbation over which HDDM will give accurate estimates will depend on the specific
modeling episode, size of the model domain, emissions and meteorological inputs, and the size
of the emissions source to which the sensitivity is being calculated. In this work, we applied
sensitivities derived from model simulations done under varying NOx levels (see Section 3.2.1)
and found that using this technique we were able to replicate brute force estimates using
HDDM sensitivities for up to 90% NOx reductions with a mean bias of less than 3 ppb and a
mean error of less than 4 ppb.
3 APPLYING HDDM/CMAQ TO ADJUST OZONE TO JUST MEET
CURRENT AND ALTERNATIVE STANDARDS: METHODOLOGY
3 1 CONCEPTUAL FRAMEWORK
This section outlines the methodology from Simon et al (2012) in which we apply
CMAQ/HDDM to estimate hourly Oj concentrations that might result from meeting the
existing and potential alternative standards. As part of the methodology, photochemical
modeling results are not used in an absolute sense, but instead are applied to modulate ambient
measurements, thus tying estimated Os distributions to measured values. The basic steps are
outlined below and in Figure 1 Details are given in section 3. The inputs, set-up, and
evaluation of the modeling system are described in Appendix 4-B.
Step 1: Run CMAQ simulation with HDDM to determine hourly Os sensitivities to
NOX emissions and VOC emissions for the grid cells containing monitoring sites in an
urban area.
Step 2: For each monitoring site, season, and hour of the day use linear regression to
relate first order sensitivities of NOX and VOC (SNox and Svoc) to modeled O3 and
second order sensitivities to NOx and VOC (S2NOx and S2voc) to the first order
sensitivities.
Step 3: For each measured hourly Os value, calculate the first and second order
sensitivities based on monitoring site-, season-, and hour-specific functions calculated in
Step 2.
Step 4: Adjust measured hourly 2006-2010 Os concentrations for incrementally
increasing levels of emissions reductions using assigned sensitivities and then
-------
recalculate 2006-2008 and 2008-2010 design values until all monitors in an urban area
are in attainment of existing and potential alternative levels of the standard.
Natural
1
)
1
L
O3an
Anthropogenic
^_ U.S.
Canada and
Mexico
d O3 Precursor Emissions
Meteorology
Initial and Boundary conditions
Other Model Inputs
I i
r
(i
Recent Monitored O3
(2006-2010)
i
Step 3:
Use Regressions and
Observed Ozone to
Predict Sensitivities
>.
Hourly Ozone \
Dbservations Paired with \
ensitivities for 2006-2010 j
At All Monitor Locations J
f Unigue L
/ Relationships
Sensitivities a
Location fo
\ Season
\ Hour-of-th
/Select Err
/ Reductic
"N. which Sen
\Will Be /i
Stepla:
CMAQ
HDDM Modeling
(Jan, Apr-Oct 2007)
near \
Between \
nd Hourly \
r Each /
e-Day ^/
\
issions>v Ste
ns to \ Adjust HOL
sitivities / to Meet;
pplied/' Stan
{Gridded hourly O3 \
Concentrations and 1 >
Sensitivities I
/Ozo
/ &
Step 2: ^
sate Regressions " Me
\ Ea
V (JE
/"Ad
P, n I Vali
rlv Ozone J r
Mternate *1 C^.
Jards V T
Steplb:
Extract Output
E
ne Concentrations \
Sensitivities at \
Locations of
)nitoring Sites for 1
Dh Modeled Hour /
n, Apr-Oct 2007) /
usted Hourly OzoneX
jes for 2006-201 Oat \
h Monitor Location to I
ow Attainmentwith /
ternate Standards ^/
Figure 1: Flow diagram demonstrating DDM model-based O3 adjustment approach
3.2 APPLICATION TO MEASURED O3 CONCENTRATIONS IN 15 URBAN AREAS
In this analysis, we apply the model-based adjustment approach described above for
attainment of the existing standard and three potential alternative standards. The analysis
covers the 15 urban areas listed in Chapter 4 using photochemical modeling for January, and
April-October of 2007 and ambient data for the years 2006-2010. When running CMAQ with
HDDM, additional information is required to designate model inputs for calculating
sensitivities. In this analysis, HDDM was set up to calculate the sensitivity of 63
-------
concentrations to U.S. anthropogenic NOX and VOC emissions2. US anthropogenic emissions
were defined as all emissions in the following sectors: commercial marine and rail, onroad
mobile, offroad mobile, Electric Generating Unit (power plant) point sources, non-EGU point,
and non-point area. These sectors accounted for 17.6 million of the total domain-wide 23.6
million tons per year of NOX emissions and 13.8 million of the total domain-wide 68.1 million
tons per year of VOC emissions. Sensitivities were not determined for biogenic, fire, Canadian,
or Mexican emissions. In addition, sensitivities were not calculated for any emissions
originating from outside the domain (i.e. entering through the use of boundary concentrations).
3.2.1 Multi-step Application of HDDM Sensitivities
As discussed in Section 2.2 of this appendix, HDDM has been reported to reasonably
replicate brute force emissions reductions up to a 50% change in emissions. For this analysis, it
was desirable to have confidence that the HDDM sensitivities could replicate the entire range of
emissions reductions. Evaluations of the HDDM estimates compared to brute force emissions
reduction model runs confirm that the HDDM estimates of Os response to NOX reductions are
fairly comparable for a 50% change. However, HDDM and brute force estimates begin to
diverge in comparisons under larger emissions changes (90%). Consequently four additional
CMAQ/HDDM runs were performed under different levels of NOx and VOC emissions
reductions in order to characterize ozone sensitivities to NOx reductions over a larger range of
emissions perterbations. One CMAQ/HDDM simulation was performed with US
anthropogenic NOX cut by 50%. A second additional simulation was performed with a 90%
NOX reduction. Another set of CMAQ/HDDM simulations were performed for simultaneous
NOX and VOC cuts (50% NOX and 50% VOC; 90% NOX and 90% VOC). Emissions of other
species were not modified from the base case in these four simulations. These additional
HDDM simulations give ozone sensitivities to NOX and VOC under conditions with lower NOX
(or NOX and VOC) emissions in the US. These sensitivities are used in a multistep adjustment
approach.
Figure 2 gives a conceptual picture of the multistep adjustment procedure using first-
order sensitivities. Sensitivities from the base run are used to adjust O3 concentrations for NOx
emissions reductions up to X%. Additional emission reductions beyond X% use sensitivities
from the 50% NOx cut run until reductions exceed (X+Y)%. Finally, sensitivities from the 90%
NOx cut run are applied for any emission reductions beyond (X+Y)%. In order to more closely
approximate the non-linear O3 response to any level of emissions reductions, 2nd order terms are
added to the multistep approximation method in Equations (4)-(7). P represents the percentage
: Sensitivities only tracked U.S. emissions in the contiguous 48 states.
-------
NOX cut for which the AOs values are being calculated, S and S are the first and second order
Os sensitivities to US NOX emissions, and X and Y are described above. Alternately, Equation
(8) can be used with Equations (5)-(7) if simultaneous NOX and VOCC cuts are being simulated.
Note that the 50% and 90% cut sensitivities used in Equation (8) come from model runs which
cut both VOC and NOx by these percentages in contrast to the NOx cut only sensitivities that
are used in Equation 4. Consequently, Equation (8) works only for equal percentage cuts in
both NOX and VOC emissions. In equations (4)-(7), we cap A03, so that Os never drops below
zero due to emissions changes.
A03 = -a x SNOXhase + Y x S20xbase -bx SNOX50%cut + -x S2OX5o%cut Equation (4)
a =
C X SNOXg0%cut + ~T"
( for P < X
\ 100 ' ~
1 forP>X
'100 '
Equation (5)
0 for P < X
1 forX
-------
O3
L.
+->
C
CD
U
C
o
U
CD
C
o
M
O
I HOx first-order approximation from base DDM simulation
I NOx first-order approximation from 50% NOx cut DDM simulation
I NOx s first-order approximation from 75% NOx cut DDM simulation
Theoretical ozone response curve to NOx emission reductions
CMAQ DDM simulations
Starting concentration
Estimated
concentration
after P%
reduction in
baseline NOx
emissions
100
T \ I \ \
P 75 X+Y 50 X
% reduction in baseline NOx emissions
Figure 2: Conceptual picture of 3-step application of HDDM sensitivities
The ideal value for equation transition points, X and Y, are determined by minimizing
the least square mean error between the adjusted concentrations using the multistep approach
and modeled concentrations from brute force NOx cut runs. We first determined the value of X
which gave the lowest error compared to brute forces estimates at 50% NOx cuts. Then holding
X constant, we determined the value of Y which gave the lowest error compared to brute force
estimates at 90% NOx cuts. This process was performed independently for each of the 15
urban areas in this analysis.
Error in HDDM estimates of hourly Os is defined here as the difference between
HDDM estimates and brute force Os. Based on equations (4)-(7), this can be calculated from
Equations (9) and (10) for 50% NOx cuts:
-------
£ = AOzoneHDDM 50 - AOzoneSF 50 Equation (9)
2
X X2 c2 2(50 -J) (2x(50-r>)2
QQ x SNOx_t)ase + 2 x 1002 X NOx-base 1QO X NOxs°%cut + 2 x 1002 X N
Equation (10)
2
Equation (10) can be rearranged to appear in the form: AX + BX + C:
_ SNOxhase 4 X SNOxsa%cut\ 2 (~SNOxhase 2 X ^JVOaTyoeut 400 X
2 x 1002 2 x 1002 100 100 2 x 1002
Equation (11)
Equation (12)
Equation (13)
" ** '
c = (-sNOxso%cut + Sj^ - AOzonesF,50) Equation (14)
Next, the error is squared, summed over all points (error can be calculated for each hourly 63
value at each monitoring location), and the derivative is set to 0 to determine X which gives the
least squares error (Equations (15), (16), and (17)).
e2 = A2x* + 2ABX3 + (2AC + B2)x2 + 2BCX + c2 Equation (15)
I £2 = (Zv42)*4 + (1. 2AB)x3 + (Z 2AC + B2)x2 + (Z 2BC)x + ^c2 Equation (16)
Q]£2)' = (A*£A2}x3 + ($*£2AB}x2 + (2*£2AC + B2}x + (£2BC} = 0 Equation (17)
The value of X that gives the least squares error will occur at one of the 3 roots of the trinomial
in Equation (17) or at 0 or 50. All real roots, 0, and 50 were input into equation (16) and X was
set to the value which resulted in the lowest error in each city. An analogous procedure was
followed to determine Y using the 90% NOx cut brute force simulation and Equations (18)
through (24).
10
-------
QQ X S
22Y2
X 1002
NOxbase
C2 _
X SNOxhase X SNOxso%cut
10(90 -(X + 7))
~~
X S
NOxm%cut
._
90"/.Cut - AOzoneSF,90
Equation (18)
£ =
2ABY3
2BCY + C2
Equation (19)
A = x40.5oM ioox409
2xioo2 2xioo
0./.mA
2 /
Equation (20)
ivy
_
S-2xsNcuBO%cut
\ 100
2 x 1002
Eauation (21}
4 V /
100x(90-J)2
100
2 x 1002
+ (Z 2AB)Y3 + (Z 2v4C + B2)72 + (Z 2BC)7 + Z c2
(Z £2)' = (4Z A2~)Y3 +
+ (2 Z 2AC + B2~)Y + (Z 2BC) = o
Equation (22)
Equation (23)
Equation (24)
This methodology can also be adjusted to calculate least squares error cutpoints for the
combined NOX and VOC emissions reduction case. X and Y cutpoints which have the least
square error in each urban area are shown in
Table 1. This 3-step adjustment methodology was shown to be a robust method for
minimizing error in the HDDM applications for larger percentage changes in emissions by
Simon et al (2012). Figure 3 through Figure 6 are density scatter plots that compare hourly
ozone estimates from brute force with hourly ozone estimates from the 3-step FtDDM
adjustments at all monitor locations in each of the 15 urban areas evaluated in this study. The
colors in these plots depict the percentage of points falling at any one location. Mean error for
the 50% and 90% 3-step FtDDM adjustment NOx cut cases compared to brute force results are
less than 1 ppb and 4 ppb respectively in all 15 case study areas.
11
-------
Table 1: X and Y cutpoints used in Equations (4) and (8).
case was only performed in two cities.
Note: the NOx/VOC sensitivity
Urban Area
Atlanta
Baltimore
Boston
Chicago
Cleveland
Dallas
Denver
Detroit
Houston
Los Angeles
New York
Philadelphia
Sacramento
Saint Louis
Washington D.C.
NOx cuts only
X
40
40
39
38
42
37
39
38
37
38
37
40
38
39
40
Y
46
46
45
45
45
46
45
45
44
46
45
45
45
45
46
NOx and VOC
combined cuts
X
40
43
Y
46
45
12
-------
Atlanta Area Sites
Baltimore Area Sites
Boston Area Sites
Q
s
u
X
O
Y = 0.24 +0.97 *X
= -0.8ppb
ME = 1 ppb
~iiiiii
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Chicago Area Sites
Y = 0.43 + 0.98 * X
as S-
O
in
15 s-
m ^
O
-D.Zppb
ME = 1 ppb
I I I I I I
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Denver Area Sites
°-\ Y = 0.24+ 0.99* X
9
-1-
MB = -D.4 ppb
ME = 0.9 ppb
I I I I I I
0 20 40 60 80 100 140
03 at 50% NOx cut - Brute Force
5
4
3
2
- 1
- 1
6
5
4
3
2
- 1
|H
$1
1'
Y = 0.77+ 0.97'X
MB = -0.5 ppb
ME = 1 ppb
\ \ \ \ \ \ \
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Cleveland Area Sites
°-\
- 5
- 4
-3 ° o
|§-
-2 as S-
» -
n
O
Y = 0.82 + 0.97 * X
MB = -0.4 ppb
ME = 1 ppb
I I I I I I I
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Detroit Area Sites
°-\
2
-------
Houston Area Sites
Los Angeles Area Sites
New Vork Area Sites
2 *H
Q "~
Ss
Y = 0.87 + 0.97 * X
ivia = -0.3 ppb
ME = 1 ppb
i i i i i i r
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Philadelphia Area Sites
* S-
O
in
IS o_
I \ \ I I ! T
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Washington D.C. Area Sites
f'-l
Y = 0.72 + 0.97 *X
MB = -0,5 ppb
ME = 1 ppb
I I I ! I 1 I
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
3
2
- 1
5
4
3
2
- 1
I-
8-
g-
15 o_
8-
V = 0.49 + 0.99 * X
MB = -02 ppb
ME = 0,7 ppb
\ i i i i i r
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Sacramento Area Sites
?-
Ji-
g-
S-
Y = 0.32 + 0.99 * X
MB = am ppb
ME = 0,4 ppb
i I i : I ' T
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
g-
Y = 1 + 0.97 * X
MB = -0,1 ppb
ME= 1 ppb
i i i i i i r
20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Saint Louis Area Sites
?-
8-
g-
ts s-l
CO
O
Y = 0.22 + 0.98 * X
MB = -0,5 ppb
ME = 1 ppb
\ \ I I I ! '
0 20 40 60 80 100 140
O3 at 50% NOx cut - Brute Force
Figure 4: Comparison of brute force and 3-step HDDM Os estimates for 50% NOx cut
conditions: Houston, Los Angeles, New York, Philadelphia, Sacramento, St. Louis,
and Washington D.C.
14
-------
Atlanta Area Sites
Baltimore Area Sites
Boston Area Sites
2-
Q
s
u
X
O
2-
m T
O
o _
Y = -2.3 + 0.97 'X
MB = -3 ppb
ME = 4 ppb
I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Chicago Area Sites
I-
5 S-
Q "~
9s.
15 -
n
O
-2.3 + 0.87 * X
MB = -Zppb
ME = 3 ppb
I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Denver Area Sites
Y = 1.1+0.94*X
9
-1-
MB = -1 ppb
ME = Z ppb
I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
- 3
- 1
5
4
3
2
- 1
i- Y = 1.5 + 0.9*X
1
2-
^
MB = -Zppb
ME = 3 ppb
I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Cleveland Area Sites
°-\ Y = 0.82 + 0.93 * X
5 S-
Q "~
9 8-
x -
as S-
O
OT
» 2-
m ^
O
= -Zppb
ME = 3 ppb
I I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Detroit Area Sites
°-\ Y = 1.7 +0.89 *X
3§-
Q
I
i
S o
-1- I1 -
g
s-
MB = -Zppb
ME = 3 ppb
I I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
- 5
4
3
- 2
- 1
- 6
- 5
- 1
2-
Q
9
u
X
o
2-
2 + 0.9*X
MB = -1 ppb
ME = Z ppb
\ i i i i i r
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Dallas Area Sites
3-
9 8-1
4 -5
u
3
2
* S-
Y = 1.1 +0.87"X
MB = -3 ppb
ME = 4 ppb
I I I I I I I
0 20 40 60 30 100 140
O3 at 90% NOx cut - Brute Force
Figure 5: Comparison of brute force and 3-step HDDM Os estimates for 90% NOx cut
conditions: Atlanta, Baltimore, Boston, Chicago, Cleveland, Dallas, Denver, and
Detroit.
15
-------
Houston Area Sites
Los Angeles Area Sites
New Vork Area Sites
Q
a
3-
S-
8-
3-
8-
Y = 0.99 + 0.91 *X
i i i i i i r
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Philadelphia Area Sites
S-
g-
3-
Y = 1.9 +0.89 *X
MB = -2 ppb
ME= 3ppb
I I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Washington D.C. Area Sites
3-
Y = 0.81 +0.91 *X
n i i i i i r
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
- 4
- 3
- 2
- 1
- E
- 5
4
3
- 2
- 1
5
4
- 3
g-
O_
CO
g-
3-
S-
Y = 3.5 + 0.9'X
MB = -0.9 ppb
ME = 2 ppb
I I I I I I I
20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Sacramento Area Sites
i-
S-
g-
15 3-
Y = 1.1 +0.96*X
MB = -0.3 ppb
ME = 1 ppb
I I I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
- 6
5 §
1 g-^
4
3
2
- 1
- 6
- 4
-2
IB-.
s« §-
Y = 3.3 +0.85 *X
MB = -1 ppb
ME = 4 ppb
I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Saint Louis Area Sites
i-
S-
g-
Y = 0.51 +0.92*X
ME = 3 ppb
I I I I I
0 20 40 60 80 100 140
O3 at 90% NOx cut - Brute Force
Figure 6: Comparison of brute force and 3-step HDDM Os estimates for 90% NOx cut
conditions: Houston, Los Angeles, New York, Philadelphia, Sacramento, St. Louis,
and Washington D.C.
3.2.2 Relationships between HDDM Sensitivities and Modeled Os Concentrations
First and second order hourly 63 sensitivities to VOC and NOX were extracted from the
HDDM simulation for model grid cells that contained the Oj monitors in the 15 urban areas.
Extracted data included modeled sensitivities at monitor locations for all modeled hours in
2007. These sensitivities cannot be applied directly to observed values for two reasons: 1) high
modeled Os days/hours do not always occur concurrently with high observed Os days/hours and
16
-------
2) the modeling time period includes 8 months in 2007 but the time period we are analyzing in
this REA includes five full years of ambient data, 2006-2010. As to the first point,
photochemical models are generally used in a relative sense for purposes of projecting design
values to assess attainment with the NAAQS standard. In this manner, model predictions are
"anchored" to measured ambient values. In general, the average response on high modeled
days is used for this purpose. This allows for more confidence in calculated results when "less
than ideal model performance [occurs] on individual days" (US EPA, 2007). Similarly, for this
analysis we believe it is appropriate to account for the fact the model does not always perfectly
agree with measurements and that sensitivities from a low 63 modeled day would not be
appropriate to apply to a high Os measured day (and vice-versa) even if they occur on the same
calendar day. For the second point, due to current resource and time constraints we were only
able to model 8 months in 2007. However, the 63 exposure analysis evaluates the effects of 63
decreases for two 3-year periods in 2006-2008 and 2008-2010 most of which is outside the
modeled time period. For both of these reasons, a method was developed to generalize the
modeled site-, season-, and hour-specific sensitivities so that they could be applied to ambient
data during 2006-20103.
Simon et al. (2012) describe how first order sensitivities are generally well correlated
with hourly modeled Os concentrations and second order sensitivities are well correlated to first
order sensitivities. Based on their analysis, we create a separate linear regression for SNOX and
Svoc as functions of hourly 63 (i.e. SNOX = mxQs + b) for every site, season4, and hour-of-the
day examined in this analysis. For instance, for summer 8-am hours at Detroit Site 260990009,
SNOX and O?, values from all 8 am hours in June-August 2007 are used to fit this relationship.
Since only 8 months were modeled (Jan, April-October) regressions for summer season
relationships include more data points (92) than those for winter (31), spring (61), and fall (61).
Similarly, S2NOx and S2voc were calculated as a function of SNOX and Svoc respectively and
S2NOx,voc was calculated as a linear combination of both SNox and SVoc- Figure 7, Figure 8, and
Figure 9 show examples of these regressions for first order NOx sensitivities for a NOx-limited
site (summertime, downwind of Atlanta), a NOx-saturated site (autumn, Queens NY), and a
transitional site which switches between chemical regimes (spring, Long Island NY). Example
relationships are shown for four different times of day with different Os response behavior:
3 The 8 months modeled covered a variety of conditions such that we can use the results from this modeled time
period in conjunction with the ambient data from the longer 5-year period for estimating response and applying
adjustments
4 Here seasons are defined as follows:
For ambient data, Winter = December, January, February; Spring = March, April, May;
Summer = June, July, August; Fall = September, October, November.
For modeled data, Winter = January; Spring = April, May; Summer = June, July, August;
Fall = September, October
17
-------
nighttime (1:00 LST), mid-afternoon (15:00 LST), morning rush-hour (8:00 LST), and evening
rush-hour (18:00 LST). The Atlanta area NOx-limited site (Figure 7) generally showed positive
sensitivities which would lead to decreasing ozone with decreasing emissions. However some
limited 63 increases with NOx reductions from NOx (negative sensitivities) occurred at this site
during nighttime and rush-hour times. The Queens site had negative sensitivities at these four
hours on almost all days indicating strongly NOx saturated conditions. The slopes from the
regressions at the Queens site were negative indicating larger ozone increases are high
concentrations. Finally, the Long Island site had both positive and negative sensitivities but the
slope was positive resulting in ozone increases for low basecase concentrations but ozone
decreases at higher basecase concentrations. Correlations were strongest at the NOx limited
site and during rush-hour and nighttime periods for the other sites. Figure 10, Figure 11, and
Figure 12 show these regressions from the same sites and time-periods for second order
sensitivities.
Comparisons between brute force and HDDM Os estimates shown in Figure 5 and
Figure 6, demonstrate that for the vast majority of data points, HDDM replicates brute force
with minimal errors. However, these figures show a small number of instances in which
HDDM predicts very high hourly Os (> 100 ppb) while the brute force emissions simulations
show much lower Os (< 40 ppb). In such cases, base modeled Os is low due to NOx titration
and increases occur with reductions of NOX. The HDDM sensitivities for these few points
appear to be too high to be applied over large emissions changes because of strongly nonlinear
chemistry. While the brute force simulations predict modest increases in Os at these hours, the
HDDM method predicts very large increase in O3 at these hours. To avoid outlier sensitivities
from biasing this analysis, we set a floor value for negative SNOX values (ozone increases) in
each regression so that SNOX never drops below the 5th percentile modeled SNOX (for each site,
hour, season grouping) when negative SNOX values are simulated (see dotted blue lines in Figure
7, Figure 8, and Figure 9). In addition, SNox is prevented from dropping below 0 when all
modeled SNOX values were positive (i.e. no modeled increases in ozone). These floor values
have the most influence when regressions with highly NOx-saturated conditions, such as those
shown for Queens NY, are applied to ambient O3 values that are higher than the modeled range.
For instance in Figure 9, if there were any measured autumn 18:00 O3 concentrations above 30
ppb at that site in 2006-2010, the floor would prevent the extrapolation of O3 increases from the
modeled time period to conditions that do not apply. This floor is necessary for O3 increases
which, if extrapolated, could unrealistically increase O3 without a bound. In contrast, the O3
decreases can never exceed the total measured O3 so positive NOx sensitivities have a built-in
upper bound.
18
-------
For the 50% and 90% emissions cut CMAQ/HDDM simulation, regressions were
performed for first order NOX and VOC sensitivities with modeled Os from the base HDDM
simulation. The regression technique was performed for the first and second order NOx and
VOC sensitivities from the base run and the 50% emissions cut and 90% emissions cut
simulations. The sensitivities from the emissions cut runs were fitted to hourly Os
concentrations in the base simulation. Simon et al. (2012) found that correlation coefficients
using for sensitivities from NOx cut simulations to base case O3 concentrations were similar to
those with Os concentrations from the NOx cut runs.
Atlanta Site 131510002, summer, 1:00
Atlanta Site 131510002, summer, 15:00
o
CO
0-0
5
z o
o o
cor = 0.8
0 1020
i i i i i
20 30 40 50 60
hourly 05 (ppb)
50 100
hourly O5 (ppb)
150
Atlanta Site 131510002, summer, 8:00
Atlanta Site 131510002, summer, 18:00
i i i i r
0 10 20 30 40 50 60
hourly 05 (ppb)
0 20 40 60 80 120
hourly O5 (ppb)
Figure 7: Relationship between SNOX and hourly Os at a NOx-limited site downwind of
Atlanta (summer). Relationships are shown for one nighttime hour, one morning
rush-hour hour, one daytime hour, and one evening rush-hour hour. The solid
blue line show the linear regression for these points and the dotted blue line shows
the floor value used for SNOX based on the 5th percentile modeled value.
19
-------
NV Site 360810098, autumn, 1:00
NV Site 360810098, autumn, 15:00
5 10 15 20 25
hourly 03 (ppb
NV Site 360810098, autumn, 8:00
40
ourly Qj (ppbj
.60
80
NV Site 360810098, autumn, 18:00
hourly
10, 15, 20
hourly 03 (ppB)
25
Figure 8: Relationship between SNQX and hourly O3 at a NOx-saturated site in Queens
County, NY (autumn). Relationships are shown for one nighttime hour, one
morning rush-hour hour, one daytime hour, and one evening rush-hour hour. The
solid blue line show the linear regression for these points and the dotted blue line
th
shows the floor value used for SNox based on the 5 percentile modeled value.
20
-------
NV Site 361030004, spring, 1:00
0 10 20 30 40 50 60 70
nourfy 03 (ppb)
NV Site 361030004, spring, 8:00
Q-
Q-
cor = 0.68
i
0 10 u 20, 30, 40 50
hourly Q3 (ppb)
NV Site 361030004, spring, 15:00
cor = 0.5
i
20
i
40 60
hourly 03 (ppb)
80 100
NV Site 361030004, spring, 18:00
o
2
tO i
hourly O3 (ppb)
80
Figure 9: Relationship between SNox and hourly O3 at a NOx-saturated site in Suffolk
County, NY on Long Island (spring). Relationships are shown for one nighttime
hour, one morning rush-hour hour, one daytime hour, and one evening rush-hour
hour. The solid blue line show the linear regression for these points and the dotted
blue line shows the floor value used for SNox based on the 5th percentile modeled
value.
21
-------
Atlanta Site 131510002; summer, 1:00
Atlanta Site 131510002, summer, 15:00
Q-
o.
LD
CO
cor = 0.6
o
G
cor = 0.3
-10
Q 10 20
SNOK (PPB)
Atlanta Site 131510002, summer, 8:00
i i
-40 -20
I I !
0 40 60
SNOX '
Atlanta Site 131510002, summer, 18:00
o
o -
OJ
.Q CD
Q_ CD -
Q- i
O
O _
C<3f = 0.75
-40 -20
40
Figure 10: Relationship between »SM>* and .SVo* at a NOx-limited site downwind of Atlanta
(summer). Relationships are shown for one nighttime hour, one morning rush-
hour hour, one daytime hour, and one evening rush-hour hour. The solid blue line
shows the linear regression for these points.
22
-------
NV Site 360810098 autumn, 1:00
NV Site 360810098 autumn, 15:00
0
S
NV Site 360810098 autumn, 8:00
NV Site 360810098 autumn, 18:00
I I I I
-100 -80 -60 -40 -20
SNOH (PPD)
Figure 11: Relationship between S2Nox and ^o^ at a NOx-saturated site in Queens County,
NY (autumn). Relationships are shown for one nighttime hour, one morning rush-
hour hour, one daytime hour, and one evening rush-hour hour. The solid blue line
shows the linear regression for these points.
23
-------
NV Site 361030004 spring, 1:00
NV Site 361030004spring, 15:00
o
C"3
O
C*J
.Q '
Q.
Q-
(0
O
C"3
I I !
-40 -30 -20 -1
SNOK
NV Site 361030004 spring, 8:00
Q -10 0 11
SNoJPPb)
NV Site 361030004spring, 18:00
.C:
Q-
O
-i 2
in
Figure 12: Relationship between »SM>* and SNox at a NOx-saturated site in Suffolk County,
NY on Long Island (spring). Relationships are shown for one nighttime hour, one
morning rush-hour hour, one daytime hour, and one evening rush-hour hour. The
solid blue line shows the linear regression for these points.
3.2.3 Application of Sensitivity Regressions to Ambient Data
To apply the HDDM adjustments to observed data, sensitivities must be determined for
each hour from 2006-2010 at each site based on the linear relationship from the modeled data
and the observed O3 concentration. The linear regression model also allows us to quantify the
standard error of each predicted sensitivity value at each hour and site.
Observed hourly 63 from 2006-2010 at each monitor location was adjusted by applying
incrementally increasing NOx reductions (equations (4)-(8)) and recalculating MDA8 values at
each step until all monitors in an urban area achieved design values at the standard level being
evaluated (75, 70, 65, or 60 ppb). Therefore all monitors within an urban area were treated as
responding to the same percentage reduction in NOX or NOX/VOC emissions. The standard error
associated with each predicted sensitivity from the linear model can be propagated through
Equations (4) and (8) to quantify the standard error in the final predicted 63 concentration. This
24
-------
gives a measure of variability in the sensitivities at any given 63 concentration and allows us to
quantify how much our predicted Os could change given that variability.
3.2.4 Alternate Methodologies
The methodology used for most of the 15 urban areas is described above. In most cases
the Os response was calculated using Equation (4) which represents a NOX only control
strategy. In addition, sensitivities applied to observed data were based on the predicted
sensitivity for that hourly Os from the site, season, and hour-specific regression. However, two
alternate scenarios were used for a few cities as described below.
A combined NOX/VOC control strategy was applied to two cities for which controlling
monitors were sensitive to changes in VOC emissions: Chicago and Denver. These were the
only case study areas in which the addition of VOC reductions allowed attainment of the
targeted standard levels with smaller reductions in NOx emissions than would be required if
VOC emissions were held constant. In these cases, Equation (8) was applied in place of
Equation (4) and the X, Y cutpoints described in
Table 1 were derived for the NO/VOC control case based on brute force runs that
reduced the two Os precursors by equal percentages.
In two cities, New York and Los Angeles, results were affected by aberrant behavior at
a few monitors which occurred during rush-hour and nighttime hours. At a few highly
urbanized monitors during some seasons, the ozone increases during highly titrated rush-hour
and nighttime hours appear to be overestimated, leading to predictions that when NOx was
reduced ozone would peak during rush-hour periods on the very highest ozone days (compare
outlier dots for 65 ppb standard at 9:00-17:00 to outlier dots for 65 ppb standard at 18:00-8:00
in Figure 23). Even at these monitors, the majority of days did not have this problem as shown
by the more normal diurnal pattern for the boxes (interquartile range) and whiskers (1.5 x
interquartile range) in Figure 23. As mentioned above, for each predicted hourly Os
concentration we calculated a standard error based on the uncertainty in the fitted regressions
representing the variability in modeled sensitivities which is not explained by hourly O?, levels
alone for each site, season, and hour. So, in addition to the predicted Os, we can create alternate
hourly Os datasets based on the standard error in these values. To address the aberrant behavior
on the very highest ozone days in the adjustment scenarios in New York and Los Angeles, we
25
-------
took the 95th percentile confidence interval for each hourly 63 prediction and used the lower
bound value to determine the NOX reductions required to meet the existing and potential
alternative standards and to calculate the hourly Os that would occur under that reduced
emission scenario. During most hours and at most monitors, the 95th percentile range was small
(less than +/- 2.7 ppb) so the use of the lower-bound of the 95th percentile range made little
difference in predicted ozone concentrations (see Table 18 and Table 19). However, using the
lower bound of the 95th percent confidence interval allowed us to dampen the effect of
overpredicted ozone increases during rush-hour times. Since the Os concentrations for each
standard level were created using consistent methodology, these 63 datasets can be used to
compare between standards in these cities. In addition, at a single standard level (for instance
75 ppb) these Os values can be compared against those obtained using the base methodology in
order to quantify uncertainty due to variability in modeled sensitivities. However, 63 values for
one standard obtained using the 95th percentile confidence interval lower bound are not
contrasted against 63 concentrations for a potential alternative standard which were obtained
using the base methodology since these datasets are not directly comparable.
4 APPLYING HDDM/CMAQ TO ADJUST OZONE TO JUST MEET
CURRENT AND ALTERNATIVE STANDARDS: RESULTS
4 1 EMISSION REDUCTIONS APPLIED TO MEET ALTERNATIVE STANDARDS
Table 2 reports the percentage reduction in domain-wide emissions that were used to
reach the existing and potential alternative standard levels in each urban area. Percentages in
Chicago and Denver represent reductions in anthropogenic NOX and VOC. Percentages in all
other cities represent reductions only in anthropogenic NOX emissions. Percentages in New
York and in Los Angeles were calculated based on air quality estimates at the lower end of the
95th percentile confidence interval as discussed above in section 3.2.4. Please note that these
reductions and broad nationwide emission cuts are not intended to represent recommended
control scenarios since they would not be the most efficient method for achieving the standard
in many localized areas.
Table 2: Percent emissions reductions used for each urban area to obtain each standard.
Urban Area
Atlanta
Baltimore
Years
75 ppb
2006-2008
2008-2010
2006-2008
2008-2010
50%
23%
46%
44%
Standard Level*
70 ppb ; 65 ppb
58% 64%
43% i 54%
54% j 61%
52% I 60%
60 ppb
71%
62%
69%
67%
26
-------
Boston
Chicago
Cleveland
Dallas
Denver
Detroit
Houston
Los Angeles
New York
Philadelphia
Sacramento
Saint Louis
Washington
D.C.
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
40%
13%
19%
N/A
48%
50%
50%
50%
51%
15%
59%
N/A
62%
42%
87.1%
87%
64%
52%
54%
42%
63%
64%
45%
10%
53%
31%
49%
40%
52%
27%
61%
64%
57%
58%
65%
46%
69%
54%
68%
53%
89.3%
89%
74%
67%
61%
52%
70%
71%
56%
34%
60%
50%
61%
53%
66%
55%
73%
77%
65%
64%
78%
64%
76%
66%
74%
63%
91.2%
91%
92%
89%
68%
61%
76%
77%
66%
50%
67%
60%
70%
65%
80%
70%
88%
88%
72%
71%
87%
87%
84%
78%
82%
75%
93.2%
93%
N/A
N/A
74%
68%
84%5
84%
75%
63%
74%
71%
* N/A values for the 75 ppb standard level mean that a particular urban area did not have any
design values above 75 for that 3-year period so no controls were needed. N/A values for the
60 ppb standard level mean that this adjustment methodology was not able to bring design
values down to 60 for that particular city and 3-year period.
4.2 DESIGN VALUES
Table 3 through Table 17 report the design values for sites in each of the urban areas for
recent years (2006-2008 and 2008-2010) and for adjusted Os levels representing just meeting
standards of 75, 70, 65, and 60 ppb. In each table, the highest design value for each scenario is
5 An error was discovered in the Sacramento adjustment to 60 ppb for the 2006-2008 design value period in which
84% NOx cuts were applied instead of 85% NOx cuts. The 84% NOx cuts thus simulate meeting a standard of 61
ppb for Sacramento for those years. Results for the 85% NOx cut case are not expected to be substantially
different from the 84% NOx cut case that was analyzed.
27
-------
displayed in bold text. These tables demonstrate that in some urban areas high 63 values at
monitors in different locations have different magnitudes of response to reductions in NOx (and
VOC) emissions. Atlanta monitor 132470001, a downwind monitor, had the highest design
value for 2006-2008. With the NOx reduction scenarios to meet various standard levels, the
two downtown monitors (131210055 and 130890002) switch to being the highest monitors in
the area. These downtown locations which have large amounts of spatially concentrated NOx
emissions are expected to be more VOC limited and therefore less responsive to NOx emission
reductions than surrounding rural and suburban areas. This phenomenon occurs to varying
degrees in each of the 15 urban areas included in this analysis. For instance, the difference in
response between monitors on high days is very pronounced in Cleveland and in New York. In
Cleveland, a downwind monitor located East of the city along Lake Erie was the high monitor
for both 2006-2008 and 2008-2010 (39071001). That monitor responded more dramatically to
NOx cuts than another monitor (390850003) which was closer to the city. As a result of
adjusting air quality to show attainment of a 60 ppb standard at monitor 390850003, the design
value at monitor 390071001 was lowered to 45 ppb for the 2006-2008 data and 44 ppb for the
2008-2010 data. Similarly in the New York area, the highest measured Os values occurred at
downwind sites in Connecticut and on Long Island. Two highly urbanized sites (360050110 in
the Bronx and 360610135 in Manhattan) have lower observed design values but become the
controlling monitors in the adjustment scenarios for 75, 70, and 65 ppb. In contrast, several
cities do not show this change in location of the highest monitor after adjustments to show
attainment with potential alternative standards. For example, the highest monitor for the 2008-
2010 design value period in the Dallas area (484393009, located on the North side of Fort
Worth) remained the highest monitor through all adjustment scenarios for that time period.
Similarly, the controlling monitor in Sacramento (060670012, a downwind suburban site)
remained the highest or second highest site for all adjustment scenarios. However, in the
Sacramento case, another downwind monitor (060610006) which did not have as high of an
observed design value, responded less to emissions changes and essentially "caught up" to the
controlling monitor in the 65 and 60 ppb scenarios.
It is important to note that a model-based adjustment technique is uniquely capable of
capturing this type of spatial heterogeneity in response to emissions reductions. The quadratic
rollback technique used in the first draft REA forces Os at all monitoring sites in an urban area
to respond identically.
28
-------
Table 3: Design values for the Atlanta area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards in 2006-2008 and
2008-2010.
Monitor
132470001
132230003
131510002
131350002
131210055
131130001
130970004
130890002
130850001
130770002
130670003
2006-2008
obs
95
80
94
88
91
86
87
93
77
84
85
75ppb
72
62
72
71
75
67
68
75
59
65
68
70ppb
66
58
65
67
70
61
63
70
55
60
63
65 ppb
61
55
61
63
65
57
59
65
53
56
59
60 ppb
55
51
55
58
59
53
54
60
49
52
53
2008-2010
obs
78
70
79
74
80
N/A
75
79
71
68
76
75 ppb
71
64
73
71
75
N/A
69
75
66
63
72
70 ppb
64
59
66
66
70
N/A
64
70
60
58
66
65 ppb
59
55
60
61
65
N/A
59
64
56
54
62
60 ppb
55
52
55
58
60
N/A
55
60
52
51
57
The highest DV for each scenario is shown in bold. N/A values
enough ambient data to compute a design value for that monitor
indicate that there was not
during a specific 3-year period.
Table 4: Design values for the Baltimore area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards in 2006-2008
and 2008-2010.
Monitor
240030014
240051007
240053001
240130001
240251001
240259001
245100054
2006-2008
obs
87
80
85
83
91
89
66
75 ppb
72
68
74
68
75
75
61
70 ppb
66
63
69
63
70
69
58
65 ppb
61
59
65
59
65
65
56
60 ppb
56
53
60
53
59
59
52
2008-2010
obs
79
77
78
76
89
78
67
75 ppb
67
68
69
64
75
68
62
70 ppb
63
64
65
60
70
64
59
65 ppb
58
60
62
56
65
60
55
60 ppb
54
56
58
53
60
56
53
The highest DV for each scenario is shown in bold.
29
-------
Table 5: Design values for the Boston area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards in 2006-2008 and
2008-2010.
Monitor
250010002
250051002
250070001
250092006
250094004
250095005
250170009
250171102
250213003
250250041
250250042
250270015
2006-2008
obs
79
80
83
81
78
79
75
78
82
74
67
82
75ppb
70
70
74
73
72
72
68
70
73
72
66
75
70ppb
67
67
70
69
68
68
64
66
68
69
64
70
65 ppb
62
62
65
62
63
62
58
59
62
63
60
63
60 ppb
58
57
60
57
58
56
52
53
56
58
56
58
2008-2010
obs
73
75
78
74
N/A
71
68
71
73
72
62
76
75 ppb
72
73
75
72
N/A
69
66
69
72
72
63
74
70 ppb
68
67
69
68
N/A
66
62
64
68
70
63
70
65 ppb
63
62
65
63
N/A
62
58
58
62
65
60
65
60 ppb
58
57
60
56
N/A
56
53
53
57
60
57
58
The highest DV
enough ambient
for each scenario is shown in bold. N/A values indicate
data to compute a design value for that monitor during a
that there was not
specific 3-year period.
Table 6: Design values for the Chicago area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards using NOx and
VOC emissions reductions in 2006-2008 and 2008-2010.
Monitor
170310001
170310032
170310064
170310076
170311003
170311601
170314002
170314007
170314201
170317002
170436001
170890005
2006-2008
obs
76
74
71
73
73
75
62
66
71
70
63
66
75 ppb
73
74
70
72
73
72
63
66
70
68
63
63
70 ppb
65
70
66
67
70
66
61
62
66
63
59
57
65 ppb
59
65
62
62
65
60
58
58
62
58
54
53
60 ppb
53
58
56
56
60
53
54
53
56
53
49
47
2008-2010
obs
69
68
64
67
66
70
65
59
69
63
60
66
75 ppb
69
68
64
67
66
70
65
59
69
63
60
66
70 ppb
66
69
65
65
66
67
65
59
68
62
58
62
65 ppb
60
65
61
60
63
60
62
57
63
58
54
56
60 ppb
54
60
57
56
60
55
58
53
59
53
51
51
30
-------
170971002
170971007
171110001
171971011
180890022
180890030
180892008
180910005
180910010
181270024
181270026
550590019
71
72
64
66
73
77
73
69
70
74
70
78
69
71
61
63
70
75
71
66
67
72
67
75
63
64
55
55
64
70
66
58
58
65
59
68
58
59
51
50
59
65
62
53
53
60
54
62
54
54
47
46
53
60
56
47
47
56
50
56
64
73
64
62
61
64
67
65
65
67
62
74
64
73
64
62
61
64
67
65
65
67
62
74
61
70
61
59
59
63
65
61
61
64
58
70
56
64
55
52
54
60
62
55
54
59
53
64
52
59
51
49
50
56
57
51
49
55
49
59
The highest DV for each scenario is shown in bold.
Table 7: Design values for the Cleveland area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards in 2006-2008
and 2008-2010.
Monitor
390071001
390350034
390350064
390355002
390550004
390850003
390850007
390930018
391030003
391331001
391530020
2006-2008
obs
84
78
74
81
73
78
76
74
72
73
82
75ppb
69
74
67
74
61
75
72
65
61
62
69
70ppb
62
69
61
68
55
70
67
59
55
56
62
65 ppb
55
64
56
61
49
65
62
53
49
50
54
60 ppb
45
57
47
50
42
60
55
44
41
42
44
2008-2010
obs
77
75
68
75
77
76
72
70
70
67
75
75 ppb
64
72
62
68
63
75
70
62
58
57
64
70 ppb
58
67
57
62
55
70
66
56
52
51
57
65 ppb
50
62
51
55
48
65
60
49
47
46
50
60 ppb
44
57
45
48
42
60
56
43
41
41
44
The highest DV for each scenario is shown in bold.
Table 8: Design values for the Dallas area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards in 2006-2008 and
2008-2010.
Monitor
480850005
481130069
481130075
481130087
2006-2008
obs
83
74
80
82
75 ppb
70
67
71
65
70 ppb
66
63
68
63
65 ppb
61
59
63
59
60 ppb
57
56
59
55
2008-2010
obs
77
67
78
78
75 ppb
66
63
71
65
70 ppb
62
59
67
61
65 ppb
57
56
63
58
60 ppb
54
54
58
54
31
-------
481210034
481211032
481390016
481391044
482311006
482510003
482570005
483670081
483970001
484390075
484391002
484392003
484393009
484393011
91
81
75
N/A
70
83
73
84
75
89
83
87
87
79
73
65
62
N/A
56
69
59
66
61
75
71
75
73
64
68
62
59
N/A
54
66
57
62
57
70
68
70
68
61
62
57
55
N/A
51
61
53
57
53
64
64
65
63
58
56
53
51
N/A
47
56
50
52
49
58
60
60
57
54
80
78
72
68
64
80
67
75
74
85
79
86
82
79
67
65
61
56
54
69
56
61
62
74
70
75
71
66
62
60
57
53
50
65
53
57
57
69
66
70
66
61
57
55
54
50
47
59
50
54
53
63
62
65
61
58
53
51
50
47
45
55
47
50
49
58
58
60
55
54
The highest DV for each scenario is shown in bold. N/A values
enough ambient data to compute a design value for that monitor
indicate that there
during a specific 3
was not
-year period.
Table 9: Design values for the Denver area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards using NOx and
VOC emissions reductions in 2006-2008 and 2008-2010.
Monitor
080013001
080050002
080130011
080310014
080310025
080350004
080590002
080590005
080590006
080590011
080690011
080691004
081230009
2006-2008
obs
71
71
81
73
N/A
82
78
78
86
81
82
71
76
75ppb
69
65
70
70
N/A
72
73
70
75
75
70
61
66
70ppb
66
61
65
67
N/A
67
69
65
70
70
65
57
62
65 ppb
65
58
60
64
N/A
61
64
61
64
65
59
53
57
60 ppb
60
54
55
60
N/A
56
58
56
58
59
55
50
53
2008-2010
obs
70
67
73
68
65
76
73
72
77
72
74
65
71
75 ppb
70
66
71
68
64
74
72
71
75
72
72
63
69
70 ppb
69
63
66
67
62
69
70
67
70
70
65
57
63
65 ppb
65
59
61
64
59
64
65
61
65
65
60
53
59
60 ppb
60
52
53
60
56
54
56
54
55
57
52
48
52
The highest DV
enough ambient
for each scenario is shown in bold. N/A values indicate that there was not
data to compute a design value for that monitor during a specific 3-year period.
32
-------
Table 10: Design values for the Detroit area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards in 2006-2008 and
2008-2010.
Monitor
260490021
260492001
260990009
260991003
261250001
261470005
261610008
261630001
261630019
2006-2008
obs
74
76
81
80
76
78
74
71
82
75ppb
61
58
69
75
70
63
61
64
74
70ppb
56
54
64
70
65
58
56
59
69
65 ppb
52
50
61
65
60
55
53
56
64
60 ppb
49
46
57
60
55
51
49
51
59
2008-2010
obs
68
68
74
73
72
71
66
66
75
75 ppb
68
68
74
73
72
71
66
66
75
70 ppb
59
56
66
70
69
60
58
62
70
65 ppb
54
51
61
65
64
56
54
58
65
60 ppb
49
47
57
60
57
51
49
53
59
The highest DV for each scenario is shown in bold.
Table 11: Design values for the Houston area regulatory monitors from observed data and
for adjustments to meet the existing and potential alternative standards in 2006-2008 and
2008-2010.
Monitor
480391004
480391016
481671034
482010024
482010026
482010029
482010046
482010047
482010051
482010055
482010062
482010066
482010070
482010075
482010416
482011015
482011034
2006-2008
obs
85
76
N/A
83
80
85
75
76
80
91
81
89
74
76
89
74
80
75 ppb
63
56
N/A
64
61
63
60
61
63
70
62
75
66
66
70
57
63
70 ppb
59
53
N/A
60
57
58
57
58
59
65
58
70
64
64
65
55
60
65 ppb
55
51
N/A
56
54
54
54
54
56
60
55
65
61
62
62
52
57
60 ppb
50
47
N/A
51
50
47
49
49
51
53
50
60
59
59
56
48
53
2008-2010
obs
85
74
75
83
78
81
72
76
77
82
72
75
73
74
77
73
76
75 ppb
73
63
65
75
68
72
66
73
71
75
65
75
71
73
72
65
71
70 ppb
67
60
61
69
63
66
61
68
66
69
62
70
68
69
67
61
66
65 ppb
61
55
57
63
58
59
56
62
61
64
57
65
64
65
62
57
62
60 ppb
54
50
52
55
52
51
51
55
55
57
52
59
60
60
57
52
55
33
-------
482011035
482011039
482011050
483390078
73
87
80
80
59
63
61
56
56
59
58
52
53
56
55
49
49
51
51
45
75
81
75
71
69
68
67
61
64
65
62
56
59
59
58
50
53
53
53
45
The highest DV for each scenario is shown in bold. N/A values indicate that there was not
enough ambient data to compute a design value for that monitor during a specific 3-year period.
Table 12: Design values for the Los Angeles area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards using the
lower bound of the 95th percent confidence interval of estimated hourly Os in 2006-2008
and 2008-2010.
Monitor
060370002
060370016
060370113
060371002
060371103
060371201
060371301
060371602
060371701
060372005
060374002
060375005
060376012
060379033
060590007
060591003
060592022
060595001
060650004
060650008
060650012
060651010
060651016
060651999
2006-2008
obs
96
107
69
92
73
97
58
78
103
92
59
64
105
94
73
66
87
83
N/A
79
102
N/A
105
N/A
75
ppb
63
64
54
69
68
54
56
71
63
75
54
56
57
55
58
52
55
61
N/A
55
58
N/A
60
N/A
70
ppb
60
60
52
64
64
52
54
67
59
70
52
54
54
52
56
51
52
57
N/A
53
55
N/A
57
N/A
65
ppb
56
57
50
59
61
50
52
63
55
65
52
53
51
49
54
50
50
54
N/A
52
52
N/A
54
N/A
60
ppb
53
53
48
54
56
47
50
59
52
60
50
51
47
47
51
48
48
51
N/A
51
49
N/A
51
N/A
2008-2010
obs
89
103
72
84
70
91
N/A
69
90
87
61
61
97
91
68
66
81
74
97
81
102
78
102
77
75
ppb
63
65
55
69
68
53
N/A
68
62
75
55
56
57
54
58
53
54
60
60
55
58
52
59
52
70
ppb
60
61
53
64
64
51
N/A
64
58
70
54
54
54
52
55
52
52
57
56
54
55
51
56
51
65
ppb
56
57
50
59
60
49
N/A
61
55
65
53
52
50
50
53
51
50
54
53
53
52
49
53
49
60
ppb
52
53
48
54
56
47
N/A
57
51
60
52
50
47
48
51
49
47
51
49
52
49
48
50
48
34
-------
060652002
060655001
060656001
060658001
060658005
060659001
060659003
060710001
060710005
060710012
060710306
060711004
060711234
060712002
060714001
060714003
060719004
061110007
061110009
061111004
061112002
061112003
061113001
86
97
107
107
N/A
102
63
86
119
96
89
110
80
112
96
116
116
75
80
83
88
64
61
55
56
57
63
N/A
55
48
57
70
60
60
67
56
68
60
68
67
47
47
49
49
48
47
54
54
54
59
N/A
52
47
56
65
57
57
62
55
63
56
63
62
46
46
48
48
47
46
52
52
51
55
N/A
49
46
54
61
56
56
57
54
58
53
59
58
44
44
46
46
46
46
51
50
47
51
N/A
46
45
53
55
54
54
52
53
53
50
54
53
43
43
45
44
46
45
85
95
102
97
93
96
63
80
112
99
87
100
75
101
96
103
102
78
79
79
86
63
63
55
55
55
60
58
53
48
54
68
60
58
65
54
65
60
65
64
48
47
47
49
47
47
53
54
52
56
55
51
47
52
64
57
56
61
53
61
57
61
60
46
46
46
48
46
46
52
52
50
53
51
48
47
51
59
55
53
56
52
56
53
56
55
45
45
45
46
46
45
50
49
47
49
48
46
46
50
55
54
52
52
51
51
50
51
50
44
43
44
44
45
44
The highest DV for each scenario is shown in bold.
enough ambient data to compute a design value for
N/A values indicate that there was not
that monitor during a specific 3-year period.
Table 13: Design values for the New York area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards using the
lower bound of the 95th percent confidence interval of estimated hourly Os in 2006-2008
and 2008-2010.
Monitor
090010017
090011123
090013007
090019003
090070007
090090027
2006-2008
obs
99
102
102
106
104
90
75ppb
75
60
70
66
62
61
70 ppb
67
50
61
58
53
54
65 ppb
39
36
40
39
37
37
2008-2010
obs
91
91
89
91
86
77
75 ppb
74
64
68
70
63
61
70 ppb
65
54
60
61
54
53
65 ppb
43
37
42
41
40
39
35
-------
090093002
340030006
340170006
340190001
340210005
340230011
340250005
340273001
340315001
360050110
360050133
360610135
360715001
360790005
360810124
360850067
361030002
361030004
361030009
361192004
108
N/A
92
94
101
100
94
94
86
89
87
N/A
97
91
88
95
98
102
104
104
64
N/A
69
54
59
59
58
51
53
74
70
N/A
49
53
65
61
65
61
65
64
56
N/A
64
45
50
50
51
42
45
70
66
N/A
41
46
58
55
58
53
58
56
40
N/A
60
38
36
37
37
38
37
65
54
N/A
35
35
44
45
40
40
41
37
88
84
91
88
87
86
87
87
83
76
82
80
88
87
85
90
90
84
92
87
64
71
71
59
63
64
65
56
60
70
72
75
54
59
70
66
71
61
70
67
55
63
64
49
53
53
55
46
50
66
65
70
45
50
61
59
61
54
61
57
41
46
60
39
39
39
39
40
37
60
54
65
36
37
46
47
43
42
44
38
The highest DV for each scenario is shown in bold. N/A values
enough ambient data to compute a design value for that monitor
indicate that there was not
during a specific 3-year period.
Table 14: Design values for the Philadelphia area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards in 2006-2008
and 2008-2010.
Monitor
100031007
100031010
100031013
240150003
340010006
340070003
340071001
340110007
340150002
340290006
420170012
420290100
420450002
2006-2008
obs
80
83
78
90
N/A
87
86
81
87
87
92
82
83
75ppb
60
65
64
68
N/A
75
65
61
70
67
75
63
68
70ppb
57
60
60
63
N/A
70
60
57
65
62
70
59
64
65 ppb
53
56
56
58
N/A
65
55
52
60
57
64
54
59
60 ppb
50
51
53
52
N/A
60
52
49
55
53
58
50
55
2008-2010
obs
75
76
75
80
74
N/A
80
76
81
81
83
76
74
75 ppb
64
67
67
68
64
N/A
69
63
72
69
74
65
68
70 ppb
59
62
63
63
60
N/A
63
59
67
64
69
60
64
65 ppb
55
57
59
58
56
N/A
59
54
62
59
64
56
59
60 ppb
51
53
56
54
52
N/A
54
51
57
55
60
52
55
36
-------
420910013
421010004
421010024
84
67
89
67
59
74
62
56
69
57
53
63
53
50
58
78
66
82
71
62
75
66
59
70
61
55
65
56
52
60
The highest DV for each scenario is shown in bold. N/A values indicate that there was not
enough ambient data to compute a design value for that monitor during a specific 3-year period.
Table 15: Design values for the Sacramento area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards in 2006-2008
and 2008-2010.
Monitor
060170010
060170012
060170013
060170020
060570005
060570007
060571001
060610002
060610004
060610006
060670002
060670006
060670010
060670011
060670012
060670013
060670014
060675003
060950004
060950005
060953003
061010003
061010004
061130004
061131003
2006-2008
obs
96
76
70
98
91
87
70
90
89
90
78
87
79
82
99
78
N/A
95
60
68
75
72
85
76
76
75ppb
71
66
61
74
69
66
61
69
67
73
64
71
65
65
75
64
N/A
72
57
57
59
59
69
59
59
70ppb
66
64
59
69
66
63
59
65
63
68
61
66
62
61
70
61
N/A
68
56
55
55
56
66
55
56
65 ppb
63
63
58
64
62
60
58
61
60
64
58
62
59
58
65
59
N/A
64
55
52
53
53
64
52
53
60 ppb
56
60
57
58
57
56
57
56
55
59
54
56
55
53
59
54
N/A
58
53
49
49
48
61
48
49
2008-2010
obs
90
71
N/A
89
84
81
N/A
87
78
90
75
85
75
77
99
N/A
57
92
63
69
70
66
76
72
72
75 ppb
67
61
N/A
68
63
63
N/A
66
59
73
62
69
62
61
75
N/A
51
70
59
58
56
54
62
56
56
70 ppb
63
60
N/A
64
60
59
N/A
62
56
69
58
65
59
57
70
N/A
49
65
58
55
53
51
60
53
53
65 ppb
59
59
N/A
60
57
57
N/A
58
53
65
55
61
56
54
65
N/A
47
61
56
53
51
49
58
50
50
60 ppb
54
57
N/A
55
53
54
N/A
54
49
60
52
56
53
51
59
N/A
45
55
54
50
49
46
56
47
47
The highest DV
enough ambient
for each scenario is shown in bold. N/A values indicate that there was not
data to compute a design value for that monitor during a specific 3-year period.
37
-------
Table 16: Design values for the Saint Louis area regulatory monitors from observed data
and for adjustments to meet the existing and potential alternative standards in 2006-2008
and 2008-2010.
Monitor
170831001
171170002
171190008
171191009
171193007
171630010
290990019
291130003
291831002
291831004
291890004
291890005
291890014
295100085
295100086
2006-2008
obs
73
70
76
78
77
72
N/A
81
N/A
N/A
78
76
82
N/A
81
75ppb
61
58
67
67
68
65
N/A
67
N/A
N/A
71
65
73
N/A
75
70ppb
57
54
62
62
63
60
N/A
62
N/A
N/A
66
60
68
N/A
70
65 ppb
52
50
57
57
57
55
N/A
56
N/A
N/A
60
55
62
N/A
65
60 ppb
48
46
52
52
52
51
N/A
51
N/A
N/A
55
50
57
N/A
60
2008-2010
obs
69
66
71
72
68
68
72
72
77
74
N/A
65
71
68
N/A
75 ppb
67
64
70
70
67
67
70
70
75
72
N/A
64
70
68
N/A
70 ppb
62
60
67
66
63
64
67
65
70
66
N/A
59
66
69
N/A
65 ppb
57
55
61
61
59
59
62
60
65
61
N/A
55
62
65
N/A
60 ppb
52
51
56
56
53
55
56
55
59
54
N/A
51
57
60
N/A
The highest DV
enough ambient
for each scenario is shown in bold. N/A values indicate that there was not
data to compute a design value for that monitor during a specific 3-year period.
Table 17: Design values for the Washington D.C. area regulatory monitors from observed
data and for adjustments to meet the existing and potential alternative standards in 2006-
2008 and 2008-2010.
Monitor
110010025
110010041
110010043
240090011
240170010
240210037
240313001
240330030
240338003
510130020
510590005
510590018
510590030
2006-2008
obs
80
86
87
79
82
82
84
83
87
85
79
86
85
75 ppb
62
73
75
61
63
65
68
68
65
70
61
68
69
70 ppb
58
69
70
57
59
61
63
64
60
66
58
64
64
65 ppb
53
64
65
54
55
56
59
60
56
62
54
60
60
60 ppb
48
59
60
49
50
52
54
54
52
56
50
56
55
2008-2010
obs
75
77
79
77
75
75
74
79
77
79
67
73
81
75 ppb
68
74
75
69
68
69
68
74
70
74
62
68
75
70 ppb
60
69
70
61
61
63
63
67
62
68
57
62
68
65 ppb
55
65
65
56
56
58
59
62
57
63
53
58
63
60 ppb
49
60
60
49
50
52
53
55
51
57
50
52
55
38
-------
510591005
510595001
510610002
510690010
511071005
511530009
511790001
515100009
83
83
70
72
82
78
81
81
68
66
54
55
62
58
60
65
64
61
51
52
58
54
56
61
59
57
48
49
53
49
52
56
54
52
45
46
49
46
48
53
68
66
65
68
75
70
70
74
65
62
59
61
68
61
63
68
60
58
53
55
60
55
57
61
57
54
50
51
56
51
53
56
54
51
45
46
50
46
48
51
The highest DV for each scenario is shown in bold.
4.3 DISTRIBUTION OF HOURLY O3 CONCENTRATIONS
Figure 13 through Figure 27 display diurnal boxplots of hourly O3 concentrations at
monitor locations in each urban area for observed air quality, air quality adjusted to meet the
existing standard, and an example of air quality adjusted to meet a potential alternative standard
(65 ppb), for 2006-2008 and 2008-2010. Note that these plots include data from multiple
monitoring sites within each urban area so they generally encompass the overall distribution of
O3 at both the urban core sites and the downwind suburban sites. The hourly plots show similar
patterns in most cities for which O3 concentrations during daytime hours decrease from
observed air quality (black) to air quality adjusted to meet the existing standard (red), and
decrease further when adjusted to meet a potential alternative standard of 65 ppb (blue). These
daytime decreases are generally seen most on high O3 days represented by outlier dots. In some
cities the mid-range O3 days, represented by the 25th - 75th percentile boxes, remained fairly
constant (Boston) and in other cities, mid-range O3 days decreased (Atlanta). Although
daytime O3 decreases, concentrations during morning rush-hour generally increase. These
increases are associated with VOC-limited and NOx titration conditions near NOx sources
during rush-hour periods. Reducing NOx under those conditions results in less O3 titration and
thus increases O3 concentrations. Nighttime increases in O3 are often seen to a lesser extent than
morning rush-hour increases. These phenomena generally lead to a flattening of the diurnal O3
pattern with smaller differences between daytime and nighttime concentrations as NOx
emissions are reduced. Cases that required more substantial NOX cuts to reach 75 and 65 ppb
standards generally have more pronounced patterns of decreases in daytime O3 and increases in
nighttime O3 leading to a flatter diurnal O3 pattern (e.g. Los Angeles in Figure 22). Two cities,
Houston and New York, do not follow this general pattern. In Houston (Figure 21), mid-range
O3 values increase during daytime hours as the highest O3 concentrations decrease. This pattern
is consistent with NOx-limited conditions on high O3 days but VOC-limited conditions on mid-
range O3 days or potentially NOx-limited conditions at high O3 sites and VOC limited
conditions at mid-range O3 sites. Therefore NOx reductions, which are applied applied on all
39
-------
days, increase 63 on mid-range 63 days. Note that mid-range 63 days in Houston start out
quite low (20-30 ppb) and increase modestly. The changes in Os from observed air quality to
the existing standard in New York look similar to the trends shown for other cities, but the
diurnal 63 pattern for the 65 ppb potential alternative standard has some unrealistic features
which were previously discussed in section 3.2.4 of this appendix. The steep NOx cuts (-90%)
applied to meet a 65 ppb standard in New York result in a very flat diurnal pattern for mid-
range values (see boxes from Figure 23) but the outliers represented by dots in Figure 23 show
an inverse diurnal pattern with lower Os concentrations during the day and higher values at
night. This is a result of the modeled sensitivities showing that nighttime 63 does not respond
to NOx emissions changes on high Os nights while daytime Os does respond to NOx emission
changes during those same days. Thus the daytime values drop substantially while the
nighttime values show no change. These results are not seen when the sensitivities are applied
for the more modest cuts (50-65%) to reach 75 ppb. The diurnal pattern in New York for the 65
ppb case for the highest days is clearly unrealistic as there is no chemical process expected to
result in daytime values lower than values at night, rather it appears to be an artifact of applying
this model-based methodology to extremely large emissions reductions in a location where
relationships derived under 3 discrete conditions (base, 50% NOx cut, 90% NOx cut) cannot
fully characterize sensitivities over the entire range of possible reductions. As discussed
previously in section 3.2.4, the use of the lower bound from the 95th percentile confidence
interval can mitigate some of this behavior, but in the case of New York some unrealistic
outliers still remain. It should be noted, however that this unrealistic diurnal pattern is only
seen in the outlier points, and is not seen the box and whiskers which represent 1.5 times the
interquartile range (most of the data).
Figure 28 through Figure 42 display the same information as Figure 13 through Figure
27 but for monthly rather than hourly distributions. Note that missing months in these plots
indicate that there was no monitor data available for those months and years. Similar to the
diurnal plots, the seasonal distributions become flatter when meeting 75 and 65 ppb standard
levels especially for the highest ozone days. This is due to more ozone decreases during
summer months and more ozone increases in winter months. The ozone increases in the winter
are consistent with the understanding that solar insolation rates are lower in the winter reducing
total photochemical activity and shifting the net affect of NOx emissions on ozone which can
both create ozone through photochemical pathways and destroy ozone through titration. In
addition, the decreases on the highest ozone days and increases on the lowest ozone days show
a visible compression of the ozone distribution in these plots similar to what was seen in the
diurnal plots. The changes for mid-range ozone days also show a pattern of shifting higher
mid-range ozone earlier in the year. While in most cities, the highest interquartile ozone
40
-------
concentrations in the recent conditions occur in the summer months (June-August), in many
areas the highest interquartile ozone concentrations shift to spring months (April-May) for the
adjustment scenarios of meeting 75 and 65 ppb standard levels. This pattern can be seen most
dramatically in Atlanta, Baltimore, Boston, Denver, Los Angeles, New York, Philadelphia,
Sacramento, and Washington D.C. This pattern is consistent with higher contribution from
non-U.S. anthropogenic sources at lower standard levels than under recent observed conditions.
Many of these non-U.S. anthropogenic sources such as stratospheric intrusions and
international transport have been shown to peak during spring months as discussed in the
Integrated Science Assessment (EPA, 2013).
atlanta sites: 2006-2008 atlanta sites: 2008-2010
O
observed
75 ppb standard
65 ppb standard
TriT^TliilSISSl**
8-
__
CO to
O
observed
75 ppb standard
65 ppb standard
mfff'fi
'IB !!,!£
0 2 4 6 8 10 13 16 19 22
hour
0 2 4 6 8 10 13 16 19 22
hour
Figure 13: Hourly Os distributions at Atlanta area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
41
-------
Baltimore sites: 2006-2008
Baltimore sites: 2008-2010
.a o
a.03
9-
8-
observed
75 ppb 5tnndiird
55 ppb standard
r-T'
aO
IIII
JDL
'ill?
0246
10 13 16 19 22
hour
8-
QL
CL
8
s-
8-
observed
75 ppb standard
65 ppb standard
0 2 4 6 8 10 13 16 19 22
hour
Figure 14: Hourly Os distributions at Baltimore area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Boston sites: 2006-2008
Boston sites: 2008-2010
.
O.CO
O <°
9-
8-
observed
75 ppb standard
65 ppb standard
o
o o
°° lip8 -
iiSilmHiiii
l!
0246
10 13 16 19 22
hour
0 2 4 6 8 10 13 16 19 22
Figure 15: Hourly Os distributions at Boston area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
42
-------
Chicago sites: 2006-2008
Chicago sites: 2008-2010
Q
Q.
ss-
s-
observed
75 ppb standard - NOxA/OC scenario
65 ppb standard - NOx/VOC scenario
o^O'-^a"* 5 f o
°°s?:§ *S?
SossBI PH§§
_Q
1s-
8
observed
70 ppb standard - NOxA/OC scenario
65 ppb standard - NOxA/OC scenario
»8P!8'i
«'; "f*a=°
III!-0*? M !"" Hll
; : i !"
; ₯ fill n ' '
0246
10 13
hour
16 19 22
0 2 4 6 8 10 13
hour
16 19 22
Figure 16: Hourly Os distributions at Chicago area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Cleveland sites: 2006-2008
Cleveland sites: 2008-2010
_o
Q.O
CO
O
obscivcd
75 ppb standard
65 ppb standard
0246
10 13 16 19 22
hour
75 ppb standard
65 ppb standard
0 2 4 6 8 10 13 16 19 22
Figure 17: Hourly Os distributions at Cleveland area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
43
-------
Da 11 as sites: 2006-2008
Dallas sites: 2008-2010
-Q
Q_
Q.
55-
observed
75 ppb 5tnndiird
55 ppb standard
lit IF
I' I
nnn '
Oo
W8^°
till
'Mii
8-
Q-o
Q_(0
8
observed
75 ppb standard
65 ppb standard
5-5|8o
'-' ₯ _ o
IITTT'I
B
ill
,: "
i .
HI
llllllll.n,
4 l
i a H e
0246
10 13 16 19
hour
22
0 2 4 6 8 10 13 16
hour
19 22
Figure 18: Hourly Os distributions at Dallas area regulatory monitoring sites for observed
air quality, and air quality adjusted to meet the existing (75 ppb) and alternative
(65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Denver sites: 2006-2008
Denver sites: 2008-2010
CO
O
observed
75 ppb standard - NOx/VOC scenario
65 ppb standard - NOx/VOC scenario
0 2 4 6 8 10 13 16 19 22
hour
75 ppb standard - NOx/VOC scenario
65 ppb standard - NOx/VOC scenario
0 2 4 6 8 10 13 16 19 22
Figure 19: Hourly Os distributions at Denver area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
44
-------
Detroit sites: 2006-2008
Detroit sites: 2008-2010
_
CL
CL
observed
75 ppb standard
55 ppb standard
I
"68858
0 2 4 6 8 10 13 16 19 22
hour
8-
CL
observed
70 ppb standard
55 ppb standard
o
88
-
Q 9 a a ° o o
5 i. ? § g 9 o
i S i s §
1 9 ---:*?,-
'6 ' * 5
I;
- -
U i ,<4
illiiiiiiiiiiiiifF
"
0 2 4 6 8 10 13 16 19 22
hour
Figure 20: Hourly Os distributions at Detroit area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Houston sites: 2006-2008
Houston sites: 2008-2010
8-
CO
O
observed
75 ppb standard
55 ppb standard
0 2 4 6 8 10 13 16 19 22
hour
O
observed
75 ppb standard
55 ppb standard
§131
8S*ap^
..iiTli,-
0 2 4 6 8 10 13 16 19 22
hour
Figure 21: Hourly Os distributions at Houston area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
45
-------
Los Angeles sites: 2006-2008
observed
75 ppb t;
-------
Philadelphia sites: 2006-2008
Philadelphia sites: 2008-2010
.a o
a.03
observed
75 ppb 5titndard
55 ppb standard
ff
,
l!!!!I!!ll!
m
nun
8-
.
CL
Be
81
8-
observed
75 ppb standard
E5 ppb standard
8-
?!'
I ill
0 2 4 6 8 10 13 16 19 22
hour
0 2 4 6 8 10 13 16 19 22
hour
Figure 24: Hourly Os distributions at Philadelphia area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Sacramento sites: 2006-2008
Sacramento sites: 2008-2010
CO
O
53-
observed
75 ppb standard
65 ppb standard
O O
\ OQ " 00
I 8 o 0 o
H H 3 a Q ? i - s - * s
.HllHiH?-
IHIIIII I
liiTTf
066^-i*
s-s-
Q_
O
observed
75 ppb standard
65 ppb standard
O O
l?!l Hi
HMfiiiir
:;: :|i
.u LJijnnu,
0 2 4 6 8 10 13 16 19 22
hour
0 2 4 6 8 10 13 16 19 22
hour
Figure 25: Hourly Os distributions at Sacramento area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
47
-------
St Lou is sites: 2006-2008
St Louis sites: 2008-2010
8-
Q.
CL
s-
observed
75 ppb stitndtird
65 ppb standard
t g
o o H
;;« ?' ::
'"
a
0 2 4 6 8 10 13 16 19 22
hour
S-
8
observed
75 ppb standard
65 ppb standard
°«aS°e
t o § i! |
s'!'i;.;;l!hi
51 o ilTlJllln
I illiUi!
'??8.«
llllii
!,,,!!#
iiliJ
ill
II
0 2 4 6 8 10 13 16 19 22
hour
Figure 26: Hourly Os distributions at St. Louis area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Washington D.C. sites: 2006-2008
CO
O
obscivcd
75 ppb standard
65 ppb standard
nC
Tin
til
BBS
fSS
w '
in
ug
0246
10 13
hour
16 19 22
Washington D.C. sites: 2008-2010
o
abs&rv&d
75 ppb standard
S5 ppb standard
ti
O"
024
6 8 10 13
hour
16 19 22
Figure 27: Hourly Os distributions at Washington, D.C. area regulatory monitoring sites
for observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
48
-------
atlanta sites: 2006-2008
atlanta sites: 2008-2010
observed
75 ppb ?trmd;ifd
55 ppb :-t.-nd;iid
S-
QL
CL
8
s-
8-
observed
75 ppb standard
E5 ppb standard
Q O
1
I '
34567
month
9 10 12
123456739 10 12
month
Figure 28: Monthly Os distributions at Atlanta area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Baltimore sites: 2006-2008
Baltimore sites: 2008-2010
.
O.CO
Q.
O <°
obscivtd
75 ppb standard
65 ppb standard
o
O _
observed
75 ppb standard
65 ppb standard
123456789 10
month
12
123456789 10
month
12
Figure 29: Monthly Os distributions at Baltimore area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
49
-------
Boston sites: 2006-2008
Boston sites: 2008-2010
.a o
Q.03
Q.
9-
observed
75 ppb standard
65 ppb standard
o O
o
l
8
T 4-
@ 8
123456789 10 12
month
J3
CL
8:
o _
observed
75 ppb standard
55 ppb standard
123456789 10 12
month
Figure 30: Monthly Os distributions at Boston area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
8-1
CO
o
Chicago sites: 2006-2008
observed
75 ppb stDndard - NOx/VOC scenario
65 ppb standard - NOx/VOC scenario
..
123456789 10
month
12
Chicago sites: 2008-2010
_
s
8
observed
75 ppb standard - NOx/VOC scenario
65 ppb standard - NOx/VOC scenario
dfil Elli
^ (
123456789 10 12
month
Figure 31: Monthly Os distributions at Chicago area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
-------
Cleveland sites: 2006-2008
Cleveland sites: 2008-2010
.a
Q-C
a.'
o
CVJ H
observed
75 ppb 5titncfard
65 ppb f.'iKJin
-------
Denver sites: 2006-2008
Denver sites: 2008-2010
-Q
O C
s{
8
a-
observed
75 ppb standard - NOx/VOC scenario
65 ppb standard - NOx/VOC scenario
..
o- -J-
123456789 10
month
12
0.3
Q.
8
o _
observed
75 ppb standard - NOxA/OC scenario
65 ppb standard - NOxA/OC scenario
f i i
123456789 10
month
12
Figure 34: Monthly Os distributions at Denver area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
_
CL
Q.
Detroit sites: 2006-2008
observed
75 ppb standard
65 ppb standard
I
"
567
month
12
Detroit sites: 2008-2010
.a
CL
O-o
'CD
8
observed
75 ppb standard
65 ppb standard
II
i
I
H
i
i
jjl
123456789 10
month
12
Figure 35: Monthly Os distributions at Detroit area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
52
-------
Houston sites: 2006-2008
Houston sites: 2008-2010
8-
CL
Q.
observed
75 ppb standard
65 ppb standard
!
I"
IB
a
123456789 10
month
12
observed
75 ppb standard
E5 ppb standard
123456789 10
month
12
Figure 36: Monthly Os distributions at Houston area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Los Angeles sites: 2006-2008
,8
CO
O
obscived
75 ppb standard
65 ppb stfMidunl
till!
12
3456789 10
month
12
Los Angeles sites: 2008-2010
O
75 ppb standard
S5 ppb standard
lit
111
123456789 10 12
month
Figure 37: Monthly Os distributions at Los Angeles area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
53
-------
New York sites: 2006-2008
New York sites: 2008-2010
CL
a.
8-
observed
"S ppb stitndard
55 ppb :-t.-nd;iid
11
,1=1,
=i =i
[Si
S
=
--
tsJ
75 ppb standard
55 ppb standard
-TJlT.
o -
123456789 10
month
12
123456789 10
month
12
Figure 38: Monthly Os distributions at New York area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
Philadelphia sites: 2006-2008
Philadelphia sites: 2008-2010
_
O.CO
O <°
observed
75 ppb standard
55 ppb standard
111
11
ggg
.a
CL
O
o
observed
75 ppb standard
65 ppb standard
it!!'1!
il
I i
"
I
Hii
1 23456789 10
month
12
123456789 10
month
12
Figure 39: Monthly Os distributions at Philadelphia area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
54
-------
Sacramento sites: 2006-2008
-8
a.1"
a.
5?-
observed
"S ppb standard
55 ppb standard
f >:
o,,=,sisll5
HP
123456789 10 12
month
Sacramento sites: 2008-2010
55-
sl-
QL
Q.
8
s-
observed
75 ppb standard
65 ppb standard
1
I I!
ABU
₯
123456789 10 12
month
Figure 40: Monthly Os distributions at Sacramento area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
St Louis sites: 2006-2008
observed
75 ppb standard
65 ppb standard
123456789 10 12
month
St Louis sites: 2008-2010
O
observed
75 ppb standard
65 ppb standard
-
I l
1
123456789 10 12
month
Figure 41: Monthly Os distributions at St. Louis area regulatory monitoring sites for
observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
55
-------
Washington D.C. sites: 2006-2008
observed
75 ppb standard
65 ppb standard
l
ron
Washington D.C. sites: 2008-2010
_
OL
-So
8"
observed
75 ppb standard
55 ppb stiindard
123456789 10
month
123456789 10
month
12
Figure 42: Monthly Os distributions at Washington, D.C. area regulatory monitoring sites
for observed air quality, and air quality adjusted to meet the existing (75 ppb) and
alternative (65 ppb) standards for 2006-2008 (left) and 2008-2010 (right).
4.4 STANDARD ERRORS FOR PREDICTED HOURLY O3 CONCENTRATIONS
Standard error values for predicted hourly Os were generally small relative to total
predicted hourly 63 concentrations. This indicates that even when regression fits had somewhat
lower correlation coefficients, the resulting uncertainty from the use of the regression line does
not substantially affect predicted Os concentrations. Table 18 and Table 19 show the mean and
95th percentile standard error for all hourly 63 values (2006-2010) in each urban area for the
existing standard of 75 ppb and the potential alternative standards of 70 ppb, 65 ppb, and 60
ppb.
Table 18: Mean standard error (ppb) in adjusted hourly
area for each standard.
concentration in each urban
Urban Area
Atlanta
Baltimore
Boston
Chicago*
Years
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
Standard Level
75 ppbT
0.65
0.37
0.62
0.62
0.51
0.16
0.29
70 ppb
0.69
0.68
0.65
0.64
0.53
0.51
0.66
65 ppb
0.74
0.71
0.70
0.69
0.59
0.54
0.77
60 ppb*
0.81
0.77
0.78
0.75
0.69
0.63
0.99
56
-------
Cleveland
Dallas
Denver*
Detroit
Houston
Los
Angeles*
New York*
Philadelphia
Sacramento
Saint Louis
Washington
D.C.
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
N/A
0.60
0.60
0.59
0.57
0.51
0.17
0.60
N/A
0.73
0.52
1.15
1.14
0.84
0.69
0.66
0.62
0.49
0.49
0.57
0.13
0.67
0.49
0.41
0.65
0.66
0.64
0.63
0.62
0.48
0.70
0.56
0.83
0.55
1.17
1.15
1.06
0.92
0.72
0.65
0.55
0.56
0.61
0.48
0.71
0.66
0.68
0.76
0.80
0.74
0.72
0.77
0.58
0.80
0.65
0.95
0.62
1.18
1.17
1.39
1.34
0.80
0.71
0.61
0.62
0.70
0.58
0.78
0.72
0.83
0.96
0.95
0.86
0.84
0.85
0.82
0.92
0.81
1.11
0.76
1.21
1.19
N/A
N/A
0.90
0.79
0.81
0.70
0.81
0.66
0.87
0.84
* Values are from the standard NOx reduction scenario for all cities except Chicago and Denver,
for which value are from the standard NOx/VOC reduction scenario and New York and Los
Angeles, for which values based on reductions required when using the lower bound of the 95th
percentile confidence interval.
|N/A values for the 75 ppb standard level mean that a particular urban area did not have any
design values above 75 for that 3-year period so adjustments were made to ambient data.
JN/A values for the 60 ppb standard level mean that this adjustment methodology was not able
to bring design values down to 60 for that particular city and 3-year period.
-th
Table 19: 95 percentile standard error (ppb) in adjusted hourly O3 concentration in each
urban area for each standard.
Urban Area
Atlanta
Baltimore
Years
2006-2008
2008-2010
2006-2008
2008-2010
Standard Level
75 ppb!
1.28
0.72
1.22
1.21
70 ppb
1.34
1.33
1.26
1.26
65 ppb
1.42
1.38
1.34
1.33
60 ppb*
1.56
1.47
1.49
1.46
57
-------
Boston
Chicago*
Cleveland
Dallas
Denver*
Detroit
Houston
Los
Angeles*
New York*
Philadelphia
Sacramento
Saint Louis
Washington
D.C.
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
1.02
0.33
0.54
N/A
1.13
1.13
1.16
1.12
1.14
0.38
1.12
N/A
1.45
0.98
2.84
2.78
1.64
1.34
1.29
1.25
1.10
1.11
1.07
0.26
1.29
0.94
1.05
1.03
1.22
0.78
1.19
1.22
1.26
1.24
1.35
1.07
1.28
1.05
1.66
1.04
2.86
2.80
2.15
1.87
1.39
1.29
1.22
1.24
1.14
0.90
1.35
1.27
1.19
1.09
1.43
1.27
1.35
1.42
1.48
1.45
1.70
1.28
1.46
1.20
1.96
1.15
2.89
2.83
2.87
2.98
1.56
1.41
1.35
1.38
1.29
1.07
1.46
1.36
1.41
1.29
1.89
1.56
1.69
1.69
1.79
1.75
1.89
1.84
0.69
1.48
2.41
1.39
2.93
2.86
N/A
N/A
1.78
1.58
1.54
1.55
1.51
1.21
1.63
1.57
* Values are from the standard NOx reduction scenario for all cities except Chicago and Denver,
for which value are from the standard NOx/VOC reduction scenario and New York and Los
Angeles, for which values based on reductions required when using the lower bound of the 95th
percentile confidence interval.
|N/A values for the 75 ppb standard level mean that a particular urban area did not have any
design values above 75 for that 3-year period so adjustments were made to ambient data.
JN/A values for the 60 ppb standard level mean that this adjustment methodology was not able
to bring design values down to 60 for that particular city and 3-year period.
58
-------
4.5 AIR QUALITY INPUTS FOR THE EPIDEMIOLOGY-BASED RISK
ASSESSMENT
The air quality inputs to the epidemiology-based risk assessment discussed in Chapter 7
were spatially averaged "composite monitor6" values for 12 of the 15 urban case study areas.
The procedure for calculating these values is described in Appendix 4a. Figure 43 through
Figure 54 show boxplots of the composite monitor daily maximum 8-hour values for observed
air quality, air quality adjusted to meet the existing standard, and the potential alternative
standards of 70 ppb, 65 ppb, and 60 ppb in the 12 urban areas in the epidemiology-based risk
assessment. There are eight panels in each figure. The panels are designed to show contrasts
based on three factors:
1. Spatial extent of the urban case study area: the top row of panels in each figure are
based on the smaller areas from the Zanobetti & Schwartz, 2008 study (Z & S), , while
the bottom row of panels are based on the larger Core Based Statistical Areas (CBSAs)..
2. Length of the O^ season: the 1st and 3rd columns of panels in each figure are based on a
shorter June - August Oj season, which was used in Z&S, while the 2nd and 4th columns
of panels are based on a longer April - October Os season. The Smith et al, 2009 study
was based on the required Os monitoring season, which varied by area but often
encompassed the April-October period.
3. Year: the two left-hand columns of panels in each figure are based on 2007, while the
two right-hand columns of panels in each figure are based on 2009. The epidemiology-
based risk assessment focused on these two years: 2007, to represent a year with higher
Os concentrations, and 2009, to represent a year with lower Os concentrations.
There are a few properties common to nearly all of the figures. First, the highest
composite monitor values represented by the top whiskers decrease when air quality is adjusted
to meet the existing standard, and continue to decrease as air quality is further adjusted to meet
lower alternative standard levels. By contrast, the lowest composite monitor values represented
by the bottom whiskers increase when air quality is adjusted to meet the existing standard, and
continue to increase as air quality is further adjusted to meet lower alternative standard levels.
The behavior of the 25th percentile, median, mean, and 75th percentile values making up the
boxes varies by urban area and across the three contrasting factors.
The spatial extent contrasts show that the base composite monitor values based on the
smaller Z&S study areas tend to be slightly lower than the base composite monitor values based
on the CBSAs. This is because the highest ozone concentrations are often located downwind of
the urban area's population center, and the smaller Z&S areas often do not capture the highest
6 Composite monitor values are calculated as the average of all monitors within each CBS A. See Appendix 4-A
for more details.
59
-------
observed concentrations. Two notable exceptions to this tendency are Atlanta and Sacramento,
where the highest monitored concentrations are located near the population center. The spatial
extent contrasts also show that when air quality is adjusted, the highest composite monitor
values tend to decrease more quickly for the CBS As than for the Z&S areas, and conversely the
lowest composite monitor values tend to increase more slowly for the CBS As than for the Z&S
areas. This is consistent with observed air quality patterns, which show that as NOx emissions
decrease, O3 concentrations decrease more quickly downwind of the urban population center,
and may increase near the population center due to less titration. This phenomenon also affects
the center of the distribution, where we see that the 25th percentile, median, mean, and 75th
percentile of the composite monitor values decrease more quickly for the CBS As than for the
Z&S areas. In cases where we see increases in these values, the values tend to increase more
quickly for the Z&S areas than for the CBSAs.
The seasonal contrasts show that the composite monitor values based on observed air
quality tend to be lower in the spring and fall months than in the summer months. The highest
composite monitor values tend to occur in the June-August period, so the upper tail of the
distribution often does not change, but the center and lower tail values are usually lower for the
April-October period. One notable exception is Houston, where the highest Os concentrations
are often observed in the spring and fall months. When Os concentrations are adjusted to meet
the existing standard, there are many cases where net decreases in the 25th percentile, median,
and mean composite monitor values for the June-August period turn into net increases in these
values for the April-October period. When air quality is further adjusted to meet the potential
alternative standards, initial increases in these values for the April-October period often persist,
and in a few cases the increases grow larger as the level of the standard decreases. By contrast,
initial increases in the 25th percentile, median, and mean composite monitor values are often
reversed when air quality is further adjusted to meet the potential alternative standards.
Finally, the year contrasts are meant to show the effects of changes in emissions and
meteorology on the composite monitor values. Precursor emissions were generally higher in
2007 than in 2009, and meteorological conditions were generally more favorable to ozone
formation in 2007 than in 2009. Thus, the composite monitor values based on observed air
quality are generally higher in 2007 than in 2009, with the exception of Houston, where the
climate regime was the reverse of most of the rest of the U.S. during those two years. When air
quality are adjusted to meet the existing and potential alternative standards, the decreases in the
2007 composite monitor values tend to be larger than in 2009, and there tend to be fewer
increases. However, it is worth noting that since the emissions and meteorological inputs used
in the CMAQ/HDDM modeling for the air quality adjustments were based on 2007 data, we
tend to have more confidence in the adjusted values for 2007 than for 2009.
60
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 i
CBSA, June-August, 2007
base 75 70 65 60
CBSA, April-October, 2007
base 75 70 65 60
CBSA, June-August, 2009
base 75 70 65 E
CBSA, April-October, 2009
base 75 70 65 60
base 75 70 65 60
base 75 70 65 60
base 75 70 65 60
Figure 43: Composite monitor daily maximum 8-hour Os values for Atlanta based on
observed and adjusted air quality.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 i
CBSA, June-August, 2007
base 75 70 65 60
CBSA, April-October, 2007
se 75 70 65 60
CBSA, June-August, 2009
base 75 70 65 E
CBSA, April-October, 2009
base 75 70 65 60
base 75 70 65 60
base 75 70 65 60
base 75 70 65 60
Figure 44: Composite monitor daily maximum 8-hour Os values for Baltimore based on
observed and adjusted air quality.
61
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
a
base 75 70 65 60
CBSA, June-August, 2007
base 75 70 65 60
CBSA, April-October, 2007
base 75 70 65 60
CBSA, June-August, 2009
base 75 70 65 E
CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 45: Composite monitor daily maximum 8-hour Os values for Boston based on
observed and adjusted air quality.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60
CBSA, June-August, 2007
base 75 70 65 60
CBSA, April-October, 2007
base 75 70 65 60
CBSA, June-August, 2009
base 75 70 65 E
CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 46: Composite monitor daily maximum 8-hour Os values for Cleveland based on
observed and adjusted air quality.
62
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60
CBSA, June-August, 2007
base 75 70 65 60
CBSA, April-October, 2007
base 75 70 65 60
CBSA, June-August, 2009
base 75 70 65 E
CBSA, April-October, 2009
I i
base 75 70 65
75 70 65 60
75 70 65 60
base 75 70 65
Figure 47: Composite monitor daily maximum 8-hour Os values for Denver based on
observed and adjusted air quality.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60
CBSA, June-August, 2007
base 75 70 65 60
CBSA, April-October, 2007
base 75 70 65 60
CBSA, June-August, 2009
base 75 70 65 E
CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60
base 75 70 65
Figure 48: Composite monitor daily maximum 8-hour O3 values for Detroit based on
observed and adjusted air quality.
63
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60 base 75 70 65 60 base 75 70 65 60 base 75 70 65 E
CBSA, June-August, 2007 .=. CBSA, April-October, 2007 o CBSA, June-August, 2009 o CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 49: Composite monitor daily maximum 8-hour Os values for Houston based on
observed and adjusted air quality.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60 base 75 70 65 60 base 75 70 65 60 base 75 70 65 E
o CBSA, June-August, 2007 o CBSA, April-October, 2007 o CBSA, June-August, 2009 o CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 50: Composite monitor daily maximum 8-hour Os values for Los Angeles based on
observed and adjusted air quality.
64
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60 base 75 70 65 60 base 75 70 65 60 base 75 70 65 E
CBSA, June-August, 2007 o CBSA, April-October, 2007 o CBSA, June-August, 2009 o CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 51: Composite monitor daily maximum 8-hour Os values for New York based on
observed and adjusted air quality.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60 base 75 70 65 60 base 75 70 65 60 base 75 70 65 E
CBSA, June-August, 2007 o CBSA, April-October, 2007 o CBSA, June-August, 2009 o CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 52: Composite monitor daily maximum 8-hour O3 values for Philadelphia based on
observed and adjusted air quality.
65
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60 base 75 70 65 60 base 75 70 65 60 base 75 70 65 E
CBSA, June-August, 2007 .=. CBSA, April-October, 2007 o CBSA, June-August, 2009 o CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 53: Composite monitor daily maximum 8-hour Os values for Sacramento based on
observed and adjusted air quality.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60 base 75 70 65 60
CBSA, June-August, 2007 o CBSA, April-October, 2007
base 75 70 65 60 base 75 70 65 E
CBSA, June-August, 2009 o CBSA, April-October, 2009
base 75 70 65
75 70 65 60
75 70 65 60 base 75 70 65
Figure 54: Composite monitor daily maximum 8-hour O3 values for St. Louis based on
observed and adjusted air quality.
4.6 AIR QUALITY INPUTS FOR THE EXPOSURE AND CLINICAL RISK
ASSESSMENT
The air quality inputs for the exposure and clinical risk assessments discussed in
Chapters 5 and 6 include spatial surfaces of hourly O3 concentrations estimated for each census
66
-------
tract in the 15 urban case study areas using the Voronoi Neighbor Averaging (VNA) technique.
A description of the VNA technique and its application to ambient concentrations are provided
in Appendix 4a. In this section, we present three types of figures which summarize the data
from the hourly VNA surfaces for observed air quality, air quality adjusted to meet the existing
standard7, and air quality adjusted to meet the potential alternative standard of 65 ppb.
The first set of figures (Figure 55 through Figure 69) show density scatter plots of the
change in daily maximum 8-hour average (MDA8) Oj concentrations versus the observed
concentration based on the hourly VNA estimates in each area. In each of these figures, the
left-hand panels show the observed MDA8 values (x-axis) versus the change in those values
that occur when air quality is adjusted to meet the existing standard (y-axis). The right-hand
panels show the MDA8 values for air quality adjusted to meet the existing standard7 (x-axis)
versus the additional change in those values that occur when air quality is further adjusted to
meet the alternative standard of 65 ppb (y-axis). The top panels show values based on 2006-
2008, while the bottom panels show values based on 2008-2010. Within each panel, the x and
y values are rounded to the nearest integer and colored to show the relative frequency of each 1
ppb x 1 ppb square within the plot region. Values falling outside of the plot region were set to
the nearest value within the plot region, and frequencies above the range in the color bar were
set to the highest value within the color bar.
The second set of figures (Figure 70 through Figure 99) show maps of the changes in
design values (3-year average of the annual 4th highest MDA8 values) and May - September
average MDA8 values based on the ambient data and the hourly VNA surfaces. There are two
figures for each of the 15 urban case study areas, one based on 2006-2008 and one based on
2008-2010. In each figure, the panels on the left show the changes in these values that occur
when air quality is adjusted to meet the existing standard7, and the panels on the right show the
additional changes in these values that occur when air quality is further adjusted to meet the
alternative standard of 65 ppb. The top panels show the changes in the design values, while the
bottom panels show the changes in the May - September average MDA8 values. Within each
panel, squares show values based on observed data at ambient 63 monitoring sites while circles
show values based on VNA estimates at census tract centroids. Regions shaded pink indicate
counties in either the Zanobetti & Schwartz, 2008 or the Smith et al, 2009 studies (for the 12
cities in the epidemiology-based risk assessment). The Zanobetti & Schwartz and Smith et al
study areas had some counties in common in all 12 cities, and were identical in 6 of the cities.
Regions shaded gray indicate additional counties in the CBS A, and regions shaded peach
7 Chicago and Detroit were already meeting the existing standard in 2008-2010. The 2008-2010 figures
for those areas are based on air quality adjusted to meet the 70 ppb alternative standard instead of the existing
standard.
67
-------
indicate any additional counties in the study areas for the exposure and clinical risk assessment.
The maps also show some monitors which are located outside of the exposure study areas.
These monitors were adjusted along with the other monitors and used in the VNA estimates, but
were not used when determining the emissions reductions necessary to meet the various
standards.
The third set of figures (Figure 100 through Figure 114) show changes in design values
(3-year average of the annual 4th highest MDA8 values) and May - September average MDA8
values in the 15 urban case study areas versus population and population density. The
population total and population density information for each census tract were obtained from
the U.S. Census Bureau based on the 2010 U.S. Census. Within each figure, the top panels
show histograms of the total population stratified by the change in design value or seasonal
average, while the bottom panels show scatter plots of population density (x-axis) versus
change in design value or seasonal average (y-axis). The left-hand panels are based on 2006-
2008, while the right-hand panels are based on 2008-2010. The first and third rows of panels
show changes in design values, while the second and fourth rows of panels show changes in
May-September average values. Finally, the first and third columns show the changes that
occur when air quality is adjusted to meet the existing standard7, while the second and fourth
columns show the additional changes that occur when air quality is further adjusted to meet the
alternative standard of 65 ppb. Within each panel, values associated with census tracts falling
within the epidemiology study areas as defined previously are colored pink, values associated
with additional census tracts falling within the CBS A are colored gray, and values associated
with any additional census tracts falling within the exposure study areas are colored peach.
Population density values are shown on a logarithmic scale, with values falling outside of the
plot region set to the nearest values within the plot region.
In general, the density scatter plots show that the HDDM adjustment procedure predicts
increases in MDA8 O3 at low ambient concentrations, and decreases in MDA8 O3 at high
concentrations. The vast majority of the increases in MDA8 O3 occur at ambient concentrations
below 50 ppb. The relationship between the starting concentrations and the changes in these
values based on the UDDM adjustments is fairly linear with strong negative correlation in all
15 urban areas. In some areas, such as Baltimore and Philadelphia, there is a bimodal pattern
near the center of the distribution, which may be indicative of differing behavior in the urban
population center versus the surrounding suburban areas.
The maps reveal several trends in the spatial pattern of changes in O3 in the urban case
study areas. The design values decreased almost universally when air quality was adjusted to
68
-------
meet the existing standard with a few exceptions8, and continued to decrease when air quality
was further adjusted to meet the 65 ppb alternative standard. The design values also tended to
decrease more quickly in suburban and rural areas than in the urban population centers. The
May-September "seasonal" average MDA8 values also followed this trend to some extent.
However, the decreases in the seasonal average values were nearly universal in the suburban
and rural areas, while the behavior in the urban population centers varied amongst the cities.
The seasonal average values in the urban population centers followed one of three distinct
patterns:
1. The seasonal average values decreased when air quality was adjusted to meet the
existing standard, and continued to decrease when air quality was further adjusted to
meet the 65 ppb alternative standard. (Atlanta, Sacramento, Washington D.C.)
2. The seasonal average values increased or remained constant when air quality was
adjusted to meet the existing standard, then decreased when air quality was adjusted to
meet the 65 ppb alternative standard. (Baltimore, Cleveland, Dallas, Detroit, Los
Angeles, New York, Philadelphia, St. Louis)
3. The seasonal average values increased when air quality was adjusted to meet the
existing standard, and continued to increase or remained constant when air quality was
further adjusted to meet the 65 ppb alternative standard. (Boston, Chicago, Houston,
Denver)
The population plots show a clear and consistent trend regarding the populations living
in areas associated with various changes in design values and seasonal average concentrations.
In almost every figure, there is a positive correlation between population density and change in
design value or seasonal average concentration. This suggests that in almost every scenario,
when NOx emissions were reduced, the suburban areas surrounding the urban population center
experience more 63 reductions than in the urban population center. For the 12 urban areas
examined in the epidemiology-based risk assessment, the vast majority of the locations
associated with increases in the seasonal average concentration occurred within the
epidemiology study area, which tended to be focused on the urban population center.
These figures show that using the HDDM adjustment methodology, peak Os
concentrations are reduced in urban areas with large domain-wide reductions in U.S.
anthropogenic NOx emissions. In most cases, seasonal average 63 concentrations also
decreased with large domain-wide reductions in U.S. anthropogenic NOx emissions. However,
All design values from the VNA surfaces decreased when going from recent conditions to the 75 ppb
adjustment scenario with the exceptions of small areas in Boston where design values increased by up to 1 ppb
(2008-2010), Chicago where design values increased by up to 2 ppb (2006-2008) and 4 ppb (2008-2010) and New
York where design values increased by up to 3 ppb (2008-2010). All areas in which design values increases
started with very low design values based on recent conditions.
69
-------
there were a few cases, such as Chicago and Houston, where current NOx emissions were high
enough to cause titration in the urban population centers so that seasonal average Os
concentrations were below regional background levels.
70
-------
Atlanta 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Atlanta 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0,6
% of points
0.8
1.0
Figure 55: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in Atlanta.
71
-------
Baltimore 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Baltimore 2008 - 2010
Change in MDA8 03 from Observed to 75 ppb Change in MDAS 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 56: Change in VNA estimates of the daily maximum 8-hour average (MDAS)
concentrations based on HDDM adjustments in Baltimore.
72
-------
Boston 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Boston 2008 - 2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 57: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in Boston.
73
-------
Chicago 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
o
20 40 60 80 100
Observed MDA8 O3 (ppb)
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Chicago 2008- 2010
Change in MDA8 03 from Observed to 70 ppb Change in MDAS 03 from 70 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
70 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 58: Change in VNA estimates of the daily maximum 8-hour average (MDAS)
concentrations based on HDDM adjustments in Chicago.
74
-------
Cleveland 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Cleveland 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDAS 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 59: Change in VNA estimates of the daily maximum 8-hour average (MDAS)
concentrations based on HDDM adjustments in Cleveland.
75
-------
Dallas 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Dallas 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 60: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in Dallas.
76
-------
Denver 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Denver 2008 - 2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 61: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in Denver.
77
-------
Detroit 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Detroit 2008-2010
Change in MDA8 03 from Observed to 70 ppb Change in MDAS 03 from 70 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
70 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 62: Change in VNA estimates of the daily maximum 8-hour average (MDAS)
concentrations based on HDDM adjustments in Detroit.
78
-------
Houston 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Houston 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 63: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in Houston.
79
-------
Los Angeles 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Los Angeles 2008 - 2010
Change in MDA8 03 from Observed to 75 ppb Change in MDAS 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 64: Change in VNA estimates of the daily maximum 8-hour average (MDAS)
concentrations based on HDDM adjustments in Los Angeles.
80
-------
New York 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
New York 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 65: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in New York.
81
-------
Philadelphia 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Philadelphia 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDAS 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 66: Change in VNA estimates of the daily maximum 8-hour average (MDAS)
concentrations based on HDDM adjustments in Philadelphia.
82
-------
Sacramento 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Sacramento 2008 - 2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 67: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in Sacramento.
-------
St. Louis 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
St. Louis 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 68: Change in VNA estimates of the daily maximum 8-hour average (MDA8)
concentrations based on HDDM adjustments in St. Louis.
84
-------
Washington 2008-2010
Change in MDA8 03 from Observed to 75 ppb Change in MDA8 03 from 75 ppbto 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDA8 O3 (ppb)
120
0,0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0,2
0.4 0.6
% of points
0.8
1.0
Washington 2006 - 2008
Change in MDA8 03 from Observed to 75 ppb Change in MDAS 03 from 75 ppb to 65 ppb
20 40 60 80 100
Observed MDA8 O3 (ppb)
120
20 40 60 80 100
75 ppb Adjusted MDAS O3 (ppb)
120
0.0
0.2
0.4 0.6
% of points
0.8
1.0
0.0
0.2
0.4 0.6
% of points
0.8
1.0
Figure 69: Change in VNA estimates of the daily maximum 8-hour average (MDAS) O3
concentrations based on HDDM adjustments in Washington, D.C.
85
-------
Atlanta 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 70: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Atlanta, 2006-2008. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
86
-------
Atlanta 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5 0
Change in O3 (ppb)
10
Figure 71: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Atlanta, 2008-2010. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
87
-------
Baltimore 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 72: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Baltimore, 2006-2008. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
88
-------
Baltimore 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 73: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Baltimore, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
89
-------
Boston 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 74: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Boston, 2006-2008. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
90
-------
Boston 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 75: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Boston, 2008-2010. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
91
-------
Chicago 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 76: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Chicago, 2006-2008. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
92
-------
Chicago 2008 - 2010
Change from Observed to 70 ppb Change from 70 ppb to 65 ppb
00
<
Q
E?
T
£Z
<
CO
Q
2
0)
CT
_
(D
_Q
E
,
(fl
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 77: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Chicago, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
-------
Cleveland 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 78: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Cleveland, 2006-2008. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
94
-------
Cleveland 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 79: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Cleveland, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
-------
Dallas 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 80: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Dallas, 2006-2008. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
96
-------
Dallas 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 81: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Dallas, 2008-2010. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
97
-------
Denver 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 82: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Denver, 2006-2008. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
98
-------
Denver 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 83: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Denver, 2008-2010. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
99
-------
Detroit 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
0
Figure 84: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Detroit, 2006-2008. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
100
-------
Detroit 2008 - 2010
Change from Observed to 70 ppb Change from 70 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
0
Figure 85: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Detroit, 2008-2010. The points are colored
according to the change in ppb, and values falling outside the range in the color
bar were set to the nearest value within the color bar.
101
-------
Houston 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
oo
Q
2
"w
ID
D)
T
CD
Q
2
m
D)
-
".
-------
Houston 2008- 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
oo
Q
2
"w
ID
D)
T
CD
Q
2
m
D)
-
".
-------
Los Angeles 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 88: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Los Angeles, 2006-2008. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
104
-------
Los Angeles 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 89: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Los Angeles, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
105
-------
New York 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 90: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for New York, 2006-2008. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
106
-------
New York 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 91: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for New York, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
107
-------
Philadelphia 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 92: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Philadelphia, 2006-2008. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
108
-------
Philadelphia 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
oo
Q
S
-^^
m
ID
JZ
O)
if
oo
Q
2
m
en
E
-------
Sacramento 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
OD
Q
s
-^^
m
<1>
JZ
D)
T
a
Zi
CD
Q
2
-------
Sacramento 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 95: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Sacramento, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
Ill
-------
St. Louis 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
00
Q
S
-i
m
a
D)
T
a
Z!
CO
Q
5
9
O)
_
9
_Q
E
-------
St. Lou is 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5
Change in O3 (ppb)
10
Figure 97: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for St. Louis, 2008-2010. The points are
colored according to the change in ppb, and values falling outside the range in the
color bar were set to the nearest value within the color bar.
113
-------
Washington 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5 0
Change in O3 (ppb)
10
Figure 98: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Washington, D.C., 2006-2008. The points
are colored according to the change in ppb, and values falling outside the range in
the color bar were set to the nearest value within the color bar.
114
-------
Washington 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20
-15
-10 -5 0
Change in O3 (ppb)
10
Figure 99: Changes in annual 4 highest MDA8 and May-September average MDA8
values based on HDDM adjustments for Washington D.C., 2008-2010. The points
are colored according to the change in ppb, and values falling outside the range in
the color bar were set to the nearest value within the color bar.
115
-------
Atlanta 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Atlanta 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-
1
r
r^fk
«r
£
£
!." "
c
.0
^ N -
3
I
-'
r
Jill
T -
CO
<
Q
5 "
1 1
f 1
s !«
£ S
* 3.
| £_
c
<
h
F
J
y
J
-25 -20 -15 -10 -5 0 -25 -20 - 1 5 -10 -5 0 -15 -10 -5
Change in O3 (ppb) Change in O3 (ppb} Change in O3 (ppb)
I
~ j_|
n~n
acl I 1
in -
T -
c
= m
I.
c
£
S CM -
Q.
£
-
j
M
oo -» '
<
Q
5
QJ
cn " '
1
> =
< E
i i-
"i 1
Q> CL
f £^-
(/)
>,
re
5
** -
_
c
i
§
c 04 -
.Q
S
j*
s.
m-i
m
u
i
0 -15 -10 -5 0
Change in OS (ppb)
1
-25 -20 -15 -10 -5 0 -25 -20 -15 -10 -5 0 -15 -10 -5
Change in O3 (ppb) Change in O3 (ppb) Change in O3 (ppb)
-* -
c
=
J
C CM -
.0
1
a.
CL
1
SfT~\
0 -15 -10 -5 0
Change in O3 (ppbj
D Epi Study Area CBSA Q Exposure Area
Atlanta 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
D Epi Study Area CBSA D Exposure Area
Atlanta 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
If
& O
3. o
i ^
*r o
"nj o '
100 1,000 10,000
Population Density (people/km1* 2)
100 1,000 10,000
Population Density (people/km1^)
100 1,000 10,000
Population Density (people/km'12)
100 1,000 10,000
Population Density (people,;km"2)
I
S. S
< 0
aj -^
t 5'
e
CO
5 2-
-*££*$&
1?-
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
Exposure Area
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.-'km*2)
Exposure Area
-th
Figure 100: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Atlanta versus population and population density.
116
-------
Baltimore 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Change in O3 (ppb)
15 -10 -5
Change inO3 (ppb)
s.
1)
'km"-2)
Exposure Area
Figure 101: Changes in VNA estimates of annual 4* highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Baltimore versus population and population
density.
117
-------
Boston 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
£ I
I
Change in O3 (ppb)
Change inO3 (ppb)
I!
-15 -10 -505 -15 -10 -5 0 E
Change in OS (ppb) Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Boston 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
§
5 ff =
100 1,000 10,000
Population Density (people/km1*2)
10 100 1,000 10,000
Population Density (people/kin"2)
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
Boston 2008- 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Change in OS (ppb)
Change in O3 (ppb)
-15 -10 -505 -15 -10 -5 0 £
Change in O3 (ppb) Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Boston 2008- 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
~5i 0
T ^l
10 100 1,000 10,000
Population Density (people/km*2)
10 100 1,000 10,000
Population Density (people,;km"2)
< o
te .
_Q 0)
Q) ^
"5. 6
Exposure Area
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.>'km"-2)
Exposure Area
Figure 102: Changes in VNA estimates of annual 4* highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Boston versus population and population density.
118
-------
Chicago 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Chicago 2008 - 2010
Change from Observed to 70 ppb Change from 70 ppb to 65 ppb
Change in O3 (ppb)
Change inO3 (ppb)
Change in OS (ppb)
Change in O3 (ppb)
> -10 -5
Change in O3 (ppb
d Epi Study Area I
CBSA
Change inO3 (ppb)
Exposure Area
Chicago 2006- 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Change in O3 (ppb) Change in OS (ppb)
D Epi Study Area CBSA n Exposure Area
Chicago 2008- 2010
Change from Observed to 70 ppb Change from 70 ppb to 65 ppb
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/km1^)
Exposure Area
Epi Study Area * CBSA
Exposure Area
Figure 103: Changes in VNA estimates of annual 4* highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Chicago versus population and population
density.
119
-------
Cleveland 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Change in O3 (ppb)
Change in O3 (ppb)
-10 -5 0 5 -15 -10 -5 0 E
Change in O3 (ppb) Change inO3 (ppb)
n Epi Study Area CBSA n Exposure Area
Cleveland 2006-2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
10 100 1,000 10,000
Population Density (people/km11^)
10 100 1,000 10,000
Population Density (people/kin"2)
o a
"o. o (
a
0 0
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
Cleveland 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
« £
Change in O3 (ppb)
Change in O3 (ppb)
-15 -10 -5
Change in O3 (ppb) Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Cleveland 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
10 100 1,000 10,000
Population Density (people/km*2)
10 100 1,000 10,000
Population Density (people,;km"2)
5 2-
Exposure Area
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.''km"-2)
Exposure Area
-th
Figure 104: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Cleveland versus population and population
density.
120
-------
Dallas 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Dallas 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-20 -15 -10 -5
Change in O3 (ppb)
-20 -15 -10 -5 0
Change inO3 (ppb)
D Epi Study Area CBSA n Exposure Area
Dallas 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-10 -5
Change in O3 (ppb) Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Dallas 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
"5 o
f I.
± |
Cfl O
~5i 0
T ^l
100 1,000 10,000
Population Density (people/km1*2)
100 1,000 10,000
Population Density (people/km1^)
10 100 1,000 10,000
Population Density (people/km*2)
100 1,000 10,000
Population Density (people,;km"2)
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
10 100 1,000 10,000
Population Density (people/kmlV2)
Exposure Area
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.''km"-2)
Exposure Area
-th
Figure 105: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Dallas versus population and population density.
121
-------
Denver 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Denver 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
£ I
I
Change in O3 (ppb)
Change inO3 (ppb)
Change in O3 (ppb)
Change in O3 (ppb)
-15 -10 -5 0
Change in O3 (ppb)
-15 -10 -5 0
Change inO3 (ppb)
n Epi Study Area CBSA n Exposure Area
Denver 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
§
100 1,000 10,000
Population Density (people/km1* 2)
100 1,000 10,000
Population Density (people/krnn2)
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
i
-15 -10 -505
Change in O3 (ppb)
D Epi Study Area
c
1.
C T- -
s
3
Q_
o -
k
15 -10 -505
Change in OS (ppbj
CBSA n Exposure Area
Denver 2008 -2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
:^***ww
r**
a
o.
CL
0
|
1
0 o _
. A
: . . '^P^-
L ft ifct^al^B
1 «! V^*»*«!T?*^flF
it, v ft'f*f~ -f*^*^
, - . B»
100 1,000 10,000
Population Density (people/km1^)
"5. 6 o
0) V
Exposure Area
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.''km"-2)
Exposure Area
-th
Figure 106: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Denver versus population and population density.
122
-------
Detroit 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
if,
---
'
-20 -15 -10 -5
Change in O3 (ppb)
-20 -15 -10 -5
Change inO3 (ppbj
II.
-20 -15 -10 -5 0
Change in O3 (ppb)
-20 -15 -10 -5 0
Change inOO (ppb)
n Epi Study Area CBSA n Exposure Area
Detroit 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
100 1,000 10,000
Population Density (people/km1*2)
100 1,000 10,000
Population Density (people/kinn2)
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
10 100 1,000 10,000
Population Density (people/kmlV2)
Exposure Area
Detroit 2008-2010
Change from Observed to 70 ppb Change from 70 ppb to 65 ppb
-15 -10 -5
Change in O3 (ppb)
Change in O3 (ppb)
-15 -10 -5
Change in O3 (ppb)
-15 -10 -5 0
Change in OS (ppb)
P Epi Study Area CBSA n Exposure Area
Detroit 2008 -2010
Change from Observed to 70 ppb Change from 70 ppb to 65 ppb
~5i 0
I E'
100 1,000 10,000
Population Density (people/km*^)
100 1,000 10,000
Population Density fpeople;'km"2)
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.>'km"-2)
Exposure Area
Figure 107: Changes in VNA estimates of annual 4* highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Detroit versus population and population density.
123
-------
Houston 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Houston 2008- 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
£ I
I
-25 -20 -15 -10 -50
Change in O3 (ppb)
-25 -20 -15 -10 -50
Change inQ3 (ppb)
Change in O3 (ppb)
Change in OS (ppb)
I!
-25 -20 -15 -10 -50 510
Change in O3 (ppb)
-25 -20 -15 -10 -50 510
Change in OS (ppb)
-15 -10 -5 0
Change in O3 (ppb)
-15 -10 -5 0
Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Houston 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
P Epi Study Area CBSA n Exposure Area
Houston 2008- 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/km*2)
Exposure Area
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.-'km*2)
Exposure Area
-th
Figure 108: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Houston versus population and population
density.
124
-------
Los Angeles 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
£ I
I
=5 ^ -
t=
C
<
-50 -40 -30 -20-10 0 10
Change in OS (ppb)
-30 -20 -10 0
Change in O3 (ppb)
1!
Q- Q_
-------
New York 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
New York 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Cfl O *
CD =:
I
-40 -30 -20 -10 0 10 -40 -30 -20 -10 0 10
Change in O3 (ppb) Change in O3 (ppb)
to -
Change in Q3 (ppb)
Change in OS (ppb)
-40 -30 -20 -10 0
Change in O3 (ppb)
-40 -30 -20 -10 0
Change inO3 (ppb)
-30 -20 -10 0
Change in O3 (ppb)
-30 -20 -10 0
Change in O3 (ppbj
n Epi Study Area CBSA n Exposure Area
New York 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
P Epi Study Area CBSA n Exposure Area
New York 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
100 1,000 10,000
Population Density (peapie/km*2)
100 1,000 10,000
Population Density (people:'km^2
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
Exposure Area
Epi Study Area * CBSA
Exposure Area
-th
Figure 110: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for New York versus population and population
density.
126
-------
Philadelphia 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
£ I
I
=5 ^_
El
<
-25 -20 -15 -10 -505
Change in OS (ppb)
-25 -20 -15 -10 -5 0
Change in O3 (ppb)
I I
-25 -20 -15 -10 -505
Change in O3 (ppb)
-25 -20 -15 -10 -5 0
Change inOO (ppb)
n Epi Study Area CBSA n Exposure Area
Philadelphia 2006-2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
100 1,000 10,000
Population Density (people/km1*2)
100 1,000 10,000
Population Density (people/kinn2)
t
10 100 1,000 10,000
Population Density (people/km^)
* Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
Exposure Area
Philadelphia 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Change in O3 (ppb)
Change in O3 (ppb)
-15 -10 -5
Change in O3 (ppb)
-15 -10 -5 0
Change in OS (ppb)
P Epi Study Area CBSA n Exposure Area
Philadelphia 2008-2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
100 1,000 10,000
Population Density (people/km*2)
10 100 1,000 10,000
Population Density (people,;km"2)
< o
te .E1?
n 0)
(D ^
"5. 6 o
0) T
GO
>,
ns
5 S -
V
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.''km"-2)
Exposure Area
-th
Figure 111: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Philadelphia versus population and population
density.
127
-------
Sacramento 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-25 -20 -15 -10 -5
Change in O3 (ppb)
-25 -20 -15 -10 -5 0
Change in OS (ppb)
s.
CO
ii
-25 -20 -15 -10 -5 0
Change in O3 (ppb)
-25 -20 -15 -10 -5 0
Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Sacramento 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
S
100 1,000 10,000
Population Density (people/km1*2)
100 1,000 10,000
Population Density (people/km1^)
Q
CO _
E.Q
aifi
S-
.
10 100 1,000 10,000
Population Density (people/km^)
" Epi Study Area CBSA
100 1,000 10,000
Population Density (people/kmlV2)
Exposure Area
Sacramento 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
E --
-30 -25 -20 -15 -10 -5
Change in O3 (ppb)
-30 -25 -20 -15-10 -5 0
Change in O3 (ppb)
I
-30 -25 -20 -15 -10 -5 0
Change in O3 (ppb)
-30 -25 -20 -15-10 -5 0
Change in OS (ppb)
P Epi Study Area CBSA n Exposure Area
Sacramento 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
* o
_ .c
nJ 0 <
0
S -
100 1,000 10,000
Population Density (people/km*^)
100 1,000 10,000
Population Density fpeople;'km"2)
< OT
o5
"5. o °
H-
10 100 1,000 10,000
Population Density (people/kmA2)
Epi Study Area * CBSA
100 1,000 10,000
Population Density (people.''km"-2)
Exposure Area
-th
Figure 112: Changes in VNA estimates of annual 4 highest MDA8 and May - September average
MDA8 based on HDDM adjustments for Sacramento versus population and population
density.
128
-------
St. Louis 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
£ I
I
Change in O3 (ppb)
Change inO3 (ppb)
s.
1)
-------
Washington 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
Washington 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-25 -20 -15 -10 -5 0
Change in O3 (ppb)
-25 -20 -15 -10 -5 0
Change inO3 (ppb)
d Epi Study Area CBSA n Exposure Area
Washington 2006 - 2008
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
-15 -10 -5
Change in O3 (ppb) Change in OS (ppb)
n Epi Study Area CBSA n Exposure Area
Washington 2008 - 2010
Change from Observed to 75 ppb Change from 75 ppb to 65 ppb
8-
8-
3. o
I E'
!,'
10 100 1,000 10,000
Population Density (people/km11^)
100 1,000 10,000
Population Density (people/kinn2)
100 1,000 10,000
Population Density (people/km*^)
100 1,000 10,000
Population Density (people,;km"2)
03 O
|
01 *?
CO _
S. o
8-
-------
4.7 COMPARING AIR QUALITY ADJUSTMENTS BASED ON NOX REDUCTIONS ONLY
TO AIR QUALITY ADJUSTMENTS BASED ON NOX AND VOC REDUCTIONS
As mentioned in section 3, HDDM-adjustment scenarios could be carried out either by applying
across-the-board reductions in U.S. anthropogenic NOx emissions or by applying across-the-board
reduction in U.S. anthropogenic NOx and VOC emissions (with equal percentage reductions for the two
precursors). The core analysis applied the NOx only reductions to all urban case study areas except
Chicago and Denver for which combined NOx and VOC reductions were applied. However, in order to
address the question of how the choice of NOx-only emissions reductions affects estimated air quality
distributions for adjustments to the current and alternative standard levels we performed sensitivity
analyses for seven cities: Denver, Detroit, Houston, Los Angeles, New York, Philadelphia, and
Sacramento. For these seven cities, we performed both NOx-only and NOx/VOC emissions reductions
scenarios for all standard levels and compared the resulting air quality results. Although in six of these
cities the additional VOC reductions did not result in needing lower percentage NOx reductions, we
recognize that some VOC reductions are likely to occur in future years due to on-the-books mobile
source rules. However, the large VOC cuts applied here are larger than expected reductions from these
rules. Several caveats need to be noted for this analysis. First, this is meant as a sensitivity test only and
not as a potential realistic alternative control scenario. Second, because the methodology described in
section 3 restricts the calculations to equal percentage cuts in NOx and VOC, this sensitivity does not
necessarily identify the optimal combination of NOx and VOC reduction levels. Table 20 shows the
percentage reductions that were applied in each scenario for each of the seven cities.
131
-------
Table 20: Comparison of NOx-only and NOx/VOC emission reductions applied in sensitivity
analyses for seven urban areas.
City
Denver
Detroit
Houston
Los Angeles
(95% LB)
New York
(95% LB)
Philadelphia
Sacramento
Years
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
2006-2008
2008-2010
NOxonly
75
54%
24%
59%
N/A
62%
42%
87%
87%
64%
52%
54%
42%
63%
64%
70
67%
53%
69%
54%
68%
53%
89%
89%
74%
67%
61%
52%
70%
71%
65
76%
70%
76%
66%
74%
63%
91%
91%
92%
89%
68%
61%
76%
77%
60
89%
89%
84%
78%
82%
75%
93%
93%
N/A
N/A
74%
68%
84%
84%
NOx/VOC
75
51%
15%
60%
N/A
65%
40%
95%
93%
60%
41%
57%
37%
65%
65%
70
65%
46%
73%
53%
73%
52%
96%
95%
71%
55%
65%
52%
73%
74%
65
78%
64%
85%
69%
81%
65%
98%
97%
89%
86%
71%
62%
80%
81%
60
87%
87%
90%
85%
87%
85%
99%
98%
N/A
N/A
79%
72%
88%
88%
Figure 115 through Figure 121 show boxplots of composite monitor daily maximum 8-hr ozone values
for recent conditions (base) and each of the eight adjustment scenarios (NOx-only and NOx/VOC for 75,
70, 65, and 60 ppb standard levels) in the seven cities evaluated. A range of results can be seen in
different urban areas. Denver, Houston, Los Angeles, and New York showed the largest difference
between Os concentrations in NOx-only versus NOx/VOC scenarios while Detroit, Philadelphia and
Sacramento had relatively less difference in the Os distributions estimated in two types of scenarios. In
all cities, the NOx-only and NOx/VOC scenarios had very similar O?, concentrations at the upper end of
the distribution (top whiskers and outlier dots in the boxplots). This is not surprising since the
adjustment scenarios were implemented to obtain identical 4th high Oj concentrations. Mid-range Oj
132
-------
concentrations (25th-75th percentiles) were generally lower in the NOx/VOC adjustment scenarios than
the NOx-only adjustment scenarios for the same standard level with the exception of New York. The
reduction of mid-range O3 concentrations in the NOx/VOC scenarios compared to the NOx-only
scenarios tended to be larger in 2007 than in 2009 and tended to be larger at lower alternate standard
levels. In most urban case study areas the reduction in mid-level ozone in the NOx/VOC scenario
compared to the NOx-only scenario was modest but in Los Angeles it was significant. The change in
mid-range ozone concentrations between the two sets of scenarios was so dramatic in Los Angeles that
in many cases the 75th percentile concentration in the NOx/VOC scenario was lower than the 25th
percentile concentration in the comparable NOx-only scenario. The most dramatic differences between
the NOx-only and the NOx/VOC scenarios occurred at the low end of the ozone distribution. In all
urban case study areas, there were smaller increases in ozone at low ozone concentrations in the
NOx/VOC scenario when compared to the NOx-only scenario for the same standard level. This is
especially evident for extreme low concentrations (bottom whiskers for blue NOx-only scenarios are
much higher than bottom whiskers for red NOx/VOC scenarios in the boxplots) but can also be seen in
the 25th percentile O3 values which are represented by the bottom of the boxes. The reductions were
most apparent for Denver, Houston, Los Angeles, New York, and Philadelphia but were more modest in
Detroit and Sacramento.
Z S, S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
NOxonly
NOxWOC
NOx only
NCMVOC
NOxonly
NOxWOC
NOxonly
NOxA/OC
base 75 70 65
MSA, June-August, 2007
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
NOxonly
NOxWOC
NOxonly
NOxWOC
base 75 70 65
base 75 70 65
base 75 70 65
base 75 70 65 60
Figure 115: Composite monitor daily maximum 8-hour Os values for Denver based on observed
and adjusted air quality for the NOx only and NOx/VOC scenarios.
133
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
NOxonly
NOxWOC
NOxonly
NOx/VOC
base 75 70 65
MSA, June-August, 2007
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
NOxonly
NOx/VOC
NOxonly
NOxA/OC
base 75 70 65
base
70 65
60
base 75 70 65 60
base 75 70 65
60
Figure 116: Composite monitor daily maximum 8-hour Os values for Detroit based on observed
and adjusted air quality for the NOx only and NOx/VOC scenarios.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
NOxonly
NOx/VOC
NOxonly
NOxVOC
base 75 70 65 60
MSA, June-August, 2007
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
NOxonly
NOx/VOC
NOxonly
NOx/VOC
base 75
70
B6
60
base 75
65
60
base 75 70
65
60
base 75
65 60
Figure 117: Composite monitor daily maximum 8-hour Os values for Houston based on observed
and adjusted air quality for the NOx only and NOx/VOC scenarios.
134
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
NOxonly
NOx/VOC
NOxonly
NOxWOC
base 75 70 65
MSA, June-August, 2007
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
NOxonly
NOxJVOC
NOx only
NOx/VOC
NOxonly
NOx/VOC
NOxonly
NOxA/OC
base 75 70 65
base
70 65 60
base 75 70 65
60
base 75 70 65
60
Figure 118: Composite monitor daily maximum 8-hour Os values for Los Angeles based on
observed and adjusted air quality for the NOx only and NOx/VOC scenarios.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
NOx only
NOx/VOC
NOx only
NOxWOC
NOxonly
NOx/VOC
base 75 70 65 60
MSA, June-August, 2007
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
NOxonly
NOx/VOC
NOxonly
NOxA/OC
base 75 70 65
base 75 70 65
base 75 70 65
base 75 70 65 60
Figure 119: Composite monitor daily maximum 8-hour Os values for New York based on observed
and adjusted air quality for the NOx only and NOx/VOC scenarios.
135
-------
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65
MSA, June-August, 2007
base 75 70 65
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
base
Figure 120: Composite monitor daily maximum 8-hour Os values for Philadelphia based on
observed and adjusted air quality for the NOx only and NOx/VOC scenarios.
Z & S, June-August, 2007
Z & S, April-October, 2007
Z & S, June-August, 2009
Z & S, April-October, 2009
base 75 70 65 60
MSA, June-August, 2007
65
base 75 70 65 60
MSA, April-October, 2007
base 75 70 65 60
MSA, June-August, 2009
base 75 70 65 60
MSA, April-October, 2009
Figure 121: Composite monitor daily maximum 8-hour Os values for Sacramento based on
observed and adjusted air quality for the NOx only and NOx/VOC scenarios.
Figure 122 through Figure 128 show maps of the 2006-2008 observed and adjusted April-October
seasonal mean of the daily maximum 8-hr ozone value at monitor locations in each of the urban case
study areas. Adjusted values are shown for the 75 ppb and 65 ppb NOx-only and NOx/VOC adjustment
scenarios. As described earlier in this appendix and in Chapter 4 of the main REA document, these
136
-------
figures show that for observed 2006-2008 Os values, the seasonal mean of the daily maximum 8-hr
ozone concentration was suppressed in the highly urbanized areas compared to surrounding locations.
In these figures, it is clear that ozone concentrations are lower in the center city areas for all cities except
Sacramento. The NOx only adjustment scenarios resulted in three types of behavior as demonstrated by
the top panels on these maps. First, in Denver, Houston, Los Angeles, and New York the NOx only
adjustment cases resulted in increasing seasonal mean concentrations in center city locations where
ozone was suppressed in observations and decreasing seasonal mean concentrations in outlying areas
where ozone was higher in the observations. This trend is so dramatic in Los Angeles and New York
that the spatial gradient becomes inverted in the adjustment cases compared to the observed values (i.e.
highest concentrations in urban core areas and lower concentrations in surrounding areas). The second
type of response occurred in Detroit where there was little change at the monitor in the center of Wayne
County but seasonal mean ozone at the outlying monitors decreased in the NOx-only adjustment cases.
Finally, in Sacramento and Philadelphia, the NOx only adjustment cases cause relatively small decreases
in seasonal mean ozone throughout the area. When comparing the NOx-only to the NOx/VOC
adjustment scenarios three types of patterns emerge. In Denver and Los Angeles, the NOx/VOC
adjustment scenarios lead to lower seasonal mean ozone concentrations than the equivalent NOx-only
adjustment scenarios in the urban core areas but looked similar to the NOx only adjustment scenario in
the outlying areas. In Houston, Detroit, Philadelphia, and Sacramento, seasonal mean ozone
concentrations in equivalent NOx-only and NOx/VOC adjustment scenarios were very similar at
monitors throughout the areas although the NOx/VOC adjustment scenarios lead to slightly lower
concentrations. Finally, in New York the 75 ppb adjustment scenarios were similar for the NOx/VOC
and NOx-only cases but for the 65 ppb adjustment scenarios the NOx/VOC case actually lead to higher
seasonal mean ozone concentration in the urban core area than the NOx-only case.
An evaluation of the composite monitor and spatial plot maps leads to several general
conclusions for this sensitivity analysis. First, the NOx/VOC reduction scenarios tended to mitigate
increases that occurred in the NOx-only scenario at the lower end of the ozone distribution. Second the
effect on the NOx/VOC scenario versus the NOx-only scenario was less dramatic for mid-range ozone
concentrations and varied from city to city. The NOx/VOC scenarios lead to lower mid-range ozone
concentrations than the NOx-only scenarios except in the case of New York. Third, the high end ozone
concentrations at various standard levels were similar in the NOx-only and the NOx/VOC scenarios.
Finally, the VOC reductions tended to have more impact in urban core areas and relatively little impact
in outlying areas. The effects of these air quality sensitivity analyses on risk are evaluated and discussed
in chapter 7 of the main document.
137
-------
75 ppb rollback- NOx only 65 ppb rollback- NOx only
April- October Average MDA8
Observed 2006 - 2008
J I
75 ppb rollback- NOx/VOC 65 ppb rollback- NOx/VOC
Figure 122: April-October seasonal average of daily maximum 8-hour Os values at Denver area
monitor locations for observed 2006-2008 conditions (left panel), 75 ppb adjustment
scenarios (middle panels), and 65 ppb adjustment scenarios (right panels). NOx only
adjustments are shown in top panels, NOx/VOC adjustments are shown in bottom panels.
138
-------
75 ppb rollback- NOx only 65 ppb rollback- NOx only
June-August Average MDA8
Observed 2006 - 2008
75 ppb rollback- NOx/VOC 65 ppb rollback- NOx/VOC
Figure 123: April-October seasonal average of daily maximum 8-hour Os values at Detroit area
monitor locations for observed 2006-2008 conditions (left panel), 75 ppb adjustment
scenarios (middle panels), and 65 ppb adjustment scenarios (right panels). NOx only
adjustments are shown in top panels, NOx/VOC adjustments are shown in bottom panels.
139
-------
75 ppb rollback- NOx only 65 ppb rollback- NOx only
April- October Average MDA8
Observed 2006 - 2008
75 ppb rollback- NOx/VOC 65 ppb rollback- NOx/VOC
Figure 124: April-October seasonal average of daily maximum 8-hour Os values at Houston area
monitor locations for observed 2006-2008 conditions (left panel), 75 ppb adjustment
scenarios (middle panels), and 65 ppb adjustment scenarios (right panels). NOx only
adjustments are shown in top panels, NOx/VOC adjustments are shown in bottom panels.
140
-------
75 ppb rollback- NOx only 65 ppb rollback- NOx only
April- October Average MDA8
Observed 2006 - 2008
75 ppb rollback-NOx/VOC
65 ppb rollback-NOx/VOC
Figure 125: April-October seasonal average of daily maximum 8-hour Os values at Los Angeles
area monitor locations for observed 2006-2008 conditions (left panel), 75 ppb adjustment
scenarios (middle panels), and 65 ppb adjustment scenarios (right panels). NOx only
adjustments are shown in top panels, NOx/VOC adjustments are shown in bottom panels.
141
-------
75 ppb rollback- NOx only 65 ppb rollback- NOx only
-^ "-7-<-~T~/ -^r'Ti. i l^ ^rJ-^~ I "iTHi i
June- August Average MDA8
Observed 2006 - 2008
75 ppb rollback- NOx/VOC 65 ppb rollback- NOx/VOC
^ V^ I |