Technical Support Document for the
        Clear Skies Act 2003
   Air Quality Modeling Analyses
   U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
 Emissions Analysis and Monitoring Division
     Research Triangle Park, NC 27711
             October 2003

-------
Table of Contents

I.      Introduction	1

II.     Emissions Inventory Estimates	1

II.     Episodic Ozone Modeling	2
       A.     Model Configuration	3
                    1. Episodic Meteorology and Ambient Air Quality	3
                    2. Domain and Grid Configuration	4
                    3. Meteorological and Other Model .Inputs	5
       B.      Model Performance Evaluation	7
                    1. Statistical Definitions	7
                    2. Domainwide Model Performance (Eastern U S.)	8
                    3. Local-scale Model Performance (Eastern U.S.)	10
       C.     Ozone Modeling Results	13
                    1. Projected Future Ozone Design Values	13
                          a. East	13
                          b. West	14
                    2. Ozone nonattainment summary	14

IV.    Particulate Matter, Visibility,  and Deposition Modeling over the Continental U.S	17
       A.     REMSAD Model Description	17
                    1. Gas Phase Chemistry	17
                    2. PM Chemistry	18
       B.     REMSAD Modeling Domain	19
       C.     REMSAD Inputs	20
                    1. Meteorological Data	21
                    2. Initial and Boundary Conditions, and Surface Characteristics	23
       D.     Model Performance Evaluation	25
                    1. Statistical Definitions	28
                    2. Results of REMSAD Performance Evaluation	30
                          a. IMPROVE Performance	30
                                 a. 1. PM25 Performance 	30
                                 a.2. Sulfate Performance	32
                                 a.3. Elemental Carbon Performance	34
                                 a.4. Organic Aerosol Performance	35
                                 a.5. Nitrate Performance	37
                                 a.6. PMFINE-Other (crustal) Performance	39
                          b. NADP Wet Deposition Performance	41
                          c. Wet Mercury Deposition	42
                          d. CASTNet Performance	43
                          e. Visibility Performance	44
                    3. Summary of Model Performance	46

-------
V.
E.    Projected Future PM25Design Values	48
             l.East	48
             2. West	49
F.    PM2 5 Nonattainment Summary	50
G.    Projected Visibility	53
H.    Model Output for Benefits Calculations	54

References	55
Appendix A   Annual Total Emissions Summaries and EGU Emissions Summaries.
Appendix B   8-Hour Ozone Design Values for 1999-2001 and Future Year Predicted Design
             Values for 2010 and 2020 Base and Control Cases.
Appendix C   8-Hour Ozone Nonattainment Maps and Design Value Range Maps.
Appendix D   IMPROVE Monitoring Sites used in the REMSAD Model Performance
             Evaluation.
Appendix E   Speciated Modeled Attainment Test (SMAT) Documentation.
Appendix F   Annual Average PM2 5 Design Values for 1999-2001 and Future Year Predicted
             Design Values for 2010 and 2020 Base and Control Cases.
Appendix G   Annual Average PM2 5 Nonattainment Maps and Design Value Ranges Maps.
Appendix H   Projected Visibility Summaries for 20% Best and 20% Worst Days at IMPROVE
             Monitoring Sites.
                                         11

-------
I. Introduction

       This document describes the procedures and results of the air quality modeling analyses
used to quantify the impacts and benefits of the Clear Skies Act (CSA) of 2003. The air quality
modeling was conducted to support several components of the legislation including:

       (a) an assessment of the costs and benefits associated with the Act, and
       (c) an assessment of the expected impact of the Act on ozone and PM25 levels.

       The air quality  model applications include episodic regional scale ozone modeling for the
Eastern U.S. and annual particulate matter (PM), visibility, and deposition modeling on a
continental scale covering the 48 contiguous States. 1996 Base Year simulations were made to
examine the ability of the modeling systems to replicate observed concentrations  of these
pollutants.  This was followed by simulations using a 2001 "proxy" emissions inventory. The
2001 inventory was created for the purpose of aligning the current year ozone and PM25 design
values (1999-2001) to the emissions inventory year.

       The 2001 modeling was followed by simulations for future-year base case scenarios of
2010 and 2020. The future base case scenarios included emissions resulting from growth and
known emissions controls from Federal and State rules and legislation. The results of the future
base case model runs were used to establish a base to compare against the Clear Skies controls.
Additional control case simulations were performed for 2010 and 2020 to quantify the impacts of
the Clear Skies Act controls on air quality. The outputs of the base and control case model runs
were also used to calculate portions of the monetized benefits as part of the cost-benefits
analysis.

       The remainder  of this report includes a description of the ozone and
PM/visibility/deposit!on modeling systems, the time periods modeled, the Base Year (1996)
model performance evaluations, and the results of the future Base Case and Control Case model
simulations.
II.  Emissions Inventory Estimates

       In order to complete the requisite ozone and PM modeling, it was necessary to first
develop a national mass emissions inventory. This mass emissions inventory was then used as
the basis for developing input files for the air quality modeling. The development and details of
these inventories for each of the scenarios (i.e.,  1996 base year, 2001 base case, 2010 base case,
2010 control case, 2020 base case, and 2020 control case) are described elsewhere (EPA, 2003a)
and (EPA, 2003b).

       The mass inventories were prepared at the county-level for on-highway mobile,
stationary area sources, and nonroad sources. Emissions for electric generating units (EGUs)

-------
and large industrial sources (non-EGUs) were prepared as individual point sources. These
inventories contain annual and typical summer season day emissions for the following
pollutants: oxides of nitrogen (NOX), volatile organic compounds (VOC), carbon monoxide
(CO), sulfur dioxide (SO2), primary particulate matter with an aerodynamic diameter less than or
equal to 10 micrometers and 2.5 micrometers (PM10 and PM25), ammonia (NH3), and mercury
(Hg). The 2010 and 2020 non-EGU and area source base case inventories were prepared by
applying growth and control assumptions to the 1996 base year inventory.  The 2010 and 2020
mobile and nonroad base case inventories were developed by applying the MOBILESb1 and
NONROAD2002 models respectively. The EGU base case emissions were developed from the
IPM model (EPA, 2003) .  The 2010 and 2020 control case inventories are the same as the 2010
and 2020 base case inventories for all sectors except for EGU emissions. The Clear Skies
control case EGU emissions are supplied by IPM.

       The annual and summer day mass emissions inventories for each scenario were processed
using the SMOKE (Houyoux, 2000) to create the appropriate emissions  inputs for REMSAD and
CAMx model runs, respectively.  The emissions processing produced hourly, gridded, speciated
emissions. For PM modeling the annual emissions for stationary area, point, and nonroad
sources were processed to generate separate sets of emissions representing typical weekday,
Saturday, and Sunday emissions for each season. For ozone modeling the summer day emissions
were processed to generate typical summer weekday, Saturday, and Sunday emissions. On-
highway emissions were obtained in model-ready form from the Heavy-Duty Diesel Rule
modeling exercise.  Hourly biogenic emissions were calculated using the Biogenic Emissions
Inventory System (BEIS3.09) model. Biogenic emissions were not altered for any of the
scenarios modeled. Appendix A contains State by State emissions summaries of total emissions
from all sectors  as well as annual total EGU emissions for each of the modeling scenarios. More
detailed emissions summaries (for all source sectors) can be found at:
ftp://ftp.epa.gov/modelingcenter/Clear_skies/CSA2003/Emissions/Summaries/CSA-2003-Emissi
ons-Summaries.zip


III.  Episodic Ozone Modeling

       Air quality modeling analyses for ozone were conducted with the Comprehensive Air
Quality Model with Extensions (CAMx). CAMx is non-proprietary computer modeling tool that
can be used to evaluate the impacts of proposed emissions reductions on future air quality levels.
For more information on the CAMx model, please see the model user's guide (Environ, 2002)2.
Version 3.10 of the CAMx model was employed for these analyses.
       1 The mobile source inventories were prepared using MOBILESb with adjustment factors to
simulate MOBILE6 emissions. These are the same mobile source emissions that were used in the
Nonroad rule proposal and the Heavy Duty Engine and Tier 2 rules.

       2 http://www.camx.com/pdf/CAMx3.UsersGuide.020410.pdf

                                          2

-------
       The modeling analyses were completed for an Eastern U.S. domain as shown in Figure
III-l.  The domain has nested horizontal grids of 36 and 12 km. The model was applied and
evaluated over three episodes that occurred during the summer of 1995 base year. Subsequently,
episodic ozone model runs were made for the 2001 base year scenario and the 2010  and 2020
base and control case scenarios for all episodes.

       The model output from the 1996 base year scenario was used to evaluate the performance
of the model. The model outputs from the 2001 base year and 2010 and 2020 base and control
cases, combined with current air quality data, were used to project ozone design values to the
future years (2010 and 2020).  The costs, benefits, and expected impacts of the proposed controls
were determined by comparing the model results in the future year control runs against the
baseline simulations of the same year.
A.  Model Configuration

1. Episodic Meteorology and Ambient Air Quality

       There are several considerations involved in selecting episodes for an ozone modeling
analysis (EPA, 1999a).  In general, the goal should be to model several differing sets of
meteorological conditions leading to ambient ozone levels similar to an area's design value.
Warm temperatures, light winds, cloud-free skies, and stable boundary layers are some of the
typical characteristics of ozone episodes.  On a synoptic scale,  these conditions usually result
from a combination of high pressure aloft (e.g., at the 500 millibar pressure level) and at the
surface. Of course at a smaller scale, the conditions that lead to local ozone exceedances can
vary from location to location based on factors such as wind direction, sea/lake breezes, etc. The
meteorological and resultant ozone patterns for the three separate modeling episodes used in this
analysis are listed in Table III-l and are discussed in more detail in previous technical support
documents  for the Tier-2/Low Sulfur rule (EPA, 1999b) and the Heavy-Duty Engine rule (EPA,
2000).  These previous discussions conclude that the selected episodes contain measured ozone
concentrations that are representative of design values over most of the eastern U.S. The first
three days of each period are  considered ramp-up days and the results from these days were not
used in the  analyses. In all, 30 episode days were modeled.
Table III-l. Dates of CAMx Modeling Episodes.

Episode 1
Episode 2
Episode 3
Eastern U.S. Modeling
June 12-24, 1995
July 5- 15, 1995
August 7-21, 1995

-------
2. Domain and Grid Configuration
       As with episode selection, there are also several considerations involved in selecting the
domain and grid configuration to be used in the ozone modeling analysis.  The modeling domain
should encompass the area of intended analysis with an additional buffer of grid cells to
minimize the effects of uncertain boundary condition inputs. When possible, grid resolution
should be equivalent to the resolution of the primary model inputs (emissions, winds, etc.) and
equivalent to the scale of the air quality issue being addressed. The CAMx modeling was
performed for the coarse and fine grid domains as defined below.

Table III-2. Details of the CAMx Modeling Domains.


Map Projection
Grid Resolution
East/West extent
North/South extent
Vertical extent
Dimensions
Eastern US Domain
Coarse Grid
latitude/longitude
1/2° longitude, 1/3° latitude
(~ 36 km)
-99 W to -67 W
26 N to 47 N
Surface to 4 km
64 by 63 by 9
Fine Grid
latitude/longitude
1/6° longitude, 1/9° latitude
(~ 12 km)
-92 W to -69. 5 W
32 N to 44 N
Surface to 4 km
137 by 110 by 9

-------
          189
                                                             192
Figure III-l. Map of the Eastern U.S. modeling domain.  The outer box denotes the entire
modeling domain (36 km) and the inner box shaded indicates the fine grid location (12 km).
3. Meteorological and Other Model Inputs

       The air quality model requires certain meteorological inputs that, in part, govern the
formation, transport, and destruction of pollutant material. In particular, the CAMx model used
in these analyses requires seven meteorological input files: wind (u- and v-vector wind
components), temperature, water vapor mixing ratio, atmospheric air pressure, cloud cover,
rainfall, and vertical diffusion coefficient.  Fine grid values of wind, pressure, and vertical
diffusivity are used; the other fine grid meteorological inputs are interpolated from the coarse
grid files.

Eastern U.S. Domain:   The gridded meteorological data for the three historical  1995 episodes
were developed by the New York Department of Environment and Conservation using the
Regional Atmospheric Modeling System (RAMS), version 3b.  RAMS (Pielke et. a/., 1992) is a
numerical meteorological model that solves the full set of physical and thermodynamic equations
which govern atmospheric motions.  The output data from RAMS, which was run in a polar
stereographic projection and a sigma-p coordinate system, was then mapped to the CAMx grid.

-------
Two separate meteorological CAMx inputs, cloud fractions and rainfall rates, were developed
based on observed data.

       RAMS was run in a nested-grid mode with three levels of resolution: 108 km, 36 km, and
12 km with 2S-343 vertical layers.  The top of the surface layer was 16.7 m in the 36  and 12km
grids. The two finer grids were at least as large as their CAMx counterparts. In order to keep
the model results in line with reality, the simulated fields were nudged to an European Center for
Medium-Range Weather Forecasting analysis field every six hours.  This assimilation data set
was bolstered by every four-hourly special soundings regularly collected as part of the North
American Research Strategy on Tropospheric Ozone field study in the northeast U.S.

       A limited model performance evaluation (Sistla, 1999) was completed for a portion of the
1995 meteorological modeling (July  12-15).  Observed data not used in the  assimilation
procedure were compared against modeled data at the surface and aloft.  In  general, there were
no widespread biases in temperatures and winds.  Furthermore, the meteorological fields were
compared before and after being processed into CAMx inputs. It was concluded that this
preprocessing did not distort the meteorological fields.

Other Model Inputs:  In addition to the meteorological data, the photochemical grid model
requires several other types of data.  In general, most of these miscellaneous model files were
taken from existing regional modeling applications.  Clean conditions were used to initialize the
model and as lateral and top boundary conditions as in previous regional modeling applications.
The model also requires information regarding land use type and surface albedo for all layer 1
grid cells in the domain.  Existing regional data obtained from OTAG were  used for these  non-
day-specific files. Photolysis rates were developed using the JCALC preprocessor (SAI, 1996).
Turbidity values were set equal to a constant thought to be representative of regional conditions.
       3 The inner nests were modeled with 34 layers while the outer 108 km domain was
modeled with 28 layers.

-------
B.  Model Performance Evaluation

       The goal of the Base Year modeling was to reproduce the atmospheric processes
resulting in high ozone concentrations over the eastern United States during the three 1995
episodes selected for modeling.  Note that the Base Year of the emissions was 1996 while the
eastern U.S. episodes are for 1995.  The effects on model performance of using 1996 Base Year
emissions for the 1995 episodes are not known, but are not expected to be major.

       An operational model performance evaluation for surface ozone for the three episodes
was performed in order to estimate the ability of the modeling system to replicate base year
ozone concentrations.  This evaluation is comprised principally of statistical assessments of
model versus observed pairs. The robustness of an operational evaluation is directly
proportional to the amount and quality of the ambient data available for comparison.

1. Statistical Definitions

       Below are the definitions of those statistics used for the evaluation. The format of all the
statistics is such that negative values indicate model ozone predictions that were less than their
observed counterparts. Positively-valued statistics indicate model overestimation of surface
ozone. Statistics were not generated for the first three days of an episode to avoid the
initialization period. The operational statistics were principally generated on a regional basis in
accordance with the primary purpose of the modeling which is to assess the need for, and
impacts of, a national emissions control program.  However, a local assessment of model
performance was also completed to ensure that the model did not significantly overestimate the
need for controls in individual areas. The  statistics were calculated for (a) the entire domain, (b)
four quadrants (i.e., Midwest, Northeast, Southeast, Southwest), and (c) 47 local areas.  The
statistics calculated for each of these sets of areas  are described below.

Domainwide unpaired peak prediction accuracy: This metric simply compares the peak
concentration modeled anywhere in the selected area against the peak ambient concentration
anywhere in the same area. The difference of the  peaks (model - observed) is then normalized
by the peak observed concentration.

Peak prediction accuracy: This metric averages the paired peak prediction accuracy calculated
for each monitor in the subregion.  It characterizes the ability of the model to replicate peak
(afternoon) ozone over a subregion. The daily peak model versus daily peak observed residuals
are paired in space but not by hour.

Mean normalized bias: This performance statistic  averages the normalized (by observation)
difference (model - observed) over all pairs in which the observed values were greater than 60
ppb. A value of zero would indicate that the model over predictions and model under
predictions exactly cancel each other out.

Mean normalized gross error: The last metric used to assess the performance is similar to the

-------
above statistic, except in this case it is the absolute value of the residual which is normalized by
the observation, and then averaged over all sites. A zero gross error value would indicate that all
model concentrations (in which their observed counterpart was greater than 60 ppb) exactly
matched the ambient values.
2. Domainwide Model Performance

       As with previous regional photochemical modeling studies, the degree that model
predictions replicate observed concentrations varies by day and location over the large eastern
U.S. modeling domain. From a qualitative standpoint, there appears to be considerable
similarity on most days between the observed and simulated ozone patterns.  Additionally, where
possible to discern, the model appears to follow the day-to-day variations in synoptic-scale
ozone fairly closely. More quantitative comparisons of the model predictions and ambient data
are provided below.

       When all hourly observed ozone values (greater than 60 ppb) are compared to their
modeled counterparts for the thirty episode modeling days for the eastern U.S., the mean
normalized bias is -1.1 percent and the mean normalized gross error is 20.5 percent As shown in
Table III-3, the model generally underestimates observed ozone values for the June and July
episodes, but predicts higher than observed amounts for the August episode.
Table III-3. Performance statistics for hourly ozone in the Eastern U.S. CAMx simulations.

June 1995
July 1995
August 1995
Average Accuracy of the Peak
-7.3
-3.3
9.6
Mean Normalized Bias
-8.8
-5.0
8.6
Mean Normalized Gross Error
19.6
19.1
23.3
       Depending on the episode and region, the normalized biases can range from an
underestimation of 18 percent to an overestimation of 16 percent. Gross errors tend to average
between 17 and 25 percent. As shown in Table III-4, when the model domain is subdivided into
four quadrants, it is found that most of the underestimations in the June and July episodes are
driven by the Northeast and Midwest quadrants (i.e., the two northern ones). Conversely, most
of the overestimated ozone in the August episode is due to the Midwest, Southeast and
Southwest quadrants. Hourly ozone is consistently underestimated in the Northeast quadrant.
The model does slightly better in replicating the peak values for each monitoring site than it does
at replicating the mean values, especially in the Northeast where the underpredictions are not as
large for the highest ozone observations.

-------
Table III-4. Regional/Episodic performance statistics for Clear Skies hourly ozone predictions.


Whole Grid
Northeast
Midwest
Southeast
Southwest
Average Accuracy of the
Peak
June
-7.3
-14.7
-7.3
-2.9
-0.9
July
-3.3
-5.0
-6.2
1.9
1.3
August
9.6
-4.3
15.5
15.1
7.0
Mean Normalized Bias
June
-8.8
-18.4
-8.7
-3.0
0.7
July
-5.0
-7.2
-7.2
1.3
3.1
August
8.6
-6.0
15.5
14.7
10.3
Mean Normalized Gross
Error
June
19.6
24.7
18.0
17.4
19.0
July
19.1
19.1
19.4
19.1
20.0
August
23.3
22.6
23.7
24.1
22.6
       At present, there are no generally accepted set of numerical criteria by which one can
judge the adequacy of model performance for regional applications. In view of this, EPA
determined the acceptability of modeling for Clear Skies by comparison against the performance
results of regional models from previous analyses. For instance, the Nonroad rule Heavy Duty
Engine (HDE) simulations were determined to be appropriate for use based on comparisons to
previously accepted modeling analyses (e.g., OTAG and Tier-2). As shown in Table III-5,
model performance in the base year Clear Skies simulations is generally similar or better than
other regional ozone modeling efforts.  In particular, the gross error metric is almost universally
improved  in the more recent Clear Skies modeling. In general, the Clear Skies/CAMx modeling
results are approximately 3-6 ppb higher on average than what was generated in the HDE/UAM-
V modeling.  In some previous regional modeling  applications, there had been a tendency in
some regions for the model to underestimate ozone in the early parts of an episode and then
overestimate ozone at the end of an episode.  However, in general, there does not appear to be
any such bias trend in the Clear Skies base year modeling.
Table III-5. Regional/Episodic performance statistics for HDE hourly ozone predictions.  Bold
numbers indicate HDE statistics that have improved in the Clear Skies simulations (see Table
III-4).


Whole Grid
Northeast
Midwest
Southeast
Southwest
Average Accuracy of the
Peak
June
-10.5
-15.1
-13.1
-5.4
0.2
July
-5.8
-6.6
-11.1
0.6
3.9
August
7.7
-5.2
11.4
14.7
8.8
Mean Normalized Bias
June
-13.2
-20.3
-15.4
-7.2
1.0
July
-9.6
-12.1
-14.2
-2.8
4.9
August
5.0
-8.8
9.6
12.1
10.5
Mean Normalized Gross
Error
June
22.3
27.0
21.6
18.4
21.6
July
22.3
21.2
23.6
21.0
23.4
August
23.6
24.2
22.1
24.6
26.5

-------
       Table III-6 presents the results from the eight-8-hour ozone evaluation. In general, the
gross error is noticeably less for the eight-hour ambient versus observed ozone comparisons.
However, model estimates during the August episode clearly over predict the observed values in
regions outside the Northeast.
Table III-6. Regional/Episodic performance statistics for Clear Skies 8-hour ozone predictions.


Whole Grid
Northeast
Midwest
Southeast
Southwest
Average Accuracy of the
Peak
June
-3.9
-13.5
-4.0
1.3
5.0
July
0.9
-2.4
-0.9
5.3
8.2
August
13.9
-1.6
20.6
20.5
16.2
Mean Normalized Bias
June
-5.7
-15.4
-5.8
0.9
3.9
July
-2.1
-4.9
-4.4
4.0
3.6
August
11.0
-3.8
17.6
18.4
12.4
Mean Normalized Gross
Error
June
17.5
21.3
16.0
16.4
17.8
July
16.4
14.6
16.7
17.5
18.1
August
22.6
20.8
23.7
24.1
21.1
3. Local-scale Model Performance

       The CAMx modeling results were also evaluated at a "local" level. The purpose of this
analysis was to ensure that areas determined to need the nonroad engine emissions reductions
based on projected exceedances of the ozone standard were not unduly influenced by local
overestimation of ozone in the model Base Year. For this analysis, the modeling domain was
broken up into 51 local subregions as shown in Figure III-3. The primary statistics for each of
the 51 subregions is shown in Table III-7.

       As noted  above, there is no set of established statistical benchmarks to determine the
adequacy of a regional modeling operation evaluation.  However, the performance statistics for
the eastern U.S. modeling were compared to the recommended performance ranges for urban
attainment modeling (EPA, 1991).  The results indicate that model performance for the June
episode was within the recommended ranges for 69% of the local areas examined. For the July
and August episodes, the percent of local areas with performance within the recommended
ranges was 80% and 61%, respectively. This is an improvement from the HDE model
performance where the numbers were 57%, 45%, and 55% for the June, July, and August
episodes, respectively.

       Local scale model performance is poorest in the southeastern U.S. in the August episode
where over predictions occurred.  In fact, areas along the Gulf Coast (New Orleans,
Beaumont/Port Arthur, Baton Rouge, etc.) tend to be universally overestimated. This is likely
due to the model  tendency to generate large amounts of ozone along coastal areas where low
stability and high emissions densities can coexist.
                                          10

-------
       With the exception of the July episode, the model tends to underestimate observed ozone
by approximately 15% in the local areas of the Northeast (e.g., New York City, Philadelphia,
Boston). The local 8-hour metrics (not shown) generally do not greatly differ from their hourly
counterparts. There is a slight tendency toward greater overprediction of the 8-hour values.

Table III-7. Local performance statistics for Clear Skies hourly ozone predictions.


Dallas
Houston/Galveston
Beaumont/Port Arthur
Baton Rouge
New Orleans
St. Louis
Memphis
Alabama
Atlanta
Nashville
Eastern TN
Charlotte
Greensboro
Raleigh-Durham
EvansvUle/Owensboro
Indianapolis
Louisville
Cincinnati/Dayton
Columbus
West Virginia
Chicago
Milwaukee
Muskegon/Grand
Rapids
Gary/South Bend
Detroit
Average Accuracy of
the Peak
June
-9.6
-3.0
14.0
15.6
15.6
-0.5
-7.7
5.2
-3.1
-2.9
-14.2
8.3
-1.7
-11.8
1.2
-8.3
2.8
-4.7
-8.5
-8.8
-9.9
-14.8
-10.8
-13.0
-17.2
July
-12.3
-5.1
16.7
24.7
29.1
-4.0
-4.9
-1.7
5.4
7.8
-16.0
-2.1
-1.1
1.3
-0.9
-13.5
4.2
-8.5
-14.5
-5.7
-4.3
-12.9
-12.3
-10.0
-5.8
August
2.2
0.3
8.8
31.4
42.1
8.4
13.7
16.0
19.0
31.5
-2.7
6.0
17.2
-2.3
28.3
15.9
36.6
29.0
9.2
12.7
10.4
21.5
3.1
11.8
3.9
Mean Normalized
Bias
June
-10.6
-3.5
16.0
22.6
15.9
-0.6
-5.9
6.5
-3.4
-2.4
-21.0
5.8
-4.2
-10.7
4.5
-3.6
4.8
0.1
-6.2
-7.5
-17.1
-16.5
-11.6
-15.0
-20.1
July
-11.5
-3.9
19.3
26.6
28.9
0.6
-0.3
6.7
6.8
9.1
-17.1
4.1
1.2
4.2
5.4
-14.4
6.1
-5.6
-11.0
-3.2
-11.1
-16.9
-12.9
-14.5
-13.2
August
3.2
2.2
12.9
37.4
48.9
10.5
13.6
23.1
26.1
36.1
-5.9
14.5
18.2
-1.9
32.8
18.0
42.1
32.7
14.2
13.7
3.5
12.3
1.7
9.3
-3.2
Mean Normalized
Gross Error
June
16.6
20.8
20.4
26.1
21.9
17.0
15.5
14.4
16.7
18.1
22.7
13.0
14.1
14.6
15.1
13.1
14.7
12.8
14.6
15.7
24.5
19.1
17.7
19.2
25.1
July
18.7
19.0
24.5
31.0
32.0
18.4
19.3
16.6
20.1
24.7
20.7
16.3
15.3
13.9
21.2
19.3
17.9
19.1
17.3
16.6
23.5
23.3
20.4
24.4
22.5
August
15.7
25.7
24.6
40.5
50.2
18.2
22.0
25.2
31.0
37.4
18.3
18.2
21.7
16.9
33.9
19.7
42.5
33.5
18.7
24.5
22.3
18.2
16.4
20.7
23.4
                                           11

-------
Pittsburgh
Central PA
Norfolk
Richmond
Baltimore/Washington
Delaware
Philadelphia
New York City
Hartford
Boston
Maine
Longview/Shreveport
Kansas City
Western NY
Northeast OH
South Carolina
Gulf Coast
FL West Coast
FL East Coast
Jackson
Central MI
Macon/Columbus
Austin/San Antonio
Oklahoma City/Tulsa
Ft. Wayne/Lima
Bangor/Hancock Co.
-10.0
-6.0
-9.0
-1.2
-4.7
-6.1
-14.1
-16.2
-16.9
-13.7
-20.4
-2.1
-8.5
-23.1
-4.0
-2.5
0.5
-6.4
-15.9
0.6
-6.9
-9.5
-14.1
-12.3
-9.1
-17.8
-3.2
-7.6
0.0
4.8
-3.1
-5.2
-1.8
-3.9
-5.0
-4.7
-4.7
11.3
-7.8
-20.6
-6.5
1.3
23.1
22.8
16.2
10.9
-10.4
-11.1
-19.6
-5.6
-13.1
-6.9
9.2
1.0
8.3
2.6
1.7
2.3
-8.7
-12.2
-9.9
-15.6
-6.9
7.7
-4.3
-9.0
6.9
11.4
29.3
41.2
23.3
21.0
12.0
21.6
-1.9
-5.2
3.9
-17.7
-9.2
-8.5
-13.4
-1.3
-6.8
-6.3
-22.0
-24.6
-18.5
-19.6
-25.0
0.8
-7.9
-25.6
-6.6
-3.4
4.5
-7.3
-16.8
1.8
-9.6
-8.8
-11.0
-12.9
-8.3
-24.4
-2.1
-6.0
-5.6
10.7
-5.2
-0.2
-10.5
-14.1
-4.0
-9.2
-9.4
11.1
-1.5
-20.5
-6.8
1.5
30.0
11.9
16.6
10.0
-14.8
-5.7
-15.5
-3.2
-14.1
-8.5
7.9
1.1
5.7
4.5
0.7
7.5
-13.9
-17.9
-7.7
-19.6
-6.9
11.4
-8.3
-12.1
7.7
15.7
33.7
42.8
26.3
24.0
6.6
26.4
4.1
-2.8
5.1
-19.9
23.1
21.9
19.1
8.4
18.6
12.9
26.4
31.3
23.6
25.9
25.3
16.2
15.7
28.1
20.4
12.5
15.4
11.3
18.0
16.0
18.1
10.9
14.1
17.2
16.0
25.2
16.1
15.5
18.6
18.3
15.6
11.6
19.5
22.5
18.2
20.9
19.0
16.5
13.0
23.8
15.5
17.7
31.6
22.7
18.4
16.0
18.7
13.0
17.2
14.6
18.2
15.3
20.4
18.6
24.7
20.3
23.4
16.2
28.9
29.8
20.1
26.5
15.5
17.9
12.4
19.0
16.5
19.4
34.9
43.7
29.4
24.9
17.5
26.9
12.4
12.6
10.6
21.0
12

-------
           189
                                                                          192
Figure III-3. Map of the 51 local-scale evaluation zones.
C.  Ozone Modeling Results

       The Clear Skies CAMx modeling output was analyzed to examine the air quality impacts
of the legislation. The procedures and results of the analysis is described below.

1. Projected Future Ozone Design Values

a.) East

       The CAMx simulations were performed for Base Cases in 1996, 2001, 2010, and 2020
considering growth and expected emissions controls that will affect future air quality.  The
effects of the Clear Skies Act emissions reductions (i.e., Control Cases) were modeled for the
two future years (2010 and 2020). As a means of assessing the future levels of air quality with
regard to the ozone NAAQS, future-year estimates of ozone design values were calculated using
relative reduction factors (RRFs) applied to 1999-2001 ozone design values (EPA, 2003b).  The
procedures for determining the RRFs are similar to those in EPA's draft guidance for modeling
for an 8-hour ozone standard (EPA, 1999a).  Hourly model predictions were processed to
                                          13

-------
determine daily maximum 8-hour concentrations for each grid cell for each non-ramp-up day
modeled.  The RRF for a monitoring site was determined by first calculating the multi-day mean
of the 8-hour daily maximum predictions in the nine grid cells surrounding the site using only
those predictions greater than or equal to 70 ppb, as recommended in the guidance. This
calculation was performed for the base year 2001 scenario and each of the future-year scenarios.
The RRF for a site is the ratio of the mean prediction in the future-year scenario to the mean
prediction in the base year scenario.  RRFs were calculated on a site-by-site basis. The future-
year design value projections were then calculated by county, based on the highest resultant
design values for a site within that county from the RRF application. The  current 8-hour county
maximum ozone design values and future year base and control attainment status is provided in
Appendix B. County populations are also included in this appendix.

b.) West

       Western US ozone episodes were not modeled as part of the Clear  Skies analysis due to
the fact that  all of predicted future year ozone nonattainment areas in the West are in California4.
Clear Skies is predicted to reduce NOX emissions in California by less than 0.2% in both 2010
and 2020 (-1300 tons per year).  Therefore, it was assumed that Clear Skies would not affect the
attainment status of any counties in California. But estimated future year nonattainment county
counts for the West were still needed to portray the estimated nationwide 8-hour ozone
nonattainment problem.  The modeling results from the recently completed Nonroad Land Based
Diesel Engine (NLDE) proposed rulemaking were used to develop base year 2010 and 2020
nonattainment county  estimates for the West (EPA, 2003c).  Western episodic ozone modeling
was completed as part of the NLDE modeling. The future year model runs were completed for
2020 and 2030. For the Clear Skies analysis, the 2020 Nonroad modeling Western county
counts were  used directly.  The 2010 Clear Skies estimates were derived by linearly interpolating
the 2020 Nonroad values and the 1999-2001 ambient design values.

2. Ozone Nonattainment Summary

       As shown in Table III-8,  the modeling projects that 59 counties across the country with a
population of 45.4 million people will have design values greater than the  8-hour NAAQS in
2010 without Clear Skies controls. By 2020 that number is expected to fall to 30 counties with a
population of 39.5 million people as a result of projected emissions reductions from existing
control programs.

       Clear Skies emissions reductions are predicted to bring 5 counties into attainment in 2010
and cause 2  counties (Mecklenburg county NC and Henry county GA) to go out of attainment5 in
       4The only non-California county in the West that is currently not attaining the 8-hour standard is
Maricopa county, AZ with a current (1999-2001) design value of 85 ppb. Through interpolation methods
described above, the predicted 2010 design value for Maricopa county is 82 ppb. Therefore, all of the
predicted 2010 and 2020 ozone nonattainment counties in the West are in California.

       5It might seem unusual that there is an apparent predicted increase in ozone as a result of Clear
Skies controls in several counties. When low level NOX emissions are reduced, ozone can increase in

                                           14

-------
2010.  The net reduction of 3 counties leaves 56 counties with a population of 44.5 million
people nonattainment for the 8-hour ozone standard in 2010 after Clear Skies controls. In 2020
Clear Skies is expected to bring 3 counties into attainment. That leaves 27 counties with a
population of 33.5 million people nonattainment for the 8-hour ozone standard in 2020 after
Clear Skies controls. Appendix C contains maps of the base year and projected year 8-hour
ozone nonattainment counties.
Table III-8. Lists of counties projected to violate the 8-hour NAAQS in 2010 and 2020 for the
Base Case and Clear Skies Control Case.
2010 Base
California, Orange
California, Kings
California, Sacramento
California, Merced
California, Ventura
California, El Dorado
California, Tulare
California, Los Angeles
California, Fresno
California, Kern
California, Riverside
California, San Bernardino
Connecticut, Fairfield
Connecticut, New Haven
Connecticut, Middlesex
D.C., District of Columbia
Delaware, New Castle
Georgia, DeKalb
Georgia, Rockdale
Georgia, Fulton
Indiana, Lake
Maryland, Baltimore
2010 Control
California, Orange
California, Kings
California, Sacramento
California, Merced
California, Ventura
California, El Dorado
California, Tulare
California, Los Angeles
California, Fresno
California, Kern
California, Riverside
California, San Bernardino
Connecticut, Fairfield
Connecticut, New Haven
Connecticut, Middlesex
D.C., District of Columbia
Delaware, New Castle
Georgia, DeKalb
Georgia, Rockdale
Georgia, Fulton
Georgia, Henry
Maryland, Baltimore
2020 Base
California, Orange
California, Ventura
California, Los Angeles
California, Fresno
California, Kern
California, Riverside
California, San
Bernardino
Connecticut, Fairfield
Connecticut, New Haven
Connecticut, Middlesex
Illinois, Cook
Indiana, Lake
Maryland, Harford
Michigan, Macomb
Michigan, Wayne
New Jersey, Hudson
New Jersey, Hunterdon
New Jersey, Gloucester
New Jersey, Camden
New Jersey, Middlesex
New Jersey, Mercer
New Jersey, Ocean
2020 Control
California, Orange
California, Ventura
California, Los
Angeles
California, Fresno
California, Kern
California, Riverside
California, San
Bernardino
Connecticut, Fairfield
Connecticut, New
Haven
Connecticut,
Middlesex
Michigan, Macomb
Michigan, Wayne
New Jersey, Hudson
New Jersey,
Hunterdon
New Jersey,
Gloucester
New Jersey, Camden
New Jersey,
Middlesex
New Jersey, Mercer
New Jersey, Ocean
New York, Bronx
New York,
We stch ester
New York, Richmond
oxidant limited areas. But that is not what is happening in this case.  The increase in ozone is caused by
local predicted NOX increases in the IPM model from certain power plants. These power plants were
predicted to be controlled under the NOX SIP call trading program (which is assumed in the 2010 Clear
Skies base case). Under the Clear Skies control case, the plants trade under a new Clear Skies trading
program which is year-round and expanded to additional states. The predicted emissions patterns from
IPM are slightly different under the two trading programs.  Therefore, some power plants that were
predicted to put on controls under the NOX SIP call may not be predicted to do so under Clear Skies (and
vice versa). It is important to note that the overall summer utility NOX emissions in the NOX  SIP call area
are predicted to be lower under Clear Skies than under the NOX SIP call.  So overall, Clear Skies will
provide regional ozone benefits in the NOX SIP call area.
                                             15

-------
Maryland, Prince George's
Maryland, Kent
Maryland, Anne Arundel
Maryland, Harford
Maryland, Cecil
New Jersey, Hudson
New Jersey, Monmouth
New Jersey, Cumberland
New Jersey, Morris
New Jersey, Hunterdon
New Jersey, Gloucester
New Jersey, Camden
New Jersey, Middlesex
New Jersey, Mercer
New Jersey, Ocean
New York, Erie
New York, Westchester
New York, Richmond
Pennsylvania, Delaware
Pennsylvania, Lancaster
Pennsylvania, Lehigh
Pennsylvania, Northampton
Pennsylvania, Montgomery
Pennsylvania, Bucks
Rhode Island, Kent
Tennessee, Shelby
Texas, Tarrant
Texas, Galveston
Texas, Collin
Texas, Denton
Texas, Harris
Virginia, Caroline
Virginia, Fauquier
Wisconsin, Door
Wisconsin, Kenosha
Wisconsin, Ozaukee
Wisconsin, Sheboygan
59 Counties
Maryland, Prince George's
Maryland, Kent
Maryland Anne Arundel
Maryland, Harford
Maryland, Cecil
New Jersey, Hudson
New Jersey, Monmouth
New Jersey, Morris
New Jersey, Hunterdon
New Jersey, Gloucester
New Jersey, Camden
New Jersey, Middlesex
New Jersey, Mercer
New Jersey, Ocean
New York, Westchester
New York, Richmond
North Carolina,
Mecklenburg
Pennsylvania, Delaware
Pennsylvania, Lancaster
Pennsylvania, Lehigh
Pennsylvania,
Northampton
Pennsylvania, Montgomery
Pennsylvania, Bucks
Rhode Island, Kent
Tennessee, Shelby
Texas, Tarrant
Texas, Galveston
Texas, Collin
Texas, Denton
Texas, Harris
Virginia, Fauquier
Wisconsin, Kenosha
Wisconsin, Ozaukee
Wisconsin, Sheboygan



56 Counties
New York, Bronx
New York, Westchester
New York, Richmond
Pennsylvania,
Montgomery
Pennsylvania , Bucks
Texas, Galveston
Texas, Harris
Wisconsin, Kenosha





























30 Counties
Pennsylvania ,
Montgomery
Pennsylvania , Bucks
Texas, Galveston
Texas, Harris
Wisconsin, Kenosha
































27 Counties
16

-------
IV. Particulate Matter, Visibility, and Deposition Modeling over the
Continental U.S.

A.  REMSAD Model Description

       The REgional Modeling System for Aerosols and Deposition (REMSAD) Version 7.06
(ICF Kaiser, 2002) model was used as the tool for simulating base year and future concentrations
of PM, visibility, and deposition in support of the Clear Skies Act air quality assessments. Model
runs were made for the 1996 and 2001 base years as well as for the 2010 and 2020 base and
control scenarios. As described below, each of these emissions scenarios was simulated using
1996 meteorological data in order to provide the PM2 5 concentrations needed for the
nonattainment county analysis and annual mean PM concentrations, nitrogen, sulfur, and
mercury deposition, and estimates of visibility needed for benefits calculations.

       The basis for REMSAD is the atmospheric diffusion equation (also called the species
continuity or advection/diffusion equation).  This equation represents a mass balance in which all
of the relevant emissions, transport, diffusion, chemical reactions, and removal processes are
expressed in mathematical terms. REMSAD employs finite-difference numerical techniques for
the  solution of the advection/diffusion equation.

       REMSAD was run using a latitude/longitude horizontal grid structure in which the
horizontal grids are generally divided into areas of equal latitude and longitude. The vertical
layer structure of REMSAD is defined in terms of sigma-pressure coordinates.  The top and
bottom of the domain are defined as 0 and 1 respectively. The vertical layers are defined as a
percent of the atmospheric pressure between the top and bottom of the domain.  For example, a
vertical layer of 0.50 sigma is exactly halfway between the top and bottom of the domain as
defined by the local atmospheric pressure. Usually, the vertical layers are defined to match the
vertical layer structure of the meteorological model used to generate the REMSAD
meteorological inputs.

1. Gas Phase Chemistry

       REMSAD simulates gas phase chemistry using a reduced-form version of Carbon Bond
(CB4) chemical mechanism termed "micro-CB4" (mCB4) which treats fewer VOC species
compared to the full CB4 mechanism. The inorganic and radical parts of the reduced mechanism
are  identical to CB4. In this version of mCB4 the organic portion is based on three primary
species (VOC, ISOP, and TERP) and one primary and secondary carbonyl species (CARB).
The VOC species was incorporated with kinetics representing an average anthropogenic
hydrocarbon species.  The other two primary VOC species represent biogenic emissions of
isoprene and terpenes and are included with kinetic characteristics representing isoprene and
terpenes respectively. The intent of the  mCB4 mechanism is to (a) provide a physically faithful
representation of the linkages between emissions of ozone precursor species and secondary PM
precursors species, (b) treat the oxidizing capacity of the troposphere, represented primarily by
the  concentrations of radicals and hydrogen peroxide, and (c) simulate the rate of oxidation of
the  nitrogen oxide (NOX) and sulfur dioxide (SO2) PM precursors. Box model testing of mCB4
has found that it performs very closely to the full CB4 that is contained in UAM-V (Whitten,

                                          17

-------
1999).

       REMSAD version 7.06 includes several updates to the mCB4 mechanism relative to
earlier versions of REMSAD. A new treatment for the NO3 and N2O5 species has been
implemented which results in improved agreement with rigorous solvers such as Gear and
eliminates nitrogen mass inconsistencies.  Also, several additional reactions have been added to
the mCB4 mechanism which may be important for regional  scale and annual applications where
wide ranges in temperature, pressure, and concentrations may be encountered. The reactions are
OH + H, OH + NO3, and HO2 + NO3. For the same reason three reactions involving peroxy
nitric acid (PNA), which were included in the original CB4 mechanism, were added to mCB4.

2. PM Chemistry

       Primary PM emissions in REMSAD are treated as inert species.  They are advected and
deposited without any chemical interaction with other species.  Secondary PM species, such as
sulfate and nitrate are formed through chemical reactions within the model. SO2 is the gas phase
precursor for particulate sulfate, while nitric acid is the gas phase precursor for particulate
nitrate.  Several other gas phase species are also involved in the secondary reactions.

       There are two pathways for sulfate formation; gas phase and aqueous phase.  Aqueous
phase reactions take place within clouds, rain, and/or fog. In-cloud processes can account for the
majority of atmospheric sulfate formation in many areas.  In REMSAD, aqueous SO2 reacts with
hydrogen peroxide (H2O2), ozone (O3), and/or oxygen (O2) to form aerosol sulfate.  REMSAD
version 7 has been upgraded to include all three aqueous phase sulfate reactions. Previous
versions only contained the hydrogen peroxide reaction.  The rate of the aqueous phase reactions
depends on the concentrations of the chemical reactants as well as cloud water content. SO2 also
reacts with OH radicals in the gas phase to form aerosol sulfate. The aqueous phase and gas
phase sulfate is typically added together to get the total sulfate  concentration.

       An  equilibrium algorithm is used to calculate particulate nitrate concentrations.
REMSAD version 7.06 uses the MARS-A equilibrium algorithm (Saxena et al., 1986) and (Kim
et al., 1993). In REMSAD, particulate nitrate is calculated in an equilibrium reaction between
nitric acid,  sulfate, and ammonia. Nitric acid is a product of gas phase chemistry and is formed
through the mCB4 reactions. The acids are neutralized by ammonia  with sulfate reacting more
quickly than nitric acid. An equilibrium  is established among  ammonium sulfate and
ammonium nitrate which strongly favors ammonium sulfate. If the available ammonia exceeds
twice the available sulfate then particulate nitrate is allowed to  form  as ammonium nitrate.
Nitrate is then partitioned between particulate nitrate and gas phase nitric acid.  The partitioning
of nitrate depends on the availability of ammonia as well meteorological factors such as
temperature and relative humidity.

       An  additional update to the REMSAD 7.06 code affects the dry deposition velocity of all
gas phase species and in particular ammonia.  Several assumptions contained in the REMSAD
dry deposition code were removed. In previous versions of REMSAD, the surface resistance
(Re) for ammonia gas was set equal to 30 s/m at all times for the landuse categories  of
agriculture, range, and mixed agriculture and range. Additionally for the landuse types of
deciduous forest, coniferous forest, and mixed forest, the ammonia surface resistance was set

                                           18

-------
equal to the stomatal resistance only . Both of these assumptions were removed from the code.
The current version more closely follows the original work by Wesley (Wesley, 1989).

       Organic aerosols can contribute a significant amount to the PM in the atmosphere.
Primary organic aerosols (POA) are treated as a directly emitted species in REMSAD. In
REMSAD version 7, a calculation of the production of secondary organic aerosols (SO A) due to
atmospheric chemistry processes has been added6.  A peer review of the REMSAD model
(Seigneur et al., 1999) recommended an SO A module based on the equilibrium approach of
Pankow  (Odum et al., 1997), (Griffin et al., 1999). The implementation of the SOA treatment in
version 7 of REMSAD follows the recommendation of the peer review. This includes SOA
formation from anthropogenic and biogenic organic precursors.  For both anthropogenic and
biogenic organics REMSAD includes gas phase secondary organic species and the
corresponding aerosol phase species.
B.  REMSAD Modeling Domain

       The REMSAD domain used for the Clear Skies modeling is shown in Figure IV-1.  The
geographic characteristics of the domain are as follows:

120 (E-W) X 84 (N-S) grid cells
Cell size (-36 km)
       !/2 degree longitude (0.5)
       1/3 degree latitude (0.3333)
E-W range: 66 degrees W - 126 degrees W
N-S range: 24 degrees N - 52 degrees N
Vertical extent: Ground to 16,200 meters (100mb) with 12 layers
                                                                  120
Figure IV-1. REMSAD Modeling Domain.
       6An error was found in the SOA mechanism of REMSAD v7.01. This has been corrected in the
current version (7.06). The reference temperature from the literature to calculate the partitioning
coefficient (K) was assumed to be 298K when it should have been -308K.

                                          19

-------
C.  REMSAD Inputs
       Input data for REMSAD can be classified into six categories:  (1) simulation control, (2)
emissions, (3) initial and boundary concentrations, (4) meteorological, (5) surface
characteristics, and (6) chemical rates.  The REMSAD predictions of pollutant concentrations are
calculated from the emissions, advection, and dispersion processes coupled with the formation
and deposition of secondary PM species within every grid cell of the modeling domain. To
adequately replicate the full three-dimensional structure of the atmosphere, the REMSAD
program uses hourly input data for a number of variables.  Table IV-1 lists the required
REMSAD input files.

Table IV-1.  List of REMSAD input files.
Data type
Control
Emissions

Initial and
boundary
concentrations
Meteorological

Surface
characteristics
Chemical rates

Files
CONTROL
PT SOURCE
EMISSIONS
AIRQUALITY
BOUNDARY

WIND
TEMPERATURE
PSURF
H20
VDIFFUSION
RAIN

SURFACE
TERRAIN
CHEMPARAM
RATES
Description
Simulation control information
Elevated source emissions
Surface emissions
Initial concentrations
Lateral boundary concentrations

X,Y-components of winds
3D array of temperature
2D array of surface pressure
3D array of water vapor
3D array of vertical turbulent diffusivity
coefficients
3D array of cloud water mixing ratio
3D array of rain water mixing ratio
2D array of rainfall rates
Gridded land use
Terrain heights
Chemical reaction rates
Photolysis rates file
                                          20

-------
1. Meteorological Data

       REMSAD requires input of winds (u- and v-vector wind components), temperatures,
surface pressure, specific humidity, vertical diffusion coefficients, and rainfall rates. The
meteorological input files were developed from a 1996 annual MM5 model run that was
developed for previous projects.  MM5 is the Fifth-Generation NCAR / Penn State Mesoscale
Model. MM5 (Grell et al., 1994) is a numerical meteorological model that solves the full set of
physical and thermodynamic equations which govern atmospheric motions. MM5 was run in a
nested-grid mode with 2 levels of resolution:  108 km, and 36km with 23 vertical layers sigma
layers extending from the surface to the 100 mb pressure level. The model was simulated in five
day segments with an eight hour ramp-up period. The MM5 runs were started at OOZ, which is
7:00 p.m. EST.  The first eight hours of each five day period were removed before being input
into REMSAD. Figure IV-2 shows the MM5 and REMSAD 36km  domain superimposed on
each other. Table IV-2 lists the vertical grid structures for the MM5 and REMSAD domains.
Further detailed information concerning the development and evaluation of the 1996 MM5
datasets can be found in (Olerud, 2000).
         3500-

         3000-

         2500-

         2000-

         1500-

         1000-

          500-

           o-

         -500-

         -1000-

         -1500-

         -2000-

         -2500-

         -3000-

         -3500-
D01
            -4500   -3500
                          -2500   -1500
                                        -500    500    1500   2500   3500    4500
Figure IV-2. MM5 36km Domain (solid box) and REMSAD Domain (dashed lines).
                                          21

-------
Table IV-2. Vertical Grid Structure for 1996 MM5 and Clear Skies REMSAD Domains.  Layer
heights represent the top of each layer.  The first layer is from the ground up to 38 meters.
REMSAD
Layer
0
i
2

3

4

5

6

7

8

9

10

11


12
MM5 Layer
0
i
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
Sigma
1.000
0.995
0.988
0.980
0.970
0.956
0.938
0.916
0.893
0.868
0.839
0.808
0.777
0.744
0.702
0.648
0.582
0.500
0.400
0.300
0.200
0.120
0.052
0.000
Approximate
Height(m)
0.0
38.0
91.5
152.9
230.3
339.5
481.6
658.1
845.8
1053.9
1300.7
1571.4
1849.6
2154.5
2556.6
3099.0
3805.8
4763.7
6082.5
7627.9
9510.5
11465.1
13750.2
16262.4
Pressure(mb)
1000.0
995.5
989.2
982.0
973.0
960.4
944.2
924.4
903.7
881.2
855.1
827.2
799.3
769.6
731.8
683.2
623.8
550.0
460.0
370.0
280.0
208.0
146.0
100.0
The physical options selected for this configuration of MM5 include the following:
1.  One-way nested grids
2.  Nonhydrostatic dynamics
3.  Four-dimensional data assimilation (FDDA):
       •      Analysis nudging of wind, temperature, and mixing ratios
       •      Nudging coefficients range from 1.0 x 10   s  to 3.0x10  s
4.   Explicit moisture treatment:
       •      3-D predictions of cloud and precipitation fields
                                          22

-------
       •      Simple ice microphysics (summer) and Mixed ice microphysics (winter)
       •      Cloud effects on surface radiation
       •      Moist vertical diffusion in clouds
       •      Normal evaporative cooling
5.  Boundary conditions:
       •      Time and inflow/outflow relaxation
6.  Cumulus cloud parameterization schemes:
             Anthes-Kuo (108-km grid)
       •      Kain-Fritsch (36-km grid)
7.  No shallow convection
8.  Full 3-dimensional Coriolis force
9.  Drag coefficients vary with stability
10. Vertical mixing of momentum in mixed layer
11. Virtual temperature effects
12. PEL process parameterization: MRF scheme
13. Surface layer parameterization:
       •      Fluxes of momentum, sensible and latent heat
       •      Ground temperature prediction using energy balance equation
       •      24 land use categories
14. Atmospheric radiation schemes:
       •      Simple cooling
       •      Long- and short-wave radiation scheme
15. Sea ice treatment:
             Forced Great Lakes/Hudson Bay to permanent ice under very cold conditions
       •      36-km treatment keyed by observations of sea ice over the Great Lakes
16. Snow cover:
       •      Assumed no snow cover for July and August
       •      National Center for Environmental Prediction (NCEP) snow cover for January to
             June, and for September to December

       The MM5 model output cannot be directly input into REMSAD due to differences in the
grid coordinate systems and file formats. A postprocessor called MM5-REMSAD was
developed to convert the MM5 data into REMSAD format. This postprocessor was used to
develop hourly average meteorological input files from the MM5 output. Documentation of the
MM5REMSAD code and further details on the development of the input files is contained in
(Mansell, 2000).

2.  Initial and Boundary Conditions, and Surface Characteristics

       Application of the REMSAD modeling system requires data files specifying the initial
species concentration fields (AIRQUALITY) and lateral  species concentrations (BOUNDARY).
Due to the extent of the proposed modeling domains and the regional-scale nature of the
REMSAD model, these inputs  were developed based on "clean" background concentration
values. The Clear Skies modeling used temporally and spatially (horizontal) invariant data for
both initial and boundary conditions.  Species concentration values were allowed to decay
                                          23

-------
vertically for most species.  Table IV-3 summarizes the initial and boundary conditions used in
the Clear Skies REMSAD modeling.
Table IV-3. REMSAD Initial and Boundary Conditions (ppm)

NO
N02
03
CO
S02
NH3
VOC
CARB
ISOP
HN03
PN03
HGO
HG2G
GS04
AS04
NH4N
NH4S
SOA
POA
PEC
PMFINE
PMCOARS
HG2P
Layer 1
1.00E-12
1 .OOE-04
3.50E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 2
1.00E-12
1 .OOE-04
3.50E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 3
1.00E-12
1 .OOE-04
3.50E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 4
1.00E-12
1 .OOE-04
3.50E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 5
1.00E-12
1 .OOE-04
4.00E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 6
1.00E-12
1 .OOE-04
4.00E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 7
1.00E-12
1 .OOE-04
5.00E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 8
1.00E-12
1 .OOE-04
5.00E-02
8.00E-02
3.00E-04
1 .OOE-04
2.00E-02
1 .OOE-07
1 .OOE-09
1 .OOE-05
1 .OOE-05
1 .95E-07
9.75E-09
1 .OOE-05
1.00E-12
1 .OOE-05
1 .OOE-05
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .OOE-03
1 .22E-09
Layer 9
8.44E-13
8.44E-05
6.00E-02
8.00E-02
2.53E-04
7.12E-05
1 .69E-02
1 .OOE-07
1 .OOE-09
8.44E-06
7.12E-06
1 .39E-07
6.94E-09
7.12E-06
7.12E-13
7.12E-06
7.12E-06
7.12E-04
7.12E-04
7.12E-04
7.12E-04
6.00E-04
8.54E-10
Layer 10
5.15E-13
5.15E-05
6.00E-02
8.00E-02
1 .55E-04
2.66E-05
1 .03E-02
1 .OOE-07
1 .OOE-09
5.15E-06
2.66E-06
5.18E-08
1 .59E-09
2.66E-06
2.66E-13
2.66E-06
2.66E-06
2.66E-04
2.66E-04
2.66E-04
2.66E-04
1 .37E-04
3.19E-10
Layer 11
1.72E-13
1 .72E-05
6.00E-02
8.00E-02
5.15E-05
2.95E-06
3.44E-03
1 .OOE-07
1 .OOE-09
1 .72E-06
2.95E-07
5.76E-09
2.88E-10
2.95E-07
2.95E-14
2.95E-07
2.95E-07
2.95E-05
2.95E-05
2.95E-05
2.95E-05
5.07E-06
3.54E-1 1
Layer 12
1.72E-13
1 .72E-05
7.00E-02
8.00E-02
5.15E-05
2.95E-06
3.44E-03
1 .OOE-07
1 .OOE-09
1 .72E-06
2.95E-07
5.76E-09
2.88E-10
2.95E-07
2.95E-14
2.95E-07
2.95E-07
2.95E-05
2.95E-05
2.95E-05
2.95E-05
5.07E-06
3.54E-11
       Application of the REMSAD model requires specification of gridded terrain elevations
(TERRAIN) and landuse characteristics (SURFACE). The SURFACE data files provides the
fraction of the 11 landuse categories recognized by REMSAD in each grid cell.  Landuse
characteristics are used in the model for the calculation of deposition parameters. For this task, a
landuse/terrain processor, PROC_LUTERR, was developed based on the MM5 TERRAIN
preprocessor. Landuse data was obtained from the USGS Global 30 sec. vegetation database
which is the same database used in the 1996 MM5 models runs. This dataset provides 24
landuse categories, including urban.  For the REMSAD application, the 10 min. (1/6 deg.)
datasets was utilized.  The processor remapped the 24 USGS vegetation categories to those
required for application of REMSAD.  It also aggregated the 10 min resolution data to the -36
km horizontal resolution used for this REMSAD application.

       For the TERRAIN input data files, a similar global terrain elevation dataset is also
available from NCAR and was used for this task.  While it is possible to use the terrain
elevations obtained from the MM5 model output data files, it was deemed more appropriate to
                                          24

-------
begin with the USGS 10 min. resolution database due to the various map projections and
interpolations involved in developing the required data files for the geodetic coordinates used in
REMSAD.  However, because proper application of REMSAD will require zero terrain
elevations, "dummy" terrain files (with all zeroes) were developed and provided for input to
REMSAD.

D.  Model Performance Evaluation

       The goal of the 1996 Base Year modeling was to reproduce the atmospheric processes
resulting in formation and dispersion of fine particulate matter across the U.S. An operational
model performance evaluation for PM2 5 and its related speciated components (e.g., sulfate,
nitrate, elemental carbon etc.) for 1996 was performed in order to estimate the ability of the
modeling system to replicate Base Year concentrations.

       This evaluation is comprised principally of statistical assessments of model versus
observed pairs. The robustness of any evaluation is directly proportional to the amount and
quality of the ambient data available for comparison. Unfortunately, there are few PM25
monitoring  networks with available data for evaluation of the Clear  Skies PM modeling. Critical
limitations of the 1996 databases are a lack of urban monitoring sites with  speciated
measurements and poor geographic representation of ambient concentration in the East. PM2 5
monitoring  networks were expanded in 1999 to include more than 1000 Federal Reference
Method (FRM) monitoring sites. The purpose of this network is to monitor PM25 mass levels in
urban areas. These monitors only measure total PM2 5 mass and do not measure PM species. In
2001 a new network of-300 urban oriented speciation monitor sites began operation across the
country. These monitors collect a full  range of PM25 species that are necessary to evaluate
models and to develop PM2 5 control strategies. Future modeling efforts will be able to take
advantage of these newer speciated PM25 measurements.

       The evaluation used data from the IMPROVE, CASTNet dry deposition, and NADP
monitoring  networks (IMPROVE, 2000), (EPA, 2002), (NADP, 2003). The IMPROVE and
NADP networks were in full operation during 1996.  The CASTNet dry deposition network was
partially shutdown during the first half of the year.  There were 65 CASTNet sites with at least
one season of complete data. There were 16 sites which had complete annual data. The
CASTNet visibility network was also partially operating in 1996. Data from the 7 visibility sites
is only complete from September-December. This only provides a single season (fall) of
complete data. Therefore, the limited data from these sites was not used in the evaluation.  The
mercury deposition network (MDN) was in its first year of operation in 1996. There was not
adequate data to fully evaluate the wet deposition of total mercury.

       The largest available ambient database for 1996 comes from the Interagency Monitoring
of PROtected Visual Environments (IMPROVE) network. IMPROVE is a cooperative visibility
monitoring  effort between EPA, federal land management agencies, and state air agencies. Data
is collected at Class I areas across the United States mostly at National Parks, National
Wilderness  Areas, and other protected pristine areas. There were approximately 60 IMPROVE
sites that had complete annual PM2 5 mass and/or PM2 5 species data for 1996.  Forty two sites
                                          25

-------
were in the West7 and 18 sites were in the East. Figure IV-3 shows the locations of the
IMPROVE monitoring sites used in this evaluation. IMPROVE data is collected twice weekly
(Wednesday and Saturday). Thus, there is a total of 104 possible samples per year or 26 samples
per season. For this analysis, a 50% completeness criteria was used8.  That is, in order to be
counted in the statistics a site had to have > 50% complete data in all 4 seasons. If any season
was missing, an annual average was not calculated for the site. See Appendix D for a list of the
IMPROVE sites used in the evaluation. The observed IMPROVE data used for the performance
evaluation was PM2 5 mass, sulfate ion,  nitrate ion, elemental carbon, organic aerosols, and
crustal material (soils). The REMSAD model output species were postprocessed in order to
achieve compatibility with the observation species. The following is the translation of
REMSAD output species into PM2 5 and related species:

Sulfate Ion:                 TSO4 = ASO4 + GSO4
Nitrate Ion:                 PNO3
Organic aerosols:           TOA = 1.167*POA + SOA1 + SOA2 + SOA3 + SOA4
Elemental Carbon:           PEC
Crustal Material (soils):      PMFINE
PM2 5:                      PM2 5 = PMFINE + ASO4 + GSO4 + NH4S +
                                 PNO3 + NH4N + 1.167*POA + PEC +
                                 SOA1 + SOA2 +  SOA3  + SOA4
where, TSO4 is total sulfate ion, ASO4 is aqueous path sulfate, GSO4 is gaseous path sulfate,
NH4S is ammonium associated with sulfate,  PNO3 is nitrate ion, NH4N is ammonium
associated with nitrate, TOA is total organic aerosols, POA is primary organic aerosol9,  SOA1
and SOA2 are anthropogenic secondary organic aerosol, SOA3 and SOA4 are biogenic
secondary organic aerosol, PEC is primary elemental carbon, and PMFINE is primary fine
particles (other unspeciated primary PM25). PM25 is defined as the sum of the individual
species.
       7The dividing line between the West and East was defined as the 100th meridian.

       8The same completeness criteria was used for all of the monitoring networks.

       9For the performance evaluation and the calculation of PM25 mass, POA is multiplied by 1.167.
The IMPROVE organic carbon mass is multiplied by a 1.4 factor to account for additional mass attached
to the carbon (this follows standard IMPROVE procedures). In REMSAD, the "additional" mass is
already accounted for in the SOA predictions (by using a molecular weight of 160 g/mole). The POA
emissions have been multiplied by 1.2 prior to processing by the emissions model (the 1.2 factor is applied
to the organic carbon in the PM25 speciation profiles). The post-processed POA concentrations are then
multiplied by 1.167 to simulate an equivalent 1.4 factor (1.2 * 1.167 = 1.4).

                                          26

-------
                                  1996  IMPROVE Hani taring  Si tes
                                                                               IJDEMi]
Figure IV-3. Map of 1996 IMPROVE monitoring sites used in the REMSAD model
performance evaluation.
       Model performance was also calculated using data from the CASTNet dry deposition
monitoring network.  The sulfate and total nitrate data was used in the evaluation.  CASTNet
data is collected and reported as weekly average data. The data is collected in filter packs that
sample the ambient air continuously during the week. The sulfate data is of high quality since
sulfate is a very stable compound.  But the particulate nitrate concentration data collected by
CASTNet is subject to volatility due to the length of the sampling period. Therefore, we chose
not to use the CASTNet particulate nitrate data in this evaluation. CASTNet also reports a total
nitrate measurement.  This is the combined total of particulate nitrate and nitric acid. Since the
total nitrate measurement is not affected by the partitioning back and forth between particulate
nitrate and nitric acid, it should be a fairly accurate measurement.

       Wet deposition data from the National Acid Deposition Program (NADP) was also used
in the model evaluation. There were a total of 160 NADP sites with complete annual data in
1996.  Model results were compared to observed values of ammonium, sulfate,  and nitrate wet
deposition.
                                           27

-------
1. Statistical Definitions

       Below are the definitions of statistics used for the evaluation. The format of all the
statistics is such that negative values indicate model predictions that were less than their
observed counterparts.  Positive statistics indicate model overestimation of observed PM.. The
statistics were calculated for the entire REMSAD domain and separated for the east and the west.
The dividing line between East and West is the 100th meridian.

Mean Observation: The mean observed value (in |ig/m3) averaged over all monitored days in
the year and then averaged over all sites in the region.

            1  N
   OBS =  -  E Obs'
            N ih     '
Mean REMSAD Prediction: The mean predicted value (in |ig/m3) paired in time and space
with the observations and then averaged over all sites in the region.
   PRED =  —  YPred*
             N  A
Ratio of the Means: Ratio of the predicted over the observed values.  A ratio of greater than 1
indicates on overprediction and a ratio of less than 1 indicates an underprediction.

            1 N  Pred*
   RATIO= —
                  Obsx,t
Mean Bias (jig/m3):  This performance statistic averages the difference (model - observed) over
all pairs in which the observed values were greater than zero. A mean bias of zero indicates that
the model over predictions and model under predictions exactly cancel each other out. Note that
the model bias is defined such that it is a positive quantity when model prediction exceeds the
observation, and vice versa. This model performance estimate is used to make statements about
the absolute or unnormalized bias in the model simulation


   BIAS= 1 E  (Predl - Obs^
         M i=\

Mean Fractional Bias (percent): Normalized bias can become very large when a minimum
threshold is not  used.  Therefore fractional bias is used as a substitute.  The fractional bias for
cases with factors of 2 under- and over-prediction are -67 and + 67 percent, respectively (as
opposed to -50 and +100 percent, when using normalized bias, which is not presented here).
                                           28

-------
Fractional bias is a useful model performance indicator because it has the advantage of equally
weighting positive and negative bias estimates. The single largest disadvantage in this estimate
of model performance is that the estimated concentration (i.e., prediction, Pred) is found in both
the numerator and denominator.
FBIAS =  —  E
          N i=
             2   N  (Pred*  -
                                    - * 100
 Mean Error (jig/m3): This performance statistic averages the absolute value of the difference
(model - observed) over all pairs in which the observed values were greater than zero.  It is
similar to mean bias except that the absolute value of the difference is used so that the error is
always positive.
               N
   ERR =  —  T Pred* -  Obs'\
           N {=i     *''       xA
Mean Fractional Error (percent): Normalized error can become very large when a minimum
threshold is not used. Therefore fractional error is used as a substitute. It is similar to the
fractional bias except the absolute value of the difference is used so that the error is always
positive.
   FERROR  =  —  y
               N A
                       Pred   -
                              0bsL\
                             	^1*100
Correlation Coefficient: This performance statistic measures the degree to which two variables
are linearly related.  A correlation of coefficient of 1 indicates a perfect linear relationship,
whereas a correlation coefficient of 0 means that there is no linear relationship between the
variables.
   CORRCOEFF =
                         N            	           	
                         E  (Predi -  Pred) (Obst -  Obs)
                        i=l
                    \
                    N            	    N          	
                    E  (Predt -  Pred)2  E (Obst  - Obs)2
                   i= i                    i= i
                                           29

-------
2. Results of REMSAD Performance Evaluation

       The statistics described above are presented for the entire domain, the Eastern sites, and
the Western sites. The statistics were calculated in two different ways. The bias, error, and R2
statistics in the tables below were calculated for all days and all sites.  Observations and model
predictions were paired in time and space on a daily basis. These statistics represent the ability
of the model to replicate each day of year with measurements.

       Following the statistical tables are scatterplots of seasonal and  annual average predictions
at each ambient data site.  These scatterplots represent the ability of the model to represent a
seasonal average or annual average measurement. The correlation coefficients for the
scatterplots represent the correlation  of the site average (seasonal and/or annual) predictions to
the site average measurements.
a.  IMPROVE Performance

       a. 1. PM25 Performance

       Table IV-4 lists the performance statistics for PM2 5 at the IMPROVE sites.  For the full
domain, PM25 is underpredicted by 18%. Overall, the performance of REMSAD (v7.06) has
improved from underpredicting PM2 5 by 34% in version 7.01.  The ratio of the means is 0.82
with a bias of-1.10 |ig/m3. It can be seen that most of this underprediction is due to the Western
sites. The West is underpredicted by 33% while the East is underpredicted by 2%. The
fractional bias is -9% in the East, while the fractional error is 46%. The fractional bias and error
in the West is -30% and 63% respectively.  The observed PM2 5 concentrations in the East are
relatively high compared to the West. REMSAD displays an ability to differentiate between
generally high and low PM2 5 areas.
Table IV-4. Annual mean PM2 5 performance at IMPROVE sites.

National
East
West
No. of
Sites
54
15
39
Mean
REMSAD
Predictions
Og/m3)
5.11
10.93
2.87
Mean
Observations
Og/m3)
6.21
11.15
4.31
Ratio of
Means
(pred/obs)
0.82
0.98
0.67
Bias
Og/m3)
-1.10
-0.22
-1.44
Fractional
Bias (%)
-24.1
-8.9
-29.9
Error
Og/m3)
3.01
4.99
2.44
Fractional
Error (%)
58.2
46.1
62.8
Correlation
Coefficient
0.46
0.39
0.09
                                           30

-------
Figures IV-4 and IV-5 show the annual and seasonal average PM25 1996 IMPROVE
observations versus REMSAD predictions respectively.  The annual and seasonal scatterplots
showed some scatter, but good agreement, with strong correlations (annual: R2 = 0.79; summer:
R2 = 0.69; fall: R2 = 0.62; spring: R2 = 0.60; and winter:  R2 = 0.78).
                  18
                  14
                c 12
                o
                £
                Q_
                Q
                UJ
                OL  4
                                      8
                                           10
                                               12
                                                    14
                                                         16
                                                             18   20
                                   IMPROVE Observations (ug/m3)
Figure IV-4. Annual average PM25 1996 IMPROVE observations versus REMSAD
predictions.
                                 IMPROVE Observations (ug/m
Figure IV-5. Seasonal average PM25 1996 IMPROVE observations versus REMSAD
predictions
                                          31

-------
       a. 2. Sulfate Performance

       Table IV-5 lists the performance statistics for particulate sulfate at the IMPROVE sites.
Domainwide, sulfate is underpredicted by 21%. The annual average sulfate underprediction in
the east is 12%  and 41% in the West. The sulfate performance (especially in the East) is better
than most of the other PM2.5 species. The fractional error in the East is -60% and the R2 is
0.51.
Table IV-5. Annual mean sulfate ion performance at IMPROVE sites.

National
East
West
No. of
Sites
58
16
42
Mean
REMSAD
Predictions
(^g/m3)
1.25
3.47
0.41
Mean
Observations
(^g/m3)
1.59
3.93
0.69
Ratio of
Means
(pred/obs)
0.79
0.88
0.59
Bias
(^g/m3)
-0.34
-0.46
-0.29
Fractional
Bias (%)
-40.7
-29.8
-44.8
Error
(^g/m3)
0.80
1.80
0.41
Fractional
Error (%)
69.3
60.2
72.8
Correlation
Coefficient
0.66
0.51
0.13
       Figures IV-6 and IV-7 show the annual and seasonal average sulfate 1996 IMPROVE
observations versus REMSAD predictions respectively. The scatterplots and linear regressions
displayed strong correlations (annual: R2 = 0.96; summer: R2 = 0.92; fall: R2 = 0.91; spring: R2 =
0.90; and winter: R2 = 0.86).

       Overall, the model shows an ability to replicate the annual and seasonal sulfate
concentrations.  This is particularly important for this application of REMSAD.  The Clear Skies
emissions controls mainly reduce SO2 and lead to large predicted sulfate reductions.  It is
important to have good model performance for the species that is being reduced the most.
                                           32

-------
              E
              O)


              t  5
              _o

              •—  4

              £
              Q.

              9  3
                                   IMPROVE Observations (ug/m )
Figure IV-6. Annual average sulfate 1996 IMPROVE observations versus REMSAD

predictions.
                                            8
                                                  10
                                                         12
                                                               14
                                IMPROVE Observations (ug/m )
Figure IV-7. Seasonal average sulfate 1996 IMPROVE observations versus REMSAD

predictions.
                                          33

-------
       a. 3. Elemental Carbon Performance

       Table IV-6 lists the performance statistics for primary elemental carbon at the IMPROVE
sites.  Elemental carbon concentrations at IMPROVE sites are relatively low, but performance is
generally good. There is a domainwide underprediction of 14% and a western underprediction of
29%.
Table IV-6. Annual mean elemental carbon performance at IMPROVE sites.

National
East
West
No. of
Sites
47
15
32
Mean
REMSAD
Predictions
(^g/m3)
0.27
0.49
0.17
Mean
Observations
Og/m3)
0.32
0.48
0.24
Ratio of
Means
(pred/obs)
0.86
1.01
0.71
Bias
Og/m3)
-0.05
0.01
-0.07
Fractional
Bias (%)
-13.6
1.78
-20.9
Error
(^g/m3)
0.17
0.20
0.16
Fractional
Error (%)
58.7
41.7
66.7
Correlation
Coefficient
0.33
0.47
0.07
       Figures IV-8 and IV-9 show scatterplots of annual and seasonal average elemental carbon
1996 IMPROVE observations versus REMSAD predictions respectively.  The annual scatterplot
and linear regression displayed some scatter, however good agreement with a R2 of 0.53.
Overall, summer and fall linear regressions had relatively good agreement (summer: R2 = 0.63;
fall: R2 = 0.62), whereas spring and winter had the weakest correlations (spring: R2 = 0.49; and
winter: R2 = 0.39).
                2.5
              3  2
              HI
              £
              Q
                1.5
              LU
              OL
                0.5
                 0 ,
                          0.5       1       1.5      2      2.5

                                  IMPROVE Observations (ug/m3)
Figure IV-8. Annual average elemental carbon 1996 IMPROVE observations versus REMSAD
predictions.
                                          34

-------
                           0.5
                                   1
                                         1.5
                                                        2.5
                                  IMPROVE Observations (ug/m3)
Figure IV-9. Seasonal average elemental carbon 1996 IMPROVE observations versus
REMSAD predictions.
a. 4. Organic Aerosol Performance

       Table IV-7 lists the performance statistics for organic aerosols at the IMPROVE sites.
Organic aerosols performance is generally good. The nationwide bias and errors are low. But
the correlation coefficient is also low. There is much uncertainty in the predictions of organic
carbon. There are several different forms of organic carbon predicted in the model.  There is
primary organic carbon, secondary biogenic organic carbon, and secondary anthropogenic
organic carbon. Both the model and the ambient data contains a mix of these different types of
organics which all originate from different sources. Unfortunately, given limitations in
measurement techniques, it is currently not possible to quantify the different types of organic
carbon in the ambient air.

       This latest version of REMSAD (7.06) contains science updates and code fixes that result
in predicted concentrations of secondary  organic carbon that are much higher than in previous
versions of REMSAD.  The model predictions for organics are tempered by the fact that
wildfires (a significant source of organic  carbon) are not included in the current modeling
inventory.  The performance for organics should be viewed relative to the uncertainties in the
measurements and the emissions inventories.
                                           35

-------
Table IV-7. Annual mean organic aerosol performance at IMPROVE sites.

National
East
West
No. of
Sites
47
15
32
Mean
REMSAD
Predictions
(^g/m3)
1.76
2.58
1.38
Mean
Observations
Og/m3)
1.76
2.49
1.42
Ratio of
Means
(pred/obs)
1.00
1.04
0.97
Bias
Og/m3)
0.004
0.09
-0.04
Fractional
Bias (%)
-5.58
-11.83
-2.64
Error
(^g/m3)
1.13
1.42
1.00
Fractional
Error (%)
62.0
54.7
65.4
Correlation
Coefficient
0.18
0.21
0.10
Annual and seasonal scatterplots (Figures IV-10 and IV-11) of average organic aerosol for 1996
IMPROVE observations versus REMSAD predictions displayed some scatter, with an annual R2
= 0.40 and seasonal correlations of: summer: R2 = 0.43; fall: R2 = 0.23; spring: R2 = 0.45; and
winter: R2 = 0.45.
                                      234
                                   IMPROVE Observations (ug/m3)
Figure IV-10. Annual average organic aerosol 1996 IMPROVE observations versus REMSAD
predictions.
                                          36

-------
                              23456
                                IMPROVE Observations (ug/m3)
Figure IV-11. Seasonal average organic aerosol 1996 IMPROVE observations versus REMSAD
predictions.
a. 5. Nitrate Performance

       Table IV-8 lists the performance statistics for nitrate ion at the IMPROVE sites. Nitrate
is generally overpredicted in the East and underpredicted in the West. Nitrate is overpredicted
by 166% in the east and underpredicted by 31% in the west.  Domainwide there is an
overprediction of 55%.

Table IV-8. Annual mean nitrate ion performance at IMPROVE sites.

National
East
West
No. of
Sites
48
15
33
Mean
REMSAD
Predictions
Og/m3)
0.61
1.47
0.22
Mean
Observations
Og/m3)
0.39
0.55
0.32
Ratio of
Means
(pred/obs)
1.55
2.66
0.69
Bias
Og/m3)
0.21
0.91
-0.10
Fractional
Bias (%)
-59.4
13.0
-91.9
Error
Og/m3)
0.57
1.11
0.32
Fractional
Error (%)
129.8
109.3
139.0
Correlation
Coefficient
0.19
0.29
0.15
       Likewise, this overprediction is depicted in Figures IV-12 and IV-13, which show the
scatterplots of the annual (R2= 0.37) and seasonal (summer: R2= 0.24; fall: R2= 0.17; spring:
R2= 0.36; winter:  R2= 0.52) average nitrate ion for 1996 IMPROVE observations verus
REMSAD predictions.
                                          37

-------
       It is important to consider these results in the context that the observed nitrate
concentrations at the IMPROVE sites are very low. The mean nationwide observations are only
0.40 |ag/m3.  It is often difficult for models to replicate very low concentrations of secondarily
formed pollutants.  Nitrate is generally  a small percentage of the measured PM25 at almost all of
the IMPROVE sites. Nonetheless, it has been recognized that the current generation of PM air
quality models generally overpredict particulate nitrate. There are numerous ongoing efforts to
improve particulate nitrate model performance through emissions inventory improvements
(ammonia emissions and dry deposition of gaseous precursors) and improvements in the
scientific formulations of the models.

       More recent ambient data has shown that nitrate can be an important contributor to PM2 5
in some urban areas (particularly in California and the upper Midwest) but performance for those
areas could not be assessed due to the lack of urban area speciated nitrate data for 1996.
            u
            'o
           v>
           LJJ
                     o  o
                                   IMPROVE Observations (ug/m3)
Figure IV-12. Annual average nitrate ion 1996 IMPROVE observations versus REMSAD
predictions.
                                           38

-------
                                23456
                                 IMPROVE Observations (ug/m3)
Figure IV-13. Seasonal average nitrate ion 1996 IMPROVE observations versus REMSAD
predictions
a. 6. PMFINE-Other (crustal) Performance

       Table IV-9 lists the performance statistics for PMFINE-other or primary crustal
emissions.  The observations show crustal PM2 5 to be generally higher in the West than in the
East.  However, REMSAD is predicting higher crustal concentrations in the East. Performance
statistics show an underprediction of 19% in the west, with an overprediction nationally of
-33%. The largest categories of PMFINE-other are fugitive dust sources such as paved roads,
unpaved roads, construction, and animal feed lots.

       There is a large uncertainty as to how emissions for such sources should be treated in
grid-based air quality models since a large fraction of the emissions either deposit or are
removed by vegetation within a few meters  of the source. Work is underway to develop
improved methods for estimating emissions from these sources for the purpose of air quality
modeling.
Table IV-9. Annual mean PMFINE (crustal) performance at IMPROVE sites.

National
East
West
No. of
Sites
57
16
41
Mean
REMSAD
Predictions
Og/m3)
0.86
1.64
0.56
Mean
Observations
Og/m3)
0.64
0.53
0.69
Ratio of
Means
(pred/obs)
1.33
3.08
0.81
Bias
Og/m3)
0.22
1.10
-0.13
Fractional
Bias (%)
38.8
103.8
13.5
Error
Og/m3)
0.80
1.36
0.58
Fractional
Error (%)
93.9
116.1
85.3
Correlation
Coefficient
0.003
0.002
0.00
                                           39

-------
       Figures IV-14 and IV-15 show the annual and seasonal average concentration scatterplots
for PMFINE-other.
              2.5
           S.  2
           o
           a.
           o
           <
           OT
              0.5
                         0.5
1
1.5
2.5
                                 IMPROVE Observations (ug/m3)
Figure IV-14. Annual average PMFINE (crustal) 1996 IMPROVE observations versus
REMSAD predictions
                     0.5   1    1.5   2   2.5   3   3.5   4   4.5   5

                               IMPROVE Observations (ug/m3)
Figure IV-15. Seasonal average PMFINE (crustal) 1996 IMPROVE observations versus
REMSAD predictions
                                          40

-------
      b. NADP Wet Deposition Performance

       Figures IV-16; 17; and 18 show the annual 1996 NADP observations versus REMSAD
predictions for ammonium, nitrate, and sulfate wet deposition respectively. The scatterplots and
linear regressions show some scatter (e.g. underprediction bias for nitrate and especially sulfate
wet deposition), but good agreement, with strong correlations (NH4: R2 = 0.65; NO3: R2 = 0.78;
SO4: R2 = 0.78).
                             1234
                                   NADP Observations (kg/ha)
Figure IV-16. Annual total ammonium (NH4) wet deposition 1996 NADP observations versus
REMSAD predictions.
             -5- 20
             •c
             S>
             a.
             V)
             HI
               10
                                   10       15       20
                                   NADP Observations (Kg/Ha)
25
30
Figure IV-17. Annual total nitrate (NO3) wet deposition 1996 NADP observations versus
REMSAD predictions
                                           41

-------
                               10          20          30
                                  NADP Observations (Kg/Ha)
40
Figure IV-18. Annual total sulfate (SO4) wet deposition 1996 NADP observations versus
REMSAD predictions.

c. Wet Mercury Deposition

       The Mercury Deposition Network (MDN) was in its first year of operation in 1996 with a
limited number of sites10.  Therefore, there is not enough data available to adequately judge
model performance. At the few sites where there was ambient data, REMSAD predicted wet
mercury deposition is underestimated compared to the observed values (mercury deposition
performance is notably poorer than the other deposited species). It should be noted that
REMSAD generally predicts higher levels of dry deposition of mercury compared to wet
deposition. But there are no existing measurements of dry deposition to compare to the model
results.

       There is a great deal of uncertainty in the modeling of mercury deposition. Mercury
chemistry is not fully understood. There is also uncertainty associated with the global
background of mercury. Estimates of background mercury in terms of the boundary conditions
assumed in the model can be very important to predicting mercury and mercury deposition.
Certain forms of mercury  are long lived and can be transported around the globe.  In view  of the
uncertainty in global transport, different models and model applications have used boundary
conditions for mercury that vary by as much as a factor of 5 at the surface and aloft. Additional
research is needed to be able  to develop representative boundary conditions.  Since temporally
varying boundary conditions  at high altitudes may be important, it may be necessary to use
results of a global mercury model to develop boundary conditions for continental  scale air
quality models such as REMSAD.
       10 There were 8 sites with complete annual wet deposition data.

                                          42

-------
d. CASTNet Performance

       Figures IV-19 and 20 show the seasonal 1996 CASTNet observations versus REMSAD
predictions for total sulfate and total nitrate, respectively.  The scatterplot and linear regression
of sulfate showed good agreement, with strong correlations among all seasons (summer: R2 =
0.80; fall: R2 = 0.92; spring: R2 = 0.81; winter: R2 = 0.78). The performance of sulfate at the
CASTNet sites looks better than at the IMPROVE sites. The CASTNet sites measure data on a
weekly average basis as opposed to the IMPROVE twice weekly sampling schedule. There are
also more CASTNet sites in the high sulfate region of the East (e.g. the Ohio Valley). The
CASTNet long term averaging of data seems particularly well suited for comparisons to seasonal
average modeled concentrations.
       The scatterplot and linear regression of total nitrate showed modest agreement, with
weaker correlations within each season (summer: R2 = 0.48; fall: R2 = 0.67; spring: R2 = 0.74;
winter: R2 = 0.51).  There is an indication of an overprediction bias. This is not surprising given
the  overprediction bias of modeled particulate nitrate.  The overprediction of total nitrate
indicates that nitric acid concentrations may be overpredicted. This may be one of the reasons
for the general overprediction of particulate nitrate. Model developers are continuing to examine
the  nitric acid production and destruction pathways.  There are continuing improvements being
made to the daytime and nighttime nitric acid formation reactions.  Dry deposition of nitric  acid
is also being studied as a possible cause of overprediction.
3        6        9        12
    CASTNet Observations (ug/m3)
                                                               15
Figure IV-19. Seasonal average sulfate (SO4) 1996 CASTNet observations versus REMSAD
predictions.
                                           43

-------
              E
              ^>
              
              c
              o
              Q.
              O

              W
              LLJ
                               3   4   5   6   7   8   9   10  11   12
                                CASTNet Observations (ug/m3)
Figure IV-20. Seasonal average total nitrate (NO3+ HNO3) 1996 CASTNet observations versus
REMSAD predictions.
e.  Visibility performance

       For the purpose of model performance evaluation, visibility was calculated in a manner
similar to recommendations for the Regional Haze rule. For the Regional Haze rule, states must
look at the change in visibility on the 20% best days and the 20% worst days (in units of
deciviews)  at each Class I area. A certain improvement in visibility on the 20% worst days is
needed in the future at each Class I area. Visibility on the 20% best days cannot degrade in the
future.

       EPA has released a draft version of guidance that details the calculation of base period
visibility (EPA, 200la). The 20% best and worst days for the "base period" are to be  calculated
from the  2000-2004 IMPROVE data at each Class I area. The daily average extinction
coefficient (bext) values are calculated using the following formula:

bext = 10.0 + [3.0 * f(RH) * (1.375 * sulfate) + 3.0 * f(RH) * (1.29 * nitrate)+
     4.0 * (organic aerosols) + 10.0 * (elemental carbon) + 1.0 * (crustal) + 0.6 * (coarse PM)]

Bext is in units of inverse megameters (Mm"1).  The 10.0 initial value accounts for atmospheric
background (i.e., Rayleigh) scattering. F(RH) refers to the relative humidity correction function
as defined by IMPROVE (2000). The relative humidity correction factor was derived from
historical climatological meteorological data.  There is  a published f(rh) value for each month of
the year for each Class I area (SAIC, 2001). The climatological f(rh) values will be used to
calculate bext for the Regional Haze rule.
                                           44

-------
The formula to calculate bext from REMSAD output species is as follows:

bext = 10.0 + [3.0 * f(RH) * (1.375 * (GSO4 + ASO4)) + 3.0 * f(RH) * (1.29 * PNO3)H
     4.0 * (TOA) + 10.0 * PEC + 1.0 * (PMFINE) + 0.6 * (PMCOARS)]
The daily average bext values are converted to deciview values using the following formula:
         dv =  10.0 *  In
                          10.0 Mm
                                  -i
The 20% best and worst days are identified based on the daily average observed deciview values
at each Class I areas. For the purpose of this model performance evaluation, we have calculated
the 20% best and worst days from 1996 (the meteorological year we are using) at each
IMPROVE site with complete data. The following scatter plots show the observed vs. predicted
bext values at the IMPROVE sites on the 20% best and worst days.
                 400
                 350 --
                 300
               u 250
               1
               o
                 200
               i 150
                 100
 East- 20% Best days

• East- 20% Worst days
                          50     100    150     200    250    300    350    400

                                 IMPROVE Extinction Coefficient (M nv1)
Figure IV-21. IMPROVE observed versus REMSAD predicted light extinction coefficient
values on the 20% best and worst days in the East.
                                           45

-------
                 250

                 225

               "E 200

               ~ 175
               0)
               'o
               £ 150
               u
               o
               » 125
               o
               'i 100
               Q
               1
               LU
               o:
                                  VTT:
• West- 20% Best days
• West- 20% Worst days
                             50   75   100   125   150   175  200  225  250

                                IMPROVE Extinction Coefficient (M nr1)
Figure IV-22. IMPROVE observed versus REMSAD predicted light extinction coefficient
values on the 20% best and worst days in the West.
       REMSAD was generally able to predict the highest bext values on the observed worst days
in the East. The 20% worst days in the East show little bias, but a large amount of scatter.  The
20% best days in the East are generally overpredicted.  The 20% worst days in the West are
underpredicted.  REMSAD rarely predicted high bext values in the West. The model predictions
on the 20% best and worst days are similar.
3. Summary of Model Performance

       The purpose of this model performance evaluation was to evaluate the capabilities of the
REMSAD modeling system in reproducing annual average concentrations and deposition at all
IMPROVE, CASTNet, and NADP sites in the contiguous U.S. for fine particulate mass, its
associated speciated components, visibility, and wet deposition.  When considering annual
average statistics (e.g., predicted versus observed), which are computed and aggregated over all
sites and all days, REMSAD underpredicted fine particulate mass (PM25), by  18%. PM25 in the
Eastern U.S. was underpredicted by 2%, while PM25 in the West was underpredicted by 33%.
All PM2 5 component species were underpredicted in the west. In the East, nitrate and crustal
material are overestimated.  Elemental carbon shows neither over or underprediction in the east
with a bias near 0%. Eastern sulfate is slightly underpredicted with a bias of 12%. Organic
aerosols show little or no bias in the East and West.
                                          46

-------
       The comparisons to the CASTNet data show generally good model performance for
particulate sulfate. Comparison of total nitrate indicate an overestimate, possible due to
overpredictions of nitric acid in the model.

       Performance at the NADP sites for wet deposition of ammonium, sulfate, and nitrate
were reasonably good.  There is a an underprediction bias of nitrate, and especially sulfate wet
deposition. The model predictions of total mercury wet deposition at the MDN sites were also
underpredicted.

       Given the  state of the science relative to PM modeling, it is inappropriate to judge PM
model performance using criteria derived for other pollutants, like ozone.  Still, the performance
of the Clear Skies PM modeling is very encouraging, especially considering that the results may
be limited by our current knowledge of PM science and chemistry, by the emissions inventories
for primary PM and secondary PM precursor pollutants, by the relatively sparse ambient data
available for comparisons to model output, and by uncertainties in monitoring techniques. The
model performance for sulfate is quite reasonable, which is key to the Clear Skies analysis due to
the importance of SO2 emissions reductions in the Clear Skies control program.

       It is important to note that there are a number of factors to be considered when
interpreting the results of this performance analysis. First, simulating the  formation and fate of
particles, especially secondary organic aerosols and nitrates are part of an evolving science.  In
this regard, the science in air quality models is continually being updated as new research results
become available.  Also, there are a number of issues associated with the emissions and
meteorological inputs, as well as ambient air quality measurements and how these should be
paired to model predictions that are currently under investigation by EPA and others.  The
process of building consensus within the scientific community on ways for doing PM model
performance evaluations has not yet progressed to the point of having a defined set of common
approaches or criteria for judging model performance.  Unlike ozone, there is a limited database
of past performance statistics against which to measure the performance of the Clear Skies PM
modeling. Thus, the approach used for this analysis may be modified or expanded in future
evaluation analyses.
                                           47

-------
E.  Projected Future PM25 Design Values

1) East

       The REMSAD simulations were performed for Base Cases in 1996, 2001, 2010, and
2020 considering growth and expected emissions controls that will affect future air quality.  The
effects of the Clear Skies Act emissions reductions (i.e., Control Cases) were modeled for the
two future years (2010 and 2020). As a means of assessing the future levels of air quality with
regard to the PM2 5 NAAQS, future-year estimates of PM25 design values were calculated using
relative reduction factors (RRFs) applied to 1999-2001 PM2 5 design values (EPA, 2003b).  The
procedures for determining the RRFs are similar to those in EPA's draft guidance for modeling
the PM2 5 standard (EPA, 1999a).  The guidance recommends that model predictions be used in
a relative sense to estimate changes expected to occur in each major PM2 5 species. These
species are sulfate, nitrate, organic carbon, elemental carbon, crustal and un-attributed mass
which is defined as the difference between measured PlVts and the sum of the other five
components. The procedure for calculating future year PM2 5 design values is called the
"Speciated Modeled Attainment Test (SMAT)". EPA used this procedure to estimate the
ambient impact of the Clear Skies Act emissions controls.

       The guidance describes a sequence of key steps that are recommended in processing the
data. The following is a brief summary of those steps:

1.     Derive current quarterly mean concentrations (averaged over three years) for each of the
       six major components of PlVts  This is done by multiplying the monitored quarterly
       mean concentration of Federal Reference Method (FRM) derived PM25 by the monitored
       fractional composition of PM25 species (at speciation monitor sites) for each quarter in
       three consecutive years, (e.g., 20% sulfate x 15 |ig/m3 PM25  = 3 |ig/m3 sulfate).

2.     For each quarter, apply an air quality model to estimate current and future concentrations
       for each of the six components of PM25. Take the ratio of future to current predictions for
       each component. The result is a component-specific relative reduction factor (RRF).
       (e.g., given model predicted sulfate for base is 10 |ig/m3 and future is 8 |ig/m3 then RRF
       for sulfate is 0.8).

3.     For each quarter, multiply the current quarterly mean component concentration (step  1)
       times the component-specific RRF obtained in step 2.  This leads to an estimated future
       quarterly mean concentration for each component, (e.g., 3 |ig/m3 sulfate x 0.8 = future
       sulfate of 2.4 |ig/m3).

4.     Average the four quarterly mean future concentrations to get an estimated future annual
       mean concentration for each component. Sum the annual mean concentrations of the six
       components to obtain an estimated future annual concentration for PM25.

       EPA will use the Federal Reference Monitor (FRM) data for nonattainment designations.
Therefore it is critical that FRM data is used in the speciated modeled attainment test described
above. As can be seen from the list of steps, the modeled attainment test is  critically dependent
on the availability of species component mass at FRM sites. There is currently a limited

                                          48

-------
database of urban speciation data from the Speciation Trends Network (STN)11.  Therefore, a
spatial interpolation methodology was developed to estimate component species mass at the
FRM locations. Additional ambient data handling procedures were also developed. Full
documentation of the procedures and assumptions used in the future year design value
calculations is contained in Appendix E.

The SMAT procedure was performed using the base year 2001 scenario and each of the future-
year scenarios.  PM2 5 component species RRFs were calculated on a site-by-site basis.  The
future-year design value projections were then calculated by county, based on the highest
resultant design values for a site within that county. The current and future base and control
annual average PM25 design values are provided in Appendix F.  County populations are also
included in this appendix.

2) West

       Western US PM25 concentrations were modeled as part of the  Clear Skies analysis.  But
due to the lack of PM2 5 species spatial fields, the SMAT technique was not applied. Instead, the
relative reduction factors were calculated based on the predicted percentage change in total
PM2 5.  This was the method of calculating future year PM2 5 values used in past analyses. The
future year design values for the West were calculated using the 1999-2001 ambient design
values, the 2001 base year scenario and the 2010 and 2020 future year scenarios.

       There are no western PM2 5 nonattainment counties outside of  California (except for
Lincoln county, MT)12. As stated earlier, the projected NOX emissions reductions in California
from Clear Skies is very small (-1300 tons/year).  There are also no projected Clear Skies SO2
emissions reductions in California.  Therefore, Clear Skies is expected to have no impact on
PM2 5 attainment status in California, and hence the West.  Other Federal and state emissions
control programs contained in the 2010 and 2020 base cases are predicted to bring several
California counties into attainment for the PM2 5 standard.  The current and future base and
control annual average PM2 5 design values are provided in Appendix  F.
       11 Even when the STN is completely deployed, approximately 80% of the FRM monitoring sites
will not have a co-located speciation monitor.

       12 Lincoln county, Montana (Libby) is known to have a local direct PM2 5 problem associated
with its location in a river valley.  It is not thought to be heavily influenced by utility emissions.

                                           49

-------
F.  PM2 5 Nonattainment Summary

       As shown in Table III-10, the REMSAD modeling projects that 80 counties across the
country with a population of 53.6 million people will have design values greater than the annual
PM2 5 NAAQS in 2010 without Clear Skies controls. By 2020 that number is expected to fall to
53 counties with a population of 45.7  million people as a result of projected emissions
reductions from existing control programs.

       Clear Skies emissions reductions are predicted to bring 42 counties with a population of
13.7 million people into attainment for the PM2 5 standard in 2010. The reduction of 42 counties
leaves 38 counties with a population of 39.9 million people nonattainment for the PM25 standard
in 2010 after Clear Skies controls.  In 2020 Clear Skies is expected to bring 35 counties into with
a population of 12.4 million people into attainment. That leaves 18 counties with a population of
33.2 million people nonattainment for the PM25 standard in 2020 after Clear Skies controls.
Appendix G contains maps of the base year and projected year PM2 5 nonattainment counties.
Table 111-10. Lists of counties projected to violate the Annual PM2 5 NAAQS in 2010 and 2020
for the Base Case and Clear Skies Control Case
2010 Base
Alabama, Houston
Alabama, Shelby
Alabama, DeKalb
Alabama, Montgomery
Alabama, Talladega
Alabama, Russell
Alabama, Morgan
Alabama, Jefferson
California, San Diego
California, Merced
California, Stanislaus
California, Orange
California, Kern
California, Fresno
California, Tulare
California, San Bernardino
California, Los Angeles
2010 Control
Alabama, Talladega
Alabama, Russell
Alabama, Morgan
Alabama, Jefferson
California, San Diego
California, Merced
California, Stanislaus
California, Orange
California, Kern
California, Fresno
California, Tulare
California, San
Bernardino
California, Los Angeles
California, Riverside
Georgia, Bibb
Georgia, Wilkinson
Georgia, Muscogee
2020 Base
Alabama, Montgomery
Alabama, Talladega
Alabama, Russell
Alabama, Morgan
Alabama, Jefferson
California, San Diego
California, Merced
California, Stanislaus
California, Orange
California, Kern
California, Fresno
California, Tulare
California, San Bernardino
California, Los Angeles
California, Riverside
Georgia, Chatham
Georgia, Dougherty
2020 Control
Alabama, Jefferson
California, San Diego
California, Merced
California, Stanislaus
California, Orange
California, Kern
California, Fresno
California, Tulare
California, San
Bernardino
California, Los Angeles
California, Riverside
Georgia, DeKalb
Georgia, Fulton
Illinois, Cook
Michigan, Wayne
Ohio, Jefferson
Ohio, Cuyahoga
                                           50

-------
California, Riverside
Connecticut, New Haven
D.C., District of Columbia
Delaware, New Castle
Georgia, Washington
Georgia, Chatham
Georgia, Dougherty
Georgia, Paulding
Georgia, Richmond
Georgia, Hall
Georgia, Bibb
Georgia, Wilkinson
Georgia, Muscogee
Georgia, Floyd
Georgia, Cobb
Georgia, Clarke
Georgia, Clayton
Georgia, DeKalb
Georgia, Fulton
Illinois, Will
Illinois, Madison
Illinois, St. Clair
Illinois, Cook
Indiana, Lake
Indiana, Marion
Indiana, Clark
Kentucky, Fayette
Kentucky, Jefferson
Maryland, Baltimore city
Michigan, Wayne
Mississippi, Jones
Missouri, St. Louis city
Georgia, Floyd
Georgia, Cobb
Georgia, Clarke
Georgia, Clayton
Georgia, DeKalb
Georgia, Fulton
Illinois, Madison
Illinois, St. Clair
Illinois, Cook
Michigan, Wayne
Montana, Lincoln
New York, New York
Ohio, Franklin
Ohio, Stark
Ohio, Jefferson
Ohio, Hamilton
Ohio, Scioto
Ohio, Cuyahoga
Pennsylvania, Allegheny
Tennessee, Hamilton
Tennessee, Knox











Georgia, Richmond
Georgia, Bibb
Georgia, Wilkinson
Georgia, Muscogee
Georgia, Floyd
Georgia, Cobb
Georgia, Clarke
Georgia, Clayton
Georgia, DeKalb
Georgia, Fulton
Illinois, Madison
Illinois, St. Clair
Illinois, Cook
Indiana, Marion
Indiana, Clark
Maryland, Baltimore city
Michigan, Wayne
New York, New York
Ohio, Summit
Ohio, Butler
Ohio, Montgomery
Ohio, Franklin
Ohio, Stark
Ohio, Jefferson
Ohio, Hamilton
Ohio, Scioto
Ohio, Cuyahoga
Pennsylvania, Philadelphia
Pennsylvania, Allegheny
Tennessee, Hamilton
Tennessee, Knox
West Virginia, Hancock
Pennsylvania,
Allegheny































51

-------
Montana, Lincoln
New York, New York
North Carolina,
Mecklenburg
North Carolina, Catawba
North Carolina, Davidson
Ohio, Trumbull
Ohio, Mahoning
Ohio, Summit
Ohio, Butler
Ohio, Montgomery
Ohio, Franklin
Ohio, Stark
Ohio, Jefferson
Ohio, Hamilton
Ohio, Sioto
Ohio, Cuyahoga
Pennsylvania, Philadelphia
Pennsylvania, Lancaster
Pennsylvania, Allegheny
South Carolina, Greenville
Tennessee, Sullivan
Tennessee, Roane
Tennessee, Davidson
Tennessee, Hamilton
Tennessee, Knox
West Virginia, Marshall
West Virginia, Hancock
West Virginia, Brooke
West Virginia, Wood
West Virginia, Cabell
West Virginia, Kanawha
80 Counties































38 Counties
West Virginia, Brooke
West Virginia, Wood
West Virginia, Cabell
West Virginia, Kanawha



























53 Counties































18 Counties
52

-------
G. Projected Visibility

       As described previously, visibility was calculated for the 20% best and worst days from
1996 at each IMPROVE site with complete data.  The future year projected visibility was also
calculated for each Clear Skies scenario in 2010 and 2020 using a methodology similar to
SMAT. The draft modeling guidance recommends the calculation of future year changes in
visibility in a similar manner to the calculation of changes in PM2 5.  The extinction coefficient
and deciview values are made up of individual component species (sulfate, nitrate, organics,  etc).
The predicted change in visibility (on the 20% best and worst days) is calculated as the summed
percent change in the extinction coefficient for each of the PM species (on a daily basis). The
daily average extinction coefficients are converted to deciviews and then averaged (best and
worst days separately). In this way, we can calculate an average change in deciviews from the
base case to a future case at each IMPROVE site.

       Appendix H contains an example calculation of the predicted improvement in visibility
on the 20% worst days at an IMPROVE site as well as the predicted reductions in visibility at
Class I areas on the 20% best visibility days and the 20% worst visibility days.  There is a
separate table for the 20%  best days and 20% worst days. The calculated reductions  in
deciviews is based on the model predicted changes in PM species between the 2001 proxy base
case and the 2010 and/or 2020 model runs. The calculated reductions in deciviews are from  a
base line of 1996 ambient data.  The 1996 ambient data was only used as a starting point to
calculate the deciview reductions.13

       As an example, the expected improvement in visibility at the Great  Smoky Mountain
National Park (GRSM) on the 20% worst visibility days in 2010 without Clear Skies is 1.38
deciviews.  The expected improvement with Clear Skies controls (in addition to all other
expected controls) is 3.37 deciviews.  The improvement in visibility due only to Clear Skies
controls is 1.99 deciviews.  The expected improvement in visibility in 2020 is even larger. The
visibility improvement without Clear Skies is 2.28 deciviews. The improvement with Clear
Skies is 5.56 deciviews, resulting in an improvement due only to  Clear Skies controls of 3.29
deciviews.

       The Clear Skies modeling predicts smaller improvements in visibility on the 20% best
days.  But there are no cases in which visibility deteriorated due to Clear Skies or any other
controls. There are some Class I areas in the West which do not show any improvement in
visibility due to Clear Skies. This is not surprising due to the very small emissions reductions in
California and other Western states.
       13 The 1996 data was used because it is coincident with the REMSAD meteorology.  The changes
in visibility are representative of emissions changes from 2001 into the future (not 1996). Due to the lack
of complete IMPROVE baseline ambient data and due to the fact that 1996 meteorology was used, it was
difficult to replicate the Regional Haze guidance (the modeling guidance and the procedures for
calculating the baseline 20% best and worst days.) The resultant values are believed to be representative
of the expected improvement in visibility.

                                           53

-------
H. Model outputs for benefits calculations

A number of model outputs are provided for economic and health benefits calculations. The
following model outputs were provided for each modeling scenario:

CAMx

Hourly average ozone concentration


REMSAD

Daily average PM2.5 concentrations
Daily average PM10 concentrations
Annual average visibility14
Annual average PM2.5 concentrations
Annual average PM10 concentrations
Annual total nitrogen deposition
Annual total sulfur deposition
Annual total mercury deposition
       14 The daily average modeled bext values are averaged to derive the annual average bext.  The
annual average bext were used to calculate the annual average deciviews (dv). The relative humidity
correction factor f(rh) used to calculate the annual average visibility was calculated from the hourly
average modeled relative humidity at each grid cell for each time period.  The climatological f(rh) values
at each Class I area could not be used because annual average visibility calculations are needed for each
grid cell.

                                            54

-------
V. References

Environ, 2002: User's Guide: Comprehensive Air Quality Model with Extensions (CAMx),
Novato, CA.

EPA, 1991: Guideline for Regulatory Application of the Urban Airshed Model, EPA-450/4-91-
013, Office of Air Quality Planning and Standards, Research Triangle Park, NC.

EPA, 1999a: Draft Guidance on the Use of Models and Other Analyses in Attainment
Demonstrations for the 8-Hour Ozone NAAQS, Office of Air Quality Planning and Standards,
Research Triangle Park, NC.

EPA, 1999b: Technical Support Document for the Tier 2/Gasoline Sulfur Ozone Modeling
Analyses, EPA420-R-99-031, Research Triangle Park, NC.

EPA, 2000a: Technical Support Document for the Heavy Duty Engine and Vehicle Standards
and Highway Diesel Fuel Sulfur Control Requirements: Air Quality Modeling Analyses,
EPA420-R-00-0208, Research Triangle Park, NC.

EPA, 2000b: Guidance for Demonstrating Attainment of Air Quality Goals for PM2 5 and
Regional Haze; Draft 1.1, Office of Air Quality Planning and Standards, Research  Triangle Park,
NC.

EPA, 200la: Draft Guidance for Tracking Progress Under the Regional Haze Rule, Office of Air
Quality Planning and Standards, Research Triangle Park, NC.

EPA, 200Ib: Draft Guidance for Estimating Natural Visibility Conditions Under the Regional
Haze Program, Office of Air Quality Planning and Standards, Research Triangle Park, NC.

EPA, 2002: Clean Air Status and Trends Network (CASTNet), 2001 Annual Report.

EPA, 2003a: Procedures for Developing Base Year and Future Year Mass Emissions Inventories
for the Nonroad Diesel Engine Rulemaking, Office of Air Quality Planning and Standards,
Research Triangle Park, NC.

EPA, 2003b: Air Quality Data Analysis 1999-2001, Technical Support Document for Regulatory
Actions, Office of Air Quality Planning and Standards, Research Triangle Park, NC.

EPA, 2003c: Technical Support Document for the Nonroad Land-Based Diesel Engine
Standards: Air Quality Modeling Analyses, Office of Air Quality Planning Services, Research
Triangle Park, NC.

Grell, G., J. Dudhia, and D. Stauffer, 1994: A Description of the Fifth-Generation Penn
State/NCAR Mesoscale Model (MM5), NCAR/TN-398+STR., 138 pp, National Center for
Atmospheric Research, Boulder CO.
                                         55

-------
Houyoux, M.; Vukovich, J.; Brandmeyer, J.  Sparse Matrix Operator Kernal Emissions
Modeling System (SMOKE) User Manual, Version 1.1.2 Draft, MCNC-North Carolina
Supercomuting Center Environmental Programs, 2000. (Updates at
http://www.cmascenter.org/modelclear.htmltfsmoke)

ICF Kaiser, 2002: User's Guide to the Regional Modeling System for Aerosols and Deposition
(REMSAD) Version 7, San Rafael, CA.

IMPROVE. 2000. Spatial and Seasonal Patterns and Temporal Variability of Haze and its
Constituents in the United States: Report III. Cooperative Institute for Research in the
Atmosphere, ISSN: 0737-5352-47.

Griffin, R.J., D.R. Cocker III, R.C. Flagan, and J.H. Seinfeld, 1999: "Organic aerosol formation
from the oxidation of biogenic hydrocarbons" J. Geophysical Research, Vol. 104, pp.  3555-
3567.

Kim, Y.P., J.H. Seinfeld, and P. Saxena, 1993. "Atmospheric Gas-Aerosol Equilibrium I.
Thermodynamic Model." Aerosol Science and Technology, Vol. 19, pp. 157-181.

Mansell, G., 2000: User's Instructions for the Phase 2 REMSAD Preprocessors, Environ
International, Novato, CA.

NADP, 2002: National Acid Deposition Program 2002 Annual Summary.

Odum, J.R., T.P.W. Jungkamp, R.J. Griffin, R.C. Flagan, and J.H. Seinfeld, 1997: "The
Atmospheric Aerosol-Forming Potential of Whole Gasoline Vapor" Science, Vol. 276, pp. 96-
99.

Olerud, D., K. Alapaty, and N. Wheeler, 2000: Meteorological Modeling of 1996 for the United
States with MM5. MCNC-Environmental Programs,  Research Triangle Park, NC.

Pielke, R.A., W.R. Cotton, R.L. Walko, C.J.  Tremback, W.A. Lyons, L.D. Grasso, M.E.
Nicholls, M.D. Moran, D.A. Wesley, T.J. Lee, and J.H. Copeland, 1992: A Comprehensive
Meteorological Modeling System - RAMS, Meteor. Atmos. Phys., Vol. 49, pp. 69-91.

Saxena, P., A.B. Hudischewskyj, C. Seigneur, and J.H. Seinfeld, 1986: "A Comparative Study of
Equilibrium Approaches to the Chemical Characterization of Secondary Aerosols." Atmospheric
Environment, Vol. 20, pp. 1471-1483.

Seigneur, C., G. Hidy, I. Tombach, J. Vimont, P. Amar, 1999: "Scientific Peer-Review of the
Regulatory Modeling System for Aerosols and Deposition (REMSAD)." The KEVRIC
Company, Inc., Durham, NC.

Sistla, Gopal, 1999: Personal communication.
                                          56

-------
SAIC, 2001: "Interpolating Relative Humidity Weighting Factors to Calculate Visibility
Impairment and the Effects of Improve Monitor Outliers" EPA Contract No. 68-D-98-113.
http://vista.cira.colostate.edu/improve/Publications/GuidanceDocs/DraftReportSept20.pdf

Systems Applications International, 1996: User's Guide to the Variable-Grid Urban Airshed
Model (UAM-V), SYSAPP-96-95/27r, San Rafael CA.

Wesley, M.L., 1989: "Parameterization of Surface Resistances to Gaseous Dry Deposition in
Regional-Scale Numerical Models" Atmospheric Environment, Vol. 23, No 6, pp. 1293-1304.

Whitten, Gary Z., 1999: Computer Efficient Photochemistry for Simultaneous Modeling of
Smog and Secondary Particulate Precursors, Systems Application International,  San Rafael, CA.
                                          57

-------