* _ \
PRO"^
Meteorological Model Performance for Annual
2017 Simulation WRF v3.8
1
-------
2
-------
EPA-454/R-22-006
September 2022
Meteorological Model Performance for Annual 2017 Simulation WRF v3.8
U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
Air Quality Assessment Division
Research Triangle Park, NC
3
-------
Meteorological Model Performance for
Annual 2017 Simulation WRF v3.8
4
-------
1. INTRODUCTION
The Weather Research and Forecasting model (WRF) was applied for the entire year of 2017 to
generate meteorological data to support emissions and photochemical modeling applications
for this year. The WRF meteorological fields will be converted to air quality modeling input data
and used to support assessments of ozone, PM2.5, visibility, and a variety of toxics.
The WRF model was applied to the 12 km continental United States (12US) scale domain,
initialized directly from meteorological analysis data. Model parameterizations and options
outlined in this document were chosen based on a series of sensitivity runs performed by U.S.
Environmental Protection Agency (USEPA) Office of Research and Development that provided
an optimal configuration based on temperature, mixing ratio, and wind field. All WRF
simulations were done by CSRA under contract to the USEPA.
2. MODEL CONFIGURATION
Version 3.8 of the WRF model, Advanced Research WRF (ARW) core (Skamarock, 2008) was
used for generating the 2017 simulation. Selected physics options include Pleim-Xiu land
surface model, Asymmetric Convective Model version 2 planetary boundary layer scheme, Kain-
Fritsch cumulus parameterization utilizing the moisture-advection trigger (Ma and Tan, 2009),
Morrison double moment microphysics, and RRTMG longwave and shortwave radiation
schemes (Gilliam and Pleim, 2010).
The 12US WRF model was initialized using the 12km North American Model (12NAM) analysis
product provided by National Climatic Data Center (NCDC). Where 12NAM data was
unavailable, the 40km Eta Data Assimilation System (EDAS) analysis (ds609.2) from the National
Center for Atmospheric Research (NCAR) was used. Analysis nudging for temperature, wind,
and moisture was applied above the boundary layer only. The model simulations were
conducted continuously. The 'ipxwrf' program was used to initialize deep soil moisture at the
start of the run using a 10-day spinup period (Gilliam and Pleim, 2010). Landuse and land cover
data were based on the 2011 National Land Cover Database (NLCD 2011). Sea surface
temperatures were ingested from the Group for High Resolution Sea Surface Temperatures
(GHRSST) (Stammer et al., 2003) 1km SST data.
Additionally, lightning data assimilation was utilized to suppress (force) deep convection where
lightning is absent (present) in observational data. This method is described by Heath et al.
(2016) and was employed to help improve precipitation estimates generated by the model.
Figures 2.1 shows the 12US domain, which utilized a Lambert conformal projection centered at
(-97,40) with true latitudes of 33 and 45 degrees north. The 12US domain contains 412 cells in
the X direction and 372 cells in the Y direction. The atmosphere is resolved with 35 vertical
5
-------
layers up to 50 mb (see table 2.1), with the thinnest layers being nearest the surface to better
resolve the planetary boundary layer (PBL).
WRF
Height
Pressure
Sigma
Layer
(m)
(mb)
35
17,556
5000
0.000
34
14,780
9750
0.050
33
12,822
14500
0.100
32
11,282
19250
0.150
31
10,002
24000
0.200
30
8,901
28750
0.250
29
7,932
33500
0.300
28
7,064
38250
0.350
27
6,275
43000
0.400
26
5,553
47750
0.450
25
4,885
52500
0.500
24
4,264
57250
0.550
23
3,683
62000
0.600
22
3,136
66750
0.650
21
2,619
71500
0.700
20
2,226
75300
0.740
19
1,941
78150
0.770
18
1,665
81000
0.800
17
1,485
82900
0.820
16
1,308
84800
0.840
15
1,134
86700
0.860
14
964
88600
0.880
13
797
90500
0.900
12
714
91450
0.910
11
632
92400
0.920
10
551
93350
0.930
9
470
94300
0.940
8
390
95250
0.950
7
311
96200
0.960
6
232
97150
0.970
5
154
98100
0.980
4
115
98575
0.985
3
77
99050
0.990
2
38
99525
0.995
1
19
99763
0.9975
Surface
0
100000
1.000
Table 2.1 WRF layers and their approximate height above ground level.
6
-------
Figure 2,1 Map of WRF model domain: 12US.
3 MODEL PERFORMANCE DESCRIPTION
The WRF model simulations were evaluated to determine whether the output fields represent a
reasonable approximation of the actual meteorology that occurred during the modeling period.
Identifying and quantifying these output fields allows for a downstream assessment of how the
air quality modeling results are impacted by the meteorological data. For the purposes of this
assessment, 2-meter temperature and mixing ratio, 10-meter wind speed and direction, and
shortwave radiation are quantitatively evaluated. A qualitative and quantitative evaluation of
precipitation is also provided.
The observation database for surface-based temperature, wind speed and direction, and mixing
ratio is based on measurements made at United States (i.e., National Weather Service) and
Canadian (i.e., Environment Canada) airports. The observational dataset (ds472 network) is
available from NCAR. Monitors used for evaluation are shown in Figure 3.1.
7
-------
Figure 3.1 Stations used for model performance: ds472 network.
Shortwave downward radiation measurements are taken at Surface Radiation Budget Network
(SURFRAD) (https://www.esrl.noaa.gov/gmd/grad/surfrad/index.html) and SOLRAD (formerly
ISIS) (https://www.esrl.noaa.gov/gmd/grad/solrad/index.html) monitor locations. The
SURFRAD network consists of 7 sites and the SOLRAD network consists of 9 sites across the
United States (see Figure 3.2). Both networks are operated by the National Oceanic and
Atmospheric Administration (NOAA), with SURFRAD sites existing as a subset of SOLRAD
monitors that provide higher level radiation information not used in this evaluation.
8
-------
Figure 3.2. Location of SOLRAD and SURFRAD radiation monitors.
Rainfall amounts are estimated by the Parameter-elevation Relationships on Independent
Slopes Model (PRISM) model, which uses an elevation-based regression model to analyze
precipitation. PRISM's horizontal resolution is approximately 2 to 4 km and is re-projected to
the WRF modeling domain for direct comparison to model estimates. The rainfall analysis is
limited to the contiguous United States as the model utilizes elevation and measured
precipitation data at automated weather stations.
Model performance (i.e., temperature, wind speed, and mixing ratio) is described using
quantitative metrics: mean bias, mean (gross) error, fractional bias, and fractional error (Boylan
and Russell, 2006). These metrics are useful because they describe model performance in the
measured units of the meteorological variable and as a normalized percentage. Since wind
direction is reported in compass degrees, estimating performance metrics for wind direction is
problematic as modeled and observed northerly winds may be similar but differences would
result in a very large artificial bias. For example, the absolute difference in a northerly wind
direction measured in compass degrees of 1ฐ and 359ฐ is 358ฐ when the actual difference is only
2ฐ. To address this issue, wind field displacement, or the difference in the U and V vectors
between modeled (M) and observed (0) values, is used to assess wind vector performance
(Equation 1). Performance is best when these metrics approach 0.
(1) Wind displacement (km) = (Um ~ Uo + Vm - Vo)*(l km/1000 m)*(3600 s/hr)*(l hr)
-------
Rainfall performance is examined spatially using side-by-side comparisons of monthly total
rainfall plots. The WRF model outputs predictions approximately 15 meters above the surface
while observations are at 10 meters. WRF generates output at near instantaneous values (90
second time step) as opposed to longer averaging times taken at monitor stations. This should
be considered when interpreting model performance metrics.
3.1 Model Performance for Winds
WRF-predicted wind speed estimates are compared to surface-based measurements made in
the ds472 network described earlier and shown below in Figure 3.1.1. Regional analysis of
statistical metrics for wind speed performance by quarter1 is shown in Table 3.1.1.
WRF tends to slightly overpredict wind speeds in the early morning and afternoon hours, while
slightly underpredicting wind speeds in the late evening and overnight hours. There is no
significant seasonal variability noted in terms of wind speed.
The monthly spatial distributions of the wind speed biases (m/s) for all hours (Figures 3.1.2-
3.1.5) are presented. In general, WRF slightly overpredicts (0.25 to 0.5 m/s) across much of the
eastern US. Conversely, WRF tends to underpredict (-0.25 to -1 m/s) wind speeds in the
western US, which persists across much of the year. As noted above, these biases generally
persist regardless of changes in season.
1 Quarters are Q1 (January, February, March), Q2 (April, May, June), Q3 (July, August, September), and Q4
(October, November, December).
10
-------
Wind Speed Bias
B S
~~r~
to
~~r~
11
"i r~
12 13
~r~
14
~~r~
15
~r~
16
~1~
17
~~r~
18
~i 1 r~~
19 2D 21
~i r~
22 23
Hour off day (GMT)
Wind Speed Bias
1 1 I 1 1 1
1
1
Wind Speed Error
Wind Speed Fractional Bias
s s s
I
M
i
A
"T"
M
# 8 -
o -
Figure 3.1.1. Distribution of hourly bias by hour and hourly bias, error, fractional bias, and
fractional error for wind speed by month for 12US domain,
Wind Speed Fractional Error
J FMAMJ JASOND
11
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20170101 AND 20170131
Mean bias of Wind Speed (m/s) Date: BETWEEN 20170201 AND 20170228
Figure 3.1.2. Spatial distribution of wind speed bias (m/s) across all hours for the months of
January, February, and March (top to bottom).
12
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20170401 AND 20170430
Mean bias of Wind Speed (m/s) Date: BETWEEN 20170601 AND 20170630
Figure 3.1.3. Spatial distribution of wind speed bias (m/s) across all hours for the months of
April, May, and June (top to bottom).
13
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20170701 AND 20170731
: . 4
Mean bias of Wind Speed (mfs) Pate: BETWEEN 20170801 AND 20170831
"7"
Mean bias of Wind Speed (m/s) Date: BETWEEN 20170901 AND 20170930
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Figure 3.1.4. Spatial distribution of wind speed bias (m/s) across all hours for the months of
July, August, and September (top to bottom).
14
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20171001 AND 20171031
Mean bias of Wind Speed (m/s) Date: BETWEEN 20171101 AND 20171130
Figure 3.1.5. Spatial distribution of wind speed bias (m/s) across all hours for the months of
October, November, and December (top to bottom).
15
-------
Climate Region
Season
Mean Obs
Mean Mod
MB
MAE
NMB
NME
RMSE
Northeast
Q1
4.58
4.24
1.37
0.05
1.18
29.89
1.93
Q2
4.04
3.75
1.22
0.05
1.24
30.11
1.72
Q3
3.4
2.96
1.03
-0.02
-0.57
30.37
1.48
Q4
4.08
3.71
1.26
0.12
2.89
30.81
1.82
N. Rockies & Plains
Q1
5.16
4.24
1.48
-0.69
-13.37
28.71
2
Q2
5.04
4.26
1.39
-0.55
-10.91
27.48
1.86
Q3
4.13
3.51
1.25
-0.38
-9.28
30.35
1.71
Q4
5.27
4.47
1.43
-0.55
-10.48
27.19
1.95
Northwest
Q1
4.06
3.34
1.52
-0.39
-9.72
37.34
2.02
Q2
3.96
3.35
1.33
-0.29
-7.45
33.59
1.75
Q3
3.47
2.87
1.15
-0.32
-9.3
33.19
1.51
Q4
3.77
3.08
1.38
-0.34
-9.02
36.6
1.83
Ohio Valley
Q1
4.55
4.17
1.11
-0.11
-2.48
24.44
1.46
Q2
4.27
3.9
1.11
-0.01
-0.28
26.04
1.48
Q3
3.14
2.7
0.89
-0.05
-1.47
28.19
1.18
Q4
4.08
3.71
1.01
0.01
0.19
24.72
1.32
South
Q1
4.76
4.21
1.22
-0.3
-6.34
25.67
1.64
Q2
4.72
4.18
1.25
-0.21
-4.38
26.39
1.68
Q3
3.8
3.26
1.05
-0.17
-4.56
27.67
1.43
Q4
4.27
3.69
1.08
-0.21
-4.82
25.3
1.45
Southeast
Q1
3.79
3.56
1.16
0.22
5.77
30.73
1.54
Q2
3.64
3.34
1.14
0.17
4.72
31.42
1.52
Q3
3.19
2.72
1.04
0.01
0.47
32.51
1.42
Q4
3.41
3.09
1.05
0.21
6.19
30.85
1.4
Southwest
Q1
4.64
3.79
1.64
-0.57
-12.25
35.4
2.25
Q2
4.74
3.88
1.62
-0.61
-12.84
34.19
2.19
Q3
3.94
3.08
1.52
-0.67
-16.9
38.59
2.07
Q4
4.34
3.47
1.57
-0.59
-13.68
36.22
2.2
Upper Midwest
Q1
4.61
4.3
1.15
-0.02
-0.46
24.85
1.52
Q2
4.4
4.16
1.17
0.08
1.77
26.64
1.54
Q3
3.48
3.36
1.02
0.27
7.69
29.18
1.34
Q4
4.52
4.43
1.17
0.23
5.17
25.8
1.54
West
Q1
4.08
3.39
1.46
-0.31
-7.66
35.76
1.98
Q2
4.32
3.61
1.36
-0.37
-8.54
31.47
1.82
Q3
3.79
3.01
1.26
-0.5
-13.26
33.18
1.68
Q4
3.54
2.81
1.31
-0.36
-10.14
36.89
1.77
Table 3.1.1. Mean observed, mean modeled, mean bias (MB), mean absolute error (MAE),
normalized mean bias (NMB), normalized mean error (NME), and root mean square error
(RMSE) for wind speed (m/s).
16
-------
Wind vector displacement (km) is presented below (Figure 3.1.6) utilizing the ds472
observation network described earlier. These plots show the entire distribution of hourly wind
displacement by month and by hour of the day. Overall, model performance is adequate in
terms of wind vector differences. The average wind displacement for the WRF simulation is
around 5km for all months and hours of the day. The interquartile ranges are roughly 2-10km.
As the displacement is generally less than the resolution of the model, minimal impacts due to
displacement of wind vectors are expected.
17
-------
Wind Displacement
O
-------
3.2 Temperature
Temperature estimates are compared to the ds472 observation network described earlier and
are presented below (Figure 3.2.1). Regional analysis of statistical metrics for temperature
performance by quarter is shown in Table 3.2.1.
Overall, WRF slightly overpredicts temperatures across most months of the year. The range of
biases decreases slightly during the late spring and early summer months (April-July) compared
to the rest of the year, with the inner-quartile range (IQR) becoming more tightly centered
around zero. Model error decreases considerably during the late spring and much of the
summer, as well. Overall, with an average IQR of +/- 1 degree, this is considered adequate
model performance.
In Figures 3.2.3-3.2.6, spatial distribution of monthly biases is presented across all hours. WRF
generally overpredicts temperatures slightly across most months of the year. A more noticeable
overprediction is noted during the months of July, October, and December, with an
overprediction on the order of 1 to 1.5 degrees. In areas of the western US, performance for
temperature is mixed, with persistent significant overpredictions and underpredictions
observed in varying locations.
19
-------
Temperature Bias
III' i
F=! 5=3 ฆ1=3
i i i i > ฆ
S H H S = B
H S ra H ^ S S E
i i i i i
ฆ i i i i i
-
O
6 10 11 12 13 14 15 18 17 18 10 2D 21 22 23
Hour of day (GMT)
Temperature Bias
Temperature Error
CM -
O -
S S S S B
b e s 0
i
M
I
M
Temperature Fractional Bias
~i
M
I
M
Temperature Fractional Error
Figure 3.2.1. Distribution of hourly bias by hour and hourly bias, error, fractional bias, and
fractional error for temperature by month.
20
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20170101 AND 20170131
$
%
' < jr x
Ar *%i -
-y--V U
ป *
i
V ^
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Mean bias of 2 m Temperature (C) Date: BETWEEN 20170201 AND 20170228
VvA^ .V* i'
V..'. _
> , * i jk*ป
, , ฆ->
V
-------
ฆ ฃ *% ,\V
Mean bias of 2 m Temperature (C) Date: BETWEEN 20170401 AND 20170430
, -V * ซ % '
r % ,, \V, j
'wปw ' i?-
!
Mean bias of 2 m Temperature (C) Date: BETWEEN 20170501 AND 20170531
& V \
*
.. .
ฆซ* _ฃa
".Y>----J r.iปik''v-
I-J-;. "> - ^ * 4
.r - 4 , ?i fป,vป , *>vw
Mean bias of 2 m Temperature (C) Date: BETWEEN 20170601 AND 20170630
W^t^i
H '\+ . .
1 _! "L?l '1 j '*
j*
w ฆ">
> ฃV
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Figure 3.2.3. Spatial distribution of temperature bias (C) across all hours for the months of April,
May, and June (top to bottom).
22
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20170701 AND 20170731
v : f ' r T v
ซ! *4? ,fv -v.-.- *ป x&jv
*
' '
> It <-
\
Figure 3.2.4. Spatial distribution of temperature bias (C) across all hours for the months of July,
August, and September (top to bottom).
23
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20171001 AND 20171031
, K r'
r.V:
A
<2
> * . *%/ Vฃ' V > \
w v. 'iV ^ ^ ' y i
5, L.- *t r J?&
Figure 3.2.5. Spatial distribution of temperature bias (C) across all hours for the months of
October, November, and December (top to bottom).
24
-------
Climate Region
Season
Mean Obs
Mean Mod
MB
MAE
NMB
NME
RMSE
Northeast
Q1
274.49
1.17
1.77
-0.18
-0.06
0.64
2.35
Q2
288.27
288.34
1.5
0.07
0.02
0.52
2.03
Q3
293.59
294.02
1.33
0.43
0.15
0.45
1.79
Q4
279.4
279.65
1.67
0.25
0.09
0.6
2.21
N. Rockies & Plains
Q1
271.24
271.54
2.07
0.31
0.11
0.76
2.78
Q2
286.57
286.86
1.54
0.29
0.1
0.54
2.02
Q3
293
293.36
1.67
0.36
0.12
0.57
2.19
Q4
275.13
275.56
1.98
0.42
0.15
0.72
2.58
Northwest
Q1
275.27
275.45
1.83
0.17
0.06
0.66
2.51
Q2
286.22
286.42
1.51
0.19
0.07
0.53
2
Q3
292.78
293.35
1.87
0.57
0.2
0.64
2.48
Q4
278.28
278.72
1.79
0.45
0.16
0.64
2.38
Ohio Valley
Q1
278.25
278.29
1.59
0.03
0.01
0.57
2.06
Q2
291.21
291.54
1.35
0.33
0.11
0.46
1.77
Q3
295.12
295.55
1.25
0.43
0.15
0.42
1.65
Q4
280.53
280.84
1.61
0.3
0.11
0.57
2.06
South
Q1
286.41
286.58
1.75
0.17
0.06
0.61
2.25
Q2
295.37
295.62
1.28
0.24
0.08
0.43
1.72
Q3
299.39
299.63
1.22
0.24
0.08
0.41
1.62
Q4
287.16
287.48
1.71
0.32
0.11
0.59
2.2
Southeast
Q1
285.95
286
1.73
0.04
0.02
0.6
2.25
Q2
295.09
295.32
1.33
0.23
0.08
0.45
1.76
Q3
298.32
298.54
1.21
0.21
0.07
0.41
1.6
Q4
287.25
287.48
1.61
0.23
0.08
0.56
2.09
Southwest
Q1
278.36
278.69
2.1
0.34
0.12
0.75
2.79
Q2
289.61
289.85
1.91
0.24
0.08
0.66
2.54
Q3
294.72
295.04
1.98
0.32
0.11
0.67
2.64
Q4
280.85
281.55
2.32
0.7
0.25
0.83
3
Upper Midwest
Q1
270.63
270.57
1.46
-0.06
-0.02
0.54
1.92
Q2
286.82
287
1.47
0.17
0.06
0.51
1.94
Q3
292.36
292.83
1.32
0.47
0.16
0.45
1.74
Q4
274.83
275.11
1.51
0.28
0.1
0.55
1.98
West
Q1
283.84
283.97
1.61
0.13
0.05
0.57
2.18
Q2
291.64
291.75
1.68
0.11
0.04
0.57
2.24
Q3
296.71
296.89
1.81
0.18
0.06
0.61
2.45
Q4
286.54
287.07
2.1
0.53
0.18
0.73
2.8
Table 3.2.1. Mean observed, mean modeled, mean bias (MB), mean absolute error (MAE),
normalized mean bias (NMB), normalized mean error (NME), and root mean square error
(RMSE) for temperature (K).
25
-------
3.3 Mixing Ratio
Water mixing ratio estimates are compared to the ds472 observation network described earlier
and are presented below (Figure 3.3.1). Regional analysis of statistical metrics for water vapor
mixing ratio performance by quarter is shown in Table 3.3.1.
The WRF simulation slightly overpredicts moisture across most hours of the day, with a more
noticeable overprediction during the late evening and overnight hours. Additionally, there is
more uncertainty in model predictions during the spring and summer months. This increase in
error is explained by the increased convective activity and influx of moist air masses that are
typical of that time of year. In general, WRF performance was adequate for water vapor mixing
ratio.
The monthly spatial distributions of the mixing ratio bias across all hours are shown in Figures
3.3.3-3.3.6. As noted in the earlier figures, a general overprediction of moisture is observed
across much of the year. Some slight variations appear across regions, with a noticeable
underprediction of moisture that persists across the Southeast for much of the year. Mixing
ratio performance is noticeably overpredicted during the summer months across the Western
US, with biases of 1-2 g/kg.
26
-------
Mixing Ratio Bias
~i 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 r
3 4 5 5 7 8 Q ID 11 12 13 14 15 16 17 18 10 20 21 22 23
Hour of day (GMT)
Mixing Ratio Bias
B S B ^
~1~
M
~~I
A
~~1
M
Mixing Ratio Error
cn CN
a S e ^
Mixing Ratio Fractional Bias
Mixing Ratio Fractional Error
Figure 3.3.1. Distribution of hourly bias by hour and hourly bias, error, fractional bias, and
fractional error for water vapor mixing ratio by month.
27
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170101 AND 20170131
*.
/ v ;
. tA ' tM/* I'
* ->; y ,ปi V-s**
iT-1
t <
r *
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170201 AND 20170228
' ฃ
\ *r/
\ v ' 3
} J"" *? ' * m\A 1
^ ฆ? % . * aF,
ฆ" ^
i/
*'ป. * .jS '1 . ปr'; -
* * i . * l &je
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170301 AND 20170331
~7~
; ,
V T :
* ? V. J ,t*ป
' V
ซฆ nJL^T ' *
V
ฆ?
v
w
I*'r v ซ
; ,r i1 ito
. .-ji, v*v
i. V ..
ฆ ฆ
w f\.
^xS
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Figure 3.3.2. Spatial distribution of water vapor mixing ratio bias (g/kg) across all hours for the
months of January, February, and March (top to bottom).
28
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170401 AND 20170430
~ ^ f ป, r# rrป, - - . ^ .
r ' h '7z~
V.- 4 -t: :A J-> \
'it . "* ป c* .
A
*i .* .*'ป
i
' * vJt
* *
'5>-
>1
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170601 AND 20170630
V*
r_ 11 I * _? % ~ V . ฆ
**3 , # ,5s2I
Figure 3.3.3. Spatial distribution of water vapor mixing ratio bias (g/kg) across all hours for the
months of April, May, and June (top to bottom).
29
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170701 AND 20170731
19 9 m ~ \ *9 J
V-, 'Avvi *ฆ -
* - 3 * f\,
- -^X V
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20170901 AND 20170930
Figure 3.3.4, Spatial distribution of water vapor mixing ratio bias (g/kg) across all hours for the
months of July, August, and September (top to bottom).
30
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20171001 AND 20171031
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20171101 AND 20171130
ฃ
^ '
ฆฆ - -
- - ^ iป
siป
~r
V.
.v
-3
-2
-1
-0.5
-0.25
0
0.25
0.5
1
2
3
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20171201 AND 20171231
4\
I,
- j -
ฆ?
V* '^*r
ฆ
ฆA
%
-
-s - i
rf; ^ V *
***.'' &5$fay-
ฆ
-------
Climate Region
Season
Mean Obs
Mean Mod
MB
MAE
NMB
NME
RMSE
Northeast
Q1
3.23
3.61
0.53
0.38
11.83
16.44
0.74
Q2
8.02
8.39
0.88
0.37
4.63
10.97
1.19
Q3
11.83
11.87
0.92
0.05
0.39
7.75
1.22
Q4
5.2
5.47
0.56
0.27
5.14
10.82
0.76
N. Rockies & Plains
Q1
2.84
2.99
0.43
0.15
5.17
15.15
0.6
Q2
6.28
6.63
0.87
0.35
5.61
13.89
1.21
Q3
9.24
10.06
1.31
0.83
8.95
14.14
1.72
Q4
3.37
3.56
0.44
0.19
5.69
13.19
0.61
Northwest
Q1
3.98
4.19
0.5
0.22
5.46
12.6
0.67
Q2
6.21
6.35
0.71
0.14
2.32
11.45
0.98
Q3
7.6
8.12
1.08
0.52
6.82
14.21
1.48
Q4
4.63
4.76
0.54
0.13
2.89
11.7
0.72
Ohio Valley
Q1
4.45
4.64
0.55
0.19
4.28
12.34
0.78
Q2
9.31
9.82
1.01
0.51
5.45
10.82
1.36
Q3
12.87
13.19
1.04
0.31
2.43
8.11
1.4
Q4
5.39
5.68
0.6
0.28
5.26
11.13
0.82
South
Q1
7.2
7.4
0.79
0.21
2.88
11.04
1.12
Q2
12.23
12.61
1.11
0.38
3.09
9.04
1.51
Q3
15.44
15.85
1.28
0.4
2.62
8.29
1.69
Q4
7.81
8
0.8
0.18
2.34
10.24
1.1
Southeast
Q1
6.86
7.12
0.83
0.26
3.73
12.04
1.12
Q2
12.42
12.67
1.13
0.25
2
9.11
1.51
Q3
16.06
16.31
1.27
0.26
1.59
7.89
1.67
Q4
8.61
8.61
0.8
0
0.02
9.28
1.09
Southwest
Q1
3.45
3.75
0.61
0.3
8.8
17.78
0.82
Q2
4.72
5.28
1.03
0.57
12.05
21.77
1.38
Q3
8.64
9.62
1.5
0.98
11.37
17.38
1.92
Q4
3.49
3.9
0.73
0.41
11.88
20.97
1.03
Upper Midwest
Q1
2.85
2.98
0.38
0.14
4.79
13.35
0.54
Q2
7.03
7.64
0.96
0.6
8.55
13.61
1.3
Q3
10.98
11.36
0.97
0.38
3.45
8.81
1.3
Q4
3.9
4.18
0.47
0.28
7.17
11.96
0.66
West
Q1
5.84
6
0.7
0.16
2.77
12.03
1
Q2
6.92
7.08
0.93
0.16
2.27
13.37
1.29
Q3
9.13
9.67
1.22
0.54
5.87
13.41
1.68
Q4
5.24
5.52
0.95
0.28
5.42
18.24
1.37
Table 3.3.1. Mean observed, mean modeled, mean bias (MB), mean a
normalized mean bias (NMB), normalized mean error (NME), and root
(RMSE) for water vapor mixing ratio (g/kg).
Dsolute error (MAE),
mean square error
32
-------
3.4 Precipitation
Monthly total rainfall is plotted for each grid cell to assess how well the model captures the
spatial variability and magnitude of convective and non-convective rainfall. As described earlier,
the PRISM estimations for rainfall are only within the continental United States. WRF rainfall
estimates by month are shown for all grid cells in the domain. Monthly total estimates are
shown in Figures 3.4.1 through 3.4.12.
In general, WRF performs adequately in terms of the spatial patterns and magnitude of
precipitation across the US throughout the year. WRF struggles with representing precipitation
in areas of complex terrain (e.g., northern CA), particularly during the late winter and early
spring months. In general, the simulation overpredicts precipitation across the western areas of
the country during most months, with notable overpredictions of precipitation during periods
of enhanced convective activity. Significant overpredictions are noted in the south-central US
during May and across the desert Southwest and Front Range of the Rockies during July and
August. The Deep South has a noted underprediction that persist across much of the year.
33
-------
Precipitation, January 2017
PRISM
Model
0
100 200
300
400
0
^ vi t, r
100 200
300
400
Difference
0
100
200
300
400
-S -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4,1. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for January.
34
-------
Precipitation, February 2017
PRISM
3.4.2. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in) and the
difference (bottom) for February.
Model
Difference
300
250 -
35
-------
Precipitation, March 2017
Figure 3.4.3. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall
and the difference (bottom) for March,
Difference
PRISM
Model
36
-------
Precipitation, April 2017
Model
0
100
200
300
400
Difference
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4.4. PRISM analysis (top left) arid WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for April.
37
-------
Precipitation, May 2017
0
100
200
300
400
Difference
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4.5. PRISM analysis (top left) arid WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for May.
38
-------
Precipitation, June 2017
PRISM Model
Inches Inches
Difference
0
100
200
300
400
Inches
Figure 3.4,6. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for June.
39
-------
Precipitation, July 2017
9 12
Inches
PRISM
Model
Difference
9 12
Inches
Figure 3.4.7. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for July.
300
250 -
40
-------
Precipitation, August 2017
PRISM
Figure 3.4.8. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for August.
Model
Difference
9 12
Inches
300
250 -
41
-------
Precipitation, September 2017
Model
0
100 200
300
400
6 9 12
Inches
Difference
0
100
200
300
400
Figure 3.4,9. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for September.
42
-------
Precipitation, October 2017
Model
0
100 200
300
400
Difference
0
100
200
300
400
Figure 3.4.10. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for October.
43
-------
Precipitation, November 2017
PRISM
Figure 3.4.11. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for November.
Model
Difference
300
250 -
44
-------
Precipitation, December 2017
PRISM
Figure 3.4.12. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for December.
Model
9 12
Inches
Difference
3.5 Solar Radiation
Photosynthetically activated radiation (PAR) is a fraction of shortwave downward radiation and
is an important input for the biogenic emissions model for estimating isoprene (Carlton and
Baker, 2011), Isoprene emissions are important for regional ozone chemistry and play a role in
300
250 -
45
-------
secondary organic aerosol formation. Radiation performance evaluation also gives an indirect
assessment of how well the model captures cloud formation during daylight hours.
Shortwave downward radiation estimates are compared to surface-based measurements made
at SURFRAD and SOLRAD network (Figure 3.5.1).
Overall, WRF has little bias in shortwave radiation predictions during the fall and winter
months, but overpredicts slightly in general across most months. Biases tend to grow during the
spring and peak in the summer, though the spread in overpredictions tends to be less than 50
W/m2 on average, with a median bias close to zero.
More variability is noted on an hourly basis. WRF tends to overpredict shortwave radiation
across all daytime hours. The median overprediction at the time of greatest incoming solar
radiation is less than 50 W/m2. A significant spread in the model biases is noted in the
afternoon hours during peak radiation. These errors are likely attributable to the model being
unable to accurately simulate cloud features at subgrid (<12km) scales.
46
-------
Shortwave Radiation Bias: 12US 2017
J FMAMJJASOND
Shortwave Radiation Bias: 12US 2017
. T I r
* n00
T
I
1
1
I
I
1
1 1*
]RF
T T
1 I
1 I
1 1
1 I
i |
* 1
300
0^
i ป i .
j. j_ i i
' i
L I
X
J _ L,
ฆ
ฆ
ฆ
.
ฆ
-L
- zr-
i i
i >
i i
i i
* i
i i
i i
i i
-L -J-
Lpj i-pri ,
i ! i
1 4-
1
1
-L
"i 1iiiiiiiiiiiiii 1iiiir
0 1 2 3 4 5 6 7 8 9 10 12 14 16 18 20 22
Hour of day (GMT)
Figure 3.5.1. Distribution of hourly bias for shortwave radiation (W/m2) by month (top) and by
4 CLIMATE REPRESENTATIVENESS OF 2017
Figures 4,1 and 4.2 show the divisional rankings for observed temperatures across the US for
2017. A climatic representation of the precipitation for 2017 is shown in Figures 4.3 and 4.4. We
can use these plots to determine whether the conditions in a specific year are particularly
anomalous. Additionally, we can make determinations of their suitability for use in
photochemical modeling in terms of a specific year's conduciveness for photochemical
production of secondary pollutants.
Temperatures in 2017 were above average to much above average across several months of
the year, with record warmth observed in the central and eastern US during the late winter and
early spring months. Normal to slightly below normal conditions were observed during the
47
-------
summer months for a large portion of the country, with much below average temperatures in
late summer.
In general, 2017 was wetter than normal for most of the year, though below average
precipitation was observed for the late Fall and Winter months.
48
-------
Divisional Average Temperature Ranks Divisional Average Temperature Ranks
January 2017 February 2017
Record Much Below
Coldest A^age Average
Above Much Record
Ave rags ^Atxjvo Warmest
Much
Below
Average
Abo"'1
Average
Divisional Average Temperature Ranks
March 2017
Period: 1895-2017
Divisional Average Temperature Ranks
April 2017
Period: 1895-2017
Much
ฆฆt
Divisional Average Temperature Ranks
May 2017
Period: 1895-2017
Much
Below
Average
Divisional Average Temperature Ranks
June 2017
Period: 1895-2017
Figure 4,1 Climatic temperature rankings by climate division: January to June 2017.
http://www.ncdc.noaa.gov/temp-and-precip/maps.php
Much
Below
Average
[=]
Average
Much
49
-------
Divisional Average Temperature Ranks Divisional Average Temperature Ranks
July 2017 August 2017
Period: 1895-2017 Period: 1895-2017
Record Much Betow Near Above Much Recwd Record Much
CokJest Betow Average Average Average Above Warmest Coldest Below
Average Average Average
Divisional Average Temperature Ranks Divisional Average Temperature Ranks
September 2017 October 2017
Period: 1895-2017 Period: 1895-2017
Abow
Divisional Average Temperature Ranks
November 2017
Period: 1895-2017
Much
Below
Average
~
Average
Divisional Average Temperature Ranks
December 2017
Period: 1895-2017
Much Reconl
Above Wannest
Average
Much
Below
Average
~
Average
Much
^Above
Figure 4.2 Climatic temperature rankings by climate division: July to December 2017.
http://www.ncdc.noaa.gov/temp-and-precip/maps.php
50
-------
Divisional Precipitation Ranks Divisional Precipitation Ranks
January 2017 February 2017
CD
Near
Average "" Average
Mud) Below Near Atwve Much
*ฆ* Average Average Above
Divisional Precipitation Ranks
March 2017
Period: 1895-2017
Divisional Precipitation Ranks
May 2017
M t=I [=] C=I H
Much Below Near Above Much
Below Average Average Average Above
Average Average
Divisional Precipitation Ranks
April 2017
Period: 1895-2017
Much
Bettw
Average
Divisional Precipitation Ranks
June 2017
Period: 1895-2017
Figure 4.3 Climatic rainfall rankings by climate division: January to June 2017.
http://www.ncclc.noaa.gov/temp-and-precip/maps.php
51
-------
Divisional Precipitation Ranks Divisional Precipitation Ranks
July 2017 August 2017
Much
Above
Average
Divisional Precipitation Ranks
September 2017
Divisional Precipitation Ranks
Informฎ
WadOet 4:
Divisional Precipitation Ranks
November 2017
Period: 1895-2017
Divisional Precipitation Ranks
December 2017
Period: 1895-2017
H EZ3 B M H E=]
Record Much 8akM> Near Above Much Record Record Much
Driest BeMw Average Average Average Above Wettest Dnest Below
Average Average Average
Figure 4.4 Climatic rainfall rankings by climate division: July to December 2017.
https://www.ncdc.noaa.gov/sotc/
52
-------
5 REFERENCES
Boylan, J.W., Russell, A.G., 2006. PM and light extinction model performance metrics, goals, and
criteria for three-dimensional air quality models. Atmospheric Environment 40, 4946-4959.
Carlton, A.G., Baker, K.R., 2011. Photochemical Modeling of the Ozark Isoprene Volcano:
MEGAN, BEIS, and Their Impacts on Air Quality Predictions. Environmental Science &
Technology 45, 4438-4445.
Cooper, O.R., Stohl, A., Hubler, G., Hsie, E.Y., Parrish, D.D., Tuck, A.F., Kiladis, G.N., Oltmans, S.J.,
Johnson, B.J., Shapiro, M., Moody, J.L., Lefohn, A.S., 2005. Direct Transport of Midlatitude
Stratospheric Ozone into the Lower Troposphere and Marine Boundary Layer of the Pacific
Ocean. Journal of Geophysical Research - Atmospheres 110, D23310,
doi:10.1029/2005JD005783.
ENVIRON, 2008. User's Guide Comprehensive Air Quality Model with Extensions. ENVIRON
International Corporation, Novato.
Gilliam, R.C., Pleim, J.E., 2010. Performance Assessment of New Land Surface and Planetary
Boundary Layer Physics in the WRF-ARW. Journal of Applied Meteorology and Climatology 49,
760-774.
Heath, Nicholas K., Pleim, J.E., Gilliam, R., Kang, D., 2016. A simple lightning assimilation
technique for improving retrospective WRF simulations. Journal of Advances in Modeling Earth
Systems. 8. 10.1002/2016MS000735.
Langford, A.O., Reid, S.J., 1998. Dissipation and Mixing of a Small-Scale Stratospheric Intrusion
in the UpperTroposphere. Journal of Geophysical Research 103, 31265-31276.
Otte, T.L., Pleim, J.E., 2010. The Meteorology-Chemistry Interface Processor (MCIP) for the
CMAQ modeling system: updates through MCIPv3.4.1. Geoscientific Model Development 3,
243-256.
Skamarock, W.C., Klemp, J.B., Dudhia, J., Gill, D.O., Barker, D.M., Duda, M.G., Huang, X., Wang,
W., Powers, J.G., 2008. A Description of the Advanced Research WRF Version 3.
Stammer, D., F.J. Wentz, and C.L. Gentemann, 2003, Validation of Microwave Sea Surface
Temperature Measurements for Climate Purposes, J. Climate, 16, 73-87.
53
-------
United States Office of Air Quality Planning and Standards Publication No. EPA-454/R-22-006
Environmental Protection Air Quality Assessment Division September 2022
Agency Research Triangle Park, NC
54
------- |