* _ \
PRO"^
Meteorological Model Performance for Annual
2018 Simulation WRF v3.8
-------
-------
EPA-454/R-23-006
July 2023
Meteorological Model Performance for Annual 2018 Simulation WRF v3.8
U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
Air Quality Assessment Division
Research Triangle Park, NC
-------
Meteorological Model Performance for
Annual 2018 Simulation WRF v3.8
-------
1. INTRODUCTION
The Weather Research and Forecasting model (WRF) was applied for the entire year of 2018 to
generate meteorological data to support emissions and photochemical modeling applications
for this year. The WRF meteorological fields will be converted to air quality modeling input data
and used to support assessments of ozone, PM2.5, visibility, and a variety of toxics.
The WRF model was applied to the 12 km continental United States (12US) scale domain,
initialized directly from meteorological analysis data. Model parameterizations and options
outlined in this document were chosen based on a series of sensitivity runs performed by U.S.
Environmental Protection Agency (USEPA) Office of Research and Development that provided
an optimal configuration based on temperature, mixing ratio, and wind field. All WRF
simulations were done by CSRA under contract to the USEPA.
2. MODEL CONFIGURATION
Version 3.8 of the WRF model, Advanced Research WRF (ARW) core (Skamarock, 2008) was
used for generating the 2018 simulation. Selected physics options include Pleim-Xiu land
surface model, Asymmetric Convective Model version 2 planetary boundary layer scheme, Kain-
Fritsch cumulus parameterization utilizing the moisture-advection trigger (Ma and Tan, 2009),
Morrison double moment microphysics, and RRTMG longwave and shortwave radiation
schemes (Gilliam and Pleim, 2010).
The 12US WRF model was initialized using the 12km North American Model (12NAM) analysis
product provided by National Climatic Data Center (NCDC). Where 12NAM data was
unavailable, the 40km Eta Data Assimilation System (EDAS) analysis (ds609.2) from the National
Center for Atmospheric Research (NCAR) was used. Analysis nudging for temperature, wind,
and moisture was applied above the boundary layer only. The model simulations were
conducted continuously. The 'ipxwrf' program was used to initialize deep soil moisture at the
start of the run using a 10-day spinup period (Gilliam and Pleim, 2010). Landuse and land cover
data were based on the 2011 National Land Cover Database (NLCD 2011)1. Sea surface
temperatures were ingested from the Group for High Resolution Sea Surface Temperatures
(GHRSST) (Stammer et al., 2003) 1km SST data.
Additionally, lightning data assimilation was utilized to suppress (force) deep convection where
lightning is absent (present) in observational data. This method is described by Heath et al.
(2016) and was employed to help improve precipitation estimates generated by the model.
1 Subsequent review of the landuse data indicated that the data provided from NCAR had the incorrect byte
allocation, resulting in slight differences of fractional landuse allocation. No significant performance degradation
resulted from the use of this data.
1
-------
Figures 2.1 shows the 12US domain, which utilized a Lambert conformal projection centered at
(-97,40) with true latitudes of 33 and 45 degrees north. The 12US domain contains 412 cells in
the X direction and 372 cells in the Y direction. The atmosphere is resolved with 35 vertical
layers up to 50 mb (see table 2.1), with the thinnest layers being nearest the surface to better
resolve the planetary boundary layer (PBL).
RF Layer
Height (m)
Pressure (mb)
Sigma
35
17,556
5000
0.000
34
14,780
9750
0.050
33
12,822
14500
0.100
32
11,282
19250
0.150
31
10,002
24000
0.200
30
8,901
28750
0.250
29
7,932
33500
0.300
28
7,064
38250
0.350
27
6,275
43000
0.400
26
5,553
47750
0.450
25
4,885
52500
0.500
24
4,264
57250
0.550
23
3,683
62000
0.600
22
3,136
66750
0.650
21
2,619
71500
0.700
20
2,226
75300
0.740
19
1,941
78150
0.770
18
1,665
81000
0.800
17
1,485
82900
0.820
16
1,308
84800
0.840
15
1,134
86700
0.860
14
964
88600
0.880
13
797
90500
0.900
12
714
91450
0.910
11
632
92400
0.920
10
551
93350
0.930
9
470
94300
0.940
8
390
95250
0.950
7
311
96200
0.960
6
232
97150
0.970
5
154
98100
0.980
4
115
98575
0.985
3
77
99050
0.990
2
38
99525
0.995
1
19
99763
0.9975
Surface
0
100000
1.000
Table 2.1 WRF layers and their approximate height above ground level.
2
-------
Figure 2,1 Map of WRF model domain: 12US.
3 MODEL PERFORMANCE DESCRIPTION
The WRF model simulations were evaluated to determine whether the output fields represent a
reasonable approximation of the actual meteorology that occurred during the modeling period.
Identifying and quantifying these output fields allows for a downstream assessment of how the
air quality modeling results are impacted by the meteorological data. For the purposes of this
assessment, 2-meter temperature and mixing ratio, 10-meter wind speed and direction, and
shortwave radiation are quantitatively evaluated. A qualitative and quantitative evaluation of
precipitation is also provided.
The observation database for surface-based temperature, wind speed and direction, and mixing
ratio is based on measurements made at United States (i.e., National Weather Service) and
Canadian (i.e., Environment Canada) airports. The observational dataset (ds472 network) is
available from NCAR. Monitors used for evaluation are shown in Figure 3.1.
3
-------
Figure 3.1 Stations used for model performance: ds472 network.
Shortwave downward radiation measurements are taken at surface-based monitor locations
and this data is obtained through the Baseline Surface Radiation Network (BSRN,
https://bsrn.awi.de/). This network is global and a map of the locations used in this evaluation
is shown below (see Figure 3.2).
4
-------
Calgary
QUEBEC
Vancouver
o
Seattle
WASHINGTON
~
Winnipeg
NORTH
DAKOTA
SOUTH
DAKOTA
MINNESOTA
~
Ottawa Montreal
© ° jf
~
WISCONSIN.'
T oronto
MICHIGAN J q.-
iVT.-
: .'NH
Chicago
&
ited States
San Francisco
¦BP
CALIFGRN A
~
KANSAS MISSOURI
Las Vegas
Los Angeles
San Diego
%
%-
;NEW MEXICO
OHIO .
Sdiana Washington
WEST ®
VIRGINIA
K E M T U C K Y'""'v VIR GINI
J NEW YORK : . ..
. ma oBoston
^ -
New York
OKWFWjMA TENNESSEE
ARKANSy|L,T
Dallas M'sfH&PPi
NORTH
CAROLINA
~
. SOUTH m
CAROLINA
ALABAMA
¦GEORGIA I I
San Antonioo
1 (
Monterrey
P j
Mexico
Guadalajara
° Mexico City
Houston
Gulf of
Mexico
o
Miami
©
Havana
Cuba
Santo
Domingo
® p
Figure 3.2. Location of radiation monitors.
Rainfall amounts are estimated by the Parameter-elevation Relationships on Independent
Slopes Model (PRISM) model, which uses an elevation-based regression model to analyze
precipitation. PRISM's horizontal resolution is approximately 2 to 4 km and is re-projected to
the WRF modeling domain for direct comparison to model estimates. The rainfall analysis is
limited to the contiguous United States as the model utilizes elevation and measured
precipitation data at automated weather stations.
Model performance (i.e., temperature, wind speed, and mixing ratio) is described using
quantitative metrics; mean bias, mean (gross) error, fractional bias, and fractional error (Boylan
and Russell, 2006). These metrics are useful because they describe model performance in the
measured units of the meteorological variable and as a normalized percentage. Since wind
direction is reported in compass degrees, estimating performance metrics for wind direction is
problematic as modeled and observed northerly winds may be similar but differences would
result in a very large artificial bias. For example, the absolute difference in a northerly wind
direction measured in compass degrees of 1° and 359° is 358° when the actual difference is only
2°. To address this issue, wind field displacement, or the difference in the U and V vectors
between modeled (M) and observed (0) values, is used to assess wind vector performance
(Equation 1). Performance is best when these metrics approach 0.
5
-------
(1) Wind displacement (km) = (Um - Uo + Vm - Vo)*(l km/1000 m)*(3600 s/hr)*(l hr)
Rainfall performance is examined spatially using side-by-side comparisons of monthly total
rainfall plots. The WRF model outputs predictions approximately 15 meters above the surface
while observations are at 10 meters. WRF generates output at near instantaneous values (90
second time step) as opposed to longer averaging times taken at monitor stations. This should
be considered when interpreting model performance metrics.
3.1 Model Performance for Winds
WRF-predicted wind speed estimates are compared to surface-based measurements made in
the ds472 network described earlier and shown below in Figure 3.1.1. Regional analysis of
statistical metrics for wind speed performance by quarter2 is shown in Table 3.1.1.
Statistically, WRF appears to perform very well for wind speed as the median value for the bias
is centered around zero for all hours of the day and months of the year. However, there is a
noticeable pattern in the performance when we examine wind speeds on a spatial scale
(Figures 3.1.2-3.1.5).
In general, WRF slightly overpredicts (0.25 to 0.5 m/s) across much of the eastern US.
Conversely, WRF tends to underpredict (-0.25 to -1 m/s) wind speeds in the western US, which
persists across much of the year. As noted above, these biases generally persist regardless of
changes in season.
2 Quarters are Q1 (January, February, March), Q2 (April, May, June), Q3 (July, August, September), and Q4
(October, November, December).
6
-------
Wind Speed Bias
13 14
Hour of day (GMT)
1 1 T
19 20 21
Wind Speed Bias
S- ^ S
~r~
M
"T"
M
to
-
CM -
"T"
M
Wind Speed Error
~~T"
M
Wind Speed Fractional Bias
B -S S
~1 1 1 1 1 1 1 1 1 1 T"
J FMAMJ JASON
* 8 -
o -
Figure 3.1.1. Distribution of hourly bias by hour and hourly bias, error, fractional bias, and
fractional error for wind speed by month for 12US domain.
Wind Speed Fractional Error
J FMAMJ JASOND
7
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20180101 AND 20180131
¦**. ^ i ; ,*» ->r-. I*
\sf.-AV tei'Tgltt?
AVv
K ">
y T<>
• -2
• -1
• -0.5
• -0.25
0
0.25
0.5
• 1
• 2
• 3
Mean bias of Wind Speed (m/s) Date: BETWEEN 20180201 AND 20180228
;Vn-^vr
a*
• fr-\
s
Figure 3.1.2. Spatial distribution of wind speed bias (m/s) across all hours for the months of
January, February, and March (top to bottom).
8
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20180401 AND 20180430
Figure 3.1.3. Spatial distribution of wind speed bias (m/s) across all hours for the months of
April, May, and June (top to bottom).
9
-------
• -3
• -2
• -1
• -0.5
• -0.25
0
0.25
0.5
• 1
• 2
• 3
Mean bias of Wind Speed (m/s) Date: BETWEEN 20180801 AND 20180831
• -3
• -2
• -1
• -0.5
-0.25
• 0
0.25
0.5
• 1
• 2
• 3
Mean bias of Wind Speed (m/s) Date: BETWEEN 20180901 AND 20180930
Figure 3.1,4. Spatial distribution of wind speed bias (m/s) across all hours for the months of
July, August, and September (top to bottom).
10
-------
Mean bias of Wind Speed (m/s) Date: BETWEEN 20181001 AND 20181031
t -r? ¦
Mean bias of Wind Speed (m/s) Date: BETWEEN 20181101 AND 20181130
:jif. _ « »> . . t 'f , V? • * •
* }
. . kj
- V'.* • ¦ w ' •
• -3
• -2
• -1
• -0.5
• -0.25
0
0.25
0.5
• 1
• 2
• 3
Mean bias of Wind Speed (m/s) Date: BETWEEN 20181201 AND 20181231
Figure 3.1.5. Spatial distribution of wind speed bias (m/s) across all hours for the months of
October, November, and December (top to bottom).
11
-------
Climate Region
Season
Mean Obs
Mean Mod
MAE
MB
NMB
NME
RMSE
Northeast
Q1
4.49
4.2
1.34
0.08
1.86
29.88
1.9
Q2
3.89
3.49
1.18
-0.02
-0.58
30.38
1.7
Q3
3.45
3.08
1.07
0.02
0.61
30.84
1.53
Q4
4.14
3.75
1.29
0.06
1.46
31.21
1.85
N. Rockies & Plains
Q1
5.04
4.12
1.46
-0.7
-13.8
28.94
1.99
Q2
4.93
4.14
1.42
-0.57
-11.5
28.76
1.93
Q3
4.21
3.64
1.24
-0.33
-7.95
29.38
1.68
Q4
4.77
4.04
1.32
-0.48
-10.08
27.61
1.79
Northwest
Q1
4
3.29
1.42
-0.4
-9.9
35.52
1.89
Q2
4.01
3.35
1.32
-0.35
-8.64
32.95
1.75
Q3
3.61
2.94
1.17
-0.38
-10.48
32.37
1.53
Q4
3.63
2.93
1.32
-0.35
-9.59
36.41
1.77
Ohio Valley
Q1
4.46
4.12
1.12
-0.06
-1.32
25.02
1.54
Q2
3.97
3.49
1.09
-0.13
-3.2
27.38
1.48
Q3
3.23
2.89
0.92
0.04
1.18
28.48
1.24
Q4
3.95
3.63
1.01
0.02
0.58
25.58
1.38
South
Q1
4.77
4.18
1.23
-0.33
-6.85
25.69
1.65
Q2
4.76
4.13
1.24
-0.28
-5.92
26.03
1.68
Q3
3.77
3.21
1.06
-0.23
-6.05
28.23
1.45
Q4
4.3
3.81
1.11
-0.17
-3.86
25.69
1.49
Southeast
Q1
3.88
3.78
1.21
0.32
8.3
31.12
1.63
Q2
3.48
3.12
1.1
0.07
2.12
31.61
1.47
Q3
3.05
2.59
1.02
-0.04
-1.26
33.39
1.4
Q4
3.51
3.27
1.13
0.27
7.59
32.19
1.55
Southwest
Q1
4.57
3.66
1.62
-0.61
-13.28
35.49
2.25
Q2
4.88
3.99
1.63
-0.63
-12.91
33.45
2.2
Q3
4.02
3.15
1.53
-0.68
-16.88
38.01
2.08
Q4
4.01
3.18
1.45
-0.58
-14.4
36.25
2.01
Upper Midwest
Q1
4.5
4.26
1.14
0.05
1.19
25.34
1.55
Q2
4.18
3.85
1.17
0.01
0.35
27.94
1.57
Q3
3.59
3.47
1.07
0.28
7.89
29.82
1.44
Q4
4.22
4.06
1.11
0.16
3.86
26.35
1.49
West
Q1
3.78
3.06
1.34
-0.35
-9.23
35.39
1.8
Q2
4.34
3.6
1.36
-0.41
-9.4
31.27
1.81
Q3
3.84
3.05
1.25
-0.5
-13
32.59
1.66
Q4
3.53
2.78
1.29
-0.41
-11.67
36.43
1.74
Table 3.1.1. Mean observed, mean modeled, mean absolute error
normalized mean bias (NMB), normalized mean error (NME), and
(RMSE) for wind speed (m/s).
(MAE), mean bias (MB),
root mean square error
12
-------
Wind vector displacement (km) is presented below (Figure 3.1.6) utilizing the ds472
observation network described earlier. These plots show the entire distribution of hourly wind
displacement by month and by hour of the day. Overall, model performance is adequate in
terms of wind vector differences. The average wind displacement for the WRF simulation is
around 5km for all months and hours of the day. The interquartile ranges are roughly 2-10km.
As the displacement is generally less than the resolution of the model, minimal impacts due to
displacement of wind vectors are expected.
13
-------
Wind Displacement
o
CM
LO
1/) —
o —
0 1 2 3 4 5 6 7 8 9 11 13 15 17 19 21 23
Hour of day (GMT)
Wind Displacement
J FMAMJ JASON D
Figure 3,1.6. Distribution of hourly wind displacement by hour and month.
14
-------
3.2 Temperature
Temperature estimates are compared to the ds472 observation network described earlier and
are presented below (Figure 3.2.1). Regional analysis of statistical metrics for temperature
performance by quarter is shown in Table 3.2.1.
WRF performs very well in terms of predicting temperature, showing a bias that oscillates
around 0 degrees for most hours of the day and months of the year. Model error decreases
noticeably during the Spring and Summer months. In general, the IQR of the bias of +/- 1
degree is acceptable model performance at this scale.
In Figures 3.2.3-3.2.6, spatial distribution of monthly biases is presented across all hours. WRF
generally underpredicts temperatures slightly across the eastern US during the Winter into the
early Spring with the undeprediction persisting longest in the northeast. A more noticeable
overprediction is noted across the eastern US during from June through October with an
average overprediction of 1 degree. In areas of the western US, performance for temperature is
mixed, with persistent significant overpredictions and underpredictions observed in varying
locations.
15
-------
Temperature Bias
~r~
1D
~~i 1 1 1 1 1 1 r
11 12 13 14 15 16 17 IB
Hour of day (GMT)
1 1 1 r~
16 20 21 22 23
Temperature Bias
Temperature Error
1
1
s
1
a
i
¦
m m mm
S
e
e
i
1
J
1
F
1
M
1
A
1 1 1 1
M J J A
Temperature Fractional Bias
1
s
i
o
i
N
1
D
~r ii
-r- ~i~ ~i~ '
~r
_L
—i— _i_
_l_
to -
-
Temperature Fractional Error
AO -
J F M A M J JASOND
Figure 3.2.1, Distribution of hourly bias by hour arid hourly bias, error, fractional bias, and
fractional error for temperature by month.
16
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180101 AND 20180131
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180201 AND 20180228
Figure 3.2.2. Spatial distribution of temperature bias (C) across all hours for the months of
January, February, and March (top to bottom).
17
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180401 AND 20180430
V
t'.1 •••'$5V' k\ ' • . A
r:* •„ , '.A * * ' ... , • >< • s-fS
ti * •• • * •;
4#S 1 'T*.
"Ti uiar
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180501 AND 20180531
' . _
• • •
, ¦'} *<%«-
& ..
.op-,\¥
;-,v• *.':•%ifXvPi
; i y#
- - '«S*
: ,1' ^ ^
»n*'V:: i v>>*>
' .**b <•*
\
* z*l .< <;tH?T.
tk
*
• \» ¦•- • ji . _ < («^
\ ,v; • » j". i - s# t
v, * y r '-r
*"\V • •:.%$'•• U'v*f
. ^5v. ~S
. • r^* ft.,
• -3
• -2
• -1
• -0.5
-0.25
0
0.25
0.5
• 1
• 2
• 3
Figure 3.2.3. Spatial distribution of temperature bias (C) across aii hours for the months of April,
May, and June (top to bottom).
18
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180701 AND 20180731
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180801 AND 20180831
; • w
.. •
*» /
•V. • • Hi
y '1 • • „ 'T/. - rfF*
I*. f#* J~ ~'£ n•* <4%
Mean bias of 2 m Temperature (C) Date: BETWEEN 20180901 AND 20180930
Figure 3.2.4. Spatial distribution of temperature bias (C) across ail hours for the months of July,
August, and September (top to bottom).
19
-------
Mean bias of 2 m Temperature (C) Date: BETWEEN 20181001 AND 20181031
¦» J
% *.
. • .V < •
v»yv r-; ;
Jfl • •• U-J, » - ^ « •. 5 '
fs»v\j ;• T- '
il ' ? • >vSt* A tju-w
/r i«:
t i; t.
»
*W*1 * ;'y
t* ,, •• » . V »~><*>
.. J.Jh'"
? "••; •. v-'«rvsr-'*w
.if . • * ,
f
. »">
• -3
• -2
• -1
• -0.5
• -0.25
0
0.25
0.5
• 1
• 2
• 3
Mean bias of 2 m Temperature (C) Date: BETWEEN 20181101 AND 20181130
r\i •.
:• -m
"k X
V? C J**" sw^, * " •>>:• a;>.
! I , ,C /i V*f
•••* \\v :..*>u
"I 0
•% •. *
Mean bias of 2 m Temperature (C) Date: BETWEEN 20181201 AND 20181231
t' •'
^ , v • I
V# * Is
t
Y
' &.. •
"* >t'. J
, •/ v „ • h i
• . o«t «» _ ' : , '*¦ \Tt
$MW»8
• I I * ** > Jt*!
\r: &
f
A *>.
• -3
• -2
• -1
• -0.5
• -0.25
• 0
0.25
0.5
• 1
• 2
• 3
Figure 3.2.5. Spatial distribution of temperature bias (C) across all hours for the months of
October, November, and December (top to bottom).
20
-------
Climate Region
Season
Mean Obs
Mean Mod
MAE
MB
NMB
NME
RMSE
Northeast
Q1
273.68
273.29
1.74
-0.39
-0.14
0.64
2.34
Q2
287.91
287.99
1.58
0.08
0.03
0.55
2.14
Q3
295.05
295.43
1.37
0.39
0.13
0.47
1.86
Q4
278.53
278.52
1.61
-0.01
-0.01
0.58
2.13
N. Rockies & Plains
Q1
268.76
268.97
2.08
0.21
0.08
0.77
2.77
Q2
286.7
286.99
1.58
0.29
0.1
0.55
2.1
Q3
292.59
293.05
1.64
0.46
0.16
0.56
2.17
Q4
273.73
274.01
1.81
0.29
0.1
0.66
2.4
Northwest
Q1
276.89
276.89
1.57
0
0
0.57
2.1
Q2
286.95
287.15
1.52
0.21
0.07
0.53
2.04
Q3
292.26
292.71
1.87
0.44
0.15
0.64
2.5
Q4
278.9
279.32
1.79
0.42
0.15
0.64
2.41
Ohio Valley
Q1
275.12
275.02
1.62
-0.09
-0.03
0.59
2.13
Q2
291.27
291.51
1.42
0.24
0.08
0.49
1.92
Q3
296.27
296.67
1.21
0.4
0.14
0.41
1.6
Q4
279.56
279.74
1.42
0.18
0.06
0.51
1.88
South
Q1
283.48
283.63
1.74
0.15
0.05
0.61
2.27
Q2
296.22
296.39
1.31
0.17
0.06
0.44
1.78
Q3
299.94
300.13
1.25
0.18
0.06
0.42
1.68
Q4
285.58
285.79
1.5
0.21
0.07
0.52
1.97
Southeast
Q1
284.01
284.03
1.68
0.02
0.01
0.59
2.2
Q2
294.93
295.13
1.33
0.2
0.07
0.45
1.78
Q3
299
299.21
1.21
0.2
0.07
0.41
1.61
Q4
287.21
287.35
1.53
0.13
0.05
0.53
2.03
Southwest
Q1
277.43
277.83
2.19
0.39
0.14
0.79
2.87
Q2
291.17
291.39
2.01
0.22
0.08
0.69
2.68
Q3
295.7
296.02
2.03
0.32
0.11
0.69
2.72
Q4
278.43
279.1
1.99
0.66
0.24
0.72
2.62
Upper Midwest
Q1
267.93
267.59
1.72
-0.34
-0.13
0.64
2.28
Q2
286.64
286.75
1.57
0.11
0.04
0.55
2.14
Q3
292.73
293.3
1.37
0.57
0.2
0.47
1.84
Q4
273.67
273.79
1.43
0.12
0.04
0.52
1.97
West
Q1
283.75
283.99
1.74
0.24
0.08
0.61
2.36
Q2
291.3
291.37
1.61
0.07
0.02
0.55
2.19
Q3
296.68
296.86
1.84
0.19
0.06
0.62
2.52
Q4
285.99
286.45
1.91
0.46
0.67
2.51
2.58
Table 3.2.1. Mean observed, mean modeled, mean absolute error (MAE), mean bias (MB),
normalized mean bias (NMB), normalized mean error (NME), and root mean square error
(RMSE) for temperature (K).
21
-------
3.3 Mixing Ratio
Water mixing ratio estimates are compared to the ds472 observation network described earlier
and are presented below (Figure 3.3.1). Regional analysis of statistical metrics for water vapor
mixing ratio performance by quarter is shown in Table 3.3.1.
Mixing ratio is generally overpredicted across most hours of the day with a greater spread in
the bias in the early morning and evening hours. Increased spread in the bias also occurs during
the late Spring to early Fall when increased moisture levels across the country are noted. In
general, the model error is less than a g/kg across the year and all hours of the day.
The monthly spatial distributions of the mixing ratio bias across all hours are shown in Figures
3.3.3-3.3.6. As noted in the earlier figures, a general overprediction of moisture is observed
across much of the year. Some slight variations appear across regions, with a noticeable
underprediction of moisture that persists across the Southeast for much of the year. Mixing
ratio performance is noticeably overpredicted during the summer months across the Western
US, with biases of 1-2 g/kg.
22
-------
CM -
s
Cb O -
0 1 2 3 4 5 6 7 8 9 ID 11 12 13 14 15 16 17 18 10 20 21 22 23
Hour of day (GMT)
Mixing Ratio Bias
-r-
T
T
w
i
n
T
-r-
—j—
-1-
-1-
¦—1—1
i—-3 1—1
1 i
i
i
-1-
-1-
1
J
1
F
1
M
i
A
1
M
1 1
J J
I
A
1
S
i
o
1
N
1
D
Mixing Ratio Bias
B 0 $ $ 9 S S a 0 9 0 8 e e e &
¦ ! i i ¦ i i ¦ ¦ ¦ 1 1 1 ¦ ¦ ¦ 1 i i I ! !
~T 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 T"
Mixing Ratio Error
C» CN
e B & S
~~r~
M
~r~
M
8 H
Mixing Ratio Fractional Bias
~r~
M
~~1
W
Mixing Ratio Fractional Error
Figure 3.3.1. Distribution of hourly bias by hour and hourly bias, error, fractional bias, and
fractional error for water vapor mixing ratio by month.
23
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20180101 AND 20180131
Figure 3.3.2, Spatial distribution of water vapor mixing ratio bias (g/kg) across all hours for the
months of January, February, and March (top to bottom).
24
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20180401 AND 20180430
¦gr-p-
c
• i. ; t '
: r • - • - f - - - ^r- v . -/ :S
\ 4 fh. %l 'a
\ - \ . ' -: ...
vf:\ ••- _:iu __u stfvJi r.
Ik
%
V \ I I + ^9
\f * • - ! • ; * r, -<%' •
« * -i f # «>j
*
, *• v-,£
. ^ *
<•••">
w V.v, .
. V'7 V f,
v/-,v**Ct* <*V;
¦
-------
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20180701 AND 20180731
Mean bias of Mixing Ratio (g/kg) Date: BETWEEN 20180901 AND 20180930
kV -I* \m"
• T-,.y* '
rA
V
• - JL «f " % f t r--,
.. .li
v (i f,-. . « » V
v
v. ' -
v/> >iV- -*y
. *">
• -3
• -2
• -1
• -0.5
-0.25
0
0.25
0.5
• 1
• 2
• 3
Figure 3.3.4. Spatial distribution of water vapor mixing ratio bias (g/kg) across all hours for the
months of July, August, and September (top to bottom).
26
-------
Figure 3.3.5. Spatial distribution of water vapor mixing ratio bias (g/kg) across all hours for the
months of October, November, and December (top to bottom).
27
-------
Climate Region
Season
Mean Obs
Mean Mod
MAE
MB
NMB
NME
RMSE
Northeast
Q1
3.22
3.62
0.57
0.4
12.56
17.66
0.79
Q2
7.99
8.42
0.94
0.44
5.47
11.82
1.25
Q3
13.42
13.44
0.98
0.02
0.11
7.3
1.31
Q4
4.96
5.31
0.63
0.36
7.24
12.7
0.84
N. Rockies & Plains
Q1
2.46
2.56
0.38
0.1
4.06
15.57
0.53
Q2
7.44
7.76
0.92
0.32
4.34
12.4
1.33
Q3
9.77
10.38
1.2
0.61
6.24
12.3
1.62
Q4
3.34
3.58
0.43
0.24
7.04
12.96
0.61
Northwest
Q1
4.18
4.32
0.48
0.14
3.38
11.4
0.67
Q2
6.46
6.55
0.71
0.09
1.41
11
0.98
Q3
7.15
7.73
1.03
0.58
8.17
14.37
1.42
Q4
4.78
4.9
0.57
0.12
2.41
11.84
0.76
Ohio Valley
Q1
3.75
4.05
0.59
0.31
8.25
15.78
0.88
Q2
10.35
10.97
1.1
0.62
6.01
10.6
1.47
Q3
14.25
14.59
1.09
0.34
2.41
7.67
1.48
Q4
5.36
5.64
0.6
0.28
5.23
11.1
0.83
South
Q1
6.17
6.38
0.72
0.21
3.45
11.69
1.05
Q2
12.44
12.98
1.16
0.54
4.31
9.29
1.58
Q3
15.8
16.41
1.31
0.62
3.9
8.32
1.73
Q4
7.72
7.96
0.75
0.24
3.1
9.71
1.08
Southeast
Q1
6.59
6.75
0.73
0.16
2.42
11.05
1.05
Q2
12.73
13.16
1.12
0.42
3.31
8.77
1.52
Q3
16.99
17.37
1.25
0.38
2.23
7.33
1.64
Q4
8.84
9.03
0.85
0.19
2.18
9.64
1.16
Southwest
Q1
2.8
3.05
0.57
0.24
8.71
20.52
0.8
Q2
4.47
5.19
1.14
0.72
16.02
25.45
1.51
Q3
8.48
9.64
1.57
1.16
13.66
18.5
2.01
Q4
4.01
4.21
0.61
0.2
5.12
15.21
0.88
Upper Midwest
Q1
2.25
2.42
0.38
0.17
7.36
16.91
0.61
Q2
7.78
8.39
1.04
0.61
7.9
13.36
1.46
Q3
11.7
12.13
1.02
0.43
3.64
8.71
1.38
Q4
3.63
3.96
0.47
0.33
9.07
12.87
0.68
West
Q1
5.08
5.07
0.73
-0.01
-0.19
14.35
1.05
Q2
6.77
7.06
0.87
0.29
4.26
12.82
1.22
Q3
8.76
9.4
1.26
0.64
7.34
14.34
1.7
Q4
5.64
5.85
0.88
0.21
3.81
15.52
1.23
Table 3.3.1. Mean observed, mean modeled, mean absolute error (MAE), mean bias (MB),
normalized mean bias (NMB), normalized mean error (NME), and root mean square error
(RMSE) for water vapor mixing ratio (g/kg).
28
-------
3.4 Precipitation
Monthly total rainfall is plotted for each grid cell to assess how well the model captures the
spatial variability and magnitude of convective and non-convective rainfall. As described earlier,
the PRISM estimations for rainfall are only within the continental United States. With lightning
assimilation mentioned earlier, the model will either trigger (suppress) convection when
lightning is observed (not observed). This assimilation is particularly useful in constraining the
model's convection scheme that at times has been observed to be inaccurately active. WRF
rainfall estimates by month are shown for all grid cells in the domain. Monthly total estimates
are shown in Figures 3.4.1 through 3.4.12.
Overall, the model captures the general spatial patterns and magnitude of the precipitation
across the US throughout the year. Precipitation is generally underpredicted across the
southern US during February and October through December. There is a general overprediction
that is noted across the western US, particularly in areas of complex terrain (e.g., northern CA,
the Rockies, etc.), especially during the Summer months. It should also be noted there is a slight
overprediction of precipitation during March and April in the Northeast.
29
-------
Precipitation, January 2018
Difference
PRISM
Model
Figure 3.4.1. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for January.
30
-------
Precipitation, February 2018
PRISM
Model
0
100 200
300
400
Difference
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
3.4.2. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in) and the
difference (bottom) for February.
31
-------
Precipitation, March 2018
PRISM
Model
0
100
200
300
400
3 6 9 12 15 18
Inches
0
vi r 71 ^ i
100 200
300
400
3 6 9 12 15 18
Inches
Difference
0
100
200
300
400
-3 -2 -1 12 3 4
Inches
Figure 3.4.3. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for March,
32
-------
Precipitation, April 2018
PRISM
0
100
200
300
400
9 12 15
Inches
300
250
200
150
100
50
0
T
Model
kJli
M .
WF . *
/
r'
t >
(
m
0
^ r 7i t, i
100 200
300
400
9 12 15
Inches
Difference
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4,4. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for April.
33
-------
Precipitation, May 2018
PRISM
Model
0
100
200
300
400
0
100
i /"-xr i
200
300
400
6 9 12 15
Inches
Difference
0
100
200
300
400
Figure 3.4,5. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for May.
34
-------
Precipitation, June 2018
PRISM
Model
0
100 200
300
400
0
\\ Z'" i
100 200
300
400
Difference
0
100
200
300
400
\—_
-
-4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4.6. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for June.
35
-------
Precipitation, July 2018
PRISM
Model
300
250
200
150
100
50
0
! * I H /Vw V
;j&
J1
0 100 200 300
400
Difference
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4.7. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for July.
36
-------
Precipitation, August 2018
300
250
200
150
100
50
0
PRISM
\ST
*
n " v
~yi f f
0 100
200 300
400
0 3 6
9 12
Inches
15 18
0
100 200
300
400
Difference
9 12 15
Inches
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4.8. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for August.
37
-------
Precipitation, September 2018
PRISM
Model
0
100
200
300
400
3 6 9 12 15 18
Inches
3 6 9 12 15 18
Inches
Difference
0
100
200
300
400
Figure 3.4,9. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for September.
38
-------
Precipitation, October 2018
PRISM
Model
0
100
200
300
400
3 6 9 12 15 18
Inches
3 6 9 12 15 18
Inches
Difference
0
100
200
300 400
Figure 3.4.10. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for October.
39
-------
Precipitation, November 201B
PRISM
Model
0
100
200
300
400
3 6 9 12 15 18
Inches
3 6 9 12 15 18
Inches
Difference
0
100
200
300 400
Figure 3.4.11. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for November.
40
-------
Precipitation, December 2018
PRISM
Model
0
100 200
300
400
Difference
0
100
200
300
400
-5 -4 -3 -2 -1 1 2 3 4 5
Inches
Figure 3.4.12. PRISM analysis (top left) and WRF (top right) estimated monthly total rainfall (in)
and the difference (bottom) for December.
41
-------
3.5 Solar Radiation
Photosynthetically activated radiation (PAR) is a fraction of shortwave downward radiation and
is an important input for the biogenic emissions model for estimating isoprene (Carlton and
Baker, 2011). Isoprene emissions are important for regional ozone chemistry and play a role in
secondary organic aerosol formation. Radiation performance evaluation also gives an indirect
assessment of how well the model captures cloud formation during daylight hours.
Shortwave downward radiation estimates are compared to surface-based measurements and
shown below (Figure 3.5.1).
In general, WRF slightly overpredicts shortwave radiation across all months of the year,
showing a greater spread in the overprediction during the late Spring to early Fall months.
Overall, the median bias in WRF for all months of the year is roughly 10-30 W/m2.
More variability is noted on an hourly basis as WRF overpredicts shortwave radiation across all
daytime hours. The median bias during the hours of most downward shortwave radiation is less
than 20-30 W/m2. A greater spread in the overprediction is noted during the afternoon to early
evening hours when the sun is highest in the sky. The model's inability to accurately simulate
subgrid clouds at a 12km resolution is likely the cause of these errors.
42
-------
Shortwave Radiation Bias: 12US 2018
Month
Shortwave Radiation Bias: 12US 2018
- T T T I
r o « } o 9 a B E
I
r
i
i
i
rH
A
ii
T
T
fl B
I * i ; ¦ !
1 i
H-1
i
li
Ii
_i_
i i
i i
¦ i
* _i_
-------
Northeast during August and September. Conversely, abnormally cool conditions were
observed in the central US during April and September.
With regards to precipitation, 2018 was primarily slightly below to slightly above average across
the country. Overall, this would appear to be a near-normal climatology for precipitation.
Notably, February and September were wetter than usual in the central and eastern US.
44
-------
Divisional Average Temperature Ranks Divisional Average Temperature Ranks
January 2018 February 2018
Record Much Below Near Above Much Record Record Much
Coldest Below Average Average Average Above Warmest Coldest Below
Average Average Average
Divisional Average Temperature Ranks Divisional Average Temperature Ranks
March 2018 April 2018
Period: 1895-2018 Period: 1895-2018
Figure 4.1 Climatic temperature rankings by climate division: January to June 2018.
http://www.ncdc.noaa.gov/temp-and-precip/maps.php
Record Much Be lew Near Above Much
Coldest Below Average Average Average Above
Average Average
Divisional Average Temperature Ranks
May 2018
Period: 1895-2018
Above
Much Below Near Above Much
Below Average Average Average Above
Average Average
Divisional Average Temperature Ranks
June 2018
Period: 1895-2018
Much
Below
Average
45
-------
Much Record
Mime Wannest
Much Below
^Betow Average
Above Much
Average ^Afiove
Divisional Average Temperature Ranks
September 2018
Divisional Average Temperature Ranks
October 2018
Divisional Average Temperature Ranks
July 2018
Period: 1895-2018
Divisional Average Temperature Ranks
August 2018
Period: 1895-2018
Much Below Near Above Much Record Record Much Below Near Above Much
Beiow Average Average Average Above Wannest Coldest Below Average Average Average Above
Average Average Average Average
Divisional Average Temperature Ranks Divisional Average Temperature Ranks
November 2018 December 2018
Period: 1895-2018 Period: 1895-2018
Much
Befow
Average
Much
^Above
Figure 4.2 Climatic temperature rankings by climate division: July to December 2018.
http://www.ncdc.noaa.gov/temp-and-precip/maps.php
46
-------
Divisional Precipitation Ranks Divisional Precipitation Ranks
February 2018
January 2018
Much
^Above
Divisional Precipitation Ranks Divisional Precipitation Ranks
March 2018 April 2018
Period: 1895-2018 Period: 1895-2018
Figure 4.3 Climatic rainfall rankings by climate division: January to June 2018,
http://www.ncclc.noaa.gov/temp-and-precip/maps.php
Much
Divisional Precipitation Ranks Divisional Precipitation Ranks
May 2018 June 2018
Period: 1895-2018 Period: 1895-2018
Above Much
Average Miove
Much Below
Stow Average
47
-------
Divisional Precipitation Ranks Divisional Precipitation Ranks
July 2018 August 2018
Divisional Precipitation Ranks
September 2018
Period: 1895-2018
Divisional Precipitation Ranks
October 2018
Period; 1895-2018
Figure 4.4 Climatic rainfall rankings by climate division; July to December 2018.
https://www.ncdc.noaa.gov/sotc/
Much Below Near Above Much
BKow Average Average Average Above
Avwage Average
Divisional Precipitation Ranks
December 2018
Much
Above
Average
Below
Average
Divisional Precipitation Ranks
November 2018
Period: 1895-2018
48
-------
5 REFERENCES
Boylan, J.W., Russell, A.G., 2006. PM and light extinction model performance metrics, goals, and
criteria for three-dimensional air quality models. Atmospheric Environment 40, 4946-4959.
Carlton, A.G., Baker, K.R., 2011. Photochemical Modeling of the Ozark Isoprene Volcano:
MEGAN, BEIS, and Their Impacts on Air Quality Predictions. Environmental Science &
Technology 45, 4438-4445.
Cooper, O.R., Stohl, A., Hubler, G., Hsie, E.Y., Parrish, D.D., Tuck, A.F., Kiladis, G.N., Oltmans, S.J.,
Johnson, B.J., Shapiro, M., Moody, J.L., Lefohn, A.S., 2005. Direct Transport of Midlatitude
Stratospheric Ozone into the Lower Troposphere and Marine Boundary Layer of the Pacific
Ocean. Journal of Geophysical Research - Atmospheres 110, D23310,
doi:10.1029/2005JD005783.
ENVIRON, 2008. User's Guide Comprehensive Air Quality Model with Extensions. ENVIRON
International Corporation, Novato.
Gilliam, R.C., Pleim, J.E., 2010. Performance Assessment of New Land Surface and Planetary
Boundary Layer Physics in the WRF-ARW. Journal of Applied Meteorology and Climatology 49,
760-774.
Heath, Nicholas K., Pleim, J.E., Gilliam, R., Kang, D., 2016. A simple lightning assimilation
technique for improving retrospective WRF simulations. Journal of Advances in Modeling Earth
Systems. 8. 10.1002/2016MS000735.
Langford, A.O., Reid, S.J., 1998. Dissipation and Mixing of a Small-Scale Stratospheric Intrusion
in the UpperTroposphere. Journal of Geophysical Research 103, 31265-31276.
Otte, T.L., Pleim, J.E., 2010. The Meteorology-Chemistry Interface Processor (MCIP) for the
CMAQ modeling system: updates through MCIPv3.4.1. Geoscientific Model Development 3,
243-256.
Skamarock, W.C., Klemp, J.B., Dudhia, J., Gill, D.O., Barker, D.M., Duda, M.G., Huang, X., Wang,
W., Powers, J.G., 2008. A Description of the Advanced Research WRF Version 3.
Stammer, D., F.J. Wentz, and C.L. Gentemann, 2003, Validation of Microwave Sea Surface
Temperature Measurements for Climate Purposes, J. Climate, 16, 73-87.
49
-------
United States Office of Air Quality Planning and Standards Publication No. EPA-454/R-23-006
Environmental Protection Air Quality Assessment Division July 2023
Agency Research Triangle Park, NC
50
------- |