EPA 910-R-15-001 b Alaska
United States Region 10 Idaho
Environmental Protection 1200 Sixth Avenue Oregon
Agency Seattle WA 98101 Washington
Office of Environmental Assessment October 2015
-------
-------
Combined WRF/MMIF/
AERCOARE/AERMOD Overwater
Modeling Approach for Offshore
Emission Sources
Volume 2 - Evaluation of Weather
Research and Forecasting Model
Simulations for Five Tracer Gas
Studies with AERMOD
EPA Contract No. EP-W09-028
Work Assignment No. M12PG00033R
Prepared for:
U.S. Environmental Protection Agency
Region 10
1200 Sixth Avenue
Seattle, WA 98101
and
U.S. Department of the Interior
Bureau of Ocean Energy Management
45600 Woodland Road
Sterling, VA 20166
Prepared by:
Ramboll Environ US Corporation
773 San Marin Drive, Suite 2115
Novato, CA, 94998
and
Amec Foster Wheeler
Environmental & Infrastructure, Inc.
4021 Stirrup Creek Dr., Suite 100
Durham, NC 27703
October 2015
-------
-------
The Region 10 Project Officer for the Interagency Agreement No. M12PGT00033R and EPA
contract number EP-W-09-028 was Herman Wong with technical support provided by Robert
Kotchenruther, PhD and Robert Elleman, PhD. From BOEM, the Project Officer was Eric J.
Wolvovsky and the Technical Coordinator was Ronald Lai, PhD. The Project Lead for the prime
contractor Amec Foster Wheeler was James Paumier while Project Lead for subcontractor
Ramboll Environ was Ken Richmond. Peer review of draft Volume 2 and/or draft Volume 3 was
provided by Steven Hanna, PhD of Hanna Consultants, Robert Paine, CCM of AECOM and
Christopher Lindsey, Shell Exploration and Production. Their reviews and comments are greatly
appreciated by R10 and BOEM.
The collaboration study was funded in part by the U.S. Department of the Interior, Bureau of
Ocean Energy Management, Environmental Studies Program, Washington DC, and the U.S.
Environmental Protection Agency, Region 10, Seattle, WA.
-------
DISCLAIMER
The opinions, findings, conclusions, or recommendations expressed in this report are those of
the authors and do not necessarily reflect the view of the U.S. Environmental Protection Agency
or the U.S. Department of the Interior, Bureau of Ocean Energy Management, nor does the
mention of trade names or commercial products constitute endorsement or recommendation for
use by the Federal Government.
-------
PREFACE
The recommended American Meteorological Society/Environmental Protection Agency
Regulatory Model (AERMOD) dispersion program continues to be studied for assessing air
quality concentration impacts from emission sources located at overwater locations under an
Interagency Agreement (IA) Number M12PGT00033R dated 9 August 2012 between the U.S.
Environmental Protection Agency (EPA), Region 10 and the U.S. Department of the Interior
(DOI), Bureau of Safety and Environmental Enforcement (BSEE) on behalf of the Bureau of
Ocean Energy Management (BOEM). Specifically, the work scope under the IA calls for Region
10 and BOEM to (1) assess the use of AERMOD as a replacement for the Offshore and Coast
Dispersion (OCD) model in a near-source (< 1,000 meters source-receptor distance) ambient air
quality impact analysis for sea surface based emission sources and (2) evaluate the use of
Weather Research and Forecasting (WRF) model predicted meteorology with AERMOD in lieu
of overwater meteorological measurements from platforms and buoys.
Results of the Region 10/BOEM collaboration study are described in a three volume report.
Volume 1 describes all six tasks completed under the IA. However, only a summary of the work
completed under Task 2 and Task 3 appears in Volume 1. Volume 2 and Volume 3 provides a
detailed description of the work in Task 2 and Task 3, respectively. The six tasks are:
Task 1. Evaluation of two Outer Continental Shelf Weather Research and Forecasting Model
Simulations
Task 2. Evaluation of Weather Research and Forecasting Model Simulations for Five Tracer
Gas Studies with AERMOD
Task 3. Analysis of AERMOD Performance Using Weather Research and Forecasting Model
Predicted Meteorology and Measured Meteorology in the Arctic
Task 4. Comparison of Predicted and Measured Mixing Heights
Task 5. Development of AERSCREEN for Arctic Outer Continental Shelf Application
Task 6. Collaboration Study Seminar
Prior to the collaboration study, Region 10 on 1 April 2011 approved the use of the Coupled
Ocean-Atmosphere Response Experiment (COARE) air-sea flux algorithm with AERMOD to
preprocess overwater measured meteorological data from platforms and buoys. Initially, the
preprocessing of the overwater measurements was done manually with COARE. Subsequently,
Region 10 funded a study that was completed in September 2012 that coded the COARE air-
sea flux procedure into a meteorological data preprocessor program called AERMOD-COARE
(AERCOARE). The AERCOARE program was uploaded to the EPA Support Center for
Regulatory Atmospheric Modeling (SCRAM) website on 23 May 2013 as a beta option for case-
by-case approval by EPA regional offices.
-------
[Blank]
-------
TABLE OF CONTENTS
LIST OF FIGURES IX
LIST OF TABLES XIV
LIST OF ABBREVIATIONS AND ACRONYMS XV
1 INTRODUCTION 1
2 WRF SIMULATIONS OF FIVE TRACER DISPERSION STUDIES 5
2.1 OVERVIEW OF THE TRACER STUDIES AND WRF DOMAINS 5
2.1.1 Cameron 5
2.1.2 Carpinteria 9
2.1.3 Oresund 11
2.1.4 Pismo Beach 15
2.1.5 Ventura 18
2.2 WRF CONFIGURATIONS FOR THE FIVE TRACER STUDIES 20
2.3 WRF PERFORMANCE EVALUATION FOR THE FIVE TRACER
STUDIES 38
2.3.1 METSTAT Benchmarks 39
2.3.2 Cameron WRF Simulation Performance 41
2.3.3 Carpinteria WRF Simulation Performance 42
2.3.4 Oresund WRF Simulation Performance 42
2.3.5 Pismo WRF Simulation Performance 43
2.3.6 Ventura WRF Simulation Performance 43
2.3.7 WRF Performance Evaluation Conclusions 55
3 DEVELOPMENT OF METEOROLOGICAL INPUTS FOR AERMOD 59
3.1 AERMOD INPUT METEOROLOGY FILES 59
3.2 BUOY METEOROLOGY PROCESSING WITH AERCOARE 62
3.3 WRF METEOROLOGY EXTRACTION METHODS 62
3.4 EVALUATION OF EXTRACTED METEOROLOGY FROM WRF 64
3.4.1 Cameron 67
3.4.2 Carpinteria 78
3.4.3 Oresund 86
-------
3.4.4 Pismo Beach 93
3.4.5 Ventura 104
3.5 DISCUSSION 114
4 AERMOD MODELING AND RESULTS 117
4.1 AERMOD METHODOLOGY 117
4.2 STATISTICAL MEASURES AND METHODS 119
4.3 AERMOD MODELING RESULTS 121
4.3.1 Cameron 121
4.3.2 Carpinteria 132
4.3.3 Oresund 142
4.3.4 Pismo 148
4.3.5 Ventura 158
5 DISCUSSION 169
5.1 PRIMARY QUESTIONS 169
5.2 CONCLUSIONS 173
REFERENCES 175
APPENDIX A: TASK 2 PROTOCOL
APPENDIX B: REPORT DISK
-------
LIST OF FIGURES
Figure 1. Cameron Experiment Configuration 7
Figure 2. Carpinteria Experiment Configuration 9
Figure 3. Oresund Experiment Configuration 12
Figure 4. Pismo Beach Experiment Configuration 15
Figure 5. Ventura Experiment Configuration 18
Figure6. Cameron (LA) WRF Domain Map 28
Figure 7. Cameron WRF 1.33 km Domain (solid magenta line) 29
Figure 8. Carpinteria (CA) WRF Domain Map 30
Figure 9. Carpinteria WRF 1.33 km Domain (solid magenta line) 31
Figure 10. Oresund WRF Domain Map 32
Figure 11. Oresund WRF 1.33 km Domain (solid magenta line) 33
Figure 12. Pismo Beach (CA) WRF Domain Map 34
Figure 13. Pismo Beach WRF 1.33 km Domain (solid magenta line) 35
Figure 14. Ventura (CA) WRF Domain Map 36
Figure 15. Ventura WRF 1.33 km Domain (solid magenta line) 37
Figure 16. Cameron wind speed METSTAT results 45
Figure 17. Cameron wind direction METSTAT results 45
Figure 18. Cameron temperature METSTAT results 46
Figure 19. Carpinteria wind speed METSTAT results 47
Figure 20. Carpinteria wind direction METSTAT results 47
Figure 21. Carpinteria temperature METSTAT results 48
Figure 22. Carpinteria humidity METSTAT results 48
Figure 23. Oresund wind speed METSTAT results 49
Figure 24. Oresund wind direction METSTAT results 49
Figure 25. Oresund temperature METSTAT results 50
Figure 26. Oresund humidity METSTAT results 50
Figure 27. Pismo wind speed METSTAT results 51
Figure 28. Pismo wind direction METSTAT results 51
Figure 29. Pismo temperature METSTAT results 52
Figure 30. Pismo humidity METSTAT results 52
-------
Figure 31. Ventura wind speed METSTAT results 53
Figure 32. Ventura wind direction METSTAT results 53
Figure 33. Ventura temperature METSTAT results 54
Figure 34. Ventura humidity METSTAT results 54
Figure 35. Cameron wind speed time series: winter releases (top) and summer releases
(bottom) 70
Figure 36. Cameron air temperature time series: winter releases (top) and summer releases
(bottom) 71
Figure 37. Cameron SST time series: winter releases (top) and summer releases (bottom) 72
Figure 38. Cameron air-sea temperature difference time series: winter releases (top) and
summer releases (bottom) 73
Figure 39. Cameron inverse of L time series: winter releases (top) and summer releases
(bottom) 74
Figure 40. Cameron PEL height (RCALF) time series: winter releases (top) and summer
releases (bottom) 75
Figure 41. Cameron PEL height (RCALT) time series: winter releases (top) and summer
releases (bottom) 76
Figure 42. Cameron relative humidity time series: winter releases (top) and summer releases
(bottom) 77
Figure 43. Carpinteria wind speed time series 81
Figure 44. Carpinteria air temperature time series 81
Figure 45. Carpinteria sea surface temperature time series 82
Figure 46. Carpinteria air-sea temperature difference time series 82
Figure 47. Carpinteria inverse of L time series 83
Figure 48. Carpinteria PEL height (RCALF) time series 83
Figure 50. Carpinteria relative humidity time series 84
Figure 49. Carpinteria PEL height (RCALT) time series 84
Figure 51. Carpinteria wind direction time series 85
Figure 52. Oresund wind speed time series 88
Figure 53. Oresund air temperature time series 89
Figure 54. Oresund sea surface temperature time series 89
Figure 55. Oresund air-sea temperature difference time series: 90
Figure 56. Oresund inverse of L time series 90
Figure 57. Oresund PEL height (RCALF) time series 91
-------
Figure 59. Oresund relative humidity time series 92
Figure 58. Oresund PEL height (RCALT) time series 91
Figure 60. Oresund wind direction time series 92
Figure 61. Pismowind speed time series: winter (top) and summer (bottom) releases 96
Figure 62. Pismo air temperature time series: winter (top) and summer (bottom) releases 97
Figure 63. Pismo SST time series: winter (top) and summer (bottom) releases 98
Figure 64. Pismo air-sea temperature difference time series: winter (top) and summer (bottom)
releases 99
Figure 65. Pismo inverse of L time series: winter (top) and summer (bottom) releases 100
Figure 66. Pismo PEL height time series: winter (top) and summer (bottom) releases 101
Figure 67. Pismo PEL height (MMIF re-calc.) time series: winter (top) and summer (bottom)
releases 102
Figure 68. Pismo relative humidity time series: winter (top) and summer (bottom) releases.... 103
Figure 69. Ventura wind speed time series: summer (top) and winter (bottom) releases 106
Figure 70. Ventura Air temperature time series: summer (top) and winter (bottom) releases. .107
Figure 71. Ventura SST time series: summer (top) and winter (bottom) releases 108
Figure 72. Ventura air-sea temperature difference time series: summer (top) and winter (bottom)
releases 109
Figure 73. Ventura inverse of L time series: summer (top) and winter (bottom) releases 110
Figure 74. Ventura PEL height time series: summer (top) and winter (bottom) releases 111
Figure 75. Ventura PEL height (MMIF re-calc.) time series: summer (top) and winter (bottom)
releases 112
Figure 76. Ventura relative humidity time series: summer (top) and winter (bottom) releases. 113
Figure 77. Cameron AERMOD Results Time Series: winter cases (top) and summer cases
(bottom) 125
Figure 78. Cameron AERMOD Results Scatter Plot-NARR.MYJ 126
Figure 79. Cameron AERMOD Results Q-Q Plot-NARR.MYJ 126
Figure 80. Cameron AERMOD Results Scatter Plot-NARR.UW 127
Figure 81. Cameron AERMOD Results Q-Q Plot-NARR.UW 127
Figure 82. Cameron AERMOD Results Scatter Plot - NARR.YSU 128
Figure 83. Cameron NARR.YSU Q-Q Plot 128
Figure 84. Cameron ERA.MYJ Scatter Plot 129
Figure 85. Cameron AERMOD Results Q-Q Plot-ERA.MYJ 129
-------
Figure 86. Cameron AERMOD Results Scatter Plot-ERA.UW 130
Figure87. Cameron AERMOD Results Q-Q Plot-ERA.UW 130
Figure 88. Cameron AERMOD Results Scatter Plot - ERA.YSU 131
Figure 89. Cameron AERMOD Results Q-Q Plot-ERA.YSU 131
Figure 90. Carpinteria AERMOD Results Time Series 135
Figure 91. Carpinteria AERMOD Results Scatter Plot-NARR.MYJ 135
Figure 92. Carpinteria AERMOD Results Q-Q Plot-NARR.MYJ 136
Figure 93. Carpinteria AERMOD Results Scatter Plot - NARR.UW 136
Figure 94. Carpinteria AERMOD Results Q-Q Plot-NARR.UW 137
Figure 95. Carpinteria AERMOD Results Scatter Plot - NARR.YSU 137
Figure 96. Carpinteria AERMOD Results Q-Q Plot-NARR.YSU 138
Figure 97. Carpinteria AERMOD Results Scatter Plot-ERA. MYJ 138
Figure 98. Carpinteria AERMOD Results Q-Q Plot- ERA.MYJ 139
Figure 99. Carpinteria AERMOD Results Scatter Plot-ERA.UW 139
Figure 100. Carpinteria AERMOD Results Q-Q Plot-ERA.UW 140
Figure 101. Carpinteria AERMOD Results Scatter Plot - ERA.YSU 140
Figure 102. Carpinteria AERMOD Results Q-Q Plot - ERA.YSU 141
Figure 103. Oresund AERMOD Results Time series 144
Figure 104. Oresund AERMOD Results Scatter Plot-ERA.MYJ 145
Figure 105. Oresund AERMOD Results Q-Q Plot-ERA.MYJ 145
Figure 106. Oresund AERMOD Results Scatter Plot-ERA.UW 146
Figure 107. Oresund AERMOD Results Q-Q Plot-ERA.UW 146
Figure 108. Oresund AERMOD Results Scatter Plot - ERA.YSU 147
Figure 109. Oresund AERMOD Results Q-Q Plot- ERA.YSU 147
Figure 110. Pismo AERMOD Results Time Series: winter cases (top) and summer cases
(bottom) 151
Figure 111. Pismo AERMOD Results Scatter Plot-NARR.MYJ 152
Figure 112. Pismo AERMOD Results Q-Q Plot-NARR.MYJ 152
Figure 113. Pismo AERMOD Results Scatter Plot-NARR.UW 153
Figure 114. Pismo AERMOD Results Q-Q Plot-NARR.UW 153
Figure 115. Pismo AERMOD Results Scatter Plot - NARR.YSU 154
Figure 116. Pismo AERMOD Results Q-Q Plot-NARR.YSU 154
-------
Figure 117. Pismo AERMOD Results Scatter Plot-ERA.MYJ 155
Figure 118. Pismo AERMOD Results Q-Q Plot-ERA.MYJ 155
Figure 119. Pismo AERMOD Results Scatter Plot-ERA.UW 156
Figure 120. Pismo AERMOD Results Q-Q Plot-ERA.UW 156
Figure 121. Pismo AERMOD Results Scatter Plot - ERA.YSU 157
Figure 122. Pismo AERMOD Results Q-Q Plot-ERA.YSU 157
Figure 123. Ventura AERMOD Results Time Series: summer cases (top) and winter cases
(bottom) 161
Figure 124. Ventura AERMOD Results Scatter Plot - NARR.MYJ 162
Figure 125. Ventura AERMOD Results Q-Q Plot-NARR.MYJ 162
Figure 126. Ventura AERMOD results scatter plot-NARR.UW 163
Figure 127. Ventura AERMOD Results Q-Q Plot-NARR.UW 163
Figure 128. Ventura AERMOD Results Scatter Plot - NARR.YSU 164
Figure 129. Ventura AERMOD Results Q-Q Plot - NARR.YSU 164
Figure 130. Ventura AERMOD Results Scatter Plot-ERA. MYJ 165
Figure 131. Ventura AERMOD Results Q-Q Plot - ERA.MYJ 165
Figure 132. Ventura AERMOD Results Scatter Plot-ERA.UW 166
Figure 133. Ventura AERMOD Results Q-Q Plot - ERA.UW 166
Figure 134. Ventura AERMOD Results Scatter Plot - ERA.YSU 167
Figure 135. Ventura AERMOD Results Q-Q Plot-ERA.YSU 167
-------
LIST OF TABLES
Table 1. Cameron Tracer Study Meteorology, Source, and Receptor Information 8
Table 2. Carpinteria Tracer Study Meteorology and Release Parameters 10
Table 3. Oresund Experiment Meteorology and Tracer Release Parameters 13
Table 4. Pismo Beach Experiment Meteorology and Release Parameters 16
Table 5. Ventura Tracer Experiment Meteorology and Release Parameters 19
Table 6. WRF Scenarios and Naming Convention 22
Table?. WRF Simulation Times 22
Table 8. WRF Physics Parameterization Scheme Options Selected for WRF Modeling 24
Table 9. WRF Vertical Grid Setup 26
Table 10. Meteorological Model Performance Benchmarks for Simple and Complex Conditions.
40
Table 11. Results summary for METSTAT analyses 57
Table 12. Meteorological Fields in the AERMOD Meteorology Input Files 59
Table 13. WRF Meteorology Extraction Options 64
Table 14. WRF meteorology bias and error compared to tracer study measurements 65
Table 15. AERMOD Configuration for Each Study 119
Table 16. Cameron AERMOD Results Performance Statistics 124
Table 17. Carpinteria AERMOD Results Performance Statistics 134
Table 18. Oresund AERMOD Performance Statistics 143
Table 19. Pismo AERMOD Performance Statistics 150
Table 20. Ventura AERMOD Performance Statistics 160
Table 21. Concentration Q-Q Plot Distribution Upper-end Evaluation 172
Table 22. Summary of Model Evaluation Findings 173
-------
LIST OF ABBREVIATIONS AND ACRONYMS
AERC WRF meteorology extraction cases processed byAERCOARE
AERMIC American Meteorological Society/Environmental Protection Agency
Regulatory Model Improvement Committee
AERMOD American Meteorological Society/Environmental Protection Agency
Regulatory Model
AERCOARE AERMOD-COARE
ASTD Air-Sea Temperature Difference
AIDJEX Arctic Ice Dynamics Joint Experiment
6 Bowen ratio
BOEM U.S. Dept. of the Interior Bureau of Ocean Energy Management
BSSE Bureau of Safety and Environmental Enforcement
c Model constant
c0 Observed concentration value
cp Predicted concentration value
c Average concentration value
cn nth highest concentration
°C Degrees Centigrade
COARE Coupled Ocean Atmospheric Response Experiment
DOI Department of the Interior
ECMWF European Center for Medium-Range Weather Forecasting
EPA U.S. Environmental Protection Agency
ERA ECMWF Reanalysis Project
ERA-40 ERA 45-year global atmospheric reanalysis
ERA-I ERA Interim global atmospheric reanalysis
eta Vertical pressure coordinate in WRF
f Coriolis parameter
FF2 Fraction-factor-of-two
FNMOC Fleet Numerical Meteorology and Oceanography Center
g Grams
H Sensible heat flux
ISC3 Industrial Source Complex
K Degrees kelvin
kg Kilograms
km Kilometers
-------
L Monin-Obukhov length
LCC Lambert Conformal Conic
m Meters
METSTAT Meteorological Statistics
MG Geometric mean bias
MIXH PEL height or "mixing height"
MMIF Mesoscale Model Interface Program.
MYJ Mellor-Yamada-Janjic
NAAQS National Ambient Air Quality Standards
NARR North American Regional Reanalysis
NBDC National Buoy Data Center
NCAR National Center for Atmospheric Research
NCEP National Center for Environmental Prediction
NOAA National Oceanic and Atmospheric Administration
NSR New Source Review
O Observed value
OBS, obs Label for observation-based AERMOD simulations
OCD Offshore and Coastal Dispersion
OCS Outer Continental Shelf
OLM Ozone Limited Method
P Sea level atmospheric pressure (also used to indicate "predicted" value in
statistical calculations)
p Predicted value
PEL Planetary boundary layer
PFL Profile file input to AERMOD
PRIME Plume Rise Model Enhancements
PSD Prevention of Significant Deterioration
PVMRM Plume Volume Molar Ratio Method
Q-Q Quantile-Quantile
r Albedo
RCALF Label for WRF-MMIF AERMOD simulations where PEL height was not
recalculated by MMIF
RCALT Label for WRF-MMIF AERMOD simulations where PEL height was
recalculated MMIF
rg Geometric correlation coefficient
RH Relative Humidity
RHC Robust High Concentration
-------
RMSE Root Mean Square Error
RPO Regional Planning Organization
RTG Real Time Global sea-surface temperature analysis (from NCEP)
s Seconds
SFC AERMOD surface meteorology input file
SST Sea Surface Temperature
T Temperature
TKE Turbulent Kinetic Energy, Thermal Kinetic Energy
TMS Total Model Score statistical measure
U Zonal wind component
UW-PBL University of Washington Shallow Convection PEL
i/* Friction Velocity
V Meridional wind component
VG Geometric variance
VPTLR Virtual Potential Temperature Lapse Rate
w. Convective scaling velocity
W Watts
WD Wind Direction
WRF Weather Research and Forecasting Mesoscale Model
WS Wind Speed
YSU Yonsei University
z Height above the surface
z0 Roughness length
z,c Convective PEL height
z/m Mechanical PEL height
0 .Degrees angular
/jg Geometric mean
OQ Standard deviation of wind direction
ow Standard deviation of vertical wind speed
(// Stability correction parameter
-------
[Blank]
-------
1 INTRODUCTION
Air quality modeling and impact assessment must be conducted for the New Source Review
(NSR) of significant sources of air pollutant emissions as promulgated by the U.S.
Environmental Protection Agency (EPA) and the U.S. Department of the Interior (DOI), Bureau
of Ocean Energy Management (BOEM). Given the recent and likely future continued expansion
of oil and mineral exploration and extraction activities in the Arctic Ocean off the coast of Alaska
along the Outer Continental Shelf (OCS) and other marine locations (e.g., mid-latitudes and
tropics), there will continue to be more demand for air quality permits and exploratory/
development plans related to such activities. The EPA and BOEM must therefore provide
modeling tools to adequately assess air quality impacts over the OCS and other overwater
regions.
The American Meteorological Society/Environmental Protection Agency Regulatory Model
(AERMOD) modeling system (USEPA, 2004c) is the preferred near-field (< 50 kilometers [km])
model used for the air quality assessment requirements of air emissions permitting1. However,
AERMOD's meteorological preprocessor, AERMET (USEPA, 2004b), was not designed to
process meteorological conditions over ocean waters and more extreme climates. Over land,
energy fluxes are strongly driven by the diurnal cycle of heating and cooling. Over water, fluxes
are more dependent on air-sea temperature differences that are only slightly affected by diurnal
heating and cooling. In addition, the meteorological observations necessary to drive the
dispersion models are often not available, especially in the Arctic Ocean. For applications in the
Arctic, the remote location and seasonal sea-ice pose significant logistical problems for the
deployment of buoys and other offshore measurement platforms.
The dispersion model currently preferred by the EPA for offshore assessment is the Offshore
and Coastal Dispersion (OCD) model (DiCristofaro & Hanna, 1989), as promulgated under 40
CFR Part 51, Appendix W. However, OCD lacks the features required for robust modern
environmental assessment. OCD does not contain the PRIME downwash algorithm (Schulman,
et al., 2002), Plume Volume Molar Ratio Method (PVMRM) (Hanrahan, 1999), and Ozone
Limiting Method (OLM) (Cole & Summerhays, 1979) and the capability to calculate receptor
averaged percentiles associated with sulfur dioxide (SO2), nitrogen dioxide (NO2), and
particulate matter less than or equal to 2.5 microns (PM2.5) concentrations for a compliance
demonstration.
State-of-the-art overwater parameterization schemes are used in the Coupled Ocean-
Atmosphere Response Experiment (COARE) air-sea flux algorithms. These algorithms have
1 As promulgated under 40 CFR Part 51, Appendix W. The AERMOD modeling system is available to the
public at the EPA modeling website: http://www.epa.gov/scram001/dispersion prefrec.htm
-------
been used to develop the AERMOD-COARE (AERCOARE)2 model (USEPA, 2012), a
counterpart to AERMET, that takes air-sea temperature difference and other features of marine
influence into account to compute the meteorological fields required for AERMOD modeling
over the open water and coastal environments. AERCOARE-AERMOD (using the current beta
version of AERCOARE) has been approved by EPA Region 10 and concurred by the EPA
Model Clearinghouse as an acceptable alternative approach for modeling emissions sources
located in the Arctic, mid-latitude, and tropic overwater environment. Use of the model still
requires a procedural protocol in accordance with Appendix W and review and acceptance by
the governing regulatory body on a case-by-case basis (Bridgers, 2011) (Wong, 2011).
Currently accepted overwater dispersion modeling methods require observational datasets.
These datasets are generally provided by meteorological buoys or instruments on platforms.
However, the observational coverage of the earth's oceans is sparse. It would be advantageous
if mesoscale meteorological models, such as the Weather Research and Forecasting (WRF)
model (National Center for Atmospheric Research, 2014) (Skamarock, et al., 2008), could be
used to provide meteorological data for AERMOD in areas where observational data are
lacking.
The Mesoscale Model Interface Program (MMIF)3(Brashers & Emery, 2014) is a program that
converts prognostic meteorological model output fields to the format required for direct input into
several dispersion models. It provides a means for using overwater meteorological input for
AERMOD in a number of different ways. MMIF can supply AERMOD-ready meteorology directly
from WRF to produce input data for AERMET (suitable for overland situations) or produce input
data for AERCOARE.
The study summarized in this report was conducted to evaluate alternative methods for
supplying meteorological variables to AERMOD for regulatory air quality modeling of sources
located over the ocean. It is hypothesized that given an appropriate overwater meteorological
dataset, AERMOD can be used for New Source Review (NSR) and exploratory plans following
the same procedures as used for sources over land. The study evaluated a combined modeling
approach where the meteorological variables were provided by the WRF mesoscale model and
then processed by a combination of the MMIF and, optionally, AERCOARE. The WRF
meteorology was used to drive AERMOD dispersion modeling of several test cases described in
Section 2. Section 3 describes the development of the meteorological inputs for AERMOD. The
results of this Task were compared to results of AERMOD modeling driven by observational
datasets, as discussed in Section 4.
2 AERCOARE is made publically available by the U.S. EPA at the website:
http://www.epa.gov/ttn/scram/dispersion related.htm
3 MMIF-Beta provided as a "related" alternative software for regulatory dispersion modeling by the U.S.
EPA at the website: http://www.epa.gov/scram001/dispersion related.htm
-------
The purpose of the study was to provide evidence to help answer some of the following
questions: How well does WRF predict overwater surface meteorology? Are pollutant
concentrations predicted by AERMOD driven by WRF lower than concentrations predicted by
AERMOD driven by observations (processed through AERCOARE)? What WRF modeling
configurations provide the best AERMOD inputs, based on the most accurate AERMOD
predictions? How sensitive is AERMOD to differences between the WRF meteorology and
observations for simulations of typical offshore sources?
The study compared WRF-driven dispersion AERMOD results to the concentrations measured
during five offshore tracer dispersion field experiments. Four of the five studies, involving
experiments conducted over North American waters, were used previously to evaluate COARE
and AERCOARE using actual overwater observations (ENVIRON Int. Corp., 2010) (ENVIRON
Int. Corp., 2012). The current study also included an evaluation of results using parameters from
the Oresund Nordic Dispersion Experiment conducted in 1984 (Gryning, 1985).
This report summarizes the methodology used for each of the elements of the study, presents
the results of the performance evaluations, and analyzes how modeling options affect model
performance. These elements are:
i) Setup and procedures for running WRF for the five tracer study periods,
ii) Evaluation of WRF performance for these periods,
iii) Extracting and processing meteorological input files for AERMOD,
iv) Procedures for running AERMOD for the five tracer study cases using both the
extracted WRF data and real meteorological observations from the studies,
v) Evaluation of the performance of AERMOD; comparisons of AERMOD performance
using extracted WRF data to measurements and AERMOD results from simulations
using observed meteorology,
vi) Analysis of the influence of the meteorological data on AERMOD performance.
The WRF and AERMOD modeling and analysis conducted for this study were conducted
according to a modeling protocol reviewed and accepted by EPA and BOEM prior to initiating
the study. This protocol is included with this report as Appendix A.
-------
[Blank]
-------
2 WRF SIMULATIONS OF FIVE TRACER DISPERSION STUDIES
WRF modeling was conducted to provide meteorology for each of the five tracer studies.
Modeling was conducted using archived reanalysis data to provide the initialization data and
boundary conditions for WRF. A detailed description of each of the five tracer studies is
provided in Section 2.1. The configuration of WRF and the modeling domains used for each
study are described in Section 2.2.
2.1 Overview of the Tracer Studies and WRF Domains
The five historical tracer dispersion field studies selected for this study are:
• Cameron, LA: July 1981 and February 1982 (Dabberdt, et al., 1982)
• Carpinteria, CA: September 1985 (Johnson & Spangler, 1986)
• Oresund (between Denmark and Sweden): May/June 1984 (Gryning, 1985)
• Pismo Beach, CA: December 1981 and June 1982 (Schacher, et al., 1982)
• Ventura, CA: September 1980 and January 1981 (Schacher, et al., 1982)
The four North American studies listed have been used for off-shore dispersion model
development including for the OCD model (Hanna, et al., 1985), CALPUFF (Earth Tech, 2006a),
and most recently for AERCOARE (ENVIRON Int. Corp., 2010) (ENVIRON Int. Corp., 2012).
The tracer experiment datasets are well known to EPA and have a history of use for model
benchmark testing and development. The meteorology and concentration data from the tracer
studies were obtained from the OCS Model Evaluation Archive (Earth Tech, 2006b). The
Cameron and Pismo Beach studies provide tracer measurements for simple level terrain near
the coastline. They are useful for analyzing model performance under marine meteorology. The
Ventura study also involved simple flat terrain, but the receptors were located 500 meters (m) to
1 kilometer (km) inland from the shoreline. The Carpinteria study involved short distance, low
wind transport conditions and receptors located on tall bluffs along the shoreline. The Oresund
study involved longer transport distances (25 km to 40 km) with tracer releases on both sides of
the Oresund straight separating Denmark and Sweden.
The tracer study databases contain estimates of planetary boundary layer (PBL) height for each
tracer release event. PBL height was not directly measured and the values included in the
database are estimates likely provided by parameterization schemes. Extensive investigation
into the origins of these values was beyond the scope of this study. They have been used in
previous modeling studies for the evaluation of the OCD model (Hanna, et al., 1984) and
CALPUFF (Earth Tech, 2006a). They are used in this study and assumed to represent the true
PBL height value at the time of the release for the purpose of evaluating WRF performance.
However, it is noted that they may be erroneous to some degree.
2.1.1 Cameron
The Cameron experiment was conducted during July 1981 and February 1982. The periods
selected offered samples from both summer and winter conditions off the coast of Louisiana.
-------
During the experiment, tracer gas was released from both a boat and a low-profile platform at a
height of 13 m. No downwash was considered for the platform releases but a building width of
20 m and height of 7 m was used for the boat-based releases. Receptors were located on flat
terrain near the shoreline with transport distances ranging from 4 to 10 km. Figure 1 shows the
land use, release points, receptor array, and meteorological monitoring stations for the
experiment (developed from graphics files supplied by the EPA from the OCS Model Evaluation
Archive (Earth Tech, 2006b)). Wind speed was measured at 10 m for the July releases and
18 m for the February releases.
The meteorological conditions and tracer release parameters from the Cameron study are listed
in Table 1. Meteorological data were collected on a 10 m mast installed on the coast, 5 to 20 m
from the water, and a 25 m mast located 2 km inland. Air temperature was measured at both
10m and 18 m at different periods during the July releases and solely at 18 m during the
February releases (at the location of the "10 m mast" indicated in Figure 1). Several
inconsistencies between air-sea temperature difference and the virtual potential temperature
lapse rate (VPTLR) were observed for a few hours. The VPTLR indicates stable conditions for
some periods when the air-sea temperature difference indicates unstable conditions. It is
possible that a low mixed layer was not captured by the PEL height measurements or that one
of the measurements was not truly indicative of the regional boundary layer conditions at the
time. It was noted in Hanna et al. (1984) that these inconsistencies were likely the result of
advection inversions common to the Louisiana coast in winter due to the strong gradient of sea-
surface temperature near the coast. The air-sea temperature difference was adjusted for these
periods to correspond to the stable conditions determined with other measurements. The air-
sea temperature difference was adjusted to be at least as stable as indicated by the virtual
potential temperature laps rate to address the inconsistency. These same adjustments were
applied in the evaluation by ENVIRON Int. Corp. (2012) and the off-shore modeling evaluation
of CALPUFF by Earth Tech (2006a).
-------
CAMERON, LA
Q
th
8 3294
z
u->
I
|
=>
39aK-
J
^
lit
25m
L
i
-.-'
^t
p^
-^^
ypQGyi vt
lOmM
ast
•^^^^OO^J
•os^eft^x
28a
°***^
^^v^
A2/15
A2/24
•^HH
!3*£
^v.
^V
466 468 470 472 474 476 478 480 482 484 486 488 490
UTM East (km) Zone 15N. Datum: NAS-C
^^
Lan
-100
95 Snow/Ice
-90
"85 Tundra
-80
~75 Barren
-70
-65 Wetland
-60
-55 Water
-50
-45 Forest
-40
-35 Range
-30
-25 Agriculture
-20
15 Urban/Built-Up
-10
J Use
X Sampler Locations
A Tracer Releases
Figure 1. Cameron Experiment Configuration.
-------
Table 1. Cameron Tracer Study Meteorology, Source, and Receptor Information.
7/20/81 14:00
7/20/81 15:00
7/23/81 17:00
7/23/81 18:00
7/27/81 20:00
7/27/81 22:00
7/29/81 16:00
7/29/81 17:00
7/29/81 19:00
2/15/82 16:00
2/15/82 17:00
2/15/8220:00
2/17/82 14:00
2/17/82 15:00
2/17/82 16:00
2/17/82 17:00
2/17/82 18:00
2/22/82 14:00
2/22/82 16:00
2/22/82 17:00
2/23/82 14:00
2/23/8217:00
2/24/82 15:00
2/24/82 16:00
2/24/82 17:00
2/24/82 19:00
Wind
Obs.
Ht.
(m)
10
10
10
10
10
10
10
10
10
10
10
10
10
18
18
18
18
18
18
18
18
18
18
18
18
18
Temp
RH
Obs.
Ht.
(m)
10
10
18
18
18
18
18
18
18
10
10
10
10
18
18
18
18
18
18
18
18
18
18
18
18
18
Wind
Dir.
202
210
232
229
176
151
218
240
241
142
134
147
178
195
210
206
193
171
172
182
152
165
143
143
140
156
Wind
Speed
(mis)
4.6
4.8
4.3
5.1
2.1
4.5
4.6
5.0
5.0
5.7
5.6
5.9
3.3
3.7
4.3
3.5
3.5
5.2
4.7
4.5
4.8
6.2
3.7
3.7
3.5
4.1
Mix Ht.
(m)
800
800
225
225
400
450
420
430
450
200
200
200
200
200
200
200
200
100
100
100
50
80
50
50
50
50
RH
(%)
63
64
73
74
82
82
69
68
68
89
88
87
93
93
93
93
93
75
76
76
84
88
49
50
50
52
Air
Temp.
(K)
302.4
302.6
303.6
303.7
300.2
300.0
303.0
303.0
303.1
287.4
287.1
287.4
288.8
288.1
288.0
287.7
287.4
290.6
290.6
290.9
291.5
291.2
293.1
292.9
292.9
290.7
Air-Sea
Temp
Diff. (K) a
-2.7
-2.6
-1.4
-1.2
-4.4
-4.5
-2.2
-2.0
-1.7
0.5
0.5
0.5
2.1
0.9
0.4
0.4
0.4
1.3
0.9
0.8
3.7
2.3
5.0
4.6
4.7
2.7
Virt. Pot.
Temp
Grad.
(K/m)
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.06
0.06
0.06
0.03
0.03
0.03
0.03
0.03
0.03
0.03
0.03
0.03
0.03
0.05
0.05
0.05
0.05
Date/
Time
6.39
4.92
4.74
4.74
-
-
9.59
6.45
9.59
-
-
-
2.46
7.63
3.89
3.78
2.06
2.69
2.41
2.81
0.63
3.21
2.75
3.21
3.26
2.63
Meteor-
ology
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
13.0
Source/
Receptor
Information
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
7.0
7.0
7.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
7.0
7.0
7.0
7.0
Bldg.
Wid. (m)
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
20.0
20.0
20.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
0.0
20.0
20.0
20.0
20.0
a Air-sea temperatures highlighted in red indicate revised values adjusted to fit with the stable conditions observed during these period
-------
2.1.2 Carpinteria
The Carpinteria tracer study was conducted during September - October 1985 to examine
offshore impacts caused by interaction with complex coastal terrain and shoreline fumigation.
Only the complex terrain tracer dataset (September releases) was evaluated in the study
because AERCOARE and AERMOD do not contain shoreline fumigation parameterization
modules. The receptors were placed along a 20-30 m high bluff within 0.8 to 1.5 km of the
offshore tracer release points. The experiment setup is shown in Figure 2 (graphics files
provided by EPA as a component of the OCS Model Evaluation Data Archive (Earth Tech,
2006b)). Tracer gas was released at heights of 18 and 61 m, between 300 and 700 m from
shore. Wind was measured on a tethersonde at varying heights ranging from 24 to 49 m.
Temperature was measured at a constant height of 9 m. The meteorological measurements
were taken near the tracer release locations. The reported constant mixed layer height of 500 m
for the entire record is suspect, especially during periods of light winds and stable conditions.
The meteorology and experiment parameters for each of the 27 release periods are listed in
Table 2.
CARPINTERIA, CA
380y-
3808,5-
3808-
3807.5-
3807-
~ 3806.5-
3306-
3805.5-
3S05-
269.5 270 270.5 271 271.5 272
UTMEast(km) Zone 11N, Datum-NAS-C
272.5
100
~9E Snow/Ice
90
~85
-80
-70
-65
Tundra
Barren
Wetland
Water
Forest
-55
-50
45
-40
-35
_30 Range
-25
-20 Agriculture
-15
10 Urban/Built-up
Land Use
Sampler Locations:
X - Complex Terrain
X -- Fumigation
Tracer Releases:
A - Complex Terrain
A -- Fumigation
Figure 2. Carpinteria Experiment Configuration.
-------
Table 2. Carpinteria Tracer Study Meteorology and Release Parameters.
Meteorology
Date/Time
9/19/859:00
9/19/85 10:00
9/19/8511:00
9/19/85 12:00
9/22/85 9:00
9/22/85 10:00
9/22/85 11:00
9/22/85 11:00
9/22/85 12:00
9/22/85 12:00
9/25/85 10:00
9/25/85 11:00
9/25/85 12:00
9/25/85 13:00
9/26/85 12:00
9/26/85 13:00
9/28/85 10:00
9/28/85 10:00
9/28/85 11:00
9/28/85 11:00
9/28/85 13:00
9/28/85 13:00
9/28/85 14:00
9/28/85 14:00
9/29/85 11:00
9/29/85 12:00
9/29/85 12:00
Wind
Obs.
Ht.
(m)
30
30
30
30
30
30
30
30
30
30
24
46
46
46
49
49
24
24
24
24
24
24
24
24
30
30
30
Temp
RH
Obs.
Ht. (m)
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
9
Wind
Dir.
259.7
235.4
214.1
252.9
220.8
251.1
253.8
230.0
248.4
237.7
163.8
163.8
165.6
175.0
262.0
262.2
155.8
155.8
174.7
177.0
234.5
229.5
215.0
215.0
243.7
238.9
232.7
Wind
Speed
(mis)
1.3
1.3
2.6
3.1
1.0
1.2
2.4
2.4
2.8
2.8
1.0
1.6
1.0
1.0
3.8
4.0
5.4
5.4
3.2
3.2
1.5
1.5
2.1
2.1
3.4
3.1
3.1
Mix
Ht.
(m)
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
500
RH
(%)
78.8
79.0
80.1
80.1
70.6
81.0
92.1
92.1
91.1
91.1
60.3
69.9
90.3
90.4
83.5
81.0
85.1
85.1
84.1
84.1
82.5
82.5
81.7
81.7
86.0
87.8
87.8
Air
Temp.
(K)
289.45
289.95
290.15
290.25
290.55
290.15
289.55
289.55
289.45
289.45
294.35
294.15
294.05
294.55
291.85
291.95
291.25
291.25
291.15
291.15
291.45
291.45
291.65
291.65
291.35
291.25
291.25
Air-Sea
Temp
Diff. (K)
-1.1
-0.8
-0.7
-0.7
0.5
0.3
1.0
1.0
1.1
1.1
2.8
2.3
2.1
2.7
-0.7
-1.0
-0.6
-0.6
-0.8
-0.8
-0.6
-0.6
-0.3
-0.3
-0.3
-0.4
-0.4
Virt. Pot.
Temp
Grad.
(K/m)
0.00
0.00
0.00
0.00
0.02
0.02
0.02
0.02
0.02
0.02
0.01
0.01
0.01
0.01
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
0.00
Sigma-
Theta
26.84
28.41
24.42
32.86
32.13
17.43
7.97
7.97
17.43
17.43
41.67
9.87
26.06
18.37
10.87
11.80
8.92
8.92
10.87
10.87
10.87
10.87
11.80
11.80
18.37
4.97
4.97
Release
parameters
Rel. Ht.
(m)
30.5
30.5
30.5
30.5
18.3
18.3
18.3
36.6
18.3
36.6
24.4
24.4
24.4
24.4
24.4
24.4
24.4
42.7
24.4
42.7
24.4
39.6
24.4
39.6
30.5
30.5
61.0
10
-------
2.1.3 Oresund
The Oresund experiment was conducted in spring 1984 between May 15 and June 14, to study
the effects of variations of surface meteorological conditions from warm land to cool marine
areas on the transport of pollutants. Instrumentation was set up on both sides of the 20 km wide
Oresund Straight dividing Denmark from Sweden, to measure meteorology and tracer
concentration. Many different types of meteorological measurement devices were used in an
attempt to describe the boundary layer conditions during the experiment. The overwater
meteorology was characterized solely using the measurements from the Oskarsgundet NE
lighthouse. Non-buoyant tracer was released from a tower at either 95 m above the ground at
Barseback, Sweden or at 115 m at Gladaxe, Denmark. Concentrations were sampled at arcs
set up along the opposite shore of the tracer release and 2-8 km inland. PEL heights over the
Straight were estimated from data collected from minisondes launched from a boat.
The meteorological conditions and parameters of each of the 36 release periods simulated are
listed in Table 3. Meteorological measurements used were from the Oskarsgrundet NE
lighthouse site. Air-sea temperature differences were as great as 6-8 degrees Centigrade (°C)
during several of the experiment days due to warm advection over cool water.
11
-------
Strait of Oresund
6230
6220
6210
6200
CO
Q
CO
ro
6190
6180
-------
Table 3. Oresund Experiment Meteorology and Tracer Release Parameters.
Meteorology
Date/Time
5/16/198410:00
5/16/198411:00
5/16/198412:00
5/16/198413:00
5/16/198414:00
5/16/198415:00
5/18/19849:00
5/18/198410:00
5/18/198411:00
5/18/198412:00
5/18/198413:00
5/22/198410:00
5/22/198411:00
5/22/1984 12:00
5/29/19849:00
5/29/198410:00
5/29/198411:00
5/29/1984 12:00
6/4/19849:00
6/4/198410:00
6/4/198411:00
6/4/198412:00
Wind
Obs.
Ht.
(m)
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
Temp
RH
Obs.
Ht. (m)
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
Wind
Dir.
125
135
136
130
130
131
202
204
201
210
199
73
73
76
75
74
75
76
74
75
69
80
Wind
Speed
(mis)
3.8
4.6
5.0
5.0
4.2
7.2
6.3
6.1
6.0
5.2
3.1
10.2
11.0
9.4
9.0
8.7
7.0
7.0
8.3
8.1
9.5
9.0
Mix
Ht.
(m)
50
50
50
50
50
50
800
800
800
800
800
50
50
50
700
700
100
100
50
50
50
50
RH
(%)
97
94
88
83
78
70
97
97
96
94
90
85
83
80
88
84
79
73
86
84
84
79
Air
Temp.
(K)
285.8
286.4
287.5
288.5
289.7
289.7
282.7
283.0
283.3
283.3
283.6
288.3
288.8
289.3
287.7
288.9
289.6
290.3
290.6
291.4
291.6
293.3
Air-Sea
Temp
Diff. (K)
4.1
4.7
5.7
6.7
7.8
7.8
0.1
0.4
0.6
0.5
0.8
5.3
5.8
6.3
3.8
4.9
5.5
6.1
4.8
5.6
5.8
7.5
Virt. Pot.
Temp Grad.
(K/m)
0.05
0.05
0.05
0.05
0.05
0.05
-0.008
-0.008
-0.008
-0.008
-0.008
0.04
0.04
0.04
-0.008
-0.008
0.035
0.035
0.035
0.035
0.035
0.035
Release
parameters
Rel. Ht.
(m)
95
95
95
95
95
95
115
115
115
115
115
95
95
95
95
95
95
95
95
95
95
95
13
-------
Table 3 (continued). Oresund Experiment Meteorology and Tracer Release Parameters.
Meteorology
Date/Time
6/5/19849:00
6/5/198410:00
6/5/198411:00
6/5/198412:00
6/12/19849:00
6/12/198410:00
6/12/198411:00
6/12/198412:00
6/12/198413:00
6/14/198411:00
6/14/198412:00
6/14/198413:00
6/14/198414:00
Wind
Obs.
Ht.
(m)
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
Temp
RH
Obs.
Ht. (m)
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
10.0
Wind
Dir.
71
76
76
74
228
228
205
198
193
331
350
294
291
Wind
Speed
(mis)
7.7
9.0
10.6
8.5
3.5
3.5
3.2
4.7
4.6
4.2
4.5
6.3
7.0
Mix
Ht.
(m)
50
50
50
50
2200
2200
2200
2200
2200
2300
2300
2300
2300
RH
(%)
88
84
80
77
75
70
69
74
77
76
72
72
74
Air
Temp.
(K)
290.0
290.8
292.0
292.2
285.3
285.2
285.4
285.7
286.2
286.7
287.1
287.2
287.4
Air-Sea
Temp
Diff. (K)
5.2
6
7.1
7.3
-1.6
-2.1
-1.9
-1.6
-1.1
0.2
0.5
0
1
Virt. Pot.
Temp Grad.
(K/m)
0.04
0.04
0.04
0.04
0.04
-0.008
-0.008
-0.008
-0.008
-0.008
-0.008
-0.008
-0.008
Release
parameters
Rel. Ht.
(m)
95
95
95
95
115
115
115
115
115
115
115
115
115
14
-------
2.1.4 Pismo Beach
The Pismo Beach tracer experiment was conducted in December 1981 and June 1982 along
the coast of California. The experiment involved the release of a tracer at 13.1 - 13.6 m above
the water from a boat mast about 5-7 km off the coast during onshore flow. Downwash was
assumed for the releases, given a boat 7 m high and 20 m wide. Sensors were placed along the
shoreline to measure the concentration of tracer gas. A small group of sensors were also placed
about 7-8 km inland. The receptor locations, release points, and terrain/land use details are
shown in Figure 4. Details for the 31 different release cases are listed in Table 4. Meteorological
data used in the study included wind measured at 20.5 m and temperature measured at 7 m all
measured at the release location. Vertical temperature gradients were measured over the water
by aircraft. Similar to the Cameron study, there were some inconsistencies between the air-sea
temperature difference and virtual potential temperature lapse rate. The air-sea temperature
differences were adjusted to coincide with the stability conditions determined by the
atmospheric lapse rate, similar to what was done for the Cameron study.
Pismo Beach, CA
3890
3865
100
"95 Snow/Ice
-90
~85 Tundra
-80
"75 Barren
-70
- 65 wetland
-60
- 55 Water
-50
-45 Forest
-40
-35 Range
-30
-25 Agriculture
-20
-15 Urban/Built-Up
10
Land Use
X Sampler Locations
A Tracer Releases
715 720 725
UTM East (km) Zone 10N, Datum: NAS-C
Figure 4. Pismo Beach Experiment Configuration.
15
-------
Table 4. Pismo Beach Experiment Meteorology and Release Parameters.
Meteorology
Date/Time
12/8/81 15:00
12/8/81 16:00
12/11/81 14:00
12/11/81 15:00
12/11/81 17:00
12/11/81 19:00
12/13/81 14:00
12/13/81 15:00
12/13/81 17:00
12/14/81 13:00
12/14/81 15:00
12/14/81 17:00
12/15/81 13:00
12/15/81 14:00
12/15/81 19:00
6/21/8215:00
6/21/82 16:00
6/21/8217:00
6/21/8218:00
6/22/8215:00
Wind
Obs.
Ht. (m)
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
Temp
RH
Obs.
Ht. (m)
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
Wind
Dir.
261
284
275
283
289
305
289
280
301
292
292
296
304
299
321
276
269
261
276
274
Wind
Speed
(mis)
2.2
1.6
4.5
5.4
8.6
7.9
5.4
6.1
7.9
7.7
10.9
9.9
5.6
6.1
1.6
4.3
3.8
2.7
3.0
3.7
Mix Ht.
(m)
100
100
600
600
700
900
50
50
50
50
50
50
50
50
50
800
800
800
800
700
Rel.
Humid.
(%)
67
75
74
73
84
81
95
97
92
79
90
88
88
83
70
84
86
87
89
80
Air Temp.
(K)
287.7
287.5
285.6
286.1
286.0
286.1
285.5
285.3
286.2
287.2
286.4
286.7
286.1
287.7
289.4
287.5
287.3
287.3
286.9
288.6
Air-Sea
Temp
Diff. (K) a
1.30
1.20
0.00
0.00
0.10
0.20
-0.80
-0.80
0.35
1.30
0.40
0.90
0.30
1.10
3.40
1.50
1.40
1.50
1.20
1.70
Virt. Pot.
Temp Grad.
(K/m)
0.030
0.030
0.010
0.010
0.010
0.010
0.000
0.000
0.060
0.020
0.020
0.020
0.010
0.010
0.030
0.008
0.008
0.008
0.008
0.005
Sigma-
Theta
9.43
12.90
5.60
4.57
2.12
45.00
0.92
2.41
1.89
1.20
1.20
1.78
14.41
45.00
45.00
1.37
2.12
6.84
19.70
6.05
Release
parameters
Release
height (m)
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.1
13.6
13.6
13.6
13.6
13.6
16
-------
Table 4 (continued). Pismo Beach Experiment Meteorology and Release Parameters.
Meteorology
Date/Time
6/22/82 16:00
6/22/8219:00
6/24/8213:00
6/24/8215:00
6/25/82 12:00
6/25/8213:00
6/25/8215:00
6/25/82 16:00
6/25/8217:00
6/27/82 16:00
6/27/8218:00
Wind
Obs.
Ht. (m)
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
Temp
RH
Obs.
Ht. (m)
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
Wind
Dir.
268
289
269
269
286
280
286
288
290
287
285
Wind
Speed
(mis)
5.2
3.2
3.9
5.3
5.6
6.5
9.8
9.1
9.5
12.7
10.2
Mix Ht.
(m)
700
700
600
600
100
100
100
100
100
100
100
Rel.
Humid.
(%)
78
84
82
84
76
80
82
82
81
93
94
Air Temp.
(K)
288.8
287.2
288.1
288.1
288.9
288.5
288.3
288.3
288.4
287.0
287.7
Air-Sea
Temp
Diff. (K) a
2.10
1.30
0.90
0.60
2.20
2.60
2.60
2.90
3.20
3.40
3.70
Virt. Pot.
Temp Grad.
(K/m)
0.005
0.005
0.010
0.010
0.010
0.010
0.010
0.010
0.010
0.010
0.010
Sigma-
Theta
3.32
10.59
27.79
7.46
1.37
1.60
5.48
0.92
1.20
1.09
7.74
Release
parameters
Release
height (m)
13.6
13.6
13.6
13.6
13.6
13.6
13.6
13.6
13.6
13.6
13.6
a Air-sea temperatures highlighted in red indicate revised values adjusted to fit with the stable conditions observed during these period
17
-------
2.1.5 Ventura
The Ventura tracer experiment was conducted in September 1980 and January 1981. Similar to
Pismo Beach, tracer was released from a boat off the coast of California during onshore flow
conditions. The primary receptors were placed 0.5 - 1.0 km from the shoreline and a secondary
group of receptors were placed 7 - 9 km inland. The release points and receptor arrangement
are shown in Figure 5. Non-buoyant tracer was released from a boat at about 8 m above the
water. Downwash was assumed for the releases, given a boat 7 m high and 20 m wide.
Meteorology and release parameters for each release period are listed in Table 5. Meteorology
data were collected at the release points including wind measured at 20.5 m and temperature at
7 m. Air-sea temperature difference values were revised to correspond to stability conditions
determined by the measured atmospheric temperature gradients. Vertical temperature gradient
over the water was measured by aircraft.
VENTURA, CA
3798
X. N
3780
284 286 288 290 292 294 296 298 300 302 304
UTM East (km) Zone 11N. Oalum: NAS-C
X Sampler Locations
A Tracer Releases
Figure 5. Ventura Experiment Configuration.
18
-------
Table 5. Ventura Tracer Experiment Meteorology and Release Parameters.
Meteorology
Date/Time
9/24/80 16:00
9/24/8018:00
9/24/8019:00
9/27/8014:00
9/27/8019:00
9/28/8018:00
9/29/80 14:00
9/29/80 16:00
9/29/8018:00
1/6/81 16:00
1/6/81 17:00
1/6/81 18:00
1/9/81 15:00
1/9/81 16:00
1/9/81 18:00
1/13/81 15:00
1/13/81 17:00
Wind
Obs.
Ht. (m)
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
20.5
Temp
RH
Obs.
Ht. (m)
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
Wind
Dir.
266
281
292
272
272
265
256
264
264
276
283
276
286
277
274
274
242
Wind
Speed
(mis)
4.1
6.2
6.9
6.3
6.1
3.1
3.3
5.1
5.2
4.0
5.1
4.9
4.7
4.6
4.9
5.8
4.2
Mix
Ht.
(m)
400
400
400
400
400
250
100
100
50
50
50
50
100
100
100
50
50
RH
(%)
72
78
77
80
80
80
76
76
76
60
58
60
87
85
87
65
84
Air
Temp.
(K)
288.3
288.0
288.0
288.0
289.0
290.0
288.7
289.3
289.2
290.3
290.6
290.4
287.6
288.0
288.2
290.1
289.0
Air-Sea
Temp
Diff. (K) a
-2.1
-2.0
-2.1
-1.9
-1.0
0.0
0.1
0.1
0.1
1.6
1.7
1.8
-0.9
-0.5
-0.3
1.4
0.4
Virt.
Pot.
Temp
Grad.
(K/m)
0.00
0.00
0.00
0.00
0.00
0.01
0.03
0.03
0.03
0.01
0.01
0.01
0.00
0.00
0.00
0.01
0.01
Sigma
-Theta
8.0
6.5
6.0
4.7
3.6
4.4
5.0
3.9
5.2
21.5
13.1
9.4
3.4
4.8
3.1
11.6
8.5
Release parameters
Release
height
(m)
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
8.1
Bldg.
ht. (m)
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
7.0
Bldg.
wid. (m)
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
20.0
'Air-sea temperatures highlighted in red indicate revised values adjusted to fit with the stable conditions observed during these period
19
-------
2.2 WRF Configurations for the Five Tracer Studies
The mesoscale atmospheric simulations for the study were developed using the National Center
for Atmospheric Research's (NCAR's) community-developed WRF model (dynamical core
version 3.4.1) (National Center for Atmospheric Research, 2014). WRF is a limited-area, non-
hydrostatic, terrain-following "eta"-coordinate mesoscale model. WRF is the state-of-the-art
mesoscale model used today to provide meteorological fields for long-range air dispersion
modeling. WRF is versatile in that it can use many different types of initialization datasets and
contains numerous physics modules (Skamarock, et al., 2008).
Mesoscale model results generally have some bias and error and no single configuration
provides the best simulation in all circumstances (Wee, et al., 2012) (Mass, et al., 2008)
(Larson, et al., 2001). The accuracy and variability of the WRF model are critical to evaluate its
success as an input to downstream dispersion models, as a single choice of model setup
represents a single deterministic solution. Thus, the modeling of each historical field study was
conducted as an ensemble of simulations, varying reanalysis input and PEL selection.
Reanalysis datasets are 3-dimensional grids of historical meteorological data developed for
meteorological and climatological studies. They are built to provide a "best guess" of the state of
the atmosphere using a combination of assimilated measurements and smoothing techniques.
WRF uses the analysis data for initialization, boundary conditions, and analysis nudging (data
assimilation using the gridded analysis data). The reanalysis dataset provides the
meteorological fields forming the initialization and boundary condition fields for the model,
including the sea surface temperature (SST). A number of different organizations have
developed reanalysis datasets using different techniques, grid resolutions, and observational
datasets (Lahoz, etal., 2010).
The PEL scheme parameterizes the interaction of the free atmosphere with the layers of the
atmosphere nearest to the ground. The PEL is the most turbulent portion of the atmosphere
where turbulent fluctuations transfer momentum from the free atmosphere to the surface.
Turbulent motions in the atmosphere occur in the sub-grid scale (for mesoscale modeling) and
therefore must be parameterized. WRF is fitted with a number of PEL parameterization
schemes, each formulated using different techniques and assumptions. The PEL scheme may
have significant influence on the surface meteorological parameters that are important for
offshore dispersion modeling. Meteorological parameters such as surface wind, vertical
temperature gradient, and PEL height are all determined by the PEL scheme. We are unaware
of any regulatory guidance that recommends which PEL model should be used for given
conditions.
A number of studies have examined the sensitivity of mesoscale model predictions to choice of
PEL model (Zhang & Zheng, 2012) (Hu, et al., 2010), (Xie, et al., 2012). In general, these
studies have found greatest differences in model results between local closure schemes and
non-local closure schemes. Local closure schemes rely on K-theory (Stull, 1988), where
turbulent mixing is a function of wind shear. Local closure schemes are best for neutral or stable
atmospheric conditions where mixing is dominated by wind shear. Non-local closure schemes
20
-------
provide additional parameterization of processes that contribute to mixing independent of the
local wind shear such as large buoyant eddies. Non-local closure schemes have been found to
improve the performance of WRF during convective conditions, resulting in better predictions of
wind speed, temperature gradient, and PEL height (Xie, et al., 2012).
The reanalysis dataset and PEL scheme selection can significantly affect the simulation, and it
is difficult to anticipate the influence of each selection on the solution. The ensemble examined
in this study included all six possible combinations of reanalysis and PEL schemes listed below:
• Reanalysis Schemes:
o European Center for Medium Range Weather Forecasts' (ECMWF) Reanalysis
Project (ERA) ERA-lnterim dataset (ERA-I, 6-hourly analysis output, about 0.5
degree horizontal resolution)
o North America Regional Reanalysis (NARR) (used for U.S. cases only, 3-hourly
analysis dataset, about 0.3 degree horizontal resolution)
• PEL Schemes:
o Yonsei University (YSU) PEL scheme (Hong, et al., 2006): A non-local closure
scheme.
o Mellor-Yamada-Janjic (MYJ) PEL scheme (Mellor & Yamada, 1982) (Janjic,
1994): a local-closure one-dimensional prognostic turbulent kinetic energy
scheme.
o University of Washington Shallow Convection (UW-PBL) PEL scheme
(Bretherton & Park, 2009): A non-local closure scheme integrated with
sophisticated moist physics parameterizations.
The 25-year NARR retrospective production period (1979-2003) uses a recent version of the
Noah land surface model and incorporates hourly precipitation analyses. NARR's 32-km
horizontal spacing makes it the highest spatial resolution product available for the periods
simulated in this study. The model's vertical structure contains 45 layers and output is 3-hourly
(Mesinger, 2006). NARR has been used previously for WRF modeling of Arctic overwater
meteorology in a BOEM study (Zhang, 2011).
ERA-lnterim (ERA-I) is a global atmospheric reanalysis (1979-present) that represents an
improvement of the ECMWF's ERA-40 global atmospheric reanalysis dataset. ERA-I includes
6-hourly output, 37 pressure levels, and 0.75° x 0.75° spatial resolution. Its global coverage
makes it the only choice for Oresund, DK (Simmons, et al., 2006).
The YSU and MYJ PEL selections are robust schemes, widely used in the WRF modeling
community and have been used extensively in WRF modeling for regulatory applications while
the UW-PBL has been shown to reduce climate bias by 7% in the Community Atmosphere
Model (Park, 2009). YSU and UW-PBL are based on non-local closure schemes, whereas MYJ
is a local closure model. The UW-PBL scheme is ideal because it includes parameterizations
accounting for moisture and convection effects on fluxes in the PEL. The scheme was explicitly
21
-------
built to improve predictions of the marine boundary layer. The six WRF configurations tested
and the corresponding naming conventions used in this study are listed in Table 6.
Table 6. WRF Scenarios and Naming Convention.
Initialization
Description
ERA.UW
ERA-lnterim initialization dataset, UW-PBL PEL scheme
ERA.YSU
ERA-lnterim initialization dataset, YSU PEL scheme
ERA.MYJ
ERA-lnterim initialization dataset, MYJ PEL scheme
NARR.UW
NARR initialization dataset, UW-PBL PEL scheme
NARR.YSU
NARR initialization dataset, YSU PEL scheme
NARR. MYJ
NARR initialization dataset, MYJ PEL scheme
WRF simulations were produced under each of the six configuration schemes for all five tracer
dispersion studies except Oresund, which is outside the geographic coverage of the NARR
dataset. MM IF was used to extract time series of meteorological fields at locations
corresponding to measurement sites for each tracer study. Relative humidity (RH), wind speed
and direction, PEL height, and the Monin-Obukhov length (L) were extracted and compared to
the measurements. The results of this analysis, reported in Section 2.4 using methods
described in Section 2.3, indicate that WRF simulations using ERA outperform simulations using
NARR initialization data.
The WRF configuration included simulation blocks ranging from 1.5 to 6.5 days, with a minimum
of 12 hours for model spin-up prior to experimental tracer release times. The spin-up time allows
the model to develop sub-grid scale processes, including vorticity and moisture fields. Table 7
summarizes the date ranges for the tracer studies and corresponding regional weather model
initializations.
Table 7. WRF Simulation Times.
Location
Historical Field Study Date Ranges
WRF Initializations
Cameron, LA
Period 1: 08Z 07/20/1981 to
13Z 07/29/1981
Period 2: 08Z 02/15/1982 to
14Z 02/24/1982
Period 1: 12Z 07/19/1981
12Z 07/24/1981
Period 2: 12Z 02/14/1982
12Z 02/19/1982
Carpinteria, CA
Period 1: 09/19/1985 to
09/29/1985
(Complex Terrain Study only)
Period 1: 12Z 09/18/1985
12Z 09/24/1985
Pismo Beach, CA
Period 1: 12/08/1981 to
12/15/1981
Period 2: 06/21/1982 to
06/27/1982
Period 1: 12Z 12/06/1981
12Z 12/11/1981
Period 2: 12Z 06/19/1982
12Z 06/24/1982
22
-------
Table 7 (continued). WRF Simulation Times.
Ventura, CA
Period 1: 09/27/1980 to
09/29/1980
Period 2: 01/06/1981 to
01/13/1981
Period 1: 12Z 09/23/1980
Period 2: 12Z 01/05/1981
12Z 01/10/1981
Oresund,
Denmark/Sweden
Period 1:
Period 2:
11Z 05/1 6/1 984 to
13Z 06/05/1 984
10Z 06/1 2/1 984 to
15Z 06/1 4/1 984
Period 1:
Period 2:
12Z 05/1 5/1 984
12Z 05/2 1/1 984
12Z 05/28/1 984
12Z 06/03/1 984
12Z 06/1 1/1 984
For the WRF modeling, the SST values found in the initialization dataset (NARR or ERA-I) were
used because higher resolution SST's were not available during the early 1980's. About half of
the field study periods occurred before the first 5-channel Advanced Very High Resolution
Radiometer was launched on the NOAA-7 satellite in June 1981, with useable datasets starting
September 1, 1981. Using this higher resolution SST dataset for a subset of the simulations
would introduce another source of uncertainty. It should be stressed, however, that modern
applications of WRF can (and do) use high resolution satellite-based SST products to provide
the lower boundary condition.
The sub-grid-scale fluxes at the lower boundary of WRF were treated by the four-layer Noah
land-surface model. The Noah land-surface model was also used for a similar WRF transport
study along the California coast (Yver, et al., 2012). The WRF options used for the study are
listed in Table 8.
23
-------
Table 8. WRF Physics Parameterization Scheme Options Selected for WRF Modeling.
Parameterization
Micro-physics
(mp_physics)
PEL physics
(bl_pbl_physics)
Cumulus /
convection
(cu_physics)
Radiation
(ra_sw_physics)
(ra_lw_physics
Land surface
(sf_surface_physics)
Surface layer
(sf_sfclay_physics
Option
selected
Thompson
YSU, MYJ,
UW-PBL
Kain-Fritsch
RRTMG
Unified
Noah
ETA M-O
similarity
WRF
option #
8
1,2, and 9
respectively
1
4
2
2
Description
Moisture physics parameterization.
Thompson scheme: 6-class
hydrometeors
YSU: non-local TKE scheme
MYJ: local TKE scheme
UW-PBL: non-local TKE scheme
with moist physics
Sub-grid convection scheme using
mass-flux approach. Used on 36
and 12 km domains only: resolved
explicitly on high resolution
domains. Also used Kain-Fritsch
"Eta" moisture advection trigger
Rapid radiative transfer model using
cloud overlap schemes
4-layer soil model with fractional
snow cover, frozen soil physics, and
ice sheet cover physics
Monin-Obukhov similarity theory
based scheme
Source
Thompson et al. (2008)
(Hong etal., 2006)
(Mellor and Yamada,
1982 and Janjic, 1994)
(Bretherton and Park,
2009)
Kain (2004)
lacono etal. (2008)
Tewari et al. (2004)
Janjic (1994)
Nudging in WRF may be used in an attempt to reduce local error in the meteorological fields
when observations are available at a given time and location. There are options to nudge wind,
temperature, and moisture fields toward 3-D and 2-D analysis fields and options to nudge
toward surface observations using specified horizontal and vertical radii of influence. Excessive
nudging may lead to non-physical development of weather patterns if mass balance is
significantly distorted, so weighting factors must be tuned to prevent too much field correction.
WRF can use temporal and spatial data assimilation methods to "nudge" gridded wind,
temperature, and water vapor towards observations or analysis data. When nudging is applied,
meteorological variables at adjacent grid points are relaxed towards the observed or analysis
value, weighted by distance. Observation nudging was not used for the WRF simulations in this
study. Analysis nudging was used for the WRF simulations on the 36 and 12 km domains. PEL
nudging of wind, moisture, and temperature was not used to comply with advice given in
(Stauffer, etal., 1991).
Observation nudging can also be used in a preliminary step using the preprocessor "obsgrid" in
an attempt to improve the analysis dataset. Observation nudging was used on the analysis
dataset in this study.
In order to compare the quality of the simulation, available meteorological datasets were
analyzed and compared to in-situ data with time series plots of temperature, air-sea
24
-------
temperature difference, wind speed, wind direction, PEL height, and lapse rates extracted from
the WRF solutions.
The WRF vertical structure was assigned 37 levels, disproportionately stacked toward the
surface. The boundary layer resolution used finer vertical spacing than typically used for most
simulations over land: it is anticipated that this will help the meteorological fields respond more
explicitly to dynamical influences. Table 9 shows the setup of the vertical grid that defines the
layer heights over the entire domain in the WRF runs, including layer average height and
thickness estimates based on the hypsometric equation, Eq. (1):
dz
The U.S. modeling domains for WRF are defined on the Lambert Conformal Conic (LCC) map
projection identical to the National Regional Planning Organization (RPO) domains, with an
outermost RPO domain (36 km) and telescoping 12 km - 4 km -1.33 km nests to capture the
fine detail of coastlines and adjacent topography. The domain configuration for Oresund is
similar. However, the location requires a projection defined for Northern Europe. Each of the
inner-most domains were centered on the region of the respective tracer experiment. The
domain setups for the Cameron, Carpinteria, Oresund, Pismo, and Ventura WRF simulations
are shown in Figure 6 - Figure 15, respectively. The 1.33 km domain figures reference the
"5-point" buffer zone. The buffer zone is a transition region where boundary condition
adjustments may lead to "edge effects." The meteorological fields within the buffer zone should
not be used for dispersion modeling.
25
-------
Table 9. WRF Vertical Grid Setup.
Level* "eta" level Pressure (mb) Height (m) Mid. Height (m) Layer Thickness (m)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
1
0.9985
0.997
0.995
0.993
0.991
0.988
0.985
0.98
0.97
0.96
0.95
0.94
0.93
0.91
0.89
0.87
0.84
0.8
0.76
0.72
0.68
0.64
0.6
0.55
0.5
1000
999
997
995
993
991
989
986
981
972
962
953
943
934
915
896
877
848
810
772
734
696
658
620
573
525
0.0 -
12.2
24.5
40.8
57.2
73.6
98.3
123.0
164.3
247.4
331.2
415.7
500.8
586.6
760.5
937.2
1117.1
1392.8
1772.4
2166.7
2577.0
3005.0
3452.2
3921.0
4540.7
5203.7
-
6.1
18.4
32.7
49.0
65.4
85.9
110.6
143.6
205.9
289.3
373.4
458.2
543.7
673.5
848.8
1027.1
1254.9
1582.6
1969.6
2371.9
2791.0
3228.6
3686.6
4230.8
4872.2
12.2
12.2
16.4
16.4
16.4
24.7
24.7
41.3
83.1
83.8
84.5
85.1
85.8
173.8
176.8
179.8
275.8
379.6
394.3
410.3
427.9
447.3
468.7
619.8
662.9
26
-------
Table 9 (continued). WRF Vertical Grid Setup
Level* "eta" level Pressure (mb) Height (m) Mid. Height (m) Layer
27
28
29
30
31
32
33
34
35
36
37
0.45
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.06
0.027
0
478
430
383
335
288
240
193
145
107
76
50
5917.1
6690.5
7536.4
8472.3
9522.5
10724.1
12136.7
13866.9
15621.6
17503.4
19594.2
5560.4
6303.8
7113.5
8004.4
8997.4
10123.3
11430.4
13001.8
14744.2
16562.5
18548.8
Thickness (m)
713.4
773.4
846.0
935.8
1050.2
1201.6
1412.6
1730.1
1754.7
1881.8
2090.8
"Calculated using P0=1000mb, Ptop=50mb, T0=20.15C, and dT/dz=-6.5 °C/km, see Eq. 1.
27
-------
50°N
45°N
40°N -
35°N
30°N -
25°N -
20°N -
120°W
110°W
100°W
90°W
80°W
Figure 6. Cameron (LA) WRF Domain Map. The entire map contains the 36 km
domain, while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km
domains, respectively.
28
-------
3322-
326:
444
454
494
474 484
UTM East (km) Zone 15N, Datum: NAS-C
Figure 7. Cameron WRF 1.33 km Domain (solid magenta line), with 5-point grid cell
buffer (dashed magenta line) and the two stations used for the inner domain
METSTAT analysis.
504
29
-------
50°N -
45°N —
40°N -
35°N
30°N
25°N -
20°N -
120°W
110°W
100°W
90°W
80°W
Figure 8. Carpinteria (CA) WRF Domain Map. The entire map contains the 36 km domain,
while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
30
-------
Sampler Locations
" »- X - Complex Terrain
Z — X ~ Fumigation
™ „. Tracer Releases:
•'* - Complex Terrain
- Fumigation
UTM East (km) Zone 11N. Datum: NAS-C
Figure 9. Carpinteria WRF 1.33 km Domain (solid magenta line), with 5 point grid
cell buffer (dashed magenta line) and the five stations used for the inner domain
METSTAT analysis.
31
-------
60°N -
55°N -
50°N -
45°N -
40°N —
35°N
30°N -
10°W
10°E
20°E
30°E
40°E
Figure 10. Oresund WRF Domain Map. The entire map contains the 36 km domain, while
d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains, respectively.
32
-------
Strait of Oresund
6230
6110
310
320
330 340 350 360 370 380 390
UTM East (km) Zone 33N, Datum: EUR-M
400
410
100
95 Snow/Ice
90
85 Tundra
80
5 Barren
J70
65 Wetland
60
55 Water
50
45 Forest
40
35 Range
-30
- 25 Agriculture
-20
15 Urban/Built-Up
- 10
Land Use
Terrain Contour 25m
420
Figure 11. Oresund WRF 1.33 km Domain (solid magenta line), with 5 point grid cell buffer
(dashed magenta line) and the eight stations used for the inner domain METSTAT
analysis.
33
-------
in
50°N
45°N -
40°N
35°N
30°N
25°N -
20°N —
120°W
110°W
100°W
90°W
80°W
Figure 12. Pismo Beach (CA) WRF Domain Map. The entire map contains the 36 km
domain, while d02. d03. and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
34
-------
3910
705
715
UTM East (km) Zone 10N. Datum: NAS-C
Figure 13. Pismo Beach WRF 1.33 km Domain (solid magenta line), with 5 point
grid cell buffer (dashed magenta line) and the two stations used for the inner
domain METSTAT analysis.
35
-------
50°N
45°N
40°N -
35°N
30°N
25°N -
20°N -
120°W
110°W
100°W
90°W
80°W
Figure 14. Ventura (CA) WRF Domain Map. The entire map contains the 36 km domain,
while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
36
-------
3836
3826
3816
O
co
3806
8
3796
3786
3776
3766
375i
246 256 266 276 286 296 306
UTM East (km) Zone 11N, Datum: NAS-C
316
326
336
Figure 15. Ventura WRF 1.33 km Domain (solid magenta line), with 5 point grid
cell buffer (dashed magenta line) and the six stations used for the inner domain
METSTAT analysis.
37
-------
2.3 WRF Performance Evaluation for the Five Tracer Studies
WRF performance was assessed in two ways: quantitatively by computation of statistics relating
WRF meteorology to observed values and qualitatively by graphical comparison of WRF
meteorology to observed values. The quantitative analysis was conducted using the publicly
available METSTAT software (ENVIRON Int. Corp., 2014). METSTAT calculates a suite of
model performance statistics using observations of wind speed, wind direction, temperature,
and moisture. WRF predictions are extracted from the nearest grid cell for comparison to the
observed values. METSTAT computes metrics for bias, error, and correlation and compares
them to a set of performance benchmarks set for ideal model performance (Emery, et al., 2001).
Statistical measures calculated by METSTAT include observation and prediction means,
prediction bias, and prediction error.
Mean observation (M0) is calculated using values from all sites for a given time period by Eq.
(2):
j i
(2)
7=1 i=l
where O y is the individual observed quantity at site / and time;', and the summations are over all
sites (I) and over time periods (J).
Mean Prediction (MP) is calculated from simulation results that are interpolated to each
observation used to calculate the mean observation for a given time period by Eq. (3):
j i
(3)
7 = 1i=l
where P'y is the individual predicted quantity at site / and time/ The predicted mean wind speed
and mean resultant direction are derived from the vector-average (for east-west component u
and north-south component v).
Bias (B) is calculated as the mean difference in prediction-observation pairings with valid data
within a given analysis region and for a given time period by Eq. (4):
(4)
ci-ofi
7=1i=i
38
-------
Gross Error (E) is calculated as the mean absolute difference in prediction-observation pairings
with valid data within a given analysis region and for a given time period by Eq. (5):
7 = 1 i=l
Note that the bias and gross error for winds are calculated from the predicted-observed
residuals in speed and direction (not from vector components u and v). The direction error for a
given prediction-observation pairing is limited to range from 0 to ±180°.
Root Mean Square Error (RMSE) is calculated as the square root of the mean squared
difference in prediction-observation pairings with valid data within a given analysis region and
for a given time period by Eq. (6):
1
; / 12
RMSE =
h
7=1i=l
(6)
The RMSE, as with the gross error, is a good overall measure of model performance. However,
since large errors are weighted heavily (due to squaring), large errors in a small sub-region may
produce a large RMSE even though the errors may be small and quite acceptable elsewhere.
2.3.1 METSTAT Benchmarks
The METSTAT benchmarks were developed using the results of about 30 meteorological model
performance simulations performed to support air quality studies of urban areas (Emery, et al.,
2001). As part of the Western Regional Air Partnership (WRAP) meteorological modeling of the
western United States, including the Rocky Mountain Region as well as the complex conditions
in Alaska, another set of model performance benchmarks were developed for complex
conditions (Kemball-Cook, etal., 2005).
Table 10 lists the meteorological model performance benchmarks for simple and complex
terrain (Kemball-Cook, et al., 2005). The benchmarks, overall, provide a measure of how well
the WRF model performs with regards to other modeling cases in the U.S. However, given the
wide variety of landforms, weather, and climatic regions in the U.S., it is likely that these
benchmarks are applicable to most regions of the world. Most surface meteorological stations
used in the METSTAT analysis are land-based coastal stations. Point measurements along the
coast can be influenced by marine or land-based boundary layers, depending on the conditions
at any given moment. Strong gradients of temperature, relative humidity, and cloud can exist at
the interface between the marine and land PBLs. Large differences in meteorology between
WRF and measurement data may occur if WRF grid resolution is not dense enough to resolve
these tight gradients, or if the WRF grid cell location is not located within the same PEL (marine
or land-based) as the measurement location. Given the complexity of meteorological conditions
39
-------
in the PEL along the land-sea interface, it can be assumed that the complex terrain criteria may
provide a more suitable set of performance goals for this study.
Table 10. Meteorological Model Performance Benchmarks for Simple and Complex
Conditions.
Parameter
Temperature Bias
Temperature Error
Humidity Bias
Humidity Error
Wind Speed Bias
Wind Speed RMSE
Wind Direction Bias
Wind Direction Error
Simple
<±0.5K
<2.0K
< ±1.0 g/kg
< 2.0 g/kg
<±0.5 m/s
<2.0 m/s
< ±10 degrees
< 30 degrees
Complex
<±2.0K
<3.5K
< ±0.8 g/kg
< 2.0 g/kg
<±1.5 m/s
<2.5 m/s
< ±10 degrees
< 55 degrees
Although METSTAT analysis can be applied to individual meteorological station datasets, it is
typically used to evaluate performance against a group of stations within the WRF domain. This
approach is advantageous because it exhibits performance across the entire domain and
relaxes high bias that can occur at any individual site (advantageous if the climate at the site is
heavily influenced by small-scale local terrain or roughness features not resolved in the WRF
domain). However, if too many stations are used in the analysis, the statistics may be unduly
smoothed and not truly representative of WRF performance. Each tracer study was evaluated
with a small number of stations (under 10).
METSTAT statistical results are typically displayed in a "soccerplot." This type of plot contains
the statistical results for selected periods plotted with respect to the simple and complex "goals"
listed in Table 10. If a point is located within the goals, it indicates that the METSTAT results
from the given period satisfy the criteria benchmarks for the meteorological variable in question.
If the point is outside the goals, it does not satisfy the criteria and indicates that WRF
performance was poor for the particular period and region evaluated. If the point is within the
complex criteria goals but outside the simple criteria goals, WRF performance is satisfactory for
complex terrain and meteorological conditions, but not necessarily for simple terrain and
meteorological conditions. If the point is inside both the simple and complex criteria goals, WRF
performance is considered satisfactory. This result indicates good agreement between
observations and the simulated surface meteorology. However, for the sites and periods
evaluated in this study, only a few local surface datasets were available.
METSTAT was used to evaluate the performance of each WRF simulation from this study using
surface meteorological data from the U.S. National Climatic Data Center DS3505 database. The
database contains records of most official surface meteorological stations from airports, military
bases, reservoirs/dams, agricultural sites, and other sources dating from 1901 to the present.
40
-------
The measurements taken during each tracer study, reported in the tracer study databases, were
not used for the METSTAT analysis.
Additional WRF performance analysis is conducted using the time series of meteorological
variables extracted at the tracer study measurement sites with MMIF for each case. The time
series comparisons are a more valuable tool for understanding the influence of meteorology on
AERMOD prediction performance because the meteorology at the extracted point is used to
drive AERMOD. The METSTAT analyses are valuable for evaluating the performance of the
WRF simulations on a domain-wide level.
For the purposes of this study, wind direction errors are not important because we use the
common practice of using the highest concentration at any of the receptors for comparison with
the observed highest concentration. The soccerplot for wind direction error and bias is included
below for completeness.
The reader is also reminded of the lack of a high-resolution SST dataset for these historical
simulations. Modern WRF applications typically use high resolution datasets that can resolve
changes in SST relative to the shoreline. The reanalysis SST data are provided every six hours,
the same rate as the meteorological parameters.
2.3.2 Cameron WRF Simulation Performance
METSTAT was used to compare the Cameron 1.3 km grid WRF solutions to two surface station
datasets located within the domain. The stations were located within the tracer study area at
coastal locations. Performance was evaluated separately for the winter and summer periods for
each of the WRF scenarios modeled. The analyses were conducted using a month of
meteorology at each station, corresponding to the months of the tracer studies. Therefore, the
periods of meteorology analyzed with METSTAT contain more than just the hours of the tracer
studies. The results are plotted for wind and temperature performance and compared against
the performance goals for simple and complex conditions. The results are shown in Figure 16,
Figure 17, and Figure 18 for wind speed, wind direction, and temperature METSTAT results,
respectively (humidity results were not available for Cameron).
The results indicate WRF performed within the complex criteria goals for wind speed and
temperature during the summer periods. Temperature error and bias was within the complex
criteria goals for the winter period also, but positive wind speed bias and error exceeding the
complex criteria goals was prevalent during the winter periods for most of the WRF cases. Wind
direction bias of-10 to -40 degrees during the summer periods exceeded the complex criteria
goals. The best performing WRF simulations were the NARR.UW simulation with respect to
wind performance. Most WRF simulations performed similar with respect to temperature
performance - all within the complex criteria but with a slight cool bias.
It is hypothesized that the WRF cold bias of 1-2 °C is at least partially due to the influence of the
cooler seas on the average temperature in the WRF grid cell. This likely results in a cold bias
when compared to the land-based coastal measurement locations. WRF results likely compare
better to the land-based measurements during onshore flow. During offshore flow, the
41
-------
measurement location will be dominated by land-based PEL processes, but a portion of the
WRF grid cell will still be influenced by the sea. This same effect is hypothesized to result in
WRF warm bias when WRF results are compared to offshore measurements near to the
coastline. Both onshore and offshore winds occurred during the summer period, but onshore
winds were dominant.
Another possibility is that the WRF PEL schemes are allowing too much or too little vertical
mixing at the coastline. During onshore flow, too little mixing could be expected to result in a
cold bias due to the lack of entrainment of warmer, dryer air from above the marine PEL. Too
little mixing could also explain wind direction bias. However, wind speed could be expected to
be biased low if too little mixing was the cause and the results indicate that wind speed is biased
high by 1-2 m/s.
2.3.3 Carpinteria WRF Simulation Performance
The Carpinteria METSTAT evaluation used DS3505 meteorological datasets from five sites.
Although none of the stations are within the study area, as seen in Figure 9, the stations are
located on the coast. The results are shown in Figure 19, Figure 20, Figure 21, and Figure 22
for wind speed, wind direction, temperature, and humidity METSTAT results, respectively. All of
the statistics fall within the acceptable range for complex terrain. Most of the WRF simulations
had satisfactory temperature performance, with bias and error values falling within the simple
terrain criteria range. Humidity was also assessed, and the statistics fell within simple terrain
criteria. All WRF configurations had similar performance. MYJ-based simulations had slightly
less bias and error in wind speed performance, but more bias in wind direction.
Carpinteria temperature bias and error was relatively low, falling within the simple terrain criteria
for most cases. A cold bias could have been expected, given the hypothesis presented in
Section 2.3.4 regarding WRF cell averaging influence on results at the coastline.
2.3.4 Oresund WRF Simulation Performance
The Oresund METSTAT soccerplots are shown in Figure 23, Figure 24, Figure 25, and Figure
26 for wind speed, wind direction, temperature, and humidity METSTAT results, respectively.
The Oresund METSTAT analysis was conducted using the eight surface stations located within
the 1.3-km domain, as shown in Figure 3. The stations are spaced evenly across the domain at
both coastal and inland sites. The plots indicate performance statistics fell within the criteria
range for simple terrain conditions for most WRF simulations and periods. All statistics are
within the complex criteria ranges. Temperature and wind predictions were slightly better in
June than May, given lower error and bias. All WRF configurations had similar performance
scores.
Similar to Carpinteria, the Oresund simulation temperature bias and error is low. The slightly
cold bias of up to -1 °C corresponds to the trend seen in the Cameron analysis. It was
hypothesized in Section 2.3.1 that the cold bias was the result of comparing WRF results to
land-based coastal stations, where a portion of the WRF grid cell is located in the marine PEL.
42
-------
2.3.5 Pismo WRF Simulation Performance
The Pismo Beach performance statistics are shown in Figure 27, Figure 28, Figure 29, and
Figure 30 for wind speed, wind direction, temperature, and humidity METSTAT results,
respectively. The METSTAT evaluation was conducted using two surface meteorological
stations. The stations are not relatively near to the tracer study area and are located far inland,
as seen in Figure 4. Therefore, the evaluation of WRF simulation of marine meteorology is
limited for the Pismo Beach case in the METSTAT analysis. Wind speed and humidity error and
bias for most winter and summer periods are within the simple terrain criteria. Wind speed
performance is within simple terrain criteria for most of the WRF cases for the summer period
and within the complex terrain criteria for winter periods, just outside the simple terrain criteria
due to high bias. Wind direction error and bias is high (error exceeding 60 degrees) and outside
of the complex criteria ranges for all of the winter WRF simulation cases. However, the ERA-
based simulations resulted in lower bias and error than the NARR-based simulations. Similarly,
temperature is poorly predicted by all WRF simulations during winter periods, with a warm bias
of 2-5 °C exceeding the bias criteria range. Temperature error was within the acceptable range
for the ERA simulations only.
The ERA simulations have better performance overall compared to the NARR simulations. The
UW-PBL and YSU PBL schemes also have slightly better performance than the MYJ scheme.
Given the lack of coastal or overwater datasets available for evaluation of WRF performance,
the reported results cannot be considered truly indicative of WRF performance overall for the
Pismo Beach tracer study. The two surface stations used for the analysis are both located in
urban areas, in valleys adjacent to a chain of foothills. In general, the temperature and wind bias
and error for these two sites during the winter simulations are high enough to indicate that the
measurement dataset or WRF results may be suspect.
2.3.6 Ventura WRF Simulation Performance
The WRF simulation performance statistics are shown in Figure 31, Figure 32, Figure 33, and
Figure 34 for wind speed, wind direction, temperature, and humidity METSTAT results,
respectively. METSTAT used meteorological data from six observation sites to calculate
performance statistics for each WRF simulation, as shown in Figure 5. Wind speed performance
falls within the simple terrain criteria goals for error and bias for most of the summer cases and
several of the winter cases. Only the NARR.MYJ simulation exceeds the complex terrain criteria
for wind speed error. The ERA.MYJ and ERA.UW simulations perform the best with very low
bias and the lowest error per season. Temperature performance is within the complex terrain
criteria range for all winter cases and most summer cases (the MYJ simulations exceed the
criteria range due to high positive bias). All wind direction error statistics are within complex
criteria goals, but bias exceeds the goal for ERA winter simulations and ERA.UW and ERA.YSU
summer simulations. Humidity performance falls within the criteria range for all winter cases and
the ERA summer cases.
All WRF simulations of the summer period resulted in warm bias. The NARR runs produced
significant bias of 1-2 °C, exceeding the complex terrain criteria (except for NARR.UW). Four
43
-------
meteorological stations used are located on the coast and one station is on an island. The
locations are similar in character to the stations used in the other studies where cold bias could
be attributed to averaging over the WRF grid cell. However, a warm bias is prevalent in the
Ventura simulations. Although both offshore and onshore winds occurred during the summer
(September) period, offshore winds dominated.
The warm bias results conflict with the WRF cell-averaging hypothesis. With offshore winds
dominant, we would expect WRF results would be prone to cold bias.
44
-------
LU
i?
5:
S
£
Cameron WRF ALL 1.3km Wind Speed Performance Comparison
ds35O5 1981,1982
• ERA.MYJ
• ERA.UW
s ERA.YSU
« NARR.MYJ
NARH.UW
« NARRYSU
• Jul
il Feb
l
-2
Wind Speed Bias (m's)
Figure 16. Cameron wind speed METSTAT results.
Cameron WRF ALL 1.3km Wind Direction Performance Comparison
ds3505 1981,1982
I
• ERA.MYJ
• ERA.LJW
• ERA.YSU
1 NARRMYJ
NARR.UW
E NARRVSU
• Jul
A Feb
-to
-30
-20 -10
Wind Direction Bias (deg)
20
Figure 17. Cameron wind direction METSTAT results.
45
-------
Cameron WRF ALL 1.3km Temperature Performance Comparison
CJS3505 1981 T9B2
I
HI
5
I
» ERA.MYJ
m ERA.UW
* ERA.YSU
:•- NARR.MYJ
NARR.UW
x NARR.YSU
• Jul
a Fab
-2 -1 0
Temperature Bias (K)
2
Figure 18. Cameron temperature METSTAT results.
46
-------
Carpinteria WRF ALL 1.3km Wind Speed Performance Comparison
ds3505 1985
in _
m
o
m ~
*»
(V
111
{/> O
It
S
Q V?
a. ^
W
TJ
1 3-
•n
o
o
b
t
* ERA.MYJ
* ERA.UW
* ERA.YSU
NARR MYJ
NARR.LW
* NARR.VSU
* Sop
^
»*
*
i ill
-2-101
Wind Speed Bias (ITV'S)
Figure 19. Carpinteria wind speed METSTAT results.
Carpinteria WRF ALL 1.3km Wind Direction Performance Comparison
ds35QS 1965
g
1JJ
g
-
•
40
i-
* ERA.MYJ
* ERA.UW
* EHA.YSU
•;- NARR MYJ
NARH.UW
* NARRVSU
* Sep
UnJBuML
-i-n-f-^-^i:;
-20
-1S
-10 -S D
Wind Direction Bias (deg)
5
10
15
Figure 20. Carpinteria wind direction METSTAT results.
47
-------
Carpinteria WRF ALL 1.3km Temperature Performance Comparison
ds35051985
* ERA.MYJ
* ERA.UW
<=•• ERA.YSU
» NARR.MYJ
NARR.UW
".- NARR.YSU
Sep
-3
-2
-1 0
Temperature Bias (K)
q
ri
Figure 21. Carpinteria temperature METSTAT results.
Carpinteria WRF ALL 1.3km Humidity Performance Comparison
ds3505 1985
* ERA.MYJ
ERA.UW
«• ERA.YSU
»- NARR.MYJ
NARR.UW
e- NARR.YSU
Sep
_ Come
Simpte Terrain
-2.0
-1.5
-1.0 -0.5 0.0
Humidity Bias (g/kg)
0.5
1.0
1.5
Figure 22. Carpinteria humidity METSTAT results.
48
-------
s
s
Oresund WRF ALL 1.3km Wind Speed Performance Comparison
ds35Q5 1984
to o
> M
E
I :
to
UJ
O
q
o
0 ERA.MYJ
0 EHA.UW
0 ERA.YSU
0 May
v Jun
o o
V \
•1 0
Wind Speed Bias (m/s)
Figure 23. Oresund wind speed METSTAT results.
S
2
1
3 o _
Oresund WRF ALL 1.3km Wind Direction Performance Comparison
ds350S 1984
0 ERA.MYJ
c- ERA,UW
<> ERA.YSU
0 May
7 Jun
':•! I/L Tvllii II
V V V
-20
-10 -5 0 5 10
Wind Direction Bias (deg)
Figure 24. Oresund wind direction METSTAT results.
49
-------
Oresund WRF ALL 1,3km Temperature Performance Comparison
ds3505 1984
•* -
s 9
o
o
o
0 ERA.MYJ
0 ERA.UW
0 ERA.YSU
O May
V Jun
Comufe. Teiram
Simpte T
enam
V
II III 1
-2.0 -1.5 -1.0 -0.5 0.0 0.5 1.0 1.5
Humidity Bias (g/kg)
Figure 26. Oresund humidity METSTAT results.
50
-------
Pismo WHF ALL 1.3km Wind Speed Performance Comparison
ds350E 1981 1982
n ^
o
K)
i **
Wind Speed RMSE
1.0 1.5 2.0
n
ei ~
0
0
B ERA.IgfYJ
B ERA.UW
a ERA.YSU
5 NARR.MYJ
NARR.UW
B (MARRYSU
B Dec
v Jun
Wind Direction
40
i
o
o -
s
a
a ERA. MY J
a EHA.UW
IB EHA.YSU
•- NARR.MYJ
- NARR.UW
is NARR.YSU
s Dee
V Jun
-20
«
V
V
v v
5-mplB Ty lOlfi
•
^n i i i i
•10 0 10 20 30 40
Wind Direction Bias ideg;
Figure 28. Pismo wind direction METSTAT results.
51
-------
Pismo WRF ALL 1.3km Temperature Performance Comparison
ds3505 1981,1982
I
a ERA.MYJ
a ERA UW
i= ERAYSU
• NARRMYJ
NARRUW
IB NARRYSU
• Dec
V Jun
•2 0
Temperature Bias (K)
4
Figure 29. Pismo temperature METSTAT results.
Pismo WRF ALL 1.3km Humidity Performance Comparison
ds3505 1981.1982
O
CO
in
'5 °
^ CM
O)
0
1 ° -
in
C)
o
0
ffl ERA.MYJ
B ERA.UW
& ERA.YSU
I NARR.MYJ
NARR.UW
IB NARR.YSU
ffl Dec
V Jun
1
-2.0
ffl
m
ga
9
y V
1 III 1
-1.5 -1.0 -0.5 0.0 0.5 1.0 1.5
Humidity Bias (g/kg)
Figure 30. Pismo humidity METSTAT results.
52
-------
Ventura WRF ALL 1.3km Wind Speed Performance Comparison
ds3505 1980.1981
1-
a-
— in
Ul
v> o
2 t\i ^
tL
s « _.
o. ~
W
1
:> °
> -^ ~
in
o
o
o ~
* ERA.MYJ
* ERAUW
* ERA.YSU
-• MARRMYJ
NARR.UW
* NARR.YSU
* Sap
a Jan
C HIBbBThHftl °
o
0
5 rui*1 T*»Tiin O
O
«!|
.
_
-1 0
Wind Speed Bias (m/s)
Figure 31. Ventura wind speed METSTAT results.
?: S -
ui
I
O
S -
Ventura WRF ALL 1.3km Wind Direction Performance Comparison
ds3505 1980,1981
* ERA.MYJ
* ERA.UW
* ERA.YSU
--• NARR.MYJ
NARR.UW
* NARRYSU
* S. |.:
o Jan
D
•
A
0
ntiiTv^i lunn
*
o
o
ESp5 Tvnvb
0
-20
-10 0
Wind Direction Bias (deg)
Figure 32. Ventura wind direction METSTAT results.
53
-------
Ventura WRF ALL 1.3km Temperature Performance Comparison
ds35Q5 I 980.1981
* ERA.MYJ
* ERA.LFW
-------
2.3.7 WRF Performance Evaluation Conclusions
The top two best performing WRF runs (least error and bias) for each meteorological variable
are listed in Table 11. The METSTAT analyses revealed that WRF performance was within the
complex criteria for the majority of cases studied and show there is no preferred WRF
configuration for all cases. Overall, the ERA.UW-based runs result in less bias and error on
average. It is possible that WRF performance could be climate or regionally specific and
recommendations could vary by region.
It is evident from this analysis ERA-based simulations perform better than NARR-based
simulations. ERA-based runs were top performers for wind speed and direction for Pismo and
Ventura (Oresund only used ERA-based runs) and NARR-based runs were top performers for
Carpinteria. At Cameron, NARR-based runs performed better for wind speed and ERA-based
runs performed better for wind direction. For temperature, ERA-based runs performed better
overall, but NARR runs outperformed ERA runs in the summer cases of Cameron, Carpinteria,
Pismo, and Ventura. ERA-based runs performed better than NARR runs in all cases.
It is evident from the results that no PEL scheme stands out as a top performer in all cases. The
ERA.UW scheme tended to perform the best on average. For wind speed the MYJ
outperformed UW and YSU for Carpinteria and Oresund. The UW scheme was a top performer
for Cameron and was either the top performer or second-best performer a majority of the cases.
For wind direction, the YSU scheme was a top performer for most cases. The UW scheme was
a top or second-best performer for the majority of cases for temperature and humidity also.
Given the results of this analysis, we would recommend adoption of the ERA.UW scheme. This
WRF configuration performed better overall and also contains a parameterization scheme
developed for modeling of the mariner boundary layer. Overall, the number of cases analyzed in
this study may be too limited to provide sufficient detail to determine preferred WRF
configurations. An expanded study involving multiple simulated years at multiple regions and
climates is recommended to confirm the recommendation provided. The bulk of the conditions
analyzed involved the Californian tracer studies (Carpinteria, Pismo Beach, and Ventura), all
located near Los Angeles, CA. These studies provided the opportunity to investigate WRF
performance over a subtropical coastal region characterized by relatively warm, dry air and cool
ocean waters. The Carpinteria study was conducted in a subtropical coastal climate
characterized by warm, humid air and warm ocean waters. The Oresund study was conducted
in a mid-latitude coastal climate characterized by relatively cool summers, cold winters, and cold
ocean waters. Further study using climatic conditions not addressed would be recommended,
including mid-latitude warm-water, mid-latitude cold-water, and arctic marine climates.
The METSTAT analysis has provided a review of WRF performance for each tracer study on a
domain-wide scale. WRF performance as indicated by METSTAT may or may not indicate
favorable performance at the locations of the tracer release. Further analysis in Section 3
evaluates WRF performance at the location of the meteorological measurements from each
tracer study. The section also includes a description of the AERMOD settings and methodology
that involves direct use of the meteorology at this location. Since the AERMOD simulations use
55
-------
the meteorology at the location of the tracer release, WRF performance at this location is more
important than across the entire domain. The analysis compares the performance at the point to
the METSTAT results for each case.
56
-------
Table 11. Results summary for METSTAT analyses.
Month
Site
Cameron Feb.
Jul.
Carpinteria Sep.
Oresund May
June
Pismo Dec.
Jun.
Wind Speed (mis)
Results
Error exceeds complex
criteria, biased +1-2 m/s
All within complex
criteria for bias, simple
criteria for error.
All within complex
criteria for bias, simple
criteria for error (except
ERA. MYJ). Low bias of -
1 m/s YSU and UW
cases.
All within complex
criteria for error and
bias. MYJ runs within
simple criteria.
All within complex
criteria for error and
bias. MYJ runs within
simple criteria.
All within complex
criteria for bias, simple
criteria for error. All
biased high 0.5 - 1 m/s.
All within complex
criteria for bias, simple
criteria for error.
Best
NARR.UW
NARR.YSU
NARR.UW
NARR.MYJ
NARR.MYJ
ERA. MYJ
ERA. MYJ
ERA.UW
ERA. MYJ
ERA.UW
ERA.YSU
ERA.UW
ERA.UW
ERA.YSU
Wind Dir.
Results
All within complex criteria
for error, YSU runs bias
of 1 0-1 5° exceed complex
criteria
High bias and error of 20-
40° all runs.
All within simple criteria
for bias, all within
complex criteria for error.
Consistently high error of
40-50°
All within simple criteria.
All within simple criteria.
Very high bias and error:
error > 60° all cases.
Bias and error within
complex criteria all cases.
Best
ERA.UW
NARR.UW
ERA.YSU
ERA.UW
NARR.YSU
NARR.UW
ERA. MYJ
ERA.YSU
ERA.YSU
ERA. MYJ
ERA.YSU
ERA.UW
ERA.YSU
ERA. MYJ
Temp. (°C)
Results
NARR.YSU exceeds
complex criteria with error of
about 4°. All runs cold bias
of -1 to -2°
All within complex criteria.
Cold bias of -1 to -2° all
cases.
All within simple criteria for
error, most cases within
simple criteria for bias.
All within simple criteria for
error, complex criteria for
bias. Cold bias of 0.5 - 1 .0°
All within simple criteria.
Very high bias. All runs
exceed complex criteria with
warm bias of 2-4°.
All within complex criteria for
bias, simple criteria for error.
Best
ERA. MYJ
ERA.UW
NARR.MYJ
ERA. MYJ
NARR.UW
ERA.YSU
ERA.UW
ERA. MYJ
ERA.UW
ERA.YSU
ERA.UW
ERA.YSU
NARR.YSU
ERA.UW
Humidity (g/kg)
Results
NA
NA
All within simple
criteria.
All within simple
criteria.
All within simple
criteria.
All within simple
criteria.
All within simple
criteria.
Best
NA
NA
ERA. MYJ
ERA.UW
ERA.UW
ERA.YSU
ERA.UW
ERA.YSU
ERA. MYJ
ERA.UW
ERA.UW
ERA. MYJ
57
-------
Table 11 (continued). Results summary for METSTAT analyses.
Month Wind Speed (m/s)
Wind Dir.
Temp. (°C)
Humidity (g/kg)
Site
Results
Best
Results
Best
Results
Best
Results
Best
Ventura
Jan. OnlyNARR.MYJ ERA.UW
exceeds complex ERA YSU
criteria. Tendency for
positive bias most runs.
Som e cases exceed E RA. MYJ
complex criteria for bias. ERA YSU
All cases with complex
criteria for error.
All cases within complex ERA.YSU All within simple ERA.YSU
criteria, slight cold bias of - ERA UW criteria. ERA UW
0.25° most runs.
Sep. Most runs within simple
criteria for error and bias
ERA.YSU ERA runs exceed
ERA UW complex criteria for bias
with bias >15°. All cases
within complex criteria for
error.
NARR.UW High warm bias most cases, NARR.YSU
ERA.YSU MYJ cases exceeding ERA.YSU
complex criteria with bias
NARR cases
exceed complex
and simple
criteria with
positive bias of 1
to 1.5 g/kg
ERA.YSU
ERA.UW
OVERALL
Most cases within ERA.UW Cases with high bias and
complex criteria error common, most
within complex criteria
ERA.YSU Most cases within complex
criteria, cold bias common.
ERA.UW
Most cases
within simple
criteria
ERA.UW
58
-------
3 DEVELOPMENT OF METEOROLOGICAL INPUTS FOR AERMOD
The AERMOD modeling system was developed as the next generation regulatory air quality
dispersion model, designed to incorporate state-of-the-art PEL structure based on Monin-
Obukhov similarity theory. Monin-Obukhov theory uses scaling factors based on the rate of heat
and momentum flux to describe the structure and evolution of the PEL. AERMOD replaced the
Industrial Source Complex (ISC3) modeling system that used Pasquill-Gifford stability classes
and corresponding lookup tables to estimate the dispersion scaling parameters. AERMOD
requires a complex set of meteorological input to characterize the PEL structure and the
turbulence parameters used to estimate rates of dispersion. A full description of the formulas
and parameterization schemes used in AERMOD and its meteorological preprocessor AERMET
can be found in (USEPA, 2004a).
3.1 AERMOD Input Meteorology Files
AERMOD requires two meteorological files as input: a surface meteorological file (SFC file) and
a profile file (PFL file) from one or more levels of wind and temperature data. Each file contains
a time series of hourly-averaged meteorological variables. The PFL file need only include wind
and temperature information at a single height, but turbulence measurements and information at
additional heights may improve the accuracy of the simulation by providing a more complete
description of the atmospheric structure for AERMOD. The meteorological variables contained
in the SFC and PFL files are listed and described in Table 12.
Table 12. Meteorological Fields in the AERMOD Meteorology Input Files.
Meteorological
Variable
Units Abbreviation
Description
Sensible heat W/m2 H
flux
The rate of heat transfer to the atmosphere from the ground,
positive H the ground is heating the PEL, negative H the ground is
cooling the PEL.
Friction velocity m/s
Characteristic velocity scaling factor used to describe the rate of
transfer of energy from atmospheric momentum to the surface
through turbulent motions. Mechanical turbulence and the rate of
pollutant dispersion in the PEL is a function of u*.
Convective m/s w*
scaling velocity
Characteristic vertical velocity scaling factor used to describe the
transfer of momentum due to convective processes in the PEL. It is
used to estimate turbulence and corresponding rates of dispersion
in the convective PEL.
59
-------
Table 12 (continued). Meteorological Fields in the AERMOD Meteorology Input Files.
Meteorological
Variable
Units Abbreviation
Description
Vertical potential °C/m d0/dz
temperature
gradient above
the mixed layer
PEL height m z/t
potential under
convective
processes
PEL height m z/m
potential under
mechanical
processes
Monin-Obukhov m L
length
Used in convective conditions only: describes the gradient of
potential temperature (temperature a parcel of air would have at
sea-level pressure) at the interfacial layer above the well-mixed
layer. This value specifies the "strength" of the top of the well-mixed
layer for describing the fraction of plume penetration into and above
the interfacial layer.
Height of the mixed layer possible under convective forcing.
AERMOD uses the maximum of z\c orzimforthe PEL height.
Height of the mixed layer possible under mechanical forcing.
AERMOD uses the maximum of zic orzimforthe PEL height.
The fundamental scaling parameter of Monin-Obukhov similarity
theory that is used to define the influence of buoyancy-induced and
mechanical turbulence on the structure of the surface layer of the
atmosphere. In stable conditions it can be considered as the relative
height at which buoyant production of turbulent energy is equal to
that produced by mechanical / wind-shear processes.
A negative L indicates unstable, convective conditions while a
positive L indicates stable conditions. Large absolute value of L is
indicative of neutral conditions while small absolute values of L are
indicative of strongly stable or unstable conditions.
Roughness
length
m
Zo
A scaling parameter used to describe the influence of ground
surface friction on the structure of the PEL. Values of z0 over the
ocean are very low (10"5- 10"3m) and are a function of wave height
(Arya, 1988).
Bowen ratio
The ratio of sensible heat flux to latent heat flux from the ground.
Values > 1 occur in drier conditions when most heat flux is in the
form of sensible heat. Values < 1 occur in moist conditions, when
sufficient surface moisture is available for evaporation. In marine
environments, the Bowen ratio is always small. It is used by
AERMET to estimate sensible and latent heat fluxes during
unstable conditions and passed thru to AERMOD to estimate the
deposition of gases.
Albedo
The fraction of total incident solar radiation reflected by the earth's
surface. It is used by AERMET to estimate the surface radiation
balance.
60
-------
Table 12 (continued). Meteorological Fields in the AERMOD Meteorology Input Files.
Meteorological
Variable
Units Abbreviation
Description
Wind speed m/s
WS
The 1-hour average scalar wind speed at a specified measurement
height. Typical measurement height is 10 m (the meteorological file
provides a column to specify measurement height), but may be as
low as a few meters on meteorological buoys.
Wind direction degrees WD
The 1-hour average wind direction at a specified measurement
height.
Temperature K
The 1-hour average atmospheric temperature at a specified
measurement height, typically 2 m (the meteorological file provides
a column to specify measurement height).
Standard
degrees a
(Provided in the PFL file only). Standard deviation of the wind
deviation of wind
direction
Standard m/s aw
deviation of
vertical wind
speed
direction during the period.
(Provided in the PFL file only).
wind speed during the period.
Standard deviation of the vertical
The input meteorology files to AERMOD are typically built by AERMET, the accepted
preprocessor program for regulatory modeling of land-based emission sources. MMIF and
AERCOARE are the alternative preprocessors examined in this study. The MMIF program
provides a method to extract WRF data directly from the WRF output files to create SFC and
PFL files for AERMOD, using the fields available in WRF to estimate the meteorological
variables listed in Table 12. Alternatively, MMIF can create input files for AERCOARE, which
can be used to create the necessary meteorological files for AERMOD for over-water
applications.
Meteorological preprocessors produce the AERMOD meteorological parameters from a set of
raw meteorological measurements or model output data such as wind speed, wind direction,
temperature, solar radiation, differential temperature, humidity, cloud cover, and atmospheric
pressure. The quality of the AERMOD meteorological input is therefore highly dependent on the
representativeness of the raw meteorology fed to the preprocessor.
AERCOARE requires overwater air-sea temperature difference, relative humidity and surface
wind speed to characterize the surface layer energy fluxes. The resulting stability of the
overwater surface layer is highly dependent on the "sign" of the air-sea temperature especially
during light to moderate winds speeds.
61
-------
3.2 Buoy Meteorology Processing with AERCOARE
The meteorological measurements from each tracer study were used to create input
meteorology for AERCOARE. The AERMOD surface (SFC) and upper-air (PFL) files were built
using AERCOARE with defaults recommended in Richmond and Morris (2012). The PEL height
was calculated by AERCOARE. AERCOARE calculates a mechanical PEL height (z,m) using the
Venkatram algorithm (Venkatram, 1980) and uses the observed PEL height for the convective
PEL height (Z/c). The Venkatram algorithm is also used by AERMET to estimate mechanical
mixing height (Cimorelli, et al., 2004). The SST depth was set to -0.5 m for all studies. The wind
measurement height was set to 10.0, 9.0, 10.0, 7.0, and 7.0 m for Cameron, Carpinteria,
Oresund, Pismo, and Ventura respectively, to correspond to the experimental measurement
heights.
The performance of the WRF-driven AERMOD simulations was compared to the results based
on two cases driven by observations from a previous study (Richmond & Morris, 2012). The
AERCOARE options for these cases were:
• Case 14: Require Abs(L) > 5 m, use oe measurements, and use the Venkatram equation
for z/m and require z,m > 25 m.
• Case 2: Require Abs(L) > 5 m, use AERMOD-predicted o0, and use the Venkatram
equation for z,m and require z,m > 25 m.
3.3 WRF Meteorology Extraction Methods
AERMOD meteorological files (SFC and PFL files) are not produced by WRF directly. The
MMIF program is used to extract meteorological data from WRF output files and build the
AERMOD SFC and PFL input files. The variables listed in the SFC and PFL files can be
calculated directly or indirectly by MMIF using the fields available in the WRF output files. The
three methods MMIF can create meteorological files for AERMOD are:
a) Create onsite, upper air, and land use data for AERMET processing,
b) Create AERCOARE input files and run AERCOARE, or
c) Create AERMET-like SFC and PFL files directly.
Method a would be inappropriate for overwater dispersion studies because AERMET is only
configured for overland meteorology. Methods b and c are both tested in this study. For method
b, the WRF simulations provide the variables that might be measured by a buoy, ship, or
offshore platform such as wind speed, wind direction, air temperature, and SST. These are
provided to AERCOARE, which produces the AERMOD SFC and PFL files using its specialized
overwater algorithms. For Method c, MMIF passes through or calculates all variables directly
from the WRF output files. If not available from WRF output, the similarity scaling variables L
4 Note that only Case 2 was evaluated for Oresund because over water oe data were not available for the
period of the study.
62
-------
and w. are determined by MMIF from Richardson-number methodology defined in Louis (1979).
In this study, the variable L is calculated and supplied directly by WRF under all scenarios
simulated. The variable w* is not calculated by WRF so it is calculated by MMIF.
WRF estimates PEL height through the PEL parameterization scheme applied - each scheme
uses a different method to formulate PEL height. The WRF PEL heights are also fixed to the
nearest vertical grid cell center and thus can vary abruptly over small spatial distances. MMIF
can also rediagnose the PEL height using the bulk Richardson approach of Vogelezang and
Holtslag (1996). This option provides a uniform method for estimating PEL height regardless of
the PEL scheme used. The MMIF PEL height estimation scheme, Vogelezang and Holtslag
(1996), takes vertical structure of the atmosphere into account unlike the Venkatram (1980)
method used in AERCOARE and AERMET. AERMOD simulations can be very sensitive to the
PEL height (Richmond & Morris, 2012). Therefore, MMIF-predicted PEL heights may provide
significantly different predicted concentrations than the PEL height used internally by WRF.
One of the goals of this study is to offer a recommendation for the MMIF extraction method to
provide the meteorology for AERMOD. Given the two methods for over water described above,
four different extraction options were tested:
1. MMIF was applied to extract and prepare data sets for direct use by AERMOD
(MMIF produces the AERMOD SFC and PFL input files directly). The PEL height
predicted by WRF is used in the SFC file (referred to as "RCALF" simulations where
PEL height was not recalculated).
2. Same as Option 1, but the PEL height was rediagnosed from the wind speed and
potential temperature profiles using the bulk Richardson algorithm within MMIF
(referred to as "RCALT" simulations where PEL height was recalculated).
3. MMIF was applied to extract the key meteorological variables of overwater wind
speed, wind direction, temperature, humidity, and PEL height from WRF results. The
MMIF extracted data were used to build an AERCOARE input file. AERCOARE used
these variables to predict the surface energy fluxes, surface roughness length and
other variables needed for the AERMOD simulations. For the current study,
AERCOARE was applied using the defaults recommended in the AERCOARE model
evaluations study (Richmond & Morris, 2012).
4. Same as for Option 3, but the PEL height was rediagnosed using the bulk
Richardson algorithm within MMIF.
The naming convention and a description of the four extraction methods are listed in Table 13
63
-------
Table 13. WRF Meteorology Extraction Options.
WRF Extraction _ _ ..
.- .. . Process Path
Method
1) MMIFa.RCALF WRF •*• MMIF -» AERMOD
2) MMIFa.RCALT WRF -» MMIF (with PEL diagnosis) -» AERMOD
3) AERCa.RCALF WRF -» MMIF •*• AERCOARE -» AERMOD
4) AERCa.RCALT WRF -» MMIF (with PEL diagnosis) -» AERCOARE •*• AERMOD
Alternative naming convention still used in some of the graphics in this report: MMIF: "aerF", AERC: "aerT"
Latitude and longitude coordinates must be supplied for the extraction point. MMIF identifies the
nearest WRF grid point to the coordinates specified and extracts the data time series from this
point (no interpolation between points). The extraction points used were selected to correspond
with the meteorological measurement sites as closely as possible given the provided
information. If MMIF is to be used to provide meteorology for AERMOD for overwater cases,
care must be taken to ensure that the data are extracted for a grid cell that is over water. It is
possible that the nearest grid cell to a near-shore source is located over land. In this case, the
modeler would need to locate the nearest overwater grid cell and configure MMIF to extract data
from the appropriate cell. All extractions used for this study were from grid points over water.
3.4 Evaluation of Extracted Meteorology from WRF
The performance of each WRF configuration (the six configurations discussed before using a
combination of two reanalysis schemes and three PEL schemes) can also be evaluated by
comparison of extracted WRF data to the tracer experiment meteorological measurements.
Time series of wind speed, air temperature, sea temperature, air-sea temperature difference,
relative humidity, PEL height, and MMIF RCALT PEL height were prepared for each WRF
scenario for comparison to the measurements. These data are used by MMIF or AERCOARE
(refer to Table 13) to produce the scaling variables such as Monin-Obukhov Length used by
AERMOD. Time series of L from each WRF extraction were also developed to compare to L
values calculated by AERCOARE using the measurement datasets. Wind direction was
evaluated for the complex terrain experiments (Oresund and Carpinteria). A discussion on the
evaluation methodology can be found in Section 4.1.
Each figure included in the subsequent sections contains points representing the average
measured value and WRF value of the respective meteorological variables for the hour of each
tracer release event. Typical tracer release events occurred for several subsequent hours at a
time. Releases were generally conducted several days to several weeks apart. The figures were
developed as seasonal time series resulting in some bunching of data points for release events
over multiple subsequent hours. Releases with bunched values indicate little difference between
WRF results. The seasonal bias and error (RMSE for wind speed) for all WRF simulations for
wind speed, temperature, SST, air-sea temperature difference, relative humidity, and PEL
height are listed in Table 14. The results for each tracer study are discussed following Table 14.
64
-------
Table 14. WRF meteorology bias and error compared to tracer study measurements.
WRF .. ,_ Number
Site . Month
scenario of hours
METSTAT CRITERIA (SIMPLE TERRAIN)
METSTAT CRITERIA (COMPLEX TERRAIN)
ERA.MYJ
ERA.UW
ERA.YSU
fob 17
NARR.MYJ
NARR.UW
Cameron NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU
Jul 9
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU 20
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
May-
Oresund ERA.UW 70
lunp
ERA.YSU
Wind speed
(m/s)
Bias
<±0.5
<±1.5
-0.3
0.3
0.8
0.2
0.4
0.6
-0.1
0.7
0.4
0.4
1.3
1.2
-1.0
-1.1
-0.8
-1.2
-1.0
-0.9
-0.9
-1.3
-1.1
RMSE
<2.0
<2.5
1.5
1.3
1.4
1.3
1.3
1.3
1.6
1.3
0.8
2.1
1.7
1.7
1.3
1.4
1.1
1.3
1.6
1.0
2.1
2.9
1.9
Temp
Bias
<±0.5
<±2.0
2.7
2.8
2.3
2.6
2.7
2.1
0.4
0.4
0.4
0.1
0.3
0.3
-1.5
-1.6
-1.6
-1.4
-1.4
-1.8
-2.2
-2.0
-2.0
• TO
Error
<2.0
<3.5
2.9
3.0
2.4
2.8
2.9
2.3
0.7
0.8
0.7
0.8
0.9
0.8
1.4
1.4
1.4
1.2
1.2
1.4
2.2
2.0
2.0
SST (°C)
Bias
NA
NA
4.6
4.6
4.6
4.3
4.3
4.3
-1.7
-1.7
-1.7
-1.9
-1.9
-1.9
-1.5
-1.5
-1.5
-1.3
-1.3
-1.3
-0.6
-0.6
-0.6
Error
NA
NA
4.6
4.6
4.6
4.3
4.3
4.3
1.7
1.7
1.7
1.9
1.9
1.9
1.2
1.2
1.2
1.1
1.1
1.1
0.8
0.8
0.8
Air-Sea
Temp. Diff.
(°C)
Bias
NA
NA
-1.9
-1.8
-2.3
-1.7
-1.6
-2.2
2.2
2.2
2.2
2.0
2.1
2.2
0.0
-0.2
-0.2
-0.1
-0.1
-0.5
-1.6
-1.4
-1.4
Error
NA
NA
1.9
1.8
2.4
1.7
1.7
2.3
2.2
2.2
2.2
2.0
2.1
2.2
0.8
0.9
0.9
0.6
0.8
1.0
2.0
1.9
2.0
Relative
Humidity (%)
Bias
NA
NA
16
13
14
15
15
16
18
8
9
17
6
7
8
10
8
8
8
7
6
2
2
Error
NA
NA
16
14
14
15
15
16
18
8
9
17
11
9
11
13
11
13
11
11
6
6
6
WRF PBL
Height (m)
Bias
NA
NA
29
-33
61
-36
-57
13
-442
-231
-61
-221
-32
122
-477
-484
-421
-477
-476
-378
-676
-732
-678
Error
NA
NA
182
86
86
117
62
59
442
261
202
485
257
213
477
484
421
477
476
378
788
732
719
MMIF (RCALT)
PBL Height (m)
Bias
NA
NA
-29
-16
-7
-50
-46
-27
-175
-133
-68
-88
31
76
-434
-444
-413
-437
-435
-375
-716
-720
-710
Error
NA
NA
75
69
55
93
78
55
286
221
210
245
211
183
434
444
413
437
435
375
716
720
710
65
-------
WRF .. ,_ Number
Site . Month ,.
scenario of hours
METSTAT CRITERIA (SIMPLE TERRAIN)
METSTAT CRITERIA (COMPLEX TERRAIN)
ERA.MYJ
ERA.UW
ERA.YSU 16
Dec
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU 15
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU 8
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU 9
Sop
NARR.MYJ
NARR.UW
NARR.YSU
Wind speed
Bias
<±0.5
<±1.5
1.8
1.7
1.6
0.6
0.6
0.8
1.4
0.7
1.2
0.1
-0.6
0.1
-1.6
-2.0
-1.9
-2.0
-2.0
-0.7
-0.9
-1.1
-0.7
-0.5
-0.1
-0.2
RMSE
<2.0
<2.5
2.4
2.3
2.4
2.4
2.1
2.2
2.4
2.3
2.3
1.3
1.7
1.7
1.6
2.0
1.9
2.0
2.0
1.4
1.1
1.2
1.0
1.2
0.8
1.2
Temp.
Bias
<±0.5
<±2.0
0.8
0.6
0.7
1.4
1.4
1.0
-0.9
-0.6
-1.3
0.1
0.1
-1.0
-1.1
-1.3
-1.2
-0.8
-0.8
0.0
1.2
0.9
1.1
2.3
2.2
0.6
Error
<2.0
<3.5
1.0
0.9
0.9
1.5
1.5
1.7
1.1
0.8
1.3
0.5
0.5
1.1
1.1
1.3
1.2
0.9
0.8
0.7
1.3
1.1
1.2
2.3
2.2
0.9
SST
Bias
NA
NA
0.9
0.9
0.9
1.5
1.5
1.5
1.3
1.3
1.3
2.1
2.1
2.1
-1.0
-1.0
-1.0
-0.6
-0.6
-0.6
-0.5
-0.5
-0.5
1.2
1.2
1.2
Error
NA
NA
0.9
0.9
0.9
1.5
1.5
1.5
1.4
1.4
1.4
2.1
2.1
2.1
1.0
1.0
1.0
0.6
0.6
0.6
0.7
0.7
0.7
1.2
1.2
1.2
Air-Sea
Temp. Diff.
Bias
NA
NA
-0.1
-0.3
-0.2
-0.2
-0.2
-0.5
-2.3
-1.9
-2.6
-2.0
-2.0
-3.1
-0.1
-0.3
-0.1
-0.2
-0.2
0.6
1.7
1.5
1.6
1.2
1.0
-0.6
Error
NA
NA
0.5
0.6
0.5
0.6
0.6
0.9
2.3
1.9
2.6
2.0
2.0
3.1
0.7
0.8
0.6
0.6
0.6
1.0
1.7
1.5
1.6
1.2
1.0
1.1
Humidity
Bias
NA
NA
3
5
3
3
4
4
9
8
10
10
11
15
8
10
9
6
1
-7
13
18
16
14
17
21
Error
NA
NA
6
7
6
8
8
9
10
8
10
10
11
15
12
14
11
10
15
16
13
18
16
14
17
21
WRF Mixing
Height
Bias
NA
NA
200
-190
-164
41
-183
-159
-256
-307
-249
-304
-356
-325
-46
-50
-32
-46
-33
31
-254
-241
-216
-240
-223
-191
Error
NA
NA
333
190
212
301
184
190
423
329
298
457
369
347
46
50
35
46
33
75
254
241
216
240
223
192
MMIF (RCALT)
Mixing Height
Bias
NA
NA
-167
-168
-155
-176
-170
-153
-291
-281
-248
-313
-303
-302
-39
-36
-32
-35
-23
-23
-238
-232
-227
-220
-204
-187
Error
NA
NA
194
185
204
180
176
185
299
291
261
325
323
328
39
36
32
35
23
24
238
232
227
220
204
188
66
-------
3.4.1 Cameron
Figure 35 - Figure 42 contain the summer and winter time series plots of the WRF and
measured meteorology. Each variable is discussed in the sections below.
3.4.1.1 Wind speed
Wind speed results are shown in Figure 35 for the winter and summer periods. The RMSE as
seen in Table 14 is within simple terrain criteria for all WRF scenarios except for summer
NARR.MYJ, which is within the complex criteria. Most of the WRF cases are biased high by up
to about 1 m/s on average except for ERA.MYJ cases which are biased slightly low. All of the
MYJ cases are within simple criteria for bias. The best performing WRF case, with least total
bias and RMSE, is the ERA.YSU case for the summer and NARR.MYJ for the winter. The July
27, 1981 release period results in the greatest differences between WRF estimates and
measurements, with the NARR-based simulations overpredicting wind speed by 3-6 m/s,
contributing to the highest RMSE (NARR.MYJ summer) for Cameron wind speed.
These findings coincide with the summer period METSTAT results in Section 2.3.2, where WRF
results fall within the acceptable ranges for complex terrain. The winter period METSTAT
analysis showed wind speed RMSE exceeding the complex criteria for all runs except for
NARR.UW. The measurement location RMSE and bias are all within the complex criteria for the
winter period. The UW and YSU runs are within the simple terrain criteria at the measurement
location. The overprediction of wind speed at the measurement point during the winter periods
does correspond with the bias indicated by the METSTAT results.
3.4.1.2 Air Temperature
All of the WRF simulations overpredict air temperature for most of the winter period.
Temperature predictions during the July periods were closer to the measurements as displayed
in Figure 36. All of the WRF predictions are very similar except for the Feb. 17th period, where
the YSU simulations (both ERA and NARR) overpredict temperature by about 1-2 °C compared
to the other cases that overpredict temperature by 4-5 °C.
The values for bias and error listed in Table 14 emphasize the overprediction of temperature
during the winter period. Warm bias in all WRF cases exceeds the complex criteria. NARR.YSU,
with a warm bias of 2.1 °C, only slightly exceeds the criteria of 2.0 °C. Temperature error is
within the complex criteria goal, but every case exceeds the simple terrain criteria.
The lower and less biased temperatures in the YSU cases correspond to more cloud cover than
the other simulations. The YSU simulations also result in precipitation during this period while
the other simulations were dry. Wind direction is similar in all simulations but wind speed is a bit
higher in the YSU simulations (about 5 m/s vs. 4 m/s in the other simulations: observations are
about 4 m/s at the time).
A review of the surface weather measurements in the domain reveals the tracer study
measurements and YSU temperatures correspond closely to regional temperatures at the
beginning of the period. However, rapid warming occurs in the region by the end of the period,
67
-------
increasing temperatures up to the magnitude indicated by the UW-PBL and MYJ simulations. At
the end of the period, the buoy measurement is much cooler with air temperatures at about
14 °C compared to other regional temperatures in the 17-24 °C range. These results indicate
that the YSU simulation is correctly predicting persistence of the cloud-topped marine PEL,
where the UW and MYJ simulations dissipate the cloud too quickly. Surprisingly, the UW
simulations are the most positively biased, with a warm bias of about 3 °C, despite being
configured specifically for improved marine boundary layer modeling.
The METSTAT results, shown in Figure 18, though within the set criteria, indicated that the
WRF simulations underpredicted regional temperature both summer and winter periods. This
conflicts with the overpredictions at the buoy location during the winter periods. The YSU cases
are also the poorer performing WRF simulations according to the METSTAT analysis, but are
the best performing cases when compared to the tracer study measurement location.
WRF prediction of air temperature during the summer period was more favorable, with bias and
error from all cases falling within the simple terrain goals. All WRF simulations overpredict
temperature by about 2 °C for the tracer study hours on July 27th, but predict temperature with
less than 1 °C error the other periods.
3.4.1.3 Sea Surface Temperature, Air-Sea Temperature Difference, Monin-Obukhov Lengths
SSI is not a variable estimated by WRF - it is a static value received from the reanalysis data,
so the choice of PEL scheme has no bearing on SST. The SST values may vary between
reanalysis datasets but the ERA and NARR have very similar values during both the summer
and winter periods of the experiment. The reanalysis SST values are highly overpredicted (by
about 3-6 °C) during the winter periods and underpredicted (by about1-2 °C) during the summer
periods as shown in Figure 37.
The measurement-based air-sea temperature differences were positive in the winter periods as
presented in Figure 38. These differences will promote a negative heat flux as heat is
transferred from the atmosphere to the sea surface, promoting stable conditions as the surface
layer of the atmosphere is cooled. As a result, 1/L is positive. The predicted differences are near
0 °C during the winter periods except for the negative values resulting from the YSU
simulations: a direct result of the overpredicted reanalysis SST values during this period. The
WRF-based predictions result in fairly neutral conditions, as indicated in Figure 39 by the 1/L
values near zero. Negative 1/L values (unstable conditions) are predicted by the YSU
simulations, corresponding to the erroneous sign of the air-sea temperature difference, in which
YSU air-sea temperature difference is negative, resulting in heat flux from the sea and unstable
atmospheric conditions. Measured air-sea temperature differences were positive.
The summer period measurements indicate a negative air-sea temperature difference as shown
in Figure 38. The WRF simulations resulted in an air-sea temperature difference near 0 °C,
perhaps due to the cold bias of the SST in the reanalysis datasets. The observation-based
meteorological conditions result in a positive heat flux from the sea surface to the surface layer
of the atmosphere resulting in unstable conditions. These conditions are evident based on
negative values of 1/L. The lack of significant air-sea temperature difference results in slightly
68
-------
unstable conditions in each of the WRF simulations. The MYJ simulations produce the most
unstable conditions of the WRF scenarios, though not nearly as unstable as the measurements
indicate. The better MYJ performance is due primarily to slightly cooler predicted air
temperatures than the UW and YSU simulations. The lower air temperature supports more heat
flux from the sea to the atmosphere.
3.4.1.4 PBL Heights.
As shown in Table 14 the time series of Cameron experiment and WRF PBL heights for both
winter and summer periods are shown in Figure 40. It is evident the WRF PBL heights (RCALF
cases) are scattered over a wide range. For example, minimum PBL heights for the Feb. 17
experiment cases are about 25 m (NARR.UW), while maximum PBL heights are near 200 m
(ERA.YSU). Most WRF runs are biased towards PBL heights that are lower than the
measurements, as seen in Table 14. The NARR.YSU results in the least bias and error, +13 m
and 59 m, respectively, for the winter periods. The NARR.UW results in the least bias -32 m for
the summer periods. The ERA.YSU results in the least error, 202 m for the summer periods.
The YSU runs over-predict PBL height overall, but are the most accurate overall. The MYJ
cases result in significant low bias in the summer cases, -221 m and -442 m for the NARR and
ERA cases, respectively. The non-local closure schemes used in YSU evidently are important,
especially in the summer, resulting in WRF runs that outperform those using the MYJ scheme.
PBL heights calculated by MMIF (RCALT cases) are plotted in Figure 41. Qualitatively, it is
evident that MMIF improves the overall prediction of PBL height because the scatter of WRF
estimates is lower and most estimates are more similar in magnitude to the measurements. The
bias and error results for each case, included in Table 14, are improved over the WRF RCALF
cases. The ERA.YSU is the better performing case for winter periods, with a bias of -7 m and
average error of 55 m. All WRF cases are biased low, under-predicting PBL height by up to 50
m on average. The ERA cases are also biased low in the summer cases, -175 m for ERA.MYJ.
The NARR.UW is the best performing case for summer periods, with a positive bias of +31 m
and error of 211 m.
3.4.1.5 Relative Humidity
WRF generally overpredicts RH over both summer and winter periods as shown qualitatively in
Figure 42. It is possible that the Feb. 24th RH measurements are erroneous given that values of
50% are anomalously low for a marine environment. The surface weather observations at the
time in the domain suggest RH was in the range of 70-80% at the time, similar to the range of
RH values measured on Feb. 22nd and Feb. 23rd. The average bias and error values are
included in Table 14. All WRF cases predict RH values higher than measured. The UW PBL
runs result in lower bias and error overall, compared to the other PBL schemes.
69
-------
-':
-3
]
5
^
'
8
A
O
V
^
s
c
a
c
o
1
M
C!
2
H
^
O Site Measurement
A WRF ERA.MYJ
D WRF ERA UW
O WRF ERA.YSU
WRF NARR.MYJ
•fr WRF NARR UW
WRF NARR.YSU
Site Measurement
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.UW
WRF NARR.YSU
Figure 35. Cameron wind speed time series: winter releases (top) and
summer releases (bottom).
70
-------
20
'.V
W
u
0)
15
14
•n
p
4p
..
®)
tsL
0
1
jM.
D
'•**'
P
w 1
v
Q
f
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
30
u
tu
•o
27
I
5
S
1
LJ
Q
x:
>
^
i
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
•ft WRF NARR.UW
WRF NARR.YSU
Figure 36. Cameron air temperature time series: winter releases (top) and summer
releases (bottom).
71
-------
J"'
18
-D
17
5
?
1
9
0
*
w
-
0
o
s
e
9
•
n
c
c
!
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
0 WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 37. Cameron SST time series: winter releases (top) and summer releases
(bottom).
72
-------
f,
3
fS
i
O
i
•
a
WRF NARR.MYJ
WRF NARR.YSU
Figure 38. Cameron air-sea temperature difference time series: winter releases (top) and
summer releases (bottom).
73
-------
0,20
0.18
0.16
0.14
-0
[ J Site Measuremen
>r WR
/i WR
A WR
A WR
n ™
O WR
n WR
WR
8WR
WR
O WR
Q WR
vy WR
O WR
^> WR
O WR
VJ" WR
§ WR
IV WR
WR
V WR
X *"*
X WR
X WR
ERA. MY) a
ERA.MYJ a
ERA.MYJ a
ERA.MYJ a
ERA.UW a
ERA.UW a
ERA.UW a
ERAUW a
ERA.YSU £
ERA. YSU a
ERA YSU a
ERA.YSU o
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.UW
NARR.UW
NARR.UW
NARR.UW
NARR.YSU
NARR Y5U
NARR.YSU
NARR.YSU
er rcalF
er rcalT
er rcalF
ar rcalT
srF.rcalF
»rF.rcalT
BrT.rcalF
;rT.rcalT
erF.realF
erF.rcalT
erT.rcalF
erT.rcalT
aerF.rcaiF
aerF.rcaiT
aerT.rcalF
aerT.rcalT
aerF.rcaiF
aerf.rcalT
aerT.rcalF
aerT.rcalT
aerF.rcaiF
aerF.rcaiT
aerT.rcalF
aerT.rcalT
0.14
0.12
0.10
0.06
0.04
: 000|
-0.021
-0.04|
?v»w
WR
A
.
1
'
D
0
.
f' ^
-*v
O
M
5
o
|
X
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
Wft
WR
WR
WR
WR
WR
WR
WR
WR
Wrt
Measurement
ERA.MYJ aerF.rcaiF
ERA.MYJ aerF.rcalT
ERA. MYJ a
ERA.MYJ a
ERA.UW a
ERA.UW a
ERA.UW a
ERA.UWa
ERA.YSU a
ERA. YSU a
BrT.rcalF
BrT.rcalT
arF.rcalF
urf.rcalT
srT.rcalF
srT.rcafT
erF.ncalf
erF.rcalT
ERA YSU aerT.rcafF
ERA.YSU a
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.UW
NARR.UW
NARR.UW
NARR.UW
NARR.YSU
NARR.rSU
NARR.YSU
NAWR YbU
erT.rcarr
aerF.rcaiF
aerF.rcalT
aerT.rcalF
aerT.rcafT
aerF.rcaiF
aerF.rcalT
aerT.rcalF
aerT.rcalT
aerF.rcaiF
aerF.rcalT
aerT.rcalF
aerT.realT
Figure 39. Cameron inverse of L time series: winter releases (top) and summer releases
(bottom).
74
-------
in-"1
in1
A
-I
fS
J
"11
If
ft
D
BLJ
&
•fir
«
w
tT
m
2ft
O
:•
n
M
<~»
£•&
^
w
s
1
4
(
K
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
WRF NARR.YSU
10
o'
10
\
>
^
•-
i
i
O
8
•
,
*
35
.y
(C
V.7
n
^
•4
5
n
4
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
WRF NARR.YSU
Figure 40. Cameron PBL height (RCALF) time series: winter releases (top) and
summer releases (bottom).
75
-------
lOOn
90L
80
70
60
50
Site Measurement
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.UW
WRF NARR.YSU
102
§
/:
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 41. Cameron PBL height (RCALT) time series: winter releases (top)
and summer releases (bottom).
76
-------
70
60
«;n
.
b
I
A
*
1
4
tf
0
|
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
90
80
70
sn
7
>
i
,
o
a
o
©
7
8
I
n
a
^
1
>
*
-
a
c
^
O Site Measurement
A WRF ERA.MYJ
O WRF ERA.UW
O WRF ERA.YSU
> WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 42. Cameron relative humidity time series: winter releases (top) and summer
releases (bottom).
77
-------
3.4.2 Carpinteria
Figure 43 - Figure 51 shows the time series plots for the Carpinteria meteorological variables.
The Carpinteria experiment was conducted in September of 1985 and therefore did not involve
tracer releases during winter conditions. Bias and error values for each WRF simulation are
listed in Table 14.
3.4.2.-? Wind Speed
Wind speeds were low during the Carpinteria experiment as depicted in Figure 43.
Measurements fall within the range of 1-5 m/s during the tracer release periods. The ERA-
based WRF simulations underpredict wind speed during the Sept. 22nd release period and
overpredict wind speed during the Sept. 25th release period. The NARR-based WRF cases
appear qualitatively to predict wind speed better in general except for the Sept. 26th release
period where only the ERA.UW simulation captures the higher wind speeds (about 4 m/s versus
about 2 m/s predicted by the other simulations) measured during the period. Overall, all WRF
cases are biased low, with average wind speed bias of about -1 m/s. The average bias values
for all cases exceed the simple terrain criteria for wind speed, but fall within the complex terrain
criteria. The wind speed RMSE is within the simple terrain criteria for all cases. The YSU cases
perform slightly better, with lowest bias (-0.8 to -0.9 m/s compared to -1.0 to -1.2 m/s for UW
and MYJ cases) and lower error.
The regional METSTAT results, shown in Figure 19, indicated that the MYJ-based simulations
performed the best for regional prediction of wind speed. The NARR.MYJ at the measurement
location is the most biased, and was near the least biased regionally. However, the overall bias
and error is relatively low, all within the complex criteria.
3.4.2.2 Air Temperature
All WRF simulations predict nearly the same temperature over all of the release periods, as
evident in Figure 44. The greatest differences are during the Sept. 25th release, where the ERA-
based simulations are about 1 °C cooler than the NARR-based simulations. The WRF cases
result in a cold bias of -1.4 to -1.8 °C, as seen in Table 14, with YSU runs the coolest. The cold
bias is within the METSTAT complex criteria goal but exceeds the simple terrain criteria.
Temperature error is within the simple terrain criteria for all WRF cases.
The regional METSTAT results, shown in Figure 21, are not a perfect predictor of bias and
error at the tracer study meteorological measurement location. The onshore-based METSTAT
results indicate regional performance falls within the simple terrain criteria for most of the WRF
cases (except NARR.YSU with cold bias exceeding the criteria and ERA.MYJ with warm bias
exceeding the criteria). Regional ERA.MYJ temperature is biased warm about +1.0 °C, while the
local is biased low at -1.5 °C. The most extreme cold bias occurs with the NARR.YSU runs at a
value of -0.8 and -1.8 °C for the regional METSTAT and local experiment measurement location
analyses, respectively.
78
-------
3.4.2.3 Sea Surface Temperature, Air-Sea Temperature Difference, and Monin-Obukhov Length
The reanalysis SST values are generally the same for NARR and ERA datasets as shown in
Figure 45. The reanalysis datasets overpredict SST by 1-2 °C on Sept. 22nd and underpredict
SST the remainder of the periods by 1-2 °C.
Air-sea temperature difference shown in Figure 46 is predicted well by all WRF simulations
during the Sept. 19th, Sept. 28th, and Sept. 29th release periods. The cold temperature bias
predictions coordinate with the low SST analysis values to provide air-sea temperature
differences with relatively low error. During the Sept. 22nd period, the measured difference is
positive (air temp, is greater than SST) and the WRF differences are near zero and slightly
negative. The measured differences during the Sept. 25th period are +2-3 °C compared to the
WRF differences in the range of +0-2°C. During the Sept. 26th period, the ERA.YSU and
ERA.MYJ simulations predict a positive difference, conflicting with the negative measured
difference. The ERA.MYJ case results in the least overall air-sea temperature difference bias of
0.0 °C and NARR.MYJ runs result in the least overall error of 0.6 °C.
The 1/L values shown in Figure 47 vary significantly. All WRF cases predict unstable conditions
(1/L < 0) during the first and the two last release periods, corresponding to the stability
conditions predicted using measured values. Most WRF cases predict unstable conditions (1/L
< 0) during the Sept. 22nd releases, while the measurements result in highly stable conditions
3.4.2.4 PBL heights
The PBL height values from the experiment database are fixed at 500 m. This value is higher
than all WRF and MMIF PBL heights during the experiment as presented in Figure 48 and
Figure 49. The constant observed PBL height may not be representative of real conditions,
especially during the statically-stable periods occurring on Sept. 22nd and Sept. 25th (periods
with 1/L > 0). The YSU WRF simulations perform the best, with PBL height estimates in the
100-200 m height range. The MYJ and UW-PBL simulations generally predict PBL height values
below 30 m throughout the period. All simulations predict low PBL heights for the Sept. 25th
period. Referring to Table 14, all of the WRF cases results in significant negative bias in the
range of -375 to -484 m. The YSU runs result in the least bias and error overall. The MMIF
RCALT cases result in slightly improved estimates, with the negative PBL height bias reduced
by a small percent. For example, the best performing NARR. YSU case WRF PBL height bias of
-378 m is reduced to -375 m with MMIF recalculation.
3. 4. 2. 5 Relative Humidity
As shown in Figure 50, all WRF simulations overpredicted RH except for a few periods where
measured RH spiked into the 90% range. There is no WRF simulation that produced better RH
estimates over all periods. The overall positive bias is lowest for the NARR. YSU case, 7%
compared to 8-10% for the other cases (percent signifies relative humidity percentage, not a
percent change in value). The WRF cases result in an RH error ranging from 11% to 13%. This
79
-------
bias and error is lower overall than the bias and error that resulted from the Cameron
simulations.
3.4.2.6 Wind Direction
The time series of predicted and observed wind-direction are shown in Figure 51. Most of the
WRF simulations predict the general wind direction within +/- 30° with some exceptions. Some
notable outliers are produced by the NARR.MYJ and NARR.YSU simulations. The NARR.UW
simulation produces the best overall estimates of wind direction on average, but there is no
WRF simulation that distinctly out-performs the other simulations. These results correspond well
to the regional METSTAT results, with average error within the complex criteria for wind
direction error.
80
-------
Site Measurement
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.UW
WRF NARR.YSU
« &
Figure 43. Carpinteria wind speed time series.
21
20
19
u
OllS
-------
20
19
O
O
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
ft WRF NARR.UW
WRF NARR.YSU
»
17
16
15
8
Figure 45. Carpinteria sea surface temperature time series.
-4
-5
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
«• WRF NARR.UW
WRF NARR.YSU
Figure 46. Carpinteria air-sea temperature difference time series.
82
-------
(J
X
s
s
.A,
G
b
a
O
<'^
o
<>
t-7
•£•
^
•
N»/
^
X
.
Sic
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
Measurement
ERA.MYJ aerF.rcalF
ERA.MYJ aerF.rcafT
ERA.MYJ aerT.rcalF
ERA MY] aerT.rcalT
ERA.UW aerF.rcalF
ERA.UW aerF.rcalT
ERA. UW aerT.rcalF
ERA.UW aertrcafT
ERA.YSU aerF.rcalF
ERA.YSU aerF.rcalT
ERA.YSU aerT.rcalF
ERA.YSU aerT.rcalT
NARR.MYJ aerF.rcalF
NAHR.MY|aerF.rca(T
NARR.MYJ aerT.rcalF
NARR.MYI aerT.rcalT
NARR.UW aerF.rcalF
NARR.UW aerF.rcalT
NARR.UW aerT.rcalF
NARR.UW aerT.rcsIT
NARR.YSU aerF.rcalF
NARR-YSU aerF.rcalT
NARR.YSU aerT.rcalF
NARR.YSU aerTrcalT
Figure 47. Carpinteria inverse of L time series.
103
10'
•Or
D
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 48. Carpinteria PBL height (RCALF) time series.
83
-------
10
10'
10
9
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.Y5U
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 49. Carpinteria PBL height (RCALT) time series.
100
90
80
70
60
50
O
Q
O
O
O
A
D
O
Site Measurement
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.UW
WRF NARR.YSU
Figure 50. Carpinteria relative humidity time series.
84
-------
C>
i
*
$
&
1
§
X
$
^
i
a
§
9
6
6"
£
*
^^ Site Measurement
\ WRF ERA.MYJ
n WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
if WRF NARR.UW
WRF NARR.YSU
Q. Q tt Q. Q Q.
u aj u QJ 01 aj
S t
Figure 51. Carpinteria wind direction time series.
85
-------
3.4.3 Oresund
Figure 52 - Figure 60 contain the time series of meteorological variables for the Oresund
experiment. The Oresund experiment period was from May to June 1984, so no wintertime
tracer releases were studied. Only ERA-based WRF cases are analyzed since the NARR
reanalysis only covers North America. Bias and error values for each WRF simulation are listed
in Table 14.
3.4.3.1 Wind Speed
The Oresund experiment had observed wind speeds, detailed in Figure 52, higher in magnitude
than the other experiments. The magnitude of surface wind speed over the Oresund Strait was
generally well predicted by all WRF simulations except for the UW-based simulations and
insolated hours from the other WRF options. Wind speeds were under estimated significantly by
the MYJ and YSU simulations for a few hours for the May 16th release period. The UW-based
simulations underpredict wind speed by 2-3 m/s most of the experiment periods. The overall
negative wind speed bias at the measurement location of-1.3, and -1.1 m/s for the UWand
YSU cases, respectively, corresponds to the regional METSTAT results which also indicated
negative wind speed bias of -0.7 m/s to -1.0 m/s. However, the regional METSTAT analysis
indicated MYJ produced very little bias, whereas bias at the measurement location was -0.9 m/s
for the MYJ case. The local measurement location bias is within the criteria goals for complex
terrain conditions but exceeds the simple terrain criteria. The YSU RMSE is within the simple
terrain criteria for wind speed. The MYJ case is within the complex terrain criteria for wind speed
RMSE with a value of 2.1 m/s and the UW case at 2.9 m/s RMSE exceeds the complex criteria.
3.4.3.2 Air Temperature
All WRF simulations underpredict air temperature by 1-4 °C for all release periods as shown in
Figure 53. The scatter between WRF cases tends to be low, with all solutions falling within a few
degrees of one another. WRF cases underpredict air temperature by 5-7 °C during the May 16th
release period and about 1 °C during the June 12th and 13th release periods. All other periods
WRF underpredicts temperature by a few degrees. The cold bias at the measurement location
corresponds to the regional cold bias for May releases indicated in the METSTAT results.
Overall temperature bias, reported in Table 14, ranges from -2.0 °C for the UW and YSU cases
to -2.2 °C for the MYJ case. The MYJ bias exceeds the complex criteria. Temperature error is
within the simple terrain error for UW and YSU cases and within the complex terrain criteria for
MYJ cases.
3.4.3.3 Sea Surface Temperature, Air-Sea Temperature Difference, and Monin-Obukhov Length
The time series of ERA SST is shown in Figure 54. Qualitatively, the SST data appears to be
accurate and therefore not correlated to the cold bias in the air temperature. The ERA SST is
within a degree of the measurements for all but the June 12th period, where the difference is
about 1-2 °C. The overall cold bias is -0.6 °C.
86
-------
Air-sea temperature differences are biased low as seen qualitatively on Figure 55. An overall
bias of -1.4 °C for the UW and YSU runs and -1.6 °C for the MYJ runs are posted in Table 14.
The air temperature cold bias is the main factor in air-sea temperature difference prediction
error. However, the "sign" of the WRF-simulation air-sea temperature difference is generally
correct for all periods, resulting in correct estimates of stable and unstable conditions, as
indicated by the 1/L plot, Figure 56. The values of 1/L from the WRF simulations match sign and
general magnitude for most cases. The 1/L values are positive for WRF and measurement
cases most of the May and early June periods, ranging from +0.02 to 0.2 nr1. The WRF
meteorology was too statically-neutral during the May 18th and June 12th periods (values of 0.0
to -0.04 nr1), where the measurements indicated more stable and unstable conditions (values of
-0.02 to -0.12 nr1), respectively.
3.4.3.4 PBL Heights
WRF PBL height estimates varied greatly, but the YSU cases tended to perform the best overall
as seen in Figure 57. The UW and MYJ simulations tended to be too statically-stable during a
majority of the release periods. The MMIF recalculation presented in Figure 58 was the most
beneficial for the MYJ cases. The predicted PBL heights were closer in magnitude to the
measurements. Referring to Table 14, all WRF cases result in significant negative bias from
-680 to -730 m. The MMIF RCALT cases do not improve PBL height prediction, resulting in the
same magnitude of negative bias.
Given the high level of static-stability determined by air-sea temperature difference and other
factors, it is surprising the Oresund study PBL height values are high, exceeding 500 m in some
cases. It is possible these measurements are not truly representative of the coastal atmospheric
surface layer at the time of the experiment. However, the high wind speeds during the
experiment should tend to correlate with higher PBL heights.
3.4.3.5 Relative Humidity
The WRF simulations result in RH estimates comparable to the measurements as shown in
Figure 59. The WRF simulations overpredict RH by about 10% over the June 12th release
periods. None of the PBL schemes have better skill than the others for prediction of RH
observations. RH is biased slightly positive in all WRF cases from 2-6% (where percent is the
increment in relative humidity, not percent difference in value). RH error is 6% for all WRF
cases. The overall RH bias and error at the tracer study measurement location is lowest for
Oresund than the other tracer studies.
3.4.3.6 Wind Direction
The predicted and observed wind directions for the release periods during the Oresund
experiment are shown in Figure 60. For several of the periods, notably May 22nd through June
12th, a majority of the WRF simulations perform well, with average wind-direction predictions
within 10-20° of the measurements. The WRF predictions are scattered for the May 16th and
June 14th release periods. The UW PBL scheme performs the best overall with the lowest
87
-------
average wind direction error. The WRF performance at the measurement location matches the
satisfactory performance indicated by METSTAT. Wind direction error and bias is within the
criteria range for complex and simple terrain conditions for all WRF simulations.
]l
*
^ o
y
I
D
9
H
*\
3
Q
1
O
,
O
&
fff\
*-W
l2
P
®&
D
8
8
s
8
r
\
c
^
^
1
\
O Sits Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
Figure 52. Oresund wind speed time series.
-------
3
a
>
1
i
1 §
n
E
0
1
i
^
f+
w
f:
£
-
•
)
)
1
o
A
§8
3o
i
31
oH
£
O site Measurement
A WRF ERA.MYJ
D WRF ERA UW
O WRF ERA.YSU
Figure 53. Oresund air temperature time series.
14
12
rail
0)
10
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
Figure 54. Oresund sea surface temperature time series.
89
-------
2
-6L
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
Figure 55. Oresund air-sea temperature difference time series.
0.20*
0.18
0.16
0.14
Figure 56. Oresund inverse of L time series.
90
-------
in3
102
in1
0
o
Q
9
r— i
; §
o
g
«* -;
A
®
f\
\J
D
•C
«;
•5
'•X
J
r\
j
y
A
D
i
\
0
8
A
t5
@
rv>
°9
AA
an
o
©
o
f=t
H
A
t
P
t
<_
r
/
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
Figure 57. Oresund PBL height (RCALF) time series.
102
o
o
o
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
Figure 58. Oresund PBL height (RCALT) time series.
91
-------
•
s
90
•
80
70"
sn
0
;
i
/
«
I
0
0
1
1
D
1
9
9
0
©
n
fra
•
np
@
A
y /
a i
§
§
c
i:
f
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
Figure 59. Oresund relative humidity time series.
O
*
Figure 60. Oresund wind direction time series.
92
-------
3.4.4 Pismo Beach
Figure 61 - Figure 68 contains the time series of meteorological variables for the Pismo Beach
experiment WRF simulations and measurements. The Pismo Beach experiment periods were
December 1981 and June 1982 and each figure contains separate time series for these periods.
Bias and error from each WRF case is included in Table 14.
3.4.4.1 Wind Speed
The Pismo experiment occurred during relatively high winds of 8-10 m/s measured in the winter
and 10-14 m/s measured in the summer. Wind speed was also highly variable as shown in
Figure 61. For example, during just the December 11th release period, wind speeds ranged from
4.5 to 8.6 m/s. The WRF cases overpredict wind speed most significantly, by +3 to +4 m/s,
during release hours on December 11th and December 15th.
Overall wind speed RMSE exceeded the complex terrain criteria for all WRF cases during the
winter periods. The ERA cases exceeded the complex criteria of 1.5 m/s for bias, with overall
positive bias in the range of 1.6 - 1.8 m/s predicted. The NARR cases were within the complex
terrain criteria but exceeded the simple terrain criteria.
During summer periods, all WRF cases exceeded complex criteria RMSE except for NARR.MYJ
and NARR.YSU cases. The NARR.MYJ and NARR.YSU wind speed bias was small, at
+0.1 m/s, within the simple terrain criteria of 0.5 m/s.
The regional METSTAT results, shown in Figure 27, indicated predicted regional wind speed
conditions were within the criteria for simple terrain conditions for the summer periods, but
biased high for winter periods. The NARR bias at the study measurement location corresponds
to the regional METSTAT results, but RMSE exceeds the simple terrain criteria. The positive
wind speed bias at the measurement location corresponds to the regional positive bias.
3.4.4.2 Air Temperature
There are only small differences in WRF solutions for air temperature over both seasons
(generally are within 1-2 °C of each other) as shown in Figure 62. The WRF cases tend to
underpredict air temperature by 1-2 °C during the summer period, with NARR.MYJ and
NARR.UW being the best performers with about -0.5 °C average error. During the winter
periods the WRF cases tend to overpredict air temperature by 1-2 °C. Overall bias and error are
within the complex criteria for the winter and summer periods. The NARR.MYJ and NARR.UW
cases are within simple terrain criteria for the summer periods.
The regional METSTAT results, shown in Figure 29, indicated WRF regional temperatures were
biased warm for the winter period, exceeding the criteria for complex terrain conditions. This
warm bias corresponds to the warm bias from each WRF case for winter periods. However,
NARR bias at the measurement location is within the complex criteria for winter periods,
whereas the regional bias exceeded the complex criteria. Regional and local measurement
location summer temperature bias and error were within the complex criteria for all WRF cases.
93
-------
3.4.4.3 Sea Surface Temperature, Air-Sea Temperature Difference, and Monin-Obukhov Length
As seen in Figure 63, the NARR SSTs were about 0.5 °C higher than the ERA SSTs for both
the winter and summer periods. The ERA SSTs were all about 1 °C higher than the
measurements during the winter period. The summer WRF SSTs are about 1 °C greater than
the observations for the first set of release periods, but are highly overpredicted by 2 - 5 °C
during the last two release periods.
The combination of WRF underpredicted summer air temperatures and overpredicted summer
SSTs results in erroneous air-sea temperature differences during the summer period shown in
Figure 64. Air-sea temperature difference of the WRF cases is biased cold 2 to 4 °C cooler than
the measurements. The measurement-based air-sea temperature differences are positive,
resulting in stable conditions. The summer YSU simulations result in negative differences and
the MYJ and UW simulations result in differences near 0 °C. The WRF predictions are more
accurate in the winter periods, resulting in a correct sign and magnitude of air-sea temperature
difference for the majority of the simulations.
The effect of the air-sea temperature differences on stability is evident through examination of
the inverse L results Figure 65. Summer measurement based 1/L values are high (many >
0.04), indicating strongly stable conditions. The WRF simulations result in near neutral stability
conditions with 1/L values near zero. The NARR.YSU simulations result in unstable conditions
for many of the summer release cases. The winter WRF predictions of 1/L are much better,
generally matching the magnitude and sign of the measurement-based 1/L values. The winter
1/L values are near zero for most of releases, indicating neutral static-stability conditions.
Overall winter WRF temperature bias, listed in Table 14, ranges from -0.1 to -0.5 °C, with
ERA.MYJ the least biased and NARR.YSU the most biased. Summer bias is more significant
ranging from -1.9 to -3.1 °C with ERA. UW the least biased and NARR.YSU the most biased.
3.4.4.4 PBL heights
Pismo experiment PBL height measurements are high at 600 - 800 m during the first three
summer release periods. These heights would seem to conflict with the stability conditions
indicated by the 1/L values for these periods. Despite the neutral and unstable conditions
predicted by the WRF simulations for these periods, the WRF PBL heights are much lower than
the measurements as exhibited in Figure 66. The WRF solutions are also highly scattered for
these periods. The last two release periods of summer were characterized by PBL heights at
about 100 m. The magnitude of the WRF simulation PBL heights generally matches the
measurements during these periods, with UW-PBL simulations performing the best. The
recalculated WRF PBL heights presented in Figure 67 are less variable and more closely match
the observations for all summer release periods.
Tracer study PBL heights during the winter periods range from 50 - 100 m except for the
December 11th release, where heights reach 600 - 900 m. The WRF PBL height predictions
vary over a large range. Overall, WRF PBL heights are biased negative ranging from about
-250 m to -350 m over the summer periods. The ERA.YSU has the least bias, with values of -
94
-------
249 m and -248 m for WRF and MMIF PEL heights, respectively. The winter NARR.MYJ PEL
height bias is only +41 m, but error is high compared to the other WRF cases, at 301 m
(compared to 180-200 m in the best performing cases). The NARR.YSU WRF and MMIF PEL
estimates have the least overall error and bias.
The variability of the recalculated PEL heights is much lower for the winter releases. However,
the MMIF cases do not result in less overall bias. The NARR.UW recalculated PEL heights
perform the best overall.
3.4.4.5 Relative Humidity
The WRF simulations overpredict RH by 10-20% for the summer release periods (actual RH
percent value, not percent increase). The WRF RH predictions for the winter release periods are
much better, generally matching the observed values for most of the release periods. Overall,
the WRF RH results were positively biased, with RH +3 to +4% in winter and +8 to +15% in the
summer. Figure 68 shows the measured and predicted RH values.
95
-------
g
g
- 6
b
5
4
2
.*
)
?
&
o
o
(Srv
s@-
4S
HI
O
^
/s
o
^if
x
(BO
ISP*
0
is
s
*
c
^Db
if
j
C
O Site Measurement
A WRF ERA.MYJ
Q WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
•fr WRF NARR.UW
WRF NARR YSU
1
k
i
9
~s
u
o
%
*
*
X
=6
0
%>
A
^^
iS6
1?
3
cf
0
C
3
1
C
1
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR. YSU
Figure 61. Pismo wind speed time series: winter (top) and summer (bottom) releases.
96
-------
1ft
u
QJ
•Q
^3
1 1
^
]
$
$
o
1
o
8
X
,*A
•\7
I
O
o
<
4
¥
i
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
* WRF NARR.UW
WRF NARR YSU
o
f
8
|
°h
A
&
0
-__
1
£
Q>
0
\]r
#
<«
r
^
0
rsi rM fNi rN rM rN
00 00 CO CO 00 CO
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
0 WRF ERA.YSU
O WRF NARR.MYJ
* WRF NARR.UW
X WRF NARR YSU
i-H i-l i-l iH i-l i-l
rN m ^ in 10 r^.
rN rN fN fN rN rN
g g g g g g
Figure 62. Pismo air temperature time series: winter (top) and summer
(bottom) releases.
97
-------
u
OJ
•D
12
11
I
i
•I
O
(Q)
O
$fc
O
o
0
4$&
PBTS
%
I
W1^
,7-?,r
O
C
0
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
A WRF NARR.UW
WRF NARR.YSU
Q
'
'
*»
®
O
5
PHI
-^m^-
o
^
-«
a
C
C
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
<0> WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 63. Pismo SST time series: winter (top) and summer (bottom) releases.
98
-------
Q
/\
?
'
2
1
5
f.
&
1
*
A
»H
o
c
jQ«
i
~?s — ~
A WRF ERA.MYJ
D WRF ERA.UW
0 WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.YSU
SJ
o
4
2
u
at
-a °
T
fi
^
fe
k
Q
O
s
^
%
.
'X
^?
^
41
X
6
1
<3
.—
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
WRF NARR.YSU
Figure 64. Pismo air-sea temperature difference time series: winter (top) and
summer (bottom) releases.
99
-------
0.18
0.16
0.14
0.12
0.10
0.08
0.06
0.04
0.02
0.00
-0.02
-0.04
-0.06
-0.08
-0.10
-0.16
-0.18
-n in
9
>
1
^
I
f>
<
.
Jfllk
0
•
V.
ri
0
f J Bite Measurement
7*^ WR ERA.MYJ aerF.rcalF
7\ WR ERA.MYJ aerF.rcalT
A
A.
LJ
b
D
O
o
t—
lJ*i'
•V
*
x
x
X
x
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
ERA.MYJ a
ERA.MYJ a
ERA.UW a
ERA.UW a
ERA.UW a
EHA.UW a
ERA.YSU a
ERA.YSU a
ERA.YSU a
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.UW
NARR.UW
NARR.UW
NARR.UW
NARR.YSU
NARR.YSU
NARR.YSU
NARR.YSU
•fTr
srT.ri
irF.rt
jrF.rt
;rT.n
>rT.n
erF.r
erT.r
erTr
aerF
aerf
aerT
aerT
aerF
aerF
aerT
aerT
aerf
aerf
aerT
alF
alT
alF
alT
alF
alT
.alF
calF
caff
rcalF
rcalT
rtalF
rcalT
rcaiF
rcalT
rcalF
rcalT
rcalF
rcalT
rcatF
aerT.italT
A
i~|
"Tj
Q
0
o
o
r>
• ' ~\
cS
V
o
A
*
it
\s
x
X
x
Fbite
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WRF
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
WR
Mea^ureme
ERA.MYJ ae
ERA.MYJ ae
ERA.MYJ ae
ERA.MY) ae
ERA.UW ae
ERA.UW ae
ERA.UW ae
ERA.UW ae
ERA.YSU a
ERA.YSU a
ERA.YSU a
ERA.YSU a
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.MYJ
NARR.UW
NARR.UW t
NARR.UW ,
NARR.UW.
NARR.YSU
NARR.YSU
NARR.YSU
NARR.YSU
nt
rF.rcaiF
rF.rcalT
rT.rcalF
rT.rcalT
rF.rcalF
rF.rcalT
rT.rcalF
rT.rcalT
rF.rcalF
rF.rcalT
rT.rcalF
rT.rcalT
erF.rcatF
erF.rcalT
erT.rcalF
erT.rcalT
erFrcalF
erF.rcalT
erT.rcalF
erT.rcalT
aerF.rcalF
aerF.rcalT
aerT.rcalF
aerT.rcalT
-0.20
Figure 65. Pismo inverse of L time series: winter (top) and summer (bottom)
releases.
100
-------
103
e
OJ
102
mi
'*
y
r
f"l
4^
f*Y
^^
V
**
O
O
m
w
1
n
V
A
0
ft
/\
CD
IP
*
y^V
v
A
n
X
o
0
\/L
fa.
O
V**
ex
6
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
WRF NARR.YSU
Site Measurement
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.UW
WRF NARR.YSU
Figure 66. Pismo PBL height time series: winter (top) and summer (bottom)
releases.
101
-------
10'
10
]
b
i
^N
W
f\
A>
v*
fl
SSLi
&s
!S
o
03
*-i
vSJ
Q/
<»
W
O
O
$f
1ST
^
1
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
WRF NARR.YSU
Site Measurement
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.UW
WRF NARR.YSU
Figure 67. Pismo PBL height (MMIF re-calc.) time series: winter (top) and summer
(bottom) releases.
102
-------
90
80
70
^n
Y
'
\
'
\
>
S
O
§
a
V7
O
of
$
fiyi
1
o
o
t
\.
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
90
70
so
i
1
i
9
)
o
^\
ft
0
/-\
b
*
A
0
O
W
*
*
™
JA
O
3
1
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
I
Figure 68. Pismo relative humidity time series: winter (top) and summer (bottom)
releases.
103
-------
3.4.5 Ventura
Figure 69 - Figure 73 contain the time series of meteorological variables for the Ventura
experiment and WRF simulations. The Ventura experiment periods were September 1980 and
January 1981. Each figure contains separate time series for these periods. Bias and error
values from each WRF case are included in Table 14.
3.4.5.1 Wind Speed
Wind speed during the Ventura experiment shown in Figure 69 was measured to be in the 3-
7 m/s range during the summer campaign and 4-6 m/s during the winter campaign. Overall
WRF bias and error was comparatively high for the winter periods, exceeding the complex
terrain criteria for bias in all cases except for NARR.YSU, as seen in Table 14. All WRF cases
resulted in a negative wind speed bias of -0.7 to -2.0 m/s for the winter periods. The NARR.YSU
and ERA.MYJ RMSE was within the simple terrain criteria. The NARR.YSU case resulted in the
best performance overall.
For the summer periods, the NARR runs resulted in slight negative bias of -0.2 to -0.5 m/s,
falling within the simple terrain criteria. The ERA bias was higher, but still within the complex
terrain criteria. All summer period RMSE was within the simple terrain criteria. The NARR.UW
case produced the least magnitude of bias and error.
The regional METSTAT simulations indicated regional wind speed predictions were within the
criteria goals for complex terrain conditions for both summer and winter cases, shown in Figure
31. Winter wind speeds tended to be positively biased.
3.4.5.2 Air Temperature
The WRF solutions overpredict air temperature by 1-3 °C during the summer cases. During the
winter campaign, NARR.YSU overpredict temperature by about 1 °C during the first release
period, while the other simulations underpredict by 1-2°C. The predictions closely match
measurements during the second winter release period and underpredict air temperature by 1-2
°C during the last release period. Measured and predicted air temperatures are presented in
Figure 70.
Overall, the WRF cases resulted in a cold bias of -0.8 to -1.3 °C for all cases except for
NARR.YSU which resulted in a bias of 0.0 °C. WRF resulted in warm bias of 0.6 to 2.3 °C for
summer periods. The NARR.YSU again was the best performer, with a bias of +0.6 °C and low
error of 0.9 °C, within the simple terrain criteria.
3.4.5.3 Sea Surface Temperature, Air-Sea Temperature Difference, and Monin-Obukhov Length
The ERA SSTs are about 0.5 °C cooler than the measurements (except for the last release
period where they are about the same as the measurements) and the NARR SSTs are
0.5 - 2 °C warmer than the measurements during the summer campaign. The ERA and NARR
SSTs are 0.5 - 1 °C cooler than the measured SSTs during the winter period. SSTs are shown
in Figure 71.
104
-------
The sign and magnitude of the WRF simulated air-sea temperature differences for the winter
study closely describe measured-differences. During the summer study period, the measured
air-sea temperature differences are negative for the first three release periods. The NARR-
based simulations result in the correct sign during these periods, but the ERA based incorrectly
predict differences in the range of +0 -1 °C. During the last two release periods all WRF cases
except for the NARR.YSU simulation predict the correct air-sea temperature differences as
measured (+0-1 °C). Figure 72 shows the air-sea temperature differences.
The measurement based 1/L predictions presented in Figure 73 are slightly negative during the
summer campaign, indicative of slightly unstable conditions. Most of the WRF simulations are
unable to accurately capture these conditions, with 1/L predictions indicative of neutral and
slightly stable conditions. The 1/L WRF predictions match the sign and magnitude of the
measurement based 1/L values well during the winter campaign. Positive 1/L values,
corresponding to stable conditions, were predicted during the winter campaign.
3.4.5.4 PBL Heights
Experiment PBL heights presented in Figure 74 fall within the 100 - 400 m range during the
summer campaign. These higher heights correspond with the slightly unstable atmospheric
stability conditions (stable conditions would result in lower heights). The WRF solutions tended
to underpredict the PBL heights with estimated values below 100 m. The NARR.YSU
simulations predict the highest PBL heights, but still much lower than the measured heights.
The recalculated WRF PBL heights in Figure 75 did not remove this bias, with predictions falling
well below the measured heights.
The winter PBL heights measurements are in the range of 50 - 100 m and are better
characterized by the WRF simulations as shown in Figure 74. During the first release period the
WRF NARR.YSU option overpredicts the PBL height while the MYJ and UW cases underpredict
PBL height.
3.4.5.5 Relative Humidity
The WRF simulations overpredict RH during the September release periods by 10-20%. The
overall RH positive bias is 13 - 21%, with ERA.MYJ the best performing. The winter period RH
observations are scattered and some of the WRF simulations predict RH values that match
closely to the observations. Relative humidity observations and predictions are exhibited in
Figure 76.
105
-------
7
-S
fc
4
3
O
0
?
£
1
c
a
a
i-
"r
3
> C
} O
> a
1 r
3U
r
r a
c
> 0
! 2
; s
r j
0
A
O
8
o
c
) O
: °
; ?
i '
0
5 C
3 c
3 5
J ?
I «
*r
9
A
R
LJ
>
i
i
H
1
J
r
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
if WRF NARR.UW
WRF NARR.YSU
7
6
5
1
4
3
2
1
a
•\
j
1
I
$
I
O
c
c
1
E
O Site Measurement
A WRF ERA.MYJ
D WRF ERA UW
O WRF ERA.YSU
WRF NARR.MYJ
•& WRF NARR UW
WRF NARR.YSU
Figure 69. Ventura wind speed time series: summer (top) and winter (bottom)
releases.
-------
%!>
if
*
>
<&
F
X
J
O
O
* f-r
o
o
g A
AO
O
o
w
0
§
X
<$
V
/^
A-(
n
4
a
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
I
18
17
U
1
14
i^i
•)
J
7
ir
I
$
1
c
Q
a
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 70. Ventura Air temperature time series: summer (top) and winter
(bottom) releases.
107
-------
18
17
U
O!
HI
•0
16
15
Id
,"r
• —
)
9
%2>
^ A
0°
H Q
A
o
n
• *
&
m
o
O Site Measurement
A WRF ERA.MYJ
O WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
I
I
16
U
01
-o
n
:>
^
D
1
B
a>
ZJB
^
OS
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
•6 WRF NARR.UW
WRF NARR.YSU
Figure 71. Ventura SST time series: summer (top) and winter (bottom)
releases.
108
-------
5
e
3
n
fi
**
nR
^n
o
a
X
ft
$
f\ C't- ' 1
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
WRF NARR.YSU
I
g
^
;
u
D»
-D U
1
3
4
5
R
^
1
A
^
c
1
-^ ^ (T/i
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
O WRF NARR.MYJ
WRF NARR.YSU
Figure 72. Ventura air-sea temperature difference time series: summer (top) and
winter (bottom) releases.
109
-------
0.16
0.14
0.12
-0.10
-0.20
X
I Sit
Mea;
•WR ERA.MY] aerF.rcalF
WR ERA.MYJ aerF.rcalT
WR ERA.MYJ aerT.rcalF
WR ERA.MY) aerT.rcalT
WR ERA.UW aerF.rcalF
WR ERA.UW aerF.rcalT
WR ERA.UW aerT.rcalF
WR ERA.UW aerT.rcaiT
WR ERA.YSU aerF.rcalF
WR ERA.YSU aerF.rcalT
WR ERA.YSU aerT.rcalF
WR ERA.YSU aerT.rcaPT
> WR NARR.MYJ aerf.rcalF
WR NARR.MYJ ae
cS
6
•&
•c,
\'
X
x
WRF ERA.MYJ
WRF ERA.MYJ
WRF ERA MY)
WRF ERA.MYJ
WRF ERA.UW
WRF ERA.UW
WRF ERA.UW
WRF ERA.UW
erF.rcalF
erF.rcalT
erT.rcalF
erT.rcaTT
erF.rcalF
erF.rcalT
erT.rcalF
erT.rcalT
WRF ERA.YSU aerF.rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU
WR ERA.YSU
WR NARR.MY
WR NARR MY
WR NARR.MY
WR NARR.MY
WRF NARR.UW
WRF NARR.UW
WRF NARR UW
WRF NARR.UW
WRF NARR YSl
WRF NARR.YSt
WRF NARR.YSL
erT.rcalF
erT.rcalT
aerF.rcalF
aerF.rcalT
aerT-rcatF
aerT.rcaFT
aerF.rcalF
aerF.rcalT
aerT.rcalF
aerT.rcalT
aerF.rcalF
aerF.rcalT
aerT.rcalF
WRF NARR.YSU aerT.rcalT
Figure 73. Ventura inverse of L time series: summer (top) and winter (bottom) releases.
110
-------
103
10
D
uf
V
.V
(V
»
ILL
5
®>
]
on
t->U
°§
o n
0<&
O
X
o
u
«
Vif
.•^
•J
3
p*
•ii
a
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
<> WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
a
in1
0)
i
in1
•K
*
ft
i
B
Aft
un
*
^
Q
(V
«
ffl
/
(T
»
«
ft
O Site Measurement
A WRF ERA.MYJ
G WRF ERA.UW
O WRF ERA YSU
WRF NARR.MYJ
ft WRF NARR.UW
WRF NARR.YSU
5
I
§
t
S
I
Figure 74. Ventura PBL height time series: summer (top) and winter (bottom) releases.
111
-------
103
10'
10'
n>
D
sfr —
-^_
is?
r
^&
A
*-J
0
•
Q
O
A
^E
X
1
*^
J*
1
ga
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA YSU
O WRF NARR.MYJ
WRF NARR.YSU
I
in*
in2
Tn-
i
>
s
— ffl)^
«
^J*
„
3p
®
•&
/•
**
J
O site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA YSU
WRF NARR.MYJ
WRF NARR.YSU
Figure 75. Ventura PBL height (MMIF re-calc.) time series: summer (top) and winter
(bottom) releases.
112
-------
1
1
a
0H
oo
ww
B
B[
5
A^
OS
O Sits Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
> WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
o o o o c
01 CTi C1 CT> 01
i-H iH rH i-l .H
in ID r*- co 01
rs rsi rsi CN IN
90
70
sn
jk
^
1
*
Cl
3P
Q
O
X
%
r
2
r
!
4S
c
O Site Measurement
A WRF ERA.MYJ
D WRF ERA.UW
O WRF ERA.YSU
WRF NARR.MYJ
* WRF NARR.UW
WRF NARR.YSU
Figure 76. Ventura relative humidity time series: summer (top) and winter (bottom)
releases.
113
-------
3.5 Discussion
The analysis of WRF meteorology at the location of the experiment meteorological
measurements has revealed that WRF sometimes predicts conditions well, with low error and
bias, and in other cases WRF predictions are poor, with overall bias and error that exceeds
complex terrain criteria used within METSTAT analyses.
In general, the regional bias and error computed by METSTAT correlates with the bias and error
of the WRF predictions at the tracer study measurement location. This conclusion is supported
by these findings:
• Cameron: wind speed bias and error at the measurement location corresponds with
the same regional trends indicated by the METSTAT analysis. High positive wind
speed biases over winter and summer periods were found for both METSTAT and
buoy analysis results.
• Oresund: low wind speed and temperature bias at the measurement location
corresponded to the low bias indicated by METSTAT.
• Pismo: high wind speed bias during the winter and summer periods at the
measurement location corresponded to regional high bias indicated by METSTAT.
Warm temperature bias at the measurement location during the winter period
corresponded to warm bias indicated by METSTAT for the region.
• Ventura: warm temperature bias during the September releases was indicated for
the region as well as the measurement location.
However, not all of the findings support this conclusion. The cases where regional METSTAT
results did not correlate with predictions at the measurement location included:
• Cameron: a regional METSTAT results indicated a cold temperature while warm bias
occurred at the measurement location.
• Carpinteria: a temperature error in the range of 1-2 °C and relatively low bias was
indicated by METSTAT for the region. This conflicts with the larger error and bias
specific to the measurement location with a cold bias of 2-3 °C.
• Pismo: a negative wind speed bias at the measurement location during summer
periods did not correspond to the regional positive wind speed bias.
• Ventura: wind speed was positively biased at the measurement location in winter but
negatively biased for the region.
Overall, the NARR.YSU was the better performer across all studies for wind speed. The YSU
PEL scheme was the better performer overall, resulting in less error and bias for most cases
except for Cameron winter cases and Pismo winter cases. For temperature, NARR.YSU and
ERA.YSU cases were better performers for most studies. The ERA.UW cases resulted in less
bias and error for Pismo, however. The YSU runs were the better performers for PEL height
prediction also for all cases except for Cameron summer cases.
Overall, the WRF predictions of wind speed and temperature are within the acceptable range for
complex terrain in terms of error and bias. However, small differences in temperature can result
114
-------
in an opposite sign of air-sea temperature difference due to the relatively small differences
between SST and air temperature. This has large implications for atmospheric stability, since
both WRF and AERCOARE PEL schemes are sensitive to the sign of heat flux, especially
during low winds. Air-sea temperature difference errors, evident for the Carpinteria, Pismo, and
Ventura periods, affect AERMOD concentration predictions because the rate of dispersion is a
function of the atmospheric stability characterized by ML.
WRF/MMIF PEL height predictability was relatively independent of error related to prediction of
unstable or stable conditions. For example, for Cameron, despite WRF resulting in overly
neutral stability compared to the highly stable conditions observed, PEL height predictions
compared well to the magnitude of the observed values. For Carpinteria, where stability
conditions were predicted quite well as indicated by the ML results, WRF/MMIF underpredicted
PEL height substantially. For Oresund, Ventura, and Pismo, WRF did not predict conditions as
stable as Cameron and Carpinteria, based on predictions of air-sea temperature difference and
L, but still underpredicted PEL height. It is possible the "measured" values of PEL height in the
studies are misleading and not based on robust measurements. Often the estimates were
obtained from methods that conflicted with each other. For stable conditions, some estimates
appear to correspond more with the height of the temperature inversion aloft instead of the
height of the surface inversion.
It is evident WRF predictions could be improved with more representative SST analysis data.
Cameron SSTs used in WRF varied as much as 6 °C from the buoy measurements. The
impacts of this were dampened by the fact that WRF air temperatures were generally close in
value to the SST (air-sea temperature differences were small), resulting in relatively neutral
stability conditions for all periods. However, in other cases the SST data biases and lack of
resolution resulted in an incorrect sign of heat flux resulting in poor prediction of atmospheric
stability.
115
-------
[Blank]
116
-------
4 AERMOD MODELING AND RESULTS
The WRF meteorology from each of the field studies were extracted from the WRF model output
files using MMIF and was applied to generate datasets both for AERCOARE processing and for
AERMOD directly. These two methods were used to compare the differences between the
influence of surface energy fluxes predicted by AERCOARE and WRF on AERMOD
performance.
AERMOD simulated tracer releases for each field study and the resulting predictions were
compared to observations using the same statistical procedures as were employed in previous
AERCOARE model evaluation studies (Richmond & Morris, 2012). Model performance statistics
were prepared for each field study and each of the four extraction methods listed in Table 13.
A variety of graphical and statistical techniques were used to evaluate the modeled
concentrations. The methods used were the same as applied in previous OCD, CALPUFF, and
AERCOARE evaluations (DiCristofaro & Hanna, 1989; ENVIRON Int. Corp., 2010; Emery, et al.,
2001; Hanna, et al., 1985; Richmond & Morris, 2012). Q-Q plots compared predicted versus
observed concentration probability distributions. Log-log scatter plots were used to evaluate the
temporal relationship between observed and predicted concentration. Further description of the
statistical measures used is provided below.
4.1 AERMOD Methodology
The methodology used to test model performance for the simple terrain (Cameron, Ventura, and
Pismo Beach) studies was adopted from the techniques used in previous evaluations of the
OCD model (DiCristofaro & Hanna, 1989), where the highest measured concentration from
each release case is used to construct the statistics. For each release case, the maximum
concentration and downwind distance of the receptor is identified. AERMOD was used to predict
the concentration at the same downwind distance as the observed maximum assuming the
maximum occurred at the plume centerline. This approach is warranted in simple terrain cases
because the maximum concentration will generally occur along the trajectory of the plume
centerline captured by the arc of samplers used in the field studies.
This approach simplifies the study. A most robust approach would be to fully resolve the
modeling domain and compare all measured concentrations. However, it is advantageous to
conduct the study in a manner consistent with previous benchmark testing of the OCD model.
Note that this approach removes the influence of WRF wind-direction performance.
For the complex terrain studies (Carpinteria and Oresund), the full set of receptors are used at
the position and elevations documented in the archives. Wind direction is not fixed so WRF
wind-direction performance is allowed to influence the AERMOD results. The maximum
measured concentration is identified for comparison to the maximum simulated concentration in
each release case.
AERMOD version 14134 was used for the study, using regulatory defaults except as noted. For
each field study, AERMOD was applied to simulate overwater tracer dispersion using
117
-------
meteorology from both field measurements and WRF. For the field measurement cases
AERMOD meteorological input files were developed using AERCOARE. AERMOD
meteorological input files for the WRF cases were developed using the four MMIF options
discussed previously (See Table 13).
Scalar wind speeds were assumed for the observations and vector wind speeds were assumed
for the MMIF extracted wind speeds from WRF. It was necessary to specify the "Beta" flag
under the "MODELOPT" keyword in the Control Options pathway in AERMOD to use the MMIF
extracted data. The Beta flag is used to alert AERMOD that non-regulatory default
methodologies are being used.
To be independent of the tracer emission rate, the AERMOD simulations were performed with a
unit emission rate of 1 g/s. The resulting AERMOD concentrations were divided by the tracer
release rate to provide normalized concentrations with units of us/m3.
For the measurement based simulations, AERCOARE was applied using default options for
surface roughness, warm layer heating, and cool skin effects. Observed PEL heights were used
for the convective PEL height. Mechanical PEL heights were determined using the Venkatram
option (Venkatram, 1980). A minimum PEL height of 25 m, and a minimum |L| of 5 m (USEPA,
2012) were applied. For these cases, AERMOD simulations were performed with ("Case-1") and
without ("Case-2") the measured standard deviation of wind direction (OG), as discussed in
Richmond and Morris (2012). Such data were not available for the Oresund study, so results are
only available for the Case 2 set of options.
Table 15 summarizes the details of the AERMOD methodology for each case. For each
AERMOD simulation, unit emission rates were used, building downwash from the release boats
were specified as in previous studies, and release heights varied according to study. The total
number of AERMOD simulations for each site is also listed in Table 15. A total of 3,151
AERMOD simulations were conducted to account for the various combinations consisting of:
• Five tracer experiments, each tracer release case simulated separately,
• Six WRF configurations,
• Four WRF-meteorology extraction methods ,
• Measurement-based AERMOD simulations using Case 1 and Case 2 options.
118
-------
Table 15. AERMOD Configuration for Each Study.
Tracer
Study
Cameron
Carpinteria
Oresund
Pismo
Beach
Ventura
Type
Simple
Terrain
Complex
Terrain
Complex
Terrain
Simple
Terrain
Simple
Terrain
AERMOD
Receptors
1 per run
47 per run
Min. of 9, Max.
of 30 (varied
for each run)
1 per run
1 per run
Release Cases and
Release Height
26 (9 summer, 17
winter); release height of
13 m
27 (all summer); release
heights from 24.4 to
61 m
21 (all summer); release
heights of 95 and 115m
31 (16 summer, 15
winter); release heights
of 13.1 and 13.6 m
17 (9 summer, 8 winter);
release height of 8.1 m
Downwash
Only on Feb.
24th (4
releases from
a boat)3
None
None
Yes (boat)3
Yes (boat)3
Total
AERMOD
Simulations
676
702
525
806
442
3 Boats were estimated to be 20 m wide and 7 m tall following the assumptions used in the OCD and CALPUFF
model evaluations studies.
4.2 Statistical Measures and Methods
The statistical measures and methods are similar to the techniques applied in the EPA
evaluation of AERMOD (USEPA, 2003). The tools used for the evaluation are described below:
• Quantile-quantile (Q-Q) plots were prepared to test the ability of the model predictions to
represent the frequency distribution of the observations. Q-Q plots are simple ranked
pairings of predicted and observed concentration, such that any rank of the predicted
concentration is plotted against the same ranking of the observed concentration. The
Q-Q plots can be inspected to examine whether the predictions are biased towards
underestimates at the important upper-end of the frequency distribution. Q-Q plots were
developed for each WRF scenario. Each plot contains measurement-based Case 1 and
Case 2 (Cases are described in Section 3.2) AERMOD results and the four
WRF-extraction AERMOD results.
• Log-log scatter diagrams were prepared to test the ability of the model to explain the
temporal variability in the observations. Each plot contains a plot of the measurement-
based Case 1 and Case 2 AERMOD results and the four WRF-extraction AERMOD
results (the four extraction methods described in Table 13).
• The robust highest concentration (RHC) has been used in most EPA model evaluation
studies to measure the model's ability to characterize the upper end of the frequency
distribution. Note that this can also be accomplished by visual inspection of the Q-Q
plots.
119
-------
/3n-l\ (7)
(c-cn)ln(—- } V '
where cn is the nth highest concentration and c is the average of the (n-1) highest
concentrations. For the small sample size data sets in the current analysis, n was set to
10.
Fractional factor of two (FF2): the ratio of the number of predictions within a factor of
two from the measurements to the total number of predictions.
Geometric correlation coefficient: standard correlation coefficient computed using the
natural log of the predictions and measurements, calculated as follows:
rg =
(8)
• Geometric mean: The nth root of the product of n numbers. The geometric mean
provides a method to evaluate a general expected value with dampened outlier
influence. Geometric mean is calculated as follows:
(9)
Geometric mean bias (MG): a symmetric measure independent of the magnitude of the
concentration. The value of MG indicates whether a model is prone to overpredict (MG >
1) or underpredict (MG < 1) and is therefore an indicator of model bias. MG is calculated
as follows:
MG = e
where c0 and cp are the observed and predicted concentrations, respectively.
Geometric mean variance (VG): a measure of the precision of the dataset. A perfect
model would result in VG = 1. VG is calculated as follows:
(11)
• Total modeling score (TMS): To summarize the modeling results with one composite
score, a "model score" value was calculated for each AERMOD case. The formula for
this score is basically an average of five statistics: the FF2, geometric correlation
120
-------
coefficient, geometric mean, RHC, and VG with equal weighting. MG is not included in
the model score because ug is an equivalent measure. The value ranges from 0 to 1,
with 1 being a "perfect" model:
, mm(RHCp,RHCobs) j.Q)
+ max(RHCp,RHCobs) + VGp] v '
where p indicates the statistic based on the model-predicted AERMOD results, and obs
indicates the statistic based on the observed concentrations.
4.3 AERMOD Modeling Results
The qualitative and quantitative results are presented and discussed in this section for each
dispersion study separately. Refer to Table 15 for details related to each study. The statistical
results are included in tables for each tracer study. The TMS scores of WRF-based AERMOD
simulations that met or exceeded the measurement-based TMS scores (the lowest TMS score
of Case 1 or Case 2 AERMOD simulations) are highlighted in red in these tables.
The scatter plots and Q-Q plots in each section compare AERMOD predicted concentrations to
observed concentrations (expressed in us/m3). The underlining in the bulleted items is intended
to assist in understanding the legend for the scatter plots and Q-Q plots.
• Case 1 buoy meteorology using measurements of wind direction variance (OG)
• Case 2 buoy meteorology using wind direction variance predicted by AERMOD
• WRF meteorology extracted by MMIF and processed with AERCOARE and MMIF-
diagnosed mixing height (AERC.RCALT)
• WRF meteorology extracted by MMIF and processed with AERCOARE and WRF mixing
height (AERC.RCALF)
• WRF meteorology extracted by MMIF and MMIF-diagnosed mixing height
(AERC.RCALT)
• WRF meteorology extracted by MMIF and WRF mixing height (AERC.RCALF)
4.3.1 Cameron
The AERMOD modeling performance statistics for the Cameron simulations are listed in Table
16. Time series plots of AERMOD results and measured concentrations for Cameron are shown
in Figure 77. The concentration time series plots show factor-of-two lines (in light blue) for the
measurement time series. Scatter plots and Q-Q plots of AERMOD results from each WRF
simulation are shown in Figure 78 - Figure 89.
The modeling results indicate some AERMOD simulations driven by WRF data perform as well
as or better than the observation-based (obs) AERMOD simulations for Cameron. The best
performing simulation, based on the TMS, is the NARR.UW MMIF.RCALF (NARR reanalysis,
UW-PBL, direct extraction from WRF without AERCOARE re-processing, using WRF-derived
PEL heights) case. This case resulted in a higher FF2, better VG and MG, and better RHC than
121
-------
the observation-based simulations. The correlation coefficient is high, but not as quite as high
as the observation driven Case 1 and Case 2 simulations. The ERA.UW AERC.RCALT
simulation also resulted in a higher TMS than the observation-based simulations. For this
simulation, VG and FF2 were better than the measurement-based simulations. Geometric
correlation was also equivalent to the measurement-based simulations, but geometric mean and
RHC were lower than produced by the measurement-based simulations.
The NARR.UW and ERA.UW scatter plots and Q-Q plots are shown in Figure 80 - Figure 81
and Figure 86 - Figure 87, respectively. The plots show that the best WRF-based simulations
overpredicted concentration in the middle range and underpredicted concentration over the
range of maximum concentration. Measurement-based simulations underpredicted
concentration over the low range and overpredicted concentrations over the middle range of
predictions. The measurement-based maximum concentration predictions coincide with the
maximum measured concentrations. In this case, the good WRF score performance was not
indicative of the performance of the upper-end of the frequency distribution.
The ERA.UW MMIF.RCALF (ERA reanalysis, direction WRF extraction, WRF PEL heights)
results in upper-range concentrations that coincide well with the measured concentrations, as
can be seen in Figure 86. This case is slightly more conservative than the measurement-based
simulations at the upper end of concentrations. The TMS for this simulation is 0.60, which is
slightly below the measurement-based TMS scores. This simulation exceeds the performance
of the measurement-based simulations in FF2 and VG, but its geometric mean and RHC are
lower than the measurements and measurement-based simulations. In this case, the geometric
mean and RHC do not provide good indicators of the models performance at the higher end of
measured concentrations.
The best performing WRF simulation (NARR.UW-MMIF.RCALF) was one of the cases with a
better prediction of PEL height for both winter and summer cases. All WRF cases were too
neutral compared to the highly stable conditions predicted through the observations.
The worst performing WRF simulations were the NARR.MYJ simulations using WRF PEL
heights: TMS scores were very low at 0.16 and 0.19 for the MMIF and AERC cases,
respectively. These simulations highly overpredicted concentration during both the winter and
summer campaigns. A look at the Q-Q and scatter plots from these simulations reveal a similar
trend as the best performing WRF simulations: overprediction of concentration at the middle
range of predictions and some underprediction at the upper range.
The time series plots in Figure 77 illustrate some of these findings more clearly. The WRF PEL
heights are too low and lead to excessively high concentrations. The measurement-based
concentration predictions are lower than the measured values over the summer period, likely
due to excessively high PEL heights. MMIF-recalculated PEL heights result in better predictions.
Overprediction was caused mainly by underprediction of PEL height: these simulations resulted
in the lowest PEL heights for both the winter and summer campaigns. The MYJ simulation PEL
height predictions were improved after MMIF re-calculation, leading to TMS scores of the same
magnitude as the observation cases. Since all simulations had similar errors in air-sea
122
-------
temperature difference, it appears the combination of the local closure schemes used in MYJ
and erroneous air-sea temperature differences led to even greater overprediction of
atmospheric stability.
Based on IMS score, other WRF-based simulations had performance as good as or better than
the observation-based AERMOD cases. ERA.UW had the best average IMS score (between all
four extraction methods). All WRF cases likely would have performed better if more unstable
conditions were used during the summer periods (near neutral conditions were predicted).
The observation-based AERMOD simulations overpredicted concentration slightly during the
winter periods. The z.\m heights used were lower than the measured heights, contributing to the
overpredicted concentrations.
In conclusion, both ERA and NARR simulations using YSU and UW PEL schemes perform fairly
well, with several extraction methods resulting in AERMOD results as good as or better than the
observation-based simulations. MYJ-based simulations performed the worst overall, but were
improved when PEL heights were recalculated by MMIF. The overly-neutral atmospheric
stabilities predicted by WRF did not result in excessively lower concentration predictions for the
best performing simulations, possibly due to the fact that tracer study PEL heights were on
average of the same magnitude or higher than the WRF/MMIF PEL heights.
Meteorology extracted directly from the ERA.UW WRF simulation, using WRF PEL heights,
resulted in AERMOD concentration predictions nearly as accurate and slightly more
conservative on the upper-end than those predicted using measurement-based observations.
Most WRF simulations underpredicted concentration over the highest range of measured
concentrations. It is likely that more accurate SST would have resulted in more stable conditions
predicted by WRF, leading to AERMOD predicted concentrations as conservative as the
measurement-based cases.
123
-------
Table 16. Cameron AERMOD Results Performance Statistics.
Case
Observed
Concentrations3
Case 1 Obs
Case 2 Obs
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU
Extraction
Method
-
obs.ae
no obs. ae
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
FF2
1.00
0.42
0.42
0.46
0.12
0.46
0.12
0.54
0.46
0.46
0.54
0.58
0.54
0.42
0.38
0.54
0.12
0.62
0.19
0.54
0.31
0.62
0.54
0.50
0.50
0.50
0.42
Geo.
Corr.
Coeff.
1.00
0.83
0.81
0.67
-0.06
0.67
-0.04
0.81
0.71
0.82
0.74
0.83
0.83
0.83
0.83
0.77
0.24
0.79
0.36
0.82
0.73
0.90
0.78
0.85
0.82
0.91
0.91
Geo.
Mean
3.17
4.04
4.17
3.46
21.84
3.15
21.44
2.59
4.22
1.96
3.60
2.03
1.78
1.57
1.46
3.14
10.89
3.21
13.17
2.82
4.05
1.90
2.58
2.19
2.03
1.63
1.49
RHC
40.32
49.93
51.26
14.27
52.58
13.24
54.75
12.65
21.66
7.87
46.14
14.32
14.97
10.70
12.31
15.29
38.52
13.29
42.68
21.23
43.53
12.30
18.95
18.69
16.88
15.41
11.40
VG
1.00
3.02
3.60
3.01
688.05
2.94
648.03
2.08
2.97
2.56
2.72
2.23
2.55
3.17
3.40
2.26
101.01
2.20
105.02
2.16
3.97
1.92
2.48
2.10
2.50
2.18
2.51
MG
1.00
0.78
0.76
0.91
0.15
1.01
0.15
1.22
0.75
1.61
0.88
1.56
1.78
2.02
2.17
1.01
0.29
0.99
0.24
1.12
0.78
1.67
1.23
1.45
1.56
1.95
2.13
Total
Model
Scoreb
1.00
0.64
0.61
0.55
0.19
0.56
0.19
0.59
0.56
0.50
0.68
0.57
0.54
0.47
0.46
0.62
0.32
0.63
0.35
0.65
0.60
0.59
0.60
0.60
0.56
0.55
0.50
aValues of 1.0 for FF2, Geo. Corn, VG, MG, and Model score included for the observations as a reminder of the
modeling performance minimum values.
b Total Model Scores suggest WRF-based AERMOD performance as good as or better than Measurement-based
AERMOD cases highlighted bold in red.
124
-------
Cameron Normalized Concentration Time-series
Site Measurement
Buoy easel
Buoy case2
WRF ERA.MYJ aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA.MYJ aerT.rcalF
WRF ERA.MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA.UW aerF.rcalT
WRF ERA.UW aerT.rcalF
WRF ERA.UW aerT.rcalT
WRF ERA.YSU aerF.rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerF.rcalT
WRF NARR MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR.UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR.UW aerT.rcalT
WRF NARR.YSU aerF.rcalF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR.YSU aerT.rcalT
Cameron Normalized Concentration Time-series
Site Measurement
Buoy easel
Buoy case2
WRF ERA.MYJ aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA.MYJ aerT.rcalF
WRF ERA.MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA.UW aerF.rcalT
WRF ERA.UW aerT.rcalF
WRF ERA.UW aerT.rcalT
WRF ERA.YSU aerF.rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerF.rcalT
WRF NARR.MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR.UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR.UW aerT.rcalT
WRF NARR.YSU aerF.rcalF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR.YSU aerT.rcalT
Figure 77. Cameron AERMOD Results Time Series: winter cases (top) and summer cases
(bottom).
125
-------
Cameron - NARR.MYJ
10'
f f
•_
X
9
JJ
'S
,-
<
--••
i
I
y
'
« *
i
•
•
• 4
•
|
%
, '
' •
jS I
fr
A
" **
* _
* '
X
*
•
X
•
xj
*
•
o
3
(
I
<
f
•
2
/,
* i
»
*
x
.
/
~t
,'
x
• - '
'!
--
X
X
x
if
•
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare. WRF c,
MMIF, Diag. z
MMIF, WRF =1
10° 1C1
Observed Concentrations dtstw*)
Figure 78. Cameron AERMOD Results Scatter Plot - NARR.MYJ.
Cameron - NARR.MYJ 0-0 plot
•*
**
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare. Diag. ;
Aercoare, WRF ;
MMIF. Diag. ;,
MMIF, WRF :,
10° 10'
Observed Concentrations d^fm'' J
Figure 79. Cameron AERMOD Results Q-Q Plot - NARR.MYJ.
126
-------
Cameron - NARR.UW
x '
-'
X
it
^/
,-
•
/.
*
6
X
U
a
1
j
+
*
«
^
,' ?
X |
• t
A
|
f4
"
^
m
.
0
.
/*
=
rt
k
B
u^
1 • x'
1 x
/£
•
t
/
0
- m
>
o
p,
,
Z-
-
X '
"••
2
'
.'
/
s
X"
/
X
• Case 1 Buoy-based
i Case 2 Buoy-based
« Aercoare, WRF zt
• MMIF, Diag. z,
10" 101 10'
Observed Concentrations {/is/m' )
103
Figure 80. Cameron AERMOD Results Scatter Plot - NARR.UW.
Cameron - MARR.UW 0-0 plot
10=
1
1
a
I
V
B
1
s
i
s
E
10"
(
f '
,'
/
, '
S
X
'
xl
<
x^
i
-1
^'
, y
• xS
»-*'
•
- J
^/
.''
>
x.
X
..
1 ,
_
'
...
.,'"
'i^
j
1 1
X
. «
-v.
^*
tf
*
-'
,*
/
~ ^^^
f
X
-'"
2
x
x1
X
-
• Case 1 Buoy-based
A Case 7. Buoy-based
,
» Aercoare. WRF 2,
• MMIF, Diag. z,
« MMIF, WRF z
;
10'
Oteerv«j Concentrations (
Figure 81. Cameron AERMOD Results Q-Q Plot - NARR.UW.
127
-------
Cameron - NARR.YSU
10=
1(1
, '
x
x"
1
^
q
*
s
'
m
=*=
J
X*
f
A
^
/*
^
.
-
,. 0
!
X
r'l
k
— ^—>
^
1
/
•
>
i
^
X
-
,'
12
/
'
x
x
,x
• Case 1 Buoy-based
* Case 2 Buoy-based
.
* Aercoare, WRF *,
• MMIF, Diag. 3,
* MMIF, WRF i
10' 10*
Observed Concentrations O'5/'""')
Figure 82. Cameron AERMOD Results Scatter Plot - NARR.YSU.
Cameron - NARR.YSU 0-0 plot
'
''
/
f
s
\
''
\
/
.'
S
s
M
9
^ *
s*
3
' : Jf
' +
I
X^
:>
(
K*
•*
x1
^
..
* •
*
r
iX
;
• *
X
• «
^
1
^
^
r
'
*
/
/
s
f
/
*
s
f
,*
• Case 1 Buoy-based
» Case 2 Buoy-based
,
* Aercoare, WRF ;(
• MMIF, Diag, ?,
* MMIF, WRF :
10° 10'
Observed Concentrations (
Figure 83. Cameron NARR.YSU Q-Q Plot.
128
-------
Cameron - ERA.MYJ
ia;
-
1
§
i
£
1
5 10'
If
3
1
i
1
10°
in1
x '"
7"
xx
9
:
Y
»
4
"
i
X*
X
X
**l
1
1
p*
1
*
*
* J
•
'•
XT
.
i
1 rfl
* I
:/
| |
X*
»
•
/»
*
,/
-
|
i
(
k
— * —
1 »,'
z
X
1 .* —
i
»
•
X
k
1
\
J*
s
<
/•
•
~*
f*
/
— -* —
2
X
>*
X
.
"^ 1 1
• Case 1 Buoy-based
* Case 2 Buoy-based
» Aercoare, WRF z,
• MMIF, Diag. .
MMIF, WRF :
10'
Observed Comrentrahons {jis/ms)
101
Figure 84. Cameron ERA.MYJ Scatter Plot.
Cameron - ERA.MYJ 0-0 plot
s
X
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare. Diag. ;r
Aercoare, WRF :
MMIF. Diag. i,
MMIF, WRF :,
10°
II,
Observed Concentrations (j
Figure 85. Cameron AERMOD Results Q-Q Plot - ERA.MYJ.
129
-------
Cameron - ERA.UW
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. ;
Aercoare, WRF r
MMIF, Diag. =.
MMIF, WRF -
10° 10!
Observed CorKcntrations (
Figure 86. Cameron AERMOD Results Scatter Plot - ERA.UW.
Cameron - ERA.UW Q-0 plot
l.r
m
'A
T*
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare. Diag. -,
Aercoare. WRF .T,
MMIF, Diag. «,
MMIF, WRF r.
10°
101
Observed Concentrabcns l,.j.''". ' t
10J
Figure 87. Cameron AERMOD Results Q-Q Plot - ERA.UW.
130
-------
Cameron- ERA.YSU
1
I
1
1 10>
3
1
I
10"
It
X
» •
A
X
..
m
*
.
" */T
Xl'l,
/
/ z * *
« i
:::
i
•
•
,:
: .: yS
• x '
* f '
•X
113
' '
il
-.-
.
•j
I T
•
f
/
-''
f
jf
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, WRF z.
MMIF, Diag. z
• MMIF, WRF z
10° ID1 1C2 10-
GBserved Cor>centrations (jjs/m5 )
Figure 88. Cameron AERMOD Results Scatter Plot - ERA.YSU.
Cameron - ERA.YSU Q-Q plot
f
*'
/_
•
^
A
^
/
-1
/
'
•'
/I
\
I
L-
'
-------
4.3.2 Carpinteria
The AERMOD modeling performance statistics for all Carpinteria simulations are listed in Table
17. Time series plots of AERMOD results and measured concentrations for Carpinteria are
shown in Figure 90. Scatter plots and Q-Q plots of AERMOD results from each WRF simulation
are shown in Figure 91 - Figure 102.
The observation-based AERMOD simulations resulted in a TMS of 0.64. Most of the WRF-
based AERMOD cases performed worse than the observation simulations processed through
AERCOARE, based on TMS score. The WRF case ERA.UW MMIF.RCALT (ERA reanalysis,
direct meteorology from WRF, MMIF-recalculated PEL heights) resulted in a TMS score near
0.55, performing nearly as well as the observation-driven cases. The ERA.YSU simulations
resulted in TMS scores about 0.50 (all extraction methods). These WRF-based simulations
resulted in geometric means and RHC scores slightly higher than the observation-based scores,
suggesting that the results are conservative. However, FF2 and correlation was lower than the
measurement-based simulations. The variance of the WRF-based simulations was very high
except for ERA.UW.
The Q-Q plots reveal the Case 1 measurement-based AERMOD simulation underpredicts
concentration a majority of the time. The result of this is a geometric mean that is well under the
observed mean. Case 2 measurement-based results are better - the mean and maximum
predicted concentrations are of the same magnitude as the measurements. The scatter plots
and Q-Q plots reveal that the ERA-based simulations perform well on the higher end of the
measured concentrations. The ERA WRF simulations tend to overpredict in the middle range
and underpredict at the lower range of measured concentrations. The error and variance in the
middle and lower range of the ERA simulations depress the TMS score. Carpinteria overall
performance is better than suggested by TMS, however, considering that the WRF-based
simulations perform well across the critical upper-end of the concentration distribution.
The measurement-based simulations use the diagnosed PEL height of 25 m during stable
periods on Sept. 22nd and 25th instead of the measured 500 m: the best WRF cases match the
about 25 m PEL heights during these periods WRF wind speeds and the sign of L also agrees
with the conditions implied by the measurements during these periods. The highest measured
concentrations occurred during these periods, as shown on the time series Figure 90. Both the
measurement-based and WRF-based AERMOD simulations perform well during these highest
periods.
WRF tends to both underpredict and overpredict wind speeds over different periods, but the
ERA.UW case is the only WRF option that accurately simulates the higher wind speeds on
Sept. 26th. The accurate wind speed predictions help assure better AERMOD performance for
the ERA.UW cases, improving accuracy mainly within the middle range of concentration
measurements.
The NARR.MYJ WRF simulations are the worst performers overall. These simulations produce
overly-unstable atmospheric conditions during many of the periods due to erroneous air-sea
temperature difference.
132
-------
In conclusion, though the IMS scores of all WRF simulations are lower than the scores of the
measurement-based simulations, the WRF ERA-based AERMOD simulations (and especially
the ERA.UW simulation) are shown to perform well at the higher end of measured
concentrations. The meteorological parameters predicted by ERA-based WRF solutions
generally match the measured conditions (and rediagnosed PEL heights) during the periods of
maximum concentration. WRF-based IMS scores were lower than measurement-based IMS
scores due to high variance related to excessive overprediction and underprediction of
concentration at the middle and lower range of measured concentrations, respectively.
133
-------
Table 17. Carpinteria AERMOD Results Performance Statistics.
Case
Observed
Concentrations3
Case 1 Obs
Case 2 Obs
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU
Extraction
Method
-
obs.ae
no obs. ae
AERC
AERC
MMIF.
MMIF.
AERC
AERC
MMIF.
MMIF.
AERC
AERC
MMIF.
MMIF.
AERC
AERC
MMIF.
MMIF.
AERC
AERC
MMIF.
MMIF.
AERC
AERC
MMIF.
MMIF.
.RCALT
.RCALF
RCALT
RCALF
.RCALT
.RCALF
RCALT
RCALF
.RCALT
.RCALF
RCALT
RCALF
.RCALT
.RCALF
RCALT
RCALF
.RCALT
.RCALF
RCALT
RCALF
.RCALT
.RCALF
RCALT
RCALF
FF2
1.0
0.56
0.67
0.26
0.22
0.22
0.19
0.35
0.15
0.37
0.30
0.31
0.31
0.41
0.37
0.44
0.30
0.33
0.26
0.41
0.30
0.52
0.26
0.44
0.44
0.56
0.48
Geo.
Corr.
Coeff.
1.0
0.68
0.76
0.14
0.33
0.06
0.40
-
-
0.24
0.44
-
-
0.33
0.33
0.33
0.38
0.29
0.39
0.43
0.63
0.57
0.62
0.27
0.24
0.36
0.36
Geo.
Mean
20.12
12.42
24.01
1.98
1.56
3.29
1.77
-
-
8.25
5.42
-
-
6.29
6.60
8.35
10.30
14.76
10.89
13.28
9.56
23.85
7.25
20.46
22.01
20.26
21.06
RHC
141
146,
319,
274,
264
384
510,
102
161
251
239
133,
130,
166,
185,
119,
121
242
191
221
182
224
211
173,
182
208
193,
.39
.71
.29
.60
.45
.07
.40
.55
.54
.29
.07
.50
.65
.54
.99
.56
.90
.80
.07
.68
.68
.85
.54
.18
.31
.27
.33
VG
1.00
2.54
2.08
>1000.
>1000.
>1000.
>1000.
-
-
>1000.
>1000.
-
-
0
0
0
0
0
0
538.17
533.09
>1000.
>1000.
>1000.
>1000.
0
0
0
0
259.31
>1000.
4.78
>1000.
42.89
49.46
33.69
28.73
0
0
MG
1.00
1.62
0.84
10.16
12.88
6.11
11.36
-
-
2.44
3.71
-
-
3.20
3.05
2.41
1.95
1.36
1.85
1.52
2.10
0.84
2.77
0.98
0.91
0.99
0.96
Total
Model
Score
1.00
0.64
0.64
0.20
0.23
0.16
0.19
0.21
0.21
0.32
0.32
0.25
0.25
0.38
0.36
0.41
0.41
0.39
0.39
0.43
0.44
0.55
0.38
0.51
0.48
0.52
0.51
aValues of 1.0 for FF2, Geo. Corn, VG, MG, and Model score included for the observations as a reminder of the
modeling minimum values.
* Note that geometric statistics posted as "- -" is for cases where calculation cannot be made due to a concentration
of 0.0 in the record.
134
-------
Carpinteria Normalized Concentration Time-series
888888SSS8888888888S
Site Measurement
Buoy easel
Buoy case2
WRF ERA.MYJ aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA.MYJ aerT.rcalF
WRF ERA.MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA.UW aerF.rcalT
WRF ERA.UW aerT.rcalF
WRF ERA.UW aerT.rcalT
WRF ERA.YSU aerF.rcalf
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerF.rcalT
WRF NARR.MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR.UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR.UW aerT.rcalT
WRF NARR.YSU aerF.rcalF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR.YSU aerT.rcalT
i i i i i i
Figure 90. Carpinteria AERMOD Results Time Series.
Carpinteria • NARR.MYJ
10*
1
c
1
3
i
•c
i
in-l
X
1
X
-
/
X
• ./
/
,'
1
/
"
i
^
;
**
.
; ;
*
&t~t
K •
K
J-/
¥•
^'r*—
i
,
V
•*»
y
"
»•
•
2
t
^
^
V''
jf
,
—
1
"
J, '
•'X
»
'
.4 —
/
^x
X
^
Case 1 Buoy-based
Case 2 Buoy-based
.
Aercoare, WRF r,
MMIF, Diag. z
* MMIF, WRF 3.
10° 10!
Observed Co IKS nitrations (^s/m'f
Figure 91. Carpinteria AERMOD Results Scatter Plot - NARR.MYJ.
135
-------
Carpinteria - MARR.MYJ Q-0 plot
10!
1
1
e
! 10'
g
8
I
t;
f
1
10»
in-'
f '
j *
X
f
/
,-
/
*
/
'
/
f
S
.S
if'
,-
/
X
/
X
•
*
s
'
'
*,
,'.»
'-f'
IP
j
/
*
**
^J
f1
is-
^.
&
•
s.
m
t
*
f
' t
\
-
* f
f.
\y
i
i f
.-
f
/
*
/
/
,
s
'
• Case 1 Buoy-based
* Case 2 Buoy-based
,
• Aercoare, WRF ;,
• MM1F, Hnrj z
* MMIF, WRF :
Observed Concentrations ()
Figure 92. Carpinteria AERMOD Results Q-Q Plot - NARR.MYJ.
Carpinteria - NARR.DW
V;:L
I
1
fi
! 10'
i
15
1
i
10"
1C
-
f *
,'
/
/
X
^
X;
p /
' ,/
jf
"',,-"
1
•
X
X
1
. •
iX
•
.
•
A
4'
*jT*
^-
' if-
B •
*
* *
r*
"I
•
» — : —
t
n
,
z'
•
•
*
•
-=;
9
x
•X
•, . -
'*. .
HI
/
L"
T
/
/
s
7* i 1
Case 2 Buoy-based
Aercoare, WRF r,
MMIF, Diag. 2
• MMIF, WRF ;
'- 10° 10* 102 103
Observed Concentrations (/js/m' )
Figure 93. Carpinteria AERMOD Results Scatter Plot - NARR.UW.
136
-------
Carpinteria - NARR.UW 0-Q plot
*
1
1
e
! 10'
s
1
t;
s
10°
f '
f '
X
r
/
,'
x
'
/_
'
/
*
^'
.X
if
^
/x
X
/
/
•
*
V
«
:
::^
^
,«^*
,
*
*
•
1 1 *
*
t*
JH
•
*
f
•
X
^ 1 /•^
'
X
..
X
^*
/
/
'•
•
• Case 1 Buoy-based
* Case 2 Buoy-based
,
• Aercoare, WRF ;,
• MMIF, Diag. z
* MMIF, WRF :
Observed Concentrations (j,is/ms}
Figure 94. Carpinteria AERMOD Results Q-Q Plot - NARR.UW.
Carpinteria - NARR.YSU
10!
j
3
1
!-'
5
?
a
i
1
lnQ
in-1
X
-
/
/
x
x
^^
^
^ '
p
X
> ^ ^ '-
-
/
'
I
:;
*
.
,
•
A
&
«-'
1s 1
*x?
i&j
::4^
*4
•
*
,
X
-
*•
*
X
'
f
L
^
t
•
»
•
X
I
V
*•'
/
X
X
/
F' 1 1
Case 2 Buoy-based
Aercoare, WRF :
MMIF. Diag. j
• MMIF, WRF z,
10° 101
Observed CorKentrations (^Sf»i')
in
Figure 95. Carpinteria AERMOD Results Scatter Plot - NARR.YSU.
137
-------
Carpinteria - NARR.YSU Q-0 plot
I ;"; "
:
|
f
I10'
s
I
1
10°
in-'
f '
,. '
X
-
r
/
-•
x
X
'
'
X
^, '
J *
^'
X
X X
f
f
X
X
/
•
*
\
'
f '
f f
9^
l>.
x*^
r
/
-K1
A
>
X
J
> *"
r
X
: :: X
.-
^
/"
X
x
'
X1
^x
^
•
• Case 1 Buoy-based
* Case 2 Buoy-based
,
* Aercoare, WRF ;,
• MMIF, Diag. z
• MMIF, WRF :
10° 101
Observed Concentrations (jjsft?^ J
Figure 96. Carpinteria AERMOD Results Q-Q Plot - NARR.YSU.
Carpinteria - ERA.MYJ
10!
J
5
1
a
E
« 10'
!
3
?
g
1
10"
in-1
/
,,
/
X
'
X
^
^
/"
'
x
f
X
•
•
1
*
V
«
1
1
1
-if
••'I
*»/
^;T;
•
*
V
I*
•
•
/.
!
#
"r
'
»
u/
•* •'
...
/
X
x
^
» Case 2 Buoy-based
.
» Aercoare, WRF r,
• MMIF, Diag. •
• MMIF, WRF :
103
O&served Concentrations (
Figure 97. Carpinteria AERMOD Results Scatter Plot - ERA.MYJ.
138
-------
Carpinteria - ERA.MYJ 0-0 plot
uH
;
"X
1
e
! 10'
s
1
t;
s
10°
f '
^ *
X
f
/
,'
/
*
/
•
*
r
__ '
^'
.X
f
f
X
x
/
•
*
' ^
'
•
: :
^i
* •
A"*
x*
x'y
: M—
v
.
»
'1
X
I
1
f'
r
'iK
TK
.-
X
-''
x
'
s
f
*
• Case 1 Buoy-based
* Case 2 Buoy-based
,
* Aercoare, WRF ;,
• MMIF, Diag. z
* MMIF, WRF :
10° 101
Observed Concentrations d\*,hi\*}
Figure 98. Carpinteria AERMOD Results Q-Q Plot - ERA.MYJ.
Carpinteria - ERA.UW
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. •
Aercoare, WRF r
MMIF, Diag. t.
MMIF. WRF ;,
Figure 99. Carpinteria AERMOD Results Scatter Plot - ERA.UW.
139
-------
Carpinteria - ERA.UW 0-Q plot
uH
;
"X
1
e
! 10'
.1
1
c
s
10°
f '
,'
X
x
,'
x
*•
x
*
/
^
__ *
^'
X
,'
,
*
x
X
X
x
/a
•
*
x'
<•
'
•
* T
X
•* ""K
f'^'
i
•
•
*
-
*
f*-
-•• —
»
r^
^
*
^
f"
j-
^
*
* f
4
' M
' .
*
f'
?
X
.
-
X
-''
x
'
x
^x
^
-
Case 1 Buoy-based
Case 2 Buoy-based
,
Aercoare, WRF :,
MMIF, Diag. z
MMIF, WRF :
Observed Concentrations (jjs/ms)
Figure 100. Carpinteria AERMOD Results Q-Q Plot - ERA.UW.
Carpinteria - ERA.YSU
10!
\
&
1
a
E
« 10'
!
g
1
n
i
1
10*
in-t
_, •*
J'
x
/
/
X
^
x
,, '
'X
' ! ,- "
X
*
»
z
4
X
» «
: : :
-
*•
** i
•4 -
'" .
*6
-•j*
•
V
I *•
1
• •
•
*
*.
y
- ;
~~*
J6
.*
•
•
....
/
x
,x
^
P i 1
Case 2 Buoy-based
.
Aercoare, WRF r,
MMIF, Diag. 2
• MMIF, WRF ;
10°
Observed Concentrations (/
Figure 101. Carpinteria AERMOD Results Scatter Plot - ERA.YSU.
140
-------
Carpinteria - ERA.YSU 0-0 plot
Predicted Concentrations (/is/™1 1
^ 5 3
,"'
f
X
x
/
••
/
s
_. '
,*
.s
,''
,
*
f
X
x^
*
:
' IT
1 '
x
• : :
J U
—I
:;* .>
^•E
x<» "5^
' Z&L
f
•
10° 101
Observed CorKenlrations (JJS/JM*
t
/
^
]
^
yf
f'\ j
-' iX
2
: -'
'''
/
'
~Z= "
s
>
.-
• Case 1 Buoy-based
» Case 2 Buoy-based
,
* Aercoare, WRF ;,
• MMIF, Diag. z
* MMIF, WRF :
10' 10J
Figure 102. Carpinteria AERMOD Results Q-Q Plot - ERA.YSU.
141
-------
4.3.3 Oresund
The AERMOD modeling performance statistics for the Oresund simulations are listed in Table
18. Time series plots of AERMOD results and measured concentrations for Oresund are shown
in Figure 103. Scatter plots and Q-Q plots of AERMOD results from each WRF simulation are
shown in Figure 104 - Figure 109.
The performance statistic scores for the observation-based AERMOD cases are relatively low.
FF2 is <0.2 and correlation is <0.2. However, Geo. mean, RHC, and MG results are similar in
magnitude to the scores produced with the observations. The measurement-based TMS score
is 0.43. The best performing WRF-based AERMOD simulations result in TMS scores similar in
magnitude to the measurement-based TMS score. The ERA.MYJ "MMIF" simulations (direct
extraction from WRF without AERCOARE processing) results in FF2 scores that are much
higher than the measurement based simulation. The other statistic scores are similar to the
measurement-based simulation except for VG. The high WRF-based VG scores are a result of
the wide variance in simulated concentrations. The scatter indicates that there is little skill in
WRF-AERMOD predictability.
The ERA.UW Q-Q plot (Figure 107) illustrates the significant underpredictions of concentration
resulting from the AERCOARE-based simulations. The Q-Q plot reveals that maximum
predicted concentrations from the WRF-MMIF simulations match well to the highest observed
concentrations. However, the scatter plot reveals that the matching highest concentrations may
be fortuitous because the predicted maxes do not occur during the same release periods as the
observed maxes. The scatter plots also illustrate the significant scatter of concentration results
produced by all of the simulations. The high scatter is also evident on the concentration time
series plot.
The ERA.UW simulation wind speed predictions are the most erroneous and likely a large factor
in the poor performance of the AERMOD simulations. Wind speeds are underpredicted in most
cases. The ERA.MYJ simulations predicted wind speed the best, contributing to the better
AERMOD performance. All WRF simulations predicted similar air-sea temperature differences
and stability conditions as observed (stable conditions except for June 12th. However, the WRF
PEL heights (both extracted and recalculated) were much lower than the observed values, in
general, contributing to the concentration overpredictions. WRF predicted much lower wind
speeds than observed in general - these correspond with the lower PEL height predictions.
Given the transport distances of this case (20-30 km), this tracer study may be the most
susceptible to meteorological heterogeneity of the five studies evaluated. AERMOD, being a
"straight-line" Gaussian model, does not account for the heterogeneity of meteorological
conditions between the source and the receptor. This is especially a factor when transport
occurs over water-land boundaries, where PEL conditions can change dramatically over short
distances. The high level of variance evident in the concentration scatter plots may be a result
of meteorological heterogeneity.
In conclusion, the measurement-based AERMOD simulations result in fairly poor performance
scores, demonstrating little skill in the ability of predicting measured concentrations. The
142
-------
ERA.MYJ-based AERMOD simulations that do not utilize AERCOARE perform about as well as
the measurement-based simulation. Underpredicted wind speed is the cause of poorer
performance in the WRF simulations that perform the worst. However, neither the observations-
driven nor WRF driven AERMOD simulations explain the temporal variability of the observed
concentrations. AERMOD does not consider the complex meteorology or shoreline fumigation
observed in the Oresund study.
Table 18. Oresund AERMOD Performance Statistics.
Extraction
Case
Method
Observed
Concentrations3
Case 2 Obs no obs. ae
AERC.RCALT
AERC.RCALF
EPA MYJ
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
FRA UW
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
ERA YSU
MMIF.RCALT
MMIF.RCALF
FF2
1.00
0.19
0.11
0.16
0.43
0.33
0.14
0.14
0.33
0.24
0.00
0.05
0.48
0.19
Geo.
Corr.
Coeff.
1.00
0.18
-
-
0.12
0.11
-0.36
-0.17
0.18
0.42
-
-
-0.17
-0.15
Geo.
Mean
0.04
0.04
-
-
0.03
0.02
0.00
0.00
0.01
0.01
-
-
0.04
0.09
RHC
0.28
0.20
0.10
0.09
0.25
0.22
0.21
0.14
0.37
0.25
0.12
0.15
0.39
0.36
VG
1.00
9.10
-
-
341.65
478.76
>1000
>1000
>1000
>1000
-
-
41.94
9.91
MG
1.00
1.05
-
-
1.73
2.08
72.98
68.23
4.34
4.82
-
-
1.03
0.50
Total
Model
Score
1.00
0.43
0.09
0.10
0.41
0.35
0.11
0.10
0.25
0.36
0.09
0.12
0.20
0.28
a Values of 1.0 for FF2, Geo. Corn, VG,
statistical minimum values.
MG, and Model score included for the observations as a reminder of the
143
-------
Oresund Normalized Concentration Time-series
IB"
Q Site Measurement
C2 Buoy case2
A WRF ERA.MYJ aerF.rcalF
A WRF ERA.MYJ aerF.rcafT
A WRF ERA.MYJ aerT.rcalF
A WRF ERA.MYJ aerT.rcalT
Q WRF ERA.UW aerF.rcalF
D WRF ERA.UW aerF.rcalT
D WRF ERA.UW aerT.rcalF
D WRF ERA.UW aerT.rcalT
O WRF ERA.YSU aerF.rcalF
O WRF ERA.YSU aerF.rcalT
O WRF ERA YSU aerT.rcalF
o WRF ERA.YSU aerT.rcalT
Figure 103. Oresund AERMOD Results Time series.
144
-------
Oresund - ERA.MYJ
10'1
;
1
5j
i
a
S
I"*
a
l
v
*
l
1ft-3
ttr*
'/
/
X
*
/
7*
''
' /
-
/
X
'
\
'
•
/
X'
t
*•
x ""
X
'
'•
•
•
X
<
V.
» *
X
*
1
n
*:x
f
*,
-rxf
* (
•
r^
-
/
•
0
2
^
^x
• Case 2 Buoy-based
* Aercoare, Diag. ?
• Aercpare, WRF ;,
» MMIF. Diag.
• MMIF. WRF i.
10'3 102 ID'1
Observed Coocentrations (jjs/m3)
10°
Figure 104. Oresund AERMOD Results Scatter Plot - ERA.MYJ.
Oresund - ERA.MY) Q-Q plot
10'1
5
1
I
a
g
£ 10-'
I
fl
1
jj
1
irv3
•n '•
. '
x'
2
r
S
,'
'
X
-
'
X
X
X
*
s'
' s
•J1* -f-
V'
•'
y
s
.f
s
, *"
f
'
,'
>^ i
- X
1
X
- ' \
*
^
*-
1
*
***
^ "
A
?•
A
r
.
,.-
,-' |
4^
A
V
X
;-
X
'
s
S
,
? 1 1
« Aercoare, Diag. .
• Aercoare. WRF j,
» MMIF, Diag. .
« MMIF, WRF ;,
102 HP
Observed Concentrations (jjs^i'' ]
II!'
Figure 105. Oresund AERMOD Results Q-Q Plot - ERA.MYJ.
145
-------
Oresund - ERA.UW
10'1
«
•£
.3
I
centra tic
Q
S
I
S
1
10"3
X
/
,/
s
f '
'X
-
X
/
x
4
1
1
/
2 _
X ""
t
X
M
J
1
c
x"
f
• •
•
-t
-;•/*
^1 '
," 1
f
f" *
•x
:::4S'
*
4
4
I
4
" [j ~^~
1
/
-''
0
/
/
• Case 2 Buoy-based
* Aercoare, Diag. ?
• Aercpare, WRF ;,
» MMIF. Diag. :,
• MMIF. WRF j,
G&served Coixentralaons (^
Figure 106. Oresund AERMOD Results Scatter Plot - ERA.UW.
Oresund - ERA.UW 0-0 plot
1C'1
s
s.
i
•
^
Q
1
1
T.'i-1
//
/
,-
_/
'
.
-•
,, ""
^ '
^
.-
: ^7-^ ~
'
X
i
/
-'
/
•
,,-
>
^ '
' ^'
i
,-•
«
i
*«•
&
f
•
f
*
,
n
*'
* •
'
k
f '
f** 4
^
I •
• ••
m
^
r
*
X
^
X
V
,
? 1 1
* Aercoare. Diag. z
m Aercoare, WRF z,
• MMIF, Diag. .-.
• MMIF, WRF 2,
ID'
Observed Concentrations (;jsft»^)
Figure 107. Oresund AERMOD Results Q-Q Plot - ERA.UW.
146
-------
Oresund - ERA.YSU
lO'1
„
•£
.3
|
a
S
i W
£
.ce nitrations (jjs/ms )
Figure 108. Oresund AERMOD Results Scatter Plot - ERA.YSU.
Oresund - ERA.YSU 0-0 plot
1C'1
i
s.
i
•
^
Q
1
£
io-3
l,-l :
X
„,
X
,'
X
'
X
.
X
,, •"
X
^
.•
'X
,'
'
/
X
.-
4
/
-
V
^
'
J
K »^.
-;xj
r X
1
M
x^
)T
'
4
^
/
^
K
x
•'
"
,
„' H
Y*
.''
* *
A
*
i
Z
\-
/
s
X
*
? 1 1
* Aercoare. Diag. z
• Aercoare, WRF z,
* MMIF, Diag. .-.
• MMIF, WRF =,
II)'
Observed Cor>cenlratiCMis (/isfoi*)
Figure 109. Oresund AERMOD Results Q-Q Plot - ERA.YSU.
147
-------
4.3.4 Pismo
The AERMOD modeling performance statistics for the Pismo AERMOD simulations are listed in
Table 19. Time series plots of AERMOD predictions and measured concentrations for Pismo
are shown in Figure 110. Scatter plots and Q-Q plots of AERMOD results from each WRF
simulation are shown in Figure 111 - Figure 122.
The measurement-based cases result in relatively high variance and low correlations and FF2
scores. The result is a relatively low TMS of 0.40. Measurement based AERMOD simulations
overpredicted concentrations significantly during the summer campaign and underpredicted
concentration for the majority of releases during the winter campaign. Under the stable summer
conditions, the measurement-based cases use calculated z.\m estimates that are much lower
than the observed heights (about 50 m vs. >500m).
The stable conditions and recalculated lower PEL heights are supported by the observed
air-sea temperature difference. The stable conditions contribute to the high concentrations
predicted by AERMOD. It is possible that the local stability conditions determined by the
observations are not representative of the region as a whole. Although AERMOD tends to be a
conservative model, the degree of overprediction in this case suggests that PEL stability was
not as severe as indicated by the measurements.
During the winter campaign the air-sea temperature difference results in negative L and
unstable conditions. The relatively high PEL heights contribute to low concentration predictions.
In this case, it is again possible the local gradients and unstable conditions were not
representative of the region as a whole. WRF predicts more neutral conditions and lower PEL
heights.
The winter time series plot reveals several hours on Dec. 11th where WRF highly overpredicts
wind speed. The WRF-based AERMOD simulations underpredict concentration during these
periods but not as severely as the measurement based simulations. Results during this period
suggest that AERMOD accuracy is more dependent on PEL height accuracy than wind speed
accuracy.
Most of the WRF based AERMOD simulations result in performance scores as good as or better
than the observation based AERMOD cases. WRF simulated air temperatures were all slightly
overpredicted in the summer, resulting in air-sea temperature differences near 0°C. The WRF
air-sea temperature differences supported neutral to slightly unstable atmospheric stability
conditions. The resulting PEL heights (both extracted and recalculated) are closer to the
measurements. Better meteorological predictions result in better AERMOD results than those
predicted using measurement based meteorology.
The ERA.YSU cases perform the best with the AERC.RCALF extraction method (AERCOARE
processing, using WRF PEL heights) resulting in a TMS of 0.75. All of the ERA.YSU simulations
result in performance scores that exceed the measurement-based simulations.
The NARR.MYJ WRF simulations achieve the lowest performance scores. These simulations
highly overpredict PEL height and underpredict wind speed during some periods in the winter.
148
-------
Large underpredictions and overpredictions during different periods result in overall poor
performance and high variance. The lower performance scores of the measurement based
cases are caused mainly by the series of concentration overpredictions occurring during the
June 21st and June 22nd releases. The time series plot (Figure 110) illustrates the high degree of
concentration overprediction during this period by the measurement-based AERMOD
simulations and the MYJ-based WRF AERMOD simulations (simulations that used WRF PEL
heights specifically). The measurement-based simulations and erroneous WRF cases calculate
concentrations of 10-40 (is/m3 over this period, compared to the 0.1-0.5 (is/m3 range of
measured values. In these cases, the measurements results in positive air-sea temperature
differences that support stable conditions. The YSU-based WRF simulations support unstable
conditions, resulting in AERMOD underpredictions. WRF PEL heights from the MYJ simulations
and measurement based PEL heights are excessively low and are the main cause of the
AERMOD overpredictions.
The relatively good performance by the WRF-based simulations is not as visually evident on the
scatter plot and Q-Q plots. The relatively low geometric correlation scores reflect the level of
scatter seen in the plots. The best-performing ERA.YSU AERC.RCALF and AERC.RCALT
simulations fall within the factor-of-2 lines for the upper range of concentrations, as seen on the
plots. The scatter plots and Q-Q plots illustrate the high degree of measurement-based
AERMOD overprediction. The maximum concentrations from the ERA.YSU AERC.RCALF and
MMIF.RCALF compare well to the measurements.
Overall, the results of the Pismo study comparison demonstrate in some cases the smoothed
meteorological fields produced by WRF may be more advantageous for AERMOD modeling
than the pinpoint meteorological records available at a buoy. The regional meteorology in this
case was likely less statically stable than suggested by the measurements, based on the PEL
height measurements and lower predictability of measurement-based AERMOD simulations
149
-------
Table 19. Pismo AERMOD Performance Statistics.
Case
Observed
Concentrations3
Case 1 Obs
Case 2 Obs
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU
Extraction
Method
-
obs.ae
no obs. ae
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
FF2
1.00
0.42
0.26
0.42
0.13
0.42
0.16
0.52
0.48
0.55
0.45
0.45
0.48
0.35
0.42
0.52
0.26
0.45
0.23
0.45
0.48
0.42
0.52
0.65
0.81
0.55
0.71
Geo.
Corr.
Coeff.
1.00
0.34
0.04
0.33
-0.01
0.31
-0.05
0.31
0.18
0.32
0.15
0.50
0.50
0.37
0.45
0.20
0.08
0.24
0.10
0.18
0.18
0.26
0.17
0.35
0.37
0.22
0.23
Geo.
Mean
3.46
3.14
5.86
2.41
2.78
2.31
2.70
2.70
3.74
2.46
3.48
2.07
2.44
1.60
1.91
3.73
5.36
3.60
5.43
3.89
5.96
2.99
4.76
2.64
3.22
2.14
2.49
RHC
8.96
34.95
54.70
27.73
36.69
23.61
31.98
27.20
30.78
24.53
36.20
29.22
33.83
22.74
27.64
25.36
42.73
26.30
43.13
22.77
33.47
16.04
44.36
7.13
8.17
6.40
5.94
VG
1.00
10.50
13.06
2.45
17.99
2.46
23.76
2.28
2.58
2.34
3.10
2.87
2.61
4.08
3.14
2.02
16.32
2.08
19.33
2.01
3.10
1.82
3.08
1.49
1.36
1.95
1.61
MG
1.00
1.10
0.59
1.44
1.25
1.49
1.28
1.28
0.93
1.41
0.99
1.67
1.42
2.16
1.82
0.93
0.65
0.96
0.64
0.89
0.58
1.16
0.73
1.31
1.07
1.62
1.39
Total
Model
Scoreb
1.00
0.40
0.23
0.44
0.24
0.44
0.24
0.48
0.45
0.47
0.43
0.44
0.47
0.37
0.41
0.50
0.25
0.49
0.24
0.48
0.37
0.53
0.39
0.65
0.75
0.52
0.59
aValues of 1.0 for FF2, Geo. Corn, VG, MG, and Model score included for the observations as a reminder of the
modeling statistical minimum values.
b Total Model Scores with WRF-based AERMOD performance as good as or better than Measurement-based
AERMOD cases highlighted bold in red.
150
-------
Pismo Normalized Concentration Time-series
site Measurement
Buoy easel
Buoy case2
WRF ERA MYJ aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA.MYJ aerT.rcalF
WRF ERA.MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA UW aerF rcalT
WRF ERA.UW aerT.rcalF
WRF ERA.UW aerT.rcalT
WRF ERA.YSU aerF rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerF.rcalT
WRF NARR MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR.UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR.UW aerT.rcalT
WRF NARR.YSU aerFrcaiF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR.YSU aerTrcalT
Pismo Normalized Concentration Time-series
Site Measurement
Buoy easel
Buoy case2
WRF ERA.MYJ aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA.MYJ aerT.rcalF
WRF ERA.MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA UW aerFrcalT
WRF ERA UW aerT.rcalF
WRF ERA.UW aerT.rcalT
WRF ERA.YSU aerf.rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerF.rcalT
WRF NARR.MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR.UW aerT.rcalT
WRF NARR.YSU aerFrcaiF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR YSU aerT.rcalT
Figure 110. Pismo AERMOD Results Time Series: winter cases (top) and summer cases
(bottom).
151
-------
Pismo - NARR.MYJ
10'
1
3
I
fl
2
I 10'
£
a
B
|
1
10°
in-1
X
x
x
'
x
'
— ix^"
/
,
«
1
,'
*X^
«
si,
•r
*
.
• |
tf
fi
•
•
•
/
ff
J
1 '
• i
-,-»-
•
B
i | *
1 ,'
i ,X
'!f-—
i
i
/
X
x
,
'
X
x
,'
x
x
f
/
? I 1
* Case 2 Buoy-based
» Aercoare, WRF ;,
• MMIF, Diag. -
MMIF, WRF z
10°
Observed Concentrations
Figure 111. Pismo AERMOD Results Scatter Plot - NARR.MYJ.
Pismo-NARR.MYJ 0-0 plot
10-'
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. -:
Aercoare, WRF ;
MMIF. Diag. ,-,
MMIF, WRF ;,
Observed Concentrations
Figure 112. Pismo AERMOD Results Q-Q Plot - NARR.MYJ.
152
-------
Pismo - NARR.UW
IB8
, "
**
/
/
/
'
x
J
_,-
J/*
jS
\
A A
1 J,»
•
^
.'
i1
*•
ff
\\
k
*T
\
•
y
• -
J
s
. i
.
•."x.
t
^J
f
s
,-'
1 .x
flf
1
z
X'
E_'
, '
f
J *
' X
/
x
X
X
^
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare. WRF z,
MMIF, Diag. ;
* MMIF, WRF j
10°
Observed Concentrations l
Figure 113. Pismo AERMOD Results Scatter Plot - NARR.UW.
Pismo - NARR.UW 0-0 plot
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. z.
Aercoare, WRF -,
MMIF. Diag. i,
MMIF, WRF :,
Figure 114. Pismo AERMOD Results Q-Q Plot- NARR.UW.
153
-------
Pismo - NARR.YSU
,/
/
X"
'
/
/
j
-- 1
2
:j '
A
rf
* -jj
{!
.
c
r*
f
, •
;
.
i
*
*
s
•
11
•M
A
'
2
/
'
/
^
,'
«
, "
2
/
X
X
x'
» Case 1 Buoy-based
* Case 2 Buoy-based
* Aercoare, WRF *,
• MMIF, Diag. z,
MMIF, WftF c
Otserved Concentrations (
Figure 115. Pismo AERMOD Results Scatter Plot - NARR.YSU.
Pismo - NARR.YSU Q-Q plot
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare. Diag. .:,
Aercoare, WRF :,
MMIF, Diag. z.
MMIF, WRF :,
Figure 116. Pismo AERMOD Results Q-Q Plot- NARR.YSU.
154
-------
Pismo - ERA.MYJ
10J
10l
LO'
in !
1C
.-'
/
-1
x
/
*
s
/
1 *
(
"x^
-
(
10°
A
i*
•>*
* '
•I
<
A
SJ
-^
m'0t
JJ&
vt
'T1
* 1
* ^
»
, ?
JL_
>
f*
* I
f '
'
z
»: '
i
»
;
lir
X
'
/
x
,'
r
f '
X
10!
y *"
/
/
/
s
s
::
it
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. --.
Aercoare, WRF ;,
MMIF, Diag. ,7,
MMIF, WRF :,
Ol»erved Concentrations (
Figure 117. Pismo AERMOD Results Scatter Plot - ERA.MYJ.
Pismo - ERA.MYJ 0-Q plot
10!
I
1
e
! 10'
3
1
a
B
i
10»
in-'
^ '
/
X
/
*•
f
/
f
s
,-'
t
'
*
z
•«
»
•
f
/
A
V
,
x"
i,
?
i
1
*
. «'
Tt
!•
^-
- 1 — -^
* ,-
;;z
: :,-i'
-
X
^ *
*
X
'
^
fr
.''
X
.-
-
X
x
/
.X
x
f
f *
• Case 1 Buoy-based
* Case 2 Buoy-based
,
* Aercoare, WRF ;,
• MMIF, Diag. -
* MMIF, WRF :
Observed Concentrations (j
Figure 118. Pismo AERMOD Results Q-Q Plot- ERA.MYJ.
155
-------
Pismo- ERA.UW
103
10° 101
Observed Concentrations C/js/m1 )
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. z,
Aercoare. WRF ;,
MMIF. Diag.
MMIF , WRF ;,
Figure 119. Pismo AERMOD Results Scatter Plot - ERA.UW.
Pismo - ERA.UW 0-Q plot
• Case 1 Buoy-based
* Case 2 Buoy-based
• Aercoare. Diag. -,
* Aercoare. WRF ;,
• MMIF. Diag, -.,
• MMIF. WRF :,
Figure 120. Pismo AERMOD Results Q-Q Plot - ERA.UW.
156
-------
Pismo - ERA.YSU
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. :,
Aertoare. WRF j,
MMIF, Diag. i,
MMIF, WRF i,
Figure 121. Pismo AERMOD Results Scatter Plot - ERA.YSU.
Pismo - ERA.YSU 0-0 plot
10!
t
1
a
S
* 10'
|
Q
I
a
1
10ft
in '
x'
'"
/^
f
^ '
/
''
/
'
J
/
f''
X
-
^f a
fcfi
^^
•
•'
i
X
•r
'!•
A
»
J
•
9>
\
i
*
• '
•
}>
.
''
'
•
, '
' ff
, '
/
^ *
'
/
'
^
,
^
'
^ "
X
-
/
,-
X1
x
^ -
V
? 1 1
« Case 2 Buoy-based
,
» Aercoare. WRF z,
• MMIF. Diag. •
• MMIF. WRF r
10°
101
Observed Cortccntrations (/
10-
Figure 122. Pismo AERMOD Results Q-Q Plot- ERA.YSU.
157
-------
4.3.5 Ventura
The AERMOD modeling performance statistics for Ventura AERMOD simulations are listed in
Table 20. Time series plots of AERMOD results and measured concentrations for Ventura are
shown in Figure 123. Scatter plots and Q-Q plots of AERMOD results from each WRF
simulation are shown in Figure 124 - Figure 135.
The Case 1 measurement-based AERMOD simulations performed relatively well, with a TMS of
0.72 supported by a strong FF2 of 0.76, correlation of 0.75, and geometric mean and RHC
predictions near to the measured values. The good performance is visually evident on the time
series plots for both the winter and summer campaigns. Variance was low also, with a VG of
1.69. Most of the summer cases occurred during unstable conditions, as supported by the
negative air-sea temperature difference. The observation based simulations used observed PEL
heights during unstable conditions. During the winter cases, the observation cases overpredict
concentration slightly perhaps due to the low PEL heights predicted under stable conditions.
The Case 2 measurement-based AERMOD simulations tend to over predict concentration in
more instances than the Case-1 simulations, perhaps due to the absence of sigma theta
observations.
All WRF simulation performance scores are much lower than the measurement-based scores.
The concentration time series plot shows that WRF based AERMOD simulations highly over
predict concentration, by more than an order of magnitude in some cases, for both the summer
and winter campaign periods. The ERA.YSU based AERMOD simulations perform slightly
better than the other simulations, but no TMS scores exceed 0.2. All FF2 scores are low and VG
scores are high. The geometric mean and RHC scores reflect the high level of over prediction
by AERMOD simulations using WRF meteorology. These trends are evident on the scatter plots
and Q-Q plots for all simulations. The concentrations exceed the upper FF2 lines. The Case 2
measurement-based simulations also highly overpredict concentration in some cases.
All WRF cases slightly overpredicted air temperature. Although SST analysis data were within
0.5 - 1°C of the measured values, the air-sea temperature difference was opposite in sign
compared to the observed conditions. The ERA-based simulations result in slightly stable
conditions for the summer cases and the NARR-based simulations result in neutral conditions.
The poor characterization of observed air-sea temperature differences resulted in excessively
stable conditions and lower PEL heights. This in turn results in highly overpredicted
concentrations by all WRF based AERMOD simulations during the summer study. The WRF
simulations also underpredicted wind speed during the winter periods and a few of the summer
periods.
The observations supported unstable atmospheric conditions over all of the summer campaign
periods and a few of the winter campaign periods. During the unstable periods,
measurement-based AERMOD simulations used the tracer study PEL heights, measured within
the range of 100-400 m. The WRF simulations result in minimum PEL heights near 25 m during
these same periods.
158
-------
Overall, the Ventura case study demonstrates small differences in SST and air temperature can
have dramatic influence on the prediction of atmospheric stability and PEL height. Although the
WRF SST and air temperature values are within 0.5-2.0°C of the measurements for most of the
periods, the sign of the air-sea temperature difference can be wrong, leading to large
differences in PEL height, wind speed, and L. These differences for this study resulted in poor
predictions of concentration by AERMOD.
159
-------
Table 20. Ventura AERMOD Performance Statistics.
Case
Observed
Concentrations3
Case 1 Obs
Case 2 Obs
NARR.MYJ
NARR.UW
NARR.YSU
ERA.MYJ
ERA.UW
ERA.YSU
Extraction
Method
-
obs. ae
no obs. ae
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
AERC.RCALT
AERC.RCALF
MMIF.RCALT
MMIF.RCALF
FF2
1.00
0.76
0.59
0.00
0.00
0.00
0.00
0.00
0.00
0.24
0.00
0.00
0.00
0.18
0.18
0.00
0.00
0.00
0.06
0.00
0.00
0.06
0.00
0.06
0.00
0.12
0.12
Geo.
Corr.
Coeff.
1.00
0.75
0.62
0.39
0.42
0.37
0.46
0.42
0.53
0.17
0.30
0.45
0.55
0.21
0.27
0.63
0.64
0.64
0.66
0.56
0.53
0.45
0.47
0.51
0.51
0.39
0.37
Geo.
Mean
1.20
1.59
2.39
16.70
20.95
17.39
22.52
15.26
17.45
11.10
15.06
15.85
16.29
11.59
11.09
11.92
17.08
12.19
18.42
9.28
12.51
7.66
11.32
7.19
6.95
5.47
5.84
RHC
4.26
5.62
19.74
40.02
48.18
37.74
46.77
37.31
45.72
49.57
75.63
53.35
54.59
62.68
57.56
49.56
54.10
47.44
57.90
44.78
58.29
40.19
50.80
30.62
24.64
22.46
21.27
VG
1.00
1.69
5.26
1979.25
5920.54
2466.22
8716.45
1322.41
2245.85
513.92
1781.98
1483.71
1533.42
683.58
458.43
337.85
2016.82
367.98
2912.69
130.96
473.90
70.26
322.69
46.29
37.91
20.61
24.42
MG
1.00
0.75
0.50
0.07
0.06
0.07
0.05
0.08
0.07
0.11
0.08
0.08
0.07
0.10
0.11
0.10
0.07
0.10
0.06
0.13
0.10
0.16
0.11
0.17
0.17
0.22
0.20
Total
Model
Score
1.00
0.72
0.42
0.11
0.11
0.11
0.12
0.12
0.14
0.12
0.09
0.12
0.14
0.11
0.13
0.16
0.16
0.17
0.17
0.16
0.14
0.16
0.13
0.18
0.18
0.19
0.19
a Values of 1.0 for FF2, Geo. Corn, VG,
statistical minimum values.
MG, and Model score included for the observations as a reminder of the
160
-------
Ventura Normalized Concentration Time-series
Site Measurement
Buoy easel
Buoy case2
WRF ERA.MYJ aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA.MYJ aerT.rcalF
WRF ERA.MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA.UW aerF.rcalT
WRF ERA.UW aerT.rcalF
WRF ERA.UW aerT.rcalT
WRF ERA.YSU aerF.rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerF.rcaFT
WRF NARR.MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR.UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR.UW aerT.rcalT
WRF NARR.YSU aerF.rcalF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR.YSU aerT.rcalT
Ventura Normalized Concentration Time-series
Site Measurement
Buoy easel
Buoy case2
WRF ERA MYj aerF.rcalF
WRF ERA.MYJ aerF.rcalT
WRF ERA MYJ aerT.rcalF
WRF ERA-MYJ aerT.rcalT
WRF ERA.UW aerF.rcalF
WRF ERA.UW aerF.rcalT
WRF ERA.UW aerT.rcalF
WRF ERA.UW aerT.rcarr
WRF ERA.YSU aerF.rcalF
WRF ERA.YSU aerF.rcalT
WRF ERA.YSU aerT.rcalF
WRF ERA.YSU aerT.rcalT
WRF NARR.MYJ aerF.rcalF
WRF NARR.MYJ aerFrcalT
WRF NARR.MYJ aerT.rcalF
WRF NARR.MYJ aerT.rcalT
WRF NARR.UW aerF.rcalF
WRF NARR.UW aerF.rcalT
WRF NARR.UW aerT.rcalF
WRF NARR UW aerT.rcalT
WRF NARR.YSU aerF.rcalF
WRF NARR.YSU aerF.rcalT
WRF NARR.YSU aerT.rcalF
WRF NARR.YSU aerT.rcalT
Figure 123. Ventura AERMOD Results Time Series: summer cases (top) and winter cases
(bottom).
161
-------
Ventura - NARR.MYJ
10'
1C1
10°
11
• '
X
X*
r
'
[«
— -*
>:
•
x,t
„
.,,
4 *
• ft
I
— ( *
* ' *
12
t
*
f
>'
fc ,
.
.
./
^x
X
,..
X
X-
X
X
.
7*
,'
:/
"'
' X 3
/
z
: ;_::
•* 1 Tl
* Case 2 Buoy-based
rcoare, Diag. j,
« Aercoare, WHF ;,
• MMIF, Diag. j
10° 10* W1 10'
Observed Corxentr'titiCins fyjs/'n1 >
Figure 124. Ventura AERMOD Results Scatter Plot - NARR.MYJ.
Ventura - NARR.MYJ Q-0 plot
Predicted Concentrations (fis/w1 t
^Si 5 =. =
_^
x:
^
,
^
^
IT
_.*'
X^
* ,-
-, #
* »
vt
* 4
•
*.<
_, •' *
-'.j X
* >x
s-.—,r'
^ ' 1
/
|
X
•
.. ,
^ '
''X
'
X
X
1
x
1
'
•
f *
*'
X
••
2
X
z
^. .
^
2
? i 1
» Case 2 Buoy-based
» Aercoarc. WRF z,
• MMIF. Diag. i
MMIF, WRF r
10° 101 102 103
Observed Cor>ccntrations (^jsftfi^ )
Figure 125. Ventura AERMOD Results Q-Q Plot - NARR.MYJ.
162
-------
Ventura - MARR.UW
10!
•y
"X
§
1
x 10>
S
1
a
i
1
10°
in-1
X
•
«
1
»
'•
S^
x«
1
1
;:I *
» .
"X
. •
•
s°
'i,
*'
i.
t
•
Xccntrations (/jsfin* )
Figure 127. Ventura AERMOD Results Q-Q Plot - NARR.UW.
163
-------
Ventura - NARR.YSU
*
I
i
centratu
o_
s
1
c
i
1
10°
X
j
,
1
1
**
X •
;
*
f
---*-!-
» .
\x
1
*
*'
»
X<
> ,
1
X
x
X
X'
-'
X
'
X
X
•
X
f
~»
•x
: •
X
-''
f
^X
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, WRF z.
MMIF, Diag. ;
* MMIF, WRF 2
10'
OBserved Cor>centrat)'
X
/
X
• Case 1 Buoy-based
» Case 2 Buoy-based
• Aercoare, Diag •
» Aercoare, WRF :,
• MMIF. Diag.
- MMIF. WRF :,
10° 101
Observed Concentrations (
IB5
Figure 129. Ventura AERMOD Results Q-Q Plot - NARR.YSU.
164
-------
Ventura - ERA.MYJ
1
1
e
! 10'
i
s
1
I
10"
it
X
j
S
-
r
*
*
• •
i
rf
•' 1
=
||
•>
•
«* • .ri
•x. .
x
x
^x
X'
-'
:X
—
xi
S
y "* |
•
f
~*
'/-
x
'''
X
x
^x"^
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, WRF z.
MMIF, Diag. ;
* MMIF, WRF 2
10° 10' 10! lO3
Observed CorKentrations (^s/ms )
Figure 130. Ventura AERMOD Results Scatter Plot - ERA.MYJ.
Ventura - ERA.MYJ 0-0 plot
X
-^
X
^
xv
Case 1 Buoy-based
Case 2 Buoy-based
Aercoare, Diag. -
Aercoare, WRF :,
MMIF. Diag. ;,
MMIF, WRF :,
10° 101
Observed Concentrations (
IB5
Figure 131. Ventura AERMOD Results Q-Q Plot - ERA.MYJ.
165
-------
Ventura - ERA.UW
tf
1
g
i
! 10'
3
U
c
a
1
10°
in-'
X
•
I
i
• .
•
. !
= i «
t* t
«
.
, .
vx"
*,
*l
A
A
«
•
s
» .
'
X
X
•'
X
'
x
X
x
f
~*
•x
;:.
x
X
x
x
^x*1^
• Case 1 Buoy-based
* Case 2 Buoy-based
• Aercoare, WRF z.
• MMIF, Diag. 2
* MMIF, WRF 2
10' 10'
Observed CorKentrations (
10'
Figure 132. Ventura AERMOD Results Scatter Plot - ERA.UW.
Ventura - ERA.UW 0-0 plot
} :': L'
j
c
c
S
e 10'
g
"S
f
1
10
in-l
x^
x"
x
•
'
X1
,-
X
r
fl
_»
' J
.
.*'
x
t
" „ '
|t ff
i
••
.» »
X^
•x» *»
• ' x^
* '
^ *
1
A
4
|
'
x
•'
/
1
•
^*
' X
? x'
, '
X
x
'
X
,,
-.
^ "
''
x
X
x
f
S
s
? 1 1
* Case 2 Buoy-based
,
# Aercoare, WRF z,
• MMIF, Diag. z
MMIF, WRF r
10° 101
Observed Concentrations (
llV
Figure 133. Ventura AERMOD Results Q-Q Plot - ERA.UW.
166
-------
Ventura - ERA.YSLJ
Predicted Concentrations (yis/r«:l )
-S- % 5-
X
t»
i
v
. *
»
if
: i
t
i
? *1
•
»"•'• j^
' X*
x
!/
x1
x *"
-'
X
—
/
X'
.
/-'
f
~»
: • "
/
ii
^x
^
• Case 1 Buoy-based
* Case 2 Buoy-based
• Aercoare, WRF z.
• MMIF, Diag. z
* MMIF, WRF 2
10° 10! 102 10?
Observed CorKentrations (jjs/ms )
Figure 134. Ventura AERMOD Results Scatter Plot - ERA.YSU.
Ventura - ERA.YSU Q-0 plot
1 •;'; L'
J
I
|
4r
I
1
10"
H
^~
/
;
,-'
/*
,-
-
•
«
—y
r
t
*
^
:::
gt t
• *
. . •* .
: «-«
_. '
a •
^ •
• j/"
s
.'
>: '
/
*
---
^
•
f *
''X
? x''
, '
',
/
.-•
,,
"::
•
.. '
''
X
/
/
-^--
f
f
? 1 1
* Case 2 Buoy-based
,
# Aercoare, WRF z,
• MMIF, Diag. z
MMIF, WRF r
''- 10° lo1 id2 io3
Observed Cor>ccntrations (^jsftfi^ )
Figure 135. Ventura AERMOD Results Q-Q Plot - ERA.YSU.
167
-------
[Blank]
168
-------
5 DISCUSSION
The results of this study suggest small differences in the key meteorological variables can result
in large differences in predicted tracer concentration for a given hour. Although many of the
WRF simulations perform quite well when compared to regional surface observation of winds
and temperatures, small differences near the overwater point of release can result in prediction
of the opposite stability (stable vs. unstable or vice-versa).
Relatively small error in air temperature or SST can have a large effect on stability class
because stability is a function of the sign of air-sea temperature difference, i.e., positive or
negative value. Warm air advected over cool water results in stable conditions, while cool air
advected over warm water results in convective unstable conditions. Spatial gradients of SST
near the coast and wind direction thus play key roles in the simulation of the stability and PEL
heights over the water.
The modeling performance analysis of the five tracer experiments examined in this study
demonstrated WRF-based AERMOD simulations can result in estimates of concentration as
good as or better than AERMOD simulations using observations - but not in all cases. For the
Cameron, Pismo, Oresund, and Carpinteria studies some of the AERMOD simulations driven
by WRF meteorology had better or similar performance statistics than simulations driven by
observed meteorology. The poorer performing AERMOD simulations, both WRF driven and
observation driven, occurred when the meteorological inputs produced atmospheric stability
conditions that were not likely representative of the larger-scale stability at the time of the study.
This observation, however, is a fundamental flaw of dispersion modeling that relies on
meteorology at a single point. Models that use a 3-dimensional grid of meteorological variables
are likely more appropriate for dispersion modeling of heterogeneous conditions.
For the Oresund study, all AERMOD simulations performed poorly when compared to observed
tracer concentrations. This study was characterized by overwater and overland transport,
25-40 km transport distances, high elevated releases, and observed shoreline fumigation. Poor
model performance in this instance is likely the result of the limits of AERMOD's formulation, not
inaccurate characterization of surface conditions over water predicted by WRF.
It should also be noted that the limits suggested by Richmond & Morris (2012), namely limiting
the PEL height to be greater than 25 m and the absolute value of L to be greater than 5, were
only implemented in cases that applied AERCOARE and not the direct WRF-MMIF extraction
cases.
5.1 Primary Questions
In general do WRF driven simulations perform as well as AERCOARE/AERMOD results based
on observations? Was the performance comparable for some locations and not others? Why?
Some of the WRF-driven AERMOD simulations performed as well as or better than the
observation-based simulations based on the TMS score and qualitative analysis of the
results. These included:
169
-------
• Cameron: 7 of the 24 WRF cases. The poorer performing simulations produced
excessive atmospheric stability leading to overprediction of concentration. Results were
most sensitive to the characterization of air-sea temperature differences.
• Carpinteria: 4 of the 24 cases. The majority of WRF cases overpredict concentration.
However, unstable conditions are predicted by WRF, when stable and neutral conditions
were observed in some cases. The better performing simulations had better predictions
of stability and wind speed. Air-sea temperature difference was the driving factor
influencing the overall model performance results for Carpinteria.
• Oresund: 2 of the 12 case. The best performing WRF based AERMOD simulations and
observation based AERMOD simulations produce conservative concentration estimates
on the upper end of the frequency distribution. However, although the frequency
distributions match the measurements, the high and low values do not match temporally.
Therefore, all of the AERMOD results that are favorable, may be fortuitous. Given the
transport distances and complexity of meteorology, it can be suggested that the Oresund
results do not provide a reasonable measure for this study.
• Pismo: 20 of the 24 case. The observation based cases tended to overpredict
concentration. Low PEL heights (underpredicted compared to observations) and stable
conditions are supported by the air-sea temperature difference. The WRF based
simulations result in more neutral and unstable conditions resulting in better PEL height
agreement. The best performing WRF based AERMOD simulations produce relatively
accurate concentration estimates. Concentration predictions on the upper end are
conservative for the best performing WRF simulations. However, the poorer WRF based
AERMOD simulations underpredict concentration on the upper end of the distribution.
Is it necessary to use AERCOARE or should WRF predictions of the surface fluxes and other
necessary parameters be passed directly through to AERMOD?
• There was no discernable advantage to using AERCOARE. The overprediction of
highest concentration by some WRF-MMIF simulations may have been reduced by
limiting the minimum PEL height and L, as done for the AERCOARE simulations.
Although not applied in this study, MMIF also has the option of restricting PEL height
and L. The WRF driven AERMOD simulations with "MMIF" (direct extraction) and
"AERC" (AERCOARE processed) extractions performed differently, but there was
not a consistent better-performing option. Perhaps the differences in the
AERCOARE vs WRF internal PEL options for prediction of the surface energy fluxes
were small when compared to basic ability of WRF to predict the correct winds and
air-sea temperature differences.
Did the WRF predicted or MMIF rediagnosed PEL height performed better?
• There was a discernible advantage to using MMIF rediagnosed (rcalT) PEL heights.
The majority of the simulations that performed as well as the observation-driven
simulations were the MMIF rediagnosed PEL height cases. The rediagnosed PEL
170
-------
height cases, in general, performed better than the AERMOD simulations using the
WRF PEL heights. In some instances the WRF PEL heights were much lower than
the MMIF rediagnosed PEL heights. The MMIF rediagnosis was shown in the time
series plots to lower PEL height variance between the different WRF simulations and
result in predictions more closely matching the observed PEL heights.
From a regulatory perspective, were any of the WRF driven options consistently biased towards
under prediction?
• There was no consistent bias - underprediction will occur in cases where air-sea
temperature difference error results in lower L (less stable, more neutral, or more
unstable conditions than observed). Poorer performing WRF simulations in the
Cameron, Oresund, and Ventura cases resulted in excessive overprediction of
results due to excessively stable conditions. Poorer performing WRF simulations in
the Carpinteria cases resulted in underprediction due to excessively unstable
conditions.
• Focusing on the upper end of the concentration distribution, the best performing
WRF simulations tended to match or slightly overpredict the measured
concentrations. There were a few cases where the WRF simulations underpredicted
concentrations on the upper end. An analysis of AERMOD results at the upper end
of the concentration distribution is included in Table 21. Subjective judgment was
used to decide whether each model case overpredicted, underpredicted, or matched
the concentration predictions, based on the trends indicated on the Q-Q plots. The
simulations with the highest TMS performance scores are highlighted (the top three
from each tracer study).
• The Q-Q plot analysis, summarized in Table 21, shows that most AERMOD cases
produce concentration results that match or exceed the measured concentrations at
the upper-end of the distribution. This indicates that WRF based AERMOD
simulations produce conservative concentration estimates, but not necessarily at the
same periods as the measured maxima.
• Overall, this analysis indicates the ERA.YSU WRF simulations using WRF
meteorology directly (without AERCOARE processing) provides the most
representative, yet conservative, method for WRF-based AERMOD modeling. These
simulations resulted in the highest frequency of top performing TMS scores. Only
during the Cameron study does this method perform worse than the measurement-
based method.
171
-------
Table 21. Concentration Q-Q Plot Distribution Upper-end Evaluation.
Study
Cameron
Carpinteria
Oresund
Pismo
Ventura
Extraction method
Measurement-based
Case 1
Measurement-based
Case 2
MMIF.RCALF
MMIF.RCALT
AERC.RCALF
AERC.RCALT
Measurement-based
Case 1
Measurement-based
Case 2
MMIF.RCALF
MMIF.RCALT
AERC.RCALF
AERC.RCALT
Measurement-based
Case 2
MMIF.RCALF
MMIF.RCALT
AERC.RCALF
AERC.RCALT
Measurement-based
Case 1
Measurement-based
Case 2
MMIF.RCALF
MMIF.RCALT
AERC.RCALF
AERC.RCALT
Measurement-based
Case 1
Measurement-based
Case 2
MMIF.RCALF
MMIF.RCALT
AERC.RCALF
AERC.RCALT
NARR
MYJ
M
U+
M
U+
O
O
O
U
-
—
-
-
o+
o+
o+
o+
o+
o+
o+
o+
NARR
UW
U
u+
M
U+
O
O
O
O
-
—
-
-
o+
o+
o+
o+
o+
o+
o+
o+
NARR ERA
YSU MYJ
M
M
U+
U+
U+
U+
O
O
O
O
O
O
U
-
—
-
-
o+
o+
U
U
M
U
O
O+
O+
O+
O+
o+
O
u+
O
u+
O
O
M
M
U
O
u+
u+
o+
o+
o+
o+
o+
o+
o+
o+
ERA
UW
M
U+
U+
u+
o
O
o
o
M
M
U+
U+
O+
O+
O+
O+
O+
O+
O+
O+
ERA
YSU
U+
u+
u+
u+
o
o
o
o
o
o
u+
u+
o+
o+
o+
o+
o+
o+
o+
o+
U: underpredicts, O: overpredicts, M: equivalent values, +: exceeding a factor of 2 (+ is considered
significant over- or under- prediction, generally exceeding a factor of 2). Note that these ratings are based
on a subjective visual impression provided by the scatter plots.
"Highlighted and italicized values indicate the "best" 3 performing WRF simulations for this case based on
performance scores.
172
-------
Table 22 summarizes the results for the individual experiments (Table 16 - Table 20).
Table 22. Summary of Model Evaluation Findings.
Scenario
Cameron
Carpinteria
Oresund
Pismo
Ventura
Case 1 Obs
IMS
0.64
0.64
-
0.4
0.72
Case 2 Obs
IMS
0.61
0.64
0.43
0.23
0.42
Best WRF
Configuration
NARR.UW
ERA.UW
ERA.MYJ
ERA.YSU
ERA.YSU
Best
Extraction
Method
MMIF.RCALF
MMIF.RCALT
MMIF.RCALT
AERC.RCALF
MMIF.RCALF
IMS
0.68
0.55
0.41
0.75
0.19
5.2 Conclusions
Based on the results of this study, the following conclusions can be made:
• ERA re-analysis datasets offered better WRF predictions of meteorology regionally, but
NARR reanalysis datasets performed better in some cases at the local tracer study
meteorology measurement sites. ERA-based runs resulted in better AERMOD results
overall. Both YSU and UW PEL schemes resulted in better predictions of meteorology
overall, leading to better AERMOD predictions.
• The METSTAT analyses suggested most WRF simulations met the performance criteria
goals for "complex terrain" conditions. The comparison of overwater measurements from
the archived buoy data suggested the METSTAT performance could be used as a
predictor at the site, despite a lack of overwater measurements in the METSTAT
analysis itself. However, the meteorological analysis suggests small errors in SST and
air temperature can result in misdiagnosed stability conditions that can have profound
effects on the AERMOD results.
• The results suggest representative SST data are necessary to prevent misdiagnosis of
surface-layer heat flux and stability. The SST data from the periods of the tracer studies
integrated into the reanalysis data are not as representative or as resolved as today's
datasets. Today SST data are collected from sophisticated satellites at high resolution. It
is likely modern SST data are more accurate and air-sea temperature differences
estimated by WRF are less likely to result in a misdiagnosis of atmospheric stability
conditions.
• AERMOD results produced using meteorology extracted from the ERA-YSU WRF
simulations produced the simulations with the highest frequency of top performing TMS
scores. This combination also tended to more closely match the upper-range
concentration predictions.
• Direct extraction by MMIF without AERCOARE produced more cases where
concentration predictions were conservative.
The MMIF rediagnosis of PEL height should be used to prevent excessively low PEL heights in
the SFC files.
173
-------
[Blank]
174
-------
REFERENCES
Angevine, W., 2012. Performance Results with the Total Energy - Mass Flux PBL scheme.
[Online]
Available at: http://www.mmm.ucar.edU/wrf/users/workshops/WS2012/ppts/3.2.pdf
[Accessed 22 01 2013].
Anon., 2010. NOAA/NCDC. [Online]
Available at: http://www.ncdc.noaa.gov/oa/climate/rcsg/datasets.htmltfsurface
[Accessed 15022013].
Arya, P., 1988. Introduction to Micrometeorology. London: Academic Press.
Brashers, B. & Emery, C., 2014. The Mesoscale Model Interface Program (MMIF) Draft User's
Manual, Novato, CA: ENVIRON Int. Corp. Air Sciences Group, Prepared for U.S. EPA Air
Quality Assessment Division.
Bretherton, C. & Park, S., 2009. A New Moist Turbulence Parameterization in the Community
Atmosphere Model. J. Climate, Volume 22, pp. 3422-3448.
Bridgers, G., 2011. Model Clearinghouse Review of AERMOD-COARE as an Alternative Model
for Application in an Arctic Marine Ice Free Environment. Research Triangle Park(North
Carolina): U.S. EPA.
Cimorelli, A. et al., 2004. AERMOD: Description of model formulation, s.l.: USEPA, EPA-454/R-
03-004.
Cole, J. & Summerhays, J., 1979. A Review of Techniques Available for Estimation of Short-
Term NO2 Concentrations. Journal of the Air Pollution Control Association, 29(8), pp. 812-
817.
Dabberdt, W., Brodzinsky, R., Cantrell, B. & Ruff, R., 1982. Atmospheric Dispersion Over Water
and in the Shoreline Transition Zone, Final Report Volume II: Data, Menlo Park, CA:
Prepared for American Petroleum Institute by SRI International.
DiCristofaro, D. & Hanna, S., 1989. OCD: The Offshore and Coastal Dispersion Model, s.l.:
Prepared for U.S. Dept. of Interior MMS, , Report #A085-1.
Doran, J. & Gryning, S., 1987. Wind and Temperature Structure over a Land-Water-Land-Area.
American Meteorological Society.
Earth Tech, 2006a. Development of the Next Generation of Air Quality Models for the Outer
Continental Shelf (OCS) Applications, Final Report: Volume 1, Contract 1435-01-01-CT-
31071: Prepared for U.S. Dept. of the Interior, Minerals Management Service.
Earth Tech, 2006b. Development of the Next Generation Air Quality Models for Outer
Continental Shelf (OCS) Applications - Model Evaluation Data Archive, Contract No. 1435-
0101-CT-31071: Prepared for U.S. Dept. of the Interior, Minerals Management Service.
175
-------
Emery, C., Tai, E. & Yarwood, G., 2001. Enhanced meteorological modeling and performance
evaulation for two Texas ozone episodes, Novato, CA: Prepared for the Texas Nat. Res.
Cons. Commission by ENVIRON Int. Corp..
ENVIRON Int. Corp., 2010. Evaluation of the COARE-AERMOD Alternative Modeling Approach
Support for Simulation of Shell Exploratory Drilling Sources In the Beaufort and Chukchi
Seas, Lynnwood, WA 98036: ENVIRON, 19020 33rd Ave. W., Suite 310.
ENVIRON Int. Corp., 2012. Evaluation of the Combined AERCOARE/AERMOD Modeling
Approach for Offshore Sources, Novato, California: ENVIRON Int. Corp., 773 San Marin
Drive, Suite 2115.
ENVIRON Int. Corp., 2014. METSTAT. [Online]
Available at: http://www.camx.com/download/support-software.apx
Gryning, S.-E., 1985. The Oresund Experiment - A Nordic Mesoscale Dispersion Experiment
over a Land-Water-Land Area. Bull. Am. Meteor. Soc., 66(11), pp. 1403-1407.
Hahmann, A. N. et al., 2011. Simulating the Vertical Structure of the Wind with the WRF Model,
s.l.:
http://www.mmm.ucar.edu/wrf/users/workshops/WS2011/Power%20Points%202011/5_4_H
ahmann_WRFWorkshop_11 .pdf.
Hanna, S., Schulman, L., Paine, R. & Pleim, J., 1984. Users Guide to the Offshore and Coastal
Dispersion (OCD) Model., Concord, MA: Environmental Research & Technology, Inc..
Hanna, S. et al., 1985. Development and Evaluation of the Offshore and Coastal Dispersion
Model. J. Air Poll. Contr. Assoc., Volume 35, pp. 1039-1047.
Hanrahan, P., 1999. The Plume Volume Molar Ratio Method for Determining NO2/NOx Ratios
for Modeling, Part 1: Methodology. Journal of the Air & Waste Management Association,
Volume 49, pp. 1324-1331.
Hong, S.-Y., Noh, Y. & Dudhia, J., 2006. A New Vertical Diffusion Package with an Explicit
Treatment of Entrainment Processes. Mon. Weather Rev., Volume 134, pp. 2318-2341.
Hu, X., Nielsen-Gammon, J. & Zhang, F., 2010. Evaluation of Three Planetary Boundary Layer
Schemes in the WRF Model. Journal of Applied Meteorology and Climatology, Volume 49,
pp. 1831-1844.
lacono, M. et al., 2008. Radiative forcing by long-lived greenhouse gases: calculations with the
AER radiative transfer models. Journal of Geophysical Research.
Janjic, Z., 1994. The step-mountain eta coordinate model: further developments of the
convection, viscous sublayer and turbulence closure schemes. Mon. Weather Rev., Volume
122, pp. 927-945.
Johnson, V. & Spangler, T., 1986. Tracer Study Conducted to Acquire Data for Evaluation of Air
Quality Dispersion Models, San Diego, CA: WESTEC Services, Inc. for the American
Petroleum Institute.
176
-------
Kain, J., 2004. The Kain-Fritsch convective parameterization: an update.. Journal of Applied
Meteorology, Volume 43, pp. 170-181.
Kemball-Cook, S., Jia, Y., Emery, C. & Morris, R., 2005. Alaska MM5 Modeling for the 2002
Annual Period to Support Visibility Modeling, Novato, CA: Prepared for the Western
Regional Air Partnership by ENVIRON Int. Corp..
Lahoz, W., Khattatov, B. & Menard, R. (., 2010. Data Assimilation: making sense of
observations. London: Springer.
Larson, V. et al., 2001. Systematic biases in the microphysics and thermodynamics of numerical
models that ignore subgrid-scale variability. Journal of Atmospheric Science, Volume 58, pp.
1117-1128.
Louis, J., 1979. A Parameteric Model of Vertical Eddy Fluxes in the Atmosphere. J. Atmos. Sci.,
Volume 35, pp. 187-202.
Mass, C. et al., 2008. Removal of systematic model bias on a model grid. Weather and
Forecasting, Volume 23, pp. 438-459.
McNally, D. & Wilkinson, J. G., 2011. Model Application and Evaluation: ConocoPhillips Chukchi
Sea WRF Model Application., Arvada, Colorado: Alpine Geophysics, LLC.
Mellor, G. & Yamada, T., 1982. Development of a turbulence closure model for geophysical fluid
problems. Rev. Geophys. Space Phys., Volume 20, pp. 851-875.
Mesinger, F. a. C., 2006. North american regional reanalysis. Bull. Amer. Metor. Soc., pp. 343-
360.
National Center for Atmospheric Research, 2014. Weather Research & Forecasting (WRF)
ARW Version 3 Modeling System User's Guide, s.l.: NCAR Mesoscale & Meteorology
Division.
Park, S. B. C., 2009. The University of Washington Shallow Convectionand Moist Turbulence
Schemes and Their Impact on Climate Siumulations with the Community Atmosphere
Model. J. Climate, pp. 3449-3469.
Richmond, K. & Morris, R., 2012. Evaluation of the Combined AERCOARE/AERMOD Modeling
Approach for Offshore Sources, s.l.: ENVIRON Int. Corp. Prepared for USEPA R.10, EPA-
910-R-12-007.
Schacher, G. etal., 1982. California Coastal Offshore Transport and Diffusion Experiments:
Meteorological Conditions and Data, Monterey, CA: Report NPS-61-82-007, Naval
Postgraduate School.
Schulman, L, Strimaitis, D. & Scire, J., 2002. Development and Evaluation of the PRIME plume
rise and building downwash model.. Journal of the Air & Waste Management Association,
Volume 50, pp. 278-390.
Simmons, A., Uppala, S., Dee, D. & Kobayashi, S., 2006. ERA-lnterim: New ECMWF reanalysis
products from 1989 onwards.. ECMWF Newsletter, pp. 26-35.
177
-------
Skamarock, W. et al., 2008. A Description of the Advanced Research WRF Model, Version 3,
s.l.: Nat. Center for Atmos. Research, Univ. Corp. Atmos. Research.
Stauffer, D., Seaman, N. & Binkowski, F., 1991. Use of four-dimensional data assimilation in a
limited-area mesoscale model. Part II: effects of data assimilation within the planetary
boundary layer. Mon. Weather Rev., Volume 119, pp. 734-754.
Stull, R., 1988. An Introduction to Boundary Layer Meteorology. Dordrecht, Netherlands: Kluwer
Academic Publishers.
Tewari, M. et al., 2004. Implementation and verification of the unified NOAH land surface model
in the WRF model. Seattle, WA, s.n.
Thompson, G., Field, P., Rasmussen, R. & Hall, W., 2008. Explicit Forecasts of Winter
Precipitation using an Improved Bulk Microphysics Scheme. Part II: Implementation of a
New Snow Parameterization. Monthly Weather Review, Volume 136, pp. 5095-5115.
USEPA, 2003. AERMOD: Latest Features and Evaluation Results, Research Triangle Park, NC:
U.S. EPA, OAQPA, EPA-454/R-03-003.
USEPA, 2004a. AERMOD: Description of model formulation, Research Triangle Park, North
Carolina: U.S. Environ. Protection Agency, EPA-454/R-03-004.
USEPA, 2004b. User's Guide for the AERMOD Meteorological Preprocessor AERMET,
Research Triangle Park, North Carolina: U.S. Environ. Protection Agency, EPA-454/B03-
002.
USEPA, 2004c. User's Guide fortheAMS/EPA Regulatory Model AERMOD, Research Triangle
Park, North Carolina: U.S. Environ. Protection Agency, EPA-454/B-03-001.
USEPA, 2004c. User's Guide fortheAMS/EPA Regulatory Model AERMOD, Research Triangle
Park, North Carolina: U.S. Environmental Protection Agency, EPA-454/B-03-001.
USEPA, 2004d. User's Guide fortheAMS/EPA Regulatory Model AERMOD, Research Triangle
Park, North Carolina: U.S. Environ. Protection Agency, EPA-454/B-03-001.
USEPA, 2012. User's Manual AERCOARE Version 1.0, Seattle, WA: U.S. Environ. Protection
Agency Region 10, EPA-910-R-12-008.
Venkatram, A., 1980. Estimating the Monin-Obukhov Length in the Stable Boundary Layer for
Dispersion Calculations. Boundary Layer Meteor., Volume 19, pp. 481-485.
Vogelezang, D. & Holtslag, A., 1996. Evaluation and model impacts of alternative boundary-
layer height formulations. Boundary Layer Meteor., Volume 81, pp. 245-269.
Wee, T. et al., 2012. Two Overlooked Biases of the Advanced Research WRF (ARW) Model in
Geopotential Height and Temperature. Monthly Weather Review, Volume 140, pp. 3907-
3918.
Wong, H., 2011. CO ARE Bulk Flux Algorithm to Generate Hourly Meteorological Data for use
with AERMOD. Seattle(WA): U.S. EPA Region 10.
178
-------
Xie, B., Fung, J., Chan, A. & Lau, A., 2012. Evaluation of nonlocal and local planetary boundary
layer schemes in the WRF model. Journal of Geophysical Research, Volume 117, pp. 1-26.
Yver, C. et al., 2012. Evaluating transport in the WRF model along the California Coast. Atmos.
Chem. Phys. Discuss., pp. 16851-16884.
Zhang, D. & Zheng, W., 2012. Diurnal cycles of surface winds and temperatures as simulated
by five boundary layer parameterizations. Journal of Applied Meteorology, 117(012103,
doi: 10.1029/2011JD017080), pp. 1-26.
Zhang, J., 2011. Beaufort and Chukchi Seas Mesoscale Meteorology Model Study, s.l.: s.n.
179
-------
[Blank]
180
-------
APPENDIX A: TASK 2 PROTOCOL
-------
-------
DRAFT Overwater Dispersion Modeling
Task 2 Protocol
AMEC RFP # 12-6480110233-TC-3902
Federal Prime Contract # EP-W-09-028
Prepared for:
AMEC Environment & Infrastructure, Inc.
502 W. Germantown Pike, Suite 850
Plymouth Meeting, PA 19462-1308
Attention: Thomas Carr
Prepared by:
ENVIRON International Corporation
773 San Marin Drive, Suite 2115
Novato, California, 94945
www.environcorp.com
P-415-899-0700
F-415-899-0707
March 11, 2013
ENVIRON
-------
Final Overwater Dispersion Modeling
Task 2 Protocol
INTRODUCTION
The primary objective of the current study is to test and evaluate AERMOD on the outer
continental shelf (OCS). The current modeling procedures for sources on land use the
AERMOD modeling system. The meteorological AERMET processor included in the system is
inappropriate for OCS sources because the energy fluxes over water are not strongly driven by
diurnal heating and cooling. In addition, the meteorological observations necessary to drive the
dispersion models are commonly not available, especially in the Arctic Ocean. For applications
in the Arctic, the remote location and seasonal sea-ice pose logistical problems for the
deployment of buoys or offshore measurement platforms.
This study evaluates a combined modeling approach where the meteorological variables are
provided by the Weather Research and Forecasting (WRF) mesoscale model, and then
processed by a combination of a new Mesoscale Model Interface program (MMIF) and,
optionally, AERCOARE (a replacement for AERMET suitable for overwater conditions). Given
an appropriate overwater meteorological dataset, AERMOD can then be applied for New
Source Review following the same procedures as used for sources over land.
The remainder of this document presents a protocol for Task 2 of the study. Task 2 compares
WRF-driven dispersion AERMOD predictions against the concentrations observed in five
offshore tracer studies. The same four North American studies were used previously to evaluate
AERCOARE using actual overwater observations (Environ, 2010); the current study also
includes Oresund. In this task, WRF is used to provide predictions of the overwater
observations. Model performance using WRF is compared to the performance found with actual
observations.
Task 2: Evaluate the use of WRF Solutions with AERMOD
AMEC and ENVIRON prepared a Work Plan outlining the various tasks and objectives of the
current study. As directed by EPA and AMEC, the second task: To generate WRF simulations
that match five offshore tracer studies and then model the tracer dispersion using multiple
configurations of MMIF/AERCOARE/AERMOD. The protocol includes additional information on
data, options, and issues that were not fully described in the Work Plan. Wth an approved
protocol, ENVIRON staff will perform the following subtasks:
Task 2a: Generate WRF simulations to match the five offshore tracer studies
ENVIRON will perform meteorological simulations to match five historical field studies
conducted in:
• Cameron, LA: July 1981 and February 1982
• Pismo Beach, CA: December 1981 and June 1982
• Carpinteria, CA: September 1985 and October 1985
• Ventura, CA: September 1980 and January 1981
• 0resund (between Denmark and Sweden): May/June 1984
March 11,2013 1 ENVIRON
-------
Final Overwater Dispersion Modeling
Task 2 Protocol
The four North American studies listed have been used for OCS model development including
forOCD, CALPUFF, and most recently for AERCOARE. For these simulations, ENVIRON has
selected the National Center for Atmospheric Research's (NCAR's) community-developed WRF
model (dynamical core version 3.4.1). WRF is a limited-area, non-hydrostatic, terrain-following
eta-coordinate mesoscale model.
WRF must be optimized for land-sea contrast, including Sea Surface Temperature (SST), the
land-sea-breeze circulation, and the correspondingly influenced temperature structure because
the tracer studies are in an offshore coastal environment. ENVIRON'S WRF configurations
attempt to capture the timing and location of rapidly-changing and diurnally-influenced land-sea-
breeze regimes, which will be critical to a successful forecast. To do this, ENVI RON'S model
configuration should include the most accurate initial inputs, regionally applicable physics
choices, and nudging selection to incorporate local field-study data, combined with the best
SST's and land surface models available.
ENVI RON'S base case configuration will include 5.5 day simulation blocks, with a minimum of
12 hours for model spin-up prior to experimental tracer release times. The spin-up time allows
for the model to develop sub-grid scale processes, including vorticity and moisture fields. Table
2 summarizes the date ranges for the tracer studies and corresponding regional weather model
initializations. The U.S. modeling domains are defined on the Lambert Conformal Conic (LCC)
map projection identical to the National Regional Planning Organization (RPO) domains, with an
outermost RPO domain (36 km) and telescoping 12-4-1.33 km nests to capture the fine detail of
coastlines and adjacent topography. The domain configuration for Oresund is similar; however,
the location requires a projection defined for Northern Europe. Domains are provided in Figures
1-5. Figures showing the inner-most 1.33 km domain nests can be found in the discussion of
Task 2c, starting on page 11.
The planned model vertical structure includes 37 levels, disproportionately stacked toward the
surface, where more complex vertical structures critical to modeling the boundary layer structure
and therefore coastal weather (see Table 3) are necessary. The proposed boundary layer
resolution will use finer vertical spacing than ENVIRON typically uses for most simulations over
land, as we anticipate this will help winds and temperatures respond more explicitly to
dynamical influences.
ENVIRON will include high resolution sea surface temperatures from NOAA's % degree
Optimum Interpolation (Ol) dataset V2 (AVHRR). The Ol dataset will replace the global model
values from September 1981 onward. Both the winter and summer periods at Ventura and the
winter period at Cameron happened before the Ol SST dataset starts, and will use the coarser
SST values found in the initialization dataset (NARR or ERA-I). Nudging techniques described
later in this section aim to correct air-sea temperature biases resulting from coarser SST data
available in the early 1980's. The subgrid-scale fluxes at the lower boundary of WRF will be
treated by the four-layer NOAH land-surface model, that has been used extensively at
ENVIRON with success. The NOAH land-surface model was also used for a similar WRF
transport study along the California coast (Yver C., 2012).
March 11,2013 2 ENVIRON
-------
Final Overwater Dispersion Modeling
Task 2 Protocol
Other WRF options ENVIRON proposes are: the sophisticated Morrison 2-moment
microphysics that predicts both number concentration and mixing ratios, the Rapid Radiative
Transfer Model (RRTMG) long-wave/shortwave physics, and Monin-Obukhov (Janjic) surface
physics. ENVIRON will parameterize cumulus on the 36 and 12 km domains using the Kain-
Fritch WRF option, and will use the new Kain-Fritch "Eta" trigger. We note the Grell 3-D cumulus
algorithm was used in the Yver California coastal tracer study. If cumulus performance in this
study is unsatisfactory, Grell 3-D will be considered. At higher resolution, 4 and 1.33 km,
convection will be explicitly parameterized by the model.
ENVIRON understands that mesoscale models will generally have some bias and no single
solution provides the best simulation in all circumstances. The accuracy and variability of the
WRF model are critical to evaluate its success as an input to downstream dispersion models as
a single choice of model setup represents a single deterministic solution. Thus, ENVIRON
proposes to approach the simulation of each historical field study as an ensemble of
simulations-varying two fields that are highly influential and difficult to anticipate the influence of
reanalysis input and planetary boundary layer (PBL) selection. The ensemble will include all
eight possible combinations of reanalysis and PBL listed below:
• European Center for Medium Range Weather Forecast's ERA-lnterim dataset (ERA-I,
6-hourly analysis output, -0.5 degree)
• North America Regional Reanalysis (NARR, U.S. cases only, 3-hourly analysis dataset,
-0.3 degree)
• Yonsei University (YSU) PBL scheme
• Mellor-Yamada-Janjic (MRF) PBL scheme
• University of Washington Shallow Convection (UW-PBL) PBL scheme
• Total Energy - Mass Flux (TEMF) PBL scheme
The first two PBL selections have been used extensively in ENVIRON modeling, while the UW-
PBL has been shown to reduce climate bias by 7% in the Community Atmosphere Model (Park,
2009). The TEMF scheme with total turbulent energy as a prognostic variable and integrated
shallow cloud is intended to improve simulations with shallow cloud and/or stable boundary
layers and therefore deserves testing in this study (Angevine, 2012). The ERA-I dataset has
replaced the Global Forecast System (GFS) model as our standard at ENVIRON due to higher
accuracy (Angevine, 2012). NARR was employed successfully to model arctic over-water
meteorology in a BOEM study (Zhang, 2011).
Effective nudging will be critical to successful modeling of meteorological conditions where
satellite-derived SST's do not exist. Nudging provides an opportunity to correct for biases in the
air-sea temperature difference that will profoundly affect boundary layer structure and wind
speeds. ENVIRON has experimented extensively with nudging in WRF; there are options to
nudge wind, temperature, and moisture fields toward 3-D and 2-D analysis fields; and options to
nudge toward surface observations using specified horizontal and vertical radii of influence.
Nudging excessively leads to non-physical development of weather patterns.
March 11,2013 3 ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
ENVIRON recommends 3-D nudging toward analysis grids for wind, temperature, and moisture
for the 36 and 12 km domains. PEL nudging prevents the development of physical boundary
layer processes. Therefore, ENVI RON'S strategy involves 3-D nudging above the PEL only;
surface analysis nudging, at the base of the PEL, will not be employed. Observational nudging
with a radius of influence of 50 km or less and a vertical radius of influence up to approximately
900mb will be employed for temperature and wind speed, but not moisture (to avoid the
development of spurious convection), using the ds3505 data.1 In this manner, we hope to
capture features such as the deceleration of near-surface winds blowing from a heated land
surface associated with a stable layer that develops over a cool water surface, as observed and
modeled at Oresund (Doran J.C., 1987). The use of observational data may adversely impact
the model if over-water data is represented by the model on land or vice-versa. In this case,
ENVIRON may choose to exclude certain data from the objectively analyzed data, or perhaps
turn off observational nudging entirely. Table 1 provides the types of nudging employed, the
variables nudged, and the strength of the nudging coefficients employed.2
In order to compare the quality of the simulation, ENVIRON will analyze all available
meteorological datasets and will compare in-situ data with time-series plots of temperature, air-
sea temperature difference, wind speed, wind direction, mixing height, and lapse rates.
Table 1. Nudging in WRF model simulations
Domain (km)
36
12
4
1.33
3-D nuding
Variables
Q, UV, T
Q, UV, T
Coefficients
0.0003
0.0003
Observational Nudging
Variables
UV, T
Coefficients
0.0005
1 DS3505 integrated surface hourly (ISH) worldwide station data includes extensive automated QC on all data and
additional manual QC for USAF, NAVY, and NWS stations. It integrates all data from DS9956, DS3280, and DS3240.
10,000 currently active stations report wind speed and direction, wind gust, temperature, dew point, cloud data,
sea level pressure, altimeter setting, station pressure, present weather, visibility, precipitation amounts for various
tie periods, snow depth, and various other elements as observed by each station. (NOAA/NCDC, 2010)
2 ENVIRON will employ different nudging coefficients, nudging variables, or nudging in the PBL if the simulation
diverges from the input grids or lack meteorological validity.
March 11, 2013
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
Table 2. Historical Field Study Dates and WRF Initializations
Location
Cameron, LA
Carpinteria, CA
Pismo Beach, CA
Ventura, CA
Oresund,
Denmark/Sweden
Historical Field Study Date
Ranges
Period 1: 08Z 08/20/1981 to
13Z 08/29/1 981
Period 2: 08Z 02/151982 to
14Z 02/24/1 982
Period 1: 09/1 9/1 985 to
09/29/1985
(Complex Terrain Study only)
Period 1: 12/08/1981 to
12/15/1981
Period 2: 06/21/1 982 to
06/27/1982
Period 1: 01/06/1 980 to
01/13/1980
Period 2: 09/27/1981 to
09/29/1981
Period 1: 11Z 05/16/1984 to
13Z 06/05/1 984
Period 2: 10Z 06/12/1984 to
15Z 06/1 4/1 984
WRF Initializations
Period 1: 12Z 08/19/1981
12Z 08/24/1 981
Period 2: 12Z 02/14/1982
12Z 02/1 9/1 982
Period 1: OOZ 09/18/1985
OOZ 09/23/1 985
OOZ 09/28/1 985
Period 1: OOZ 12/07/1981
OOZ 12/1 2/1 981
Period 2: OOZ 06/20/1 982
OOZ 06/25/1 982
Period 1: 12Z 01/05/1980
12Z 01/1 0/1 980
Period 2: 12Z 09/26/1981
Period 1: 12Z 05/15/1984
12Z 05/20/1 984
12Z 05/25/1 984
12Z 05/30/1 984
12Z 06/04/1 984
Period 2: 12Z 06/11/1984
March 11, 2013
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
Table 3. WRF Model 37 Vertical Levels.
Level
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
eta
1
0.9985
0.997
0.995
0.993
0.991
0.988
0.985
0.98
0.97
0.96
0.95
0.94
0.93
0.91
0.89
0.87
0.84
0.8
0.76
0.72
0.68
0.64
0.6
0.55
0.5
0.45
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.06
0.027
0
Pressure
(mb)
1000
999
997
995
993
991
989
986
981
972
962
953
943
934
915
896
877
848
810
111
734
696
658
620
573
525
478
430
383
335
288
240
193
145
107
76
50
Height
(m)
0.0
12.2
24.5
40.8
57.2
73.6
98.3
123.0
164.3
247.4
331.2
415.7
500.8
586.6
760.5
937.2
1117.1
1392.8
1772.4
2166.7
2577.0
3005.0
3452.2
3921.0
4540.7
5203.7
5917.1
6690.5
7536.4
8472.3
9522.5
10724.1
12136.7
13866.9
15621.6
17503.4
19594.2
Mid Height
(m)
6.1
18.4
32.7
49.0
65.4
85.9
110.6
143.6
205.9
289.3
373.4
458.2
543.7
673.5
848.8
1027.1
1254.9
1582.6
1969.6
2371.9
2791.0
3228.6
3686.6
4230.8
4872.2
5560.4
6303.8
7113.5
8004.4
8997.4
10123.3
11430.4
13001.8
14744.2
16562.5
18548.8
Dz
(m)
12.2
12.2
16.4
16.4
16.4
24.7
24.7
41.3
83.1
83.8
84.5
85.1
85.8
173.8
176.8
179.8
275.8
379.6
394.3
410.3
427.9
447.3
468.7
619.8
662.9
713.4
773.4
846.0
935.8
1050.2
1201.6
1412.6
1730.1
1754.7
1881.8
2090.8
Note: Calculated using P0=1000mb, Ptop=50mb, TO=20.15C, and dT/dx=-6.5C/km.
March 11, 2013
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
50°N
45°N -
40°N -
35°N
30°N
25°N
20°N —
120°W
110°W
100°W
90°W
80°W
Figure 1. Pismo Beach (CA) WRF domain map. The entire map illustrates the 36 km
domain, while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
50°N
45°N -
40°N -
35°N
30°N
25°N -
20°N -
120°W
110°W
100°W
90°W
80°W
Figure 2. Carpinteria (CA) WRF domain map. The entire map illustrates the 36 km domain,
while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
50°N
45°N
40°N -
35°N
30°N
25°N -
20°N -
120°W
110°W
100°W
90°W
80°W
Figure 3. Ventura (CA) WRF domain map. The entire map illustrates the 36 km domain,
while d02. d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
50°N
45°N -
40°N -
35°N -
30°N
25°N -
20°N -
120°W
110°W
100°W
90°W
80°W
Figure 4. Cameron (LA) WRF domain map. The entire map illustrates the 36 km domain,
while d02. d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
10
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
60°N
55°N -
50°N -
45°N -
40°N -
35°N
30°N -
10°W
10°E
20°E
30°E
40°E
Figure 5. Oresund (DK) WRF domain map. The entire map illustrates the 36 km domain,
while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
11
ENVIRON
-------
Final Overwater Dispersion Modeling
Task 2 Protocol
Task 2b: Run AERMOD using WRF solutions
As mentioned in the Task 1 Protocol for this study, there are many different options for the
preparation of WRF-predicted meteorology for use by AERMOD. ENVIRON proposes to
examine and compare the same four options as follows for each ensemble member of each
case:
1. MMIF will be applied to extract and prepare data sets for direct use by AERMOD. All
variables will be as predicted by the WRF simulations including the surface energy
fluxes, surface roughness and planetary boundary layer (PBL) height.
2. As in Option 1), but the PBL height will be re-diagnosed from the wind speed and
potential temperature profiles using the Bulk-Richardson algorithm within MMIF.
AERMOD simulations can be very sensitive to the PBL height (ENVIRON, 2012) and
MMIF processed PBL height may provide significantly different predicted
concentrations than the PBL height used internally by WRF.
3. MMIF will be applied to extract the key meteorological variables of overwater wind
speed, wind direction, temperature, humidity and PBL height. AERCOARE will use
these variables to predict the surface energy fluxes, surface roughness length and
other variables needed for the AERMOD simulations. AERCOARE has a surface
layer scheme developed specifically to predict surface fluxes from overwater
measurements. In this application, the WRF simulations provide the variables that
might be measured by a buoy, ship or offshore platform. AERCOARE can also be
applied using a number of different options. For the current study, we propose to
apply AERCOARE using the defaults recommended in the AECOARE model
evaluations study (ENVIRON, 2012).
4. As in Option 3), but the PBL height will be re-diagnosed using the Bulk-Richardson
algorithm within MMIF. AERCOARE will be applied as in Option 3.
Task 2c: Compare and analyze WRF-driven AERMOD predictions against data collected
from the five field studies
ENVIRON will extract meteorological datasets from the WRF simulations for each period of
the field studies in Cameron, Ventura, Pismo Beach, Carpinteria, and Oresund using MMIF.
In Task 2b MMIF is applied to generate datasets both for AERCOARE processing and
directly to AERMOD bypassing AERCOARE. These two different methods can be used to
contrast the differences between the surface energy fluxes predicted by AERCOARE
versus the internal algorithms selected for the WRF simulations.
AERMOD simulated tracer releases for each field study and the resulting predictions will be
compared to observations using the same statistical procedures as were employed in
previous AERCOARE model evaluation studies (ENVIRON 2012). We will compare the
model performance of: WRF-driven AERCOARE versus WRF-driven AERMOD; WRF-
driven AERCOARE versus meteorological observation-driven AERCOARE; and WRF-
driven AERCOARE independent of wind-direction versus meteorological observation-driven
March 11,2013 12 ENVIRON
-------
Final Overwater Dispersion Modeling
Task 2 Protocol
AERCOARE3. In addition to concentration predictions, the meteorological predictions from
WRF will be compared to the measurements from the field studies. We will diagnose the
important variables and options that resulted in different predicted concentrations for the
various cases considered. ENVIRON proposes to use a variety of graphical techniques to
represent the tracer versus modeled concentrations as performed in previous OCD, CALPUFF,
and AERCOARE evaluations. Graphical analysis includes Q-Q plots which compare two
probability distributions, in this case predicted versus observed concentration, by plotting their
quantiles against each other with logarithmic scales on the axes. Log-log scatter plots are
employed to evaluate the temporal relationship between observed and predicted concentration.
The former identifies biases in the model related to the magnitude of the concentrations, while
the latter identifies whether the model can explain the observed temporal variability.
Some of the differences and details for Task 2 are discussed in the following subsections for
each of the five field studies.
Pismo Beach. The Pismo Beach experiment was conducted during December 1981 and June
1982. Figure 6, below, illustrates the WRF domain relative to the tracer experiment, while Figure
7 zooms in to illustrate just the tracer experiment setup. The tracer was released from a boat
mast 13.1-13.6 m above the water. Peak concentrations occurred near the shoreline at
sampling distances from 6 to 8 kilometers away. The Pismo Beach evaluation database
consists of 31 sampling periods.
The meteorological data shows discrepancies between the air-sea temperature difference and
the lapse rate at times during this field experiment - sometimes the lapse rate indicates a stable
boundary layer and the air-sea temperature difference indicates unstable conditions. Several
previous modeling studies relied on the lapse rate and corrected the air-sea difference to be at
least as stable as indicated by the lapse rate (Environ, 2010). This method will be applied for
this study as well.
In previous OCD, CALPUFF, and AERCOARE model evaluation analyses, wind directions were assigned to ensure simulated
plumes were centered on the receptor with the highest prediction. This focused the previous evaluations on plume diffusion,
rather than plume transport. The proposed evaluation will also make a distinction between differences caused by apparent
plume transport errors compared to a differences resulting from WRF's predicted boundary layer structure.
March 11,2013 13 ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
675
685
695
705 715 725
UTM Easl (km) Zone 10N, Datum: NAS-C
735
755
Figure 6. Pismo Beach WRF 1.33 km domain (solid magenta line), with 5 point grid cell
buffer (dashed magenta line)
March 11, 2013
14
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
Pismo Beach, CA
3890-
3885-
o
«
<
I 3880-
J_ 3875-
i
3870-
3865-
710 715 720 725
UTM East (km) Zone 10N, Datum: NAS-C
^^H
-100
-95
-90
-85
-80
-75
-70
-65
-60
-55
-50
-45
-40
-35
-30
-25
-20
-15
-10
Land Use
X Sa
Snow/Ice
Tundra
Barren
Wetland
Water
Forest
Range
Agriculture
Urban/Built-Up
npler Locatio
Tracer Releases
Figure 7. Pismo Beach tracer release and sampler location map within the background
March 11, 2013
15
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
Carpinteria. The Carpinteria experiment was conducted in September and October of 1985.
Studies examined impacts caused by both interaction with complex terrain and shoreline
fumigation. Due to limitations in the AERCOARE-AERMOD approach, only complex terrain data
can be analyzed in this study. Figure 8, below, illustrates the WRF domain relative to the tracer
experiment, while Figure 9 zooms in to illustrate just the tracer experiment setup. Shoreline
receptors on a 20 to 30 meter high bluff are located within 0.8 to 1.5 km of the offshore
tethersonde release. Very light winds were observed during much of the study period. A
constant mixing height in the dataset suggests a problem with the instrumentation. ENVIRON
will use WRF mixing heights as input to AERCOARE in all cases as a best available estimate.
Sampler Locations
*» X ~ Complex Terrain
_ X -
_ Tracer Releases:
- Complex Terrain
— Fumigation
248 258 268 278
UTMEasl(km) Zone 11N, Datum: NAS-C
Figure 8. Carpinteria WRF 1.33 km domain (solid magenta line), with 5 point grid cell
buffer (dashed magenta line)
March 11, 2013
16
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
CARPINTERIA, CA
3814
3806
268 269 270 271 272 273 274
UTM East (km) Zone 11N, Datum: NAS-C
275 276
100
~95 Snow/Ice
-90
~85 Tundra
-80
5 Barren
-70
-65 Wetland
-60
-55 Water
-50
-45 Forest
40
-35 Range
-30
-25 Agriculture
-20
_15 Urban/Built-Up
10
Land Use
Sampler Locations:
X - Complex Terrain
X — Fumigation
Tracer Releases:
A ~ Complex Terrain
A -- Fumigation
Figure 9. Carpinteria tracer release and sampler location map with landuse in the
background
Cameron. The Cameron experiment includes 26 tracer samples from field studies in July 1981
and February 1982. Tracers were released from a boat and a low profile platform (13 m). As in
the Pismo Beach study, the receptors are located in flat terrain near the shoreline with transport
distances ranging from 4 to 10 km. Figure 10, below, illustrates the WRF domain relative to the
tracer experiment, while Figure 11 zooms in to illustrate just the tracer experiment setup.
Meteorological discrepancies similar to those in the Pismo Beach study exist here, and the air-
sea difference correction to lapse rate stability will also be applied for the analysis.
March 11, 2013
17
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
3322'
326:
444
454
464 474 484
UTM East (km) Zone 15N, Datum: NAS-C
494
504
Figure 10. Cameron WRF 1.33 km domain (solid magenta line), with 5 point grid cell
buffer (dashed magenta line)
March 11, 2013
18
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
CAMERON, LA
o
i
i
13
5 -1704-
z
m
QJ
J
|
2
ID
K^
25m
i
^
^— - —
. . rJ*S J^\.
^lOmM
ast
^*x«««e«^
28a
*«»»*
******
2/15
2/24
^^
**«^
5* c
-v
-
466 468 470 472 474 476 478 480 482 484 486 488 490
UTM East (km) Zone 15N, Datum: NAS-C
^^•i
Lan
-100
95 Snow/Ice
-90
~85 Tundra
-80
~75 Barren
-70
-65 Wetland
-60
-55 Water
-50
-45 Forest
-40
-35 Range
-30
-25 Agriculture
-20
-10
JUse
X Sampler Locations
A Tracer Releases
Figure 11. Cameron tracer release and sampler location map with landuse in the
background
March 11, 2013
19
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
Ventura. The tracer dispersion study in the Ventura, California area was conducted along the
California coast during 4 days in September 1980 and 4 days in January 1981. Data from all 4
of the days in September and 3 of the 4 test days in January are in the dataset. SF6 tracer was
released about 8m above the water from a boat located 6-8 km form shore, and sampled along
2 arcs about 10 to 12 km long. The first arc is 1/4 km to 1 km from the shoreline and the second
arc is about 7 km from the shoreline. Figure 12, below, illustrates the WRF domain relative to
the tracer experiment, while Figure 13 zooms in to illustrate just the tracer experiment setup.
Meteorological data used in previous evaluations includes wind at 20.5m, temperature at 7m,
and air-sea temperature difference measured at the release location, and vertical temperature
gradient measured over the water by an aircraft.
3836
375I
246 256 266 276 286 296 306
UTM East (km) Zone 11N, Datum: NAS-C
316
326
336
Figure 12. Ventura WRF 1.33 km domain (solid magenta line), with 5 point grid cell buffer
(dashed magenta line)
March 11, 2013
20
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
VENTURA, CA
3798
3780
100
Snow/Ice
284 286 288 290 292 294 296 298 300 302 304
UTM East (km) Zone 11N, Datum: NAS-C
Land Use
X Sampler Locations
A Tracer Releases
Figure 13. Ventura tracer release and sampler location map with landuse in the
background
March 11, 2013
21
ENVIRON
-------
Final
Overwater Dispersion Modeling
Task 2 Protocol
Oresund. The tracer dispersion study over the strait of Oresund was conducted between the
coasts of Denmark and Sweden during 9 days between May 15 and June 14, 1984. SF6 was
released as a non-buoyant tracer from a tower at either 95m above the ground (Barseback,
Sweden) or at 115 m on the east side of the Strait of Oresund (Gladaxe, Denmark), and was
sampled at arcs set up along the opposite shore and at distances 2-8 km inland. Air-sea
temperature differences were as large as 6-8°C on five of the experiment days, due to warm
advection over the cooler water.
Meteorological data in the study included a lighthouse in the straight, meteorological towers and
masts, SODARS, 3-hourly radiosondes, and occasional mini-sondes released from a boat in the
strait. Figure 14, below, illustrates the WRF domain relative to the tracer experiment.
Strait of Oresund
6230
6120
6110
300
310
410
420
320 330 340 350 360 370 380 390 400
UTM East (km) Zone 33N, Datum: EUR-M
Meteorological Stations: A Radiosonde EH Surface/Mast + Tower -O SODAR • Sea
100
95 Snow/Ice
90
85 Tundra
80
75 Barren
70
65 Wetland
60
55 Water
50
45 Forest
40
35 Range
30
- 25 Agriculture
-20
15 Urban/Built-Up
10
Land Use
Terrain Contour 25m
Figure 14. Oresund tracer release and sampler location map with landuse in the
background
March 11, 2013
22
ENVIRON
-------
Final Overwater Dispersion Modeling
Task 2 Protocol
REFERENCES
NOAA/NCDC. (2010, 09 15). Retrieved 02 15, 2013, from
http://www.ncdc.noaa.gov/oa/climate/rcsg/datasets.htmlWsurface
Angevine, W. (2012). Performance Results with the Total Energy - Mass Flux PBL scheme.
Retrieved 01 22, 2013, from
http://www.mmm.ucar.edU/wrf/users/workshops/WS2012/ppts/3.2.pdf
Doran J.C., G. S. (1987). Wind and Temperature Structure over a Land-Water-Land-Area.
American Meteorological Society.
Environ. (2010). Evaluation of the COARE-AE RMOD Alternative Modeling Approach Support for
Simulation of Shell Exploratory Drilling Sources In the Beaufort and Chukchi Seas.
Hahmann, A. N., Draxl, C., Pena, A., Badger, J., Larsen, X., & and Nielsen, J. R. (2011). Simulating
the Vertical Structure of the Wind with the WRF Model.
http://www.mmm.ucar.edu/wrf/users/workshops/WS2011/Power%20Points%202011/
5_4_Hahmann_WRFWorkshop_ll.pdf.
McNally, D., & Wilkinson, J. G. (2011). Model Application and Evaluation: ConocoPhillips Chukchi
Sea WRF Model Application. Arvada, Colorado: Alpine Geophysics, LLC.
Park, S. B. (2009). The University of Washington Shallow Convectionand Moist Turbulence
Schemes and Their Impact on Climate Siumulations with the Community Atmosphere
Model. J. Climate, 3449-3469.
Yver C, G. H.-S. (2012). Evaluating transport in the WRF model along the California Coast.
Atmos. Chem. Phys. Discuss., pp. 16851-16884.
Zhang, J. (2011). Beaufort and Chukchi Seas Mesoscale Meteorology Model Study.
March 11,2013 23 ENVIRON
-------
APPENDIX B: REPORT DISK
-------
-------
Volume 2 results can be requested from
Eric Wolvovsky
BOEM/OEP/DEA
Mail Stop: VAM-OEP
45600 Woodland Road
Sterling VA 20166
703-787-1719
Email: eric.wolvovsky@boem.gov
------- |