EPA910-R-15-001a                             Alaska
           United States          Region 10       Idaho
           Environmental Protection      1200 Sixth Avenue    Oregon
           Agency            Seattle WA 98101    Washington
           Office of Environmental Assessment            October 2015


             Combined WRF/MMIF/
            AERCOARE/AERMOD
                Overwater Modeling
             Approach for Offshore
                  Emission Sources
                        Volume 1 - Project Report

-------

-------
            Combined WRF/MMIF/
AERCOARE/AERMOD Overwater
 Modeling Approach for Offshore
                 Emission Sources


           Volume 1 - Project Report
           EPA Contract No. EP-W-09-028
      Work Assignment No. M12PG00033R
                             Prepared for:
            U.S. Environmental Protection Agency
                               Region 10
                         1200 Sixth Avenue
                         Seattle, WA 98101

                                    and

                 U.S. Department of the Interior
            Bureau of Ocean Energy Management
                      45600 Woodland Road
                         Sterling, VA 20166

                             Prepared by:
                Ramboll Environ US Corporation
                773 San Marin Drive, Suite 2115
                         Novato, CA, 94998

                                    and

                       Amec Foster Wheeler
              Environmental & Infrastructure, Inc.
               4021 Stirrup Creek Dr., Suite 100
                         Durham, NC 27703
                             October 2015

-------

-------
The Region 10 Project Officer for the Interagency Agreement No. M12PGT00033R and EPA
contract number EP-W-09-028 was Herman Wong with technical support provided by Robert
Kotchenruther, PhD and Robert Elleman, PhD. From BOEM, the Project Officer was Eric J.
Wolvovsky and the Technical Coordinator was Ronald Lai, PhD. The Project Lead for the prime
contractor Amec Foster Wheeler was James Paumier while Project Lead for subcontractor
RAM BOLL ENVIRON was Ken Richmond.  Peer review of draft Volume 2 and/or draft Volume 3
was provided by Steven Hanna, PhD of Hanna Consultants, Robert Paine, CCM of AECOM and
Christopher Lindsey, Shell Exploration and  Production. Their reviews and comments are greatly
appreciated by R10 and BOEM.
The collaboration study was funded in part  by the U.S. Department of the Interior, Bureau of
Ocean Energy Management, Environmental Studies Program, Washington DC, and the U.S.
Environmental Protection Agency, Region 10, Seattle, WA.

-------
                                   DISCLAIMER
The opinions, findings, conclusions, or recommendations expressed in this report are those of
the authors and do not necessarily reflect the view of the U.S. Environmental Protection Agency
or the U.S. Department of the Interior, Bureau of Ocean Energy Management, nor does the
mention of trade names or commercial products constitute endorsement or recommendation for
use by the Federal Government.

-------
                                    PREFACE
The recommended American Meteorological Society/Environmental Protection Agency
Regulatory Model (AERMOD) dispersion program continues to be studied for assessing air
quality concentration impacts from emission sources located at overwater locations under an
Interagency Agreement (IA) Number M12PGT00033R dated 9 August 2012 between the U.S.
Environmental Protection Agency (EPA), Region 10 and the U.S. Department of the Interior
(DOI), Bureau of Safety and Environmental Enforcement (BSEE) on behalf of the Bureau of
Ocean Energy Management (BOEM). Specifically, the work scope under the IA calls for Region
10 and BOEM to (1) assess the use of AERMOD as a replacement for the Offshore and Coastal
Dispersion (OCD) model in a near-source (< 1,000 meters source-receptor distance) ambient air
quality impact analysis for sea surface based emission sources and (2) evaluate the use of
Weather Research and Forecasting (WRF) model predicted meteorology with AERMOD in lieu
of overwater meteorological measurements from platforms and buoys.
Results of the Region 10/BOEM collaboration study are described in a three volume report.
Volume 1 describes all six tasks completed under the IA. However, only a summary of the work
completed under Task 2 and Task 3 appears in Volume 1. Volume 2 and Volume 3 provides a
detailed description of the work in Task 2 and Task 3, respectively. The six tasks are:
Task 1. Evaluation of two Outer Continental Shelf Weather Research and Forecasting Model
Simulations
Task 2. Evaluation of Weather Research and Forecasting Model Simulations for Five Tracer
Gas Studies with AERMOD
Task 3. Analysis of AERMOD Performance Using Weather Research and Forecasting Model
Predicted Meteorology and Measured Meteorology in the Arctic
Task 4. Comparison of Predicted and Measured Mixing Heights
Task 5. Development of AERSCREEN for Arctic Outer Continental Shelf Application
Task 6. Collaboration Study Seminar
Prior to the collaboration study, Region 10 on 1 April 2011  approved the use of the Coupled
Ocean-Atmosphere Response Experiment (COARE) air-sea flux algorithm with AERMOD to
preprocess overwater measured meteorological data from  platforms and buoys. Initially, the
preprocessing of the overwater measurements was done manually with COARE. Subsequently,
Region 10 funded a study that was completed in September 2012 that coded the COARE air-
sea flux procedure into a meteorological data preprocessor program called AERMOD-COARE
(AERCOARE). The AERCOARE program was uploaded to the EPA Support Center for
Regulatory Atmospheric Modeling  (SCRAM) website on 23 May 2013 as a beta option for case-
by-case approval by EPA regional  offices.

-------
[Blank]

-------
                          TABLE OF CONTENTS
LIST OF FIGURES	IX
LIST OF TABLES	XI
LIST OF ABBREVIATIONS AND ACRONYMS	XIII
1  INTRODUCTION	1
2  TASK 1 - EVALUATION OF TWO OUTER CONTINENTAL SHELF WEATHER
   RESEARCH AND FORECASTING MODEL SIMULATIONS	5
   2.1   Introduction	5
   2.2   Statement of Work	5
   2.3   Analysis	7
   2.4   Results	8
     2.4.1     Updates to MM IF	8
     2.4.2     Suitability of the two WRF datasets	9
3  TASK 2 - EVALUATION OF WEATHER RESEARCH AND FORECASTING
   MODEL SIMULATIONS FOR FIVE TRACER GAS STUDIES WITH AERMOD	11
4  TASK 3 - ANALYSIS OF AERMOD PERFORMANCE USING WEATHER
   RESARCH AND FORECASTING MODEL PREDICTED METEOROLOGY AND
   MEASURED METEOROLOGY IN THE ARCTIC	17
5  TASK 4 - COMPARISON OF PREDICTED AND MEASURED MIXING
   HEIGHTS	25
   5.1   Overview and Objective	25
     5.1.1     Upper-air and surface observations	25
     5.1.2     WRF simulations	27
   5.2   Analysis	27
   5.3   Results	29
     5.3.1     FNMOC WRF evaluated using Endeavor Island profiler	29
     5.3.2     RTG WRF evaluated using JAMSTEC soundings	30
     5.3.3     FNMOC WRF evaluated using Point Barrow soundings	31
   5.4   Conclusions	31

-------
6  TASK 5 - DEVELOPMENT OF AERSCREEN FOR ARCTIC OUTER
   CONTINENTAL SHELF APPLICATIONS	51
   6.1    Introduction	51
   6.2    Approach and Methodology	52
   6.3    Overview	52
   6.4    Methods	53
      6.4.1    AERCOARE input requirements	53
      6.4.2    AERCOARE overwater datasets	55
      6.4.3    Emission sources	61
      6.4.4    Receptor grid	62
      6.4.6    AERSCREEN simulations	66
   6.5    Results	66
   6.6    Recommendations for Future Work	69
7  TASK 6 - COLLABORATION STUDY SEMINAR	85
   7.1    I ntroduction	85
   7.2    Session Overviews	85
      7.2.1    Day 1, Morning	85
      7.2.2    Day 1, Afternoon	86
      7.2.3    Day 2, Morning	87
      7.2.4    Day 2, Afternoon	87
8  REFERENCES	91
APPENDIX A: PROTOCOLS
APPENDIX B: PEER REVIEW COMMENTS FOR VOLUME 2 AND VOLUME 3

-------
                                 LIST OF FIGURES

Figure 1. Alpine/CP 36-12-4 km domains, covering the Chukchi Sea	6
Figure 2. UAF/BOEM 10-km domain, covering the Chukchi and Beaufort Seas	6
Figure 3. Overwater meteorological measurement sites and corresponding WRF inner domain
extraction points	18
Figure 4. Source locations and structures and innermost receptor ring	20
Figure 5. Location of the profiler station on Endeavor Island	26
Figure 6. The K&Z profiler (left) and a view of the station looking North (right)	26
Figure 7. Profiler retrievals and WRF soundings for 2010-07-27 07:00 LSI	33
Figures. Profiler retrievals and WRF soundings for 2010-08-20 16:00 LSI	34
Figure9. Profiler retrievals and WRF soundings for 2010-08-16 17:00 LSI	35
Figure 10. WRF's PBLH vs hand-analyzed inversion base (ZiBase)	36
Figure 11. MMIF's Critical Bulk Richardson mixing height vs. hand-analyzed inversion base
(ZiBase)	37
Figure 12. WRF's PBLH vs. the Critical Bulk Richardson mixing height from the profiler	38
Figure 13. Critical Bulk Richardson mixing heights from MMIF's vs. from the profiler	39
Figure 14. MMIF's Critical Bulk Richardson mixing height vs. AERMOD's mechanical mixing
height	40
Figure 15. MMIF's Critical Bulk Richardson mixing height vs. AERMOD's convective mixing
height	41
Figure 16. Locations of the JAMSTEC 2009 soundings	42
Figure 17. RTG WRF and JAMSTEC profiles for 2009-09-11 06:00 UTC and 2009-09-28 12:00
UTC	43
Figure 18. RTG WRF and JAMSTEC profiles for 2009-09-30 18:00 UTC and 2009-10-03 18:00
UTC	44
Figure 19. RTG WRF and JAMSTEC profiles for 2009-10-07 18:00 UTC and 2009-10-09 09:00
UTC	45
Figure 20. RTG WRF and JAMSTEC profiles for 2009-10-10 06:00 UTC and 2009-10-10 18:00
UTC	46
Figure 21. JAMSTEC observed vs. WRF-MMIF mixing heights, both derived using the critical
bulk Richardson number technique (4km in red, 12km in blue)	47

-------
Figure 22. FNMOC WRF and Pt. Barrow profiles for 2010-08-14 12:00 UTC and 2010-08-28
12:00 UTC	48
Figure 23. FNMOC WRF and Pt. Barrow Profiles for 2010-09-20 12:00 UTC and 2011-06-27
12:00 UTC	49
Figure 24. Hypothetical Drill Ship Layout and Emission Source Locations	62
Figure 25. Screening Modeling Receptor Grid	63
Figure 26. Roster of Attendees	88
Figure 27. Agenda	89

-------
                                LIST OF TABLES

Table 1. Model configuration for UAF/BOEM and Alpine/CP WRF simulations	7
Table 2. WRF AERMOD Meteorology Extraction Methods	12
Table 3. Hypothetical Drill Ship Emission Sources	19
Table 4. Data Dependent AERCOARE Options	53
Tables. Task 3 AERCOARE Data Requirements	54
Table 6. Additional Required AERCOARE Input Parameters (Control File)	55
Table?. Review of Task 3 AERCOARE Overwater Data	57
Table 8. Ambient Air and Sea Surface Temperatures at Selected Arctic Buoys	59
Table 9. Initial Meteorological Screening Values (COARESCREEN1)	60
Table 10. Second Set of Meteorological Screening Values (COARESCREEN2)	61
Table 11. Third Set of Meteorological Screening Values (COARESCREEN3)	61
Table 12. AERMOD Simulations Identified By Meteorological Data Set	65
Table 13. AERSCREEN Input Values	66
Table 14. Percent of Model Runs Where Screening Results Are Conservative Compared to
Refined Modeling Results	67
Table 15. Ratio of COARESCREEN-to-AERSCREEN 1 -hour H1H Concentrations	68
Table 16. Comparison of Screening and Refined Modeling for Release Point S1P1	70
Table 17. Comparison of Screening and Refined Modeling for Release Point S1P2	71
Table 18. Comparison of Screening and Refined Modeling for Release Point S2P1	72
Table 19. Comparison of Screening and Refined Modeling for Release Point S2P2	73
Table 20. Comparison of Screening and Refined Modeling for Release Point S2P3	74
Table 21. Comparison of Screening and Refined Modeling for Release Point S3P1	75
Table 22. Comparison of Screening and Refined Modeling for Release Point S3P2	76
Table 23. Comparison of Screening and Refined Modeling for Release Point S3P3	77
Table 24. Comparison of Screening and Refined Modeling for Release Point S4P1	78
Table 25. Comparison of Screening and Refined Modeling for Release Point S4P2	79
Table 26. Comparison of Screening and Refined Modeling for Release Point S4P3	80
Table 27. Comparison of Screening and Refined Modeling for Release Point S5P1	81
Table 28. Comparison of Screening and Refined Modeling for Release Point S5P2	82

-------
Table 29. Comparison of Screening and Refined Modeling for Release Point S5P3	83
Table 30. Comparison of Screening and Refined Modeling for All Release Points Combined...84

-------
                  LIST OF ABBREVIATIONS AND ACRONYMS
AERC	WRF meteorology extraction cases processed by AERCOARE
AERMIC	American Meteorological Society/Environmental Protection Agency
                  Regulatory Model Improvement Committee
AERMOD	American Meteorological Society/Environmental Protection Agency
                  Regulatory Model
AERCOARE	AERMOD-COARE
ASTD	Air-Sea Temperature Difference
AIDJEX	Arctic Ice Dynamics Joint Experiment

6	Bowen ratio
BOEM	Bureau of Ocean Energy Management
BSSE	Bureau of Safety and Environmental Enforcement

c	Model constant
c0	Observed concentration value
cp	Predicted concentration value
c	Average concentration value
cn	nth highest concentration
°C	Degrees centigrade
COARE	Coupled Ocean-Atmospheric Response Experiment

DOI	U.S.  Department of the Interior

ECMWF	European Center for Medium-Range Weather Forecasting
EPA	U.S.  Environmental Protection Agency
ERA	ECMWF Reanalysis
ERA-40	ERA 45-year global atmospheric reanalysis
ERA-I	ERA Interim
eta	Vertical pressure coordinate in WRF

f	Coriolis parameter
FF2	Fraction-factor-of-two
FNMOC	Fleet Numerical Meteorology and Oceanography Center

g	Grams

H	Sensible heat flux

ISC3	Industrial Source Complex 3

K	kelvin
kg	Kilograms
km	Kilometers

-------
L	Monin-Obukhov length
LCC	Lambert Conformal Conic

m	Meters
METSTAT	Meteorological Statistics
MG	Geometric mean bias
MIXH	PEL height or "mixing height"
MMIF	Mesoscale Model Interface
MYJ	Mellor-Yamada-Janjic

NAAQS	National Ambient Air Quality Standards
NARR	North American Regional Reanalysis
NBDC	National Buoy Data Center
NCAR	National Center for Atmospheric Research
NCEP	National Center for Environmental Prediction
NOAA	National Oceanic and Atmospheric Administration
NSR	New Source Review

O	Observed value
OBS, obs	Label for observation-based AERMOD simulations
OCD	Offshore and Coastal Dispersion
OCS	Outer Continental Shelf
OLM	Ozone Limited Method

P	Sea Level Atmospheric Pressure (also used to indicate "predicted" value
                   in statistical calculations).
p	Predicted value
PEL	Planetary boundary layer
PFL	Profile file input to AERMOD
PRIME	Plume Rise Model  Enhancements
PSD	Prevention of Significant Deterioration
PVMRM	Plume Volume Molar Ratio Method

Q-Q	Quantile-Quantile

r	Albedo
RCALF	Label for WRF-MMIF AERMOD simulations where PEL height was not
                   recalculated by MMIF
RCALT	Label for WRF-MMIF AERMOD simulations where PEL height was
                   recalculated MMIF
rg	Geometric Correlation Coefficient
RH	Relative humidity
RHC	Robust  High Concentration

-------
RMSE	Root Mean Square Error
RPO	Regional Planning Organization
RTG	Real Time Global sea-surface temperature analysis (from NCEP)

s	Seconds
SFC	AERMOD surface meteorology input file
SST	Sea Surface Temperature

T	Temperature
TKE	Turbulent Kinetic Energy, Thermal Kinetic Energy
TMS	Total Model Score statistical measure

U	Zonal wind component
UW-PBL	University of Washington Shallow Convection PEL
i/*	Friction Velocity

V	Meridional wind component
VG	Geometric Variance
VPTLR	Virtual Potential Temperature Lapse Rate

w.	Convective scaling velocity
W	Watts
WD	Wind Direction
WRF	Weather Research and Forecasting
WS	Wind Speed

YSU	Yonsei University

z	Height above the surface
z0	Roughness length
z,c	Convective PEL height
z/m	Mechanical PEL height

0	Degrees angular
/jg	Geometric mean
OQ	Standard deviation of wind direction
ow	Standard deviation of vertical wind speed
(//	Stability correction parameter

-------
[Blank]

-------
1      INTRODUCTION
Air quality modeling and impact assessment must be conducted for the New Source Review
(NSR) of significant sources of air pollutant emissions as promulgated by the U.S.
Environmental Protection Agency (EPA) and the U.S. Department of the Interior (DOI), Bureau
of Ocean Energy Management (BOEM). Given the recent and likely continued expansion of oil
and mineral exploration and extraction activities along the Outer Continental Shelf (OCS) off the
coast of Alaska along the Outer Continental Shelf (OCS) and other marine locations (e.g., mid-
latitudes and tropics), there will continue to be more demand for air quality permits and
exploratory/development plans related to such activities. The EPA and BOEM must therefore
provide modeling tools that can adequately assess air quality impacts over the OCS and other
overwater regions.
The American Meteorological Society/Environmental Protection Agency Regulatory Model
(AERMOD) modeling system ((USEPA, 2004c) is the preferred near-field (< 50 kilometers [km])
model used for the air quality assessment requirements of air emissions permitting1. However,
AERMOD's meteorological preprocessor, AERMET (USEPA, 2004a) was not designed to
process meteorological conditions over ocean waters and more extreme climates. Over land
energy fluxes are strongly driven by the diurnal cycle of heating and cooling. Over water, fluxes
are more dependent on air-sea temperature differences that are only slightly affected by diurnal
heating and cooling. In addition, the meteorological observations necessary to drive the
dispersion models are often not available, especially in the Arctic Ocean. For applications  in the
Arctic, the remote location and seasonal sea-ice pose significant logistical problems for the
deployment of buoys and other offshore measurement platforms. AERMAP is not applicable in
overwater locations.
The dispersion model currently preferred by the EPA for offshore assessment of emission
sources is the Offshore and Coastal Dispersion (OCD) model (DiCristofaro & Hanna, 1989), as
promulgated under 40 CFR Part 51, Appendix W. However,  OCD lacks the features required for
robust modern environmental assessment. OCD does not contain the PRIME downwash
algorithm (Schulman, et al., 2002), Plume Volume Molar Ratio Method (PVMRM) (Hanrahan,
1999), and Ozone Limiting Method (OLM) (Cole & Summerhays, 1979) and the capability to
calculate receptor averaged percentiles associated with sulfur dioxide (SO2), nitrogen dioxide
(NO2),  and particulate matter less than or equal to 2.5 microns (PM2.5) concentrations for a
compliance demonstration.
State-of-the-art overwater parameterization schemes are used in the  Coupled Ocean-
Atmosphere Response Experiment (COARE) air-sea flux algorithms.  These algorithms have
1 As promulgated under 40 CFR Part 51, Appendix W. The AERMOD modeling system is available to the
public at the EPA modeling website: http://www.epa.gov/scram001/dispersion prefrec.htm

-------
been used to develop the AERMOD-COARE (AERCOARE)2 model (USEPA, 2012), a
counterpart to AERMET, to preprocess overwater observational data. AERCOARE takes air-sea
temperature difference and other features of marine influence into account to compute the
meteorological fields required for AERMOD modeling over the open water and coastal
environments. AERCOARE-AERMOD (using the current beta version of AERCOARE) has been
approved by EPA Region 10 and concurred by the EPA Model Clearinghouse as an acceptable
alternative approach for modeling emissions sources located in the Arctic, mid-latitude, and
tropic overwater environment. Use of the model still requires a procedural protocol in
accordance with Appendix W and review and acceptance by the appropriate EPA regional  office
on a case-by-case basis (Tikvart,  1988) (Bridgers, 2011) (Wong, 2011).
Currently accepted overwater dispersion modeling methods require observational datasets.
These datasets are generally provided by meteorological buoys or instruments on platforms.
However, the observational coverage of the earth's oceans is sparse. It would be advantageous
if output from mesoscale meteorological models, such as the Weather Research and
Forecasting (WRF) model (NCAR, 2014) (Skamarock, et al., 2008), could be used to provide
hourly prognostic meteorological data for AERMOD in areas where observational data are
lacking.
To evaluate the use of prognostic meteorological data with AERMOD, the collaboration study
consists of six tasks in which Task 2 and Task 3 contains the evaluations and analyses. Prior to
carrying out the work, protocols for Tasks 1, 2, 3, and 5 were submitted to Region 10 and
BOEM for acceptance. The protocols appear in Appendix A.  The Task 4 work followed the  work
scope in the proposal. An agenda was used  in lieu of a protocol for Task 6. The six tasks are:
Task 1. Evaluation of two Outer Continental Shelf Weather Research and Forecasting Model
Simulations
Task 2. Evaluation of Weather Research and Forecasting Model Simulations for Five Tracer
Gas Studies with AERMOD
Task 3. Analysis of AERMOD Performance Using Weather Research and Forecasting Model
Predicted Meteorology and Measured Meteorology in the Arctic
Task 4. Comparison of Predicted and Measured Mixing Heights
Task 5. Development of AERSCREEN for Arctic Outer Continental Shelf Application
Task 6. Collaboration Study Seminar
2 AERCOARE is publically available at the U.S. EPA at the website:
http://www.epa.gov/ttn/scram/dispersion  related.htm

-------
In this Volume 1, the work performed for all six tasks are presented. However, only summaries
of the work completed under Task 2 and Task 3 are provided. Volume 2 provides detail
comparisons of WRF-driven AERMOD predictions against the concentrations measured during
five offshore tracer dispersion field experiments. Volume 3 summarizes the evaluation of
alternative methods for supplying meteorological variables to AERMOD for regulatory air quality
modeling of sources located over the ocean.  Volume 2 and Volume 3 were submitted for peer
review. The response to the peer review comments for these two volumes is provided in
Appendix B.

-------
[Blank]

-------
2     TASK 1 - EVALUATION OF TWO OUTER CONTINENTAL SHELF WEATHER
      RESEARCH AND FORECASTING MODEL SIMULATIONS

2.1   Introduction

Task 1 of the collaboration study examines two existing WRF datasets for the Arctic Ocean that
might be used to provide the necessary meteorological variables for dispersion model
simulations of OCS sources within their domains. The task objective is to examine the
differences between the two datasets, examine model performance with overwater
measurements, apply the Mesoscale Model Interface (MMIF) program and AERCOARE to the
datasets using several different options, and compare AERMOD model predictions from the
resulting datasets using simulations of typical OCS sources. The protocol for Task 1 is in
Appendix A.

2.2   Statement of Work

At the time, there were two existing WRF simulations of the North Slope of Alaska. The first was
generated  by Alpine Geophysics, LLC (Alpine) under contract to ConocoPhillips (CP) which was
subsequently submitted to EPA Region 10 in support of a Prevention of Significant Deterioration
(PSD) application (McNally and Wilkinson, 2011). This PSD application covered lease blocks in
the Chukchi Sea. The second WRF simulation was developed by the University of Alaska,
Fairbanks (UAF) under contract to BOEM. The "Chukchi/Beaufort Seas Mesoscale Meteorology
Modeling Study" (MMM) produced a 31-year WRF simulation designed to support oil spill risk
assessments (Zhang, 2013). UAF completed an initial followed by a final 5 year simulation that
covered the period from 2005 to 2009.
The three subtasks described in the protocol, found in Appendix A,  are:

   •  Generate AERMOD results from the two difference WRF datasets
   •  Compare AERMOD results from the two different WRF datasets
   •  Compare the WRF runs

The modeling  domains for the Alpine/CP and the UAF/BOEM WRF simulations are shown in
Figure 1 and Figure 2, respectively. Table 1 summarizes the key WRF model options and input
data sources for the two simulations.
During the first subtask, MMIF would preprocess the WRF simulations to generate four
meteorological data sets. For the two direct meteorological data inputs into AERMOD, MMIF is
run separately to output a set of surface and profile files with and without rediagnosed planetary
boundary layer (PBL) heights. Similarly for the two indirect meteorological data input into
AERMOD,  MMIF is run separately to produce a data file with and without rediagnosed PBL
heights that is read by AERCOARE (USEPA, 2012) to output two sets of surface and profile
files. (See Section 3 for more discussion.) Using hypothetical yet typical over water emission
sources at five (5) buoy locations,  AERMOD would be run using the above four meteorological
files and the buoy observations processed with AERCOARE. The AERMOD results  would be
examined,  and the source of any discrepancies sought and explained.

-------
                            150°E    150°W
                   90°W
                                                                 - 100°W
     160°E
                                    -  110°W
     170°E
                                       120°W
                 180°
165°W
150°W
135°W
                 1     25    75    200   500   1000  1500  2000  3000
     Figure 1. Alpine/CP 36-12-4 km domains, covering the Chukchi Sea.
       180'
                     180°
                   170°W
                              165°W    150°W     135°W
                                                          120°W
                              160°W
                                         150°W
                                                    140°W
                                                                 - 130°W
                      25    75    200   500   1000   1500   2000   3000
Figure 2. UAF/BOEM 10-km domain, covering the Chukchi and Beaufort Seas.

-------
      Table 1. Model configuration for UAF/BOEM and Alpine/CP WRF simulations.
Parameter
Model Grid
Forcing Data
PEL
Micro physics
LW/SW Radiation
Surface Layer
Physics
Land-Surface
Model
Cumulus Physics
UAF/BOEM WRF
10 km with 49 vertical levels
ERA interim reanalysis (ERA-I)
MYJ
Morrison
RRTM/RRTMG
ETA Similarity
NOAH w/improved sea ice alb.
Kain-Fritch
Alpine/CP WRF
36-12-4 km with 37 vertical levels
GFS 1/4 degree dataset
YSU
Morrison
RRTM/RRTMG
MM5 similarity
NOAH
Kain-Fritch (36-1 2 only)
Data Assimilation
(obs. nudging)
In situ surface obs., radiosondes,
QuikSCAT sfc winds, MODIS profiles,
COSMIC profiles.
Nudging to MADIS data on 4-km domain
with a radius of influence of 50 km.
Analysis Nudging
Three-wavenumber spectral nudging
of all variables and all levels.
36 & 12 km for winds and temperature at
all model levels.
Lower Boundary
AMSR-E sea ice thickness and cone.
And CMC snow depth.
GFS initialized using static SST for each
5-day block. SST from NCEP RTG 1/12
degree analysis.
Vertical Velocity
Damping
Off
On
Advection
Positive-Definite
Monotonic
Simulation Period    2005-2009
                                  2007-2009
                                  (June 15-December 3rd only)
2.3    Analysis
After the first MMIF runs were completed using the UAF/BOEM initial 5-year dataset, peculiar
values were noticed every three hours for certain variables in the AERMET like output. For
example, the PEL height was a seemingly reasonable for two consecutive hours of simulated
data before it rose to an unreasonable value near 20 kilometers (km) for the third hour, then

-------
back to reasonable values for two hours, and so on. Precipitation fields in the WRF output had
similarly unexpected values every three hours.
Upon investigating these values, it was found that UAF was not performing traditional hindcast
runs with WRF, but was using the variational data assimilation version of WRF (WRF Data
Assimilation System [WRFDA]) to perform a reanalysis every three hours. WRFDA was being
run to generate analyses on a three hour interval, which in turn was being used to initialize a
two hour WRF run to fill in  the hours till the next WRFDA reanalysis. The PEL height problem
stems from differences between WRFDA and WRF. WRFDA outputs prognostic variables such
as wind speed, pressure, temperature, specific humidity, etc. in the same way that WRF does,
but does not output diagnostic variables such as PEL height, or temperature at 2-meters (m), or
friction velocity (u*), in the  same way as WRF.

The final 31-year version of the UAF/BOEM WRF simulation did not use this hybrid
WRFDA+WRF approach. Instead, UAF performed a WRFDA reanalysis for each hour, then ran
the output through WRF for a few time-steps-just enough for WRF to diagnose parameters
such as the PBL height, etc.  Presumably, each WRFDA run covered some hours before the
analysis time,  to allow the  model to "spin-up" finer-scale fields such as potential vorticity. Typical
hindcast WRF runs discard the first 12-24 hours of a simulation to account for "spin-up".
The Alpine/CP WRF dataset includes output from each (nested) domain and was briefly
investigated for the differences in AERMOD output using the 12 km grid spacing data and the
4 km grid spacing data as  input. The differences were unremarkable, and highlighted certain
deficiencies in MMIF noted in the next section.

2.4   Results

2.4.1  Updates to MMIF
During the course of Task  1,  MMIF was modified in two ways. First, previous versions of MMIF
in AERCOARE mode would print an error statement when the extraction point was over a solid
surface as detected by the land use category for the point. This caused MMIF to stop when the
extraction  point was covered with ice. During the early summer ice melt period, the sea ice
coverage fraction can vary at a particular point, causing havoc to a MMIF run. MMIF was
changed to print a warning instead of exiting with an error. Because the warning is printed for
each hour processed, the time stamp when the dominant land use category switches from ice to
ocean can easily be found.
Second, older versions of MMIF passed through all wind speeds when run in AERMOD
("direct") mode. Even wind speeds as low as 0.00001 meters/second (m/s) were passed to
AERMOD, which then calculated a correspondingly  high concentration. AERCOARE has the
ability to set a minimum wind speed as an option, and on 8 March 2013 EPA issued a
clarification memo discussing the use of a 0.5 m/s minimum wind speed threshold for the
AERMOD  modeling system (USEPA, 2013). An option setting the minimum wind speed for
AERMOD  related processing was therefore added to MMIF, with a default value of 0.5 m/s.

-------
Two additional updates to MMIF occurred after the end of work on Task 1 which is noted here.
The minimum mixing height that MMIF would produce had been set to the middle of the lowest
model layer, consistent with CALPUFF requirements. Following the recommendations for
AERCOARE settings, an optional minimum mixing height of 25 m was added to MMIF. Similarly
and also following the AERCOARE recommendations, an option controlling the minimum
absolute value of the Monin-Obukhov length was added, with a default value of 5  (|L| > 5).
These were required to be able to make a fair comparison between WRF+MMIF+AERMOD
output and WRF+MMIF+AERCOARE+AERMOD output as described in the following Section 3.
More information on these additional changes to MMIF can be found in Volume 3, the full report
for Task 3 (Analysis ofAERMOD Performance Using Weather Research and Forecasting Model
Predicted and Measured Meteorology in the Arctic).

2.4.2  Suitability of the two WRF datasets
The Alpine/CP WRF 4 km domain dataset does not extend far enough east to cover the
Beaufort Sea where many potential lease blocks exist and several have already been leased.
Additionally, the Alpine/CP WRF dataset covers only the open water period of three years and
could not be used in dispersion modeling assessments of permanent sources. For these
reasons, it was deemed unsuitable for the long term needs of this project and future permitting.
Due to the problems in the UAF/BOEM initial 5 year WRF dataset noted above  in  Section 2.3, it
was also deemed unsuitable for the long term needs of this project and future permitting. The
decision was made to pursue new hindcast WRF modeling under Task 3, and not wait for the
final UAF/BOEM 31 year dataset to become available.
In addition, the smallest WRF mesh size was 10 km. Potential sources, especially in the
Beaufort Sea, would likely be within 2 - 3 grid points or closer to the shoreline. In order to better
resolve the open water transport between source and the shoreline, a simulation with a finer
mesh size is desirable. The inner domain of the WRF modeling described in Task 3 has a mesh
size of 4 km.

-------
[Blank]
   10

-------
3     TASK 2 - EVALUATION OF WEATHER RESEARCH AND FORECASTING
      MODEL SIMULATIONS FOR FIVE TRACER GAS STUDIES WITH AERMOD
This section summarizes the Task 2 tracer dispersion studies reported in Volume 2, presenting
the methodology used for each of the elements of the task, describing the results of the
performance evaluations, and analyzing how modeling options affect model performance.  The
protocol for this task is in Appendix A.
The purpose of the task is to provide evidence to help answer some of the following questions:
    •  How well does WRF predict overwater surface meteorology?
    •  Are pollutant concentrations predicted by AERMOD driven by WRF meteorology as
      conservative as those predicted by AERMOD driven by observations (processed through
      AERCOARE)?
    •  What WRF modeling configurations and  meteorology extraction methods provide the
      best AERMOD inputs, based on the most accurate AERMOD predictions?
    •  How sensitive is AERMOD to differences between the WRF extracted meteorology and
      observations for simulations of typical offshore sources?
To answer these questions five historical tracer dispersion field studies were selected for this
task:
    •  Ventura, CA: September 1980 and January 1981;
    •  Pismo  Beach, CA: December 1981 and June 1982;
    •  Cameron, LA: July 1981  and February 1982;
    •  Oresund (between Denmark and Sweden): May/June 1984; and
    •  Carpinteria, CA: September 1985.
The four North American studies have been used for previous off-shore dispersion model
development.  The tracer experiment datasets are well known to EPA and have a history of use
for model benchmark testing and development. The Cameron and Pismo Beach studies provide
tracer measurements for simple level terrain near the coastline and are useful for analyzing
model performance with marine influence only. The Ventura study also involves simple flat
terrain, but the receptors are located 500 m to 1  km inland from the shoreline. The Carpinteria
study involved short distance, low wind transport conditions and receptors located on tall bluffs
along the shoreline. The Oresund study involved longer transport distances (25 km to 40 km)
with tracer releases on both sides of the Oresund strait separating Denmark and Sweden.
Mesoscale model results generally have some bias and no single configuration provides the
best simulation in all circumstances. Since a single choice of model setup represents a single
deterministic solution, the accuracy and variability of the WRF model are critical to evaluate its
success as an input to downstream dispersion models. The modeling of each historical field
study was conducted as an ensemble of simulations  using two reanalysis data sets and three
PEL schemes, resulting in six possible combinations of reanalysis and PEL scheme. The two
reanalysis input data sets were:  1) European Center for Medium Range Weather Forecasts
                                         11

-------
(ECMWF) Reanalysis Project (ERA) reanalysis data and 2) North American Regional
Reanalysis (NARR). The PEL schemes were: 1) Yonsei University (YSU) (Hong et al., 2006); 2)
Mellor-Yamada-Janjic (MYJ) (Mellor& Yamada, 1982; Janjic,1994); and 3) University of
Washington Shallow Convection (Bretherton & Park, 2009).
AERMOD requires two input files of meteorology: a file of scalar values (SFC file) and a file of
multi-level values (PFL file). The files can be generated for AERMOD directly or indirectly by
MMIF using the fields available in the WRF output files. Four extraction methods were used to
generate the necessary files for AERMOD:
   1. MMIF was applied to extract and prepare data sets for direct use by AERMOD (MMIF
      produces the AERMOD SFC and PFL input files directly). The PEL height predicted by
      WRF is used in the SFC.
   2. Same as Method 1, but the  PEL height was rediagnosed from the wind speed and
      potential temperature profiles using the bulk Richardson algorithm within MMIF.
   3. MMIF was applied to extract the key meteorological variables of overwater wind speed,
      wind direction, temperature, humidity, and PEL height from WRF results. The MMIF
      extracted data were used to build an AERCOARE input file. AERCOARE used these
      variables to predict the surface energy fluxes, surface roughness length and other
      variables needed for the AERMOD simulations. For this task, AERCOARE was applied
      using the defaults recommended in the AERCOARE model evaluations study (Richmond
      & Morris, 2012).
   4. Same as for Method 3, but the PEL height was rediagnosed using the bulk Richardson
      algorithm within MMIF.
The naming convention and description of the four extraction methods are listed in Table 2.
Note that "RCALT" refers to extractions with MMIF and rediagnosed PEL height and "RCALF"
refers to direct use of WRF PEL height (with the minimum 25 m PEL height applied). Also, note
that "AERC" refers to simulations with additional AERCOARE processing after MMIF extraction,
and "MMIF" refers to simulations using the WRF meteorology directly without further processing
by AERCOARE.

               Table 2. WRF AERMOD Meteorology Extraction Methods.
WRF Extraction
Method
1) MMIF.RCALF
2) MMIF.RCALT
3) AERC.RCALF
4) AERC. RCALT


WRF-
Process Path
WRF -» MMIF -» AERMOD
» MMIF (with PEL diagnosis) -» AERMOD
WRF -» MMIF -» AERCOARE -» AERMOD
WRF -» MMIF
(with PEL diagnosis) -» AERCOARE -» AERMOD
                                         12

-------
For the measurement based simulations, AERCOARE was applied using default options for
surface roughness, warm-layer heating, and cool skin effects; observed PEL heights for the
convective PEL height; mechanical PEL heights using the Venkatram option (Venkatram, 1980);
a minimum PEL height of 25 m, and a minimum |L| of 5 m. For these cases, AERMOD
simulations were performed with ("Case 13") and without ("Case 2") the measured standard
deviation of wind direction (oe), that is,

    •  Case 1: Require Abs(L) > 5 m, use oe measurements, and use the Venkatram equation
      for zim and require zim > 25 m.
    •  Case 2: Require Abs(L) > 5 m, use AERMOD-predicted oe, and use the Venkatram
      equation for zim and require zim > 25 m.

A total of 3,151 AERMOD simulations were conducted to account for the various combinations
consisting of:
    •  Five tracer experiments, each tracer release case simulated separately,
    •  Six WRF configurations,
    •  Four WRF-meteorology extraction methods,
    •  Measurement-based AERMOD simulations using Case 1 and Case 2 options.
WRF performance was assessed in two ways: quantitatively by computing statistics relating
WRF-predicted surface meteorology to observed values and qualitatively by graphical
comparison of extracted WRF meteorology to observed values. The quantitative analysis was
conducted using publically-available software, METSTAT (ENVIRON Int. Corp., 2014).
METSTAT calculates a suite of model performance statistics using wind speed and direction,
temperature, and moisture observations. WRF predictions are extracted from the nearest grid
cell for comparison to the observed values. METSTAT computes metrics for bias, error, and
correlation and compares them to a set of performance benchmarks set for ideal model
performance (Emery, et al., 2001). Graphical analysis includes Q-Q plots comparing predicted
versus observed concentration probability distributions. Log-log scatter plots are employed to
evaluate the temporal relationship between observed and predicted concentration. The
statistical and graphical results are presented in Volume 2.
The results of the task suggest that small differences in the key meteorological variables can
result in large differences in predicted tracer concentration for a given hour. Although many of
the WRF simulations perform quite well when compared to regional surface observation of
3 Note that only Case 2 was evaluated for Oresund because overwater oe data were not available for the
period of the study.
                                          13

-------
winds and temperatures, small differences near the overwater point of release can result in
prediction of the opposite stability (stable vs. unstable or vice-versa).
Relatively small error in air temperature or sea surface temperature (SST) can have a large
effect on stability class because stability is a function of the "sign" of air-sea temperature
difference. Warm air advected over cool water results in stable conditions, while cool air
advected over warm water results in convective unstable conditions. Spatial gradients of SST
near the coast and wind direction thus play key roles in the simulation of the stability and
planetary boundary layer (PBL) heights over the water.
The modeling performance analysis of the five tracer experiments demonstrated WRF based
AERMOD simulations can result  in estimates of concentration as good as or better than
AERMOD simulations using observations - but not in all cases.
For the Cameron, Pismo, Oresund, and Carpinteria studies, some of the AERMOD simulations
driven by WRF meteorology had  better or similar performance statistics than simulations driven
by observed meteorology. The poorer performing AERMOD simulations, both WRF  driven and
observation driven occurred when the meteorological  inputs produced  atmospheric stability
conditions that were not likely representative of the larger-scale stability at the time of the study.
This  observation, however, is a fundamental flaw of dispersion modeling that relies on
meteorology at a single point. Models that use a 3-dimensional grid of  meteorological variables
are likely more appropriate for dispersion modeling of heterogeneous conditions.
For the Oresund study, all AERMOD simulations performed poorly when compared to observed
tracer concentrations. This study was characterized by overwater and overland transport,
25-40 km transport distances, high elevated releases, and observed shoreline fumigation. Poor
model performance in this instance is likely the result  of the limits of AERMOD's formulation, not
inaccurate characterization of surface conditions over water predicted by WRF.
It should also  be noted that the limits suggested by Richmond & Morris (2012), namely  limiting
the PBL height to be greater than 25 m  and the absolute value of L to be greater than 5, were
only  implemented in cases that applied AERCOARE and not the direct WRF-MMIF extraction
cases.
Based on the  results of this study, the following conclusions can be made:
   •   ERA reanalysis datasets offered better WRF predictions of  meteorology regionally, but
       NARR reanalysis datasets performed better in some cases at the local tracer study
       meteorology measurement sites. ERA-based runs resulted  in better AERMOD results
       overall. Both YSU and UW PBL  schemes resulted  in better predictions of meteorology
       overall, leading to better AERMOD predictions.
   •   The METSTAT analyses suggested most WRF simulations met the performance criteria
       goals for "complex terrain" conditions. The comparison of overwater measurements from
       the archived buoy data suggested the METSTAT performance could be used as a
       predictor at the site, despite a lack of overwater measurements in the METSTAT
       analysis itself. However, the meteorological analysis suggests small errors in SST and
                                          14

-------
       air temperature can result in misdiagnosed stability conditions that can have profound
       effects on the AERMOD results.
   •   The results suggest representative SST data are necessary to prevent misdiagnosis of
       surface-layer heat flux and stability. The SST data from the periods of the tracer studies
       integrated into the reanalysis data are not as representative or as resolved as today's
       datasets. Today SST data are collected from sophisticated satellites at high resolution. It
       is likely modern SST data are more accurate and air-sea temperature differences
       estimated by WRF are less  likely to result in a misdiagnosis of atmospheric stability
       conditions.
   •   AERMOD results produced  using meteorology extracted  from the ERA-YSU WRF
       simulations produced the simulations with the highest frequency of top performing Total
       Model Score (TMS). This combination also tended to more closely match the upper-
       range concentration predictions.
   •   Direct extraction by MMIF without AERCOARE produced more cases where
       concentration predictions were conservative.
The MMIF rediagnosis of PEL height should be used to prevent  excessively low PEL heights in
the SFC files.
                                          15

-------
[Blank]
   16

-------
4     TASK 3 - ANALYSIS OF AERMOD PERFORMANCE USING WEATHER
      RESARCH AND FORECASTING MODEL PREDICTED METEOROLOGY AND
      MEASURED METEOROLOGY IN THE ARCTIC
This section summarizes the Task 3 AERMOD performance evaluation reported in Volume 3.
The purpose of this task is to evaluate alternative methods for supplying meteorological
variables to AERMOD for regulatory air quality modeling of sources located over the water. The
protocol  for this task is in Appendix A.
It is hypothesized given an appropriate overwater meteorological dataset that AERMOD can be
applied for NSR following the same procedures as used for sources over land. This task
evaluates a combined modeling approach where the meteorological variables are provided by
WRF, and then processed by a combination of MMIF and, optionally, AERCOARE. The
extracted meteorology is used to drive AERMOD for several test cases. The results are then
compared to results of AERMOD driven by observational datasets over the Chukchi  and
Beaufort Seas along the Arctic coasts of Alaska.
The aim  of this task is to  provide evidence to help answer the following questions:
   •  How well does WRF predict overwater surface meteorology in the Arctic?
   •  Are pollutant concentrations predicted by AERMOD  driven by WRF meteorology as
      conservative as those predicted by AERMOD driven by observations (processed through
      AERCOARE)?
   •  What WRF modeling configurations and meteorology extraction methods provide the
      best AERMOD inputs?
   •  How sensitive is AERMOD to differences between the WRF extracted meteorology and
      observations for simulations of typical OCS sources?
The first part of this task generated a WRF meteorological dataset suitable for dispersion
modeling in the Arctic, employed various combinations of MMIF and AERCOARE  to extract
modeled and observational meteorology over water, and used these datasets to drive AERMOD
simulations for ice-free periods of 2009 - 2012, where overwater observational datasets were
available. Results from the buoy-based and WRF-based AERMOD simulations were compared
and contrasted to address the questions above.
Meteorological observation datasets from four overwater locations were obtained for this task.
Two of the locations were in the Beaufort Sea and two were in the Chukchi Sea. Data were
available at these locations for various time-spans during the ice-free summer and autumn
periods of 2010, 2011, and 2012. The sites B2,  B3, C1 and  C2 are shown in Figure 3 and
described in detail in Volume 3. The 2010 - 2012 period was selected to take advantage of the
vertical temperature profiler data collected at Endeavor Island during this period. The profiler is
a passive microwave radiometer operating from 2010 to 2012 at the offshore Endeavor Island
facility near Prudhoe Bay, Alaska. The profiler data were used to assist in the estimates of
atmospheric planetary boundary layer (PBL) height at each  of the sites. The term "PEL height"
                                        17

-------
is used to indicate the height or depth of the mixing layer and is synonymous with "mixing
height." These terms will be used interchangeably throughout this task.
Volume 3 summarizes the methodology and results for each element of the investigation,
including:

   •  The methodology used for the WRF simulations,

   •  Evaluation of the WRF performance,

   •  The methodologies used to prepare AERMOD meteorology from both the observational
      datasets and the WRF simulations,

   •  The AERMOD modeling approach and methodology,

   •  Evaluation of the AERMOD results and comparisons of observation-based and
      WRF-based AERMOD results, and

   •  Examination of the influence of the meteorological data on AERMOD performance.
 300-
 200-
 100-
  o-
-100-
-200-
    : Chukchi-Burger, 2010, 2012

: Chukchi-Klondike,
              .—."
              B2:

  Point Lay
                                                 $• Buoy Site
                                                 -fr WRF grid point
                                                   extracted
                                                  • WRF grid points
                                                    B3: Beaufort-Sivultiq, 2010-2012
                                                    3V~r-
                                                ENDV: Endeavor-Island,
      -500
-400
-300
-200
                                                            1
                                                       2100
                                                       2000
                                                       1900
                                                       1800
                                                       1700
                                                       1600
                                                       1500
                                                       1400
                                                       1300
                                                       1200
                                                       1100
                                                       1000
                                                       900
                                                       800
                                                       700
                                                       600
                                                       500
                                                       400
                                                       300
                                                       200
                                                       100
                                                       1
                                                       -100

                                                   "elevation (m)
-100
100
200
300
400
           Figure 3. Overwater meteorological measurement sites and
               corresponding WRF inner domain extraction points.

EPA provided five unique source group configurations for this study (Wong, 2012), as shown in
Table 3. Each group represented a hypothetical OCS source with stack characteristics typical of
drill ship sources that have operated on the OCS in the recent past or have been proposed in
recent permit applications for the Arctic. Fourteen stacks were divided among the five source
groups and located at the center of a hypothetical drill ship configuration (Figure 4). Each
                                          18

-------
source group contained multiple vertical stacks with warm, buoyant plumes. Stack heights for all
14 sources ranged from 10 m to 39 m.

                  Table 3. Hypothetical Drill Ship Emission Sources.
Source Unit
Diesel engine
1
Incinerator
Diesel engine
2 Boiler
Incinerator
Propulsion engine
3 Generator
Boiler
Diesel engine
4 Winch
Heater
Diesel engine
5 Boiler
Incinerator
Source
ID
S1P1
S1P2
S2P1
S2P2
S2P3
S3P1
S3P2
S3P3
S4P1
S4P2
S4P3
S5P1
S5P2
S5P3
Stack
Height
(m)
16
14
18
17
10
25
20
15
39
25
23
18
17
10
Stack
Gas
Temp
(°K)
700
550
680
500
525
570
610
420
580
580
510
680
500
525
Stack Gas
Exit
Velocity
(mis)
30
20
28
10
17
30
22
2
21
14
42
28
10
17
Stack
Diameter
(m)
0.50
0.40
0.40
0.45
0.40
0.60
0.25
0.30
0.70
0.20
0.15
0.40
0.45
0.40
Downwash
No
No
No
No
No
No
No
No
No
No
No
Yes
Yes
Yes
For the first four source groups (totaling eleven stacks), building downwash was not applied.
AERMOD can account for the influence of the wakes of structures on downwind concentrations
using the Plume Rise Model Enhancements (PRIME) model. To examine the effects of building
downwash, Source Group #2 (with three stacks) was modeled as Source Group #5 with the drill
ship structure shown in Figure 4.
                                         19

-------
                                          200m
                 -200    -150     -100     -50
                                          (m)
        Figure 4. Source locations and structures and innermost receptor rings.

AERMOD predicts pollutant concentrations at locations based on their distance from a source.
For this task, a network of 50 receptor rings was used. Each ring contained 360 receptors at 1°
spacing. The rings were centered at the same origin as the sources with incremental radial
spacing of downwind distance based on a geometric series from 30 m to 10 km as shown in
Figure 4. Receptors were placed at a height of 0.0 m (no flagpole receptors). The total number
of receptors was 18,000. The vessel to the south had no sources nor downwash influence on
the drill rig sources. The vessel was considered part of the ambient air.
The meteorological data were extracted from WRF and processed  by MMIF the same way as
described in Task 2. Similarly, the same limits on Monin-Obukov length and PEL height
presented in Task 2 apply here as well (i.e., a minimum PEL height of 25 m, and a minimum |L|
of 5 m).
In this task, AERMOD concentrations were calculated  using meteorology from overwater
observations and meteorology extracted from WRF simulations. The  maximum predicted
concentrations at each receptor ring were extracted and the observation-driven AERMOD
results were compared directly to the WRF-driven AERMOD results. This approach simplifies
                                         20

-------
the investigation of the bias of the WRF simulations and removes some of the influence of wind
direction differences.
AERMOD version 14134 was used for this study, using all regulatory default options except the
"VECTORWS" flag (wind speeds are vector mean (or resultant) wind speeds, rather than scalar
means) was used for WRF wind speed. It was necessary to specify the "Beta" option in
AERMOD to use the MMIF extracted data, which allows for new features in AERMOD that are
in a draft BETA-test status. Five different block averaging periods were simulated for each
combination of site, year, and source type: 1-hour, 3-hour, 8-hour, 24-hour,  and period-long
averaging periods. These averaging times were selected because they correspond to the
averaging periods applicable to the NAAQS. Comparison with statistical standards was not
performed. To ensure tracer emission rate independence, AERMOD simulations were
conducted using a stack unit emission rate of 1 g/s. The resulting AERMOD concentrations
were divided by the tracer release rates to provide normalized concentration with units of us/m3.
This study consisted of 1,125 AERMOD simulations and were conducted to satisfy all of the
possible scenarios:
   •  3 sites per year made up of different combinations of sites:
          o  2010: B2, B3, C2
          o  2011: B2.B3.C1
          o  2012: B3, C2, C1
   •  3 years of WRF simulations (2010-2012),
   •  5 source groups,
   •  5 averaging periods,
   •  5 meteorological datasets:
      i)      Observations
      ii)     MMIF.RCALF WRF extractions
      iii)     MMIF.RCALT WRF extractions
      iv)    AERC. RCALF WRF extractions
      v)     AERC. RCALT WRF extractions
The definitions the WRF extractions in ii) through v) are explained in Table 2 of Section 3.
A set of summary statistics, defined in Volume 3, were calculated for each simulation to
evaluate the performance of WRF-based simulations compared to the observation-based
simulations. The summary statistics can be found in Appendix B of Volume 3.
In summary, the analyses suggest WRF was able to produce hourly meteorological datasets
that compared favorably to overwater measurements. The regional METSTAT analyses found
temperature and wind speed were within simple-terrain criteria for the majority of periods. Wind
speed at overwater sites was biased high in October only. Overall, WRF meteorology at all four
sites agreed with measured data, but with some biases.  Sea Surface temperatures were in
                                         21

-------
agreement a majority of the time, but varied up to several degrees for short periods. WRF PEL
heights were biased low on average during stable periods, but were of similar magnitude to
measurements during unstable periods.
From a qualitative perspective, most of the WRF-based AERMOD simulations resulted in
concentration maxima that were favorable, being within a factor-of-two of the observation-based
AERMOD simulation results and producing RHC within 10-20% of the observation-driven
AERMOD results a majority of the time. Maximum concentrations at distances greater than
1,000 m from the source tended to be conservative. Maximum concentrations within 1,000 m of
the source were underpredicted for the tall stack simulations (i.e. Source Groups #3 and #4)
due to the persistence of overly stable conditions that prevented near-source mixing to the
surface.  In general, maximum concentrations tended to occur from 100 m to 1,000 m during
unstable conditions characterized  by higher PEL heights due to the increased rate of vertical
mixing. The WRF-based predictions for Source Group #5 (downwash cases) were consistently
the best performers of the five source groups when compared to the observation-based
predictions.
The main conclusions of the study are summarized as responses  to the set of questions below:
   •   Is there a consistent bias across source type and/or location (e.g. Chukchi vs. Beaufort)?
       In particular are there any instances where the WRF simulations result in a bias towards
       underprediction compared to using the buoy observations?
       For the sources considered in this task, the absolute maximum concentrations occurred
       within 1000 m of source during unstable conditions. There was no consistent bias at the
       Chukchi or Beaufort sites for maximum short term average concentrations. Prediction
       accuracy (with respect to observation-based predictions) was better at the Chukchi sites
       because the WRF-MMIF PEL heights were  used for the observation-based simulations
       (no PEL height measurements were available for these sites). With respect to averaging
       time,  the long-term period-averaged concentrations were underpredicted in some
       instances. For example, the simulations using the MMIF recalculated  PEL heights
       underpredicted the long-term maximum concentrations at site C2.
       The FF2 scores were persistently lower at site  B2 than the other sites. Concentrations
       were  typically overpredicted at this site at distances greater than 1000 m and
       underpredicted at distances less than 1000 m due to the high frequency of minimum
       PEL height.
       At distances  less than 1000 m, WRF-based Source Group #4 simulations
       underpredicted concentration with respect to observation-based simulations. It was
       found that the taller stack groups (Source Groups #3 and #4) were more sensitive to
       differences in meteorology than the other groups. At distances greater than 1000 m, tall-
       stack maximum concentrations were underpredicted in cases where PEL height was
       overpredicted. High PEL height corresponded to unstable conditions that promote
       vertical mixing and concentration maxima for tall stacks in  the near-source (< 1,000 m)
       but promote lower concentrations in the far-source (> 1,000 m). On the other hand, tall-
                                          22

-------
   stack concentrations were underpredicted near the source when the PEL height was
   underpredicted.
   The MMIF-rediagnosis (RCALT) of PEL height tended to improve WRF-based AERMOD
   performance by producing PEL heights that agreed better with the observation-based
   PEL heights.
•  For locations where WRF performed better, does that ultimately translate to different
   dispersion model results?
   The short-term maximum concentrations were less sensitive to bias in the WRF results.
   This was likely because the concentration maxima occurred during the extreme
   atmospheric stability conditions (either stable or unstable). The optional MMIF PEL
   height and L limits result in observation- and WRF-based meteorological simulations that
   are quite similar during the most extreme conditions.
   The ice free period average maximum concentrations at distances exceeding 1000 m
   were the most sensitive to the long-term meteorology bias. Underpredicted wind speeds
   at Sites  B2 and B3 favored conservative period-averaged concentrations greater than
   1000 m. Site B2 period average far-source concentrations were highly conservative with
   RHC values greater than a factor of two of the observation-based concentrations.
   It is highly recommended that FNMOC SST analysis, or a similar high-resolution SST
   dataset based on both remotely-sensed and in-situ measurements be used instead of
   alternative datasets such as the NCEP RTG for simulations of open-water periods in the
   Beaufort Sea. The Mackenzie River warm-water outflow plume is a prevalent feature on
   the Beaufort Sea in summer. Low resolution SST analysis or excessive smoothing may
   result in erroneous air-sea temperature difference estimates. The FNMOC SST analyses
   gave a better spatial and temporal description of the SST distribution and gradient
   across the Beaufort Sea over the 2010-2012 periods analyzed as discussed in  Section
   5.1.2.
•  Did it make any difference when WRF predictions were processed by AERCOARE as
   oppose to direct use for predictions of the surface energy fluxes?
   Overall,  there was little discernible advantage in using AERCOARE. Considering
   average TMS, the "MMIF" runs (direct extraction from WRF without AERCOARE
   processing) resulted in slightly higher scores.  However, for the  1-hour averaging period
   RHC scores, the AERC.RCALT simulations performed the best overall. The MMIF
   recalculation of PEL  height has a much greater influence (than  AERCOARE processing)
   on the accuracy of the simulations.
•  Does it make any difference when PBL heights are rediagnosed by MMIF?
   Overall,  maximum concentration results were more accurate and more conservative
   when the MMIF rediagnosis was applied.
                                      23

-------
The concentration results from the shorter stack groups (Source Groups #1, #2, and #5)
and downwash-affected sources were less sensitive to differences in the PEL height. If
the plume is already near ground level, maximum concentrations at ground level will
occur in the near the source and are less sensitive to the height of the PEL.
Concentration maxima from taller stacks are much more sensitive to the PEL height.
Note that the height of the tall stacks used in this study is near to the minimum PEL
height (25 m). If the minimum PEL height was greater than the tallest stack,  it is likely
that concentration estimates would be more comparable.
                                   24

-------
5      TASK 4 - COMPARISON OF PREDICTED AND MEASURED MIXING
       HEIGHTS

5.1    Overview and Objective
The goal of this Task is to compare upper-air observations with WRF simulations.

5.1.1   Upper-air and surface observations
In  spring 2010, a Kipp & Zonen radiometric profiler and an instrumented meteorological tower
was installed on Endeavor Island (or Endicott) (Hoefler Consulting Group, 2003). The profiler
collected upper measurements from mid May 2010 and to late November 2012 while the tower
recorded surface data from mid May 2010 to the end of August 2013. Endeavor is a man-made
island located off the northern coast of Alaska in the Beaufort Sea and 15 miles northeast of
Prudhoe Bay. A  three mile gravel road connects Endicott's two gravel islands; the Main
Production Island (MPI) and the Satellite Drilling Island (SDI). The location where the station
was installed is indicated in Figure 5. The profiler and a view of the co-located meteorological
station is shown in Figure 6.
The profiler is a  passive microwave radiometer that measures radiances emitted from the
atmosphere with an optical path length of a few hundred meters. Every few minutes, the
radiometer performs a series of measurements at different angles ranging from horizontal to
vertical. From these radiances, a  profile of air temperature can be inferred via a mathematical
technique. Values are reported every  10 m up to 100 m, every 25 m up to 200 m, and every 50
m  up to 1000 m. No data is available above 1000 m. The soundings of temperature, when
compared with co-located radiosonde ascents, appear somewhat "smoothed" in the vertical due
to  the nature of the retrieval. No retrieval of wind speed, wind direction, or humidity are available
from this profiler. The profiler was installed on the north end of SDI and pointed due north out
over the water.
Located next to the profiler was a 12  m meteorological tower, instrumented to measure wind
speed and wind  direction at 10 m, temperature at 10 and 2 m, and solar radiation. The
temperature measurement is used to scale the retrieval because the profiler senses the change
of  temperature in the vertical. The distance from the station to the water's edge was on the
order of 10 m which is short enough such that the internal thermal boundary layer growing from
the land surface would not affect the air measurements at the site. The measurements, both
radiometric and  traditional, can therefore be assumed to be representative of the adjacent
ocean.
Periodically, the  Japan Agency for Marine-Earth Science and Technology (JAMSTEC) sends
one of its research vessels into the Chukchi and/or Beaufort Seas for a research cruise. Such
cruises occurred in 2008, 2009, 2010, and 2012. Hourly near-surface measurements and twice-
daily upper-air measurements were collected in addition to the other scientific objectives of each
cruise. There is a delay in releasing these data to the public to allow JAMSTEC researchers to
publish journal articles based on the data. The upper-air soundings from 2008 and 2009 have
been released, but the soundings from 2010 and 2012  have not yet been made public.

                                          25

-------
The twice-daily radiosondes released at Barrow (PABR) are not representative of the marine
boundary layer, and are not considered part of this analysis. However, PABR data (both
            Monitoring Site Location
              Figure 5. Location of the profiler station on Endeavor Island
      Figure 6. The K&Z profiler (left) and a view of the station looking North (right)
                                          26

-------
sounding and surface data) was used in the observational nudging of the 4 km domain of the
WRF simulations described below. WRF performance at PABR should therefore not be
considered an independent test unlike the profiler or JAMSTEC comparisons (which were not
used in the nudging dataset).

5.1.2  WRF simulations
As summarized in Section 4 and detailed in Volume 3, a three-year WRF run was produced,
spanning the years 2009-2011. This simulation used the NCEP RTG SST dataset (NOAA,
Environmental Modeling Center, no date [n.d.]) for the lower boundary condition and as
initialization data required by WRF. This dataset is a smoothed 0.083 degree (1/12th degree in
latitude and longitude or approximately  9 km) once-daily SST analysis. This WRF simulation
was later extended to 2012 and 2013, though only 2012 was completed before problems were
detected. It was discovered that the RTG SST data did not handle the Mackenzie River outflow
very well, leading to large (8 -12 °C)  errors in WRF SST compared to the buoys in the Beaufort
Sea. An unpublished re-run of the RTG product for much of the problematic period on the NCEP
FTP server was found but that re-processed dataset only improved the WRF SST performance
at the Beaufort buoys slightly. This dataset is referred to as "RTG-WRF".
A second WRF simulation was  run using the SST analysis from the FNMOC (USGODAE Data
Catalog, n.d.) in place of the RTG SST  product. This analysis is produced four times per day at
a horizontal resolution of 9 km.  Only the periods when buoys were deployed were re-run using
this SST dataset. This dataset was used in the Task 3 analysis comparing WRF-driven
AERMOD results to buoy-driven AERMOD results, so only the open-water period when buoy
data exists was required. This dataset covers August, September, and October of 2010, 2011,
and 2012. This dataset is referred to as "FNMOC-WRF".
The FNMOC-WRF output was used in the analysis involving the Endeavor Island profiler
because of its location to the Beaufort Sea buoys which had shown problems  related to the use
of the RTG SST product. This limits the analysis to the three years of open-water periods - 2010
to 2012 - using the FNMOC-WRF.
Due to the constraints of the overlap in  time between the JAMSTEC data and  the two  WRF
simulations, only the RTG-WRF was used in the analysis involving the JAMSTEC soundings -
the 2009 cruise (the FNMOC-WRF does not begin until  2010).

5.2   Analysis
The Kipp & Zonen radiometer's sub-hourly profiles were averaged to create a mean sounding
for each hour of operation. The 4 km  domain WRF output for the grid  cell containing the profiler
was extracted, with no horizontal interpolation between  the nearest grid cells.  Although the
profiler data can be plotted along with WRF data, the profiler's retrievals are smoothed as a
result of the averaging of the sub-hourly profiles. Consequently, a direct comparison was not
considered useful. Instead, this analysis focuses on mixing heights derived (analyzed) from the
profiler temperature data since  concentrations estimated by AERMOD can be sensitive to this
value.

                                         27

-------
The mixed layer height was analyzed for each observed sounding (profiler, JAMSTEC, PABR)
using two different approaches:
   1.  A plot of the sounding, visually identifying the base and top of any inversion layer. These
       are labeled "ZiBase" and "ZiTop" in the plots that follow.
   2.  The profiler soundings were numerically analyzed using the Critical Bulk Richardson
       Number (CBRN) calculation. These are labeled "ZiRib" in the plots that follow.

The Richardson Number (Ri)  (Vogelezang and Holtslag, 1996) is named after Lewis Fry
Richardson (1881 -1953), and is essentially the ratio of the potential energy to the kinetic
energy of an air parcel. It is most often expressed using gradients, which are approximated
when using quantized data by the Bulk Richardson Number RiB.
Here "g"\s the acceleration due to gravity, "z"is the height above the surface, "9" is the
potential temperature, "U"\s the wind speed, "u." is the friction velocity, b is a constant taken to
be 100, and the subscript "s" refers to the near-surface observation (e.g. typical 10m wind
speed, 2 m temperature). Starting with the surface, the height "z"is increased until the Bulk
Richardson Number exceeds a critical value, called the CBRN. Experiments have identified a
CBRN of 0.25 over land (Vogelezang and Holtslag,  1996) and 0.05 over water (Gryning and
Batchvarova,  2003) when using this formulation.
Because the profiler does not provide a retrieval of wind speed, the Monin-Obukhov surface
layer similarity theory was used to extrapolate the wind speed measured at 10 m to the levels of
the profiler's temperature retrievals.  The appropriate equation is:
Here "k"\s von Karman's constant, "z0" is the roughness length, "L"is the Monin-Obukhov
length, and "W" is the stability correction function.
In its output files, WRF includes the 2-dimensional variable "PBLH", listed as the "PEL HEIGHT"
in meters above ground level (AGL). Each PEL parameterization choice within WRF can have
its own definition of PEL height, and the same PBLH may not be diagnosed when given
identical profiles. Further, WRF's PBLH is not a continuous quantity, but is quantized to the
nearest layer's mid-point. This makes WRF's PBLH values dependent on the user's choice of
the vertical pressure coordinate (eta) levels in WRF. Because the conversion of the vertical
pressure coordinate levels to height above the surface varies in both time and space, this
estimate of the PEL height can be misleading. In the WRF simulations, the layers are
approximately 11m thick near the surface, -75 m thick near 500 m AGL, and -240 m thick near
1000 m AGL. This discrete set of WRF's PBLH therefore could lead to a 10 - 20% error. WRF's
PBLH is labeled as "ZiWRF" in the plots that follow.
                                          28

-------
The WRF output was therefore processed using the MMIF twice: once using the "pass through"
of the WRF PBLH, and once using MMIF's numerical CBRN calculation of the PEL height.
MMIF's implementation of this numerical method is very similar to the one used to process the
profiler data, except that MMIF interpolates the WRF data vertically to provide a smoother (less
quantized) output. The numerical techniques used to analyze the profiler data did not attempt to
interpolate between the retrieval layers. They are labeled "ZiMMIF" in the plots that follow.
ZiMMIF is the result of the CBRN analysis of the WRF data, which ZiRib is the result of the
same type of analysis of the profiler's retrieval.

5.3    Results

5.3.1   FNMOC WRF evaluated using Endeavor Island profiler
Some typical sounding plots are shown in Figure 7 through Figure 9. The solid black line is the
profiler retrieval of temperature (left panels), converted to potential temperature (right panels)
using the hypsometric equation (AMS, 2012). The dashed black line is the WRF data profile.
The current conditions taken from an AERMET run using the 10 m tower data are printed near
the top-left of the right panels. The various estimates of the mixing height Z, are shown by
horizontal lines with labels. To summarize:
    •   ZiBase represents the typical American Meteorological Society (AMS) definition of an
       inversion, the  level where the temperature starts to increase with height. ZiTop is the
       level  where it  starts to decrease again.
    •   ZiRis represents  the CBRN technique applied to the profiler.
    •   ZiMMIF represents the CBRN technique applied to the WRF data.
    •   ZiWRF represents WRF's internal value for the PEL depth, i.e. the output variable
       PBLH.
    •   Zim and Zic (when present) represent AERMET's mechanical and convective mixed
       layer heights,  respectively. AERMET  is  not always able to calculate these, due to
       missing input  data.

WRF is generally reproducing the shape of the profile, even if there is a constant offset in the
absolute value. Recall that the profiler uses the 10 meter air temperature measured
independently  to "anchor" its profiles - it does not retrieve the absolute value of temperature,
just the change of temperature with height.
Note that the hand-analyzed inversion base is often at the surface (i.e. a "surface-based"
inversion) yet the wind speeds measured by the 10 meter tower are considerable. The shear-
driven turbulence alone should be creating a well-mixed layer, but presumably the strong
advection and  surface fluxes are maintaining the profile of temperature. This quandary is what
the CBRN analysis is designed to solve. The RiB based mixed layer is a compromise between
the shear-driven turbulence mixing and the buoyancy-drive turbulence mixing (or suppression
thereof).
It is difficult to interpret the retrieval in Figure 9. A shallow super-adiabatic layer might exist near
the surface when there is large upward heat flux, but 200+ m deep layer aloft is not physically

                                           29

-------
possible because the warmer, less dense air below (around 200 m) would rapidly exchange with
the cooler more dense air above (about 500 m) and the temperature gradient would be
moderated. The profiler may have been affected by the radiation from clouds, or other unknown
errors in its retrieval algorithm.
Figure 10 through Figure 15 show the various identifications of the mixed layer height, plotted
against to  each other. Various statistics are shown in the upper-left corner, including the number
of points, the squared correlation coefficient, the root-mean-square error, the bias, and the
percentage of the data that are within a factor of 2 of the 1:1 "perfect fit" line. A  least-squares fit
(the dashed line) that is forced through the origin is also plotted, and its slope given in the lower-
right corner.
As expected, ZiBase (the inversion base derived visually) has very little correlation with the RiB
based or WRF's identification of the mixed layer height. Even the profiler's CBRN mixing layer
has relatively little correlation with WRF's PBLH. In Figure 13, except for the lobe of points
where WRF + MMIF's diagnosed mixing height is high and the diagnosis of the profiler data is
low, there  appears to be some correlation. Note that the profiler's highest retrieval is 1000 m
above the  surface, but no similar constraint exists for the WRF + MMIF based Z,. The
quantization of the profiler data  can also be seen, due to MMIF's ability to interpolate between
WRF layers when finding the RiB mixing height. Given the smoothed nature of the profiler's
retrievals,  along with the other potential sources of error from the profiler and the somewhat
sensitive nature of the CBRN analysis, one might not expect a very high correlation.
AERMET's mechanical and convective mixed layer heights also do not compare favorably to the
profiler's mixed layer heights. The mechanical mixed layer height is generally too high, and
often would exceed AERMET's  internal limit of 4000 m. Even the convective mixed layer height
is often much higher than the profiler's. This underscores why AERMET  is not appropriate for
overwater conditions, as it assumes 90% of solar radiation reaching the surface goes into
heating (deepening) the mixed layer.

5.3.2  RTG WRF evaluated using JAMSTEC soundings
Evaluation of the FNMOC WRF run is not possible, because JAMSTEC has not yet released to
the public the soundings from cruises after 2009, and the FNMOC WRF  run started in 2010.
Instead, a  previous version of the WRF dataset developed under Task 3 will be used. This WRF
simulation for 2009 used the RTG SST product, which spread the influence of the warm
Mackenzie River outflow far into the Beaufort Sea.  The JAMSTEC  radiosondes were launched
far enough West (see Figure  16) to escape the  most of the influence of this  problem. Sample
sounding plots showing WRF and JAMSTEC soundings are shown in  Figure 17 through Figure
20. There were a total of 47 atmospheric soundings taken within the 4 km domain during this
cruise. They were taken between September 10 and October  11, 2009. These data were not
used in the observational nudging of the WRF run; this is an independent evaluation of WRF's
performance.
                                          30

-------
Figure 21 shows a comparison between the mixing heights derived via the CBRN technique
applied to the JAMSTEC soundings and the WRF soundings. This is the same technique that
MMIF uses to rediagnose the mixing height. Forty-seven of the points plotted were from the
4 km WRF domain (plotted as red plus signs) and 101 points were from the 12 km domain.
Although the R2 value is relatively low, near % of the data points fall within the factor-of-two
lines.
WRF captures the essence and variation of the JAMSTEC soundings remarkably well during
this short period of open water. Although the elevated inversions are sharper in the observations
than in the simulation, the WRF inversions are probably strong enough to set the mixed layer
depth at approximately the correct height. Because of the small sample size and the resulting
lack of statistical significance, no CBRN analysis has been completed on the JAMSTEC
soundings.  Note also that no surface-based inversions were observed during the 2009
JAMSTEC cruise.

5.3.3  FNMOC WRF evaluated using Point Barrow soundings
Sample vertical  profile plots showing 4 km WRF modeled data and observed upper-air sounding
data from Point  Barrow, AK (PABR) are shown in Figure 22 and Figure 23. WRF handled
temperature and moisture well through the vertical atmosphere with an accurate representation
of observed conditions within  the PEL. In Figure 22, WRF better represented the radiation
inversion from the surface in the left panel. In Figure 23,  the left panel depicts WRF slightly
over-predicting the strong subsidence inversion beginning around 700 m. The right panel
displays a slight cold bias at the surface with modeled temperatures warming a bit too quickly in
the first 200 m shallow layer in the early morning hours. The summertime vertical profiles reflect
improved model performance with WRF predicting the height and depth of inversions in the
lower levels of the atmosphere.
It should be noted that the PABR upper-air data were used in the observational nudging of the
WRF run, so it's not surprising that WRF is similar to PABR.

5.4   Conclusions
After reviewing the many sounding plots of the Profiler vs. WRF and the JAMSTEC soundings
vs. WRF, the CBRN technique is difficult to apply. Quite often  in the JAMSTEC plots, there is a
small "hack" in the trace of wind speed at the lowest three levels. The lowest level reported in
weather balloon ascents is typically from another (collocated) instrument, not from the weather
balloon itself. The second level is actually the first from the sonde, and is often less than the first
or third report. This can be seen in the bottom panel of both sides of Figure 17,  and is very clear
in Figure 19. This "hack" in the trace can cause large changes in the calculated bulk Richardson
number, leading to too much sensitivity in identifying the mixed layer.
For example, although the mixing heights identified using the WRF profile (horizontal blue line)
and using the sonde profile (horizontal red line) are similar, both are driven by relatively small
wind speed changes near the surface and both miss what a trained  meteorologist would call the
mixed layer (800m for 2009-09-11_06 and 1800m for 2009-09-28_12).

                                          31

-------
In the "RTG WRF" Figure 17 through Figure 21, the WRF profiles match the JAMSTEC sonde
profiles well with only relatively small errors in the tops of the mixed layer that a meteorologist
would identify from the potential temperature traces. The wind speed profiles are simply too
"noisy" with small changes ("wiggles") over layers ~100m deep, triggering the CBRN falsely and
leading to mixing layers that do not agree with what a meteorologist might identify from viewing
the sounding.
Based on the above evaluations, more studies are warranted on a consistent method to identify
mixing layers, both from WRF and from observed profiles, before the WRF-AERCORE
methodology can be considered complete. The comparisons of WRF vs. JAMSTEC profiles,
which, unlike the Point Barrow profiles, were not used to nudge the WRF run and are
independent, show how accurate WRF can represent the profiles. The method used to identify
the mixing height from any profile (WRF or measured) is, however, too sensitive to small
changes  in the profiles.
                                          32

-------
  Profiler Sounding 1400 - 2010-07-29 07:00 LSI
    Profiler Sounding 1400 - 2010-07-29 07:00 LST
ZiTop = 450
                            \    '._  ZiMMIF'»891
                                     •, \
                                      \1
                                       •I
                                     ZIWF1F = 515
                  8        10


                 Temperature (C)
                                    12
                                             14
                                                             U = 8.7 ZO = 0.05 L = 8888 RH = NA
                                                             Profiler = Solid, WRF = Dashed
278     280    282    284    286    288


               Potential Temperature (K)
                                                                                                    290    292
           Figure 7. Profiler retrievals and WRF soundings for 2010-07-27 07:00 LST.
                                                   33

-------
  Profiler Sounding 1937 - 2010-08-20 16:00 LST
                                                Profiler Sounding 1937 - 2010-08-20 16:00 LST
ZiTop = 1000.

'*••.
"'.'
» *
'"•••x ""V
\ •-
\
'-, \
ZiWRF\B12
\
ZiRib = f50
'•-ZiMMIF ='638
                                     y
                                   * s
                                   x *,
567




  Temperature (C)
                                                             U = 17.4ZO = 0 05 L = 88 SB RH = NA
                                                             Profiler = Solid, WRF = Dashed
                                                                                               ' ZiMMIF = 638
                                                             I      T
                                                           276    278    280    282    284    286    288    290




                                                                           Potential Temperature (K)
           Figure 8. Profiler retrievals and WRF soundings for 2010-08-20 16:00 LST.
                                                  34

-------
Profiler Sounding 1842 - 2010-08-16 17:00 LST
   Profiler Sounding 1842 - 2010-08-16 17:00 LST
                10     12     14

               Temperature (C)
                                    16
                                          18
                                                          U = 7.4 ZO = 0.05 L = 8888 RH = NA
                                                          Profiler = Solid, WRF = Dashed
                                                                                                 I
                                                                                              ZiMH/IIF = 674
                                                                                               ZiRib - 35Q
275       280       285      290      295      300

               Potential Temperature (K)
        Figure 9. Profiler retrievals and WRF soundings for 2010-08-16 17:00 LST.
                                                35

-------
                   Endeavor Island WRF vs. Hand-Analyzed Mixing Heights
  o
  o
  o
  OJ
  o
  o
IL
DC



N
0)
I
_i
in
o_
LJ_
DC
  o
  o
  in
N = 6951
Rz = 15%
RMSE = 300
Bias = -19
In [1/2,2] = 32%
IM.* ii     * • * I   I
       |  I I I I • I   I » 1 *,*
                                                                  Slope = 0.64  _
                       500              1000             1500

                            Hand-Analyzed Inversion Base (rn)
                                                                   2000
     Figure 10. WRF's PBLH vs hand-analyzed inversion base (ZiBase).
                                      36

-------
                       Endeavor Island MMIF vs. Hand-Analyzed Mixing Heights
       o
       o
       o
       OJ
     N

     2:

     i?
     ID
     X

     Cj)
     TJ
     0)
     
-------
                       Endeavor Island WRF vs. Crit. Richardson Mixing Heights
        o
        o
        o
        OJ
        o
        o
     IL
     DC

     N
      0)
      I
      _l
      in
      0_
      LJ_
      DC
        c>
        o
        in
N = 6796
R2 = 42%
RMSE = 217
Bias = -50  •
In [1/2,2] = 30%
    •••• !• I Ml J ! It l/i •;
                         X  x'
t_	I I I I I  | I I I* MX. * *
          /

 toft!". I I I I I
    ••••I i i iX/"< • i * i ^>^
    III! I I flVi I I » l * .  .
                                                                       Slope = 0.84
                            500              1000             1500
                            Profiler Bulk-Richardson Mixing HeightZiRib (m)
                                                                  2000
Figure 12. WRF's PBLH vs. the Critical Bulk Richardson mixing height from the profiler.
                                           38

-------
                       Endeavor Island MMIF vs. Crit. Richardson Mixing Heights
      o
      o
      o
      OJ
N = 6796
R2 = 49%
RMSE= 180
Bias = -20
In [1/2,2] = 46%
    N

    2:

    i?
    ID
    X

    Cj)
    TJ
    0)
    V)
    O



    I
    ID
    DC

    LU
    iL
    DC
                                                                         Slope = 0.91
                            500               1000             1500

                            Profiler Bulk-Richardson Mixing HeightZiRib (m)
                                                                     2000
Figure 13. Critical Bulk Richardson mixing heights from MMIF's vs. from the profiler.
                                           39

-------
                  Endeavor Island MMIF vs.AERMOD Mechanical Mixing Heights
      o
      o
      o
    N
    ID
    X
    Cj)
    1  o
    TJ o
    0) OJ
    en
    O

    ID
    DC
    LJ_
           N = 6815
           Rz = 17%
           RMSE= 1635
           Bias = -1299
           In [1/2,2] = 7.5%
                                                     &*•*-/? ..'.*• *•*
                                                     wm£ /.uv* f .*• s>«*
                         1000            2000            3000
                        AERMOD Mechanical Mixed Layer Height Zirn (m)
4000
Figure 14. MMIF's Critical Bulk Richardson mixing height vs. AERMOD's mechanical
                                 mixing height.
                                       40

-------
                   Endeavor Island MMIF vs.AERMOD Connective Mixing Heights
      o
      o
      r--
      o
      o
    U-
    5 °
    ^ °
    ^ Lfl
    N
    2:
    i?
    ID

    O) o
    .E f

    ID
    DC
    _
    DC
      O
      O
N = 364
R2 = 0.00021%
RMSE= 176
Bia?=-106
In [1/2,2] = 20%
                                                                     Slope = 0.23
                   100       200       300       400       500       600
                          AERMOD Gonvective Mixed Layer Height Zic (rn)
                                                                   700
Figure 15. MMIF's Critical Bulk Richardson mixing height vs. AERMOD's convective
                                   mixing height.
                                         41

-------
Figure 16. Locations of the JAMSTEC 2009 soundings.
                      42

-------
         JAMSTEC (red) vs. WRF (blue)
         2009-09-11 06 GMT
                                       JAMSTEC (red) vs. WRF (blue)
                                       2009-09-28  12 GMT
      2400 -


      2000 -


    „ 1600 -
   f

   •§> 1200 -
   '
-------
         JAMSTEC (red) vs. WRF (blue)
         2009-09-30 18 GMT
      JAMSTEC (red) vs. WRF (blue)
      2009-10-03 18 GMT
   2400 -


   2000 -


 , 1600 -
1

;§> 1200 ^
i
    800 -


    400 -
             — WRF T    i
             — WRFTdew
             — OBST
             — OBSTdaw
            * »WRFT2
            • • WRFTskln
              «OBST2
\
2400 -


2000 -


1600 -


1200 -


 800 -


 400 -


  0
         — WRFT
         ~ WRFTdew
         — OBST
         ~ OBSTdew
         « »WRFT2
         • •WRFTskin
         * • OBST2
          -30
                  -20      -10       0
                    Temperature (C)
       -30      -20      -10       0       10
                 Temperature (C)
      2400 -
          260      270     280     290     300
                 Potential Temperature (K)
       260      270      280      290
              Potential Temperature (K)
      2400 -
      2000 -
   „ 1600
   '(U
   I
      1200 -\
       800 -
       400 -
   2400 -


   2000 -


„ 1600 -
£

•§) 1200 -
I
    800 -


    400 -
                5    10    15    20   25    30
                    Wind Speed (m/s)
                                                                                    300
             5    10    15    20    25   30
                 Wind Speed (m/s)
Figure 18. RTG WRF and JAMSTEC profiles for 2009-09-30 18:00 UTC and 2009-10-03
                                       18:00 UTC.
                                            44

-------
         JAMSTEC (red) vs. WRF (blue)
         2009-10-07 18 GMT
                                               JAMSTEC (red) vs. WRF (blue)
                                               2009-10-09 09 GMT
      2400 -


      2000 -


      1600 -


      1200 -


       800 -


       400 -
, . , 1 . . , . I I , . . J .
\ v 1
\ /
-•"~"~ \
\ \
— V-, J
•V'
— WRFT '' — -*»,,/






— WRFTdew \\
OB0 T v\

-- OBSTdew \
• »WRFT2 |\
• »WRFTsWn I'jX
• «OBST2 \\ \
	 jLA 	
          -30
      2400 -
   2400 -

   2000 -

_ 1600 -
   •§) 1200 H
   "5
   I
       800 -
       400 -
                                            2400 -


                                            2000 -


                                           „ 1600 -
                                          §

                                        -•§ 1200 -


                                             800 -


                                             400 -


                                               0
 — WRF T    i
 -- WRFTdew  ~-^ '
 — OBST
 — OBSTriom
 • »WRFT2
-*• • WRF Tskln
 * »OeST2
                  -20      -10       0
                    Temperature (C)
                                                -30      -20      -10       0
                                                          Temperature (C)
       260      270      280      290      300
              Potential Temperature (K)
                                                   260
                                                           270     280     290
                                                          Potential Temperature (K)
J
r
p.
]
                                               2400 -j

                                               2000 -

                                               1600 -
                                          g, 1200
                                          'to
                                          I
                                             800 -
                                                400 -
                5    10   15    20    25   30
                    Wind Speed (m/s)
                                                                                    10
                                                                                    300
                                                      5    10    15    20    25    30
                                                          Wind Speed (m/s)
Figure 19. RTG WRF and JAMSTEC profiles for 2009-10-07 18:00 UTC and 2009-10-09
                                       09:00 UTC.
                                            45

-------
         JAMSTEC (red) vs. WRF (blue)
         2009-10-10 06 GMT
JAMSTEC (red) vs. WRF (blue)
2009-10-10 18 GMT



.§.
•+— '
0)
l
\
          -30
      2400 -

2400 -


2000 -
_ 1600 -
— •
•§, 1200 -
'03
I
800 -


400 -


, . , i . , , .
*^_ \
*•.. ^
-"•--^_ ^
-. \




— WRFT
~ WRFTdew
— OBST
~ OBSTdaw
• »WRFT2
• •WRFTsMn
* »OBST2
i .... i
\j

I
v\7 \
VI
\\ \
l\
1
il
> /

,'
II
» '
&
	 rftV , , ,
                 -20      -10      0
                   Temperature (C)
                                             2400 -
                                             2000 -
 -30     -20      -10      0
          Temperature (C)
         i  ,  ,  ,  i  , .   , i
         260     270      280     290
                Potential Temperature (K)
                                        300
                                                 260
        270      280     290
       Potential Temperature (K)
               5    10    15   20   25
                   Wind Speed (m/s)
                                                                                10
                                                                               300
       5    10   15   20   25   30
          Wind Speed (m/s)
Figure 20. RTG WRF and JAMSTEC profiles for 2009-10-10 06:00 UTC and 2009-10-10
                                     18:OOUTC
                                         46

-------
               JAMSTEC: 4km and 12km WRF vs. OBS Clit Richardson Mixing Heights
      o
      o
      o
    N
    2:
    .
    (D o
    I o
    •a
    U
    V)
    O

    |>0
    .5 O
    7 *
    «
    DC
    LL
    IL
    DC
      c>
      o
           Red plusses = 4km WRF
           N= 146
           Rz = 41%
           RMSE = 239
           Bias = -11 6
           In [1/2,2] = 72%
                                                                   Slope = 0.75
                       200           400           600
                              Radiosonde Mixing HeightZiRib (m)
800
1000
Figure 21. JAMSTEC observed vs. WRF-MMIF mixing heights, both derived using the
       critical bulk Richardson number technique (4km in red, 12km in blue).
                                        47

-------
            BARROW 2010-08-14 12:00:00
                                               BARROW 2010-08-28 12:00:00
   1800 -


   1500 •


   1200
    900 H
X
    600


    300 -
WRFT
WRF Tdew
OBST
OBS Tdew
WRFT2
WRF Tskin
OBST2
        -30
   -20      -10       0
       Temperature (C)
10

"
1800 -
.

-
1500 -
_



£ 1200 -
.»— •
^^^
•5 90°
I
600
-
-
300

-
\
N \ \
\
\ \ \
\ \
\\ \

\\
V N
-. s
S^ 1

"'- \
"**» -- 1
— WRF T \ \
- - WRF Tdew \ \
- OBS T \\
-- - OBS Tdew >C J
• «WRFT2 "'••-\-./
• • WRF Tskin ^
• -OBST2 A
'•\
-30      -20      -10      0
            Temperature (C)
10
         Figure 22. FNMOC WRF and Pt. Barrow profiles for 2010-08-14 12:00 UTC and 2010-08-28 12:00 UTC.
                                               48

-------
     BARROW 2010-09-20  12:00:00
                                                             BARROW 2011-06-27 12:00:00
   1800 -


   1500 •


*£ 1200


f  90° -
I

    600


    300 -
WRFT
WRF Tdew":
OBST
OBS Tdew
WRFT2
WRF Tskin
OBST2
-30      -20      -10      0
            Temperature (C)
                                           1800 -
                                         P 1200 -
                                                   — WRFT
                                                   -- - WRF Tdew
                                                    - OBST
                                                   -- - OBS Tdew
                                                   • • WRF T2
                                                   • • WRF Tskin
                                                   • »OBST2
                                        •5  90°
                                         10
                                           -20      -10       0       10
                                                       Temperature (C)
20
 Figure 23. FNMOC WRF and Pt. Barrow Profiles for 2010-09-20 12:00 UTC and 2011-06-27 12:00 UTC.
                                       49

-------
[Blank]
  50

-------
6     TASK 5 - DEVELOPMENT OF AERSCREEN FOR ARCTIC OUTER
      CONTINENTAL SHELF APPLICATIONS

6.1   Introduction
The AERMOD modeling system (USEPA, 2004c) is the preferred refined regulatory air
dispersion model for assessing near-field (within 50 kilometers) impacts of air emissions for air
permit compliance demonstrations4. Given the likely continued expansion of oil and mineral
exploration and extraction activities in the Arctic Ocean off the coast of Alaska and other marine
locations, there will be an increasing demand for air quality permits or exploratory plans related
to such activities.  In many circumstances screening modeling, which should produce
conservative estimates over refined models, is an accepted alternative to refined modeling that
requires fewer resources and can be completed in less time. Screening modeling is especially
useful when the data required by refined models, such as AERMOD,  are not readily available,
as is often the case in remote marine environments.
The EPA's preferred screening dispersion model for regulatory applications is AERSCREEN
(USEPA, 2011) which functions as an interface for the AERMOD modeling system and
executes AERMOD in screening  mode. AERSCREEN utilizes the MAKEMET program (USEPA,
2011) to generate screening meteorology to include worst-case meteorological conditions for
the environment that is being modeled. Many of the algorithms in  MAKEMET that compute the
boundary layer parameters are based on AERMOD's  meteorological  preprocessor, AERMET
(USEPA, 2004a),  which was designed to process land-based surface and upper air
meteorology. These algorithms do not account for the unique boundary layer conditions
common to overwater environments, nor has MAKEMET been evaluated to determine whether
its use is appropriate in overwater environments.
The purpose of Task 5 is to develop and evaluate a systematic approach for generating
screening meteorology representative of overwater conditions that can be incorporated into
AERSCREEN  in the place of MAKEMET or used directly by AERMOD in the place of observed
data. Additionally, the current overwater scaling factors used to convert 1-hour screening
concentrations to  3-hour, 8-hour, 24-hour, and annual averaging periods for land-based
screening applications were to be evaluated for marine environments and modified if necessary.
The effort completed toward these end goals thus far is primarily investigative. This report
summarizes the current status of this effort and recommendations for continued future work.
4 As promulgated under 40 CFD Part 51, Appendix W. The AERMOD modeling system is available to the
public at the EPA modeling website: http://www.epa.gov/scram001/dispersion prefrec.htm
                                         51

-------
6.2    Approach and Methodology
Meteorological data collected offshore that include even the minimum observed parameters
needed to compute the boundary layer parameters required by AERMOD are limited, at best,
and non-existent in many remote areas where permit modeling will be necessary. This is
particularly true in the Arctic due to extreme weather and the presence of seasonal sea-ice. A
lack of readily available data presents a challenge with analyses, testing, and validation to
ensure screening data are representative of worst-case meteorological conditions and will
produce conservative predicted concentrations compared to refined modeling. Due to the limited
amount of observed overwater meteorological data that are appropriate to characterize the
marine boundary layer that would be useful to guide  the development of screening meteorology,
Task 5 efforts have depended heavily on meteorological data products generated under Task 3.
Task 3 focused on the evaluation of refined modeling techniques using prognostic data with
AERMOD in the Arctic waters off the coast of Alaska. Products developed under Task 3 include:
data from simulations produced with the WRF model (NCAR,  2014) extracted using MMIF
(Brashers & Emery, 2014); surface observations collected from buoys in the Chukchi and
Beaufort Seas; and mixing heights collected with a temperature profiler on Endeavor Island in
the Beaufort Sea. These data are assumed to be representative of meteorological conditions in
the Arctic.
Initial efforts to develop overwater screening  meteorology have focused on the analyses of
these Task 3 products and the development of exploratory sets of screening data intended to be
broadly applicable to the Arctic open water ice-free environment.  The knowledge and
experience gained while initially focusing on the Arctic will be  transferable to other marine
environments such as mid-latitude and tropical regions to develop a robust system for
generating overwater screening data. This section summarizes the use of these data to develop
a set of exploratory screening meteorology for overwater applications in the Arctic.

6.3    Overview
The AERCOARE (USEPA, 2012) program discussed in previous tasks was developed as a
counterpart to AERMET for processing overwater meteorological data. AERCOARE  utilizes the
COARE bulk air-sea flux algorithms5 to compute the  boundary layer parameters required by
AERMOD. These algorithms use air-sea temperature difference and other features of the
marine influence to characterize the boundary layer over open water and are more
representative than the land-based algorithms in AERMET and MAKEMET. AERCOARE has
5 A technical description of version 3.0 of the COARE algorithm can found at
ftp://ftp.etl.noaa.gov/users/cfairall/wcrp wgsf/computer program/cor3  0/and
http://www.coaps.fsu.edu/COARE/flux algor/.
                                          52

-------
been approved by EPA Region 10 as an acceptable alternative modeling approach for modeling
in the ice-free environments of the Beaufort and Chukchi Seas.
To take advantage of the COARE algorithms, overwater screening data comprised of hourly
values for the minimum set of meteorological parameters required to run AERCOARE were
developed and formatted for input into AERCOARE. These overwater datasets were based on
analyses of the Task 3 data products and additional Arctic buoy data obtained from the National
Oceanic and Atmospheric Administration's (NOAA's) National Data Buoy Center (NDBC)6. The
overwater hourly datasets were processed with AERCOARE, and the resultant surface (SFC)
and profile (PFL) files were input directly to AERMOD to model a set of point sources
representative of those typically found on a drill ship.
AERMOD results from simulations using the exploratory screening datasets were compared to
AERMOD results from simulations using: 1) buoy data collected in the Beaufort and Chukchi
Seas supplemented with mixing heights derived from Endeavour Island  profiler data and
processed with AERCOARE; 2) buoy data collected in the Beaufort and Chukchi Seas by Shell
Oil and prepared using the COARE algorithms prior to the development of AERCOARE; 3) data
extracted from WRF simulations using the MMIF processed with AERCOARE; and 4) WRF data
extracted with MMIF for direct input in AERMOD (bypassing AERCOARE).

6.4   Methods

6.4.1  AERCOARE input requirements
The meteorological input parameters required by AERCOARE vary based on options specified
by the user in the AERCOARE control file. AERCOARE was run using the default options and
settings listed in Table 4.

                    Table 4. Data Dependent AERCOARE Options.
     Option    Setting    Option and Setting Description
                                Data Requirements
     mixopt
Mixing heights option:
Mixing heights are included in
overwater file
Hourly mixing heights
     jwarm
Warm-layer effects:
Warm-layer effects not included
None
 http://www.ndbc.noaa.gov/
                                        53

-------
               Table 4, continued, Data Dependent AERCOARE Options.
     Option    Setting    Option and Setting Description
                                 Data Requirements
      icool
Cool-skin effects:
Cool-skin effects not included
None
      jwave
Surface roughness:
Surface roughness treated as a
function of friction velocity
None
The minimum set of meteorological parameters required by AERCOARE were satisfied using a
combination of hourly values included in an overwater data file and single values specified in the
AERCOARE control file that were applied to each hour. Table 5 lists the specific AERCOARE
data requirements and indicates whether the requirement was satisfied using an individual
hourly value included on each record in the overwater data file or a single value specified in the
control file.

                    Table 5. Task 3 AERCOARE Data Requirements.
Parameter
Year
Month
Day
Hour
Latitude
Longitude
Time Zone
Wind Speed
Wind Direction
Sea Temperature
Ambient Air Temperature
Relative Humidity
Relative Humidity Measurement Height
Wind Measurement Height
Air Temperature Measurement Height
Format/Units
4-digits
1-12
1-31
1-24
Decimal Degrees (°)
Decimal Degrees
Integer
Meters/Second (m/s)
Decimal Degrees
Degrees Celsius ©
Degrees Celsius
%
Meters (m)
Meters
Meters
Hourly or
Single Value
Hourly
Hourly
Hourly
Hourly
Single
Single
Single
Hourly
Hourly
Hourly
Hourly
Hourly
Single
Single
Single
Single
Value




72.0
158.0
9





3.0
3.5
3.0
                                          54

-------
              Table 5, continued. Task 3 AERCOARE Data Requirements.
Parameter
Sea Temperature Measurement Depth
Mixing Height
Atmospheric Pressure
Format/Units
Meters
Meters
Millibars (mb)
Hourly or
Single Value
Single
Hourly
Single
Single
Value
1.2

1013.2
Table 6 lists additional parameters that required in the AERCOARE control file and the values
that were specified when processing the exploratory overwater screening datasets. The values
are consistent with those used to process the WRF and buoy-based data in Task 3.

        Table 6. Additional Required AERCOARE Input Parameters (Control File).

           Parameter                                            Values
           Mixing Height for COARE Gustiness (m)                     600.0
           Minimum Mixing Height (m)                               25.0
           Default Vertical Potential Temperature Gradient (degrees C/m)   0.01
           Minimum Absolute Value of Monin-Obukhov Length (m)         5.0
           Calm Wind Threshold (m/s)                               0.5

6.4.2   AERCOARE overwater datasets
A dataset that is ideal for screening modeling should include a robust set of combinations of
values for meteorological parameters that closely  represent all potential observed
meteorological conditions at the application site. This should include worst-case conditions for
dispersion for a range of representative emission source types and configurations.
AERCOARE overwater datasets developed under Task 3 were reviewed to determine a range
of hourly values representative of the Arctic for the following meteorological parameters required
by AERCOARE:
   •   Wind speed (Ws),
   •   Sea Temperature (Tsea),
   •   Ambient Air Temperature (Tair),
   •   Relative Humidity (RH), and
   •   Mixing Height (Mix Ht).
Though not input directly into AERCOARE, the hourly air-sea temperature difference was also
assessed since it is computed by AERCOARE and is a key parameter for computing boundary
                                         55

-------
layer parameters. Air and sea temperature difference was also assessed from additional buoy
data obtained from the NBDC.
Under Task 3, overwater datasets formatted for input to AERCOARE were developed from
meteorological data collected at four overwater buoy sites during the ice-free periods of 2010-
2012. Mixing heights were derived using data from a land-based temperature profiler on
Endeavor Island. Two of the buoy sites, "Burger" and "Klondike," are located in the Chukchi Sea
while "Sivulliq" and "Reindeer Island," along with the Endeavor Island profiler are located in the
Beaufort Sea. The four buoy sites and the Endeavor Island profiler are shown in Figure 3.
In addition, data were extracted from WRF simulations for grid points closest of each of the four
buoy sites (B2, B3, C1, and C2), as shown in Figure 3, for the corresponding ice-free periods.
Data were extracted using MMIF in a format for input to AERCOARE. The WRF-derived mixing
heights were rediagnosed  as described in Task 3, and both the original WRF-derived mixing
heights and the rediagnosed mixing heights were retained as separate datasets. Thus, the
AERCOARE overwater datasets reviewed from Task 3 include: 1) buoy-based observations, 2)
WRF with WRF-derived mixing heights, and 3) WRF with rediagnosed mixing heights.  Detailed
descriptions of these datasets and the methods used in their development can be found in the
Task 3 report.
Table 7 lists the AERCOARE overwater datasets that were reviewed, the minimum and
maximum values for the parameters listed above, and the calculated air-sea temperature
difference. Table 8 lists the additional buoys that were reviewed for air-sea temperature
difference.
                                          56

-------
Table 7. Review of Task 3 AERCOARE Overwater Data.
Location
B2
B3
C1
C2
B2
B3
C1
C2
Dataset
Observation Data
Observation Data
Observation Data
Observation Data
WRF
with WRF Mixing Heights
WRF
with WRF Mixing Heights
WRF
with WRF Mixing Heights
WRF
with WRF Mixing Heights

8/18
7/30
8/13
8/2-
8/31
8/2
9/1 -
7/27
9/1
7/28
7/28
7/28
7/27
7/28
7/28
7/27
7/28
7/28
7/27
Period
-9/24/2010
-9/18/2011
-10/10/2010
-10/25/2011
-10/6/2012
-10/7/2011
-10/10/2012
-10/3/2010
-10/6/2012
-10/31/2010
-10/31/2011
-10/31/2010
-11/4/2012
-10/31/2010
-10/31/2011
-11/4/2012
-10/31/2010
-10/31/2011
-11/4/2012
Ws
(m/s)
Min / Max
0.3/14.8
0.6/14.6
0.0/12.6
0.0/14.9
0.1 /13.8
0.0/13.4
0.0/15.8
0.0/13.6
0.0/13.8
0.1 /17.9
0.2/17.7
0.1 /16.4
0.1 /14.2
0.3/15.8
0.1 /16.2
0.2/15.5
0.5/15.2
0.2/15.0
0.2/15.2
Tsea
CC)
Min / Max
-0.5/6.1
2.0/10.1
-0.9/4.5
-1.9/8.3
2.3/7.3
3.8/8.56
-1.2/5.5
-0.1 /6.4
-0.2/2.3
-1.2/7.1
-1.8/8.3
-1.7/5.7
-1.2/7.6
-1.0/8.9
0.7/7.8
-1.5/6.6
-0.6/6.3
-0.4/7.6
-1.9/4.4
Tair
CC)
Min / Max
-2.2/7.2
0.3/11.3
-6.6/6.9
-8.6/10.8
-1.2/8.2
-1.0/8.7
-1.4/5.2
-2.5/8.1
-2.4/2.9
-7.5/11.9
-11.8/11.0
-7.1 /9.6
-15.0/11.2
-7.9/9.3
-5.8/8.5
-6.9/7.1
-8.8/7.2
-6.4/8.2
-7.4/5.5
Tair -Tsea
CO
Min / Max
-3.7/5.6
-3.9/4.1
-6.1 /6.2
-7.0/4.4
-5.9/4.5
-5.3/1.63
-6.2/1.9
-6.3/7.3
-4.2/2.9
-8.3/6.8
-10.1 /6.1
-7.8/6.8
-13.8/5.3
-7.1 /3.5
-7.7/1.6
-6.5/3.4
-8.3/7.2
-7.1 /1.4
-6.2/3.5
RH
(%)
Min / Max
79 /
66 /
64.0
61 /
53 /
53 /
100
100
7100
100
100
100
65/99
54 /
100
65/99
62 /
67 /
71 /
66 /
65 /
66 /
65 /
69 /
68 /
62 /
100
100
100
100
100
100
100
100
100
100
MixHt
(m)
Min 7 Max
10
10
10
10
10
10
19
/550
/500
/700
/600
/300
71592
71076
4/1336
12
11
11
11
11
11
11
11
11
11
11
7659
71689
71506
71216
71834
71545
71860
71864
71543
71853
71852
                      57

-------
Table 7, continued. Review of Task 3 AERCOARE Overwater Data.
Location
B2
B3
C1
C2
Dataset
WRF
Rediagnosed Mixing
Heights
WRF
Rediagnosed Mixing
Heights
WRF
with WRF Mixing Heights
WRF
with WRF Mixing Heights
Period
7/28-
7/28-
7/27-
7/28-
7/28-
7/27-
7/28-
7/28-
7/27-
10/31/2010
10/31/2011
11/4/2012
10/31/2010
10/31/2011
11/4/2012
10/31/2010
10/31/2011
11/4/2012
Ws
(m/s)
Min / Max
0.1 /17.9
0.2/17.7
0.1 /14.2
0.3/15.8
0.1 /16.2
0.2/15.5
0.5/15.2
0.2/15.0
0.2/15.2
Tsea
(C)
Min / Max
-1.2/7.1
-1.8/8.3
-1.2/7.6
-1.0/8.9
0.4/7.8
-1.5/6.6
-0.6/6.3
-0.4/7.6
-1.9/4.4
Tair
(C)
Min / Max
-7.
-11
-15
-7
-5
-6
-8
-6
-7
5/11.9
.8/11.0
.0/11.2
.9/9.3
.8/8.5
.9/7.1
.8/7.2
.4/8.2
.4/5.5
Tair - Tsea
(C)
Min / Max
-8.3/6.8
-10.1 /6.1
-13.8/5.3
-7.1 /3.5
-7.7/1.6
-6.5/3.4
-8.3/2.4
-7.1 /1.4
-6.2/3.5
RH
(%)
Min / Max
62 /
67 /
66 /
65 /
66 /
65 /
69 /
68 /
62 /
100
100
100
100
100
100
100
100
100
MixHt
(m)
Min / Max
6.1 /1166
10/1141
6/1796
17/1599
17/1714
6/1521
17/1651
19/1705
6/1478
                           58

-------
      Table 8. Ambient Air and Sea Surface Temperatures at Selected Arctic Buoys.
Buoy ID Location
48012 Lay Point
48211 Camden Bay
48212 Harrison Bay
48213 Burger
48214 Klondike
PRDA2 Prudhoe Bay
Year
2013
2011
2012
2013
2011
2012
2013
2012
2013
2012
2013
2008
2009
2010
2011
2012
2013
Tair - Tsea
(C)
Minimum
-6.4
-7.0
-7.0
-6.2
-4.7
-4.7
-4.2
-4.2
-6.9
-6.2
-7.0
-26.7
-12.5
-14.3
-3.9
-11.3
-7.8
Tair -Tsea
(C)
Maximum
1.5
4.4
4.8
8.4
2.0
2.8
2.3
2.3
0.3
1.9
0.8
-3.8
14.2
13.9
11.7
0.0
16.5
Tair - Tsea
(C)
Median
-3.2
0.4
0.0
-0.6
-0.8
-0.9
-0.6
-0.6
-3.0
-2.6
-2.9
-10.0
-2.0
-0.4
2.4
0.5
1.6
Initial minimum, maximum, and intermediate exploratory values were defined for wind speed,
sea surface temperature, ambient air temperature, relative humidity, and mixing height. A
program was developed to generate a complete set of all possible combinations of values for
the five parameters. Each combination was then replicated for 36 wind directions (every 10
degrees from 0 to 350 degrees) and formatted as an hourly record in the overwater file for input
to AERCOARE. While it was understood that this approach of combining data would  produce
combinations of meteorological parameters that are likely inconsistent and unrealistic, this was
chosen for simplicity as an investigative approach to determine where refinements are needed.
Table 9 lists the initial range of values, including intermediate values, for each of the
parameters. From here on, this initial screening dataset is referenced as the COARESCREEN1
                                         59

-------
dataset. It should be noted that while mixing heights less than 25 meters are common, a lower
limit of 25 meters is consistent with the limit set in the AERCOARE control files under Task 3.

          Table 9. Initial Meteorological Screening Values (COARESCREEN1).

           Parameter                   Values
           Wind speed (m/s)              0.5,2.0,7.0,15.0
           Sea Temperature (C)           -2.0, 2.0, 8.0, 15.0
           Ambient Air Temperature (C)     -10.0, -2.0, 8.0, 15.0
           Relative Humidity (%)           50.0, 65.0, 85.0, 100.0
           Mixing Height (m)              25.0, 100.0, 300.0, 600.0, 1000.0

The screening values in Table 9 combined with 36 wind directions, yielded 46,080 hourly data
records, equivalent  to 5.26 years of meteorological data.
Recognizing that the screening values in Table 9 generate duplicate values with regard to air-
sea temperature difference as well as improbable extreme conditions (i.e., an air-sea
temperature difference of -25  degrees Celsius and a mixing height of 1000 meters), an alternate
overwater dataset (COARESCREEN2) was generated which varied the ambient air temperature
keeping the sea surface temperature at a constant 1.0 degree Celsius. The magnitude of the
air-sea temperature difference is an important parameter in AERCOARE, while the actual air
and sea temperature values is thought to be of less importance. In COARESCREEN2, the
range of the air-sea temperature difference was also reduced at both ends to avoid extreme
conditions rarely seen in the datasets that were reviewed such as an air-sea temperature
difference of -25 degrees Celsius. The largest magnitude in the air-sea temperature difference
in the data evaluated for this task is 26.7 degrees Celsius. The actual air-sea temperature was
-26.7 degrees Celsius in 2008 at the Prudhoe Bay buoy. For all other datasets that were
reviewed, the computed maximum and minimum air-sea temperature differences fell within the
range of -15 to +17  degrees Celsius (see Table 7 and Table 8). The ranges of screening values
for the COARESCREEN2 dataset are listed in Table 10. These changes yielded 25,920 hourly
records, the equivalent of about 3 years of meteorological  data due to the removal of duplication
in the computed air-sea temperature difference.
                                          60

-------
      Table 10. Second Set of Meteorological Screening Values (COARESCREEN2).

           Parameter                   Values
           Wind speed (m/s)              0.5, 2.0, 7.0, 15.0
           Sea Temperature (C)           1.0
           Ambient Air Temperature (C)     -9.0, -6.0, -3.0, -1.0, 1.0, 3.0, 5.0, 8.0, 11.0
           Relative Humidity (%)          50.0, 65.0, 85.0, 100.0
           Mixing Height (m)              25.0, 100.0, 300.0, 600.0, 1000.0

A third variation (COARESCREEN3) was generated which increased the resolution of the wind
speeds, slightly reduced the range in air-sea temperature difference, reduced the range in the
relative humidity, and increased the maximum mixing height. The ranges of the screening
values for the COARESCREEN3 dataset are listed in Table 11. These combinations yielded a
total of 42,336 hourly records, equivalent to about 4.8 years of meteorological data.

       Table 11. Third Set of Meteorological Screening Values (COARESCREEN3).

        Parameter                   Values
        Wind speed (m/s)              0.5, 1.0, 1.5, 2.0, 3.0, 7.0, 12.0
        Sea Temperature (C)           1.0
        Ambient Air Temperature (C)     -6.0, -3.0, -1.0, 1.0, 3.0, 5.0, 8.0
        Relative Humidity (%)           60.0, 85.0, 100.0
        Mixing Height (m)              25.0, 50.0, 75.0, 100.0, 300.0, 500.0, 900.0, 1500.0

6.4.3   Emission sources
The hypothetical drill ship emission sources and their configurations were provided by EPA
Region 10 and are the same sources used in the Task 3 evaluation. They include 11 uniquely
configured stacks from four sources. Each  of the 11 stacks were modeled ignoring the effects of
building downwash from the presence of the drill ship. The configurations of three of the 11
stacks (source 2) were duplicated and modeled with AERMOD's PRIME downwash algorithm
applied. The different source groups, stacks, and their respective  configurations are provided
in Table 3.
                                          61

-------
The location of the emission points for this task deviates from the locations used in the Task 3
analyses in which all emission points were collocated at the center of the drill ship. For
screening modeling purposes, to simulate a permit modeling demonstration, the emission points
were grouped and collocated at three different locations on the drill ship. The footprint of the drill
ship, building tiers, and the locations of the emission points, identified by source ID, are shown
in Figure 24. The building parameters for emissions points S5P1, S5P2, and S5P3 were
         Figure 24. Hypothetical Drill Ship Layout and Emission Source Locations.

developed with the Building Profile Input Program for PRIME (BPIPPRM) (USEPA, 1993).

6.4.4   Receptor grid
Preliminary modeling using AERMOD was performed using both a coarse Cartesian receptor
grid and the polar receptor grid used in the Task 3 analyses to identify the approximate distance
to the maximum 1-hour predicted ground-level concentration for the individual emission points
defined in Table 3. The maximum  1-hour concentration for each of the emission sources
occurred within 2000 meters from  the center of the drill ship. To simulate a generic screening
modeling analysis using AERMOD in which the maximum predicted 1-hour concentration should
be determined, a nested Cartesian receptor grid was developed. The edge of the vessel was
considered to be the ambient air boundary. Receptors were also placed along the edge of the
vessel and  omitted from within the footprint of the vessel. Figure 25 shows the layout of the
nested Cartesian receptor grid and receptors along the edge of the ship. Receptor spacing is as
follows:
   •   25 meter spacing along the edge of the ship,
   •   25 meter spacing from the  ship's edge  out to 500 meters,
   •   50 meter spacing from 500 meters out to 1500 meters,  and
                                          62

-------
      100 meter spacing from 1500 meters out to 2100 meters.
       l-i
                        444444444444444444
             +4 44444444444444444444444444444
                  444444+++++++++++++++++++
                                         1000   1500
                                         X-Direction [m]
                    Figure 25. Screening Modeling Receptor Grid


6.4.5  AERMOD simulations
The 14 emission points defined in Table 3 were modeled with AERMOD using the exploratory
screening meteorological datasets (COARESCREEN1, COARESCREEN2, and
COARESCREEN3) derived with AERCOARE. For comparison, the emission points were also
modeled with a subset of the AERMOD-ready meteorological datasets developed under Task 3,
and AERMOD-ready data provided by EPA Region 10, developed by Shell Oil (Shell), used with
EPA's approval to support permit applications for offshore exploratory drilling. The files
developed by Shell were derived from buoy data in the Chukchi and Beaufort Seas using the
COARE algorithms prior to the development of AERCOARE.
AERMOD-ready files developed under Task 3 include meteorological datasets developed from
buoy observations in the Beaufort and Chukchi Seas and mixing  heights from a temperature
profiler located on Endeavor Island (see Figure 3). Observation data were prepared for
AERMOD using AERCOARE, as discussed previously. AERMOD-ready datasets extracted
from WRF simulations for the same buoy locations using MMIF were also developed under
Task 3. WRF-based data include data extracted for direct input into AERMOD and data were
extracted and subsequently processed with AERCOARE. In both categories  of datasets, those
that bypassed AERCOARE and those processed with AERCOARE,  a copy of the data was
generated in which the WRF-derived mixing heights were retained and a second copy was
                                        63

-------
developed in which the mixing heights were rediagnosed. Additional details about AERMOD-
ready datasets developed under Task 3 can be found in Volume 3.
Each of the uniquely configured emission release points was defined as a separate source
group in the AERMOD control file to obtain a maximum 1-hour predicted concentration specific
to each configuration. Emission points were also combined into a single source group (ALL) to
obtain a 1-hour maximum concentration for the combined set of emission points. Each
configuration was modeled using a unit emission rate of 1.0 gram/second. Source and receptor
elevations and receptor height scales were set to 0.0 meters to indicate flat terrain.  Individual
years were modeled and limited to a time span that falls within the ice-free period.
The meteorological data for which AERMOD simulations were  performed, including screening
meteorology, are identified and described in Table 12. The temporal period that was modeled
for each dataset is also specified.
                                          64

-------
Table 12. AERMOD Simulations Identified By Meteorological Data Set.
Dataset
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
1 0-WRF-AERCOARE-B2-F
1 0-WRF-AERCOARE-B2-T
1 1-WRF-AERCOARE-B3-F
1 1-WRF-AERCOARE-B3-T
1 2-WRF-AERCOARE-C1 -F
1 2-WRF-AERCOARE-C1 -T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2




8/5-
8/14
8/5-
7/27
7/28
7/28
7/28
7/28
7/27
7/27
7/28
7/28
7/28
7/28
7/27
7/27
8/18
7/30
8/14
8/2-
8/31
8/2
9/1 -
7/24
9/1
Period
-
-
-
-10/12/2009
-10/10/2010
-10/12/2009
-10/17/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-10/15/2010
-9/24/2010
-9/19/2011
-10/10/2010
-10/25/2011
-10/6/2012
-10/7/2011
-10/10/2012
-10/3/2010
-10/6/2012
Description
Initial screening dataset
Alternate screening dataset
Second alternate screening dataset
Shell Oil dataset, Beaufort Sea buoy (approved by EPA R10)
Shell Oil dataset, Beaufort Sea buoy (approved by EPA R10)
Shell Oil dataset, Chukchi Sea buoy (approved by EPA R10)
Shell Oil dataset, Chukchi Sea buoy (approved by EPA R10)
WRF, WRF mixing heights, location B2, MMIF to AERCOARE
WRF, location B2, MMIF to AERCOARE, rediag. mixing heights
WRF, WRF mixing heights, location B3, MMIF to AERCOARE
WRF, location B3, MMIF to AERCOARE, rediag. mixing heights
WRF, WRF mixing heights, location C1, MMIF to AERCOARE
WRF, location C1, MMIF to AERCOARE, rediag. mixing heights
WRF, WRF mixing heights, location B2, MMIF to AERMOD
WRF, location B2, MMIF to AERMOD, rediag. mixing heights
WRF, WRF mixing heights, location B3, MMIF to AERMOD
WRF, location B3, MMIF to AERMOD, rediag. mixing heights
WRF, WRF mixing heights, location C1 , MMIF to AERMOD
WRF, location C1, MMIF to AERMOD, rediag. mixing heights
Buoy observations, location B2, processed with AERCOARE
Buoy observations, location B2, processed with AERCOARE
Buoy observations, location B3, processed with AERCOARE
Buoy observations, location B3, processed with AERCOARE
Buoy observations, location B3 processed with AERCOARE
Buoy observations, location C1, processed with AERCOARE
Buoy observations, location C1, processed with AERCOARE
Buoy observations, location C2, processed with AERCOARE
Buoy observations, location C2, processed with AERCOARE
                              65

-------
6.4.6  AERSCREEN simulations

Each of the 14 hypothetical drill ship emission release points listed in Table 3 were modeled
individually using AESCREEN for comparison with the results using the exploratory screening
datasets and the AERMOD-ready meteorological datasets developed under Task 3. In addition
to the source specific parameters (Table 3) supplied to AERSCREEN, Table 13 lists input
parameters required by AERSCREEN and the respective values used for all emission points
that were modeled. It should be noted, the BPIPPRM control files used to develop the
downwash parameters for emission points S5P1, S5P2, S5P3, were supplied to AERSCREEN
in order to apply downwash effects for those sources.

                        Table 13. AERSCREEN Input Values.

        Parameter                           Value

        Minimum Ambient Air Temperature         248.0 degrees K, -25.5 degrees C
        Minimum Wind Speed                   0.5 meters/second
        Anemometer Height                    3.5 meters
        Albedo                              0.06
        Bowen Ratio                          1.0
        Surface Roughness Length              0.001 meters
        Distance to Ambient Air                 8.5 meters
        Probe Distance                        5000 meters
6.5   Results
A primary goal in a screening method for overwater dispersion modeling is a meteorological
dataset that will produce conservative results compared to refined modeling for a broad range of
conditions (i.e., predicted concentrations using the screening method should be greater than the
predicted concentrations using refined modeling) and source configurations.  The frequency and
degree of conservatism of the three exploratory datasets (COARESCREEN1,
COARESCREEN2, and COARESCREEN3) were evaluated against refined modeling results for
25 of the AERMOD-ready datasets developed under Task 3 (see Section 0). A comparison of
AERSCREEN results to the COARESCREEN and refined model results was also performed to
evaluate the performance of the existing AERSCREEN as an overwater screening method. The
highest predicted concentration (H1H) of the 1-hour averaging period for each emission release
point from each model run was used as the basis for the evaluation.
A comparison of predicted 1-hour H1H concentrations across the three  exploratory
COARESCREEN datasets, AERSCREEN, and the refined model runs show mixed results.
Table 14 is a summary of the percentage of the 25 refined model runs in which the screening
model produced conservative results compared to the refined model. Detailed comparisons of
                                        66

-------
    the screening results to the refined model results for each Source ID in Table 3 are provided in
    Table 16 through Table 29 at the end of this section. Table 30 presents the results for all
    release points combined.  Each table compares the ratio of the 1-hour H1H concentration for
    each of the screening model runs to that of each of the 25 refined model runs.

          Table 14. Percent of Model Runs Where Screening Results Are Conservative
                           Compared to Refined Modeling Results.

                                                                         | — Downwash — |

   S?.re^n'!I9    S1P1   S1P2  S2P1  S2P2  S2P3 S3P1  S3P2   S3P3  S4P1  S4P2  S4P3  S5P1  S5P2  S5P3 ALL
    Method
   AERSCREEN  56
72
84
72
44   72
72
84   100   68    68
                0
                0    100  -
COARESCREEN1  100
100    100   100   100   100    100   96   100   100   100   100    24    72   96
COARESCREEN2  100
96    80    72   100   64    72    92    88    56    60   76    0
                                                          32   96
COARESCREEN3  92
96
36
32
100   60
48
96
88
56
48
76
100   100   96
Results are based on a comparison of the highest predicted 1-hour concentration from the screening model run compared to
the highest predicted 1-hour concentration from each of 25 refined model runs. Because AERSCREEN is limited to modeling a
single emission source, it was not possible to model all sources as a combined source group ALL using AERSCREEN.

    Table 14 shows that as a screening method for overwater arctic applications, AERSCREEN was
    conservative for a majority of the model runs. It was conservative more than 50%  of the model
    runs for all but one release point, S2P3. However, AERSCREEN was conservative 100% of the
    model runs for only two sources, S4P1 and S5P3. Conversely, the COARESCREEN1 dataset
    produced  conservative results 100% of the time for all but three sources (S3P3, S5P2, and
    S5P3) two of which are sources in which building downwash was applied. The results of the
    COARESCREEN2 dataset are similar to AERSCREEN, generally conservative for a majority of
    the model runs but conservative 100% of the time for only two sources  (S1P1 and S2P3).
    COARESCREEN3 performed even more poorly with the exception of the downwash sources, in
    which case it was conservative 100% of the time for S5P2 and S5P3.
    Table 15 is a summary of the ratio of the 1-hour H1H concentrations of each of the
    COARESCREEN  datasets to AERSCREEN for each release point configuration. The
    COARESCREEN1 dataset yielded a higher concentration than AERSCREEN for all but one
    release point while the COARESCREEN2 and COARESCREEN3 datasets yielded higher
    concentrations than AERSCREEN for just over half of the sources.
                                             67

-------
     Table 15. Ratio of COARESCREEN-to-AERSCREEN 1-hour H1H Concentrations.
Release Point
S1P1
S1P2
S2P1
S2P2
S2P3
S3P1
S3P2
S3P3
S4P1
S4P2
S4P3
S5P1
S5P2
S5P3
Ratio:
COARESCREEN1-
to-AERSCREEN
1.91
2.18
2.05
2.37
1.94
2.06
2.41
1.81
1.49
2.26
2.24
1.78
1.04
0.65
Ratio:
COARESCREEN2-
to-AERSCREEN
1.28
1.14
0.98
1.03
1.40
0.88
1.03
1.06
0.75
0.96
0.96
1.51
0.99
0.61
Ratio:
COARESCREEN3-
to-AERSCREEN
1.06
1.09
0.88
0.91
1.34
0.83
0.93
1.16
0.74
0.93
0.90
1.49
1.57
0.71
           Red text and shaded cells emphasize those ratios less than or equal to 1.0 in for
           which the AERSCREEN 1-hour H1H concentration is greater than the
           COARESCREEN 1-hour H1H concentration.

To summarize, while the performance of the COARESCREEN 1 dataset appears to perform well
with regard to the criteria of conservativism over refined modeling, it should be noted that the
COARESCREEN 1 dataset includes data combinations that are likely unrealistic for the Arctic
environment. There is a significant degradation in the performance of the COARESCREEN2
and COARESCREEN3 datasets. While these datasets also include improbable combinations of
meteorological parameters, this was thought to be reduced with changes in the air-sea
temperature difference, which is a critical parameter computed and used by AERCOARE to
characterize the boundary layer. More evaluation is needed to  discern  realistic worst-case
conditions in overwater environments to ensure the screening data are representative of those
conditions and concentrations are not the result of improbable or impossible meteorological
conditions.
                                         68

-------
6.6    Recommendations for Future Work
As noted previously, additional investigation and evaluation of the screening and refined model
results are needed. By the simple nature in which they were compiled, the exploratory
COARESCREEN datasets contain meteorological conditions that are unrealistic and others that
are likely improbable in the Arctic. Similarly, there may be worst-case conditions for dispersion
that are not well represented in the screening datasets. Each of the COARESCREEN datasets
are shown to have strengths and weaknesses for specific release point configurations. While
the range of each meteorological parameter with regard to the minimum and maximum values
should adequately represent the ice-free Arctic based on the review of the Task 3 data
products, the resolution of the intermediate values may not be adequate to represent worst-case
conditions for all source types and configurations evaluated to this point. This is dependent on
the sensitivity of a given parameter and its relationship to other parameters in characterizing the
boundary layer. The resolution or the number of values and the incremental difference in the
values used for a parameter to which AERCOARE or AERMOD is sensitive, will need to be finer
(greater number of values and less difference between each incremental value) than for those
parameters for which there is less sensitivity. A finer resolution in the  intermediate values results
in a greater number of steps between minimum  and maximum boundary points, which in turn
yields a larger dataset when all possible combinations  are compiled. Thus, care needs to be
given to balance the number intermediate values with the need for a higher resolution to keep
the number of hourly records to a manageable size for screening modeling.
To assess these potential issues, the hourly meteorological data corresponding with the 1-hour
H1H concentrations for each release point configuration should be extracted, reviewed, and
compared for all refined and screening model runs. For those cases in which the H1H
concentration from the refined model run is higher than the screening modeling, it should be
determined if either of the COARESCREEN datasets contains an hourly meteorological record
that closely imitates the meteorology used in the refined model run. If the meteorology is similar
but predicted concentrations are significantly different,  this would suggest a sensitivity to one or
more meteorological parameters. The resolution of one or more of the meteorological
parameters input to AERCOARE may be too coarse to represent conditions that produce similar
results. If the screening datasets do  not contain comparable meteorology, adjustments will  be
needed to incorporate representative conditions in a final COARESCREEN dataset.
For those cases in which the screening results are significantly higher than the  refined model
runs, the screening meteorology should be evaluated to determine if the conditions represented
are realistic. Unrealistic and improbable conditions should be omitted from the final
COARESCREEN dataset. A categorical data analysis of the Task 3 data products should be
performed to identify combinations of screening values that are  most likely not representative of
Arctic conditions. These levels of review of the meteorology have been initiated but  not yet
completed.
Finally, an evaluation of scaling parameters for the 3-, 8-, 24-hr, and annual averaging periods
is needed to make this a  viable option for screening modeling in the Arctic.
                                          69

-------
Table 16. Comparison of Screening and Refined Modeling for Release Point S1P1
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
Period
-
-
-
-
08/05 - 10/12
08/14-10/10
08/05 - 10/12
-
B2
07/27 - 10/17
07/28 - 10/15
B2 07/28 - 10/15
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
B3 08/14-10/10
B3
B3
Cl
Cl
C2
C2
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
-
F
T
F
j
F
T
p
T
F
T
F
T
-
-
-
-
-
-
-
_
1-hr HIM
(Lig/m3)
15.86
30.36
20.28
16.86
15.71
14.48
15.90
16.14
15.43
15.56
15.73
16.20
14.70
15.31
15.69
16.20
16.44
16.95
14.95
15.56
17.05
16.10
16.21
14.93
13.99
16.14
15.80
16.28
15.03
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
1.01
1.10
1.00
0.98
1.03
1.02
1.01
0.98
1.08
1.04
1.01
0.98
0.96
0.94
1.06
1.02
0.93
0.99
0.98
1.06
1.13
0.98
1.00
0.97
1.06
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.93
2.10
1.91
1.88
1.97
1.95
1.93
1.87
2.06
1.98
1.94
1.87
1.85
1.79
2.03
1.95
1.78
1.89
1.87
2.03
2.17
1.88
1.92
1.87
2.02
Ratio: Ratio:
COARESCREEN2- COARESCREEN3-
to-Refined to-Refined
-
-
-
-
1.29
1.40
1.28
1.26
1.31
1.30
1.29
1.25
1.38
1.32
1.29
-
-
-
-
1.07
1.16
1.06
1.04
1.09
1.08
1.07
1.04
1.15
1.10
1.07
1.25 1.04
1.23
1.20
1.36
1.30
1.19
1.26
1.25
1.36
1.45
1.26
1.28
1.25
1.35
1.03
0.99
1.13
1.08
0.99
1.05
1.04
1.13
1.20
1.04
1.07
1.04
1.12
                                  70

-------
Table 17. Comparison of Screening and Refined Modeling for Release Point S1P2
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
—tr~
Period
-
	 ; 	
08/05 - 10/12
08/14-10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
B2 07/28 - 10/15
B3
B3
07/28 - 10/15
07/28 - 10/15
Cl 07/27 - 10/15
Cl
B2
07/27 - 10/15
07/28 - 10/15
B2 07/28 - 10/15
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14-10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
C2 09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
j
F
T
-
-
-
-
-
-
-
1-hr HIM
(Lig/m3)
15.86
30.36
20.28
16.86
15.71
14.48
15.90
16.14
15.43
15.56
15.73
16.20
14.70
15.31
15.69
16.20
16.44
16.95
14.95
15.56
17.05
16.10
16.21
14.93
13.99
16.14
15.80
16.28
15.03
Ratio:
AERSCREEN-
to- Refined
-
-
1.01
1.10
1.00
0.98
1.03
1.02
1.01
0.98
1.08
1.04
1.01
0.98
0.96
0.94
1.06
1.02
0.93
0.99
0.98
1.06
1.13
0.98
1.00
0.97
1.06
Ratio: Ratio:
COARESCREEN1- COARESCREEN2-
to-Refined to-Refined
-
	 ; 	
1.93
2.10
1.91
1.88
1.97
1.95
1.93
1.87
2.06
1.98
1.94
1.87
1.85
1.79
2.03
1.95
1.78
1.89
1.87
2.03
2.17
1.88
1.92
1.87
2.02
-
	 ; 	
1.29
1.40
1.28
1.26
1.31
1.30
1.29
1.25
1.38
1.32
1.29
1.25
1.23
1.20
1.36
1.30
1.19
1.26
1.25
1.36
1.45
1.26
1.28
1.25
1.35
Ratio:
COARESCREEN3-
to- Refined
-
-
1.07
1.16
1.06
1.04
1.09
1.08
1.07
1.04
1.15
1.10
1.07
1.04
1.03
0.99
1.13
1.08
0.99
1.05
1.04
1.13
1.20
1.04
1.07
1.04
1.12
                                  71

-------
Table 18. Comparison of Screening and Refined Modeling for Release Point S2P1
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09- Beaufort-Shell
10- Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
Location
-
-
-
-
-
B2
B2
B3
B3
2012 Cl
2012 Cl
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
B2
B2
Period
-
-
-
Recalc.
Mix Hts
-
-
-
-
08/05 - 10/12
08/14-10/10
08/05 - 10/12
07/27 - 10/17
07/28-10/15
07/28-10/15
07/28-10/15
07/28-10/15
-
F
T
F
T
07/27 - 10/15 F
07/27 - 10/15
07/28-10/15
07/28-10/15
B3 07/28-10/15
B3 07/28-10/15
Cl 07/27 - 10/15
Cl 07/27 - 10/15
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
08/18 - 09/24
07/30-09/18
08/14-10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
T
F
T
F
T
F
T
-
-
-
1-hr HIM
(lig/m3)
19.23
39.47
18.75
16.83
20.67
17.28
19.80
17.76
17.75
19.48
17.83
18.69
15.37
15.56
16.48
19.20
17.93
23.34
16.06
16.25
17.05
18.60
17.27
16.59
15.77
16.92
16.76
17.16
15.68
Ratio:
AERSCREEN-
to-Refined
-
-
-
0.93
1.11
0.97
1.08
1.08
0.99
1.08
1.03
1.25
1.24
1.17
1.00
1.07
0.82
1.20
1.18
1.13
1.03
1.11
1.16
1.22
1.14
1.15
1.12
1.23
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.91
2.28
1.99
2.22
2.22
2.03
2.21
2.11
2.57
2.54
2.40
2.06
2.20
1.69
2.46
2.43
2.31
2.12
2.29
2.38
2.50
2.33
2.35
2.30
2.52
Ratio:
COARESCREEN2-
to- Refined
-
-
-
0.91
1.08
0.95
1.06
1.06
0.96
1.05
1.00
1.22
1.21
1.14
0.98
1.05
0.80
1.17
1.15
1.10
1.01
1.09
1.13
1.19
1.11
1.12
1.09
1.20
Ratio:
COARESCREEN3-
to-Refined
-
-
-
0.81
0.97
0.85
0.95
0.95
0.86
0.94
0.90
1.10
1.08
1.02
0.88
0.94
0.72
1.05
1.04
0.99
0.91
0.97
1.01
1.07
1.00
1.00
0.98
1.07
                                  72

-------
Table 19. Comparison of Screening and Refined Modeling for Release Point S2P2
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
-
-
-
-
-
-
-
1-hr HIM
(lig/m3)
35.46
83.90
36.35
32.20
39.43
36.99
37.89
32.10
34.05
40.40
32.61
35.12
30.55
33.36
31.15
36.79
30.37
50.54
28.81
32.99
39.68
34.58
31.05
34.04
28.73
34.63
29.61
32.36
33.17
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
0.90
0.96
0.94
1.10
1.04
0.88
1.09
1.01
1.16
1.06
1.14
0.96
1.17
0.70
1.23
1.07
0.89
1.03
1.14
1.04
1.23
1.02
1.20
1.10
1.07
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
2.13
2.27
2.21
2.61
2.46
2.08
2.57
2.39
2.75
2.52
2.69
2.28
2.76
1.66
2.91
2.54
2.11
2.43
2.70
2.46
2.92
2.42
2.83
2.59
2.53
Ratio:
COARESCREEN2-
to- Refined
-
-
-
-
0.92
0.98
0.96
1.13
1.07
0.90
1.11
1.04
1.19
1.09
1.17
0.99
1.20
0.72
1.26
1.10
0.92
1.05
1.17
1.07
1.27
1.05
1.23
1.12
1.10
Ratio:
COARESCREEN3-
to-Refined
-
-
-
-
0.82
0.87
0.85
1.00
0.95
0.80
0.99
0.92
1.05
0.97
1.03
0.88
1.06
0.64
1.12
0.98
0.81
0.93
1.04
0.95
1.12
0.93
1.09
1.00
0.97
                                  73

-------
Table 20. Comparison of Screening and Refined Modeling for Release Point S2P3
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
10-OBS-B2 2010
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
F
T
F
T
F
T
F
T
F
T
F
T
1-hr HIM
(lig/m3)
52.82
102.23
74.11
70.96
62.66
56.29
51.49
52.80
52.46
54.96
52.01
52.91
53.16
53.69
51.71
52.72
51.56
53.21
52.35
54.29
57.12
60.88
55.94
53.11
51.64
52.36
53.46
53.90
51.51
Ratio:
AERSCREEN-
to- Refined
-
-
-
0.84
0.94
1.03
1.00
1.01
0.96
1.02
1.00
0.99
0.98
1.02
1.00
1.02
0.99
1.01
0.97
0.92
0.87
0.94
0.99
1.02
1.01
0.99
0.98
1.03
Ratio: Ratio: Ratio:
COARESCREEN1- COARESCREEN2- COARESCREEN3-
to-Refined to-Refined to-Refined
-
-
-
1.63
1.82
1.99
1.94
1.95
1.86
1.97
1.93
1.92
1.90
1.98
1.94
1.98
1.92
1.95
1.88
1.79
1.68
1.83
1.93
1.98
1.95
1.91
1.90
1.98
-
-
-
1.18
1.32
1.44
1.40
1.41
1.35
1.42
1.40
1.39
1.38
1.43
1.41
1.44
1.39
1.42
1.36
1.30
1.22
1.32
1.40
1.44
1.42
1.39
1.37
1.44
-
-
-
-
1.13
1.26
1.38
1.34
1.35
1.29
1.36
1.34
1.33
1.32
1.37
1.35
1.38
1.33
1.36
1.31
1.24
1.17
1.27
1.34
1.37
1.36
1.33
1.32
1.38
                                  74

-------
Table 21. Comparison of Screening and Refined Modeling for Release Point S3P1
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
10-OBS-B2 2010
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
F
T
F
T
F
T
F
T
F
T
F
T
1-hr HIM
(lig/m3)
9.30
19.11
8.16
7.69
10.90
9.20
9.50
9.40
7.64
10.18
8.02
9.41
7.17
7.20
7.50
10.01
8.24
9.65
7.48
7.39
7.54
7.60
7.44
7.36
6.52
7.42
7.37
7.53
6.96
Ratio:
AERSCREEN-
to- Refined
-
-
-
0.85
1.01
0.98
0.99
1.22
0.91
1.16
0.99
1.30
1.29
1.24
0.93
1.13
0.96
1.24
1.26
1.23
1.22
1.25
1.26
1.43
1.25
1.26
1.24
1.34
Ratio: Ratio: Ratio:
COARESCREEN1- COARESCREEN2- COARESCREEN3-
to-Refined to-Refined to-Refined
-
-
-
-
1.75
2.08
2.01
2.03
2.50
1.88
2.38
2.03
2.67
2.65
2.55
1.91
2.32
1.98
2.56
2.59
2.53
2.51
2.57
2.60
2.93
-
-
-
0.75
0.89
0.86
0.87
1.07
0.80
^^ 1.02
0.87
1.14
1.13
1.09
0.82
0.99
0.85
1.09
1.10
1.08
1.07
1.10
1.11
1.25
2.58 1.10
2.59 1.11
2.54 1.08
2.75 1.17
-
-
-
0.71
0.84
0.81
0.82
1.01
0.76
0.96
0.82
1.07
1.07
1.02
0.77
0.93
0.80
1.03
1.04
1.02
1.01
1.03
1.04
1.18
1.04
1.04
1.02
1.10
                                   75

-------
Table 22. Comparison of Screening and Refined Modeling for Release Point S3P2
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
-
-
-
-
-
-
-
1-hr HIM
(lig/m3)
32.71
78.75
33.58
30.56
40.21
30.69
39.47
31.80
33.89
39.59
31.10
32.21
24.57
27.98
30.84
35.78
28.69
53.49
24.55
30.23
33.66
31.01
26.37
29.31
26.87
28.91
27.19
27.23
27.56
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
0.81
1.07
0.83
1.03
0.97
0.83
1.05
1.02
1.33
1.17
1.06
0.91
1.14
0.61
1.33
1.08
0.97
1.05
1.24
1.12
1.22
1.13
1.20
1.20
1.19
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.96
2.57
2.00
2.48
2.32
1.99
2.53
2.44
3.21
2.81
2.55
2.20
2.74
1.47
3.21
2.60
2.34
2.54
2.99
2.69
2.93
2.72
2.90
2.89
2.86
Ratio:
COARESCREEN2-
to- Refined
-
-
-
-
0.84
1.09
0.85
1.06
0.99
0.85
1.08
1.04
1.37
1.20
1.09
0.94
1.17
0.63
1.37
1.11
1.00
1.08
1.27
1.15
1.25
1.16
1.24
1.23
1.22
Ratio:
COARESCREEN3-
to-Refined
-
-
-
-
0.76
1.00
0.77
0.96
0.90
0.77
0.98
0.95
1.24
1.09
0.99
0.85
1.07
0.57
1.24
1.01
0.91
0.99
1.16
1.04
1.14
1.06
1.12
1.12
1.11
                                  76

-------
Table 23. Comparison of Screening and Refined Modeling for Release Point S3P3
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
10-OBS-B2 2010
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
F
T
F
T
F
T
F
T
F
T
F
T
1-hr HIM
(lig/m3)
159.10
287.23
169.10
184.72
171.53
131.06
149.54
146.87
166.69
148.91
150.61
156.97
117.55
125.81
134.86
125.75
131.51
475.10
130.90
166.01
153.72
150.26
128.92
118.32
140.04
156.76
152.91
137.90
134.37
Ratio:
AERSCREEN-
to- Refined
-
-
-
0.93
1.21
1.06
1.08
0.95
1.07
1.06
1.01
1.35
1.26
1.18
1.27
1.21
0.33
1.22
0.96
1.03
1.06
1.23
1.34
1.14
1.01
1.04
1.15
1.18
Ratio: Ratio: Ratio:
COARESCREEN1- COARESCREEN2- COARESCREEN3-
to-Refined to-Refined to-Refined
-
-
-
1.67
2.19
1.92
1.96
1.72
1.93
1.91
1.83
2.44
2.28
2.13
2.28
2.18
0.60
2.19
1.73
1.87
1.91
2.23
2.43
2.05
1.83
1.88
2.08
2.14
-
-
-
0.99
1.29
1.13
1.15
1.01
1.14
1.12
1.08
1.44
1.34
1.25
1.34
1.29
0.36
1.29
1.02
1.10
1.13
1.31
1.43
1.21
1.08
1.11
1.23
1.26
-
-
-
-
1.08
1.41
1.24
1.26
1.11
1.24
1.23
1.18
1.57
1.47
1.37
1.47
1.40
0.39
1.41
1.11
1.20
1.23
1.43
1.56
1.32
1.18
1.21
1.34
1.37
                                  77

-------
Table 24. Comparison of Screening and Refined Modeling for Release Point S4P1
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
10-OBS-B2 2010
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
F
T
F
T
F
T
F
T
F
T
F
T
1-hr HIM
(lig/m3)
9.37
13.97
6.99
6.91
7.38
6.65
7.49
6.57
4.26
6.53
4.80
6.78
4.15
4.46
4.08
6.51
4.61
7.07
4.88
4.95
5.51
5.32
5.85
5.46
5.06
5.72
5.22
4.38
4.38
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
1.27
1.41
1.25
1.43
2.20
1.44
1.95
1.38
2.26
2.10
2.29
1.44
2.03
1.33
1.92
1.89
1.70
1.76
1.60
1.71
1.85
1.64
1.79
2.14
2.14
Ratio:
COARESCREEN1-
to-Refined
-
-
-
1.89
2.10
1.86
2.13
3.28
2.14
2.91
2.06
3.36
3.13
3.42
2.14
3.03
1.98
2.86
2.82
2.54
2.63
2.39
2.56
2.76
Ratio: Ratio:
COARESCREEN2- COARESCREEN3-
to-Refined to-Refined
-
-
-
0.95
1.05
0.93
1.06
1.64
1.07
1.45
1.03
1.68
1.57
1.71
1.07
1.52
0.99
1.43
1.41
1.27
1.31
1.19
1.28
1.38
2.44 1.22
2.68 1.34
3.19 1.60
3.19 1.60
-
-
-
0.94
1.04
0.92
1.05
1.62
1.06
1.44
1.02
1.66
1.55
1.69
1.06
1.50
0.98
1.42
1.40
1.25
1.30
1.18
1.26
1.37
1.21
1.32
1.58
1.58
                                   78

-------
Table 25. Comparison of Screening and Refined Modeling for Release Point S4P2
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
-
-
-
-
-
-
-
1-hr HIM
(lig/m3)
41.87
94.50
40.12
39.07
53.82
36.61
54.68
42.42
30.72
52.44
40.34
42.64
30.33
29.15
29.00
46.02
32.27
81.51
35.18
49.62
34.50
37.75
32.70
32.06
37.96
40.43
40.37
28.90
38.15
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
0.78
1.14
0.77
0.99
1.36
0.80
1.04
0.98
1.38
1.44
1.44
0.91
1.30
0.51
1.19
0.84
1.21
1.11
1.28
1.31
1.10
1.04
1.04
1.45
1.10
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.76
2.58
1.73
2.23
3.08
1.80
2.34
2.22
3.12
3.24
3.26
2.05
2.93
1.16
2.69
1.90
2.74
2.50
2.89
2.95
2.49
2.34
2.34
3.27
2.48
Ratio:
COARESCREEN2-
to- Refined
-
-
-
-
0.75
1.10
0.73
0.95
1.31
0.77
0.99
0.94
1.32
1.38
1.38
0.87
1.24
0.49
1.14
0.81
1.16
1.06
1.23
1.25
1.06
0.99
0.99
1.39
1.05
Ratio:
COARESCREEN3-
to-Refined
-
-
-
-
0.73
1.07
0.71
0.92
1.27
0.75
0.97
0.92
1.29
1.34
1.35
0.85
1.21
0.48
1.11
0.79
1.13
1.04
1.19
1.22
1.03
0.97
0.97
1.35
1.02
                                  79

-------
Table 26. Comparison of Screening and Refined Modeling for Release Point S4P3
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
-
-
-
-
-
-
-
1-hr HIM
(lig/m3)
37.40
83.59
35.87
33.74
48.41
33.43
48.37
36.98
39.88
46.37
35.58
38.16
27.02
26.48
36.00
41.94
33.65
68.33
28.99
39.79
31.16
33.29
28.77
29.71
33.28
34.93
34.60
25.90
33.23
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
0.77
1.12
0.77
1.01
0.94
0.81
1.05
0.98
1.38
1.41
1.04
0.89
1.11
0.55
1.29
0.94
1.20
1.12
1.30
1.26
1.12
1.07
1.08
1.44
1.13
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.73
2.50
1.73
2.26
2.10
1.80
2.35
2.19
3.09
3.16
2.32
1.99
2.48
1.22
2.88
2.10
2.68
2.51
2.91
2.81
2.51
2.39
2.42
3.23
2.52
Ratio:
COARESCREEN2-
to- Refined
-
-
-
-
0.74
1.07
0.74
0.97
0.90
0.77
1.01
0.94
1.33
1.35
1.00
0.86
^^ 1.07
0.52
1.24
0.90
1.15
1.08
1.25
1.21
1.08
1.03
1.04
1.39
1.08
Ratio:
COARESCREEN3-
to-Refined
-
-
-
-
0.70
1.01
0.70
0.91
0.85
0.73
0.95
0.88
1.25
1.27
0.94
0.80
1.00
0.49
1.16
0.85
1.08
1.01
1.17
1.14
1.01
0.97
0.98
1.30
1.02
                                  80

-------
Table 27. Comparison of Screening and Refined Modeling for Release Point S5P1
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
10-OBS-B2 2010
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
F
T
F
T
F
T
F
T
F
T
F
T
1-hr HIM
(lig/m3)
51.98
92.46
78.66
77.65
54.32
64.07
61.37
60.01
84.10
58.61
85.30
57.40
84.79
56.93
87.23
60.01
86.77
57.58
86.80
56.93
64.88
72.38
57.78
58.02
56.50
60.46
58.41
60.39
55.92
Ratio:
AERSCREEN-
to- Refined
-
-
-
0.96
0.81
0.85
0.87
0.62
0.89
0.61
0.91
0.61
0.91
0.60
0.87
0.60
0.90
0.60
0.91
0.80
0.72
0.90
0.90
0.92
0.86
0.89
0.86
0.93
Ratio:
COARESCREEN1-
to-Refined
-
-
1.70
1.44
1.51
1.54
1.10
1.58
1.08
1.61
1.09
1.62
1.06
1.54
1.07
1.61
1.07
1.62
1.43
1.28
1.60
1.59
1.64
1.53
1.58
1.53
1.65
Ratio: Ratio:
COARESCREEN2- COARESCREEN3-
to-Refined to-Refined
-
-
-
-
1.45
1.23
1.28
1.31
0.94
1.34
0.92
1.37
0.93
1.38
0.90
1.31
0.91
1.37
0.91
1.38
1.21
1.09
1.36
1.36
1.39
1.30
1.35
1.30
1.41
-
-
-
-
1.43
1.21
1.27
1.29
0.92
1.32
0.91
1.35
0.92
1.36
0.89
1.29
0.89
1.35
0.89
1.36
1.20
1.07
1.34
1.34
1.37
1.28
1.33
1.29
1.39
                                   81

-------
Table 28. Comparison of Screening and Refined Modeling for Release Point S5P2
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
-
-
-
-
-
-
-
1-hr HIM
(lig/m3)
127.10
132.69
125.92
199.56
132.93
152.61
137.88
136.07
182.17
132.91
179.95
129.05
182.16
131.79
182.57
132.34
189.00
127.66
175.59
133.63
157.49
156.83
127.65
133.26
130.48
138.85
175.18
166.44
143.93
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
0.96
0.83
0.92
0.93
0.70
0.96
0.71
0.98
0.70
0.96
0.70
0.96
0.67
1.00
0.72
0.95
0.81
0.81
1.00
0.95
0.97
0.92
0.73
0.76
0.88
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.00
0.87
0.96
0.98
0.73
1.00
0.74
1.03
0.73
1.01
0.73
1.00
0.70
1.04
0.76
0.99
0.84
0.85
1.04
1.00
1.02
0.96
0.76
0.80
0.92
Ratio:
COARESCREEN2-
to- Refined
-
-
-
-
0.95
0.83
0.91
0.93
0.69
0.95
0.70
0.98
0.69
0.96
0.69
0.95
0.67
0.99
0.72
0.94
0.80
0.80
0.99
0.94
0.97
0.91
0.72
0.76
0.87
Ratio:
COARESCREEN3-
to-Refined
-
-
-
-
1.50
1.31
1.45
1.47
1.10
1.50
1.11
1.55
1.10
1.51
1.09
1.51
1.06
1.56
1.14
1.49
1.27
1.27
1.56
1.50
1.53
1.44
1.14
1.20
1.39
                                  82

-------
Table 29. Comparison of Screening and Refined Modeling for Release Point S5P3
Dataset
AERSCREEN
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell
10-Beaufort-Shell
09-Chukchi-Shell
10-Chukchi-Shell
10-WRF-AERCOARE-B2-F
10-WRF-AERCOARE-B2-T
11-WRF-AERCOARE-B3-F
11-WRF-AERCOARE-B3-T
12-WRF-AERCOARE-C1-F
12-WRF-AERCOARE-C1-T
10-WRF-AERMOD-B2-F
10-WRF-AERMOD-B2-T
11-WRF-AERMOD-B3-F
11-WRF-AERMOD-B3-T
12-WRF-AERMOD-C1-F
12-WRF-AERMOD-C1-T
10-OBS-B2
11-OBS-B2
10-OBS-B3
11-OBS-B3
12-OBS-B3
11-OBS-C1
12-OBS-C1
10-OBS-C2
12-OBS-C2
Year
-
-
-
-
2009
2010
2009
2010
2010
2010
2011
2011
2012
2012
2010
2010
2011
2011
2012
2012
2010
2011
2010
2011
2012
2011
2012
2010
2012
Location
-
-
-
-
-
-
-
-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
-
-
-
-
08/05 - 10/12
08/14 - 10/10
08/05 - 10/12
07/27 - 10/17
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18 - 09/24
07/30 - 09/18
08/14 - 10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27 - 10/03
09/01 - 10/06
Recalc.
Mix Hts
-
-
-
-
-
-
-
-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
-
-
-
-
-
-
-
1-hr HIM
(lig/m3)
813.30
527.71
497.26
576.34
505.38
499.96
514.87
531.01
499.28
500.90
478.30
494.52
543.50
543.54
498.92
500.09
478.04
493.46
542.86
540.96
499.06
497.22
505.04
544.72
493.11
493.57
529.38
497.79
486.74
Ratio:
AERSCREEN-
to- Refined
-
-
-
-
1.61
1.63
1.58
1.53
1.63
1.62
1.70
1.64
1.50
1.50
1.63
1.63
1.70
1.65
1.50
1.50
1.63
1.64
1.61
1.49
1.65
1.65
1.54
1.63
1.67
Ratio:
COARESCREEN1-
to-Refined
-
-
-
-
1.04
1.06
1.02
0.99
1.06
1.05
1.10
1.07
0.97
0.97
1.06
1.06
1.10
1.07
0.97
0.98
1.06
1.06
1.04
0.97
1.07
1.07
1.00
1.06
1.08
Ratio:
COARESCREEN2-
to- Refined
-
-
-
-
0.98
0.99
0.97
0.94
1.00
0.99
1.04
1.01
0.91
0.91
1.00
0.99
1.04
1.01
0.92
0.92
1.00
1.00
0.98
0.91
1.01
1.01
0.94
1.00
1.02
Ratio:
COARESCREEN3-
to-Refined
-
-
-
-
1.14
1.15
1.12
1.09
1.15
1.15
1.20
1.17
1.06
1.06
1.16
1.15
1.21
1.17
1.06
1.07
1.15
1.16
1.14
1.06
1.17
1.17
1.09
1.16
1.18
                                  83

-------
Table 30. Comparison of Screening and Refined Modeling for All Release Points Combined
Dataset Year
COARESCREEN1
COARESCREEN2
COARESCREEN3
09-Beaufort-Shell 2009
10-Beaufort-Shell 2010
09-Chukchi-Shell 2009
10-Chukchi-Shell 2010
10-WRF-AERCOARE-B2-F 2010
10-WRF-AERCOARE-B2-T 2010
11-WRF-AERCOARE-B3-F 2011
11-WRF-AERCOARE-B3-T 2011
12-WRF-AERCOARE-C1-F 2012
12-WRF-AERCOARE-C1-T 2012
10-WRF-AERMOD-B2-F 2010
10-WRF-AERMOD-B2-T 2010
11-WRF-AERMOD-B3-F 2011
11-WRF-AERMOD-B3-T 2011
12-WRF-AERMOD-C1-F 2012
12-WRF-AERMOD-C1-T 2012
10-OBS-B2 2010
11-OBS-B2 2011
10-OBS-B3 2010
11-OBS-B3 2011
12-OBS-B3 2012
11-OBS-C1 2011
12-OBS-C1 2012
10-OBS-C2 2010
12-OBS-C2 2012
Location

-
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
Cl
Cl
B2
B2
B3
B3
B3
Cl
Cl
C2
C2
Period
08/05 - 10/12
08/14-10/10
08/05 - 10/12
07/27-10/17
07/28-10/15
07/28-10/15
07/28 - 10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
07/28-10/15
07/28-10/15
07/28-10/15
07/28 - 10/15
07/27 - 10/15
07/27 - 10/15
08/18-09/24
07/30-09/18
08/14-10/10
08/02 - 10/25
08/31 - 10/06
08/02 - 10/07
09/01 - 10/10
07/27-10/03
09/01-10/06
Rediag
Mix Hts

-
F
T
F
T
F
T
F
T
F
T
F
T
-
-
_
-
-
-
-
1-hr HIM
(H9/m3)
966.44
669.52
576.58
556.39
499.96
514.87
531.01
499.28
500.90
478.30
494.52
543.50
543.54
498.92
500.09
478.05
797.12
542.87
540.96
499.41
505.43
505.05
545.01
493.11
493.57
529.39
497.79
486.74
Ratio:
COARESCREEN1-
to- Refined
1.74
1.15
1.12
1.09
1.15
1.15
1.21
1.17
1.06
1.06
1.16
1.15
1.21
0.72
1.06
1.07
1.15
1.14
1.14
1.06
1.17
1.17
1.09
1.16
1.18
Ratio:
COARESCREEN2-
to-Refined
1.20
1.34
1.30
1.26
1.34
1.34
1.40
1.35
1.23
1.23
1.34
1.34
1.40
0.84
1.23
1.24
1.34
1.32
1.33
1.23
1.36
1.36
1.26
1.34
1.38
Ratio:
COARESCREEN3-
to-Refined
1.04
1.15
1.12
1.09
1.15
1.15
1.21
1.17
1.06
1.06
1.16
1.15
1.21
0.72
1.06
1.07
1.15
1.14
1.14
1.06
1.17
1.17
1.09
1.16
1.18
                                      84

-------
7     TASK 6 - COLLABORATION STUDY SEMINAR

7.1   Introduction
EPA Region 10 sponsored a two day collaboration seminar for stakeholders within Federal
agencies who will depend on the various techniques, methods, and the knowledge gained from
Tasks 1  through 5, which are the primary focus of this report. The seminar was hosted by EPA
Region 4 at the Sam Nunn Atlanta Federal Center in downtown Atlanta, GA, September 16-17,
2014. The purpose was to inform the stakeholders of the status of each task, including
preliminary results, and provide hands-on demonstrations of the software evaluated as part of
those tasks.

Software demonstrations included MMIF and AERCOARE programs configured for the
AERMOD dispersion model. MMIF is used to extract and convert prognostic meteorological
model output fields from the WRF Model to formats required for input into AERCOARE, and for
direct input into AERMOD.  AERCOARE was used to prepare meteorological data for input to
AERMOD, including data collected at offshore  buoys and prognostic data extracted from WRF
output using MMIF. EPA and BOERM anticipates these programs will be essential to prepare
the overwater meteorological data needed for air dispersion modeling to support air permit
applications and exploratory plans for offshore drilling activities.

Attendees included staff from the BOEM, BSEE,  EPA Region 2 (R2), EPA Region 4 (R4), and
EPA Region 10 (R10). Also in attendance were staff from AMEC  Environment & Infrastructure
(AMEC) and ENVIRON International Corporation (ENVIRON), who  are under contract with
AMEC. George Bridgers, EPA Office of Air Quality Planning and Standards, attended via
conference call. The completed attendance log is shown in  Figure 26.

The agenda, which includes a schedule of presentations and demonstrations,  is shown in
Figure 27 of this task report. The agenda and schedule generally were followed as shown, with
only slight modifications. The times and order of some presentations shifted as needed. The
sections that follow provide a brief overview of the presentations and demonstrations.

7.2   Session Overviews

7.2.1  Day 1, Morning
The seminar was opened by Herman Wong, R10, with an overview of the needs and events that
led up to the project objectives represented in Tasks 1 through 5. A primary impetus of this
study is the need to replace the OCD Model with updated methods, procedures, and software
integrated with a state-of-the-science model such as AERMOD and are scientifically valid for
overwater environments. The meteorological data required for dispersion modeling is often
sparse or non-existent at offshore locations. Another motivation of this collaboration study is to
test the validity of using prognostic data in the place of observed data when observed data are
not available. MMIF and AERCOARE are potentially valuable tools  that can improve the current
state of offshore dispersion modeling and were evaluated as part of this study.

                                         85

-------
Opening comments were followed by presentations from R4 and BOEM to provide an overview
of current overwater projects, lease sales, and studies in the Atlantic, Gulf of Mexico, and
Alaskan regions. The activities presented further underscore the need for better tools to assess
the impact of offshore drilling activities on air quality and ensure these activities comply with
ambient air quality standards.
ENVIRON filled the remainder of the first morning session with a series of presentations that
introduced the AERCOARE program and described in detail the objectives, status, and
preliminary results of Tasks 1 through 4 which include a comparison of North Slope WRF
solutions (Task 1), evaluation of WRF solutions with AERMOD (Task 2), Arctic WRF and
AERMOD sensitivity evaluations (Task 3), and an evaluation of measured and predicted
planetary boundary layer heights (Task 4), all of which are each covered exhaustively in earlier
sections of this report or in separate reports  (Task 2 and Task 3).

7.2.2   Day 1, Afternoon
The first afternoon session began with a presentation by ENVIRON who reported on the
regulatory issues associated with the use of models and methods that are not included as
guideline models in 40 CFR Part 51 Appendix W, also commonly known as the Guideline on Air
Quality Models and Appendix W. Non-guideline models and methods are considered alternative
models and must meet the criteria in Section 3.2.2 of Appendix W and receive EPA regional
office approval prior to use in a regulatory application. The use of MMIF and AERCOARE are
considered a change to the guideline model, AERMOD, and thus, constitute an alternative
model.  In April 2011,  R10 has authorized the use of the COARE algorithm for two Shell permit
applications based on a Section 3.2.2.e demonstration.  R10 more recently approved the use of
AERCOARE based on a similar Section 3.2.2.e demonstration. The past use of the COARE
algorithm, AERCOARE, and the evaluation results from this project will be used to support
proposed changes to Appendix W recognizing the use of WRF as a data source and MMIF and
AERCOARE to process overwater data for AERMOD in the place of AERMET as a guideline
model for overwater applications.
The afternoon session continued with a presentation by AMEC of the status of the evaluation of
screening modeling methods for 1-hr averaging period and scaling factors for 3-, 8-, 24-hour,
and annual averaging periods for overwater applications (Task 5). An exploratory AERMOD-
ready screening meteorological dataset was developed  using AERCOARE. Typical emissions
from drill ship sources were modeled at multiple buoy locations with AERMOD using the
screening dataset and repeated using data products generated for Task 3 from WRF
simulations and buoy observations. The same sources were modeled using guideline screening
model AERSCREEN. Maximum 1-hour modeled concentrations were compared to evaluate the
exploratory dataset generated using AERCOARE and the existing guideline model
AERSCREEN as conservative approaches for screening modeling. Preliminary results were
presented showing potential for the AERCOARE derived screening as a conservative screening
approach over AERSCREEN.
                                         86

-------
The session ended with an open discussion of the overall overwater approach to dispersion
modeling with AERMOD including the use of prognostic data from WRF, MMIF,  and
AERCOARE.

7.2.3  Day 2, Morning
The morning  session of the second day focused primarily on meteorological data performance
with a presentation of METSTAT, WRF model statistical performance, and demonstrations of
MMIF and AERCOARE. METSTAT is a statistical meteorological program developed by
ENVIRON to  calculate statistics for a single station or across multiple stations, paired in time.
ENVIRON used MESTAT to evaluate primary meteorological variables such as wind speed,
wind direction, temperature, and humidity from WRF simulations and WRF-based data
processed with MMIF and AERCOARE against observed data.
Attendees participated in hands-on demonstrations of AERCOARE and MMIF to extract and
process WRF output for input to AERMOD. ENVIRON provided sample WRF output and ready-
made control files for MMIF and AERCOARE. Each attendee was provided with a desktop work
station. During the demonstrations, ENVIRON discussed the MMIF and AERCOARE options
and required  data elements, and directed participants through the steps required to run MMIF to
extract WRF  data for input to AERCOARE. Likewise, participants were directed  through the
steps required to process an overwater data extracted with MMIF, using AERCOARE to
generate the  surface and profile files required by AERMOD.

7.2.4  Day 2, Afternoon
A demonstration of the AERMOD model was conducted during the final session of the seminar.
Similar to the MMIF and AERCOARE demonstrations, ready-made control files were provided
and participants were walked through the steps to run an AERMOD simulation using the
AERMOD-ready files generated in  the previous demonstrations of MMIF and AERCOARE. The
final session ended with a question and answer period and wrap-up by Herman  Wong including
the next steps.
                                        87

-------
DO1 BOEM/EPA Region 10 Collaboration Study Seminar and Demonstration
                      September 16 -17, 2014
                   EPA Region 4 Office, Atlnnta, GA
•
1
2
3
4
S
6
7
9
JQ
11
12
13
11
IS
16
17
cs
19
10
21
22
23
24
as
26
27
27
29
Attendee
Alfaro. Cesar
ArpirK), Ro$3
Brtthers,, Bart
Coull«, Annamaria
Ciasiar, Albert
Ensi. Holli
Glliam, Rick
Huang, Chester
JObwtofVflKcda- 	
KoichenfuiKw, Bob
Krivo. Stan
Lai, Ronald
Leon-Guerrero, Tim
Utilcy^ fejty
McCoy. Angel
Monteith, Richard
ggpfrVXgintar 	 	 —
Richmond, Ken
Sanders, Hsmona
TBerjon, Clint
Wolvovk^y, Prsc
Wong, Hemun
\-o^t &&
T: •. ... •'...'
UJo^r^J
" u '


Agency
oot
O01 BOEM GOM
ENVIRON
EPA Region 2
OOI BOEM AK
OOI BOEM GOM
EPA Region 4
OOI BOEM, GOM
tt&JJsasaiA - — —
ECARejion 10
EPA Re|ion 4
CXDi/BOEM Hdqtrs
EPA Region 3
EPA Region 4
OOJ/BOEM OREP
EPA Region 4
QQLBQEMAK- 	 -
ENVIRON
f)
tOI BSEE
AMCC
OOI/BOCM Hdqtrs
EPA Report 10
t?K ^'c^ H
i
fffi-e*t



Email
cesaf-alfaroffbiee-gov
roberi.arp«no6>lx>em.|ov
bt3rashers@€nvkoncorp.ccKn
coulter, anns ma riaep»,gov
albert.csastar9oocin.gov
holli.en&2(9boem,BOv
ICilUun rtck^epa.gov
chestsf.huang@boem.gov
jeTOSSrTtiWrioiiS'ifna-fw 	
kotettenruther.rcrtJWt^epa-isou
knvo-Stanley^epa joa1
ronald.lai@toem.gov
eon-guerrero.tim^'epa.gov
us^.StaifUf^n^epa gov
unget-mceoy^bo^rt'i.gov
mo^tsilh.fir;hnr'd@^pa.gov
wtfJIflWraps^tjijcncjftv' 	 ~
krichm0rtd@£m*f®fic0rp.camec. com
eric.wolvov5kv@boerrt.gov
wong.h€f mafi@epa .|ov
^MDi eoG©€R«. COU
* ^j
Ujprt), fifl-vJ^^y^-
is


Tt'l^phprt^
SO4-73S-2623
504-736-2414
425.412.1S12
212.&S7.4016
907-334-5226
504.736.2536
^vy
404,562.*»»r
SO4-736-3248
«O3T5ST»3Tr"
J06.S53.621S
4O4, 562.9123
703,787.1714
21S.814.2192
404,562,9130
7Q3. 787. 1758
404. 562. 8949
§077334. 5256~
425.412-1809
S04-73&-2S04
919.768,9929
703.787.1719
206.SS3.4SS8
^H'tHW

r^o^ff^S7j(



Sign-In Initials
./^T
>.^.
g-
/J<-_ '
&^- —
•I*f7^
(te-
: !' -:
	
_£M^^
5yxx
ff j
•>S^

Qf^
T^bi
	 ^—^
rlc. \
-------
                      Agenda
R1D/BOEM Collaboration Study Seminar and Demonstration
               September 16-17, 2014
                EPA Region 4 Office
                    Atlanta, GA


nme
8:15-5:20
6:20-8:25
8:25-3:35
8:35-8:50
8:50-9:00
9:00-9:10
9:10-9:20
9:20-9:50
9:50-10:00
10:00-10:30
10:30-11:00
11:00-11:30
11 3:-12:CO

12:00-1:15
1:15-1:30
1:30-2:00
2:00-2:30
2:30-3:00
3:00-3:30
3:30-4:00
4:00-4:30


8:00-8:30
8:30-9:15
9:15-9:30
9:30-9:45
9:45-10:00
10:00-10:15
1C: 15-1 0:45
10:45-12:00

12:00-1:15
1:15-1:30
1:30-3:00
3:00-3:30
3:30-3:45
3:45-4:00
4:00-4:30


Min.
5
5
10
10
10
1Q
10
30
10
30
30
30
30

75
15
30
30
30
30
30
30


30
45
15
15
15
•5
30
75

75
15
90
30
15
15
30
Tuesday. September 16, 2014
Morning

Introduction to collaboration study semm ;
Introductions
Overview and Objective
Welcome, logistics and overwater projects
Atlantic Region projectsj'iease sales/studies
GOM Region projects/lease sales/studies
AK Region projects/lease sales/studies
Introduction to AERCOARE
Task 1 - Comparison of North Slope WRF Solutions
Break
Task 2 - Evaluate WRF Solutions with AERMOD
Task 3 -Arctic WRF & AERMOD Sensitivity Evaluations
Task 4 - Evaluate Measured vs ? re-die ted PE_ He cnts
Afternoon
Lunch
Regulatory Issues - Section 3..2.2.e etc
WRF-MMIF & AERCOARE with AERMOD. Appendix W
Group Discussion about Overall O ve rw ate r. Approach
Break
Task 5 - Introduction to COARESCRN
COARESCRN Demonstration
Q &. A
Wednesday, September 17, 2014
Morning
Applying AERCOARE
AERCOARE with Measurements Demonstration
Q &.A
WRF Simulations (Source of data: acceptability.. .etc.)
METSTAT and Performance Goals
Break
Ir-.'ccuct on to MM IF
WRF - MMIF- AERCOARE Option - AERMOD Demo
Afternoon
Lunch
Q&A
WRF- MMIF- AERMOD Option - AERMOD Demo
Break
Q&A
Summary and what next?
Q & A and W rap-up


Presenters
H.Wong, R10
Attendees
H Wong: R10
K. Lusky, R4
A. McCoy, BOEM
H. Ensz, BOEM

K. Richmond, ENVIRON
B. Brashers, ENVIRON

K. Richmond, ENVIRON
B. Brashers, ENVIRON
B. Brashers. ENVIRON


K. Richmond, ENVIRON
H ?,'ong. RIO


C.Tillerson, AMEC
C. Tillerson, AMEC



K. Rich mond.EVI RON
Both, ENVIRON

B. Brashers, ENVIRON
B. Brashers, ENVIRON

B. Brashers, ENVIRON
Both, ENVIRON



Both, ENVIRON


H Wong: RIO

                Figure 27. Agenda.
                       89

-------
[Blank]
  90

-------
8     REFERENCES
AMS (American Meteorological Society), 2012: Glossary of Meteorology. Retrieved from
   http://glossarv.ametsoc.org/wiki/Hypsometric  equation, August 23, 2015.
Brashers, B. & Emery, C., 2014. The Mesoscale Model Interface Program (MMIF) Draft User's
   Manual, Novato, CA: ENVIRON Int. Corp. Air Sciences Group, Prepared for U.S. EPA Air
   Quality Assessment Division.
Bretherton, C. & Park, S., 2009. A New Moist Turbulence Parameterization in the Community
   Atmosphere Model. J. Climate, Volume 22, pp. 3422-3448.
Bridgers, G., 2011. Model Clearinghouse Review of AERMOD-COARE as an Alternative Model
   for Application in an Arctic Marine Ice Free Environment. Research Triangle Park (North
   Carolina): U.S.  EPA.
DiCristofaro, D. & Hanna, S., 1989. OCD: The Offshore and Coastal Dispersion Model, s.l.:
   Prepared for U.S. Dept. of Interior MMS,  , Report #A085-1.
Emery, C., Tai, E. & Yarwood, G., 2001. Enhanced meteorological modeling and performance
   evaulation for two Texas ozone episodes, Novato, CA: Prepared for the Texas Nat. Res.
   Cons. Commission by ENVIRON Int. Corp..
ENVIRON Int. Corp., 2010. Evaluation of the COARE-AERMOD Alternative Modeling Approach
   Support for Simulation of Shell Exploratory Drilling Sources In the Beaufort and Chukchi
   Seas, Lynnwood, WA 98036: ENVIRON, 19020 33rd Ave. W., Suite 310.
ENVIRON Int. Corp., 2012. Evaluation of the Combined AERCOARE/AERMOD Modeling
   Approach for Offshore Sources, Novato,  California: ENVIRON Int. Corp., 773 San Marin
   Drive, Suite 2115.
ENVIRON Int. Corp., 2014. METSTAT. [Online]
   Available at: http://www.camx.com/download/support-software.apx
Gryning, S-E. and Batchvarova, E., 2003:  Marine atmospheric boundary-layer height estimated
   from NWP model output. Int. Journal of Environment and Pollution, 20, 147-153.
Hong, S.-Y., Noh, Y. & Dudhia, J., 2006. A New Vertical Diffusion Package with an Explicit
   Treatment of Entrainment Processes. /Won. Weather Rev., Volume 134, pp. 2318-2341.
Janjic, Z., 1994. The step-mountain eta coordinate model: further developments of the
   convection, viscous sublayer and turbulence closure schemes. /Won. Weather Rev., Volume
   122,  pp. 927-945.
McNally, D. & Wilkinson, J. G., 2011. Model Application and Evaluation: ConocoPhillips Chukchi
   Sea  WRF Model Application., Arvada, Colorado: Alpine Geophysics, LLC.
Mellor, G. & Yamada, T., 1982. Development of a turbulence closure model for geophysical fluid
   problems. Rev. Geophys. Space Phys., Volume 20, pp. 851-875.
                                        91

-------
NCAR, 2014. Weather Research & Forecasting (WRF) ARW Version 3 Modeling System User's
   Guide, s.l.: National Center for Atmospheric Research Mesoscale & Meteorology Division.
NOAA Environmental Modeling Center, (n.d.): Real-time, global, sea surface temperature
   (RTG_SST_HR) analysis. Retrieved from http://polar.ncep.noaa.gov/sst/rtg_high_res/
Richmond, K. & Morris, R., 2012. Evaluation of the Combined AERCOARE/AERMOD Modeling
   Approach for Offshore Sources, s.l.: ENVIRON Int. Corp. Prepared for USEPA R.10, EPA-
   910-R-12-007.
Skamarock, W. et al., 2008. A Description of the Advanced Research WRF Model, Version 3,
   s.l.: Nat. Center for Atmos. Research, Univ. Corp. Atmos. Research.
Tikvart, J. 1988. Revised Model Clearinghouse Memorandum Operation Plan. Research
   Triangle Park, NC: U.S. Environmental Protection Agency, OAQPS.
USEPA, 1993. User's Guide to the Building Profile Input Program, Research Triangle Park, NC:
   U.S. Environmental Protection Agency Region 10, EPA-454/R-93-038.
USEPA, 2003. AERMOD: Latest Features and Evaluation Results, Research Triangle Park, NC:
   U.S. Environmental Protection Agency, OAQPS,  EPA-454/R-03-003.
USEPA, 2004a. AERMOD: Description of model formulation, Research Triangle Park, North
   Carolina: U.S. Environmental Protection Agency, EPA-454/R-03-004.
USEPA, 2004a. User's Guide for the AERMOD Meteorological Preprocessor AERMET,
   Research Triangle Park, North Carolina:  U.S. Environmental Protection Agency,
EPA-454/B03-002.
USEPA, 2004c. User's Guide fortheAMS/EPA Regulatory Model AERMOD, Research Triangle
   Park, North Carolina:  U.S. Environmental Protection Agency, EPA-454/B-03-001.
USEPA, 2004d. User's Guide fortheAMS/EPA Regulatory Model AERMOD, Research Triangle
   Park, North Carolina:  Environmental Protection Agency, EPA-454/B-03-001.
USEPA, 2011. AERSCREEN User's Guide, Research Triangle Park, NC: U.S.  Environmental
   Protection Agency, EPA-454/B-11-001.
USEPA, 2012. User's Manual AERCOARE Version 1.0, Seattle,  WA: Environmental  Protection
   Agency Region 10, EPA-910-R-12-008.
USEPA, 2012. User's Manual AERCOARE Version 1.0, Seattle,  WA: Environmental  Protection
   Agency Region 10, EPA-910-R-12-008.
US EPA, 2013: Use ofASOS meteorological data in  AERMOD dispersion modeling.
   Memorandum from Tyler Fox, available at
   http://www.epa.gov/ttn/scram/guidance/clarification/20130308_Met_Data_Clarification.pdf.
USGODAE Data Catalog, (n.d.): FNMOC High Resolution  SST/Sea Ice Analysis forGHRSST.
   Retrieved from http://www.usgodae.org/cgi-
   bin/datalist.pl?summary=Go&dset=fnmoc_ghrsst#
                                        92

-------
Venkatram, A., 1980. Estimating the Monin-Obukhov Length in the Stable Boundary Layer for
   Dispersion Calculations. Boundary Layer Meteor., 19, pp. 481-485.
Vogelezang, D. H. P. and A. A. M. Holtslag, 1996: Evaluation and Model Impacts of Alternative
   Boundary-Layer Height Formulations. Boundary-Layer Meteorology, 81, 245-269.
Wong, H., 2011. CO ARE Bulk Flux Algorithm to Generate Hourly Meteorological Data for use
   withAERMOD. Seattle (WA): U.S. EPA Region 10.
Wong, H., 2012. (personalcommunication). s.Ls.n.
Zhang, J., 2013. Beaufort and Chukchi Seas Mesoscale Meteorology Model Study, Final
   Report, 4 s.l.: s.n., 4 pp.
                                         93

-------
[Blank]
  94

-------
APPENDIX A: PROTOCOLS

-------
[Blank]

-------
APPENDIX A.1: TASK 1 PROTOCOL

-------
[Blank]

-------
 Overwater Dispersion Modeling
                  Task 1 Protocol
 AMEC RFP # 12-6480110233-TC-3902
Federal Prime Contract # EP-W-09-028
                        Prepared for:
   AMEC Environment & Infrastructure, Inc.
       502 W. Germantown Pike, Suite 850
       Plymouth Meeting, PA 19462-1308
                Attention: Thomas Carr

                        Prepared by:
       ENVIRON International Corporation
          773 San Marin Drive, Suite 2115
               Novato, California, 94945
                www.environcorp.com
                      P-415-899-0700
                      F-415-899-0707
                      January 7, 2013
                  ENVIRON

-------
Final                                                          Overwater Dispersion Modeling
                                                                        Task 1 Protocol

INTRODUCTION
The primary objective of the current study is to test and evaluate AERMOD on the outer
continental shelf (OCS). The current modeling procedures for sources on land use the
AERMOD modeling system. The meteorological AERMET processor included in the system is
inappropriate for OCS sources because the energy fluxes over water are not strongly driven by
diurnal heating and cooling. In addition, the meteorological observations necessary to drive the
dispersion models are commonly not available, especially in the Arctic Ocean. For applications
in the Arctic, the remote location and seasonal sea-ice pose logistical problems for the
deployment of buoys or offshore measurement platforms.
This study evaluates a combined modeling approach where the meteorological variables are
provided by the Weather Research and Forecasting (WRF) mesoscale model, and then
processed by a combination of a new Mesoscale Model  Interface program (MMIF) and,
optionally, AERCOARE (a replacement for AERMET suitable for overwater conditions). Given
an appropriate overwater meteorological dataset, AERMOD can then be applied for New
Source Review following the same procedures as used for sources over land. The remainder of
this document presents a protocol for Task 1 of the study.

Task 1: Examine differences between two OCS WRF solutions for the Arctic
AMEC and ENVIRON prepared a Work Plan outlining the various tasks and objectives of the
current study. As directed by EPA,  the first task in the study examines two existing WRF
datasets for the Arctic Ocean that might be used to provide the necessary meteorological
variables for dispersion model simulations of OCS sources within their domains. The task
objective is to examine  the differences between the two  datasets, examine model performance
especially compared to overwater measurements, apply MMIF  and AERCOARE to the datasets
using several different options, and compare AERMOD model predictions from the resulting
datasets using simulations of typical OCS sources. The  protocol includes additional information
on data, options, and issues that were not fully described in the Work Plan. With an approved
protocol, ENVIRON  staff will perform the following subtasks:

Task 1a: Generate AERMOD results from two different WRF datasets
The first task compares AERMOD simulations of hypothetical OCS sources using
meteorological variables extracted with MMIF, and optionally reprocessed with AERCOARE for
two WRF datasets. EPA provided two WRF diagnostic simulations of the Arctic for this task: (1)
the Bureau of Ocean Energy Management (BOEM) 2005-2009 runs conducted by the University
of Alaska at Fairbanks (UAF) (Zhang, et. al., 1011); and (2) the ConocoPhillips 2007-2009 runs
(McNally and Wilkinson, 2011). Figure 1 and Figure 2 illustrate  the ConocoPhillips 36-12-4 km
and BOEM 10 km simulation domains, respectively.
January 2013                               1                                  ENVIRON

-------
Final
                               Overwater Dispersion Modeling
                                           Task 1 Protocol
                                  150°E     150°W
                   90°W
                                                                       - 100°W
           160°E
                                    - 110°W
           170°E
                                      120°W
                       180°
165°W      150°W
        135°W
                        1    25    75   200   500   1000  1500  2000   3000
Figure 1. ConocoPhillips 36-12-4 km domains, covering the Chukchi Sea.
                                     165°W    150°W
                      135°W
                   120°W
            180
                                                                           -  130°W
                         170°W
   160°W
150°W
140°W
                       1     25     75    200    500    1000   1500   2000   3000

Figure 2. BOEM 10-km domain, covering the Chukchi and Beaufort Seas.
January 2013
                                               ENVIRON

-------
Final                                                          Overwater Dispersion Modeling
                                                                         Task 1 Protocol

The study will focus on the 2009 simulations from each WRF dataset. In addition to providing
the necessary variables for AERMOD simulations of OCS sources, during this year offshore
meteorological measurements are available to evaluate some aspects of WRF model
performance. There are many different options for the preparation of WRF-predicted
meteorology for use by AERMOD. ENVIRON proposes to examine and compare at least four
options as follows:
 1.  MMIF will be applied to extract and prepare data sets for direct use by AERMOD. All
     variables will be as predicted by the WRF simulations including the surface energy fluxes,
     surface roughness and planetary boundary layer (PBL) height.
 2.  As in Option 1), but the PBL height will be re-diagnosed from the wind speed and potential
     temperature profiles using the Bulk-Richardson algorithm within MMIF. Based on the
     August to October 2010 monitoring data collected by Shell in the Beaufort Sea, PBL
     heights range from 10 m to 700 m, with a median height of 80 m. AERMOD simulations
     can  be very  sensitive to the PBL height (ENVIRON, 2012) and MMIF processed PBL
     height may provide significantly different predicted concentrations than the PBL height
     used internally by WRF.
 3.  MMIF will be applied to extract the key meteorological variables of overwater wind speed,
     wind direction, temperature, humidity and PBL height. AERCOARE will use these variables
     to predict the surface energy fluxes, surface roughness length and other variables needed
     for the AERMOD simulations. AERCOARE has a surface layer scheme developed
     specifically to predict surface fluxes from overwater measurements. In this application, the
     WRF simulations provide the variables that might be measured by a buoy, ship or offshore
     platform. AERCOARE can also be applied using a number of different options. For the
     current study, we propose to apply AERCOARE using the defaults recommended in the
     AECOARE model evaluations study (ENVIRON,  2012).
 4.  As in Option 3), but the PBL height will be re-diagnosed using the Bulk-Richardson
     algorithm within MMIF. AERCOARE will be applied as in Option 3.
Figure 3 shows a  flow diagram of the direct options for preparing meteorological datasets for the
two WRF simulations.
ENVIRON will conduct AERMOD simulations for five OCS hypothetical sources, at five
locations, using the two WRF datasets, processed with four difference techniques. The five
hypothetical OCS sources provided by the EPA are shown in Table 1. Source 5 is the same as
Source 2 but will include downwash (i.e., Plume Rise Model  Enhancements (PRIME) (Shulman
et al  2000) using the building height/layout provided by the EPA (see the source map in Figure
4). The stacks for each source will be located at the five locations shown in Figure 5. The
source locations were selected to be close to available buoy measurement sites in both the
January 2013                                3                                   ENVIRON

-------
Final
                            Overwater Dispersion Modeling
                                        Task 1 Protocol
Chukchi Sea and Beaufort Sea and also to contrast the results from sites relatively close to
shore and well offshore.1
               AERCOARE
                  MMIF
               (MMIF PEL)
                  MMIF
               (WRF PEL)
               AERCOARE
 AERMOD:
 5 SITES x
5 SOURCES
 AERMOD:
 5 SITES x
5 SOURCES
 AERMOD:
 5 SITES x
5 SOURCES
 AERMOD:
 5 SITES x
5 SOURCES
Figure 3. Flow chart of AERMOD pathways.
1  Note, because the ConocoPhillips 4-km domain doesn't extend far enough East to capture all sites in the Beaufort Sea, the
  data will be extracted from the 12-km domain.
January 2013
                                            ENVIRON

-------
Final
                                      Overwater Dispersion Modeling
                                                 Task 1 Protocol
 60-
40
 20-
  0-
-20-
             PAD-H
             (19.81m)
                      10
^INCINER
                          KS-BLDG
                           (3.05m)
                          KT-BLDC
                          (13.72m
            KB-BLDG
            (7.62m)
                            Deck
                           (4.57m)
                                           ENGBLD
                                           (10.67m)
     I
     20
          i
         80
              40       60        80       100       120       140       160
Figure 4. Source map for Source 5 in Table 1, with stack locations, and heights (m).
Figure 5. Sites of buoy data and co-located sources for AERMOD evaluation in Task 1.
January 2013
                                                     ENVIRON

-------
Final
Overwater Dispersion Modeling
           Task 1 Protocol
Table 1. Stack parameters for hypothetical PCS sources
Source
Source 1
Source 2
Source 3
Source 4
Source 5
Downwash
None
None
None
None
Yes
Stack
Stack 1
Stack 2
Stack 1
Stack 2
Stack 3
Stack 1
Stack 2
Stack 3
Stack 1
Stack 2
Stack 3
Stack 1
Stack 2
Stack 3
Stack
Height (m)
16
14
18
17
10
25
20
15
39
25
23
18
17
10
Exit
Temperature
(K)
700
550
680
500
525
570
610
420
580
580
510
680
500
525
Exit
Velocity
(mis)
30
20
28
10
17
30
22
2
21
14
42
28
10
17
Exit
Diameter
(m)
0.5
0.4
0.4
0.45
0.4
0.6
0.25
0.3
0.7
0.2
0.15
0.4
0.45
0.4
AERMOD simulations will be performed, and a time series of predicted concentrations obtained
from the open-water periods of 2009. The "open-water" period will be selected using the
criterion that the sea-ice must be less than 50 percent in both datasets at the source location.
The analysis will calculate the predicted concentrations for each of the relevant averaging
periods for the criteria pollutants (e.g., 1-hour, 8-hour, 24-hour, and period) forOCS sources. In
order to emphasize the averaging periods from the relatively new NO2 and SO2 National
Ambient Air Quality Standards (NAAQS), we will also extract the daily maximum hourly
concentrations.
ENVIRON proposes to employ a polar-receptor grid for the AERMOD simulations. Receptor
locations for AERMOD will be linear arrays radially fanning out from each co-located source in
one degree increments (360 radii).  We will sample at the following radial distances: 30 m, 50 m,
75 m, 100 m,  125 m, 150 m, 175 m, 200 m, 300 m, 400 m, 500 m, 750 m, 1 km, 1.5 km, 2 km,
3 km, 4 km, 5 km, 6 km, 7 km, 8 km, 9 km, and 10 km.

Task 1b: Compare AERMOD results from the two different WRF datasets
The output from the 200 (2 WRF sets x 4 MMIF options x 5 sources x 5 locations) AERMOD
runs performed in Task 1a will be statistically analyzed and compared to each other. The tools
used to examine the different results will include contour plots,  quantile-quantile (Q-Q) plots,
statistical tables summarizing  each  averaging period by radial distance from the source, and
scatter diagrams. The source of these differences will be traced back through the process to
find the controlling aspect or feature of the upstream program that caused the difference. The
importance of each will be investigated and reported. Note as there are many different
simulations, averaging periods and  manners to summarize the results, cross matrices
containing results from every possible comparison will not all be reported.
January 2013
               ENVIRON

-------
Final                                                          Overwater Dispersion Modeling
                                                                         Task 1 Protocol

Task 1c: Compare the WRF runs
The WRF simulation datasets will each be subjected to a limited model performance evaluation
(MPE) using the METSTAT program. METSTAT uses surface meteorological observations and
extracted WRF data (paired in time and space) to calculate a series of statistical measures
designed to examine WRF's ability to characterize the observations. ENVIRON will use as full
an observed dataset as feasible, including MADIS data over land and buoy data over water. We
have requested the verification dataset used by BOEM for its WRF MPE, and will use it if we are
allowed. The ConocoPhillips simulations overlap the BOEM simulations from June 15th through
the end of November for the years 2007-2009.  ENVIRON will focus the meteorological
evaluation on July through November, but will include  other months if quality-controlled
overwater data sources exist during the sea-ice season (most buoy data becomes unreliable
during the sea-ice season).
The BOEM and ConocoPhillips domains need to be compared as directly as possible. As such,
ENVIRON will perform three comparisons:
   1.  ConocoPhillips 4 km and 12 km domains using  observations that fall within the 4 km
      domain only, to determine whether resolution affects the modeling accuracy
   2.  ConocoPhillips 12 km against BOEM 10 km domain, using observations from the
      intersection of the two domains
   3.  ConocoPhillips 4 km against BOEM 10 km domains, with observations from the
      overlapping region
The ConocoPhillips 4 km domain  omits much of the Beaufort Sea as illustrated in the MADIS
station plot (Figure 6). In each case, a 5 grid cell buffer will be excluded from the evaluation due
to edge effects.
In order to separate over land from overwater WRF model performance we will re-run
METSTAT with subsets of the data corresponding to over land and overwater sources.
ENVIRON will use METSTAT to evaluate WRF for surface wind speed, wind direction,
temperature, and humidity. These plots include performance goals (Emery, et. al., 2001;  and
Kemball-Cook, el. al., 2005) for complex and simple terrain that are not designed to define
passing or failing grades, but rather to assist in interpreting and comparing meteorological
model performance across applications. We will also compare annual wind roses from each
WRF simulation by processing the data through MMIF and AERCOARE. For each model
simulation, we'll also examine temporal and spatial distributions of air-sea temperature
differences using monthly and annual average plots.
Table 2 illustrates the primary similarities and differences between the models. The two runs
have very important differences in forcing data  (GFS vs. ERA interim), vertical PEL scheme,
grid resolution, and domain size/shape. There are also more subtle differences in the
observational nudging data and nudging strengths.
January 2013                                7                                   ENVIRON

-------
Final
                                                Overwater Dispersion Modeling
                                                           Task 1 Protocol
      -624.0
   190
   180

   160

   140

   120

   100

     80

     60

     40

     20

       0
                                                         88.0
igiiiiiiiiiuiiiiiiiiiiiiiiiiiniiiiiiiiiiiiiiiuiiiiiiiiiiiiuiiiiiiiiiiiiiiiiniiiiuiiiiMniiiiniiiiiHiiiiiuiuiHiiiiiiiiiuiiiiiiniiiuiMiiiinMiiiiiiiiin
                      IKAA2
                  *«^2 ^ADC

                        KELA2

                          PAIN
                                                              292.0
       KTZA2
PAH
 J,  KAVA2
                                        pAFM
                                           DCKA2
                                                     NRUA2
                           PAUL
                                                HOCA2
          0     20   -K)    60    80   100  120   140  160 178
                                                                      .0
Figure 6. MADIS observations in ConocoPhillips 4 km domain. Taken from Figure 2-3
from McNally & Wilkinson (2011).
January 2013
                                                               ENVIRON

-------
Final
Overwater Dispersion Modeling
           Task 1 Protocol
Table 2. Model configuration for BOEM and ConocoPhillips WRF simulations
Parameter
Model Grid
Forcing Data
PEL
Microphysics
LW/SW
radiation
Surface Layer
Physics
LSM
Cumulus
Physics
Data
Assimilation
(obs. nudging)
Analysis
Nudging
Lower
boundary
Wdamping
Advection
Years
BOEM WRF
1 0 km with 49 vertical levels
ERA interim reanalysis (ERA-I)
MYJ
Morrison
RRTM/RRTMG
ETA Similarity
NOAH w/improved sea ice alb.
Kain-Fritch
In situ surface obs.,
radiosondes, QuikSCAT sfc
winds, MODIS profiles,
COSMIC profiles.
Three-wavenumber spectral
nudging of all variables and all
levels.
AMSR-E sea ice thickness and
cone. And CMC snow depth.
Off
Positive-Definite
2005-2009
ConocoPhillips WRF
36-12-4 km with 37 vertical levels
GFS % degree dataset
YSU
Morrison
RRTM/RRTMG
MM5 similarity
NOAH
Kain-Fritch (36-1 2 only)
Nudging to MADIS data on 4-km
domain with a radius of influence
of 50 km.
36 & 12 km for winds and
temperature at all model levels.
GFS initialized using SST-update
option. Water temperatures from
NCEP RTG 1/12 degree analysis.
On
Monotonic
2007-2009 (June 15-December
3rd only)
January 2013
              ENVIRON

-------
Final                                                           Overwater Dispersion Modeling
                                                                         Task 1 Protocol

REFERENCES
Emery, C.A., E. Tai, and G. Yarwood. (2001). Enhanced Meteorological Modeling and
       Performance Evaluation for Two Texas Ozone Episodes. Prepared for the Texas Natural
       Resource Conservation Commission, by ENVIRON International Corp, Novato, CA.

ENVIRON (2011). Evaluation of the Combined AERCOARE/AERMOD Modeling Approach for
       Offshore Sources. Prepared for U.S. EPA Region 10, Seattle, WA 98101. EPA 910-R-007,
       October 2012.
Kemball-Cook, S., Y. Jia, C. Emery, R. Morris, Z. Wang, G. Tonnesen. (2005). Alaska MM5
       Modeling for the 2002 Annual  Period to Support Visibility Modeling. Prepared for the
       Western Regional Partnership, Denver, CO. Prepared by ENVIRON International
       Corporation, Novato, CA and University of California at Riverside, Center for
       Environmental Research and Technology, September, 2005.
       http://pah.cert.ucr.edu/aqm/308/docs/alaska/Alaska MM5  DraftReport  Sept05.pdf.

McNally, D., & Wilkinson, J. G. (2011). Model Application and Evaluation: ConocoPhillips
       Chukchi Sea WRF Model Application. Arvada, Colorado: Alpine Geophysics, LLC.
Schulman, L. L, D. G. Strimaitis, and J. S. Scire (1997).  Addendum to  ISC3 User's Guide: The
       PRIME Plume Rise and Building Downwash Model.  Submitted by Electric Power
       Research Institute.  Prepared by Earth Tech, Inc., Concord, MA, November 1997.
Zhang, J., Liu, F., Krieger, J., Tao, W., & and Zhang, X. (2011). Beaufort and Chukchi Seas
       Mesoscale Meteorology Model Study.
January 2013                               10                                  ENVIRON

-------
[Blank]

-------
APPENDIX A.2: TASK 2 DRAFT PROTOCOL

-------
[Blank]

-------
DRAFT Overwater Dispersion Modeling
                        Task 2 Protocol
       AMEC RFP # 12-6480110233-TC-3902
      Federal Prime Contract # EP-W-09-028
                              Prepared for:
         AMEC Environment & Infrastructure, Inc.
             502 W. Germantown Pike, Suite 850
              Plymouth Meeting, PA 19462-1308
                      Attention: Thomas Carr

                              Prepared by:
             ENVIRON International Corporation
                773 San Marin Drive, Suite 2115
                     Novato, California, 94945
                      www.environcorp.com
                            P-415-899-0700
                            F-415-899-0707
                            March 11, 2013
                        ENVIRON

-------
Final                                                          Overwater Dispersion Modeling
                                                                        Task 2 Protocol

INTRODUCTION
The primary objective of the current study is to test and evaluate AERMOD on the outer
continental shelf (OCS). The current modeling procedures for sources on land use the
AERMOD modeling system. The meteorological AERMET processor included in the system is
inappropriate for OCS sources because the energy fluxes over water are not strongly driven by
diurnal heating and cooling. In addition, the meteorological observations necessary to drive the
dispersion models are commonly not available, especially in the Arctic Ocean. For applications
in the Arctic, the remote location and seasonal sea-ice pose logistical problems for the
deployment of buoys or offshore measurement platforms.
This study evaluates a combined modeling  approach where the meteorological variables are
provided by the Weather Research and Forecasting (WRF) mesoscale model, and then
processed by a combination of a new Mesoscale Model Interface program (MMIF) and,
optionally, AERCOARE (a replacement for AERMET suitable for overwater conditions). Given
an appropriate overwater meteorological dataset, AERMOD can then be applied for New
Source Review following the same procedures as used for sources over land.
The remainder of this document presents a protocol for Task 2 of the study. Task 2 compares
WRF-driven dispersion AERMOD predictions against the concentrations observed in five
offshore tracer studies. The same four North American studies were used previously to evaluate
AERCOARE using actual overwater observations (Environ, 2010); the current study also
includes Oresund. In this task, WRF is used to provide predictions of the overwater
observations. Model performance using WRF is compared to the performance found with actual
observations.

Task 2: Evaluate the use of WRF Solutions with AERMOD
AMEC and ENVIRON prepared a Work Plan outlining the various tasks and objectives of the
current study. As directed by EPA and AMEC, the second task: To generate WRF simulations
that match five offshore tracer studies and then model the tracer dispersion using multiple
configurations of MMIF/AERCOARE/AERMOD.  The protocol includes additional information on
data, options, and issues that were not fully described in the Work Plan. Wth an approved
protocol, ENVIRON staff will perform the following subtasks:

Task 2a: Generate WRF simulations to match the five offshore tracer studies
ENVIRON will perform meteorological simulations to match five historical field studies
conducted in:
 •  Cameron, LA: July 1981 and February 1982
 •  Pismo Beach, CA: December 1981  and June 1982
 •  Carpinteria,  CA: September 1985 and  October 1985
 •  Ventura,  CA: September 1980 and January 1981
 •  0resund (between Denmark and Sweden): May/June 1984
March 11,2013                              1                                   ENVIRON

-------
Final                                                           Overwater Dispersion Modeling
                                                                          Task 2 Protocol

The four North American studies listed have been used for OCS model development including
forOCD, CALPUFF, and most recently for AERCOARE. For these simulations, ENVIRON has
selected the National Center for Atmospheric Research's (NCAR's) community-developed WRF
model (dynamical core version 3.4.1). WRF is a limited-area, non-hydrostatic, terrain-following
eta-coordinate mesoscale model.
WRF must be optimized for land-sea contrast, including Sea Surface Temperature (SST), the
land-sea-breeze circulation, and the correspondingly influenced temperature structure because
the tracer studies are in an offshore coastal environment. ENVIRON'S WRF configurations
attempt to capture the timing and location of rapidly-changing and diurnally-influenced land-sea-
breeze regimes, which will be critical to a successful forecast. To do this, ENVI RON'S model
configuration should include the most accurate initial inputs, regionally applicable physics
choices, and nudging selection to incorporate local field-study data, combined with the best
SST's and land surface models available.
ENVI RON'S base case configuration will include 5.5 day simulation blocks,  with a minimum of
12 hours for model spin-up prior to experimental tracer release times. The spin-up time allows
for the model to develop sub-grid scale processes, including vorticity and moisture fields. Table
2 summarizes the date ranges for the tracer studies and corresponding regional weather model
initializations. The U.S. modeling domains are defined on the Lambert Conformal Conic (LCC)
map projection identical to the National Regional Planning Organization (RPO) domains, with an
outermost RPO domain (36 km) and telescoping 12-4-1.33 km nests to capture the fine detail of
coastlines and adjacent topography. The domain configuration for Oresund is similar; however,
the location requires a  projection defined for Northern Europe. Domains are provided in Figures
1-5. Figures showing the inner-most 1.33 km domain  nests can be found in the discussion of
Task 2c, starting on page 11.
The planned model vertical structure includes 37 levels, disproportionately stacked toward the
surface, where more complex vertical structures critical to modeling the boundary layer structure
and therefore coastal weather (see Table 3) are necessary. The proposed boundary layer
resolution will use finer vertical spacing than ENVIRON typically uses for most simulations over
land, as we anticipate this will help winds and temperatures respond more explicitly to
dynamical influences.
ENVIRON will include high resolution sea surface temperatures from NOAA's % degree
Optimum Interpolation  (Ol) dataset V2 (AVHRR). The Ol dataset will replace the global model
values from September 1981 onward.  Both the winter and summer periods at Ventura and the
winter period at Cameron happened before the Ol SST dataset starts, and will use the coarser
SST values found in the initialization dataset (NARR or ERA-I). Nudging techniques described
later in this section aim to correct air-sea temperature biases resulting from coarser SST data
available in the early 1980's. The subgrid-scale fluxes at the lower boundary of WRF will be
treated by the four-layer NOAH land-surface model, that has been used extensively at
ENVIRON with success. The NOAH land-surface model was also used for a similar WRF
transport study along the California coast (Yver C., 2012).
March 11,2013                               2                                   ENVIRON

-------
Final                                                           Overwater Dispersion Modeling
                                                                         Task 2 Protocol

Other WRF options ENVIRON proposes are: the sophisticated Morrison 2-moment
microphysics that predicts both number concentration and mixing ratios, the Rapid Radiative
Transfer Model (RRTMG) long-wave/shortwave physics, and Monin-Obukhov (Janjic) surface
physics. ENVIRON will parameterize cumulus on the 36 and 12 km domains using the Kain-
Fritch WRF option, and will use the new Kain-Fritch "Eta" trigger. We note the Grell 3-D cumulus
algorithm was used in the Yver California coastal tracer study. If cumulus performance in this
study is unsatisfactory, Grell 3-D will be considered. At higher resolution, 4 and 1.33  km,
convection will be  explicitly parameterized by the model.
ENVIRON understands that mesoscale models will generally have some bias and no single
solution provides the best simulation in all circumstances.  The accuracy and variability of the
WRF model are critical to evaluate its success as an input to downstream dispersion models as
a single choice of model setup represents a single deterministic solution. Thus, ENVIRON
proposes to approach the simulation of each historical field study as an ensemble of
simulations-varying two fields that are highly influential and difficult to anticipate the  influence of
reanalysis input and planetary boundary layer (PBL) selection.  The ensemble will include all
eight possible combinations of reanalysis and PBL listed below:
    •  European  Center for Medium Range Weather Forecast's ERA-lnterim dataset (ERA-I,
       6-hourly analysis output, -0.5 degree)
    •  North America Regional Reanalysis (NARR, U.S. cases only, 3-hourly analysis dataset,
       -0.3 degree)
    •  Yonsei University (YSU) PBL scheme
    •  Mellor-Yamada-Janjic (MRF)  PBL scheme
    •  University  of Washington Shallow Convection (UW-PBL) PBL scheme
    •  Total Energy - Mass Flux (TEMF) PBL scheme
The first two PBL selections have been used extensively in ENVIRON modeling, while the UW-
PBL has been shown to reduce climate bias by 7% in the Community Atmosphere Model (Park,
2009). The TEMF  scheme with total turbulent energy as a prognostic variable and integrated
shallow cloud is intended to improve simulations with  shallow cloud and/or stable boundary
layers and therefore deserves testing in this study  (Angevine, 2012). The ERA-I dataset has
replaced the Global Forecast System (GFS) model as our standard at ENVIRON  due to higher
accuracy (Angevine, 2012). NARR was employed successfully to model arctic over-water
meteorology in a BOEM study (Zhang, 2011).
Effective nudging will be critical to successful modeling of meteorological conditions where
satellite-derived SST's do not exist. Nudging provides an opportunity to correct for biases in the
air-sea temperature difference that will profoundly affect boundary layer structure and wind
speeds. ENVIRON has experimented extensively with nudging in WRF;  there are options to
nudge wind, temperature, and moisture fields toward 3-D and 2-D analysis fields; and options to
nudge toward surface observations using specified horizontal and vertical radii of influence.
Nudging excessively leads to non-physical development of weather patterns.
March 11,2013                               3                                  ENVIRON

-------
Final
Overwater Dispersion Modeling
            Task 2 Protocol
ENVIRON recommends 3-D nudging toward analysis grids for wind, temperature, and moisture
for the 36 and 12 km domains. PEL nudging prevents the development of physical boundary
layer processes. Therefore, ENVI RON'S strategy involves 3-D nudging above the PEL only;
surface analysis nudging, at the base of the PEL, will not be employed. Observational nudging
with a radius of influence of 50 km or less and a vertical radius of influence up to approximately
900mb will be employed for temperature and wind speed, but not moisture (to avoid the
development of spurious convection), using the ds3505 data.1 In this manner, we hope to
capture features such as the deceleration of near-surface winds blowing from a heated land
surface associated with a stable layer that develops over a cool water surface, as observed and
modeled at Oresund (Doran J.C., 1987). The  use of observational data may adversely impact
the model if over-water data is represented by the model on land or vice-versa. In this case,
ENVIRON may choose to exclude certain data from the objectively analyzed data, or  perhaps
turn off observational nudging entirely. Table 1 provides the types of nudging employed, the
variables nudged, and  the strength of the nudging coefficients employed.2
In order to compare the quality of the simulation, ENVIRON will analyze all available
meteorological datasets and will compare in-situ data with time-series plots of temperature, air-
sea temperature difference, wind speed, wind direction, mixing height, and lapse rates.
                     Table 1. Nudging in WRF model simulations
Domain (km)

36
12
4
1.33
3-D nuding
Variables
Q, UV, T
Q, UV, T


Coefficients
0.0003
0.0003


Observational Nudging
Variables



UV, T
Coefficients



0.0005
1 DS3505 integrated surface hourly (ISH) worldwide station data includes extensive automated QC on all data and
additional manual QC for USAF, NAVY, and NWS stations. It integrates all data from DS9956, DS3280, and DS3240.
10,000 currently active stations report wind speed and direction, wind gust, temperature, dew point, cloud data,
sea level pressure, altimeter setting, station pressure, present weather, visibility, precipitation amounts for various
tie periods, snow depth, and various other elements as observed by each station. (NOAA/NCDC, 2010)
2 ENVIRON will employ different nudging coefficients, nudging variables, or nudging in the PBL if the simulation
diverges from the input grids or lack meteorological validity.
March 11, 2013
                ENVIRON

-------
Final
Overwater Dispersion Modeling
            Task 2 Protocol
            Table 2. Historical Field Study Dates and WRF Initializations
Location
Cameron, LA
Carpinteria, CA
Pismo Beach, CA
Ventura, CA
Oresund,
Denmark/Sweden
Historical Field Study Date
Ranges
Period 1: 08Z 08/20/1981 to
13Z 08/29/1 981
Period 2: 08Z 02/151982 to
14Z 02/24/1 982
Period 1: 09/1 9/1 985 to
09/29/1985
(Complex Terrain Study only)
Period 1: 12/08/1981 to
12/15/1981
Period 2: 06/21/1 982 to
06/27/1982
Period 1: 01/06/1 980 to
01/13/1980
Period 2: 09/27/1981 to
09/29/1981
Period 1: 11Z 05/16/1984 to
13Z 06/05/1 984
Period 2: 10Z 06/12/1984 to
15Z 06/1 4/1 984
WRF Initializations
Period 1: 12Z 08/19/1981
12Z 08/24/1 981
Period 2: 12Z 02/14/1982
12Z 02/1 9/1 982
Period 1: OOZ 09/18/1985
OOZ 09/23/1 985
OOZ 09/28/1 985
Period 1: OOZ 12/07/1981
OOZ 12/1 2/1 981
Period 2: OOZ 06/20/1 982
OOZ 06/25/1 982
Period 1: 12Z 01/05/1980
12Z 01/1 0/1 980
Period 2: 12Z 09/26/1981
Period 1: 12Z 05/15/1984
12Z 05/20/1 984
12Z 05/25/1 984
12Z 05/30/1 984
12Z 06/04/1 984
Period 2: 12Z 06/11/1984
March 11, 2013
                ENVIRON

-------
Final
Overwater Dispersion Modeling
            Task 2 Protocol
                       Table 3. WRF Model 37 Vertical Levels.
Level
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
eta
1
0.9985
0.997
0.995
0.993
0.991
0.988
0.985
0.98
0.97
0.96
0.95
0.94
0.93
0.91
0.89
0.87
0.84
0.8
0.76
0.72
0.68
0.64
0.6
0.55
0.5
0.45
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.06
0.027
0
Pressure
(mb)
1000
999
997
995
993
991
989
986
981
972
962
953
943
934
915
896
877
848
810
111
734
696
658
620
573
525
478
430
383
335
288
240
193
145
107
76
50
Height
(m)
0.0
12.2
24.5
40.8
57.2
73.6
98.3
123.0
164.3
247.4
331.2
415.7
500.8
586.6
760.5
937.2
1117.1
1392.8
1772.4
2166.7
2577.0
3005.0
3452.2
3921.0
4540.7
5203.7
5917.1
6690.5
7536.4
8472.3
9522.5
10724.1
12136.7
13866.9
15621.6
17503.4
19594.2
Mid Height
(m)

6.1
18.4
32.7
49.0
65.4
85.9
110.6
143.6
205.9
289.3
373.4
458.2
543.7
673.5
848.8
1027.1
1254.9
1582.6
1969.6
2371.9
2791.0
3228.6
3686.6
4230.8
4872.2
5560.4
6303.8
7113.5
8004.4
8997.4
10123.3
11430.4
13001.8
14744.2
16562.5
18548.8
Dz
(m)

12.2
12.2
16.4
16.4
16.4
24.7
24.7
41.3
83.1
83.8
84.5
85.1
85.8
173.8
176.8
179.8
275.8
379.6
394.3
410.3
427.9
447.3
468.7
619.8
662.9
713.4
773.4
846.0
935.8
1050.2
1201.6
1412.6
1730.1
1754.7
1881.8
2090.8
Note: Calculated using P0=1000mb, Ptop=50mb, TO=20.15C, and dT/dx=-6.5C/km.
March 11, 2013
                ENVIRON

-------
Final
                                    Overwater Dispersion Modeling
                                               Task 2 Protocol
  50°N
  45°N -
  40°N -
  35°N
  30°N
  25°N
  20°N —
          120°W
110°W
100°W
90°W
80°W
Figure 1. Pismo Beach (CA) WRF domain map. The entire map illustrates the 36 km
domain, while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
                                                   ENVIRON

-------
Final
                                     Overwater Dispersion Modeling
                                                Task 2 Protocol
  50°N
  45°N  -
  40°N  -
  35°N
  30°N
  25°N  -
  20°N -
          120°W
110°W
100°W
90°W
80°W
Figure 2. Carpinteria (CA) WRF domain map. The entire map illustrates the 36 km domain,
while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
                                                    ENVIRON

-------
Final
                                    Overwater Dispersion Modeling
                                                Task 2 Protocol
  50°N
  45°N
  40°N  -
  35°N
  30°N
  25°N  -
  20°N -
          120°W
110°W
100°W
90°W
80°W
Figure 3. Ventura (CA) WRF domain map. The entire map illustrates the 36 km domain,
while d02. d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
                                                   ENVIRON

-------
Final
                                    Overwater Dispersion Modeling
                                               Task 2 Protocol
  50°N
  45°N -
  40°N -
  35°N -
  30°N
  25°N -
  20°N -
          120°W
110°W
100°W
90°W
80°W
Figure 4. Cameron (LA) WRF domain map. The entire map illustrates the 36 km domain,
while d02. d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
                10
                                     ENVIRON

-------
Final
                    Overwater Dispersion Modeling
                                Task 2 Protocol
  60°N
  55°N -
  50°N -
  45°N  -
  40°N -
  35°N
  30°N -
               10°W
10°E
20°E
30°E
40°E
Figure 5. Oresund (DK) WRF domain map. The entire map illustrates the 36 km domain,
while d02, d03, and d04 correspond to the 12 km, 4 km, and 1.33 km domains,
respectively.
March 11, 2013
11
                        ENVIRON

-------
Final                                                        Overwater Dispersion Modeling
                                                                      Task 2 Protocol

Task 2b: Run AERMOD using WRF solutions
As mentioned in the Task 1 Protocol for this study, there are many different options for the
preparation of WRF-predicted meteorology for use by AERMOD. ENVIRON proposes to
examine and compare the same four options as follows for each ensemble member of each
case:
   1.     MMIF will be applied to extract and prepare data sets for direct use by AERMOD. All
         variables will be as predicted by the WRF simulations including the surface energy
         fluxes, surface roughness and planetary boundary layer (PBL) height.
   2.     As in Option  1), but the PBL height will be re-diagnosed from the wind speed and
         potential temperature profiles using the Bulk-Richardson algorithm within MMIF.
         AERMOD simulations can be very sensitive to the PBL height (ENVIRON, 2012) and
         MMIF processed PBL height may provide significantly different predicted
         concentrations than the PBL height used internally by WRF.
   3.     MMIF will be applied to extract the key meteorological variables of overwater wind
         speed, wind direction, temperature, humidity and PBL height. AERCOARE will use
         these variables to predict the surface energy fluxes, surface roughness  length and
         other variables  needed for the AERMOD simulations. AERCOARE has  a surface
         layer scheme developed specifically to predict surface fluxes from overwater
         measurements. In this application, the WRF simulations provide the variables that
         might be measured by a buoy,  ship or offshore platform. AERCOARE can also be
         applied using a number of different options. For the current study, we propose to
         apply AERCOARE using the defaults recommended in the AECOARE model
         evaluations study (ENVIRON, 2012).
   4.     As in Option  3), but the PBL height will be re-diagnosed using the Bulk-Richardson
         algorithm within MMIF. AERCOARE will be applied as in Option 3.

Task 2c: Compare and analyze WRF-driven AERMOD predictions against data collected
from the five field studies
ENVIRON will extract meteorological datasets from the WRF simulations for each period of
the field studies in Cameron, Ventura, Pismo Beach, Carpinteria, and Oresund  using MMIF.
In Task 2b MMIF is applied to generate datasets both for AERCOARE processing and
directly to AERMOD bypassing AERCOARE. These two different methods can  be used to
contrast the differences between the surface energy fluxes predicted by AERCOARE
versus the internal algorithms selected for the WRF simulations.
AERMOD simulated tracer releases for each field study and the resulting predictions will be
compared to observations using the same statistical procedures as were employed in
previous AERCOARE model evaluation studies (ENVIRON 2012).  We will compare the
model performance of:  WRF-driven AERCOARE versus WRF-driven AERMOD; WRF-
driven AERCOARE versus meteorological observation-driven AERCOARE; and WRF-
driven AERCOARE independent of wind-direction versus meteorological observation-driven

March 11,2013                            12                                 ENVIRON

-------
Final                                                             Overwater Dispersion Modeling
                                                                             Task 2 Protocol

AERCOARE3. In addition to concentration predictions, the meteorological predictions from
WRF will be compared to the measurements from the field studies. We will diagnose the
important variables and options that resulted in different predicted concentrations for the
various cases considered. ENVIRON proposes to use a variety of graphical techniques to
represent the tracer versus modeled concentrations as performed in previous OCD, CALPUFF,
and AERCOARE evaluations. Graphical analysis includes Q-Q plots which compare two
probability distributions, in this case predicted versus observed concentration, by plotting their
quantiles against each other with logarithmic scales on the axes. Log-log scatter plots are
employed to evaluate the temporal relationship between observed and predicted concentration.
The former identifies biases in the model related to the magnitude of the concentrations, while
the latter identifies whether the model can explain the observed temporal variability.
Some of the differences and details for Task 2 are discussed in the following subsections for
each of the five field studies.
Pismo Beach. The Pismo Beach experiment was conducted during December 1981 and June
1982. Figure 6, below, illustrates the WRF domain relative to the tracer experiment, while Figure
7 zooms in to illustrate just the tracer experiment setup. The tracer was released from a boat
mast  13.1-13.6 m above the water. Peak concentrations occurred near the shoreline at
sampling distances from 6 to 8 kilometers away. The Pismo Beach evaluation database
consists of 31 sampling periods.
The meteorological data shows discrepancies between the air-sea temperature difference and
the lapse rate at times during  this field experiment - sometimes the lapse rate indicates a stable
boundary layer and the air-sea temperature difference indicates unstable conditions. Several
previous modeling studies relied  on the lapse rate and corrected the air-sea difference to be at
least as stable as indicated by the lapse rate (Environ, 2010).  This method will be applied for
this study as well.
 In previous OCD, CALPUFF, and AERCOARE model evaluation analyses, wind directions were assigned to ensure simulated
plumes were centered on the receptor with the highest prediction. This focused the previous evaluations on plume diffusion,
rather than plume transport. The proposed evaluation will also make a distinction between differences caused by apparent
plume transport errors compared to a differences resulting from WRF's predicted boundary layer structure.
March 11,2013                               13                                    ENVIRON

-------
Final
                      Overwater Dispersion Modeling
                                  Task 2 Protocol
       675
                 685
                           695
                                     705         715         725
                                      UTM Easl (km) Zone 10N, Datum: NAS-C
                                                                    735
                                                                                         755
Figure 6. Pismo Beach WRF 1.33 km domain (solid magenta line), with 5 point grid cell
buffer (dashed magenta line)
March 11, 2013
14
ENVIRON

-------
Final
                      Overwater Dispersion Modeling

                                  Task 2 Protocol
                                   Pismo Beach, CA
              3890-
              3885-
           o

           «
           <
            I  3880-
           J_ 3875-

           i
              3870-
              3865-
                     710          715          720          725


                            UTM East (km) Zone 10N, Datum: NAS-C
^^H



















-100
-95
-90
-85
-80
-75
-70
-65
-60
-55
-50
-45
-40
-35
-30
-25
-20
-15
-10
Land Use
X Sa

Snow/Ice

Tundra

Barren

Wetland

Water

Forest

Range

Agriculture

Urban/Built-Up


npler Locatio
                                                                            Tracer Releases
Figure 7. Pismo Beach tracer release and sampler location map within the background
March 11, 2013
15
ENVIRON

-------
Final
                     Overwater Dispersion Modeling
                                 Task 2 Protocol
Carpinteria. The Carpinteria experiment was conducted in September and October of 1985.
Studies examined impacts caused by both interaction with complex terrain and shoreline
fumigation. Due to limitations in the AERCOARE-AERMOD approach, only complex terrain data
can be analyzed in this study. Figure 8, below, illustrates the WRF domain relative to the tracer
experiment, while Figure 9 zooms in to illustrate just the tracer experiment setup. Shoreline
receptors on a 20 to 30 meter high bluff are located within 0.8 to 1.5 km of the offshore
tethersonde release. Very light winds were observed during much of the study period. A
constant mixing height in the dataset suggests a problem with the instrumentation. ENVIRON
will use WRF mixing heights as input to AERCOARE in all cases as a best available estimate.
                                                              Sampler Locations
                                                           *» X ~ Complex Terrain
                                                           _ X -
                                                           _  Tracer Releases:
                                                              - Complex Terrain
                                                              — Fumigation
                              248        258        268        278
                                      UTMEasl(km) Zone 11N, Datum: NAS-C
Figure 8. Carpinteria WRF 1.33 km domain (solid magenta line), with 5 point grid cell
buffer (dashed magenta line)
March 11, 2013
16
ENVIRON

-------
Final
                      Overwater Dispersion Modeling
                                  Task 2 Protocol
                                     CARPINTERIA, CA
       3814
       3806
                268    269    270    271     272    273     274
                             UTM East (km) Zone 11N, Datum: NAS-C
                 275    276
                                                                                    100

                                                                                   ~95 Snow/Ice
                                                                                   -90
                                                                                   ~85 Tundra
                                                                                   -80
                                                                                     5 Barren
                                                                                   -70
                                                                                   -65 Wetland
                                                                                   -60
                                                                                   -55 Water
                                                                                   -50
                                                                                   -45 Forest
                                                                                    40
                                                                                   -35 Range
                                                                                   -30
                                                                                   -25 Agriculture
                                                                                   -20
                                                                                   _15 Urban/Built-Up
                                                                                    10
  Land Use

   Sampler Locations:
X - Complex Terrain
X — Fumigation
   Tracer Releases:
A ~ Complex Terrain
A -- Fumigation
Figure 9. Carpinteria tracer release and sampler location map with landuse in the
background
Cameron. The Cameron experiment includes 26 tracer samples from field studies in July 1981
and February 1982. Tracers were released from a boat and a low profile platform (13 m). As in
the Pismo Beach study, the receptors are located in flat terrain near the shoreline with transport
distances ranging from 4 to 10 km. Figure 10, below, illustrates the WRF domain relative to the
tracer experiment, while Figure 11 zooms in to illustrate just the tracer experiment setup.
Meteorological  discrepancies similar to those in the Pismo Beach  study exist here, and the air-
sea difference correction to lapse rate stability will also be applied for the analysis.
March 11, 2013
17
    ENVIRON

-------
Final
                      Overwater Dispersion Modeling
                                  Task 2 Protocol
  3322'
  326:
      444
                   454
                                464           474           484

                                    UTM East (km) Zone 15N, Datum: NAS-C
                                                                       494
                                                                                    504
Figure 10. Cameron WRF 1.33 km domain (solid magenta line), with 5 point grid cell
buffer (dashed magenta line)
March 11, 2013
18
ENVIRON

-------
Final
                     Overwater Dispersion Modeling
                                 Task 2 Protocol
                                   CAMERON, LA

o
i
i
13
5 -1704-
z
m
QJ
J
|
2
ID











K^
25m
i






^
^— - —












. . rJ*S J^\.
^lOmM





ast






^*x«««e«^


28a




*«»»*






******
2/15
2/24




^^







**«^







5* c







-v



-




466 468 470 472 474 476 478 480 482 484 486 488 490
UTM East (km) Zone 15N, Datum: NAS-C
^^•i
Lan
-100
95 Snow/Ice
-90
~85 Tundra
-80
~75 Barren
-70
-65 Wetland
-60
-55 Water
-50
-45 Forest
-40
-35 Range
-30
-25 Agriculture
-20
-10
JUse
                                                                              X  Sampler Locations
                                                                              A  Tracer Releases
Figure 11. Cameron tracer release and sampler location map with landuse in the
background
March 11, 2013
19
ENVIRON

-------
Final
                     Overwater Dispersion Modeling
                                 Task 2 Protocol
Ventura. The tracer dispersion study in the Ventura, California area was conducted along the
California coast during 4 days in September 1980 and 4 days in January 1981. Data from all 4
of the days in September and 3 of the 4 test days in January are in the dataset. SF6 tracer was
released about 8m above the water from a boat located 6-8 km form shore, and sampled along
2 arcs about 10 to 12 km long. The first arc is 1/4 km to 1 km from the shoreline and the second
arc is about 7 km from the shoreline. Figure 12, below, illustrates the WRF domain relative to
the tracer experiment, while Figure 13 zooms in to illustrate just the tracer experiment setup.
Meteorological data  used in previous evaluations includes wind at 20.5m, temperature at 7m,
and air-sea temperature difference measured at the release location, and vertical temperature
gradient measured over the water by an aircraft.
  3836
  375I
     246       256       266      276       286      296      306
                                 UTM East (km) Zone 11N, Datum: NAS-C
                                                                  316
                                                                          326
                                                                                   336
Figure 12. Ventura WRF 1.33 km domain (solid magenta line), with 5 point grid cell buffer
(dashed magenta line)
March 11, 2013
20
ENVIRON

-------
Final
                      Overwater Dispersion Modeling
                                   Task 2 Protocol
                                  VENTURA, CA
      3798
      3780
                                                                                100
                                                                                   Snow/Ice
              284   286    288   290   292   294   296   298   300   302   304

                           UTM East (km) Zone 11N, Datum: NAS-C
                               Land Use

                               X Sampler Locations
                               A Tracer Releases
Figure 13. Ventura tracer release and sampler location map with landuse in the
background
March 11, 2013
21
ENVIRON

-------
Final
                      Overwater Dispersion Modeling
                                  Task 2 Protocol
Oresund. The tracer dispersion study over the strait of Oresund was conducted between the
coasts of Denmark and Sweden during 9 days between May 15 and June 14, 1984. SF6 was
released as a non-buoyant tracer from a tower at either 95m above the ground (Barseback,
Sweden) or at 115 m on the east side of the Strait of Oresund (Gladaxe,  Denmark), and was
sampled at arcs set up along the opposite  shore and at distances 2-8 km inland. Air-sea
temperature differences were as large as 6-8°C on five of the experiment days, due to warm
advection over the cooler water.
Meteorological data in the study included a lighthouse in the straight, meteorological towers and
masts, SODARS, 3-hourly radiosondes, and occasional mini-sondes released from a boat in the
strait. Figure 14, below, illustrates the WRF domain relative to the tracer experiment.
                                Strait of Oresund
  6230
  6120
  6110
     300
          310
                                                                410
                                                                     420
                320   330   340    350   360   370    380   390   400
                         UTM East (km) Zone 33N, Datum: EUR-M
         Meteorological Stations: A Radiosonde  EH Surface/Mast   + Tower   -O SODAR   • Sea
                                                                            100

                                                                            95  Snow/Ice
                                                                            90
                                                                            85  Tundra
                                                                            80
                                                                            75  Barren
                                                                            70
                                                                            65  Wetland
                                                                            60
                                                                            55  Water
                                                                            50
                                                                            45  Forest
                                                                            40
                                                                            35  Range
                                                                            30
                                                                           - 25  Agriculture
                                                                           -20
                                                                            15  Urban/Built-Up
                                                                            10
                                                                          Land Use
                                                                          Terrain Contour 25m
Figure 14. Oresund tracer release and sampler location map with landuse in the
background
March 11, 2013
22
ENVIRON

-------
Final                                                           Overwater Dispersion Modeling
                                                                         Task 2 Protocol
REFERENCES
NOAA/NCDC. (2010, 09 15). Retrieved 02 15, 2013, from
       http://www.ncdc.noaa.gov/oa/climate/rcsg/datasets.htmlWsurface
Angevine, W. (2012). Performance Results with the Total Energy - Mass Flux PBL scheme.
       Retrieved 01 22, 2013, from
       http://www.mmm.ucar.edU/wrf/users/workshops/WS2012/ppts/3.2.pdf
Doran J.C., G. S. (1987). Wind and Temperature Structure over a Land-Water-Land-Area.
       American Meteorological Society.
Environ. (2010). Evaluation of the COARE-AE RMOD Alternative Modeling Approach Support for
       Simulation of Shell Exploratory Drilling Sources In the Beaufort and Chukchi Seas.
Hahmann, A. N., Draxl, C., Pena, A., Badger, J., Larsen, X., & and Nielsen, J. R. (2011). Simulating
       the Vertical Structure of the Wind with the WRF Model.
       http://www.mmm.ucar.edu/wrf/users/workshops/WS2011/Power%20Points%202011/
       5_4_Hahmann_WRFWorkshop_ll.pdf.
McNally, D., & Wilkinson, J. G. (2011). Model Application and Evaluation: ConocoPhillips Chukchi
       Sea WRF Model Application. Arvada, Colorado: Alpine Geophysics, LLC.
Park, S. B. (2009). The University of Washington Shallow Convectionand Moist Turbulence
       Schemes and Their Impact on Climate Siumulations with the Community Atmosphere
       Model. J. Climate, 3449-3469.
Yver C, G. H.-S. (2012). Evaluating transport in the WRF model along the California Coast.
       Atmos. Chem. Phys. Discuss., pp. 16851-16884.
Zhang, J. (2011). Beaufort and Chukchi Seas Mesoscale Meteorology Model Study.
March 11,2013                              23                                  ENVIRON

-------
APPENDIX A.3: TASK 3 PROTOCOL

-------
[Blank]

-------
Evaluate the AERMOD Performance Using
     Predicted or Measured Meteorology
                          Task 3 Protocol
          AMEC RFP # 12-6480110233-TC-3902
         Federal Prime Contract # EP-W-09-028
                                Prepared for:
            AMEC Environment & Infrastructure, Inc.
               502 W. Germantown Pike, Suite 850
                Plymouth Meeting, PA 19462-1308
                        Attention: Thomas Carr

                                Prepared by:
                ENVIRON International Corporation
                  773 San Marin Drive, Suite 2115
                       Novato, California, 94945
                         www.environcorp.com
                              P-415-899-0700
                              F-415-899-0707
                                 July 2, 2013
                          ENVIRON

-------
                                                              Overwater Dispersion Modeling
                                                                         Task 3 Protocol

INTRODUCTION
The primary objective of the current study is to test and evaluate AERMOD on the outer
continental shelf (OCS). The current modeling procedures for sources on land use the
AERMOD modeling system. The meteorological AERMET processor included in the system is
inappropriate for OCS sources because the energy fluxes over water are not strongly driven by
diurnal heating and cooling. In addition, the meteorological observations necessary to drive the
dispersion models are commonly not available, especially in the Arctic Ocean. For applications
in the Arctic, the remote location and seasonal sea-ice pose logistical problems for the
deployment of buoys or offshore measurement platforms.
This study evaluates a combined modeling approach where the meteorological variables are
provided by a numerical weather prediction model, and then processed by a combination of a
new Mesoscale Model Interface program (MMIF) and, optionally, AERCOARE (a replacement
for AERMET suitable for overwater conditions). Given an appropriate overwater meteorological
dataset, AERMOD can then be applied for New Source Review following the same procedures
as used for sources over land.
The remainder of this document presents a protocol for Task 3 of the study. Task 3 generates a
WRF meteorological dataset for 2009-2011 suitable for dispersion modeling in the Arctic,
employs various combinations of MMIF and AERCOARE to extract modeled and observational
meteorology overwater, and uses that to drive AERMOD simulations.
The modeling period in the current protocol is 2009-2011 allowing for one year of overlap with
the BOEM/UAF 30-year WRF and  Observational dataset. This overlapping period would allow
for a "reanalysis vs. hind-cast" comparison. It also allows for an approximately 1.5 year overlap
with the profiler data collected at Endeavor Island (Jun 2010 to Dec 2011).  Extending the
simulation and  meteorological analysis through 2012 would provide an extra year of comparison
with the profiler and may be considered if additional funding is available.

Task 3: Evaluate the use of WRF Solutions with AERMOD
AMEC  and ENVIRON prepared a Work Plan outlining the various tasks and objectives of the
current study. As directed by EPA and AMEC, the third task protocol includes additional
information on data, options, and issues that were not fully described in the Work Plan. With an
approved protocol, ENVIRON staff will perform the following subtasks:

Task 3a:  Generate WRF simulations for calendar years 2009 through 2011.
For these simulations, ENVIRON has selected the National Center for Atmospheric Research's
(NCAR's) community-developed WRF model (dynamical core version 3.4.1). WRF is a limited-
area, non-hydrostatic, terrain-following eta-coordinate mesoscale model.
ENVIRON's 3-year WRF simulation (2009-2011) will include 5.5 day simulation blocks (starting
December 15th, 2008), with 12 hours overlapping to account for model spin-up. The spin-up time
allows for the model to develop sub-grid scale processes, including vorticity and moisture fields.
Given the high  latitude, the domains are defined on  a polar stereographic map projection. The
outermost 36 km domain encompasses all of Alaska and parts of Northern Canada and Russia;

July 2, 2013                                 1                                   ENVIRON

-------
                                                           Overwater Dispersion Modeling
                                                                      Task 3 Protocol

a 12 km nest including most of interior Alaska, the Bering, Chukchi, and Beaufort seas; and the
4 km nest focuses on the regions of the Chukchi and Beaufort seas containing active lease sites
and the entirety of Alaska's North Slope (see Figure 1). EPA guidance recommends a 50 km
buffer around CALPUFF sources and receptors, to allow for re-circulation of the puffs.
Additionally, the first five grid points on the edge of a nested WRF grid are contaminated by the
numerical down-scaling in WRF, and should not be used. Figure 2 illustrates 70 km (5 x 4km +
50km) buffers around the active lease areas with yellow dots (National  Park Service, 2010).
                            ALASKA BOEM
                              150°E 150°W    90°W
                                                                      100°W
     160°E
                                                                    -  110°W
     170°E
                                                                      120°W
             180°     170°W   160°W   150°W   140°W    130°W
                1    25    75    200    500   1000  1500   2000   3000
Figure 1. Proposed WRF 36km, 12km, and 4 km domains.
July 2, 2013
ENVIRON

-------
                                                            Overwater Dispersion Modeling
                                                                       Task 3 Protocol
                           Active Lease Sites
    72°N
    70°N
    68°N
    66°N
    64°N
    62°N
                180C
        165°W
150°W
135°W
             170°W
165°W   160°W   155°W   150°W   145°W
                                                                          72°N
                                                                       -  70°N
                                                                          68°N
                                                                          66°N
                                                                          64°N
                                                                          62°N
                      140°W
              1     25     75    200   500    1000   1500  2000   3000
Figure 2.12 km and 4 km WRF domains, with lease sites (magenta), the Arctic National
Wildlife Refuge (orange), Class I Areas (red), and 70 km buffers from active lease sites
(yellow dots).
The planned model vertical structure maintains the 37 vertical levels from the Task 2 WRF
modeling. Layers are stacked toward the surface to capture the coastal boundary layer and
sharp arctic wintertime temperature inversions (see Table 1). ENVIRON anticipates that the fine
vertical spacing will help winds and temperatures respond more explicitly to dynamical
influences.
July 2, 2013
                                                     ENVIRON

-------
                                                            Overwater Dispersion Modeling
                                                                       Task 3 Protocol
       Table 1. WRF model 37 vertical levels with approximate heights AGL.
Level
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
eta
1
0.9985
0.997
0.995
0.993
0.991
0.988
0.985
0.98
0.97
0.96
0.95
0.94
0.93
0.91
0.89
0.87
0.84
0.8
0.76
0.72
0.68
0.64
0.6
0.55
0.5
0.45
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.06
0.027
0
Pressure
(mb)
1000
999
997
996
994
992
989
987
982
973
964
955
946
937
919
901
883
856
820
784
748
712
676
640
595
550
505
460
415
370
325
280
235
190
154
124
100
Level Height
(m)
0.0
10.8
21.6
36.0
50.5
65.0
86.7
108.5
145.0
218.3
292.1
366.5
441.5
517.1
670.0
825.3
983.2
1225.0
1557.1
1901.3
2258.5
2630.0
3016.9
3421.0
3952.7
4518.1
5122.3
5771.8
6475.0
7242.8
8090.5
9039.1
10120.5
11385.0
12585.4
13761.3
14907.1
Mid-layer
Height
(m)

5.4
16.2
28.8
43.3
57.7
75.9
97.6
126.8
181.6
255.2
329.3
404.0
479.3
593.5
747.6
904.3
1104.1
1391.0
1729.2
2079.9
2444.3
2823.5
3219.0
3686.9
4235.4
4820.2
5447 .1
6123.4
6858.9
7666.7
8564.8
9579.8
10752.8
11985.2
13173.4
14334.2
Layer
Thickness
(m)

10.8
10.8
14.4
14.5
14.5
21.8
21.8
36.4
73.3
73.8
74.4
75.0
75.6
152.9
155.3
157.9
241.8
332.2
344.2
357.2
371.4
387.0
404.0
531.8
565.4
604.2
649.5
703.2
767.9
847.6
948.7
1081.4
1264.5
1200.4
1175.9
1145.8
Note: Calculated using P0=1 000mb, Ptop=100mb, T0=O.OOC, and dT/dx=-6.5C/km.
July 2, 2013
ENVIRON

-------
                                                             Overwater Dispersion Modeling
                                                                        Task 3 Protocol

WRF must be optimized to simulate coastal arctic weather. To do this, ENVIRON's model
configuration should include the most accurate initial inputs, regionally applicable physics
choices, and effective nudging, combined with the best SST's, sea ice, and land surface models
available. ENVIRON's WRF will build upon the successful application of WRF to reanalyze 30
years (1979-2009) of arctic meteorology prepared by BOEM-UAF specifically to study surface
winds (Krieger, Zhang, Shulski, Fuhong, & Tao, 2012). UAF's method used data assimilation to
generate hourly reanalyses. By contrast, ENVIRON proposes running WRF as a hind-cast,
initializing the model from ECMWF reanalysis grids and running it for a 5.5-day period, using a
combination of 3D analysis nudging and observational nudging. ENVIRON's approach is very
similar to many other WRF model applications used to support photochemical grid modeling in
many parts of the country.
Table 2 shows ENVIRON's proposed WRF hind-cast treatments relative to the BOEM-UAF
reanalysis. The treatment of sea  ice is critical to WRF modeling success. The BOEM-UAF
reanalysis employs modifications from a variant of WRF named "Polar WRF (Byrd Polar
Research Center, 2013) to the standard WRF package, which ENVRON also proposes to use.
Although WRFv3.5 has been released, the Polar WRF modifications have not yet been made to
v3.5, and ENVIRON proposes to use Polar WRFv3.4.1 instead.
ENVIRON will also improve upon the 24 km CMC sea ice by ingesting ~4 km gridded snow and
sea ice datasetfrom the National Ice Center (NIC) Ice Mapping System (IMS) available post-
2004 (National Ice Center, 2008). ENVIRON will employ the Morrison microphysics scheme,
which was designed specifically for arctic applications but has documented success at mid-
latitudes as well. ENVIRON concurs with the BOEM-UAF selection of the Rapid Radiative
Transfer Model for GCMs (RRTMG) radiation option, Monin-Obukhov (Janjic) surface layer
scheme, NOAH land surface model (with polar modifications), and the TKE-based Mellor-
Yamada-Janjic (MYJ) planetary boundary layer  scheme. Other ENVIRON sensitivity studies for
stable boundary layers in Alaska (for a confidential client) and Wyoming (Hahn, Brashers,
Emery, & McNally, 2012) indicated superior vertical profiles of temperature and moisture using
MYJ compared to YSU  and other planetary boundary layer schemes. The BOEM-UAF
reanalysis employs the  relatively un-tested Grell-3D cumulus scheme. ENVIRON performed
sensitivity studies in the four corners region, and found the Grell-3D scheme  produced
extremely minimal convection during the summer compared to PRISM data. Thus, ENVIRON
proposes to use the Kain-Fritch cumulus scheme on the 36 and 12 km domains, with explicit
convection (no parameterized scheme)  on the 4km domain. ENVIRON will update SST's daily,
calculate the skin SST,  and update deep soil temperatures following the usual WRF procedures.
Model inputs will use the ERA-interim European Center for Medium Range Weather Forecast's
ERA-lnterim dataset (ERA-I, 6-hourly analysis output, ~0.75°x0.75° degree resolution).
Traditionally, ENVIRON recommends 3-D nudging toward analysis grids for wind, temperature,
and moisture for the 36 and 12 km domains. Analysis nudging within the PEL can prevent the
natural, dynamic development of boundary layer processes. Therefore, ENVIRON's strategy
July 2, 2013                                5                                  ENVIRON

-------
                                                                  Overwater Dispersion Modeling
                                                                               Task 3 Protocol

involves 3-D analysis nudging above the PEL for the 36km domain, and observational nudging
against DS-35051 data on the 4km domain.  3-D analysis nudging of the 12 km using the 0.75°
ERA-I data directly would likely degrade model performance.  Recent ENVIRON experience
with the WRF system's OBSGRID program in the data-sparse Four Corners region showed that
model performance was  degraded when analysis nudging with OBSGRID output was used.
               Table 2. BOEM-UAF 30-year vs ENVIRON proposed Model Options
Treatment
WRF Version
Snow/sea ice
Boundary Conditions
Snow
Sea Ice
Microphysics
Radiation
Surface Layer
Land Surface Model
PEL
Cumulus
Time-varying SST
Calculate skin SST
Update Deep Soil
Temp
Fractional Sea Ice
Tice2tsk if2cold
Nudging
FDDA
Obs. nudging
BOEM-UAF REANALYSIS
V3.2.1
BOEM modified version of Polar WRF
codes for snow/sea ice processes
ERA-lnterim (0.75° grid spacing)
Canadian Meteorological Centre
(CMC) daily snow depth (24 km)
AMSR-E daily sea ice
concentration/thickness (12.5 km)
Morrison
RRTMG shortwave and longwave
Monin-Obukhov (Janjic) scheme
Noah with Polar WRF modifications
MYJ TKE
Grell-3D
On
On
Yes
Yes
True
Spectral
Yes
No
ENVIRON 2009-2011
V3.4.1
All Polar WRF modules
ERA-lnterim (0.75° grid spacing)
IMS 4-km NH daily snow
IMS 4-km NH daily (sea ice)
Morrison
RRTMG shortwave and longwave
Monin-Obukhov (Janjic) scheme
Noah with Polar WRF modifications
MYJ TKE
Kain-Fritch (36/1 2km only)
On
On
Yes
IMS 4-km dataset
True
Spectral (u, v, theta, geopotential, and
moisture)
No
Nudge toward DS-3505 data
1 DS-3505 integrated surface hourly (ISH) worldwide station data includes extensive automated QC on all data and
additional manual QC for USAF, NAVY, and NWS stations. It integrates all data from DS9956, DS3280, and DS3240.
10,000 currently active stations report wind speed and direction, wind gust, temperature, dew point, cloud data,
sea level pressure, altimeter setting, station pressure, present weather, visibility, precipitation amounts for various
tie periods, snow depth, and various other elements as observed by each station. (NOAA/NCDC, 2010)
July 2, 2013
ENVIRON

-------
                                                              Overwater Dispersion Modeling
                                                                          Task 3 Protocol

Successful application of spectral nudging in the BOEM-UAF 30-year reanalysis dataset
warrants an attempt in this study. Spectral nudging is relatively new in WRF, and ENVIRON
proposes a limited nudging sensitivity test for February and August 2009 evaluating spectral
versus analysis nudging (Table 3).  Spectral nudging configurations will be guided by the
parameters in Otte et al. (2012), that suggest limiting nudging above the tropopause and
adopting a ~6 hourtimescale. One other limited sensitivity study will  be performed comparing
analysis nudging both with and without using objectively analyzed input files during February
and August 2009. ENVIRON will also perform observational nudging (winds only) using NCDC
DS-3505 data on the 4 km domain,  using a 50 km radius of influence. ENVIRON excludes
nudging of temperature because in  coastal  areas it may weaken land-sea temperature contrasts
and adversely affect model performance; the majority of North Slope observational assets
reside along the coastline. Table 3 presents proposed relevant nudging parameters.

                       Table 3. Proposed WRF nudging coefficients
Nudging
Spectral
3-D Analysis
(if required)
2-D Surface
Observational
Domains
Applied
36/1 2/4 km
36 km
None
4km
Nudging Strengths (1/s)
Wind
~6 h timescale
35 10~4

65 10'4
Temperature (no PBL)
~6 h timescale
35 10~4

None
Humidity (no PBL)
~6 h timescale
35 10~4

None
The 2009-2011 WRF simulation will be subjected to a model performance evaluation using the
METSTAT program to evaluate temperatures, winds, and humidity. METSTAT uses surface
meteorological observations and extracted WRF data (paired in time and space) to calculate a
series of statistical measures designed to examine WRF's ability to characterize the
observations. ENVIRON will evaluate the model against as full an observed dataset as feasible,
including DS-3505 data for 2009-2011 and the BOEM-UAF observational dataset for 2009
(extended as feasible) if obtained from UAF. Data contained in the BOEM-UAF dataset, but not
in the DS-3505 data will serve as an independent verification of model performance as this data
was not used for nudging. Additionally, the off-shore buoys analyzed in Task 1 represent
independent verification as they are not contained in the DS-3505 dataset.
Additionally, ENVIRON will employ METSTAT to evaluate the single overlapping year, 2009,
from the 30-year BOEM-UAF simulation. This will enable a direct comparison of the
meteorology for that year.
Task 3b: Extract meteorological datasets using WRF solutions for sites with overwater
        observations in the Arctic
ENVIRON will extract WRF solutions from five buoy locations in the Arctic (two sites in the
Chukchi Sea and three in the Beaufort Sea) as well as for the Endeavor Island profiler location
July 2, 2013
ENVIRON

-------
                                                              Overwater Dispersion Modeling
                                                                         Task 3 Protocol

(Figure 3). As proposed in Task 1, ENVIRON will evaluate the four extractions and processing
options as follows:
 1.   MMIF will be applied to extract and prepare data sets for direct use by AERMOD. All
     variables will be as-predicted by the WRF simulations including the surface energy fluxes,
     surface roughness and planetary boundary layer (PBL) height.
 2.   As in Option 1), but the PBL height will be re-diagnosed from the wind speed and potential
     temperature profiles using the Bulk-Richardson algorithm within MMIF. Based on the
     August to October 2010 monitoring data collected by Shell in the Beaufort Sea,  PBL
     heights range from 10 m to 700 m, with a median height of 80 m. AERMOD simulations
     can be very sensitive to the PBL height (Richmond & Morris, 2012) and MMIF-processed
     PBL heights may provide significantly different predicted concentrations than the PBL
     height diagnosed by WRF.
 3.   MMIF will be applied to extract the key meteorological variables of overwater wind speed,
     wind direction, temperature, humidity, and PBL height. AERCOARE will use these
     variables to predict the surface energy fluxes, surface roughness length and other
     variables needed for the AERMOD simulations. AERCOARE has a surface layer scheme
     developed specifically to predict surface fluxes from overwater measurements. In this
     application, the WRF simulations provide model-derived alternatives for variables
     measured by a buoy, ship or offshore platform. AERCOARE can also be applied using a
     number of different options. For the current study, we propose to apply AERCOARE using
     the defaults recommended in the AECOARE model evaluations study (Richmond & Morris,
     2012).
 4.   As in Option 3), but the PBL height will be re-diagnosed using the Bulk-Richardson
     algorithm within MMIF. AERCOARE will be applied as in Option 3.
July 2, 2013                                 8                                  ENVIRON

-------
                                                             Overwater Dispersion Modeling
                                                                        Task 3 Protocol
Figure 3. Buoy sites in the Chukchi and Beaufort Seas; Endeavor Island profiler location.

Task 3c: AERCOARE using buoy and profiler data
ENVIRON will use surface input from the buoys in Figure 3 to drive AERCOARE. The buoys
provide wind speed, wind direction, temperature (at various heights), relative humidity, and sea
surface temperature.
The vertical temperature profiler at Endeavor Island will be used to extract hourly mixing heights
from April 2010 (when the profiler was installed) through 2011. These mixing heights will be
provided to AERCOARE to replace WRF-diagnosed and AERCOARE-diagnosed mixing heights
options 3 and 4, respectively, of Task 3b.

Task 3d: AERMOD
ENVIRON will conduct AERMOD simulations for the six OCS hypothetical sources using the
output from  the modeling-based approaches (MMIF/AERCOARE as described in Task 3b) to
drive AERMOD. These will be compared directly against AERMOD driven by the buoy and
profiler extraction in Task 3c. Simulations involving AERCOARE will be confined to the open
water time periods of 2009-2011, whereas MMIF/AERMOD runs will be performed for all
months. The analysis will calculate the predicted concentrations over relevant averaging periods
for the  criteria pollutants (e.g., 1-hour, 8-hour, 24-hour, and period) for OCS sources. As in Task
1, the sources will be modeled at the buoy and profiler location, with a polar grid of receptors
located along 360 equidistant radii at radial distances (30 m, 50 m, 75 m, 100 m, 125 m, 150 m,
July 2, 2013
ENVIRON

-------
                                                            Overwater Dispersion Modeling
                                                                       Task 3 Protocol

175 m, 200 m, 300 m, 400 m, 500 m, 750 m, 1 km, 1.5 km, 2 km, 3 km, 4 km, 5 km, 6 km, 7 km,
8 km, 9 km, and 10 km) from the source.

Task 3e: AERMOD evaluation
ENVIRON  will evaluate the model-driven AERMOD performance in Task 3d against the
observationally-driven AERMOD runs using contour plots, scatter diagrams, Q-Q plots, and
statistics as necessary.
Task 3f: Conclusion from evaluation
ENVIRON  will diagnose how various aspects of the modeling procedures influenced the
prediction of concentration (in particular maximum concentration) by assessing (1) the
meteorology and modeling performance influence, (2) MMIF recalculated PEL height vs WRF
PEL height, and (3) MMIF with AERCOARE or MMIF fed directly into AERMOD.
July 2, 2013                                10                                 ENVIRON

-------
                                                              Overwater Dispersion Modeling
                                                                          Task 3 Protocol
REFERENCES
NOAA/NCDC. (2010, 09 15). Retrieved 02 15, 2013, from
       http://www.ncdc.noaa.gov/oa/climate/rcsg/datasets.htmlWsurface
Bowden, J. H., Otte, T. L, Nolte, C. G., & Otte, M. J. (2012). Examining interior grid nudging
       techniques using two-way nesting in the wrf model for regional climate modeling. J.
       Climate, 25, 2805-2823.
Byrd Polar Research Center. (2013, 03 06). The Polar WRF. Retrieved 06 24, 2013, from Ohio
       State University: http://polarmet.osu.edu/PolarMet/pwrf.html
Environ. (2010). Evaluation of the COARE-AERMOD Alternative Modeling Approach Support for
       Simulation of Shell Exploratory Drilling Sources In the Beaufort and Chukchi Seas.
Hahn, R., Brashers, B., Emery, C., & McNally, D. (2012). Winter 2008 WRF Modeling of the Upper
       Green River Basin. Novato, CA: ENVIRON.
Krieger, J., Zhang, J., Shulski, M., Fuhong, L., & Tao, W. a. (2012). Toward Producing a
       Beaufort/Chukchi Seas Regional Reanalysis. Retrieved 06 12, 2013, from
       Beaufort/Chukchi Seas Mesoscale Meteorology Modeling Study: http://mms-
       meso.gi.alaska.edu/pub/amssl2-krieger-presentation.pdf
McNally, D.,  & Wilkinson, J. G. (2011). Model Application and Evaluation: ConocoPhillips Chukchi
       Sea WRF Model Application. Arvada, Colorado: Alpine Geophysics, LLC.
National Ice Center, 2008. u. (n.d.). IMS daily Northern Hemisphere snow and ice analysis at 4
       km and 24 km resolution. Retrieved 06 18, 2013, from National Snow and Ice Data
       Center.: http://dx.doi.org/10.7265/N52R3PMC
National Park Service. (2010). Federal Land Managers' Air Quality Related Values Work Group
       (FLAG) Phase 1 Report—Revised. Retrieved 06 19, 2013, from
       http://www.nature.nps.gov/air/Pubs/pdf/flag/FLAG_2010.pdf
Otte, T. L., Otte, M. J., Bowden, J. H., & Nolte, C. G. (2012). Sensitivity of Spectral Nudging
       Toward Moisture. US Environmental Protection Agency.
Richmond, K., & Morris, R. (2012, 10). Evaluation of the Combined AERCOARE/AERMOD
       Modeling Approach for Offshore Sources. Retrieved 06 19, 2013, from EPA:
       http://www.epa.gov/ttn/scram/models/relat/aercoare/AERCOARE-Model-
       Evaluation.pdf
Zhang, J. (2011). Beaufort and Chukchi Seas Mesoscale Meteorology Model Study.
July 2, 2013                                 11                                   ENVIRON

-------
APPENDIX A.4: TASK 5 DRAFT PROTOCOL

-------
[Blank]

-------
Task 5 Protocol - Draft                                               June 3, 2014

Introduction

The EPA's preferred screening dispersion model for regulatory applications is AERSCREEN
which functions as a wrapper program and interface for the AERMOD modeling system,
including BPIP and AERMAP. AERSCREEN utilizes the MAKEMET program to generate
screening  meteorology in  a format  required  by AERMOD and  subsequently executes
AERMOD in screening mode.

MAKEMET does not account for the unique boundary layer conditions common to overwater
environments such as are found over the outer continental shelf (OCS),  nor has MAKEMET
been evaluated to determine whether  it is adequate to use in marine environments. The
purpose of Task 5 in task order 0444 is to develop a mechanism for generating screening
meteorology  representative  of  overwater  conditions  that  can  either  be  input into
AERSCREEN  or  input directly into AERMOD  when AERMOD is  run in screening  mode.
Additionally, overwater scaling factors are to be  derived  to  convert 1-hour predicted
concentrations to 3-hour, 8-hour, 24-hour, and annual averaging periods.

Approach (Task 5a - Modified)

The  AERCOARE program  was recently developed as a  counterpart to  AERMET for
processing overwater meteorological data for use with AERMOD.  AERCOARE utilizes the
COARE bulk air-sea flux algorithm1 to compute boundary layer  parameters that are more
representative of the boundary layer over open water.

To address the need for representative overwater screening meteorology for use in offshore
dispersion  modeling, AMEC will  develop  a stand-alone overwater meteorological data
generator,  COARESCRN, which leverages the  advantages of the existing AERCOARE
program.   COARESCRN will function  as a  pre-processor to AERCOARE to generate an
hourly overwater meteorological dataset that contains the set of meteorological parameters
required by AERCOARE. AERCOARE will use the output from COARESCRN to compute
the boundary layer parameters and generate the meteorological surface (SFC) and  profile
(PFL) files that are  input to AERMOD.  The screening  meteorology files generated with
AERCOARE will be compatible with AERMOD when run in screening mode.

AERSCREEN, in  its current implementation, automatically initiates  the MAKEMET program
each time it runs and does not provide the option to use an existing meteorological data file.
For  this  reason, the  overwater  screening  data  generated  with  COARESCRN and
subsequently  processed with AERCOARE will not be compatible with AERSCREEN.
AERMOD will  have to be run in screening mode apart from AERSCREEN.  Any additional
1 A technical description of version 3.0 of the COARE algorithm can found at
ftp://ftp.etl.noaa.gov/users/cfairall/wcrp wgsf/computer  program/cor3 0/and
http://www.coaps.fsu.edu/COARE/flux algor/.

AERMOD-AERCOARE-MMIF                                                     Page 2

-------
Task 5 Protocol - Draft                                                 June 3, 2014

preparation of input data required by AERMOD such as building, terrain, and  receptor data
will have to be processed separately prior to running AERMOD.

The  meteorological  input  parameters required  by AERCOARE vary  based on options
specified  by the user.  To  limit the parameters generated by COARESCRN to a fixed set,
COARESCRN will assume AERCOARE will be run with the following default options:

   *   mixopt = 0    {requires mixing height in overwater data file}
   *   jwarm = 0    {warm-layer effects not included}
   *   jcool = 0      {cool-skin effects not included}
   *   jwave = 0     {surface roughness treated as a function of friction velocity}.

The overwater meteorological data file generated by COARESCRN will include  the following
fixed set of parameters on each hourly record:

   *   Year (4-digits)                    {input by user}
   *   Month (LST, 1-12)                 {derived}
   *   Day (LST, 1-31)                   {derived}
   *   Hour (LST,  1-24)                  {derived}
   *   Latitude (deg)                    {input by user}
   *   Longitude (deg)                   {input by user}
   *   Wind Speed (m/s)                  {derived}
   *   Wind Direction (degrees)           {derived}
   *   Sea Temperature (C)              {derived}
   *   Air Temperature (C)               {derived}
   *   Relative Humidity (%)              {derived}
   *   Wind Meas. Height  (m)             {input by user}
   *   Air Temp. Meas. Height (m)         {derived}
   *   Sea Temp.  Meas. Depth (m)        {input by user}
   *   Mixing Height (m)                  {derived}.

To generate a dataset that  contains the parameters above, COARESCRN will require the
following inputs from the user:

   *   Year (4-digits)
   *   Latitude (deg)
   *   Longitude (deg)
   *   Minimum Wind Speed (m/s, decimal)
   *   Minimum Air Temperature (C, whole degrees)
   *   Maximum Air Temperature (C, whole degrees)
   *   Minimum Air-Sea Temp Difference (C, whole degrees)
   *   Maximum Air Sea Temp Difference  (C, whole degrees)
   *   Air Temp. Meas. Height (m)
   *   Wind Meas. Height  (m)
   *   Sea Temp.  Meas. Depth (m)
   *   Minimum Relative Humidity
AERMOD- AERCOARE - MMIF                                                      Page 3

-------
Task 5 Protocol - Draft                                               June 3, 2014

   *  Maximum Mixing Height (m)        input by user or derived based on latitude?

COARESCRN will be developed in Fortran as a command-line executable file that can be
run from within a Microsoft Windows command-prompt or a Linux scripting shell.  When
executed, COARESCRN will read an ASCII text control file that contains each of the
required user-defined input parameters listed above in  a prescribed order.

Screening Meteorology

COARESCRN will compile an hourly overwater dataset by varying  the values of the
meteorological parameters required by AERCOARE using the ranges specified by the user
in  the  COARESCRN  control file.    Values  will   be combined  to simulate  realistic
meteorological conditions.  Rather than generating a full year  of screening meteorology,
COARESCRN will generate only the number of hours needed to cover the range of values
indicated in the control file and any internal limits established for parameters that are not
input by the user.  Each combination  of values will be replicated for 36 wind directions
ranging from 10 - 360 degrees at 10 degree intervals.

Hourly data will be assigned to an appropriate hour-of-day and season-of-year based on the
climate conditions implied by the latitude specified. The latitude will be categorized as either
tropical,  mid-latitude, or arctic.  Arctic data will be confined to the typical ice-free season
(July - September).  Limits for the hour-of-day and season-of-year assignments will  be
coded into COARESCRN based on an evaluation of observational buoy data obtained from
the National Data Buoy Center. Buoy data from tropical, mid-latitude, and arctic regions will
be evaluated to determine diurnal and seasonal variation of average minimum and maximum
values of air and sea temperature, air-sea  temperature difference, and relative humidity.
AMEC will consult with ENVIRON to identify reasonable  ranges of marine mixing heights by
season and time of day for different latitudes.

COARESCRN Evaluation (Task 5c - Modified)

COARESCRN  will  be  evaluated  by  assessing 1-hour concentrations  generated  with
AERMOD  using screening meteorology versus results from refined AERMOD simulations
using  measured and/or predicted (WRF) data processed with AERCOARE.  Task 3 of the
task order focuses on the evaluation of AERMOD performance using WRF and measured
meteorology prepared with AERCOARE.  The completion of task 3 will result in AERMOD
simulations that can be used to evaluate COARESCRN.  AMEC will consult with ENVIRON
and EPA  Region  10 to identify completed AERMOD simulations.   AMEC will prepare
comparable screening meteorology using COARESCRN and AERCOARE and rerun the
task  3 AERMOD simulations  using the non-default  SCREEN option  and the screening
meteorology.

The AERMOD simulations defined in  Task  3 are limited to the arctic region.  AMEC will
consult with ENVIRON and EPA Region 10 to determine if AERMOD simulations for non-
arctic applications (Gulf of Mexico or  other) exist that can be used to  further evaluate
COARESCRN.
AERMOD- AERCOARE - MMIF                                                     Page 4

-------
Task 5 Protocol - Draft                                             June 3, 2014

Derivation of Scaling Factors (Task 5b - Modified)

The 1-hour concentrations generated from the screening and refined AERMOD simulations
used to evaluate COARESCRN will also be used as the basis for deriving overwater scaling
factors. Scaling factors will be derived that relate peak hourly predictions to 3-hour, 8-hour,
24-hour, and annual averaging periods.

Comparison with AERSCREEN

If resources allow, the AERMOD simulations used in the COARESCRN evaluation will be
repeated using screening meteorology developed with AERSCREEN.  AMEC will compare
maximum 1-hour predicted concentrations from the  AERSCREEN-based, COARESCRN-
based,  and refined AERMOD simulations.   Scaled  3-hour, 8-hour,  24-hour,  and annual
averages from the AERSCREEN- and COARESCRN-based simulations will be compared to
the 3-hour, 8-hour, 24-hour, and annual averages computed by AERMOD during the refined
simulations.  AERSCREEN- and COARESCRN-based maximum 1-hour concentrations will
each be  scaled  using the  factors  built  into AERSCREEN  and those  derived  for
COARESCRN for additional comparison.
AERMOD-AERCOARE-MMIF                                                   Page 5

-------
APPENDIX B: PEER REVIEW COMMENTS FOR VOLUME 2 AND VOLUME 3

-------
[Blank]

-------
            RESPONSE TO COMMENTS
                    FOR
                DRAFT VERSION
          EVALUATION OF THE COMBINED
WRF/MMIF/AERCOARE/AERMOD OVERWATER MODELING
   APPROACH FOR OFFSHORE EMISSION SOURCES
            VOLUME 2 and VOLUME 3
                   October 2015

-------
[Blank]

-------
                           TABLE OF CONTENTS

ABBREVIATIONS AND ACRONYMS	ii
I      INTRODUCTION	 1
II     RESPONSE TO COMMENTS	2
   A.  General Comments	2
   B.  Volume 2 - Evaluation of Weather Research and Forecasting Model Simulations for Five
      Tracer Gas Studies with AERMOD	15
   C.  Volume 3 - Analysis of AERMOD Performance Using Weather Research and
      Forecasting Model Predicted Meteorology and Measured Meteorology in the Arctic	54

-------
                  LIST OF ABBREVIATIONS AND ACRONYMS
AERCOARE ............................ AERMOD-COARE
AERMET .................................. AERMIC Meteorological
AERMIC ................................ AMS/EPA Regulatory Model Improvement Committee
AERMOD ................................. AERMIC Model
AERSCREEN ....................... AERMOD-SCREEN
AMS ...................................... American Meteorological Society

B                       Bias
6. ............................................... Bowen Ratio
BOEM ................................... Uni ted States Department of the Interior, Bureau of Ocean
                        Energy Management
BPIP .......................................... Building Profile Input Program
BPI PPRTME """.".""""""".".""""" BPI P PRI M E
BSEE ........................................ Bureau of Safety and Environmental Enforcement

CALPUFF ............................. California Puff
CFR ........................................... Code of Federal Regulations
COARE                  Coupled  Ocean-Atmosphere Response Experiment

                        Degrees
K/m_ .......................................... Vertical Potential Temperature Gradient above the mixed layer
DOI ............................................ United States Department of the Interior

E                       Gross Error
ECMWF .................................... European Centre Medium-Range Weather Forecasts
EPA ........................................... United States Environmental Protection Agency
ERA-I ........................................ ECMWF  Interim Analysis
Eta ............................................. Greek letter n

FNOMC .................................... Fleet Numerical Meteorology and Oceanography Center

H_ ............................................... Sensible  Heat Flux

IA                       I nteragency Agreement

km                      kilometers
L_ ............................................... Monin-Obukhov Length

m ............................................... meters
Mo .............................................. Mean observation
Mp                      Mean prediction
mb                      millibar

-------
m/secmeters per second
METSTAT	Meteorological Statistical
MMIF	Mesoscale Model Interface
MYJ	Mellor-Yamada-Janjic

NAAQS	National Ambient Air Quality Standards
NARR                    North American Regional Reanalysis
NCAR	National Center for Atmospheric Research
NOAA	National Oceanic and Atmospheric Administration
NCDC	National Climatic Data Center
NC>2	Nitrogen Dioxide
NWP                     Numerical Weather Prediction
OCD	Offshore and Coast Dispersion
OCS	Outer Continental Shelf
OLM	Ozone Limiting Method

%	Percent
PEL	Planetary Boundary Layer
PFLZZZZZZZZZZZZ'Profile
PM2.5	Particulate matter less than or equal to 2.5 microns
PRIME                   Plume Rise Model Enhancements
PSD	Prevention of Significant Deterioration
PVMRM	Plume Volume Molar Ratio Method

QAPP	Quality Assurance Project Plan
Q-Q                     Quantile-Quantile

rAlbedo
Region 10	United States  Environmental Protection Agency, Region 10
RMSE	Root Mean Square Error

SFC	Surface
SCRAM	Support Center for Regulatory Atmospheric Modeling
Shell	Shell Offshore Inc. or Shell of Gulf Mexico Inc.
o^                       Standard Deviation of Wind Direction
aw	Standard Deviation of Vertical Wind Speed
SO2                     Sulfur Dioxide

T or 0                    Temperature
TIBL	Thermal Internal Boundary Layer
TKE	Thermal Kinetic Energy

USN	United States  Navy
u._	Surface Friction Velocity

-------
WD_	Wind Direction
WRF	Weather Research and Forecasting
l^sJ.3™3™3™3™3.™Wind Speed
w._                        Convective Velocity Scaling Factor

zjc_	Convective Mixing Height
z/m.	Mechanical Mixing Height
z	Surface Roughness Length
 o--
                                          IV

-------
I      INTRODUCTION

On 19 December 2014, the United States Environmental Protection Agency (EPA), Region 10 in
collaboration with the United States Department of the Interior (DOI), Bureau of Ocean Energy
Management (BOEM), emailed Volume 2 and Volume 3 of the Draft Evaluation of the Combined
WRF/MMIF/AERCOARE/AERMOD Overwater Modeling Approach for Offshore Emission
Sources to 18 peer reviewer. The peer reviewers consisted of individuals in government,
professional organizations, trade groups, and industry. For Volume 2 - Evaluation of Weather
Research and Forecasting Model Simulations for Five Tracer Gas Studies with AERMOD and
Volume 3 - Analysis of AERMOD Performance Using Weather Research and Forecasting
Model Predicted Meteorology and Measured Meteorology in the Arctic, there were three
respondents and two respondents, respectively. The  aforementioned subtitles to Volume 2 and
Volume 3 were  updated to provide clearer understanding of its content.

The responses  are divided into three sections. Section A provides responses to 35 general
comments received on the two volumes. Section B supplies responses to 231 comments
received on Volume 2 while Section C furnishes responses to 17 comments received on
Volume 3. Section B and Section C are further subdivided into General Comments and Specific
Comments. Response to comments totalled 283.

-------
II     RESPONSE TO COMMENTS

A.    General Comments

Comment A.1:  The two reports need clear and consistent titles that describe what is in each
report. The subheading about "Section 3.2.2e ...." means little to me. Say it in words.

Response:  The collaboration study report consists of three volumes. Volume 1 describes all
six tasks completed by Region 10 and BOEM under an Interagency Agreement.1 However, only
a summary of the work completed under Task 2 and Task 3 appears in Volume 1. Volume 2 and
Volume 3 provides a detailed description of the work in Task 2 and Task 3, respectively.

The six tasks are:

Task 1. Evaluation of two Outer Continental Shelf Weather Research and Forecasting Model
Simulations

Task 2. Evaluation of Weather Research and Forecasting Model Simulations for Five Tracer
Gas Studies with AERMOD

Task 3. Analysis of AERMOD Performance Using Weather Research and Forecasting Model
Predicted Meteorology and Measured Meteorology in the Arctic

Task 4. Comparison of Predicted and Measured Mixing Heights

Task 5. Development of AERSCREEN for Arctic Outer Continental Shelf Application

Task 6. Collaboration Study Seminar

The complete title of the three volumes are:

Combined WRF/MMIF/AERCOARE/AERMOD Overwater Modeling Approach for Offshore
Emission  Sources, Volume 1 - Project Report

Combined WRF/MMIF/AERCOARE/AERMOD Overwater Modeling Approach for Offshore
Emission  Sources, Volume 2 - Evaluation of Weather Research and Forecasting Model
Simulations for Five Tracer Gas Studies with AERMOD

Combined WRF/MMIF/AERCOARE/AERMOD Overwater Modeling Approach for Offshore
Emission  Sources, Volume 3 - Analysis of AERMOD Performance Using Weather Research
and Forecasting Model Predicted Meteorology and Measured Meteorology in the Arctic

Comment A.2:  Front Cover: Why is this here and what does it mean (i.e.," A Section 3.2.2.e
Demonstration under Appendix W of 40 CFR 51")?
1  EPA Interagency Agreement/Amendment, RW-14-95830001 - 0, 20 August 2014.

                                        2

-------
Response: The placeholder subtitle has been updated. See Response to Comment A. 1.

Comment A.3:  Front Cover: Add a subheading explaining what is in this Task 2 report.  Use
parallel structure so that the Task 3 report has the same "look" but different subheadings.

Response: The placeholder subtitle has been updated. See Response to Comment A. 1.

Comment A.4:  Inside Cover: Add a subheading explaining what is in this Task 2 report.  Use
parallel structure so that the Task 3 report has the same "look" but different subheadings.

Response: The placeholder subtitle has been updated. See Response to Comment A. 1.

Comment A.5.  Maybe it was not in the scope of work, but it seems to be that since OCD is the
guideline model, it should have been  included in the Task 2 evaluation.  There are probably
available OCD evaluation results for many of the databases that could simply be referenced and
added to the comparison.

Response: There is a plan by Region 10 and BOEM  to compare the American Meteorological
Society/U.S Environmental Protection Agency Regulatory Model Improvement Committee
(AERMIC) Model (AERMOD)2 with the Offshore and Coastal Dispersion (OCD) model3
predictions using the same four North America Weather Research and Forecasting (WRF)
model4 simulations generated in Task 2.  Essentially, the Mesoscale Model Interface (MMIF)
program5 will be modified to output the WRF hourly meteorology to run OCD during the period
of the Cameron, Carpinteria, Pismo Beach and Ventura tracer gas experiments. External files
containing the MMIF output meteorological data will then be read by OCD to calculate
concentration impacts. Statistical summaries will be generated for OCD and compared with the
Task 2 AERMOD statistical summaries.  The comparison will not include the Oresund tracer gas
experiment.

 Comment A.6:  In general, I still disagree with the general statement that OCD is based upon
outmoded science and is not suitable for current modeling tasks.  OCD is based upon similarity
theory, as is AERMOD.  Furthermore, OCD is specifically based upon overwater science, has  a
TIBL algorithm,  and incorporates knowledge of shoreline geometry.  In addition, the downwash
algorithm is specifically designed for offshore rigs, while PRIME is not. So, I think that
AERMOD is not clearly superior to OCD  in many respects.
2 USEPA. 2004. AERMOD: Description of Model Formulation. Report No. EPA-454/R-03-004. Research Triangle
Park, NC. September.
3 DiCristofaro, D.C. andS.R. Hanna. 1989. OCD: The Offshore and Coastal Dispersion Model, Version 4, Volume 1:
User's Guide and Volume II: Appendices. NTIS Nos PB 93-144384 and PB-93-144392. Sigma Research Corporation,
Westford, MA. 1989.
4 NCAR (National Center for Atmospheric Research). 2014. Weather Research & Forecasting (WRF) Advanced
Research WRF (ARW), Version 3 Modeling System User's Guide. NCAR Mesoscale & Meteorology Division,
Boulder, CO.
5 Brashers, B. and C. Emory. 2014. The Mesoscale Model Interface (MMIF) Program, Draft User's Manual.
ENVIRON Int. Corp., Air Sciences Group, Novato, CA.

-------
Response:  OCD is the EPA recommended air quality dispersion model to predict emission
impacts from a new or existing over water stationary source.6 The model contains two specific
features unique to overwater emission source for the purpose of determining compliance with
National Ambient Air Quality Standards (NAAQS). They are the platform downwash algorithm7
and the Thermal Internal Boundary Layer (TIBL) algorithm that are not found AERMOD. Hence,
OCD is not an outdated model.  See Response A.7 for additional explanation.

Comment A.7:  Introduction statement, "However, OCD is an older model that utilizes
simplistic, outdated physical parameterization schemes and lacks the features required for
robust modern environmental assessment."  The overly negative statements about OCD should
be toned down.  OCD has quite reasonable scientific assumptions which are backed up by
references and observations.

Response:  During the Region  10 review of the Shell Offshore Inc.  (Shell) major and minor
source air permit applications to conduct exploratory drilling in the Arctic, it was determined that
OCD lacked the  "...features required for robust modern environmental assessment" to
adequately predict emission impacts from a new or existing overwater stationary source.8
These features, options and capabilities are contained in AERMOD and include the Plume Rise
Model  Enhancements (PRIME)  downwash algorithm,9 Plume Volume Molar Ratio Method
(PVMRM),10 and Ozone Limiting Method (OLM),11 and the capability to calculate receptor
averaged percentiles associated with sulfur dioxide (SO2), nitrogen dioxide (NO2), and
particulate matter less than or equal to 2.5 microns (PM2.s) concentrations for a compliance
demonstration with NAAQS. However, AERMOD was missing a meteorological data
preprocessor program for overwater application, a platform downwash algorithm and a TIBL
algorithm.

       a.      The over water meteorological  data preprocessor program deficiency was
              addressed when Region 10 approved on 01 April 201112 and the EPA Modeling
              Clearinghouse concurred on 6  May 201113 the use of the Coupled Ocean-
6  Code of Federal Regulators (CFR). Protection of the Environment. Appendix W to Part 51-Guideline on Air
Quality Models.
7  Petersen, R.L, 1986.  Wind tunnel Investigation of the Effect of Platform-Type Structures on Dispersion of
effluent from Short Stacks. Journal of Air Pollution Contraction Association, 36,1347-1352.
8  Winges, K. 2010. Shell Beaufort Sea Permit Application Meeting Notes. ENVIRON Project No.: 03220903A. July
15.
9  Schulman, L.L., D.G. Strimaitis and J.S. Scire, 2002. Development and Evaluation of the PRIME plume rise and
building downwash model. Journal of the Air & Waste Management Association, 50:278-390.
10  Hanrahan, P. L, 1999. The Plume Volume Molar Ratio Method for Determining NO2/NOx Ratios in Modeling-
Part 1: Methodology. Journal of Air & Waste Management Association, 49: 1324-1331
11 Cole, H.S. and J.E. Summerhays.  1979. A Review of Techniques Available for Estimation of Short-Term NCh
Concentrations. Journal of the Air Pollution Control Association, 29(8): 812-817
12  EPA. 2011.  Memorandum: COARE  Bulk Fox Algorithm to Generate hourly Meteorological Data for Use with
the AERMOD Dispersion Program; Section 3.2.2.e Alternative Refined Model Demonstration. From Herman Wong
EPA Regional Office Modeling Contact to Tyler Fox, Lead Air Quality Modeling Group, OAQPS. April 1.
13  EPA.  2011. Memorandum: Model Clearinghouse Review of AERMOD-COARE as an Alternative Model for
Application in an Arctic Marine Ice Free Environment. From George Bridgers, EPA Model Clearinghouse Director to
Herman Wong, EPA Regional Atmospheric Scientist, Office of Environmental Assessment, OEA-095, EPA Region 10.
May 6.

-------
             Atmospheric Response Experiment (COARE) algorithm.14 The over land
             AERMIC meteorological preprocessor program is called AERMET15 and is not
             designed to process meteorological conditions over ocean waters and more
             extreme climates. That's because over land energy fluxes are strongly driven by
             the diurnal cycle of heating and cooling while over water fluxes are more
             dependent on air-sea temperature differences that are only slightly affected by
             diurnal heating and cooling.

             On 16 July 2011, Region 10 publicly noticed the issuance of revised draft Outer
             Continental Shelf (OCS)/Prevention of Significant Deterioration (PSD) permits for
             Shell to conduct exploratory drilling in the Beaufort Sea and in the Chukchi Sea.
             Contained in the notices was a request for comments on the use of the
             alternative COARE algorithm with AERMOD since the approval and concurrence
             was made under Section 3.2.2.e of Appendix W, 40 CFR 51. Under this section,
             Region  10 addressed five element which were:

             1.     The model received scientific peer review,
             2.     The model can be demonstrated to be applicable to the problem on a
                    theoretical basis,
             3.     Date bases which are necessary to perform the analysis are available
                    and adequate,
             4.     Appropriate performance evaluations of the model have shown that the
                    model is  not biased toward underestimates, and
             5.     A protocol on the methods and procedures to be followed has been
                    established.

             Several public comments were received and responded to by Region 10.16

       b.     In a 2012  funded project by Region 10, COARE was coded into the AERMOD-
             COARE (AERCOARE) model.17 AERCOARE has been available since 21 May
             2013 from the EPA Support Center for Regulatory Atmospheric Modeling
             (SCRAM) website for public review and use on a case-by-case basis.

       c.     Region  10 also funded a study in 2012 to add the platform downwash algorithm
             in OCD into AERMOD. This effort was not successful because the structure and
             routines of the two programs were different. Furthermore, Region 10 was  unable
             to fund in  2013 a wind tunnel study for a variety of platforms (e.g., monopod,
             three-legged, four-legged and jackup rig). The objective of the study was  to
14 Fairall, C.W. and E.F. Bradley. 2003. The TOGA-COARE Bulk Air-Sea Flux Algorithm. NOAA/ERL/ETL, 325
Broadway, Boulder, CO. September 2.
15 EPA, 2004. User's Guide for the AERMOD Meteorological Preprocessor (AERMET), Publication NO. EPA-454/B-
03-002.  U.S. EPA, Research Triangle Park, NC. November.
16 EPA. 2011. Supplemental Response to Comments for Outer Continental Shelf Prevention of Significant
Deterioration Permits, Noble Discoverer Drill Ship.  Permit No. R10OCS.PSD-AK-2010-01 and Permit No.
R10OCS/PSD-AK-09-01.  Region 10, Seattle, WA
17 ENVIRON. 2013. Evaluation of the Combined AERCOARE/AERMOD Modeling Approach for Offshore Sources.
EPA 910-R-12-007. Region 10, Seattle, WA. 98101. October

-------
             revise the PRIME downwash method in AERMOD using wind tunnel
             experiments to account for flow and dispersion resulting from the open space
             underneath the platform. PRIME currently assumes that the building obstacle is
             based at the ground surface. This revision will allow the use of the EPA Building
             Profile Input Program for PRIME (BPIPPRM) program,18 which outputs wind
             direction-specific building dimensions for each complex structure on a platform
             or for the platform as a whole. When their dimensions are input into AERMOD,
             refined concentration impacts can be estimated from a stationary source located
             in a marine environment. Complex structures can consist of oddly shaped multi-
             tier buildings, heliport, combustion  engines, porous pipe racks, cranes, and
             cantiliver. OCD treats all the complex structures on a platform as one solid
             representative structure.

       d.    There are two TIBL algorithms available to be coded into AERMOD. They are
             the OCD TIBL algorithm and the Venkatram TIBL19.

In summary, the goal of Region 10 and BOEM  is to replace OCD with AERMOD using either
measured or predicted over water meteorological  data.

Comment A.8:  Section 2.1: The Ventura study also involves simple flat terrain, but the
receptors are located 500 meters (m) to 1  kilometer (km) inland from  the shoreline.

Response: This comment is a reference to a sentence contained within the Volume II report.
The comment does not specifically state a concern.

Comment A.9:  Section 2.1.2: Unlike previous model evaluations of OCD and CALPUFF, the
ENVIRON Int. Corp. (2010) study found AERMOD very sensitive to the observed PEL height
and it was recommended that a mechanical PEL estimate be used instead of the observations.

Response: This sentence was removed because it does not apply to this section.

Comment A.10: Section 2.3: METSTAT calculates a suite of model performance statistics
using wind speed and direction, temperature, and moisture observations.

Response: This sentence was corrected in the report to improve the clarity of the statement.

Comment A.11: As I mentioned on the phone, it would appear that the air-sea temperature
difference is a critical parameter that have a better chance for more accurate parameterization
with modern data.  But, the evaluation databases are old, so the availability of accurate satellite
data or other means to parameterize the stability are not available. So, the utility and relevance
of this entire evaluation exercise is questionable.  In lieu of creating another field study, I would
look for an opportunistic emission source on a  peninsula or in the water with monitoring data on
shore.
18 Schulman, LL 1997. Addendum to ISC3 User's Guide: The PRIME Plume Rise and Building Downwash Model.
Submitted by Electric Power Research Institute. Prepared by Earth Tech, Inc., Concord, MA. November.
19 Venkatram, A. 1977. A Model of Internal Boundary Layer Development.  Boundary-Layer Meteor., 11, 419-437.

-------
Response: The possibility of deficiencies in the measurement systems and parameters
included in the evaluation database was noted in the Volume II report. It was noted that mixing
height measurements in particular may be highly inaccurate. We agree that this study could be
improved using more modern over-water dispersion studies (i.e., tracer gas experiments).
However, we are unaware of any modern over-water tracer release studies that would be fit for
model evaluation. The Ventura, Pismo Beach, and Cameron tracer studies used in this study
were also  used to evaluate the OCD model (refer to Hanna et al., 1985).20 More recently,
Ventura, Pismo Beach, Cameron, Carpinteria and Oresund tracer release studies were used to
evaluate offshore enhancements of CALPUFF (Earth Tech, Inc., 2006).21  It is appropriate to use
these tracer studies for this work given the community history and familiarity with the studies
and from a perceived lack of more modern large-scale offshore tracer dispersion experiments.

Comment A.12:  Alternatively, just do a meteorological evaluation for a modern database just
to see if the simulation can accurately predict the stability parameters.

Response: This comment is outside the project work scope.

Comment A.13:  Figures 14-32 - consider putting these into an appendix! This is the first of a
few sections with an overwhelming number of figures.

Response: There  are four sites, several years of meteorology at each station, and seven
meteorological parameters plotted. Each of these parameters are reviewed individually to
review WRF performance. The figures are reviewed in each section appropriately.  Inclusion of
this set of  plots in the report is warranted.

Comment A.14:  Section 3.2: The meteorological measurements from each tracer study were
used to create input meteorology for AERCOARE.  The AERMOD SFC and  PFL (PBS) files were
built using AERCOARE with defaults recommended in Richmond and Morris (2012). The PEL
height was calculated using option (1) in AERCOARE that calculates a  mechanical PEL height
(Zim) using the Venketram algorithm (Venketram, 1980).

Response: Sentence structure edited in report revision.

Comment A.15:  Figures 33-74: Rescue me from this onslaught!  I couldn't get through this
section, so I just skipped it.  Is there a way to put some of these figures into an appendix?

Response: Please refer to the response to Comment A.13.

Comment A16:  Time series plots - they can get too complicated if too many different model
combinations  are attempted (e.g., Figure 88).  I recommend more than one per study such that
the maximum number of models on each one is limited to about 4 or 5 at the most.  The Q-Q
20 Hanna, S.R., LL Schulman, R.J. Paine, J.E. Pleim, M. Baer. 1985. Development and Evaluation of the Offshore and
Coastal Dispersion Model. J. of the Air Pollution Control Association, 35:1039-1047.
21 Earth Tech, Inc. 2006. Development of the Next Generation Air Quality Models for Outer Continental Shelf (OCS)
Applications - Final Report: Volume I. Prepared for the U.S. Dept. of the Interior, Minerals Management Service.

-------
plots are easier to follow because they have a limited number of different models included.
Unfortunately, this will increase the number of figures in a section with a whole lot of them
already.

Response: Figure formats were retained. The purpose of the plot is often to compare the
magnitude of results between sets of comparable runs and identify outliers. Plotting on separate
plots would make this more difficult. Some of the figures were enlarged to ease viewing.

Comment A.17:  As noted in the conclusions, "However, the meteorological analysis suggests
small errors in SST and air temperature can result in misdiagnosed stability conditions that can
have profound effects on the AERMOD results."  So, a meteorological evaluation with new
data is the next obvious step to see if using MMIF  makes any sense.  This study is
inconclusive, unfortunately,  but if you need to use  parameterized data, you can't go back in time
and hope to get anything useful.

Response: The opinion of the reviewer has been noted. It is beyond the scope of this work to
evaluate WRF-MMIF-AERMOD performance in comparison to additional databases (refer to
response to comment A. 11).

Comment A.18:  WRF was run down to 1.33 km for each evaluation database.   How long does
such a run take, especially if it is necessary to model for 5 years?

Response: The time required to complete a year-long WRF simulation is dependent on grid
size and computational resources available. A typical year-long WRF simulation on ENVIROM's
high-performance computing cluster with an inner  grid resolution of 4 km requires at least a
week of computing time. A five year run would likely require more than a month of processing
time.

Comment A.19:  In general, AERMOD is highly sensitive to the PEL height, and we really don't
know what that height is, so that is a limitation for the Task 3 study, which is basically a model
vs. model study.  From the Task 2 evaluation, you might be able to determine if there is a best
performing combination of stability and PEL height (the most important prediction parameters).
So, the task 3 "gold standard" has uncertainty because of the difficulty in "observing" the PEL
height.

Response: The purpose of the Task 3 study was to determine if maximum concentrations
predicted by AERMOD driven by WRF model output differed substantially from AERMOD driven
by available offshore meteorological observations. The Endeavor Island profiler provides
estimates of PEL height in the vicinity of the Reindeer Island and Beaufort-Sivulliq (B2 and B3)
sites. Therefore, more weight should be placed on the model comparisons at these locations. It
was recognized that the mixing height estimates at the Chukchi Sea  buoy sites (C1 and C2)
were dubious. A revision of the Task 3 study has involved elimination of the use of the C1 and
C2 PBL height estimates. Instead,  PEL height from the WRF-MMIF recalculations have been
applied to the C1  and C2 observation meteorology datasets. Therefore,  PBL height has been
eliminated as a differing variable for the C1 and C2 comparisons. Study conclusions and
recommendations must continue to place more weight on the B2 and B3 comparisons since
AERMOD results are sensitive to PBL height differences.

                                          8

-------
It can be noted that PEL height estimates typically used in AERMOD for New Source Review air
quality analyses are based on relatively simplistic parameterization schemes. PEL height
measurements are almost never available. PEL height is estimated in AERMET using a set of
parameterizations reliant on local surface data and nearest radiosonde data. Although morning
radiosonde data is used to initialize the morning convective mixing height, the radiosonde site is
often quite distant from or relatively unrepresentative of the site in question. Mechanical mixing
height is estimated using the Venkatram (1980)22 model:
                                    Zim = 2400u»
                                                 1.2
This model itself was formulated using a set of assumptions appropriate for mid-latitudes. The
constant value of 2400 was determined using linear regression against a single experiment
conducted in Minnesota, September 1973.23

Comment A.20: However, the real question is: which modeling approach is more accurate in
the near-field?  This is just a model vs. model comparison, so we really don't know if the obs-
based predictions have good skill.   However, if the basic point is to see if the WRF/MMIF
approach can provide results consistent with that of the observed met data, there is useful
information. Where the modeling approaches differ, we don't know whether the observed met
would perform against real concentrations better than the "synthetic" meteorology.

Response:  The point of the study was to evaluate whether the WRF/MMIF approach can
provide  results consistent with that of the observed meteorological data. Extensive evaluation of
the accuracy of AERMOD has been conducted by the EPA.24

Comment A.21: Jargon is used throughout the report and should be replaced by standard
terminology.

Response:   The report has been edited to remove some jargon when appropriate.

Comment A.22: This is not the first time or the first agency where this type of work has been
done. Nearly all other agencies in the US and elsewhere have developed linked NWP and
transport and dispersion models, have accounted for overwater boundary layers,  and have
evaluated NWP model predictions and dispersion model predictions. The EPA has several
related studies underway besides this one. Yet the report has very few references or
discussions about this large body of related work.  In particular, when any new method is
proposed, there should be an overview of related studies and there should be rationale provided
for why you decided to do something new rather than just adapt an existing system.
22 Venkatram, A. 1980. Estimating the Monin-Obukhov Length in the Stable Boundary Layer for Dispersion
Calculations. Boundary-Layer Meteorology. 19: 481-485.
23 Caughey, S.J., J.C. Wyngaard, J.C. Kaimal. 1979: Turbulence in the Evolving Stable Boundary Layer. Journal of the
Atmospheric Sciences. 36: 1041-1052.
24 U.S. Environmental Protection Agency. 2003. AERMOD: Latest Features and Evaluation Results. Pub. EPA-454/R-
03-003. U.S. EPA, Research Triangle Park, NC.

-------
Response: The purpose of the report is to provide a summary of the modeling results and an
analysis of those results. A full review of the larger body of related work and discussion on how
the current study relates to this work was beyond the scope of the Task 2 and Task 3 portions of
this study. A full literature review and discussion concerning how the current study relates to the
historical body of study would be appropriate but beyond this work scope.

Comment A.23: It looks like the evaluation software automatically produced over 100 plots and
tables. However, the text that attempts to explain each plot is brief and vague. Many
hypotheses are  brought up that do not seem to me to be valid. More thought needs to be given
to the many explanations.  Use standard terminology and not jargon.  Give quantitative results
(e.g., "a difference of 1.4 m/s") rather than vague subjective statements (e.g., "performs better",
"is much larger").

Response: The report  has been edited to address these concerns.

Comment A.24: In this report, many optional inputs and assumptions to meet models and
other models are tested. So the reader has to look at multiple sets of performance measures
and other outputs.  There are too many options. The authors should reduce the number of
analysed options to a few that can be discussed in detail, with scientific rationale and
quantitative results given.

Response: Certainly the Task 2 evaluations and Task 3 analyses  presented in Volume 2 and
Volume 3 respectively, could have been reduced to focus and explore specific  options,
assumptions and outliers that influenced the results.  In this case however, it would be more
convenient for the reader if all the assumptions, options and outliers are placed in the body of
the report to review.

Comment A.25: There is a lack of reference by the scientists involved in this development to a
previous MMS-sponsored study where COARE was adapted for use in coastal environments
and to provide improved inputs to overwater dispersion models:

Hanna SR, MacDonald CP, Lilly M, Knoderer C and Huang CH. Analysis of three years of
boundary layer observations over the Gulf of Mexico and its shores. Estuarian, Coastal and
Shelf Science 2006; 70: 541-550.

The abstract includes the following sentences about COARE:

Estimates of surface momentum, sensible heat, and latent heat fluxes have been made from the
surface observations using the COARE software. Simulations by the National Weather Service's
Eta meteorological model are compared with the observations of surface fluxes and wind
profiles. The boundary layer is found to be unstable over 90% of the time, and  latent heat fluxes
are about five to ten times larger than sensible heat fluxes, as usually found over tropical
oceans. Eta model simulations of surface fluxes are within about 50% of COARE estimates of
the fluxes based on surface observations. Most of the time, COARE derived fluxes at 11 sites
are within a factor  of two of each other at any given hour. In multi-day case studies, COARE
calculations are found to agree with Eta model simulations of these fluxes and  parameters
within a factor of two most of the time.

                                          10

-------
The paper was published in 2006 but the research was underway for five years previously,
including many communications with Fairall about his COARE algorithm. Ron Lai was one of
the MMS managers on our study, and I see that he is referred to in the current report. At the
least, the 2006 paper should be referred to in this report and possibly the new report's findings
compared with our 2006 findings.

Response: The Task 2 report did not contain a detailed review of the scientific literature
concerning over-water dispersion modeling. The purpose of the report was to provide a
summary of the modeling methodology and results. A detailed discussion of the work's
relationship to the broader body of related scientific work was beyond the scope of the study.
Such a discussion would be appropriate for a scientific journal article.

Comment A.26: Many figures and sections are devoted to evaluations of the WRF predictions
of boundary layer variables.  This is a very active topic of research throughout the world and
others' work should be discussed. For example, there should be reference to and discussion of
the seminal review paper on this topic:

Seaman, N.L., 2000. Meteorological modeling for air quality assessments. Atmospheric
Environment 34, 2231-2259.

I have also written a chapter in an AMS monograph (edited by Roger Pielke) on this topic, plus
the two specific papers below:

Hanna SR, Yang R., 2001. Evaluations of mesoscale model predictions of near-surface winds,
temperature gradients, and mixing depths. J Appl Meteorol 40,  1095-1104.

Hanna SR, Reen B, Hendrick E, Santos L, Stauffer D, Deng AJ, McQueen J, Tsidulko M, Janjic
Z, Jovic D, Sykes Rl, 2010. Comparison of observed, MM5 and  WRF-NMM model-simulated,
and HPAC-assumed boundary-layer meteorological variables for three days during the I HOP
field experiment. Bound-Layer Meteorol 134 (2), 285-306.

The current results should be compared with the methods and quantitative results in these and
other references.

Response: Refer to the response for Comment A.22

Comment A.27: What exactly is being compared (evaluated)?  I am often not quite sure. One
problem is that the actual observations are sometimes massaged (by interpolation or combining
with model predictions) to produce pseudo "observations". Also, the averaging time needs to be
clearly stated (e.g., one hour?).  Also, the locations of the "observation" and "prediction" need to
be definitively identified.  Where is the wind speed measured?  Show it on a map and state it in
the caption. Concerning the "model prediction", often this is made by an NWP model, and thus
represents a grid square or maybe a specific point. In a coastal area, the measurement domain
can include grid squares over water, over land, and partly over both, and the method  needs to
be clearly explained for producing the model output that is being evaluated.
                                          11

-------
Response:  The only observations "massaged" are several air-sea temperature differences that
were modified to match the stability indicated by the vertical temperature gradient. The changes
are described in the report and are similar to changes made in previous work, including Hanna
et al. (1980),25 using these datasets to account for non-local influences on stability. A purpose of
this study is to evaluate if WRF meteorology, using conditions averaged over a WRF grid cell,
are sufficient for AERMOD modeling when compared to results provided by observed
meteorology.

Comment A.28: Mixing height or PEL height is something that can't easily be "observed". It is
especially uncertain in coastal areas where it may vary significantly over short distances. The
current report appears to have recalculated the "observed" mixing height based on applying
some parameterizations to limited observations, and has ended up with many low mixing
heights (< 100 m and sometimes only a few tens of meters).  The report then concludes that the
predicted concentrations vary strongly with this mixing height. This has been recognized for a
long time and is obviously caused by the fact that the vertical spread of a near-surface release
of pollutants is constrained by the mixing height. Thus C is inversely proportional to mixing
height. Note that this result in the current report may be due to a faulty parameterization of
mixing height. To reduce the chance of this happening, many current models do not allow firm
mixing heights less than about 100 m.  After all, during stable conditions, the vertical turbulent
dispersion is already much reduced.

Response:  The observed mixing heights  in the Task 2 report were obtained from the study
archives  for each study.  Mixing height was not recalculated for any of the periods or sites
modeled.

In the Task 3 study, mixing height was  derived for the two Beaufort meteorological buoy sites
using the Endeavor Island vertical temperature profiler data. For the two Chukchi meteorological
buoy sites, an empirical algorithm was designed to estimate mixing height because no mixing
height measurements were available for the Chukchi Sea sites. The deficiencies of such an
approach were noted during project review and as a response the study has been revised
removing the mixing height estimates formulated using the algorithm. For the study revision, the
mixing heights derived by WRF-MMIF were assigned to the 'observed' dataset, effectively
removing mixing height as  a variable of study.

A minimum mixing height of 25 m was implemented as a constraint for the WRF-MMIF output
for both the Task 2 and Task 3 studies.

Comment A.29: Related to previous comments, there is a need to continually clearly state
what is meant by "extracted WRF meteorology"  (jargon). Also, the term "METS[T]AT" seems to
be used as a noun and adjective in unexpected  places.  What exactly is meant by "METSTAT"?
Please expand the terminology so that it is more understandable to the anticipated readers.
25 Hanna, S.R., LL Schulman, R.J. Paine, J.E. Pleim. 1984. Users Guide to the Offshore and Coastal Dispersion (OCD)
Model. Environmental Research & Technology, Inc., Concord, MA, prepared for the Minerals Management Service
of the U.S. Dept. of the Interior under Contract No. 14-08-0001-21138.

                                           12

-------
Response: The METSTAT software is introduced in Section 2.3 of the Task 2 report and
Section 3.0 of the Task 3 report. Both reports indicate that METSTAT is publically-available
software that is commonly used to assess WRF modeling performance. Use of the term
"extracted WRF meteorology" has been edited in the revised Task 3 report. The term refers to
AERMOD meteorological input files built by the MMIF program using meteorology extracted
from the WRF output files at the nearest grid point to the buoy or source.

Comment A.30: In several places there are discussions of the relation between surface-air
temperature difference, stability, and sensible heat fluxes. These are in general hard to
understand, and the cause and effect rationale is fuzzy. It is a "which came first- the chicken or
egg" dilemma.   My opinion is that actually all are inter-related, and it is important for you to
clearly identify what are the known (observed) inputs.

Response: Edits have been  made to clarify the discussion.

Comment A.31: Figures - Zoom in so you show only the area  with data, so the points can be
distinguished. Make the "observed" point more visible.

Response: The Task 2 report plots were  edited to improve the visibility of data points.

Comment A.32: The report is written for a technical person to read and understand and [not]
for the general public.

Response: The opinion  of the reviewer is noted.

Comment A.33: Refrain from using colons and semicolons.

Response: Edits have been  made to address this concern.

Comment A.34: Please justify the use of the "complex" benchmarks in METSTAT to judge
WRF performance at overwater locations.

Response: The complex benchmarks represent a set of criteria that are less stringent than the
simple-terrain benchmarks. Microclimate heterogeneity in regions of complex terrain increase
the likelihood that winds and temperature at any WRF grid point may vary from those at the
nearest measurement location. For example, if a WRF grid point happens to fall on the top of a
ridge and the nearest measurement is within an adjacent valley, the meteorology is more likely
to vary substantially.

It was stated in Section 3.2 of the Task 3 report that it could be assumed that the complex
criteria provided a suitable set of performance goals for the Arctic given the complexity of
meteorology in the region. No statement was made regarding the acceptable set of criteria for
onshore or offshore datasets in the study,  despite separate analyses of METSTAT results for
onshore and offshore datasets.

The report will be amended to state the simple criteria are likely more suitable for the analysis of
WRF performance offshore. Microclimate over the open Arctic Ocean will be homogeneous

                                          13

-------
under most conditions over the distances corresponding to WRF grid spacing (4 km). This
assumption is a subjective judgment of the authors since we are unaware of any prior studies of
suitable METSTAT criteria for offshore locations specifically.

The report will also be edited to state the complex criteria may be more appropriate for the
evaluation of WRF performance over the onshore regions of the domain. This assumption is
based partially on the fact that most meteorological monitoring stations in Alaska are located on
the coast near to population centers. Along the coast, meteorological conditions may vary
significantly over short distances given the differences in air-sea versus air-land interactions in
the PEL. There are often strong gradients of temperature, humidity, and cloud across the land-
sea PEL interface. This assumption is a  subjective judgment of the authors. We are  unaware of
any studies of suitable METSTAT criteria specifically for coastal locations.

Additional discussion was added to address this in the Task 2 report in Section 2.3.1.

Comment A.35: Some comments identified in Section B also are applicable to Section C.
They include the crowded symbols in graphics, use of jargons, semi-colons,  hyphens... etc.

Response:  Reports have been edited to address these concerns.
                                          14

-------
B.     Volume 2 - Evaluation of Weather Research and Forecasting Model
       Simulations for Five Tracer Gas Studies with AERMOD

General Comments

Comment B.1:  Regarding your Task 4 discussion, I would like to point out that there is no
instrument that can directly observe the mixing height. Temperature profiles can be useful at
times, especially if there is a sharp capping inversion at the top of an obvious daytime adiabatic
mixed layer. But often (I find over 30 or 40 % of the time) the temperature profile is "fuzzy", with
no obvious  capping inversion (there may be multiple weak inversions or one deep slightly stable
layer).  And at night, when there is a ground based inversion, I  notice that some people define
the mixing height as the depth of the surface stable layer or the height of the low level jet. This
is not correct.

I feel that a lot of the questionable model performance that you report in the Task 2 report is due
to incorrect mixing height assessments.  Especially picking relatively low mixing heights that
cannot be scientifically justified.

Response: The observed mixing heights from the tracer experiments were obtained from the
study archives.  PEL height from the "rcalF" WRF datasets were determined using the schemes
specific to each PEL model (UW and MYJ use TKE-based lower bounds to estimate PEL height
and YSU uses a critical Richardson number technique). Also, the MMIF PEL heights ("rcalT"
cases) are estimated using a bulk Richardson number approach. Richardson number methods
have been recommended for the practical estimation of mixing height for dispersion  modeling.26

It is recognized  by the authors that mixing height error is a contributor to areas of poor
performance in this study. One of the purposes of this study is  to assess if meteorological
variable error is significant enough to result in significant AERMOD prediction error. The  mixing
height schemes used represent standard tools that are used to predict mixing height in
regulatory dispersion models. In fact, the TKE and Richardson  number methods used are
arguably more sophisticated than the mixing  height model used in AERMET. The TKE and
Richardson number methods use a vertical profile of the atmosphere to estimate mixing height
while AERMET  relies on surface friction velocity only to estimate mixing height (for the stable
conditions in question).  AERMET estimates the mechanical mixing height using a formula from
Venkatram  (1980)27 that was itself a simple linear fit of the Zilitinkevich (1972)28 model to
averaged data from a single experiment.

Comment B.2:  Section 3.4 -  Development of Meteorological Inputs for AERMOD.  It is
interesting to note that the WRF MYJ-based simulations resulted in the best predictions of
regional wind speed, as indicated  by METSTAT, for Cameron,  Carpinteria,  and Oresund.
However, it was also noted that better performance of WRF MYJ-based simulations was not
26 Seibert, P., F. Beyrich, S.E. Gryning, S. Joffre, A. Rasmussen, P. Tercier (1997): Mixing Height Determination for
Dispersion Modeling. COST Action 710, Report of Working Group 2,
27 Venkatram, A. 1980. Estimating the Monin-Obukhov Length in the Stable Boundary Layer for Dispersion
Calculations. Boundary-Layer Meteorology. 19: 481-485.
28 Zilitinkevich, S.S. 1972. On the Determination of the Height of the Ekman Boundary Layer. Boundary-Layer
Meteorology. 3: 141-145.

                                          15

-------
evident at the offshore buoy location. This suggests that MYJ may not be the best over-water
scheme.  Yet, this is what was used for the Task 3 modeling analyses.

Response:  The results of the Task 2 study support the recommendation that YSU or UW PEL
schemes be used for WRF instead of the MYJ scheme.  As mentioned in Section 2.1 of the
Task 3 report, the MYJ scheme was used for Task 3 modeling because it has been the
preferred PEL scheme for Polar WRF based on the results of Mines and Bromwich (2008).29
However, it was noted that Bromwich, Mines, and Bai (2009)30 found YSU provided slightly
superior performance for WRF modeling over the Arctic Ocean,  particularly during  periods of
strong  positive sensible heat flux. In Mines et al. (2011), both MYJ and YSU were used for
sensitivity tests of WRF performance over the Alaskan Arctic tundra. Although both simulations
showed good agreement with most surface observations, both tended to underpredict summer
cloud cover by poor representation of Arctic stratus. This deficiency was blamed on YSU and
MYJ allowing excessive vertical mixing  of moisture in the PEL. Mines  et al. recommended an
improved PEL scheme with better treatment of vertical moisture mixing. The very purpose of the
creation of the UW PEL model was to improve moisture transport parameterization in the PEL to
better simulate marine stratocumulus-topped boundary layers.31 It is therefore plausible that the
UW PEL model may have outperformed the MYJ and YSU model  if tested in this study.

Comment B.3: Section 5.1 - Primary Questions. A final table summarizing the model
evaluation findings might be useful to the reader.  This table could list the scenario,
observational TMS, best model, and its TMS.  For example:
Scenario
Cameron
Carpinteria
Oresund
Pismo
Ventura
Case 1 Obs
TMS
0.64
0.64
-
0.4
0.72
Case 2 Obs
TMS
0.61
0.64
0.43
0.23
0.42
Best WRF
Config.
NARR.UW
ERA.UW
ERA. MYJ
ERA. YSU
ERA. YSU
Best
Extraction
Method
MMIF.RCALF
MMIF.RCALT
MMIF.RCALT
AERC.RCALF
MMIF.RCALF
TMS
0.68
0.55
0.41
0.75
0.19
Response:  We agree that it would be valuable to include such a table in the report summary. It
was added to the summary section of the report.

Specific Comments

Comment B.4: Preface, first paragraph, second sentence: I thought that the measurements
were assimilated into the model?
29 Mine, K.M, D.H. Bromwich. 2008. Development and Testing of Polar Weather Research and Forecasting (WRF)
Model. Part I: Greenland Ice Sheet Meteorology. Monthly Weather Review, 136: 1971-1989.
30 Bromwich, D.H., K.M Mines, L.S. Bai. 2009. Development and testing of Polar Weather Research and Forecasting
model: 2. Arctic Ocean. Journal of Geophysical Research, 114:1-22, doi:10.1029/2008JD010300.
31 Bretherton, C.S., S. Park. 2009. A New Moist Turbulence Parameterization in the Community Atmosphere Model.
Journal of Climate, 22: 3422-3448.
                                          16

-------
Response: Measurements were not used for observational nudging in the Task 2 WRF
simulations. 3-D nudging of fields towards the reanalysis data was used. A paragraph in Section
2.2 incorrectly states that observational nudging was utilized. This paragraph has been
removed.

Comment B.5:  Preface, first paragraph, second to the last sentence: Did any beta users
provide comments and how were they addressed by EPA?

Response: Region 10 noticed the issuance of draft air permits for the Shell Kulluk
Conical Drill Unit and the Noble Discoverer Drill Ship for exploratory drilling in the
Beaufort Sea and Chukchi Sea. As part of the notice, Region 10 requested public
comments on the use of the COARE algorithm approved under Section 3.2.2.e of
Appendix W in 40 CFR 51 to process over water measured meteorological data. In the
Response to Comments for the Noble Discoverer Drill Ship,32 Region 10 responded to six
comments (S.1  to S.6). Similar response to comments can be found for Kulluk.

See Response to Comment A.7 for additional discussion.

Comment B.6:  Preface, first paragraph, last sentence:  This has no meaning to me (i.e.,
"Section 3.2.2.e of Appendix W in 40 CFR 51").  Please  give a brief overview.

Response: See Response to Comment A.7.a.

Comment B.7:  Preface, last paragraph: You say Environmental [Protection Agency twice in a
row.

Response: The last paragraph in the Preface has been corrected per the comment.

Comment B.8:  List of Abbreviations:  "cn". Makes no sense when compared to the previous
three concentration symbols).  Please revise.

Response: The description of the variable has been revised.

Comment B.9:  List of Abbreviations:  "ERA-I".  List only the words in the acronym.

Response: Corrected.

Comment B.10:  List of Abbreviations: Add "METSTAT".

Response: Description added to list of abbreviations.

Comment B.11:  List of Abbreviations: "MG". These words aren't correct.  Delete "Bias of the".

Response: Corrected to "Geometric mean bias."
32 USEPA. 2011. Supplemental Response to Comments for Outer Continental Shelf Prevention of Significant
Deterioration Permits, Noble Discoverer Drill Ship. Seattle, WA. September 19.

                                         17

-------
Comments.12:  List of Abbreviations:  "MMIF".  What does this mean? Only define the
acronyms and don't add other words.

Response: Additional details removed which stated that "MMIF" also indicated WRF
meteorology extraction cases without AERCOARE processing.

Comment B.13:  List of Abbreviations:  "OBS".  Very odd definitions.  I thought that OBS
referred to "observation".

Response: Description changed to address the reviewer's concern. "OBS" is used as a label
for AERMOD runs and is not used to indicate any specific measurement.

Comment B.14:  List of Abbreviations:  "P". Then an observation is OBS and a corresponding
prediction is P? The symbols should be more consistent.

Response: The abbreviation for predicted value is used in the definition of the statistics
(Equations 1-5). The corresponding abbreviation for observed value, "O," was not originally
included in the list of abbreviations. The description of this variable has been added to the list.

Comment B.15:  List of Abbreviations:  "Q". Volumes per unit volume (such as ppmv)?
Usually q is in g/m3.

Response: The reference to Q in the report has been removed.

Comment B.16:  List of Abbreviations:  "RCALF". Extraction? What is this? Prediction?

Response: The description has been edited. The abbreviation  is a labeling system used to
indicate AERMOD simulations driven by WRF-MMIF meteorology.

Comment B.17:  List of Abbreviations:  "s".  I never heard of this before. Why did you choose
this symbol?

Response:  The description has been  edited. The abbreviation is used to represent seconds
only.

Comments.18:  List of Abbreviations:  "U". Zonal? Do you mean W-E component, positive for
wind from the west?

Response: Yes, "Zonal" is a common term used to indicate wind tangent to lines of latitude,
positive when blowing from west to east.

Comments.19:  List of Abbreviations:  "ug".  Above you used "MG".

Response: MG was used for geometric mean bias and ng was  used for the geometric mean
       itself.
                                         18

-------
Comment B.20: Section 1 - Introduction, second paragraph, first sentence. Show a distance
range for near field (< 50 km?).

Response: Added to report.

Comment B.21: Section 1 - Introduction, third paragraph, last sentence. What is the peer-
reviewed reference for this? As the developer of OCD, I disagree with these unsupported
negative statements. You should go through the several OCD scientific algorithms and provide
a point-by-point set of comments. I  maintain that most of the OCD components have withstood
the test of time and have been shown to be reasonable.

Response: Refer to responses for comments A.6 and A.7. We agree that the statement
describing OCD as "an older model that utilizes simplistic, outdated physical parameterizations"
was negative, unfounded, subjective, and incorrect. However, as demonstrated in the
responses to comments A.6 and A.7, there are advantages to an AERMOD-based alternative
modeling methodology for over-water sources.

Comment B.22: Section 1 - Introduction, fourth paragraph. First two sentences. The EPA
development seems to have ignored the previous MMS-supported development of a COARE-
based model for overwater applications as a meteorological processor for dispersion models:
Hanna SR, MacDonald CP, Lilly M, Knoderer C and Huang CH.  Analysis of three years of
boundary layer observations over the Gulf of Mexico and its shores.  Estuarian, Coastal and
Shelf Science 2006; 70: 541-550. This peer-reviewed paper should be discussed in this report.

Response: (repeat of response to comment A.25) The Task 2 report did not contain a detailed
review of the scientific literature concerning over-water dispersion modeling. The purpose of the
report was to provide a summary of the modeling methodology and results. A detailed
discussion of the work's relationship to the broader body of related scientific work was beyond
the scope of the study. Such a discussion would be appropriate fora scientific journal article or
summarizing document that refers to the Task 2 report to discuss how the report conclusions
relate to the greater body of scientific literature.

Comment B.23: Section 1 - Introduction, fifth paragraph.  I thought that CALPUFF contained
an overwater boundary layer parameterization.  That effort was supported by MMS and I was on
the science review committee. Many of the concepts in COARE (my version) and in OCD (such
as the overwater PEL height assumptions) were incorporated. Has that been reviewed in this
project?

Response: CALPUFF version 6 contains enhancements for over-water applications. These
were developed as part of the MMS "Development of the Next Generation Air Quality Models for
Outer Continental Shelf Applications" project completed in 2006.33 It was beyond the scope of
33Scire, J.S., Strimaitis, D.G., Robe, F.R., Popovic, J.M. and Phadnis, M.J., 2006. Development of the Next
Generation Air Quality Models for Outer Continental Shelf (DCS) Applications. Final Report: Volume 1. MMS Final
Report. Contract 1435-01-01-CT-31071, U.S. Department of the Interior, Minerals Management Service, Herndon,
Virginia. 131 pp.

                                          19

-------
the Task 2 work to evaluate the broader body of related work (refer to the response for
Comment B.22).

Comment B.24:  Section 1 - Introduction, sixth paragraph.  NOAA has their own interface for
WRF-HYSPLIT. DOD has a widely used interface for WRF-SCIPUFF.  The EPA system is not
the only one and the others should be referenced.

Response: Refer to the response to Comment B.22.

Comment B.25:  Section 1 - Introduction, sixth paragraph.  I thought that MMIF also could be
used with CALPUFF and CMAQ.

Response: MMIF can provide meteorological input data to a number of  models, including
CALPUFF and CMAQ.  It was unnecessary to list all of the features of MMIF in the report.

Comment B.26:  Section 1 - Introduction, seventh paragraph, third sentence. . The first study
(Chapter #) evaluated a combined modeling approach...

Response: This  has been corrected in the report.

Comment B.27:  Section 1 - Introduction, seventh paragraph. Fourth sentence.  Define
"extracted" as used in this context.

Response: "Extraction" in this context means that a program such as MMIF has extracted
meteorological data from WRF output files to create AERMET input files.  This sentence has
been edited.

Comment B.28:  Section 1 - Introduction, seventh paragraph. Fourth sentence. What results?
Concentrations?  Wind speeds?

Response: References to the report section numbers addressing these  details have been
added.

Comment B.29:  Section 1 - Introduction, eighth paragraph, first sentence.  The purpose of the
first study (Chapter*) was to provide evidence to help...

Response: This  has been corrected in the report.

Comment B.30:  Section 1 - Introduction, eighth paragraph, second sentence. Who said that
AERMOD was conservative?  I thought that it was meant to  be realistic?

Response: AERMOD was designed to be unbiased, as shown in EPA (2003).34 The sentence
references a concern that AERMOD driven by WRF meteorology will predict lower
concentrations than AERMOD driven by observed meteorology. The sentence has been edited.
34 USEPA. 2003. AERMOD: Latest Features and Evaluation Results. U.S. Environmental Protection Agency, Research
Triangle Park, EPA-454/R-03-003.

                                         20

-------
Comment B.31: Section 1 - Introduction, eighth paragraph, third sentence. Define how
extraction is used in sentence.

Response: Refer to the response for comment B.27. The term "extracted" is unnecessary in
this sentence  and has been removed.

Comment B.32: Section 1 - Introduction, eighth paragraph, fourth sentence. Define how
extraction is used in sentence.

Response: Refer to the response for comment B.27. This term "extracted" is unnecessary in
this sentence  and has been removed.

Comment B.33: Section 2.1 - Overview of the Tracer Studies and WRF Domains.  For the
experimental configuration figures (Figure 1  though Figure 4) or in the modeling domain figures
(such as Figure 6),  it would be useful to include a plot of the meteorological stations evaluated
in the METSTAT analyses.

Response: Plots showing the position of the METSTAT stations added to the revised report.

Comment B.34: Section 2.1.1 - Cameron,  first paragraph, last sentence. In this section, the
basic field experiment reports/references should be listed.

Response: References added.

Comment B.35: Section 2.1.1 - Cameron,  second paragraph, second sentence.  Is this
measured offshore, at the coast, or inland? (Several met measurement locations are shown in
Figl)

Response: Meteorological data was collected at a 10 m mast installed on the coast, 5 to 20 m
from the water and  a 25 m mast located 2 km inland. These details have been added to the
report. The  locations of the masts are indicated on  Figure 1 of the report.

Comment B.36: Section 2.1.1 - Cameron,  second paragraph, fifth sentence.  Is there a
reference for this? It doesn't make sense to  me.

Response: Both the ENVIRON (2012) and Earth Tech  (2006) studies used adjusted air-sea
temperature differences to ensure consistent parameters for the atmospheric stability
conditions. The ENVIRON (2012)  reference was added to the report here. The inconsistency
between air-sea temperature differences and the vertical temperature gradient, a result of
horizontal advection of air masses in a region of strong SST gradient,  were also noted in the
OCD model performance evaluations reported in Appendix C of Hanna et al. (1984).35 The
strongly stable conditions indicated by the temperature gradient conflicted with the sign of heat
35 Hanna, S.R., LL Schulman, R.J. Paine, J.E. Pleim. 1984. Users Guide to the Offshore and Coastal Dispersion (OCD)
Model. Environmental Research & Technology, Inc., Concord, MA, prepared for the Minerals Management Service
of the U.S. Dept. of the Interior under Contract No. 14-08-0001-21138.

                                          21

-------
flux indicated by the air-sea temperature differences. Hanna et al. reported the OCD model
underpredicted concentration when the air-sea temperature difference was used to estimate
stability parameters in these cases. They reported better agreement when strongly-stable
conditions were assigned to these cases, ignoring the air-sea temperature differences.  In the
current study we have adjusted the air-sea temperature differences to coincide with the strongly
stable conditions indicated by the vertical temperature gradient.

Sea surface temperatures were adjusted to ensure that the air-sea temperature difference
matched the vertical temperature gradient, given the reported temperature measurement
heights.

Comment B.37: Section 2.1.1 - Cameron, second paragraph, last sentence. Earth Tech was
using the data to evaluate CALPUFF. Did you compare these met inputs with what is in my
original OCD evaluation papers?  Note that Joe Chang has all the original data in our Modelers
Data Archive (MDA).

Response: Evaluation of the Earth Tech CALPUFF meteorology was beyond the scope of this
study.

Comment B.38: Section 2.1.1 -Cameron, Table 1. How were Mixing Height shown in the
table observed or determined?

Response: These were the values reported in the Cameron study database. The data was
obtained from the OCS Model Evaluation Database provided by the EPA. The values were used
in prior evaluations of OCD and CALPUFF in the studies referenced in the report. It was beyond
the scope of this study to perform an extensive evaluation of the meteorological parameters
measured or derived in the tracer studies since these evaluations had been conducted  by
previous groups.

Comment B.39: Section 2.1.2 - Carpinteria, first paragraph, second sentence. Really!  EPA
models used to have a good shoreline fumigation algorithm. What happened to it? You could
use the algorithm in OCD or an alternate algorithm by Venkatram.

Response: AERCOARE and AERMOD do not contain shoreline fumigation modules. It was
beyond the scope of this study to evaluate or recommend shoreline fumigation models. It is
recognized that a shoreline fumigation module should be adopted into AERMOD if used for
offshore/coastal air quality impacts analysis.

Comment B.40: Section 2.1.2 - Carpinteria, first paragraph, fifth sentence.  Tracer gas was
released at heights of 18 and 61 m at distances # offshore...

Response: Released at a distance of 300 - 700 m offshore. This fact has been added to the
report.

Comment B.41: Section 2.1.2 - Carpinteria, first paragraph, seventh sentence.  I am probably
responsible for this, since analysis of many overwater boundary layer data sets suggested that
                                         22

-------
a good default mixing height is about 500 m. I had thought that was also incorporated in the
overwaterCALPUFF.

Response: A fixed height of 500 m was reported in the project database. Inquiry into the
derivation of the mixing height used in the study was beyond the scope of this study.

Comment B.42:  Section 2.1.2 - Carpinteria, first paragraph, last sentence. This is probably
because you were using a too-low PEL height.  If you assume PEL height is 10 or 20 m of
course it would make a big difference. Another reason for assuming 500 m.

Response: The minimum PEL height for WRF or AERCOARE meteorology was set at 25 m for
this study.  Further evaluation could be warranted to provide recommendations for a minimum
PEL height, but such evaluation was beyond the scope of the current study.

Comment B.43:  Section 2.1.2 - Carpinteria, Figure 2. The points of interest are in a small area
in the lower part of the figure.  I recommend cutting  the top 2/3 of this figure and expanding the
area where sources, met instrument locations, and sampler locations are shown.

Response: The graphic has been altered to focus on the  area of interest.

Comment B.44:  Section 2.1.2 - Carpinteria.  For the Carpinteria tracer study, offshore impacts
from complex coastal terrain and shoreline fumigation were considered;  BOEM and EPA only
evaluated the complex terrain receptors and not the shoreline receptors  since AERMOD and
AERCOARE do not contain shoreline fumigation modules. However, it seems that the analysis
should have considered receptors  along the shoreline separately from receptors inland for the
AERCOARE evaluation, as AERMOD will likely over-predict concentrations the further inland
that a receptor is located. The consideration of shoreline receptors would give a more
representative indication of how WRF is working in the marine layer region.

Response: The scope of the study was to evaluate AERMOD as it is currently configured
(without a shoreline fumigation model), so it was appropriate to use both shoreline and inland
receptors,  especially since the purpose of using Carpinteria was to provide an examination of a
complex terrain case. Note all receptors were located atop a coastal bluff.

Comment B.45:  Section 2.1.3- Oresund, Table 3. The 50 m mixing  height will cause
problems since the release height  is above 50 m. Why did you decide on a 50 m mixing
height? You will get better results  with a larger mixing height (as in OCD).

Response: The mixing heights used were from the tracer study database (values reported in
the OCS Model Evaluation Archive provided by the  EPA to ENVIRON).

Comment B.46:  Section 2.1.3- Oresund.  The report should include figures for the Oresund
study  showing the study layout (e.g., receptors, sources, and meteorological sampling locations)
like what was presented for the other tracer studies.

Response: Figures showing the Oresund study layout and WRF domains have been included
in the report as Figures 3 and 11, respectively.

                                         23

-------
Comment B.47:  Section 2.1.4 - Pismo Beach, first paragraph, last sentence. Who did this?
What is the justification for the numbers chosen?

Response: Refer to the response to B.36.

Comment B.48:  Section 2.1.4 - Pismo Beach, Figure 3. Did you receive written approval for
this "borrowing"?

Response: References to "borrowed from Earth Tech, 2006" have been removed to avoid
confusion. The graphics were developed using mapping files provided by the EPA from the
publically available OCS model evaluation archive. The OCS model evaluation archive was
originally developed for the U.S. Dept. of the Interior by Earth Tech, Inc. A reference to the OCS
model evaluation archive has been added to the report.

Comment B.49:  Section 2.1.4 - Pismo Beach, Table 4. Explain mixing height of 50 m with
unstable Delta-T?

Response: This is likely a case similar to those referred to in the response to Comment B.36
where advection inversions determine the mixing height (caused by non-local influences). The
air-sea temperature gradient was not adjusted for the 12/13/81 14:00 and 15:00  periods
referenced because the virtual potential temperature gradient was not positive. However, the
low mixing heights during these periods would support an adjustment to the air-sea temperature
difference following the methods described in the response to Comment B.36.

Comment B.50:  Section 2.1.4 - Pismo Beach, Table 4.  Give a reference for the mixing
heights.

Response: The data reported in the table were obtained from the OCS Model Evaluation
Archive as stated in the  report and in the response to previous comments (B.45)

Comment B.51:  Section 2.1.4 - Pismo Beach, Table 4. What happened to the principle that
"all data are innocent until proven guilty"? References and better justification is needed for any
revisions to observations.

Response: Refer to the response for Comment B.36.

Comment B.52:  Section 2.1.4 - Pismo Beach.  For the Pismo Beach tracer study, offshore
impacts from complex coastal terrain and shoreline fumigation were considered; BOEM and
EPA only evaluated the  complex terrain receptors and not the shoreline receptors since
AERMOD and AERCOARE do not contain shoreline fumigation modules. However, it seems
that the analysis should  have considered receptors along the shoreline separately from
receptors inland for the AERCOARE evaluation, as AERMOD will likely over-predict
concentrations the further inland that a receptor is located. The consideration of shoreline
receptors would give a more representative indication of how WRF is working in  the marine
layer region.
                                         24

-------
Response:  Refer to the response for Comment B.44.

Comment B.53: Section 2.1.5 - Ventura, first paragraph, sixth sentence. Who assumed this?
The authors of the current report or the authors of the original field experiment report?

Response:  Refer to the response for Comment B.36.

Comment B.54: Section 2.1.5 - Ventura, Table 5. What happened to the principle that "all
data are innocent until proven guilty"? References and better justification is needed for any
revisions to observations.

Response:  Refer to the response for Comment B.36.

Comment B.55: Section 2.2 - WRF Configuration and Settings. This is a big jump. We were
just talking about field experiments and now we all of a sudden are talking about WRF.  Please
add a transition paragraph or anything else that might explain this transition and why.

Response:  An introductory paragraph has been added to the beginning of Section 2.0 to
explain the transitions. The title of Section 2.2 has also been changed to "WRF Configurations
for the Five Tracer Studies."

Comment B.56: Section 2.2 -WRF Configuration and Settings, first paragraph. Give a
reference or two (report, journal article...etc.)

Response:  References  added.

Comment B.57: Section 2.2 -WRF Configuration and Settings, second paragraph, first
sentence.  Provide a reference.

Response:  References  added.

Comment B.58: Section 2.2 -WRF Configuration and Settings, second paragraph, fourth
sentence.  This rationale is fuzzy.  I don't see why the conclusion follows from what has been
just said.

Response:  Sentence has been edited to clarify the assumptions being made.

Comment B.59: Section 2.2 -WRF Configuration and Settings, second paragraph, fifth
sentence.  Define what is meant by "reanalysis input dataset".

Response:  This has been edited to state "reanalysis dataset."

Comment B.60: Section 2.2 -WRF Configuration and Settings, second paragraph, sixth
sentence. Can the authors please help the average readers by better explaining their rationale?
This is currently written as if the audience is a group of WRF researchers. Actually the
audience is a group of persons used to running AERMOD or OCD.
                                          25

-------
Response: Additional discussion added to the report.

Comment B.61:  Section 2.2 -WRF Configuration and Settings, third paragraph, first sentence.
Explain "reanalysis dataset and PEL scheme selection..."

Response: Refer to response to Comment B.60.

Comment B.62:  Section 2.2 -WRF Configuration and Settings, third paragraph, second
sentence. Give more explanations of why reanalysis and PEL schemes are needed and how the
choices are made.

Response: Refer to response to Comment B.60.

Comment B.63:  Section 2.2. -WRF Configuration and Settings, after third paragraph, second
closed bullet.  These all seem to be for overland boundary layers.  If you look at a lot of
overwater boundary layers, you find that there is often a persistent mixing depth of about 500 m
that persists over broad areas.

Response: The three PEL schemes used in this study are not specifically developed to
simulate overland boundary layers. The theory each are based on are applicable to both
overland and overwater boundary layers. The UW model itself was specifically developed to
improve simulation of marine boundary layers. A purpose of this study is to evaluate the PBL
scheme performance for over-water conditions.

Comment B.64:  Section 2.2 -WRF Configuration and Settings, fourth paragraph, first
sentence.  Write out "NARR".

Response: NARR (North America Regional Reanalysis) is written out and defined  in Section
2.2 before it is used as an acronym.

Comment B.65:  Section 2.2 -WRF Configuration and Settings, fourth paragraph, second
sentence.  But the field experiments have land-use variability at scales less than 32 km.  Review
all the figures.

Response: Reanalysis datasets typically have too low of a resolution to fully resolve terrain
and land-use features. A mesoscale model  such as WRF is typically used with the purpose of
simulating meteorology at resolutions that resolve  important features that influence local
conditions.

Comment B.66: Section 2.2 -WRF Configuration and Settings, fourth paragraph, last
sentence.  But what about the field experiments where there is more spatial variability?

Response: Refer to response to Comment B.65.

Comment B.67:  Section 2.2 -WRF Configuration and Settings, sixth paragraph.  There should
be a more complete description of the PBL schemes so the reader can understand the
                                         26

-------
differences and limitations of each approach.  For example, where have Yonsei University (YSI)
and Mellor-Yamada-Janjic (MYJ) schemes been successfully applied?

Response: Further discussion has been added to Section 2.2 to respond to this Comment and
Comments B.61 and B.62. It is beyond the scope of this study to produce a full survey of the
scientific literature on this subject. The reader is directed to the additional citations added  to this
section.

Comment B.68:  Section 2.2 - WRF Configuration and Settings, sixth paragraph, first
sentence.  But not for overwater boundary layers.  Do they give the 50 m mixing heights that the
authors list in the tables for the field experiments?

Response: The  results are discussed in Section 3. We concur it would have been ideal to set a
MMIF minimum PEL height at 50 m to conform to the experimental minimum mixing heights.
This was the original value recommended based on the values in the experimental database
and the fact that it is the minimum mixing height in CALPUFF. The EPA requested use of
minimum mixing heights of 25 m to provide more conservative concentration estimates. It
should be  noted that AERMOD allows for a minimum PEL height of 1 m. The 25 m value is a
reasonable minimum mixing  height for the most extreme stable periods, based on
measurements reported in (Hsu, 1988) (Garratt, 1992).

Comment B.69:  Section 2.2 -WRF Configuration and Settings, seventh paragraph, first
sentence.  What were the resolutions for the WRF solutions?

Response: This information is reported later in the Section. Each location was modeled using
a base WRF grid  at 36km resolution and nested grids at 12 km, 4 km, and 1.33 km resolution.
Vertical resolution is reported in Table 10.

Comment B.70:  Section 2.2 -WRF Configuration and Settings, seventh paragraph, last
sentence.  What statistical criteria was used to demonstrate that the WRF ERA simulations out
performed WRF NARR simulations?

Response: The  analysis is described in Sections 2.3 and 2.4. References to these Sections
have been added to this paragraph.

Comment B.71:  Section 2.2 -WRF Configuration and Settings, eleventh paragraph, last
sentence.  Yes, this is my worry for these data sets. Why shouldn't you prefer to match the
observed wind speed and direction at the field experiment site?

Response: No observational nudging was used in this study. A purpose of this study is to
investigate how well AERMOD performs using WRF assuming local measurements are
unavailable for the location of interest.

Comment B.72:  Section 2.2 - WRF Configuration and Settings, eleventh to thirteenth
paragraph. Were the same meteorological stations  used for nudging WRF also used for the
METSTAT evaluation?
                                         27

-------
Response:  No observational nudging was used in this study. This statement referring to
observational nudging was incorrect and removed. This statement was an artifact from previous
planning. It was decided that observational nudging conflicted with the purpose of the study.

Comment B.73: Section 2.2 -WRF Configuration and Settings, thirteenth paragraph, fourth
sentence. But nudging is weak. I would hope that an attempt would be made to closely match
the on-site met observations.

Response:  Ideally the best WRF simulation would be the case that resulted in surface
conditions that most closely matched the on-site measurements. A purpose of the study was to
identify a WRF configuration that performed best, given an assortment of WRF scenarios using
different PEL schemes and the most widely available reanalysis data.

Comment B.74: Section 2.2 -WRF Configuration and Settings, sixteenth (or last) paragraph.
As noted in the text, the U.S. modeling domains for WRF are defined on the Lambert Conformal
Conic (LCC) map projection identical to the National Regional Planning Organization (RPO)
domains, with an outermost RPO domain (36 km) and telescoping 12 km - 4 km - 1.33 km
nests to capture the fine detail of coastlines and adjacent topography. What criteria were used
to establish the size of the 1.33-km  domain and buffer? Was there just a need to capture the
shoreline? For example, Catalina Island was not included in the 1.33-km domain (see Figures 7
and 8 of the Task 2 report). Would  this island feature potentially influence the winds in the
channel offshore?

Response: WRF inner domain resolution must be a multiple of three of the parent domain
resolution. Thus, the 1.33 km domain resolution is used because the outermost domain uses a
resolution of 36 km. The 36, 12, 4, and 1.3 km resolutions are a common set of resolutions used
for WRF modeling. The outermost domain resolution of 36 km is ideal because it is similar in
resolution to the reanalysis datasets used.

Comment B.75: Section 2.2 -WRF Configuration and Settings, Table 10. Explain table title
"WRF Vertical Grid  Setup" (e.g., for what exactly, overwater application...etc.).

Response: The Table lists the vertical levels used in these WRF runs. The modeler sets up the
vertical grid, defining the layer heights. The setup represents the vertical grid resolution used
over all of the domains in a WRF run.

Comment B.76: Section 2.3 - WRF Performance Evaluation.  Change title to read "WRF
Performance Evaluation at Five Overwater Field Experiment Sites."

Response: The change was made to  the report as suggested.

Comment B.77: Section 2.3 -WRF Performance Evaluation, first paragraph,  first sentence.
What does this mean? How is it different from "predicted surface meteorology" on the previous
line?

Response: The sentence has been edited.
                                          28

-------
Comment B.78:  Section 2.3 -WRF Performance Evaluation, first paragraph, second
sentence.  Has METSTAT been published in the peer-reviewed literature?

Response: METSTAT is a tool that has been used by studies reported in public documents
such as Emery et al. (2001).36

Comment B.79:  Section 2.3 -WRF Performance Evaluation, first paragraph, four sentence.
What if the nearest grid cell includes both land and water and the observed point is obviously in
one or the other?

Response: MMIF prints a warning when the extraction point is over a solid surface, including
ice. This can also be problematic when sea ice coverage fraction varies at a particular point
during early summer ice melt.  The date and time stamp when the dominant land use category
switches from ice to ocean is reported.

Comment B.80:  Section 2.3 -WRF Performance Evaluation, first paragraph, four sentence.
For each field experiment and measurement site, a map should be included showing the
observation site and the "nearest grid cell".

Response: The METSTAT stations used are now added to the WRF domain figures. At 1.3 km
resolution, the nearest grid point is near to the measurement site.

Comment B.81:  Section 2.3 -WRF Performance Evaluation, fourth paragraph, second
sentence.  Describe how the predicted wind speed is interpolated to the actual measurement
height.

Response: WRF calculates 10m wind vectors using Monin-Obukhov surface layer similarity
theory.  The measurement height of most stations in the DS3505 database is 10m.

Comment B.82:  Section 2.3.1 - METSTAT Benchmarks, first paragraph. This seems too
parochial.  There are many published comparisons of this type in the peer reviewed literature.
Seaman (2001) presented a review at the time with a focus on meteorological variables of
interest to air quality models. Hanna and Zhang (2003) and Hanna et al. (2010) also published
summaries of typical performance  measures: Hanna SR, Reen B, Hendrick E, Santos L,
Stauffer D, Deng AJ, McQueen J, Tsidulko M, Janjic Z, Jovic D, Sykes Rl, Comparison of
observed, MM5 and WRF-NMM model-simulated, and HPAC-assumed boundary-layer
meteorological variables for three days during the I HOP field experiment.  Bound-Layer
Meteorology 2010: 134 (2); 285-306.

Response: The METSTAT benchmarks have been used for many studies accepted by the
EPA for evaluation of WRF performance for air quality studies. It was therefore deemed
unnecessary to adopt an alternative set of performance standards from the scientific literature.
36 Emery, C.A., E. Tai, G. Yarwood. 2001. "Enhanced Meteorological Modeling and Performance
Evaluation for Two Texas Ozone Episodes." Prepared for the Texas Natural Resource Conservation
Commission, by ENVIRON International Corp, Novato, CA.
                                         29

-------
Comment B.83:  Section 2.3.1 - METSTAT Benchmarks, second paragraph, fourth sentence.
Earlier you said that the new model does not account for fumigation across the TIBL anyway.
Does WRF produce a TIBL (probably not because the TIBL happens within the 1.33 km grid
resolution)?

Response: The WRF PEL schemes used do not contain specific parameterizations for coastal
meteorological phenomena.

Comment B.84:  Section 2.3.1 - METSTAT Benchmarks, third paragraph. As noted by BOEM
and EPA in the text, "Although METSTAT analysis can be applied to individual meteorological
station datasets, it is typically used to evaluate performance against a group of stations within
the WRF domain.  This approach is advantageous because it exhibits performance across the
entire domain and relaxes high bias that can occur at any individual site (advantageous if the
climate at the site is heavily influenced by small-scale  local terrain or roughness features not
resolved in the WRF domain)."

We would agree with this approach if the entire WRF wind field were being used  in AERMOD.
However, WRF is being used to extract a single point,  which goes into AERMOD. Thus, what
matters is how well WRF  is representing the meteorological conditions at that single point, not
over the entire domain. So, the evaluation should focus more  on the release monitoring site
and for stations in the marine layer.

Response: For this study, evaluation of WRF  performance at the measurement locations  has
been conducted as reported  in Section 4.0. We agree  that it would  insightful to conduct a
METSTAT analysis over the  small set of hours from each tracer study. However, METSTAT
analysis was only conducted with the DS3505 datasets. If this  method is used in  a regulatory
context, measurements at the location of the source are assumed not available. In this case,
evaluation of WRF performance over the inner-most domain using  available surface datasets is
warranted.

Comment B.85:  Section 2.3.1 - METSTAT Benchmarks, third paragraph, last sentence.
"However, if too many stations are used in the analysis, the statistics may be unduly smoothed
and not truly representative of WRF performance." How many stations are too many? The text
lists  14 to 107 stations being used for smoothing.  How do the  authors determine whether there
is too much smoothing going on?

Response:  No recommendations are available in the literature for the number of stations to
use.  This is a topic of research. The number of stations used is a subjective judgment of the
modeler at this point. The modeler should select a subset of available data that provides the
most even distribution of measurement points  within the domain. The stations actually used for
the inner-most domains are now shown on the respective plots for each study.

Comment B.86:  Section 2.3.1 - METSTAT Benchmarks, Table 11. Need a category for delta-T
(i.e.,  temperature difference).
                                         30

-------
Response: Air-sea temperature difference of 10 m - 2 m temperature difference are not
variables used in METSTAT currently.

Comment B.87: Section 2.3.1 - METSTAT Benchmarks, Table 11. For Humidity Error, this
would be a function of T (i.e., saturation specific humidity).

Response: It is understood that there is a degree of covariance between all variables used in
the METSTAT analysis,  particularly temperature and  humidity.

Comment B.88: Section 2.3.1 - METSTAT Benchmarks, Table 11. For Wind Speed  RMSE,
Seaman (2001) and the Hanna papers agree with this. However, they also say that the
minimum RMSE is 1 m/s due to natural variability.

Response: A minimum wind speed is not applied.

Comment B.89: Section 2.3.1 - METSTAT Benchmarks, Table 11. For wind direction error,
this has been shown to be inversely proportional to wind  speed.

Response: It is understood that there is a degree of covariance between all variables used in
the METSTAT analysis.

Comment B.90: Section 2.3.1 - METSTAT Benchmarks, third paragraph, first sentence. These
field experiments only have 1, 2, or 3 stations. Explain which stations are used in these
evaluations and show them on a map for each site.

Response: The DS3505 stations used for each study have been added to the domain figures.

Comment B.91: Section 2.3.1 - METSTAT Benchmarks, fourth paragraph, fourth sentence.
The word "poor" is arbitrary for your definitions. It is better to use specific statistical methods
related to % confidence

Response: Discussion added to indicate what is considered acceptable or poor. The
METSTAT error and bias goals have been proposed as sufficient benchmarks to evaluate WRF
performance for this study. This paragraph is stating what results are considered poor or
satisfactory based on comparisons of error and bias to the METSTAT benchmarks. It  is  beyond
the scope of this work to perform a more thorough statistical analysis of the results.

Comment B.92: Section 2.3.1 - METSTAT Benchmarks, fourth paragraph, seventh sentence.
The word "exceptional" is also arbitrary for your definitions.

Response: "Exceptional" changed to "satisfactory." Refer to response to Comment B.91.

Comment B.93: Section 2.3.1 - METSTAT Benchmarks, fourth paragraph, last sentence. After
last sentence, add "However, for our specific scenarios, only a few local sites are available."

Response: Sentence added.
                                         31

-------
Comment B.94: Section 2.3.1 - METSTAT Benchmarks, fifth paragraph, first sentence.  Clarify
if measurements were assimilated by the model?

Response: Measurements were not assimilated by the model.

Comment B.95: Section 2.3.1 - METSTAT Benchmarks, fifth paragraph, second sentence.
Clarify if the extracted points were the buoy meteorological

Response: Extracted points were at DS3505 database stations. Evaluation of meteorology at
the measurement locations is in Section 3. The METSTAT analyses did not use any data from
the tracer-study databases.

Comment B.96: Section 2.3.2 - Cameron WRF Simulation Performance, first paragraph, first
sentence. The comparison should be emphasized for the station used for OCD applications.

Response: Evaluation of meteorology at the measurement locations is in Section 3. The
METSTAT analysis is used for a domain-wide survey of WRF performance.

Comment B.97: Section 2.3.2 - Cameron WRF Simulation Performance, second paragraph,
second sentence.  The word "satisfactory" is arbitrary.  Please list quantitative number to
describe performance.

Response: Refer to the response to Comment B.91. The sentence has been changed to
emphasize "satisfactory" is within METSTAT error and  bias "goals."

Comment B.98: Section 2.3.2 - Cameron WRF Simulation Performance, second paragraph,
third sentence.  Describe excessive wind direction bias in terms of a numerical value.

Response: The sentence was changed to a reference to bias and error with reference to the
METSTAT goals.

Comment B.99: Section 2.3.3 - Carpinteria WRF Simulation Performance, first paragraph.
Suggest listing quantitative performance measures rather than just saying "within the acceptable
range" or "slightly less bias".

Response: Changes have been made, but references to the METSTAT goals are retained.

Comment B.100:  Section 2.3.3. - Carpinteria WRF Simulation Performance, first paragraph.
Extending Comment B.82 to Carpinteria, "However, if too many stations are used in the
analysis, the statistics may be unduly smoothed and not truly representative of WRF
performance." How many stations are too many? The text lists 14 to 107 stations being used
for smoothing. How do the authors determine whether there is too much smoothing going on?

Response: Refer to response to comment B.85. The number of stations reported was also
incorrect. The stations reported were those used for METSTAT analysis of the outer-most
domain and not for the METSTAT analysis of the inner-most domain reported in this study. The
                                         32

-------
reported number of stations used for each study has been corrected and the stations are plotted
in the respective tracer study domain plots.

Comment B.101:  Section 2.3.4 - Oresund WRF Simulation Performance, first paragraph, first
sentence.  Need to reference a figure.

Response: Figures showing the Oresund  study layout and WRF domains have been included
in the report as Figures 3 and 11, respectively. Refer to response to Comment B.46.

Comment B.102:  Section 2.3.4 - Oresund WRF Simulation Performance, first paragraph, last
sentence.  Discuss temperature and wind prediction performance quantitatively.

Response: Refer to response to Comment 91. Reference to where results fall compared to the
METSTAT benchmarks are considered satisfactory.

Comment B.103:  Section 2.3.4 - Oresund WRF Simulation Performance, first paragraph. As
noted by BOEM and EPA in the text, "The number of stations used likely results in some
smoothing of the statistics. Therefore, WRF performance across portions of the domain may be
less favorable than suggested by the domain-wide statistics."

We believe this provides support for the argument of separating over-water from over-land
stations in the analyses since AERCOARE is focused on over-water conditions. In the
METSTAT analyses,  how do over-water or coastline WRF analyses compare?

Response: We agree that it would be advantageous to evaluate land-based and sea-based
statistics independently.  However, the small number of available datasets limits our ability to do
this. The majority of the sites used are at coastal locations. Refer to the response to Comment
B.100.

Comment B.104:  Section 2.3.5 - Pismo WRF Simulation Performance, first paragraph, first
sentence. Need to reference a figure.

Response: A reference has been added.

Comment B.105:  Section 2.3.5 - Pismo WRF Simulation Performance, first paragraph, fourth
sentence. Describe wind speed performance  in terms of performance measures.

Response: Refer to response to Comment 91. Reference to where results fall compared to the
METSTAT benchmarks are considered satisfactory.

Comment B.106:  Section 2.3.5 - Pismo WRF Simulation Performance, first paragraph, second
to the last sentence. Discuss the overprediction of temperature numerically?

Response: Values added to the discussion.

Comment B.107:  Section 2.3.5 - Pismo WRF Simulation Performance, second paragraph. Are
any of these differences statistically significant?

                                         33

-------
Response: Statistical significance is not quantified during a METSTAT analysis currently. It
would be useful if a measure of statistical significance was calculated, but it is beyond the scope
of this work to develop or adopt methodology for a more robust statistical analysis of these
results. It is likely, given the few number of DS3505 sites used in the Pismo analysis, that the
reported values are not statistically significant at a standard confidence level of 95%.

Comment B.108:  Section 2.3.6 - Ventura WRF Simulation Performance, first paragraph.
These short one paragraph summaries are too similar, and look like they were written using
copy and paste and fill in the different numbers. Provide quantitative results and discussions of
significant differences to this and  previous one paragraph summaries.

Response: Refer to response to Comment 91. Reference to where results fall compared to the
METSTAT benchmarks are considered satisfactory. The purpose of the report was to briefly
summarize  modeling results, not robustly evaluate all aspects of model performance and
causes.

Comment B.109:  Section 2.3.6 - Ventura WRF Simulation Performance, first paragraph, first
sentence. Need to reference a figure.

Response: A  reference has been added.

Comment B.110:  Figure 14 - Cameron Wind Speed METSTAT Results. General comment -
For EPA applications, a specific model output will be required.  Rather than showing six slightly
different model results, why not show just the one that you recommend for use? Some of these
options have a > 3 m/s RMSE, which to me seems excessive.

Response: The purpose of this study was to evaluate WRF performance using model options
to help identify settings to recommend for regulatory modeling.

Comment B.111:  Figure 15 - Cameron Wind Direction METSTAT Results. Do the models
with better performance on wind direction correspond to those with better performance on WS?

Response: No, the cases with the least wind speed bias and error had the greatest wind
direction bias and error, overall.

Comment B.112:  Figure 16 - Cameron Temperature METSTAT Results. What  is your
hypothesis for the large bias and RMSE?

Response: Cameron winter and summer temperatures were biased low (-1 to -2°C cold bias)
under all WRF  simulations. The bias and error were within the complex terrain criteria, however.
At the location  of the tracer study meteorological measurements, WRF over-estimated
temperature by 1-4°C in the winter. The SST analysis over-predicted SST by 3-6°C for the
winter period and under-predicted SST by 1-2°C for the  summer period. We did not examine
historical observations of cloudiness or other features that could be useful for examining this
further. It is possible, given a 1.3 km resolution, the tight temperature, cloud, and humidity
gradients along the coast are not resolved in WRF. The  simulation results represent an average

                                          34

-------
over a grid cell. Given the strong gradients of meteorological conditions at the coastal interface,
a degree of bias and error could be expected comparing model results to discrete point
measurements. Although the stations are located on the coast, they are land-based stations, so
the influence of inland meteorology could be expected to influence the stations a portion of the
time. The WRF grid cell contains some fraction of sea and land influence. The influence of
cooler ocean water on the WRF grid cell is likely the cause of the cold bias. A quick analysis of
wind direction reveals that onshore winds are dominant most of the time. The hypothesis would
tend to conflict with this observation.

Another hypothesis is that the WRF PEL schemes are repressing vertical mixing. This would
prevent mixing of warmer, drier air into the marine PEL resulting in a cold bias. However, wind
speed is biased high which conflicts with this alternative hypothesis. Also, the MYJ runs (with
local closure) were the better performers in temperature. Since local  closure tends to repress
vertical mixing, it is unlikely that repressed vertical mixing is the cause of the cold bias.

Discussion has been added to Section 2.3.2 to address this comment.

Comment B.113:  Figure 18 - Carpinteria Wind Direction  METSTAT Results. This is where it
would be good to know the result for the wind station nearest to the source which would have
been used for prior OCD runs.

Response: This is evaluated in Section 3.

Comment B.114:  Figure 21 - Oresund Wind Speed METSTAT Results. Where were these
stations? Can you focus one on the moist representative site?

Response: The stations have been included in Figure 3. The comparison to tracer study
meteorological measurements is included in Section 3.

Comment B.115:  Figure 21 - Oresund Wind Speed METSTAT Results. How many stations?

Response: Compared to eight stations: refer to response for Comment B.46.

Comment B.116:  Figure 21 - Oresund Wind Speed METSTAT Results. Can you focus one  on
the moist representative site?

Response: The comparison to tracer study meteorological measurements is included in
Section 3. The purpose of the METSTAT analysis is to evaluate WRF on a domain-wide scale.

Comment B.117:  Figure 24 - Oresund Humidity METSTAT Results. As stated earlier, this all
depends on the temperature.

Response: It has been noted that there is covariance between the variables analyzed and that
the report does not state any assumptions that the variables are independent from one another.
                                         35

-------
Comment B.118:  Figure 26 - Pismo Wind Direction METSTAT Results. Errors are pretty
large. The max RMSE is about 104 degrees and you are close to that. What about the station
that would be used for OCD modeling?

Response: The comparison to tracer study meteorological measurements is included in
Section 3. The  purpose of the METSTAT analysis is to evaluate WRF on a domain-wide scale,
but there is an insufficient set of surface data for the period and location of the Pismo study.

Comment B.119:  Figure 27 - Pismo Temperature METSTAT Results.  Looks like the mean
bias is included in the error, which leads to the linear relation seen here.

Response: Noted.

Comment B.120:  Figure 30 - Ventura Wind Direction METSTAT Results.  What was wind
speed?  Must have been fairly low to give such a large WD error.

Response: Mean wind  speed during September at the six Ventura stations was 2.8 m/s with a
standard deviation of 2.1 m/s and skew of 0.75 (median was 2.5 m/s). Therefore,  wind speed
was not relatively low.

Comment B.121: Section 2.3.7 -WRF Performance Evaluation Conclusions. This section  is
rather non-quantitative and vague.  By inserting the arbitrary definitions of good performance
measures, you  have removed the possibility for an interested reader to "see for himself".  I
suggest inserting quantitative performance measures in the text.  Also, I don't see why we have
to look at the results for six model options. Can one model be  picked for focus?

Response: The purpose of this study was to evaluate WRF performance using model options
to help identify  settings to recommend for regulatory modeling  (response to Comment B10.
Additional discussion was added.

Comment B.122: Section 2.3.7 -WRF Performance Evaluation Conclusions, second
paragraph. As  noted by BOEM and EPA in the text, "A PEL scheme preference cannot be
given based on these results.  Performance of UW-PBL simulations was slightly better for the
Cameron cases and the MYJ scheme was slightly better for the Carpinteria cases." This is
problematic for a regulatory application. It would be valuable to permit applicants if the
regulatory agency selected/approved a scheme, based on their review of the valuable datasets,
and established a consistent dataset for applicant use.

Response: The conclusion section has been expanded. Table 11 includes a summary
comparison of all cases. A recommendation to use the ERA.UW configuration has been added.

Comment B.123: Section 3 - Development of Meteorological Inputs for AERMOD, first
paragraph, first sentence.  Insert a transition paragraph between this new AERMOD  section and
the previous WRF section.

Response: Paragraph added to the end of Section 2 for transition.
                                         36

-------
Comment B.124: Table 12 - Meteorological Fields in the AERMOD Meteorology Input Files.
For "Albedo", Deposition? Not dispersion? By inference, the albedo is "used by AERMOD"
since it is a major parameter in AERMET in estimating sensible heat flux.

Response: Sentence was confusing as written. It has been corrected to address this concern.

Comment B.125: Table 12 - Meteorological Fields in the AERMOD Meteorology Input Files,
Wind Speed. The average (over what time period (e.g. 1-hour)) scalar wind speed at a specified
measurement height.

Response: The first paragraph of Section 3.1 states that all the variables are 1-hour averages.
The variable description has been edited to state that  it is a 1-hour average.

Comment B.126: Table 12 - Meteorological Fields in the AERMOD Meteorology Input Files,
Wind Direction. The average wind direction over time period # at a specified measurement
height.

Response: The first paragraph of Section 3.1 states that all the variables are 1-hour averages.
The variable description has been edited to state that  it is a 1-hour average.

Comment B.127: Section 3.1 -AERMOD Input Meteorology Files, fourth paragraph, last
sentence.  This also was a major point in the Hanna et al. OCD papers and in the Hanna et al.
paper on his adaptation of the COARE formula to overwater applications (2010)

Response: Noted,  but a reference was not added. Refer to the response to Comment A.22.

Comment B.128: Section 3.2 - Buoy Meteorology Processing with AERCOARE, first
paragraph, second sentence. What about the recommendations in the Hanna et al. papers on
evaluations of OCD with these field data sets?

Response: Refer to the response to Comment A.22.

Comment B.129: Section 3.2 - Buoy Meteorology Processing with AERCOARE, first
paragraph, third sentence.  Is this what is giving the questionable PEL heights in previous
sections? This method should have been discussed back in those sections.

Response: The PEL heights reported in the tables for each of the tracer studies in Section 2
were the "measured" values taken from the tracer study databases. No modeled PEL heights
have been discussed before Section 3.2. The Venkatram algorithm is used in AERMET to
estimate mechanical mixing height.

Comment B.130: Section 3.2 - Buoy Meteorology Processing with AERCOARE, first
paragraph, fourth sentence.  Sea surface temperature? Why should it be assumed to apply at a
height of 0.5 m? That is not the "sea surface".

Response: The SST depth was set to -0.5m. This is a standard measurement depth for SST.
                                         37

-------
Comment B.131: Section 3.2. - Buoy Meteorology Processing with AERCOARE, after second
paragraph, first closed bullet. Justification for the use of the Venketram equation.  I think that
this is the cause of many of the variations in predicted concentrations.

Response: The Venkatram method is adopted into AERCOARE. This is the same method that
is used in AERMOD.

Comment B.132: Section 3.3 - WRF Data Extraction Methods, after first paragraph, the three
MMIF processing methods. Why were these three chosen? They do  not acknowledge  previous
(past 30 years) evaluation studies with these datasets.  I would like to also see a default simple
met input file as used in OCD.

Response: These are the three methods used to produce SFC  and PFL meteorology  files for
AERMOD using the MMIF program.  An extraction for OCD meteorology is not available in the
MMIF program.

Comment B.133: Section 3.3 -WRF Data Extraction Methods, second paragraph, third
sentence. Which? Not "all".  Instead it is files similar to what AERMET would produce.

Response: Additional discussion added to address L and w..

Comment B.134: Section 3.3 -WRF Data Extraction Methods, second paragraph, fourth
sentence.  Why are we switching to an Rl method?  AERMET can calculate these variables
without Ri and the Louis method.

Response: Under MMIF extraction case C described in Section 3.3,  MMIF provides the SFC
and PFL files for AERMOD directly and therefore no AERMET processing is used. Therefore, L
and w. must be calculated. MMIF uses the Louis method to do so.

Comment B.135: Section 3.3 -WRF Data Extraction Methods, second paragraph, fifth
sentence. But is easily calculated from the sensible heat flux and the  PEL height.

Response: Refer to the response to Comment B. 134

Comment B.136: Section 3.3 -WRF Data Extraction Methods, third paragraph, first sentence.
What schemes? Why use more than one?

Response: Refer to the response to Comment B.73. A purpose of this study was to explore
WRF performance under different PEL schemes.

Comment B.137: Section 3.3 -WRF Data Extraction Methods, third paragraph, third sentence.
As said above, why is now Rl becoming involved?

Response: Refer to response to Comment B.134. MMIF can estimate PEL height  using a Rl
scheme. This method is explored in this study.
                                        38

-------
Comment B.138:  Section 3.3 -WRF Data Extraction Methods, third paragraph, fourth
sentence.  Any dispersion model is sensitive to PEL height, especially at low values. There is a
long record of publications on this fact, well before it was reported in 2012 by Richmond and
Morris.

Response: Noted, but the reference cited is directly related to this work.

Comment B.139:  Section 3.3 -WRF Data Extraction Methods, after fourth paragraph, Number
2.  Justify statement.

Response: The four extraction methods tested in this study were identified as the most
practical ways to directly use WRF data for AERMOD modeling. As discussed above, each PEL
scheme used in WRF calculates PEL height differently. Extraction option #2 (PEL height
calculation from MMIF) provides a uniform method for estimating PEL height regardless of the
PEL scheme used. The MMIF PEL height estimation scheme of Vogelezang and Holtslag
(1996) takes vertical structure of the atmosphere into account unlike the Venkatram (1980)
method used in AERCOARE and AERMET.

Comment B.140:  Section 3.3 -WRF Data Extraction Methods, after fourth paragraph, Number
3.  I don't understand why you don't just use the surface energy fluxes and z0 etc directly
predicted by WRF, instead or recalculating them.

Response: The WRF parameters are used directly under extraction schemes #1 and #2.
Extraction schemes #3 and #4 use AERCOARE calculations. A purpose of this study is to
examine results under the different options. It was hypothesized in the report that perhaps
AERCOARE reprocessing improves meteorology for AERMOD modeling since it contains
specific algorithms for estimating marine boundary layer parameters.

Comment B.141:  Section 3.3 - WRF Data Extraction Methods, after fourth paragraph,  Number
3.  The reference is not a peer-reviewed journal article.  Instead it is a project report. A peer-
reviewed reference should be used to justify this.

Response: The report referenced is a related work that recommends AERCOARE settings. It
is a published EPA report that is relevant to this paragraph.

Comment B.142:  Table 13 -WRF Meteorology Extraction Options. I would like to see a single
well-justified method used here.  Instead, there are a few options picked out of hundreds that
could  be chosen.

Response: The methods tested have been identified as four of the most practical ways to
provide meteorology for AERMOD from WRF output. These were the methods identified by the
EPA for testing in this study.

Comment B.143:  Section 3.3 -WRF Data Extraction Methods, sixth paragraph, second
sentence.  What if the grid cell is mostly  over land but the  desired met information is for the
overwater part of the grid cell?
                                         39

-------
Response: This is addressed with new commentary added to this section. The modeler must
confirm that the grid cell land-use is specified as over water. It is recognized that mixed
land/water area within the cell may be a source of error.

Comment B.144: Section 3.4 - Evaluation of Extracted Meteorology from WRF.  It would seem
that the evaluation results would also depend on the "extraction method".

Response: A purpose of this study is to evaluate results under the different extraction methods
as specified in the introduction.

Comment B.145: Section 3.4 - Evaluation of Extracted Meteorology from WRF, first
paragraph, first sentence. What is this?  Extraction method? Model option?

Response: Additional information  added to state what the WRF scenarios are - the sentence
has been edited to note that it refers to the six configurations (reanalysis scheme and PEL
scheme).

Comment B.146: Section 3.4 - Evaluation of Extracted Meteorology from WRF, second
paragraph, third sentence. You lost me.  Try using standard terminology

Response: Corrected.

Comment B.147: Section 3.4.1 - Cameron, first paragraph second sentence. Reference not
identified.

Response: Wording changed to clarify meaning.

Comment B.148: Section 3.4.1.1 -Wind Speed, first paragraph, first sentence.  Please use
quantitative numbers

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.149: Section 3.4.1.2 - Air Temperature, first paragraph, first sentence.  All of the
WRF simulations overpredict air temperature (by about # C on average and with a range of # to
#) for most of the February period.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.150: Section 3.4.1.2 - Air Temperature, first paragraph, second sentence.
Temperature predictions  during the July periods were closer (within # C) to the measurements
as displayed in Figure 34.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.
                                          40

-------
Comment B.151:  Section 3.4.1.2 - Air Temperature, first paragraph, third sentence. How
similar were WRF predicted temperatures?

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.152:  Section 3.4.1.2 - Air Temperature, third paragraph, first sentence. A review
of the surface weather measurements in the domain reveals the buoy measurements and YSU
temperatures  correspond closely (within # C) to regional temperatures at the beginning of the
period.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.153:  Section 3.4.1.2 - Air Temperature, fourth paragraph, first sentence. Refer
back to the METSTAT criteria table.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.154:  Section 3.4.1.2 - Air Temperature, fourth paragraph, first sentence. Are the
poor performance of YSU cases statistically significant?

Response: Refer to the response to Comment B.91. Measures of statistical significance were
not conducted in this study for the  meteorological comparisons. It is recognized that the
comparisons would be enhanced with a more thorough statistical evaluation .

Comment B.155:  Section 3.4.1.3 - Sea  Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, first paragraph, first sentence. SST varies in space and time and  is not
static.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.156:  Section 3.4.1.3 - Sea  Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, first paragraph, second sentence.  ...but in these simulations it appears that
the ERA and NARR datasets have very similar values (within # C) during both the summer and
winter periods.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.157:  Section 3.4.1.3 - Sea  Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, first paragraph, third sentence.  But you just said that "SST is not a variable
estimated by WRF".

Response: Sentence edited to indicate that these are the reanalysis SST values.

                                          41

-------
Comment B.158:  Section 3.4.1.3 - Sea Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, second paragraph, third sentence. Just say "As a result, L is positive".

Response: Sentence edited.

Comment B.159:  Section 3.4.1.3 - Sea Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, second paragraph, fourth sentence. The authors need to reconcile these
statements with their previous statement 12 lines above.

Response: Emphasis was added earlier to state that the SST values are from the reanalysis
dataset instead of WRF.

Comment B.160:  Section 3.4.1.3 - Sea Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, second paragraph, last sentence. Explain what "erroneous" means as used
in this context.

Response: Explanation added to the report: "YSU air-sea temperature difference is negative,
resulting in heat flux from the sea and unstable atmospheric conditions. Measured air-sea
temperature differences were positive."

Comment B.161:  Section 3.4.1.3 - Sea Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, third paragraph, second sentence. Does not make sense.  Everything is tied
together. This comment applies to the next few lines too.

Response: This statement is emphasizing that the SST cold  bias results in an erroneously
neutral air-sea temperature difference. The measurements support a negative air-sea
temperature difference resulting in a different atmospheric stability regime.

Comment B.162:  Section 3.4.1.3 - Sea Surface Temperature, Air-Sea Temperature, Monin-
Obukhov Lengths, third paragraph, fourth sentence. I don't see how this  results in the
conclusion in the last part of the sentence.

Response: The MYJ cases result in slightly cooler air, resulting in a more negative air-sea
temperature difference. The greater difference results in more unstable conditions. The tracer
study measurements support highly unstable conditions during this period. Most WRF runs
supported near-neutral atmospheric conditions. The MYJ runs produce the more correct results
because the sign of air-sea-temperature difference matches the sign indicated by the
measurements.

Comment B.163:  Section 3.4.1.4 - PEL Heights, first paragraph. These six lines raise more
questions than they answer.  Please revise.

Response: This section has been edited.
                                         42

-------
Comment B.164:  Section 3.4.1.4 - PEL Heights, first paragraph, second sentence.  If
something is "measured", isn't it better? Or maybe it is not really "measured" but is
parameterized based on measurements of something else. Thus needs to be clarified.

Response: Wording has been changed to emphasize that the tracer experiment PEL heights
are values from the experiment databases rather than "measurements.

Comment B.165:  Section 3.4.1.4 - PEL Heights, first paragraph, fourth sentence. This sounds
very peculiar. The WRF model predicts something, and then MMIF recalculates it?

Response: This section has been edited.

Comment B.166:  Section 3.4.1.5 - Relative Humidity.  These five lines need to be revised.
They look like "a few random thoughts about RH".

Response: This section has been edited.

Comment B.167:  Figure 33 - Cameron Wind Speed Time Series: Winter Releases (top) and
Summer Releases (bottom).  Can the observation be shown more clearly?  Now it is often hard
to identify. Maybe use a big solid black symbol?

Response: Measurements are now highlighted as bright blue circles. The size and brightness
of the symbol has been altered in the plots to increase visibility.

Comment B.168:  Figure 33 - Cameron Wind Speed Time Series: Winter Releases (top) and
Summer Releases (bottom).  Explain which site. See my previous comments  asking for this.
Mark the site on the map given earlier.

Response: The measurement locations have been identified in Section 2.0 to address
previous comments. In some cases this involved addition of symbols on Section 2.0 plots to
identify the location.

Comment B.169:  Figure 40 - Cameron Relative Humidity Time Series:  Winter Releases (top)
and Summer Releases (bottom). I have the same comments on all of these figures. First, show
or say somewhere about the exact location of the site.  Second make the observed point more
visible.  And third, perhaps say or show where the WRF grid point is located that is used for
comparison.

Response: Refer to the responses to Comment B.167 and B.168.

Comment B.170:  Section 3.4.2 - Carpinteria.  Commenter has the same types of comments
for Carpinteria as for Cameron. Please make similar corrections here, plus  others indicated.

Response: Refer to the responses to Comment B.167 and B.168.

Comment B.171:  Section 3.4.2.1 -Wind Speed, first paragraph, second sentence. Which site
exactly.

                                         43

-------
Response: Refer to the responses to Comment B.167 and B.168.

Comment B.172:  Section 3.4.2.1 -Wind Speed, first paragraph, third sentence. Show
numerical values for underprediction and overprediction.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.173:  Section 3.4.2.1 -Wind Speed, first paragraph, fifth sentence.  Discuss the
statistical significance of the MYJ based simulations in lieu of using the word "best".

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.174:  Section 3.4.2.1 -Wind Speed, second paragraph, first sentence.  Explain
acronyms and avoid jargon.  Give number instead of "much higher".

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.175:  Section 3.4.2.1 -Wind Speed, second paragraph, first sentence.  It states
"...result in more unstable conditions increasing the wind speed through vertical mixing." These
are big leaps in logic. Actually everything is inter-related.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion. The assumption discussed in
this sentence has been removed.

Comment B.176:  Section 3.4.2.1 -Wind Speed, second paragraph, last sentence.  What does
this statement mean exactly?

Response: This section has been rewritten and the last sentence was removed.

Comment B.177:  Section 3.4.2.2 - Air Temperature, first paragraph, first sentence. Show a
numerical value for "...nearly the same temperature over all of the release periods.

Response: Section has been edited to add more quantitative discussion. References to bias
and error values now posted in a Table 14 add to the discussion.

Comment B.178:  Section 3.4.2.4 - PEL Heights, first paragraph, first sentence. Explain
"measurement" as used in this context.

Response: Reference to "measurement" has been removed - the statement now states that
these are PEL height values from the experiment database.
                                         44

-------
Comment B.179:  Section 3.4.2.4 - PEL Heights, first paragraph, second to the last sentence.
It is unlikely that "The MYJ and UW-PBL simulations generally predict PEL height values below
30m..."

Response: The reviewer has recommended a higher minimum PBL height in previous
comments. The EPA recommended using a minimum PBL height of 25 m for this study. It is
recognized that AERMET itself can produce minimum PBL heights of 1 m. RAMBOLL
ENVIRON would recommend further study to identify the most physically-realistic minimum PBL
heights observed in nature. A survey of the literature could suffice.

Comment B.180:  Section 3.4.2.4 - PBL Heights, second paragraph, first sentence.  The "re-
calculated PBL heights values" is a very arbitrary method.

Response: This discussion has been edited. Earlier discussion in the report regarding MMIF
PBL height calculation has been augmented.

Comment B.181:  Section 3.4.2.5 - Relative Humidity, first paragraph.  Paragraph is much too
brief.

Response: The discussion has been extended. Quantitative comparisons are included,  using
the values posted in the new Table 14.

Comment B.182:  Section 3.4.2.6 -Wind Direction, first paragraph, first sentence. Which
location exactly for the obs and for the WRF grid canter?

Response: Discussion regarding the location of the experiment meteorological measurements
has been added to Section 2.0. The WRF grid was not added to the plots: the  high resolution of
the WRF  grid would place the grid cell relatively very near to the measurement location, virtually
over the measurement location in the plots.

Comment B.183:  Section 3.4.2.6 -Wind Direction, first paragraph, fourth sentence. Did you
do statistical tests with 95 % confidence limits?

Response: No statistical test conducted. Refer to the response to Comment B.154.

Comment B.184:  Figure 42 - Carpinteria Air Temperature Time Series.  Same comments on
these figures as for Cameron figures.

Response: Refer to the response to Comment B.168.

Comment B.185:  Section 3.4.2 - Oresund.  I have the same types of comments for Oresund
as I had for Cameron and Carpinteria in the sections above. Please make similar corrections
here, plus others I indicate.

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.
                                         45

-------
Comment B.186:  Section 3.4.3.1 -Wind Speed, first paragraph first sentence.  State exactly
where the observation took place and what heights...etc.

Response: Discussion regarding the location of the experiment meteorological measurements
has been added to Section 2.0.

Comment B.187:  Section 3.4.3.1 -Wind Speed, first paragraph second sentence. What is
basis and criteria for stating that the magnitude of the surface wind speed over the Oresund
Strait was "...was generally predicted by all WRF simulations except..."

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.

Comment B.188:  Section 3.4.3.1 -Wind Speed, first paragraph third sentence.  Quantify the
word "significantly".

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.

Comment B.189:  Section 3.4.3.2 - Air Temperature, first paragraph, first sentence.  State how
much WRF under predicts air temperature.

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.

Comment B.190:  Section 3.4.3.3 - Sea  Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, first paragraph, first sentence.  This sentence is an
opinion and need scientific support.

Response: This sentence was edited to  remove the subjective statement.

Comment B.191:  Section 3.4.3.3 - Sea  Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, second paragraph, third sentence.  This sentence
needs more explanation because this shouldn't happen.

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.

Comment B.192:  Section 3.4.3.3 - Sea  Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, second paragraph, fourth sentence. Figure needs to
be provided.

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.
                                         46

-------
Comment B.193:  Section 3.4.3.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, second paragraph, fifth sentence. Quantitatively
discuss 1/L sign and general magnitude.

Response: The discussion has been extended to include a range of values.

Comment B.194:  Section 3.4.3.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, second paragraph, second to last sentence. Quantify
"too statistically neutral" as used in this context.

Response: The discussion has been extended to include a range of values.

Comment B.195:  Section 3.4.3.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, second paragraph, last sentence. Relate "less
favorably" to a value and if significance.

Response: The discussion has been edited to include more quantitative comparisons.

Comment B.196:  Section 3.4.3.4 - PEL Heights, first paragraph, first sentence.  How do you
know what is "best" since the "observed" comes from a different sort of model?

Response: We are assuming that the PEL heights included in the tracer study database are
the correct values and have acknowledged that they may be based on  parameterization
schemes and possibly quite erroneous. The report includes further discussion in the introduction
to note these assumptions.

Comment B.197:  Section 3.4.3.4 - PEL Heights, first paragraph, third sentence. "MMIF
recalculation" is very fishy and sounds like tuning.

Response: Refer to the response to comment A. 19. An aspect of this  study was to evaluate
the performance using MMIF recalculated  PEL heights.

Comment B.198:  Section 3.4.3.4 - PEL Heights, second paragraph. Sounds like this was a
failure. Give more explanation.  Is this likely to occur elsewhere?

Response: Either WRF predicted excessively low PEL heights, unrepresentative of the
conditions at the time, or the study PEL height was excessively high and unrepresentative of the
conditions at the time. Given the stable (L>0) values at the time and high wind speeds, it is
unknown what values make the most sense. Additional discussion added to the report.

Comment B.199:  Section 3.4.3.5 - Relative Humidity, first paragraph. Too brief and informal
and non-quantitative.

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.
                                         47

-------
Comment B.200:  Section 3.4.3.6 -Wind Direction, first paragraph, fourth sentence.  Discuss
the significance of "best" as used in this context.

Response: The discussion has been extended.

Comment B.201:  Figure 51 - Oresund Wind Speed Time Series.  Same comments on these
figures as for previous sites discussed.

Response: Refer to the response to Comment B.168.

Comment B.202:  Section 3.4.4 - Pismo. I  have the same types of comments for Pismo Beach
as I had for Cameron, Carpinteria and Oresund in the sections above.  Please make similar
corrections here, plus others I  indicate.

Response: Refer to the response to Comment B.168. The discussion has been extended.
Quantitative comparisons are included, using the values posted in the new Table 14.

Comment B.203:  Section 3.4.4 - Pismo, first paragraph, first sentence. Where exactly is the
measurement taken and the center of the WRF grid?

Response: Refer to the response to Comment B.168.

Comment B.204:  Section 3.4.4.1 -Wind Speed, first paragraph, fourth sentence.  But is the
relative difference (divided by mean wind) any different?

Response: This sentence in the report is simply informing the reader of the high wind speed
measured  during the period.

Comment B.205:  Section 3.4.4.1 -Wind Speed, first paragraph, fifth sentence. Did you do
95% confidence tests?

Response: Refer to the response to Comment B. 154.

Comment B.206:  Section 3.4.4.1 -Wind Speed, first paragraph, sixth sentence. Quantify
"quite well."

Response: The discussion has been extended. Quantitative comparisons are included, using
the values posted in the new Table 14.

Comment B.207:  Section 3.4.4.1 -Wind Speed, first paragraph, sixth sentence.  Discuss the
meaning of "match the magnitude".

Response: The discussion has been extended.

Comment B.208:  Section 3.4.4.1 -Wind Speed, first paragraph, last sentence. Quantify
"overpredict."
                                         48

-------
Response: The discussion has been extended. Quantitative comparisons are included.

Comment B.209:  Section 3.4.4.2 - Air Temperature, first paragraph, first sentence.  What
would you consider "large"? (Hanna 20150205)

Response: The discussion has been extended. Quantitative comparisons are included.

Comment B.210:  Section 3.4.4.2 - Air Temperature, first paragraph, second sentence.
Significance test>

Response: Refer to the response to Comment B. 154.

Comment B.211:  Section 3.4.4.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, first paragraph, first sentence.  Remind the reader of
the definition of NARR and ERA.

Response: These acronyms are listed in the Abbreviation and Acronyms page and introduced
early in the report.

Comment B.212:  Section 3.4.4.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, first paragraph, second sentence. Which
measurements and where?

Response: Refer to the response to Comment B.168.

Comment B.213:  Section 3.4.4.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, first paragraph, third sentence. Quantify "highly".

Response: A value has been added to the report.

Comment B.214:  Section 3.4.4.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, Second and third paragraphs.  I still don't see how this
happens.  The  model should be predicting a reasonable surface sensible and latent heat flux
which will guarantee the air-sea temperature difference to be reasonable.

Response: The SST is not dynamic and is provided by the reanalysis dataset and is therefore
fixed at any given point and time.

Comment B.215:  Section 3.4.4.3 - Sea Surface Temperature, Air-Sea Temperature
Difference, and Monin-Obukhov Lengths, Second and third paragraphs.  It should be noted that
all of these tracer experiments were conducted only if the wind direction was expected to bring
the plume towards the coast. Thus offshore winds never occur.  This biases the stability
patterns, since  o=an onshore wind is more likely to be an equilibrium boundary layer.
My comment was based on the fact that, with a long fetch over water, the boundary layer
thermal stability is going to be nearly neutral, with air and sea temperature difference close to
zero, and relatively small magnitude sensible heat flux.  (However,  the latent heat flux may be
much larger than the sensible  heat flux). In contrast, with a short (< 5 or 10 km) fetch over

                                         49

-------
water, such as happened in several of the 1998-2005 analyses reported in my ten year old
GOM paper (I was a sub to STI who had the prime contract from MMS), the air is likely to have
originated over land and the air and sea temperature differences can be relatively large one way
or the other.  My paper discusses this and presents some examples. For example, after a cold
front passes in November, there can be cold air (30 F) with NW winds blowing offshore, while
the GOM is still warm with sea level temperatures of 65 F. This can cause strong instability for
many km offshore.  I see this in Maine and it is sometimes evidenced by mirages, caused  by the
same phenomenon as over a desert in July.  After several 10s of kms, the air may adjust to the
underlying warm surface.  However, nearer the shore we observed extreme sensible and latent
heat fluxes during those occasional periods.  The opposite can happen with offshore winds in
May, when there could be 80  F temperatures onshore and 55 F water temperatures, leading to
extreme thermal stability.  This was also observed in the GOM buoy and offshore platform data
that we analyzed.  In these cases, you can sometimes have the peculiar situation with
downward sensible heat fluxes and upward latent heat fluxes.  Does your version of COARE
and your L estimation method use the total buoyant energy flux (sensible and latent), which is
the correct way to do it?

The above phenomena with short fetches can be observed at all places in the world, even the
arctic.

In the Cameron, Carpinteria, Pismo Beach, and Ventura field experiments that you analyzed
(and I analyzed in evaluating OCD 20 or 30 years ago), the tracer  releases took place only if
there was confidence that the resulting plume would "hit" the monitors, which were nearly all on
the shoreline and slightly inland. Thus there was by necessity a persistent onshore wind,
implying that the air had passed over  a long fetch of open water, and thus the overwater
boundary layer was in equilibrium and relatively deep. Also it follows that the mixing heights are
not small.

Response: The COARE algorithm explicitly includes similarity profiles of both moisture and
temperature (e.g. not virtual temperature), so sensible heat fluxes  can be downward while
moisture evaporating from the surface causes an upward latent heat flux. The definition of the
Monin-Obukhov length includes  both the sensible and latent heat contributions to overall
buoyancy. Given  light winds and downward sensible heat flux, marine PEL height can be quite
low. Both WRF and the measurements indicate PEL conditions that are not in equilibrium.  Air
temperatures warmer than the SST can result in surface temperature inversions, leading to low
PEL height, as interpreted for AERMOD modeling.

Comment B.216: Section 3.4.4.4 - PEL Heights, first paragraph,  first sentence. Why are you
surprised? Please note that your PEL "measurements" are not actually measured but are
parameterized. This needs to be brought to the forward.

Response:  Discussion regarding the experiment PEL heights was added to the introduction.
The subjective term "surprisingly" has been removed because it is not appropriate.

Comment B.217: Section 3.4.4.4 - PEL Heights, first paragraph,  third sentence.  Explain the
scientific rationale for why this is happening in WRF.
                                         50

-------
Response: The most blatant cases of discrepancy are the Pismo June 22, 1982 releases
where the 1/L values are negative, but very near to zero, implying near-neutral atmospheric
stability conditions. Examining the NARR.UW and NARR.MYJ cases further, wind speeds are
light, in the 2 to 4 m/s range. PEL heights are low, at the minimum height. With low wind speeds
and low surface roughness, PEL height estimates might be expected to be low. However, we
would expect minimum PEL heights would occur only during the most stable periods, not near-
neutral periods.  Further examination into the  numeric of WRF would be required to fully
understand this phenomenon. The MMIF recalculation provides PEL heights that are more
intuitive and match closer to the values reported in the tracer study database.

Comment B.218:  Section 3.4.4.4 - PEL Heights, second paragraph.  This uncertainty is why,
in OCD, I suggested just using a default mixing height of about 500 m. This removes any
possibility of strange  concentration predictions due to uncertain low mixing heights.

Response:  Refer to the response to comment B.179.

Comment B.219:  Section 3.4.4.4 - PEL Heights, third paragraph, second sentence. Explain
the use of "best" in this context.

Response: The discussion has been extended. Quantitative comparisons are included,  using
the values posted in the new Table 14.

Comment B.220:  Section 3.4.4.5 - Relative Humidity, first paragraph. Too terse and vague.
Expand.

Response: The discussion has been extended. Quantitative comparisons are included,  using
the values posted in the new Table 14.

Comment B.221:  Figure 59 - Pismo Wind Speed Time Series: Winter (top) and Summer
(bottom) Releases. Same comments as for previous field site figures.

Response:  Refer to the response to Comment B.168.

Comment B.222:  Figure 60 - Pismo Air Temperature Time Series: Winter (top) and Summer
(bottom) Releases. Please modify the C scale (say from 11  C to 17 C) so that the points
occupy more of the figure. Same comment for other figures where points are clustered together
in a small area on the figure.

Response:  Plots edited.

Comment B.223:  Figure 63 - Pismo Inverse of L Time Series: Winter (top) and Summer
(bottom) Releases. A log scale should be used for 1/L, to better show the details near 0.

Response: The comment is noted.
                                         51

-------
Comment B.224:  Section 3.4.5 -Ventura.  I have the same types of comments for Ventura as I
had for Cameron, Carpinteria, Oresund and Pismo beach in the sections above. Please make
similar corrections here, plus others I indicate.  (Hanna 20150205)

Response: Refer to the response to Comment B.168.

Comment B.225:  Section 3.5. Where is Section 3.5?

Response: Section 3.6 changed to Section 3.5.

Comment B.226:  Section 3.6 - Discussion, first paragraph, first sentence.  It is hard for the
reader to know exactly what is meant. Replace the vague generalization with specific evidence.
I do not understand what is being compared to what.

Response: Discussion edited. New introductory sentence added.

Comment B.227:  Section 3.6 - Discussion, after first paragraph, first closed bullet. You lost
me. I  thought that METSTAT was the software. What is meant by "METSTAT and buoy
analysis results"?

Response: Discussion edited to clarify meanings. "Buoy location" has been edited to the
"tracer experiment meteorology measurement location" in many places.

Comment B.228:  Section 3.6 - Discussion, second paragraph. I  refrain from making further
comments on this section until these terms are better defined.  I thought that METSTAT was
used to evaluate the predictions?

Response: The paragraph is comparing the trends in bias and error between the regional
analysis (METSTAT) and the local tracer study measurement site analysis.

Comment B.229:  Section 3.6 - Discussion, fourth paragraph. These conclusions should be
quantitative rather than vague general statements without detailed support.

Response: Discussion has been edited substantially. Quantitative comparisons added or
referred to from earlier discussion.

Comment B.230:  Section 3.6 - Discussion, fifth paragraph.  The parameter that WRF has the
hardest time estimating seems to be the PEL height. In most cases, the recalculation approach
seems to cause the various scenario values to converge more.  For example, the Task 2 report
states: "It is possible the "measured" values of PEL height in the studies are misleading and not
based on robust measurements.  Often the  estimates were obtained from methods that
conflicted with each other."  Thus, it seems that future studies need to focus on  an accurate
definition of the PEL height and on accurate measurements of the parameters used to
characterize it.

Response: We agree, further study is warranted for PEL height estimation. The MMIF
recalculation option offers the most practical method at this time to unify PEL height predictions.

                                         52

-------
Comment B.231: Section 4.3.3 - Oresund, fifth paragraph.  As noted by BOEM and EPA in the
text, "Given the transport distances of this case (20-30 km), this study is the least suitable case
for the evaluation of AERMOD between the tracer studies used in this study. AERMOD, being a
"straight-line" Gaussian model, does not account for the heterogeneity of meteorological
conditions between the source and the receptor at the scales of time and space involved with
long-range transport."

In the current regulatory context, the range of 20-30 km is not considered long-range transport.
AERMOD is routinely utilized at distances out to 50 km.  Thus, this statement is inconsistent
with regulatory practice.  However, it is true that when transport occurs over water-land
boundaries,  PEL conditions can change  dramatically over short distances.  Thus, for this study,
only marine-influenced receptors should be considered or at  least evaluated separately.

Response:  This discussion has been edited in the report to  take into account the commenter's
concerns. Distances within 50 km  are not considered long-range transport in a  regulatory
context. The receptors used in this study were placed at the locations where tracer
concentration was measured during the experiments. The measurements were collected at
locations on the shoreline and inland. It would be  advantageous to conduct similar analysis
using  over-water receptors only if  such a tracer study is identified. The scope of this work was to
use these five tracer studies that have been used historically for model validation.
                                          53

-------
[Blank]

-------
C.    Volume 3 - Analysis of AERMOD Performance Using Weather Research
and Forecasting Model Predicted Meteorology and Measured Meteorology in the
Arctic

General Comments

Comment C.1:  In many places, the words "wind" and "speed" are joined as one word; I don't
think that this is correct.

Response: This has been corrected in the revised report.

Comment C.2:   Comments identified in Volume 2 such as the use of qualitative rather than
quantitative, jargon, difficult to read plots...etc, also applies to Volume 3.

Response: this comment follows from general observations of the narrative in Volume 2.

Specific Comments

Comment C.3:  Preface:  Please say what is in this report as opposed to the second (Task 3)
report.

Response: Preface has been rewritten for all three volumes.

Comment C.4:  Figure 85 is  revealing in that the peak concentrations are generally
consistently-predicted at distances beyond 200 m,  but at closer distances, there is a huge
variation in the model performance (that is apparently due to  instability differences?).  It makes
sense that high instability could lead to high predictions in the near-field.  Maybe some
AERMOD debugging exercises are needed to further explain what is happening.

Response: If a plume is elevated  above the ground, concentration  at the ground downwind of
the source is zero up to the distance where the width of the plume has expanded sufficiently to
reach the ground. In stable conditions, vertical plume spread  is repressed and the
concentrations will be zero or near zero for a greater  distance downwind of the source. In
unstable conditions vertical plume  spread is increased and the plume width will expand to the
surface quicker, resulting  in higher ground-level concentrations nearer to the source.

The variation in model performance in the near-field in this study is sensitive to the relationship
of the plume heights and the  minimum  PEL height. The minimum PEL height of 25 m is similar
to the heights of many of the  stacks. AERMOD parameterizes plume interaction with the top of
the PEL including reflection and penetration. The wide variation in near-field concentrations is  a
result of these interactions and variations in the rate of plume spread.

Comment C.5:  As the averaging time is increased, the effect of the 1-hour average variations
are increasingly dampened as the  averaging time is increased, as expected.

Response: We concur with the reviewer's observation.
                                         54

-------
Comment C.6:  Figures 95 and 96 also exhibit large near-field differences between the obs-
based model and WRF/MMIF-based models that might need more debugging investigations.

Response: Note that figure numbers have changed in the future drafts of the report. Refer to
the response to Comment C.4.

Comment C.7:  Section 7 - Conclusions, seven paragraph, first subparagraph. "The use of
WRF meteorology for AERMOD dispersion modeling resulted in similar concentrations
compared to the measurement-based AERMOD modeling. The higher WRF-driven
concentrations were conservative and within a factor-of-two of the higher predictions from the
measured meteorology simulations...there was no scenario modeled in this study where the
RHC values predicted by WRF were underpredicted (in comparison to  measurement-based
RHC results) by more than a factor of two.  This suggests WRF extracted meteorology can be
used as an alternative to offshore observations for air permitting in such areas."

Given that the higher WRF-driven concentrations were conservative and within a factor-of-two
of the higher predictions from the measured meteorology simulations (common criteria for
evaluating acceptable regulatory modeling performance), Shell/Air Sciences concurs with the
BOEM and EPA recommendation that WRF meteorology could be used as an alternative to
offshore observations for use in the AERMOD dispersion model for air  permitting purposes.

In general, the results show that WRF can predict reasonable values for most of the
meteorological variables at sites in the Arctic. We believe that the WRF-derived wind,
temperature, and relative humidity values were with reasonable levels.  See  Comment #3 below
for our discussion on the WRF-predicted mixing heights, which are more problematic.

Response: The opinion of the reviewer has been noted.

Comment C.8:  Section 7 - Conclusions, seven paragraph, second subparagraph. "The WRF
simulated datasets should undergo considerable scrutiny prior to their application. We
recommend a focused evaluation against measurements in the offshore or coastal areas of the
study domain. The evaluation should consider model performance for  difference [sic] WRF PEL
schemes and  reanalysis datasets."

The Task 3 report introduction and conclusions are contradictory. The  report introduction
states: "This study aims to evaluate alternative methods for supplying meteorological variables
to AERMOD for regulatory air quality modeling of sources located over the ocean." But this is
not consistent with what the conclusion states (see quote in BOEM/EPA Recommendation #2,
above).  The conclusion puts permit  applicants in the onerous position  of collecting evaluation
datasets for their application without  any assurance that the WRF solution would  be acceptable
to the regulatory authority.

From the study, it was evident that some WRF schemes performed better compared to
observational  data than others, but no one scheme worked best.  Air Sciences suggests that
BOEM and EPA work to develop a consistent and representative WRF dataset for the Arctic for
all permit applicants. This way, a consistent meteorological framework would be  available for
                                         55

-------
permitting efforts and would avoid the lengthy and protracted agency negotiation about which
PEL scheme best "fits" the particular scenario, especially if evaluation datasets are lacking.

Analogous to how the Federal Land Managers (FLMs) issue modeling guidance (FLAG
guidance) and scrutinize and recommend pre-approved mesoscale meteorological datasets for
use with the CALPUFF dispersion model when evaluating Class I Area Wilderness area
impacts, it would be advantageous to offshore air permit applicants if BOEM and EPA would
evaluate and pre-approve acceptable WRF datasets for air permitting purposes in the Arctic,
and if BOEM and EPA would  provide more details/guidance on acceptable pre-approved
offshore modeling guidance rather than requiring an applicant to perform detailed, case-by-case
WRF performance evaluation studies during the air permit application process.

Response: We agree that it would be advantageous for pre-approved WRF datasets to be
provided to permit applicants  by the presiding agency. This would provide a consistent platform
for evaluation of permit applicants. The burden of WRF performance evaluation would be the
responsibility of the authority. However, it is beyond the scope of this study to offer
recommendations on the implementation of a program. The purpose of the study is to examine
the effectiveness of the use of WRF data for regulatory dispersion modeling and to provide
recommendations on the configuration of WRF.

Comment C.9:  Section 7 - Conclusions, seven  paragraph, third subparagraph. "The MMIF
PEL height rediagnosis (RCALT) option should be applied to obtain  more accurate and
conservative maximum AERMOD predictions.  The rediagnosis option provides a consistent
way to define the PEL height as opposed to the multiple definitions used by the various WRF
PEL schemes."

We generally concur with the use of this technical option based on the study results.  It appears
that the heat flux configuration of WRF-MMIF is overly sensitive to the air-sea temperature
difference, resulting in either stable (25 meter) or high/over-predicted PEL heights.  Because of
this, we agree that the use of the re-diagnosis option for the calculation PEL heights is
warranted.  However, we recommend that additional studies be performed to better understand
and characterize the mixing height and heat flux portion of the model.

Does raising the PEL through the recalculation change the model thermodynamics and heat
fluxes? Many interpretations  of the results are based on an enhancement of unstable
conditions.  Based on the text description, recalculating algorithms just raised the PEL height,
but would not result in a change in near-surface flux conditions.  If that is not the case, then it
should be made clear, from the heat flux and mixing standpoint, what the implications of
artificially raising the PEL height would be.

Response:  MMIF extracts a set of parameters from WRF output including friction velocity, PEL
depth, surface roughness, ground temperature, and lowest layer gridded meteorological
parameters such as wind, temperature, and humidity. MMIF uses these values to estimate the
Monin-Obukhov length (L) (when not provided by WRF; L was provided by the WRF simulations
                                          56

-------
used in this study) and convective velocity scale (w.) using Monin-Obukhov similarity theory and
the Richardson-number based methodology of Louis (1979).37

MMIF recalculates the PEL height using a Richardson-number methodology (Vogelezang and
Holtslag, 1996)38 that relies on the vertical gradient of temperature and wind. This method does
not use the flux scaling parameters  (i/.or w.) to estimate PEL height. After the new PEL height
has been assigned,  w.is recalculated (when applicable: when L < 0). All other parameters are
not rediagnosed by MMIF after the new PEL height is formulated. The rate of vertical mixing
under convective conditions is proportional to w..

The MMIF PEL height recalculation is not an "artificial" method to change PEL height, but a
model  based on solid scientific theory and assumptions. The Vogelezang and Holtslag method
has been shown to produce accurate estimates of PEL height under various conditions.39 The
MYJ PEL scheme in WRF uses the vertical gradient of turbulent kinetic energy (TKE) to
estimate PEL height, defining the height as the level where TKE decreases to a value of 0.1
m2s'2. The MYJ PEL scheme is not  necessarily the more scientifically valid or more accurate
model  under any given set of conditions.

Other WRF PEL schemes, such as  the YSU model, use a critical-Richardson number scheme
similar to the model  used in MMIF. It is likely that MMIF would therefore produce PEL height
estimates more similar to WRF estimates when the YSU PEL model is used. Regardless of the
PEL parameterization scheme used in WRF, MMIF provides the ability to improve the PEL
height estimate by refining the depth to within 1/20th of a WRF vertical cell height. WRF PEL
height estimates are limited to the resolution of the vertical grid.

AERMOD calculate the vertical and lateral dispersion coefficients differently under convective
and stable conditions.  Under stable conditions the dispersion coefficients are a function of
friction velocity and PEL  height. The rate of decay of vertical and lateral turbulence with height
under stable conditions is a function of PEL height. Under convective/unstable conditions the
vertical and lateral dispersion coefficients are a function of convective scaling velocity and PEL
height. Therefore, with greater PEL height, the rates of lateral and vertical dispersion
determined by AERMOD will be greater.

Comment C.10: Section 7 - Conclusions, seven paragraph, fourth subparagraph. "There is no
discernible benefit in using AERCOARE to process meteorology [rather than MMIF] extracted
from WRF.  AERMOD results were  similar overall with and without AERCOARE processing [i.e.,
using MMIF]".

We concur with this observation  based on the study results.
37 Louis, J.F. (1979): A Parametric Model of Vertical Eddy Fluxes in the Atmosphere. J. Atmospheric Science, 35,
187-202.
38 Vogelezang, D., A. Holtslag (1996): Evolution and model impacts of alternative boundary layer formulations.
Boundary-Layer Meteorology, 81, 245-269.
39 Seibert, P.,  F. Beyrich, S.E. Gryning, S. Joffre, A. Rasmussen, P. Tercier (1997): Mixing Height Determination for
Dispersion Modeling. COST Action 710, Report of Working Group 2,

                                           57

-------
Response: The opinion of the reviewer has been noted.

Comment C.11: Section 7 - Conclusions, seven paragraph, fifth subparagraph.  "A high
resolution SST dataset is recommended to capture near-shore temperature gradients.  Avoid
using the coarse SST data typically available in the meteorological reanalysis datasets."

We concur with this technical recommendation based on the study results.  BOEM and EPA
observed that the use of higher resolution SST data (data available from modern satellites)
would be useful for modeling purposes.  In the study, the SST spatial gradients were high in the
Beaufort Sea due to the prevalence of the MacKenzie River outflow plume.  The influence of the
river plume affected a much larger area than suggested by buoy measurements and more
refined SST datasets. This bias resulted in WRF predicting a different planetary PEL structure
than was observed in some cases.

We agree that including high-resolution SST data is important for a WRF simulation. However,
errors in the SST dataset should also be addressed as they can potentially bias WRF results.

Response: The opinion of the reviewer has been noted. The recommendation has been
appended to note that it is recommended that the modeler compare SST analysis data to
available measurements before use in WRF.

Comment C.12: Section 7 - Conclusions, seven paragraph, sixth subparagraph.  "When used
by AERMOD,  we recommend WRF extracted meteorology be filtered to avoid extreme
conditions not typically observed over water. In our study we defined calm conditions as 0.5
m/s, required mixing heights to be greater than 25 m, and did not allow the absolute value of the
Monin-Obukhov length (L) to be less than 5 m."

We concur with these types of technical options that are employed to avoid  extreme
meteorological conditions not typically observed over water, thus avoiding unrealistically
high/conservative modeled concentrations.

Response: The opinion of the reviewer has been noted.

Comment C.13: Section 7 - Conclusions, seven paragraph, seventh subparagraph. "Land-
based RH [relative humidity] data was used as a substitution for missing data in a number of
instances in this study. The drawbacks of such an approach were discussed. It is known that
RH data were collected at the buoys during these periods but was not included in the public
databases.  Efforts to obtain these datasets are still in progress. We recommend additional
analysis using the buoy RH if it can be obtained."

Response: Commenter statement noted.

Comment C.14: The evaluation studies provided a caveat for the use of onshore RH data
when offshore RH data were not available. Additional analysis using buoy RH data could be a
worthwhile analysis for BOEM and EPA to pursue to test model sensitivity.
                                         58

-------
Response: The relative humidity data missing from several of the buoy datasets was obtained
subsequent to the release of the draft report. The revised report contains analysis and
discussion related to results using the complete datasets. References to substitution of missing
relative humidity data with onshore measurements have been removed.

Comment C.15:  Section 7 - Conclusions, seven paragraph, eighth subparagraph. "Finally, this
study compared WRF meteorological predictions and WRF-driven AERMOD simulations to
AERMOD applications prepared with measurements. Such datasets in the Arctic are limited to
a few locations, a couple seasons, and in some instances patched together with assumptions
that were difficult to assess. WRF can be used to account for locations and seasons outside of
the available datasets and the MMIF extractions likely provide a more robust and consistent
meteorological database to simulate sources in the Arctic."

Response:  Commenter statement noted.

Comment C.16:  Concur with the BOEM and EPA recommendation that WRF meteorology can
be used as an alternative to offshore observations for use in the AERMOD dispersion model for
air permitting purposes.

Response: The opinion of the reviewer is noted.

Comment C.17:  Section 7 - Conclusions, seven paragraph, eighth subparagraph. "Finally, this
study compared WRF meteorological predictions and WRF-driven AERMOD simulations to
AERMOD applications prepared with measurements. Such datasets in the Arctic are limited to
a few locations, a couple seasons, and in some instances patched together with assumptions
that were difficult to assess. WRF can be used to account for locations and seasons outside of
the available datasets and the MMIF extractions likely provide a more robust and consistent
meteorological database to simulate sources in the Arctic."

For AERMOD to be more useful, we suggest that EPA consider adding fumigation to the
AERMOD model. EPA could also consider taking into account whether a  receptor is over water
and/or over land and then adjust the atmospheric dispersion accordingly.

Response: We agree that it would be useful for AERMOD to include a fumigation module, but
it is beyond the scope of this work.
                                         59

-------