United States Office of Wetlands, Office of Air Quality EPA-453/R-01-009
Environmental Oceans, and Watersheds Planning and Standards September 2001
Protection Agency Washington, DC 20460 Research Triangle Park, NC 27711
Frequently Asked Questions
Atmospheric Deposition
A Handbook for Watershed Managers
-------
EPA-453/R-01-009
September 2001
About
A for
Office of Wetlands, Oceans, and Watersheds
U.S. Environmental Protection Agency
Washington, DC 20460
Office of Air Quality Planning and Standards
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
-------
This handbook is a joint project of the EPA Office of Wetlands, Oceans, and Watersheds in the Office
of Water, and Office of Air Quality Planning and Standards in the Office of Air and Radiation.
Special thanks go to the technical and watershed management experts who particpated in the two-day
workshop in March 2000 to develop ideas for the handbook. Thanks also go to the many knowledgeable
people who provided technical input to and reviewed the draft handbook.
-------
Decision Flow Diagram [[[ 1
I. Purpose of Handbook [[[ 2
II. Atmospheric Deposition 101[[[ 3
Where Does Atmospheric Deposition Come From? [[[ 5
"What Is the Link Between Emissions, Atmospheric Deposition, and Ecological Impacts
of Pollutants? [[[ 6
III. How Do I Know If Air Deposition is a Problem for My Watershed? ..................................... 8
IV. I Think Air Deposition Is a Problem. Now What[[[ 12
Paper Studies [[[ 12
Summary of Data Sources[[[ 14
V. Time To Ask For Help[[[ 22
VI. How Do I Design An Air Deposition Assessment Strategy?................................................. 24
What Questions Need to be Answered? [[[ 24
"What Information is Needed? 25
What Else Should be Considered[[[ 26
VII. What You Need To Know About Air Deposition Monitoring 31
Station Setup [[[ 31
Deposition Monitoring 33
-------
XI. Now What[[[64
Coordination 64
Total Maximum Daily Loads[[[ 65
Air Program Basics 66
Emission Reduction Methods [[[ 70
What Forms Do Emission Reduction Rules Take? 71
Managing Old Pollution [[[ 72
An Iterative Process 72
XII. Resources[[[ 73
Federal Atmospheric Deposition Programs 73
Non-Federal Programs with Air Deposition Components.............................................. 75
Deposition Resources 75
Watershed Transport Resources[[[ 75
Other Sources of Information 76
References[[[ 77
XIII. Sources of Funding 78
Appendices
Appendix 1: Sources of Pollutants of Concern in the Great Waters and Coastal Areas.......... 79
Appendix 2: Diy Deposition Velocities [[[ 81
Appendix 3: List of Mobile Source Air Toxics[[[ 82
Appendix 4: NADP-NTN Siting Criteria for Wet Deposition Sites..................................... 83
Index[[[ 87
of
-------
APTI
CAA
CAAA
CALMET
CALPUFF
CASTNet
CBPO
CMAQ
EPA
GLNPO
HCB
HYSPLIT
IADN
IMPROVE
MACT
NAAQS
NADP-AIRMoN
NADP-MDN
NADP-NTN
NAICS
NAMS
NDAMN
NEI
NEP
NOAA
NSPS
NSR
Air Pollution Training Institute
Clean Air Act
Clean Air Act Amendments
California Meteorological Model
California Puff Model
Clean Air Status and Trends Network
Chesapeake Bay Program Office
Community Modeling for Air Quality
Environmental Protection Agency
Great Lakes National Program Office
liexaclilorobenzene
Hybrid Single Particle LaGrangian Integrated Trajectory Model
Integrated Atmospheric Deposition Network
Interagency Monitoring of Protected Visual Environments
Maximum Achievable Control Technology
National Ambient Air Quality Standard
National Atmospheric Deposition Program-Atmospheric Integrated Research
Monitoring Network
National Atmospheric Deposition Program-Mercury Deposition Network
National Atmospheric Deposition Program-National Trends Network
North American Industry Classification System
National Air Monitoring Stations
National Dioxin Air Monitoring Network
National Emission Inventory
National Estuary Programs
National Oceanic and Atmospheric Administration
new source performance standard
new source review
-------
PAHs
POM
PCB
QA/QC
QAPP
RADiM
RELMAP
REiMSAD
sec
SIC
SIP
SLAMS
SPMS
TBEP
TMDL
TRI
uses
voc
polycyclic aromatic hydrocarbons
polycyclic organic matter
polychlorinated biphenyl
quality assurance/quality control
Quality Assurance Project Plan
Regional Acid Deposition Model
Regional LaGrangian Model of Air Pollution
Regulatory Modeling System for Aerosols and Deposition
Source Classification Codes
Standard Industrial Classification
state Implementation plan
State and Local Air Monitoring Stations
Special Purpose Monitoring Stations
Tampa Bay Estuary Program
total maximum daily loads
Toxic Release Inventory
United States Geological Survey
volatile organic compounds
VI
-------
The diagram has been placed in the front of this handbook for ready reference. It summarizes the
decisionmaking process suggested in the handbook. The process presumes you have identified water
quality or ecological problems in your waterbody that could be due to pollution.
Do any of
- Known of do
or of in the
- or or
a of air or
in
- Air as a of
in a or
- There is or of
no or
- There of in
upwind.
Air is
a
problem.
Do a
to air deposition load.
Compare air deposition
to other pollutant, loads. Are they on
same order of magnitude?
is
ess
at
- a of
- Identify appropriate experts.
Form an advisory group to designing a
strategy.
Design
advisory group.
assessment strategy
with partners.
Analyze information, and ...
-------
of
' J^lf^lilll1!11' fill111"11111,;
l
The purpose of this handbook is to provide information about what
atmospheric deposition is, how it can be measured, and how the
significance of the problem may be determined for a particular area.
The handbook may not answer all your questions directly; rather, it is
intended to lead you in the right direction and provide enough
information to decide how to address the issue in your area.
Atmospheric deposition is now recognized in many
areas as a significant cause of water quality
problems, acidification of streams and lakes, and
toxic contamination offish and the birds and
mammals that eat them. Several National Estuary
Programs (NEPs) have calculated that atmospheric
deposition of at least one pollutant is a significant
portion of the total pollutant load to their estuaries.
It is something water resource managers are finding
they may need to take into account if they are to be
effective stewards of their environmental resources.
There are challenges to managing the problem. For
example, traditionally there has been a separation of
air and water legislation and programs in all levels
of government. Atmospheric deposition does not
always fit neatly into most resource management
agencies' media-specific programs and organiza-
tional structures. Also, unlike effluents discharged
directly into a waterbody, the sources of air
pollution may be near the waterbody or distant,
such as in another state or perhaps even another
country.
However, in the last decade, there has been legisla-
tion explicitly to address atmospheric deposition.
For example, when the Clean Air Act (CAA) was
amended in 1990, Congress included authorization
to reduce emissions of sulfur dioxides and nitrogen
oxides from utilities to address the problem of acid
rain, which was having a detrimental effect in many
areas, including the Adirondacks region of New
York and northern New England. At the same time,
Congress added requirements to the CAA that the
Environmental Protection Agency (EPA) assess the
impact of atmospheric deposition of toxic air
emissions (and other air pollutants of concern) on
certain waterbodies collectively known as the Great
Waters. EPA's current guidance also specifies that
states should include waterbodies with atmospheric
sources of pollution on their lists of impaired
waters that require total maximum daily loads
(TMDLs). A few states have already developed a
TMDL with an allocation for atmospheric deposi-
tion as part of the total pollutant load. To address
this kind of multimedia problem, air and water
quality managers must work together closely.
If you believe water quality or ecological problems
in your waterbody result from particular pollutants,
you may need to consider air deposition as a
possible contributor of those pollutants. Your first
question is likely to be: How do I know if I need to
worry about atmospheric deposition? This
handbook helps you answer that question. If it
turns out you do need to woriy about atmospheric
deposition, the questions then become: What kind
of data do I need and how much? For which
pollutants? How should I monitor? Where do the
monitoring sites need to be located? When should I
use models? Which ones? How do 1 identify which
sources are responsible? How do I translate all this
information into a coherent management strategy?
These questions were collected from local
watershed managers who are, or have been, in the
position of needing to know how atmospheric
deposition contributes to their water quality
problems and what they can do about it.
-------
11, 101
Pollutants released Into the atmosphere do not "go away." Some stay in
their original form in the atmosphere, the others are transformed into
other chemicals (which may or may not be considered pollutants).
Some stay in the atmosphere, the remainder are removed from the
atmosphere—are deposited—to the land or water. Pollutants can travel
anywhere from a few yards to a few thousand miles before depositing as
a part of the pollutant load to our land and water.
The pollutants that are often identified as having
significant atmospheric contributions in
waterbodies are sulfur compounds, nitrogen
compounds, mercury compounds, other heavy
metals, and a handful of anthropogenic (of human
origin) pesticides and industrial by-products,
including current-use pesticides and herbicides
such as atrazine. For a list of the pollutants that are
most commonly studied, see the box below.
as
in at
*Sulfur com pounds
Nitrogen compounds
Mercury compounds
compounds
Cadmium compounds
Chlorpyrifos
*Copper
*Zinc
Polychlorinated biphenols (PCBs)
Diazinon
Dioxins/furans
Dieldrin
DDT/DDE
Hexachlorobenzene (HCB)
a-hexachlorocyclohexane (a-HCH)
Lindane
Tfbxaphene
Polyeydic organic matter (POM), including polycyclic
aromatic hydrocarbons (PAHs)
*Atrazine
•Indicates pollutants that are not identified as Great Waters
Pollutants of Concern,
Nitrogen inputs have been studied in several east
and Gulf Coast estuaries due to concerns about
eutrophication. Nitrogen from atmospheric
deposition is estimated to be as high as 10 to 40%
of the total input of nitrogen to many of these
estuaries and perhaps higher in a few cases.
Deposition of toxic pollutants is being studied in
the Great Lakes and several estuaries on the
Atlantic, Gulf, and Pacific coasts. For example,
copper deposition is known to contribute to the
ongoing total copper load to San Francisco Bay.
Roughly 80% of the mercury input to Lake
Michigan is estimated to be due to atmospheric
deposition. The contribution of pollutants from
deposition compared with other sources varies both
by waterbody and pollutant.
Not only is the type of pollutant important in
understanding air deposition, but the chemical
form of the pollutant is critically important as well.
Mercury, for example, is emitted in both the
elemental (Hg°) and divalent (Hg2+) form in the
case of coal-fired utilities. Elemental mercury can be
transported in the atmosphere for long distances
(e.g., thousands of miles). It can be oxidized in the
atmosphere and is deposited as divalent mercury in
complexes in precipitation or as particles. The
divalent form of mercury in the gas phase may be
removed from the atmosphere within a short
distance of its emission source (e.g., tens to
hundreds of miles). Solubility, reactivity, and
physical state are often different for each form of a
pollutant. Important chemical species of concern in
air deposition to waterbodies are shown on the
following page by physical state (solid, liquid, or gas).
-------
Physical State for Pollutants of Concern:
Present in Precipitation
NO ' nitrate
3
NH/ ammonium ion
4
Organic nitrogen compounds
SO/- sulfate
HSO3- bisulfite
SO/- sulfite
Atrazine, alachlor, cyanazine (herbicides) and degra-
dation products
PCBs polychlorinated biphenyls
Present in Gaseous Phase
NO nitric oxide
NO2 nitrogen dioxide
NH, ammonia
N,,OS dinitrogen pentoxide
HNO, nitric acid (vapor)
SO2 sul fur dioxide
POM polycyclic organic matter
PAH polycyclic aromatic hydrocarbons
Hg"'+ mercuric ion
Hg° elemental mercury
PCBs polychlorinated biphenyls
D/F dioxins/furans
POM polycyclic organic matter
Present in/on Particles
NH,
ammonium ion
NO3" nitrate
Organic nitrogen compounds
H2SO4 sulfuric acid
SO/' sulfate
Hg2+ mercuric ion
HgCL, mercuric chloride
HgO mercuric oxide
Hg° elemental mercury
Pb lead
Cd cadmium
PCBs polychlorinated biphenyls
D/F dioxins/furans
POM polycyclic organic matter
Paniculate
Matter
of Pollutants
Air
-Local or long-distance transport
-Changes in chemical/physical forms
Anthropogenic
Sources
VjiNS^iC1^
pA
-"'Sources.
Dry
Particle
Deposition
Air/Water Gas
- Exchange
Wet
Deposition
-------
For most pollutants, deposition has been studied in
a limited number of geographic areas. However,
that does not mean that deposition is not a
problem in other geographic areas; rather, it is
unknown. The deposition monitoring networks
that cover the broadest geographic areas are for
sulfur and nitrogen compounds. As you begin to
consider deposition in your area, it may be difficult
to find existing data about deposition rates. This
handbook provides tips on where you could look
for existing information in the Summary of Data
Sources on page 14. For more information on air
deposition monitoring, see the section on What You
Need to Know about Air Deposition Monitoring
on page 31.
Monitoring extensively enough to answer all your
questions about deposition may be prohibitively
expensive. Therefore, models are often used to
complement or replace monitoring to help answer
these questions. For example, modeling can fill
spatial or temporal gaps in the data. Models also
can address questions that monitoring alone may be
unable to address. One example is predicting the
effect of various management options or future
emission reductions on deposition rates. Another is
figuring out what types of sources are responsible
for the deposition in your area. The pollutants for
which modeling has been used to estimate
deposition rates include nitrate, ammonia,
ammonium, sulfate, total mercury, cadmium,
dioxins, atrazine, and paniculate matter. While
models are extremely useful, they are, by definition,
simplifications of reality. It is important to
understand what assumptions and simplifications a
model includes and their implications for the
answers you are seeking. Furthermore, even the best
models are only as reliable as the data input to
them. To the extent possible, models should be
calibrated against actual deposition measurements.
For more information on modeling, see What You
Need To Know About Air Deposition Modeling
on page 45.
Atmospheric deposition comes from emissions of
air pollutants from natural and human-made
(anthropogenic) sources. Some pollutants in the
atmosphere occur naturally, including nitrogen,
sulfur, mercury, lead, cadmium, copper, and zinc.
These pollutants also have significant anthropogenic
sources, which can rival or exceed emissions from
natural sources.
Primary anthropogenic sources of nitrogen include
burning fossil fuels (e.g., in power plants,
industries, and vehicles) and agricultural activities
(including fertilizer application and animal feedlots
and waste lagoons). Natural sources of nitrogen
emissions include lightning, natural burns (e.g.,
forest fires), and microbial activity. Sulfur oxides
are formed when fuel containing sulfur (mainly coal
and oil) is burned and during metal smelting and
other industrial processes.
Generally, the primary anthropogenic sources of
mercury emissions are from combustion of material
containing mercury, such as coal-burning utilities
and boilers, and waste incinerators. There are also
emissions from industrial processes, such as chlor-
al kali plants and gold mining operations.
of Common Terms
The Some pollutants
(especially PCBs, pesticides, toxaphene, some
forms of mercury) do not
depending on a host of variables that are not to
quantify, they can be re-emitted from contaminated
soil or water. This to a where a
pollutant is from an source,
transported some distance, deposited. Then a
portion is re-emitted, transported further,
redeposited. The is indefinitely.
There is that pollutants to sorp/
until they reach northern climates or high
elevations. The phenomenon is sometimes
called the "cold condensation effect." It Is due to the
physical chemical properties of the pollutants.
However, substantial amounts of these pollutants are
still found at lower latitudes.
-------
Anthropogenic mercury emissions are a component
of a global mercury cycle. Because elemental
mercury can travel great distances in the atmo-
sphere, and mercury compounds have their own
natural cycle between the atmosphere, the earth,
and the oceans, a background level of mercury is
present in the atmosphere. This background
reservoir is continuously refreshed from natural and
anthropogenic sources around the world. Other
heavy metals have somewhat different sources; they
come primarily from smelting, metals production
or plating, and mining, as well as combustion of
material containing the metals. Some pollutants are
mainly emitted as combustion by-products.
Dioxins/furans are by-products of combustion of
organic material containing chlorine. Typically,
POM, including PAHs, also are by-products of
incomplete combustion of fossil fuels and plant
biomass.
Many atmospherically deposited compounds are
not naturally occurring in the environment. For
example, PCBs are a group of synthetic organic
compounds that were used widely in electrical
equipment. Pesticides and herbicides are typically
manufactured chemicals. The use of some of these
pollutants has been banned or restricted in the
United States. In those cases, emissions are mainly
from volatilization from contaminated soils, use
from existing stocks (if still allowed), and long-
range transport from other countries.
Common atmospherically deposited pollutants in
waterbodies are listed in Appendix 1 with their
major sources and uses. The discussion of
inventories (page 47) in the chapter What You Need
to Know About Air Deposition Modeling also
contains citations for additional information on
emission sources.
is
of
It is often difficult to make a direct connection
between emissions of any pollutant at one location
and deposition at another. Emissions from a
of Common Terms:
The of deposition of air
pollution directly into a body of water (usually-
thought of in terms of a body of water like an
estuary or lake). The amount of pollution
reaching the in this way is the
from atmospheric deposition.
Indirect The of deposition of air
pollution to the of the watershed (both the land
the water). Once pollutants are in the
watershed, some portion of them are transported
through runoff, rivers, streams, groundwater to
the water body of concern. The portion that
the waterbody by through the is
the indirect from atmospheric
deposition.
Wet Pollutants deposited in rain, snow,
or fog. Acid rain, which has
as a problem in Europe, Canada, Asia,
of the United is an of wet
deposition of sulfur nitrogen compounds.
Dry Pollutants when it is not
raining or snowing. This is a complicated process that
in different ways depending on the
chemical nature of the particle being deposited
the "stickiness" of the surface. Dry of
particles can basically be thought of as similar to dust
collecting on a table.
particular source may spread over a wide area and
deposit in several watersheds. In addition,
deposition rates in any watershed are probably due
to a large number of sources and a variety of
meteorological patterns. There are situations,
however, where reducing emissions in a specific area
leads directly to reducing atmospheric deposition
rates. For example, when large reductions in sulfur
dioxide emissions were implemented in the Ohio
River Valley, a significant drop in sulfate deposition
was measured downwind in the highly sensitive
Adirondacks and New England.
Several studies suggest links between atmospheric
deposition and environmental impacts, including
one showing that rainwater containing typical levels
of nitrogen increased the rate of algal growth when
added to seawater. However, it is difficult to
actually trace most atmospheric pollutants into the
food web because pollutants that have been
-------
deposited through air deposition are difficult to
distinguish from those that entered the food chain
through other pathways. So, modeling is typically
used to make these links. For example, for mercury,
fate and transport modeling and exposure
assessments predict that the anthropogenic
contribution to the total amount of methylmercury
in fish is, in part, the result of mercury emissions
from industrial and combustion sources increasing
mercury body burdens (i.e., concentrations) in fish.
Furthermore, the consumption offish is the
dominant pathway of exposure to methylmercury
for fish-consuming humans and wildlife. If a
pollutant is known to cause a particular environ-
mental problem and atmospheric deposition of that
pollutant is known to be a significant part of the
total load, it is plausible that atmospheric
deposition is responsible for a portion of the
environmental problem.
-------
II. Do I If Air is
a em for My
Several signs suggest that atmospheric deposition may be a problem:
• Known water sources of pollution do not explain the amount
or location of contaminants found in the waterbody.
• National or regional deposition modeling or monitoring maps
indicate a large amount of deposition in the area. For a list of
places to find these maps see page 14.
• Atmospheric deposition has been identified as a significant
source of pollution in a nearby or similar waterbody.
• There is broad-scale (often low-level) water or sediment
contamination with toxic pollutants but no hot spots or
known discharges.
• Large sources of atmospheric pollutants are upwind.
PAHs Found in Caseo ME
A comprehensive analysis of sediments In Casco
revealed widespread distribution of organic
contaminants, such as PAHs, In sediments throughout
the Including in away from population
centers, current discharges, or known historical water
discharges. This that atmospheric
sources could be responsible for contaminants. The
Casco National Estuary Program Comprehensive
Conservation and Management Plan directed EPA, the
Maine Department of Environmental Protection,
the Casco Estuary Project to determine how
atmospheric deposition contributes to the problem of
sediment contamination. In response, the partners
Initiated a three-year deposition monitoring program
for PAHs, among other pollutants. The PAHs
monitoring is one of only a handful in the country
and not part of any coordinated network. Wet and dry
deposition of PAHs to Casco as measured at the
Wolfe's Neck monitoring site, appear to represent a
significant source of PAHs to the Casco Estuary. A
must be to this statement, however,
loading of PAHs to the Is not well
characterized for all sources In the watershed.
If the answer to all the questions is no, it is unlikely
that significant atmospheric deposition is taking
place in your watershed or it is masked by high
loads from other sources. These signs are "warning
flags." If the answer to any of the statements is
"yes," it does not necessarily mean air deposition is a
large source of pollution or that it must be con-
trolled. It simply suggests that it would be a good
idea to take a closer look at what amount of deposi-
tion might be occurring directly in the waterbody
or in the larger watershed and affecting water qual-
ity through indirect deposition.
The following examples show some ways that
watershed management programs discovered that
atmospheric deposition was a significant source of
pollution in their watersheds.
During the late 1970s and early 1980s, studies in
the Great Lakes area were finding elevated levels of
PCBs, mercury, toxaphene, and other pesticides in
fish, water, and sediments of remote lakes with no
direct water discharge of these contaminants.
-------
Atmospheric deposition was suggested as a cause,
and subsequent monitoring confirmed that
significant amounts of mercury were being
deposited with rainwater. Further studies showed
that sources in the area could account for some of
these contaminant loadings; but, in some cases,
contaminants were being transported long distances
through the air. From there, suspicion arose that
mercury was probably being deposited in other
locations and from other sources as well.
Furthermore, other pollutants could also be
deposited in the same way. Atmospheric deposition
is now recognized as a significant source of a large
number of toxic pollutants in the Great Lakes area,
as well as several other places around the country. It
is also now known that long-range transport of
these pollutants from sources in lower latitudes are
causing substantial contamination of otherwise
pristine Arctic and high elevation ecosystems.
The Delaware Inland Bays National Estuary
Program began an air deposition monitoring study
to help Delaware develop aTMDL. Atmospheric
deposition was suspected, in part due to studies
from North Carolina showing a large amount of
ammonia deposition downwind of industrial hog
farms on the Carolina coastal plain. The peninsula
on which Delaware sits, known as the Delmarva
Peninsula, is also home to large numbers of
concentrated animal feeding operations, although
on Delmarva they are mostly chicken farms. Also,
data from the Atmospheric Integrated Research
Monitoring Network (AIRMoN) site near the Bays
were used to gauge wet nitrogen deposition.
Researchers have indeed found a significant amount
of ammonia deposition into the Delaware Inland
Does No Local Source = No Air Deposition
Problem?
Not necessarily. Since air pollutants can be
transported long distances, local sources mayor may
not be the most important eontributor to an
atmospheric deposition problem. For example, the
Chesapeake Program estimates that local
emissions from automobiles contribute about a third
of the nitrate deposition to the Chesapeake
Emissions from power plants, some of which are local
but many of which are hundreds of miles away,
contribute a third of the total nitrate deposition.
How far pollutants travel depends on their
characteristics. For example, ammonia Is generally
transported relatively short distances (within a few
hundred miles), while mercury and
pollutants such as DDT appear to be transported
globally. Therefore, a lack of local air sources,
especially for long-lived toxic pollutants such as
mercury not rule out
atmospheric deposition as a significant pathway.
Bays and the coastal Atlantic ocean. These data were
combined with data from other sources to develop
aTMDL for nitrogen for the Delaware Inland
Bays. TheTMDL does not include any reductions
in nitrogen loads from atmospheric sources other
than those that are predicted from existing federal
laws, but it does identify atmospheric deposition of
nitrogen as a significant source of pollution to the
fragile ecosystems of the Bays.
The Tampa Bay Estuary Program used data from a
nearby National Atmospheric Deposition Program
(NADP) site and local rainfall data to estimate how
nitrogen was being deposited directly into Tampa
Bay. That loading was compared to nitrogen loads
What Is An Airshed?
An airshed is different than a watershed. In a watershed, everything that falls In that flows Into a body of
water. An airshed, In contrast, is a theoretical idea. It defines the source area that contains the emissions that
contribute a certain (usually 75%) to the deposition In a particular watershed or waterbody. In other
words, the containing sources shown by a model to be responsible for some of the atmospheric
deposition reaching an estuary is defined as the airshed. Airsheds are determined from models differ for each
form of every pollutant (e.g., the ammonia and nitrogen oxides airsheds are different for the waterbody).
Airsheds can be useful tools to explain atmospheric transport the to control sources far from the
of concern. Nitrogen airsheds have developed for every coast estuary; they can be found on EPA's
Air-Water Coordination group's at http://www.epa.gov/Qwow/Qceans/aIrdep, Examples of airsheds are
on the next
-------
for
to
Airsheds developed by R. Dennis, Atmospheric Sciences Modeling Division: NOAA Air Resources Laboratory and USEPA National Exposure
Research Laboratory
10
-------
from other sources, including wastewater treatment measure the deposition load more carefully. Local
plants and stormwater runoff. Because the amount monitoring in Tampa Bay Estuary Program
of nitrogen calculated as coming from the confirmed that 25 to 30% of the nitrogen entering
atmosphere was comparable to the amount coming the Bay comes from atmospheric deposition. The
from the other sources (about 27% of the total program has begun working with local sources to
load), the estuary program decided to set up an reduce that load.
atmospheric deposition monitoring program to
11
-------
IV! I Think Air Deposition is a Problem.
Now What?
So you think air deposition may be a problem in your watershed. Now what? The first step is to take a
closer look without actually spending money on equipment or data collection. This section describes the
desk or paper studies that experts recommend conducting before you collect
new data.
This also is a good time to start thinking about what expert assistance you
may need and asking for suggestions of people who can help. However,
consider waiting until after you have started or completed the paper studies
recommended in this chapter to get experts on board. The paper studies
will help you figure out what your next steps will be and the kind of
expertise you will need. Also, the experts will appreciate starting with the
information you will have already pulled together.
This section contains information on
• Paper studies
• Data sources.
The watershed problems that you are trying to
address point you to the pollutants for which you
think air deposition may be a problem. Now
would be the time to identify specifically the
pollutants you need to focus on, if you haven't
already. Doing a paper study for those pollutants
means taking a survey of information already
available and putting it in the context of the
watershed. This would include available informa-
tion about potential sources that could influence
deposition and estimates of their magnitude.
Information about sources can be found in
Appendix 1 and through emission inventories
(discussed on page 47). For toxic pollutants that
tend to persist in the environment, also think about
what sources may have contributed in the past, but
are not current contributors. For example, a waste
incinerator that closed two years ago may have
contributed significantly to the total load of
mercury and dioxins/furans cycling in your
waterbody.
You also want to get a rough "back-of-the-
envelope" estimate of the contribution from air
deposition compared with other inputs. If it is
difficult to come up with actual numerical
deposition estimates, you can still look at pieces of
the puzzle conceptually to help develop hypotheses
about what is happening. For example, informa-
tion about air pollution sources, prevailing winds,
and non-air sources of pollution will provide clues.
The results of the paper study will provide guidance
for your next steps. Let us say you have estimated
that the input of air deposition is very small relative
to other inputs or you have very little information
about another potentially important pathway of
pollutant to your waterbody. You may decide to
expend resources on studying or reducing other
inputs before proceeding with additional study of
the air inputs. Should you decide that your next
step is additional air deposition analysis, the paper
study should shed more light on the questions that
12
-------
you still need to answer and the kind of data you
need to gather.
There are several places to find information on
deposition rates. Air deposition and air quality
monitoring networks and individual research sites
already in place around the United States are good
places to start. The air quality monitoring networks
alone will not provide deposition rates; rather they
provide information about the concentration of
pollutants in the ambient air. However, as discussed
later in this section, air concentrations can be used
in estimating dry deposition rates. They can also be
used alone as evidence of where the highest
deposition is anticipated to occur if you don't have
enough information to estimate deposition rates.
You would expect more deposition in the general
areas where ambient air concentrations are highest.
You can also look for results of air quality or
deposition modeling studies.
For suggestions on places to start looking for
estimated deposition loading rates, see Summary of
Data Sources on page 14. Also look in the Great
Waters Reports to Congress (available on the Web
at www.epa.gov/oar/oaqps/gr8water/) or search
scientific paper databases for publications, especially
for more uncommon pollutants (those not listed
on page 3).
of Common Terms
A that receives a, pollutant.
generally refers to a location in the
transport that is to the
ing to be answered. Exam of receptors are
the deposition monitoring site, the landscape the
pollutant deposits on, an estuary or where its
are in question, or any in the
ecological web where it might be found (such as
or fish tissue). Scientists sometimes refer to
"measuring the form of pollutant at the
site," This really they want to know
how much of the pollutant in what form is showing
up at the place{s) where they are monitoring for it.
Dry
Dry The amount of
lands in a particular over a certain period of
time, for example, 50 kilograms/hectare/year {kg/ha/
yr.) This is the number to know for management
purposes.
Dry velocity: A term in the equation
below that explains how much of a particle or a gas
to the ground. The entire equation is:
dry deposition (D) = C x Vd for
period of time
where
Vd = dry deposition velocity
C = pol I concentration in air at a
reference height
Dry deposition monitoring methods either measure
deposition directly (rarely—due to limitations)
or the pollutant concentration in air use
a modeled velocity to calculate the or flux.
Deposition rates are often reported as pounds/
hectare/year (Ib/ha/yr), grams/square meter/day
(gm/m;Yday), or kilograms/year (kg/yr) for a
specific waterbody. They may be reported as wet
deposition rates, dry deposition rates (not
deposition velocities, as described below), and total
deposition rates. If the total deposition rate is
reported, check with the author to find out if it
includes wet and dry deposition of all forms of the
pollutant. For example, total nitrogen deposition
sometimes means wet and dry deposition of nitric
acid; sometimes wet and dry deposition of nitric
acid and ammonium; sometimes wet and dry
deposition of nitric acid, ammonium, and ammonia;
and sometimes wet and dry deposition of nitric acid,
ammonium, ammonia, and organic nitrogen.
If you cannot find data on dry deposition rates, you
may be able to estimate them with dry deposition
velocities and concentrations of the pollutants in
ambient air, as noted in the Dry Deposition box at
the left. Velocity measurements describe how fast a
particle or gas falls to a particular type of surface,
such as the outside surface of a leaf, but they do not
indicate how many particles or how much gas falls.
They are sometimes reported in the scientific
literature as centimeters/second. A table showing
some of the dry deposition velocities from the
literature is in Appendix 2.
Deposition rates reported in the literature are very
sensitive to the landscape and meteorological
conditions. They are good enough for a paper study
first estimate, but they should be used with caution
13
-------
to calculate more exact dry deposition loads.
Generally the more the conditions under which the
reported value was collected resemble your
conditions (surface, meteorology, and nearby
sources), the more accurate the deposition rate is
likely to be. For example, if the watershed is just
downwind of a large oil refinery complex, borrow a
deposition rate from another area that has similar
emissions upwind rather than the watershed next
door with no emissions from that industry. The key
to picking a good deposition rate or rates is to have
a general knowledge of what industries emit which
pollutants and where they are located. If the
reported values are relatively close together (within
one order of magnitude), it is easiest to average
them and use the average deposition rate for back-
of-the-envelope calculations. If there is a large
difference between them (or if you're not sure if the
difference is large), pick a high one and a low one to
do the back-of-the-envelope calculations of loadings
with each.
To do a back-of-the-enveiope calculation of the
direct atmospheric load to the body of water,
multiply the deposition load by the area of the
waterbody. For example, if the low reported
deposition rate for total mercury deposition is
0.1 micrograms per square meter per year (ug/m2/yr),
and the waterbody is 5,193 nr', the direct mercury
load is 519 pg/yr. If the high reported deposition
rate is 15 fig/m;7yr, the total load to the waterbody
is 77,895 pg/yr. Therefore, the direct atmospheric
mercury load probably falls between 500 and
80,000 fig/yr. This translates into 0.0005 grams per
year (g/yr) and 0.08 g/yr. This may not sound like a
lot, but it can be, depending on how much comes
from other loads and how sensitive the ecosystem is.
Compare the estimated load(s) to loads from other
sources. Are they roughly similar, or does one dwarf
the others? It helps to calculate the estimated
percent load of atmospheric sources. For example, if
the other known source is a waterborne point
source on the lake that discharges 0.1 g/yr and a
tributary that carries approximately 0.05 g/yr, the
percentage of atmospheric load ranges from <0.3%
to 53%. In this case, since it is possible that air
deposition is a significant source, it is probably
worth spending time and/or money refining the
estimated loading. In general, if air deposition is
more than 10 to 15% of the total load, it is
probably worth spending additional resources to
refine the estimates by measuring atmospheric
deposition directly in the watershed. This does not
mean you have to control air sources, just that it is
worth knowing more accurately how important
they are.
of
This section describes a variety of sources of
information for your paper study. The national
deposition and air monitoring networks are an
important place to look. The majority of this
section is devoted to describing these networks.
Other sources are air deposition research or local
studies and results from runs of air quality or
deposition models.
Some monitoring sites measure only wet deposition
of a few pollutants and some measure only dry
deposition; some measure deposition daily and
some measure it weekly. The following paragraphs
summarize some of the monitoring networks and
individual studies that may have useful information
to help with the paper study. Contact each of these
programs directly for the latest information on
where sites are located, details on specific
constituents measured, and the latest measured
deposition rates at each site.
NADP-National Trends Network (NTN). The
NADP was originally set up in 1978 to measure
deposition of pollutants that cause acid rain. (The
NTN was once a separate network, but has now
completely merged into the NADP umbrella.)
NADP-NTN is considered the standard for wet
deposition measurements of sulfate, nitrate,
ammonium, and orthophosphate. Samples are
collected on a weekly basis. The network consists of
over 200 sites, many of which are located in
relatively rural or remote areas. These sites are also
predominantly inland. The sites were (and still
primarily are) set up far away from sources to
measure regional deposition rates and not the
14
-------
o . «• .
II • • • • ' ' • .
z g • • , . ,
o> a- '. • i
C E „ a » *
o o '. • *
5 e * *
O s> •
2 2 " • * *
™™UL j™^
«j-j O> - '»
o "5 *
Q. •— *
ffl S
Qj£
«•" (,
*
^
t
UJ
5
2
? s
z <
ir Sr
o o
15
-------
(0
.
II
"I
1 1
c c
o ®
c
o z
±J en
« E
8.?
6
CM
v*
3s
o
Q
<
O
5
a.
UJ
3
Q
ro
E
t- O
HI !»
w™ «£.
z: o
I i
o <
16
-------
for
2 '-• - §
c . . :; ,: „ i
CO is. . °
•*_> .,, Q
3 - - <
o > iJVr \ ' ' , $
Q- - - - i,. - - I
o > ?>«>
1 > ' '
to
o>
Z
O3
C
c
o
o
Q.
CD
Q
Q
• !>
-------
specific impact of a particular source. The sites,
therefore, are very valuable in showing trends over
space and time, but they cannot be used to identify
sources of pollution very well. The sites are almost
exclusively inland because sea salt complicates the
measurement of sulfate (makes it a little less
accurate). This is less of a problem for measuring
nitrate and ammonium, and recently NADP added
a handful of coastal sites to the two or three that
were already operational.
NADP is a multi-agency network run out of the
Illinois State Water Survey. It has a technical
committee of scientists and agency managers that
oversee the network and make sure it continues to
provide highly reliable data. All data collected by
NADP are available on the Web at no cost through
the NADP home page at http://nadp.sws.uiuc.edu.
The data are also interpolated using a Kriging
model (similar to a statistical least squares fit) to
produce maps of nationwide deposition rates. These
maps can be accessed on the Web for every year
since 1994, as well as the monthly and yearly
deposition rates at each monitoring site. These
deposition rates are very useful in doing back-of-the
envelope calculations for watersheds nationwide.
NADP-Mercury Deposition Network (MDN).
The MDN is one subnetwork of NADP. The first
of Common Terms
A technical term for a population of very
small solid or liquid in the atmosphere.
Pollutants that occur as include nitric acid
sulfuric acid.
A of highly reactive
chemicals produced in the atmosphere, usually from
chemical reactions of NOX and volatile organic
compounds (VOCs) in sunlight. Exam pies of oxidants
include ozone (O3), hydrogen (H2O2),
organic Oxidants react with many of the
compounds of interest to atmospheric deposition
alter their behavior. For example, sulfur dioxide (SOj)
is relatively insoluble in water. However, when it
with the resulting (SG4) is
much more soluble. Soluble compounds tend to
"wash out" in rain much more easily than insoluble
ones. In short, oxidants change the chemistry of
pollutants in the the
that control deposition
sites became operational in 1995, and the network
became an official part of NADP in 1996. MDN
contains approximately 50 mercury deposition
monitoring sites nationwide. Most of those sites
monitor for total wet mercury deposition; some
also monitor for wet methylmercury deposition.
Samples are collected on a weekly basis. There is no
widely accepted method for measuring dry mercury
deposition on a routine basis. The data from MDN
can also be accessed on the NADP Web site at
http://nadp.sws.uiuc.edu.
NADP-AIRMoN. AIRMoN, the smallest
subnetwork of the NADP, is sponsored by the Air
Resources Laboratory of the National Oceano-
graphic and Atmospheric Administration (NOAA)
and run by NADP. The purpose of the network is
to provide research-grade monitoring data to the
NADP and the data users, especially modelers. The
first sites were installed in 1992, and the network
actively encourages coastal and urban sites.
AIRMoN consists of about 22 sites (13 dry and
nine wet) and measures sulfur and nitrogen
compounds (dry) and several cations and anions
(wet) on a daily (instead of weekly) basis. AIRMoN
data can also be accessed on the NADP Web site at
http://nadp.sws.uiuc.edu.
Clean Air Status and Trends Network
(CASTNet). CASTNet is the nation's primary
monitoring network for measuring dry acidic
deposition. In conjunction with other national
monitoring networks, CASTNet is used to
determine the effectiveness of national emission
control programs. The network was built from an
old dry deposition monitoring network started in
the mid-1980s and a new commitment in the 1990
CAA to measuring lone-term status and trends.
O O
There are approximately 80 CASTNet sites
nationwide. CASTNet sites use filterpacks to
collect ambient air samples, and deposition
velocities are calculated using the Multi-Layer
Model. The deposition velocity and air concentra-
tion are plugged into an eqtiation to get deposition
flux or rate. Measurements at each site include
weekly average atmospheric concentrations of
sulfate, nitrate, ammonium, sulfur dioxide, and
nitric acid, and meteorological conditions required
18
-------
for calculating dry deposition rates. Because of the
interdependence of wet and dry deposition, NADP
wet deposition data are collected at or near all
CASTNet sites. CASTNet data can be accessed
from EPA's Office of Air and Radiation at littp://
www.epa.gov/castnet/.
Integrated Atmospheric Deposition Network
(IADN). This network has measured deposition of
toxic contaminants for the Great Lakes region since
1990 on a master site/satellite plan. IADN is a joint
project between the United States and Canada, and
monitoring sites are set up on both sides of the
border. There is one "master site" near each lake
and several "satellite" sites around the lakes that
monitor less frequently and often for fewer
parameters. This approach attempts to capture both
temporal variability (the frequent monitoring at the
master site to measure trends) and spatial variability
(several sites for each lake to get at source
attribution). The sites are rural to measure
background or regional deposition and not hot
spots from cities. IADN monitors for wet and dry
deposition of PCBs, banned pesticides, and some
PAHs. IADN measured trace metals historically,
and the Canadian sites continue to do so. The U.S.
sites do not yet include mercury deposition
monitoring. However, both wet divalent and
methylmercury will be sampled at Sleeping Bear
Dunes, Michigan, beginning in late 2001. Data can
be accessed from the Great Lakes National Program
Office at http://www.epa.gov/glnpo/monitoring/
air.
National Dioxin Air Monitoring Network
(NDAMN). NDAMN is a relatively new research
network that has only been in operation since 1998.
The network has been implemented in stages, with
only nine sites initially set out and 11 as of June
2000. The complete network consists of 30 sites
that monitor ambient air for a suite of dioxins/
furans and dioxin-like PCBs. The sites are located
in predominately rural areas to measure background
levels of dioxins/furans/PCBs contamination, allow
for geographic comparisons of dioxins/furans/PCBs
levels, and provide information on long-range
transport of dioxins/furans/PCBs. For NDAMN
information, contact http://www.epa.gov/ncea/
lpage.htm.
Interagency Monitoring of Protected Visual
Environments (IMPROVE). IMPROVE sites are
primarily located in national parks or wilderness
areas. They collect ambient air data (not deposition)
on aerosols, paniculate matter (large and fine), and
other visibility-related pollution. The sites are
designed to provide data for federal visibility
regulations, identify sources of man-made pollution
that reduces visibility, and assess whether the goal of
no man-made visibility impairment in national
parks (Class 1 areas) is being met over the long
term.
The IMPROVE Web site provides general public
access to the data and information on network sites
at http://vista.cira.colostate.edu/improve/Data/
IMPROVE/improve___data.htm. Aerosol data for all
101 sampler locations, including carbon, are
available in seasonal summaries by year on the
University of California, Davis, FTP site at http://
improve.cnl.ucdavis.edu.
National Air Monitoring Stations/State and
Local Air Monitoring Stations/Special Purpose
Monitoring Stations (NAMS/SLAMS/SPMS).
NAMS, SLAMS, and SPMS are ambient air
quality monitoring networks. NAMS and SLAMS
have been maintained for many years, while SPMS
of Common Terms:
Primary A pollutant that is emitted
directly into the atmosphere,
A pollutant that is formed in
the atmosphere from a reaction between other
compounds. Exam pies of secondary particles include
most nitrogen com pounds that are atmospherically
{nitric ammonium
sulfate.
Precursor: Compounds that in the
atmosphere to form pollutants. For exam pie, NOx
VOCs (volatile compounds) to
form ozone; they are called "ozone precursors."
Since is primarily formed in the atmosphere
from precursors, the precursors themselves
are to the ozone NAAQS (National
Ambient Air Quality Standard).
19
-------
may be run for intermittent periods of time,
ranging from a few months to a few years. The
available data usually include the six criteria
pollutants: CO, O,, NO;, SO;, lead, and
paniculate matter (PM](1). Some fine paniculate
matter (PM,,.) and air toxics data are also beginning
to be made available. Toxics generally have been
monitored on the local scale for short periods of
time.
EPA has led a major effort to establish a national
PM,,, monitoring network to achieve both
regulatory and research objectives related to the new
ambient standard for respirable particulates. The
PM,, s monitoring network consists of three major
components: mass monitoring sites (several
hundred NAMS/SLAMS sites), routine chemical
at Which Atmospheric Deposition
Atmospheric deposition studies outside the
national monitoring networks conducted
at these estuaries. You can contact the
corresponding National Estuary Program or
National Estuarlne for additional
Information.
Albemarle-Pamlico Estuary (nitrogen
elements)
Casco (PAHs, Hg, nitrogen)
Chesapeake Bay (nitrogen and toxic compounds)
Coastal (nitrogen, PAHs, PCBs,
pesticides, trace metals)
Delaware (nitrogen elements
includingHg)
Delaware Inland (nitrogen and
elements)
Galveston Bay (TRIADS [PAHs, PCBs, and
pesticides])
Long Island Sound (nitrogen and Hg)
Massachusetts (PAHs and metals)
New York/New Jersey Harbor State of New
Jersey (nitrogen and Pb, NI, Cu, PCBs, PAHs, and
dloxin)
San Francisco Bay (Hg, Cu, PCBs, and PAHs)
Santa Monica (metals and organlcs)
Sarasota Bay (nitrogen)
Tampa (metals, Including Hg; ambient air
concentrations of pesticides, PCBs, PAHs)
Waquoit National Estuarine
(nitrogen)
speciation of the fine paniculate sample fraction
sites (select NAMS/SLAMS sites and SPMS sites,
as part of research efforts), and targeted geographical
areas for special research data collection efforts
aimed at identifying links between health effects
and fine particulates, commonly known as
"Supersites." (Eight metro areas have been
established: Phase I—Atlanta in 2000, Phase II—
Baltimore, Fresno, Houston, Los Angeles, New
York, Pittsburgh, and St. Louis in December 2000
through 2001.) For information about PM,,,
monitoring efforts, access the EPATTN Web
AMTIC site at http://www.epa.gov/ttn/amtic/.
While ambient air data do not directly provide
deposition rates, they do give a general picture of
overall air quality and the potential for deposition.
These networks are mostly managed by EPA's
Office of Air Quality Planning and Standards and
maintained by the various state and local agencies.
Check with your state air quality agency and its
monitoring group to find out where these sites are
located and what they are measuring in your area.
Data from all three networks are available from the
EPA AIRData Web site at http://www.epa.gov/air/
data/monitors.html.
Ambient Air Toxics Monitoring. The EPA is
working with state and local air monitoring
agencies to develop an air toxics monitoring
network with the following objectives: to
characterize air toxics problems on a national scale,
to provide a means to obtain data on a more
localized basis as appropriate and necessary, and to
help evaluate air quality models. The goal is to
build on monitoring already in place in state, local,
and tribal programs, as well as other national
networks. As the air toxics network is phased in,
the pollutants to be monitored are expected to
include several of the compounds of concern for air
deposition, such as mercury, POM, and metals.
More information can be found at http://
www.epa.gov/ttn/amtic/airtoxpg.html.
Apart from the national networks, other air
deposition studies have been or are being conducted
at the regional or local level. Information about
20
-------
these results can be found in the scientific literature.
You also could contact the state or local environ-
mental agencies or organizations that focus on
particular waterbodies (e.g., National Estuary
Programs, the Chesapeake Bay Program, the Great
Lakes National Program Office) to get leads on
research studies. The table below provides the
results of monitoring studies on several
waterbodies.
This is not an exhaustive list. The Atmospheric
Exchange Over Lake and Oceans Study targeted the
Lake Michigan and Chesapeake Bay areas for
examining the impact of urban plumes from
Chicago and Baltimore, respectively, on atmo-
spheric loadings of trace metals, including mercury,
and organics (PAHs and PCBs) on the waterbodies.
Mass balance studies of individual waterbodies have
been performed on Lake Michigan (PCBs, mercury
trans-nonachlor, and atrazine) and Green Bay
(PCBs, dichlorine, cadmium, lead) in Wisconsin.
The Chesapeake Bay Atmospheric Deposition
Study was designed to study the loads of a variety
of trace metals and organic contaminants in the
Chesapeake Bay. The New Jersey Atmospheric
Deposition Network was designed to study
deposition of PAHs; PCBs; a suite of pesticides,
nutrients, and selected trace metals; and mercury to
sensitive watersheds in New Jersey.
Other potential sources of information for your
paper study are results from deposition or air
quality modeling runs that have already been done.
Some models provide results on a national or
regional scale; others on a more local scale.
Descriptions of several models are provided in the
chapter "What You Need to Know About Air
Deposition Modeling." The programs listed in the
Resources section in the back of this handbook are
a good place to look for modeling that has been
done. Another suggestion is to contact your state or
local air pollution agency.
Atmospheric Nitrogen Loads Relative to Total Nitrogen Loads in Selected Great Waters*
Water body
Albemarle-Pamlico Sounds
Chesapeake Bay
Delaware Bay
Long Island Sound
Narragansett Bay
New York Bight
Total Nitrogen Load
(million kg/yr)
23
170
54
60
5
164
Atmospheric Nitrogen
Load (million kg/yr)
9
36
8
12
0.6
62
Percent Load From
the Atmosphere
38
21
15
20
12
38
Based on ADN loads from the watershed only (excluding direct nitrogen deposition to the bay surface):
Waquoit Bay, MA
.022
.0065
29
Based on ADN directly to the waterbody (excluding ADN loads from the watershed):
Delaware Inland Bays
Flanders Bay, NY
Guadalupe Estuary, TX
Massachusetts Bays
Narragansett Bay
Newport River Coastal Waters, NC
Potomac River, MD
Sarasota Bay, FL
Tampa Bay, FL
1.3
.36
4.2- 15.9
22-30
9
.27 - .85
35.5
.6
3.8
.28
.027
.31
1.6-6
.4
.095 - .68
1.9
.16
1.1
21
7
2-8
5-27
4
>35
5
26
28
ADN = atmospheric deposition of nitrogen
*Table from Deposition of Air Pollutants to the Great Waters—3rd Report to Congress. EPA-453/R-00-005,
June 2000. Original literature references included in the report.
21
-------
V. to for
If you have made it this far, you have a preliminary estimate that says air deposition may be significant.
Now it is time to ask for help getting through the rest of the steps. No one can design and carry out an air
deposition assessment by themselves; and, unless you are an atmospheric scientist, there will be a steep
learning curve. So the most important tool you can find is a group of people who are willing to help and
who have done this before.
Managers that have done atmospheric deposition
assessments strongly recommend setting up some
sort of advisory group to answer the science
questions. Consider a variety of research scientists
with different perspectives and expertise, such as
national experts, scientists doing related work in the
region, people with an understanding of how
atmospheric deposition fits into the bigger picture
of watershed management, and scientists from a
local college or university. These experts will be
able to advise you on many technical questions. For
example, for a monitoring study, questions include
site locations, what to monitor, how frequently to
sample, methodologies to use, how to develop a
quality assurance plan, and how to interpret the
data collected. Similarly, for a modeling
assessment, there are questions like what model to
use, what data inputs to use, and how to interpret
results.
You may also want to have local stakeholders
involved with the advisory group from the
beginning. These include representatives from
environmental organizations, scientists and/or
officials from different levels of government (e.g.,
federal, state, tribal, county, municipal), people
working on related projects in the area, and
representatives from industry. They could bring
information and various perspectives that would
positively affect the design of the assessment. For
example, stakeholders from the governmental sector
generally have a better understanding of the
management questions than most scientific experts
and can assist in framing the scientific dialogue
about how to identify data needs and acceptable
uncertainty levels. Many times, these kinds of
stakeholders are instrumental in identifying
opportunities for phasing data collection efforts,
beginning with screening-level analysis, that may
lead to increasingly sophisticated inquiries.
Furthermore, since the local stakeholders may be
expected to make decisions, make changes, or
spend resources based on the study's findings,
getting their buy-in to the study design could
smooth the implementation process.
The advisory group should meet in person at least
once at the program design stage to map out a data
collection strategy they can all agree on. This is
especially important in situations where the project
is not similar to ones that have been done before,
where there is no widely accepted method to
answer the question(s) that need to be answered, or
where the situation is controversial. If there are
conflicting scientific views relating to the study
design within your group, you may want to ask the
scientists to try to reach consensus among
themselves on one or a few approaches they could
recommend. In this way, you will not be in the
position of trying to reconcile conflicting technical
advice. In addition, you will reduce the potential
for later disagreements over the validity of the
assessment.
After there is consensus on the strategy, most
simple questions can be answered by telephone
either one-on-one or in small conference calls.
However, it is good to hold periodic open
meetings or otherwise keep the advisory group
involved as the process continues, both to take
advantage of new research or other breakthroughs
and to keep the "buy-in" that was developed at the
beginning.
22
-------
Colleges and universities are indispensable. Not applications for their research, and you need
only do they provide technical expertise and a link research for a practical application, so—see if you
to the latest information from the academic world, can help each other out!
they can often also help with resources. Some have
i it j i Several managers who have gone through the
equipment that can be borrowed, some nave ° ....
it -i i r r process of setting up an air deposition assessment
laboratories that can run samples tor free or at a r . . , ,
11 iii i i ri say it is critical to have at least one point-person you
reduced cost, and they have a large number or low- J 1111 /
i i -i 11 ]r, i i always go to first. This person should be part of
cost hands available to work, ror example, graduate , 111 ri
i 11 i i i i the team, well-respected by the rest of the group
students can collect samples and analyze them as • 1-111 rr • \ \
r i - i j j i - 11 and by you, and available to offer advice regularly.
part or their research, and data analysis can be done / J ..,. , i . . ,
ii r • • j TI • • A point-person is indispensable in answering the
by classes or statistics students. 1 his is not to say u r ,r . \ °
11-1 i i r • • "stupid questions (which usually arent) and
that there is always a wealth or resources waiting to n -111 \ • \
iii r i i IT generally provide both technical and moral support
be taken advantage or on local campuses. However, , .
i j r i i r -i as you work your way up the learning curve.
many students and professors look tor practical J i i r t>
23
-------
VI. Do I an Air
_/ i
*j\j
The most important question to ask yourself when designing an assessment strategy is "What question am
I trying to answer?" What to monitor for, what type of monitoring equipment to choose, where it is
placed, how often samples are collected, how deposition data are coordinated with water quality data, and
whether models are used (and if so, which one(s) and how) are all dependent on what question(s) need to
be answered. It is also important to remember the "assessment" part of
the project. To answer any questions, you must dedicate sufficient time
| and resources to interpret or analyze the data collected. This is not a
negligible cost. Most atmospheric deposition studies have to dedicate
30% of the budget to data analysis. The advisory group will help design
the strategy, but here are some things to keep in mind as you go through
that process.
• What questions need to be answered?
• What information is needed?
• What else should be considered?
to be
When thinking about the questions that need to be
answered, you need to look ahead to how you plan
to use the information gathered. It is likely that
most of the users of this handbook want to gather
Information that can be used in making manage-
ment decisions, rather than primarily for basic
research. Some of the typical questions are
• How Important is atmospheric deposition of a
particular pollutant compared to other sources?
• How does it affect the bay/estuary/lake?
• Are there biological or ecological effects?
• How much deposition is falling on the
watershed and ending up in the bay/estuary/
lake?
• How much Is coming from in-state sources
versus out-of-state sources?
• How much is coming from a source or source
category (e.g., utilities, certain agricultural
practices, pulp and paper mills, automobiles,
etc.)?
• How much is coming from a single source
(e.g., a particular industry or animal-feeding
operation upwind)?
• What do I want the data to do? (Objectives
defined, along with statistical uncertainty.)
• What degree of certainty in the answer is
required for decision makers?
The first step is to clearly identify your specific
objectives and prioritize the question(s) that need to
be answered. There may be more than one. You also
need to ask yourself how much certainty you need
in the results to make decisions or to support
various management approaches you may use.
These questions will help the partners and experts
focus on the problem at hand (and design a better
assessment strategy). They will guide decisionmak-
ing throughout the study and will help assess the
results of the study and determine if it has been
successful. That is not to say the study won't
uncover questions you didn't know you had or
open up paths that hadn't been thought of before.
But if you need answers to specific questions, you'd
better make sure you have them when the study is
24
-------
complete (or a good explanation of why you don't).
It is a good idea to prioritize the questions that need
to be answered so strategic decisions can be made if
it is not possible to answer all the questions given
the time and/or resources available.
The quality of the data to be collected must be
defined during the design stage. A confidence
interval or some other statistical parameter by
which data quality can be assessed should be devel-
oped. The advisory board will prove useful, if not
essential, in this effort.
is
The information needed depends on the question(s)
that need to be answered and the tools used to
answer them. Details on different methods of
atmospheric deposition assessment are in the
sections on What You Need to Know About Air
Deposition Monitoring and What You Need to
Know About Air Deposition Modeling. Here is a
short discussion of what types of information are
needed to get particular kinds of data.
Wet Deposition Rates. Wet deposition rates are
the easiest to measure directly because wet deposi-
tion can be measured with a precipitation sampler.
Several wet deposition collectors are commercially
available that open and close automatically. For
more information on how to monitor wet deposi-
tion, see page 34.
Dry Deposition Rates. Estimating dry deposition
is more complicated. As noted earlier, dry
deposition depends on many factors, including
meteorological conditions, characteristics of the
pollutants being deposited (e.g., particle size), and
characteristics of the surface on which the
deposition occurs. There are several different types
of dry deposition methods to choose from (see
page 34 for descriptions). The data from each type
must be converted into deposition rates using a
modeled deposition velocity or one taken from the
literature. To accurately calculate deposition rates,
detailed meteorological data must be collected on
wind speed and direction, as well as temperature
and humidity at a specific reference height above
the ground. This is usually several meters off the
ground, so dry deposition sites are also called
towers.
Indirect Deposition Load. To determine indirect
deposition load, you need to know not only how
much is deposited to the watershed, but also how
much of the deposited pollutant reaches the
waterbody of concern via surface runoff or ground-
water. The proportion of pollutant retained versus
the proportion transported (transmission coeffi-
cient) is estimated either from a set of runoff
coefficients or a watershed transport model. These
values vary greatly for each pollutant and each
watershed. They are influenced by land use, soil
type and permeability, vegetation slope, and stream
density, depth, temperature, and discharge. For
more information on how to calculate indirect
deposition loads, see page 38.
Percentage of Load Due to Atmospheric Deposi-
tion. To know the percentage of load due to
atmospheric deposition, you need both an estimate
of the load due to atmospheric deposition and
estimates of the loads due to other pathways. The
estimate will only be as accurate as the estimated
loading rates from the various pathways. Depending
the
Microlay er: The microlayer is the thin microns—millionths of a meter—thick) surface layer of water.
Pollutants that are hydrophobic (don't mix well with water) tend to collect in the microlayer. Oil forming on
the surface of the water is an of this. This that animals who live in the micro-
layer, or who eat food from the microlayer, are to far higher concentrations of hydrophobic pollutants
than those who do not. Therefore, in order to fully understand the ecological impact of hydrophobic pollutants,
you to study the m icrolayer from the of the water colum n. If the microlayer is thic k the
waterbody shallow, then the microlayer can a significant role in the of deposition
revolatilization. However, some scientists believe for deeper the microlayer is not a enough reservoir to
be important. Some hydrophobic pollutants include: PAHs, PCBs, organic metal compounds, dioxins/furans
pesticides/herbicides.
25
-------
on your information needs, you may group these
other loads into categories (e.g., point sources and
nonpoint sources) or break them into more specific
categories (e.g., wastewater treatment plants or
stormwater runoff).
Source Attribution. Once the total deposition rate
is known, the process of identifying sources can
begin. This is done in one of two ways: either by
using some sort of tracer or modeling. To use
tracers, samples must be analyzed for the tracer(s)
used and the unique chemical composition or
fingerprint of a source or source category must be
known. To do back-trajectory modeling related to
deposition, you need deposition data collected over
a day or less and access to meteorological data
temporally resolved on a short-term basis. To do
source-receptor modeling, you need sophisticated
meteorological data sets and complete emissions
inventories. For more information on source
attribution see page 57.
Ecological Impacts of Deposition. This is compli-
cated for several reasons. One is that the effects of
atmospherically deposited pollutants are not easy to
separate from the effects of the same pollutants
coming from other sources. The second is that
many potential environmental effects and indicators
can be measured. Another complication is differ-
ences in waterbodies, such that the same level of
pollutant in one system can have a greatly different
response than it would have in another system.
Some of the common environmental effects are
lake acidification (for sulfate and nitrogen
deposition), fish/bird tissue loads and microlayer
assays (for toxic bioaccumulating pollutants), forest
health (both tree and soil), and symptoms of
eutrophication. In addition to measuring specific
ecological indicators, it is necessary to know the
percentage of the total pollutant load that comes
from atmospheric sources and possible (or even
proven!) mechanisms for the atmospheric load to
cause the observed ecological effects.
Types of data ideally needed in some aspect of
atmospheric deposition studies include:
• Wet deposition rates
• Dry deposition rates
• Ambient air quality data and deposition
velocities (to calculate the dry deposition rate)
• Meteorological data (rainfall amount daily or
weekly, wind speed, wind direction)
• Good inventories of sources that emit pollut-
ants of concern locally, regionally, and perhaps
nationally. An ideal inventory should include all
sources, emission heights, speciation of emis-
sions, rates of emissions, the exit velocity, and
the stack gas temperature.
• Sophisticated meteorological data sets (to input
into transport models)
• Watershed transport ratios or models
• Loading rates from sources other than
atmospheric sources
• Emissions chemical "fingerprints"
• Ecological data showing impacts of atmospheric
deposition.
It is important to note that while this is a list of
ideal data, it is highly unlikely you will have all of
it. This list should be considered a goal, not the
bare minimum you need to know before any
decisions can be made.
be
Do I or
Most studies monitor first. Unless there is already a
good set of data to put into the model you want to
use, that is a good example to follow. You should
know what kind of modeling you plan to do before
you begin monitoring, however.
26
-------
Modeling and monitoring are really two parts of an
iterative process. Generally there will already be
some monitoring or modeling done in the water-
shed before your project begins, so part of the
decision is based on what data are already available.
Many assessment strategies monitor first and use
modeling to fill in gaps in monitoring data, identify
sources, and make predictions about what emissions
reductions are needed. Some of the modeling may
be already done on a national scale, so check with
your state air quality agency and the EPA Air-Water
Coordination group or the Great Waters Program
to see what is available or "in the works." (For a list
of federal contacts see the Resources section on
page 73; for state contacts see the EPA Aero metric
Information Retrieval System database.)
Air deposition monitoring and modeling are best
thought of as complementary strategies that,
together, can provide a large amount of information
for managers and for the public about the sources
and importance of atmospheric deposition in a
watershed. For this to happen, however, monitoring
and modeling researchers need to communicate
their needs to one another and understand the
strengths and limitations of each approach. Any
manager who uses monitoring and modeling data
must, in turn, be clear about how they plan to use
Monitoring networks use such as
kriging or the least-squares fit to estimate
in wet deposition over
monitoring The are often turned into
isopleth maps, which contours of deposition
amounts or concentrations that look similar to
topographic maps. However, unlike topographic
which are presenting that actually
on the ground, are
of deposition monitoring
sites. This is the technique to create the
nationwide deposition maps NADP from
its 200+ sites. It is really only over
with many wet deposition monitoring sites. It
cannot be for dry deposition the
are dependent on the
immediate meteorology. An of an
ammonium deposition isopleth map (NADP's 1999
is on 29.
it and what kinds of questions they need answered.
A given model may be used to answer some ques-
tions, but it usually cannot do all of them. The
choice of which model(s) to use—or the decision to
create a new one—will be guided in large part by
what questions need to be answered.
If you think of monitoring and modeling as
complementary techniques and if you know what
data you have and what data the model(s) you
might use require, it will often be obvious what to
do first. Your advisory group should be able to help
you make the final decision.
I Air
For practical purposes, the more you can coordinate
the better. Existing air monitoring stations already
have power and security, there is access to the site,
and there may be some equipment, particularly
meteorological equipment, that you can share
instead of buying your own. Perhaps most impor-
tantly, coordinating almost always saves operator
time (and therefore costs) for sample collection and
site maintenance.
It is, however, not essential to collocate air deposi-
tion monitoring with either existing air or water
quality monitoring sites. Generally it is easier to
compare results if the sites are relatively close by,
but that is not a good enough reason to locate an air
deposition site in a particular location. Develop a
set of criteria for the air deposition site based on the
questions you need to answer and carefully evaluate
whether an existing site will meet those criteria. If
they will, go ahead and collocate. But if they will
not, locate the air deposition site in an area that will
allow you to answer the key questions.
on
The decision to join an existing monitoring net-
work or create an independent site or network of
sites depends (once again) largely on the question(s)
the data need to answer. Generally, national net-
works are designed to measure deposition rates on a
large scale. They are not designed to show differ-
ences in deposition rates over small areas (although
-------
sometimes they can), show the impacts of particular
sources on a particular area, or identify sources.
They do show long-term trends of "regional"
deposition rates and are important databases for
scientists studying the effects of deposition. For
example, NADP data clearly show the gradient in
deposition from the Midwest that spreads north
and east into New York and northern New
England. They also clearly show the impacts of the
1990 CAA amendments (CAAA) on reductions in
sulfate deposition in New England and New York.
NADP sites do not indicate which particular
sources affected by that legislation caused the
original deposition or the decrease. AIRMoN sites
can be used in back-trajectory analyses to identify
source regions, but may not be adequate for all
source identification needs.
National network sites are very important because
they allow us to see the big picture on a national
scale and identify how local areas and regions fit
into that big picture. They are usually not good at
differentiating deposition rates over a small area,
although they can in some instances. Many local
managers have discovered that they end up needing
both national network sites and independent sites
over the course of a many-year study to answer all
their questions.
A benefit of national networks is that sampling,
analyses, and quality assurance are all done uni-
formly, in accord with established methods, which
makes it easier to compare data between sites or
geographic regions. Networks can also provide a
source of expertise and can provide a forum to
exchange ideas among experts in the field. The
national networks ask for a minimum five-year
commitment to maintain the site and collect
samples. This requirement ensures that inter-annual
variation can be measured. However, this commit-
ment is a cost consideration. These networks were
discussed in general on pages 14-19. More detail is
provided below, such as estimated site operation
costs.
To join the NADP-NTN or -MDN, the site needs
to be approved by the NADR This includes agree-
ment that the network needs a site in the general
location of interest. (Because the NADP strives to
get regional representation of deposition rates, it is
not interested in having sites clumped together.)
Once the NADP agrees that a site in the general
area is needed, the actual site location, maintenance
plan, and quality assurance plan require approval
based on NADP criteria. This involves submitting
the paperwork requested by the NADP. For
examples of NADP siting criteria, see Appendix 4.
The NADP does not provide funding to install or
maintain the site, collect samples, or analyze
samples. The NADP charges a small fee for pro-
gram coordination that includes data analysis/
interpretation and the production of national maps
showing spatial variability of deposition. The
analysis does not include local watershed-scale data
analysis and, by itself, is not useful for most source
identification. NADP sites can be used in some
situations as reference sites to show lower deposi-
tion rates upwind of major sources. The NTN sites
measure wet deposition of a handful of pollutants
(sulfate, nitrate, orthophosphate, ammonium,
calcium, magnesium, potassium, sodium, and pH).
The MDN sites measure total wet mercury deposi-
tion or, for additional sample analysis fees, wet
methylmercury deposition.
In 1999, the first-year costs (buying the equipment
and a year's worth of monitoring) for the NTN
cost approximately $17,000. This does not include
the actual installation costs (digging the hole in the
ground and installing the monitor and rain gauge).
These site-specific costs can vary significantly. Each
additional year costs approximately $ 15,000 (for
site operation, labor, electricity, and sample analy-
sis). MDN sites are more expensive than the basic
NTN sites because the equipment has to be modi-
fied before sample collection can begin, and mer-
cury analysis is expensive. In 1999, the first-year
costs (buying the equipment and a year's worth of
sampling) were approximately $21,000. Annual
operating (sampling and analysis) costs are approxi-
mately $12,000, and more if methylmercury
samples are analyzed. Samples are collected weekly
(9 am on Tuesday) by operators at each NTN and
MDN site and shipped to a single laboratory (one
for NTN and another for MDN) for analysis.
28
-------
Ion
O >n O «"i O Vi O w.
S 60
O
¥1 O
£
,™«
\ -I
r-i ?*>-.
z
CS
c
.
1
ES
O,
c
.2
in
1 ss
O
Q.
S __r, f,_ g./-
^""1 y, "" O f '"-^.~X ^ ^
* tjjfV' ,_ =' "^ O / '™
""*'*--,. ;",
-------
AIRMoN, the smallest subnetwork of the NADP,
is sponsored by the Air Resources Laboratory of
NOAA. The first sites were installed in 1992, but
additional sites can still be added to the network.
New sites are approved by the NOAA Air
Resources Lab. AIRMoN consists of about 22 sites
and measures a range of pollutants (for both wet
and dry deposition) on a daily (instead of weekly)
basis. The network also refrigerates samples to
preserve them (which NADP-NTN does not). The
pollutants measured (wet, dry, or both) include
nitrogen oxides (NO.) and sulfur dioxide (SO,,).
This allows AIRMoN data to be used for back-
trajectory analysis (a source attribution method)
because data can be matched with meteorological
patterns to identify source regions of the pollutants.
This is not possible with weekly samples since the
wind shifts so many times during a seven-day
period. A typical AIRMoN wet site costs
approximately $28,000 for the first year (excluding
installation costs). Each additional year costs
approximately $20,000, primarily for sample
analysis. A typical AIRMoN dry site costs
approximately $35,000 for the first year and
$30,000 to run for each additional year.
The other national dry deposition network is
CASTNet, which was established in 1987 to
determine spatial patterns and geographic trends in
air pollution (and to measure the effectiveness of
the CAAA of 1990). There are approximately 80
CASTNet sites in the country. CASTNet sites use
filterpacks to collect ambient air samples, and
deposition velocities are calculated using the Multi-
Layer Model. Deposition rates are then calculated
from the ambient air samples and the deposition
velocity. Various pollutants are measured at each
site, but there is the capability to measure ambient
gaseous nitric acid and sulfur dioxide; paniculate
sulfate, nitrate, and ammonium; and base cations
(potassium, calcium, etc.).
CASTNet sites are expensive to operate because of
the large number of pollutants being analyzed and
the process required to analyze dry deposition. In
1999, the first year costs were approximately
$78,000. Each additional year costs approximately
$42,000. These costs include installation (roughly
$30,000), operating site costs ($10,000 and up
annually), and annual analysis ($38,000 to
$43,000).
30
-------
VII. What You Heed to Know About
Air Deposition Monitoring
1 ^
Although the concept of ait deposition monitoring is pretty simple, the reality is rathet complicated. There
are a variety of methods to choose from and a laundry list of things to watch out for, as well as dozens of
decisions to make about all the details. This section lays out many of the options you have to choose from
and issues you have to consider, outlines background information on the pros and cons of different
methods of monitoring, and provides estimates of the scale of monitoring you can achieve given a certain
amount of resources.
A word of caution about monitoring is probably appropriate here. Air deposition monitoring data, like any
other type of monitoring data, are only as good as the monitoring design allows them to be. In other
words, there are dozens of reasons why the results you get may not be really representative of what
deposition is happening now, what has happened, or what generally occurs in the watershed. Monitoring
sites should be chosen carefully and maintained for as long as possible to minimize these problems, but you
should always ask yourself (as with any other monitoring effort) how accurate and representative your data
need to be for their intended purpose.
This section contains information on
Station setup
Deposition monitoring
Estimating the indirect deposition load
Uncertainty, errors, and quality assurance
How much data can I get for $15,000, $50,000,
$400,000 a year?
How much monitoring is enough?
a
Picking a good site is one of the most important
things to do when designing a monitoring strategy.
A monitoring site must meet the goals of the
study—it must be placed to answer the questions
that need to be answered—and the data it produces
must be scientifically defensible.
If you can place just one monitoring site, it should
almost always be where it can measure "regional"
deposition. That is, it should measure some sort of
average of what happens in the area, not "hotspots"
from particular sources. NADP sites are regional
sites; following their site location criteria will give
you a good estimate of representative deposition
rates. A copy of the NADP site criteria is included
in Appendix 4. Dry deposition sites also are located
to get regionally representative results, but this is
much more difficult to do than it is for wet
deposition sites because of the acute sensitivity of
dry deposition to the surrounding landscape.
If you can put out two or more sites, things get
more complicated. Often two sites are organized in
an "upwind, downwind" system where one site is
upwind of suspected sources (often well inland) and
31
-------
How to Monitoring
The rule of thumb is to the
temporal (lots of data points at one
over a time) the
points from many different sites). Temporal
allows more results to be
concluded for a particular point; but without
it is not to know how
representative those results are for the entire
watershed. Some networks use a master-satellite
approach to get the of both worlds. In the
approach, is
station where deposition is measured frequently
and out through the
watershed where deposition is frequently
can be to collect
speciation that would be too to
collect at all the sites. An of a master-
network is 1ADN page 18 for more
information in 1ADN).
The decision of where to on
thequestion(s)to be answered. For exam pie, if a
particular is the study
would like to confirm its significance, should
be upwind downwind, and the results
compared. If the is to accurately measure the
reaching an estuary or the
should be geographically out around
the of concern, none of them should be
directly downwind of a source. Sites should
be in "hotspots" of especially
high deposition only if they are what you
want to specifically characterize.
the other is downwind on or near the coast. With
additional resources, sites can also be set up upwind
of other source areas or located close to large
suspected sources in an attempt to characterize the
influence of a particular source. Often local or
regional networks with multiple sites do not
measure every pollutant at every site; the samples
are strategically analyzed based on either initial
measurements or the researchers' best understanding
of what pollutants are important to measure where.
In that situation, unanalyzed samples are usually
stored for future analysis for additional pollutants at
a later time, if necessary.
Every site requires some type of deposition sampler
and some type of equipment to measure
meteorological data. Dry deposition sites require
more sophisticated meteorological equipment than
wet sites. When choosing site location, think about
access (hard for other people, not extremely
difficult for you), availability of power (if you can
go with battery- and/or solar-powered equipment,
great; if not, you'll have to bring power in or
choose a site where it already exists), and security.
Almost all sites have fencing around them or are
located on rooftops or in some other access-
restricted area such as an Army base. These practical
concerns are as important as choosing a site without
interference from nearby sources or objects and that
is the right distance and direction from suspected
sources.
Sampling on islands or from boats is attractive
because this is often as close as you can get to
measuring deposition rates over water. Another
advantage of sampling from boats or ships is that,
due to their mobility, they can be used to sample at
various locations around the waterbody. Few
sampling programs end up collecting samples from
either boats or ships, however, because the logistics
are so difficult. Sampling from boats, in addition to
being very costly, also has the drawback that
emissions from the boat can contaminate the
samples, and the boat disturbs the air flow around
it, making it difficult to sample ambient air. This
seriously compromises the accuracy of the
measurements. If you decide to sample from boats,
make sure your quality assurance plan addresses
how you will avoid contamination from the boat
exhaust and avoid interference from the boat
structure itself. Sometimes samplers are set up on
buoys and the samples collected by boat (but not
from a boat), but this also makes access (and getting
adequate power supply!) difficult and has the
potential to contaminate samples.
Several sampling frequencies are regularly used (e.g.,
12-hour, daily, or weekly). Which one is chosen has
a large impact on what the data can be used for. It
also has a large impact on how much it costs to do
the monitoring. Although the cost of equipment
32
-------
for different methods varies dramatically, in general,
the more samples that have to be analyzed, the
more expensive the monitoring.
Longer sampling frequencies (weekly/monthly) are
usually adequate for measuring deposition over a
longer term, while shorter frequencies (daily/12-
hourly) are required for determining depositional
processes and delineating emission sources.
Twelve-hour sampling is recommended by many
experts to get accurate dry deposition measure-
ments. Temperature and humidity changes have a
significant influence on measurements; therefore,
samples can become altered while waiting for
analysis. Twelve-hour sampling minimizes the
temperature and humidity changes that any
particular sample undergoes. This is often done by
having one filterpack or denuder switch on to
collect a sample during the day and another switch
on to collect a sample at night. This can be done
with a series of denuders or filterpacks for several
days.
Daily sampling can be done for wet or dry
samples. Daily sampling allows the data to be used
in back-trajectory analyses (a method of matching
wind direction and pollutant load to help identify
sources). This also tends to be the most accurate for
wet deposition samples because there is the least
opportunity for any particular sample to be
contaminated or otherwise altered. AIRMoN uses a
daily sampling frequency.
Weekly sampling is probably most common for
both wet and dry deposition. Weekly samples
cannot be used for back-trajectory analyses (because
the wind direction shifts frequently over that
period). NADP-NTN, MDN, and CASTNet use
a weekly sampling frequency.
Integrated samples are used to measure ambient
air concentrations to support dry deposition
calculations. In integrated sampling, air samples are
collected on a filter (particles) or reactive/absorbing
medium (gases) and subsequently taken to a
laboratory for analysis. Samplers are set out for
anywhere from a few days to several months, and
the data are averaged over that period. This leaves a
large amount of time for samples to be contami-
nated or altered by humidity and temperature
changes. This may affect how well deposition
velocities (and therefore deposition rates) can be
calculated. It is done regularly anyway to minimize
the cost of sample analysis and, in some cases,
because shorter sampling frequencies do not collect
enough trace pollutants (such as dioxins/furans) to
be accurately measured. As samplers become better
at detecting very small amounts of pollutants, this
technique will probably be used less and less because
of the issues with contamination and alteration. It
should be emphasized that although pollutant levels
may be low enough that samplers must stay active
for a long period of time, the small amount of
pollution measured may cause significant water
quality impacts.
In situ continuous/semi-continuous samplers
collect and analyze air samples at the sampler
location at very small time intervals (such as 5
seconds or 15 minutes) and store the data until they
are retrieved by an operator. Typically, data are
aggregated to one-hour reporting periods for
interpretation and comparison with predictive
models. An assortment of techniques is available to
capture gases and particle-bound ions, metals, and
carbon. Sampling at small time intervals is good for
comparing data with results from an airplane study
or to measure extremely precise differences in
meteorology and air concentration over a short
period of time.
There are only a handful of ways to measure
atmospheric deposition. Which one you choose
depends to a large extent on what question needs to
be answered, what needs to be measured, what
assumptions the advisory group is most comfort-
able with, and the resources available. The
preferences of the scientist(s) actually doing the
monitoring are also important.
Measuring atmospheric deposition is not a simple
process. "Clean techniques" are especially important
33
-------
for toxic pollutants, which are often (but not
always) measured in small quantities. All staff or
volunteers who collect atmospheric deposition
samples should have some basic training, and those
collecting dry deposition samples should be
especially well prepared. NADP provides good
training materials for wet deposition monitoring
methods, but dry deposition training often must be
on an individual basis by researchers or experienced
field staff. Don't pinch pennies with training. Many
data sets have been shown to be virtually worthless
because proper sampling and handling techniques
were not observed or analytical techniques were
employed that were not sensitive enough.
Wet deposition is measured by collecting rain and
snow. Almost any pollutant can be measured using
this method, and some isotopes of some pollutants
can also be measured (for more information on
isotope analysis see page 61). The basic equipment
is a collector, such as a bucket, tray, or funnel
connected to a bottle. A collector typically has an
automated cover that keeps dry deposition and
debris out when it is not raining and slides away
from the collector to uncover it when it is raining.
Snowfall also triggers the collector to be uncovered,
oo
so deposition in snowfall is measured. However, the
capture efficiency for snowfall may be poor due to
the aerodynamics of trying to capture blowing
snow. It is also possible for something else, like bird
droppings, to trigger the collector to open. This can
contaminate the sample. Sometimes samples fall as
rain, but freeze before they are collected, which
complicates the analysis. Special handling proce-
dures, materials, and other considerations are
required for collecting samples for metals, organic
compounds, or isotope analysis.
Regardless of whether they measure regional
deposition or hotspots, it makes sense for wet
deposition sites to follow the local NADP siting
criteria (located in an open area where trees and
buildings will have a minimum amount of
interference). Sites measuring regional deposition
rates should also not be too close (either upwind or
downwind) to any major sources. (The regional
34
NADP siting criteria address these issues. See
Appendix 4.)
Dry deposition can be measured in several ways:
1) collecting dry particles and gases on some sort of
surface (surrogate surfaces method), 2) measuring
the amount of dry particles and gases in the air and
calculate a deposition rate (ambient air sampling),
and 3) measuring deposition at a specific location
with a dry collector. See the section on Resources
for references on dry deposition sampling.
Ambient air sampling methods are considered the
most accurate by many researchers. In these systems,
the deposition rate is calculated by models based on
an equation that includes a deposition velocity and
an air concentration. Unless the monitoring is
research-grade, use deposition velocities from the
literature or a commonly accepted model in lieu of
site-specific data.
A table of some dry deposition velocities from the
literature is provided in Appendix 2. For more
sophisticated estimates, check with experts for the
best rate to use.
Measuring dry deposition also requires collecting
meteorological data. This must be done at the same
interval the model requires. The data you need to
collect usually include wind speed, wind direction,
humidity, temperature, solar radiation, and rainfall.
The deposition rate also varies depending on the
characteristics of the surface the pollutant falls on,
although these differences are usually estimated
rather than measured. For example, dry deposition
rates are very different over a parking lot than over a
forest of broadleaf trees. Similarly, they can differ
over a waterbody and its adjacent shoreline. The key
is to know how you will calculate deposition rates
before setting up the station so the appropriate data
are collected.
Dry deposition sites are very difficult to locate well.
This is because the samplers are highly sensitive to
local meteorology and the surrounding surfaces.
Buildings divert wind much the same way rocks in
a stream disturb the smooth flow of water. The
type of surface also changes the way the wind flows.
-------
The optimum dry deposition site has a long
uniform fetch (smooth open area over which wind
can blow) and is not close to any known sources.
Since that does not happen very often in real life,
pick the best site you can and accept its limitations
in the data interpretation.
From a practical point of view it is recommended
that you collocate dry deposition samplers with wet
deposition samplers. This minimizes the headaches
involved with setting up and maintaining sites and
buying extra rain gauges. Collocated samplers also
provide a more complete picture at one location
which is sometimes more useful than half the
picture at two different locations.
Fiiterpacks. Filterpacks are basically systems in
which air is pulled through a series of filters.
Pollutants collect on the filters based on their size
and chemical characteristics. Filterpacks collect all
particle sizes through an "open" inlet. Many particle
samplers, like those used for regulatory purposes,
collect specific particle size fractions, such as
paniculate matter up to 2.5 microns in diameter
(PM,,.) and paniculate matter up to 10 microns in
diameter (PM10). Ideally, one benefit of using this
method for toxic deposition monitoring is that the
samples can be easily and cheaply analyzed for a
large number of pollutants. This method also can
provide data useful for characterizing the signature
of a particular source or source category when using
the chemical mass balance method of source
identification. For more information on chemical
mass balance sampling, see page 59. Filterpacks are
attractive because they are relatively cheap compared
to other types of active samplers (denuders and
dichotomous samplers). The downside of
filterpacks is that they are often put out for long
periods of time (more than one week). The night/
day cooling and heating cycle can cause the ratio of
gases and particles collected in the system to change.
This is fine for measuring the total amount of
pollutant, but severely compromises your ability to
measure deposition velocity (because gases and
particles fall at different rates). CASTNet, however,
uses them on a weekly basis. If filterpacks are used
for short periods of time (12 or 24 hours), they can
be highly effective (and cheap) measurement tools.
Gas Trap Samplers. Gas trap samplers are very
similar to filterpacks except that they are specifically
designed to capture semi-volatile organic pollutants.
Examples of pollutants for which these would be
used include PAHs, PCBs, and dioxins/furans. The
sample is pulled through a filter and then through a
tube containing some sort of polyurethane or
sorbent resin material (some use a material very
similar to foam seat cushioning!) that acts as a filter
to capture the organic pollutants. The pollutants are
extracted from the material in the tube and the
filter in the lab. These systems can be highly
accurate for measuring small amounts of organic
pollutants if proper clean field techniques are used.
They are not cheap, however; and technical
assistance with them is required to select the
appropriate materials for the pollutant of concern.
Denuders. Typically, denuders are used to separate
gas phase chemicals from those bound in
paniculate. In this system, air is pulled through
tubes coated with a chemical that will "stick" to the
pollutant being measured and filters to catch
additional pollutants. The order of the denuders
and filters is important because different pollutants
are captured by each component. Some denuders
have a device to sort particles on the front called a
cyclone that blocks large particles from entering the
tube. It was originally designed to keep soot and
other large particles out of the denuders to prevent
contamination. Evidence is mounting, however,
that the device also keeps large nitrogen particles
out. This is a problem only in coastal areas where
sea salt in the air strips nitric acid and sulfates from
the air and forms large particles of sodium nitrate
(NaNO ). These particles get caught on the cyclone
and are not counted in the measured nitrogen load.
Therefore, denuders probably underestimate
nitrogen deposition in coastal areas. This is thought
to be a solvable problem, but no precise method
has yet emerged as a standard way to solve it. Even
with this limitation, denuders are considered by
many experts to be the best samplers to measure dry
deposition of nitrate and ammonia. However, they
are significantly more expensive than filterpacks.
The specific chemicals used on the denuder and the
order of denuder tubes and filters will vary
depending on what you are trying to measure and
35
-------
the chemistry of the air in your area. So this is yet these reasons, particularly the concerns about the
another place where help from technical experts will "unnaturalness" of the surface, data from these
be necessary to make the technology meet your samplers are considered inaccurate by most
specific needs. researchers.
Dichotomous samplers (dicots). A dichotomous
sampler measures the amount of particles of
different sizes, but does not differentiate between
different types of pollutants in the air. This is
helpful in some circumstances where data from a
denuder or filterpack do not clearly indicate what
form of the pollutants are being deposited. Since
the size of a particle is a key factor in how quickly it
deposits, using a dicot with a filterpack or denuder
often allows you to estimate more accurate dry
deposition rates. Often dicots are set to differentiate
particles smaller than 2.5 microns from those that
are larger. This breakoff point is commonly used in
air sample analysis because particles smaller than
approximately 2.5 microns behave very differently
(more like gases) than those that are larger.
Continuous air quality samplers. Continuous air
quality samplers may do what you need either on
their own or with some modifications. Generally
they collect ozone precursors (NO and VOCs),
SO , and particulate matter. They are common and
easy to obtain. Existing continuous air sampler data
may already be available through the NAMS/
SLAMS/SPMS network or from NO , SO , or
x x
PM monitoring stations already in place in many
urban areas and national parks. The ambient air
quality data require significant translations to get
estimated deposition rates. It is possible to do this
as long as all the other data needed (deposition
velocity, particle size, meteorology) are also
collected.
Dry bucket/tray/funnels. This design attempts to
measure dry deposition directly by doing the
opposite of the wet deposition sampler: the
collector is covered when it rains. Because the
collector is open for long periods of time, it is
highly susceptible to contamination from
windblown dust or debris. The collector tends to
over-collect large particles and under-collect small
particles or gases. The deposition surface is also not
very similar to any situation found in nature. For
Wet bucket/tray/funnels. This is the same concept
as the dry collector, except there is water in the
collector. The purpose of this design is to simulate
deposition to a body of water (rather than a dry
surface). It has the same problems with contamina-
tion as the dry design, but it does have a somewhat
more realistic deposition surface (the water).
However, the deposition of certain gaseous
pollutants (such as NH?, SO,,, and Hg°) to the
water surface strongly depends on the pH of the
water. It is difficult to maintain pH conditions in
the collector similar to those that would be found
in the waterbody. It is still considered highly
inaccurate by most researchers.
Surrogate surfaces. Surrogate surfaces are a
variation of the dry collector design where the
deposition sampling surface is constructed to
resemble a natural surface. Unfortunately, it is
extremely difficult (some say impossible) to
construct a surface that resembles a natural surface
(let alone lots of different kinds of natural surfaces).
Many different surfaces can be used, but usually
they are "distressed" in some way to provide an
uneven surface (instead of the artificially smooth
surface of most man-made materials). While this
sample design appears to have potential, it also can
be contaminated very easily. Some scientists believe
it may be useful for larger particle species, but less
so for smaller particles or gases. Therefore, many
(but not all) experts are highly skeptical of data
coming from these types of sampling devices.
Passive samplers. Passive samplers rely on passive
diffusion to trap a gaseous species on an impreg-
nated filter. No air is pulled through the sampler.
One passive sampler design is a badge sampler,
which looks like a small petri dish made out of
nylon-like material. There is a top filter/screen to
keep out larger particles and an internal filter
saturated with different compounds depending on
the type of pollutant being measured. The passive
sampler is placed in the field in a location protected
36
-------
from rain (under some type of roof) for anywhere
from a few days to a few weeks.
These samplers are attractive because they are
extremely cheap, and a watershed could be
blanketed with them for relatively little money.
They do have to be referenced to other active
samplers (filterpacks or denuders), however; and
some researchers still consider them inaccurate.
Some studies have been able to develop correlations
between badge samplers and any other type of dry
deposition sampling. They are probably most useful
as a scoping method to identify deposition
"hotspot" areas. Once the badge samplers have
identified those "hotspots," another sampling
method should be used to quantify the actual
deposition load accurately. They are less accurate for
low rates of deposition than for intermediate rates;
to measure very high deposition rates just leave
them out for a shorter period of time.
As you surely noticed, none of these methods of
measuring dry deposition is as accurate as scientists
would like; and there is a pressing need to develop
better (and cheaper!) methods. This is not to say
that scientists don't know anything about dry
deposition; just that cost-effective methods in many
cases have not kept up with the science.
Inferred dry deposition. Because the technology
for measuring dry deposition has not kept up with
science or management needs, and because the more
accurate sampling systems are expensive, dry
deposition of a few pollutants is sometimes simply
inferred from wet deposition. This crude estimate
assumes that dry deposition is equal to wet
deposition, i.e., total deposition equals wet
deposition x 2. This ratio comes from some initial
measurements made in a few locations on the east
coast for nitrogen (nitrate) and mercury, and
appears to be holding up well as newer research
confirms that it is a reasonable estimate. It probably
works well for annual averages of nitrogen (nitrate)
and mercury (if you want to know seasonal averages
it is not very accurate) in places that get about a
meter of rain per year. It does not necessarily hold
true for other pollutants (e.g., most metals) or in
other climates. For example, in southern California,
dry deposition is a larger portion of the load of
most pollutants simply because rain is infrequent.
The pollutant speciation of nearby sources can also
make this estimate grossly inaccurate. Further,
divalent mercury is highly water soluble. Sources
that emit a large amount of divalent mercury, such
as medical waste incinerators, may cause local
(within approximately 10 to 25 km) wet
deposition to be significantly more than 50% of
the total deposition.
The decision whether or not to measure dry
deposition depends on several factors, including the
frequency of rainfall in your climate, whether or
not there are standard methods for analyzing dry
deposition for the pollutants of concern, and how
accurately the atmospheric load needs to be
estimated.
Revolatilization is the opposite process from
deposition. It happens when volatile pollutants are
released from lakes and estuaries to the atmosphere.
Only volatile or semivolatile pollutants can do this;
the most common ones are mercury, DDT/DDE
and other banned pesticides, PCBs, and HCB.
Recent research suggests that ammonia also has this
behavior. Revolatilization is not measured directly;
there is no collection system to catch the gases
wafting off the surface of the water the way buckets
of rain are collected. Rather, it is calculated based on
the concentration of pollutant in the air and in the
water column and the chemical/physical properties
of the pollutant.
Revolatilization rates are generally determined by
short-term measurements that are extrapolated over
long time periods (not on a weekly or daily basis
the way deposition rates are calculated). For
example, surface water and air sampling data
indicate that the annual loss of PCBs from
Chesapeake Bay from net volatilization (-403 kg/
yr) is 10 times greater than inputs from wet and dry
deposition (37 kg/yr) and at least two times greater
than the loadings from the Susquehanna River
(165 kg/yr). (A negative "net gas exchange" is
volatilization, positive "net gas exchange" would be
deposition).
-------
Watershed Pass-Through
These apply only to inorganic nitrogen and vary
within the land use type of
differences In age of vegetation, soil type, ecological
history, rainfall, slope, the presence of vegetative
stream buffers, and other factors. But they are a good
place to for nitrogen compounds; for other
pollutants the pass-through may be very
different.
Forests: Pass-through from 0-5% In
one study to 20% In two others
Pasture: Pass-through range from 0,4-6% In
one study, 20% In another, and 30% In a third.
Cropland: Pass-through from 0,3-
24% in one study, 30% In a second, and 40% In a
third.
Residential: Pass-through estimates from 5-
38% in one study, 65% In a second, 75% in a
third.
Data from Valigura et of., 1996 NOAA Coastal Ocean
Program Decision Analysis Series No. 9 Atmo$f>heric Nutrient
Injiot to Coastal Areas; Reducing the Uncertainties (http://
www, cop, noaa.go¥/pub$/das/da$f. html).
The indirect load is the deposition to the watershed
that makes its way into the waterbody of concern.
The deposition rate is measured in exactly the same
way, but not all of the deposition reaches the
waterbody. Some percentage is retained in soils,
some may be taken up by plants on land, some may
settle to the bottom of lakes or slow sections of
rivers, and some may be taken up by aquatic
vegetation. The importance of each of these
"storage" or "retention" processes depends on the
watershed characteristics and the behavior of the
pollutants. The percentage of pollutants that
actually reach the waterbody of concern is called the
"pass-through" rate. The actual amount of
deposition that reaches the waterbody by way of the
watershed is called the indirect atmospheric load.
Original estimates of indirect loads used a 5 to 10%
pass-through rate for no reason other than that it
was the best guess anyone had. It is now considered
too much of an oversimplification to be useful, but
many older estimates of indirect load rely on it.
There are two methods to estimate the indirect
load. The first is to estimate pass-through rates
more accurately than the simple 10% guess by
assigning different rates to different land use types.
Using this method, forests could be given a pass-
through rate of 10%, but urban areas a pass-
through rate of 90%. An additional pass-through
rate can be assigned to the stream or river
transporting the pollutant to the waterbody of
concern. This gives a good back-of-the-envelope
estimate. If you choose this method, it is often
useful to do it twice, once with conservative (low)
estimates of pass-through rates and once with
higher ones. For an example of how this works, see
the box on the next page.
The second method is to do more complex
watershed modeling that simulates the transport of
pollutants through the watershed. A model like
this includes runoff rates such as those used above
to get the pollutants into the waterways, and then
represents the complex ecology that transports the
pollutants to the waterbody of concern. These
models are significantly more complex and can
require substantial resources to run, but they are
good at capturing the complex in-stream chemistry
for pollutants where this is important (such as
nitrogen and mercury). The runoff coefficients used
in the back-of-the-envelope calculations are usually
calculated from this kind of model. One national
watershed transport model is the United States
Geological Survey (USGS) SPARROW model.
There are many other watershed models that can be
used instead, many of which maybe based on local
watersheds. It may be useful to use the same model
used to calculate tributary loads to the bay or
estuary. Using the same model for all watershed
Transport
Many watershed transport models can be to
calculate how much deposition a lake, bay,
or estuary. Some are national models and can be
anywhere; others are specifically for use in
specific watersheds or areas. Not all existing
watershed models ean be to calculate indirect
deposition loads; some do not allow atmospheric
loads as an input. Check on the models you currently
use to see If they will work. The USGS SPARROW
model can work in any watershed, but must be
run by the developers and it may take a long time to
get the results you need. For a list of resources on
watershed transport models see 75,
38
-------
transport (both air deposition and upstream
nonpoint source and point sources) makes it easier
to put the atmospheric load in context of other
loads because all the loads are calculated based on
the same assumptions.
Once the indirect atmospheric load is calculated, it
is added to the direct load that was calculated earlier
(by multiplying the deposition rate by the area of
water). As you can imagine, it is more accurate to
use a watershed model to estimate indirect loads,
but coefficients are a good first step.
Early in the process of designing an air deposition
study, it is important to figure out the performance
criteria of the data, meaning the quantity and
quality of the data needed to answer the questions
of the study with the desired certainty. One tool
you should find useful to facilitate planning data
collection activities in a systematic way is the data
quality objectives process. The outcome of the
process is a set of qualitative and quantitative
statements called data quality objectives that clarify
study objectives, define the appropriate type of
data, and specify tolerable levels of potential
decision errors that will be used as the basis for
establishing the quality and quantity of data needed
to support decisions.
Quality assurance refers to activities that ensure that
the quality of the results of the work done meets
the needs determined up front in the project. All
projects that receive federal funding need to submit
Quality Assurance Project Plans (QAPPs); most
other flinders will require one as well. The QAPP
covers quality assurance activities related to all stages
of a project, including planning, management, and
oversight of the project, and collection and
management of the data. This plan will usually have
to be submitted after the project has been approved,
but before funds are released and field work begins.
An outline of a QAPP is included in a box on the
next page. For more information on developing
quality assurance programs, data quality objectives,
and QAPPs, you can look at EPA's quality system
Web site (www.epa.gov/quality).This site includes
reference documents, training opportunities,
example documents, and links to other references
and examples. In addition, you can get examples by
asking other managers who have done atmospheric
deposition projects if they are willing to share their
QAPPs.
Hypothetical of the Simple Pass-Throygh Method
Assume that the estimate of total inorganic nitrogen deposition In the literature is between 6 and 14 kg/ha/yr on
a 354~square~mile watershed. The deposition is 10 kg/ha/yr. According to county records, the
watershed is ! % urban, 67% forested, 30% farmland, and 2% wetland. The pass-through from the literature
are 75% for urban, 10% for forests, 40% for farmland (annual 10%* for wetlands. The watershed is
90,624 ha, of which 906 ha are urban, 60,718 ha are forested, 27,187 ha are farmland, and 1,812 ha are wetland.
The are
* 6,795 tcg/yr from the urban (906 hax 10 kg/ha/yr x 0.75 pass-through)
* 60,718 kg/yr from the {60,718 ha x 10 kg/ha/yr x 0.1 pass-through)
« 109,148 kg/yr from the farmland (27,287 ha x 10 kg/ha/yr x 0,4 pass-through)
* 1,812 kg/yr from the wetland area* (1,812 ha x 10 kg/ha/yr x 0.1 pass-through)
The total Indirect load from the watershed Is 193,026 kg/yr. An additional correction factor is to estimate in-
stream They depend on the of water flow and, for biologically pollutants, the season. After the
Indirect loads are corrected for In-stream (approximately 50% could be a ballpark estimate), they can be
to the direct deposition for the atmospheric load. Given the uncertainty of pass-through estimates,
It can be useful to run through this twice, once with high pass-through estimates and once with low ones,
•These estimates were made up as a guess—they are not real literature values! (See top box on page 38.)
39
-------
a
Below Is a working outline for a QAPP for a nitrogen
deposition monitoring project,
!, Data Quality Objectives
II, Ambient Air Nitrogen Speeies Particle
Distribution
III. Wet Deposition Monitoring
IV. Dry Deposition Monitoring
V, Measurement of Ambient Air Speeies
VI. Modeling Nitrogen Dry Deposition
VII, Modeling Nitrogen Oxide Transport,
Dispersion, Transformation, and Deposition
VIII, Estimating Uncertainty
IX Quality Assurance/Qual ity Control
It Is very important not to cut corners when it comes
to quality assurance/quality control (QA/QQ, This
means using field and lab blanks, duplicate samples,
split samples, samples, audits, other QA/
QC techniques to ensure high quality data. Basically,
there is no point of thousands of
dollars on fancy equipment if there Is no QA/QC
analysis to tell you how accurate the results are.
Without knowing the accuracy of those numbers, the
are wide open to criticism from all
management decisions wi 11 be very hard to make and
to Implement, Plan on spending 25 to 30% of your
budget on QA/QC activities.
For data collection and analysis, you will need to
consider quality control with respect to how the
samples are collected, transported, stored, and
analyzed. When choosing a sampling methodology,
you need to consider the detection limits of the
methods and ensure that the sample volume is
sufficient. For example, dioxins/furans are found at
low levels in the ambient air. Consequently, the
sample volume needed to get detectable amounts is
high. In addition, maintenance and operation of
both field and laboratory equipment are important
to quality control. Various quality control
techniques are going to be essential for you to have
a data set that is usable. These could include field
and laboratory blank samples, duplicate samples,
split samples, spiked samples, a tracking system that
clearly delineates who is responsible for the samples
at what points in the process, and training for staff.
You also need to remember other potential
uncertainties in addition to those directly related to
measurement and analysis of samples. Many of
these have been touched on in other sections of this
handbook. One example is the representativeness
of the sites and time periods you sample relative to
what you are trying to measure. Another example is
the uncertainty of assuming deposition velocities
derived from the scientific literature to estimate dry
deposition. If you are estimating the contribution
of air deposition relative to other pathways for
pollutants entering the waterbody, there is the
uncertainty of the estimates for those other
pathways.
Quality assurance is the key that makes the whole
package work. If you don't know the error and
uncertainty associated with the data collected in
your study, or that the data are of a sufficiently
good quality to support the management decisions
and actions, then the data are not much more than
a collection of numbers. This quote from a
researcher working on atmospheric deposition of
nitrogen says it best:
Wet deposition estimates depend on
capturing every rain event and retrieving
un contaminated, undegraded samples. NO
SMALL FEAT! The dry deposition models
rely on empirical algorithms with relatively
large uncertainties, not to mention the
uncertainties associated with the input data.
Literature contains estimates of these
uncertainties and should quickly disabuse
anyone of the [idea that] accuracy... can be
obtained in their nitrogen [or other
pollutant] deposition estimates.
This is not to scare you off; just to insert a note of
caution about the potential difficulties of getting
accurate data. Plan on spending a significant
portion of your budget (e.g., 25 to 30%) on
quality assurance/quality control activities.
Can I Get for $ I
or a
This section provides an idea of what kind of data
you might get under three different budget
scenarios. It is intended to give you a feeling for the
O •/ O
costs of monitoring, as well as some suggestions for
40
-------
low budgets. The budget categories apply to each
pollutant (i.e., you cannot get everything in the box
at the right for $ 15,000, just one pollutant). There
are some exceptions to this, however, particularly in
regard to metals. Once the equipment is in place to
sample for one metal, analyzing for several
additional ones would probably be possible within
the $15,000 budget. In most cases, however, the
budget applies to only one pollutant and a rather
basic approach with limited data analysis.
Keep in mind that these are estimates only; costs
may be significantly higher (or, with lots of in-kind
support, lower) in some situations. The majority of
the scientists and managers who have conducted
atmospheric deposition studies stress the
importance of partnering, borrowing, leveraging,
and any other legal means to make the money
stretch farther. The cost for the data analysis and
interpretation is extra. If you spend $ 15,000 on
data collection and sample analysis, expect to spend
another $5,000 or more for data analysis; if you
spend $50,000 on data collection, expect to spend
another $10,000 or more on data analysis; and if
you spend $400,000 on data collection, expect to
spend as much as $100,000 more on data analysis.
That said, here are some experts' opinions of the
best data to collect in each price range.
These budget categories for monitoring can also be
thought of as goals you can reach over time. In
other words, start with the $15,000 version, then
add a small piece every year; and after four years you
will be doing the $50,000 monitoring scenario.
Remember, these suggestions are guidelines only.
The goal is to design a monitoring strategy that
answers as many of the questions you have as
possible given the resources you have.
$ I
All experts agree that only the bare minimum
amount of sampling for a single pollutant could be
done for $15,000/year. In some cases, such as for
PAHs and other semivolatile compounds, no
sample analysis can be done.
In that case, some experts suggest collecting the
samples anyway and analyzing them when sufficient
$l5,000/Pollutant
Nitrate or ammonia: Look for wet dry values
reported at nearby or model runs in the literature.
The inventories in the model runs should be
checked local knowledge of sources, land
use data should be collected from USGS or the Soil
Conservation Service (SCS). Could set up an NADP
wet-only site.
Mercyry: Collect 50 to80 event-specific precipitation
over the year at a single (probably with
low-cost labor such as a student or intern). Analysis for
mercury
$ 100. Samples should be split, with half saved for
future Dry deposition cannot be
with this budget; use rates found in the literature or the
I: I wet to dry ratio {the ratio method only works if
the climate is not extremely dry there are no
sources of divalent mercury by).
PAHs: No significant monitoring possible. Look for wet
dry values at or runs
in the literature, or collect the store them
for future analysis.
PC Bs: Same as PAHs.
air sampler to
collect two-month
deposition revolatilization for the most
common
Cadmiynn either
wet or dry deposition (wet in buckets dry using a
filter-pack). Get a crude by multiplying a wet-
to-dry ratio assumption. Samples generally are very
(about $20/sample) collected.
Very to collocate with for other pollutants,
like mercury (if available).
Current-yse Target monitoring to the
portion of the when is {in where
year-round agriculture is practiced, measure year-
round). You cannot, of course, deposition
from the highest-use times to the entire year; this
only on what the
are. It will probably be to measure only one
compound on a short-term, basis.
Do a study of
atmospheric {wet dry)
column measurements to estimate revolatilization
It will probably be possible only to one
compound on a short-term, event basis.
funds become available. If samples are collected, but
not analyzed immediately, make sure they are
stored properly! This often (but not always) means
deep-freeze or refrigeration. Find storage protocols
appropriate for the pollutants you want to measure
from researchers and follow them. This is to make
sure the samples can be analyzed accurately later.
41
-------
Another tactic is to take a large number of samples
and only analyze a few that you suspect will give
good results. The advisory group will help you
figure out what samples those are. Then the data
from these initial analyses can be used to leverage
funds to get the rest of the samples analyzed and, if
necessary, continue the monitoring program.
In addition to representative sample analysis and
getting all the in-kind donations possible, explore
the opportunities to piggyback on existing sites.
Much of the start-up costs for sampling consist of
securing a site, getting power brought in, and
buying a rain gauge and other meteorological
equipment. If there is an existing air quality
monitoring site with some or all of that already
done, it will make beginning a sampling project
much easier. Some pollutants, especially metals, can
often be measured very cheaply and easily by adding
a filter to, or analyzing additional samples collected
from, monitoring equipment already in place.
Note, however, that some researchers have tried
analyzing sub-samples for organics or metals from a
standard wet deposition collector being used for
nitrogen, and have not gotten useful results because
sampling and handling protocols necessary for the
organics and metals were not followed. So, when
considering shortcuts, look to see what success (or
lack thereof) others have had.
Regardless of how many samples are collected and
analyzed, it is critical to interpret the data. That is,
once the samples have been analyzed in a lab, the
data must be analyzed to figure out what they are
telling you. You will get better interpretation if
someone with experience analyzing atmospheric
deposition data analyzes your data. Data interpreta-
tion usually costs between 25 and 30% of the total
project cost. So an additional $5,000 or more
would be budgeted, either in the first or second
year of the project for data analysis. It cannot be
overemphasized how critical this part of the project
is; without it, all the effort into sampling and
laboratory work just results in a pile of numbers.
For robust data that can be used in decisionmaking,
the interpretation must be done well.
For $50,000 it is possible to have at least one
monitoring site for almost any pollutant. For
organic pollutants such as PAHs, PCBs, dioxins/
furans, and historic-use pesticides, it is unlikely that
you could have a replicate sampler for quality
control purposes for this budget without borrowing
one. You have to consider your data quality needs
when deciding what sampling to do on this budget.
For nitrogen it will be possible to have two or even
three sites, depending on whether dry deposition
will be measured. To conserve costs, it may still be
worthwhile to set up several sites and selectively
analyze samples. The results may be used to prove
the need for more resources. If initial results do not
indicate a substantial amount of deposition, store
all the samples anyway and analyze them within a
few years to make sure there are not spikes in
deposition or other events that the selective sample
analysis did not reveal.
ISO.OQO/Potlytant
Nitrogen or ammonia: Put out at two (or
in relation to
measure both wet dry deposition. Annular
denuders or filter packs are preferred for dry
deposition analysis.
Mercury: Add more for
reactive mercury (Hg2+) or wet deposition
for methyl mercury, as well as mercury. Pass-
through are probably too expensive unless
most of the and/or modeling has
been done.
PAHs: A few over short time
during different times of the year.
PCBs: Use a single ambient air sampler to collect
integrated estimate deposition
revolatilization from literature values.
Dioxins/furans: Add additional measure
precipitation as well as ambient air samples.
Cadmiym heavy Add more
sites, wet dry measurements to refine
wetdry ratio.
Cyrrent-yse more
(measure more or regularly),
more compounds.
Measure more often and
more compounds.
42
-------
It is critical to save enough resources to get the data
analyzed. In fact, since you probably generate more
data under this scenario, it is even more important.
It will still cost between 25 and 30% of the
amount spent monitoring to get the data analyzed,
so budget an additional $14,000 or more to get
data from these types of studies analyzed.
The theme for this scenario is "put out more sites
that cover more pollutants or species (or flavors) of
pollutants." At this point, you could afford to
monitor for any pollutant and you could afford to
do it in quite a few locations. Those locations
should be chosen based on the question(s) you need
to answer, but given these resources, sites could be
located both to provide regional deposition rates
and to measure deposition from specific hotspots.
You could also decide to dedicate some of the
resources toward modeling. This could be either
developing a good watershed transport model or
using air deposition or source attribution models to
begin thinking about management options.
Another type of study that might be done in a
research mode is an airplane study. Airplane studies
measure the spatial heterogeneity of deposition
rates. They typically are done by researchers to
better understand atmospheric chemistry.
Therefore, such studies are not likely to be cost-
effective for meeting the needs of watershed
managers. Basically, many different kinds of
samplers are loaded on an airplane with an air intake
designed to minimize taking up exhaust from the
engines. The samplers are then run to collect huge
amounts of data on pollutants and meteorology at
different heights. The samples must be taken while
at least one on-ground dry deposition sampler is
making continuous measurements. The samples
taken from the airplane are then compared to
samples collected at the ground sampler(s). This
information on how pollutants interact in different
layers of the atmosphere is used to refine deposition
estimates. Airplane studies also show what kind of
spatial variability there is in the dry deposition
estimates coming from the sampling sites. An
airplane study can cost at least $100,000 for
$4QO,0QQ/PoItytant
Nitrogen or ammonia: Add more measure
dry deposition organic nitrogen at several,
Mercyry: Add more sites, measure speclation
(total, elemental, methylmercury) at several. Can
measure dry deposition as well.
PAHs, PCBs: Add more sites, measure different
species. Measuring common that were not
already measured (there are 10,000+ different
PAHs and 200+ PCBs) will help identify sources.
Measure water column, as well as air deposition, to
get volatilization and mlcrolayer effects.
Dioxins/fyrans: Measure at more wet and dry sites,
measure more frequently, partlele-phase and
vapor-phase and
sediment core samples.
Cadrniym other heavy Same as for
$50,000 scenario,
Gurrent-yse Add more
all compounds of Interest; measure water column, as
well as air deposition, to get volatilization.
Historic-use Measure more often and
measure more compounds.
approximately 50 hours of flying time, plus data
analysis.
Once again, whether or not an airplane study is
done, it is important to save 25 to 30% of the total
project budget to properly analyze the data.
is
There is no such thing as "enough data" to most
researchers. Long-term data sets are so rare and so
valuable that it is almost impossible for any
researcher to say that enough monitoring has taken
place. However, you are in the business of
managing resources, not doing research on them, so
you will get to the point where enough is enough.
If possible, try to turn your site over to someone
else who will keep it going rather than stopping
monitoring altogether.
The length of time you should monitor depends on
what questions you are trying to answer. A single
year of deposition monitoring is often used to
determine what proportion of the total pollutant
load comes from atmospheric deposition. This is
not optimal because deposition rates vary some
from year to year depending on emission rates and,
43
-------
more importantly, changes in meteorology. There-
fore, deposition rates during a drought year, or
during a year in which your site was hit by three
hurricanes, may not be representative of what
generally happens at the site. This doesn't mean you
can't use one year's worth of deposition data. It just
means that if you do, the uncertainty of your
estimate is larger than if you use averaged deposi-
tion rates for three or five years.
Because of this variability in deposition rates from
year to year, it is recommended that a site be active
for at least five years. NADP requires new sites to
commit to operating the site for five years as well.
This is long enough both to get a good grasp on the
"real" average annual deposition rate and to know if
rates from year five really are the same or different
than rates from year one. Therefore, to know if
you are receiving more deposition or less than in
the past, you have to make a significant commit-
ment to monitoring. This is especially relevant
when the purpose of monitoring is to quantify
changes as a result of particular management
actions. In other words, monitoring for a year or
two to determine if there is a problem, then doing
something about it is great. It is even better,
however, to continue monitoring to assess how
successful that management action was. Assessment
not only strengthens the argument for controlling
atmospheric sources in your watershed; it helps
other watersheds in similar situations build the case
for the management actions they need to take.
44
-------
VIII. What You Need to Know About
Air Deposition Modeling
I ^
The first thing to know is that you will not have to do the modeling. While you may be the site operator
for a monitoring site(s) and collect the samples, and may in a few cases analyze them, modeling is technical
enough that it requires individuals with significant training to do well. The downside of this is that it
tends to be more expensive. The upside is that you only need to know enough to be a good interpreter;
you don't have to become a modeler.
Two types of modeling are used to assess atmospheric deposition: deposition models and source
attribution models. Sometimes models designed to answer questions about deposition rates can also
answer questions about source identification and vice versa. More often, different models or different
model runs are used to answer questions about source attribution. It is useful to think of them as two
different kinds of models because they are used at different points in the assessment process to answer
different questions. This section discusses atmospheric deposition models; models that help answer the
question "How much of a pollutant is being deposited on the watershed?" Source identification and
attribution models are discussed in Chapter X on Source Attribution.
Modeling is an art. On one hand, models are easy to believe because they present easy-to-understand
pictures of what we think is going on. On the other hand, because they are by definition simplifications of
reality, they are easy to criticize because they always leave something out. The key to developing successful
models is to justify what has been simplified or left out based on expert knowledge and sound science.
This section contains information on
Questions that air deposition modeling can answer
Basic theoiy of air deposition models
Comparison shopping among models and modeling inputs
— Inventories
— Meteorological data
— Models.
It must be stressed that all models rely on the
quality of the data and reasonableness of the
assumptions that go into making them. There are
two keys to good models that must occur together:
accurate atmospheric chemistry and transport
equations and good input data (inventories and
meteorological data). In other words, there is no
point in having a highly accurate model if the
input data are not good—the "good" model will
still turn out incorrect results. It is possible to get
the "right" answer—one that agrees with the
monitoring data or that confirms suspected
linkages—for the wrong reason. To avoid that trap
and accurately answer questions, the limitations
and sensitivity of the model must be clearly
understood, and the model must be based on
45
-------
reliable data with known error margins and reason-
able assumptions.
Given those caveats, atmospheric deposition models
can do the following:
• Summarize current conditions to help to
identify management options
• Fill in spatial or temporal holes left by a
monitoring program
• Predict future conditions due to growth (e.g.,
economic development) or regulatory changes
• Estimate what reductions are necessary to reach
specific goals (such as a particular nitrogen
loading or concentration in an estuary)
• Detect what changes in deposition rates will be
significant to ecological or human health.
Models are generally classified as LaGrangian or
Eulerian. The difference between them has to do
with how calculations are made. LaGrangian
models track emission plumes that spread out
toward some receptors (an example being an
estuary) based on their chemical and physical
parameters and the meteorology.
Eulerian models do calculations based on grids.
These grids are areas over which inputs are averaged,
calculations are performed, and deposition is
averaged. For example, a model running with a
36-km grid (36 km on a side) assigns each source to
be emitted in a particular grid and calculates atmo-
spheric chemistry, transport, and deposition on
those pollutants over a certain amount of time. The
model then sends the pollutants to the next grids
according to the results of those calculations. The
key feature of grid size for model users is that the
deposition rate is estimated for the entire grid area.
For example, the 36-km grid calculates a single
deposition rate for a l,296-km2area. For finer
resolution (to see differences on a smaller scale), the
model must be run on a smaller grid. This means
more calculations, which increase the amount of
time it takes to run the model. The first atmo-
spheric deposition models were run with grids of
Definitions of Common Terms
Grid; A grid is the at whleh an Eulerian model
emissions, meteorology, atmospheric
chemistry, and deposition rates. The smaller the grid
size, the higher the resolution of the deposition
rates. This not necessarily mean that the
deposition are more accurate, just that they are
calculated over smaller areas. Grid are
Important when trying to interpret deposition
Large grid work better to capture deposition
on a rougher (such as from one source region
to another). To get finer resolution (such as
deposition gradients from a particular source), small
grids are better. Small grids may calculational
problems that to inaccurate however,
If the Input data or equations are not to be
Iterated at that frequency. Small grids may
problems if the meteorological set Is on a
larger grid Emission sources can be
over any grid size, but meteorological data collected
at 80-km grids may be too to use In 36-km
model grids. Make sure the conditions of the model
runs, Including grid are to the
advisory group before modeling begins.
60 or 80 km/side. Many are now run with 36-km
grids, and some can be run on grids as small as
12 km or even 100 m.
It should be pointed out that the term "grid" is
somewhat inaccurate. The grids are actually three-
dimensional rectangles of air space. The vertical
space in these models is made up of "layers" of
different thicknesses. The thickness of the layers
varies; one may be 1 m thick, another 10m thick,
another 100 m thick, and so on. These layers
correspond to different layers in the atmosphere
where processes happen differently based on the
physical characteristics of air at that height. Simply
described, layers are the way models capture the fact
that emissions from cars (which are very low to the
ground) are subject to different meteorological
conditions and travel differently than emissions
from 100-m smokestacks. This is because meteoro-
logical conditions (such as wind speed and direc-
tion) vary with height above the ground. For
example, long-range transport of pollutants from
China to the west coast of the United States appears
to occur only when meteorological conditions push
pollutants into high air currents. The number of
46
-------
layers is one of the factors determining how accu-
rately the model represents the atmospheric chemis-
try and transport of pollutants.
Eulerian models are good at capturing the complex
non-linear chemistry necessary to model ozone,
nitrogen, sulfur, and many toxic compounds (e.g.,
mercury) accurately. LaGrangian models generally
work well for those toxic compounds that have
fairly linear atmospheric chemistry. Linear
atmospheric chemistry means that a compound
generally does not react or change from the point at
which it is emitted through transport to the point
of deposition. An example is cadmium. It is
emitted, transported some distance based on wind
speed and direction, and deposited based on its size
(which is about the same as when it was emitted)
and some meteorologic parameters. Non-linear or
complex atmospheric chemistry means that a
compound undergoes chemical and physical
changes in the atmosphere from the time it is
emitted to when it is deposited. For example,
nitrogen compounds undergo many chemical
reactions in the atmosphere, which depend on
several factors including sunlight, water vapor, and
other chemicals. The nitrogen deposited is likely in
a different form from that emitted.
Eulerian models include RADM, REMSAD, and
Models3 (see pages 50 through 52 for a discussion
of these models). RELMAP and CALPUFF are
LaGrangian models, and HYSPLIT is a LaGrangian
model that can be run in an Eulerian mode.
Shopping among models is not shopping to get the
best price, although that may enter into it. Rather,
it is wading through the various options to figure
out which one has the features that can give you the
kind of answer you need. The choice of model
should be made with the advisory group and, in
most cases, should be made before monitoring
begins. Sensitivity analyses may be a useful tool to
inform you about critical data inputs to the model.
Models are one of the "uses" to which the data will
be put—as an input, validation, or two pieces of a
larger puzzle. Thus, it is critical to know the uses of
the monitoring data before they are collected, so
that the sampling is conducted with the proper
temporal and spatial resolution and includes all the
parameters required by the model. Therefore, the
type of modeling the project will include should be
identified as soon as possible in the design process.
This section provides an overview of several models
and modeling inputs and discusses some of the
differences among models and among modeling
inputs.
Inventories
Emissions inventories are collections of information
on releases of pollutants to the air over a specified
time for particular sources or geographic area (e.g.,
county). Inventories also may have other
information about the emissions, such as the form
in which the pollutant is emitted (e.g., elemental
mercury), the height at which the pollutant is
released (e.g., ground level or from a hundred-foot
stack), the temperature and velocity of the release,
and detailed information about the location of the
release points (e.g., latitude and longitude).
Information about the emissions is necessary input
to any deposition model. In Eulerian models, each
grid in the model contains emissions of all the
sources located in that area. In LaGrangian models,
the emissions are emitted as a plume from each
source. These emissions are then transported, may
be transformed, and may be deposited. The fate is
dependent on multiple meteorological, chemical,
and physical factors.
As simple as an inventory sounds in theory, the
reality is quite complicated because there are many
types of air pollution sources. One way to broadly
classify sources is by point, area, mobile, and
biogenic sources. Point sources are larger stationary
sources, such as factories and electric power plants.
Area sources are smaller stationery sources, such as
dry cleaners, degreasing operations, or houses.
Mobile sources include cars, trucks, buses, airplanes,
and other sources that move. Biogenic sources
include trees and vegetation, gas seeps, and
microbial activity. An inventory may include
-------
estimates of emissions from all or some of these
classes of sources.
Many inventories use codes to classify various types
of sources. You may find Standard Industrial
Classification (SIC) codes, which were used by the
U.S. Census Bureau to identify the primary type of
activities an establishment is engaged in. The SIC
codes are available at http://www.census.gov/epcd/
www/sic.html. A new classification system is
replacing the SIC codes; it is the North American
Industry Classification System (NAICS), which
will be standardized across the United States,
Mexico, and Canada. Information about the
NAICS and how it relates to the SIC codes can be
found at http://www.census.gov/epcd/www/
naics.html.
Other codes commonly appear in EPA and state-
developed inventories. Source Classification Codes
(SCC) (not to be confused with SIC codes!)
describe the type of process that releases emissions.
For example, there is an SCC for liquefaction
(mercury cell process) at chlor-alkali production
facilities. For the National Toxics Inventory
(described below), there are also codes that connect
the emissions information to a given regulatory
category. These are Maximum Achievable Control
Technology (MACT) codes. Additional informa-
tion on SCC and MACT codes, as well as others,
can be found at http://www.epa.gov/ttn/chief/
codes/index.html#nei.
The data in an inventory are likely to be derived
from a combination of measurement and estima-
tion techniques. Some large point sources have
monitors that continually measure the pollutants at
the stack. For the majority of sources, though,
emissions are estimated using other methods. These
methods include tests of emissions from the source
over a short period of time and extrapolated over a
longer period, material balances, emission factors,
analysis of fuel, emission estimation models,
engineering judgment, or a combination thereof.
An emission factor is the relationship between the
amount of the emissions released and the activity of
the producer. For example, emission factors can be
reported as emissions per hour of production or
emissions per widget produced. An emission
estimation model relates multiple parameters that
influence emissions. For example, emissions from
cars and trucks are estimated from a combination of
factors, including estimates of vehicle miles trav-
eled, fuel type, vehicle model, and types of travel.
If you plan to develop an inventory, several factors
will influence your decisions on what sources and
other information to include and what emission
estimation techniques to use. These factors include
the intended use of the inventory; the quality of
data needed to achieve the intended use; the
availability of information; and the availability of
time, money, and personnel.
In most cases, you will want to work with an
inventory that has already been developed. You
should understand what types of sources are
included and, in general, the methods used to
estimate emissions and the quality of the estimates.
You also may want to look at the local sources to
see if the information is consistent with your
knowledge of them. Do you know of sources that
haven't been included? Is the information in the
inventory about a particular source really different
from what you'd expect based on your local
knowledge? Don't forget that inventories are huge
undertakings and also represent a particular time
period. You may find that the quality of the data
does not match your needs, that you need more up-
to-date information, or that an error has occurred.
Therefore, you may decide to make adjustments to
an existing inventory for your modeling work. It
would be helpful to pass on your improved or up-
to-date information to your local, state, or tribal
agency. They may find it useful in improving their
inventories.
Below are descriptions of several national or
regional emission inventory efforts. They typically
represent emissions over the period of a year (for
example, a 1996 inventory would represent
emissions over the calendar year 1996). A given
year's inventory may be updated periodically to
incorporate improved or new information that was
not available when the inventory was first
published.
48
-------
National Emission Inventories Prepared by the
EPA. The U.S. EPA prepares a national emission
inventory with input from numerous state and local
air agencies. These data are used for air dispersion
modeling, regional strategy development, regulation
setting, air toxics risk assessments, and tracking
trends in emissions over time. The National
Emission Trends (NET) database has emissions data
for 1985 through 1998 for the pollutants known as
"criteria pollutants" (see Air Program Basics in the
Now What section of this handbook, page 64).
These pollutants include nitrogen oxides, ammonia,
and lead, among others. The data in the NET for
emissions of nitrogen oxides and sulfur oxides for
electric utilities is from the acid rain program
(http://www.epa.gov/airmarkets/).
Emissions data for air toxics are available for 1993
and 1996 in the National Toxics Inventory (NTI)
database. These air toxic pollutants include mercury,
cadmium, lead, PCB, POM (and PAH), and HCB.
For 1999, the criteria and toxic emissions data are
being prepared in an integrated fashion in the
National Emission Inventory (NEI), which will
take the place of the NET and the NTI. The NEI
will be conducted on a three-year basis (e.g., 1999,
2002, 2005). These inventories have information
about larger point sources on an individual source
basis. For smaller area sources and mobile sources,
emissions are aggregated at the county level. More
information about these inventories can be found at
http://www.epa.gov/ttn/chief/net/index.html.
Summary data from the NTI and NET can be
accessed at http://www.epa.gov/air/data/
sources.html. More detailed data can be accessed
through http://www.epa.gov/ttn/chief/net/
index.html#dwnld.
The more general site, http://www.epa.gov/ttn/
chief, also includes basic information about
inventories in general, emission factors, models and
other methodologies for estimating emissions, and
continuing efforts to improve the inventories. For
example, there are efforts to better speciate
emissions of certain pollutants, such as mercury,
from electric power plants. There are also links to
state sites, which may contain more up-to-date
inventory information.
Toxics Release Inventory (TRI). The Emergency
Planning and Community Right-To-Know Act
authorizes the TRI to provide the public with
information about potentially hazardous chemicals
and their use in their communities. Industrial
facilities are required to report annually on releases
of toxic chemicals into the air, water, and land, as
well as other information. This inventory includes
primarily larger industrial point sources. Area
sources, mobile sources, and natural sources are not
included. The pollutants reported in the TRI
include mercury, cadmium, lead, dioxins and
furans, PCBs, PAHs, HCB, and several pesticides.
Additional information and links to the TRI
database can be found at http://www.epa.gov/
triinter/index.htm.
Dioxin Inventory as Part of EPA Dioxin
Reassessment. In 1992, the EPA's Office of
Research and Development began an effort to
reassess the exposure and health effects associated
with dioxin. The effort includes an inventory of
dioxin sources in the United States. The inventory
and reassessment are in draft form as of the writing
of this handbook. Additional information about
the inventory and reassessment can be found at
http://www.eap.gov/ncea/dioxin.htm and http://
www. epa. go v/nceawww 1 /diox. htm.
Great Lakes Regional Air Toxics Inventory. The
air regulatory agencies in the eight Great Lakes
states have worked collaboratively to develop an air
emissions inventory of toxic pollutants in these
states for various years. The results of this inventory
are largely the same as in the NTI discussed above,
since a primary source of data from the NTI is
information submitted by state agencies. More
information about this inventory effort can be
found at http://www.glc.org/air/air3.html.
It is critical to have good meteorological data for
atmospheric deposition modeling. From your
monitoring sites, you may have some meteorologi-
cal data, but you probably would not have collected
all the data needed for the modeling efforts. You
would then look toward the larger meteorological
databases that are regional or national in scope.
49
-------
Models use three types of meteorological data. One
type, called episodic meteorological data, is a short
(several days to several weeks in length) segment
used to represent a longer period of time. Often
several of these are run to estimate annual deposi-
tion rates. For example, RADM (see page 51) uses
four two-week meteorological data segments to
calculate annual nitrogen deposition rates. These
segments are supposed to be representative of
different weather patterns to capture the range of
deposition rates that can occur under different
atmospheric conditions. The disadvantage is that
actual annual results are not provided by the model.
Some models use an "average" meteorological year
constructed by averaging monthly data over several
years. For example, Tampa Bay Estuary Program
uses an average year where January's modeled
meteorology is the average of ten Januaries,
February's is the average often Februaries, and so
on. Some models, such as REMSAD, use an entire
365 days of meteorological data to drive the model.
The advantage of this is that it represents actual
events that can be easily verified by monitoring
data. The disadvantage is that any given run ignores
interannual variability.
Which type of meteorological data is used depends
mostly on the complexity of the chosen model. It
would be difficult and resource-intensive to run a
very complex model for a full year. If there is a
choice of using an entire year versus a subset or an
average year, there are a few things to keep in mind.
Yearlong data are good at catching seasonal changes
and patterns. Because it is a continuous stream of
data, the model captures the slight changes in initial
meteorological conditions that can have large
impacts on deposition rates. It is also possible to
run these models using several different years of
meteorological data to capture interannual variabil-
ity. A 10-year average meteorological data set
captures all that variability in one run, and the
episodic model does to some extent as well, de-
pending on how well the segments were chosen.
This reduces the chance of basing management
decisions on abnormal years (making the right
decision for drought years, but the wrong decision
in the long term). The downside is that this takes
additional time and resources.
One final thing to keep in mind is that meteoro-
logical and emissions data may not coincide with
each other or with the data from deposition moni-
tors that can validate the model. For example, a
model may run the 1990 emissions inventory using
1997 meteorological data. The weakness of these
model runs is that it is more difficult to verify them
with monitoring data (because no data were ever
collected that meet both those assumptions). It is
preferable for model evaluation to choose specific
years to run when the best meteorological data,
emissions inventories, and monitoring data overlap.
The goal of this section is to illustrate what models
may be useful for air deposition studies and is not
intended to be exhaustive. Some of the air disper-
sion/transport models described are in the research
stages of development or are not generally available
for use. Some may require experts to run them for
you.
Regulatoiy Modeling System for Aerosols and
Deposition (REMSAD). REMSAD models
nationwide wet and dry deposition for mercury,
nitrogen (nitrate, nitric acid, and ammonia), sulfate
particles, cadmium, atrazine, dioxins, acids, and
POM. Lead, HCB, and PCBs are planned to be
added to REMSAD. Other pollutants could be
added with relative ease. The model is usually run
for a full 365 days of meteorological data to move
emissions from the sources, calculate transport and
transformation rates, and deposit them in grids
from 12 to 36 km2. However, the model can be
run for shorter or longer periods. It is capable of
simultaneously modeling concentrations of primary
and secondary particles. In other words, REMSAD
can model deposition of nitrogen and mercury (or
other pollutants) at the same time. This provides
opportunities for different agencies or organizations
to jointly fund one run that can answer several
questions for each of them. It is simple enough to
run on a high-end desktop PC (although each run
may take a week) so it does not require supercom-
50
-------
puter time. (This is because it has simpler atmo-
spheric chemistry than models that require bigger
computers.) The model is non-proprietary so
anyone can get it from EPA.
Regional Acid Deposition Model (RADM).
RADM models deposition to the eastern half of the
United States for secondary particles (principally
nitrate and sulfate) and acidic deposition (princi-
pally nitric acid, nitrate, and sulfate). A newer
version, the extended RADM, includes ammonia
deposition. The model uses an "episodic meteorol-
ogy" approach by running a single two-week
segment of meteorology for each season to calculate
transport, transformation rates, and deposition.
RADM has more complicated chemical transforma-
tion equations than REMSAD. These equations are
intended to give a more accurate representation of
the atmospheric processes and deposition rates.
RADM uses a 20 to 80 km2 grid and is generally
run on a supercomputer by the model developers.
Models3/CMAQ. This modeling system is actually
a framework of a graphical user interface, an atmo-
spheric transport model, and data analysis tools. As
of 2001, CMAQ (Community Modeling for Air
Quality, which includes a deposition component
very similar to the RADM model) is available as an
independent air quality model. The Models3/
CMAQ system is also available and is currently
be ing evaluated. CMAQ models acid precipitation
(primarily nitrate, sulfate, and nitric acid), photo-
chemical oxidants (such as NOx, VOCs, and
ozone), and aerosol chemical and physical proper-
ties. The domain size (the area being modeled) can
range from 100 to 5,000 km. It can be run at
different grid sizes, including grids as small as
12 km^. The model can be run from a workstation
(no supercomputer is required).
California Puff Model (CALPUFF). This
modeling system is composed of a diagnostic
meteorological processor (CALMET) and a puff
dispersion model (CALPUFF). CALMET can
accommodate meteorological data from a variety of
sources (including processed meteorological data
such as might be used to drive Models3/CMAQ,
RADM, or REMSAD). Since CALMET performs
a diagnostic wind analysis, the characteristics of the
meteorological conditions improve as the data
describing the situation improve. For instance, in
situations where the terrain influences are severe (as
within a system of interconnected deep valleys),
CALMET will require detailed terrain height
information to provide reasonable results, and its
results will improve dramatically when local
meteorological observations are available.
Furthermore, since diagnostic methods are being
used, application of CALMET requires an
experienced professional. CALPUFF is designed to
identify the impacts of sources on deposition of
gases and particulates. It includes a simplified
representation of sulfate and nitrate chemistry,
which is known to underestimate sulfate formation
since it does not address aqueous phase (in cloud)
sulfate formation (which is addressed by the other
models listed in this section). CALPUFF has been
shown to perform well for characterization of
pollutant transport and dispersion at distances of
300 km. Recent enhancements to CALPUFF have
been made to extend the distances to which it can
be applied and to address aqueous-phase sulfate
chemistry, but these enhancements have not been
fully tested and evaluated at this time. The model is
non-proprietary and can be run on a high-end
desktop PC (although some runs may take a week
or two).
Regional LaGrangian Model of Air Pollution
(RELMAP). This is one of the older atmospheric
deposition models that still sees some use by U.S.
EPA and others for estimating deposition rates for
unreactive pollutants, like most heavy metals and
dioxins. It has been used to analyze sulfur
deposition and deposition of mercury, cadmium,
lead, and other toxic pollutants. It was the primary
model used in the 1997 EPA Mercury Study
Report to Congress to assess the long-range
atmospheric transport of mercury emitted from
anthropogenic sources in the United States. This
analysis generated annual average deposition values
across the United States. The model is a regional
scale model and was developed using various
assumptions about wind and precipitation patterns
that do not hold true for smaller scales. Therefore,
51
-------
the geographic area that you want to get results for
should be relatively large, and the size of the
individual grid elements used by the model to
calculate concentrations and depositions should be
at least 20 km in size. The model is simple in
comparison to newer models now available or
under development, because of the linear
atmospheric chemistry and basic meteorological
processes it uses. It is best for heavy metals and
other relatively unreactive anthropogenic
compounds like dioxins. The value of RELMAP
lies in the fact that it is a straightforward way to get
an initial estimate of the deposition rate for
pollutants for which a more complex model is not
required. For pollutants such as sulfur, nitrogen,
mercury, and some reactive organics, where more
sophisticated models are available, those models
should be considered instead of RELMAP.
Hybrid Single Particle LaGrangian Integrated
Trajectory Model (HYSPLIT). HYSPLIT is being
used to model the fate and transport of dioxins,
atrazine, and mercury and can be configured to
model many other pollutants as well. The model is
usually run on 365 days of meteorological data, but
both shorter and longer periods can be run.
HYSPLIT can accept several kinds of meteorologi-
cal datasets. HYSPLIT can be downloaded or run at
a Web site of the NOAA Air Resource Laboratory
(http://www.arl.noaa.gov/ss/models/hysplit.html).
A module called TRANSCO has been developed
(for dioxins and atrazine to date) to allow the
model to output detailed source-receptor
information. In other words, HYSPLIT/
TRANSCO can estimate the contribution of each
source in the emission inventory to the modeled
estimate of total deposition for any given receptor
(such as a lake or estuary). This model must be run
by scientists at NOAA Air Resources Laboratory.
Modeling Costs. It is extremely difficult to put a
dollar value on what a modeling effort for a
particular pollutant in a specific geographical area
might require. Some of the cost depends on how
much work you need to do to get inputs together
and into a format that can be used in the model.
The degree of complexity of the pollutant
chemistry and the location of interest can have a
sizable impact on the resources required. The
resources required are also dependent on how much
model input data are already available. The
following table identifies some of the factors that
will influence the cost of modeling. The order of
magnitude values are provided for general
illustrative purposes and should not be relied upon
to establish a budget for a specific modeling effort.
52
-------
Questions that Influence Deposition Modeling Costs
Questions
Do thepollutant(s) involved have simple or
complex atmospheric chemistry?
What quality of emission inventory is available?
What geographic scale of analysis is sought;
i.e. , what is the geographical scale of the
sources which may potentially contribute
significantly to the waterbody?
How complex is the terrain?
Does adequate meteorological data exist for the
modeling domain? Any unique microscale
weather influences?
Has the model being considered been used for
this pollutant before?
Has the pollutant been analyzed successfully by
any model before?
Can this effort be coordinated with existing
studies underway?
Order of Magnitude Potential Costs
More Expensive
Complex (e.g. .nitrogen)
None previously done
Continental - Global (e.g.,
RGBs, HCB, Hg)
Complex with microscale
weather (land-water
interactions)
Coastal environment with
microscale meteorology
(fog, lake-effect snow);
situations where
meteorological datasets
to drive the modeling do
not already exist, and
must be created (from
databases of weather
observations) using
meteorological models.
No prior use with the
pollutant
No, substantial work must
be performed to "figure
out" how to model the
pollutant.
No other studies are
being performed or are
planned in the area
$100,000 to $500,000
Less Expensive
Simple (e.g., cadmium)
Existing (good and
complete) inventory
Local - Regional
(e.g. , some PAHs)
Simple (flat without
local weather
anomalies)
Adequate
meteorological data to
drive the model already
exists, is easily
obtainable, and is in a
form compatible with
the fate and transport
model being used. The
data must have
sufficient temporal,
horizontal, and vertical
resolution to capture
significant
meteorological
phenomena affecting
the fate and transport
of the pollutant in the
modeling domain.
Well utilized for
pollutant of concern
Yes, a body of work is
available in the
scientific literature to
aid in the adaptation of
the model to simulation
of the pollutant.
Modeling is already
being conducted in the
region of interest for
this pollutant, and
analysis for your
receptor can be added
relatively easily to this
study.
$10,000 to $100,000
53
-------
X. of a Well
«_y\y
This chapter summarizes the previous sections of this handbook Into a stepwise strategy for easy reference.
You have already identified water quality or ecological problems In your waterbody to which air deposition
may contribute. The Intent of the strategy is for you to learn as much as you can initially from existing
information and then plan to carefully determine what information you really need before expending large
amounts of resources on a monitoring and modeling program. This strategy Is not intended as a
requirement; assessments can be well designed using other strategies. Yet, the steps presented below are ones
that managers and scientists involved In air deposition assessments have found useful and cost effective.
Step 1: Do a paper study. Look up estimated measured or modeled deposition rates in your watershed
from national assessments that have already taken place. A good place to start Is the NADP assessment for
nitrogen and mercury compounds. The IADN is a good place to look for deposition rates of toxic pollut-
ants in the Great Lakes region. If those analyses do
not cover the pollutants of concern, look for
deposition rates estimated for other nearby water-
sheds In the research literature, for ambient air data,
or for emissions inventories.
Step 2: Perform a rough calculation to estimate the
load from atmospheric deposition. Take the
estimated deposition rate and convert that to a
loading by multiplying the rate by the area of the
estuary or body of water. Then add the indirect
deposition load to the direct deposition load. If the
paper study turns up several widely differing
deposition rates, take one at the high end and one
at the low end and calculate two estimated
loadings.
Step 3: Compare the estimated deposition load
with other pollutant sources, including point
source discharges and nonpoint source discharges
such as stormwater or agricultural runoff and
erosion. If the estimated deposition load is on the
same order ofmagnitu.de as or larger than other
sources, continue additional investigation of the air
deposition component. However, if the estimated deposition load is small or comparatively much smaller
than other loadings, you may want to consider using your limited resources to better understand the
loadings from point sources and other nonpoint sources rather than launch into additional quantification
of the air deposition. This does not mean that there are no environmental impacts from air deposition,
only that air deposition monitoring Is much more expensive than water monitoring.
Once the paper study has been done, the estimated loadings to the waterbody have been calculated, and
the atmospheric loadings are estimated to be a significant portion of the total load, it is time to think
about atmospheric deposition monitoring and/or modeling.
Pollutant
The National Estuary Program
atmospheric deposition of nitrogen
might be a significant problem in the when it was
identified as 25 and 30% of the
nitrogen load next door in Tampa Bay, However,
to think atmospheric
deposition not as significant a source as
stormwater runoff and point source loadings. So
NEP conducted a monitoring study to
atmospheric
stormwater to the The purpose of the
study to determine the most significant source of
nitrogen to the Preliminary results that,
although there is substantial atmospheric deposition
to the atmospheric pathway is not as
significant a source as stormwater loadings. This
the importance of putting the
atmospheric in context with the of the
pollutant so that priorities can be established
to use limited resources in the most environmentally
effective manner.
54
-------
Step 4: Decide the question(s) the monitoring and/or modeling needs to answer. Do you want to know
total annual loads to a body of water? Are there particular seasons or weather events that are more Impor-
tant than others? Do you want to know the effect of atmospheric deposition on the plants and animals in
the river, lake, or estuary? Do you want to know how much deposition is coming from local sources or in-
state versus out-of-state sources? Do you want to identify the atmospheric deposition coming from a
particular source categoiy in your area (e.g., several municipal waste Incinerators) or a particular source (e.g.,
a single coal-fired utility)?
Step 5: Look for partners. This may Include agencies, universities, researchers, non-profit groups, and
anyone else who is already doing any atmospheric work, has any Interest in doing atmospheric work, or has
resources (money or staff) available to help conduct atmospheric deposition work. Atmospheric deposition
monitoring and modeling are time-intensive and expensive and generally require a relatively high level of
experience and expertise to do well. Therefore, the key to a successful atmospheric deposition study is to
leverage as much support as possible at every stage of the project.
Step 6: Form an advisory group. The technical portion of the group will answer questions about the
monitoring details—what type of monitoring equipment to use, what protocols to use, what type of
modeling Is appropriate, what data will be needed to do adequate data analysis and how to get it, and other
technical details. The non-technical members will make sure the study Is an accepted part of the larger
management framework and that the data collected can
and will be used for management purposes. The point
is to get buy-in as early as possible to avoid "good-data,
bad-data" arguments later on.
Committee
The Tampa Estuary Program (TBEP) was
of the first to nitrogen deposition to
a ecosystem. Since TBEP no
no atmospheric scientist on staff, the TBEP
senior scientist a national advisory
group to help develop the program. The
advisory group includes nationally
experts in wet and dry deposition
methodologies for nitrate ammonia {and
more recently mercury), national atmospheric
program with technical
knowledge of modeling, local stakeholders,
including counties Tampa Electric
Company. Since TBEP not do most of the
monitoring or modeling work the county
university scientists doing the work sit
on the advisory group. The advisory group
periodically to answer specific complex
require a group
consensus. The advisory group responds to
other questions on an through
individual telephone calls, conference calls, or
written
Step 7: Decide on an assessment strategy. This may be
monitoring, modeling, or more likely some combina-
tion of the two. The advisory group should generally
agree with the strategy you choose. If they do not,
make sure other scientists do (then put them on your
advisory committee). If the strategy is not scientifically
defensible, It is not worth doing because the ensuing
discussion will revolve around the data, not what to do
about the results; and few people will feel comfortable
making decisions based on questionable data.
Step 8: Find the resources to carry out the strategy.
These may be grants, contributions from non-profit
groups or industry, or in-kind donations. Save 30% of
the total to interpret the results or get an in-kind
donation to do it. If resources are tight, analyze only a
subset of the samples you collected and store the rest.
The stored samples can be analyzed at a later date if
warranted by the initial results and additional funds
become available.
55
-------
Step 9: Begin the assessment. Remember to use the advisory group to answer questions and get around
roadblocks. Begin analyzing data as soon as possible.
Step 10: Reassess periodically. When the initial phase of the assessment is done (approximately one year or
so), take stock. What has been learned? What still needs to be learned? Have priorities changed? Remember
the theoiy of adaptive management: it is OK (actually it is more than OK, it is a good idea) to change the
strategy if it is not working or not meeting the needs for which it was designed.
56
-------
X.
If atmospheric deposition is Identified as a significant problem, the next step is often to identify the
responsible sources. Some sort of source attribution is generally necessary before anything can be done to
solve the problem associated with the source. Source attribution is often called "attribution" instead of
"identification" because, In many cases, the exact source cannot be Identified from the crowd of
possibilities. Instead, certain types or categories of sources (e.g., municipal waste combustors within
50 miles or hog farms from six counties) or several combinations of sources in a geographic area are
identified as contributors to the total atmospheric deposition load. (Note: A description of systems for
classifying industry or source categories can be found in the section on inventories beginning on page 47.)
Sometimes the results of this type of analysis lead to the development of laws or agreements to reduce
emissions from particular types of sources. For example, Title IV of the 1990 CAAA was passed to
regulate certain utilities based on knowledge that the source category as a whole was responsible for a
substantial portion of the lake acidification problem in the Adirondacks. Therefore, "source attribution"
does not necessarily mean being able to point to the specific smokestack or area source; it may mean
narrowing down the options to a collection of sources that all contribute to some portion of the problem.
Source attribution may be the most technically difficult part of solving environmental problems caused by
atmospheric deposition. Despite all the concerns about accuracy and the sensitivity of deposition sampling
methods, contamination, and quality control, measuring deposition is relatively straightforward. Source
identification, in contrast, involves some sort of tracking of the pollutant from the source to the area
where it is being deposited. This is complicated by the fact that many sources emit the same pollutants.
Furthermore, the pollutants are dispersed and do not necessarily travel In a straight line, and they may be
transformed in the atmosphere before being deposited. Therefore, you
should expect to work with individuals trained and experienced in the
methods used for source attribution.
This section contains information on
• Identifying sources
• Designing a source attribution strategy.
Back-trajectory analysis Is an analysis of meteoro-
logical data, specifically air transport, to estimate
the location of an air parcel earlier in its history. An
air-parcel trajectory is the path of a parcel of air as it
is transported by the wind. A backward trajectory
follows the parcel of air backward in time. For a
given deposition sample, a back-trajectory analysis
would help answer the question: "Where did the air
that carried the pollutant to my sampler come
from?" For example, the analysis might indicate
that three hours earlier the air parcel carrying the
pollutant had been six miles to the west of the
sampler, and nine hours earlier, it had been nine
miles northwest of the sampler. This analysis alone
cannot tell you which types of sources or individual
sources emitted the pollutants because you don't
know how far along the trajectory (i.e., how far
back in time) the pollutant originally was emitted
into the air parcel. You also do not know from the
trajectory alone what portion, if any, of the
pollutant was deposited to the ground by
precipitation or dry deposition processes before
reaching the sampler. Further, just as forward
-------
trajectories are only estimates of where a pollutant
will be in the future because of dispersion, the same
applies for back trajectories. For instance, a
pollutant source that is located near, but not
directly along, a back trajectory may be contribut-
ing to the pollutant measured at the sampler.
In spite of its limitations, back-trajectory analysis
can provide useful information for pollutants for
which there is not an inventory of emissions
sources. The direction of transport of the air parcels
bringing pollutants to the site can provide clues
about potential sources. It can be used as a screening
tool, even if you have an emissions inventory. A
comparison of the trajectory patterns with data
from emissions inventories would help you deter-
mine which sources should be examined with
additional analyses. Note, however, that if the
pollutant has been through the grasshopper effect
(defined on page 5), then the back-trajectory
analysis would indicate the area of the last point at
which the pollutant was re-emitted to the air, rather
than the original emission source.
There are several conditions for running a back-
trajectory analysis. The deposition sample should
have been collected over a time period of a day or
less. Over a longer period, the weather patterns are
usually too variable to reasonably estimate a trajec-
tory associated with the deposition sample. In
addition, the meteorological data used should be
temporally resolved on a short-term (one-hour or
three-hour) basis, rather than a daily basis, to
account for precipitation events, variations in wind,
etc. If you have wet-deposition samples, then the
trajectory should start during the time of the
precipitation, which you know if you have hourly,
as opposed to daily, precipitation data. Archived,
model-generated weather data based upon measured
weather data from surface sites and balloons are
available on the NOAA Ready Web site at http://
www.arl.noaa.gov/ready.html. This allows you to
run the back trajectory based on these weather data
archives, without having to have any meteorological
data of your own. However, oil-site weather data
may be important, if the flow around the site is
very complex (e.g., the site is located in an area of
very complex terrain, or the area of interest is very
small, say on the order of 50 square miles). In this
case, the back trajectories couldn't be run back very
far because the wind direction and speed at the site
would be different than that in the regions sur-
rounding the site. Remember, to make a meaning-
ful calculation, the back-trajectory model has to
have appropriately resolved meteorological data in
the entire domain that the air parcel may have come
from. The better the meteorological data represent
reality, the better the trajectory will be.
Meteorological expertise is also needed to determine
what initial height(s) should be used for the analy-
sis. Air parcels in different vertical layers of the
atmosphere can experience different trajectories. A
ground-level trajectory is problematic due to the
effects of varying terrain on the flow of air, and it is
unlikely to represent the layer at which the pollut-
ant was carried before it was deposited. Too high of
a trajectory would probably not be representative of
the path of the pollutant as it was transported to
the sampling site. The choice of the initial height
should take into consideration meteorological
conditions when the deposition sample was taken
(e.g., daytime with vertical mixing or nighttime
with stable air masses). Multiple runs of the analysis
at various heights will provide a range of possible
trajectories. Looking at the area bounded by the
range provides a better chance of catching the true
path of the pollutant.
An additional reason for expertise in meteorology is
to determine how far back in time you can run the
analysis before the potential errors become so large
that it becomes a useless exercise. Typically, you
would expect runs that go back 12 to 48 hours.
An approach that can be thought of as "back-
trajectory plus" is called cluster analysis. If a lot of
monitoring data are suitable for a back-trajectory
analysis (for example, daily deposition samples
taken over a five-year period), statistical analyses can
be done to determine the probability of pollutants
coming from various places. First, the meteorologi-
cal data are statistically analyzed to determine
patterns or "clusters" of trajectories. Then, patterns
58
-------
in deposition can be examined in comparison to the
clusters. This provides a more powerful analysis than
looking at just a small set of deposition samples.
The obvious drawback is the amount of data
required.
One advantage of the back-trajectory approach is
that it is relatively simple and inexpensive compared
to other source attribution analyses. The H YSPLIT
model described in the previous modeling chapter
(see page 52) can be run in a back-trajectory mode.
This model can be downloaded or run at a Web site
of the NOAA Air Resources Laboratory (http://
www.arl.noaa.gov/ss/models/hysplit.html). General
information about trajectories and other modeling
topics can be found at http://www.arl.noaa.gov/
slides/ready/index.html.
This technique matches deposition to a source
category based on the chemical "fingerprint" of the
source and the measured deposition. The fingerprint
is the unique profile of chemicals that every process
emits.
The chemical mass balance technique does not
identify which smokestack, area, or mobile source
the emissions are coming from, just that it is, for
Technique
A particular incinerator may have an emission
that has particulate matter of a certain size, dioxins,
mercury, cadmium, copper, antimony.
Generally, if the concern from the water quality
standpoint is mercury and dioxins, those will be the
only compounds at the deposition But
simply the mercury dioxins not
give much hint of where it is coming from. If the
incinerator is suspected, the fingerprint
can be found or measured, there is another option.
The can to the mercury
dioxins samples for antimony as well,
if they are they the incinerator
may be contributing to deposition at the site. If they
are not the incinerator is probably not
contributing to deposition at the site.
example, a pulp and paper factory and not a chlor-
alkali plant. Usually the entire fingerprint does
not have to be measured, only some key
compounds. For example, researchers on the
Chesapeake Bay know they are getting deposition
from Philadelphia every time antimony (from a
large antimony roaster in Philadelphia) shows up
in higher concentrations in the deposition
samples. Sometimes fingerprints from multiple
sources can be identified at a particular site by
"backing out" one signature after another. Some
source types have known emissions fingerprints;
others are less well known or not known. Your
advisory group will help you determine which
sources have good data available. If you do not
have good data on source types potentially
impacting a site, it is not worth doing this type of
analysis because you will not be able to match the
data you collect with any known sources.
It may be easy to monitor for the key compounds
if the deposition sampling is being done with
filterpacks, depending on what you are looking
for. Additional metals can be measured from
filterpacks for very little extra money and can
provide invaluable information about potential
sources.
The chemical mass balance technique is
particularly useful when used in conjunction with
back-trajectory analysis. The back-trajectory
analysis provides the general path of the air mass
that caused the deposition, and the chemical mass
balance analysis provides the type of source to
look for in that path. The analysis cannot pin
down the exact source location (unless there is
only one possibility), but it can significantly
narrow the range of possibilities and may provide
enough information to begin the process of
getting reductions from those sources. The
chemical mass balance technique is a little more
resource- and time-intensive than back-trajectory
modeling. While there may not be additional
equipment to buy, there are some additional
sample analysis costs, and the time to find source
fingerprints and match them with the data results.
59
-------
Dispersion modeling involves using an already-
developed model or developing a new model to
simulate transport and deposition from emission
sources to the waterbody of concern. It can be
thought of as "forward-trajectory" modeling;
instead of going backward from deposition site to
the source, dispersion modeling models transport
forward from the source to the deposition site. This
type of modeling is data-intensive. You'll need good
emissions inventory and meteorological informa-
tion.
Various approaches can be used. You can do facility-
specific model runs with just the emissions from
one or a particular set of sources. For example, if
you are interested in learning about the contribu-
tion of dioxins from local waste combustors, you
could do dispersion modeling of the dioxin
emissions just from those sources. An alternative
approach is to do a baseline run with all the
emissions in the inventory, then do additional runs
without the emissions from a given source or set of
sources. A comparison of the two runs would
indicate the contribution from the source or sources
in which you are interested. For example, to
estimate the contribution of each suspected source
of mercury, a model run would be made that
includes emissions from all sources. That is the
baseline. A second run would be made without
emissions from one type of coal-fired utility, then
another run would be made with the coal-fired
utilities, but without municipal waste combustors,
and so on. Your modeling experts will be able to
help you choose the most appropriate approach to
answer your questions. For example, the first
approach is not viable if the model needs a broad
inventory of emissions to simulate atmospheric
chemistry that could affect deposition rates.
A new idea in dispersion modeling is to use some
type of source labeling to identify sources within
the model. This can be thought of as "virtual"
labeling: labeling all emissions from one source as
watermelon jellybeans, all from another source as
cherry, from a third source as licorice, and so on.
The fraction of watermelon jellybeans deposited at
any location is the fraction of pollution that source
(say, incinerators in the Denver metropolitan area) is
responsible for. This type of labeling also works
much better for compounds that travel in
mathematically simple ways. It has not yet been
tried for pollutants with non-linear chemistry, such
as nitrogen, whose atmospheric chemistry (and
therefore transport and deposition) depends on the
presence of other nitrogen compounds, sulfur
compounds, VOCs, heat, and ozone.
Dispersion modeling must be groundtruthed with
actual meteorological, ambient air, and deposition
measurements. The grid size must be chosen
carefully to give the resolution needed, and the
meteorological data must be compatible with that
grid size. Smaller grid sizes generally increase the
amount of time it takes to run the model and the
expense.
Although it is possible to develop a new dispersion
model, it is highly recommended that managers use
one "off the shelf" that has already been developed
and tested. Not only is it cheaper and easier, but the
results will be more robust because the model itself
has already been tested and peer-reviewed. The
chapter What You Need to Know About Air
Deposition Modeling describes several dispersion
models that could be considered. Since these
models also can be run to estimate total deposition
rates, it is important to be clear on what data are
used and how the model is being run. This is where
building a good working relationship with the
modelers from the beginning of the project will pay
off; they will make sure the model runs are robust
scientifically and answer the questions you need
answered.
Dispersion modeling analysis can get expensive,
even if an existing model is used. It depends to a
large extent on which model is being used
(generally, a model you can run yourself is cheaper
than one you have to pay someone to run for you)
and what form the input data are in. Often a large
part of the cost of running a dispersion model is
preparing the inventory and meteorological data to
feed into the model. It is important to use the best
inventory and meteorological data available for the
60
-------
year(s) in which you are interested. Although a set
of several dispersion model runs can cost as much as
several hundred thousand dollars, it can cost an
order of magnitude less if most of the meteorologi-
cal and emissions data needed for modeling are
already available.
Elements are present on earth as different isotopes,
or weights. A well-known example is a radioactive
isotope of carbon, carbon 14. (The most abundant
carbon isotope is carbon 12.) Carbon 14 is used to
determine the age of a particular material based on
its ratio of carbon 14 to carbon 12 (14C/12C).
Isotopic nitrogen ratios use the same characteristic
in a slightly different way. Instead of measuring the
age of a material based on a decay rate, nitrogen
isotope ratios can narrow down the source by
matching the 15N/14N ratio at the source and in
the deposition sample (or other receptor).
Source categories emit different ratios of 15N/14N.
The ratio is usually presented as the "enrichment" of
15N versus some measured baseline, which is
expressed as "delta 15 N" (815N). A positive 815N
is enrichment of 15N, a negative 815N is depletion
of 15N. For example, one study measured the
815N of fertilizer as approximately 0%o (parts per
thousand; like a percent except out of 1,000 instead
of 100), nonpoint runoff from agricultural fields as
+6 to +9%o, and effluent from sewage treatment
plants +11 to +l4°/oo. Animal waste lagoons are
even more enriched in 15N, ranging from +16 to
+27%o depending on the animal species. In Europe,
studies have shown that dissolved ammonium has a
815N of approximately -12%o. Isotopic ratios for
atmospheric sources can also be measured and used
in the same way.
The 815N can be measured at the receptor (a
receptor could be the deposition samples, rivers,
estuaries, or estuarine plants or animals) to estimate
the source of nitrogen. Like the chemical mass
balance method, this method only distinguishes
classes of sources, not the actual source. For
example, deposition measured at a particular site
may have a ratio that is similar to that of organic
fertilizer rather than fossil fuel combustion. That
would suggest agricultural sources rather than
power plants or mobile sources.
Isotope ratios have a distinct advantage over the
other methods in that they can be used to measure
the percentage of nitrogen from air deposition in
receptors other than the deposition sampling site.
The 815N can be measured in runoff, streams,
estuaries, or algae. Theoretically, this would make it
possible to estimate what role atmospheric
deposition had in any particular algae bloom.
There are some uncertainties with using isotopic
ratios as tracers. The biggest uncertainty is that what
is measured at any given receptor is sometimes a
mixture of nitrogen from different sources. A
mixture of sources can do one of two things. It can
produce a ratio at the receptor site that is not the
same as any of the sources. Or, the ratio at the
receptor site can be the same as a particular source
type (say, +12%o), but actually be the result of a
mixture of source types with ratios of+6%o and
+ 18%o. One way to get around some of this
problem is to use more than one isotopic ratio.
Oxygen ratios (18O/16O) are sometimes used in
conjunction with nitrogen ratios to narrow down
the source type. For example, if the 815N indicates
two source categories as possible sources, the 8ISO
may point to one of those sources, making it the
likely source. Like many of these methods, isotope
ratios are best used in conjunction with other
methods and not relied upon as the only method to
identify sources.
Another potential limitation of the use of isotopic
ratio tracers is that, in the atmosphere, one must
assume the isotopic composition does not change
from the emission source to the point of
deposition. Other assumptions about how these
ratios are enriched or depleted as nitrogen inputs
move within the food chain are also required when
carrying these measurements into surface waters.
This method of source identification is being tested
in Sarasota Bay NEP and the Neuse River Basin in
North Carolina, among other places, but it is not
yet well-developed or accepted for use with
61
-------
nitrogen compounds in the United States. Most
American researchers are unfamiliar with the
method, and there are few laboratories with the
capability to analyze nitrogen compounds for their
isotopic signatures. If you do decide to use this
technique, make sure that the advisory group is
committed to the choice and that the necessary
equipment and expertise are available. Both could
be resource-intensive.
Tracers
Artificial tracers can sometimes be inserted into the
emissions from a suspected source and tracked
through ambient air and deposition monitoring.
This is usually done to verify models. While
theoretically very simple, the reality is much more
difficult. For a tracer to work well, it must behave
like the pollutant of concern (travel and deposit in a
similar way), be inert (not affect pollutant of
concern), be unique and benign in the environment
(not emitted by other sources), and be amenable to
detection at very low concentrations. These
conditions make it difficult to find suitable tracers.
The benefit of using a tracer is that, if it does work,
it identifies a specific source. Other methods, except
dispersion modeling, can only narrow down source
areas or categories.
If a suitable tracer can be found, there is still the
substantial logistical problem of getting access to
the suspected emission site and inserting the tracer
into the emission stream. This is not a small
problem. It requires having an extremely good
working relationship with suspected sources and
sometimes getting permission from the state or
EPA. Tracers are usually a "one-shot deal" so it is
important to get everything right the first time.
There must be a suite of monitors available to track
the tracer toward the area of interest. This method
is (obviously) very sensitive to existing weather
conditions, and the weather scenario should be
chosen carefully. Once you go to all the work of
finding a tracer and getting permission to use it, the
whole thing will be useless (or worse) if you insert
the tracer during the 20% of the time the wind
blows away from your watershed. Therefore,
extreme care needs to be used when using tracers. It
can also be costly to hire the necessary experts to
conduct the field work associated with tracer
studies. Tracers should probably be viewed as the
last resort, to be used only if all the other
techniques for source identification do not provide
the needed information.
a
The easiest way to design a source attribution
strategy is to start with the simple methods and
work your way up to the more complex (and more
expensive) ones.
The first thing you should do is find out what
sources emit the pollutant(s) in wrhich you are
interested. A list of pollutants and their largest
sources is in Appendix 1.
Then get a recent inventory for your area and see if
it makes sense. It helps to do this with someone
who knows inventories, but it also requires
someone with a working knowledge of the local
area. Inventories may miss sources or have old data
that do not take into account changes in land use,
production methods at factories, closure of
industrial sites, or other obvious things that only
someone familiar with the area would know.
The first source attribution tool to try is the back-
trajectory analysis. It is free—just download it from
the NOAA Web page—and does not require any
special training to use. Compare the results with the
inventory to see how suspected sources compare to
the back-trajectory results. If there are suspected
sources in the path of the most contaminated air
masses, it is reasonable to suspect those sources are
responsible. This is as far as many managers go to
identify sources. With this data, many feel
comfortable approaching sources and/or states to
talk about options to reduce emissions.
If that is not enough information, the chemical
mass balance analysis, where source fingerprints are
matched with chemical profiles found in deposition
samples, can be used to support back-trajectory
results for toxics. Isotopic ratios can do the same for
nitrogen, yet there can be many uncertainties with
their use; and they are best suited for identifying
62
-------
what portion of nitrogen in the environment
comes from atmospheric deposition (as opposed to
what atmospheric sources are emitting it).
Dispersion modeling is a higher-end tool to use if
more concrete connections are needed. Much of
this is done at the federal level on a regional or
national scale in support of regional or national
regulations, but it is sometimes done at the state
and local level as well. Check to see what types of
source-receptor modeling results are or will be
available before deciding to do your own.
Tracers are a last resort if none of the other methods
work. They are extremely difficult to do well and
should only be done if it is certain that results will
be useful in the management context.
63
-------
XL Now What?
So you've gone through the whole process—thinking that you might have a problem, confirming that you
have a problem, making estimates about the importance of atmospheric deposition, refining those
estimates with monitoring and/or modeling, getting an idea of where the pollution is coming from—and
now you know. Atmospheric deposition from several sources is a significant source of pollution to the
watershed. Now what?
Your next steps will depend on the underlying goals, requirements, and processes involved in your
watershed management activities. For your purposes, it may be sufficient to know how much air
deposition is contributing to pollution compared to other sources. Your management activities may then
focus on other sources of pollutants to the waterbody. On the other hand, you may want to expend
additional effort on the air sources. In doing so, you will work with air specialists to investigate what
regulations will be implemented and what reductions might be expected. If additional emission
reductions are important to achieving your waterbody goals, you can act
as a catalyst for initiating new or more stringent regulations at the
federal, state, tribal, or local level or for providing incentives to reduce
air pollution voluntarily.
The information provided in this section should help you work more
effectively with other water and air professionals in government and
industry. Specifically, this section discusses
• The importance of coordination with air and water specialists,
as well as with other agencies, In dealing with air deposition
• The requirements for state water quality managers to develop TMDLs
for impaired waters, particularly as they relate to air deposition
• Air pollution programs and tools for reducing emissions
• Managing old pollution
• Follow-up processes.
One of the biggest lessons everyone who works
with air deposition learns is the degree to which the
air and water environments are connected. Because
air and water management tends to be separate, it is
important for managers to build relationships with
their counterparts. You will find that, to get any-
thing done about atmospheric deposition, you need
to have a good relationship with local, state, and
tribal air pollution managers. This involves under-
standing each other's language, understanding the
limitations of what each can do, and brainstorming
about how management strategies can fit together.
64
The air program professionals can be helpful in
many ways. From them, you can find out about the
sources of pollutants you are concerned about. You
can learn what restrictions on emissions are in place
for those sources, as well as what reductions are
planned in the near future. They are also a resource
for determining how possible additional reductions
could be achieved, both from a technical and legal
standpoint.
Your work may benefit the air program as well. It
can show where there are water quality benefits, as
well as air quality benefits, of emission reduction
activities. For example, assume an air program goal
-------
is to reduce air emissions from cars by encouraging
alternative commuting options. Also, assume you
have partnerships with active community organiza-
tions interested in educating the public on water-
shed protection. These organizations could
disseminate information about benefits to the
watershed from reducing air pollution from cars,
which gives commuters additional incentives to use
alternative commuting options.
In the context of TMDLs (discussed in more detail
in the next section), there should be good
coordination between air and water agencies at the
state and local levels. This is to ensure that any in-
state load reductions called for in aTMDL can be
accomplished through air permits or other
mechanisms. (Although air permits have not yet
been revised to meet load reductions assigned in a
TMDL, in some cases they may. TMDL
implementation might then have to be consistent
with the state permitting schedule, as is the case
with water permits.) Good coordination is also
necessary to ensure that the data needed to develop
TMDLs in the first place are collected. In some
tribes, states, EPA regions, and other water
management agencies, there is already a good
relationship between air and water programs. In
others, the air and water organizations are not used
to working together, and both sides need to make
the effort to make cooperation a reality.
In addition to coordination between air and water
programs, it may be important to work with other
agencies. For example, for agricultural sources of air
emissions, it would be useful to coordinate with
local representatives of the U.S. Department of
Agriculture's Natural Resources Conservation
Service. They will have contacts with the
agricultural community and be knowledgeable
about their management practices. The National
Park Service or the U.S. Forest Service may also be
useful partners.
In many cases, the process of developing a
TMDL may be the catalyst for doing the initial
investigation of atmospheric deposition. ATMDL
basically identifies the maximum amount of a
pollutant that can be in the water and still meet
state water quality standards, as well as how much
pollutant loads need to be reduced to meet water
quality standards. A TMDL also divides up or
"allocates" pollutant loads among the various
sources discharging to a waterbody. In some cases,
the TMDL connection won't be made until after
the deposition is estimated. Regardless of the initial
reason for characterizing atmospheric deposition,
once you have identified it as a significant source of
pollution to an impaired water body, it is possible
that the issues surrounding TMDLs will be raised.
Under the Clean Water Act, each state must
develop a list of waterbodies where water quality
standards developed by these jurisdictions are not
being met. This list is called the 303(d) list, after
the section of the Clean Water Act requiring the
list. In the 1998 guidance for the 1998 303(d) lists,
EPA reiterated that the section 303(d) list provides
a comprehensive inventory of waterbodies impaired
by all sources, including point sources, nonpoint
sources, or a combination of both. The guidance
also clarifies that states should list all waters
impaired either entirely or partially due to
pollutants from atmospheric deposition.
A major challenge of any management strategy that
includes atmospheric deposition is figuring out
how to achieve the load reductions in air sources
necessary to meet water quality standards. The
TMDL program does not create any new laws to
address specific sources; it relies on other existing
laws or programs to implement any load reductions
specified in a TMDL. The most frequent concern
raised about air deposition and TMDLs is the
limited ability a state has to control sources outside
its boundaries. In some cases, out-of-state sources
may be responsible for a large part of the
atmospheric deposition load, but local sources can
also contribute a significant amount.
States first need to make sure they have identified
local sources (both air and water) that they have the
authority to control through state regulations or
voluntary agreements. States can then identify how
much pollution is coming from out-of-state
sources (both air and water). A state will have to
65
-------
coordinate with other states and EPA to determine
how best to address those sources.
It is important to remember that, in developing a
TMDL, a state may find that it is possible to
achieve state water quality standards through
reductions in water point sources alone, without
including any reductions in loadings from air
sources. However, if a large portion of the total
load is from air sources, and that load is not
considered in the TMDL, the reductions from the
water sources may not be sufficient to achieve the
intended water quality benefits (or attain the water
quality standard) because there is "extra" pollution
that is not accounted for in the TMDL. Therefore,
it is important that atmospheric deposition be
included in the development of TMDLs.
To date, states have developed a small number of
TMDLs for nitrogen that identify the total loading
from atmospheric deposition, along with the
nitrogen loadings from other sources. Some of the
methods described earlier were used to estimate the
contributions from air sources, such as the use of
deposition estimates produced by the RADM
model. In determining the nitrogen reductions
needed to meet water quality standards, these
TMDLs consider as reductions in atmospheric
loadings the estimated emissions reductions
expected as a result of existing federal legislation or
regulations. For example, a few TMDLs have stated
that total nitrogen deposition is expected to
decrease by a certain amount as a result of
implementing existing CAA regulations. These
regulations include national ambient air quality
standards for paniculate matter and ozone,
including other requirements designed to help
achieve them, the acid rain program, and mobile
source regulations. (The next section on air
program basics will orient you to these regulations,
and those noted in the following paragraph, if you
are not familiar with them.) Another nitrogen
TMDL referenced a local effort to improve sea
grass productivity in an estuary through voluntary
nitrogen reductions by local utilities and other
initiatives.
Efforts are also under way by EPA and some states
to develop TMDLs for waterbodies impaired
primarily by the atmospheric deposition of
mercury. Several TMDLs estimate the current or
baseline mercury deposition. These TMDLs use
various methods to estimate deposition, such as
deposition data from mercury deposition network
sites and modeling data from the 1997 Mercury
Study Report to Congress. Similar to the nitrogen
TMDLs described above, expected reductions in
mercury deposition are estimated from the baseline
based on CAA regulations already promulgated but
not fully implemented. For mercury, these include
national standards for sources of hazardous air
pollutants and for solid waste combustion units.
The anticipated reductions are then compared with
the total reductions needed to meet water quality
standards.
The technical approach for estimating current and
projected atmospheric loads may vary from site to
site and pollutant to pollutant. Several techniques
(as described earlier) can be used to identify sources
of air pollutants. EPA is conducting a pilot project
to test methods for determining the relative
contributions of local and out-of-state sources of
mercury, as well as several source categories in
specific geographic areas. A report from that pilot
project is expected to be available in 2002.
As our ability to quantify the relative contribution
of different sources or source categories improves, it
will be possible to develop more targeted
approaches to reducing atmospheric deposition
loads in a TMDL context.
The federal legislation that gives the EPA the
authority to regulate emission of pollutants into the
air is the Clean Air Act. It was originally passed in
1970, and was most recently amended in 1990.
Two CAA programs that specifically address
deposition to waterbodies are the Great Waters
program and the Acid Rain program.
The Great Waters program was created to study
the extent and effects of atmospheric deposition to
the Great Waters. It can be found in section 112(m)
of the CAA. The Great Waters include the Great
Lakes, Chesapeake Bay, Lake Champlain, and
-------
estuaries in the National Estuary Program and the
National Estuarine Research Reserves. The program
published three reports to Congress on "Deposition
of Air Pollutants to the Great Waters," most
recently in June 2000. The EPA was also required
under this portion of the CAA to determine
whether it has sufficient authority to adequately
address the problems of atmospheric deposition of
pollutants of concern to the Great Waters and in
1997 determined that it does have adequate
authority.
The Acid Rain program (Title IV of the CAA) was
created to reduce the adverse effects of acid
deposition by reducing emissions of nitrogen oxides
(NOx) and sulfur oxides (SOx) from electric
utilities. This program features a cap-and-trade
program that caps sulfur dioxide emissions at one-
half of 1980 levels and requires each utility to hold
one allowance for every ton of sulfur dioxide (SO2)
emitted. Utilities can trade these allowances,
allowing the industry as a whole to comply with
the cap at a lower cost than traditional regulatory
methods. Utilities can also bank allowances for
future use, a feature that caused many sources to
reduce their emissions in the first years of the
program more than required. For NOx, the Acid
Rain program limits the rate of emissions from
utilities (i.e., pounds of NOx emitted per British
thermal unit of power generated). There is no total
cap on NOx emissions from electric utilities under
this program. Both the SO and NO emission
reductions required by the acid deposition program
are being implemented in two phases: Phase I began
for SO2 in 1995 and for NOx in January 1996.
Phase II for both pollutants became effective in
2000.
While the Great Waters and Acid Rain programs
primarily address air deposition, other CAA
programs address air problems in addition to
deposition, such as smog and risks due to inhalation
of toxic pollutants. Nevertheless, these other
programs reduce emissions that may adversely affect
aquatic ecosystems.
In the ambient air quality program, national
ambient air quality standards (NAAQS) set national
standards for acceptable concentrations of specific
Criteria Pollutants
Carbon monoxide (CO)
Lead (Pb)
Nitrogen oxides (NGx)
Ozone
Partlculate matter (PM)
Sulfur oxides (SO)
pollutants in outdoor air. These pollutants, called
criteria pollutants, are found commonly through-
out the country. They threaten public health and
the environment across broad regions of the
country and are emitted in relatively large quantities
by a variety of sources.
Areas of the country where measured air quality
does not exceed the NAAQS more often than
allowed are designated attainment areas; areas
where air quality exceeds the NAAQS more often
than allowed are non-attainment areas. Areas that
have gone from non-attainment status to
attainment status are called maintenance areas.
States are required to have state implementation
plans (SIPS) describing how they will achieve the
NAAQS. These plans include requirements for
emissions from stationary and mobile sources (such
as inspection and maintenance programs). They also
include requirements for precursors to criteria
pollutants, namely VOCs, a group of organic
pollutants that react with NO in the presence of
sunlight to form ozone. States have the latitude to
develop specific requirements, although the SIPs are
subject to approval by EPA.
The EPA recently published more stringent
NAAQS for ozone and PM. The new PM
NAAQS is for particles smaller than 2.5 microns in
diameter, hence it is called the PM,, s standard. This
standard focuses on smaller (or finer) particles than
previous PM standards. Implementation strategies
for these NAAQS are still being developed by EPA.
The air quality program also includes provisions to
address regional problems in air pollution due to
criteria pollutants, such as pollutants carried from
one state to another. Certain states may be required
to impose controls on sources in their state to
-------
reduce pollution in a downwind state. The "NOx The air toxics program has two phases. In the first
SIP Call" is a rule that requires most states in the phase, EPA develops national regulations for
eastern half of the country to develop plans to categories of stationary industrial and commercial
reduce NOx emissions that travel downwind and sources based on the best emission levels already
cross state borders, contributing to smog formation being achieved by similar sources in the country.
in the eastern United States. This rule assigns a These technology-based regulations are called
summertime NOx emissions limit, or budget, for MACT standards. The great majority of these
each affected state and requires states to submit SIPs MACT standards are already completed or are close
showing how they will allocate the budget by 2004. to being completed.
Another rule, published in January 2000 under the
authority of section 126 of the CAA, establishes In the second Phase> EPA aPPlies a Abased
federal NO emission limits for certain industrial approach to assess how these technology-based
sources, in particular eastern states, to reduce regulations are reducing health and environmental
transport into other states. risks-' f existing technology-based standards are not
sufficient to meet these risk-based goals, EPA is
The control requirements for a stationary source required to promulgate additional regulations. This
not only depend on the attainment status of the second phase is called the residual risk assessment.
area, but also on whether it is a new source of EPA has begun residual risk assessments for several
emissions. The programs that specifically affect source categories regulated in the early years of the
new sources of criteria pollutants are the new source MACT phase of the program.
review (NSR) program, which is implemented by
the states, and the nationally set new source Some of the categories of sources being regulated
performance standards (NSPSs). ullder the air toxics program include chlorine
manufacturing (i.e., chlor-alkali facilities) and
The air toxics program sets national standards to hazardous waste combustion facilities. Electric
reduce emissions of hazardous air pollutants from utilities are also being regulated. The CAA required
industrial and commercial sources. The pollutants EPA to study the emissions of air toxics from
covered by the air toxics program, called hazardous utilities and determine whether utilities should be
air pollutants, were listed originally in the 1990 regulated under the toxics program. The EPA's
amendments to the CAA. Pollutants can be added positive determination to regulate electric utilities
or removed from the list by the EPA following the for air toxics was made in December 2000. All of
criteria provided in the Act. The list includes many these example source categories emit mercury.
metals and metal compounds, POM, and dioxins/
furans. You can find the full list of hazardous air The air toxics program also includes the Great
pollutants at http://www.epa.gov/ttn/atw/. Many Waters program (described above) and the urban
air professionals refer to the air toxics program as air toxics strategy> which has several components
the "Title III" program because it falls under Title and considers both stationary and mobile sources of
III of the 1990 Amendments to the CAA. Yet, in pollution. One of the components of this strategy
the CAA itself, the air toxics program is under Title is to list categories of smaller (area) stationary
I, in section 112. sources for regulation.
Major Area Sources
Both terms refer to stationary sources. Major sources generally are large, such as manufacturing facilities and electric
utilities. Area sources generally are smaller sources, such as dry cleaning facilities, often numerous and out over a
geographic The various air programs have different specific definitions for terms, A primary differentiation
between program definitions Is the amount of pollutant emitted (or potentially emitted) from asource to define It as
"major," Aglven facility could be a major source for pollutant A, but an (or non-major) source for pollutant B.
68
-------
Solid waste combustion units (e.g., municipal
waste combustion units, hospital and medical waste
incinerators) are regulated under section 129 of the
CAA, which covers several toxic pollutants
(including mercury and dioxins) and criteria
pollutants (including NO ). These regulations
currently being implemented are technology-based,
similar to the NSPS and the MACT standards
described above. Residual risk assessment is also
required for these sources.
Emissions from mobile sources and the
transportation sector are addressed by require-
ments for motor vehicles, fuels, and non-road
engines (e.g., aircraft, boats, trains, farm and garden
equipment). State motor vehicle emissions
inspections and maintenance programs are also an
important component of mobile source emission
reductions. In addition, there are programs to
encourage travel choices that minimize emissions.
These programs reduce NO and air toxics, as well
as other pollutants. They are sometimes called Title
II programs, since they are under Title II of the
CAA.
Several recent rules that address emissions from
mobile sources include the Tier 2 rule that requires
tailpipe emission standards for all passenger vehicles,
including sport utility vehicles, minivans, and
pickup trucks. In addition, the recent heavy-duty
diesel ride includes new standards for heavy-duty
trucks and buses, and requirements to reduce the
sulfur content in diesel fuel. A third new rule, called
the 202(1) rule, identifies the compounds that
Additional Air Program Information
EPA Web Sites
http://www.epa.gov/oar/
http://www.epa.gov/oar/oaqps/gr8water/
http://www.epa.gov/airmarkets/
http://www.epa.gov/owow/oceans/airdep/
index.html
to the Air Act {EPA4QG-K-93-
001, April
Air-Water Interface Work Plan
Federal Register - each spring fall an of EPA
regulatory and deregulatory actions is published.
should be considered mobile source air toxics (see
Appendix 3), and evaluates the effectiveness of
mobile source rules in reducing air toxics emissions.
It also sets new gasoline toxic emission performance
standards to ensure that refiners maintain their
average 1998-2000 gasoline toxic emission
performance levels. In addition, the rule establishes
a Technical Analysis Plan that EPA will implement
in continuing to conduct research and analysis on
mobile source air toxics.
A given source can be affected by multiple
programs, either because the source emits multiple
pollutants or because the pollutant it emits is both a
criteria pollutant and an air toxic. For example,
POM and metals are hazardous air pollutants, and
Air Pollution Training
Air Pollution Training Institute (APTI) (classroom,
self-instruction,
instruction)
Information about the APTI can be found at http://
www.epa.gov/oar/oaqps/eesg/apti. html.
therefore are included in the air toxics program.
These pollutants also are typically emitted in the
form of particulate matter, a criteria pollutant. So
the source could be affected by limits to help
achieve the NAAQS, by the NSR program, by an
NSPS and MACT standard. Electric utilities are
affected by the Acid Deposition program in
addition to one or more of these other programs.
You may want to know the bottom line—how
much of a given pollutant is a source allowed to
emit—rather than all the underlying legal
authorities. A source may have this information
consolidated into a federal permit or a state, local,
or tribal permit. Several of the stationary sources,
particularly the large sources, are required to obtain
federal air pollution permits to ensure compliance.
These permits (also referred to as Title V permits,
after the title of the CAA that authorizes the
program) consolidate all the air pollution control
requirements into a single, comprehensive
document that covers all aspects of a source's year-
to-year air pollution activities. Another federal
69
-------
program requires businesses that build new sources
of criteria pollutants or make significant changes to
existing sources to have "preconstruction" or "new
source review" permits. These permits are required
to ensure that large new sources do not cause
significant health or environmental threats and that
these sources are well controlled. Contact either
industry sources, or your local, state, and/or tribal
air programs to obtain permit information for
sources in your area.
If you are trying to figure out what reductions a
regulation is expected to achieve, there are a few
points to keep in mind. One is that there is likely
to be a delay between the time a rule or require-
ment is published and when it is fully implemented
and the reductions are in effect. For example, a rule
affecting the tailpipe emissions from cars or trucks
may apply to new vehicles starting in 2004. It
would take several years before the large majority of
vehicles on the road would meet these tailpipe
standards. An industrial facility may be given up to
three years to put controls in place after promulga-
tion of a MACT standard. A second point to keep
in mind is that a given ride may not achieve the
same amount of reduction or percent reduction
across the board for all facilities. Sometimes
requirements are different for different processes in
an industry. Or, sources may emit varying amounts
of pollutants before the regulation is in place, and
the regulation may bring them all to the same level.
For facility-specific information, you may need to
work more closely with the state or local agency to
determine the amount of reduction expected for the
facility under a given rule, rather than using the
more generalized national estimated percent
reduction expected from the ride.
Emissions—and subsequent deposition—can be
reduced in three ways: changing the resources
(inputs) going into the process, changing the
process, or using control systems to reduce or treat
the outputs. Depending on the emission source, all
of these methods may be applicable, or perhaps
only one. Factors that could be considered in the
choice of methods include the type of source, the
Pollution Prevention
Changing Inputs to a process
Using lead-free gas
Separating from
Changing the process
Producing chlorine using a process without
mercury eel Is
Using a foam or wetting that
chromium emissions in electroplating
Changing outputs of a process
Using an gas scrubbing on
municipal combustion units
pollutants, technical feasibility, emission-reduction
potential, cost, cost-effectiveness, economic
impacts, and other potential repercussions.
Examples of repercussions include an increase of
other problematic pollutants, an increase of
pollutants to other media, or safety concerns. If the
emission reduction method is chosen as the result
of a law, such as the CAA or a regulation under its
authority, the law may be specific about what
factors can be considered or the required
performance of the method.
Changing the inputs going into the process and
changing the process are pollution-prevention types
of approaches. Preventing or not causing the
pollution in the first place is desirable and certainly
worth consideration. However, approaches need to
be considered in the framework of all the factors.
Sometimes they do not meet the necessary criteria,
or other factors make another approach more
desirable. For example, changing the process may
achieve a small reduction in emissions, but it may
not be sufficient to meet the pollution reduction
needs. Or, the substitution of another input to a
process creates another pollution problem, such as
the emission of a more toxic pollutant.
An example of changing inputs to a process is the
use of lead-free gasoline in automobiles. Lead
emissions in the United States fell dramatically with
this change. Another example is separating mercury-
containing batteries from a municipal waste stream
before the waste is combusted. That reduces
mercury emissions from the waste combustion
process.
70
-------
An example of changing the process can be found in
the chlor-alkali industry, which produces chlorine
for use in water treatment and swimming pools,
among other things. One process, which is still used
in a few plants, uses mercury cells and creates
emissions of mercury. A newer process uses
membranes and no mercury, so there are no
mercury emissions to control. Another example is
for chromium electroplating tanks. A foam or a
wetting agent that suppresses emissions of
chromium during the electroplating process can be
added to the tanks. These methods to suppress
emissions do not affect the electroplating itself, but
prevent the need to treat emissions with an add-on
device. A third example would be housekeeping
measures such as wetting or cleaning surfaces to
prevent fugitive dust emissions.
For treating the output of a process, municipal
waste combustion units provide a good example.
These units typically use an integrated gas scrubbing
system to control dioxins, mercury, cadmium, lead,
paniculate matter, hydrogen chloride, and sulfur
dioxide emissions. The system consists of a spray
dryer to condense the pollutants into a paniculate
form, followed by a fabric filter to remove the
paniculate from the gas stream. Activated carbon is
also injected into the spray dryer to enhance the
removal of mercury and dioxins.
Do
Some regulations specify the emission reduction
methods that are to be used by a given type of
source. However, such specificity is rare. Typically, a
rule has emission limits that can be achieved by a
variety of methods. Examples of emission limits
include percent reduction of pollutant emissions,
concentration in the gas stream (e.g., parts per
million or grams per cubic meter of gas), amount
of emissions per time period (e.g., tons of emissions
per month), and amount of emissions per product
made (e.g., grams of emissions per kilogram of
product produced). Rules often include multiple
emission limits in order to not preclude emission
reduction methods.
Economic or market incentives are ways to provide
for more flexibility, efficiency, and emission reduc-
tion from what might be achieved without the
incentive, while maintaining environmental protec-
tion, accountability, and enforceability. Incentives
include emission trading programs, financial
mechanisms, clean air investment funds, and public
information programs. Such incentive programs are
allowed to varying degrees under different sections
of the CAA.
Emission trading programs create transferable
emission reductions or emissions allowances. The
cost of achieving an emission reduction may be
relatively low for some sources, but high for others.
In these situations, both types of sources may
benefit by trading reductions or allowances.
Emissions trading can be designed for various
sources within a single facility, or across a set of
facilities within a particular industry, geographic
region, or nationwide. Some emission trading
programs include an overall cap or budget on the
total emissions from a particular set of sources. The
Trading
A regulation requires facilities to reduce emissions to
50 tons per year allows emissions trading among
facilities. Facility A can reduce its emissions to 30 tons
per year for a relatively low cost. For Facility B, the
cost to reduce emissions to 50 tons per year is
relatively high, but the cost to reduce emissions to
70 tons per year is relatively low. Facility B could
meet the requirements of the regulation by reducing
Its emissions to 70 tons per year and buying Facility
A's extra 20 tons per year of emission reduction (50
minus 30 tons per year). Facility A would have to
commit to achieving the emission reduction It
sells to Facility B,
Acid Rain program is an example of a cap-and-trade
design, where excess allowances can be traded
among electric utilities nationwide. On the other
hand, open market trading programs do not have
an overall emissions cap, but allow the flexibility to
include sources of pollution that are not normally
regulated.
71
-------
Another type of incentive is a financial mechanism
program, such as fees, taxes, or subsidies targeted at
pollution-reducing activities or projects. Examples
include fees on emissions, subsidies for purchasing
zero-emitting vehicles, or time saving mechanisms,
such as high-occupancy vehicle lanes to encourage
carpooling.
A third type of incentive is a clean air Investment
fund. Such a fund would provide cost relief for
sources when the cost of emission reductions is
high. Sources that exceed an established cost-per-
ton for emissions control pay into the fund in lieu
of reducing emissions. The fund manager procures
emission reductions elsewhere. As such, it includes
elements of both emissions trading and financial
mechanisms.
A public information program can also be
considered a type of economic Incentive program.
Such programs encourage the public to make
choices that reduce air emissions. In doing so, the
public becomes more aware of environmental
issues, such as the links between air emissions and
water quality, and the impact of their daily choices
on the environment.
The EPA published guidance in January 2001 for
states that want to develop economic incentive
programs to improve air quality and visibility. You
may find this guidance useful in determining what
questions need to be addressed in designing any
economic incentive program. It can be found at
http://www.epa.gov/ttn/ecas/innostra.html.
Persistent pollutants are unique because they often
lack discreet sources; rather, they are emitted from
residues in soils or sediments (e.g., DDT and
PCBs) or by accident in fires or spills (e.g., PAHs
and dioxins/furans). Management efforts for these
compounds must include identifying historic sites
where they may have been used or created (often
old, abandoned industrial property) or applied for
agricultural purposes (banned or cancelled pesticides
or herbicides). A review of resources relating to
waste site cleanup and local agricultural practices
may yield viable reduction strategies that could
lower volatilization and/or erosion and resuspension
of these persistent compounds. Remediation of key
waste sites is a prime example of one of the
practices that may be considered for nonpoint or
historic-use materials where traditional control
measures are unavailable.
An
Once control options are in place, it will be
necessary to monitor them to ensure that the
controls are actually having a positive impact on
water quality or that your predictions about the
benefits of implementing regulations are in the
right ballpark. As population density and vehicle
miles traveled increase, air deposition might
become a more significant part of the problem than
originally estimated. Like any potential threat to an
ecosystem, the impact of atmospheric deposition
will need to be reevaluated from time to time to
ensure that water quality and ecosystem protections
are maintained.
72
-------
XII.
Formed as part of the Air-Water Initiative of EPA's Office of Water in 1995, the group is responsible for
coordination of air-water issues within EPA, especially those pertaining to coastal ecosystems. More
information on air deposition and air-water coordination and links to other programs and resources are
available at http://www.epa.gov/owow/oceans/airdep/
The Great Waters Program was formed in response to the Great Waters section of the 1990 Clean Air Act.
The Great Waters include the Great Lakes, Chesapeake Bay, Lake Champlain, the National Estuary
Program, and the National Estuarine Research Reserves. The program coordinates atmospheric deposition
issues under CAA section 112 (hazardous air pollutants) within the Office of Air and Radiation and is one
of the main liaisons with the Office of Water on air-water issues. For more information on the Great
Waters Program and links to its publications, including the Great Waters Reports to Congress, visit the
Web site at http://www.epa.gov/oar/oaqps/gr8water/.
Air
Formed initially in response to Title IV (the Acid Rain Program) of the 1990 Clean Air Act, the Clean Air
Markets Division administers the national SOx trading program, as well as much of the NOx emissions
reduction program and coordinates the atmospheric deposition assessments and CASTNet monitoring
program. The program is the lead office in the Office of Air and Radiation for most nitrogen deposition
issues under Title IV and one of the main liaisons with the Office of Water on air-water issues. For more
information on the Clean Air Markets Division, visit the Web site at: http://www.epa.gov/airmarkets/.
Air
The NOAA Air Resources Laboratory supports and conducts a significant amount of research, monitor-
ing, and modeling on atmospheric chemistry and pollutant transport and its relationship to atmospheric
deposition. For more information about specific projects and the ARL READY information on models,
including HYSPL1T, visit the Web site at http://www.aii.noaa.gov/research/themes/aq.html.
USGS does substantial research on atmospheric deposition and water quality, including being the largest
single supporter of the NADP and developing and using the SPARROW model that can measure the
transport of atmospherically deposited pollutants through watersheds. Although the agency works
primarily with nitrate and sulfate deposition, it does some work with toxics as well (http://btdqs.usgs.gov/
acidrain/index.html).
73
-------
EPA
EPA Regional Offices work with states and local programs to characterize and address deposition. Those
that have been most active are noted below.
EPA 1
EPA Region I is working closely with the Casco Bay National Estuary Program on PAH deposition
monitoring in the northeast (http://www.epa.gov/regionl).
EPA II
EPA Region 2 is working closely with the State of New Jersey on a toxics deposition monitoring
program for New Jersey and the New York Harbor area (http://www.epa.gov/region2).
EPA IV
EPA Region 4 supports national estuary programs in Region IV and their air deposition monitoring
activities (http://www.epa.gov/region4).
EPA V
EPA Region V works closely with the Great Lakes National Program Office (GLNPO) and other EPA
offices on toxics deposition to the Great Lakes (http://www.epa.gov/region5/).
EPA IX
Region IX supports air deposition activities in Region IX and encourages cooperation between air and
water management agencies at regional and local levels (http://www.epa.gov/region9).
The Chesapeake Bay Program Office (CBPO) was one of the first organizations to investigate the role of
atmospheric deposition in any coastal area. The program's experiences have helped numerous other coastal
areas become involved in research/monitoring of atmospheric deposition of nitrogen. CBPO has also
conducted some investigations on the deposition of chemical contaminants (http://
www. chesapeakebay. net).
GLNPO is the leading national program for toxics deposition monitoring and research. It also works
closely with its Canadian counterparts on binational monitoring and management strategies (http://
www.epa.gov/glnpo/).
The Gulf of Mexico Program Office helped coordinate a workshop on atmospheric deposition and the
hypoxic zone with other EPA offices and the Ecological Society of America (http://pelican.gmpo.gov/).
74
-------
Coastal management programs that have conducted or are conducting air deposition assessments:
(Programs with * were not conducting assessments in 2000)
Casco Bay NEP
*Massachusetts Bay NEP
*Waquoit Bay NERR
*Narragansett Bay NEP
"Long Island Sound NEP
Peconic Bay NEP
New York/New Jersey Harbor NEP
Delaware Inland Bays NEP
Maryland Coastal Bays NEP
Albemarle-Pamlico NEP
Indian River Lagoon NEP
Charlotte Harbor NEP
Sarasota Bay NEP
Tampa Bay NEP
Mobile Bay NEP
Galveston Bay NEP
Coastal Bend (Corpus Christi) NEP
Santa Monica Bay NEP
San Francisco Bay NEP
*Puget Sound NEP
http://www.cascobay.usm.maine.edu/
http://www.state.ma.us/massbays/
http://inlet.geol.sc.edu/WQB/
http: //www. nbep.org/
http://www.epa.gov/regionO 1 /eco/lis/
http://www.co.suffolk.ny.us/health/eq/pep.html
http://www.harborestuary.org/
http://www.udel.edu/CIB/
http://www.dnr.state.md.us/coastalbays/
http://h2o.enr.state.nc.us/nep/
http://www.epa.gov/OWOW/oceans/lagoon/
http://www.charlotteharbornep.com/
http://www.sarasotabay.org/
http://www.tbep.org/
http://www.niobilebaynep.com/
http://gbep.tamug.tamu.edu/
http://tarpon.taniucc.edu/
http://www.smbay.org/
http://www.abag.ca.gov/bayarea/sfep/sfep.htnil
http://www.wa.gov/puget_sound/
Wesely, M. L and B. B. Hicks. 2000. A Review of the Current Status of Knowledge on Dry Deposition.
Atmospheric Environment 34 (2000) 2261-2282.
Lovett, G. M. 1994. Atmospheric Deposition of Nutrients and Pollutants in North America: an
Ecological Perspective. Ecological Applications 4(4):629-650.
Valigura, Richard A., Alexander, Richard B., Castro, Mark S., Meyers, Tilden P., Paerl, Hans W, Stacey,
Paul E., Turner, R. Eugene, editors (2000) Nitrogen Loading in Coastal Water Bodies: An Atmospheric
Perspective, American Geophysical Union, Washington, DC.
More information on the chemical mass balance source identification technique can be found in Gordon,
G.E. 1991. Airborne Particles on Global and Regional Scales, Environmental Science and Technology
25(11):1822-1828.
U.S. Geologic Survey SPARROW model information can be found at http://water.usgs.gov/nawqa/
sparrow.
Compendium of Tools for Watershed Assessment and TMDL Development, EPA document number:
EPA841-B-97-006, May 1997.
75
-------
Watershed Analysis Risk Management Framework. Atmospheric deposition data can be an input to the
model, and the model can estimate watershed transport. Works best for smaller watersheds. For more
information, see http://systechengineering.com/warmf.htm.
The Ecological Society of America has several publications that can be used as education tools. The Issues
in Ecology series is particularly useful for students; several workshop reports also include good information
and useful references on atmospheric deposition on the Atlantic, Gulf, and Pacific coasts (http://
esa.sdsc.edu).
of
A treasure trove of useful air source, emissions, and air quality information and an up-to-date listing of
air data contacts in state and local agencies is available from the EPA AIRS database at http://
www.epa. gov/alrs/.
Information on EPA's TMDL Air Deposition Pilot Project can be found at http://www.epa.gov/owow/
tmdl/madpp.html.
Great Lakes Commission at http://www.glc.org/air/air3.html.
Great Lakes Air Toxics Inventory (1996 report files in Adobe .pdf format) http://www.glc.org/air/
1996/1996.html.
EPA's 3rd Great Waters Report to Congress, which is available from EPA on the Great Waters web page
or in hard copy (Deposition of Air Pollutants to the Great Waters, Third Report to Congress, EPA-453/
ROO-005, June 2000) contains a summary of air deposition monitoring activities across the country. The
Second Report to Congress (1997) is also available on the web at http://www.epa.gov/oar/oaqps/
grSwater/.
Mercury Study Report to Congress is on the EPA Web page at http://www.epa.gov/oar/mercury.html.
This report was sent to Congress in 1997 and is a complete analysis of all the information on the sources
and effects of mercury in the environment, including atmospheric sources. It includes estimates of the
proportion of mercury contamination caused by atmospheric deposition and estimates of sources or
source categories of atmospherically deposited mercury.
NAPAP Report to Congress is sent to Congress every two years and discusses emissions, deposition, and
ecological effects of nitrogen and sulfur and any ecological changes resulting from implementation of
federal regulations (http://www.iiiiic.noaa.gov/CENR/NAPAP/.
EPA Persistent Bioaccumulative Toxics (PBT) program develops action plans to minimize impacts of
pollutants that are toxic, bioaccumulative, and persistent in the environment. Pollutants were prioritized
by EPA, and several draft plans have already been developed. For more information on the plans, visit the
PBT Web page at http://www.epa.gov/opptintr/pbt/.
Information on toxic pollutants can be found on EPA's Air Toxics Web page at http://www.epa.gov/ttn/
atw/. Information on the criteria pollutants (carbon monoxide, nitrogen oxide, sulfur dioxide, lead,
paniculate matter, and ozone) can be found at http://www.epa.gov/oar/oaqps/emissiis.htmL Information
on EPA's mobile source regulations are at http://www.epa.gov/otaq/. Information on utilities can be
76
-------
found at several different places, including http://www.epa.gov.airmarkets/ and http://www.epa.gov/ttn/
atw/combust/utiltox/utoxpg.html.
National Atmospheric Deposition Program data can be accessed at http://nadp.sws.uiuc.edu.
AIRMoN data can be accessed at http://www.arl.iioaa.gov/researcli/progranis/airmoii.htnil or the NADP
Web site at http://www.sws.uiuc.edu.
CASTNet data can be accessed at http://www.epa.gov/castnet.
The NOAA HYSPLIT back trajectory model can be accessed on the NOAA Web page at http://
www.arl.noaa.gov/ss/models/hysplit.html.
References
Butler, T. J., and Likens, G.E. (1998). "Weekly and Daily Precipitation Chemistry Network Comparisons
in the Eastern U.S.: NADP/NTN vs. MAP3S/AIRMoN," Atmospheric Environment32(21 ):3749-3765.
ESA Sustainable Biosphere Initiative Project Office, Workshop Report (1997). Atmospheric Nitrogen
Deposition to CoastalWatersheds, http://esa.sdsc.edu/sbindepl.htm.
Finlayson-Pitts, B. J. and Pitts, J. N. (2000). Chemistry of the Upper and Lower Atmosphere, Academic
Press, Chapter 9, Section C, pp. 380-393.
Hoff, R. M, Strachan, M. J., Sweet, C. W, Chan, C. H., Shackleton, M., Bidleman, T. E, Brice, K. A.,
Burniston, D. A., Cussion, S., Gatz, D. E, Harlin, K., and Schroeder, W. H. (1996). "Atmospheric
deposition of toxic chemicals to the Great Lakes: a review of data through 1994," Atmospheric Environ-
ment, Vol. 30, No. 20, pp. 3505-3527.
Lamb, D. and Comrie, L. (1993). "Comparability and Precision of MAP3S and NADP/NTN
Precipitation Chemistry Data at an Acidic Site in Eastern North America," Atmospheric Environment
27:1993-2008.
National Atmospheric Deposition Program (NRSP-3)/National Trends Network (2000). NADP Office,
Illinois State Water Survey, 2204 Griffith Drive, Champaign, 1L 61820.
Sisterson, D. L., Wurfel, B. E., and Lesht, M. M. (1985). "Chemical Differences Between Event and
Weekly Precipitation Samples in Eastern Illinois," Atmospheric Environment 19:1453-1469.
USEPA (1997) Mercury Study Report to Congress, Volume 1, Executive Summary, EPA-452/R-97-003.
USEPA-OAQPS (2000) Deposition of Air Pollutants to the Great Waters, Third Report to Congress, EPA-
453/R-00-005.
USEPA-ORD-NCEA, Draft Dioxin Reassessment, Vol. 3, Chapter 2, pp. 16-26, http://www.epa.gov/
ncea/pdfs/dioxin/partl and2.htm.
Valigura, R. A., Luke, W. T, Am, R. S., and Hicks, B. B. (1996) Atmospheric Nutrient Input to Coastal
Areas: Reducing the Uncertainties, NOAA Coastal Ocean Program Office, Decision Analysis Series No. 9.
Vet, R. J., Sirois, A., Lamb, D., and Artz, R. (1989). Intercomparisons of Precipitation Chemistry Data
Obtained Using CAPMoN and NADP/NTN Protocols. NOAA Technical memorandum ERL/ARL-174,
39 pp.
-------
XIII. of
Sources of funding for atmospheric deposition projects change regularly. The sources listed here are
programs that have funded atmospheric deposition monitoring In the past and have not indicated that they
will not be doing it In the future or programs whose goals are likely to be consistent with atmospheric
deposition monitoring programs. Of course, you should contact any programs directly for the latest
Information about what type of projects they are funding, how large the grants are, when requests for
proposals are sent out, when submissions are due, and all the other important details.
The Office ofWater has funded air deposition monitoring projects in several coastal communities for
several years, http://www.epa.gov/owow/oceans/airdep
The Great Waters Program has funded air deposition monitoring and research projects in the past, http://
www.epa.gov/oar/oaqps/gr8water/
of for
This catalogue Includes links to pages of non-federal sources as well, http://www.epa.gov/owow/watershed/
wacademy/fund.html
on
This Purdue University site includes contact information for state watershed protection efforts, http://
www.ctic.purdue.edu/KYW/wspartners/statewscontacts.html
These federal-state partnerships are funded by section 104 of the 1984 WRRA to promote research on water
quality issues. Funding information is available on each state's Web page (linked to the site), but some state
Web pages are more complete than others, http://water.usgs.gov/wrri
Cooperative effort between the Northeast-Midwest Institute and the Marine Sciences Consortium. Many of
the links on this page are redundant, but there are a few new ones under the federal funding sources heading.
http://www.nemw.org/water.htrrrfnps
This resource is located at the University of Maryland and has links to funding sources as well as general
information on fundraising. http://www.mdsg.umd.edu/EFC/
of
Although you must pay to access many parts of this site, it has great information on private funding sources.
http://www. philanthropy.com
A resource for those In the Great Lakes area on all sorts of things, including funding sources, http://
www.great-lakes. net/
The Chesapeake Bay Program has a grants program for atmospheric deposition research on Issues affecting
the Bay. http://wvAv.chesapeakebay.net
-------
1
of of in the
and
Mercury and Compounds: Naturally occurring element often used in thermometers, electrical
equipment (such as batteries and switching equipment), industrial control instruments, and industrial
processes (e.g., Chlor-alkali plants). Released during combustion of fossil fuels (e.g., coal, oil); incinera-
tion of municipal, medical, and hazardous waste; and from numerous manufacturing and natural
processes. Banned as a paint additive in U.S. in both interior (1990) and exterior (1991) paint. Being
phased out of batteries. Removed from catalysts, turf products, and explosives.
Cadmium and Compounds: Naturally occurring element used in metals production processes, batteries,
and solder. Often released during combustion of fossil fuels and waste oil, and during mining and
smelting operations.
Lead and Compounds: Naturally occurring element historically used in gasoline and paint additives, and
still used in storage batteries, solder, and ammunition. Released from many combustion and manufactur-
ing processes and from motor vehicles. Use in paint additives restricted in U.S. in 1971. U.S. restrictions
on use in gasoline additives began in 1973 and have continued through the present, with a major use
reduction in the mid-1980s.
POM1' (includes PAHs): Naturally occurring substances that are by-products of the incomplete
combustion of fossil fuels and plant and animal bio mass (e.g., forest fires). Also, by-products from steel
and coke production and waste incineration.
Dioxins/Furans: By-products of combustion of organic material containing chlorine, chlorine bleaching
in pulp and paper manufacturing, and diesel-fueled vehicles. Also a contaminant in some pesticides.
Nitrogen Compounds: By-products of power generation, industrial, and motor vehicle fossil fuel
combustion processes (NOJ). Also, compounds used in fertilizers and released from agricultural animal
manures (NH ).
PCBs: Industrial chemicals used widely in the U.S. from 1929 until 1978 for many purposes, such as
coolants and lubricants and in electrical equipment (e.g., transformers and capacitors). In the U.S.,
manufacture stopped in 1977 and uses were significantly restricted in 1979. Still used for some purposes
because of stability and heat resistance, and still present in certain electrical equipment used throughout the
U.S.
Chlordane: Insecticide used widely in the 1970s and 1980s. All U.S. uses except termite control canceled
in 1978; use for termite control voluntarily suspended in 1988. Use of existing stocks permitted.
aSee the Third Report to Congress, 2000, Deposition of Ait Pollutants to the Great Waters (U.S. EPA 2000).
POM is a large class of chemicals consisting of organic compounds having multiple benzene rings and a boiling point greater
than 100° C. Polycyclic aromatic hydrocarbons (PAHs) are a chemical class that is a subset of POM.
79
-------
DDT/DDE: Insecticide used widely from Introduction in 1946 until significantly restricted in U.S.
in 1972. Still used in other countries. Used in U.S. for agriculture and public health purposes only
with special permits.
Dieldrin: Insecticide used widely after introduction in late 1940s. Used in U.S. for termite control
from 1972 until registration voluntarily suspended in 1987.
Hexachlorobenzene: Fungicide used as seed protectant until 1985. By-product of chlorinated
compound and pesticide manufacturing. Also a by-product of combustion of chlorine-containing
materials. Present as a contaminant in some pesticides.
Hexachlorocyclohexane: Component of technical-HCH, an insecticide for which use is restricted In
U.S., but which is used widely in other countries.
Lindane: An insecticide used on food crops and forests, and to control lice and scabies In livestock and
humans. Currently used primarily in China, India, and Mexico. U.S. production stopped in 1977. Use
was restricted in 1983; many uses are still registered, but are expected to be voluntarily discontinued in
the future.
Toxaphene: Insecticide used widely on cotton in the southern U.S. until the late 1970s. Most U.S.
uses banned in 1982; remaining uses canceled in 1987.
80
-------
2
Ye
Table of Dry Deposition Velocities From Literature
Compound
NO3 (aerosol)
HNO3 (nitric acid, aerosol
NH3+ (ammonia)
NH4 (aerosol)
large particles (> 2 |jm)
small particles (< 2 |jm)
lead
cadmium
copper
iron
manganese
zinc
Deposition Velocity
(cm/sec)
0.1
0.1 -0.5
0.5-5
0.72
0.5-2
< 0.5
0.28-0.96
0.45- 1.5
0.43- 1.5
0.85-2.7
0.62-2.1
0.44- 1.5
Type of Surface
exterior surfaces
exterior surfaces, leaf interiors
exterior surfaces, leaf interiors
pine forest
exterior surfaces
exterior surfaces
various exterior surfaces
various exterior surfaces
various exterior surfaces
various exterior surfaces
various exterior surfaces
various exterior surfaces
Data from: Hill, A. C. and E. M. Chamberlain. 1974. The Removal of Water Soluble Gases from the
Atmosphere by Vegetation, in Atmosphere-Surface Exchange of Particulate and Gaseous Pollutants, B.
Hicks, ed. Energy and Research Development Administration, NTIS CONF-740921,pp 153-169;
Judekis, H. S. and A. G. Wren. 1978. Laboratory Measurements of NO and NO2 Depositions onto Soil and
Cement Surfaces, Atmospheric Environment 12:2315-2087; Lovett, G. M. 1994. Atmospheric Deposition
of Nutrients and Pollutants in North America: An Ecological Perspective, Ecological Applications 4:629-650.
81
-------
3
of Air
Acetaldehyde n-Hexane
Acrolein Lead compounds1
Arsenic and compounds1 Manganese compounds1
Benzene Mercury and compounds1
1,3-Butadine Naphthalene
Chromium and compounds NicM compoundsi
Dioxins/Furans2 _ Polycyclic Organic Matter (POM)3
Diesel Participate Matter, and Diesel Exhaust Stvrene
Organic Gases (DPM and DEOG) Toluene
Ethylbenzene Xylene
Formaldehyde
'Although the different metal compounds generally differ in their toxicity, the on road mobile source inventory contains
emissions estimates for total metal compounds (i.e., the sum of all forms).
'This entry refers to two large groups of chlorinated compounds. In assessing their cancer risks, their quantitative potencies are
usually derived from that of the most toxic, 2,3,7,8-tetrachlorodibeny,odioxin.
3Polycyclic organic matter includes organic compounds with more than one benzene ring and which have a boiling point greater
than or equal to 100 degrees centigrade. A group of seven polynuclear aromatic hydrocarbons, which have been identified by
EPA as probable human carcinogens (ben(a)anthracene, benzo(b)fluoranthene, benzo(k)fluoranthene, benzo(a)pyrene, chrysene,
7,12-dimethylbenz(a)anthracene, and indeno( 1,2,3-cd)pyrene), are used here as surrogates for the larger group of POM
compounds.
Data from 40 Code of Federal Regulations Parts 80 and 86, Control of Emissions of Hazardous Pollutants
from Mobile Sources; Final Rule (FR: March 29, 2001, pp. 17229-17273).
82
-------
4
for
1.0 General Considerations
Monitoring sites for the networks are selected to represent major physiographic, agricultural, aquatic,
and forested areas within each cooperating state, region, or ecoregion. Wherever possible, collection
sites include locations where watershed, marine, freshwater, or other hydrological research is already
under way, or where research is being conducted on nutrient cycling, air pollution, or atmospheric
chemistry. Additional consideration is given on the basis of available knowledge of emission sources,
prevalent forms of deposition, frequency of precipitation events, and other meteorological and
atmospheric processes that influence the deposition of substances in each area. This background
information permits meaningful interpretations of spatial, seasonal, and temporal variations in the
chemistry of wet and dry deposition both regionally and nationally.
2.0 Collocation With Other Programs
The collocation of monitoring equipment with other programs is encouraged. Some precautions,
however, need to be observed when collocating sampling or monitoring equipment.
Sampling sites can be overused to the point where one program becomes compromised by the
addition of extra equipment. Besides violating the siting criteria outlined in Section 3.0, increased
visitation to a site increases the chance of contamination to the sampling receptacles. Disturbances in
air movement about the site by other than natural phenomena can reach a point where what is
sampled is no longer representative of the region, but only represents the local congested environ-
ment.
3.0 Collector and Rain Cage Siting Criteria
3.1 Regional Requirements. The rain gage and collector should be located in an area that typifies a
region and minimizes the impact of local point or area sources. However, if a region is characterized
by a certain type of agricultural land use or industrialization, the collector should be located to
provide representation of such extensive deposition sources.
Specific sources of concern include industrial operations and suburban/urban area related sources.
Industrial operations such as power plants, chemical plants, and manufacturing facilities should be at
least 10 km away from the collector. If the emission sources are located in the general upwind
direction (i.e., the mean annual west-east flow in most cases) from the collector, then this distance
should be increased to 20 km. This same criteria also applies to suburban/urban areas whose
population approximates 10,000 people. For larger population centers (i.e., greater than 75,000) the
collector should be no closer than 20 km. This distance is doubled, to 40 km, if the population is
upwind from the collector. Beyond 50 km both industrial and urban sources are generally assumed to
blend in with the typical characteristics of the region.
83
-------
3.2 Local Requirements. Transportation-related sources, agricultural operations, and surface storage
of certain types of products are typically the most troublesome sources to identify and quantify once
regional requirements for industrial sources have been met (Section 3.1). No moving sources of
pollution, such as air, ground, or water traffic or the medium on which they traverse (e.g., runway,
taxiway, road, tracks, or navigable river) should be within 100 meters of the collector. The local road
net around the site is of particular concern. Traffic volume and type will largely determine the impact
of these types of sources on the site. Feedlots, dairy barns, etc., in which large concentrations of
animals are housed should be no closer than 500 meters from the collector. Grazing animals and
pasture should be no closer than 20 meters from the collector. Parking lots and maintenance yards
also need to be kept at least 100 meters from the collector. Local sources, whether point, line or area
sources, will greatly influence the suitability of a site to serve as a long-term regionally representative
station. Land development in future years may further compromise the site's usefulness as a station.
For these reasons, consideration should be given to alternate sites in the event that the original site is
no longer representative of the region.
3.3 On-Site Requirements. The site should be accessible in both summer and winter and be a low
risk to vandalism. Further, the collector and rain gage should be sited to conform as nearly as
possible with the following:
1. The collector should be installed over undisturbed land on its standard 1 -meter high aluminum
base. Naturally vegetated, level areas are preferred, but grassed areas and slopes up to ± 15% will be
tolerated. Sudden changes in slope within 30 meters of the collector should also be avoided.
Ground cover should surround the collector for a distance of approximately 30 meters. In farm
areas a vegetated buffer strip must surround the collector for at least 30 meters.
2. Annual vegetation within the site should be maintained at less than two feet in height.
3. No object or structure shall project onto the collector or rain gage with an angle greater than 45°
from the horizontal (30° is considered optimal, but 45° is the highest angle acceptable). Therefore,
the distance from the sampler to the object must be at least equal to the height of the object
(preferably twice the height of the object). Residential dwellings mustbe kept twice their height
from the collector (30°). Pay particular attention to anemometer towers and overhead wires
(Figure 1).
Figure I
84
-------
4. Residential structures within 30 meters of the collector should not be within the 30° cone of the
mean wind direction (Figure 2).
•30m-
Figure 2
5. The base of the collector should not be enclosed. Further, any object over 1 meter high with
sufficient mass to deflect wind should not be located within 5 meters of the collector. Alter wind
shields and open fences are excluded from this requirement.
6. The rain gage should be within 30 meters of the collector, but no closer than 5 meters. Its orifice
should be located within one foot of the same plane as the orifice of the collector. In snow
accumulation areas this may require a separate platform for the rain gage.
7. In areas where more than 20% of annual precipitation is snow, rain gages must be equipped with
an alter wind shield. This shield should be installed such that the pivot axis of the shield is at the
same level as the top of the rain gage.
8. In areas having an accumulation of over 0.5 meter of snow per year, the collector and rain gage
may be raised off the ground on a platform. The platform should be no higher than the maximum
anticipated snow pack. In general, platforms are discouraged. Note: The 5-meter separation
between the rain gage and collector must be maintained (item 6).
9. Collectors located in areas which normally receive snow should have a properly counterweighted
snow roof installed on the moving lid of the collector only if problems with the opening and
closing are encountered. If installed, the roof will be left on year round.
10. Changes or modifications to established or approved sites or to its equipment must be submitted
to the Program Coordinator's Office prior to implementation. This includes moving the site,
siting other equipment in close proximity to the existing collectors (30 meters), installation of
snow roofs, etc. In the event additional equipment is added to the site or a change in location
becomes necessary, the following information is needed:
a. A brief letter to the Program Coordinator's Office requesting the change and documenting its
need.
b. Sites moving within the 30 meters surrounding the original location of the collector will be
required to file a new site sketch with pictures and negatives, along with a letter stating when
and why the site was moved.
85
-------
c. Sites moving greater than 30 meters but less than 10 km will be required to file a new Site
Description Questionnaire, site sketch map, and pictures with negatives. A new topographical
map will be required only if the site moves off* the old quad.
d. Sites moving further than 10 km or into a different type of topography, ecoregion, or land
use must reapply for admission to the network as a new site. Such a move requires submis-
sion of a complete set of siting documents to the coordinator's office for approval. A new site
name, AL code, and station number will be assigned to the new site.
11. All collector location changes (orientation, moves on or off platforms, elevation, short moves,
long moves, etc.) will be documented so that data users have the ability to determine if a change
in data correlates with some physical change at the site.
86
-------
A
air deposition
direct. See direct deposition
dry. See dry deposition
indirect. See in direct deposition
revolatilization. See revolatilization
wet. See wet deposition
AIRMoN. feNADP-AIRMoN
airshed 9, 10
ambient air 30
monitoring 62
ammonium 13, 14, 18, 19, 27, 28, 30, 61
anthropogenic source 5
Atmospheric Exchange Over Lakes and Oceans 21
atrazine 3, 50, 52
B
back trajectory analysis 57, 59, 77
c
CAA 2, 18, 28, 30, 57, 66, 67, 68, 69, 70, 73
cadmium 3, 5, 41, 42, 43, 50, 51, 59
California Puff Model. See CALPUF F
CALPUFF 47, 51
CASTNet 18, 19, 30, 73, 77
chemical mass balance 35, 59, 61, 62, 75
Chesapeake Bay Atmospheric Deposition Study 21
Clean Air Act. See CAA
Clean Air Status and Trends Network. See CASTNet
Community Modeling for Air Quality. See Models3/CMAQ
copper 3, 5, 59
costs
modeling. See modeling
criteria pollutant 67, 68, 69, 70, 76
D
DDE 3, 37
DDT 3, 9, 37, 72
denuder 33, 35, 36, 37, 42
deposition
direct. See direct deposition
dry. See dry deposition
indirect. See in direct deposition
velocity 13, 18, 25, 30, 34, 35, 36
wet. See wet deposition
dichotomous sampler 35, 36
dioxins 3, 19, 25, 33, 41, 42, 43, 68, 83
dioxins/furans 3, 19, 25, 41, 42, 43, 83
direct deposition 6, 38, 54
dry deposition 6, 13, 14, 16-19, 25-27, 30-37, 40-
43, 50, 54, 75
ecological impacts 6, 26
emission inventory 52
emission trading 71
emissions 5, 6, 9, 14, 26, 27, 32, 46, 50, 59, 60,
62, 64-73, 76
equipment 28, 32, 34, 40, 41, 42
eutrophication 26
F
filterpack 18, 30, 33, 35, 36, 37, 41, 42, 59
funding sources 78
furans 3, 19, 25, 41, 42, 43, 83
G
gas trap sampler 35
Great Waters Program 27, 73, 78
H
hazardous air pollutants 73, 83
HCB 3, 37
hexachlorobenzene. See HCB
Hybrid Single Particle LaGrangian Integrated Traje. See
HYSPLIT
HYSPLIT 47, 52, 73, 77
I
IADN 19, 32, 37, 54
IMPROVE 20
indirect
deposition 54
load 6, 25, 31, 38, 39
information needs 24
Integrated Atmospheric Deposition Network. See IADN
Inter agency Monitoring of Protected Visual Environment.
IMPROVE
isotope 34, 61
lead 3, 5, 6, 46, 51, 59, 73, 74, 76, 83
lindane 3
loading 9, 13, 14, 26, 46, 54, 55- See also TMDL
local source 9, 11, 32, 55. See tfio tributary contributions
M
mercury 3, 5, 8, 9, 14, 18, 19, 28, 37, 38, 41-43, 47, 50-
52, 54, 59, 60, 66, 68, 69, 70, 71, 76, 83
Mercury Deposition Network. See NADP-MDN
-------
meteorological data 25, 26, 32, 34, 45, 46, 49, 50,
52, 60
mobile source 59, 61, 76
modeling 5, 8, 10, 25-27, 38, 40, 42, 43, 45-47, 49, 50,
51, 54, 55, 59, 60, 62, 63, 66, 73. See also source-
receptor modeling
costs 40
Models3/CMAQ 51
monitoring. See equipment, ambient air monitoring, direct
deposition, indirect deposition
networks. 5ft?CASTNet, IADN, IMPROVE, NADP-
AIRMoN, NADP-MDN, NADP-NTN; NAMS/
SLAMS/SPMS, NDAMN
studies. See studies
N
NAAQS 19, 67, 69
NADP 9
NADP-AIRMoN 18
NADP-MDN 18, 18-20, 28, 33
NADP-NTN 14, 28, 33
NAMS/SLAMS/SPMS 20, 36
Nation al Ambient Air Quality Standard. See NAAQS
National Atmospheric Deposition Program. 5ft?NADP
NADP-AIRMoN, NADP-MDN, NADP-NTN
NDAMN 19, 20
networks 13-36
New Jersey Atmospheric Deposition Network 21
nitric acid 13, 18, 19, 30, 35, 50, 51
nitrogen 3, 5, 6, 9, 11, 13, 19, 26, 30, 35, 37, 38, 40, 42,
43, 46, 47, 50, 52, 54, 60, 61, 62, 66, 73, 74, 76
NTN. See NADP-NTN
PAH 3, 8, 19, 25, 41, 42, 43, 72, 74, 79
PCBs 3, 5, 8, 19, 25, 37, 41, 42, 43, 72
point source 14, 39, 54, 78
pollutants
atrazine. See atrazine
copper. See copper
DDE. 5«? DDE
DDT. 5ft? DDT
dioxins/furans. See dioxins/furans
hexachlorobenzene. Jw HCB
lead. 5
-------
TECHNICAL REPORT DATA
(Please read Instructions on reverse before completing)
i. REPORT NO.
EPA-453/R-01-009
3. RECIPIENT'S ACCESSION NO.
4. TITLE AND SUBTITLE
Frequently Asked Questions About Atmospheric Deposition - A
Handbook for Watershed Managers
5. REPORT DATE
September 2001
6. PERFORMING ORGANIZATION CODE
7. AUTHOR(S)
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
Office of Air and Radiation
Research Triangle Park, NC 27711
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
68-D-98-030
12. SPONSORING AGENCY NAME AND ADDRESS
Director
Office of Air Quality Planning and Standards
Office of Air and Radiation
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
13. TYPE OF REPORT AND PERIOD COVERED
Final
14. SPONSORING AGENCY CODE
EPA/200/04
15. SUPPLEMENTARY NOTES
This report is issued jointly by the sponsoring agency (in 12) and the U.S. Environmental Protection
Agency;Office of Wetlands, Oceans, and Watersheds; Office of Water; Washington, DC 20460
16. ABSTRACT
Atmospheric deposition is recognized in many areas as contributing to water quality problems. This
handbook is provides basic information about atmospheric deposition, about how to determine whether it is
problematic for a waterbody, and about air and water programs that could help address the problem.
17.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b. IDENTIFIERS/OPEN ENDED TERMS
c. COSATI Field/Group
Air Pollution, Air Pollution control,
Atmospheric Deposition, Air
Toxics, Hazardous Air Pollutants,
Great Waters, Nitrogen Compounds,
Eutrophication, Estuaries, Water
Pollution
18. DISTRIBUTION STATEMENT
Release Unlimited
19. SECURITY CLASS (Report)
Unclassified
21. NO. OF PAGES
97
20. SECURITY CLASS (Page)
Unclassified
22. PRICE
EPA Form 2220-1 (Rev. 4-77) PREVIOUS EDITION IS OBSOLETE
------- |